Get Instant Help From 5000+ Experts For
question

Writing: Get your essay and assignment written from scratch by PhD expert

Rewriting: Paraphrase or rewrite your friend's essay with similar meaning at reduced cost

Editing:Proofread your work by experts and improve grade at Lowest cost

And Improve Your Grades
myassignmenthelp.com
loader
Phone no. Missing!

Enter phone no. to receive critical updates and urgent messages !

Attach file

Error goes here

Files Missing!

Please upload all relevant files for quick & complete assistance.

Guaranteed Higher Grade!
Free Quote
wave
How Safe Are Self-Driving Cars? - A Case Study

The Safety Levels of Self-Driving Cars

According to the National Safety Council, an estimated 38,300 people were killed and another 4.4 million were injured as a result of accidents on U.S. roads in 2015. The vast majority of fatal accidents are due to human error, so self-driving vehicles have the potential to save a lot of lives. Indeed, one study estimated that widespread adoption of self-driving vehicles by the year 2030 could eliminate 90 percent of all auto accidents in the United States, thus eliminating close to $190 billion in auto repair and health care–related costs annually, and, even more importantly, saving thousands of lives. The NHTSA recently adopted the Society of Automotive Engineers’ levels for automated driving systems. The six levels range from complete driver control to full autonomy, as summarized below:

  • Level 0 (no automation): A human driver controls it all: steering, brakes, acceleration, and the like, although the car may include some warning or intervention systems.
  • Level 1 (driver assistance): Most functions are controlled by the human driver, but some specific functions (such as steering or accelerating) can be done automatically by the car.
  • Level 2 (partial automation): These cars have at least one driver assistance system of “both steering and acceleration/deceleration using information about the driving environment” (such as cruise control or lane-centering) that is automated. The driver must still always be ready to take control of the vehicle, however, to handle “dynamic driving tasks.”
  • Level 3 (conditional automation): Drivers are able to completely shift “safety-criticalfunctions” to the vehicle under certain traffic or environmental conditions. The driver is still present and is expected to “respond appropriately” if asked to intervene.
  • Level 4 (high automation): At this level, cars are fully autonomous and are designed to handle all aspects of the dynamic driving task—even if a human driver doesnot respond appropriately to a request to intervene—including, performing all safety-critical driving functions and monitoring roadway conditions for an entire trip. However, it’s important to note that this is limited to the “operational designdomain” of the vehicle—meaning it does not cover every driving scenario.
  • Level 5 (full automation): Cars at this level have a fully autonomous system that is designed to handle all aspects of the dynamic driving task under all the roadway and environmental conditions that could be managed by a human driver—including extreme environments, such as on dirt roads (which are unlikely to be navigated by driverless vehicles in the near future).

Autonomous vehicles are chock full of sensors and cameras that observe, monitor, and record the surrounding environment, including other vehicles in the vicinity. All these data are fed into an artificial intelligence algorithm that makes decisions on what movements are right, wrong, safe, and unsafe for the car to perform given the conditions it is experiencing. Self-driving cars even have the ability to share their driving experiences and recorded data with other cars so that each car’s computer can adapt its algorithm to the environments faced by other vehicles. The goal of this information sharing would be to improve the ability of all self-driving vehicles to react to situations on the road without actually having to experience those situations firsthand. Tesla CEO Elon Musk, perhaps optimistically, anticipates the first fully autonomous Tesla to be ready by 2018 but expects that regulatory approval may require an additional one to three years. Audi, BMW, Fiat Chrysler, Ford, General Motors, Mercedes, Nissan, Toyota, Volvo, and Waymo (the new name of Google’s self-driving division), all have some level of autonomous vehicle today, and all have plans to deliver a fully autonomous vehicle by 2025 or sooner.

Testing of autonomous vehicles has not been without incident, however. In 2016, one of Google’s self-driving cars hit a bus during a test drive in California because the car made an incorrect assumption about how the bus would react in a particular situation. The vehicle had identified an obstruction on the road ahead, so it decided to stop, wait for the lane next to it to clear, and then merge into the other lane. Although the vehicle detected a city bus approaching in that lane, it made an incorrect assumption that the bus driver would slow down. The bus driver, however, assumed the car would stay put, so he kept moving forward. The car pulled out, hitting the side of the bus while going about 2 mph. This was the first time in several years of testing on public roads that a Google self-driving car caused a crash. Understanding why a crash involving aself-driving car occurred is important in order to avoid a repeat of that accident scenarios. In this case, Google made necessary changes to its software so that it would “more deeply understand that buses and other large vehicles are less likely to yield” than other types of vehicles.

A Tesla Model S with its autopilot system activated was involved in a fatal crash in 2016, the first known fatality in a Tesla in which autopilot was active. The accident occurred when a tractor trailer drove across the highway perpendicular to the Model S. Neither the driver nor the car noticed the big rig or the trailer “against a brightly lit sky,” and the brakes were not applied. The vehicle’sradar didn’t help in this case because, according to Tesla, it “tunes out what looks likean overhead road sign to avoid false braking events.” The NHTSA is investigating the accident to determine if the autopilot system was working properly; if not, it could consider ordering a recall to repair the problem.

  1. When self-driving cars are involved in accidents, where does liability reside? Is it the driver’s fault? Is the car manufacturer or software manufacturer libel? How might the deployment of self-driving cars affect the insurance industry?
  2. Automated driving systems range from complete driver control (level 0) to full autonomy (level 5). Should the degree of care exercised in developing vehicle software increase as the level of autonomy increases, or should all vehicle software be treated with the same level of care? Explain your answer.

Your well-written report should be 4-5 pages in length, not including the title or reference pages. APA style guidelines, citing at least two references in support of your work, in addition to your text and assigned readings.

support
Whatsapp
callback
sales
sales chat
Whatsapp
callback
sales chat
close