Autonomous cars (also known as self-driving cars, robot cars, and automated cars) are vehicles that have the ability to sense their surroundings and subsequently use previous data to ultimately make a decision on their movement. Autonomous capabilities have evolved from park assist technologies to fully monitoring surroundings, changing lanes, and making emergency stops. Self-driving cars have been featured at Expos, Worlds Fairs, and have made recent appearances at the North American International Auto Show and the Consumer Electronics Show. These cars are safer than traditional driving because they can sense obstacles and react to them quicker than humans can. However, there are ethical considerations pertaining to the increased safety measures that these cars may provide. Because autonomous vehicles are designed by humans, there is debate over who should determine the vehicles' tendencies. Moreover, there is the question of who should be held responsible for the vehicle's decisions when accidents occur, which is demonstrated by the MIT Moral Machine and the Trolley Problem.
- 1 History
- 2 Future
- 3 Public Opinion
- 4 Legislation
- 5 Current Models
- 6 Safety Concerns
- 7 Ethical Concerns
- 8 See Also
- 9 References
Autonomous cars have origins in the 1920s when the Houdina Radio Control Company installed radio control equipment in a 1926 Chandler. From 1950 to 1980 numerous companies and universities attempted to control cars by laying guide wires in the roadway. These electronically controlled cars made appearances at numerous world showcases. During the 1990s, we began to see the transition towards autonomous vehicles of today.
Modern self-driving cars do much more than follow wires placed in the roadway because our modern technology allows them to drive on public roads. New designs attempt to enable cars to make decisions about routes, safety, and changing lanes. Collectively, these enhancements are referred to as "driver assists" which designates them as distinct from driver replacements and true autonomy. This technology is far more sophisticated than that of previous models, which were controlled by humans on the other end of a remote control.
Autonomous driving systems were deployed by large corporations like Uber and Tesla, Inc. as early as the summer of 2016  and are currently beginning to be adopted. Such systems already require little to no human input while driving on public roadways under certain conditions. These modern vehicles use a combination of radar, GPS, odometers, computer programs, and machine learning to safely navigate roadways.
A wide variety of Machine Learning (ML) methods are employed today, in the construction of Autonomous Driving systems, including established models like encoder/decoder architectures, that include feeding input images into a CNN Encoder, and flowing output data into a series of Decoders, including but not limited to Detection Decoders, Segmentation Decoders, Motion Decoders, Depth Decoders, and Localization Decoders.
Convolution Neural Networks (CNNs) are used for visual perceptive tasks like object recognition, motion and depth estimation, visual SLAM, and others. Each of these tasks are individually modeled in most self-driving systems, and research is being conducted to design joint-designs of modeling these tasks simultaneously by teams like Ganesh Sistu et. al .
Decoders in autonomous driving systems consist of a series of "deconvolutions," or "upscaling" different layers in the model - specifically the task-dependent layers, like calibration or depth estimation. These tasks are shared with a CNN encoder, in addition to being present in multiple parallel task-dependent decoders. 
Levels of Autonomy
Any autonomous driving system is classified by common convention into one of six levels ranging from Level 0 to Level 5, depending on the degree of sophistication of the system. The measure of that sophistication, and thus the metric which determined the appropriate level, is the amount of input and attention required of a human operator in order to safely direct the vehicle.
- Level 0
- No Automation: The human driver is in charge of all aspects of driving but the vehicle may have a warning or minor intervention systems.
- Level 1
- Driver Assistance: The driver assistance system is in charge of either steering or accelerating and decelerating using driving environment information but the human driver is in charge of all remaining driving tasks.
- Level 2
- Partial Automation: One or more assistance systems for both steering and accelerating and decelerating using driving environment information but the human driver is in charge of all remaining driving tasks.
- Level 3
- Conditional Automation: The automated driving system performs all aspects of driving but with a human driver responding to requests to intervene to override the system.
- Level 4
- High Automation: The automated driving system performs all aspects of driving even in the case where a human driver does not properly respond to a request to intervene and override the system.
- Level 5
- Full Automation: Complete performance of the automated driving system of all aspects of driving under all road and environment conditions that could be handled by a human driver.
Benefits of Autonomous Vehicles
Advocates of autonomous vehicles believe that they substantially improve public safety and the economy. Current estimates predict that the adoption of autonomous vehicles could reduce deaths on the road by 90%. This would result in approximately 300,000 lives saved each decade and $190 billion saved in healthcare costs associated with car accidents. Furthermore, individuals when riding in an autonomous vehicle will be able to accomplish tasks other than driving. While commuting to work, people would be able to utilize their time to accomplish more work. Additionally, they could use their commute for leisure. They could watch the news or watch a movie while in their car.
Additional potential benefits of autonomous vehicles include:
- Less time spent driving
- Parking becoming easier
- New models of transportation
- Improved transportation efficiency
In a 2011 survey by Accenture, 49% of US and UK consumers stated that they would be comfortable using a "driverless car".
In a 2012 survey by JD Power and Associates, 37% of respondents said they would be interested in purchasing a fully autonomous car. However, that number dropped to 20% when told that the technology would cost more than $3,000.
In a 2014 survey by Insurance.com, over 75% of licensed drivers said they would at least consider buying a self-driving car. This number rose to 86% if their car insurance would be cheaper for an autonomous vehicle. 31.7% said they would not continue to drive anymore if autonomous cars become available.
In a 2015 survey by Delft University of Technology, respondents, on average, found manual driving the most enjoyable mode of driving. 22% of the survey respondents did not want to spend any money for a fully automated driving system, whereas 5% indicated they would be willing to pay more than $30,000. 33% of respondents indicated that fully automated driving would be highly enjoyable. 69% of respondents estimated that fully automated driving will reach a 50% market share between now and the year 2050.
With rapid developments still happening in the field of autonomous cars, it is hard for regulatory agencies to make laws about their design and usage. Today, 33 states have enacted legislation pertaining to autonomous vehicles.
Most laws in regards to autonomous vehicles are similar to the legislation that has been passed in the state of Michigan.
- Michigan Senate Bill 0169, which was passed in 2013
- Defines automation as:
- - “Automated technology” means technology installed on a motor vehicle that has the capability to assist, make decisions for, or replace an operator.
- Defines what an operator of an autonomous vehicle is:
- - Sec. 35a. “Operate” or “operating” means 1 or more of the following:
- Being in actual physical control of a vehicle. This subdivision applies regardless of whether or not the person is licensed under this act as an operator or chauffeur.
- Causing an automated motor vehicle to move under its own power in automatic mode upon a highway or street regardless of whether the person is physically present in that automated motor vehicle at that time. This subdivision applies regardless of whether the person is licensed under this act as an operator or chauffeur. As used in this subdivision, “causing an automated motor vehicle to move under its own power in automatic mode” includes engaging the automated technology of that automated motor vehicle for that purpose.
- Being in actual physical control of a vehicle. This subdivision applies regardless of whether or not the person is licensed under this act as an operator or chauffeur.
- Explains the registration and insurance protocol for such a vehicle:
- Defines automation as:
- H.R.3388 - SELF DRIVE Act
The Federal Government has also begun developing legislation in regards to autonomous vehicles. The House passed the SELF DRIVE Act on September 7th, 2017. This act calls upon the the Department of Transportation (DOT) to research into how to cost effectively inform car buyers about autonomous (and partly autonomous) vehicles and their limits. 
Many of the worlds leading automakers along with technologies companies are developing autonomous vehicles or autonomous technology for vehicles. Most estimate that, at the very least, they will be producing a partially autonomous vehicle by 2020. Some companies developing autonomous vehicles include:
- General Motors
- Mercedes Benz
Ford calls themselves one of the pioneers in autonomous vehicle technology and has recently began testing their vehicles at the University of Michigan’s Mcity and conducting real world snow tests . Ford has two generations of autonomous vehicles. In 2003 Ford installed LiDAR technology on an F-250 Super Duty for participation in the DARPA challenge. The second generation of testing began in 2011 with the Ford Fusion sedan because of its advanced electrical systems. In January of 2016 Ford announced that it would expand its fleet of autonomous Fusions to 30 vehicles .
Google’s self-driving car project, based out of Mountain View, California, began in 2009 and hit the road for testing in 2012 . Currently, Google operates two types of autonomous vehicles. The first group is composed of Toyota Priuses and Lexus RX450hs. These vehicles are used by engineers to test driving systems in real world traffic. Google has begun testing a concept car developed in collaboration with Bosch, ZF Lenksysteme, LG, Continental, and Roush . These vehicles are designed without peddles or steering wheels. As part of their concept car experiment, Google has asked citizens to submit artwork for their Paint the Town initiative. Artwork has been featured on cars in Mountain View, California, and Austin, Texas. Google expects to have vehicles available for purchase in 2020. The cars still need to undergo testing in both snow and heavy rain .
Tesla currently sells cars with Enhanced Autopilot and Full Self-Driving Capabilities. The Enhanced Autopilot features cameras and sensors that help the car to match its speed to traffic conditions, stay in a lane, and change lanes without driver input, switch freeways, exit a freeway, self park when near a parking spot, and be summoned to and from a garage. This feature, however, is not fully automatic and a driver should still be in control of the car at all times. The Full Self-Driving Capabilities feature, however, allows the car to fully drive itself. Tesla claims that this technology is estimated to be twice as safe as an average human driver. The system is supposed to be able to complete short and long term trips with no help from a human driver. All you have to do is get into the car and tell it where you want to go. If you do not say anything, the car will automatically look at your calendar and take you where the calendar says you need to be next. If there is nothing on your calendar, it will default to taking you home. The car will find the optimal route and navigate through streets, even without street markings, and traverse stop signs, stop lights, roundabouts, and complex intersections. The car is able to drive on crowded freeways at very fast speeds. Once you get out of the car at your destination, it will search for parking spots and automatically park itself. The user just needs to tap a button on their phone and the car will be recalled to pick them up.  This feature is functional due to the 8 sensors around the car that Tesla uses to have a complete 360 visibility around the car at up to 250 meters of range. Twelve ultrasonic sensors are used to complement this 360 view, detecting soft and hard objects at almost twice the distance of the previous system. A forward-facing radar looks in front of the car to cut through fog, heavy rain, dust, and even the car in front of the sensor to provide additional data about the world around the car. 
Uber has tested its self-driving cars, intended to pick up passengers and take them where they needed to go, like the rest of Uber's services. They first intended to test the cars in San Fransisco, California, assuming that the self-driving regulations and $150 permit required for self-driving did not apply to them since there would be drivers in the seat while the testing was happening. California, however, did not agree and told Uber that they needed the permit. In response, Uber pulled the testing and went to Arizona instead, where the governor was happy to have them and asked to be the first test rider in February of 2017. The state overall has not regulated technology and self-driving cars, allowing testing to happen freely.  On March 17th, 2017, one of Uber's self-driving cars was involved in a high speed crash in Tempe, Arizona. The car hit a pedestrian crossing the street, causing her to be rushed to the hospital and eventually die. Uber has since pulled all tests while the crash is being investigated. The crash resulted in serious scandal and questions as to whose fault it was, and since the accident, Yapavani officials have determined that Uber has no criminal Liability.  Some see this as an argument for self-driving cars, saying that they are safer than cars driven by humans since humans can be unpredictable. Some say that, as a ride sharing company, Uber needs to figure out self-driving cars in order to stay relevant, keep up with advancing technologies, and continue being a successful company. 
GM and Lyft
General Motors (GM) is also working on producing their own self-driving cars and hope to put as many as three hundred more autonomous vehicles on the road. This would give GM what seems to be the biggest fleet of self-driving cars in the United States. Their first release of autonomous cars would be an on-demand ride share network. GM has partnered with Lyft, a ride sharing company, to produce self driving Bolts to be used for ride sharing purposes.  Lyft plans for the majority of their rides to be in self-driving cars by the year 2021, according to their president and co-founder, John Zimmer. Zimmer says that it is unclear who would own these cars, as there will be no drivers for them. He is also unsure the role governments will play in the regulation of autonomous vehicles (especially autonomous ride share vehicles), but he believes that driverless cars are the future of ride sharing -- and that ride sharing is the future of driving. Zimmer claims that, by 2025, personal car ownership in the United States will virtually nonexistent and nearly everyone will be using the driverless ride share cars. The two companies hoped to deploy thousands of the self-driving bolts in the year 2018, which may be the largest test of self-driving cars before 2020. 
In 2017, Toyota revealed their own take on the self-driving car, the Concept-i. It is a compact car which features an artificial intelligence system named Yui. This system is meant to make the driver and the car safer while still allowing the driver to maintain control of the car. Yui does this by measuring human emotion and responding by performing calming techniques such as turning on the radio or maintaining a conversation with the driver. The driver remains in control unless there is an impending accident situation, in which case Yui will take control of the car. This is a very different system than many of Toyota's competitors, as it is far from a fully autonomous vehicle, choosing to be a Level 2 autonomous system instead. This system is meant to combine the idea of autonomous vehicles preventing crashes with the idea of a robot that can assist people at home, such as Alexa. 
Google has been able to make their autonomousis cars street legal, a variety of safety concerns have arisen over the presence of self-driving vehicles sharing the road with traditional human driven vehicles. Since 2009, Google's self-driving cars have been involved in a total of sixteen crashes, which have mostly consisted of fender benders. All sixteen crashes were faulted as human error, however, in a 2015 report, the New York Times wrote that the car's tendency to "drive by the book" lead to over maneuvering to prevent collisions, often confusing other human drivers on the road and resulting in the subsequent accidents. 
In 2015, the Telsa Motors company enabled an autopilot feature through a software update to existing cars. While CEO Elon Musk claimed that the feature would result in radical improvements for vehicle safety, the company failed to prevent drivers from abusing the feature. While in autopilot mode, the company recommends that drivers keep both hands on the car steering wheel.  However, additional constraints had to be enforced after a slew of YouTube videos emerged of drivers using the autopilot feature without their hands on the steering wheel.  Currently, regulators have banned Telsa from activating the feature in Hong Kong, and other countries may soon follow suit.
In 2016, a man named Joshua Brown died on May 7th when he crashed into a tractor-trailer while driving his Tesla on autopilot mode. Tesla was under investigation following the fatal crash, but the company was cleared ensuring no fault in the design. After the incident, Tesla released new versions of the autopilot design that gave drivers more frequent warnings to keep their hands on the steering wheel. .
In 2018, an autonomous vehicle from Uber was not under control of any driver and killed a woman from Arizona, and soon after, a passenger in an semi-autonomous vehicle by Tesla was killed even with the driving software engaged due to the vehicle crashing.
Computer ethicists and informed citizens alike are closely following the development of autonomous cars. With the transition from automated to autonomous, there are an increased number of decisions that must be made by designers..
Much of the legislation regarding autonomous vehicles has related to responsibility. The question is whether consequences from autonomous cars should fall on manufacturers, owners, or others. This debate is furthered when cars are considered autonomous agents and have the ability to learn and change themselves. Product liability laws are one venue of research on this dilemma that places responsibility for defects, injuries, and other unforeseen negative effects of sold products on any entities involved in the manufacture or selling processes. Being as autonomous cars are a tangible product, these laws may provide a framework for how responsibility can be regulated. 
Bryant Walker Smith, an assistant law professor at the University of South Carolina, was quoted in April of 2017 saying that the level of automation today is at two or below and that drivers are responsible for crashes in these cases. He went on to say that liability for crashes will move from driver to vehicle as the level of automation rises. Many experts believe that when a computer takes over the role of the driver, the companies that own the computer and programming will be held liable instead of the human driver and their insurance. Many auto companies, however, want to keep humans liable, instead of themselves. The use of General Motors' Super Cruise implies that the driver must stay alert and prepared to take over the steering wheel at any time, reducing the liability on General Motors. 
In May 2016, Joshua Brown got in a fatal accident using Tesla's autopilot system when the system hit a tractor-trailer in front of his car.  In response to Brown's accident, Tesla stated that the vehicle's autopilot was functioning correctly and that it is meant to keep the car in its lane and adjust its speed. Yet, human drivers are still responsible for staying vigilant about their surroundings and taking over control of the car if they are entering an unsafe situation.  On the other hand, others argue that human responsibility should not be a factor when dealign with autonomous vehicles. This factor can be eliminated by taking out the pedals and steering wheel from autonomous cars.  This change in the design of autonomous vehicles would require a substantial increase in the AI technology that operates in autonomous vehicles since humans would have no option to take over if they were entering an unsafe situation.
Car Computer Security
As cars connected to the internet becomes more mainstream, automakers are challenged to ensure that users are safe from cars hacking. Mainstream vehicles such as Teslas, are constantly connected to the internet. Tesla also plans to make all their cars autonomous in the near future and are currently installing necessary hardware to make their vehicles autonomous.  It is currently possible to connect to the Tesla API and remote control a Tesla while it is running.  Some of the capabilities hacking the Tesla API allows a hacker to control are: turning on the lights, honking the horn, turning off the car, and giving faulty warnings to the driver. Allow the seriousness o these hacks very, this sheds light on a more serious issue of users ability to trust Tesla. In light of private researchers demonstrating a hack on a 2014 Jeep Grand Cherokee that disabled the breaks of the moving vehicle, the FBI warns that car hacking will be a real risk in the future.  Yet, our laws will be unable to change in time to offset this widespread capability. Waymo, an autonomous car company owned by Alphabet (the parent company of Google), pledged that their cars will only occasionally connect to the internet and they will try to avoid it.  While this is a security precaution, it fails to successfully ward off potential hackers. As long as autonomous vehicles have the capability to connect to the internet, there is a possibility of the cars being breached. The risk of hacking does not completely diminish. The new idea of autonomous vehicles that do not require user involvement, and therefore lack pedles and a steering wheel, are at a greater risk because a passenger will not have the opportunity to control the car at all if it were attacked and remotely controlled by a hacker. As long as cars, especially autonomous cars, connect to the internet, manufacturers will be held accountable for developing secure systems to ensure consumer safety. The risk hacking has on autonomous cars could be lethal.
Impacts on employment
Some economists fear that the adoption of autonomous vehicles will lead to severe, and potentially problematic unemployment. In the United States, over 4.4 million Americans work as drivers. Truck driving is the largest occupation in the country. Autonomous vehicle technology threatens these jobs. Many people worry about what will happen when these jobs get replaced.
While some believe that any worry of incoming technological unemployment is a Luddite-fallacy, others are not convinced. Rather, many people are concerned that the rate at which jobs will be lost is unlike what economies have experienced in previous periods of technological improvement. Internet entrepreneur and Tesla CEO, Elon Musk, believes that in order to address the incoming technological unemployment that autonomous vehicles will create, economies will need to adopt policies such as universal basic income.
MIT Moral Machine
MIT Moral Machine is a project sponsored by MIT’s Media Lab  that poses various ethical dilemmas to gather public opinion on how autonomous vehicles should behave under various life-or-death scenarios. The underlying dilemma is given that an autonomous car must crash, whose life/lives should it save? An example of the dilemma is given that the car must kill either a pedestrian or its passenger, how should it choose?
From their website, the goals of Moral Machine are “1) building a crowd-sourced picture of human opinion on how machines should make decisions when faced with moral dilemmas, and 2) crowd-sourcing assembly and discussion of potential scenarios of moral consequence.”
In response to the question of whether an autonomous vehicle should choose to save a passenger or a pedestrian, some argue that the vehicle should choose to protect its passenger. In October 2016, Mercedes Benz executive Christoph von Hugo commented that, in the future, Mercedes autonomous cars would choose to save its passenger. Their reasoning is that “crash situations, he says, are so chaotic that an A.I. should prioritize the lives it has most direct control of.” 
Another argument used in favor of saving the passenger is that people are less likely to use autonomous vehicles if they know that the vehicle won’t choose to save their own life. In a paper published in the journal Science, psychologists found that, of the 2000 respondents to a survey, most people would not be willing to use autonomous cars in this case .. If drivers were unwilling to adopt autonomous cars for this reason, society would not achieve the benefit of reduced driving-related accidents that autonomous vehicles promise.
An argument in favor of saving the pedestrian, or surrounding entities, instead of the passenger is that it is ethically more just to do so. Passengers choose to ride in the autonomous vehicle and are ethically responsible for the risks and costs associated with a car crash. Another argument says that the probability of survival for the passenger by crashing the car is better than hitting the passenger. Overall, there seems to be an agreement amongst most people that from the most moral standpoint, autonomous cars should save the greater number of people possible. Ethical inconsistencies could be the fate of barriers in self-driving cars.
How Autonomous Cars Make Decisions
While AI decision making cannot replace that of a human, it is estimated that 90% of car accidents are caused by human error. With that, autonomous vehicles make decisions based on the speed they're traveling, the road and weather conditions, distance and other data gathered by sensors. The car will make a calculation based on how fast the object traveling towards it is going, where the challenge would be processing the calculation fast enough to help avoid the dangerous circumstance. While the human element of control is flawed and imperfect, many still prefer to have the decision of how to react in their hands. However, this could mean being responsible for the death of someone else instead of having a car be responsible. Humans do not have time to think about the outcomes of their decision, rather reacting instinctively irregardless of if there could be another way to swerve the wheel and save more lives. With the information that is gathered by the car from its different sources, choosing a certain route to avoid killing more people as opposed to another route with more people will be a byproduct of these calculations and not a direct thought of the algorithm in and of itself.
The trolley problem, or trolley dilemma, is a thought experiment used be ethicists and designers across many different disciplines. While there are a number of variations on how the story goes, the main idea is that there is a runaway trolley headed down a track to which five people are tied. You however are standing next to a switch which you could throw and reroute the trolley on to a track with only one person tied to it. You can do two things, do nothing and let the trolley kill 5 people, or throw the switch which would result in only one person being killed. This same decision will have to be made by the designers of autonomous vehicles.
There are many different examples that designers will need to take into account. Consider a 10-year-old running in front of an autonomous car. The car can either hit the child, or it could swerve and crash and injure or kill the person in the vehicle. These examples will need to be answered.
While we might be faced with same problem while driving an ordinary car down the road it is generally recognized that there is a difference between ordinary cars and driverless cars. While driving down the road in a normal car this would be a split section decision made by the driver. With autonomous cars the decisions is far from split second. This leads to the ethical question of who will be making the decision of which path our cars should take.
Chris Urmson leads Google's self-driving car project and notes that Google's driverless cars will not be able to determine the most ethical person to hit in a collison. . Urmson argues that it is impossible to ethically determine one person's worth over another.  Google's driverless cars are engineered to first avoid pedestrians and bicyclists, then other vehicles, and then immobile objects. 
Drinking and Driving
Current laws do not address the role autonomous vehicles play in the consumption of alcohol. It is still undetermined if fully autonomous vehicles will, by law, require an unimpaired, technically skilled driver in the vehicle or not, or if all occupants then become passengers.
Many experts believe that the introduction of autonomous vehicles will change the automotive industry from a good-based economy to a service economy. Rather than owning a car, users will own a subscription to an automotive company. It essentially uberizes the market, where users will request rides and a car will come pick them up, drop them off, and then proceed to service another customer. Experts believe this will have dramatic effects on parking services, as it would negate the need for parking as cars would be in constant use. Experts beleive that this will have severe consequences on the traffic, likely dramatically increasing congestion.