SlideShare ist ein Scribd-Unternehmen logo
1 von 5
Downloaden Sie, um offline zu lesen
Autonomous Vehicles
Driverless vehicles, or as they are called autonomous vehicles, are a major topic of interest these days.
A self-driving vehicle offers many advantages to the public, especially to the elderly and people with
physical impairments who otherwise would not be able to drive. Then there is the issue that
autonomous vehicles do not get drunk or become drowsy, and are not distracted by cell phones or
children in the back seat. Thus they should, in principle, reduce the incidence of accidents and injuries to
occupants. Another advantage suggested by the concept’s supporters is that by controlling speed and
vehicle spacing patterns on highways, better overall fuel economy can be achieved and CO2 emissions
reduced.
So who is working on autonomous vehicles and when can you expect to be able to buy one? The answer
depends on how you define an autonomous vehicle. Google has been actively promoting and testing
vehicles that can drive themselves for the last several years. But they can currently only legally test drive
their vehicles on public roads in California, Florida, Michigan and Nevada subject to speed restrictions
and having the vehicles appropriately labeled as test vehicles. Other states are considering similar
legislation. From an international perspective, starting in January, 2015 driverless vehicles will be
allowed to drive in yet to be designated cities in the United Kingdom, once again for testing purposes
only. Other European countries, as well as Japan and China, have enacted similar legislation.
The situation changes if you want a vehicle that can steer itself while you are in the driver’s seat in case
of an emergency. Mercedes-Benz is now offering their 2014 S 500 Class vehicle with a self-drive option
costing $2800-$3000. The version sold in the United States automatically adjusts the steering wheel so
that the vehicle stays in the center of the lane. Similar systems are now offered by Infiniti, Lexus and
Acura. The option for the S Class sold in Europe also self-drives at speeds under 6 mph, and can apply
the brakes to avoid a collision. Mercedes-Benz leads the pack of automotive companies in terms of test
experience. The company says its research on autonomous driving goes back to 1986, and that by 1994
they had developed technology allowing a vehicle to change lanes, pass another vehicle, and maintain a
safe driving distance between vehicles. Mercedes-Benz and other automotive companies, most notably
BMW, and Volvo expect to incrementally offer features in the next two years. Others including GM,
Ford, Daimler-Chrysler and the other major European and Japanese companies, expect to see vehicles
with full autonomous driving capabilities on the road by 2020.
The Google story is quite different, and began in 1986 when an autonomous vehicle named Stanley won
the 2005 Grand Challenge sponsored by DARPA. The vehicle completed a treacherous 7.32 mile course
in 6 hours and 54 minutes. Stanley was developed by the Stanford University Artificial Intelligence
Laboratory, and key members of the project later joined Google, along with robotics engineers from
another competitor in the race, Carnegie Mellon, to develop the current generation of Google
autonomous vehicles. The latest rendition of the Google vehicle has no steering wheel, accelerator or
brakes so that the occupant must totally rely on the control system. Google has collected more test data
in terms of miles driven on public roadways than even Mercedes-Benz, and they claim no accidents have
occurred to date. The unique feature of the Google approach is the use of a device called a Lidar (laser
radar) that uses 48 rotating lasers mounted on the roof of the vehicle to gather very accurate
information about what is on the road and what the local terrain looks like. The Lidar was first used on
the Stanley vehicle. The images are very data intensive, and data is processed on-board the vehicle and
by a remote computer which are connected by a high speed data link. Images are constantly compared
to data collected from Google’s street map database. The Lidar is produced by a California based
company, Velodyne, and costs a staggering $70,000. When asked if the Lidar can be replaced, Google’s
answer is that they are working on it. Recently a German company named Ibeo states that it can bring
the down cost to $250 per vehicle when produced in high volume. Google’s future seems to be aimed at
collaborations with automotive companies such as it now has with Kia and Hyundai.
For a vehicle to be truly autonomous, requires that at a minimum, the vehicle diagnostics know where
the vehicle is and where it is going as well as information related to what lane it is in, and where the
other vehicles are which in close proximity to it and what speed they are travelling at. The vehicle must
also be able to identify the status of traffic lights and identify road signs such as stop, yield or merge,
and identify if bicycles, pedestrians or animals such as deer or farm animals are on the road. The vehicle
then must use this data to safely navigate toward its destination.
Enhanced GPS positioning is the key means to determining the position of the vehicle on its designated
route. GPS by itself is not sufficient given that contact with a satellite GPS signal may not always be
available at all locations due to weather conditions or local road terrain. By using information provided
by on-board linear and angular accelerometers, vehicle position can be accurately updated. Google
also concurs that GPS can be inaccurate and uses Lidar data to correct GPS errors. Route information is
supplied by a map service provider such as Google or Nokia, where the data is downloaded to the
vehicle several times per minute. The downloaded data presumably includes the locations of
intersections and traffic lights located along the route, so that the vehicle knows ahead of time that an
intersection is ahead and can calculate when the vehicle will reach it.
Lane identification and positioning are two important tasks that must be addressed by a robotic controls
system. Lane positioning is determined usually by analyzing in real time where the vehicle is relative to
the lane markers painted on the roadway, typically using two stereoscopic cameras located in the front
of the vehicle, and making small adjustments to the steering wheel angle. As previously mentioned, this
feature is now part of advanced cruise control systems offered by several automotive suppliers.
Collision avoidance is also a major design issue for an autonomous vehicle. Sensors mounted on the
front and side of the vehicle are typically used to detect the presence of other vehicles on the roadway,
particularly those in close proximity that are approaching on a trajectory that may lead to a collision.
Both short and long range radar are now in use. For, example, the 2014 Mercedes Benz S Class has short
and long range radar on the front and rear of the vehicle and short range radar on the sides. These
sensors not only measure distance but also speed of approach and direction. To complement the radar
sensors are twelve ultrasonic sensors located on the sides and rear of the vehicle. Their function is to
measure the short range proximity of objects and detect objects in back of the vehicle when backing up
or assist in self-parking operations. Current generation systems simply alert the driver of a possible
collision, and in some situations apply the brakes. The automotive companies have also implemented
algorithms to eliminate false positive conditions and eliminate unwanted and distracting warning
signals.
The autonomous vehicle is designed to take corrective actions by changing speed or direction. Mercedes
Benz implements no hands driving at low speeds such as extreme traffic congestion in their 2014 S Class.
Another important feature of radar and ultrasound is to identify crash events that cannot be avoided.
Crash detection times would then be predictive and not responsive. Crash severity could also be better
identified so that seatbelt pre-tensioners could be deployed much earlier and airbag systems deployed
in a far more effective way than today on-board systems can do. Braking could also be initiated to
reduce crash severity. The benefits may be even more important for side impact crashes where current
generation detection times are limited to around 10 to 15 milliseconds and the coverage of the side by
inertial sensors is limited to the region between the A and B pillars whereas ultrasound sensors would
respond even in extreme frontal or rear locations.
The other key task is identifying road signs, traffic lights and objects on the road, including pedestrians.
United States patent US2014/0016826 A1 titled “Traffic Signal Mapping and Detection” was issued to
Google in early 2014. It discusses the use of frontal stereoscopic cameras in identifying the status and
location of traffic lights by scanning the pixels in real time and looking for red, yellow or green “blobs” in
the pixel map and not confusing them with the tail lights of other vehicles. This is a very complex task in
terms of pattern recognition. Not covered in the patent is applying the same methodology to detect
people or other objects including fallen tree branches or truck tire carcasses. Google and every
automotive company offering an autonomous vehicle must solve this problem.
There is no doubt that autonomous vehicles will also cause accidents in the future, but knowing what
we do now what are the likely causes? The list below contains likely candidates.
Software Related Issues- According to an article in MIT Technology Review there are currently around
100 million lines of code in each vehicle sold today, including entertainment systems. To put this
number in perspective there are 8 million lines of source code supporting the new F-35 fighter jet plane.
How many additional lines of code will be required for autonomous vehicles? There are no good
estimates, but it may be in the range of 10 million. The Stanley from the DARPA sponsored race only had
100 thousand lines of code, but its mission was far less complicated. The most likely major problem is
software bugs and the recent revelation that faulty software malfunctions caused accelerators in the
2005 Toyota Camry LR to stick. The Toyota problems arose because the overall software architecture
did not comply with industry guidelines. The other problem is checking embedded algorithms for
formulation mistakes or incorrect algorithm parameters.
Map Data Errors – It is difficult to understand what type of errors may exist in the road maps compiled
by Google, Nokia and others, but it is highly likely that they do exist. Google now asks its customers to
report mistakes on a hotline. A recent incident in Iowa where a Google map survey vehicle entered a
street going the wrong way and caused an accident illustrates the problem. The cause of the error may
simply be that a township installed a traffic light at an intersection after Google made their survey.
However, if the data is being used by an autonomous vehicle the results could be catastrophic.
Component malfunction - Presumably the sensors and controllers associated with autonomous vehicles
have diagnostic codes indicating a malfunction and that this data can be used to generate an error signal
that can be transmitted to the occupant as is the case with other systems such as an airbag system
failure. However, the problem could be that, for example, a camera or radar unit is out of alignment
because of a minor collision. Accumulated dirt on the front of sensors can also impact performance.
Here again the autonomous vehicle is much more susceptible to error than a vehicle with a driver.
Weather Conditions – There are already reports that camera performance can be impacted by snow. A
recent news item stated that the lane control system in some Kias is adversely affected by snow. Google
has even had to admit that they have never tested their vehicles in snow storms. Presumably fog can
also impede performance. Strangely there are no negative comments about European vehicles where
driving in snow conditions is inevitable.
Unforeseen Events – It is virtually guaranteed that road events will occur which although possibly
foreseen by the engineers, there may be no feasible solution to offer outside of having the occupant
assume control of the vehicle. A situation as simple as a policeman guiding vehicles along an unmarked
temporary detour or a very recent bridge washout with no warning signs would most likely totally
confuse the system. These types of events represent a major stumbling block to the current Google
vehicle design. Other events may occur too quickly for the vehicle to respond such as avoiding a vehicle
running a red light at high speed at an intersection with limited lateral visibility.
What is not discussed is the impact that an intelligent roadway system might have on autonomous
vehicles. If stop lights could communicate with the vehicle so that their status at the time when the
vehicle reaches the intersection could be communicated to the vehicle, the control system diagnostics
could be simplified. Another facet of the intelligent highway is that vehicles could communicate with
each other. These systems are technically possible, but given the financial problems of the Federal
Highway Trust Fund it seems unlikely that it will be in place by the automotive company’s target date of
2020.
Background of Author
Ralph L. Hensler has PhD in Physics granted under a joint program with Rutgers University and Bell
Laboratories. He then spent 16 years in a technical management capacity at the former engineering
company Ebasco Services working on projects including nuclear power plants, nuclear fusion reactor
design, solar energy, advanced magnetic power conversion for space satellites and superconducting
magnetic energy storage. In 1989 he joined Breed Technologies where he served as a Director of R&D
working on automotive crash detection, algorithm development and advanced airbag design concepts.
After leaving Breed he worked with a variety of startup companies specializing in two-phase flow
applications for cleaning water treatment systems, batteryless RFID tags, under water energy storage
and most recently, a portable water treatment system for military Special Forces located in remote
locations. He has also served as a consultant and expert witness on several automotive product liability
law suits and patent infringement lawsuits. He has twenty publications covering a wide range of topics.

Weitere ähnliche Inhalte

Was ist angesagt?

HYVE autonomous driving report
HYVE autonomous driving reportHYVE autonomous driving report
HYVE autonomous driving report
Michael Bartl
 

Was ist angesagt? (20)

Innovation Stories: Self-Driving Cars
Innovation Stories: Self-Driving CarsInnovation Stories: Self-Driving Cars
Innovation Stories: Self-Driving Cars
 
Google SDC disengagements Report annual-15
Google SDC disengagements Report annual-15Google SDC disengagements Report annual-15
Google SDC disengagements Report annual-15
 
Autonomous Cars
Autonomous CarsAutonomous Cars
Autonomous Cars
 
Iot and self driving cars
Iot  and self driving cars Iot  and self driving cars
Iot and self driving cars
 
GovTech Explainer: Self-Driving Cars
GovTech Explainer: Self-Driving CarsGovTech Explainer: Self-Driving Cars
GovTech Explainer: Self-Driving Cars
 
Autonomous cars
Autonomous carsAutonomous cars
Autonomous cars
 
Driverless Cars
Driverless CarsDriverless Cars
Driverless Cars
 
Autonomous vehicles[1]
Autonomous vehicles[1]Autonomous vehicles[1]
Autonomous vehicles[1]
 
Autonomous and electric vehicles
Autonomous and electric vehiclesAutonomous and electric vehicles
Autonomous and electric vehicles
 
Autonomous car
Autonomous carAutonomous car
Autonomous car
 
The Self-Driving Car
The Self-Driving CarThe Self-Driving Car
The Self-Driving Car
 
Self driving cars -
Self driving cars - Self driving cars -
Self driving cars -
 
HYVE autonomous driving report
HYVE autonomous driving reportHYVE autonomous driving report
HYVE autonomous driving report
 
Mills & Reeve - Driverless cars April 2016
Mills & Reeve - Driverless cars  April 2016Mills & Reeve - Driverless cars  April 2016
Mills & Reeve - Driverless cars April 2016
 
Google car
Google carGoogle car
Google car
 
Driverless car Google
Driverless car GoogleDriverless car Google
Driverless car Google
 
Anatomy of self driving vehicle
Anatomy of self driving vehicleAnatomy of self driving vehicle
Anatomy of self driving vehicle
 
2017 Automotive Seating Presentation Package
2017 Automotive Seating Presentation Package 2017 Automotive Seating Presentation Package
2017 Automotive Seating Presentation Package
 
Autonomous vehicles
Autonomous vehiclesAutonomous vehicles
Autonomous vehicles
 
Autonomous cars
Autonomous carsAutonomous cars
Autonomous cars
 

Andere mochten auch (8)

The Wright brothers would gaze in wonder at the flying machines of today
The Wright brothers would gaze in wonder at the flying machines of todayThe Wright brothers would gaze in wonder at the flying machines of today
The Wright brothers would gaze in wonder at the flying machines of today
 
PALMETTO PRODUCT INFORMATION
PALMETTO PRODUCT INFORMATIONPALMETTO PRODUCT INFORMATION
PALMETTO PRODUCT INFORMATION
 
Culture of India
Culture of IndiaCulture of India
Culture of India
 
COPASO
COPASOCOPASO
COPASO
 
Visor Mágico - Sergio Maroto Talavera 4ºA
Visor Mágico - Sergio Maroto Talavera 4ºAVisor Mágico - Sergio Maroto Talavera 4ºA
Visor Mágico - Sergio Maroto Talavera 4ºA
 
Profile
ProfileProfile
Profile
 
C.V Ahmad M.Hassan
C.V Ahmad M.HassanC.V Ahmad M.Hassan
C.V Ahmad M.Hassan
 
Administrative management (lec 4)
Administrative management (lec 4)Administrative management (lec 4)
Administrative management (lec 4)
 

Ähnlich wie Autonomous Vehicles

GOOGLE'S AUTONOMUS CAR
GOOGLE'S AUTONOMUS CARGOOGLE'S AUTONOMUS CAR
GOOGLE'S AUTONOMUS CAR
jolsnaj
 
ALKATAN 1Google CarAs we know the trends are being influence.docx
ALKATAN 1Google CarAs we know the trends are being influence.docxALKATAN 1Google CarAs we know the trends are being influence.docx
ALKATAN 1Google CarAs we know the trends are being influence.docx
galerussel59292
 
(The google car) Presentation
(The google car) Presentation(The google car) Presentation
(The google car) Presentation
Asif Choudhury
 

Ähnlich wie Autonomous Vehicles (20)

GOOGLE'S AUTONOMUS CAR
GOOGLE'S AUTONOMUS CARGOOGLE'S AUTONOMUS CAR
GOOGLE'S AUTONOMUS CAR
 
Google self driving car technology
Google self driving car technologyGoogle self driving car technology
Google self driving car technology
 
ALKATAN 1Google CarAs we know the trends are being influence.docx
ALKATAN 1Google CarAs we know the trends are being influence.docxALKATAN 1Google CarAs we know the trends are being influence.docx
ALKATAN 1Google CarAs we know the trends are being influence.docx
 
DriverlessCars
DriverlessCarsDriverlessCars
DriverlessCars
 
Google car
Google carGoogle car
Google car
 
autonomous car
autonomous carautonomous car
autonomous car
 
Futuristic driverless cars are here now
Futuristic driverless cars are here nowFuturistic driverless cars are here now
Futuristic driverless cars are here now
 
Datacommunication presentation
Datacommunication presentationDatacommunication presentation
Datacommunication presentation
 
Google self driving car technology
Google self driving car technology Google self driving car technology
Google self driving car technology
 
Autonomous car(driver less car) (self driving car)
Autonomous car(driver less car) (self driving car)Autonomous car(driver less car) (self driving car)
Autonomous car(driver less car) (self driving car)
 
Autonomous car
Autonomous car Autonomous car
Autonomous car
 
(The google car) Presentation
(The google car) Presentation(The google car) Presentation
(The google car) Presentation
 
Google Self Driving Car
Google Self Driving CarGoogle Self Driving Car
Google Self Driving Car
 
Google's Driverless Car report
Google's Driverless Car reportGoogle's Driverless Car report
Google's Driverless Car report
 
Autonomous Vehicles (from FC&S) by Christine G. Barlow
Autonomous Vehicles (from FC&S) by Christine G. BarlowAutonomous Vehicles (from FC&S) by Christine G. Barlow
Autonomous Vehicles (from FC&S) by Christine G. Barlow
 
Google driver less car presentation (ppt) 2017 Hemant pratap singh
Google driver less car presentation (ppt) 2017 Hemant pratap singh Google driver less car presentation (ppt) 2017 Hemant pratap singh
Google driver less car presentation (ppt) 2017 Hemant pratap singh
 
Self Driving Vehicles (sept 2012)
Self Driving Vehicles (sept 2012)Self Driving Vehicles (sept 2012)
Self Driving Vehicles (sept 2012)
 
GOOGLE DRIVERLESS CAR
GOOGLE DRIVERLESS CAR GOOGLE DRIVERLESS CAR
GOOGLE DRIVERLESS CAR
 
Anand
AnandAnand
Anand
 
Finalreport
FinalreportFinalreport
Finalreport
 

Autonomous Vehicles

  • 1. Autonomous Vehicles Driverless vehicles, or as they are called autonomous vehicles, are a major topic of interest these days. A self-driving vehicle offers many advantages to the public, especially to the elderly and people with physical impairments who otherwise would not be able to drive. Then there is the issue that autonomous vehicles do not get drunk or become drowsy, and are not distracted by cell phones or children in the back seat. Thus they should, in principle, reduce the incidence of accidents and injuries to occupants. Another advantage suggested by the concept’s supporters is that by controlling speed and vehicle spacing patterns on highways, better overall fuel economy can be achieved and CO2 emissions reduced. So who is working on autonomous vehicles and when can you expect to be able to buy one? The answer depends on how you define an autonomous vehicle. Google has been actively promoting and testing vehicles that can drive themselves for the last several years. But they can currently only legally test drive their vehicles on public roads in California, Florida, Michigan and Nevada subject to speed restrictions and having the vehicles appropriately labeled as test vehicles. Other states are considering similar legislation. From an international perspective, starting in January, 2015 driverless vehicles will be allowed to drive in yet to be designated cities in the United Kingdom, once again for testing purposes only. Other European countries, as well as Japan and China, have enacted similar legislation. The situation changes if you want a vehicle that can steer itself while you are in the driver’s seat in case of an emergency. Mercedes-Benz is now offering their 2014 S 500 Class vehicle with a self-drive option costing $2800-$3000. The version sold in the United States automatically adjusts the steering wheel so that the vehicle stays in the center of the lane. Similar systems are now offered by Infiniti, Lexus and Acura. The option for the S Class sold in Europe also self-drives at speeds under 6 mph, and can apply the brakes to avoid a collision. Mercedes-Benz leads the pack of automotive companies in terms of test experience. The company says its research on autonomous driving goes back to 1986, and that by 1994 they had developed technology allowing a vehicle to change lanes, pass another vehicle, and maintain a safe driving distance between vehicles. Mercedes-Benz and other automotive companies, most notably BMW, and Volvo expect to incrementally offer features in the next two years. Others including GM, Ford, Daimler-Chrysler and the other major European and Japanese companies, expect to see vehicles with full autonomous driving capabilities on the road by 2020. The Google story is quite different, and began in 1986 when an autonomous vehicle named Stanley won the 2005 Grand Challenge sponsored by DARPA. The vehicle completed a treacherous 7.32 mile course in 6 hours and 54 minutes. Stanley was developed by the Stanford University Artificial Intelligence Laboratory, and key members of the project later joined Google, along with robotics engineers from another competitor in the race, Carnegie Mellon, to develop the current generation of Google autonomous vehicles. The latest rendition of the Google vehicle has no steering wheel, accelerator or brakes so that the occupant must totally rely on the control system. Google has collected more test data in terms of miles driven on public roadways than even Mercedes-Benz, and they claim no accidents have
  • 2. occurred to date. The unique feature of the Google approach is the use of a device called a Lidar (laser radar) that uses 48 rotating lasers mounted on the roof of the vehicle to gather very accurate information about what is on the road and what the local terrain looks like. The Lidar was first used on the Stanley vehicle. The images are very data intensive, and data is processed on-board the vehicle and by a remote computer which are connected by a high speed data link. Images are constantly compared to data collected from Google’s street map database. The Lidar is produced by a California based company, Velodyne, and costs a staggering $70,000. When asked if the Lidar can be replaced, Google’s answer is that they are working on it. Recently a German company named Ibeo states that it can bring the down cost to $250 per vehicle when produced in high volume. Google’s future seems to be aimed at collaborations with automotive companies such as it now has with Kia and Hyundai. For a vehicle to be truly autonomous, requires that at a minimum, the vehicle diagnostics know where the vehicle is and where it is going as well as information related to what lane it is in, and where the other vehicles are which in close proximity to it and what speed they are travelling at. The vehicle must also be able to identify the status of traffic lights and identify road signs such as stop, yield or merge, and identify if bicycles, pedestrians or animals such as deer or farm animals are on the road. The vehicle then must use this data to safely navigate toward its destination. Enhanced GPS positioning is the key means to determining the position of the vehicle on its designated route. GPS by itself is not sufficient given that contact with a satellite GPS signal may not always be available at all locations due to weather conditions or local road terrain. By using information provided by on-board linear and angular accelerometers, vehicle position can be accurately updated. Google also concurs that GPS can be inaccurate and uses Lidar data to correct GPS errors. Route information is supplied by a map service provider such as Google or Nokia, where the data is downloaded to the vehicle several times per minute. The downloaded data presumably includes the locations of intersections and traffic lights located along the route, so that the vehicle knows ahead of time that an intersection is ahead and can calculate when the vehicle will reach it. Lane identification and positioning are two important tasks that must be addressed by a robotic controls system. Lane positioning is determined usually by analyzing in real time where the vehicle is relative to the lane markers painted on the roadway, typically using two stereoscopic cameras located in the front of the vehicle, and making small adjustments to the steering wheel angle. As previously mentioned, this feature is now part of advanced cruise control systems offered by several automotive suppliers. Collision avoidance is also a major design issue for an autonomous vehicle. Sensors mounted on the front and side of the vehicle are typically used to detect the presence of other vehicles on the roadway, particularly those in close proximity that are approaching on a trajectory that may lead to a collision. Both short and long range radar are now in use. For, example, the 2014 Mercedes Benz S Class has short and long range radar on the front and rear of the vehicle and short range radar on the sides. These sensors not only measure distance but also speed of approach and direction. To complement the radar sensors are twelve ultrasonic sensors located on the sides and rear of the vehicle. Their function is to measure the short range proximity of objects and detect objects in back of the vehicle when backing up or assist in self-parking operations. Current generation systems simply alert the driver of a possible
  • 3. collision, and in some situations apply the brakes. The automotive companies have also implemented algorithms to eliminate false positive conditions and eliminate unwanted and distracting warning signals. The autonomous vehicle is designed to take corrective actions by changing speed or direction. Mercedes Benz implements no hands driving at low speeds such as extreme traffic congestion in their 2014 S Class. Another important feature of radar and ultrasound is to identify crash events that cannot be avoided. Crash detection times would then be predictive and not responsive. Crash severity could also be better identified so that seatbelt pre-tensioners could be deployed much earlier and airbag systems deployed in a far more effective way than today on-board systems can do. Braking could also be initiated to reduce crash severity. The benefits may be even more important for side impact crashes where current generation detection times are limited to around 10 to 15 milliseconds and the coverage of the side by inertial sensors is limited to the region between the A and B pillars whereas ultrasound sensors would respond even in extreme frontal or rear locations. The other key task is identifying road signs, traffic lights and objects on the road, including pedestrians. United States patent US2014/0016826 A1 titled “Traffic Signal Mapping and Detection” was issued to Google in early 2014. It discusses the use of frontal stereoscopic cameras in identifying the status and location of traffic lights by scanning the pixels in real time and looking for red, yellow or green “blobs” in the pixel map and not confusing them with the tail lights of other vehicles. This is a very complex task in terms of pattern recognition. Not covered in the patent is applying the same methodology to detect people or other objects including fallen tree branches or truck tire carcasses. Google and every automotive company offering an autonomous vehicle must solve this problem. There is no doubt that autonomous vehicles will also cause accidents in the future, but knowing what we do now what are the likely causes? The list below contains likely candidates. Software Related Issues- According to an article in MIT Technology Review there are currently around 100 million lines of code in each vehicle sold today, including entertainment systems. To put this number in perspective there are 8 million lines of source code supporting the new F-35 fighter jet plane. How many additional lines of code will be required for autonomous vehicles? There are no good estimates, but it may be in the range of 10 million. The Stanley from the DARPA sponsored race only had 100 thousand lines of code, but its mission was far less complicated. The most likely major problem is software bugs and the recent revelation that faulty software malfunctions caused accelerators in the 2005 Toyota Camry LR to stick. The Toyota problems arose because the overall software architecture did not comply with industry guidelines. The other problem is checking embedded algorithms for formulation mistakes or incorrect algorithm parameters. Map Data Errors – It is difficult to understand what type of errors may exist in the road maps compiled by Google, Nokia and others, but it is highly likely that they do exist. Google now asks its customers to report mistakes on a hotline. A recent incident in Iowa where a Google map survey vehicle entered a street going the wrong way and caused an accident illustrates the problem. The cause of the error may
  • 4. simply be that a township installed a traffic light at an intersection after Google made their survey. However, if the data is being used by an autonomous vehicle the results could be catastrophic. Component malfunction - Presumably the sensors and controllers associated with autonomous vehicles have diagnostic codes indicating a malfunction and that this data can be used to generate an error signal that can be transmitted to the occupant as is the case with other systems such as an airbag system failure. However, the problem could be that, for example, a camera or radar unit is out of alignment because of a minor collision. Accumulated dirt on the front of sensors can also impact performance. Here again the autonomous vehicle is much more susceptible to error than a vehicle with a driver. Weather Conditions – There are already reports that camera performance can be impacted by snow. A recent news item stated that the lane control system in some Kias is adversely affected by snow. Google has even had to admit that they have never tested their vehicles in snow storms. Presumably fog can also impede performance. Strangely there are no negative comments about European vehicles where driving in snow conditions is inevitable. Unforeseen Events – It is virtually guaranteed that road events will occur which although possibly foreseen by the engineers, there may be no feasible solution to offer outside of having the occupant assume control of the vehicle. A situation as simple as a policeman guiding vehicles along an unmarked temporary detour or a very recent bridge washout with no warning signs would most likely totally confuse the system. These types of events represent a major stumbling block to the current Google vehicle design. Other events may occur too quickly for the vehicle to respond such as avoiding a vehicle running a red light at high speed at an intersection with limited lateral visibility. What is not discussed is the impact that an intelligent roadway system might have on autonomous vehicles. If stop lights could communicate with the vehicle so that their status at the time when the vehicle reaches the intersection could be communicated to the vehicle, the control system diagnostics could be simplified. Another facet of the intelligent highway is that vehicles could communicate with each other. These systems are technically possible, but given the financial problems of the Federal Highway Trust Fund it seems unlikely that it will be in place by the automotive company’s target date of 2020. Background of Author Ralph L. Hensler has PhD in Physics granted under a joint program with Rutgers University and Bell Laboratories. He then spent 16 years in a technical management capacity at the former engineering company Ebasco Services working on projects including nuclear power plants, nuclear fusion reactor design, solar energy, advanced magnetic power conversion for space satellites and superconducting magnetic energy storage. In 1989 he joined Breed Technologies where he served as a Director of R&D working on automotive crash detection, algorithm development and advanced airbag design concepts. After leaving Breed he worked with a variety of startup companies specializing in two-phase flow applications for cleaning water treatment systems, batteryless RFID tags, under water energy storage and most recently, a portable water treatment system for military Special Forces located in remote
  • 5. locations. He has also served as a consultant and expert witness on several automotive product liability law suits and patent infringement lawsuits. He has twenty publications covering a wide range of topics.