budny/papers/216 - University of Pittsburgh

Conference Session A8
Paper # 216
Disclaimer—This paper partially fulfills a writing requirement for first year (freshman) engineering students at the University
of Pittsburgh Swanson School of Engineering. This paper is a student, not a professional, paper. This paper is based on publicly
available information and may not provide complete analyses of all relevant data. If this paper is used for any purpose other
than these authors’ partial fulfillment of a writing requirement for first year (freshman) engineering students at the University
of Pittsburgh Swanson School of Engineering, the user does so at his or her own risk.
DETECTION SYSTEMS ON AUTONOMOUS VEHICLES
Rob Colville, [email protected], Sanchez 5pm, Devon Grupp, [email protected], Vidic 2pm
Abstract- In our presentation we will aim to show how a
system of sensors and camera systems can be integrated to
map 3D environments and detect surroundings so effectively
that they can serve as a safe and viable mechanism to operate
self-driving cars. People all around the world have dreamt for
decades about self-operating machines and what they can do
for our society. That dream is now reality the companies
investing in and designing machines so elegant and
intellectual that they can take the wheel from humans. The
important part of this is utilizing color cameras, lasers, radar,
and ultrasonics to create a virtual image of the surrounds. The
computer system takes the second by second information that
a human would have and processes this and can drive itself
based on the information gathered by all of the detection
apparatuses. We will try to gain a further understanding of
how the radar and LiDAR sensors work and how the computer
system conveys all the information into a decision. We are also
looking into possible alternatives and other technologies that
could be implemented in the future. In this paper we hope to
convey our research on the processing system in autonomous
vehicles and how it will change the industry for better or
worse.
Key words- Autonomous vehicles, 3D imaging, Lidar, Selfdriving cars, Image Processing
RELEVANCE TO TODAY
The self-driving car is no longer a thing of science fiction.
It is possible, it is here, and it is already beginning to integrate
into the lives of people here. Tech giants like Google, Tesla,
and Uber see the self-driving car as the future of the
automotive industry, and are each spending millions of dollars
on developing systems that make this vision possible.
Different companies have different ideas of exactly how to
meet this end, but each of their designs employ the same
overall principle. A suite of sensors on the exterior of the car
gathers all sorts of environmental data, and relays it to a central
computer, which processes the data and uses it to make real
time decisions, just as a human driver would. Some
components of the self-driving car are cutting edge while have
existed for some time. It is the integration of these sensors and
3D mapping and decision making algorithms that is now being
University of Pittsburgh Swanson School of Engineering 1
Submission Date 2.10.2017
perfected by designers, and that will allow vehicles to finally
drive themselves.
To witness first-hand the first instances of the self-driving car
in the real world, all one has to do is walk the streets of
Pittsburgh, where ride-hailing company Uber has unleashed a
fleet of fully functional prototype autonomous taxis [1]. If this
technology becomes feasible enough to reach its full potential,
people may all look back to last year as the year the ‘modern’
car was invented, a truly paradigm shifting eventuality. This
point in time stands to be the boundary between an old and new
era in the ways of human transportation.
Throughout this paper, it will be outlined how
innovations in sensor and 3D mapping technology are making
the self-driving car possible. By describing the functionality of
two of the most important sensors, as well as the computer that
is used to process the data, it will be shown how the selfdriving car can take the place of a human driver and how the
world will change with the coming of the autonomous vehicle.
HISTORY
For decades man has dreamed of technological advances
that will make life easier. The first autonomous vehicle dates
all the way back to 1478 when Leonardo Da Vinci tried to
construct a self-propelled cart that could travel and complete a
predetermined course. He implemented coiled clockwork
springs and certain control mechanisms to make this invention
possible [2]. Although it was able to operate by itself, it could
only carry out a single function, and was incapable of taking
in any information from its surroundings. The next evolution
of this technology came about in 1866 when Robert Whitehead
of the Austro-Hungarian Navy invented the self-guided and
self-propelled torpedo [2]. This weapon was critical during
World War 2 sinking over 1500 ships and the cause of death
for over 200,000 men. Following this in 1945 the Germans
invented the V2 Ballistic Missile. This was a self-guided
missile that would travel at speeds up to 3600 MPH from
France to London and would explode releasing the 2200 pound
warhead onto their enemies [2].
Nearly fifty years later, in 1995, the US’s “Predator”
which is a manually operated drone was invented and used for
assassinating enemies of the United States starting in 2001.
This computer guided machine could be controlled from
hundreds of miles away to do research and perform important
Rob Colville
Devon Grupp
tasks for the CIA [2]. These drones are fitted with various types
of cameras, targeting systems and programmable autopilot
systems, and can do a considerable number of things on their
own. They still relay any and all data back to a human operator,
and does not carry out any actions until the human gives it a
command. In this way it represents the last hurdle for
autonomous vehicles. The technology to carry out functions is
there, but it still lacks the decision making that has always been
entrusted to humans.
Now in modern day, cars already have self-braking,
lane departure, back up warnings, blind spot alert, and parallel
park assist systems. Companies have been adding these
detection systems for years to help drivers avoid accidents and
improve safety on the road. These systems already employ a
few of the sensors that are being implemented into the
autonomous vehicle. All of these things in affect, but so far
there has not been a successful attempt to combine them and
create a completely self-driving car capable of operating on
public roads until now.
Developers disagree on which of these sensors is best to
use and which are unnecessary. While LiDAR has been
championed by companies such as Google and Uber, it has
been famously dismissed by Tesla. While they are both meant
to produce the same type of readings, differences in price and
versatility have made them controversial [4].
Both radar and LiDAR are based on the same principle.
In the simplest possible terms, they work much like a bat using
echolocation. They send out a signal, and measure the time it
takes to come back. Because they both take measurements of
something that was generated by the system itself, they are
considered active remote sensors [5]. This is a much more
accurate form of data collection than passive sensors, which
only detect inputs present in the environment, such as sunlight
[5]. By emitting a wave and recording the time it takes pulse
to return, these sensors can detect the location of objects with
respect to the car. For active sensors, the initial emission of the
wave by the sensor is called a pulse, and the recording of the
wave coming back to the sensor is called a return [5]. Aside
from this basic functional similarity however, these two
sensors do not have much in common. The differences lie in
the orientation of the emitter, and the type of pulse emitted.
This will be discussed in detail below.
MODERN IMPLEMENTATION
PURPOSE OF SENSORS
LIDAR
In order for a car to operate independently, it needs to be
able to steer, accelerate, brake, and signal on its own. In order
to do so, it must take in information from its surroundings, just
as a human driver would. Each second as it moves through its
environment, its situation changes. Roads bend. Stoplights
approach. Other cars pass by and pedestrians cross the street.
A human driver is able to interpret all of these things using
only their sense of sight, and make decisions on how to react
based on this sensory input [3]. This is because the human eye
can sense movement, changes in color and brightness, distance
of objects. The brain then takes all of this information and is
able to use it determine where the car needs to go and when it
needs to happen [3]. On today’s autonomous vehicles, sensors
are the eyes, and the computer is the brain.
No single sensor can replicate the versatility of the
human eyes. In order to take in all the information a human
would be able to, multiple types of sensor must be integrated
into a single system. Normal visible spectrum cameras for
example can be used to detect colors and shapes, such as those
of stoplights, yield signs, etc. However, they are useless for
determining distances or speeds of other objects [3].
It is essential that the sum of the inputs of all of the
sensors on the car is detailed enough to be used by the
computer to create a three-dimensional map of the world
around the car. To meet this end, some of the most important
sensors on the car are those that can detect the distances and
shapes of nearby surfaces and objects. This category includes
two technologies called radar, and LiDAR (light detection and
ranging). Either one or a combination of these two types of
sensors is used on every viable autonomous car currently being
designed [3].
LiDAR sensors consist of a single spinning module that
emits laser beams in order to detect objects in a full 360
degrees around the car. Instead of sending out radio waves like
radar, LiDAR units use near-infrared laser beams, which have
a wavelength of only 1050 nanometers [6]. LiDAR units used
for terrain mapping send out pulses at 50,000Hz to 200,000 Hz
and generate millions of data points in all directions [6].
Mounted on top of the car, this single sensor car be a very
effective way to create a virtual model of the car’s
surroundings [6].
Once the laser beam is generated, it is directed into a
spinning mirror that directs it outwards away from the car. As
the waves travel out at different angles, the system uses a highprecision clock to record the amount of time that passes since
a beam left the mirror from a certain orientation. When the
return is detected from that direction, a calculation is made to
determine the range distance between the car and the object
that the beam bounced off of, and a data point is assigned
coordinates [6]. The basic formula used to calculate this is
R=v*t. Or more specifically, R=(½)c*t, where R is the range
distance, c is the speed of light, t is time, and the ½ is a constant
to account for the time the wave spends moving away from the
sensor and the time spent moving back towards it. The sum of
all these return data points is called a point cloud [6]. This point
cloud is the virtual map that the LiDAR creates for the
computer. By providing a huge amount of highly accurate data
points, inferences can be made that allow the computer to “see”
the shapes of surfaces and objects. Because the spinning sensor
is constantly picking up new data points and refreshing the
2
Rob Colville
Devon Grupp
point cloud, changes in distance can be used to detect
movement and speed as well [5].
Because the lasers travel in straight lines to and from the
sensor, each pulse can only gather a single data point in each
direction, accounting for the first solid surface the laser
touches and bounces back from. This is not always true when
detecting objects that are partially translucent, such as foliage
and glass [5]. In these cases the LiDAR is able to “see through”
the first surface and detect surfaces behind it. The sensor picks
up the first and last returns and interprets them as two surfaces,
one closer and one further away. For most solid objects
however, the full beam is bounced back, creating a shadow in
the point cloud behind the surface [5]. This is usually of no
consequence, as the nearest faces of an object are all that
matter when trying to avoid hitting it. Also, as the car moves,
the sensor is able to partially fill in these shadows by viewing
objects from different angles. The main issue that arises is
shadow thrown by the car itself. The optimal location to mount
the sensor is the top of the car, where it can spin in all
directions horizontally without being completely obstructed,
but the body of the car will always partially block the bottom
of the sensor’s field of view, as can be seen in Figure 1 [y].
The dark ring around the blue car is a blind spot in the
LiDAR’s field of view created by the car itself. By mounting
the sensor higher above the car, the shadow in the bottom of
the field of view can be minimized, allowing the LiDAR to see
as close as possible to the car [6].
Due to their high perch above the car, LiDAR sensors are
not discrete. They are easily recognizable and turn heads when
seen on the road, most notably on Uber’s self-driving taxis.
Uber has championed the technology over radar and believes
that it is the best technology to use for terrain mapping [4].
detection system, radar is one the most effective ways of
sensing the movement and range of objects. Instead of using
focused infrared laser beams to emit a pulse, radar generates
radio waves and records their return [8]. Just like LiDAR, radar
is an active remote sensor and uses the return data of the waves
that it sends out to do its range calculations, but its hardware
and setup are very different.
Radar is not designed to take readings in 360 degrees, but
instead each sensor can only look out in one direction,
requiring the use of multiple sensors positioned around the car.
It requires no rotating mirrors however, as the radio waves are
not nearly as focused as those of the infrared light of a laser
beam. The emitted radio waves are out of phase with each
other, causing them to spread out as they move away from the
car, allowing them to cover a larger area with a single pulse
[3].
There are two ways radar can be used to take readings.
The first is similar to LiDAR. The sensor times the interval
between the release of the pulse and the detection of the return,
and uses an equation incorporating the speed of light
(R=(½)c*t) to calculate the distance of the object being
detected. Multiple consecutive readings can be used to detect
any relative movement by the object [8].
The second way is called the Frequency Modulated
Continuous Wave (FMCW) method. In this technique a wave
with a modulated frequency is emitted. Then it is picked back
up, the frequency of the return wave is remeasured. The
difference in the frequency of the pulse and the return can be
used to calculate both the range distance, as well as the relative
velocity of the object to the sensor, using only the data from a
single pulse [8].
THE SUPERIOR SENSOR
Many companies use both radar and LiDAR in tandem.
Google and Uber have been utilizing them with great success.
But one of the leaders in the autonomous car industry, Tesla,
has publically stated that they do not feel that LiDAR is neither
necessary nor consistent enough to build a functioning
autonomous car [4].
Despite only requiring one unit per car as opposed to
many, LiDAR is currently a very expensive technology. And
while Google and others are currently working to make it more
affordable, it is currently the most expensive range detecting
sensor to choose from [3]. Another downside to LiDAR is that
it has troubles functioning in certain weather conditions where
radar does not. Because the wavelength of near-infrared light
is so small, around 1000 nanometers, false initial returns are
sent back to the sensor as the waves bounce off of water
particles in the air such as fog. Radar does not have this
problem, as the radio waves it emits have a wavelength of a
few millimeters, which is much larger and allows them to
easily ignore small particles [8].
FIGURE 1 [7]
LiDAR generated point cloud around a car
RADAR
Radar as a technology has been around much longer than
LiDAR. Developed in the 1930’s as a long range military
3
Rob Colville
Devon Grupp
adjusts the steering wheel to the angle it needs to be turned to.
This device attaches to the vehicle's Controller Area Network
(CAN) and then has access to all the controls in the car. The
CAN allows devices to communicate between one another
without having an overseeing host. This means information
can be passed on and processed into what decisions should be
made by the autonomous vehicle. This is seen in all parts of
the autonomous vehicle not only with the lane guiding feature
of the CNN system [9].
This technology was then tested in a simulator and
later on public roads. Nvidia was testing to see how
autonomous and how many mistakes it makes over a long
distance. To help combat some errors the CNN was taught how
to correct from poor positions or orientations [9]. It would be
fed images that were slightly rotated or off center and the CNN
would have to try and fix these errors and keep the vehicle
travelling safely along the road. The simulator would take
images from an actual dash cam and then once the CNN made
its adjustments the simulator would take the next image and
alter it based on the path that CNN had the car take [9]. With
all of this they were able to perform very detailed simulations.
The next step was to test how the system worked on public
roads. Around the streets of New Jersey the autonomous
vehicle using a CNN. The car was fully autonomous 98 percent
of the time [9]. There is room for improvement and technology
is continuing to have breakthroughs and new systems being
implemented every day, but these tests are a positive sign that
society is growing ever closer to fully autonomous cars all over
the world.
HOW A COMPUTER PROCESSES THE DATA
If these autonomous vehicles are to be successful
there must be a way to gather all the information from the
sensors and then use that to make decisions. One system which
has been created by Nvidia uses deep learning, sensor fusion,
and a neural network so the car can function on public roads
without accidents [9]. The Convolutional Neural Network
(CNN) is the network this company has developed to process
the data. The network has roughly twenty seven million
connections and over 250 thousand parameters that determine
the actions the car will take after being processed [9]. In these
systems there is normally a bias for these computer systems to
drive in straight lines, but with this new system they are able
to represent normal road curves with a higher frame proportion
[9]. CNN is able to pick up on the outline of a road without
being explicitly shown or told where the road is or what it
looks like. The car is also able to perform these tasks and
complete a course even when having to deal with adverse
conditions of weather. The system is able to adapt to outside
conditions and can deal with the many different types of roads
it will face over the course of a drive.
The system is also set up so that the steering
command is set to 1/r instead of r where r is the radius. This
allows the car to seamlessly transition from left turns (negative
numbers) through 0 to right turns (positive numbers) [9].
DURABILITY
One of the most significant issues currently affecting
the self-driving car is the fact that because it is a novel,
untested technology, questions can be raised about the
sustainability of these vehicles. Automobiles have been
breaking down since their invention, the same as any other type
of mechanical equipment, and the need for the occasional
repair or replacement part has become standard. But different
driving patterns and tendencies will lead to the wear and tear
on a car operated by a computer being different than that on
one driven by a human [11]. In addition, without a proper
analog, it is hard to say with certainty how long the new
sensors will last on the road, in relation to the total lifespan of
the car. Radar and LiDAR sensors mounted on a road car
would be subjected to yearly exposure to rain, salt, vibrations,
and dust. It is possible that the sensors would need to be
checked, calibrated, or replaced regularly, like brake pads or
windshield wiper blades. While similar sensor apparatuses
used in other scenarios can last for long periods of time, such
as those used by weather stations or military vehicles, these
devices are subject to regular maintenance themselves, and are
not put under the same adverse conditions as sensors mounted
on a car [6].
FIGURE 2 [10]
Computer data input/steering output mechanism
In figure one, Nvidia call it the DAVE-2 system, there
is an example of what one of these computer processing/data
storage devices looks like. In the bottom left of the picture you
will see a couple of small greenish blue ports. Any cameras
that are needed to determine the steering of the car will be
hooked up to these ports.
The data is collected and stored in the external state
drive which is just the rest of the hardware. When this data
comes in it is processed and then moves to the component that
adjusts the angle of the steering wheel. Along the bottom of
the panel on the far right one can see a rather large device that
extrudes from the main graphics processing unit (GPU) [9].
This is the piece that takes the information it has received and
4
Rob Colville
Devon Grupp
While it is unclear as to whether the longevity of the
individual sensors will contribute the sustainability of the
automobile, these sensors will make the car itself last longer
and in turn save the car owners money. With the
implementation of self-driving vehicles as a nation are
projected to save upwards of 576 million dollars just on
crashes alone [11]. By making efficient driving decisions and
driving safely behind cars and decreasing air resistance, on
average forty two lives will be saved every day and four
hundred and twenty thousand barrels of oil will be reduced
from daily consumption. By reducing the number of accidents
and limiting traffic, the nation will be able to save on the
consumption of fuel. Oil is a valuable resource so by saving
hundreds of thousands of barrels a day it will become less
scarce and the price for gasoline will in turn go down. This
technology will help the cars run smoother and cleaner for a
much longer time. The average car today lasts for roughly eight
years or about one hundred and fifty thousand miles [12].
Another thing with cars is that owners have to change the
brakes around thirty thousand miles of use depending on their
driving habits. These vehicles will be able to brake smarter
because they can sense and pick up information and react much
quicker than a human. They will see the need to brake earlier
and ease into it rather than hitting the brake hard and causing
a very sudden stop. With autonomous vehicles, their brakes
and their car as a whole will be able to run smoother and
require much less maintenance for a longer amount of time.
When the autonomous vehicles are driving they will not be
constantly jamming on the brakes or accelerating rapidly, the
cars will be able to run smoothly and efficiently [12]. The car
will also be staying in a relatively straight path and not veering
outside the lanes so that can help prevent picking up any debris
or puncturing a tire from anything on the side of the road.
Ultimately, the change from human to autonomous driving
results in the car becoming a superior and more efficient
method of transportation than ever before. It allows for
optimization and efficiency that will lead to a more useful and
convenient product for consumers.
from personal car ownership towards communal car sharing
networks.
The current methods of public transportation will also be
affected by these autonomous vehicles. In the previous
paragraph I mentioned the ride sharing system that could be
implemented and how this could likely take the place of the
taxicab. With these ride sharing systems there would be “Fleet
Managers” who are in charge of managing an amount of cars
in a certain area [13]. In densely populated areas and urban
cities these Fleet Managers would be able to access the car’s
computer systems and see where there is heavy traffic and as
the use of these autonomous vehicles increases they will be
able to get more accurate and precise data. They could then
relay this information onto the city traffic coordinator [13].
The traffic could be managed appropriately by making certain
lights stay green longer to allow more cars to flow through and
make lights with minimal to know cars stay green for a shorter
amount of time. This will lead to more real-time traffic
management and make for less traffic and shorter travel times
[13]. These cars that will be in the fleets will be able to slow
down to avoid the clogging of intersections or speed up and
allow more vehicle to make it through a light in one turn. There
are endless possibilities and ways to innovate people's
everyday commutes to make them more efficient [13].
Finally, and most importantly to everyday people, almost
every person in the world could or will end up being affected
by these autonomous vehicles. The self-driving car has already
hit the road and been transporting people around for months.
Numerous companies are investing in this technology and
trying to create fully autonomous cars capable of operating on
public roads. Citizens will now be able to hail these vehicles
from our phone. This will be far more convenient than having
to wait around and hail a cab. With this computer system in the
cars the car that is nearest will pick up the request and therefore
save the average person time by getting to them quicker. Also
the convenience of being able to pay straight from your phone/
bank account instead of rummaging through your purse or
wallet looking for the proper change. It should also be
mentioned that these vehicles have been proven to be much
safer and therefore will not have to worry about getting into an
accident when taking a ride in one of these autonomous
vehicles.
The demand from everyday citizens to have access to a
car like this at all times could lead to a shift in car production.
Companies will soon be selling these autonomous vehicles to
the public. Everyday people are going to want to be able to
relax in the car instead of having to deal with all the stress of
dealing with traffic and other people on the road who might
not be as good of a driver as you. Everyone has had to deal
with other drivers who have annoyed us with their decision
making on the road. With the autonomous vehicle people are
no longer going to have to worry about getting into an
accident because another driver did not see you in there blind
spot or they are just a negligent driver and did not see the
light change from green to red. Having these self-driving
vehicles will lead people being less stressed and much
WHO WILL BE AFFECTED
As this technology develops and becomes more and more
prevalent, one must take into account how our lives could
change and who all will be affected by this change. There are
many industries that will be affected by autonomous vehicles.
The taxicab industry could be drastically altered by these new
self-driving vehicles. This can be seen today as the company
Uber employs more and more people every day and has created
an armada of cars to transport people around. The taxi driver
may fade out and no longer become necessary to get a ride
from one place to another. The Yellow taxi business will either
fade away and succumb to other companies like Uber or
become automated itself. Ease of access and increased privacy
may lead to a major increase in people using ride hailing
services, possibly to the point where the societal norm shifts
5
Rob Colville
Devon Grupp
[3] D. Santo. “Autonomous Cars’ Pick: Camera, Radar,
Lidar?” EE Times. 7.7.2016. Accessed 3.3.2017.
http://www.eetimes.com/author.asp?section_id=36&doc_id=
1330069
[4] S. Gibbs. “Uber riders to be able to hail self-driving cars
for first time.” The Guardian. 8.18.2016 Accessed 1.11.2017.
https://www.theguardian.com/technology/2016/aug/18/uberriders-self-driving-cars
[5 ] “A Complete Guide to LiDAR: Light Detection and
Ranging.”
GIS
Geography.
Accessed
2.10.2017.
http://gisgeography.com/lidar-light-detection-and-ranging/
[6 ] “Light Detection and Ranging.” Portland State
University. Accessed 2.10.2017.
http://web.pdx.edu/~jduh/courses/geog493f12/Week04.pdf
[7] “See How The Google Self Driving Car Sees.” Universal
Design
Style.
10.2.2013.
Accessed
3.31.2017.
http://www.universaldesignstyle.com/see-google-selfdriving-car-sees/
[8] “Distance Sensors – RADAR.” Clemson University
Vehicular Electronics Laboratory. Accessed 3.3.2017.
http://www.cvel.clemson.edu/auto/sensors/distanceradar.html
[9] “Introducing the New NVIDIA Drive PX 2.” NVIDIA.
Accessed 3.3.2017. http://www.nvidia.com/object/drivepx.html
[10] “The AI Car Computer for Self-Driving Vehicles”
NVIDIA.
2017.
Accessed
1.11.2017.
http://www.nvidia.com/object/drive-px.html
[11] “Daily Impact of Self Driving Cars in the United States”
AUVSI.
2016.
Accessed
1.11.2017.
http://www.auvsi.org/auvsiresources/knowledge/dailylossesin
aworldwithoutselfdrivingcars
[12] H. Weisbaum. “What’s the life expectancy of my car”
NBC.
2006
Accessed
3.30.2017
http://www.nbcnews.com/id/12040753/ns/businessconsumer_news/t/whats-life-expectancy-mycar/#.WN7Pg8s2yi1
[13] A. Hars. “Driverless Car Market Watch.” 2.27.2017.
Accessed 3.3.2017.
http://www.driverless-future.com/?cat=26
happier in general due to not having to deal with the everyday
worries and stresses put on people from traffic jams and other
automobile issues.
FUTURE OF THIS TECHNOLOGY
Now there is no way of knowing what the future holds
and the fate of this technology cannot be predicted There are
so many different paths and routes it could travel down, but
based on what many experts are saying one can form own idea
of how this technology will grow and evolve in the future.
Volkswagen, one of the leaders in the auto industry, has
already implemented a plan to prepare for a driverless future
[13]. There could come a time when it is illegal for humans to
be behind the wheel. With more and more testing of these selfdriving cars they are proving to be safer on the roads and more
efficient. The cars themselves are able to see and sense more
things going on and can process and make decisions faster.
Where humans may sometimes make bad judgements or errors
these autonomous vehicles are set up to take the real time data
they receive from their sensors and process that to make the
correct decision. The plan for these cars is to make them
capable of attaching to caravans. Caravans are lines are cars
that drive very close bumper to bumper to help reduce drag
from the wind which in turn will increase miles per gallon and
resultantly save the owner of the car money on gas. On long
commutes the vehicles will be able to notice these caravans
and be able to latch on to one for as long as possible.
Another perk of these autonomous vehicles of the future
could be the time you save from behind the wheel. Can you
imagine how much more you could accomplish in a day if you
did not spend multiple hours a week driving to work, friends’
houses, restaurants,...etc? The average commuter will spend 50
minutes a day going from home to work and then back home
[9]. If you work 5 days a week then you will be spending over
4 hours a week behind the wheel just going to and from work.
With the invention and implementation of this autonomous
vehicle everyday people will be more productive and be able
to complete more work than ever before. The inside of these
cars could be set up as an office to get work done or a
relaxation area when you need to take a break or destress after
a long day at work. This is just a small glimpse of all the
incredible things the autonomous car has to offer.
ADDITIONAL SOURCES
E. Ackerman. “Cheap Lidar: The Key to Making SelfDriving Cars Affordable” 9.22.2016 Accessed 2.10.2017.
http://spectrum.ieee.org/transportation/advanced-cars/cheaplidar-the-key-to-making-selfdriving-cars-affordable
SOURCES
[1] S. Gould, Y. Han, D. Muoio. “Here's the Tech that Lets
Uber's Self-driving Cars See the World.” Business Insider.
9.14.2016.
Accessed
1.11.2017.
http://www.businessinsider.com/how-ubers-driverless-carswork-2016-9
[2] M. Weber. “Where to? A History of Autonomous
Vehicles.” Computer History Museum. 5.08.2014. Accessed
1.11.2017. http://www.computerhistory.org/atchm/where-toa-history-of-autonomous-vehicles/
ACKNOWLEDGMENTS
First we would like to thank Rachel Lukas, our co-chair
for meeting with us and helping with revisions throughout the
whole writing process. Also for setting up the meetings with
Mr. Andes and helping us prepare for this conference.
6
Rob Colville
Devon Grupp
Next, we would like to thank Mr. Maddocks our writing
instructor who gave his feedback and instructions on how we
could improve our paper.
Another thank you to Mr. Andes who helped us discuss
our topic and for giving use many tips and feedback on how
to make a good presentation.
Also, thanks to Dr. Budny and the rest of the
engineering staff that helped set up this conference and
provide us this wonderful stage to gain experience.
Finally, I would like to thank my father Ron Grupp who
listened to what I had to say and gave me his input on the
subject. He has been very influential in my life and has shown
me a great deal of useful things. I would like to thank him for
all the time and effort he put into helping me become who I am
and giving good feedback and information on this paper.
7