DEPT of Mechanical Engineering JNNCE, Shivamogga

Download as pdf or txt
Download as pdf or txt
You are on page 1of 23

Google Driverless Car

CHAPTER 1
INTRODUCTION

The inventions of the integrated circuit and later, the microcomputer, were major factors
in the development of electronic control in automobiles. The importance of the
microcomputer cannot be overemphasized as it is the brain that controls many systems in
today’s cars. For example, in a cruise control system, the driver sets the desired speed and
enables the system by pushing a button. A micro-computer then monitors the actual speed of
the vehicle using data from velocity sensors. The actual speed is compared to the desired
speed and the controller adjusts the throttle as necessary.

A completely autonomous vehicle is one in which a computer performs all the tasks
that the human driver normally would. Ultimately, this would mean get-ting in a car, entering
the destination into a computer, and enabling the system. From there, the car would take over
and drive to the destination with no human input. The car would be able to sense its
environment and make steering and speed changes as necessary. This scenario would require
all of the automotive technologies mentioned above: lane detection to aid in passing slower
vehicles or exiting a highway; obstacle detection to locate other cars, pedestrians, animals,
etc.; adaptive cruise control to maintain a safe speed; collision avoidance to avoid hitting
obstacles in the road way; and lateral control to maintain the cars position on the roadway.

In addition, sensors would be needed to alert the car to road or weather conditions to ensure
safe travelling speeds. For example, the car would need to slow down in snowy or icy
conditions. We perform many tasks while driving without even thinking about it. Completely
automating the car is a challenging task and is a long way. However, advances have been
made in the individual systems.

Google’s robotic car is a fully autonomous vehicle which is equipped with radar and LIDAR
and such can take in much more information, process it much more quickly and reliably,
make a correct decision about a complex situation, and then implement that decision far
better than a human can.

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 1


Google Driverless Car

The Google car system combines information gathered for Google Street View with
artificial intelligence software that combines input from video cameras inside the car, a
LIDAR sensor on top of the vehicle, radar sensors on the front of the vehicle and a po sition
sensor attached to one of the rear wheels that helps locate the car's position on the map. At
the same time some hardware components are used in the car they are APPIANIX PCS,
VELODYNE, SWITCH, TOPCON, REAR MONITOR, COMPUTER, ROUTER, FAN,
INVERTER and BATTERY along with some software program is installed in it.Google
anticipates that the increased accuracy of its automated driving system could help reduce the
number of traffic-related injuries and deaths, while using energy and space on roadways more
efficiently.

Figure1.1 Google Car

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 2


Google Driverless Car

The combination of these technologies and other systems such as video based lane
analysis, steering and brake actuation systems, and the programs necessary to control all of
the components will become a fully autonomous system. The problem is winning the trust of
the people to allow a computer to drive a vehicle for them, because of this, there must be
research and testing done over and over again to assure a near fool proof final product. The
product will not be accepted instantly, but over time as the systems become more widely used
people will realize the benefits of it.

Overview
The overview of this project is to imple ment a driverless car that can drive itself
from one point to another without assistance fro m a driver. One of the main impetuses
behind the call for driverless cars is safety. An autonomous vehicle is fundamentally
defined as a passenger vehicle. An autonomous vehicle is also referred to as an autopilot,
driverless car, auto-drive car, or automated guided vehicle (AGV).

Most prototypes that have been built so far performed automatic steering that
were based on sensing the painted lines in the road or magnetic monorails embedded in
the road.

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 3


Google Driverless Car

CHAPTER 2
HISTORY
The Sebastian Thrun has invented the Google driverless car. He was director
of the Stanford Artificial Intelligence laboratory. Sebastian friends were killed in car
accident, so that he decided there should not be any accidents on the road by car. By that
decision only the Google Driverless car was invented.

Figure 2.1 Sebastian Thrun

”Our goal is to help prevent traffic accidents, free up people’s time and reduce carbon
emission by fundamentally changing car use”-Sebastian Thrun.

The Google Driverless car was tested in the year 2010; Google has tested
several vehicles equipped with the system, driving 1,609 kilo meters (1,000 mi)
without any human intervention, in addition to 225,308 kilometers (140,000 mi) with
occasional human intervention. Google expects that the increased accuracy of its
automated driving syste m could help reduce the number of traffic-related injuries and
deaths, while using energy and space on roadways more efficiently. It was introduced
in oct-2010 and it becomes legal in Nevada at June 2011.

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 4


Google Driverless Car

CHAPTER 3

COMPONENTS
3.1HARDWARE SENSORS

3.1.1 RADAR Sensor

Radar is an object-detection system which uses electromagnetic waves specifically radio


waves - to determine the range, altitude, direction, or speed of both moving and fixed objects
such as aircraft, ships, spacecraft, guided missiles, motor vehicles, weather formations, and
terrain.

Figure 3.1 Radar

A radar system has a transmitter that emits radio waves called radar signals in predetermined
directions. When these come into contact with an object they are usually reflected and/or
scattered in many directions. Radar signals are reflected especially well by materials of
considerable electrical conductivity- especially by most metals, by sea water and by wetlands.
Some of these make the use of radar altimeters possible. The radar signals that are reflected
back towards the transmitter are the desirable ones that make radar work. If the object is
moving either closer or farther away, there is a slight change in the frequency of the radio
waves, due to the Doppler effect.

Radar receivers are usually, but not always, in the same location as the transmitter.
Although the reflected radar signals captured by the receiving antenna are usually very weak,
these signals can be strengthened by the electronic amplifiers that all radar sets contain. More

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 5


Google Driverless Car

sophisticated methods of signal processing are also nearly always used in order to recover
useful radar signals.

The three RADAR sensors were fixed in front of the bumper and one in the rear
bumper. These will measures the distance to various obstacles and allow the system to
reduce the speed of the car. The back side of sensor will locates the position of the car on
the map.

Figure 3.2 Front and Back side of the road

For example, when the car was travelling on the road then RADAR sensor was
projected on road from front and back side of the car.

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 6


Google Driverless Car

3.1.2 LIDAR Sensors

LIDAR (Light Detection And Ranging also LADAR) is an optical remote sensing
technology that can measure the distance to, or other properties of a target by illuminating the
target with light, often using pulses from a laser.

Figure 3.3 LIDAR

LIDAR uses ultraviolet, visible, or near infrared light to image objects and can
be used with a wide range of targets, including non- metallic objects, rocks, rain, chemical
compounds, aerosols, clouds and even single molecules. A narrow laser beam can be used to
map physical features with very high resolution. LIDAR has been used extensively for
atmospheric research and meteorology.

The LIDAR is fixed on the top of the car. It contains 64 lasers that
are sent to the surroundings of the car. These laser beams hits objects around the car and
reflects back. By this the distance of the objects from the car is determined and also the
time to reach that object.

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 7


Google Driverless Car

Figure 3.4 Road

For example, if a person is crossing the road, the LIDAR sensor will reorganize
him by sending the laser beam in to the air as waves and waves are disturbed thus it
identifies him as some object was crossing and by this the car will be slow down.

3.1.3 Global Positioning System (GPS)

The Global Positioning System (GPS) is a space-based global navigation


satellite System (GNSS) that provides location and time information in all weather, anywhere
on or near the Earth, where there is an unobstructed line of sight to four or more GPS
satellites.GPS receiver calculates its position by precisely timing the signals sent by GPS
satellites high above the Earth.

Figure 3.5 Google Map

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 8


Google Driverless Car

Each satellite continually transmits messages that include

1) The time the message was transmitted.

2) Precise orbital information (the ephemeris).

3) The general system health and rough orbits of all GPS satellites.

The receiver uses the messages it receives to determine the transit time of each message and
computes the distance to each satellite. These distances along with the satellites’ locations are
used with the possible aid of trilateration, depending on which algorithm is used, to compute
the position of the receiver. This position is then displayed, perhaps with a moving map
display or latitude and longitude; elevation information may be included. Many GPS units
show derived information such as direction and speed, calculated from position changes.
Three satellites might seem enough to solve for position since space has three dimensions and
a position near the Earth’s surface can be assumed. However, even a very small clock error
multiplied by the very large speed of light the speed at which satellite signals propagate
results in a large positional error. Therefore receivers use four or more satellites to solve for
the receiver’s location and time. The very accurately computed time is e ffectively hidden by
most GPS applications, which use only the location. A few specialized GPS applications do
however use the time; these include time transfer, traffic signal timing, and synchronization
of cell phone base stations.

3.1.4 Position Estimator

Figure 3.6 Position Estimator

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 9


Google Driverless Car

A sensor mounted on the left rear wheel. This sensor measures any small
movements made by the car and helps to accurately locate its position on the map. The
position of the car can be seen on the monitor.

3.1.5 Video Cameras

Figure 3.7 Video Camera

The video camera was fixed near the rear view mirror. That will detect traffic
lights and any moving objects front of the car. For example if any vehicle or traffic detected
then the car will be slow down automatically, This slowing down operation is done by the
artificial intelligence software.

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 10


Google Driverless Car

Figure 3.8 Hardware sensors and their locations

3.2. GOOGLE STREET VIEW

Google Street View is a technology featured in Google Maps and Google Earth
that provides panoramic views from various positions along many streets in the world. It was
launched on May 25, 2007, originally only in several cities in the United States, a nd has since
gradually expanded to include more cities and rural areas worldwide.

Google Street View displays images taken from a fleet of specially adapted cars.
Areas not accessible by car, like pedestrian areas, narrow streets, alleys and ski resorts, are
sometimes covered by Google Trikes (tricycles) or a snowmobile. On each of these vehicles
there are nine directional cameras for 360 views at a height of about 8.2 feet, or 2.5 meters,
GPS units for positioning and three laser range scanners for the measuring of up to 50 meters
180 in the front of the vehicle. There are also 3G/GSM/Wi-Fi antennas for scanning 3G/GSM
and Wi-Fi hotspots. Recently, 'high quality' images are based on open source hardware
cameras from Elphel.

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 11


Google Driverless Car

Figure 3.9 Street View

Where available, street view images appear after zooming in beyond the highest zooming
level in maps and satellite images, and also by dragging a "pegman" icon onto a location on a
map. Using the keyboard or mouse the horizontal and vertical viewing direction and the
zoom level can be selected. A solid or broken line in the photo shows the approximate path
followed by the camera car, and arrows link to the next photo in each direction. At junct ions
and crossings of camera car routes, more arrows are shown.

3.3 ARTIFICIAL INTELLIGENCE

Artificial intelligence is the intelligence of machines and the branch of


computer science that aims to create it. AI textbooks de fine the field as "the study and design
of intelligent agents where an intelligent agent is a system that perceives its environment and
takes actions that maximize its chances of success. John McCarthy, who coined the term in
1956, defines it as "the science and engineering of making intelligent machines".

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 12


Google Driverless Car

Figure 3.10 Hardware assembly of the system

Just like a human, self-driving cars need to have sensors to understand the world around them
and a brain that collects, processes and chooses specific actions based on information
gathered. The same goes for self-driving cars, and each autonomous vehicle is outfitted with
advanced tools to gather information, including long-range radar, LIDAR, cameras,
short/medium- range radar, and ultrasound Each of these technologies is used in different
capacities, and each collects different information. However, this information is useless
unless it is processed and some form of action is taken based on the gathered information.
This is where Artificial Intelligence comes into play and can be compared to the human brain,
and the actual goal of Artificial Intelligence is for a self-driving car to conduct in-depth
learning.

Artificial Intelligence has many applications for these vehicles; among the more immediate
and obvious functions:

 Directing the car to a gas station or recharge station when it is running low on fuel.
 Adjust the trip’s directions based on known traffic conditions to find the quickest
route.
 Incorporate speech recognition for advanced communication with passengers.
 Eye tracking for improved driver monitoring.

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 13


Google Driverless Car

CHAPTER 4
IMPLEMENTATION

How does it Work…?

The “driver” sets a destination. The car’s software calculates a route and starts
the car on its way. A rotating, roof- mounted LIDAR (Light Detection and Ranging - a
technology similar to radar) sensor monitors a 60- meter range around the car and
creates a dynamic 3D map of the car’s current environment.
A sensor on the left rear wheel monitors sideways movement to detect the car’s
position relative to the 3-D map. Radar systems in the front and rear bumpers calculate
distances to obstacles. Artificial intelligence (AI) software in the car is connected to
all the sensors and has input from Google Street View and video cameras ins ide the
car.

Figure 4.1 Working

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 14


Google Driverless Car

CHAPTER 5
ROAD TESTING

 The Nevada Department of Motor Vehicles issued the first license fo r an autonomous
car in May 2012 to a Toyota Prius. Later Florida, California, Michigan allowed
testing of driverless cars on public roads.

 In August 2012, the team announced that they have completed over 300,000
autonomous-driving miles (500,000 km) accident-free

 In April 2014, the team announced that their vehicles have now logged nearly
700,000 autonomous miles (1.1 million km).

 In June 2015, the team announced that their vehicles have now driven over 1,000,000
mi (1,600,000 km), in the process they had encountered 200,000 stop signs, 600,000
traffic lights, and 180 million other

 As of July 2015, Google's 23 self-driving cars have been involved in 14 minor traffic
accidents on public roads

 In July 2015, three Google employees suffered minor injuries when the self-driving
car they were riding in was rear-ended by a car whose driver failed to brake at a
traffic light. This was the first time that a self-driving car collision resulted in injuries.

Figure 5.1 Toyota Prius modified to operate as a Google driverless car

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 15


Google Driverless Car

CHAPTER 6

APPLICATIONS

5.1. Intelligent transporting

Intelligent transport systems vary in technologies applied, from basic management systems
such as car navigation; traffic signal control systems; container management systems;
variable message signs; automatic number plate recognition or speed cameras to monitor
applications, such as security CCTV systems; and to more advanced applications that
integrate live data and feedback from a number of other sources, such as parking guidance
and information systems; weather information bridge de icing systems; and the like.
Additionally, predictive techniques are being developed to allow advanced modelling and
comparison with historical baseline data this technology will be a revolutionary step in
intelligent transportation.

5.2. Military applications

Automated navigation system with real time decision making capability of the system makes
it more applicable in war fields and other military applications.

5.3. Transportation in hazardous places

The complete real time decision making capability and sensor guided navigation will leads to
replace the human drivers in hazardous place transportation.

5.4. Shipping

Autonomous vehicles will have a huge impact on the land shipping industry. One way to
transport goods on land is by freight trucks. There are thousands of freight trucks on the road
everyday driving for multiple days to reach their destination. All of these trucks are driven by
a paid employee of a trucking company. If the trucks were able to drive on their own, a
person to move the vehicle from one point to another is no longer needed. The truck is also
able to drive to their destination without having stop to sleep, eat, or anything besides more
fuel. All that is necessary is someone to load the vehicle and someone to unload the vehicle.
This would save trucking companies a very large amount of money, but it would also put
thousands of people out of jobs. These people would have to find and learn a new profession

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 16


Google Driverless Car

as driving a freight truck is a full time job with little time spent at home to learn how to do
another profession. This is potentially life ruining for many employees in this industry.

5.5. Public transportation

Various forms of public transportation are controlled by a human operator. Whether it is on a


bus, in a train, subway, streetcar, or shuttle, there is a person sitting in the drivers seat and
they are controlling what the vehicle is doing. For trains and other rail- based transportation, it
is a simpler process more involved with accelerating and decelerating the train from and into
stops with no concern over keeping in a lane. However, on a bus or shuttle, a person must
follow rules, watch the actions of other drivers and pedestrians, keep the bus in lane, and
make sure they stop at every bus station. These are many tasks that one person must be able
to handle and react to and control at the same time. In the early stages of implementation, it
would most likely keep the driver behind the wheel as a safe guard in case there is a problem
with the system. The driver would also be needed in the beginning in order for the general
public to trust it at first. As the life of the autonomous vehicle systems progresses, bus drivers
would no longer be needed as the system would be able to perform all of the required tasks. It
is a simple job of following a specific route and stopping at designated points. The problems
would arise from actions of other vehicles in the area. The most ideal situation is when the
autonomous vehicle systems have matured to the point that nearly every vehicle on the road
is autonomously driven.

5.6. Taxi services

Another business that would be strongly affected is taxi services. It is based solely on driving
someone around who does not have a car or does not want to drive. Then an employee is
dispatched to go and pick up the person and bring them to their destination. Taxis also drive
around cities and wait in busy areas for people to request a cab. A taxi service comprised
completely of autonomous vehicles could be started. A person can call in and request to be
picked up and then be brought to their destination for a fee. There could be autonomous taxis
waiting in designated areas for people to come and use them. Many taxi drivers need the job
because they are unable to perform other jobs for various reasons. The need for a human in
the service goes away almost completely. This is another example of a large amount of
people being removed from their jobs because of autonomous vehicles being able to perform
the task without the need of an extra person.

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 17


Google Driverless Car

CHAPTER 7

MERITS

 Without the need for a driver, cars could become mini- leisure rooms. There would be
more space and no need for everyone to face forwards. Entertainment technology, such
as video screens, could be used to lighten long journeys without the concern of
distracting the driver.

 Sensory technology could potentially perceive the environment better than human
senses, seeing farther ahead, better in poor visibility, detecting smaller and more subtle
obstacles, more reasons for less traffic accidents.

 Speed limits could be increased to reflect the safer driving, shortening journey times.

 Parking the vehicle and difficult maneuvering would be less stressful and require no
special skills. The car could even just drop you off and then go and park itself.

 People who historically have difficulties with driving, such as disabled people and older
citizens, as well as the very young, would be able to experience the freedom of car
travel. There would be no need for drivers' licenses or driving tests.

 Autonomous vehicles could bring about a massive reduction in insurance premiums for
car owners.

 Efficient travel also means fuel savings, cutting costs.

 Reduced need for safety gaps means that road capacities for vehicles would be
significantly increased.

 Passengers should experience a smoother riding experience.

 Self-aware cars would lead to a reduction in car theft.

 Travelers would be able to journey overnight and sleep for the duration.

 Traffic could be coordinated more easily in urban areas to prevent long tailbacks at busy
times. Commute times could be reduced drastically.

 Reduced or non-existent fatigue from driving, plus arguments over directions and
navigation would be a thing of the past.

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 18


Google Driverless Car

CHAPTER 8

DEMERITS

 Driverless cars would likely be out of the price range of most ordinary people when
generally introduced, likely costing over $100,000.

 Truck drivers and taxi drivers will lose their jobs, as autonomous vehicles take over.

 A computer malfunction, even just a minor glitch, could cause worse crashes than
anything that human error might bring about.

 If the car crashes, without a driver, who's fault is it: Google/the software designer, or the
owner of the vehicle?

 The cars would rely on the collection of location and user information, creating major
privacy concerns.

 Hackers getting into the vehicle's software and controlling or affecting its operation
would be a major security worry.

 There are problems currently with autonomous vehicles operating in certain types of
weather. Heavy rain interferes with roof- mounted laser sensors, and snow can interfere
with its cameras.

 Reading human road signs is challenging for a robot.

 As drivers become more and more used to not driving, their proficiency and experience
will diminish. Should they then need to drive under certain circumstances, there may be
problems.

 The road system and infrastructure would likely need major upgrades for driverless
vehicles to operate on them. Traffic and street lights, for instance, would likely all need
altering.

 Self-driving cars would be great news for terrorists, as they could be loaded with
explosives and used as moving bombs.

 Human behavior such as hand signals are difficult for a computer to understand.

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 19


Google Driverless Car

CHAPTER 9

FUTURE SCOPE

The transition to an automated transportation structure will greatly prevent many problems
caused by the traffic. Implementation of autonomous cars will allow the vehicles to be able to
use the roads more efficiently, thus saving space and time. With having automated cars,
narrow lanes will no longer be a problem and most traffic problems will be avoided to a great
extent by the help of this new technology .Research indicates that the traffic patterns will be
more predictable and less problematic with the integration of autonomous cars.

Smooth traffic flow is at the top of the wish list for countless transportation officials. Car
manufacturers are already using various driver assist systems in their high-end models and
this trend is becoming more and more common. As a result of this trend, the early co- pilot
systems are expected to gradually evolve to auto-pilots. All developments show that one day
the intelligent vehicles will be a part of our daily lives, but it is hard to predict when. The
most important factor is whether the public sector will be proactive in taking advantage of
this capability or not. The Public Sector will determine if the benefits will come sooner rather
than later.

Since these assist systems are very similar with the systems that are used in autonomous car
prototypes, they are regarded as the transitio n elements on the way to the implementation
fully autonomous vehicles.

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 20


Google Driverless Car

CHAPTER 10

CONCLUSION

Currently, there are many different technologies available that can assist in creating
autonomous vehicle systems. Items such as GPS, automated cruise control, and lane keeping
assistance are available to consumers on some luxury vehicles. The combination of these
technologies and other systems such as video based lane analysis, steering and brake
actuation systems, and the programs necessary to control all of the components will become a
fully autonomous system. The problem is winning the trust of the people to allow a computer
to drive a vehicle for them, because of this, there must be research and testing done over and
over again to assure a near fool proof final product. The product will not be accepted
instantly, but overtime as the systems become more widely used people will realize the
benefits of it. The implementation of autonomous vehicles will bring up the problem of
replacing humans with computers that can do the work for them. There will not be an instant
change in society, but it will become more apparent over time as they are integrated into
society.

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 21


Google Driverless Car

REFERANCES
[1] Thorsten Luettel, Michael Himmelsbach, and Hans-Joachim Wuensche, Autonomous
Ground Vehicles-Concepts and a Path to the Future, Vol. 100, May 13th,Proceedings of the
IEEE,2012

[2] S. Thrun, W. Burgard, and D. Fox, Probabilistic Robotics (Intelligent Robotics and
Autonomous Agents), 2001

[3] Nilotpal Chakraborty, Raghvendra Singh Patel, Intelligent Agents and Autonomous Cars:
A Case Study, International Journal of Engineering Research Technology (IJERT), ISSN:
2278-0181, Vol. 2 Issue 1, January- 2013

[4] Dragomir Anguelov, Carole Dulong, Daniel Filip, Christian Frueh, Stphane Lafon Google
Street View: Capturing the World at Street Level, International Journal of Engineering
Research Technology (IJERT),Vol.43, Issue:6 Page(s):32 38.2011

[5] Julien Moras, Veronique Cherfaoui, Phillipe Bonnifait A lidar Perception Scheme for
Intelligent Vehicle Navigation 11th International Conference on Control Automation
Robotics Vision (ICARCV), Pages: 1809 1814, 2010.

[6] Dept. of ECE, T.K.M Institute Of Technology, Kollam 32 Seminar Report 2013
Autonomous Car

[7] A. Frome , ”Large-Scale Privacy Protection in Google Street View”, Proc. 12th IEEE Int´l
Conf. Computer Vision (ICCV 09), 2009

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 22


Google Driverless Car

DEPT of Mechanical Engineering JNNCE, Shivamogga Page 23

You might also like