In 2011, Google began to seriously test
what they later referred to as a ‘driverless car’. The goal of this testing is
to one day develop a vehicle which does not require a driver to operate it.
Navigation, speed, and steering would all be handled by an artificial
intelligence designed to transport passengers safely from one area to another.
Five years later, other companies such as Ford and Tesla now share Google’s
aspirations and are working on driverless car technology in their own vehicles.
Once this technology is fully developed and available on the market,
individuals such as the elderly and disabled will experience a new level of
independence that they previously did not have. When it was first announced, it
was a common belief that driverless cars would not be seen on the road for
many, many years. Now in 2016, Tesla has been the first company to release a
vehicle with basic driverless car technology on the market. There is no doubt
that further advancements will be made as companies rush to compete with
Tesla’s vehicles. As the concept of what it means to be a ‘driver’ changes over
time, Courts across the country will face new and interesting questions as to
who can really be held accountable for car accidents. There have already been
two car accident cases involving driverless car technology. These events have
led the Court to debate over who is and who is not at fault for such accidents.
The driverless car option first appeared
in Tesla’s Model S vehicles. These features included automatic braking and
steering. It is also possible for the vehicle to detect its surroundings and
make adjustments accordingly. Collectively, these features work together in a
function called ‘Autopilot’. Upon the release of the Model S, Tesla stressed
the fact that the Autopilot technology was still in its ‘beta’ stage of
development, meaning that research and development of the product is ongoing.
When applied to software, ‘beta’ is largely associated with early product
releases that forecast a greater technology to come. It also marks the
transition of a program from its ‘creation’ phase to its ‘usable’ phase. This
means that there are likely to be software bugs and glitches that surface as
the program is being used. Tesla’s statement should be taken as a warning by
drivers to use the Autopilot with discretion and not to rely upon it
completely. Many legal professionals believe that the title ‘Autopilot’ has led
some drivers to put more faith than they should in the driverless car feature.
So far, Tesla has refused to change the name despite two accidents that
occurred in Model S cars.
In the first accident, a man was using the
Autopilot feature in his vehicle when he was struck and killed by a tractor
trailer that passed in front of the car. Tesla has since been questions about
the reliability of the software and why it did not detect the tractor trailer
as it crossed in front of the car’s path. The car was traveling at
approximately 65 miles/hour when it collided with the tractor trailer. It is
unsure as to whether speed affects the software’s ability to detect oncoming
objects, however, it was a sudden collision.
In the second case, the Autopilot program
failed to traverse a winding road when activated. Instead, the car was driven
off the road completely before it crashed. Fortunately, the accident was not
fatal. In this case, Tesla stated that the driver was at fault for the accident
because he ignored the Autopilot’s safety features.
The Model S cars are equipped with
pressure sensors in the car’s driver’s seat and steering wheel. When Autopilot
is activated, the driver then has to periodically place his or her hands on the
wheel or they will receive an audio warning from the car. This security measure
is meant to keep a driver awake and alert even though they are not steering the
vehicle. If the driver fails comply with the warning, the car will then slow
down until the wheel is gripped. According to Tesla, the man in the second case
failed to follow the car’s warnings and place his hands on the steering wheel.
While these security measures certainly
improve the vehicle’s safety, there is still much room for improvement.
Irresponsible and dangerous drivers have found ways of tricking the security system
in an attempt to lounge in the back seat of their vehicle while the program
navigates it. By placing weighted objects in the correct spots in the front
seat, a driver can fool the car into thinking that someone is at the wheel.
This is very dangerous, irresponsible, and illegal for someone to do as it
endangers both the driver and anyone involved in an accident that their actions
may cause.
Another hiccup in the Autopilot program
involves its vehicle detection system. This feature allows the car to recognize
other vehicles that travel in its proximity. While this is essential for the
car’s navigation, some glitches have been found in the software. When a vehicle
is detected, the Autopilot program will attempt to match it with a make and
model of car which will allow it to estimate the dimensions of the car for
navigation purposes. However, there have been occasions where the software has
failed to match the car with a recognized make and model or incorrectly matches
the identified vehicle with the wrong brand or model. While this has yet to
cause an accident, it is concerning because the Autopilot’s steering is
determined by the environment around it, including other cars. In particular,
the Autopilot’s ‘Merging’ feature may be affected by this glitch.
If the software can be tricked and the
person sitting in the front seat isn’t actually driving the car, who can be
held accountable if a car accident happens? The law is that the owner of a
vehicle is responsible for any damage caused by that vehicle whether or not
they are the driver. For more information about the ownership and liability of
drivers, please read our article, Who Should Be Driving Your Vehicle. Even so, drivers have a duty to be alert and do
everything within their power to avoid car accidents. For more information on
that, refer to our article, Use Your Right-of-Way The Right Way. Drivers
must use discretion when using the technology in their vehicles and judge the
effectiveness of the technology based on the situation.
While the Autopilot feature is innovative,
it should not be used to navigate complicated or roads that have many curves
and different contours. It is best used on highways or long stretches of
roadway. Using the feature in these areas will allow a driver to rest and be
prepare for more complicated navigation ahead.
Even though the Autopilot program is
capable of steering the vehicle, whoever sits behind the steering wheel at that
time is still considered a driver, even when doing something stupid like climbing
into the back seat to trip the so called, ‘Autopilot’ system. The Autopilot
does not activate on its own. A driver must flip a switch by the steering wheel
in order to turn it on. This means that a driver must make the conscious
decision to use the technology available to him or her. Therefore, their
actions will still be considered by the Court. Tesla released an additional
statement which stated that drivers to have a responsibility to keep their
hands on the wheel. Although they may be allowed to relax when Autopilot is
active, he or she still has the same driving responsibilities as when the
feature is turned off.
In this way, the consideration of a
driver’s roll in a driverless car has not changed in in a legal sense. The
owner-driver relationship in terms of liability still applies to this
situation. Yet, a third party such as the manufacturer or designer of the
vehicle may be held accountable for such a car accident if a claim in product
liability can be established. A product liability lawsuit is established when
someone becomes injured by a product, like a motor vehicle, due to an error
made by a manufacturer or designer of the product. This error could be related
to unsafe product design, ignored federal or state regulations, a defective product,
or the inability of the product to do that for which it was designed to do.
This technology is new and exciting, but
it must be used with care. Drivers should not push and break the boundaries of
safety just because they believe it is possible to do so. When you own a
vehicle or drive a vehicle, your actions can result serious consequences. Think
of Autopilot as an aid rather than the entity its name suggests. Despite what
you may believe, this technology is nowhere near capable of replacing a driver.
As a side note, I would like to share my
opinion on certain ‘reward’ systems that insurance companies offer drivers for
good driving. These rewards come in the form of reduced payments and sometimes
gift cards. A person should not be rewarded for doing something as simple as
looking out for being a careful driver. Maintaining your own safety and the
safety of others should always be a concern of drivers whether they have a
monetary incentive to do so or not. Acting to save your life and the lives of
others is a legal duty when you are behind the wheel and the fact that some
people need an extra push to do so is disconcerting. Be a good driver whether
you are or are not operating a driverless car.
If you have questions about the liability
involved in owning and operating a car or other motor vehicle, contact
Poissant, Nichols, Grue, and Vanier at:
367
West Main Street
45 Market Street
Malone,
New York 12953 Potsdam,
New York 13676
Phone:
(518) 483-1440
Toll
Free: 1-800-924-3529
-Joseph
Nichols
-Paul
Nichols
No comments:
Post a Comment