T. Krisher,
Feb 19, "The
fiery crash of
a Tesla near
Houston with
no one behind
the wheel is
drawing
scrutiny from
two federal
agencies that
could bring
new regulation
of electronic
systems that
take on some
driving tasks.
The National
Highway
Traffic Safety
Administration
and the
National
Transportation
Safety board
said Monday
they would
send teams to
investigate
the Saturday
night crash on
a residential
road that
killed two men
in a Tesla
Model S.
Local
authorities
said one man
was found in
the passenger
seat, while
another was in
the back.
They’re
issuing search
warrants in
the probe,
which will
determine
whether the
Tesla’s
Autopilot
partially
automated
system was in
use. Autopilot
can keep a car
centered in
its lane, keep
a distance
from cars in
front of it,
and can even
change lanes
automatically
in some
circumstances.
On Twitter
Monday, Tesla
CEO Elon Musk
wrote that
data logs
“recovered so
far” show
Autopilot
wasn’t turned
on, and “Full
Self-Driving”
was not
purchased for
the vehicle.
He didn’t
answer
reporters’
questions
posed on
Twitter...."
Read
more Hmmmm... I'll stand by my
quote... "...“Elon’s
been totally
irresponsible,”
said Alain
Kornhauser,
faculty chair
of autonomous
vehicle
engineering at
Princeton
University.
Musk, he said,
has sold the
dream that the
cars can drive
themselves
even though in
the fine print
Tesla says
they’re not
ready. “It’s
not a game.
This is
serious
stuff.”..."
... even
though it
isn't the most
critical
comment.
What is more concerning.... "Why didn't Tesla's Automated Emergency Braking System prevent the Tesla from hitting the tree?" The common theme in the Joshua Brown, Elaine Herzberg, Walter Huang, Firetruck/Derrick Monet, 2nd_Firetruck_Tesla crash ..., Teslas seem to disregard stationary objects directly ahead, or certainly doesn't avoid hitting them enough of the time. The Tesla code must assume that it can pass underneath them. Can such an egregious oversight in Tesla's AEB computer code really exist? Is the Society of Automotive Engineers (SAE) involved in this oversight because it has made Tesla and maybe others so adverse to false positives that they simply assume that Teslas can pass under any and all stationary objects in the road ahead? Not a pretty situation. Alain
[log in to unmask]" _mf_state="1" title="null" src="cid:[log in to unmask]" width="44" height="44" border="0"> The SmartDrivingCars eLetter, Pod-Casts, Zoom-Casts and Zoom-inars are made possible in part by support from the Smart Transportation and Technology ETF, symbol MOTO. For more information: www.motoetf.com. Most funding is supplied by Princeton University's Department of Operations Research & Financial Engineering and Princeton Autonomous Vehicle Engineering (PAVE) research laboratory as part of its research dissemination initiatives.
Press
Release, Dec
17, 2019,"The
U.S.
Department of
Transportation’s National Highway Traffic Safety Administration today
released an
update on the
progress of 20
automakers in
manufacturing
new passenger
vehicles with
low-speed
automatic
emergency
braking
systems). The
installation
of AEB is part
of a voluntary
commitment by
20 automakers
to equip
virtually all
new passenger
vehicles with
low-speed
AEB that
includes
forward
collision warning
by
September 1,
2022...." Read more Hmmmm...Poop!! Its only low-speed
... Where
Safety is not
much of an
issue. All
those
Automakers who
promote speed
of their
products only
commit to
"automated...
warning" at
"low-speed"
...by2022. Be
sure not to go
too far too
fast with any
of this safety
stuff! So
Bad!!! Never
mind! 😬
Alain
K. Barry,
April 22,
"Consumer
Reports
engineers
easily tricked
our Tesla
Model Y this
week so that
it could drive
on Autopilot,
the
automaker’s
driver
assistance
feature,
without anyone
in the
driver’s
seat—a
scenario that
would present
extreme danger
if it were
repeated on
public roads.
Over several
trips across
our half-mile
closed test
track, our
Model Y
automatically
steered along
painted lane
lines, but the
system did not
send out a
warning or
indicate in
any way that
the driver’s
seat was
empty.
“In our
evaluation,
the system not
only failed to
make sure the
driver was
paying
attention, but
it also
couldn’t tell
if there was a
driver there
at all,” says
Jake Fisher,
CR’s senior
director of
auto testing,
who conducted
the
experiment.
“Tesla is
falling behind
other
automakers
like GM and
Ford that, on
models with
advanced
driver assist
systems, use
technology to
make sure the
driver is
looking at the
road...." Read more Hmmmm...Essentially everything is
hackable or
can be
circumvented
by those who
wish to
mis-behave.
I'm sure
Consumer
Report could
break into
Fort Knox.
The
issue here is
not about the
mis-behavior
of the
driver,. We
have a justice
system to deal
with that.
Hopefully it
can deal with
this kind of
mis-behavior
better than it
can deal with
broken tail
lights and
hanging air
freshener.
It
is about the
visions and
dreams that
motivate us to
buy these cars
that entices
us to
misbehave in
our search to
experience
those dreams.
Those dreams
and visions
are inspired
and created by
Mad Men to please just
a few
individuals,
the leaders of
the
corporations
that make and
want to us to
buy these
cars. These
automaker
leaders can
simply insist
that their
messages and
the messages
produced by
their minions
don't
engender such
behavior. As
I've written,
autoPilot and
maybe even
FSD, are
fantastic
Comfort &
Control
features that
may well
substantially
improve
safety, but
only if they
are used
responsibly.
The proper use
should not be
drowned out by
hype that
clearly can
result in
tragedy.
Moreover,
these leaders
can instruct
their coders
to write code
that is smart
enough to know
when we
misbehave in
using these
systems and do
something
about it. If
the system
catches you
mis-behaving,
it should have
been written
to turn on the
emergency
flashers, slow
down, find a
the next
opportunity to
pull over and
stop. It
should then
disable its
operation
until the
driver
(engager of
the system)
gets a note
from his
mother.
This
isn't tough.
This doesn't
need
DeepLearning,
MachineLearning or the latest AI. It needs responsible leadership that
stops
promoting
these products
as ways to
show off one's
bravado. Why
are car
commercials so
focused on
speed and
acceleration?
Do we really
like to plow
through deep
snow, driver
down river
banks and up
Great Walls?
Really??? So
depressing!
Alain
R.
Mitchell,
April 19,
"It’s a 21st
century
riddle: A car
crashes,
killing both
occupants —
but not the
driver.
That’s what
happened over
the weekend in
Houston, where
a Tesla Model
S slammed into
a tree and
killed the two
men inside.
According to
police, one
had been
sitting in the
front
passenger
seat, the
other in the
back of the
car.
Although
investigators
have not said
whether they
believe
Tesla’s
Autopilot
technology was
steering, the
men’s wives
told local
reporters the
pair went out
for a
late-night
drive Saturday
after talking
about the
system.
Tesla Chief
Executive Elon
Musk pushed
back on
speculation
but also
asserted no
conclusion,
tweeting
Monday that
“Data logs
recovered so
far show
Autopilot was
not enabled.”
The company
has resisted
sharing data
logs for
independent
review without
a legal
order...." Read more Hmmmm... I'll stand by my
quote...' “I
suspect there
will be big
fallout from
this,” said
Alain
Kornhauser,
head of the
driverless car
program at
Princeton
University."
Alain
B.
Templeton,
April 22, "...
Regardless of
whether the
owner was able
to activate
Autopilot or
AC or just
made a mistake
trying to push
the
accelerator,
the real
question about
this accident
is what made
somebody do
something so
fatally
foolish? This
has been the
central
question
around Tesla’s
deployment of
these systems.
Used
correctly,
they are
useful tools.
Driving with
Autopilot is
at a similar
safety level
to driving
without it.
(Not much
safer, as
Tesla
misleadingly
claims.) The
problem arises
because people
decide,
against all
warnings, that
it is better
than it is,
and they
misuse
it...." Read more Hmmmm...All really good points.
The shame is
that the
Automated
Emergency
Braking system
which, by its
very name, is
supposed to
intervene in
and
"save-the-day", may well have been "out-to-lunch" again. (Why does the
AEB need
lunch???)
Alain
B. Pietsch,
April 18, "Two
men were
killed in
Texas after a
Tesla they
were in
crashed on
Saturday and
caught fire
with neither
of the men
behind the
wheel, the
authorities
said.
Mark Herman,
the Harris
County
Precinct 4
constable,
said that
physical
evidence from
the scene and
interviews
with witnesses
led officials
“to believe no
one was
driving the
vehicle at the
time of the
crash.”
The vehicle, a
2019 Model S,
was going at a
“high rate of
speed” around
a curve at
11:25 p.m.
local time
when it went
off the road
about 100 feet
and hit a
tree,
Constable
Herman said.
The crash
occurred in a
residential
area in the
Woodlands, an
area about 30
miles north of
Houston.
The men were
59 and 69
years old. One
was in the
front
passenger seat
and one in the
rear seat,
Constable
Herman said.
He said that
minutes before
the crash, the
men’s wives
watched them
leave in the
Tesla after
they said they
wanted to go
for a drive
and were
talking about
the vehicle’s
Autopilot
feature.....
" Read more Hmmmm... This was my 1st read on
this. Alain
F. Lambert,
April 22, "In
what could be
a first, Tesla
has reportedly
publicly
released the
data logs from
a customer’s
vehicle
involved in a
crash that led
the owner to
protest at
Tesla’s booth
at the
Shanghai Motor
Show....
She jumped on
a display car
to claim that
Tesla’s
“brakes are
not working.”
The owner was
eventually
dragged out of
the booth and
reportedly put
in “police
detention,”
but not before
the event was
filmed and
posted to
social
media.”..." Read more Hmmmm... Maybe that's what I need
to do to get
Tesla to
release their
data so that
I, or some
other
independent
entity, can
ascertain the
safety of
autoPilot.
More importantly, "...The front collision warning and automatic emergency braking function were activated (the maximum brake master cylinder pressure reached 140.7bar) and played a role, reducing the amplitude of the collision. 1.8 seconds after the ABS was applied, the system recorded the occurrence of the collision. After the driver stepped on the brake pedal, the vehicle speed continued to decrease, and before the collision, the vehicle speed was reduced to 48.5 kilometers per hour.” ..."
Why did Elon's coders design the AEB:
1.
to wait until
1.8 seconds to
activate. The
coders
knew/know that
is too late
given the
speed at 1.8
seconds before
collision,
2.
Why wasn't it
designed to
have the AEB,
or autoPilot,
begin to slow
down earlier
that the "1.8
seconds to
collision"?.
It is straight
forward to
compute a
feasible"master
cylinder
pressure
profile" to
not crash with
a stationary
object ahead.
These
computations
can be done
quickly and
repeated
continuously.
At the start of a trip this process is trivial because the car is at rest and the solution is simply the brake pressure that keeps the car stationary. As the car starts to move it is going slowly and, hopefully, its cameras can detect a stationary object ahead at a sufficient distance that many feasible "master cylinder pressure profiles"exits that can keep the Tesla from crashing. I'm sure that the coder can pick a good profile to implement.
As
the car gets
going faster
and objects
get closer,
fewer feasible
profiles
exist, but
still, one can
be found and
implemented so
that the car
does not
crash.
At some point, the combination of speed and distance are such that physics can no longer help and there exists no feasible profile that will keep the car from crashing.
NHTSA
or some
regulator or
Elon himself
should require
that the AEB
is designed to
not let the
car get
between that
rock and that
hard place.
Not let it
enter a Design
Domain (the
combination of
road
condition,
speed and
distance of
the stationary
object ahead)
that has no
feasible "master
cylinder
pressure
profile" that
will avoid a
crash!
These
"no-man's
lands" don't
just appear
out of thin
air. Every
trip starts
with speed =
0. Conditions
rarely change
instantaneously. Even the "instantaneous" appearance of stationary
objects in the
lane ahead are
rare. In
those
instances, yes
it is
"everybody do
the best they
can', but in
the
preponderance
of others, no
crash would
occur!
Now maybe Tesla needs Lidar in order to first "see' stationary objects sufficiently far ahead when conditions are poor and the speed is high. In that case Elon should either back off his 'no Lidar" stance or the AEB (or whatever other automated system) should not let you go that fast under those conditions, period!
This
is serious
business here
and these
systems should
be there to
give us a "get out of jail
free"
card, pass GO
and collect
$200".
Again,
the Tesla knew
it had waited
too long to
stop. It was
not designed
to
slow down in
order to not
let it get
itself in a
situation that
it can't stop.
It was
designed to
crash at 48.5
kph in this
situation.
While some may
be happy that
it slowed from
above 74kph to
48.5kph,
additional
brake pressure
could have
been applied
by Tesla's
great
autoPilot/AEB/...
system earlier
than
6:14:26.37PM
that would
have slowed it
down
sufficiently
below 74kph at
that time such
that no
collision
would have
occurred. It
is also likely
that a brake
profile could
have been
computed and
implemented
that would not
have concerned
the driver and
almost
assuredly
would have had
the driver
praising Tesla
at the
Shanghai Auto
Show rather
than
protesting.
Alain
F. Lambert,
April 22,
"Elon Musk,
who has
generally
welcomed valid
criticism, has
now dismissed
some serious
concerns about
Tesla as
“weird”
attacks from
the media.
We previously
reported on
how “Tesla
superfandom is
becoming toxic
and negative
for the
electric
revolution.”
Part of that
involves Elon
Musk’s
feedback loop,
which has been
extremely
valuable for
Tesla, getting
corrupted by
those
superfans.
Musk has often
highlighted
the feedback
loop, which
often consists
of him
responding
directly to
people and
criticism on
Twitter, as
one of Tesla’s
biggest
advantages..."
Read more Hmmmm ... Again, it is a shame
because Elon,
Tesla and
probably
autoPilot and
really
fundamentally
good. They
don't need to
be oversold
and hyped.
And their
challenges and
limitations
should be made
clear. We can
then really
appreciate all
of the
excellent
real features
of these
cars. Alain
These
editions are
sponsored by
the SmartETFs
Smart
Transportation
and Technology
ETF, symbol
MOTO. For more
information
head to www.motoetf.com
F. Fishkin,
Nov 25, "What
you should
know about
electric cars,
climate change
and more. The
Dispatcher
publisher
Michael Sena
joins
Princeton's
Alain
Kornhauser and
co-host Fred
Fishkin in an
eye opening
edition of
Smart Driving
Cars.."
F. Fishkin, Nov 24, "When it comes to active driver assistance systems, what works and what needs improvement? Some answers from Kelly Funkhouser… program manager for vehicle interface, head of connected and automated vehicles at Consumer Reports. She joins Princeton's Alain Kornhauser and co-host Fred Fishkin for episode 186 of Smart Driving Cars."
F. Fishkin, July 20, "Is Driverless home delivery the fastest route to Affordable Mobility for the Mobility Disadvantaged? ... "
F. Fishkin,
July 2,
"Transportation,
racial
injustices and
changing the
thinking
around the
future of
mobility. NYU
McSilver
Institute for
Poverty Policy
& Research
fellow Henry
Greenidge
joins
Princeton's
Alain
Kornhauser and
co-host Fred
Fishkin in an
eye and mind
opening
episode of
Smart Driving
Cars. Plus
Amazon, Zoox,
Waymo, Tesla
& more.
."
F. Fishkin,
June 2, "But
the debate is
not really
about
technology nor
is it about
who delivers
the best value
for the money
or the most
privacy. It is
about ..."
A.
Ohnsman, April
2, "John
Krafcik, the
auto industry
veteran who’s
run Waymo for
over five
years, is
stepping down
as CEO of the
Alphabet Inc.
self-driving
tech giant and
is being
replaced by
two
high-ranking
company
executives.
J.
Gallagher,
March 24, "Two
prominent
labor unions
want the U.S.
Department of
Transportation
(DOT) to
reject the
Trump
administration’s
automated
vehicle (AV)
strategy for
relying too
much on the
viewpoint from
industry
without enough
attention paid
to potential
damage to
worker safety
and jobs.
The 38-page Automated
Vehicles
Comprehensive
Plan
(AVCP), one of
the last
documents
released for
public comment
by DOT under
Secretary
Elaine Chao
before she
left the
administration
in January,
laid out the
previous
administration’s
vision for
integrating
AVs – both
cars and heavy
trucks – into
the U.S.
transportation
system.
The plan received 23 comments before the comment period closed on Tuesday, with trucking technology companies generally supporting the strategy and labor rejecting it.
“This document doubles down on the previous administration’s irresponsible, hands-off approach to AV deployment and regulation and mostly boosts the agency’s role as cheerleader and enabler rather than safety regulator,” wrote John Samuelsen, international president of the Transport Workers Union of America (TWU), which represents transit workers...." Read more Hmmmm... One might suggest that TWU's position is enormously short sighted. Transit pre_Covid served 1% of the person-miles in the US. That is a niche of a niche. During Covid, almost anyone who could affords a car and didn't have one, bought one. Transit ridership took an enormous hit. Even with enormous subsidy, Transit, especially bus transit, is hardly ever the "mode of choice" for anyone because its level-of-service is fundamentally poor. It serves relatively few locations, loosely connected by a route which delivers service only at infrequent fixed times. Essentially no other consumer commodity today operates with so little regard to its customer's real-time needs and desires. Even network television has adapted to become demand-responsive as opposed to take-it-or-leave-it.
Conventional transit is labor
intensive
because it
needs a
chauffeur for
each vehicle
and that
chauffeur
deserves nice
working
conditions and
a living
wage.
Unfortunately,
the service
that a
chauffeur can
deliver can't
attract enough
customers to
make that
service a
going
concern.
However, an
automated
driver can
arguably
deliver
demand-responsive
service while
having the
total cost of
its working
conditions and
level-of-effort be substantially less than a TWU driver. This might let
a Transit
entity to
actually
develop a
going concern
that would
serve 10x or
more
person-miles
and create
better paying
and better
working
conditions for
all TWU
members.
M.
Hogan, March
19, "A beta
version of
Tesla's "Full
Self Driving"
Autopilot
update has
begun rolling
out to certain
users. And
man, if you
thought "Full
Self Driving"
was even close
to a reality,
this video
of the system
in action
will certainly
relieve you of
that notion.
It is perhaps
the best
comprehensive
video at
illustrating
just how
morally
dubious,
technologically
limited, and
potentially
dangerous
Autopilot's
"Full Self
Driving" beta
program
is...." Read
more Hmmmm... The Video
is
MUST watch.
This is what I
would call a "Semi-SelfDriving
Alpha"
product in
this
Operational
Design Domain
(non-dense
city/commercial
suburban
streets,
during
daylight, in
clear weather
with moderate
temperature
conditions).
Drivers have four (4)
"responsibilities".
1. Feet/foot
on/near the
pedals, 2.
Hand(s) on the
wheel, 3. Eyes
on the road,
and 4. Butt in
the driver's
seat (and
possibly 5....
Have
reasonable
cognitive
brain
functions).
If the
Operational
Design Domain
is a straight
lane with a
slight
downgrade and
nothing else
around, my "55
Chevy" can
"Self-drive"
and even be
"Driverless".
I don't even
have to be in
it. However,
we must all
agree, that we
can't call my
"55 Chevy" a
"Driverless"
car. We can't
even call it a
Self-driving
because I'm
going to need
to have my
butt is the
driver's seat
to do
something when
the ODD
changes (the
road turns ,
...) and it
is, at best,
Semi-Self
driving
because my
eyes will need
to be on the
road for me to
realize that
the "55 Chevy"
is about to
exit its ODD.
It is going to
need help from
me to not
crash.
So Elon's FSD is definitely Semi-SelfDriving because its ODD doesn't come close to including many of the situations that it found in its video journey above. It is Alpha because any potential user can be expected to have little if any idea what is required to use this product without getting hurt. So, please be very careful out there and don't stop paying attention to the road ahead!!! Alain