E. Niedemeyer, Nov 1, "Congrats! This car is all yours, with no one up front,” the pop-up notification from the Waymo app reads. “This ride will be different. With no one else in the car, Waymo will do all the driving. Enjoy this free ride on us!” Moments later, an empty Chrysler Pacifica minivan appears and navigates its way to my location near a park in Chandler, the Phoenix suburb where Waymo has been testing its autonomous vehicles since 2016.
Waymo,
the Google
self-driving-project-turned-Alphabet
unit, has
given demos of
its autonomous
vehicles
before. More
than a dozen
journalists
experienced
driverless
rides in 2017
on a closed
course at
Waymo’s
testing
facility in
Castle ... Doesn't count!... ;
and Steve
Mahan, who is
legally blind,
took a
driverless
ride in the
company’s
Firefly
prototype on
Austin’s city
streets way
back in 2015 ... Counts, but just as a 'one
< 10 mile
ride', (We'll
assume that
the road
conditions
weren't
staged.)...
.
But this driverless ride is different — and not just because it involved an unprotected left-hand turn, busy city streets or that the Waymo One app was used to hail the ride. It marks the beginning of a driverless ride-hailing service that is now being used by members of its early rider program and eventually the public..." Read more Hmmmm... Maybe!? If it is, it is reasonably well hidden that my student wasn't able to find one that was being used in this way to try to verify that this is not just another staged event. If this is really going to be an economically sustainable mobility service it is going to have to be able to deliver "millions" of rides without a driver, without an attendant and without staging. It would be nice to be able to just find one. FYI: In Chandler on a typical day there are approximately a million person trips taken using a vehicle. Across the nation there are about a billion. See also Tim Lee's reporting of this trip. Alain
T.
Randall, Nov
5, "We asked
5,000 Model 3
owners about
Tesla’s
software for
automated
driving on
highways and
parking lots.
More than 90%
said the
feature makes
them
safer....
Six drivers claimed that Autopilot actually contributed to a collision, while nine people in the Bloomberg survey went so far as to credit the system with saving their lives. Hundreds of owners recalled dangerous behaviors, such as phantom braking, veering or failing to stop for a road hazard. But even those who reported shortcomings gave Autopilot high overall ratings....
Autopilot
Puts Owners in
Danger—And
Saves Them
From It
Explore
owners’ close
calls with
Autopilot
engaged" Read
more Hmmmm... Very interesting.
Substantive
1st hand
accounts that
can be taken
seriously. See
also Tim
Lee's take on
this. Alain
E. Becic, Nov. 5, "Uber ATG developmental ADS installed on the crash-involved vehicle was designed to operate in a fully autonomous mode only on pre-mapped designated routes. Although the system was designed to be fully automated along a specific route, a human operator located inside the vehicle was tasked with overseeing the operation of the system and monitoring the environment. Unless stated otherwise, the ADS discussed in this report refers only to Krypton platform, software version 2018.071.3p1 that was installed on the crash-involved vehicle....
1.6
Object
Detection and
Hazard
Avoidance
When ADS is
activated, it
performs all
driving tasks,
including
changing
lanes,
overtaking
slow moving or
stopped
vehicles,
making turns,
or stopping at
traffic lights
and stop
signs.
1.6.1. Object
Detection and
Classification, and Path Prediction
As the ADS
navigates and
controls the
vehicle along
a designated
route, the
system
continually
monitors the
environment
for any
objects,
whether moving
or stationary,
on or outside
a roadway. The
detected
objects are
incorporated
into the
virtual
environment,
and the system
dynamically
updates the
vehicle’s
motion plan to
avoid
potential
conflicts.
Object
detection is
conducted
primarily by
the lidar,
radar and
camera
systems, each
of which has
different
specialized
functions.
When an object
is detected,
it is tracked,
its heading
and velocity
calculated,
and classified
by the
perception
system.
Detected
objects can be
classified as
vehicles,
pedestrians
and
bicyclists; a
detected
object may
also be
classified as
“other”,
indicating an
unknown
object....
However,
if the
perception
system changes
the
classification
of a detected
object, the
tracking
history of
that object is
no longer
considered
when
generating new
trajectories.
...
When
the system
detects an
emergency
situation, it
initiates
action
suppression.
This is a
one-second
period ... one second is a very
long time
in an
emergency
situation...
during
which the ADS
suppresses
planned
braking while
the (1) system
verifies the
nature of the
detected
hazard and
calculates an
alternative
path, or (2)
vehicle
operator takes
control of the
vehicle. ATG
stated that it
implemented
action
suppression
process due to
the concerns
of the
developmental
ADS
identifying false
alarm
—detection
of a hazardous
situation when
none
exists—causing
the vehicle to
engage in
unnecessary
extreme
maneuvers.
Read
more Hmmmm... ... false
alarms, aka
mistakes, are
the problem
here. Only
with hindsight
is a mistake
identified.
So waiting any
amount of time
expecting to
correct a
mistake has no
value. If a
'better
whatever'
would have not
made the
mistake, why
wasn't the
'better
whatever' used
in the first
place?
Mistakes
(false alarms)
tend to be
transient
(short
lasting)
rather than
persistent.
Plus, these
systems
re-evaluate
everything 10
to 30 times a
second. Doing
nothing while
waiting for
the system to
stop making
mistakes might
be a valid
design;
however, the
brakes take
time to
activate. By
the time a
brake force
actually
begins to be
applied, the
system may
well have
stopped making
mistakes thus
suspending the
brake action,
maybe even
before any
discomfort
is felt by
those inside
the car. All
good.
If the mistakes are so persistent the brakes are not released and discomfort is felt in the car for no reason, then the system is simply not good enough to be used on public roadways. Unfortunately it gets worse...
...
if the
collision
cannot be
avoided with
the
application of
the maximum
allowed
braking, the
system is
designed to
provide an
auditory
warning to the
vehicle
operator while
simultaneously
initiating
gradual
vehicle
slowdown. In
such
circumstance,
ADS would
not apply
the maximum
braking to
only mitigate
the
collision. ... Unbelievable!! Did
Uber really
say ... crash
faster???
Unbelievable!!
...
Table 1.
Selected
parameters
recorded by
the ADS. ... Please read. This
is really
scary.
Classification
of objects is
a very
important
objective of
this
algorithm.
Probably
because future
actions of
object is
assumed to be
well
correlated to
object type.
Given an
object, its
future
behavior is
better
anticipated by
an historical
pattern
assigned to
that object
rather than
what has been
observed about
its behavior
since it was
first
sighted. That
is why, if the
classification
changes of an
object that is
being tracked,
its previous
position and
velocity data
are
disregarded.
I doubt that
this
approach has
any merit,
even if one
could reliably
classify
objects in
real time.
Seems like
Uber tried to
get fancy here
but its
capabilities
were simply
not up to the
task. It
couldn't
reliably
classify
objects and
even if it
did, historic
behaviors are
unlikely to be
better
predictors of
future
behavior than
simple
extrapolations
of recent
observations.
Using recent
tracking data
has to be
better than .
...
ADS predicts
the object’s
path as
static... if
the objects
classification
changes. That
is the big
mistake! In
fact, only two
classifications are needed: OK to hit versus Do NOT hit. Track each
(position,
velocity, ...)
and use the
tracking data
to predict
each object's
its future
trajectory.
But
there is
more...
1.9
Interaction
Between Volvo
ADAS and ATG
ADS
When the SUV
was operated
in a manual
mode—controlled
by a vehicle
operator—all
the Volvo ADAS
components
were active
and operated
as designed.
When the SUV
was operated
in the
autonomous
mode—controlled
by the ADS—all
Volvo ADAS
components
were
automatically
disengaged.
...
1.10
Crash
History of the
ATG Fleet of
ADS-Equipped
Vehicles
ATG shared
records of
fleet crash
history with
NTSB
investigators.
The records
showed that
between
September 2016
and March 2018
(excluding the
current
crash), there
were 37
crashes and
incidents
involving ATG
test vehicles
which at the
time operated
in autonomous
mode. Most of
these crashes
involved
another
vehicle
striking the
ATG test
vehicle—33
such
incidents; 25
of them were
rear-end
crashes and in
8 crashes ATG
test vehicle
was side
swiped by
another
vehicle.
In only two incidents, the ATG test vehicles were the striking vehicles. In one incident, the ATG vehicle struck a bent bicycle lane bollard that partially occupied the ATG test vehicle’s lane of travel. In another incident, the vehicle operator took control of the vehicle to avoid a rapidly approaching oncoming vehicle that entered the ATG vehicle’s lane of travel; the vehicle operator steered away and struck a parked car. In the remaining two incidents, an ATG vehicle was damaged by a passing pedestrian while the vehicle was stopped. ...To be fair, this is actually a really good crash history. My interpretation is that the crash probability of the Uber ATG vehicles with trained operators behind the wheel is almost 10 times safer that of the driving public's overall crash probability. (I don't buy the argument that these cars cause the driving public to hit them because of the way the automated system drives.) Alain
T.
Krisher, Nov.
7. " A
National
Transportation
Safety Board
report on the
March 18, 2018
crash in
Tempe,
Arizona, that
killed Elaine
Herzberg, 49,
found that the
Uber
self-driving
system
couldn’t
determine if
she was
pedestrian,
vehicle or
bicycle. It
also could not
predict that
she was
jaywalking in
the path of
the moving
SUV.
“The system
design did not
include a
consideration
for jaywalking
pedestrians,”
the agency
said in its
report,
released ahead
of a Nov. 19
board meeting
to determine
the cause of
the Tempe
crash.That,
and the fact
that Uber
disconnected
braking
systems and
relied on the
human safety
driver to stop
the SUV in an
emergency,
shows that the
Uber system
wasn’t ready
to be tested
on public
roads, experts
say. Some say
that stronger
standards or
more
government
regulation are
needed to set
standards
before testing
is allowed.
“These have to
be much better
than that
before they
can go out
there (on
public
roads),” said
Alain
Kornhauser,
chair of
autonomous
vehicle
engineering at
Princeton
University.
“If you can’t
do better than
that, stay on
your test
tracks. Don’t
come out in
public.”
Consumer
Reports said
the report
showed
“outrageous
safety lapses”
by Uber and
called for
stronger rules
governing
autonomous
vehicle
testing. “We
hope Uber has
cleaned up its
act, but
without
mandatory
standards for
self-driving
cars, there
will always be
companies out
there that
skimp on
safety,” Ethan
Douglas,
senior policy
analyst for
the magazine
and website,
said in a
statement. “We
need smart,
strong safety
rules in place
for
self-driving
cars to reach
their
life-saving
potential.”..."
Read
more Hmmmm... See above. See also Tim Lee's take on this. Alain
D.
Etherington,
Nov. 8,
"Lyft has
another year
of building
out its
autonomous
driving
program under
its belt, and
the
ride-hailing
company has
been expanding
its testing
steadily
throughout
2019. The
company says
that it’s now
driving four
times more
miles on a
quarterly
basis than it
was just six
months ago,
and has
roughly 400
people
worldwide
dedicated to
autonomous
vehicle
technology
development.
Going into
next year,
it’s also
expanding the
program by
adding a new
type of
self-driving
test car to
its fleet:
Chrysler’s
Pacifica
hybrid
minivan, which
is also the
platform of
choice for
Waymo’s
current
generation of
self-driving
car. The
Pacifica makes
a lot of sense
as a
ridesharing
vehicle, as
it’s a perfect
passenger car
with easy
access via the
big sliding
door and
plenty of
creature
comforts
inside..." Read more Hmmmm...
Sounds great!
Alain
P. Dave, Nov 7, " Uber (UBER.N) said it “will likely” have to strike a licensing deal with Waymo or opt for costly changes to its autonomous driving software, after an expert found the ride-hailing giant still used technology from the Alphabet Inc (GOOGL.O) unit. While it was unclear by when the company needed to decide on its next move in the blockbuster trade secrets dispute, Uber, in a quarterly securities filing on Tuesday, said that a detour in its software development “could limit or delay our production of autonomous vehicle technologies.”...
Waymo
told Reuters
in a statement
that the
independent
software
expert’s
findings
“further
confirm
Waymo’s
allegations
that Uber
misappropriated
our software
intellectual
property. We
will continue
to take the
necessary
steps to
ensure our
confidential
information is
not being used
by Uber.”..."
Read
more Hmmmm... Really??? That's all
Uber had to do
to get Waymo
to give it a
licensing deal
on its
technology?
Doesn't seem
that Waymo
would want to
do that unless
the licensing
deal was at
least as
valuable as
the service
that it
enables. The
price for a
goose that
lays golden
eggs is the
price of the
goose plus the
net present
value of the
eggs that it
produces ad
infinitum.
Ironically,
such pricing
wouldn't help
Uber reach
profitability
nor its
ability to
compete with
Waymo One.
See also Tim
Lee's
reporting of
this.
Alain
W. Feuer, Nov.8, "...The Alphabet subsidiary is removing employees at the facility and says it will offer them relocation to the company’s production center in Detroit, or to Phoenix, Arizona, where Waymo has a fleet of hundreds of self-driving cars. Those who do not wish to relocate will be offered a transition pay package. The move will affect fewer than 10 employees, but an undisclosed number of contractors also worked in the office. The relocations will not affect other Waymo employees around the country...
“Waymo is growing our investment and teams in both the Detroit and Phoenix areas, and we want to bring our operations teams together in these locations to best support our riders and our ride-hailing service,” a Waymo spokesperson said in a statement sent Friday to CNBC. “As a result we’ve decided to relocate all Austin positions to Detroit and Phoenix...." Read more Hmmmm... I guess that they weren't welcomed, even thought they've tried to play nice for more than a few years. Waymo should send the team to Princeton where we are working to create a welcoming environment for these mobility machines to affordably serve everyone, especially the mobility marginalized (the physically, mentally and/or financially challenged, the too young or too old to drive and those who prefer not to drive their own car), throughout Mercer County and eventually all of New Jersey. See the excellent NBC video that is embedded. Alain
T.
Lee, Aug 28,
"Uber lost
another $1.1
billion in the
third quarter
of 2019, the
company
announced on
Monday. This
wasn't a
surprise: Uber
lost about the
same amount in
the first
quarter of
2019 and lost
even more last
quarter.
Yet the
company argues
that things
aren't as bad
as that
headline
figure
suggests. To
show why, Uber
broke its
earnings down
by business
area,
distinguishing
its core
"rides" app
from Uber
Eats, Uber
Freight, and
other
operations.
Uber says
that, if you
exclude
certain
non-operating
expenses—mainly interest, depreciation, and stock-based compensation—the
"rides" app
actually
earned a
substantial
$631 million
profit. That's
enough to
cover the
company's core
operating
expenses, the
company said.
But Uber's
profitability
was dragged
down by losses
in its other
businesses—mainly
a $316 million
loss from Uber
Eats.
Of course,
interest,
depreciation,
and
stock-based
compensation
are real
costs. So the
fact that Uber
looks less
unprofitable
excluding them
isn't going to
be
particularly
reassuring to
Uber
investors." Read
more Hmmmm...Also read the comments. Uber stock
price is
not pretty.
Neither is Lyft's.
Alain
H.
Somerfield,
Nov. 6, "Uber
Technologies
Inc. shares
hit an
all-time low
Wednesday as
the "lockup"
period
following its
May initial
public
offering
ended,
delivering a
blow to a
company that
has struggled
to satisfy
investors.
The expiration
of the period
sent a flood
of shares onto
the market,
pushing the
stock as low
as $25.58,
down 43% from
its IPO price.
While stock
volatility and
losses aren't
unusual
following
lockup
expirations,
the loss of $2
billion from
its market
capitalization
in 24 hours
added injury
to Uber after
a lackluster
earnings
report this
week... " Read
more Hmmmm...Yup! Alain
Staff,
Nov. 8, "...
Tesla's total
autonomous
miles logged
has grown
exponentially
from 0.1
billion in May
2016 to an
estimated 1.88
billion as of
October 2019...."
Read
more Hmmmm... Unfair to compare Tesla's
Self-driving
mileage with
that of Waymo
&
GM/Cruise.
The
functionality
of Tesla's
system is very
much inferior
to that of
Waymo &
GM/Cruise.
None the less,
the AutoPilot
miles are
impressive.
Alain
A. Roy,
Nov. 2,
"Outlaw racer
Alex Roy &
roboticist
Bryan Salesky
sit with Sam
Abuelsamid —
Navigant
Research
Analyst,
Forbes writer
and one of the
world’s
leading
experts on the
automotive
&
technology
sectors — to
discuss and
debunk the Top
10 Myths about
Autonomous
Vehicles...."
Listen
Hmmmm... Very interesting. Alain
G.
Anadiotis, Nov
4,
"...Autonomy
and cloud
don't go well
together
To understand
why, let's
consider the
notion of
autonomy.
Autonomy is
defined as
'independence
or freedom, as
of the will or
one's
actions'. Can
you be
autonomous,
when relying
on someone
else's
computer? Not
really...." Read
more Hmmmm... That's why Edge doesn't
cut it
either.
Autonomy, by
definition,
needs its own
on-board
dedicated
computing;
else it's
constrained to
locations that
have Edge,
making it not
so
autonomous.
Alain
I Randall, Nov. 7, "An alarming video shows a 'smart summoned' driverless Tesla Model 3 car tentatively trying to find its owner — while going down the wrong side of the road. Stopping and starting — in the dead middle of the road at one point — the vehicle's ham-fisted driving is seen to attract the concerned attention of passersby. This latest worrying exhibition of driverless tech was filmed in a shopping centre parking lot in Richmond, British Columbia...." Read more Hmmmm... See video. So bad!! Please stop. Alain
November
20th-22nd,
2019
HILTON MIAMI
DOWNTOWN
1601 BISCAYNE
BLVD
MIAMI, FL
33132
F. Fishkin, May 18,, "From the 3rd Annual Princeton Smart Driving Car Summit, join Professor Alain Kornhauser and co-host Fred Fishkin. In this special edition, the summit's focus on mobility for all with guests Anil Lewis, Executive Director of Blindness Initiatives at the National Federation of the Blind and ITN America Founder Katherine Freund."
April 5, F. Fishkin, "The success of on demand transit company Via is proving that ride sharing systems can work. Public Policy head Andrei Greenawalt joins Princeton's Alain Kornhauser and co-host Fred Fishkin for a wide ranging discussion. Also: Uber, Tesla, Audi, Apple and Nuro are making headlines"
April 5, F. Fishkin, "Here comes congestion pricing in New York City...but what will it mean? Former city Taxi and Limousine Commission head and transportation expert Matthew Daus joins Princeton's Alain Kornhauser and co-host Fred Fishkin. Also...Tesla, VW and even Brexit! All on Episode 98 of Smart Driving Cars."
March 28, F. Fishkin, "The Future Networked Car? From Sweden, The Dispatcher publisher, Michael Sena, joins Princeton's Alain Kornhauser and co-host Fred Fishkin for the latest edition of Smart Driving Cars. Plus ...the Boeing story has much to do with autonomous vehicles and more. Tune in and subscribe."
F. Fishkin, Sept 6, "The coming new world of driverless cars! In Episode 55 of the Smart Driving Cars podcast former GM VP and adviser to Waymo Larry Burns chats with Princeton's Alain Kornhauser and Fred Fishkin about his new book "Autonomy: The Quest to Build the Driverless Car and How it Will Reshape Our World"
Elon, you sell cars to individuals at which point you relinquish control and responsibility, and thankfully, liability, for that car. Please do everything that you can to be certain that your cars are used responsibly at all times and that those individuals are alert and in control at all times; else, you'll re-acquire the responsibility and the liability. The burden of liability is not good for any business. Liability without control is TrainWreck. The regulators won't save you. Alain
- If you get matched with a fully driverless car, you'll see a notification in your Waymo app that confirms the car won't have a trained driver up front....
- you can enjoy having the car all to yourself....
R.
Mitchell, Oct.
4, " Smart
Summon is for
parking lot
use. But
drivers have
other ideas.
Tesla unleashed the latest twist in driverless car technology last week, raising more questions about whether autonomous vehicles are outracing public officials and safety regulators.
...Using
a smartphone,
a person can
now command a
Tesla to turn
itself on,
back out of a
parking space
and drive to
the smartphone
holder's
location - say
at a curb in
front of a
Costco
store.." Read
more Hmmmm.... Russ, great article. A
must read!
Elon,
please stop.
StupidSummon
was a bad
Valley-entitled
idea before
you released
it. Now that
it is out
there it will
ruin all that
is good about
Tesla,
AutoPilot and
Driverless
cars. The
shorters are
going to have
a field day.
While
you are at it
also remove
all of the
DistractTainment
add ons or
limit their
use when
AutoPilot is
NOT on and
drivers are
engaged in
driving. Just
go back to
V09! Along
the way also
get the
Automated
Emergency
Braking (AEB)
system to work
properly (See
NTSB
below).
To do that,
maybe you
should take a
serious look
at Velodyne's
new
Tesla LiDAR.
It may be able
to tell you if
the stationary
object in the
lane ahead is
high enough
above the road
surface before
your AEB
system decides
to disregard
it. Then
Tesla's may
stop decapitating
drivers.
If you don't remove StupidSummon
then at least
be sure to
limit its use
to the Tesla
owner's own
private
property by
responsible
users. (You
know the GPS
coordinates of
where each
owner lives,
so you can
geofence it.
You also know
each
irresponsible
use (You get
the videos).
Irresponsible
use (use in
the violation
of the
conditions
spelled out in
the user's
manual) should
void its
future
availability
in that car
unless proper
amend are
made. If not,
then insurance
companies
should clearly
state that
insuring the
use of this
feature
requires a
substantial
additional
premium; else,
you're not
covered.
Courts should
view that use
of this
feature
implies
premeditated
harm and
demonstrates
an extreme
indifference
to human
life. Parking
Lot owners
should install
signs
forbidding the
use of this
feature on
their property
to protect
themselves
from being
dragged into
the claims
process.
K. Korosec,
Sept 16,
"Waymo
transported
6,299
passengers in
self-driving ...drivered,
not
driverless...Chrysler
Pacifica
minivans in
its first
month
participating
in a robotaxi
pilot program
in California,
according to a
quarterly
report the
company filed
with the
California
Public
Utilities
Commission.
In all, the
company
completed
4,678
passenger
trips in July
— plus another
12 trips for
educational
purposes. It’s
a noteworthy
figure for an
inaugural
effort that
pencils out to
an average of
156 trips
every day that
month. And it
demonstrates
that Waymo has
the resources,
staff and
vehicles to
operate a
self-driving
vehicle pilot
while
continuing to
test its
technology in
multiple
cities and
ramp up its
Waymo One
ride-hailing
service in
Arizona...
The CPUC
authorized in
May 2018 two
pilot programs
for
transporting
passengers in
autonomous
vehicles. The
first one,
called the Drivered
Autonomous
Vehicle
Passenger
Service Pilot
program,
allows
companies to
operate a
ride-hailing
service using
autonomous
vehicles as
long as they
follow
specific
rules.
Companies are
not allowed to
charge for
rides, a human
safety driver
must be behind
the wheel and
certain data
must be
reported
quarterly.
The second
CPUC pilot
would allow
driverless
passenger
service —
although no
company has
yet to obtain
that
permit...."Read
more Hmmmm.... Be sure to look at the Waymo
Quarterly
Report and
that of the
other 3
companies: Zoox,
AutoX
and Pony.ai.
Those 4
companies
reported
respectively [ 4,678; 103; 9; 0] vehicleTrips; [
6,299; 134;
13; 0]
personTrips;
[59,917; 352; ?; 0] vehicleMiles, and [
55; 10; 1; 0]
number
of unique
vehicles used
throughout the
quarter. Note
Waymo only
began
operating on
July 2, the
last month of
the quarter [May, June, July]. Note: the CPUC does not permit
casual
shared-ride
services
(serving
individuals or
groups of
individuals
who weren't
predisposed to
travel
together). Go
figure??? Alain
Also note: This is Drivered Service,
meaning there
is an
attendant/driver
inside each
vehicle for
each trip; so
this is
actually
conventional
ride-hailing,
a la Lyft/Uber
with fancy
schmancy
vehicles. The
CPUC did NOT
require
"disengagement
reporting" so
one has no
idea as to the
extent of
driver/attendant
involvement is
the provision
of the
Drivered
service. It
will be
interesting to
learn if Waymo
considers this
activity to be
part of its AV
testing
program and
includes the
disengagement
performance of
these vehicles
in its
disengagement
report to the
CA DMV at the
end of the
year. We'll be able to infer if that the
disengagement
performance is
exemplary when
Waymo decides
to begin Driverless
service
(w/o an
attendant, as
opposed to Drivered
service).
1. Figure 4, The speed of the Tesla in the last 221 seconds before the crash showing that the Tesla was traveling rather slowly in the 100 seconds before the crash (under 20 mph), but then accelerated (as discussed above) in the 3 seconds just prior to the crash, beginning as soon as the lead SUV changed lanes,
2. Figure 5, the distance between the Tesla and its lead vehicle, showing that the TACC worked really well until the lead vehicle "disappeared" (changed lanes), and"... Data show that at about 490 msec before the crash, the system detected a stationary object in path of the Tesla. At that time, the forward collision warning was activated; the system presented a visual and auditory warning. Data also shows that the AEB did not engage and that there was no driver-applied braking of steering prior to the crash. According to Tesla, the AEB was active at the time of the crash, and considering that the stopped fire truck was detected about half a second before impact, there likely was not sufficient time to activate the AEB." ...This implies that the AEB and its functioning in collaboration with the TACC needs to be substantially re-evaluated/re-designed. Alain
3. Figure 6 which clearly depicts the movement of the Tesla relative to the lead vehicle and the Firetruck in the 15 seconds before the crash. The Tesla's radar and front facing camera mush have "seen' the firetruck 4 seconds before the crash and every sensing loop (1/10th of a second) during the last 4 seconds yet...
M. Isaac,
Aug 27,
"Anthony
Levandowski
was once one
of Silicon
Valley’s most
sought after
technologists.
As a pioneer
of
self-driving
car
technology, he
became a
confidant of
Larry Page, a
co-founder of
Google, and
helped develop
the search
giant’s
autonomous
vehicles. Uber
wooed him to
gain an edge
in
self-driving
techniques.
Venture
capitalists
threw their
money at him.
But on
Tuesday, Mr.
Levandowski,
39, fell far
from that
favored
stature.
Federal
prosecutors
charged him
with 33 counts
of theft and
attempted
theft of trade
secrets from
Google. ...
The criminal
indictment
against Mr.
Levandowski
from the
United States
Attorney’s
Office for the
Northern
District of
California
opens a new
chapter in a
legal battle
that has
embroiled
Google, its
self-driving
car spinoff
Waymo and its
rival Uber in
the
high-stakes
contest over
autonomous
vehicles. The
case also
highlights
Silicon
Valley’s
no-holds-barred
culture, where
gaining an
edge in new
technologies
versus
competitors
can be
paramount....
According to the indictment, Mr. Levandowski downloaded more than 14,000 files containing critical information about Google’s autonomous-vehicle research before leaving the company in 2016. He then made an unauthorized transfer of the files to his personal laptop, the indictment said. Mr. Levandowski joined Uber later that year when the ride-hailing firm bought his new self-driving trucking start-up, which was called Otto....
“The Bay
Area has the
best and
brightest
engineers, and
they take big
risks,” John
Bennett, the
F.B.I. special
agent in
charge of the
San Francisco
Division, said
at a news
conference on
Tuesday. “But
Silicon Valley
is not the
Wild West. The
fast-paced and
competitive
environment
does not mean
federal laws
do not
apply.”Mr.
Levandowski’s
next court
date is Sept.
4. If he is
convicted, he
could face a
maximum of 10
years in
prison, a
$250,000 fine
for every
count and
additional
restitution.
“All of us are
free to move
from job to
job,” said
David L.
Anderson,
United States
attorney in
the Northern
District of
California.
“What we
cannot do is
stuff our
pockets on the
way out the
door.”..." Read
more Hmmm... Central to this
technology is
the perception
of
personal
safety and
trust. Lying,
cheating &
stealing can't
be part of
this industry,
else it will
never emerge
from the
venture
stage. If DeiselGate
and the Uber
crash weren't
enough, let
this be the
next wake-up
call to this
industry to
clean up its
ethical
behavior.
Hopefully the
FBI will also
aggressively
pursue all
cyber
attackers. It
isn't cute,
nor a virtual
reality game.
It is hard
serious work
and creativity
focused on
improving the
quality of
everyday life.
Alain
J.
Browne, Aug
16,
"Autonomous
vehicles are
the future.
Self-driving
cars could
change our
lives,
heralding an
era of greater
convenience,
improved
productivity
and safer
roads...." Read
more Hmmmm.... Actually much of this opening sentence
is a myth...
It doesn't
take
Self-driving
or Driverless
to have
automation
technology
yield safer
roads. It
takes
safe-driving
technology
that works,
like Automated
Emergency
Braking (front
and rear)...
And ... are we
really going
to do our
"manufacturing
or service job
" (increase
"productivity")
if we don't
have to do the
work of
driving
anymore??? Of
the few
"riding
shotgun to
work" what
percentage are
doing work
while riding
shotgun?
Certainly less
than 10%.
Less than 1%?
So much for
productivity
improvements
If we get to Driverless, then the myths aren't
myths. There
will be fewer
private cars,
downtown
congestion
will be
reduced, the
environment
will be saved,
the insurance
industry's
gross revenues
will go down substantially (but
their profits
will go up)
and AVs are
already safer
than humans
that text
and/or are
"under the
influence"
while
driving.
If we don't get to Driverless, then we'll remain with "Do-it-yourself private mobility" that will include Self-driving assistance. Armed with that form of personal mobility, then all the myths are myths: More private cars ... and the policy implications are clear. See: J. M. Greenwald, A. L. Kornhauser "It’s up to us: Policies to improve climate outcomes from automated vehicles". Also, to have a proper perspective of the role of transportation and car/"FordF150s" in greenhouse gas emissions see... M. Sivak, Aug 22, "Increased relative contribution of medium and heavy trucks to U.S. greenhouse gas emissions" Alain
K.
conger, Aug
7, "Uber set
two dubious
quarterly
records on
Thursday as it
reported its
results: its
largest-ever
loss,
exceeding $5
billion, and
its
slowest-ever
revenue
growth. The
double whammy
immediately
renewed
questions
about the
prospects for
the company,
the world’s
biggest
ride-hailing
business. Uber
has been
dogged by
concerns about
sluggish sales
and whether it
can make
money, worries
that were
compounded by
a
disappointing
initial public
offering in
May.
For the second
quarter, Uber
said it lost
$5.2 billion,
the largest
loss since it
began
disclosing
limited
financial data
in 2017. A
majority of
that — about
$3.9 billion —
was caused by
stock-based
compensation
that Uber paid
its employees
after its
I.P.O.
Excluding that
one-time
expense, Uber
lost $1.3
billion, or
nearly twice
the $878
million that
it lost a year
earlier. On
that sariesme
basis and
excluding
other costs,
the company
said it
expected to
lose $3
billion to
$3.2 billion
this
year...Lyft
has also
reported a
series of deep
losses. This
week, it said
it lost $644.2
million in the
second
quarter,
though it
added that it
expected that
amount to
abate. Several
months
earlier, Lyft
had also
posted a
particularly
steep loss
related to
stock-based
compensation
payouts to its
employees..."
Read
more Hmmmm.... No wonder Uber looked so good prior to
its IPO, it
hadn't "paid"
its
employees. So
is this really
a "one time"
expense??
Anyway,
Driverless is
their only
potential
savior as a
$40 stock.
They can't
afford to pay
their
employee,
their gig
workers can't
feed families,
new customers
can't afford
their prices
and food
delivery
generates only
chump change.
Uber
Stock price,
See also...Uber and Lyft keep losing money while driving up the
number of cars
on our
overcrowded
streets.
Alain
Tesla,
July 16, "At
Tesla, we
believe that
technology can
help improve
safety. That’s
why Tesla
vehicles are
engineered to
be the safest
cars in the
world. We
believe the
unique
combination of
passive
safety, active
safety, and
automated
driver
assistance is
crucial for
keeping not
just Tesla
drivers and
passengers
safe, but all
drivers on the
road. It’s
this notion
that grounds
every decision
we make – from
the design of
our cars, to
the software
we introduce,
to the
features we
offer every
Tesla owner.
Model S, X and
3 have
achieved the
lowest
probability of
injury of any
vehicle ever
tested by the
U.S.
government’s
New Car
Assessment
Program.
... In the 2nd quarter, we registered one accident for every 3.27 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.19 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 1.41 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 498,000 miles.... " Read more Hmmmm.... This summary uses "accident" for Teslas and "crash" for NHTSA. This may suggest that the Tesla and NHTSA are not comp[arable... Tesla is reporting about apples and NHTSA is referring to "oranges". That notes; however, it does seem that for Teslas with and without AutoPilot and the other active safety features, there is consistency in the measure. A more detailed question arises about the equivalence of the driving domain for each category as well as who is at fault in each of these situations. Even in light of these issues and details, the large variation in the rates: 3.27 v 2.18 v 1.41 is very significant among Teslas. Seems as if AutoPilot and Tesla's other active collision avoidance safety features are improving safety of Teslas. The spread from the 0.5 value for NHTSA is really astonishing making Teslas much safer than the average of all other cars. Unfortunately these numbers only scratch the surface and beg for more details. In the past I have called for an independent evaluation of the Tesla crash statistics and I do that again there today. I'll offer to do it. Tesla should encourage someone to do it. As it stands today, not enough people believe or trust Tesla (see below) Tesla. That's unfortunate because improved safety is THE major objective of SmartDrivingCar technology. Alain
Oct 16, Establishes
fully
autonomous
vehicle pilot
program A4573
Sponsors:
Zwicker (D16);
Benson (D14)
Oct 16, Establishes New
Jersey
Advanced
Autonomous
Vehicle Task
Force AJR164
Sponsors:
Benson (D14);
Zwicker (D16);
Lampitt (D6)
May
24, "About
9:58 p.m., on
Sunday, March
18, 2018, an
Uber
Technologies,
Inc. test
vehicle, based
on a modified
2017 Volvo
XC90 and
operating with
a self-driving
system in
computer
control mode,
struck a
pedestrian on
northbound
Mill Avenue,
in Tempe,
Maricopa
County,
Arizona.
...The
vehicle was
factory
equipped with
several
advanced
driver
assistance
functions by
Volvo Cars,
the original
manufacturer.
The systems
included a
collision
avoidance
function with
automatic
emergency
braking, known
as City
Safety, as
well as
functions for
detecting
driver
alertness and
road sign
information.
All these
Volvo
functions are
disabled when
the test
vehicle is
operated in
computer
control..."
Read more
Hmmmm....
Uber must
believe that
its systems
are better at
avoiding
Collisions and
Automated
Emergency
Braking than
Volvo's.
At least this
gets Volvo
"off the
hook".
"...According to data obtained from the
self-driving
system, the
system first
registered
radar and
LIDAR
observations
of the
pedestrian
about 6
seconds before
impact, when
the vehicle
was traveling
at 43 mph..."
(=
63
feet/second)
So the system
started
"seeing an
obstacle when
it was 63 x 6
= 378 feet
away... more
than a
football
field,
including end
zones!
"...As
the vehicle
and pedestrian
paths
converged, the
self-driving
system
software
classified the
pedestrian as
an unknown
object, as a
vehicle, and
then as a
bicycle with
varying
expectations
of future
travel
path..." (NTSB:
Please tell us
precisely when
it classified
this "object'
as a vehicle
and be
explicit about
the expected "future
travel
paths." Forget the path, please just tell us the precise
velocity
vector that
Uber's system
attached to
the "object",
then the
"vehicle".
Why didn't the
the Uber
system
instruct the
Volvo to begin
to slow down
(or speed up)
to avoid a
collision? If
these paths
(or velocity
vectors) were
not accurate,
then why
weren't they
accurate? Why
was the object
classified as
a
"Vehicle" ?? When did it finally classify the object as a "bicycle"?
Why did it
change
classifications?
How often was
the
classification
of this object
done. Please
divulge the
time and the
outcome of
each
classification
of this
object. In the tests that
Uber has done,
how often has
the system
mis-classified
an object as a
"pedestrian"when the object was
actually an
overpass, or
an overhead
sign or
overhead
branches/leaves
that the car
could safely
pass under, or
was nothing at
all??
(Basically,
what are the
false alarm
characteristics
of Uber's
Self-driving
sensor/software
system as a
function of
vehicle speed
and
time-of-day?)
"...At 1.3 seconds before impact, (impact speed was 39mph = 57.2 ft/sec) the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision" (1.3 x 57.2 = 74.4 ft. which is about equal to the braking distance. So it still could have stopped short.
"...According to Uber,
emergency
braking
maneuvers are
not enabled
while the
vehicle is
under computer
control, to
reduce (eradicate??) the potential
for erratic
vehicle
behavior.
..." NTSB: Please describe/define potential and erratic vehicle
behavior Also
please uncover
and divulge
the design
& decision
process that
Uber went
through to
decide that
this risk
(disabling the
AEB) was worth
the reward of
eradicating "
"erratic vehicle behavior". This
is
fundamentally
BAD design.
If the Uber
system's false
alarm rate is
so large that
the best way
to deal with
false alarms
is to turn off
the AEB, then
the system
should never
have been
permitted on
public
roadways.
"...The vehicle operator
is relied on
to intervene
and take
action. " Wow! If Uber's
system
fundamentally
relies on a
human to
intervene,
then Uber is
nowhere near
creating a
Driverless
vehicle.
Without its
own Driverless
vehicle Uber
is past "Peak
valuation".
Video similar to part of Adam's Luncheon talk @ 2015 Florida Automated Vehicle Symposium on Dec 1. Hmmm ... Watch Video especially at the 13:12 mark. Compelling; especially after the 60 Minutes segment above! Also see his TipRanks. Alain
This list is
maintained by
Alain
Kornhauser
and hosted by
the Princeton
University
Leave
|Re-enter
[log in to unmask]"
alt="imap:[log in to unmask]:993/fetch%3EUID%3E/INBOX%3E3022058?part=1.39&filename=dhbhaandkmfbffia.png"
class=""
width="106"
height="88"
border="0"> [log in to unmask]"
alt="imap:[log in to unmask]:993/fetch%3EUID%3E/INBOX%3E3022058?part=1.40&filename=lglcejopfgfnajaj.png"
class=""
width="238"
height="92"
border="0">[log in to unmask]">Mailto:[log in to unmask]