[log in to unmask]:993/fetch%3EUID%3E/INBOX%3E3022058?part=1.5&filename=lmjdiniodjkflpia.png"
class=""
src="cid:[log in to unmask]"
width="38"
height="42"
border="0"> Draft
Program 4th
Annual
Princeton
SmartDrivingCar
Summit
Postponed
until after
June 30. More
later
A. Mueller, March 2020, "Currently available Level 2 driving automation has the potential to reduce crashes. However, there are known risks with drivers misusing these systems, particularly as they relate to drivers becoming disengaged from the driving task. The purpose of this paper was to summarize the human factors literature and make empirically supported design recommendations for Level 2 driving automation on the best methods to encourage driver engagement and communicate where the system can safely be used. Our recommendations pertaining to driver engagement concern driver monitoring systems that detect signs of driver disengagement, driver attention reminder methods, escalation processes, consequences for sustained noncompliance when monitoring systems have detected driver disengagement, and proactive methods for keeping drivers engaged with respect to driver-system interactions and system functionality considerations. We also provide guidance on how the operational design domain should be communicated and restricted. We advise you to consider these recommendations in a holistic context, as selectively adhering to only some may inadvertently exacerbate the dangers of driver disengagement and system misuse....
The operational design domain (ODD) for Level 2 automation systems should be clearly defined and communicated to the driver." Read more Hmmmm... This is very important and a MUST read for everyone. I'd even go farther by:
Alain
Mar. 13, "Few are
aware that driverless
18-wheelers are
already on the road.
The test runs on
highways have humans
in them just in case
sensors or computers
fail, but an
autonomous trucking
executive says by next
year, they won't. The
future of freight on
America's roads can be
a driverless one, this
executive says. And
that's news to many,
especially the truck
drivers who stand to
lose their
livelihoods. 60
Minutes cameras ride
aboard a test run and
Jon Wertheim reports
on the potential
disruption to a
storied American
industry on the next
edition of 60 Minutes,
Sunday, March 15 at 7
p.m. ET/PT on CBS...."
Read
more
Hmmmm... they
now ...have
humans in them just in
case...
and next year
they will also ...have
humans in them just in
case...
, and the year
after that... ...have humans... There is an enormous benefit to
the driver, the
carrier and the
shipper for
automated
driving aids for
the long-haul
trucking
industry (Safe-
and Self-driving
trucks). What
is being
presented here
are Self-driving
trucks, where
the professional
driver remains
attentive to the
driving task as
IIHS
has
recommended
above.
It
is complete watch-bait,
that 60 minutes
is even
suggesting that
any of these
trucks will be
going down any
interstate or
any other public
road
driverlessly any
time soon (No
attendant
on-board
anywhere.). The
risk to the
surrounding
public will
simply be many
times greater
than the
minuscule
private benefits
derived by
saving the cost
of the driver.
No trucking
company CEO is
going to risk
his job and the
value of his
stock options on
such a stunt.
TuSimple isn't
going to bet its
ranch on it
either. Just
because you
might be able to
do it, doesn't
mean that you
will do it,
especially when
one screw-up
loses the
ranch. It's
even doubtful
that a "Level 3"
operation (the
driver is able
to sleep while
traveling within
the Operational
Design Domain
(ODD)) will
become
operational in
any significant
way. There
isn't even a
reasonable
business case
for platooning
except for maybe
very limited
niche
situations.
Again,
systems that
improve the
driver's work
environment and
possibly help
her/him get
another one or
maybe even two
hours of service
are fantastic.
removing the
driver
altogether, not
so much. Alain
C. Reinicke, Mar 2,
"Tesla stock fell to a
one-month low on the
last trading day in
February as
coronavirus fears
rattled global
markets, snapping a
record rally that had
sent shares up more
than 250% from October
to their all-time high
close of about $918
per share on February
19.
It might still be too
early to buy the dip,
according to Morgan
Stanley. "While the
shares trade ~34%
above our $500 target,
we would wait and see
if a challenging 1Q
and supply disruptions
come to pass before
revisiting the stock,"
equity analyst Adam
Jonas wrote in a
Monday note, referring
to Friday's closing
price. He reiterated
his "underweight"
rating on shares of
the company, which
gained about 10%
Monday. ..." Read
more
Hmmmm... Tesla
Stock Price.
Alain
D. Proeber, Mar 6, "This promotional video by AutonomouStuff in Morton shows how it is bringing self-driving technology to life." Read more Hmmmm...See video. Alain
Mar 9, "A new AAA survey on automated vehicles reveals only one in 10 drivers (12%) would trust riding in a self-driving car. Even more Americans – 28% – don’t know how they feel about the technology, signaling that consumers are stuck in neutral on the road to accepting self-driving cars. AAA believes consumer sentiment of automated vehicles will be driven by tangible information on key issues and, equally important, quality education and experience....
Seven in 10 (72%) U.S. adults would feel safer riding in a self-driving car if they had the ability to take over control if something goes wrong. ..." Read more Hmmmm... Ability to take over??? Self-driving cars require you to continuously be paying attention and be ready and able to "take over" if something bad is starting to happen!!! Once again, the AAA didn't properly describe Self-driving cars, so it is impossible to draw any conclusions from this survey. Alain
[log in to unmask]:993/fetch%3EUID%3E/INBOX%3E3022058?part=1.5&filename=lmjdiniodjkflpia.png" class="" src="cid:[log in to unmask]" width="46" height="52" border="0">
F. Fishkin, May 18,, "From the 3rd Annual Princeton Smart Driving Car Summit, join Professor Alain Kornhauser and co-host Fred Fishkin. In this special edition, the summit's focus on mobility for all with guests Anil Lewis, Executive Director of Blindness Initiatives at the National Federation of the Blind and ITN America Founder Katherine Freund."
April 5, F. Fishkin, "The success of on demand transit company Via is proving that ride sharing systems can work. Public Policy head Andrei Greenawalt joins Princeton's Alain Kornhauser and co-host Fred Fishkin for a wide ranging discussion. Also: Uber, Tesla, Audi, Apple and Nuro are making headlines"
April 5, F. Fishkin, "Here comes congestion pricing in New York City...but what will it mean? Former city Taxi and Limousine Commission head and transportation expert Matthew Daus joins Princeton's Alain Kornhauser and co-host Fred Fishkin. Also...Tesla, VW and even Brexit! All on Episode 98 of Smart Driving Cars."
March 28, F. Fishkin, "The Future Networked Car? From Sweden, The Dispatcher publisher, Michael Sena, joins Princeton's Alain Kornhauser and co-host Fred Fishkin for the latest edition of Smart Driving Cars. Plus ...the Boeing story has much to do with autonomous vehicles and more. Tune in and subscribe."
F. Fishkin, Sept 6, "The coming new world of driverless cars! In Episode 55 of the Smart Driving Cars podcast former GM VP and adviser to Waymo Larry Burns chats with Princeton's Alain Kornhauser and Fred Fishkin about his new book "Autonomy: The Quest to Build the Driverless Car and How it Will Reshape Our World"
Press
release, Feb
6, "NHTSA
announced
today that it
granted Nuro’s
request for a
temporary
exemption from
certain
low-speed
vehicle
standard
requirements.
The exemption
will allow the
company to
deploy its
low-speed,
occupantless
electric
delivery
vehicle, the
“R2.” Unlike
a conventional
low-speed
vehicle, the
R2 is designed
to have no
human occupant
and operates
exclusively
using an
automated
driving
system.
“Since this is
a low-speed
self-driving
delivery
vehicle,
certain
features that
the Department
traditionally
required –
such as
mirrors and a
windshield for
vehicles
carrying
drivers – no
longer make
sense,” said
U.S. Secretary
of
Transportation
Elaine L.
Chao... " Read
more
Hmmmm...
this is: One
small step.
The bigger one
will be for
the GM/Cruise
vehicle. Be
sure to read
the Supplemental
Information.
Details
matter. Alain
Kyle
Vogt, Jan 17,
"In a few
weeks the
California DMV
will release
disengagements
data from
Cruise and
other
companies who
test AVs on
public roads.
This data is
really great
for giving the
public a sense
of what’s
happening on
the roads.
Unfortunately,
it has also
been used by
the media and
others to
compare
technology
from different
AV companies
or as a proxy
for commercial
readiness.
Since it’s the
only publicly
available
metric, I
don’t really
blame them for
using it. But
it’s woefully
inadequate for
most uses
beyond those
of the DMV.
The idea that
disengagements
give a
meaningful
signal about
whether an AV
is ready for
commercial
deployment is
a myth. ..."
Read
more
Hmmmm...
Amen! This
is a MUST
read. As with
everything, details
matter. It is
true that
figures don't
lie, but but
it is easy to
game systems
such that
figures,
without the
underlying
details, do
lie. As Kyle
points out,
there are
important
details
associated
with
disengagements.
These need to
be well
understood for
disengagements
to be a proxy
for safety and
market
readiness. The
when, where
and associated
details of
each
disengagement
is critically
important if
the objective
is safety and
market
readiness.
What is also most important here
is the
underlying
objective of
the companies
doing the
tests and
reporting the
data. As has
happened in
our secondary
education
where students
are taught
what is in and
how to take
the SATs
rather than
just learn.
The objective
is not
learning , but
getting 800s
on the SATs so
that they can
get into
'Princeton'.
This is
perpetuated by
the
'Princetons'
of this world
that don't
look into the
details of the
student's
academic
qualities and
capabilities.
In the
academic
world, we know
these students
as 'box
checkers',
gamers of the
college
admission
process. The
gaming is
continued by
the 'banks and
med schools'
that use
simplistic GPA
(Grade Point
Average, aka
'disengagements')
cutoffs. The
'box checkers'
then take
'underwater
basket
weaving'
courses and
become grade
grubbers. It
is lazy and
irresponsible
to use
simplistic
measures as
proxies to
very complex
concepts such
as
intelligence,
creativity,
compatibility,
and all the
other details
that make a
good student,
a good
employee, a
good citizen,
a good
mobility
system.
In our case, testing is assumed to be about safety and market readiness; however, for some, it may be about trying to "make a silk purse out of a sow's ear" or "putting lipstick on the pig". It is easy to game the metric 'Disengagements' by simply testing in easy places, under easy conditions, instead of really trying to find the corner/edge cases that you don't know in places and conditions of the Operational Design Domain that you are actually going to serve and make a business out of all of this technology; rather than just trying to get good press, or flipping it to someone else or putting it on an academic self. The details would readily divulge the real objective of the company doing the testing.
I hope that Kyle, in his next post, will divulge what he, GM's lawyers and GM's board are requiring of his system for each of them to sign off and begin to operate an economically viable mobility service to the general public in some ODD. Each will demand that it be safe. The board will also demand that it be profitable. What details are they requesting that will make each comfortable signing on the bottom line? AlainT. Lee, Jan. 10, "...In a Tuesday speech at the Consumer Electronics show, Mobileye CEO Amnon Shashua made clear just how big of a strategic advantage this is. He laid out Mobileye's vision for the evolution of self-driving technology over the next five years. And he made it clear that he envisions Mobileye staying at the center of the industry...
In
his Tuesday
speech,
Mobileye's
Shashua calls
ADAS systems
with
high-definition
maps, like
Super Cruise,
"Level 2+"—a
small step
above regular
ADAS systems
that are
called "level
2" in the
five-level SAE
framework. A
number of
carmakers have
developed
similar
systems.
Shashua says
Mobileye is
supplying the
technology for
70 percent of
them,
including
systems from
Nissan,
Volkswagen,
and BMW..." Read
more
Hmmmm...
This is all
about
Self-driving
just like
Tesla's
AutoPilot. It
is not Driverless.
A lot is made about HD maps that I
simply don't
appreciate. "...
The company
uses all this
data to
generate
detailed,
high-definition
maps of the
areas where
the cars
drive..." HD maps don't have any info on
the other
cars,
pedestrians
and ... that
are moving
around you
when you
drive. Nor do
they have the
"stopped
firetrucks" in
your lane
ahead. Call
these thing
"half" of the
things that
you don't want
to hit while
driving down
the road. You
and I need
something
(cameras,
radars) to
sense these in
real time as
we move down
the road.
These things
need to "see"
everything
around you
(especially in
front of you),
which likely
include the
things that
are NOT in the
HD maps.
Moreover, by
sensing them
relative to
"my nose", I
only need "10
cm" accuracy,
especially
when I do this
in real time
20 to 30 a
second.
Also, I don't really need to know
where I am. I
only care
about objects
relative to
where I am.
(Since I only
care about my
position
relative to
the static map
data, I need
to take the
difference
between my
position and
the position
of the objects
in the map
data. The
accuracy of
that
difference in
those two
values (my
location and
the object's
location in
the map data)
is the
inferior
accuracy of
those two
values. Good
luck at
independently
knowing to
centimeter
accuracy your
position every
20th or 30th
of a second.
So
"centimeter'
accuracy in
the HD data is
totally
useless and
need not be
any more
accurate than
your
independent
positional
accuracy.
What is easier
and better is
to simply
directly
measure the
relative
positions (and
velocities and
accelerations
and...) of
everything
every/many
time steps in
(near)
real-time and
disregard any
of the
"precision" in
the map data
that isn't
complete and
latent.
So, please, explain to me why I
need super
accurate info
about the
stationary
things. Seems
like an
enormous
amount of
overhead to
carry around
when it is
still p to the
real-time
sensing system
to spot the
stopped
firetruck in
the lane
ahead. (Also,
most folks, if
they pay
attention and
behave, they
drive very
safely without
HD maps and
just
Rand-McNally
fold out
maps.)
Also, can you imagine how useless much of the real-time image data are (data is plural). Everything that is moving in each frame is unique, never to happen precisely again. All of that needs to be purged. Also all of the non-"permanent" stuff like parked cars and "stopped firetrucks". One thing that our brains do very well is to forget, (especially those of Steelers fans). In addition to "Optimal Learning" algorithms, we need some "Optimal Forgetting" algorithms. Alain
A. Kornhauser, Jan 12, Hmmmm... Self-driving cars are hot and the OEMs are responding. I'm about to buy a new Subaru Outback and EyeSight is standard. It is no longer just AutoPilot or expensive options that car salesmen don't sell. Car companies, as reflected in what is in showrooms and what was promoted at CES, have realized the comfort and convenience of Self-driving technology (cars that have a lot of the Safe-driving car features but also enable you to take your feet off the pedals and hands off the wheel at least for short periods of time. These technologies are really becoming the 'chrome and fins' that sell cars to individuals in the 2020s. The momentum is all behind that happening and there is little Washington or Trenton or Princeton Council can do about it. Hopefully part of that momentum will be to make these systems actually work well, especially the Automated Emergency Braking Systems (MUST quit assuming that all stationary objects in the lane ahead can be passed under and consequently each is disregarded. As Tesla is finding out, sometimes those objects are parked firetrucks.) and begin to put hard limits on over-speeding, tailgating and use while driver is impaired. Self-driving cars are unfortunately going to lead to substantial urban sprawl, increased VMT, increased congestion and do nothing to help the energy and pollution challenges of our addiction to the personal automobile. Only 'Waymo-style Driverless' (autonomousTaxis, (aTaxis)) tuned to entice ride-sharing can potentially stem the tide of ever more personal car ownership and ever expanding urban sprawl. Alain
A. Kornhauser, Jan. 6, Hmmmm... I'm in rehab and hope to go home on Wednesday morning. Thank you to so many of you for all the good wishes and prayers. They each helped. I'm looking to making a full recovery. Remember, if you don't feel well, get evaluated by a doctor. I was totally clueless about what hit me from out of nowhere. Alain
[log in to unmask]" width="84" height="148">
autonomousTaxi (aTaxi) stop facilitating true ride-sharing to any destination within the autonomous transit system's Operational Design Domain. The first of what may well become a half million or so others. Each strategically located to be less that a 5 minute walk from essentially any of the billion or so person trip ends that are made on any typical day in the USA (outside of Manhattan (whose subway stations provide the comparable accessibility). Twenty million or so aTaxi vehicles could readily provide on-demand, share-ride mobility from these ~0.5M aTaxi stops. Provided would be essentially the same 24/7 on-demand level-of-service as we do for ourselves with our own conventional automobiles; however, this mobility would be affordably achieved using half the energy, creating half the pollution, eliminating essentially all the congestion, doubling conventional transit ridership and making such improved mobility available to those who today can't or wish not to drive a conventional automobile. This is a MAJOR 1st. Alain
R.
Wile, Nov 22,
"Sen. Jeff
Brandes (R-St.
Petersburg)
had just
finished
serving in the
Army, and was
looking to
make a name
for himself in
Tallahassee as
a junior
representative.
He came across
a talk given
by the founder
of Google’s
driverless car
project.
He quickly
realized the
potential of
self-driving
cars to
transform many
aspects of
daily life.
Ever since, he
has made it
his mission to
turn Florida
into what he
calls “an
angel
investor” in
automation
policy. “We
want to have
policies in
place for this
technology to
flourish,”
Brandes said
in an
interview at
the 7th Annual
Florida
Automated
Vehicles
conference in
Miami, which
concluded
Friday.
R.
Mitchell, Oct. 4, "
Smart Summon is for
parking lot use. But
drivers have other
ideas.
Tesla unleashed the latest twist in driverless car technology last week, raising more questions about whether autonomous vehicles are outracing public officials and safety regulators.
...Using
a smartphone, a person
can now command a
Tesla to turn itself
on, back out of a
parking space and
drive to the
smartphone holder's
location - say at a
curb in front of a
Costco store.." Read
more Hmmmm....
Russ, great
article. A must
read!
Elon, please stop. StupidSummon
was a bad
Valley-entitled
idea before
you released
it. Now that
it is out
there it will
ruin all that
is good about
Tesla,
AutoPilot and
Driverless
cars. The
shorters are
going to have
a field day.
While you are at it also remove
all of the
DistractTainment
add ons or
limit their
use when
AutoPilot is
NOT on and
drivers are
engaged in
driving. Just
go back to
V09! Along
the way also
get the
Automated
Emergency
Braking (AEB)
system to work
properly (See
NTSB
below).
To do that,
maybe you
should take a
serious look
at Velodyne's
new
Tesla LiDAR.
It may be able
to tell you if
the stationary
object in the
lane ahead is
high enough
above the road
surface before
your AEB
system decides
to disregard
it. Then
Tesla's may
stop decapitating
drivers.
If
you don't remove
StupidSummon
then at least be
sure to limit
its use to the
Tesla owner's
own private
property by
responsible
users. (You
know the GPS
coordinates of
where each owner
lives, so you
can geofence
it. You also
know each
irresponsible
use (You get the
videos).
Irresponsible
use (use in the
violation of the
conditions
spelled out in
the user's
manual) should
void its future
availability in
that car unless
proper amend are
made. If not,
then insurance
companies should
clearly state
that insuring
the use of this
feature requires
a substantial
additional
premium; else,
you're not
covered. Courts
should view that
use of this
feature implies
premeditated
harm and
demonstrates an
extreme
indifference to
human life.
Parking Lot
owners should
install signs
forbidding the
use of this
feature on their
property to
protect
themselves from
being dragged
into the claims
process.
Oct 16, Establishes
fully autonomous vehicle
pilot program A4573
Sponsors: Zwicker (D16);
Benson (D14)
Oct 16, Establishes
New
Jersey Advanced
Autonomous Vehicle Task
Force AJR164
Sponsors: Benson (D14);
Zwicker (D16); Lampitt
(D6)
May 24,
"About 9:58 p.m., on
Sunday, March 18, 2018,
an Uber Technologies,
Inc. test vehicle, based
on a modified 2017 Volvo
XC90 and operating with
a self-driving system in
computer control mode,
struck a pedestrian on
northbound Mill Avenue,
in Tempe, Maricopa
County, Arizona.
...The
vehicle was factory
equipped with several
advanced driver
assistance functions by
Volvo Cars, the original
manufacturer. The
systems included a
collision avoidance
function with automatic
emergency
braking, known as City
Safety, as well as
functions for detecting
driver alertness and
road sign information.
All these Volvo
functions are disabled
when the test vehicle is
operated in computer
control..."
Read more Hmmmm....
Uber must
believe that its
systems are
better at
avoiding
Collisions and
Automated
Emergency
Braking than
Volvo's.
At least this
gets Volvo "off
the hook".
"...According
to data obtained
from the
self-driving system,
the system first
registered radar and
LIDAR observations
of the pedestrian
about 6 seconds
before impact, when
the vehicle was
traveling at 43
mph..." (=
63
feet/second)
So the system
started
"seeing an
obstacle when
it was 63 x 6
= 378 feet
away... more
than a
football
field,
including end
zones!
"...As
the vehicle and
pedestrian paths
converged, the
self-driving system
software classified
the pedestrian as an
unknown object, as a
vehicle, and then as
a bicycle with
varying expectations
of future travel
path..." (NTSB:
Please tell us
precisely when
it classified
this "object'
as a vehicle
and be
explicit about
the expected "future
travel paths." Forget
the path, please
just tell us the
precise velocity
vector that
Uber's system
attached to the
"object", then
the "vehicle".
Why didn't the
the Uber system
instruct the
Volvo to begin
to slow down (or
speed up) to
avoid a
collision? If
these paths (or
velocity
vectors) were
not accurate,
then why weren't
they accurate?
Why was the
object
classified as a
"Vehicle" ??
When did it
finally classify
the object as a
"bicycle"?
Why did it
change
classifications?
How often was
the
classification
of this object
done. Please
divulge the time
and the outcome
of each
classification
of this object. In the tests that
Uber has done,
how often has
the system
mis-classified
an object as a
"pedestrian"when the object was
actually an
overpass, or
an overhead
sign or
overhead
branches/leaves
that the car
could safely
pass under, or
was nothing at
all??
(Basically,
what are the
false alarm
characteristics
of Uber's
Self-driving
sensor/software
system as a
function of
vehicle speed
and
time-of-day?)
"...At 1.3 seconds before impact, (impact speed was 39mph = 57.2 ft/sec) the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision" (1.3 x 57.2 = 74.4 ft. which is about equal to the braking distance. So it still could have stopped short.
"...According to Uber,
emergency
braking
maneuvers are
not enabled
while the
vehicle is
under computer
control, to
reduce (eradicate??) the potential
for erratic
vehicle
behavior.
..." NTSB: Please describe/define potential and erratic vehicle
behavior Also
please uncover
and divulge
the design
& decision
process that
Uber went
through to
decide that
this risk
(disabling the
AEB) was worth
the reward of
eradicating "
"erratic vehicle behavior". This
is
fundamentally
BAD design.
If the Uber
system's false
alarm rate is
so large that
the best way
to deal with
false alarms
is to turn off
the AEB, then
the system
should never
have been
permitted on
public
roadways.
"...The vehicle operator
is relied on
to intervene
and take
action. " Wow! If Uber's
system
fundamentally
relies on a
human to
intervene,
then Uber is
nowhere near
creating a
Driverless
vehicle.
Without its
own Driverless
vehicle Uber
is past "Peak
valuation".
Video similar to part of Adam's Luncheon talk @ 2015 Florida Automated Vehicle Symposium on Dec 1. Hmmm ... Watch Video especially at the 13:12 mark. Compelling; especially after the 60 Minutes segment above! Also see his TipRanks. Alain
This list
is maintained by Alain
Kornhauser and
hosted by the Princeton
University
Leave
|Re-enter