A. Kornhauser, Jan 6, Hmmmm...
I'm in rehab and hope to go home on
Wednesday morning. Thank you to so many of
you for all the good wishes and prayers.
They each helped. I'm looking to making a
full recovery. Remember, if you don't
feel well, get evaluated by a doctor. I
was totally clueless about what hit me
from out of nowhere. Alain
A. Pressman, Jan. 7, "Autonomous cars from
Waymo, owned by Google parent Alphabet, drove
10 million miles on public roads in about the
past year, doubling the company's self-driving
record of the prior 10 years, CEO John Krafcik
said on Monday.
"We're now beyond 20 million miles of fully
self-driving, like, really self driving,"
Krafcik said in an interview at Fortune's
Brainstorm Tech dinner in Las Vegas during the
CES conference. "You need to have a lot of
real world experience. There's no way to avoid
that. You must have it."..." Read
more Hmmmm... What is
fundamentally impressive is that Waymo
captured the data associated with
essentially each foot of those 20M miles
of travel (~100B feet). Those data,
some/most/essentially all are boring and
repetitive; however, can you imagine how
many golden nuggets are buried in all
that gravel and rock. It is those
nuggets that
contain the opportunity that is
absolutely necessary for this technology
to succeed. Congratulations and keep
mining those nuggets. It would be very
nice of you to share them with everyone
else so that they don't stumble;
however, that may be too much to ask.
But remember, if one stumbles, everyone
pays. Alain
A. Charlton, Dec 31, "So many vehicle
technology companies are exhibiting at the CES
technology show in Las Vegas next week, that
you'll count 32 of them before you even get to
the letter B.
Many of these are directly involved with the
business of autonomous driving, whether they
be builders of Lidar sensors, producers of
vehicle systems, or car manufacturers
themselves.
Daimler
suffers 'reality check' with reality of
autonomous cars...." Read
more Hmmmm... Disappointed
that I had to cancel my participation,
but it is nice that Daimler 'suffers
reality check' because what they
demonstrated several years ago was very
much an elitist toy for those 0.1 %ers
that already have more mobility choices
than they deserve, Instead they should
be working on mobility machines that
substantially enhance the quality of
life of the so many that Daimler has
left behind over the years. Alain
L. Fabian, Jan. 6, "Only a determined Mode
Shift will save us! Yet nowhere is the US or
the rest of the world are officials determined
to move cities and society at large away from
highway-oriented policies which assume that
basic transport and public safety needs are
met by gasoline-powered street-running
vehicles...." Read
more Hmmmm... Enjoy reading.
Proper land use and ride-sharing go
hand-in-hand. Alain
B. Reimer, Jan. 5, "... The safety of
Autopilot continues to be questioned by
experts worldwide. A recent letter from
Senator Markey to Elon Musk adds to an already
heated conversation. In a post earlier this
year on the Tesla automation strategy, I
raised the question of whether it is
appropriate for consumers to be used as test
subjects, the need for a well validated
measure of risk associated with the use of
Autopilot, and the need for camera-based
driver monitoring to manage inattention. In
the months since, visuals in the media of
drivers falling asleep, engrossed in
non-driving activities, and otherwise
participating in inattentive driving continue
to appear. Seemly random, but realistically
predictable, crashes keep happening that might
be preventable with camera-based driver
monitoring and driver management...." Read
more Hmmmm... The simple
answer is .. Sure! However, some/many
of those videos are fake and the
fundamental problem is that the
automated emergency braking (AEB) system
on Teslas, and essentially every other
make of cars, simply doesn't work as
well as it should. It doesn't deal
appropriately with stationary objects in
the lane ahead and the false alarm rates
are way too high. Getting the AEB
system to actually work throughout the
operational design domain of AutoPilot
is a much better thing to focus on. All
the Tesla owners that I know are not at
all cavalier about their use of
Autopilot and treat the system with
utmost respect and care. The very few
that mis-use AutoPilot are unlikely to
change their ways by what is discussed
here, so such policies will have
minuscule impact. Focus on fixing the
AEB in Teslas and essentially every
other car make and model. Alain
J. Crider, Jan. 5, "In a new
report by MIT’s Lex Fridman, we learn
that Tesla now has over 730,000 vehicles with
Autopilot Hardware 2 and 3 on the world’s
roads...." Read
more Hmmmm... This is a
non-trivial accomplishment not only in
terms of implementation of a safety
technology but also in terms of the
data/information flow that such a
penetration affords Tesla. Hopefully
Tesla will either release or have
independent researchers substantively
evaluate the safety implication of the
system. The sample size is so large
that major safety implications could
readily be addressed. Alain
T. krishner, Jan. 3, "... On Dec. 29, a
Tesla Model S sedan left a freeway in Gardena,
Calif., at a high speed, ran a red light and
struck a Honda Civic, killing two people
inside, police said. On the same day, a
Tesla Model 3 hit a parked firetruck on an
Indiana freeway, killing a passenger in the
Tesla...." Read
more Hmmmm...Ultimately
tragic and my sincere condolences go out
to all those touched by these two
tragedies. Unfortunately there is and
has been nothing in AutoPilot that keeps
Teslas or any other car or truck make
and model from running red lights (I
agree that there should be ASAP) and I
have been pointing out in this eLetter
the limitations of Tesla's, and every
other car or truck make or model's,
Automated Emergency Braking (AEB) system
with properly addressing the "stationary
object in the lane ahead" problem. NTSB
and NHTSA have know (or should have
known) about this problem since the Joshua
Brown crash. To my knowledge,
they've done nothing about it. I
suspect that for NTSB/NHTSA to recall
one AEB system, they'd have to recall
all of them. I do fault Tesla for not
fixing this code and over-the-air
updating it to solve it for Teslas.
Apparently, the fix is not so easy
because the false alarm issue
substantially interacts with
"ride-quality". Seems to me that safety
should trump ride-quality. Alain
R. Mitchell, Jan. 3, "Tesla Inc. said it
delivered 367,500 vehicles in 2019, achieving
the sales forecast issued by Chief Executive
Elon Musk and propelling its stock to a new
record high. The Silicon Valley automaker
finished 2019 on an upswing. Last year it
opened a new manufacturing plant in China,
scored a rare profit in the third quarter, and
saw its stock price soar to record highs.
Tesla needed to deliver 104,000 vehicles in
the fourth quarter to meet Musk’s 2019
forecast of between 360,000 and 400,000
vehicles.
Tesla delivered 112,000 vehicles in the fourth
quarter. Of those, 92,550 were Model 3s, a
growth rate of 16.3% over the previous quarter
and 46% above the fourth quarter of 2018. The
more expensive and aging Model S sedan and
Model X SUV models saw a combined 19,450
deliveries for the quarter. That’s 11.8%
higher than the previous quarter, but 37%
below the fourth quarter of 2018...." Read more
Hmmmm... A non-trivial accomplishment
in the market place. Alain
T. lee, Dec. 19, "On Thursday, Tesla stock
rose above $400 for the first time in the
company's history. The record price caps a
year—and a decade—when Tesla proved its
doubters wrong.
At the start of 2010, Tesla had produced fewer
than 1,000 units of the high-priced Roadster.
The Model S was years away. The firm's
finances were still precarious, having
narrowly escaped bankruptcy in the final days
of 2008. Few would have guessed that Tesla was
poised to become a major automaker. Indeed,
over the last decade, people repeatedly
predicted that the company would run out of
money and be unable to raise more. They
doubted that Tesla could deliver new car
models on time—or at all. They said that
quality problems and missed deadlines would
sour customers on the Tesla brand.
But Tesla has proved these critics wrong. It's
true that the company has repeatedly missed
deadlines and has sometimes shipped cars with
quality problems. But those setbacks have had
little impact on its customers' enthusiasm for
the company. Compelling features like instant
acceleration, over-the-air software updates,
and Tesla's vast supercharger network have
been enough to convince hundreds of thousands
of fans to overlook the company's flaws and
open their wallets...." Read
more Hmmmm... Yup!
F. Lambert, Dec 30, "... A Model 3 driver
going by Dougal Vlogs on YouTube was driving
on Autopilot while filming from inside his car
with a handheld camera. While filming, he
crashed his car, which he described in the
video description..." Read
more Hmmmm... In the
performance of any safety analysis
one must discount the "15 seconds
of famers" and the fakes. The
fact that he lived demonstrates
the efficacy of the Model 3's
structural design. Tesla, thank
you for that contribution. Alain
J. Yoshida, Dec 26, "In 2020, a host of new
industry standardization initiatives for
artificial intelligence will roll out in
earnest. Their common mission is to develop
safety standards for AI-driven systems in
autonomous vehicles and robotics...." Read
more Hmmmm... Wait a
minute... You still don't have Automated
Emergency Braking systems that actually
work throughout the normal driving
operational domain... brake the car
before it hits something in the lane ahead
and whose false alarm rate is essentially
zero. No need to move onto other AV
"Safety Standards" until you've fixed this
one which is less than half-baked. Alain
Too many to print...
StupidSummon is competing with cat videos
as common click-bait.
F. Fishkin, May 18,, "From the 3rd Annual Princeton Smart Driving Car Summit, join Professor Alain Kornhauser and co-host Fred Fishkin. In this special edition, the summit's focus on mobility for all with guests Anil Lewis, Executive Director of Blindness Initiatives at the National Federation of the Blind and ITN America Founder Katherine Freund."
April 5, F. Fishkin, "The success of on demand transit company Via is proving that ride sharing systems can work. Public Policy head Andrei Greenawalt joins Princeton's Alain Kornhauser and co-host Fred Fishkin for a wide ranging discussion. Also: Uber, Tesla, Audi, Apple and Nuro are making headlines"
April 5, F. Fishkin, "Here comes congestion pricing in New York City...but what will it mean? Former city Taxi and Limousine Commission head and transportation expert Matthew Daus joins Princeton's Alain Kornhauser and co-host Fred Fishkin. Also...Tesla, VW and even Brexit! All on Episode 98 of Smart Driving Cars."
March 28, F. Fishkin, "The Future Networked Car? From Sweden, The Dispatcher publisher, Michael Sena, joins Princeton's Alain Kornhauser and co-host Fred Fishkin for the latest edition of Smart Driving Cars. Plus ...the Boeing story has much to do with autonomous vehicles and more. Tune in and subscribe."
F. Fishkin, Sept 6, "The coming new world of driverless cars! In Episode 55 of the Smart Driving Cars podcast former GM VP and adviser to Waymo Larry Burns chats with Princeton's Alain Kornhauser and Fred Fishkin about his new book "Autonomy: The Quest to Build the Driverless Car and How it Will Reshape Our World"
A. Kornhauser, Dec 30, Hmmmm...On about December 2, I became ill with what I thought was just a cold. I went through the week teaching my classes, did PodCast, etc. but finally, at the insistence of my wife Elizabeth, I went to my doctor at 2 pm, Friday Dec. 6. He immediately told me to go to the emergency ward of the hospital. For the next 16 days a team of doctors, Elizabeth, her father, Laura, Michelle, ... brought me back to life. On 12/25 I was released to a rehab facility to relearn how to walk, etc. Unfortunately, on Dec. 28 my fever came back and I was returned to the hospital where they've gotten control of the fever. Indications are that I'll be in the hospital for a week or so while they track this down. Just an obvious recommendation to all... Don't mess with kidney failure, acute pneumonia, and/or bacterium Legionella. All the best for 2020. Alain
[log in to unmask]" alt="" class="" width="84" height="148">
autonomousTaxi (aTaxi) stop facilitating true ride-sharing to any destination within the autonomous transit system's Operational Design Domain. The first of what may well become a half million or so others. Each strategically located to be less that a 5 minute walk from essentially any of the billion or so person trip ends that are made on any typical day in the USA (outside of Manhattan (whose subway stations provide the comparable accessibility). Twenty million or so aTaxi vehicles could readily provide on-demand, share-ride mobility from these ~0.5M aTaxi stops. Provided would be essentially the same 24/7 on-demand level-of-service as we do for ourselves with our own conventional automobiles; however, this mobility would be affordably achieved using half the energy, creating half the pollution, eliminating essentially all the congestion, doubling conventional transit ridership and making such improved mobility available to those who today can't or wish not to drive a conventional automobile. This is a MAJOR 1st. Alain
R. Wile, Nov 22, "Sen.
Jeff Brandes (R-St. Petersburg) had
just finished serving in the Army, and
was looking to make a name for himself
in Tallahassee as a junior
representative. He came across a talk
given by the founder of Google’s
driverless car project.
He quickly realized the potential of
self-driving cars to transform many
aspects of daily life. Ever since, he
has made it his mission to turn
Florida into what he calls “an angel
investor” in automation policy. “We
want to have policies in place for
this technology to flourish,” Brandes
said in an interview at the 7th Annual
Florida Automated Vehicles conference
in Miami, which concluded Friday.
A. Karpathy, Nov. 6, "Hear from Andrej Karpathy on how Tesla is using PyTorch to develop full self-driving capabilities for its vehicles, including AutoPilot and Smart Summon. ..." Read more Hmmmm... Worth watching the video, (except for the StupidSummon part) Alain
Elon, you sell cars to individuals at which point you relinquish control and responsibility, and thankfully, liability, for that car. Please do everything that you can to be certain that your cars are used responsibly at all times and that those individuals are alert and in control at all times; else, you'll re-acquire the responsibility and the liability. The burden of liability is not good for any business. Liability without control is TrainWreck. The regulators won't save you. Alain
- If you get matched with a fully driverless car, you'll see a notification in your Waymo app that confirms the car won't have a trained driver up front....
- you can enjoy having the car all to yourself....
R. Mitchell, Oct. 4, " Smart
Summon is for parking lot use. But drivers
have other ideas.
Tesla unleashed the latest twist in driverless car technology last week, raising more questions about whether autonomous vehicles are outracing public officials and safety regulators.
...Using a smartphone, a person
can now command a Tesla to turn itself on,
back out of a parking space and drive to the
smartphone holder's location - say at a curb
in front of a Costco store.." Read
more Hmmmm.... Russ, great
article. A must read!
Elon, please
stop. StupidSummon was a bad
Valley-entitled idea before you
released it. Now that it is out
there it will ruin all that is
good about Tesla, AutoPilot and
Driverless cars. The shorters are
going to have a field day.
While you are at
it also remove all of the
DistractTainment add ons or limit
their use when AutoPilot is NOT on
and drivers are engaged in
driving. Just go back to V09!
Along the way also get the
Automated Emergency Braking (AEB)
system to work properly (See NTSB
below). To do that, maybe you
should take a serious look at Velodyne's new
Tesla LiDAR. It may be able
to tell you if the stationary
object in the lane ahead is high
enough above the road surface before
your AEB system decides to
disregard it. Then Tesla's may
stop decapitating
drivers.
If you don't remove
StupidSummon then at least be sure to
limit its use to the Tesla owner's own
private property by responsible users.
(You know the GPS coordinates of where
each owner lives, so you can geofence
it. You also know each irresponsible
use (You get the videos). Irresponsible
use (use in the violation of the
conditions spelled out in the user's
manual) should void its future
availability in that car unless proper
amend are made. If not, then insurance
companies should clearly state that
insuring the use of this feature
requires a substantial additional
premium; else, you're not covered.
Courts should view that use of this
feature implies premeditated harm and
demonstrates an extreme indifference to
human life. Parking Lot owners should
install signs forbidding the use of this
feature on their property to protect
themselves from being dragged into the
claims process.
K. Korosec, Sept 16, "Waymo transported 6,299
passengers in self-driving ...drivered, not
driverless...Chrysler
Pacifica minivans in its first month
participating in a robotaxi pilot program in
California, according to a quarterly report the
company filed with the California Public
Utilities Commission.
In all, the company completed 4,678 passenger
trips in July — plus another 12 trips for
educational purposes. It’s a noteworthy figure
for an inaugural effort that pencils out to an
average of 156 trips every day that month. And
it demonstrates that Waymo has the resources,
staff and vehicles to operate a self-driving
vehicle pilot while continuing to test its
technology in multiple cities and ramp up its
Waymo One ride-hailing service in Arizona...
The CPUC authorized in May 2018 two pilot
programs for transporting passengers in
autonomous vehicles. The first one, called the
Drivered Autonomous Vehicle Passenger
Service Pilot program, allows companies to
operate a ride-hailing service using autonomous
vehicles as long as they follow specific rules.
Companies are not allowed to charge for rides, a
human safety driver must be behind the wheel and
certain data must be reported quarterly.
The second CPUC pilot would allow driverless
passenger service — although no company has yet
to obtain that permit...."Read
more Hmmmm.... Be sure to look
at the Waymo
Quarterly Report and that of the
other 3 companies: Zoox,
AutoX
and Pony.ai.
Those 4 companies reported respectively [ 4,678;
103; 9; 0] vehicleTrips;
[ 6,299; 134; 13; 0]
personTrips; [59,917;
352; ?; 0] vehicleMiles,
and [ 55; 10; 1; 0] number
of unique vehicles used
throughout the quarter. Note
Waymo only began operating on
July 2, the last month of the
quarter [May,
June, July]. Note: the
CPUC does not permit casual
shared-ride services (serving
individuals or groups of
individuals who weren't
predisposed to travel
together). Go figure??? Alain
Also note:
This is Drivered Service,
meaning there is an
attendant/driver inside each
vehicle for each trip; so this
is actually conventional
ride-hailing, a la Lyft/Uber
with fancy schmancy vehicles.
The CPUC did NOT require
"disengagement reporting" so
one has no idea as to the
extent of driver/attendant
involvement is the provision
of the Drivered service. It
will be interesting to learn
if Waymo considers this
activity to be part of its AV
testing program and includes
the disengagement performance
of these vehicles in its
disengagement report to the CA
DMV at the end of the year. We'll be able
to infer if that the
disengagement performance is
exemplary when Waymo decides
to begin Driverless service
(w/o an attendant, as opposed
to Drivered service).
- "Vehicle Data Recorder Specialist's Factual Findings": especially the charts of "Vehicle drive mode information": Figures 1 for the hour leading up to the crash and Figure 2, for just the 15 minutes prior. It is very interesting to have the precision and richness of data of the vehicle's behavior prior to the crash. Armed with this information, no wonder Elon wants to insure these cars. What is most interesting about these data is the chart of Lead Vehicle Distance (m). It shows that "lead vehicle distance" is not the instantaneous value obtained by the radar but some smoothed out value of { previous readings plus the latest radar value} (else, there would be some discrete jumps in the data when other cars either cut-in or cut-out of the Tesla's lane ahead.). Moreover, the appearance of a stationary object (approach speed = Tesla speed) in the lane ahead is disregarded (or very lightly weighted) in the determination of "lead vehicle distance". (it grew to its saturated value (that was much greater than the distance to the firetruck) once the lead SUV had changed lanes (whenever that was determined to have occurred). At some point (possibly 490msec before the crash, see below), the system decided that the stationary object detected ahead was not a "false reading" but actually a stationary object that should no longer be disregarded. Since it was being disregarded the Traffic Aware Cruise Control (TACC) operated using a large value for "lead vehicle distance" so it began to accelerate to its desired cruise speed, as would be expected if "lead vehicle distance" is a large value. Yipe!!!!! If Elon hasn't already demanded, NTSB should require Tesla, and all other manufacturers, to: 1. The software/logic governing TACC's behavior during transitions involving a cut-out or a cut-in needs to be substantially improved!, and 2. The reliability in the identification of stationary objects in the lane ahead needs to be substantially improved so that they cease to be assumed to be false alarms.
- "Vehicle Automation Data Summary Report": especially:
1. Figure 4, The speed of the Tesla in the last 221 seconds before the crash showing that the Tesla was traveling rather slowly in the 100 seconds before the crash (under 20 mph), but then accelerated (as discussed above) in the 3 seconds just prior to the crash, beginning as soon as the lead SUV changed lanes,
2. Figure 5, the distance between the Tesla and its lead vehicle, showing that the TACC worked really well until the lead vehicle "disappeared" (changed lanes), and"... Data show that at about 490 msec before the crash, the system detected a stationary object in path of the Tesla. At that time, the forward collision warning was activated; the system presented a visual and auditory warning. Data also shows that the AEB did not engage and that there was no driver-applied braking of steering prior to the crash. According to Tesla, the AEB was active at the time of the crash, and considering that the stopped fire truck was detected about half a second before impact, there likely was not sufficient time to activate the AEB." ...This implies that the AEB and its functioning in collaboration with the TACC needs to be substantially re-evaluated/re-designed. Alain
3. Figure 6 which clearly depicts the movement of the Tesla relative to the lead vehicle and the Firetruck in the 15 seconds before the crash. The Tesla's radar and front facing camera mush have "seen' the firetruck 4 seconds before the crash and every sensing loop (1/10th of a second) during the last 4 seconds yet...
Friday, August 30, 2019
[log in to unmask]" alt="" class="" width="50" height="39"> Former Star Google and Uber Engineer Charged With Theft of Trade Secrets
Tesla, July 16, "At Tesla, we
believe that technology can help improve safety.
That’s why Tesla vehicles are engineered to be the
safest cars in the world. We believe the unique
combination of passive safety, active safety, and
automated driver assistance is crucial for keeping
not just Tesla drivers and passengers safe, but
all drivers on the road. It’s this notion that
grounds every decision we make – from the design
of our cars, to the software we introduce, to the
features we offer every Tesla owner.
Model S, X and 3 have achieved the lowest
probability of injury of any vehicle ever tested
by the U.S. government’s New Car Assessment
Program.
... In the 2nd quarter, we registered one accident for every 3.27 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.19 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 1.41 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 498,000 miles.... " Read more Hmmmm.... This summary uses "accident" for Teslas and "crash" for NHTSA. This may suggest that the Tesla and NHTSA are not comp[arable... Tesla is reporting about apples and NHTSA is referring to "oranges". That notes; however, it does seem that for Teslas with and without AutoPilot and the other active safety features, there is consistency in the measure. A more detailed question arises about the equivalence of the driving domain for each category as well as who is at fault in each of these situations. Even in light of these issues and details, the large variation in the rates: 3.27 v 2.18 v 1.41 is very significant among Teslas. Seems as if AutoPilot and Tesla's other active collision avoidance safety features are improving safety of Teslas. The spread from the 0.5 value for NHTSA is really astonishing making Teslas much safer than the average of all other cars. Unfortunately these numbers only scratch the surface and beg for more details. In the past I have called for an independent evaluation of the Tesla crash statistics and I do that again there today. I'll offer to do it. Tesla should encourage someone to do it. As it stands today, not enough people believe or trust Tesla (see below) Tesla. That's unfortunate because improved safety is THE major objective of SmartDrivingCar technology. Alain
Oct 16, Establishes
fully autonomous vehicle pilot program A4573
Sponsors: Zwicker (D16); Benson (D14)
Oct 16, Establishes
New
Jersey Advanced Autonomous Vehicle Task Force
AJR164 Sponsors: Benson (D14); Zwicker
(D16); Lampitt (D6)
May 24, "About 9:58 p.m., on
Sunday, March 18, 2018, an Uber Technologies,
Inc. test vehicle, based on a modified 2017
Volvo XC90 and operating with a self-driving
system in computer control mode, struck a
pedestrian on northbound Mill Avenue, in Tempe,
Maricopa County, Arizona.
...The vehicle was factory
equipped with several advanced driver assistance
functions by Volvo Cars, the original
manufacturer. The systems included a collision
avoidance function with automatic emergency
braking, known as City Safety, as well as
functions for detecting driver alertness and
road sign information. All these Volvo functions
are disabled when the test vehicle is operated
in computer control..."
Read more Hmmmm....
Uber must believe that its systems are
better at avoiding Collisions and
Automated Emergency Braking than Volvo's. At least
this gets Volvo "off the hook".
"...According
to data obtained from the self-driving
system, the system first registered radar
and LIDAR observations of the pedestrian
about 6 seconds before impact, when the
vehicle was traveling at 43 mph..." (= 63
feet/second) So the system started
"seeing an obstacle when it was 63 x
6 = 378 feet away... more than a
football field, including end zones!
"...As the
vehicle and pedestrian paths converged, the
self-driving system software classified the
pedestrian as an unknown object, as a
vehicle, and then as a bicycle with varying
expectations of future travel path..." (NTSB:
Please tell us precisely when it
classified this "object' as a vehicle
and be explicit about the
expected "future
travel paths." Forget
the path, please just tell us the
precise velocity vector that Uber's
system attached to the "object", then
the "vehicle". Why didn't the the Uber
system instruct the Volvo to begin to
slow down (or speed up) to avoid a
collision? If these paths (or velocity
vectors) were not accurate, then why
weren't they accurate? Why was the
object classified as a
"Vehicle" ?? When
did it finally classify the object as a
"bicycle"? Why did it change
classifications? How often was the
classification of this object done.
Please divulge the time and the outcome
of each classification of this object. In the tests that
Uber has done, how often has the
system mis-classified an object as a
"pedestrian"when the object
was actually an overpass, or an
overhead sign or overhead
branches/leaves that the car
could safely pass under, or was
nothing at all?? (Basically,
what are the false alarm
characteristics of Uber's
Self-driving sensor/software
system as a function of vehicle
speed and time-of-day?)
"...At 1.3 seconds before impact, (impact speed was 39mph = 57.2 ft/sec) the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision" (1.3 x 57.2 = 74.4 ft. which is about equal to the braking distance. So it still could have stopped short.
"...According to
Uber, emergency braking
maneuvers are not enabled while
the vehicle is under computer
control, to reduce (eradicate??)
the potential for erratic
vehicle behavior. ..."
NTSB: Please describe/define potential
and erratic vehicle
behavior Also
please uncover
and divulge
the design
& decision
process that
Uber went
through to
decide that
this risk
(disabling the
AEB) was worth
the reward of
eradicating "
"erratic vehicle behavior". This
is
fundamentally
BAD design.
If the Uber
system's false
alarm rate is
so large that
the best way
to deal with
false alarms
is to turn off
the AEB, then
the system
should never
have been
permitted on
public
roadways.
"...The vehicle operator
is relied on
to intervene
and take
action. " Wow! If Uber's
system
fundamentally
relies on a
human to
intervene,
then Uber is
nowhere near
creating a
Driverless
vehicle.
Without its
own Driverless
vehicle Uber
is past "Peak
valuation".
Video similar to part of Adam's Luncheon talk @ 2015 Florida Automated Vehicle Symposium on Dec 1. Hmmm ... Watch Video especially at the 13:12 mark. Compelling; especially after the 60 Minutes segment above! Also see his TipRanks. Alain
This list is maintained by Alain
Kornhauser and hosted by the Princeton
University
Leave
|Re-enter