A. Karpathy, Nov. 6, "Hear from Andrej Karpathy on how Tesla is using PyTorch to develop full self-driving capabilities for its vehicles, including AutoPilot and Smart Summon. ..." Read more Hmmmm... Worth watching the video, (except for the StupidSummon part) Alain
M.
Hineman, Oct 2019, "Background: Public acceptance
is critical to the long-term success and near term testing of
self-driving cars....
The Data: The data from all four demos was compiled
to create a discussion starter for industry, regulators,
technology suppliers, vehicle makers and others impacted by
self-driving technology and those seeking to understand public
sentiment....
Methodology: Design of the SAE Demo Days survey was a collaborative effort between SAE and its strategic partners. Questions were added and removed for each event based upon participant feedback, partner ideas and value of collected responses. Surveys were collected on-site via electronic tablets connected to a cloud instance to store replies as they were entered both before and after the self-driving demo ride. Proctors oversaw and offered assistance with use and ensured the responses were collected independent of any other participant’s opinions....
Survey Results: The Public Is Enthusiastic About Self-Driving Cars...
If you would like to learn more about SAE Demo Days, or how
you can get involved visit
sae.org/demodays..." Read more
Hmmm... Very interesting: If you would like to
learn more about SAE Demo Days, or how you can get
involved visit sae.org/demodays and/or attend the
upcoming 7th Annual Florida
Automated Vehicles Summit. Alain
ikerobotics, Nov. 2019, "... In the future, trucks
powered by Ike's technology will navigate the highway without a
driver, handing off loads to truckers for the more complex part
of the journey on either end. It means more jobs for truckers
that keep them closer to home, using their skills and expertise
whee it matters..." Read more Hmmmm... Maybe...
Simply scroll through presentation of their case.
Maybe... Alain
P. Holley, Nov 13, "On the muggy streets of
suburban Houston, amid McMansions, bright green lawns and
stately oak trees, a futuristic race is quietly afoot.
The contestants are not people but late-model Toyota Priuses
outfitted with an array of sophisticated sensors. Despite fierce
competition and unending pressure to perform, the nearly silent
electric vehicles do not speed. They move cautiously, rigorously
following traffic laws and never topping 25 mph.
Their goal is not an easily discerned finish line but to map
large swaths of the nation’s fourth-largest metropolis, a
sprawling patchwork of neighborhoods, mini-cities, strip malls,
gridlocked superhighways and mazelike gated communities — an
area so prodigious in size it easily could swallow Manhattan,
Brooklyn, Queens and Staten Island whole.
The vehicles are owned by Nuro, a Silicon Valley robotics
company with an ambitious goal — to become the world’s
preeminent autonomous delivery service, allowing millions of
people to have groceries and other goods delivered by robots
instead of making trips to the store, potentially reducing
traffic and kicking off a new chapter in our relationship with
machines. For months now, Nuro’s robotically piloted vehicles
have been successfully, if quietly, delivering groceries to
restaurants and homes around Houston, the vehicles’ sensors
mapping the city as they go...." Read
more Hmmmm...
Maybe...Houston has already been mapped many times. What
hasn't/can't be mapped are the parked cars, pedestrians
and vehicles moving around
when the AI is driving the car. The Driverless car can't
run into those things. They are the tough things. So all
of this isn't being done to "map" but to "learn" and as
is pointed out in the article ... "...
To get there, they will first have to run their autonomous
vehicles, or AVs, through millions of miles of driving tests in
cities such as Houston until they are glitch-free and
unquestionably safe...." and "...As the
country’s most ethnically diverse large city — and with a
foreign-born population of 1.4 million — Houston also is a place
where Nuro officials could probe fundamental questions about its
business model.
“The big question for us is: Who is going to use this service,
and how often will they do it?” said Sola Lawal, a Nuro product
operations manager based in Houston who formerly worked for
Uber. “Our robots don’t care who they’re delivering to, but we
want to understand how different demographics interact with and
feel about the robots. Houston allows for this broad swath of
experience in one city.”..." A Driverless vehicle that delivers
groceries, Amazon packages, US Mail, ... doesn't need to
be able to overcome misbehavior by human riders
nor endure ride quality and comfort nuances. These
vehicles will have their own access and, especially,
egress challenges. Moving goods instead/before people is
not a bad way to start for no other reason than
goods/things don't have their own way to move. They must
be chauffeured in one way or another. (Note... while
"safety" is an absolute necessity, selling all of this on
"safety" and "visions of zero" is naive.) Alain
successfully,
A. Roy, Nov. 11, "The director of Friday Night Lights, Patriots Day, Very Bad Things and Deepwater Horizon joins Alex & Bryan to discuss what drives the characters in his films, fatherhood, risk, freedom, self-driving cars, and the greatest WW2 movie that has never been made. Also, boxing..." Read more Hmmm... Enjoy! Alain
K. Pyle, Oct. 5, "n impromptu mini-panel blossoms
in front of Viodi’s video camera at the World Safety Summit on
Autonomous Technology, as Princeton’s Dr. Alain Kornhauser leads
a discussion with Diana Furchtgott-Roth, Deputy Assistant
Secretary for Research and Technology for the U.S. Department of
Transportation. As with all conferences, some of the best
learnings and tidbits are discovered from the informal
conversations in the hallways, like the one documented in our
video. ..." Read more
Hmmm... Excellent interview with Diana
Furchgott-Roth. Thank you Ken. Alain
J. Fortin, Nov. 11, "Uber’s chief executive, Dara
Khosrowshahi, drew a backlash on Sunday after calling the murder
of Jamal Khashoggi a “mistake” by the Saudi government and
comparing the killing to the accidental death of a woman hit by
a self-driving car, remarks that he quickly walked back.
“It’s a serious mistake,” Mr. Khosrowshahi said in the interview
with Axios on HBO, which aired on Sunday, after he was asked
about the brutal killing last year by Saudi operatives of Mr.
Khashoggi, a Saudi journalist. “We’ve made mistakes too, right,
with self driving, and we stopped driving, and we’re recovering
from that mistake. So I think that people make mistakes.”..." Read
more Hmmm...
The two are not comparable. Uber did make serious
mistakes. It has paid and will continue to pay a
substantial price for those mistakes (market cap);
however, the Herzberg crash was not premeditated. Alain
N. Lavaers, Nov. 11, "Autonomous driving technologies could significantly shape the future of trucking over the coming decade, and as we're continuing to see through various trials around the world, platooning may play a big part in this transformation. By sending a pair of self-driving trucks down a test highway in South Korea, Hyundai has now carried out a platooning trial of its own, in which it says the vehicles successfully demonstrated some key platooning maneuvers....
The test saw two connected Hyundai Xcient trucks
take to the highway, along with other autonomous testing
vehicles that Hyundai says helped to simulate real-world driving
conditions. With platooning mode engaged, the following truck
trailed the leading truck by 16.7 m (55 ft), automatically
adjusting its braking and acceleration to keep that distance
constant. ..." Read
more Hmmm...
Nice that trucks are finally getting what is in Teslas and
many cars today without the need for V2V communications.
Brake lights actually illuminate slightly before the brake
calipers actually begin begin decelerating the truck and
at the same time that any V2V brake initiation message
might be received by the trailing truck. Alain
J. Hecht, Nov. 12, "...A planning document says
AI development should match public expectations so:
• AI never engages in careless, dangerous, or reckless driving
behavior
• AI remains aware, willing, and able to avoid collisions at
all times
• AI meets or exceeds the performance of a competent, careful
human driver...
Read more Hmmm... The first is easy, the 2nd is impossible because it includes "all times" the 3rd needs something about an Operational Design Domain (ODD), and must measure/quantify the performance of a competent, careful . I'm not hopeful. I don't believe that a "Driving Test" will establish public trust. What will establish public trust is a system that actually delivers driverless mobility and doesn't crash. Alain
AP, Nov. 13, "Facebook said it removed 3.2 billion fake accounts from its service from April to September, up slightly from 3 billion in the previous six months. ... Facebook estimates that about 5% of its 2.45 billion user accounts are fake...." Read more Hmmm... What??? You claim/admit to have more fake accounts that active accounts. And, these fake accounts accumulate every 6 months at a rate that is greater than the total real accounts that you have. These fake accounts have been accumulating at a rate of about 10% per week (10.047% = (3.2/2.45)/13)). Moreover, the claim is that of the 2.45 B users (today?) only 5% are fake, yet September is more than 6 weeks ago which would suggest that 60% are fake today!?? I need Machine Learning to straighten all of this out. C'mon Mark! Alain
A. Moser, Nov 16, "A team of ducks crossing the road in
Gisborne, New Zealand were saved when Tesla Autopilot sensed the
fowl in the road and swerved at the last minute, narrowly
avoiding them...." Read more
Hmmm... I watched this several times
and I suspect that this is a fake. The ducks seem to be
walking across faster than ducks walk. This suggests that
the video has been sped up. Plus the brakes don't seem to
have been applied, which suggests that the collision
avoidance was done using only swerving. Luckily there
wasn't a car coming the other way. I have my doubts. Hopefully Andrej
Karpathy will chime in and correct me here. Alain
November
20th-22nd, 2019
HILTON MIAMI DOWNTOWN
1601 BISCAYNE BLVD
MIAMI, FL 33132
F. Fishkin, May 18,, "From the 3rd Annual Princeton Smart Driving Car Summit, join Professor Alain Kornhauser and co-host Fred Fishkin. In this special edition, the summit's focus on mobility for all with guests Anil Lewis, Executive Director of Blindness Initiatives at the National Federation of the Blind and ITN America Founder Katherine Freund."
April 5, F. Fishkin, "The success of on demand transit company Via is proving that ride sharing systems can work. Public Policy head Andrei Greenawalt joins Princeton's Alain Kornhauser and co-host Fred Fishkin for a wide ranging discussion. Also: Uber, Tesla, Audi, Apple and Nuro are making headlines"
April 5, F. Fishkin, "Here comes congestion pricing in New York City...but what will it mean? Former city Taxi and Limousine Commission head and transportation expert Matthew Daus joins Princeton's Alain Kornhauser and co-host Fred Fishkin. Also...Tesla, VW and even Brexit! All on Episode 98 of Smart Driving Cars."
March 28, F. Fishkin, "The Future Networked Car? From Sweden, The Dispatcher publisher, Michael Sena, joins Princeton's Alain Kornhauser and co-host Fred Fishkin for the latest edition of Smart Driving Cars. Plus ...the Boeing story has much to do with autonomous vehicles and more. Tune in and subscribe."
F. Fishkin, Sept 6, "The coming new world of driverless cars! In Episode 55 of the Smart Driving Cars podcast former GM VP and adviser to Waymo Larry Burns chats with Princeton's Alain Kornhauser and Fred Fishkin about his new book "Autonomy: The Quest to Build the Driverless Car and How it Will Reshape Our World"
Elon, you sell cars to individuals at which point you relinquish control and responsibility, and thankfully, liability, for that car. Please do everything that you can to be certain that your cars are used responsibly at all times and that those individuals are alert and in control at all times; else, you'll re-acquire the responsibility and the liability. The burden of liability is not good for any business. Liability without control is TrainWreck. The regulators won't save you. Alain
- If you get matched with a fully driverless car, you'll see a notification in your Waymo app that confirms the car won't have a trained driver up front....
- you can enjoy having the car all to yourself....
R. Mitchell, Oct. 4, " Smart Summon is for
parking lot use. But drivers have other ideas.
Tesla unleashed the latest twist in driverless car technology last week, raising more questions about whether autonomous vehicles are outracing public officials and safety regulators.
...Using a smartphone, a person can now command
a Tesla to turn itself on, back out of a parking space and
drive to the smartphone holder's location - say at a curb in
front of a Costco store.." Read
more Hmmmm....
Russ, great article. A must read!
Elon, please
stop. StupidSummon was a bad Valley-entitled idea
before you released it. Now that it is out there
it will ruin all that is good about Tesla,
AutoPilot and Driverless cars. The shorters are
going to have a field day.
While you are
at it also remove all of the DistractTainment add
ons or limit their use when AutoPilot is NOT on
and drivers are engaged in driving. Just go back
to V09! Along the way also get the Automated
Emergency Braking (AEB) system to work properly
(See NTSB
below). To do that, maybe you should take a
serious look at Velodyne's
new
Tesla LiDAR. It may be able to tell you if
the stationary object in the lane ahead is high
enough above the road surface before your
AEB system decides to disregard it. Then Tesla's
may stop decapitating
drivers.
If
you don't remove StupidSummon then at least be sure to
limit its use to the Tesla owner's own private property
by responsible users. (You know the GPS coordinates of
where each owner lives, so you can geofence it. You
also know each irresponsible use (You get the videos).
Irresponsible use (use in the violation of the
conditions spelled out in the user's manual) should void
its future availability in that car unless proper amend
are made. If not, then insurance companies should
clearly state that insuring the use of this feature
requires a substantial additional premium; else, you're
not covered. Courts should view that use of this
feature implies premeditated harm and demonstrates an
extreme indifference to human life. Parking Lot owners
should install signs forbidding the use of this feature
on their property to protect themselves from being
dragged into the claims process.
K. Korosec, Sept 16, "Waymo transported 6,299 passengers in
self-driving ...drivered,
not driverless...Chrysler
Pacifica minivans in its first month participating in a robotaxi
pilot program in California, according to a quarterly report the
company filed with the California Public Utilities Commission.
In all, the company completed 4,678 passenger trips in July —
plus another 12 trips for educational purposes. It’s a
noteworthy figure for an inaugural effort that pencils out to an
average of 156 trips every day that month. And it demonstrates
that Waymo has the resources, staff and vehicles to operate a
self-driving vehicle pilot while continuing to test its
technology in multiple cities and ramp up its Waymo One
ride-hailing service in Arizona...
The CPUC authorized in May 2018 two pilot programs for
transporting passengers in autonomous vehicles. The first one,
called the Drivered Autonomous Vehicle Passenger Service
Pilot program, allows companies to operate a ride-hailing
service using autonomous vehicles as long as they follow
specific rules. Companies are not allowed to charge for rides, a
human safety driver must be behind the wheel and certain data
must be reported quarterly.
The second CPUC pilot would allow driverless passenger service —
although no company has yet to obtain that permit...."Read
more Hmmmm....
Be sure to look at the Waymo
Quarterly Report and that of the other 3 companies:
Zoox,
AutoX
and Pony.ai.
Those 4 companies reported respectively [ 4,678; 103; 9; 0] vehicleTrips; [ 6,299; 134; 13; 0] personTrips; [59,917; 352; ?; 0] vehicleMiles,
and [ 55; 10; 1; 0] number
of unique vehicles used throughout the
quarter. Note Waymo only began operating on
July 2, the last month of the quarter [May, June, July]. Note:
the CPUC does not permit casual shared-ride
services (serving individuals or groups of
individuals who weren't predisposed to travel
together). Go
figure??? Alain
Also note:
This is Drivered Service, meaning
there is an attendant/driver inside each
vehicle for each trip; so this is actually
conventional ride-hailing, a la Lyft/Uber with
fancy schmancy vehicles. The CPUC did NOT
require "disengagement reporting" so one has
no idea as to the extent of driver/attendant
involvement is the provision of the Drivered
service. It will be interesting to learn if
Waymo considers this activity to be part of
its AV testing program and includes the
disengagement performance of these vehicles in
its disengagement report to the CA DMV at the
end of the year. We'll be
able to infer if that the disengagement
performance is exemplary when Waymo decides to
begin Driverless service (w/o an
attendant, as opposed to Drivered service).
1. Figure 4, The speed of the Tesla in the last 221 seconds before the crash showing that the Tesla was traveling rather slowly in the 100 seconds before the crash (under 20 mph), but then accelerated (as discussed above) in the 3 seconds just prior to the crash, beginning as soon as the lead SUV changed lanes,
2. Figure 5, the distance between the Tesla and its lead vehicle, showing that the TACC worked really well until the lead vehicle "disappeared" (changed lanes), and"... Data show that at about 490 msec before the crash, the system detected a stationary object in path of the Tesla. At that time, the forward collision warning was activated; the system presented a visual and auditory warning. Data also shows that the AEB did not engage and that there was no driver-applied braking of steering prior to the crash. According to Tesla, the AEB was active at the time of the crash, and considering that the stopped fire truck was detected about half a second before impact, there likely was not sufficient time to activate the AEB." ...This implies that the AEB and its functioning in collaboration with the TACC needs to be substantially re-evaluated/re-designed. Alain
3. Figure 6 which clearly depicts the movement of the Tesla relative to the lead vehicle and the Firetruck in the 15 seconds before the crash. The Tesla's radar and front facing camera mush have "seen' the firetruck 4 seconds before the crash and every sensing loop (1/10th of a second) during the last 4 seconds yet...
M. Isaac, Aug 27, "Anthony Levandowski was once one of Silicon
Valley’s most sought after technologists. As a pioneer of
self-driving car technology, he became a confidant of Larry
Page, a co-founder of Google, and helped develop the search
giant’s autonomous vehicles. Uber wooed him to gain an edge in
self-driving techniques. Venture capitalists threw their money
at him.
But on Tuesday, Mr. Levandowski, 39, fell far from that favored
stature. Federal prosecutors charged him with 33 counts of theft
and attempted theft of trade secrets from Google. ...
The criminal
indictment against Mr. Levandowski from the United States
Attorney’s Office for the Northern District of California opens
a new chapter in a legal battle that has embroiled Google, its
self-driving car spinoff Waymo and its rival Uber in the
high-stakes contest over autonomous vehicles. The case also
highlights Silicon Valley’s no-holds-barred culture, where
gaining an edge in new technologies versus competitors can be
paramount....
According to the indictment, Mr. Levandowski downloaded more than 14,000 files containing critical information about Google’s autonomous-vehicle research before leaving the company in 2016. He then made an unauthorized transfer of the files to his personal laptop, the indictment said. Mr. Levandowski joined Uber later that year when the ride-hailing firm bought his new self-driving trucking start-up, which was called Otto....
“The Bay Area has the best and brightest engineers, and they
take big risks,” John Bennett, the F.B.I. special agent in
charge of the San Francisco Division, said at a news conference
on Tuesday. “But Silicon Valley is not the Wild West. The
fast-paced and competitive environment does not mean federal
laws do not apply.”Mr. Levandowski’s next court date is Sept. 4.
If he is convicted, he could face a maximum of 10 years in
prison, a $250,000 fine for every count and additional
restitution.
“All of us are free to move from job to job,” said David L.
Anderson, United States attorney in the Northern District of
California. “What we cannot do is stuff our pockets on the way
out the door.”..." Read
more Hmmm...
Central to this technology is the perception of
personal safety and trust. Lying, cheating & stealing
can't be part of this industry, else it will never emerge
from the venture stage. If DeiselGate
and the Uber
crash weren't enough, let this be the next wake-up
call to this industry to clean up its ethical behavior.
Hopefully the FBI will also aggressively pursue all cyber
attackers. It isn't cute, nor a virtual reality game. It
is hard serious work and creativity focused on improving
the quality of everyday life. Alain
J. Browne, Aug 16, "Autonomous vehicles are the
future. Self-driving cars could change our lives, heralding an
era of greater convenience, improved productivity and safer
roads...." Read
more Hmmmm.... Actually much
of this opening sentence is a myth... It doesn't take
Self-driving or Driverless to have automation technology
yield safer roads. It takes safe-driving technology that
works, like Automated Emergency Braking (front and rear)...
And ... are we really going to do our "manufacturing or
service job " (increase "productivity") if we don't have to
do the work of driving anymore??? Of the few "riding
shotgun to work" what percentage are doing work while riding
shotgun? Certainly less than 10%. Less than 1%? So much
for productivity improvements
If we get to Driverless,
then the myths aren't myths. There will be fewer private
cars, downtown congestion will be reduced, the environment
will be saved, the insurance industry's gross revenues will
go down substantially (but their
profits will go up) and AVs are already safer than humans
that text and/or are "under the influence" while driving.
If we don't get to Driverless, then we'll remain with "Do-it-yourself private mobility" that will include Self-driving assistance. Armed with that form of personal mobility, then all the myths are myths: More private cars ... and the policy implications are clear. See: J. M. Greenwald, A. L. Kornhauser "It’s up to us: Policies to improve climate outcomes from automated vehicles". Also, to have a proper perspective of the role of transportation and car/"FordF150s" in greenhouse gas emissions see... M. Sivak, Aug 22, "Increased relative contribution of medium and heavy trucks to U.S. greenhouse gas emissions" Alain
K. conger, Aug 7, "Uber set two dubious quarterly
records on Thursday as it reported its results: its largest-ever
loss, exceeding $5 billion, and its slowest-ever revenue growth.
The double whammy immediately renewed questions about the
prospects for the company, the world’s biggest ride-hailing
business. Uber has been dogged by concerns about sluggish sales
and whether it can make money, worries that were compounded by a
disappointing initial public offering in May.
For the second quarter, Uber said it lost $5.2 billion, the
largest loss since it began disclosing limited financial data in
2017. A majority of that — about $3.9 billion — was caused by
stock-based compensation that Uber paid its employees after its
I.P.O. Excluding that one-time expense, Uber lost $1.3 billion,
or nearly twice the $878 million that it lost a year earlier. On
that sariesme basis and excluding other costs, the company said it
expected to lose $3 billion to $3.2 billion this year...Lyft has
also reported a series of deep losses. This week, it said it lost
$644.2 million in the second quarter, though it added that it
expected that amount to abate. Several months earlier, Lyft had
also posted a particularly steep loss related to stock-based
compensation payouts to its employees..." Read
more Hmmmm.... No wonder Uber
looked so good prior to its IPO, it hadn't "paid" its
employees. So is this really a "one time" expense?? Anyway,
Driverless is their only potential savior as a $40 stock. They
can't afford to pay their employee, their gig workers can't
feed families, new customers can't afford their prices and
food delivery generates only chump change. Uber
Stock price, See also...Uber and Lyft keep losing money while
driving up the number of cars on our overcrowded streets.
Alain
Tesla, July 16, "At Tesla, we believe that
technology can help improve safety. That’s why Tesla vehicles are
engineered to be the safest cars in the world. We believe the
unique combination of passive safety, active safety, and automated
driver assistance is crucial for keeping not just Tesla drivers
and passengers safe, but all drivers on the road. It’s this notion
that grounds every decision we make – from the design of our cars,
to the software we introduce, to the features we offer every Tesla
owner.
Model S, X and 3 have achieved the lowest probability of injury of
any vehicle ever tested by the U.S. government’s New Car
Assessment Program.
... In the 2nd quarter, we registered one accident for every 3.27 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.19 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 1.41 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 498,000 miles.... " Read more Hmmmm.... This summary uses "accident" for Teslas and "crash" for NHTSA. This may suggest that the Tesla and NHTSA are not comp[arable... Tesla is reporting about apples and NHTSA is referring to "oranges". That notes; however, it does seem that for Teslas with and without AutoPilot and the other active safety features, there is consistency in the measure. A more detailed question arises about the equivalence of the driving domain for each category as well as who is at fault in each of these situations. Even in light of these issues and details, the large variation in the rates: 3.27 v 2.18 v 1.41 is very significant among Teslas. Seems as if AutoPilot and Tesla's other active collision avoidance safety features are improving safety of Teslas. The spread from the 0.5 value for NHTSA is really astonishing making Teslas much safer than the average of all other cars. Unfortunately these numbers only scratch the surface and beg for more details. In the past I have called for an independent evaluation of the Tesla crash statistics and I do that again there today. I'll offer to do it. Tesla should encourage someone to do it. As it stands today, not enough people believe or trust Tesla (see below) Tesla. That's unfortunate because improved safety is THE major objective of SmartDrivingCar technology. Alain
Oct 16, Establishes
fully autonomous vehicle pilot program A4573 Sponsors:
Zwicker (D16); Benson (D14)
Oct 16, Establishes
New Jersey
Advanced Autonomous Vehicle Task Force AJR164 Sponsors:
Benson (D14); Zwicker (D16); Lampitt (D6)
May 24, "About 9:58 p.m., on Sunday, March 18,
2018, an Uber Technologies, Inc. test vehicle, based on a
modified 2017 Volvo XC90 and operating with a self-driving
system in computer control mode, struck a pedestrian on
northbound Mill Avenue, in Tempe, Maricopa County, Arizona.
...The vehicle was factory equipped with several
advanced driver assistance functions by Volvo Cars, the original
manufacturer. The systems included a collision avoidance
function with automatic emergency
braking, known as City Safety, as well as functions for
detecting driver alertness and road sign information. All these
Volvo functions are disabled when the test vehicle is operated
in computer control..."
Read more Hmmmm.... Uber must believe
that its systems are better at avoiding Collisions and
Automated Emergency Braking than Volvo's. At least this gets Volvo
"off the hook".
"...According to data obtained
from the self-driving system, the system first registered
radar and LIDAR observations of the pedestrian about 6
seconds before impact, when the vehicle was traveling at 43
mph..." (= 63 feet/second) So
the system started "seeing an obstacle when it was
63 x 6 = 378 feet away... more than a football
field, including end zones!
"...As the vehicle and pedestrian
paths converged, the self-driving system software classified
the pedestrian as an unknown object, as a vehicle, and then
as a bicycle with varying expectations of future travel
path..." (NTSB: Please tell us
precisely when it classified this "object' as a
vehicle and be explicit about the
expected "future
travel paths." Forget the path,
please just tell us the precise velocity vector that
Uber's system attached to the "object", then the
"vehicle". Why didn't the the Uber system instruct the
Volvo to begin to slow down (or speed up) to avoid a
collision? If these paths (or velocity vectors) were
not accurate, then why weren't they accurate? Why was
the object classified as a "Vehicle" ?? When did it finally classify the
object as a "bicycle"? Why did it change classifications?
How often was the classification of this object done.
Please divulge the time and the outcome of each
classification of this object. In
the tests that Uber has done, how often has the
system mis-classified an object as a "pedestrian"when
the object was actually an overpass, or an
overhead sign or overhead branches/leaves that
the car could safely pass under, or was nothing
at all?? (Basically, what are the false alarm
characteristics of Uber's Self-driving
sensor/software system as a function of vehicle
speed and time-of-day?)
"...At 1.3 seconds before impact, (impact speed was 39mph = 57.2 ft/sec) the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision" (1.3 x 57.2 = 74.4 ft. which is about equal to the braking distance. So it still could have stopped short.
"...According
to Uber, emergency braking maneuvers are not
enabled while the vehicle is under
computer control, to reduce (eradicate??)
the potential for erratic vehicle
behavior. ..." NTSB: Please
describe/define potential and erratic
vehicle behavior Also
please uncover and
divulge the design
& decision process
that Uber went through
to decide that this
risk (disabling the
AEB) was worth the
reward of eradicating
" "erratic vehicle behavior". This
is
fundamentally
BAD design.
If the Uber
system's false
alarm rate is
so large that
the best way
to deal with
false alarms
is to turn off
the AEB, then
the system
should never
have been
permitted on
public
roadways.
"...The
vehicle operator is relied
on to intervene and take
action. " Wow! If Uber's
system
fundamentally
relies on a
human to
intervene, then
Uber is nowhere
near creating a
Driverless
vehicle.
Without its own
Driverless
vehicle Uber is
past "Peak
valuation".
Video similar to part of Adam's Luncheon talk @ 2015 Florida Automated Vehicle Symposium on Dec 1. Hmmm ... Watch Video especially at the 13:12 mark. Compelling; especially after the 60 Minutes segment above! Also see his TipRanks. Alain
This list is maintained by Alain Kornhauser and
hosted by the Princeton
University
Leave
|Re-enter