imap://alaink@exchangeimap.princeton.edu:993/fetch%3EUID%3E/INBOX%3E3022058?part=1.2&filename=hejedgabmgkdglfj.png

SmartDrivingCar.com/6.23-NTSB-Uber-052518
23st edition of the 6th year of SmartDrivingCars

Friday, May 25,  2018

cid:part5.AE4FD59B.5339998B@princeton.eduPRELIMINARY REPORT: HIGHWAY: HWY18MH010 (Uber/Herzberg Crash)

KMay 24, “About 9:58 p.m., on Sunday, March 18, 2018, an Uber Technologies, Inc. test vehicle, based on a modified 2017 Volvo XC90 and operating with a self-driving system in computer control mode, struck a pedestrian on northbound Mill Avenue, in Tempe, Maricopa County, Arizona.

…The vehicle was factory equipped with several advanced driver assistance functions by Volvo Cars, the original manufacturer. The systems included a collision avoidance function with automatic emergency
braking, known as City Safety, as well as functions for detecting driver alertness and road sign information. All these Volvo functions are disabled when the test vehicle is operated in computer control…” Read more  Hmmmm…. Uber must believe that its systems are better at avoiding Collisions and Automated Emergency Braking than Volvo’s.  At least this gets Volvo “off the hook”. 

…According to data obtained from the self-driving system, the system first registered radar and LIDAR observations of the pedestrian about 6 seconds before impact, when the vehicle was traveling at 43 mph…” (= 63 feet/second)  So the system started “seeing an obstacle when it was 63 x 6 = 378 feet away… more than a football field, including end zones!   

“…As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path…” (NTSB: Please tell us precisely when it classified this “object’ as a vehicle and be explicit about the expected “future travel paths.”  Forget the path, please just tell us the precise velocity vector that Uber’s system attached to the “object”, then the “vehicle”.  Why didn’t the the Uber system instruct the Volvo to begin to slow down (or speed up) to avoid a collision?  If these paths (or velocity vectors) were not accurate, then why weren’t they accurate?  Why was the object classified as a   “Vehicle” ??  When did it finally classify the object as a “bicycle”?  Why did it change classifications?  How often was the classification of this object done.  Please divulge the time and the outcome of each classification of this object.  In the tests that Uber has done, how often has the system mis-classified an object as a “pedestrian”when the object was actually an overpass, or an overhead sign or overhead branches/leaves that the car could safely pass under, or was nothing at all?? (Basically, what are the false alarm characteristics of Uber’s Self-driving sensor/software system as a function of vehicle speed and time-of-day?)  

“…At 1.3 seconds before impact, (impact speed was 39mph = 57.2 ft/sec) the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision” (1.3 x 57.2 = 74.4 ft. which is about equal to the braking distance. So it still could have stopped short.

“…According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce (eradicate??) the potential for erratic vehicle behavior. …” NTSB:  Please describe/define potential  and erratic vehicle behavior   Also please uncover and divulge the design & decision process that Uber went through to decide that this risk (disabling the AEB) was worth the reward of eradicating ” erratic vehicle behavior“.  This is fundamentally BAD design.  If the Uber system’s false alarm rate is so large that the best way to deal with false alarms is to turn off the AEB, then the system should never have been permitted on public roadways. 

…The vehicle operator is relied on to intervene and take action. Wow!  If Uber’s system fundamentally relies on a human to intervene, then Uber is nowhere near creating a Driverless vehicle.  Without its own Driverless vehicle Uber is past “Peak valuation”.  

“…The system is not designed to alert the operator. ” That may be the only good part of Uber’s design.  In a Driverless vehicle, there is no one to warn, so don’t waste your time.  If it is important enough to warn, then it is important enough for the automated system to start initiating things to do something about it.  Plus, the Driver may not know what to do anyway.  This is pretty much as I stated in PodCast 30 and the March 24 edition of SmartDrivingCar, See below.  Alain 

imap://alaink@exchangeimap.princeton.edu:993/fetch%3EUID%3E/INBOX%3E3022058?part=1.4&filename=fkcoajjkbhnffcof.pngSmart Driving Cars Podcast Episode 40

F. Fishkin, May 25, “With the NTSB preliminary report out on the fatal crash in Arizona …where does Uber go from here? Princeton University’s Alain Kornhauser and co-host Fred Fishkin tackle those issues plus more on Mobileye, Apple and the report on the dangers of push button car ignitions. Listen and subscribe!”

Hmmmm…. Now you can just say “Alexa, play the Smart Driving Cars podcast!” .  Ditto with Siri, and GooglePlay.  Alain

Real information every week.  Lively discussions with the people who are shaping the future of SmartDrivingCars.  Want to become a sustaining sponsor and help us grow the SmartDrivingCars newsletter and podcast? Contact Alain Kornhauser at alaink@princeton.edu!  Alain

cid:<a href=Uber chose to disable emergency braking system before fatal Arizona robot car crash, safety officials say

Russ Mitchell, May 24, “…The failure to brake in the Arizona accident highlights the immaturity of driverless technology and tradeoffs made by programmers that could end in tragedy. While the basics of various companies’ driverless systems are the same, in detail they differ greatly, from how software is written to which sensor systems are used. Tesla, for instance, bucks the industry in general by dismissing the need for lidar, an expensive technology that uses laser pulses to draw images of physical objects.

“What is not being stressed is that the performance of these systems varies greatly,” said Alain Kornhauser, director of Princeton University’s autonomous vehicle engineering program. “The technology is not all the same.”…”   Read more  Hmmmm….Uber was testing a system that fundamentally requires a human operator to be alert and monitoring in ALL situations when emergency brakes need to be applied because its “automatic driving system” design “turns off the Emergency Braking System when the “automatic driving system” is turned on.  Yipes!!! Alain

cid:<a href=Feds: Uber self-driving SUV saw pedestrian, did not brake

T. Krisher, May 24,  “The autonomous Uber SUV that struck and killed an Arizona pedestrian in March spotted the woman about six seconds before hitting her, but did not stop because the system used to automatically apply brakes in potentially dangerous situations had been disabled, according to federal investigators….Uber, he said, likely determined in testing that its system braked in situations it shouldn’t have, possibly for overpasses, signs and trees. “It got spoofed too often,” Kornhauser said. “Instead of fixing the spoofing, they fixed the spoofing by turning it off.”…”  Read more  Hmmmm….Yup Tesla & MobilEye likely also turn off (or fail to activate) the AEB in similar situations (when closing speed = vehicle speed) which the NTSB failed to point out as a root cause in the Joshua Brown crash and is a likely factor in the Tesla Firetruck and the “405/butt-end NJ Barrier” Tesla crashes. Alain

cid:<a href=Fatal crash prompts Uber to shut down Arizona self-driving car trial

A. Ganz, May 23, “Two months after one of its vehicles was involved in a fatal crash with a pedestrian, Uber said Wednesday that it will formally cease self-driving car testing and development in Arizona…”  Read more  Hmmmm…. Uber must have been anticipating the NTSB’s preliminary report (above). Alain

cid:<a href=Internal Uber email announces shutdown of Arizona driverless car testing

T. Lee, May 23, “Uber is shutting down testing of self-driving cars in Arizona after one of its cars killed a pedestrian in a March crash. In an internal email, Uber executive Eric Meyhofer wrote that Uber would be shifting its focus to Pittsburgh, …Uber hopes to resume testing in Pittsburgh this summer.

Meyhofer also indicated that Uber would be changing how it tested its driverless cars. “When we get back on the road, we intend to drive in a much more limited way to test specific use cases,” Meyhofer wrote. “Taking this approach will allow us to continually hone the safety aspects of our software and operating procedures. We have also used the past two months to strengthen our simulation capability, which will allow us to be more efficient with our use of road miles.” Read more  Hmmmm…. Uber really needs to understand and appropriately justify what its code is doing and not doing.  If “false alarming:  we are on a collision course with a pedestrian” occurs so frequently that it justifies the insertion of code that treats such determinations by its “AI”: “we are on a collision course with a pedestrian” is a “false alarms”  and consequently go on with “business as usual”  is totally unethical, if it isn’t criminal.  Alain

cid:<a href=Intel’s Mobileye wants to dominate driverless cars—but there’s a problem

T. Lee, May 21, “Mobileye, the Israeli self-driving technology company Intel acquired last year, announced on Thursday that it would begin testing up to 100 cars on the roads of Jerusalem. But in a demonstration with Israeli television journalists, the company’s demonstration car blew through a red light. (near end of video)….While most companies working on full self-driving technology have made heavy use of lidar sensors, Mobileye is testing cars that rely exclusively on cameras for navigation. Mobileye isn’t necessarily planning to ship self-driving technology that works that way. Instead, testing a camera-only system is part of the company’s unorthodox approach for verifying the safety of its technology stack. That strategy was first outlined in an October white paper, and Mobileye CTO Amnon Shashua elaborated on that strategy in a Thursday blog post.

“We target a vehicle that gets from point A to point B faster, smoother, and less-expensively than a human-driven vehicle; can operate in any geography; and achieves a verifiable, transparent 1,000-times safety improvement over a human-driven vehicle without the need for billions of miles of validation testing on public roads,” Shashua wrote on Thursday.

It’s a bold claim. We’re skeptical it’s actually true.

The company argues that validating the sensing system is made even easier by capitalizing on sensor redundancies….But this argument relies on two big assumptions, and it’s far from clear that either of them is true.

The first assumption is that the failure modes of the two sensing systems are independent…

And it’s that last point that should really worry us: that Mobileye’s model likely makes assumptions that don’t actually describe the real world.

For example, Mobileye is implicitly assuming that fusing the two sensor systems together won’t introduce any new sources of error. But as Navigant Research analyst Sam Abuelsamid pointed out to us, that’s not likely to be true….

So the company’s leadership has convinced itself that it can rely heavily on formal mathematical proofs as a substitute for millions of miles of real-world testing, because its business model doesn’t leave it with many good alternatives. But wishing this were true doesn’t make it so.”  Read more  Hmmmm…. This is an excellent article!  Read all of it.  Alain

imap:<a href=Deadly Convenience: Keyless Cars and Their Carbon Monoxide Toll

 D. Jeans, May 13, “It seems like a common convenience in a digital age: a car that can be powered on and off with the push of a button, rather than the mechanical turning of a key. But it is a convenience that can have a deadly effect.

On a summer morning last year, Fred Schaub drove his Toyota RAV4 into the garage attached to his Florida home and went into the house with the wireless key fob, evidently believing the car was shut off. Twenty-nine hours later, he was found dead, overcome with carbon monoxide that flooded his home while he slept….” Read more  Hmmmm….This is an example of why real-world testing is necessary.  No simulation or “Math model” did, nor would have, predicted this outcome.  Only implementation in the real world allows one to identify this new risk.  A similar thing happened with airbags… they can kill little kids.

One simply doesn’t know what one doesn’t know and Mother Nature throws a lot of curve balls and change ups and one only finds out about these by going up to bat. 

The remedy for the airbags was/is, yellow stickers on all sun visors alerting parents to have their children sit in the back seats.  In the case of he push start, the remedy is to allow push starts only in cars that automatically shut off the engine after a brief idle (in order to improve fuel economy).. So that if you drive into your garage and get out, the car will idled for a few seconds and then automatically shut itself off, if you’ve forgotten to.  Alain

cid:<a href=Fiat Chrysler recalls 4.8 million U.S. vehicles for cruise control defect

D. Shepardson, May 25, “Fiat Chrysler Automobiles NV said on Friday it is recalling 4.8 million U.S. vehicles over a defect that could prevent drivers from deactivating cruise control and warned owners not to use the function until they get software upgrades.  The Italian-American automaker said no injuries or crashes are related to the large recall campaign but said it had one report of a driver of a 2017 Dodge Journey rental car unable to deactivate the cruise control….NHTSA said drivers could overpower the system by forcefully applying the brakes until the vehicle stopped. Fiat Chrysler also said the vehicle could be stopped by shifting into neutral and braking….

Fiat Chrysler noted that at times cruise control systems automatically initiate acceleration to help vehicles maintain driver-selected speeds, including when going up an incline. If an acceleration occurs simultaneously with a short-circuit in a specific electrical network, a driver could be unable to deactivate the function…” Read more  Hmmmm…. Kudos to Fiat-Chrysler for fixing the problem.  Again, another one of these massive curve balls thrown by Mother Nature.  Simulation can crunch through a lot of combinatorials, but “accelerating at the time of a short circuit that wipes some memory registers that it can be fixed by software”  is a really rare one.  How did they reproduce it?    Whew!  

I bet Fiat wishes that it had Over-the-Air software update capability (but who knows what that would break and of course there’s Cyber vulnerability).  Details matter and details aren’t simple. Again, Kudos to F-C for JFing It (Just Fixing It).  Alain

cid:<a href=An Autonomous Vehicle Ecosystem Crash Course at Princeton

K. Pyle, May 22, “A kind of college experience compressed into 2+ days is how the SmartDrivingCars Summit could be described. Led by the affable, gracious and entertaining host Professor Alain Kornhauser on the historic and beautiful grounds of Princeton University, this two-day event was a deep dive into all things dealing with autonomous transportation, as well as adjacent topics, such as land-use in a world where the need and the role for parking will change.

The SmartDrivingCars Summit drew a diverse group from academia, industry and government to discuss the challenges of transitioning to autonomous vehicles.  Highlights Include: …”  Read more  Hmmmm….Ken, thank you for the kind words  Plus the two tweets of the Kornhauser Law: one, two. And a couple of videos:  Kornhauser – Images & Perspectives,   Kornhauser – Elevator Analogy

cid:<a href=The Open Source Solution to Autonomous Safety #smartdrivingcar

K. Pyle, May 9 (updated) “Safety and, as importantly, the perception of safety could be the pin that pricks the expectations surrounding the autonomous vehicle future. Recognizing the importance of safety to the success of this still nascent industry, autonomous taxi start-up, Voyage, recently placed their testing and reporting procedures in an open source framework. Voyage Co-Founder, Eric Gonzalez explains in the above interview that, at launch, there are four functional blocks to OAS (Open Autonomous Safety) that are now part of a GitHub repository:…” Read more  Hmmmm…. and see video.  Alain

cid:<a href=2nd Annual Princeton SmartDrivingCar Summit

A. Kornhauser, May 17, ” Outstanding Summit.  Click on URL read presentations ” Read more  Hmmmm…. Enjoy!  Alain

cid:<a href= 2nd Annual Princeton SmartDrivingCar Summit:  Interviews with Key Participants

F. Fishkin, May 16 & 17:

Interview with Velodyne’s John Eggert
Interview with Sam Schwartz
Interview with NVIDIA’s Danny Shapiro
Interview with Adriano Alessandrini & Michel Parent
Interview with Kurtis Hodge of Local Motors
Interview with the Alliance for Transportation Innovation’s Paul Brubaker..
Interview with Voyage CEO Oliver Cameron & Eric Gonzalez

imap:<a href=Apple, Spurned by Others, Signs Deal With Volkswagen for Driverless Cars

J. Nicas, May 23, “…In recent years, Apple sought partnerships with the luxury carmakers BMW and Mercedes-Benz to develop an all-electric self-driving vehicle, according to five people familiar with the negotiations who asked not to be identified because they were not authorized to discuss the matter publicly. But on-again, off-again talks with those companies have ended after each rebuffed Apple’s requirements to hand over control of the data and design, some of the people said..”  Read more  Hmmmm…. Is that really the best that Apple could do?  Alain


Half-baked stuff that probably doesn’t deserve your time


 C’mon Man!  (These folks didn’t get/read the memo)

imap:<a href=Uber to open Advanced Technologies Center in Paris focused on flying taxis

A. Hawkins, May 24 “Uber’s plan to fill the skies above cities with swarms of electric-powered flying taxis is getting its own dedicated laboratory. …” Read more  Hmmmm….  Does Uber really believe that this will turn around its plunging valuation???? C’mon Man!! Alain



Calendar of Upcoming Events:

imap:<a href=

3rd Annual Princeton SmartDrivingCar Summit
evening May 14 through May 16, 2019
Save the Date; Reserve your Sponsorship
Photos from 2nd Annual Princeton SmartDrivingCar Summit

Program & Links to slides from 2nd Annual Princeton SmartDrivingCar Summit


  On the More Technical Side

https://orfe.princeton.edu/~alaink/SmartDrivingCars/Papers/

Capsule Networks (CapsNets) – Tutorial