part5.6B7230D2.07F59D58@princeton.edu”>CPUC AUTHORIZES PASSENGER CARRIERS TO PROVIDE FREE TEST RIDES IN AUTONOMOUS VEHICLES WITH VALID CPUC AND DMV PERMITS
Press Release, May 31, "…Today’s decision also allows TCP permit-holders that hold a “DMV Manufacturer’s Testing Permit – Driverless Vehicles” to operate autonomous vehicles without a driver in the vehicle, subject to certain restrictions. Authorization to provide this service is available only to TCP permit-holders with driverless autonomous vehicles that have been in DMV-permitted driverless operation on California roads for a minimum of 30 days. Entities seeking to participate in the pilot program are not allowed to operate from or within airports; must limit the use of the vehicle to one chartering party at any given time (i.e., fare-splitting is not permitted); must ensure that the service can only be chartered by adults 18 years and older; and may not accept monetary compensation for the ride. Participants are also required to continuously comply with all DMV regulations, and to report certain data to the CPUC on a quarterly basis that will be publicly available…." Read more Hmmmm…..Good News: Able to serve customers with autonomousTaxis. Bad news: Not able to Share Rides. (This is really bad news because having the public oversight body focus Driverless serving single occupants thereby making even worse the fundamental problem of the personal auto is simply REALLY BAD!. Their opportunity is to encourage ride-sharing whenever possible so as to alleviate congestion and reduce energy and pollution. C’mon CPUC!! The fact that the rides are free is largely irrelevant at this time, except as, once again, a subsidy to the 1%ers who are a disproportionate element of the early adopters that are likely to hail this service. Alain
Decision, May 31, "DECISION AUTHORIZING A PILOT TEST PROGRAM FOR AUTONOMOUS VEHICLE PASSENGER SERVICE WITH DRIVERS AND ADDRESSING IN PART ISSUES RAISED IN THE PETITIONS FOR MODIFICATION OF GENERAL MOTORS, LLC/GM CRUISE, LLC, LYFT, INC., AND RASIER-CA, LLC/UATC, LLC FOR PURPOSES OF A PILOT TEST PROGRAM FOR DRIVERLESS AUTONOMOUS VEHICLE PASSENGER SERVICE…" Read more Hmmmm…. Read carefully. Contains all of the important details. Alain
F. Fishkin, June 12, "What is the big mistake California is making in driverless vehicle testing? Princeton University’s Alain Kornhauser says the key is to promote ride sharing. Join the professor and co-host Fred Fishkin for Episode 44 of the Smart Driving Cars Podcast for more on that, Waymo, Tesla and more. Listen and subscribe." Hmmmm…. Now you can just say "Alexa, play the Smart Driving Cars podcast!" . Ditto with Siri, and GooglePlay. Alain
Real information every week. Lively discussions with the people who are shaping the future of SmartDrivingCars. Want to become a sustaining sponsor and help us grow the SmartDrivingCars newsletter and podcast? Contact Alain Kornhauser at firstname.lastname@example.org! Alain
part15.672ABF06.16C64B71@princeton.edu”> A Chance to Reinvent the Way We Live – Thanks to MaaS & More
K. Pyle, June 12, "A recurring idea at the Prospect SV 2018 Impact and Innovation conference is that changes to mobility will have the potential for significantly large positive impacts on the built environment and energy consumption. Speakers at the conference embraced these upcoming changes as opportunities to improve the quality of life, particularly in urban areas. Jesse Denver, DER Program Manager for the City and the County of San Francisco’s Department of the Environment, suggested that these changes provide a “Chance to reinvent the way we live.”…
Waymo announced, on the morning of the Innovation and Impact Summit, their intent to purchase up to 62,000 Pacifica Hybrid mini-vans from FCA. This was especially timely for the Impact and Innovation Summit as Waymo’s Head of Local Policy, Ellie Casson was able to provide context about this announcement, as well as details of Waymo’s approach, including: Read more Hmmmm….. Very interesting. You must read. Waymo finally is saying something substantive about their intentions. See also video reporting their 7M miles of testing and "infinite" amount of simulation. Alain
part19.123EEDEF.279BBDE8@princeton.edu”>Driverless testing in NJ? Phil Murphy aide wants to move in that direction
J. Cichowski, June 6, "…“Finally, somebody in power recognizes that New Jersey is a microcosm of the nation that has everything necessary for a grand experiment,” he said, citing the state’s limited mass-transit options and its balance of urban, suburban and rural roads and population demographics. "And the weather isn’t always great," he added, "but that makes it ideal for testing under all conditions."…" Read more Hmmmm…. See video. New Jersey may finally start trying to be a player. Also see videos of several of the other presentations at the 2nd Princeton SDC Summit."
T. Lee, June 11, "…the car "began a left steering movement" seven seconds before the crash that put it on a collision course with a concrete lane divider.
This isn’t the only recent case where Autopilot steered a Tesla vehicle directly into a stationary object—though thankfully the others didn’t get anyone killed. Back in January, firefighters in Culver City, California, said that a Tesla with Autopilot engaged had plowed into the back of a fire truck at 65mph. In an eerily similar incident last month, a Tesla Model S with Autopilot active crashed into a fire truck at 60mph in the suburbs of Salt Lake City.
A natural reaction to these incidents is to assume that there must be something seriously wrong with Tesla’s Autopilot system. After all, you might expect that avoiding collisions with large, stationary objects like fire engines and concrete lane dividers would be one of the most basic functions of a car’s automatic emergency braking technology.
But while there’s obviously room for improvement, the reality is that the behavior of Tesla’s driver assistance technology here isn’t that different from that of competing systems from other carmakers. As surprising as it might seem, most of the driver-assistance systems on the roads today are simply not designed to prevent a crash in this kind of situation.
Sam Abuelsamid, an industry analyst at Navigant and former automotive engineer, tells Ars that it’s "pretty much universal" that "vehicles are programmed to ignore stationary objects at higher speeds."
To understand why these systems behave like this, it’s helpful to keep in mind how they evolved. About 20 years ago, …These systems were designed to work on controlled-access freeways, and, in the vast majority of cases, stationary objects near a freeway would be on the side of the road (or suspended above it) rather than directly in the car’s path. Early adaptive cruise control systems simply didn’t have the capability to distinguish the vast majority of objects that were near the road from the tiny minority that were on the road.
So cars were programmed to focus on maintaining a safe distance from other moving objects—cars—and to ignore stationary objects. Designers assumed it would still be the job of the human driver to pay attention to the road and intervene if there was an obstacle directly in the roadway.
Abuelsamid points out that a car’s adaptive cruise control system was often a totally separate system—made by a different supplier and using different sensors—from the lane-keeping system. Adaptive cruise control systems are often radar-based, while lane-keeping systems more often use cameras. The two systems don’t necessarily even share data, and on many cars they don’t do any kind of sophisticated path-planning…." Read more Hmmmm….. Pretty much how I’ve been commenting: that it is the shortcomings of AEB that’s the problem. It amazes me that the NTSB & NHTSA haven’t realized this. Let’s go back to the NTSB Findings in Joshua Brown crash:
- Why did the NTSB so easily accept MobilEye’s excuse that its system didn’t see the trailer because of the background color and not ask why it didn’t see the cab of the truck or the truck’s front tires?
- Why didn’t the NTSB uncover that because Josh was traveling fast (less than 10 over), even if it would have seen those items (which it probably did) the AEB would have disregarded them because the closing speed was equal to the vehicle speed. The truck was moving laterally and was stationary in the longitudinal direction. Thus it was classified to be a stationary object, Stationary objects are disregarded by the AEB when traveling at high speeds.
- NTSB should have issued an advisory to everyone that they should "up their AEB game" so that stationary objects in the lane ahead are NOT disregarded.
The NTSB is thus complicit in the Barrier crash, because they didn’t expose the shortcoming of Tesla’s AEB. Had the AEB been fixed in the ensuing 2 years, the improved AEB would likely have, at least, cause the Tesla to slow down, rather than speed up as it approached the butt end of a NJ Barrier.
With respect to the Huang crash (NJ Barrier), watch this video and I’ve put together a few slides that imply that CA DoT was also complicit. Highway engineers have repeatedly asked: "what are the infrastructure needs of AVs (SDCs)?" and have repeatedly been told: "NOTHING except what conventional drivers need to be good drivers: Good (non-ambiguous) Paint (lane markings) and Good (readable, non-ambiguous) Signs". It is time that CA DoT make sure that there is good paint, especially on heavily traveled roads like US 101 AND why is it that CA DoT doesn’t stripe the area inside the point. If either of these had been the case, the Huang crash and possibly the crash, 11 days earlier, that collapsed the attenuator could have been averted. Since this newsletter has no influence, the NTSB needs to do its job and consider the same "Findings". Alain
part33.34BD8B57.DEFB8C92@princeton.edu”>Toyota’s long list of what will stump its automatic braking system
D. Gershgorn, June 12, "Automakers are slowly nudging their cars towards becoming autonomous. Automatic lane assist and automatic braking are slowly becoming standard on new vehicles, with 92% of Toyota’s cars now shipping with the technology….The new technology has its limits. Toyota, to its credit, makes many caveats clear in the Camry’s owner manual. Basically, the automatic braking will work if the pedestrian is standing still against a plain background. Otherwise, its cameras’ radar and sensors may not recognize:..The manual for the Ford F-150, America’s best-selling truck, says its accuracy for automatic braking suffers in “direct or low sunlight,” as well as with “vehicles at night without tail lights, unconventional vehicle types, … Honda makes the fewest promises of all, not promising to stop the car but simply decrease its speed in the face of an unavoidable collision." Read more Hmmmm….. Even though much of this is CYA, it is nice that some honesty is being displayed. Moreover, there are at least two issues here: Recognition and Response. Failure to Recognize implies that the recognition system needs to become better. Failure to Respond when there is recognition, means that the response system needs to be overhauled. It would be nice if these companies explained their Response Policy (How do they respond to recognition, especially when recognition is less than certain. How certain do they have to be to respond? How cautious are they and why was this level of cautiousness chosen? Is the user/owner offered an opportunity to choose the level of cautiousness? Alain
part24.EFE45AB0.CE670865@princeton.edu”>Tesla updates Autopilot to nag users to hold the wheel more often
T. Lee, June 11, "Tesla has begun rolling out a new version of its software, version 2018.21.9, that is stricter about requiring drivers to keep their hands on the wheel.
Previous versions of the software allowed drivers to take their hands off the wheel for one to two minutes before reminding them to put them back on the wheel—a measure designed to make sure drivers were paying attention to the road. The new update dramatically shortens this interval, with videos showing warnings popping up after around 30 seconds….
The latest change is an effort to improve driver safety after at least three Tesla crashes with Autopilot engaged since the start of the year. One of those crashes—in March in Mountain View, California—led to a fatality.
That crash killed engineer Walter Huang. Tesla argued that "the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so."…" Read more Hmmmm….. I highly prefer GM’s eye tracker approach that monitors directly if the driver is paying attention to the road ahead rather than implying it because the wheel was touched. Elon needs retrofit an eye tracker into every Tesla with AutoPilot. Alain
part41.EDD7C742.172F4DBE@princeton.edu”>Safer Roads with Automated Vehicle
June, 2018, "This report examines how increasing automation of cars and trucks could affect road safety and which security vulnerabilities will need to be addressed with the rise of self-driving vehicles. The report applies the principles of the “Safe System”, which is at the forefront of current thinking about road safety, to the wider discussion on vehicle automation. It also takes into considerations the security of the cyber-physical system associated with automated driving. This includes a definition of relevant system boundaries and future proof minimum requirements for relevant safety and security indicators. …" Read more Hmmmm….. Extensive report on the implications Self-driving cars in the EU. As with most of the "research" coming out of the EU, this report is focused almost exclusively on the Safety of consumer-owned personal Self-driving cars and, only by implications, conventionally owned and operated Safe-driving cars. It almost seems as if the EU either doesn’t know or understand the concept of fleet-owned & operated, on-demand driverless cars that would offer shared ride services when appropriate. This is not surprising since each of these EU efforts have legacy auto OEMs on the study teams and the EU OEMs are most interested in using the comfort & convenience aspects of Self-driving to continue their 100 year old business model of selling cars to individual consumers to use to drive themselves, usually alone and never with a stranger, to where ever they wish to go. Alain
part44.8CE58628.C7210CD0@princeton.edu”>How Safe Is Safe Enough for Self‐Driving Vehicles?
P. Liu, May 21, "Self‐driving vehicles (SDVs) promise to considerably reduce traffic crashes. One pressing concern facing the public, automakers, and governments is “How safe is safe enough for SDVs?” To answer this question, a new expressed‐preference approach was proposed for the first time to determine the socially acceptable risk of SDVs. In our between‐subject survey (N = 499), we determined the respondents’ risk‐acceptance rate of scenarios with varying traffic‐risk frequencies to examine the logarithmic relationships between the traffic‐risk frequency and risk‐acceptance rate. … the results showed that SDVs were required to be safer than HDVs….
Two risk‐acceptance criteria emerged: the tolerable risk criterion, which indicates that SDVs should be four to five times as safe as HDVs, and the broadly acceptable risk criterion, which suggests that half of the respondents hoped that the traffic risk of SDVs would be two orders of magnitude lower than the current estimated traffic risk. The approach and these results could provide insights for government regulatory authorities for establishing clear safety requirements for SDVs…." Read more Hmmmm….. Very interesting and very well done. Since this fundamentally involves the perception of safety, it is very challenging to quantify and forecast eventual behavior. People smoke.
This paper reinforces the notion that SDC will need to be perceived to be substantially safer. This has always been a strict necessary condition on the advancement of the technology. Alain
part47.FE7628F4.9C081344@princeton.edu”>Tesla’s AI director gives insights into Autopilot’s computer vision and neural net development
F. Lambert, June 11, "Andrej Karpathy, Tesla’s director of AI and computer vision, is currently hard at work trying to improve Autopilot by training Tesla’s neural net with incredible amounts of data from Tesla’s fleet…." Read more Hmmmm….. See also video; however, AutoPilot has much more fundamental flaws associated with its inability to deal appropriately with stationary objects and its totally inadequate Automated Emergency Braking system. In the Huang/NJ_Barrier crash it confused two neighboring lane markings and chose to follow the outer rather than the inner which should never be the default choice. Alain
part47.FE7628F4.9C081344@princeton.edu”>BYTON unveils new all-electric and autonomous sedan concept
F. Lambert, June 12, "Just a day after announcing that they secured $500 million in funding to bring their electric vehicles to market, BYTON is now unveiling an all-electric and autonomous sedan concept…" Read more Hmmmm….. This is just another "Self-driving Car" that has little hope of becoming driverless anytime soon and is focused on the conventional personal car ownership model. Chances to upstage legacy OEMs or Tesla are slim2none. It isn’t easy to succeed in the car business: List of defunct automobile manufacturers of the United States. Alain
Interested in working in Toronto? Have a good background and interest in working on safety and security for autonomous driving vehicles and fleets? Contact Dr. Fengmin Gong, DiDi Labs
Calendar of Upcoming Events:
3rd Annual Princeton SmartDrivingCar Summit
evening May 14 through May 16, 2019
Save the Date; Reserve your Sponsorship
Photos from 2nd Annual Princeton SmartDrivingCar Summit
Program & Links to slides from 2nd Annual Princeton SmartDrivingCar Summit