http://smartdrivingcar.com/Another-071416

Thursday, July 14, 2016

  Another Tesla crash blamed on car’s Autopilot system

S. Musil, July 12, "The most recent crash involved a Model X near the small town of Whitehall, Montana, on Sunday morning, according to the Detroit Free Press. Neither the driver nor the passenger was injured in the single-vehicle crash, the Montana Highway Patrol told the newspaper….The car failed to detect an obstacle in the road, according to a thread posted on the Tesla Motors Club forum by someone who said they’re a friend of the driver. The thread included photos showing the damage to the vehicle.

Tesla said Tuesday that it appears the driver in the crash was using the system improperly.

"The data suggests that the driver’s hands were not on the steering wheel, as no force was detected on the steering wheel for over 2 minutes after autosteer was engaged (even a very small amount of force, such as one hand resting on the wheel, will be detected)," a Tesla spokesman said in a statement. "This is contrary to the terms of use that are agreed to when enabling the feature and the notification presented in the instrument cluster each time it is activated.

"As road conditions became increasingly uncertain, the vehicle again alerted the driver to put his hands on the wheel. He did not do so and shortly thereafter the vehicle collided with a post on the edge of the roadway," the spokesman said. He added that the Autopilot feature was being used on an undivided mountain road despite being designed for use on a divided highway in slow-moving traffic….Read more  Hmmm….Interesting that Tesla didn’t say that the car began to slow down (as it is supposed to if the driver does not put his/her hand back on the wheel!!!!???? (The "lane-centering" should NOT turn off if the driver does not respond (I believe the Mercedes "997 package" turns off lane-centering if you don’t respond to the buzzer 🙁  (However, since the lane centering on my 2014 S-550 only works if the lane is essentially perfectly straight, and Mercedes has never made an effort to fix/update my software, I rarely take my hands off the wheel.  The system is so poor that I can’t tell if lane-centering is just not working or the buzzer turned it off.  🙁  )) ,  What should happen is that the car should turn on its emergency flashers, slow down at a rate that is proportional to the quality of the road conditions and once it reaches a slow enough speed have the capability to determine if a lane change to the right (in US and …) is safe or a clear shoulder to the right is available.  If so,  make the lane change and come to a complete stop, all the while announcing to the driver what the system is doing because hands have not been put back on the wheel.  After stopping, "AutoPilot" should then turned off as should "AutoPilot" privileges until a "Tesla" representative resets the system.  If that doesn’t convince the driver to put "hands-on-wheel", then the car has just averted a possible catastrophe associated with a comatose driver.     Alain

July 8 Letter to Tesla

J. Quandt, July 8 "This letter is to inform you that the ODI of NHTSA has opened a Preliminary Investigation PE 16-007to examine the performance of the Automatic Emergency Braking (AEB) system and any other forward crash mitigation of forward crash avoidance system enabled and in use at the time of the fatal crash …" You MUST Read morHmmm….  Specifically on page 3  "1. …Separately, for each subject vehicle (All Tesla vehicles equipped with any version of the Autopilot Technology Package  …) manufactured to date by Tesla, state the following…k. The total number of AEB events that occurred with…" and on the next page "2…and 3. …"  and on page 5 "5. Describe all assessments, analyses, tests, …that relate to, or may relate to the alleged defect that have been conducted, are being conducted, are planned, … and 6.  Describe all modifications or changes made by, or on behalf of Tesla in …." … and on page 6 "Provide the following information related to the subject system: a. …, l. …" and on page 7  "8.  Provide the following information on all AEB events experienced by the subject vehicles that involve avoidance or mitigation of crashes with vehicles crossing the path of the subject vehicle: …. 9.  Provide Tesla’s reconstruction of the subject crash, including: a.  The positions of the vehicles at impact and the positions at each 10 msec increment up to two seconds prior to impact…"Unfortunate that Velocities aren’t also being requested and that for only 2 seconds before the crash.  Why not 5 or even 15 seconds if Tesla has the data.  "10.  Furnish Tesla’s assessment of the alleged defects in the subject vehicle including: a.  the reason the subject system did not activate in the subject crash; …  ..This is a VERY good letter; unfortunately, there are caveats on page 8 under "Confidential Business Information" which will probably mean that most of the information will not appear in the public domain nor be shared with other automakers and developers so that "the same thing doesn’t happen to another manufacturer".  This information/data is all about safety.  It should be open and shared as "lessons learned" and Tesla should be held harmless in return for sharing the information and placing it in the public domain.  In that spirit, NHTSA should make a similar request of all manufactures that include AEB offerings, especially which VIN numbers have AEB systems so that one can reliably identify when these systems are involved in crashes.  Undoubtedly the Tesla May 7 crash was NOT the first crash in which there was a fatality and the AEB did not avoid the crash.   Likely there have been many.  As an industry we MUST learn from each of these crashes.  By the way, VIN numbers should reflect the optional safety/automated features that are included in the vehicle.  Alain

New York Times Editorial Board Crashes into Automated Vehicles

M. Scribner, Jul 13 "…In a country where more than 35,000 were killed in traffic accidents last year, national conversations about one or two additional highway fatalities, tragic as they are, are completely unwarranted. This is particularly true when the New York Times editorial board is attempting to lead the conversation….Tesla looked around and discovered their early-adopting owner pool would make the perfect guinea pigs…It is also important to note that even when much safer automated vehicle technology comes to market, people will still die on the roads…. The promise of far safer roads, not impossibly and perfectly safe “Vision Zero” roads, should be our focus…

But it gets worse.  In the closing three paragraphs, the Times editorial board makes the following claims:

  • NHTSA needs to speed the deployment of vehicle-to-vehicle (V2V) communications, which could have plausibly saved the dead Tesla driver;
  • Federal regulators should take to heart the lesson with early deployment of airbags, which killed women and children; and
  • NHTSA should be prepared to update its rules rapidly and frequently.

On the first, I have written about why NHTSA’s looming V2V dedicated short-range communications (DSRC) mandate will actually harm automated vehicle development and do little to promote highway safety. The V2V DSRC mandate is itself a major distraction funded by self-interested auto companies who, having forgotten about the sunk-cost fallacy of throwing good money after bad, refuse to give up on already obsolete technology because they’ve already thrown more than $1 billion down the V2V DSRC hole…"  Read more Hmmm….Yup!  Alain

SEC Is Reportedly Investigating Tesla for Not Disclosing Autopilot Death

S. Gandel, July 11, "The Securities and Exchange Commission would like to know a little bit more about what Tesla Motors and Elon Musk knew about a fatal Tesla crash in early May.

On Monday, the Wall Street Journal reported that SEC officials are looking into whether Tesla  TSLA 0.03% and Musk violated securities laws when the car company and its CEO sold $2 billion worth of shares in mid-May without disclosing the fact that a driver had been killed while reportedly using the car manufacturer’s autopilot feature…."  Read more  Hmmm….  Amazing how poorly this "inevitability" has been handled by Tesla.  How many other crashes have Teslas been involved for which Tesla has captured data during the "last minute’ before the crash.  These data, excluding any information about the individuals involved, should be released to public.  These data are incredibly valuable and should be shared.  In the end, they are likely all "discoverable" in a legal proceeding.  released earlier, they could help everyone improve their systems and avert many repeated "shortcomings" and crashes.  We have a lot yet to learn about these rare events.  The faster we can learn, the more hardships averted and lives we will save.  Alain

#TeslaCrash: Three reasons for Tesla (and all of us) to be concerned

S. LeVine July 2,"…Last week my team and I released a new Working Paper on the liability issues raised by Automated Cars, and how this will constrain how they drive.  I want to therefore share initial thoughts on this incident, starting with specific reasons for Tesla to be concerned, and concluding with broader reasons for the rest of us to also pay attention: "…  Read more Hmmm….Yup!  Alain

As U.S. Investigates Fatal Tesla Crash, Company Defends Autopilot System B. Vlasic & N. Boudette, July 12,  "…The questions raised by the N.H.T.S.A., in a nine-page letter that was dated July 8 but not made public until Tuesday, indicated the agency was investigating whether there are defects in the various crash-prevention systems related to Autopilot.


Those systems include automatic emergency braking, which is supposed to stop Tesla models from running into other vehicles detected by radar and a camera…."   Read more  Hmmm….See letter above.  The accident image is not a correct representation of the intersection because it fails to depict the 76 foot grass median. See images in Link  Alain

Fatal Tesla Crash Draws In Transportation Safety Board

N. Boudette, July 10,  "A second federal agency is investigating a fatal May 7 crash in Florida involving a Tesla automobile operating in Autopilot mode that failed to stop when a tractor-trailer turned in front of it…The involvement of the transportation safety board signals even greater scrutiny of the accident and Tesla’s Autopilot technology. The agency specializes in determining the causes of crashes and is familiar with the self-driving technology used in trains and airplanes…."   Read more  Hmmm….Yea.  Alain

Willow Run autonomous car test site needs $60M

July 13,  "The American Center for Mobility has grand plans to transform the weed-strewn concrete ruins of the former Willow Run bomber plant into a world-class autonomous car test site, but it still needs tens of millions of dollars to make it a reality.

The state of Michigan has pledged $3 million, and officials expect approval this month for an additional $17 million in state aid. But the nonprofit still needs another $60 million to fully realize a 335-acre test site that would include tunnels, bridges, traffic stops, suburban cul-de-sacs and city streets to test the driverless cars of tomorrow…"   Read more  Hmmm….As I found out, not an easy sell 🙁  (unless you’re willing to do "v2v" as per Columbus.)  Alain

IBM’s Watson makes a move into self-driving cars with Olli, a minibus from Local Motors

I. Lunden, June 16,  "IBM today took the wraps off its first big foray into the world of self-driving cars, not as the driver of them, but as the brain behind making your self-driving journey a little more interesting.

IBM Watson, the company’s AI platform, is powering services in Olli… The cars will start operations first in Washington, DC, before expanding to deployments in Miami-Dade County and Las Vegas later this year. IBM says Miami-Dade County will run a pilot to transport people around Miami using these autonomous vehicles.

Olli will be using a special version of Watson aimed at automotive applications and it is not fully powering the car’s self-driving features. Instead it’s aimed at “improving the passenger experience,” according to a statement from IBM."   Read more  Hmmm….What a let down!  Not a very big "move" if it just doing "travelTainment" (aka "passenger experience").  C’mon IBM, you really should be able to do more.  Alain


Some other thoughts that deserve your attention

Playing ‘Telephone’ with Transportation Data

S. Polzin, July 11, "… As I continued reading, my confidence in a transformational change diminished with each sentence. But I did stumble across one of my pet peeves—the infamous allusion to 30 percent of traffic in cities being due to drivers seeking parking. The Guardian story contained the following quote.

"The emails and documents show that Flow applies Google’s expertise in mapping, machine learning and big data to thorny urban problems such as public parking. Numerous studies have found that 30 percent of traffic in cities is due to drivers seeking parking."…

The story’s "numerous studies" linked to the often-cited Donald Shoup article "Cruising for Parking Transport Policy". The reference to numerous studies evidently referred to the case studies Dr. Shoup was able to assemble in his research—some of which appears to be single data points. This 30 percent average, rather incidental and appropriately qualified data in the original paper, has taken on a life of its own and is regularly cited as evidence for everything from the opportunities for better parking management to the benefits of public transit to the wastefulness of auto dependency to the current prospect of mitigating congestion by virtue of eliminating parking through automated vehicles. Indeed, Google Scholar shows over 300 citations, with nearly 100 reported in Web of Science…. Read more Hmmm….Details matter!  Alain


On the More Technical Side

http://orfe.princeton.edu/~alaink/SmartDrivingCars/Papers/


Recompiled Old News & Smiles:

Videos of Automated Emergency (Non) Braking


Half-baked stuff that probably doesn’t deserve your time:

Hyperloop Connecting Helsinki and Stockholm Turns 300-Mile Trip Into 28 Minute Ride

A. Walker, July 5, "Where will the first Hyperloop be? So far there are plans to use the tubular transportation system to move passengers in Slovakia and freight in Switzerland. But a proposed application for the Hyperloop announced today could solve a transportation conundrum that has been challenging planners for centuries: Connecting the neighboring nations of Sweden and Finland…"  Read more  Hmmm….Do-able since one actually was built and operated 150 years ago in NYC by Alfred E. Beach, resurrected by Lawrence Edwards as Gravity-vacuum transit in the mid 1960s and reinvented several other times since (Vactrain) only to be "Al Gored" (self-proclaimed "creator" of the internet) by Elon Musk.  Given the graveyard of failures, the initial investors better pu a lot of lipstick on this pig and flip it quickly.   Alain


C’mon Man!  (These folks didn’t get/read the memo)

LG Electronics says to jointly develop connected car platform with Volkswagen

Reuters, July 6, "LG, in a statement, said it and Volkswagen will work to jointly develop over "the next few years" technologies allowing drivers to control and monitor devices in their homes such as lights and security systems, as well as in-vehicle entertainment technologies and an alerting system for drivers providing "recommendations" based on real-time situations.

Automakers and technology companies have been forming partnerships in recent years, as the race to develop self-driving cars has created need for more sophisticated components and software that will allow vehicles to seamlessly communicate with various external devices and servers via the internet.

LG Electronics …has identified the auto industry as a new growth driver and has been pushing to grow new businesses amid continued struggles for its mobile phones division. Read more  Hmmm….How can VW whose image is so shattered have this view of a "connected car"?  Instead of focusing on driving, a driver is now going to monitor their homes and receive "recommendations".  If this is inspired by LG, its struggles are not over. C’mon VW & LG  Alain


Calendar of Upcoming Events:


Sept 15 & 16, 2016
Arlington, VA


  
Sept 19-21, 2016
Antwerp, Belgium


Recent Highlights of:

Monday, July 11, 2016

 

Lessons From the Tesla Crash

Editorial Board, July 11, "A recent fatal crash in Florida involving a Tesla Model S is an example of how a new technology designed to make cars safer could, in some cases, make them more dangerous. These risks, however, could be minimized with better testing (Hmmm….Yes!) and regulations (Still too early, we don’t know enough, yet)…Tesla’s electric cars are not self-driving, but when the Autopilot system is engaged it can keep the car in a lane, adjust its speed to keep up with traffic and brake to avoid collisions. Tesla says audio and visual alerts warn drivers to keep their hands on the steering wheel and watch the road. If a driver is unresponsive to the alerts, the car is programmed to slow itself to a stop.

Such warnings aren’t sufficient, though; some Tesla drivers, as shown in videos on YouTube, have even gotten into the back seat while the car was moving. Such reckless behavior threatens not just the drivers but everyone else on the road, too. (Absolutely!)… If that system (V2V) had been in place, Mr. Brown might have survived. (Sure, but Mr Brown would have had to wait more than his normal expected life span before that system would have been adopted by more than 70% of all vehicles for it to have better than a "coin flip" chance of helping him.   What would have helped Mr. Brown is if the Automated Emergency Braking system worked on his Tesla, or if the truck driver had seen him coming (not become distracted) and had not "failed to yield".  ) Federal officials could take lessons from the history of airbags and the lack of strong regulations. (This is a VERY appropriate and relevant lesson!)… The agency does not yet have regulations for driverless cars or cars that have driver assistance systems. But when officials do put rules in place, they will have to update them regularly as they learn about how the technology works in practice. Automation should save lives. But nobody should expect these vehicles to be risk-free. (This is very wise.  They should also immediately focus on Automated Emergency Braking systems which are the foundation of any Self-driving or Driverless systems. )  Read more  Hmmm….Comments in-line above.  Alain

Tuesday, July 5, 2016

May 7 Crash

Hmmm…What we know now (and don’t know):

1.  On May 7, 2016 at about 4:40pm EDT, there was a crash between a Tesla and a Class 8 Tractor-Trailer. The accident is depicted in the Diagram from the Police Report: HSMV Crash Report # 85234095. (1)  Google Earth images from the site.

 2. The driver of the Tesla was Joshua Brown"No citations have been issued, but the initial accident report from the FHP indicates the truck driver "failed to yield right-of-way."" (2) .  Hmmm….No Citations??? Did the truck have a data recorder?  Was the truck impounded, if so, how is the truck driver making a living since the crash?  Why was his truck not equipped with sensors that can warn him of collision risks at intersections?  As I’ve written, driving is one of the most dangerous occupations.  Why isn’t OSHA concerned about improving the environment of these workers?  Why doesn’t  ATRI (the American Trucking Association’s research arm recognize the lack availability/adoption of "SmartDrivingTruck technology" as one of its Critical Issues?  Why didn’t his insurance agent encourage/convince him to equip his truck with collision risk sensors.  If they aren’t commercially available, why hasn’t his insurance company invested/promoted/lobbied for their development?  These low-volume rural highway intersections are very dangerous.  Technology could help.

"…(the truck driver)…said he saw the Tesla approaching in the left, eastbound lane. Then it crossed to the right lane and struck his trailer. "I don’t know why he went over to the slow lane when he had to have seen me,” he said…." (2) .  Hmmm….If the driver saw the Tesla change lanes, why did he "failed to yield right-of-way"???

"…Meanwhile, the accident is stoking the debate on whether drivers are being lulled into a false sense of security by such technology. A man who lives on the property where Brown’s car came to rest some 900 feet from the intersection where the crash occurred said when he approached the wreckage 15 minutes after the crash, he could hear the DVD player. An FHP trooper on the scene told the property owner, Robert VanKavelaar, that a "Harry Potter" movie was showing on the DVD player, VanKavelaar told Reuters on Friday.

Another witness, Terence Mulligan, said he arrived at the scene before the first Florida state trooper and found "there was no movie playing."   "There was no music. I was at the car. Right at the car," Mulligan told Reuters on Friday.

Sergeant Kim Montes of the Florida Highway Patrol said on Friday that "there was a portable DVD player in the vehicle," but wouldn’t elaborate further on it. She also said there was no camera found, mounted on the dash or of any kind, in the wreckage….

…Mulligan said he was driving in the same westbound direction as the truck before it attempted to make a left turn across the eastbound lanes of U.S. Highway 27 Alternate when he spotted the Tesla traveling east.  Mulligan said the Tesla did not appear to be speeding on the road, which has a speed limit of 65 miles per hour, according to the FHP…." (2) .

3. "…the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents…" (3). Not sure how Tesla knows what Joshua Brown saw or did not see.  Events prior to the crash unfolded over many seconds.  Tesla must have precise data on the car’s speed and steering angle, video  for those many seconds prior to the crash, as well as, what it was "seeing" from MobilEye’s cameras and radar data.  At no time prior to the crash did it see anything crossing its intended travel lane?  More important, why didn’t the truck driver see the Tesla?  WHAT WAS HE DOING? What was the truck doing.  How slow was it going?  Hopefully there was a data speed recorder on the truck.  Was the truck impounded, if so, how is the truck driver making a living since the crash? 

One can also ask: Why was the truck not equipped with sensors that can warn the driver of collision risks at intersections?  As I’ve written, driving is one of the most dangerous occupations.  Why isn’t OSHA concerned about improving this workplace environment?  Why doesn’t  ATRI (the American Trucking Association’s research arm) recognize the lack availability/adoption of "SmartDrivingTruck technology" as one of its Critical Issues?  Why didn’t the driver’s insurance agent encourage/convince him to equip his truck with collision risk sensors.  If they aren’t commercially available, why hasn’t his insurance company invested/promoted/lobbied for their development?  These low-volume rural highway intersections are very dangerous.  Technology could help.

While the discussion is about AutoPilot, the Tesla also has Automated Emergency Braking (AEB) which is supposed to always be on.  This seems more like an AEB failure rather than an AutoPilot failure. The Tesla didn’t just drive off the road,  The discussion about "hands-on-wheels" is irrelevant.  What was missing was "foot-on-brake" by the Tesla driver and "eyes-on-road" by, most importantly, the truck driver, since he initiated an action in violation to "rules of the road" that may have made a crash unavoidable. 

3. "Problem Description: A fatal highway crash involving a 2015 Tesla Model S which, according to Tesla, was operating with automated driving systems (“Autopilot”) engaged, calls for an examination
of the design and performance of any driving aids in use at the time of the crash."
(4). Not to be picky, but the initiator of the crash was the failure to yield by the truck driver.  Why isn’t this human failure the most fundamental "Problem Description"?  If "driving aids" were supposed to "bail out" the truck driver’s failure to yield, why isn’t the AEB system’s "design and performance" being examined.  AutoPilot’s responsibility is to keep the Tesla from steering off the road (and, as a last resort, yield to the AEB).  The focus should be on AEBs.  How many other Tesla drivers have  perished that didn’t have AutoPilot on, but had AEB?  How many drivers have perished of other cars that have AEB?  Seems as if this crash was more about an emergency automated systems failing to apply the brakes, rather than a driver not having his hands-on-wheel.  Unfortunately, it is likely that we will eventually have a fatality in which an "AutoPilot" will fail to keep a "Tesla" on the road (or in a "correct" lane), but from what is known so far, this does not seem to be the crash. 

4. "What we learn here is that Mobileye’s system in Tesla’s Autopilot does gather the information from the vehicle’s sensors, primarily the front facing camera and radar, but while it gathers the data, Mobileye’s tech can’t (or not well enough until 2018) recognize the side of vehicles and therefore, itcan’t work in a situation where braking is required to stop a Tesla from hitting the side of another vehicle.

Since Tesla pushed its 7.1 update earlier this year, the automaker’s own system used the same data to recognize anything, under adequate conditions, that could obstruct the path of the Tesla and if the radar’s reading is consistent with the data from the camera, it will apply the brakes.

Now that’s something that was put to the test by Model S owners earlier in the week:" (4). See video,  "In the last two tests, the Autopilot appears to detect an obstacle as evidenced by the forward collision warning alerts, but the automatic emergency braking didn’t activate, which raised questions – not unlike in the fatal crash.

Though as Tesla explained, the trailer was not detected in the fatal crash, the radar confused it for an overhead sign, but in the tests above, the forward collision warning system sent out an alert – though as evidenced by the fact that the test subject wasn’t hit, the AEB didn’t need to activate and therefore it didn’t. Tesla explains:

“AEB does not engage when an alternative collision avoidance strategy (e.g., driver steering) remains viable. Instead, when a collision threat is detected, forward collision warning alerts the driver to encourage them to take appropriate evasive action. AEB is a fallback safety feature that operates by design only at high levels of severity and should not be tested with live subjects.”…" Read more (5) With all of the expertise that MobilEye has in image processing, it is surprising that it can’t recognize the side of a tractor trailer or gets confused with overhead signs and tunnel openings.  If overhead signs (and overpasses and tree canopies) are really the issue, then these can be readily geocoded and included in the digital map database.)

5.  It seems that all of the other stuff about DVD player, watching movies, previous postings on YouTube is noise. Automated Collision Avoidance Systems and their Automated Emergency Braking sub-system MUST be more robust a mitigating "failed to yield right-of-way" situations irrespective of the "failure to yield" derived from a human action (as seems to have occurred in this crash) or an "autoPilot" (which doesn’t seem to be the case in this crash).  Alain

(1) Self-Driving Tesla Was Involved in Fatal Crash, U.S. Says, June 30 NYT,

(2) DVD player found in Tesla car in fatal May crash, July 1, Reuters

(3) A Tragic Loss, June 30, Tesla Blog

(4) NHTSA ODI Resume PE 16-007 Automatic vehicle control system, June 28, 2016

(5) Tesla elaborates on Autopilot’s automatic emergency braking capacity over Mobileye’s system Electrek, July 2, 2016  See also: Understanding the fatal Tesla accident on Autopilot and the NHTSA probe July 2, 2016, Tesla Autopilot partner Mobileye comments on fatal crash, says tech isn’t meant to avoid this type of accident [Updated], July 1,

Monday, June 27, 2016

Who Will Build the Next Great Car Company?

E. Griffith, June 24, "…Also, he’s hit the decoy plenty of times. In 2012 he even did it in front of Ford’s board of directors.  Back then the idea of self-driving cars looked, to Ford’s leadership, like a frivolous Silicon Valley moonshot. Four years later things have dramatically changed. Today Ford’s vehicle lineup features more than 30 options for semiautonomous features, including the automatic brakes I tested, and the company is aggressively working on cars that fully drive themselves. By year-end the company expects to have the largest fleet of autonomous test vehicles of any automaker.

Ford is not alone. The entire automotive industry is in the midst of a radical transformation that is reshaping the very definition of what it means to be a car company. There is hype, hope, fear, and insecurity—and at the center of it all is the self-driving car. Thanks to cheap sensors, powerful machine-learning technology, and a kick in the butt from the likes of Google and Tesla Motors  , driverless vehicles are becoming a sooner-than-you-think reality…." Read more Hmmm…A very good summary of where the industry stands with respect to Self-driving; however, it really doesn’t address Driverless, (autonomousTaxi (aTaxi) shared-ride on-demand transit).  It makes no mention of the low-speed Easy Mile, 2GetThere, CityMobil2 approaches.  Fortune is still seeing a personal car future and not a Mobility-on-Demand future.  That would be way too disruptive.  See also the intro video  Alain

Sunday, May 22, 2016

Derailment of Amtrak passenger train 188, Philadelphia, PA, May 12, 2015 NTSB/ DCA15MR010

Public meeting of May 17 "… Executive Summary…This report addresses the following safety issues:

  • Crewmember situational awareness and management of multiple tasks.
  • Positive train control. In the accident area, positive train control had not yet been implemented at the time of the accident, but it has since been implemented.  The NTSB found that the accident could have been avoided if positive train control or another control system had been in place to enforce the permanent  speed restriction of 50 mph at the Franklin Junction curve.
  •   Read more 

Hmmm… Kudos to NTSB for finding "the accident could have been avoided if positive train control or another control system had been in place to enforce..."

HOWEVER, given that PCT was mandated by Congress in 2008 with a deadline of December 15, 2015 and that 6 months before the deadline PTC had NOT been implemented on Amtrak’s highest volume segment (PHL-NYC) is so unacceptable that this deserved to have been their #1 bullet.  NOT some poor train engineer that was simply trying to do a job made enormously more dangerous and stressful because Amtrak management failed to implement in a timely manner what had been mandated by its "sugar daddy"!!  So the NTSB "threw" the engineer "under the bus" and essentially all of the news reports pointed to the engineer rather than Amtrak’s senior (mis)management (The Atlantic, NBC, Washington Post, WSJ, NYT etc.  Why didn’t the NYT do a long story on why Amtrak management didn’t install PTC in a timely manner???) 

My point here is larger in that this same issue exists in the rest of the transit industry where crash-avoidance technology exists today that can substantially reduce collisions and do so while printing money for the transit industry.  Dr. Jerome Lutin and I have pointed out to deaf ears that automated collision avoidance systems exist today for buses whose costs are substantially less than the net present value of the liability that these buses can be expected to impose on society.  This is about the cash that a hopelessly bankrupt transit industry has to pay out because it isn’t installing existing crash avoidance technology that is available today.  On top of that cash are all of the societal benefits associated with eliminating collisions. There is no rush (not even a faint heart-beat) by the industry to do this. FTA is totally asleep, yet bus drivers continue to be placed in some of the most stressful and unsafe working conditions without the help that such technologies can deliver.  I can’t be more blunt… The major cause of accidents in the transit industry is the fact that the management of the transit industry is not installing in its fleets existing and available automated collision avoidance systems.  What is even more derelict is that new bus procurement don’t include such provisions either.  When is the finger going to finally be pointed towards "Management" and the FTA instead of the poor bus driver or train engineer? NTSB is getting close by at least  putting  it 2nd, but if the public is to become aware, it will need to rise to the top bullet.  Alain  

Sunday, May 15, 2016

Extracting Cognition out of Images for the Purpose of Autonomous Driving

Chenyi Chen PhD Dissertation , "…the key part of the thesis, a direct perception approach is proposed to drive a car in a highway environment. In this approach, an input image is mapped to a small number of key perception indicators that directly relate to the affordance of a road/traffic state for driving….."  Read more  Hmmm..FPO 10:00am, May 16 , 120 Sherrerd Hall, Establishing a foundation for image-based autonomous driving using DeepLearning Neural Networks trained in virtual environments. Very promising. Alain 

Saturday, April 23, 2016

  N.J. superintendent killed while jogging was struck by student late for trip

K. Shea, April 19, "…The Robbinsville High School student who was driving the car that struck and killed the district’s superintendent Tuesday morning was late for a school trip when the crash occurred, according to two sources involved in the investigation.…" Read more Hmmm…Most tragic in so many dimensions!!!  HOWEVER, it was NOT the student that STRUCK the Superintendent, it was the CAR.  AND the CAR needs to start being held responsible for ALLOWING such tragedies to ruin so many lives.  It is very likely that this tragedy could have been averted had the car been equipped with an automated collision avoidance system and/or lane-keeping system.  Given the availability of these "tragedy avoidance systems", we should all be asking why this CAR wasn’t equipped with such a system and why all cars aren’t so equipped.  Certainly innocent runners and dogs need to be asking such questions.  So too, that young lady’s car insurance company; it must be muttering: "shouda bought her that upgrade".  What about the car companies themselves who are largely just sitting on the technology or the dealerships that don’t feel compelled to espouse the benefits of such technology while pushing more "horsepower" and "Corinthian Leather" (and worse yet: "AooleCarXYZ" that distracts drivers).  We all know that Washington is broken.  Them staying out of the way is probably best (although aggressively applying better human-visible paint/laneMarkings and human-readable signs would go a long way to helping both attentive drivers and automated lane-keeping systems).  Everyone else has  fundamental self-interest at stake and each needs to stop pointing the finger to the frail human driver.  We have the technology and the the self-interest to make mobility substantially safer.  Let’s really get on with it.  It’s time!   Alain

Friday, March 25, 2016

Hearing focus of SF 2569 Autonomous vehicles task force establishment and demonstration project for people with disabilities

March 23 Hmmm… Watch the video of the Committee Meeting.  The testimony is Excellent and very compelling! Also see Self-Driving Minnesota Alain

Thursday, March 17, 2016

U.S. DOT and IIHS announce historic commitment of 20 automakers to make automatic emergency braking standard on new vehicles

Press Release, Mar 17, NHTSA & IIHS "announced today a historic commitment by 20 automakers representing more than 99 percent of the U.S. auto market to make automatic emergency braking a standard feature on virtually all new cars no later than NHTSA’s 2022 reporting year, which begins Sept 1, 2022. Automakers making the commitment are Audi, BMW, FCA US LLC, Ford, General Motors, Honda, Hyundai, Jaguar Land Rover, Kia, Maserati, Mazda, Mercedes-Benz, Mitsubishi Motors, Nissan, Porsche, Subaru, Tesla Motors Inc., Toyota, Volkswagen and Volvo Car USA. The unprecedented commitment means that this important safety technology will be available to more consumers more quickly than would be possible through the regulatory process…The commitment takes into account the evolution of AEB technology. It requires a level of functionality that is in line with research and crash data demonstrating that such systems are substantially reducing crashes, but does not stand in the way of improved capabilities that are just beginning to emerge. The performance measures are based on real world data showing that vehicles with this level of capability are avoiding crashes.. Watch NHTSA video on AEB  Download AEB video from IIHSRead more  Hmmmm…Fantastic!  Automakers leading with regulatory process staying out of the way.   Alain

Thursday, February 18, 2016

  Motor Vehicle Deaths Increase by Largest Percent in 50 Years

Press Release Feb 16 "With continued lower gasoline prices and an improving economy resulting in an estimated 3.5% increase in motor-vehicle mileage, the number of motor-vehicle deaths in 2015 totaled 38,300, up 8% from 2014.

The 2015 estimate is provisional and may be revised when more data are available. The total for 2015 was up 8% from the 2013 figure. The annual total for 2014 was 35,398, a less than 0.5% increase from 2013. The 2013 figure was 3% lower than 2012. The estimated annual population death rate is 11.87 deaths per 100,000 population, an increase of 7% from the 2014 rate. The estimated annual mileage death rate is 1.22 deaths per 100 million vehicle miles traveled, an increase of 5% from the 2014 rate. Read more Hmmmm…This is REALLY BAD news.  Come on insurance. This is costing you money!  Accident rates going up means that your actuarials are behind, your regulated pricing lags and you are losing money.  To get ahead of your actuarials, you MUST incentivize the adoption of automated collision avoidance systems.  You’ll then do very well, thank you AND help society.  Alain

Thursday, January 14, 2016

 Obama’s $4 Billion Plan for Self-Driving Cars Will Make Google Very Happy

M. Bergen, Jan 14 "The Obama Administration has seen the self-driving future, and it’s jumping aboard.  At the Detroit auto show on Thursday morning, U.S. Transportation Secretary Anthony Foxx will unveil a plan to develop a national blueprint for autonomous driving technology within the next six months.  He will also announce that President Obama is planning to insert $4 billion into the 2017 budget for a 10-year plan to support and “accelerate” vehicle automation projects.

“We are on the cusp of a new era in automotive technology with enormous potential to save lives, reduce greenhouse gas emissions, and transform mobility for the American people,” Secretary Foxx said in a statement. …But here’s the part of Foxx’s talk that really matters for Google: These national rules will allow fully driverless cars..." Read More  Hmmm… A few months ago it was $42M for Connected Vehicles. Today it is 100x for automated vehicles! Finally Secretary Foxx.."YES! YES! JESUS H. TAP-DANCING CHRIST… I HAVE SEEN THE LIGHT" (Blue Brothers)  Yea!!!!!   🙂 Alain

Sunday, January 3, 2016

 Google Pairs With Ford To Build Self-Driving Cars

J. Hyde & S. Carty, Dec. 21 "Google and Ford will create a joint venture to build self-driving vehicles with Google’s technology, a huge step by both companies toward a new business of automated ride sharing, …According to three sources familiar with the plans, the partnership is set to be announced by Ford at the Consumer Electronics Show in January. By pairing with Google, Ford gets a massive boost in self-driving software development; while the automaker has been experimenting with its own systems for years, it only revealed plans this month to begin testing on public streets in California….

Google already has several links to Ford; the head of the self-driving car project, John Krafcik, worked for 14 years at Ford, including a stint as head of truck engineering, and several other ex-Ford employees work in the unit as well. Former Ford chief executive Alan Mulally joined Google’s board last year.

And Ford executives have been clear for years that the company was ready to embrace a future where cars were sold as on-demand services. Ford CEO Mark Fields has repeatedly said Ford was thinking of itself “as a mobility company,” and what that would mean for its business" Read more  Hmmm…Not surprising and not exclusive. 🙂 Alain

Sunday, December 19, 2015

Adam Jonas’ View on Autonomous Cars

Video similar to part of Adam’s Luncheon talk @ 2015 Florida Automated Vehicle Symposium on Dec 1.  Hmmm … Watch Video  especially at the 13:12 mark.  Compelling; especially after the 60 Minutes segment above!  Also see his TipRanks.  Alain


This list is maintained by Alain Kornhauser and hosted by the Princeton University LISTSERV.

Unsubscribe | Re-subscribe

 

***************************************************************************************************************
This list is maintained by Alain Kornhauser and hosted by the Princeton University LISTSERV.

Unsubscribe | Re-subscribe

%d bloggers like this: