[log in to unmask]" alt="" class="" width="84" height="148">
autonomousTaxi (aTaxi) stop
facilitating true ride-sharing to any destination
within the autonomous transit system's Operational
Design Domain. The first of what may well become
a half million or so others. Each strategically
located
to be less that a 5 minute walk from essentially
any of the billion or so person trip ends that are
made on any typical day in the USA (outside of
Manhattan (whose subway stations provide the
comparable accessibility). Twenty million or so
aTaxi vehicles could readily provide on-demand,
share-ride mobility from these ~0.5M aTaxi
stops. Provided would be essentially the same
24/7 on-demand level-of-service as we do for
ourselves with our own conventional automobiles;
however, this mobility would be affordably
achieved using half the energy, creating half the
pollution, eliminating essentially all the
congestion, doubling conventional transit
ridership and making such improved mobility
available to those who today can't or wish not to
drive a conventional automobile. This is a MAJOR
1st. Alain
M. Kynaston, Nov. 21, "A common complaint during the
school year for valley parents is traffic in school
zones. Some smartphone apps aim to cut down on school
traffic and congestion.
For many parents at Desert Oasis High School, taking
their teenager to school can be a nightmare. With
Cactus Avenue and Rainbow Boulevard the only major
intersection near the high school, some parents told
us they wait up to 45 minutes in the drop-off line.
There are several apps now for organizing an
old-fashioned carpool.
M. Sena, Dec. 2019, "Turning Over Vehicle
Infotainment to Tech Titans
How did car navigation and infotainment get mixed up
in the US presidential campaign? Ask that question to
one of the candidates and they would not know what you
were talking about. Ask the question to a CEO of one
of the car OEMs, and they would refer you to their
governmental affairs office where the path would end.
But mixed up it is, and if Elizabeth Warren wins the
Democratic nomination and the presidential election,
and if she follows through with her promise to voters,
the big gold. Continue
reading
A. Li, Nov 21, "Ahead of more and more self-driving
cars appearing on the road, Waymo today published an
instructional video on how first responders should
interact with its autonomous vehicles in case of
roadside emergencies.
This instructional video is designed to equip first
responders to confidently identify, approach, and
interact with Waymo fully self-driving vehicles in a
wide range of collision and other emergency scenarios.
Not all of the techniques described will be required
during every interaction with a Waymo vehicle.
This “Guide for First Responders” offers “recommended
techniques for emergency situations.” It starts off by
learning how to identify the 2017 Chrysler Pacifica
Hybrid due to the various sensors located on the roof,
front, and sides of the vehicle...." Read
more Hmmmm... See
video. All part of the overhead that needs
to be done in order to begin to scale. Alain
K. Pyle, Nov 25, "Imagine a mobility service that
offers on-demand, door-to-door capability for all
citizens, regardless of physical ability. Gary
Miksell, Chief Innovation/Technical Officer of VTA,
Silicon Valley’s transit and congestion agency, has
been imagining that scenario and has begun putting the
pieces in place to test such an approach in a
real-world setting at the campus of the Veteran’s
Hospital in Palo Alto. Gary Miskell, Chief
Innovation/Technical Officer of VTA, Silicon Valley’s
transit and congestion agency, has been imagining that
scenario and has begun putting the pieces in place to
test such an approach in a real-world setting at the
campus of the Veteran’s Hospital in Palo Alto.
By focusing on one of the most difficult and expensive
transportation challenges, paratransit, the applied
research Miskell, and his team are doing could have a
huge long-term impact on the way mobility services are
delivered in Silicon Valley. As Miskell describes in
the above interview, an on-demand shuttle would meet
the customer at her home. For those people needing
help, a paratransit professional would meet that
person and help her launch or end the trip...." Read
more Hmmmm... See video.
Very interesting. Alain
K.Laing, Nov. 21, "Gov. Gretchen Whitmer announced
Thursday an agreement with self-driving car-vision
technology developer Mobileye to create a pilot
program to test the company's products in the ice and
snow on Michigan roads. Whitmer's office said the
program, which will last six months to take advantage
of winter road conditions in Michigan, will involve
Israeli-based Mobileye.... " Read
more Hmmmm... Very nice. We need
some of these initiatives to come to New Jersey.
Alain
A. Rosenzwigm Nov. 25, "Teleoperation enables a human
operator to control vehicles from far away. In July,
Florida became the sixth state to explicitly allow
autonomous vehicle companies to use remote operators.
The teleoperation case
This is what makes the Florida legislation so
relevant. Now that both legislators and the AV
industry understand that full autonomy isn't as
imminent as once thought, we need to find alternative
solutions. That's where teleoperation comes in...".... Excuse
me!!! Teleoperation is NOT an alternative
solution to "full autonomy" and should NOT be
promoted as such. It is a necessary element of
"full autonomy" that addresses the
rare/emergency instances where "full autonomy"
fails. It is not an easy way to extend the
Operational Design Domain. It is only a way to
address failures, crashes, and very unusual
circumstances...
"... When the AV encounters an edge case it cannot
negotiate, it stops and calls for a remote human
operator to assist. Over a reliable network connection
(based on a network bonding solution that utilizes
several independent cellular connections and fuses
them into one reliable connection), the operator
receives all relevant information from the vehicle
cameras and sensors and decides how to best resolve
the edge case for the vehicle.
This can involve indirect control, such as selecting
or drawing a path for the AV to execute, or direct
control, such as driving the vehicle using the
steering wheel and pedals at the teleoperation
station.
Teleoperation combines the safety of human
intervention with the scalability required for
commercial deployment. ..." Read
more Hmmmm... Excuse me again. This
is not about scalability. It is about being a
responsible commercial operation which can have
remote operation as one of its tools to use in
extremely rare circumstances. "Teleoperation"
is the wrong name for this because "operation"
implies a normal process. This is to be used in
abnormal/emergency situations only. At worse it
should be called "teleEmergencyRecovery" or
"teleFailureRecovery" or "teleFailureOperation"
or ??? but it should have "emergency" or
"failure" in the name to make certain that it is
understood to be rarely used. Alain
Press Release, Nov. 29, "Daimler and the General
Works Council have agreed on key points in order to
streamline the Group structure and thus increase
efficiency and flexibility. Therefore, measures to
reduce costs and employment in a socially responsible
manner have jointly been agreed upon. Daimler will,
among other things, use natural fluctuation to reduce
jobs. In addition, the possibilities for part-time
retirement will be expanded and a severance program
will be offered in Germany in order to reduce jobs in
the administration. The implementation of this Key
Points Agreement will be further developed with
employee representatives in the coming weeks.
Daimler aims to cut thousands of jobs worldwide by the
end of 2022. The agreed job protection in Germany
until the end of 2029, which was promised and agreed
upon in the spin-off of Mercedes-Benz Cars & Vans
and Daimler Trucks & Buses, stays untouched for
Daimler AG, Mercedes-Benz AG and Daimler Truck AG..."
Read more Hmmmm... 10 year job
protection must mean that Daimler expects to
grow substantially in those years and train its
German workforce for new skills. Alain
P. Baker, Nov. 27, "I discovered the videos in January. One of the first I saw was posted by the irreverent online car magazine Jalopnik. “Dangerous Idiot Sleeps While Driving His Tesla on Autopilot,” read the headline. The cellphone-captured footage did, as promised, appear to show a man snoozing in the driver’s seat of a Tesla as it zipped along a road somewhere outside Las Vegas. And because the man was asleep, it was safe to assume that navigation was indeed being handled by the car’s onboard driving-assistance system, which is called Autopilot...
These videos are magnetic not just because of the eerie images they contain, but also because, watching them, we can’t actually be sure what we’re seeing. Is this danger or safety or both at once? Perhaps in a different era we would have cried out in excitement: How cool! Today we are more tempted to gasp in shock and call out a warning: Wake up!" Read more Hmmm... And since none of the videographers warned: Wake up!, they are fake! With this article NYT Magazine amplifies the Fake News. Moreover, the water bottle stuff was used to trick the Daimler system in 2014 before AutoPilot was even released. As with everything, there are a very few who misuse perfectly good products. 94% of car crashes have human misuse/misbehavior as a contributing cause. Alain
M. Wisniewski, , Nov. 26, "With the Illinois Tollway’s
permission and cooperation, Autobon’s truck will be on
the road starting Monday. A driver will be behind the
wheel at all times, according to Gebis.
“The driver still has full control,” Gebis said. “We
want to make sure the truck isn’t weaving in the lane
and it maintains a safe following distance. We want to
make sure it’s very precisely driving."...The test by
Autobon this week is a harbinger of something road users
are going to be seeing more of in the coming decades —
vehicles that are at least partially self-driving...."
Read
more Hmmm... Partially self-driving not
near being Driverless and seemingly not focused on
carrier benefits of "partially self-driving".
Alain
Too many to print...
StupidSummo is competing with cat videos as common
click-bait.
F. Fishkin, May 18,, "From the 3rd Annual Princeton Smart Driving Car Summit, join Professor Alain Kornhauser and co-host Fred Fishkin. In this special edition, the summit's focus on mobility for all with guests Anil Lewis, Executive Director of Blindness Initiatives at the National Federation of the Blind and ITN America Founder Katherine Freund."
April 5, F. Fishkin, "The success of on demand transit company Via is proving that ride sharing systems can work. Public Policy head Andrei Greenawalt joins Princeton's Alain Kornhauser and co-host Fred Fishkin for a wide ranging discussion. Also: Uber, Tesla, Audi, Apple and Nuro are making headlines"
April 5, F. Fishkin, "Here comes congestion pricing in New York City...but what will it mean? Former city Taxi and Limousine Commission head and transportation expert Matthew Daus joins Princeton's Alain Kornhauser and co-host Fred Fishkin. Also...Tesla, VW and even Brexit! All on Episode 98 of Smart Driving Cars."
March 28, F. Fishkin, "The Future Networked Car? From Sweden, The Dispatcher publisher, Michael Sena, joins Princeton's Alain Kornhauser and co-host Fred Fishkin for the latest edition of Smart Driving Cars. Plus ...the Boeing story has much to do with autonomous vehicles and more. Tune in and subscribe."
F. Fishkin, Sept 6, "The coming new world of driverless cars! In Episode 55 of the Smart Driving Cars podcast former GM VP and adviser to Waymo Larry Burns chats with Princeton's Alain Kornhauser and Fred Fishkin about his new book "Autonomy: The Quest to Build the Driverless Car and How it Will Reshape Our World"
R. Wile, Nov 22, "Sen. Jeff
Brandes (R-St. Petersburg) had just finished
serving in the Army, and was looking to make a
name for himself in Tallahassee as a junior
representative. He came across a talk given by
the founder of Google’s driverless car
project.
He quickly realized the potential of
self-driving cars to transform many aspects of
daily life. Ever since, he has made it his
mission to turn Florida into what he calls “an
angel investor” in automation policy. “We want
to have policies in place for this technology
to flourish,” Brandes said in an interview at
the 7th Annual Florida Automated Vehicles
conference in Miami, which concluded Friday.
A. Karpathy, Nov. 6, "Hear from Andrej Karpathy on how Tesla is using PyTorch to develop full self-driving capabilities for its vehicles, including AutoPilot and Smart Summon. ..." Read more Hmmmm... Worth watching the video, (except for the StupidSummon part) Alain
Elon, you sell cars to individuals at which point you relinquish control and responsibility, and thankfully, liability, for that car. Please do everything that you can to be certain that your cars are used responsibly at all times and that those individuals are alert and in control at all times; else, you'll re-acquire the responsibility and the liability. The burden of liability is not good for any business. Liability without control is TrainWreck. The regulators won't save you. Alain
- If you get matched with a fully driverless car, you'll see a notification in your Waymo app that confirms the car won't have a trained driver up front....
- you can enjoy having the car all to yourself....
R. Mitchell, Oct. 4, " Smart Summon is
for parking lot use. But drivers have other ideas.
Tesla unleashed the latest twist in driverless car technology last week, raising more questions about whether autonomous vehicles are outracing public officials and safety regulators.
...Using a smartphone, a person can now
command a Tesla to turn itself on, back out of a
parking space and drive to the smartphone holder's
location - say at a curb in front of a Costco
store.." Read
more Hmmmm.... Russ, great article. A
must read!
Elon, please stop.
StupidSummon was a bad Valley-entitled
idea before you released it. Now that it
is out there it will ruin all that is good
about Tesla, AutoPilot and Driverless
cars. The shorters are going to have a
field day.
While you are at it also
remove all of the DistractTainment add ons
or limit their use when AutoPilot is NOT
on and drivers are engaged in driving.
Just go back to V09! Along the way also
get the Automated Emergency Braking (AEB)
system to work properly (See NTSB
below). To do that, maybe you should
take a serious look at Velodyne's
new
Tesla LiDAR. It may be able to tell
you if the stationary object in the lane
ahead is high enough above the road
surface before your AEB system
decides to disregard it. Then Tesla's may
stop decapitating
drivers.
If you don't remove StupidSummon
then at least be sure to limit its use to the
Tesla owner's own private property by
responsible users. (You know the GPS
coordinates of where each owner lives, so you
can geofence it. You also know each
irresponsible use (You get the videos).
Irresponsible use (use in the violation of the
conditions spelled out in the user's manual)
should void its future availability in that car
unless proper amend are made. If not, then
insurance companies should clearly state that
insuring the use of this feature requires a
substantial additional premium; else, you're not
covered. Courts should view that use of this
feature implies premeditated harm and
demonstrates an extreme indifference to human
life. Parking Lot owners should install signs
forbidding the use of this feature on their
property to protect themselves from being
dragged into the claims process.
K. Korosec, Sept 16, "Waymo transported 6,299
passengers in self-driving ...drivered,
not driverless...Chrysler
Pacifica minivans in its first month participating in a
robotaxi pilot program in California, according to a
quarterly report the company filed with the California
Public Utilities Commission.
In all, the company completed 4,678 passenger trips in
July — plus another 12 trips for educational purposes.
It’s a noteworthy figure for an inaugural effort that
pencils out to an average of 156 trips every day that
month. And it demonstrates that Waymo has the
resources, staff and vehicles to operate a self-driving
vehicle pilot while continuing to test its technology in
multiple cities and ramp up its Waymo One ride-hailing
service in Arizona...
The CPUC authorized in May 2018 two pilot programs for
transporting passengers in autonomous vehicles. The
first one, called the Drivered Autonomous
Vehicle Passenger Service Pilot program, allows
companies to operate a ride-hailing service using
autonomous vehicles as long as they follow specific
rules. Companies are not allowed to charge for rides, a
human safety driver must be behind the wheel and certain
data must be reported quarterly.
The second CPUC pilot would allow driverless passenger
service — although no company has yet to obtain that
permit...."Read
more Hmmmm.... Be sure to look at the Waymo
Quarterly Report and that of the other 3
companies: Zoox,
AutoX
and Pony.ai.
Those 4 companies reported respectively [ 4,678; 103; 9;
0] vehicleTrips; [ 6,299;
134; 13; 0]
personTrips; [59,917;
352; ?; 0] vehicleMiles,
and [
55; 10; 1; 0] number
of unique vehicles used throughout the
quarter. Note Waymo only began
operating on July 2, the last month of
the quarter [May,
June, July].
Note: the CPUC does not permit casual
shared-ride services (serving
individuals or groups of individuals
who weren't predisposed to travel
together). Go
figure??? Alain
Also
note: This is Drivered Service,
meaning there is an attendant/driver
inside each vehicle for each trip; so
this is actually conventional
ride-hailing, a la Lyft/Uber with
fancy schmancy vehicles. The CPUC did
NOT require "disengagement reporting"
so one has no idea as to the extent of
driver/attendant involvement is the
provision of the Drivered service. It
will be interesting to learn if Waymo
considers this activity to be part of
its AV testing program and includes
the disengagement performance of these
vehicles in its disengagement report
to the CA DMV at the end of the year.
We'll
be able to infer if that the
disengagement performance is exemplary
when Waymo decides to begin Driverless
service (w/o an attendant, as
opposed to Drivered service).
- "Vehicle Data Recorder Specialist's Factual Findings": especially the charts of "Vehicle drive mode information": Figures 1 for the hour leading up to the crash and Figure 2, for just the 15 minutes prior. It is very interesting to have the precision and richness of data of the vehicle's behavior prior to the crash. Armed with this information, no wonder Elon wants to insure these cars. What is most interesting about these data is the chart of Lead Vehicle Distance (m). It shows that "lead vehicle distance" is not the instantaneous value obtained by the radar but some smoothed out value of { previous readings plus the latest radar value} (else, there would be some discrete jumps in the data when other cars either cut-in or cut-out of the Tesla's lane ahead.). Moreover, the appearance of a stationary object (approach speed = Tesla speed) in the lane ahead is disregarded (or very lightly weighted) in the determination of "lead vehicle distance". (it grew to its saturated value (that was much greater than the distance to the firetruck) once the lead SUV had changed lanes (whenever that was determined to have occurred). At some point (possibly 490msec before the crash, see below), the system decided that the stationary object detected ahead was not a "false reading" but actually a stationary object that should no longer be disregarded. Since it was being disregarded the Traffic Aware Cruise Control (TACC) operated using a large value for "lead vehicle distance" so it began to accelerate to its desired cruise speed, as would be expected if "lead vehicle distance" is a large value. Yipe!!!!! If Elon hasn't already demanded, NTSB should require Tesla, and all other manufacturers, to: 1. The software/logic governing TACC's behavior during transitions involving a cut-out or a cut-in needs to be substantially improved!, and 2. The reliability in the identification of stationary objects in the lane ahead needs to be substantially improved so that they cease to be assumed to be false alarms.
- "Vehicle Automation Data Summary Report": especially:
1. Figure 4, The speed of the Tesla in the last 221 seconds before the crash showing that the Tesla was traveling rather slowly in the 100 seconds before the crash (under 20 mph), but then accelerated (as discussed above) in the 3 seconds just prior to the crash, beginning as soon as the lead SUV changed lanes,
2. Figure 5, the distance between the Tesla and its lead vehicle, showing that the TACC worked really well until the lead vehicle "disappeared" (changed lanes), and"... Data show that at about 490 msec before the crash, the system detected a stationary object in path of the Tesla. At that time, the forward collision warning was activated; the system presented a visual and auditory warning. Data also shows that the AEB did not engage and that there was no driver-applied braking of steering prior to the crash. According to Tesla, the AEB was active at the time of the crash, and considering that the stopped fire truck was detected about half a second before impact, there likely was not sufficient time to activate the AEB." ...This implies that the AEB and its functioning in collaboration with the TACC needs to be substantially re-evaluated/re-designed. Alain
3. Figure 6 which clearly depicts the movement of the Tesla relative to the lead vehicle and the Firetruck in the 15 seconds before the crash. The Tesla's radar and front facing camera mush have "seen' the firetruck 4 seconds before the crash and every sensing loop (1/10th of a second) during the last 4 seconds yet...
Friday, August 30, 2019
[log in to unmask]" alt="" class="" width="50" height="39"> Former Star Google and Uber Engineer Charged With Theft of Trade Secrets
Tesla, July 16, "At Tesla, we believe that
technology can help improve safety. That’s why Tesla
vehicles are engineered to be the safest cars in the
world. We believe the unique combination of passive
safety, active safety, and automated driver assistance is
crucial for keeping not just Tesla drivers and passengers
safe, but all drivers on the road. It’s this notion that
grounds every decision we make – from the design of our
cars, to the software we introduce, to the features we
offer every Tesla owner.
Model S, X and 3 have achieved the lowest probability of
injury of any vehicle ever tested by the U.S. government’s
New Car Assessment Program.
... In the 2nd quarter, we registered one accident for every 3.27 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.19 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 1.41 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 498,000 miles.... " Read more Hmmmm.... This summary uses "accident" for Teslas and "crash" for NHTSA. This may suggest that the Tesla and NHTSA are not comp[arable... Tesla is reporting about apples and NHTSA is referring to "oranges". That notes; however, it does seem that for Teslas with and without AutoPilot and the other active safety features, there is consistency in the measure. A more detailed question arises about the equivalence of the driving domain for each category as well as who is at fault in each of these situations. Even in light of these issues and details, the large variation in the rates: 3.27 v 2.18 v 1.41 is very significant among Teslas. Seems as if AutoPilot and Tesla's other active collision avoidance safety features are improving safety of Teslas. The spread from the 0.5 value for NHTSA is really astonishing making Teslas much safer than the average of all other cars. Unfortunately these numbers only scratch the surface and beg for more details. In the past I have called for an independent evaluation of the Tesla crash statistics and I do that again there today. I'll offer to do it. Tesla should encourage someone to do it. As it stands today, not enough people believe or trust Tesla (see below) Tesla. That's unfortunate because improved safety is THE major objective of SmartDrivingCar technology. Alain
Oct 16, Establishes
fully autonomous vehicle pilot program A4573
Sponsors: Zwicker (D16); Benson (D14)
Oct 16, Establishes
New
Jersey Advanced Autonomous Vehicle Task Force AJR164
Sponsors: Benson (D14); Zwicker (D16); Lampitt (D6)
May 24, "About 9:58 p.m., on Sunday,
March 18, 2018, an Uber Technologies, Inc. test vehicle,
based on a modified 2017 Volvo XC90 and operating with a
self-driving system in computer control mode, struck a
pedestrian on northbound Mill Avenue, in Tempe, Maricopa
County, Arizona.
...The vehicle was factory equipped with
several advanced driver assistance functions by Volvo
Cars, the original manufacturer. The systems included a
collision avoidance function with automatic emergency
braking, known as City Safety, as well as functions for
detecting driver alertness and road sign information.
All these Volvo functions are disabled when the test
vehicle is operated in computer control..."
Read more Hmmmm.... Uber must
believe that its systems are better at avoiding
Collisions and Automated Emergency Braking than
Volvo's. At least this gets
Volvo "off the hook".
"...According to data
obtained from the self-driving system, the system
first registered radar and LIDAR observations of the
pedestrian about 6 seconds before impact, when the
vehicle was traveling at 43 mph..." (= 63
feet/second) So the system started "seeing
an obstacle when it was 63 x 6 = 378 feet
away... more than a football field,
including end zones!
"...As the vehicle and
pedestrian paths converged, the self-driving system
software classified the pedestrian as an unknown
object, as a vehicle, and then as a bicycle with
varying expectations of future travel path..." (NTSB:
Please tell us precisely when it
classified this "object' as a vehicle
and be explicit about the expected "future
travel paths." Forget the
path, please just tell us the precise velocity
vector that Uber's system attached to the
"object", then the "vehicle". Why didn't the
the Uber system instruct the Volvo to begin to
slow down (or speed up) to avoid a collision?
If these paths (or velocity vectors) were not
accurate, then why weren't they accurate? Why
was the object classified as a
"Vehicle" ?? When did it
finally classify the object as a "bicycle"? Why did it change
classifications? How often was the
classification of this object done. Please
divulge the time and the outcome of each
classification of this object.
In the tests that Uber has done, how often
has the system mis-classified an object as a
"pedestrian"when the object was
actually an overpass, or an overhead
sign or overhead branches/leaves that
the car could safely pass under, or was
nothing at all?? (Basically, what are
the false alarm characteristics of
Uber's Self-driving sensor/software
system as a function of vehicle speed
and time-of-day?)
"...At 1.3 seconds before impact, (impact speed was 39mph = 57.2 ft/sec) the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision" (1.3 x 57.2 = 74.4 ft. which is about equal to the braking distance. So it still could have stopped short.
"...According to Uber,
emergency braking maneuvers are not
enabled while the vehicle is
under computer control, to reduce (eradicate??)
the potential for erratic
vehicle behavior. ..." NTSB:
Please describe/define potential and erratic
vehicle behavior Also please uncover and divulge the design &
decision
process that
Uber went
through to
decide that
this risk
(disabling the
AEB) was worth
the reward of
eradicating "
"erratic vehicle behavior". This
is
fundamentally
BAD design.
If the Uber
system's false
alarm rate is
so large that
the best way
to deal with
false alarms
is to turn off
the AEB, then
the system
should never
have been
permitted on
public
roadways.
"...The
vehicle operator is
relied on to
intervene and take
action. " Wow! If Uber's
system
fundamentally
relies on a
human to
intervene,
then Uber is
nowhere near
creating a
Driverless
vehicle.
Without its
own Driverless
vehicle Uber
is past "Peak
valuation".
Video similar to part of Adam's Luncheon talk @ 2015 Florida Automated Vehicle Symposium on Dec 1. Hmmm ... Watch Video especially at the 13:12 mark. Compelling; especially after the 60 Minutes segment above! Also see his TipRanks. Alain
This list is maintained by Alain Kornhauser
and hosted by the Princeton
University
Leave
|Re-enter