White Surface Tesla Crash in Taiwan

Taiwan Tesla Accident
Taiwan Tesla Accident. Source Liberty Times

Guest essay by Eric Worrall

A Tesla on autopilot crashed into the roof of an overturned truck in Taiwan on Monday, prompting concern about Tesla autopilot’s ability to recognise visually confusing obstacles.

Watch an oblivious Tesla Model 3 smash into an overturned truck on a highway ‘while under Autopilot’

Driver braked but it was too late

TUE 2 JUN 2020 // 01:07 UTC

Video A Tesla Model 3 plowed straight into the roof of an overturned truck lying across a highway in Taiwan, sparking fears the driver trusted the car’s Autopilot a little too much.

The smash occurred on Monday at 0640 local time (2240 UTC) and the drivers of both vehicles were unharmed according to Taiwan’s Liberty Times. You can see the accident for yourself below:

The prang reminds us of a previous case where a 40-year-old man was beheaded after his Tesla Model S, while in Autopilot mode, hit a white 18-wheeler tractor trailer in 2016.

Read more: https://www.theregister.com/2020/06/02/tesla_car_crash/

To the credit of Tesla’s safety technology the driver survived a high speed collision with a truck.

0 0 votes
Article Rating
152 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
commieBob
June 5, 2020 10:16 am

The thing most people miss is just how good human drivers are at avoiding collisions. It’s also impressive how often Teslas are at avoiding hitting pedestrians. link Even so, I suspect it will be a long time before Teslas, or any other self-driving cars, are up to human standards.

Scouser in AZ
Reply to  commieBob
June 5, 2020 10:50 am

Interesting that none of the other traffic seemed to slow down with a truck overturned blocking two lanes…

Clyde Spencer
Reply to  Scouser in AZ
June 5, 2020 11:18 am

And the Tesla was out in the passing lane, pretty much alone. It looks like most of the human drivers saw the obstacle long before the ‘passenger’ in the Tesla realized that there was a problem.

Reply to  Clyde Spencer
June 5, 2020 12:28 pm

What about the “driver”, can’t he handle that, or is he completely out of the game ?

yarpos
Reply to  Scouser in AZ
June 5, 2020 9:11 pm

Driving in Asia. If you slow down for every weird thing/incident you see, you wont get anywhere.

Cliff E. Hilton
Reply to  Scouser in AZ
June 6, 2020 5:19 am

Scourser in AZ

You should visit Taiwan to understand why. My wife is made in Taiwan. We visit her family, who now have move to the countryside, from the busy city life.

BTW, you will feel very safe to live there. Not much crime.

Laufangpi
Reply to  Cliff E. Hilton
June 8, 2020 6:52 am

LOL, I’m in the same position! One really needs to drive in Taiwan sometime to understand how surreal it can be. My father in law left me instruction as he turned over his keys reminiscent of a seen in the 80’s movie The Gumball Rally: ‘ tearing off the rearview mirror “What is behind you doesn’t matter”

Unfortunately, what’s in front apparently doesn’t matter too much either!

HD Hoese
Reply to  commieBob
June 5, 2020 11:25 am

I was told quite a few years ago that automatic systems would have difficulty understanding non-verbal signals. These can come from the individual driver or the vehicle. Think 4 way stops among others. Robots need to play basketball against humans, interesting experiment.

John Endicott
Reply to  HD Hoese
June 8, 2020 2:04 am

Robots need to play basketball against humans, interesting experiment.

That was a Gilligan’s island TV movie “The Harlem Globetrotters on Gilligan’s Island (1981)”. The Robots were crushing it playing straight up basketball in the first half, but Globetrotters beat them by being their usual wacky selves rather than playing straight basketball in the second half.

Reply to  commieBob
June 5, 2020 12:47 pm

It would love to have all other passenger cars use a good safe auto pilot system … It would be a lot easier to make ’em get out of my way. (which would, eventually, result in regulations against unduly disturbing auto-pilot cars).

Patrick B
Reply to  commieBob
June 5, 2020 1:03 pm

The problem I had from the beginning with Tesla’s “self drive” is that it encourages the drivers to treat the car as if it was self driving. In my mind that makes it so defective it should never have been permitted by the regulators. I can’t imagine any other field in which a device that clearly is life threatening and encourages life threatening behavior would be permitted until it was perfected.

And I would not mind so much if it was only the driver and his passengers at risk, but it’s clear the use of the feature puts everyone on or near the road at risk.

Javert Chip
Reply to  Patrick B
June 5, 2020 8:38 pm

PB

Absolutely agree.

Any idiot thinking current state-of-the-art “auto-driving” is safe is one of the many definitions of a liberal arts graduate.

Yooper
Reply to  Patrick B
June 6, 2020 5:56 am

Hmmm…. What about autopilots on ships and airplanes?

MarkW
Reply to  Yooper
June 6, 2020 9:03 am

Both are much simpler problems.

Greg Cavanagh
Reply to  MarkW
June 6, 2020 10:46 am

And they are time limited at fixed direction.

They are also not fully automated, in that they don’t decide for themselves where they are going.

Richard Patton
Reply to  Yooper
June 6, 2020 7:12 pm

: With both planes and ships, autopilot is just a mechanism to maintain speed and heading (and in the case of planes altitude), it does not do what the Tesla is trying to do. Even so ships must always have someone on the bridge watching the radar to take evasive action if needed. Aircraft, above FL180, are under ATC control which monitors the aircraft for potential conflicts and advises the pilots of actions needed to avoid conflict.

Tom
Reply to  Yooper
June 8, 2020 7:43 am

A master, or captain is legally required to maintain a watch at all times.

Autopilots are an aid to a master, not a substitute.

Michael Jankowski
Reply to  commieBob
June 5, 2020 3:21 pm

“…It’s also impressive how often Teslas are at avoiding hitting pedestrians…”

The autopilot didn’t notice the stranded one flagging down the Tesla in the accident video at all.

Philo
Reply to  Michael Jankowski
June 6, 2020 6:18 am

The Tesla is briefly hit the brakes just after it passed the guy signaling along the median barrier. A puff of tire smoke and the Tesla kept going straight into the truck.

A Darwin award to the driver for stupidity.
Some kudos for being well-buckled up, and to the Tesla for protecting the passenger compartment. It looked like a 50mph or so collision.

About time for Traffic Safety agencies to require the Autodrive to be disabled in all Tesla’s. If they can change the battery capacity via wireless they certainly can download a modified operating system.

Reply to  commieBob
June 6, 2020 1:23 pm

I think AI is still a lot more A than I

Stewart Pid
June 5, 2020 10:16 am

The credit to survival should maybe go to the food stuff in the truck that cushion the blow …. it looked like a load of cake icing or some other goo.

Reply to  Stewart Pid
June 5, 2020 10:30 am

Exactly. I guarantee, had the load been something made of steel, the out come would have been less happy.

Reply to  Stewart Pid
June 5, 2020 11:33 am

The truck roof was obviously quite flimsy, so the car had the height of the truck’s box (2 metres?) plus the truck was light enough that it slid about 3 metres along the road, so the car had a 5-metre stopping distance. Plus any crumple zone built into the car itself, so probably a 6-metre stopping distance for the occupants. Easily survivable at 100 km/h with seat belts and air bags, probably even if the driver hadn’t braked.

Next one may not be so lucky.

John MacDonald
Reply to  Smart Rock
June 5, 2020 2:31 pm

Look closely, the car applied the brakes about 40 meters before the crash. Smoke from tires.
Truck roofs are made of light semi-transparent fiberglass to reduce weight up high and admit light into the interior.

mobihci
Reply to  John MacDonald
June 5, 2020 6:56 pm

no, thats some white powder on the road. it didnt slow down at all.

watching some videos of close calls by auto cars and even though they are touted as being the “saviour”, they are more often than not the cause of the problem. they fail to see cues that humans pick up from experience and understanding of intent. eg someone wanting to barge into your lane, they may indicate and run out of their lane, so they force the issue. the auto kicks in, supposedly saves the day and puts everyone there in harms way because the actual driver is not even looking at the problem in the first place. just a mess that should not be allowed to happen on our roads.

self driving cars should be classed as a major distraction, more so than texting while driving and therefor illegal.

Javert Chip
Reply to  mobihci
June 5, 2020 8:49 pm

My 2019 BMW has several useful “safety assists” but it also has a couple “active-control” functions (maintain lane & lane change warning) that drive me nuts (under certain circumstances, both dramatically effect steering).

The last thing I want during the split-second response required to avoid danger is to have a fighting match with my car for control.

Latitude
Reply to  Stewart Pid
June 5, 2020 1:46 pm

exactly…the Tesla ran into the roof of a big pillow

scott finegan
June 5, 2020 10:19 am

Doesn’t look like the Tesla detected the pedestrian in the road either.

Roger Knights
Reply to  scott finegan
June 5, 2020 1:37 pm

That pedestrian was, I’ve read, the driver of the truck, attempting to warn off approaching traffic.

Reply to  scott finegan
June 6, 2020 6:49 am

“Doesn’t look like the Tesla detected the pedestrian in the road either.”

If you watch the footage closely, the pedestrian is never in the path of the car so what would you expect the auto pilot to actually do? Pedestrians must walk within a couple of feet of passing cars all the time.

MarkW
June 5, 2020 10:19 am

With all of the problems, I’m surprised that nobody has sued Tesla for including an “autodrive” feature that is obviously not ready for prime time.

Harry Davidson
Reply to  MarkW
June 5, 2020 12:08 pm

What is the Accidents per Km driven for Tesla, compared to humans. I would not be surprised if it was already better.

MarkW
Reply to  Harry Davidson
June 5, 2020 12:45 pm

Better than humans? Not for many decades.

MarkG
Reply to  Harry Davidson
June 5, 2020 5:33 pm

“What is the Accidents per Km driven for Tesla, compared to humans.”

Here’s an idea. Drive a Tesla to Land’s End, program it to drive to John O’Groats, get out, and see how far it can get before it crashes.

Javert Chip
Reply to  Harry Davidson
June 5, 2020 8:52 pm

Whatever the stats, make sure the driving environments are equivalent.

Reply to  MarkW
June 6, 2020 1:30 pm

Maybe designed by the smart system designer on the Boeing 737MAX?

Carl Friis-Hansen
June 5, 2020 10:20 am

Interesting.
Thought that those kind of scenarios was already considered. Even a blind mouse could have seen the obstacle half a mile away. So was it because the software only recognizes trucks, busses and cars upright and from behind?

Reply to  Carl Friis-Hansen
June 5, 2020 11:09 am

Computer vision is a very challenging field of study. The processing capabilities of the human mind are way ahead of what’s available now. You can train computer vision to spot specific things, but to generalize into a category as broad as general threats from a lot of different things is exceedingly difficult.

MarkW
Reply to  Carl Friis-Hansen
June 5, 2020 12:44 pm

I’m going out on a limb and guess that the autodrive wasn’t able to distinguish the white truck top from the horizon.

Javert Chip
Reply to  MarkW
June 5, 2020 8:54 pm

Just a wild-assed guess, but I doubt the horizon in Taiwan is white. Sweden, maybe.

yarpos
Reply to  Javert Chip
June 5, 2020 9:13 pm

Not a wild ass guess, the “white”truck roof will be a dirty grey, as will the sky

MarkW
Reply to  Javert Chip
June 6, 2020 9:05 am

Looking at the picture above, the horizon is pretty close to white.

If you have a lot of water vapor in the air, the horizon is going to be white.

Reply to  MarkW
June 6, 2020 6:42 am

Have you seen the footage?

There is no question in my mind the truck was seen but it was probably either misclassified or even more likely classified as a truck but the AI couldn’t determine whether it was a truck in a situation that needed an action (ie stop or swerve) because it was never trained for dealing with trucks on their sides.

AI systems are stupid. If they dont recognize something as having experienced it before, then it may as well not exist.

Louis Hunt
Reply to  Carl Friis-Hansen
June 5, 2020 2:18 pm

I think the roadrunner painted a tunnel on the roof of the truck. How does a computer tell the difference between a real tunnel and a painting?

Javert Chip
Reply to  Louis Hunt
June 5, 2020 8:55 pm

Among other things, radar.

yarpos
Reply to  Javert Chip
June 5, 2020 9:09 pm

Which Elon says Teslas dont need. Lidar is stupid apparently and you can do it all with optical recognition.

Reply to  yarpos
June 6, 2020 6:59 am

Teslas have radar. And you and I drive just fine with optical recognition only – so in theory a computerized version of that is capable of doing it better than us because it wont be distracted. We dont have such a system yet, obviously.

Reply to  Carl Friis-Hansen
June 5, 2020 7:02 pm

Adding data to the programming from an old-fashioned radar system would have easily detected a large object ‘approaching’ at the same speed as the car was moving. The safety systems on my car (Lexus GS 350) would have been screaming at me to brake. Not having a system like this indicates they are ‘purists’ and want a totally new concept exclusively using computer vision, or a cost-cutting measure. Either way, bad move.

ChrisW
June 5, 2020 10:23 am

I guess this means that the Tesla autopilot system won’t work very well in a snow storm. Despite all of the scary climate predictions, we still get snow up here in Wisconsin. This is another reason I’ll pass on Tesla (and other self-driving technologies).

ironargonaut
Reply to  ChrisW
June 5, 2020 10:54 am

Have you noticed also you don’t hear about testing in cold climates? Only in Cali, Arizona and Israel.

Stewart Pid
Reply to  ironargonaut
June 5, 2020 1:07 pm

Car & Driver had a Tesla 3 in Michigan this past winter: link https://www.caranddriver.com/reviews/a30209598/2019-tesla-model-3-reliability-maintenance/

From the article …… So, where have we driven our Model 3? Not very far. We’ve mostly stayed in our home state of Michigan as we’ve soldiered through winter. Its logbook is full of anxiety-ridden comments about near misses on range, which we’ve been chewing through at a rate that’s roughly 50 percent higher than predicted.

niceguy
Reply to  ironargonaut
June 5, 2020 2:16 pm

One electric plane needed heating blankets on the wings before it could be started, in a demo in France, not even in winter.

MarkG
Reply to  ironargonaut
June 5, 2020 5:38 pm

“Have you noticed also you don’t hear about testing in cold climates?”

According to a Tesla driver–I think he may be the one I see occasionally around the city–here in Canada where the temperature drops to 40 below zero or lower on some days, the car loses about half its range in the winter.

Trying to Play Nice
Reply to  ChrisW
June 5, 2020 11:27 am

My brother complained that his lane detection system only works when the weather is perfect and he doesn’t need it. When he needs help during bad weather it can’t tell where the lanes are.

Rob
Reply to  ChrisW
June 5, 2020 12:49 pm

I have had to push the same Tesla out of a fairly small snow drift twice here in Canada. The kind of drift that any normal car can handle stumps the electronic drive system. I spoke to another Tesla owner in Canada and he said it was only when he got an upgrade that he heard about the snow setting which allows you to turn off the anti-spin setting. I guess these anti-spin features are on a lot of cars, but at slow speeds they will let you spin the wheels until you get traction in snow. Fully electronic drive still needs to be programmed.

Teslas are **** heavy as well!

Stewart Pid
Reply to  Rob
June 5, 2020 9:35 pm

My 2008 Camry hybrid had the same issue with the electronic nannies being so protective of the transmission that u would be left sitting when there was glare ice. Good Michelin X-Ice snow tires solved that problem for many years but now my daughter runs Toyo Celsius all-weather tires …. almost as good as the Michelins but no seasonal changing tire hassle.

Jeff Labute
June 5, 2020 10:29 am

Dick Jones: “I’m sure it’s only a glitch. A temporary setback.”

I don’t know why anyone would ever trust autopilot. This person is a true pioneer.

Clyde Spencer
Reply to  Jeff Labute
June 5, 2020 11:20 am

Jeff
Almost a candidate for the infamous Posthumous Darwin Award.

MrGrimNasty
June 5, 2020 10:30 am

I suppose if you were going to pick a crash, the thin top of a truck on its side loaded with cardboard boxes is as soft and safe as it gets. Stuntman style – lucky.

RayG
June 5, 2020 10:33 am

Yes, as the video shows, self driving technology is ready for prime time. I can hardly wait for self driving long haul semis carrying 80,000 lbs of cargo. or a triple bottom rig with even more. And just think, fully automated aircraft may be just over the horizon.

Reply to  RayG
June 5, 2020 11:07 am

They already have fully automated aircraft. Their called autonomous drones. They don’t carry passengers yet, but their competence with landings and takeoffs has become on par with a human pilot. It’s also a whole lot easier for a plane to avoid collisions then it is for a motor vehicle. Autopilots for airplanes have been in ubiquitous use for many decades and I’m unaware of a case where the autopilot caused a plane to crash into anything, with the possible exception of a plane crashing by running out of fuel due to a disabled crew. In this case, a smarter plane could change course to the nearest airport.

David Lilley
Reply to  co2isnotevil
June 5, 2020 12:47 pm

Boeing 737 max – OK, so it wasn’t strictly an autopilot issue but it was still a dumb computer. Not only did it crash the plane but it wasn’t possible for the human pilot to override it.

Dodgy Geezer
Reply to  David Lilley
June 5, 2020 1:51 pm

My understanding is that it WAS possible to override it – if you knew how. This was difficult, because Boeing had taken this data out of the training schedule.

Also, the new plane had very heavy trim. You needed auto assistance to move it. When the auto system started to trim the plane nose down, you could turn off all automation – but then you could not move the nose trim manually. The last crash happened when the pilots were forced to turn the auto-trim back on to move the trim back, any it promptly forced the nose hard down…

tty
Reply to  David Lilley
June 5, 2020 2:46 pm

“This was difficult, because Boeing had taken this data out of the training schedule.”

No, they had never put it in, because the sales department had promised the lead customer that there would be no changes in the manual requiring re-training.

Though the second crash is still quite incomprehensible to me. I’m just a retiree following the aviation world out of ex-professional interest, and still I knew quite well after the first crash how to handle the situation: trim the plane out with the electric trim, then pull the breaker and use the manual trim.

Also Boeing were exceptionally unlucky in having two AoA-sensor failures right after each other. In 40 years I never encountered one, or heard of one.

MarkG
Reply to  tty
June 5, 2020 5:41 pm

“I knew quite well after the first crash how to handle the situation: trim the plane out with the electric trim, then pull the breaker and use the manual trim.”

If I remember correctly, they tried that, but weren’t strong enough to override the trim manually. So they turned the electric trim back on, and that just made things worse.

I’m only a flight sim pilot who does some work in the aviation business, but even I can see what a crazy design that was.

AngryScotonFraggleRock
Reply to  tty
June 6, 2020 6:21 am

An AOA sensor failure should never cause an aircraft to crash. The humans involved failed to do their job. The first crew did a good job to begin with but then reinstated the hydraulic power to the STAB at high MACH#; this action allowed the powerful STAB to pitch the aircraft nose down with the pilots having no elevator authority to correct. The same happened with the second one – they had left takeoff thrust set and accelerated to a high IAS. Again, the elevator has less effect as the IAS increases and the mechanical trim becomes very difficult to move. Solution – slow down. When you cut through the diatribe about MCAS (not particularly well thought out by BOEING, either in design or implementation) these were really simple emergencies to deal with (I have had 2 or 3 similar ones in B747 Classics) – simply disconnect the STAB TRIM HYDS and DO NOT REINSTATE! Two perfectly flyable aircraft and many lives needlessly wasted.

Paul767
Reply to  David Lilley
June 6, 2020 11:01 am

The human pilots at both airlines were poorly trained. It was certainly and easily shut off as happened with the las t aircraft on a flight the previous day. Boeing took a soft line on the pilot training issue, as the airline was the biggest purchaser of Boeing aircraft in the world, they didn’t want to criticize them for the bad training. Poor decision on Boeing’s part! Search New York Times Magazine Boeing 737 Max

tty
Reply to  co2isnotevil
June 5, 2020 2:33 pm

“I’m unaware of a case where the autopilot caused a plane to crash into anything”

Air New Zealand 901 is a case which immediately comes to mind. Caused by inserting faulty coordinates into the system.

Michael Jankowski
Reply to  tty
June 5, 2020 3:42 pm

I was somehow unfamiliar with this crash. Wow. Crazy story.

Reply to  tty
June 5, 2020 6:24 pm

Isn’t that human error? Obviously if a human directs the autopilot to crash the plane, it will crash, just as surely as a human pilot intent on crashing a plane will do so. Unless a brick wall is the same color and texture of the road, I don’t think you can even get a Tesla to crash into it and many new cars with the latest safety systems won’t even let you crash into something like that under manual control. None less, flying airplanes is still far easier to automate safely than driving cars.

Some of the most advanced autonomous autopilot systems can already closely follow terrain with or without a map and won’t be surprised by things like mountains. In human assist mode, they would generate loud obnoxious warnings when safety is being compromised by overriding automatic controls or directing the plane to crash. I can even see a future where a pilotless plane is preferred since it removes human error, which is by and large the cause of most crashes, plus there will be some seats with a great view in the front of the plane.

MarkG
Reply to  co2isnotevil
June 5, 2020 5:35 pm

“Autopilots for airplanes have been in ubiquitous use for many decades and I’m unaware of a case where the autopilot caused a plane to crash into anything”

No, but whenever conditions get difficult, they dump the problem back on the pilots. A significant amount of time, the pilots then crash, and the crash is blamed on ‘pilot error’, not an autopilot failure.

AngryScotonFraggleRock
Reply to  MarkG
June 6, 2020 4:12 am

Autopilots are simply tools to help fly the aircraft. As a B777 Capt, I have a choice – fly the aircraft by good old-fashioned stick-and-rudder or delegate. Delegate to the First Officer (I continue to ‘fly’ the aircraft through s/him) or the autopilot (likewise, I continue to fly the aircraft through this system). So I have 2 co-pilots. The autopilot still needs to be programmed and manipulated to put the aircraft precisely where I want it. This is what this Taiwanese muppet failed to do. The analogy I use is: you enter the office, switch on the computer, do you a) sit back and see what happens or b) interact with the machine to produce results?

Richard Patton
Reply to  co2isnotevil
June 6, 2020 7:24 pm

Umm, drones aren’t automated. They are actually remotely operated aircraft. The things the military likes about them if they go down, the pilot isn’t lost; lack of an onboard pilot means longer range, better maneuverability, or greater weapons load; and since pilots don’t have to be on board they can be swapped out after an 8 hr day (actually, to maintain alertness, it is 4 hours with the pilot doing other duties for the other 4 hours) and the drone can remain aloft for a long, long time.

Douglas Lampert
Reply to  RayG
June 5, 2020 11:18 am

Humans have all sorts of stupid crashes, it used to be you could assume that alcohol was involved in a really stupid crash (not always correctly, but it was the way to bet), but nowadays the attempt to play with your phone while driving may be overtaking that.

What matters are the rate and severity of the accidents. Values for which I have no numbers that I trust (self-driving enthusiasts and companies sometimes give numbers, but I don’t trust their numbers as they have an obvious incentive to inflate their safety). I wouldn’t use a self-driving car, but if you offered to give my father one I might urge him to use it.

Javert Chip
Reply to  Douglas Lampert
June 5, 2020 9:05 pm

Let’s clear up some ambiguity: here’s hoping you have a good relationship with your father.

MarkW
Reply to  RayG
June 5, 2020 12:47 pm

Fully automated aircraft have been here for a decade or more.

Flying a plane is orders of magnitude easier than driving a car.

Philip
Reply to  MarkW
June 5, 2020 3:27 pm

Well, it was a while ago, but … Flying from CDG to LHR – a very short, 40 minute trip.
Everything went normally until the landing. It was one of the bumpiest landings I had for a long time, and included a free bounce in the process.

As the plane was taxiing to the gate, the captain made a short announcement, which went something like this:

“Ladies and gentlemen, we up here in the cockpit feel that you are owed an apology for that abysmal landing. We just want you to be aware that neither of the pilots was responsible. The company have been insisting that when conditions are good, we test out the automated landing system that this plane is equipped with. I think you may agree with us that it has a way to go before it can compete with a human pilot.”

I know that these days you can hardly tell the difference. But it took a while to get there, even after regulators had decided that this stuff actually was trustworthy with 100+ lives.

Reply to  Philip
June 5, 2020 8:09 pm

A lot of work has been done on this over the last few years and its a good bet that the technology will soon surpass all capabilities of a human pilot and in many cases, already surpasses most. Many trains are already there, planes will follow and after that other mass transit and auto piloted cars that are as safe as the safest driver and at much faster speeds. We’re not there yet, but are approaching quickly. If all cars were automated in congested cities, traffic would flow far better. A whole line of cars can accelerate simultaneously when a light changes maintaining near contact with the cars in front and behind, much like a train. Keep in mind that what we are seeing now with autonomous driving is what was in the labs 4-5 years ago. What’s in the labs now is even more capable and less expensive to produce.

Javert Chip
Reply to  co2isnotevil
June 5, 2020 9:13 pm

Uh-huh.

I’ve been hearing that for the 50 years I worked around computers. Technology definitely has gotten better, but so has the realization of the true magnitude of the problem.

John Endicott
Reply to  co2isnotevil
June 8, 2020 2:31 am

Planes and trains generally travel in straight lines between two points (ie “less choices” need to be made getting from point A to point B, and while train lines can be all kinds of curvy, there’s still only one direction for the train to go – along the rail). Cars not so much, as they can change direction at every intersection, that alone is magnitudes more complex.

Not only is their own “path” less complex, but planes and trains generally don’t have to worry above the other planes and trains on complex paths of their own travelling at speed in close proximity all the time (and long before other planes and trains do approach such close proximity, the pilots/conductors are warned of approaching danger and usually have to make the decisions on to how to deal with the problem).

In short auto piloted cars have a long, long, long way to go before they’re ready for prime time (yes, even the ones in the labs now. Heck even the ones that will be in the labs 4-5 years from now).

chris moffatt
Reply to  Philip
June 6, 2020 8:58 am

The HS121 Triden5t had a fully automated landing system many years ago that worked IIRC off the standard ILS of the day. It worked well but it was comforting to have a human pilot as backup…

MarkW
Reply to  Eric Worrall
June 5, 2020 8:07 pm

In some cities, that would just be normal Sunday morning traffic.

markl
June 5, 2020 10:34 am

Reminds me of climate modeling. You can’t input every conceivable variable into a program with unlimited variables that are ever changing and some unknown. This is proof that people are safer drivers than robot systems….. today. I wonder what the driver was doing just before they realized the car didn’t recognize the overturned truck was blocking the roadway?

Moderately Cross of East Anglia
Reply to  markl
June 5, 2020 11:16 am

signing his will…

Richard Patton
Reply to  markl
June 6, 2020 7:31 pm

Probably on his cell phone.

June 5, 2020 11:13 am

I still want an electric car, they are just WOW to drive. So far I’ve driven two of them. BUT Tesla can keep all the automatic driving, braking and other whistle & bell stuff.

markl
Reply to  Steve Case
June 5, 2020 12:31 pm

+1 But I’ll keep the auto braking.

Reply to  Steve Case
June 5, 2020 6:31 pm

Electric cars need an efficient gas powered motor generator to keep the batteries charged and provide enough power to sustain highway speeds.

Javert Chip
Reply to  co2isnotevil
June 5, 2020 9:18 pm

As soon as EVs get 600 miles on a charge (about as far as I want to drive in any given day, including road trips), I’ll be right behind Steve Case buying a Tesla…and I’ve been driving BMWs for the last 21 years.

June 5, 2020 11:21 am

Was Will Smith driving?
Is V.I.K.I. still pissed at him?

mario lento
June 5, 2020 11:24 am

What worries me is that software can be used to direct these ground drones to “do stuff”… think about it.

Please don’t call me OK Boomer, I love technology, but the technology is ripe for cyber take over.

niceguy
Reply to  mario lento
June 5, 2020 2:29 pm

Think of all the murders one could do “by accident”..

ozspeaksup
Reply to  niceguy
June 6, 2020 3:42 am

yeah and then have to cover it up
until someone exposes it and then you have to call THEM treasonous
Free Julian Assange

MaxP
June 5, 2020 11:29 am

Self driving vehicles? We’re not there. A beach ball dropped into traffic from an overpass would bring it all to a screeching halt.

Regards.

MaxP

n.n
June 5, 2020 11:29 am

Expect the unexpected.

Earthling2
June 5, 2020 11:43 am

Autopilot works great in predictable situations like a port, for e.g. where the truck/robot just goes back and forth all day delivering containers to another part of the port or to the crane to be lifted off and put on the ship. Or vice versa. They got that figured out 99.99% for the most part, but it is obvious that a lot of the vehicular autopilot accidents that have happened on streets and highways have been so simply avoidable, that it appears this autopilot feature can never predict real world circumstances such as an alert human driver could. Under ideal weather conditions, on a divided 4 lane highway perhaps, but no way right now is the technology reliable for the real world of a simple parking lot or a street. I am surprised that that the Lidar/Rader wouldn’t have picked this up as an obstacle and choose to just drive right into it. Obviously a huge FAIL.

Carl Friis-Hansen
Reply to  Earthling2
June 5, 2020 12:22 pm

“I am surprised that that the Lidar/Rader wouldn’t have picked this up as an obstacle”

Suppose the roof of the truck was soft cloth, and the laser beam was not adequate reflected back. That in combination with low contrast between the white roof and the white horizon.
Add to that, that the truck roof was covering more then the full lane width, which could make it look like the Sun shinning through from the left hand of the driver.

My work colleague in Scotland once mistook an empty black plastic flowerpot for a black cat. We laughed a lot, but it just tell you how difficult the road can be to scan correctly.

boffin77
Reply to  Carl Friis-Hansen
June 5, 2020 3:10 pm

Carl, I wondered if the LIDAR/RADAR/Machine Learning was still recovering its sanity after the shock it got seeing that truck-driver trying to flag it down (you can see it pump the brakes and swerve as it went by him). I see not indication that it even touched the brakes as it approached the truck.

MarkW
Reply to  Earthling2
June 5, 2020 12:56 pm

I don’t believe any of these systems are equipped with LIDAR or RADAR yet.
2 big problems.
1) Cost
2) Interference from the units in other cars.
Every car on the road would either have to operate at a different frequency or some how communicate with each other so that two cars don’t transmit at the same frequency at the same time.

The auto braking systems that I have read about use SONAR but at low power.

tty
Reply to  MarkW
June 5, 2020 3:13 pm

The Uber car that crashed in Arizona had a LIDAR. It kept changing the identification of the lady leading a bicycle, from inanimate to cyclist to inanimate, to unknown etc, and starting over each time until a split second before the crash, when it gave up and alerted the driver, who of course hadn’t time to react.

Javert Chip
Reply to  MarkW
June 5, 2020 9:25 pm

My 2019 BMW 5-series adaptive cruise control uses radar.

I hadn’t really thought of MarkW’s observation regarding several cars simultaneously using radar.

Reply to  MarkW
June 7, 2020 2:16 am

re: “Every car on the road would either have to operate at a different frequency or ”

We are WAAAY beyond simply ‘spitting out a burst of RF’ (at a different frequency even); the RF can be pulse coded, uniquely identifying the source (or allowing a source to uniquely identify its own transmitted RF signal).

An example (and you know this): Your home WiFi network. EVERYBODY (neighbors, etc) can use the same RF channel at the same time even BECAUSE the RF is “coded” (modulated in this case) with unique spreading DSSS (direct sequence spread spectrum) code …

BTW, the SunCell guys are reporting out having successfully reached the 100 hr “mile marker” on a several hundred kilowatt device, making this almost in the power class that could POWER a Tesla … assumptions are the device is still operating presently, with the objective now being to for an additional 100 hrs. This test run has been subject to cessation periodically for the purpose of teardown and examination of the ‘reactor’ for any anomalous trends in the mechanical assembly e.g. corrosion or ‘wear through’ in the reactor walls.

John Endicott
Reply to  _Jim
June 8, 2020 2:42 am

Yeah, and when all of your neighbors and you are on the same channel, it can cause interference that degrades your wifi’s performance (particularly if one or more of your neighbors has a particularly strong signal, as might be the case when your neighbors are very close such as in an apartment building). Which is why it’s recommended you switch to a less busy channel to avoid such interference

John Endicott
Reply to  _Jim
June 8, 2020 3:02 am

In other words, two cars, diving side by side on a multilane road, could be close enough to interfere with each others signal when on the same channel, even with pulse coding. Add another two close in front of them, and two more close behind, and the potential interference is even greater. Pulse coding doesn’t do you much good when it’s being jammed like that.

tom0mason
June 5, 2020 11:48 am

Surely just having a modified version of something like a Leica Disto S910 Touch Laser Distance Meter ( https://www.engineersupply.com/Leica-Disto-S910-Touch-Laser-Distance-Meter-808183.aspx ) incorporated into the autopilot’s design could alert the driver of a possible hazard ahead (up to 300 meters or about 1,000 feet ).

tty
Reply to  tom0mason
June 5, 2020 3:03 pm

Try that on anything but a completely empty interstate in dead flat country and no curves and you will get constant warnings. There is very rarely a completely free area 300 meters ahead.

That is the problem with GPWS in aicraft. The parameters for warning are so tight that there is usually not enough time to pull up. This is because when you get a warning you are supposed to pull up immediately without trying to check if it is a false alarm. If the parameters were less strict it would be impossible to land in a lot of places. Even so I’ve heard of one airport where you often get a GPWS warning on final, so you really can’t make a legal landing there.

tom0mason
Reply to  tty
June 6, 2020 2:28 am

The not insurmountable problems you outline is exactly why I said “Surely just having a modified version of something like a Leica Disto S910 Touch Laser Distance Meter” i.e. ENGINEER the device to the requirements and not just slot it in unmodified. And it was my poor wording that gave the indication that “possible hazard ahead (up to 300 meters or about 1,000 feet )” — it does not HAVE to see 300 meters ahead just that this device (unmodified) could do that! It should be within the wit of engineers to modify and fit such a device appropriate to the circumstances, i.e. within required limits, maybe with speed dependant variable distance monitoring.
It is after all what much of engineering is about — easing difficulties by imaginative use of known technology, making them practical and solvable.

Ed Zuiderwijk
June 5, 2020 11:53 am

Bang = Stop

June 5, 2020 11:58 am

Teslas rely on visual data?
Microwave detectors don’t care about the color of an object.

tty
Reply to  Sam Grove
June 5, 2020 3:16 pm

No, but they have vastly worse definition than an optical system.

Reply to  tty
June 5, 2020 9:32 pm

You don’t need much definition to detect a wall approaching.

Reply to  Sam Grove
June 5, 2020 9:35 pm

And can be use in a complementary fashion.

Reply to  Sam Grove
June 5, 2020 9:36 pm

And can be used as complement to optical systems.

June 5, 2020 12:04 pm

Not really relevant, but I’m wondering how an apparently undamaged truck ended up on its side, cross-wise to the road, in the fast lane of an expressway. No sign of any other vehicles being involved. The road is well fenced.

Reply to  Smart Rock
June 5, 2020 12:41 pm

There is a sort of forest aisle, maybe, there was a gust, but during the video, the trees doesn’t move, so not probable, but playing with his phone, reading a newspaper, lightning a cigarette, drunken, wrong placed load may be reasons.

Javert Chip
Reply to  Smart Rock
June 5, 2020 9:28 pm

Placed there by a Prius salesman…

John Endicott
Reply to  Smart Rock
June 8, 2020 2:47 am

He swerved to avoid the chicken that was crossing the road, the sudden turn causing the truck to tip over. Now quite why the chicken was crossing the road is a question no one has been able to adequately ascertain.

June 5, 2020 12:10 pm

From “boxed” theregister.com article Eric provides in his article: “. . . sparking fears the driver trusted the car’s Autopilot a little too much.”

Don’t you just love the spin put there: “sparking fears the driver trusted . . .”?

All the objective evidence is the driver DID in fact trust the Tesla’s Autopilot far too much . . . he freakin’ crashed at high velocity into an obvious-to-humans road obstacle.

Instead, how about “. . . leading to the conclusion that the driver trusted the Autopilot far too much.”

June 5, 2020 12:29 pm

Jeremy Clarkson challenged Audi back in 2017:
“You drive one of your driverless cars over the Death Road in Bolivia and I’ll buy one. Sit there with your hands folded and let it drive you up there, then squeeze past a lorry with half the tyre hanging over a 1,000 ft drop while the car drives itself. Fine, I’ll buy into it.”

It came as the UK government prepared to announce changes to regulations to allow developers to test self-driving cars on UK roads for the first time.

The Treasury sees the rules as the last barrier to advanced, on-road testing, and hopes it will help realise the Chancellor’s vision of autonomous cars on British roads by 2021.

…. they may be running a bit behind schedule.

MarkW
Reply to  Climate believer
June 5, 2020 1:02 pm

I don’t mind them doing tests, so long as there is a fully trained test driver, alert and ready to take over at an instants notice.

tty
Reply to  MarkW
June 5, 2020 3:27 pm

A basic design rule is to never design anything that requires a person to stay alert for a long period but without actually doing anything. It can’t be done. Our nervous system simply doesn’t work like that.

FranBC
Reply to  tty
June 5, 2020 6:47 pm

+10

tsk tsk
Reply to  tty
June 5, 2020 7:52 pm

And our recovery time to full situational awareness is seconds. Full automation or none(*).

*Fly-by-wire automation excepted.

MarkW
Reply to  tty
June 5, 2020 8:10 pm

Key word: Tests

Reply to  tty
June 6, 2020 6:32 am

“It can’t be done. ”

Speak for yourself, it can. As a passenger, I usually find myself watching the road just as intently as the driver for the whole trip.

John Endicott
Reply to  tty
June 8, 2020 9:02 am

tty, in a test, they wouldn’t be “doing nothing” they’d be monitoring the automation’s responses (and taking control when it doesn’t respond appropriately). One of the things they certainly would be doing is looking out for things further down the road that potentially might trip up the automation (IE unusual traffic situations, unexpected objects in the roadway, road work, etc) and then watch the automation’s response as it approaches those things. For example, if this incident in the article happened during such a test, the test driver could have spotted the overturned truck long before the car got there, noticed the automation wasn’t responding to it as it should and taken control to avoid a crash.

Vanessa
June 5, 2020 12:34 pm

Musk was never as clever as he thought he was !!!! This proves it. Basic ability to see a solid, hard obstruction seems pretty easy and obvious to me but the car has “NO EYES” !!!!!!!!

Javert Chip
Reply to  Vanessa
June 5, 2020 9:39 pm

Musk is not everybody’s piece of cake, but I’d say that carbon-based unit is doing one hell of a job.

Anybody who cold sleep the night before a manned Falcon-9 launch (1.1 million pounds of liquid oxygen, kerosene, and a bunch of nuts & bolts) has my total respect.

PayPal to Tesla to SpaceX to The Boring Company (flame-throwers fit in there somewhere) – you can’t accuse Elon of repeating himself.

John Endicott
Reply to  Vanessa
June 8, 2020 9:16 am

It’s not a lack of cleverness that’s the real issue here. It’s the rush to put out something that isn’t ready for prime time.

ColMosby
June 5, 2020 12:56 pm

The not-very-auto autopilot. How can Tesla claim that their system is an autopilot? They have been sued before when fatalities occurred – object struck was a lot more solid than the aluminum siding on a truck. Tesla’s legal liability they believe is by their recommendation that drivers never let go of the steering wheel. Tesla just plain sucks.

Simon
June 5, 2020 1:10 pm

On that day.
Self driving accidents around the world = 1, human error = 24673.

Reply to  Simon
June 5, 2020 2:24 pm

No idea, how much self driving cars are on the roads, but in relation to human driven…….
btw, “1” is wrong, we had the discussion here month earlier, not a truck, but a pedestrian crossing the road in darkness.
Btw,

In any case, the autopilot has been involved in three fatal Tesla accidents since 2016. The National Transportation Safety Board had complained that the autopilot had too few safety precautions and allowed the driver to be distracted from his actual tasks. (German source via google)

https://en.wikipedia.org/wiki/List_of_self-driving_car_fatalities

tty
Reply to  Simon
June 5, 2020 3:22 pm

On that day
Self driving cars around the world = 1,000, human driven 1,000,000,000.

John Dilks
Reply to  Simon
June 5, 2020 4:57 pm

Sorry Simon,
That does not compute. There were many millions more human driven cars that day than Auto-Pilot cars.

MarkW
Reply to  Simon
June 5, 2020 8:12 pm

Which means that on a per capita basis, self driving cars had 100s times more accidents than did human driven cars.

You just aren’t very good at this math thing are you Simon.

John Endicott
Reply to  MarkW
June 8, 2020 2:52 am

That’s why he’s simple Simon, complex stuff, like math, is just beyond his abilities

Alex
June 5, 2020 1:31 pm

Here, it looks like this Tesla has really poor brakes.

Consumerreport:
“The Tesla’s stopping distance of 152 feet from 60 mph was far worse than any contemporary car we’ve tested and about 7 feet longer than the stopping distance of a Ford F-150 full-sized pickup.”

152 feet are some 46 meters!
Facepalm.
Tesla has simply no brakes.

My MB stops after 35 m at that speed.
Well, my MB did authomatic emergency brake several times on autobahn already and saved at least the car, if not my life.

MarkW
Reply to  Alex
June 5, 2020 8:13 pm

It’s not the brakes, it’s the weight. Those batteries are heavy.

Javert Chip
Reply to  MarkW
June 5, 2020 9:46 pm

MArkW

Not that heavy.

The heaviest Tesla 3 (3500 to 4100 pounds) is only 9% heavier than the heaviest BMW series 3 (3500 to 3770 pounds).

Carguy Pete
June 5, 2020 1:43 pm

This just proves yet again that we are a long long way from self driving cars.
IMHO we will never have truly autonomous passenger vehicles. The whole technology is one class action lawsuit away from extinction.

Robert of Texas
June 5, 2020 2:57 pm

Seems to me that Tesla is dead on center into that truck! Maybe they should send this technology to the military for shooting down incoming missiles?

MarkMcD
June 5, 2020 6:06 pm

Anyone taking bets that the proposed solution will be to ban white trucks?

Nashville
June 5, 2020 7:14 pm

I have a 2018 Ford Edge, 60-0 in 39 meters.
I trust the autonomous cruise control to a point, only had to ‘take over’ once.
This is my first car with all the new safety gizmos…like it so far.

tsk tsk
June 5, 2020 7:57 pm

CLOUD!

For some reason Teslas like to confuse trucks and clouds.

toorightmate
June 6, 2020 5:52 am

The truck rolled because some idiot packed the Tesla nose-down on top of the truck.
The idiot who loaded the truck that way should be sacked.

Alasdair Fairbairn
June 6, 2020 10:17 am

One of my concerns on driverless cars is about courtesy. How on earth do you incorporate that into the software, particularly as it depends so much on visual communication between drivers usually by eye and signal means, subject to interpretation?. There are endless examples of this courtesy enabling flow in heavy traffic situations, that we have all experienced; so I don’t need to give examples. I do wonder though how two such cars would react when they meet head on on a single track road, apart from stopping and digitally glaring at each other.

Lee
June 6, 2020 1:28 pm

Autonomous driving systems need to be trained to ‘see’ road to use, and obstacles to avoid. They aren’t too good at independent thought, because they are not actually thinking! That said, I have been driving quite a long time, and I have never seen a truck overturned on a highway in that way. The auto-driver may well have interpreted the roof as a tunnel – it did hit it almost dead center (same way a smart bomb hits a target dead center. If not now, then very soon the self driving car will be safer than human driven, and then we will have to fight for the right to drive.

John Endicott
Reply to  Lee
June 8, 2020 9:29 am

If not now, then very soon the self driving car will be safer than human driven

Not for a long time, as long as the roads are filled with chaotic human drivers. AI is a long ways from being able to handle “unexpected” things at any moment (Exhibit A – an overturned truck) whereas humans do a better (not perfect, but still considerably better) job of reacting to the unexpected things (be it the stupidity of their fellow drivers or a natural phenomena such as extremely bad weather that can block, obscure, interfere with or just generally foul up autonomous vehicles sensors). Self driving cars are great in “perfect” orderly conditions that pretty much only exist on a test track, not so great in the chaotic conditions of the real world. No amount of wishful thinking changes that reality.

June 6, 2020 4:03 pm

The new Shelby GT500 will do 0-100 – 0 in 10.8 seconds and in a touch over 800′

Mine isn’t quite that fast but I can’t understand distracted driving in a Tesla

June 6, 2020 6:51 pm

Does the Tesla come with FREE crash reporting and Emergency Services Notification?