Taiwan Tesla Accident. Source Liberty Times

Tesla Slammed Over “Full Self-Driving Capability” Claims

Guest essay by Eric Worrall

h/t Breitbart; California regulators are officially reviewing Tesla’s claim to have a “full self-driving capability”.

Tesla’s ‘Full Self-Driving Capability’ Falls Short of Its Name

The pricey option doesn’t make the car self-driving, and now Tesla’s promises are under scrutiny by state regulators in California

By Mike Monticello and Keith Barry
Last updated: May 19, 2021

The features might be cutting edge, even cool, but we think buyers should be wary of shelling out $10,000 for what electric car company Tesla calls its Full Self-Driving Capability option. Tesla claims every new vehicle it builds includes all the hardware necessary to be fully autonomous, and the company says that through future over-the-air software updates, its cars should eventually be capable of driving themselves—for a price.

But for now, Full Self-Driving Capability, which includes features that can assist the driver with parking, changing lanes on the highway, and even coming to a complete halt at traffic lights and stop signs, remains a misnomer. And as federal investigations of crashes involving Tesla vehicles add up, regulators are increasingly scrutinizing Tesla’s claims.

Earlier this week, the California Department of Motor Vehicles put Tesla “under review” for public statements that may violate state regulations that prohibit automakers from advertising vehicles for sale or lease as autonomous unless the vehicle meets the statutory and regulatory definition of an autonomous vehicle and the company holds a deployment permit, the agency’s press office confirmed to CR. 

As of May 2021, the National Highway Traffic Safety Administration (NHTSA) has initiated 28 special crash investigations into crashes involving Tesla vehicles with advanced driver assistance systems, including Autopilot and Full Self-Driving Capability. And safety experts worry that the automaker’s bold claims risk the kind of misuse that has been widely seen on social media, where some owners have demonstrated unsafe behavior by relying too much on the car’s autonomous abilities.  

Read more: https://www.consumerreports.org/autonomous-driving/tesla-full-self-driving-capability-review-falls-short-of-its-name/

The rest of the Consumer Reports article describes detailed testing self driving feature testing performed on a Tesla, and hilights of some fascinating small print caveats in Tesla’s description of their self driving systems.

4.7 19 votes
Article Rating
143 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
May 22, 2021 10:16 pm

I would only get into a self driving car under the barrel of a gun.

Rory Forbes
Reply to  Mike
May 22, 2021 10:39 pm

What would worry me in a “self driving” car is; I have no way of getting to know who’s actually doing the driving. We just assume the programmers are as good at driving as we are … or possibly better. That’s a bad assumption.

Adam Gallon
Reply to  Rory Forbes
May 23, 2021 12:54 am

Looking at the way a large chunk of the public drives, you’re probably safer in a self-drive car.

Reply to  Adam Gallon
May 23, 2021 4:59 pm

No, I’m safer if THEY are in the s-d car!

Björn Eriksson
Reply to  Rory Forbes
May 23, 2021 3:58 am

I dont understand your reasoning?
The car has better attention, is never a distracted, angry, tired or bored driver. That is why, in a few years, cars will be much better drivers than humans, we are now in experimental mode. The programmers do not drive the cars, that is a misunderstanding. It is possible, likely even, that some programmers of the autopilot have never driven a car. The programmers driving skills do not factor in the sutopilots skill. I think you know that, but it is worth spelling out, this is a paradigm shift in robotics.

Reply to  Björn Eriksson
May 23, 2021 8:03 am

Always “in a few years”.

Reply to  Gordon A. Dressler
May 23, 2021 4:59 pm

Yeah, “any day now,” as they used to say in the software business.

Rory Forbes
Reply to  Björn Eriksson
May 23, 2021 9:04 am

You’re far more confident in robotics than I am. Personally, I don’t see AI becoming ‘intelligent’, in a human sense’ any time soon.

The programmers driving skills do not factor in the sutopilots skill.

I disagree. In fact, I believe it will be the lack of “skill” that will cause most of the problems. It’s the lack of skill that make climate models useless. Besides, driving is fun, so why would anyone want to give that pleasure away? If you don’t enjoy driving, take a bus. Single person transport is not for you.

MAL
Reply to  Rory Forbes
May 23, 2021 10:50 am

I enjoy driving in fact I once had a boss that told me I would not be happy unless I drove at least 750 miles a week. I am not a truck driver, I spent a life time fixing computers and install them and the infrastructure to support them. I would welcome a full functional autonomous car my wife has gone blind so she cannot drive and I am having problem stay awake. Driving has become a different ball game than what is was for me twenty years ago.

Rory Forbes
Reply to  MAL
May 23, 2021 10:59 am

Driving has become a different ball game than what is was for me twenty years ago.

For me too and since I’m closing in on 80, the things that make driving an increasing problem are piling up. However, I still enjoy the autonomy of deciding to go to the store, throwing on a jacket and shoes and arriving there minutes later.

There’s always taxis for those who find driving a challenge and it’s probably cheaper.

chris pasqualini
Reply to  Rory Forbes
May 23, 2021 1:10 pm

I, too, would welcome a fully autonomous car, as I can look forward and see inevitable physical decline rob me of independence. But I am highly skeptical that the technology will become available in time for me.

goracle
Reply to  chris pasqualini
May 23, 2021 3:18 pm

the only way for autonomous to be truly autonomous on public roads is if all cars are autonomous. who’s at fault if the car is fully autonomous and gets into a fender bender… you? car manufacturer? software maker? the other autonomous car? the other driver? the other software maker? what if it’s the same software developer for both cars involved in fender bender? who’s insurance will pay? this is a can of worms.

Rory Forbes
Reply to  chris pasqualini
May 23, 2021 3:19 pm

I absolutely KNOW I won’t live to see it happen.

Joe Crawford
Reply to  MAL
May 24, 2021 12:46 pm

“I would welcome a full functional autonomous car…”

MAL, I would think that with your experience fixing computers for years there would be no way you would ever trust software designers, programmers and testers with your or your family’s life. I knew a software manager that believed in “error free programming.” I pointed out that the shortest Type 1 (IBM’s term for multiple installation company supported) program ever shipped was only a single line of code. It was executed by hundreds of customers for many years before an error was reported and had to be fixed. That fix doubled the size of the program.

Having worked in every thing from field support, hardware/software design and development through system architecture there is no way I would ever drive a fully software controlled vehicle. I don’t even trust automatic braking on black-ice without all-wheel-drive and studded tires so I can safely override it.

Nashville
Reply to  Rory Forbes
May 23, 2021 7:12 pm

We bought a Used Ford Edge (2018) loaded.
The car can park itself, the adaptive cruise control on the highway is amazing. Set the distance you want to maintain, and it does it.
Nowhere near hands free, ( for me ) but the technology is advancing rapidly.

Streetcred
Reply to  Nashville
May 23, 2021 8:37 pm

That is standard fare in cars today … but it is not self-driving and it has not advanced much since your 2018 Ford Edge.

In fact, my experience in driving prestige vehicles with all of the ‘bells and whistles’ is they are damned dangerous. For example, violently pulling the car back into the lane your are trying to leave for a straight forward overtaking manoeuvre and them slamming the brakes on because the car (now) in front (again) is too close; setting auto speed limiting incorrectly because it failed to reading the speed limit signs.

I’ve turned off all driver aides in my car as I want to live longer.

Rory Forbes
Reply to  Nashville
May 23, 2021 9:01 pm

I would worry about the same problems mentioned by Streetcred. I don’t want some machine assuming it knows more about driving than I do … and that goes for the programmers too. My older model BMW does have cruise control, but that is easily overridden. I’m not sure I’d like to be on the road with drivers who would trust an autonomous vehicle.

Paul of Alexandria
Reply to  Björn Eriksson
May 23, 2021 11:36 am

Get back to me when the programmers put in basic common sense, like “if something is in front of you, don’t bother trying to classify it, just don’t hit it!” A woman walking her bike was killed last year because the AI couldn’t figure out how to classify her.

IAMPCBOB
Reply to  Björn Eriksson
May 23, 2021 12:32 pm

AS Mike said, above, only “under the barrel of a gun.” Maybe not even then!

Reply to  Björn Eriksson
May 23, 2021 3:32 pm

Sorry, but there have been too many times when gut instinct saved me from disaster. I don’t even know why I acted defensively in some situations, but I did, and it the right thing to do. So how in the heck is a programmer going to program that?

Here’s a situation that should be obvious to all of us (not the same as above): you are driving down a residential street at the 35 mph speed limit. A soccer ball rolls into the street in front of you, but will completely cross before you get to it. Is everything good? What would the car do, i.e., how would the programmer treat an object that the car was in no danger of hitting?

I suspect most of us here would brake the car rather stiffly. There’s often a child running behind a loose ball. What is the likelihood of a programmer thinking about that in his zillion lines of code?

Reply to  jtom
May 23, 2021 5:10 pm

Friend of mine was sitting on a side street, stopped at a major boulevard. The light turns green. He just sits there. His wife asks why he doesn’t go. He waits. ZHOOOM!! A car runs the red light at about 70 mph. “That’s why,” he says. He could not see the car coming from where he sat, but he kept his foot on the brake. Could he hear it? Maybe unconsciously. A robot vehicle would have gotten them killed, no maybe about that.

Lynx
Reply to  jorgekafkazar
May 23, 2021 5:32 pm

Incorrect, an autonomous vehicle would have detected it.

MarkG
Reply to  Lynx
May 23, 2021 7:54 pm

Yes. That’s always the answer you people give.

Magically, an autonomous vehicle would just have detected it. Because magic. Sorry, AI.

John Dilks
Reply to  Lynx
May 23, 2021 8:04 pm

BS. Not coming from a right angle to travel and at that speed.

Reply to  Lynx
May 24, 2021 2:37 pm

Lynx posted “Incorrect, an autonomous vehicle would have detected it.”

Hmmm . . . as of today, I wonder how many souls (particularly those of former Tesla drivers) shout from their graves “But, but , but the Tesla Autopilot should have detected that!”

Reply to  jorgekafkazar
May 23, 2021 7:01 pm

I’ve done that myself. TWICE.

The first time, fortunately, nobody else rabbited out, the car sailed right through. Not going all that fast, anyway; I think I saw the gray hair as they went past (for those who know Tucson, yes, this was during Snowbird Season).

The second time, oh GOD! Lady that was next to me hit the gas, and WHAM! Accident report said “in excess of 80 MPH.” Dead right there; she ended up wrapped around the post on the passenger side of her vehicle.

I have the habit, which I’ve taught to the wife and all of my kids, of taking a quick look both ways before going through an intersection. Whether I have the right of way or not! As one of my sisters taught me, there is frequently a disconnect between “being right” and “being alive.”

John
Reply to  Björn Eriksson
May 23, 2021 4:56 pm

you better hope it is not the case remember robocop etc
a delusion robot is worse than a delusion human remember Arthur C Clarks Hal

Tom Johnson
Reply to  Rory Forbes
May 23, 2021 4:35 am

I would worry more about the sensors than the programmers. What does “autopilot” do about:
Your lane partially blocked on a two lane road with oncoming traffic?
Overhead branches and obstructions?
Oncoming traffic passing in your lane?
Sudden snow, sleet, or rain squalls?
Side road traffic approaching with or without a stop sign?
Road repair crews?
Damaged guard rails?
Pot Holes?
Storm Damage lying in the road?
Accident damage lying in the road?
Flagmen allowing you to drive in oncoming traffic lane?
Check engine light comes on?
Etc.
Etc.
And another 100,000, or so other things I have missed.

noaaprogrammer
Reply to  Tom Johnson
May 23, 2021 7:36 am

Another sensor fail is misinterpreting large white paneled trucks crossing or stalled across your lane as gray sky.

MAL
Reply to  noaaprogrammer
May 23, 2021 10:51 am

radar would fix that why don’t they use it? To expensive?

noaaprogrammer
Reply to  MAL
May 23, 2021 2:47 pm

Tesla has a sensor that uses radar but is considering discontinuing it.

Rory Forbes
Reply to  Tom Johnson
May 23, 2021 8:52 am

Yes, you’re right about the number of variables, unknowns and things that require instant (or even deliberate) decisions. However it was my understanding that it’s the programmer’s job to anticipate such things and write them into the software. Above all else, though, it’s human hubris that worries me. What happens when someone hacks the entire system?

IAMPCBOB
Reply to  Rory Forbes
May 23, 2021 12:39 pm

Exactly, and with Tesla being able to ‘upgrade’ them remotely, how can anyone be sure they won’t be hackable? WE just had an oil pipeline get hacked!

Rory Forbes
Reply to  IAMPCBOB
May 23, 2021 12:49 pm

You know, the pipe line is what I was thinking about when I wrote that. I still can’t understand (or quite believe) how such an important and what should be totally bomb proof facility be so vulnerable … and the hackers won.

Reply to  Rory Forbes
May 23, 2021 5:38 pm

IT security people should have skunk works diplomas.

Rory Forbes
Reply to  IAMPCBOB
May 24, 2021 11:13 am

What if Tesla itself goes rogue, in the way that YouTube, Twitter and the others have? That would be my concern. I drive a car to exercise my freedoms, not to live in a box.

Reply to  Rory Forbes
May 23, 2021 5:36 pm

What happens when someone uses a self-driving vehicle to send a rude contraption downtown at rush hour?

Reply to  Rory Forbes
May 24, 2021 10:32 am

As a programmer, I can guarantee that these programmers won’t anticipate every possibility. Even on relatively simple systems, when exposed to the real world, things come up constantly that were never anticipated.

Just last week we had a video of an autonomous taxi that couldn’t handle a construction zone.

Rory Forbes
Reply to  TonyG
May 24, 2021 11:19 am

Just last week we had a video of an autonomous taxi that couldn’t handle a construction zone.

I remember that. It would have made good comedy if we lived in fantasy land, but it was real and there are real dangers. The thing that no programmer can anticipate is what I call cascading errors, where chaos takes hold. One loses all reference points from which to anticipate a correction.

David A
Reply to  TonyG
May 24, 2021 1:23 pm

And, an AI, having ZERO real intelligence or intiative, will, when confronted with that missed situation, proceeded to crash, possibly killing the driver, passengers or another, and not have one nano second of awareness, and promptly repeat the same mistake the next day.

IAMPCBOB
Reply to  Tom Johnson
May 23, 2021 12:37 pm

Besides, with all the sensors we already have in our cars, having a sensor fail to work is a constant ‘thing’! What happens to the supposdely ‘intelligent’ cars when that happens? Does the ‘intelligent’ car pull, over to the side of the road and request that a human drive? I doubt it, very much!

Rory Forbes
Reply to  IAMPCBOB
May 23, 2021 12:55 pm

The car simply takes itself to the nearest fully automated service station for repairs (with you in it), makes the necessary adjustments and then automatically deducts what it determines is the correct cost from your bank account.

What could possibly go wrong with that?

Reply to  IAMPCBOB
May 23, 2021 6:00 pm

If a man did the programming, it won’t even pull over to ask directions.

Reply to  Tom Johnson
May 23, 2021 3:36 pm

You will know who is in the ‘self-driving’ car. It’ll be the person behind the mail truck, stopping at every mailbox. It won’t know the difference between that and stop-and-go traffic.

John Dilks
Reply to  jtom
May 23, 2021 8:08 pm

ROFLMAO.
Good one and true.

Reply to  Tom Johnson
May 23, 2021 5:13 pm

“Check engine light?” In a Tesla?

John Dilks
Reply to  jorgekafkazar
May 23, 2021 8:10 pm

OK. OK. “Check motor” light.

Paul Penrose
Reply to  Tom Johnson
May 24, 2021 10:31 am

What about when sensors fail? All these computers are connected via some sort of network, so what happens when the network fails? How about when one of the main processors fails? If you are going 70 down the highway and the driver is asleep or not even at the controls, and one of these failures occurs, what will happen to the car and it’s occupants?

Generally in safety critical systems we have complete redundancy for all these components, but that will quickly add to the total cost and complexity. And the more complex a system gets, the more testing it requires. This isn’t rocket science people; it’s even more difficult than that.

Reply to  Rory Forbes
May 23, 2021 2:46 pm

When these first came out, I recalled the scene in the beginning of Robert Heinlein’s “Stranger In a Strange Land.” Where Ben Caxton (a journalist who is a thorn in the side of the world government) gets into an autotaxi – and disappears.
Being a person who does not have love for big government myself – I would be more worried about having no way of being sure what my destination would be.

Rory Forbes
Reply to  writing observer
May 23, 2021 3:15 pm

I would be more worried about having no way of being sure what my destination would be.

There’s the thing. I’m not sure I’d want to put my entire existence in the “hands” of any machine … regardless who vouched for its safety.

Reply to  Rory Forbes
May 24, 2021 6:42 pm

My computer language teachers always emphasized the value of explicit coding over implicit coding.

Programs operating under “what if” conditions are following implicit command structures.
Implicit commands forced through single thread decision trees for the car to decide basic commands.
Confusion reigns when environment conditions are not relevant to current code structures, data sources, inputs or the decision trees themselves.

Reply to  Mike
May 23, 2021 12:13 am

In 1910, many a man said the exact same thing about those loud contraptions with engines while they rode their horse-drawn buggy in to town.

Warren
Reply to  Joel O'Bryan
May 23, 2021 1:12 am

The advancement delusion Joel. All advancement is finite. Eventually one comes to the limit of natural laws. How far has the engine in your car advanced? The basic design hasn’t changed since day-1 despite tens of millions invested in alternative configurations. Autonomous self driving will never happen in any city of ‘relative free will’.

TonyL
Reply to  Warren
May 23, 2021 2:10 am

The Monorail in Las Vegas.
Totally automated, including the conductor’s voice.
“Ladies and gentlemen, the doors are about to close, please stand clear.”
“Children are half price, adults who act like children pay double.”

Much easier than a car, of course, but safe and useful enough. Perhaps it points the way.

On the other hand:
A car that reports your every movement to Big Brother is a whole different story.

Rich Davis
Reply to  TonyL
May 23, 2021 6:32 am

Yes indeed. The “autonomous” In autonomous vehicle is a misnomer if it depends on a server out in the cloud and we can’t opt out of sharing telemetry. Or worse, that someone can remotely shut down or take control of our car.

It’s understandable that engineers developing these systems crave data. But politicians can have less benign intentions for using our data.

John Adams
Reply to  TonyL
May 23, 2021 9:52 am

I’m old enough to remember when elevators had operators.

IAMPCBOB
Reply to  TonyL
May 23, 2021 12:45 pm

The original idea behind ‘autonomous’ automobiles was that the entire HIGWAY, and ALL of the vehicles would be controlled by one huge computer network. You would get in, tell the car where you wantedn to go and it woulddeliver you teher. There would be NO accidents, since ht computer controlled everything, including all the OTHER cars! When on one car is being controlled, the rest of them are, what, OUT of control? One or two cars, or even a few thousand, being controlled individually, is NOT going to work! It’s just another one of those pie-in-the-sky dreams, like solar energy and Wind Power.

IAMPCBOB
Reply to  IAMPCBOB
May 23, 2021 12:49 pm

Oops, I should have done my editing BEFORE I posted it!

So, try this:

The original idea behind ‘autonomous’ automobiles (per science fiction from the 1950’s) was that the entire HIGHWAY, and ALL of the vehicles, would be controlled by one huge computer network. You would get in, tell the car where you wanted to go and it would deliver you there. There would be NO accidents, since the computer controlled everything, including all the OTHER cars! When only one car is being controlled, the rest of them are, what, OUT of control? One or two cars, or even a few thousand cars, being controlled individually, is NOT going to work! It’s just another one of those pie-in-the-sky dreams, like solar energy and Wind Power.

Streetcred
Reply to  TonyL
May 23, 2021 8:46 pm

The Vegas monorail operates on a predetermined track to a specific timetable with known constraints … unlike a car amongst many in a chaotic scramble.

Rhee
Reply to  Streetcred
May 24, 2021 3:36 pm

Chaos is the variable that cannot be managed by software, AI or otherwise, and is one prime reason fully autonomous self driving is pie in the sky. Another reason is entropy, mechanical failure of sensors cannot be precluded, there will always be a failure at some point in time. Chaos and random failure are use cases that cannot be coded-away.

John Endicott
Reply to  TonyL
May 25, 2021 6:07 am

You do realize that programming for movement in a single direction along a singular, fixed & unaltering path (the clue is in the name – prefix mono means one, single, only) which is fairly well controlled & regulated (IE there is no other traffic going every which way to consider) is a whole different ball game to the chaotic landscape of automobiles where the paths are virtually infinite, the number of other objects to account for, and the directions they’re going, are virtually uncountable.

Rich Davis
Reply to  Warren
May 23, 2021 6:22 am

If man were meant to fly, God would have given us wings.

It seems obvious to me that autonomous vehicles can be “perfected” to similar or lower levels of risk that we readily accept when handing the keys to a 16-year old.

The question is whether there is a plan for dealing with that risk. When your teenager damages someone’s property driving your car, it’s clear that your insurance is going to get more expensive. If your kid is a slow learner, the insurance company may drop you as a customer.

If society reaches a consensus that autonomous cars are a risk worth taking (benefits expected to exceed damages), then those questions will be answered. If not, some trial lawyers will feast on the corpses of the companies making autonomous vehicles.

I’ve heard it claimed that when automobiles first came out, some jurisdictions required that in order to travel through town, the driver had to hire someone to run ahead of the car warning passersby. The great-great-great grandchildren of those town elders will probably also pass ordinances that hinder autonomous vehicles for a time. (Especially if they own a taxi business).

I’m not ready to trust my life to an autonomous vehicle just yet, but having pizza delivered doesn’t scare me.

Streetcred
Reply to  Rich Davis
May 23, 2021 8:48 pm

… so long as you’re standing in the street waiting for it 😉

Reply to  Warren
May 23, 2021 8:18 am

Warren posted: “. . . The basic design {of the ICE-—GD]} hasn’t changed since day-1 despite . . .”

Hmmm,
— electronic fuel injection
— variable ignition timing
— use of overhead cam for combustion chamber valve actuation
— supercharging and turbocharging
— automatic emission controls (timing and mixture ratio controls based on sensor feedbacks)
— using diesel, propane, natural gas, and hydrogen as substitutes for gasoline

Yeah, NOTHING is the “basic design” has changed. 😉

Rich Davis
Reply to  Gordon A. Dressler
May 23, 2021 11:12 am

A lot of things were changed in the pursuit of better efficiency and/or performance and/or pollution control, but the basic idea of compressing a flammable vapor to get mechanical energy out of the enclosed explosion hasn’t changed. Not sure how that is particularly relevant though.

Maybe his point is that there isn’t going to be a major increase in efficiency going forward that comes from reducing waste heat.

In my lifetime we’ve probably seen a 4-fold increase in fuel efficiency. There isn’t another comparable efficiency gain left to achieve.

About the only way to gain efficiency in fuel: passenger distance driven is to enforce carpooling or mass transit. For example, if everyone travelled in autonomous taxis, there could be efficiency gains and somewhat less need for vehicles in total.

All we have to do is give up any privacy and end our love affair with driving. No thanks.

D. J. Hawkins
Reply to  Joel O'Bryan
May 23, 2021 3:51 pm

It’s still the case that horse-drawn buggies rarely go off into a ditch or collide with oncoming buggies on their own.

Reply to  Joel O'Bryan
May 23, 2021 7:06 pm

When self driving cars can see a truckload of chicken guts has spilled all over the freeway a mile ahead just as whole bunch of speeding motorcycles fly past you and an angry late truck driver behind you is all jazzed up on speed trying to push you to speed up or go into the gravel shoulder just as it’s starts pouring with rain and and a spider falls on the lap of the driver to your side, AND take safe corrective action, then I might consider consider getting in. In the mean time, you go first….

George Tetley
Reply to  Mike
May 23, 2021 2:49 am

My day is full. of. UNFORSEENS , in which dimension is E.M. living in?

Richard Page
Reply to  Mike
May 24, 2021 6:33 am

What is needed is a new, up to date version of Ralph Nader’s “Unsafe at any speed”. Not joking on this – it’s probably an approach that is needed every few generations.

May 22, 2021 10:39 pm

It’s good money if you can pull it off. Charge US$?k+ (C$10.6k right now) for a software update that’s coming in the future. Real soon. Promise!

Imagine if Microsoft wants to charge you $250 for a new version of Windows for a PC. They tell you it’s going to come out ‘soon’ [they’ve been saying for 5 years], but you have the great chance to pay for it in full up-front because the price is only going to go up! On top of that you get to try out a few apps from the next Windows right now! What a deal!

Easy way to increase your margins.

And they’ve been selling FSD for 5+ years. Expecting people to believe that older hardware/processors/cameras will be just as capable as their newest versions once they install a software update.

TonyL
May 22, 2021 11:32 pm

I have reviewed some of the many videos of tests of the self-driving features in real-world conditions. It is clear that the teslas so tested are fully self-driving. That is to say they exhibit all the driving characteristics we customarily associate with human drivers.

We can make a checklist of human driver characteristics and objectively determine whether the Tesla under test exhibits that characteristic.

*Speeding: check
*Distracted Driving: check
*Negligence: check
*Reckless Driving: check
*Blocking Traffic: check
*Drunk Driving: check (computer simulated)

It sure seems like Tesla has got this self driving thing surrounded.

Reply to  TonyL
May 23, 2021 2:14 am

Who pays the fine? There is a good test of the self-driving system warranty. If Tesla are willing to pay all fines and handsome injury and damage bills then that is putting money behind their claim.

I have watched Hyundai gradually upping their warranty over the years to suck in the unwary and most survive the warranty period but there are not many 20 year old Hyundais still on the roads in Australia. I am aware of a few that have not made it through warranty though before major repairs and Hyundai have reluctantly honoured the warranty. Some very old Mazdas, Hondas and Toyotas though. All brands probably have their Monday builds but some seem to have more than others.

Michael S. Kelly
Reply to  RickWill
May 23, 2021 10:23 am

The reliability of cars today is partly due to progress in materials science, partly to computerized engine control and vehicle diagnostics, but mainly to the high degree of automation that has been achieved in assembly. That takes the “Monday build” aspect out of the equation.

In the mid-1970s, I worked summers in the Chrysler car and truck assembly plants in Fenton, Missouri. It was usually either in material handling or custodial work, but one summer I did work on the car assembly line. I couldn’t fit a description of all of my screw ups in any comment section.

A good alternative, though, is describing what greeted every employee as they came in the employee entrance each day. Opposite the entrance was a 20 foot tall graph called “The Cherry Chart.” The horizontal axis was time, and the vertical was the percentage of cars coming off the line that did not have to go to the repair pool.

The “goal” on the chart was a dotted line across the entire graph. It was set at 66%. Yep, Chrysler would have been thrilled if as few as 2/3 of the cars produced didn’t need some kind of repair or rework. The actual percentage was plotted on a daily basis. It never got above 13%.

Today, the average amount of touch labor involved in assembling an automobile, from parts, components and subsytems to the final product, is 13 hours. Everything else is put together by robots. Back in the 1970s, just the instrument panel wiring required almost 13 hours of touch labor. Things have definitely improved.

Streetcred
Reply to  Michael S. Kelly
May 23, 2021 8:57 pm

You mean the ‘unreliability’ of modern cars ? The electronic components are woeful, not any better than PC perpetual motherboard failures of the 90’s.

For instance, my prestige European sports car … electronic water pump failed, engine fried in 400m – repair $10k; electronic transmission failed – repair $6k, electronic dynamic traction control failed – $3k, etc.

Ha!

Michael S. Kelly
Reply to  Streetcred
May 24, 2021 4:33 am

Sounds like your prestige European sports car is a Jaguar, complete with Lucas electrics. I’m referring to ordinary cars with actual electronics.

Richard Page
Reply to  TonyL
May 23, 2021 1:51 pm

I would venture the opinion that Tesla have only avoided huge numbers of lawsuits up to this point by victims of the ‘self driving cars’ because there have been so few survivors.
Remember – Thelma and Louise would still be with us today if they hadn’t been test driving a Tesla!!

May 23, 2021 12:11 am

Putting cabbies out of a job. Tech is just doing to the driver what technology did to the boiler stokers shoveling coal into the boiler furnace. You can be a Luddite or you can embrace technology that ultimately improves everyone and everything.

This is not the same thing though as going backwards on energy density (wind and solar) on the energy that drives our ever increasingly complex technical society.

commieBob
Reply to  Joel O'Bryan
May 23, 2021 1:22 am

You’re assuming that autonomous vehicles will eventually become as safe as human driven vehicles.

There is not yet a true self-driving car at Level 5, which we don’t yet even know if this will be possible to achieve, and nor how long it will take to get there.

link

So, there’s that.

Thus far, technology has brought us productivity gains that have increased the well-being of nearly everyone in the world. What would happen though if advanced AI rendered all human beings obsolete?

In the Innovators’ Dilemma, Clayton Christensen points out that disruptive technology has a tendency to push companies up market until there is no market left, at which point they go bankrupt.

The nightmare is that technology will push human workers into a more remunerative but smaller market until there is no market left for human workers.

We currently have the problem that, as the technical demands of all jobs increase, people with an IQ of less than 90 may become unemployable. As an example it is possible that someone who can’t properly read and understand a WHMIS form would be unable to get a job pushing a broom. IIRC, such people are currently about 15% of the population.

Just because something has worked great in the past, there’s no guarantee that it will continue to do so. We have to consider that possibility. On the other hand, we don’t want to be stampeded into doing something stupid because we think there’s a possibility that something bad will happen. Does that kind of thing happen? Well there’s CAGW …

Reply to  commieBob
May 23, 2021 2:22 am

We currently have the problem that, as the technical demands of all jobs increase, people with an IQ of less than 90 may become unemployable. 

I thought all these people were tasked with making climate models and then paid to support the projected claims.

Reply to  RickWill
May 23, 2021 6:12 pm

Or they’re elected to high government offices.

John Endicott
Reply to  jorgekafkazar
May 25, 2021 6:13 am

Those are just the ones with IQs lower than their shoe sizes.

Reply to  Joel O'Bryan
May 23, 2021 2:19 am

Tech is just doing to the driver what technology did to the boiler stokers shoveling coal into the boiler furnace. 

Covid has already proven the validity of this statement but more on the reduction in needs for cabs rather than self-driving cabs.

Think of all the cab fares lost through reduced air travel. Zoom has replaced a lot of the need for travel.

Here in Australia, taxi queues at airports could be observed from space. They are no more – at least for a while yet.

Streetcred
Reply to  RickWill
May 23, 2021 9:13 pm

Covid has not done what that analogy has done and it will not as it is a temporary phenomena. All it has achieved is to set back business a few years through failures in demand and production.

Zoom (and the rest) is rubbish other than for the most highly structured of meetings … design management is almost impossible. One thing that has happened is that all of those unnecessary meetings that could have been resolved between two people on a phone call have now been resolved between two people over the phone.

The “new normal” will be just like the ‘old normal’ in due course … employers are demanding that employees return to the workplace … you still need to eat and drink … you still need to meet objectives … nothing has changed.

May 23, 2021 12:48 am

It is impossible for a computer to handle this, and it stagers me that any government has even allowed such cars on the road.

People make mistakes, but they also make new decisions. Computers cant make new decisions. They cant think for themselves. They can only react in a preprogramed way to situations the software engineer thought of.

These situations are called ‘use cases’. There are an infinite number of these in reality because they will evolve and change continuously because events that lead to them are random.

Person x leaves a shop top cross a road, cyclist coming at an angle y, a truck parked up on the left, a broken down car on the right. An ever changing and infinite number of variants of just this exist.

And we saw the result recently of the addition of bollards (traffic cones) to a road did. It caused the car to just stop.

But stopping like this is dangerous, you risk being rear ended. So even that is not a catch all for unknown situations.

They must be banned because they will take lives. And computers are NOT gods. They are NOT clever.

Izaak Walton
Reply to  Matthew Sykes
May 23, 2021 1:16 am

Matthew,
What exactly do you think is the difference between a brain and a computer? Both are physical machines designed to perform computational tasks. There is nothing in the laws
of physics that says that a computer can’t emulate a brain with to an arbitrary degree of precision.

And you intend to ban things because they take lives, why allow cars at all? Or let humans drive faster than 10km/hr. Banning SUVs would also save lives of pedestians as would putting an upper limit on the weight of a car.

Hokey Schtick
Reply to  Izaak Walton
May 23, 2021 1:29 am

Dude here can’t tell the difference between a human brain and a computer. Snigger.

Bill Toland
Reply to  Izaak Walton
May 23, 2021 3:10 am

Izaak, have you ever written a computer program? The computer can only do what you have programmed it to do. I have been writing computer models for 48 years and I would never get into a self driving car.

Jay Hendon
Reply to  Bill Toland
May 23, 2021 7:38 am

Mr Toland, if you would never get into a self driving car, have you flown on a commercial jetliner recently?

Bill Toland
Reply to  Jay Hendon
May 23, 2021 7:40 am

No.

noaaprogrammer
Reply to  Bill Toland
May 23, 2021 8:12 am

I’m with you, Bill. If I die in transport going somewhere, I want to be the driver.

MarkG
Reply to  Jay Hendon
May 23, 2021 8:01 pm

No. And it’s worth noting that, even with decades and decades of work, we’ve been unable to create a fully self-flying aircraft reliable enough to put passengers on board.

Is there even a fully self-driving train outside of tightly-controlled urban areas?

Streetcred
Reply to  Jay Hendon
May 23, 2021 9:18 pm

I have. How many aircraft up there at any given time flying predetermined routes in 3 dimensions ? Cars on the road ? Even at the level of autopilot sophistication they still have a full compliment of aircrew … yet to see the airlines putting the janitor in charge of the ‘bus.
Not to mention that air traffic control on the ground is still very busy. 😉

Reply to  Jay Hendon
May 23, 2021 11:09 pm

They have pilots.

A car has cruise control, thats basic automation. The driver can override that in an instant.

Same with autopilot. Or controlled descent.

But recognise this, air space is VERY clean. When was the last time a kid with an ice cream ran out in front of a jet liner?

Cars live in an incredibly ‘dirty’ world in comparison, my example was clearly explaining that.

So planes are very different.

Reply to  Matthew Sykes
May 24, 2021 10:51 am

“The driver can override that in an instant. Same with autopilot.”

And we saw what a disaster that could be when the pilot COULDN’T override with the 737 mess.

dk_
Reply to  Bill Toland
May 23, 2021 7:57 am

Bill T,
It may well be that you are arguing with a computer program, or perhaps a software enhanced human. Watching for a while now, I haven’t been able to decide with that one.

Simon
Reply to  Bill Toland
May 23, 2021 12:18 pm

 I have been writing computer models for 48 years and I would never get into a self driving car.”
But I bet you would fly in a plane with auto pilot? The difference is only that one is more developed (at this stage) than the other. And planes crash on AP so if a 100% success rate is your criteria, then back to walking for you. Oh wait, humans have heart attacks,and mental breakdowns…..

Bill Toland
Reply to  Simon
May 23, 2021 12:48 pm

Simon, the difference is that planes still have pilots. As far as I know, there are no plans to remove the pilots. Would you get on a plane with no pilot? Would anyone?

Bill Toland
Reply to  Bill Toland
May 23, 2021 1:06 pm

I now see that there are plans for self flying planes. I’m sure that the public will be lining up in their droves to fly in them. Not.

Reply to  Bill Toland
May 24, 2021 10:52 am

“I now see that there are plans for self flying planes”

So we learned nothing from the 737 mess…

John Dilks
Reply to  Simon
May 23, 2021 8:28 pm

Air traffic is much easier to handle than automobile traffic.

Reply to  Bill Toland
May 23, 2021 11:06 pm

Yep, SW engineer myself, as you can probably tell. I wouldn’t go near one in a million years.

Bill Toland
Reply to  Izaak Walton
May 23, 2021 3:20 am

Izaak, would you be willing to board a plane which has no pilots?

Warren
Reply to  Bill Toland
May 23, 2021 4:09 am

Izaak your next operating theatre procedure will be fully automated. Please check-in online and thanks for choosing Autocryptocropnchop Inc.

Bill Toland
Reply to  Warren
May 23, 2021 5:22 am

In a fully automated surgery, you will be given the choice from the computer menu which procedure you require. Just imagine your surprise when you wake up if you want a vasectomy and your finger slips and you accidentally choose the sex change option.

Mr.
Reply to  Izaak Walton
May 23, 2021 6:35 pm

Izaak, the top line Tesla weighs in at about 2.3 tonnes.

That’s a 100 Series Landcruiser.

Also, our brains use rationality on top of instincts & intuition.

Cray supercomputers can’t emulate that combo.

And never will.

Reply to  Izaak Walton
May 23, 2021 11:06 pm

IMAGINATION!!!!!!

Dear oh dear, do I have to spell it out for you? OIRIGINAL THOUGHT!

Rusty
May 23, 2021 4:32 am

Calling it ‘Autopilot’ is the opposite of designing to take account of idiots. Just the other day I read of one man who was caught literally asleep at the wheel by police in the US.

He’d been stopped on 3 other occasions but they couldn’t see if he was asleep.

There have been numerous cases where the system doesn’t do what’s advertised. If it was only the brain dead who were in danger of relying on the system it wouldn’t be so bad but they endanger every other road user.

The system should be banned.

Scissor
Reply to  Rusty
May 23, 2021 5:07 am

Safe and effective.

Arthur J
Reply to  Rusty
May 23, 2021 6:47 am

Light aircraft makers Piper and Cessna nearly went broke in the 70’s/80’s because they were sued over crashes which were essentially pilot error. When you take responsibility off the driver/pilot for safe operating, then you’re on a slippery slope. Sooner or later a very wealthy person is going to sue Tesla for billions for loss of life of a loved family member in an autonomous driving accident, and they’ll win.

May 23, 2021 5:52 am

Why does anyone want a self driving car?

PaulH
May 23, 2021 6:00 am

“Caveat emptor” meets “a fool and his money are soon parted”. Hey, if you think it’s a good idea to spend $10,000 for vaporware, I may have a nice assortment of slightly used bridges for your consideration. 😉

H.R.
Reply to  PaulH
May 23, 2021 10:55 am

Is the Brooklyn Bridge taken? I kinda fancy that one.
😜

dk_
Reply to  PaulH
May 23, 2021 8:36 pm

PaulH, Not contradicting your point, and no scholar myself, but I thought it was “Let the Buyer Beware.” In this case, status buyers are having to justify their own lack of wariness/gullibility. Vaporware comment is right on, and even harder to demonstrate failed delivery.

H. D. Hoese
May 23, 2021 7:04 am

I learned to drive with standard shift. My wife’s 2013 car gives so much visual and audible information that it is distractive. What does that sound, display mean? When we bought it used, encountered someone being shown a new Ford Expedition. I asked if it needed a road [flight] engineer. Put that with cell phones going off, sometimes when not planning destination precisely requiring following the map downloaded from the sky. Twice this month I have been with two drivers doing this, when I could have saved time and gas. Last minute planning! I am reading a book where they similarly got lost in the mountains. Had to get a map.

Assume that this could all be cured automatically. I am driving on the interstate at too fast a speed, still legal, far in the distance I see a deer almost hidden heading rapidly for the road. I cautiously slow down, it may come across, may not. Who decides who goes first at a four way stop when cars get there at exactly the same time, maybe? Non-verbal communication works there. Do they have that? Are other myriad possibilities considered?

If they design it without great consideration from driver experience it would be dangerous. Such computer modeling has been getting a bad deserved reputation.

Coach Springer
May 23, 2021 7:30 am

A techie’s arrogance.

May 23, 2021 7:48 am

My relatively late model car has adaptive cruise control, as well as various lane assist and auto braking features. These features all work great in terms of enhanced safety and convenience, but I would never consider giving up control of the vehicle, even under ideal conditions. The main reason is that while the systems know exactly what the car ahead of me is doing, it has no clue why or what’s literally going on down the pike from that car. This frequently means that when the car ahead accelerates or changes lanes, my car will try to accelerate up to it’s set speed even when traffic ahead is stopped or stopping. Something that an active driver would clearly avoid doing, if only to save gas and/or brake pad wear. Maybe another 10 grand to Tesla gets you better instrumentation, but there’s no way that instrumentation can see through a leading vehicle.

ScarletMacaw
May 23, 2021 7:52 am

They only said “full self-driving.” They didn’t claim it would be adequate. 🙂

May 23, 2021 8:01 am

What???? . . . You mean to tell me “bait and switch” sales tactics is against the law?

Who knew?

May 23, 2021 10:14 am

Humans have evolved to have brains that can infer what might happen next during a situation with various factors and without total information about the situation. Computers have not evolved for this. Computers are not predators who must make life and death decisions on what action to take in uncertain circumstances. Even just coming to a stop can be dangerous, not only to you but the 10th car back in your line. Cognition is what lets us make these decisions. I have yet to see a computer with true cognition. They only respond to what they are programmed to do, they don’t think outside the box.

Simon
May 23, 2021 12:10 pm

Another day another anti-Tesla article from the man who admits he has never been in one….

Mr.
Reply to  Simon
May 23, 2021 6:52 pm

Simon, there’s nothing revolutionary about battery-powered vehicles.
The first cars (“town cars”) were the genesis of Teslas.
And as “town cars”, Teslas do the the job.
Expensively though.

Carlo, Monte
Reply to  Simon
May 24, 2021 7:31 am

When you are cruising down the road in your shiny T, and other poor schlobs getting by with gasoline, do you feel better than they? Superior?

John Endicott
Reply to  Simon
May 25, 2021 6:27 am

Once again, Simon can’t handle the message so he uses a fallacious argument to avoid listening or responding to it. Do you need to ingest poison to be able to discuss the downsides of ingesting poison? No. Same here. You don’t need to have ever been in a Tesla to discuss the pros and cons of a Tesla.

Jim
May 23, 2021 1:35 pm

Tesla annual net profit: ~$721M

Tesla revenue due to government forcing other car companies to give it money because of global warming: ~$1.5B

Tesla not a legitimate business.

Simon
Reply to  Jim
May 23, 2021 3:28 pm

So every company that receives a legal break from the government through tax or subsidy is not a “legitimate business.” Wow.

MarkG
Reply to  Simon
May 23, 2021 8:08 pm

A company which relies on taxing poor people to reduce the cost of its products so the rich people who buy them can afford to do so is not a legitimate business.

From what I’ve read, Tesla sales in Hong Kong dropped to near zero when the government removed subsidies. Clearly the customers don’t consider them worth their true cost.

John Dilks
Reply to  Simon
May 23, 2021 8:40 pm

That is not what was said, and you know it.

David A
Reply to  Simon
May 24, 2021 1:41 pm

Simon, when you write a strawman you demonstrate your inability to comprehend. Jim made no such assertion.

Wow!

John Endicott
Reply to  David A
May 25, 2021 6:30 am

Strawmen are all Simon can handle. And even they outweigh him intellectually.

Peter Morris
May 23, 2021 4:06 pm

Ten thousand? We could almost buy our own ship for that!

Yeah but who’s going to fly it kid? You?

You bet I could! I’m not such a bad pilot myself. We don’t have to sit here and listen to this!

John
May 23, 2021 4:53 pm

Tesla / Spacex etc are just the delusions of a delusionist

He thinks he is a god and acts like an idiot

May 23, 2021 6:46 pm

Ah, found the passage…

Caxton spoke with half a dozen underlings and became more aggressive with each one. He was so busy that he did not notice when his cab ceased to hover.

When he did notice, it was too late; the cab refused to obey orders. Caxton realized bitterly that he had let himself be trapped by a means no hoodlum would fall for; his call had been traced, his cab identified, its robot pilot placed under orders of an over-riding police frequency—and the cab was being used to fetch him in, privately and with no fuss.

He tried to call his lawyer.

He was still trying when the taxicab landed inside a courtyard and his signal was cut off by its walls. He tried to leave the cab, found that the door would not open—and was hardly surprised to discover that he was fast losing consciousness—

Robert A. Heinlein, Stranger In a Strange Land

Sixty years ago this year.

Do any of the other inconvenient “climate deniers” here feel lucky? Do you?

MarkG
Reply to  writing observer
May 23, 2021 8:09 pm

The whole point of fully self-driving cars is to control where people can travel to. Once you remove the ability for people to control where the car goes, they’re prisoners.

May 23, 2021 6:57 pm

Speaking of Tesla – don’t get a flat tire. They come with no spare, and don’t have ‘run-flat’ capability.

(Read about that at Quora last night, and just ran a search – confirmed. After-market tires, wheels and jacks are available)

MarkG
Reply to  Tombstone Gabby
May 23, 2021 8:10 pm

On the other hand, the tires must be pretty robust because they typically seem to survive even when the car spontaneously combusts and burns itself to the ground.

Why don’t they make the rest of the car out of whatever materials they use for the wheels?

Reply to  MarkG
May 23, 2021 8:38 pm

G’day Mark,

I believe that tire makers have switched to using silica rather than carbon black as a ‘filler’ in tires. That might account to their not burning easily. (Picked that up from a paper put out by the “Continental” tire company.)

John Endicott
Reply to  Tombstone Gabby
May 25, 2021 6:32 am

Tesla isn’t the only automaker that doesn’t automatically provide a spare. Other manufacturers have been doing so with some of their “cheaper” models for over a decade or more.

Reply to  John Endicott
May 25, 2021 12:08 pm

G’day John,

Thanks for the ‘heads-up’. Something else to ‘worry about’ if we start looking at new vehicles.

I was 14 before my father bought his “first car”, an English Ford Prefect. (Brisbane, Australia, excellent public transport in the mid-1950’s.) Wasn’t raised with cars, so I don’t ‘follow’ what is happening in the automotive world. Currently: an ’01 Cherokee and an ’03 F-350, both 4 wheel drive. Some of the places we visit – off the main roads – being without a spare would be just asking for trouble.

Steve Z
May 24, 2021 10:31 am

I would never get into a self-driving car as a passenger, and wouldn’t want to be driving (or a pedestrian) in front of one.

There are far too many situations (many of them mentioned by other commenters below) that a computer, no matter how sophisticated it is, cannot correctly interpret. One simple example would be a stop sign hidden by leaves on overhanging tree branches. A driver who had been through the intersection before knows the stop sign is there, and knows to stop and look both ways. Even a driver unfamiliar with the intersection would probably slow down and check, just in case.

But a computer, which would get its information from a GPS, might not “know” there is a stop sign there, and run right through it.

There are some “driver assist” features that can be useful, such as lights on a side-view mirror that turn on if there is another vehicle close behind in the neighboring lane, which can warn a driver against changing lanes. But, if one is stuck in traffic, and the vehicle in the next lane is stopped or moving very slowly, the driver can still make the lane change.

Some commenters have made comparisons with self-driving trains, particularly in airports. But there is a major difference–the automated trains are always stopping in the same places, and are riding on rails free of cross-traffic or obstructions. But a self-driving car on public roads will always encounter hazards (other vehicles, pedestrians, possibly animals) as well as natural hazards (bad weather or other obstructions) that it may not be programmed to deal with, and is inherently much more dangerous than an adult human being behind the wheel.

S.K.
May 24, 2021 10:56 am

Michael Burry, a successful fund manager, has shorted Tesla huge.
This may be one of the reasons for that decision.

May 24, 2021 12:23 pm

And now Tesla is hit with this:

https://www.foxbusiness.com/lifestyle/tesla-pay-16000-cutting-battery-charging

Tesla downloaded a software update to all the 2013-2015 Model S vehicles sold that had the 85 kWh battery pack. The update reduced the capacity and charging rate in order to extend the battery’s longevity. It also reduced the range as much as 11%.

They just lost a class action lawsuit in Norway, and there are similar lawsuits in other countries.

Longer battery lifetime, faster charging times, longer range. It’s almost like the product of those equals a constant. Improve one, the other two suffer.

Shawn Marshall
May 26, 2021 4:32 am

I am a very conservative person but I have high hopes for automated driving. We have ‘eyesight’ adaptive cruise in our Subaru and it works great 99% of the time; snow, rain and lowboys are issues. The radar in the rear to detect crossing vehicles in the parking lot is very beneficial too – very sensitive and protects you from those jerks nailing it in parking areas. The lane assist feature is helpful as is the lane change warning lights on the rear view mirrors. So Tesla does a disservice by overselling their software. Many deaths will be prevented as software assisted driving improves. Driver override is still paramount. Assisted driving will help many elderly stay out of nursing homes – we’ll tell the car where to take us and simply jam on the brakes if a need arises. Bars and restaurants will benefit from sober computer driving. But perhaps we are missing two real alternatives: maybe a lot of local driving can be eliminated by electric drone taxis that are self driving. Maybe our nation could offload the highway system by using automated trucks on the railroads that could exit at any ‘at grade’ crossing and park themselves to await a local driver or self drive at low speed ( preferably at night) to the delivery point. For that application we could divert highway funds to new rail construction. Highway costs would go way down if only cars were permitted. On I-81 in Virginia on a rainy day with moderate to heavy traffic you can pretty well bet on a wreck – many times involving trucks and autos because the trucks go too fast and travel in the left lanes which gets the lane changing speed demons in a frenzy – pretty soon everything stops.