Tesla’s FSD Beta’s Driving Modes Bring Up Interesting Ethical Issues We Should Talk About

Tesla’s FSD Beta’s Driving Modes Bring Up Interesting Ethical Issues We Should Talk About

Ever rolled through a stop sign? Of course you have. I think I did it this morning, in fact. If you’re a driver who hasn’t, then I hope being a liar is working out for you, because I bet you have. As far as illegal things go, rolling through a stop sign is about as minor as you can get, though it is technically illegal and I suppose for a decent reason, since stop signs are generally placed at locations where stopping fully is at the very least a pretty good idea. So, with that in mind, should we be programming self-driving car AI to commit this admittedly minor crime? Tesla seems to have already decided that it’s ok.

Tesla’s current version of their Full Self-Driving (FSD) Beta software contains a feature known as “FSD Profiles,” which started with version 10.3 released in October 2021 (but was then pulled for issues, and came back very soon with 10.3.1).

As noted in other articles about these updates, a big feature was the introduction of three “profiles” for FSD’s behaviour, which include Chill, Average, and Assertive.

In Average and Assertive modes, the little description text reveals a bit about how the software will make the car behave, and reveals some details:

Tesla’s FSD Beta’s Driving Modes Bring Up Interesting Ethical Issues We Should Talk About
Image: Tesla, JDT

In both these modes, the description states that the car may perform a rolling stop. Let’s be absolutely clear about what this is: Yes, it’s incredibly minor and perhaps even trivial, but this is the car telling you that its programming may cause it to decide to perform an illegal act.

The reason I’m making a Big Thing out of this is that we’re still early enough in humanity’s development of what we hope will one day be actually, fully self-driving cars that we can still look at what we’re able to do and really ask ourselves if this is the path we want to take.

Is it? I’m honestly not certain.

The specific act — rolling through a stop sign — is less important than the bigger implications here. Those implications are the fact that we have traffic laws on our books that are routinely broken, because we’re human beings and the overall experience of driving can, we feel, be improved in some way by the willful ignoring of some of these laws.

Almost all of us speed at times, too. And while you can get Tesla’s (and others, of course) driver-assist systems to speed as well, it’s always been the human’s decision to do so. If you set your cruise control or Level 2 semi-automated driving system speed upper limit at 153 km/h, that’s what the car will do, but that was your choice, not the car’s.

This situation is different, because the driver is not part of the decision making process that could result in the Tesla rolling through a stop sign, breaking a law. If a cop sees you do this, and pulls you over, who is to blame?

Is it the driver’s fault, because they were informed that the car might pull off such a crime when they selected the driving mode? Or should the cop send the ticket to Tesla HQ, since it was their software that wilfully decided to roll through the stop sign?

Do we want our eventual self-driving cars to be willing to break laws? Does this mean we need to take a realistic look at our traffic laws and maybe adapt them a bit better to real-world, real-life behaviours and situations? Should we just legalise a slow-rolling stop at certain intersections and conditions, and maybe have more flexible speed laws?

Or should we just program our cars to follow the law? Isn’t part of what makes the possibility of computer-controlled driving so appealing to people is that computers can always do the safe thing and won’t ever be tempted to break laws or run stop signs or speed because they’re not burdened with our flawed, impulsive, horny, hungry human brains?

It may sound minor, but the line of thinking shown in these driving profiles isn’t conceptually different than if, say, Tesla manages to actually develop and sell their humanoid robot (stop laughing, this is a thought experiment), and it includes a Shoplift Mode that would let it attempt to shoplift items if it thought it could get away with it.

Tesla’s FSD Beta’s Driving Modes Bring Up Interesting Ethical Issues We Should Talk About
Image: Tesla, JDT

Of course, this doesn’t exist, but it’s just not really different than a slider that lets the car decide to violate a traffic law.

We need to think all this through now and decide what we want for our future. Do we want total law-and-rule following? Do we want certain exceptions? The ability to override as needed and permit a law-breaking behaviour? Give the decision over to the machine, possibly with a sliding set of acceptable parameters?

Honestly, I’m not sure exactly how this should play out. What I am sure about is that we, collectively, as a society, need to take the time and do the admittedly hard work to decide on a standard set of rules, before we just start trying shit out and seeing how far we can push it.

Because, remember, we’re humans, and part of that deal means we’ll always push it, maybe too far.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.