Home
News
Products
Corporate
Contact
 
Friday, April 19, 2024

News
Industry News
Publications
CST News
Help/Support
Software
Tester FAQs
Industry News

Tesla brings “Smart Summon,” to its AV, but is it safe?


Tuesday, October 22, 2019

Tesla, Inc., recently released what the company calls “Smart Summon,” a remote valet feature. When a Tesla Model 3 is summoned via a smartphone app, the vehicle drives in full autonomous mode, performing every trick in the AV playbook, from autonomous steering and braking to free space analysis, path planning and semantic segmentation… sort of.

These tasks are what every Level 4 autonomous vehicle is supposed to do perfectly. The hitch is that Smart Summon falls far short of perfect.

Tesla makes clear that the vehicle owner wielding the smartphone is responsible for remote-control operation of his or her vehicle. This “human factor” is not a safety feature, however. All it does is absolve Tesla from liability when this seeming Level-4 function malfunctions. After all, when autonomy features inside a Level-2 vehicle don't work as planned, carmakers have an age-old tradition upon which they can rely: Disclaim responsibility and finger the driver.

Intended or unintended, Tesla’s “Smart Summon” has become a full-blown wedge issue dividing the world into two: those who are die-hard Tesla fans and those who are not.

Are we surprised?

Tesla owners and Elon Musk cheerleaders — who contend Tesla can do no wrong — are veritably giddy about Smart Summon. It has engendered more than 100 videos on the web. In these clips, proud Tesla owners conduct a variety of show-and-tell science experiments with Smart Summon.

The performances are at times erratic, but Tesla fans are a forgiving bunch. Deeming Smart Summon a “work in progress,” they are convinced that the glitches will be fixed by over-the-air software updates — sometime, soon. So, they ask: why worry?

In contrast, skeptics like me are a little nervous about “autonomous features” such as Smart Summon and Autopilot. These features can divert regulators and carmakers from affixing the onus for AV safety. As more autonomy features creep into vehicles, who’s accountable? Should carmakers who develop vehicles with incomplete “autonomy” capabilities feel free to shirk responsibility for any minor or major accidents caused by these features?

How Smart Summon works

Before going further, let's review how Smart Summon works, as Consumer Reports explained:

Model 3, when Smart Summon is activated, would slowly make its way to the person summoning it with a smartphone. Let me be clear. In those cases, the car is indeed controlling itself, steering, braking, and making decisions about its route. But the person operating the app still bears the responsibility to monitor the car and keep it out of trouble.

The system requires the user to press and hold the "Come to Me" or “Go to Target” buttons on the smartphone app. Releasing the button will immediately stop the vehicle.

You see the line here? Although the vehicle is automatically steering and braking, Tesla is off the hook when something goes wrong. Blame falls to the vehicle owner who has the remote.

Of course, I’m the first to admit that I can’t afford a Tesla and I’m probably biased.

So, I checked with one of my trusted car guys — Tesla owner Phil Magney. Magney is also the founder and principal advisor of VSI Labs, a tech research company that examines the building blocks for autonomous vehicle (AV) technologies.

First-hand experience

Asked about his first-hand experience with Smart Summon, here’s what Magney told me.

“I got the latest download yesterday and have been messing with it all day. Initially, I tried it in our parking lot and it worked pretty well, albeit it wanders around a bit. Then I took it to a strip mall, and I got worried because it was moving too slow.”

What made you worry? Magney pointed out that he felt the vehicle’s “overly cautious” driving behavior “would upset human drivers in no time.”

But why does Smart Summon behave like that?

Magney said, “As I mentioned, the most noticeable behavior is the wandering as the vehicle does not move in a straight line when you hold down the 'come-to-me' button on the app.” He noted, “It is not completely clear how it does path planning in a parking lot. I believe it pretty much does free space analysis and basically moves within that. Often that space is very open and since there are no lane lines within a parking lot the vehicle figures it can take as much space as it wants.”

Magney added, “It relies on GPS for this application and since there is some error and drift the trajectory can get wiggly. I also believe it is doing some semantic segmentation. In other words, it knows where the drivable free space is and stays within the confines of the paved surface. While in motion the vehicle is situationally aware as we stepped in front of the vehicle a few times and it always stopped and the app told me it has stopped for a pedestrian.”

Where do regulators stand on this?

As expected, National Highway Traffic Safety Administration (NHTSA) is doing nothing actively about Tesla’s Smart Summon. The agency said it has ongoing contact with the company and will continue to gather information. The agency noted that consumers are encouraged to report any concerns on the NHTSA complaint database, which regulators use to track potential safety defects. For now, that’s it.

In the automotive industry, regulators are conditioned to be reactive. They take action only after a catastrophe.

Legally speaking, “there is nothing that forbids this kind of parking application,” said Magney. “If there are injuries and property damage then the agencies will do a cease and desist for this operation, I suppose.”

But will they? Colin Barnden, lead analyst at Semicast Research, is not holding his breath.

“We’ve seen this already with Autopilot and it is all about accountability. If nothing goes wrong, Teslas are amazing. If anything goes wrong, Tesla goes into full denial mode. It is all the responsibility of the human operator,” he said. Typically, Tesla points out that “liability [is] specified in detail in the disclaimer. Oh, and we have the data logs which prove it is all your fault.”

Barnden’s rejoinder: “Serious car companies do not treat their customers with derision.”

It’s also clear that NHTSA can’t possibly keep up with Tesla. “Tesla’s use of OTA updates means features can be added faster than NHTSA can evaluate them and legislate accordingly,” observed Barnden. “Serious car companies do not treat NHTSA (and the NTSB) with derision.”

Magney agreed: “The regulators are pretty far behind in their understanding of this technology or even the availability of the technology.”

ODD defined?

One issue raised about Smart Summon is its Operational Design Domains (ODD). Tesla’s guidance says that Smart Summon should be used in “private” parking lots. But this could cause confusion.

Magney said, “I am not completely sure who makes the rules within a private parking facility.”

Consumer Reports also noted, “Many consumers would consider shopping centers as a kind of public space.” This raise “the question of where exactly [Smart Summon] can be used.” The Consumer Reports’ article explained: … we found that the [Smart Summon] system works only intermittently, depending on the car’s reading of the surroundings. The system is designed to work only in private parking lots, but sometimes it seemed confused about where it was. In one case, the system worked in one section of a private lot, but in another part of the lot it mistakenly detected that it was on a public road and shut itself down. At various times, our Model 3 would suddenly stop for no obvious reason.

When it did work, the Model 3 appeared to move cautiously, which could be a positive from a safety perspective. But it also meant the vehicle took a long time to reach its driver. The Model 3 also didn’t always stay on its side of the lane in the parking lots.

As Magney pointed out, “Its ODD is the parking lot for now.” But even within that ODD, Smart Summon reportedly could get confused.

It’s a gimmick

Unless you are a die-hard Tesla fan, the world seems to agree that Smart Summon is a gimmick.

Barnden told me, “It is important to understand that in no way can Smart Summon be classed as a safety product. At best it is a convenience feature, but gimmick would be a more accurate description.”

He added, “Any product which only ‘mostly works,’ is, by definition, not specified for safety critical applications. In automotive, you can always tell what a safety product is, because you never even notice it is there, right up until the moment it saves your life.” He asked, “See anything ever done by Volvo as an example. How many people upload video of airbags doing their job, or Automatic Emergency Braking (AEB)? None, because airbags and AEB are safety products and safety is boring.”

Magney concurred with Barnden. He said, “Honestly, this new feature [Smart Summon] is a gimmick in my opinion. And I don’t believe anyone purchased Autopilot for the Smart Summon feature. Tesla is rubbing their technology in the faces of the industry. Look what we can do! We are so far ahead! Our cars have 360-degree situational awareness. We can do things nobody else can do!”

Smart Summon is an autonomy feature that embodies the ethos of Tesla being Tesla.

Magney explained, “Tesla is so committed on driving a wedge between themselves and the rest of the industry, that they will do things like this. It’s like the games and silly apps that they push out. Nobody really cares about those things. Tesla does it because it can.”

Barnden went a step further.

“Tesla isn’t a serious car company, it is a social media brand bordering on light entertainment and not very different from ‘Keeping Up with the Kardashians,’” he said. But “what Tesla does brilliantly is to tap into the innate human desire for a sense of belonging, which creates the extremely loyal and cult-like fan base worshipping St. Elon. This is no different to fanatically following a sports team or memorizing baseball statistics. For some people, spending hours on Twitter aggressively arguing with random strangers about Tesla fulfills their need for intimacy.”

As Magney acknowledged, “Tesla customers are a unique kind of buyer. Like the company itself. Early adopters and proud of their vehicle despite its quirks. The general public may not be so tolerant.”

But hand it to Tesla, Barnden conceded. “Tesla has certainly found a different way to sell cars.”

Let’s not trash Tesla all the way

Regardless the controversies surrounding Elon, one fact is manifest: Carmakers admire Tesla’s pioneering efforts to enable OTA updates, which can literally add new features to the vehicle, not all of which are silly apps.

Magney said, “Tesla’s V10 update comes with a bunch of new features that are really good. Like the driving visualization that enables the driver to have greater situational awareness. This is an awesome feature.” He added, “The new visualizer lets the driver see traffic much further ahead, and oncoming traffic. These features give the Tesla owner greater assurance and confidence in the technology.”

Is ‘Smart Summon’ Level 4?

One big question that came to my mind while watching Smart Summon video was: Shouldn’t Smart Summon be technically categorized as a Level 4 feature? If so, who is responsible for any malfunctions that might happen?

Magney doesn’t see much ambiguity. He said, “Remember this feature may only be operated in line-of-site. Also, the throttle is in your hand. In other words, the operator has to press the button and hold it for the vehicle to move. The moment you release the button the vehicle stops. So, the owner/operator is in control at all times.”

Magney acknowledges that with features like Smart Summon, “for Tesla, and increasingly the industry at large, the notion of adhering to strict SAE levels is beginning to fade.”

It strikes me that features like Autopilot and Smart Summon are “wolves in sheep's clothing.” They are designed to push Level 4 autonomy into Level 2 vehicles. I can’t help but wonder how regulators and carmakers might deal with this mixed reality in the future.

As Barnden pointed out, “For any technology to be Level 4, the OEM must be liable for whatever happens.” But “for Tesla, this is never — which is why every technology feature they offer is Level 2, irrespective of what they may claim,” he noted.

Asked about calling one’s vehicle’s autonomy “Level 4,” Barnden said: “This is the biggest barrier to their introduction of a robotaxi fleet — there is no human to throw under the bus when the [driverless] cars inevitably crash or kill. Which they will.”

This suggests that Tesla might have devised a perfectly artful dodge: By introducing a host of autonomy features, but putting them inside Level 2 cars, automakers can’t be held responsible when the gimmicks go haywire. They can always point the finger at the fall guy behind the wheel — or under it.

By: DocMemory
Copyright © 2023 CST, Inc. All Rights Reserved

CST Inc. Memory Tester DDR Tester
Copyright © 1994 - 2023 CST, Inc. All Rights Reserved