The government has urged Tesla to improve driver education on its so-called “autopilot” features after a string of videos raised safety concerns.
Several Tesla vloggers have filmed themselves driving using autopilot systems – without hands on the wheel, in contravention of Hong Kong law – while other drivers have been caught doom-scrolling or texting oblivious of the world around them as they travel at high speed on highways.
Transit Jam asked the government whether such technology was safe for Hong Kong’s streets and whether the proliferation of such videos was a cause for concern.
In response, Transport Department (TD) said drivers needed more education on autopilot functions.
“TD has urged the local representatives of Tesla to step up customer education on proper use of [autopilot] functions, so that they are fully aware of the operations and limitations of these functions,” a TD spokeswoman said.
Tesla’s Hong Kong website says “autopilot” functions require “active driver supervision and [does] not make the vehicle autonomous.”
But that warning appears to be lost on local Tesla owners. For example vlogger “Toofless Tesla” shows his Tesla driving through tunnels, without driver hands on the wheel and despite in one case a dashboard warning that “auto steering” was about to be disabled.
Last month, the US National Highway Traffic Safety Administration’s (NHTSA’s) Office of Defects upgraded a year-long investigation into Tesla’s autopilot system from a “preliminary investigation” to an “engineering analysis”.
According to the agency, in most crashes involving the autopilot system, the struck vehicles in question were ‘parked and surrounded by scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones’, factors which should have made the struck vehicles visible to the Tesla systems and the drivers at the wheel.
But Hong Kong’s TD has no comment on the NHTSA investigation and says, based on international standards, the systems themselves are safe.
“TD has approved the use of ‘auto steer’, ‘auto lane change’ and ‘navigate on autopilot’ functions of ADAS (Advanced Driver-Assistance System) in Tesla vehicle models after assessing their impact on road traffic and road safety, and their compliance with the relevant recognised international vehicle safety standards,” said the spokeswoman.
According to the government, autopilot features may only be used on dual-carriageways or roads with a central divider, and only on roads with a speed limit of 70 kph or above. That also makes Tesla’s nifty “auto parking” feature, a frequent favourite of Tesla vloggers, illegal on most urban streets, which have a speed limit of 50 kph and which have no central divider. Self-driving cars have been quite low-profile in Hong Kong, with the government assessing new laws for how the technology might be safely implemented.
A survey last year revealed as many as 10% of Teslas sold worldwide are ordered with its automated driving support system package, badged “Full Self Driving” (FSD). That proportion is down from a peak of around 40%, with analyst TroyTeslike estimating this is due to price increases and an increase in China sales where FSD uptake has been low.
Hi everybody. The last report I wrote about FSD take rate is now public. The data source is my Tesla Order Tracker spreadsheets.
You can find all the details here https://t.co/IhbEPZOQu1 including a breakdown by region and model and why it peaked and then dropped. pic.twitter.com/VTA1hSoADR
— Troy Teslike (@TroyTeslike) August 31, 2021
Critics of the system have pointed to an expectation gap between Tesla’s “Full Self Driving” badge and genuine “Level 4” or “Level 5” self-driving systems where human supervision is not required. The “autopilot” label may lead drivers to assume the car is more capable than it is, and to take less care as a result.
The first person killed using autopilot, Joshua Brown, was thought to be watching a movie on an iPad when his car drove into a container truck at 120 kph.
The carmaker has responded with new security and systems around its self-driving packages, including detectors which sense hands on the wheel. In the US at least, use of Tesla’s FSD package is tied to a “Safety Score” calculated by the car as it is driven: hard cornering, tailgating, heavy braking or abuse of the auto-steering system (by taking hands off the wheel, for example) will reduce the safety score, while three “disengagements” caused by dangerous driving will disable FSD for the remainder of a trip.
In theory, backed up with some real-world data, driverless cars or those fitted with ADAS are safer than those operated by humans. Driverless car firm Waymo says it aims to build “The world’s most experienced driver”, with 15 billion miles travelled or simulated.
But high-profile crashes such as Brown’s death have raised social and political concerns, while safe streets advocates say development systems should not be tested on public streets.
In China, meanwhile, fully autonomous cars are making technological headway, with Baidu securing a permit to run five Apollo Go “robotaxis”, without safety driver, in both Wuhan and Chongqing. And in Beijing, the company was allowed to move its “safety supervisors” from the driver’s seat to the passenger seat of their robotaxis.
Tesla has been approached for comment on the Hong Kong government’s request for driver education and its strategy for “autopilot” in Hong Kong.
Categories: Law and Enforcement, Policy, Transit