Tesla has revealed the details and possible causes of the accident with the Model X. Tesla admitted that Autopilot was working at the time of the fatal Model X accident. Late deadline for informing the regulator about the accident

The US National Highway Traffic Safety Board (NTSB) has sent experts to California to investigate a fatal accident involving a Tesla electric vehicle.

The driver (operator) of a Tesla Model X died on March 23 in a crash on California Highway 101 near Mountain View. For unknown reasons, the crossover lost control, crashed into a bump stop at high speed, then caught fire and exploded. The car had to be driven in autonomous mode, and the operator was in the cabin for insurance in case of unforeseen situations, as required by California regulations. The movement of drones without a driver will only be allowed. The accident occurred a week before the new rules were introduced. The operator of the vehicle died on the way to the hospital in the ambulance.

However, the main topic of the upcoming investigation is fire. After the accident, two more cars following it crashed into the electric car, as a result of which the Tesla received significant damage, caught fire, and then, according to eyewitnesses, exploded. It is assumed that due to the collision, the car's battery, located under the floor, detonated, although the manufacturing company previously stated that it was completely safe. Tesla noted that the car's battery is designed to minimize the likelihood of a fire. But if a fire does occur, the fire will spread slowly so that passengers have time to leave the cabin.

It should be noted that, as eyewitnesses note, they managed to get the operator out of the car before the explosion. Thus, his death was not related to burns to the body, but was caused by injuries received as a result of the collision.

The company indicated that more than 200 Teslas drive along this section of the highway every day in autopilot mode, and no accidents with drones have ever been recorded there before.

Tesla offered its own version of the fire and subsequent explosion - the bump stop into which the car crashed was supposed to soften the blow, but for some reason this did not happen. It is possible that this section of the fender was either incorrectly installed, or had previously suffered from a similar collision and was not replaced after that.

But the National Highway Traffic Safety Administration does not intend to pay special attention to the ensuing fire. The main thing is to find out the reasons for the accident with Tesla in California. The American road regulator is increasingly faced with the need to study the role of unmanned technologies, writes The Wall Street Journal.

American officials and politicians have recently increasingly talked about the need to review regulatory rules in connection with the development of the self-driving car industry.

PHOTOS and VIDEO from the scene of a fatal accident with Tesla:

Americans are afraid of drones and express aggression

Nearly two-thirds (63%) of Americans are wary of traveling in a vehicle that is completely independent of the driver, according to a survey conducted by the American Automobile Association (AAA).

The death of a cyclist in Arizona () and the latest fatal accident in a Tesla only intensify these concerns.

San Francisco cyclists previously sent a petition to California authorities asking that self-driving cars not be allowed to be tested on state streets without drivers, since existing technologies are not yet safe enough.

The Department of Motor Vehicles of the American state of California has recorded an increase in the number since the beginning of 2018. As it turned out, of all the accidents involving self-driving cars recorded in the first month of the year, almost a third were caused by pedestrians who expressed aggression and attacked cars.

Most US states have a rule that requires a person to be behind the wheel when testing self-driving cars. California ended that rule in late February. The absence of an operator in the cabin will become legal from April 2018. However, none of the companies have yet applied for such permits.

In addition, Americans fear that hackers could gain remote unauthorized access to the control systems of unmanned vehicles - engine start, steering, braking system - and disable them, as well as lock people in the car.

Experts warn that it could potentially be possible through remote hacking of cars equipped with remote control functionality. After all, modern cars are an “open door” for hackers, and enemy states or terrorists can take advantage of the opportunity to hack them, turning the car into a deadly weapon.

Hackers are already able to take control of any car manufactured since 2005, but some cars manufactured in 2000 are also at risk.

"Any nation with the capability to launch a cyber strike could kill millions of civilians through car hacking," said Justin Cappos, a computer security expert at New York University.

And while automakers argue over the timing of bringing self-driving cars to the global market, cybersecurity experts paint a rather grim picture of the “terrorist” potential of drones, whose systems can be hacked just like any other computer.

And it is very difficult to protect against such hacking - it only takes one error for the system to become accessible to hackers. Even for an organization like the US National Security Agency (NSA), which has all the necessary technical and intellectual resources, the main question is not “if” a potential hack will occur, but “when” it will happen. With regard to drones, one of the main dangers may be the so-called. Virus creators focus their efforts precisely on detecting such unknown vulnerabilities in software, which, if successful, gives them control over the entire system, rather than its individual objects.

Tesla lost three billion dollars in a month

The outlook remains "negative", which indicates the possibility of further deterioration in the credit rating.

Since the beginning of March, Tesla shares have fallen 25%. Tesla's capitalization decreased by $14.6 billion over the month. Shares are falling amid investor concerns about a lack of funding, as well as because of the scandal over a fatal crash on California's Highway 101. The company's founder and CEO, Elon Musk, lost about $3 billion in just one month.

To pay down debt and avoid a liquidity crunch, Tesla management will need to raise significant capital in the near future.

Tesla released the results of an analysis of the on-board systems of the Model X crossover shortly before the tragic accident last Friday, March 23.

As a reminder, there was a fatal traffic accident on Highway 101 in Mountain View, California. The electric car crashed into a concrete divider at high speed, after which it collided with two more cars. As a result of the terrible impact, the Model X crossover completely lost its front end, and the battery pack caught fire. The driver was taken to the hospital, but died from his injuries.

As Tesla now reports, the crossover was moving on autopilot before the collision. Shortly before the accident, the driver received several visual and one audible warnings about the need to hold the steering wheel with his hands. However, for six seconds before the collision with the divider, the sensors did not record the driver’s hands on the steering wheel.

The electric car maker claims that the driver had approximately five seconds and an unobstructed view from a distance of 150 meters to avoid the collision, but no action was taken.

It is also noted that the consequences of the collision were so destructive because the so-called “Impact attenuator” on the separator was destroyed as a result of an earlier accident. But the road services did not have time to replace it with a new one.

Tesla emphasizes that Model X crossovers have never received such serious damage in any accident. In any case, the company claims, Autopilot in its current form cannot prevent all possible accidents, but it significantly reduces their likelihood.

However, it is still not clear why the Model X's numerous sensors allowed the car to hit the divider. After all, theoretically, the car should have detected its presence and at least performed emergency braking.

US authorities have begun investigating the causes of the first fatal accident involving a car that was driving on autopilot. The accident itself occurred in Florida back in May, but the incident in which the driver died became known only now. A Tesla Model S electric car with autopilot turned on rammed a tractor-trailer on the highway, which was moving in a perpendicular direction, trying to cross an intersection. According to the Tesla press service, the accident occurred due to a combination of tragic circumstances.

By the way, about a year ago, a self-driving car from Google was also involved in a serious accident, which resulted in no casualties, but three Google employees were injured. The “self-propelled” hybrid crossover Lexus RX 450h was approaching an intersection when two cars moving in front of it suddenly began to slow down, despite the traffic light allowing passage. The drivers of these cars saw a traffic jam after the intersection and, despite the green light, did not dare to drive to the intersection of streets, so as not to block traffic at the intersection. The autonomous Lexus also applied the brakes. The driver of the car driving behind the Googlemobile did not have time to react and crashed into the Lexus at a speed of 27 km/h. Later it turned out that the man driving this car did not even have time to press the brake pedal before the impact.

On July 1, Geektimes published news that a Tesla Model S with the autopilot system turned on was involved in a fatal accident. So far, this is the first and only case of death of the driver of a car controlled by a computer system.

Formally, the electric car manufacturer is not to blame for anything. Autopilot is disabled by default in Tesla vehicles. The company previously stated the following: “Safety is Tesla's top priority and we design and engineer our vehicles with that in mind. We also ask our customers to practice safe driving when using our cars...Tesla Autopilot is the most advanced system available, but it does not turn the car into a self-driving vehicle or relieve the driver of responsibility.” Everything is true, but there are a couple of controversial points.

Late deadline for informing the regulator about the accident

The accident happened on May 7, and the US National Highway Traffic Safety Administration learned about it on May 16, 9 days later. The public was told about the car accident even later - on June 30, almost 2 months after the incident itself. Why didn't regulators or the company itself report this earlier?

Tesla Motors gave a detailed answer to this question: “Tesla, like any other automaker, does not find it necessary to share information about all accidents involving Tesla electric vehicles. More than a million people die in accidents every year, but automakers do not share information about every incident with investors...”


Tesla Model S electric car after an accident on May 7

Sale of shares of Tesla Motors

On May 18, another interesting event occurred: Tesla and Elon Musk sold $2 billion worth of company shares at $215 per share. At this time, the company management already knew about the disaster, and an appropriate investigation was carried out. If the incident had become known before the sale of shares, the proceeds would have been less. The question arises - is the sale of shares and the delay related to the reporting of the fatal accident?

Immediately after the incident became known, the company's share price fell from $212 to $206. True, the losses were quickly recouped by the market: by evening the share price increased to $216. Elon Musk, the head of Tesla Motors, responded to accusations of delaying informing regulators and the public about the accident on his Twitter. He said none of this was “material” to the company’s investors.

There are a few more questions that don't have answers yet.

Did the deceased driver watch the movie? Several people reported hearing the sounds of the Harry Potter movie coming from the electric car. If the driver watched the film, then responsibility for what happened lies with him. If the driver did not watch the film (one of the participants in the investigation of the accident speaks about this) and was watching the road, and the autopilot failed him, then Tesla is also responsible. Whatever the company says about beta testing of Autopilot and the responsibility of drivers for its use, Google does not plan to use its car control system until it is fully ready. Volvo representatives say the same thing. It turns out that Tesla brought a crude product to the market to attract buyers?

How does Tesla plan to avoid accidents in the future? According to some experts, "the high ground clearance, coupled with the position of the semi-trailer on the road and extremely rare circumstances, caused the Model S to pass under the trailer and the bottom of the trailer to collide with the windshield of the Model S."

If there had been something else in place of the trailer, with less clearance, the autopilot would probably have responded.

Soon the company will bring to market an updated version of the autopilot, which they promise to make even safer than the previous version.

Some features of the new version of the autopilot are already known: cameras have been added and the software has been updated. “The dual-camera system is capable of recognizing and responding to stop signs and traffic lights without driver intervention,” says a source familiar with the updated version’s functionality. Currently, the system only responds to physical obstacles in front of the car. If the updated version is able to respond to signs and traffic lights, then this is much closer to autonomous driving.

System components such as Traffic-Aware Cruise Control and Autosteer have been updated. A Tesla driver with the new autopilot will also receive an updated interface. In particular, the objects surrounding the car will be shown in first person - the way they are “seen” by the sensors of the electric car. Now the picture is shown in third person. This may interfere with the driver's perception of obstacles.

Speech recognition has been improved and new voice commands have been added. You will no longer need to press and hold a button in order to give a voice command: the system will constantly listen to the driver. Once recognized, the command will be shown on the screen. It is planned to improve interaction with large objects on the road - trucks and trailers.

The new autopilot is currently undergoing beta testing, and it could take several weeks or several months before release.

By the way, those owners of Tesla electric vehicles who placed an order before September 14, 2014 will not receive the update. Most likely, the Tesla Model 3 will come with an updated version of the software.

What does Musk think?

Previously, the head of Tesla Motors wrote the following to a Fortune journalist: "... around the world, about a million people die every year in car accidents. About half of this million could be saved if Tesla Autopilot were available to everyone." Previously, Musk has already stated that, in his opinion, Autopilot reduces the likelihood of a car getting into an accident by about 50%.

UPD. Today it became known about another Tesla electric car accident. This time, trouble happened to the owner of a Tesla Model X, who activated the autopilot mode before the accident. Fortunately, no one was killed and there were no serious injuries.

The accident occurred on July 1 in Pennsylvania, USA. The Model X, with autopilot engaged, drove into a protective fence, crossed several lanes of the road and crashed into a concrete divider.

The electric car rolled over onto its roof and stopped. Another vehicle was damaged in the accident, a 2013 Infiniti G37, which was struck by detached body parts of a Model X. According to those familiar with the stretch of road where the accident occurred, driving is very easy. And the markings and all the fences are installed where needed. Driver error is practically excluded here. Why Tesla's autopilot went astray is unclear and remains to be studied.

Tags: Add tags

As we already know, on March 23, a tragic accident involving a Tesla Model X occurred on a highway near Mountain View - the electric car crashed into a concrete divider at high speed, after which it also collided with Mazda and Audi cars. The driver of the Model X died in hospital from his injuries. The US National Transportation Safety Board (NTSB) began investigating the accident, and media reported that the deceased driver had repeatedly complained about the malfunction of Tesla's Autopilot. And last night Tesla clarified the issue with autopilot.

After analyzing data collected from the on-board PC of the destroyed Model X, the manufacturer confirmed that the autopilot function was turned on at the time of the collision.

According to Tesla, shortly before the collision, the driver received several visual notifications and one audible warning warning him to take the wheel. On-board computer data indicates that six seconds before the collision with the divider, the car was being driven by autopilot, and the driver was not holding the steering wheel at all. The manufacturer also claims that the driver had at least five seconds and an unobstructed view 150 meters ahead to avoid a collision, but, judging by the on-board computer records, no action was taken.

In addition, it is separately noted that the consequences of the collision turned out to be so destructive for the reason that the energy-absorbing barrier fence was destroyed as a result of a previous accident and road services had not yet had time to repair it.

Let's remember that the collision was so strong that it caused a fire in the battery pack, which was designed to prevent such an outcome.

“We have never seen such a severe level of damage to a Model X after a crash,” the manufacturer said in the first statement.

Unfortunately, Tesla does not provide data on the speed at which the car was moving, but it was clearly high.

Tesla admits that its Autopilot is imperfect, while emphasizing the superiority of its own solution over alternative driver assistance systems. Referring to WHO statistics (1.25 million people die in road accidents around the world every year), the company notes that with the current level of safety, Tesla cars could potentially save about 900 thousand human lives. In other words, if only Teslas were driven around the world, the number of road deaths would decrease by more than 70%. The company expects that in the future, self-driving cars will be 10 times safer than human-driven cars.

These statements cannot be called absurd, especially considering the many cases where the Tesla autopilot, which is still being developed, saved human lives and prevented various types of collisions. In addition, we should not forget that serial unmanned vehicles do not yet exist on the market; at this stage, all autonomous driving technologies are only being developed and refined.

On the other hand, the recent case with Uber's self-driving car and the current one with the Model X raise many questions. In the case of the Model X, it is unclear why the system did not react at all and allowed the car to hit the divider in broad daylight. At a minimum, the emergency braking system should have worked. The manufacturer has not yet given any additional explanations. It’s stupid to deny that self-driving cars are our future, but one thing is clear now: the corresponding technologies are still too “raw” and recent mistakes at the cost of human lives will slow down their development, no matter how sad that may be. In other words, Tesla cars or any other cars driving freely without a driver throughout the United States or any other country will not appear any time soon. However, I would like to be wrong.