Should We Trust Autonomous Cars Without Transparency?
As someone who owns a Tesla Model 3 equipped with Full Self-Driving (FSD) capabilities (under my constant supervision), the recent news about the next administration wanting to scrap car-crash reporting for autonomous vehicles raises serious concerns.
This isn’t just about the safety of those who choose to drive or purchase these vehicles—it’s about everyone on the road. Whether you’re driving a traditional car, riding a bike, walking across the street, or simply sharing the road as a passenger, everyone deserves to know that the vehicles around them, especially those with autonomous capabilities, are being held to the highest safety and accountability standards. Transparency isn’t just a luxury for consumers like me; it’s a basic right for all road users to ensure trust, safety, and fairness on our shared roads.
Editor’s Note:
We've added a new section addressing common misunderstandings about Tesla’s autonomous systems, including misleading depictions in media coverage. Scroll down to the "Misunderstandings Around Tesla’s Systems" section to learn more.
Transparency Matters for Safety
Currently, car-crash data for autonomous vehicles provides critical insight into how these systems perform in real-world scenarios. This information isn’t just a dry statistic—it’s a vital tool for:
- Consumers: To make informed decisions about which vehicle to trust with their safety.
- Regulators: To identify patterns or systemic issues and adjust safety standards.
- Manufacturers: To improve their systems based on real-world performance.
Scrapping this reporting feels like a step backward, especially at a time when autonomous technology is still in its infancy. According to Reuters, “Removing the crash-disclosure provision would particularly benefit Tesla, which has reported most of the crashes – more than 1,500 – to federal safety regulators under the program.”
While this might seem like a win for Tesla, it raises a much broader concern: What about other automakers who are not as far along in autonomous technology?
A Major Concern: Opening the Door to Unsafe Practices
Another troubling aspect of scrapping crash reporting is the precedent it sets for other automakers. While Tesla has been transparent—reporting over 1,500 crashes—it’s concerning to think about newer or less advanced companies entering the autonomous vehicle space without any accountability.
Imagine a scenario where a company rushes out an autonomous feature that isn’t rigorously tested, resulting in accidents. Without mandatory reporting, there would be no oversight, no public data, and no consequences for unsafe practices. This lack of accountability could lead to:
- Unsafe vehicles on the road: Companies might prioritize speed-to-market over thorough testing.
- Erosion of consumer trust: If brands cut corners without accountability, it casts doubt on the entire industry.
- Regulatory blind spots: Without crash data, regulators won’t know when or how to step in.
This isn’t just a theoretical issue—it’s a reality we could face if reporting requirements disappear.
Why Reporting Should Stay
- Protects Public Safety: Mandatory reporting ensures that companies prioritize safety in their designs.
- Levels the Playing Field: Transparency holds all brands to the same high standard, preventing bad actors from slipping through the cracks.
- Builds Consumer Trust: Buyers need data to feel confident in their decisions.
The idea of scrapping crash reporting doesn’t just remove accountability from Tesla—it also leaves the door wide open for less responsible players to enter the space unchecked.
Misunderstandings Around Tesla’s Systems
Another concern is the regular misunderstandings surrounding Tesla’s autonomous capabilities, even from reputable sources. For example, in the Wall Street Journal video titled "The Hidden Autopilot Data That Reveals Why Teslas Crash" during the section labeled "Why this is happening: Tesla’s camera-based system" they show someone turning on Tesla FSD on the street. This is misleading because FSD (Full Self-Driving) and Autopilot are two distinct software systems.
While it’s good that the video raises awareness about Tesla not sharing important crash data, it undermines its credibility by showing footage of a completely different software suite. Autopilot, Tesla’s more basic driver-assist system, doesn’t receive updates as frequently as FSD, yet it is still widely used by many Tesla owners. This disparity in update frequency makes it crucial to focus on Autopilot’s safety just as much as FSD’s.
Misrepresentations like this confuse the public and risk creating false perceptions about how these systems operate and how safe they truly are. Accurate reporting and clear distinctions between the systems are critical for addressing the safety challenges of autonomous driving technology. If crash reporting is eliminated, it could further muddy the waters, making it even harder for consumers to understand these differences and trust the technology.
Final Thoughts
For autonomous driving to succeed, it must be safe, transparent, and trustworthy. Removing the requirement to report crashes takes us in the wrong direction, endangering both consumers and the future of this technology.
As someone who owns an autonomous-capable vehicle which requires supervision, I can’t help but worry about the ripple effects of this decision. Autonomous vehicles have the potential to shape the future—but only if they’re developed responsibly.
What do you think about this push to eliminate crash reporting? Does it make you question the safety of autonomous vehicles?