

News
Tesla Autopilot Abusers need to be held accountable, but how?
Tesla Autopilot Abusers need to be held accountable for their actions. For years, Tesla engineers have worked long and hard to improve Autopilot and Full Self-Driving. Hundreds of thousands of hours of work have been put into these driving assistance programs, whether it would be through software, coding, and programming or through other mediums. However, years of hard work, diligence, and improvement can be wiped away from the public’s perception in a minute with one foolish, irresponsible, and selfish act that can be derived from an owner’s need to show off their car’s semi-autonomous functionalities to others.
The most recent example of this is with Param Sharma, a self-proclaimed “rich as f***” social media influencer who has spent the last few days sparring with Tesla enthusiasts through his selfish and undeniably dangerous act of jumping in the backseat while his car is operating on Autopilot. Sharma has been seen on numerous occasions sitting in the backseat of his car while the vehicle drives itself. It is almost a sure thing that Sharma is using several cheat devices in his Tesla to bypass typical barriers the company has installed to ensure drivers are paying attention. These include a steering wheel sensor, seat sensors, and seatbelt sensors, all of which must be controlled or connected by the driver at the time of Autopilot’s use. We have seen several companies and some owners use DIY hack devices to bypass these safety thresholds. These are hazardous acts for several reasons, the most important being the lack of appreciation for other human lives.
This is a preview from our weekly newsletter. Each week I go ‘Beyond the News’ and handcraft a special edition that includes my thoughts on the biggest stories, why it matters, and how it could impact the future.
While Tesla fans and enthusiasts are undoubtedly confident in the abilities of Autopilot and Full Self-Driving, they will also admit that the use of these suites needs to be used responsibly and as the company describes. Tesla has never indicated that its vehicles can drive themselves, which can be characterized as “Level 5 Autonomy.” The company also indicates that drivers must keep their hands on the steering wheel at all times. There are several safety features that Tesla has installed to ensure that these are recognized by the car’s operator. If these safety precautions are not followed, the driver runs the risk of being put in “Autopilot Jail,” where they will not have the feature available to them for the remainder of their drive.
As previously mentioned, there are cheat devices for all of these safety features, however. This is where Tesla cannot necessarily control what goes on, and law enforcement, in my opinion, is more responsible than the company actually is. It is law enforcement’s job to stop this from happening if an officer sees it occurring. Nobody should be able to climb into the backseat of their vehicle while it is driving. A least not until many years of testing are completed, and many miles of fully autonomous functionalities are proven to be accurate and robust enough to handle real-world traffic.
The reason Tesla should step in, in my opinion, and create a list of repeat offenders who have proven themselves to be irresponsible and not trustworthy enough for Autopilot and FSD, is because if an accident happens while these influencers or everyday drivers are taking advantage of Autopilot’s capabilities, Tesla, along with every other company working to develop Level 5 Autonomous vehicles, takes a huge step backward. Not only will Tesla feel the most criticism from the media, but it will be poured on as the company is taking no real steps to prevent it from happening. Unbelievably, we in the Tesla community know what the vehicles can and what safety precautions have been installed to prevent these incidents from happening. However, mainstream media outlets do not have an explicit and in-depth understanding of Tesla’s capabilities. There is plenty of evidence to suggest that they have no intentions of improving their comprehension of what Tesla does daily.
While talking to someone about this subject on Thursday, they highlighted that this isn’t Tesla’s concern. And while I believe that it really isn’t, I don’t think that’s an acceptable answer to solve all of the abuses going on with the cars. Tesla should take matters into its own hands, and I believe it should because it has done it before. Elon Musk and Tesla decided to expand the FSD Beta testing pool recently, but the company also revoked access to some people who have decided that they would not use the functionality properly. Why is this any different in the case of AP/FSD? Just because someone pays for something doesn’t mean the company cannot revoke access to it. If you pay for access to play video games online and hack or use abusive language, there are major consequences. Your console can get banned, and you would be required to buy a completely new unit if you ever wished to play online video games again.
While unfortunate, Tesla will have to make a stand against those who abuse Autopilot, in my opinion. There needs to be heavier consequences by the company simply because an accident caused by abuse or misuse of the functionalities could set the company back several years and put their work to solve Level 5 Autonomy in a vacuum. There is entirely too much at stake here to even begin to let people off the hook. I believe that Tesla’s actions should follow law enforcement action. When police officers find someone violating the proper use of the system, the normal reckless driving charges should be held up, and there should be increasingly worse consequences for every subsequent offense. Perhaps after the third offense, Tesla could be contacted and could have AP/FSD taken off of the car. There could be a probationary period or a zero-tolerance policy; it would all be up to the company.
I believe that this needs to be taken so seriously, and there need to be consequences because of the blatant disregard for other people and their work. The irresponsible use of AP/FSD by childish drivers means that Tesla’s hard work is being jeopardized by horrible behavior. While many people don’t enjoy driving, it still requires responsibility, and everyone on the road is entrusting you to drive responsibly. It could cost your life or, even worse, someone else’s.
A big thanks to our long-time supporters and new subscribers! Thank you.
I use this newsletter to share my thoughts on what is going on in the Tesla world. If you want to talk to me directly, you can email me or reach me on Twitter. I don’t bite, be sure to reach out!
News
Tesla Model 3 gets perfect 5-star Euro NCAP safety rating
Tesla prides itself on producing some of the safest vehicles on the road today.

Tesla prides itself on producing some of the safest vehicles on the road today. Based on recent findings from the Euro NCAP, the 2025 Model 3 sedan continues this tradition, with the vehicle earning a 5-star overall safety rating from the agency.
Standout Safety Features
As could be seen on the Euro NCAP’s official website, the 2025 Model 3 achieved an overall score of 90% for Adult Occupants, 93% for Child Occupants, 89% for Vulnerable Road Users, and 87% for Safety Assist. This rating, as per the Euro NCAP, applies to the Model 3 Rear Wheel Drive, Long Range Rear Wheel Drive, Long Range All Wheel Drive, and Performance All Wheel Drive.
The Euro NCAP highlighted a number of the Model 3’s safety features, such as its Active Hood, which automatically lifts during collisions to mitigate injury risks to vulnerable road users, and Automatic Emergency Braking System, which now detects motorcycles through an upgraded algorithm. The Euro NCAP also mentioned the Model 3’s feature that prevents initial door opening if someone is approaching the vehicle’s blind spot.
Standout Safety Features
In a post on its official Tesla Europe & Middle East account, Tesla noted that the company is also introducing new features that make the Model 3 even safer than it is today. These include functions like head-on collision avoidance and crossing traffic AEB, as well as Child Left Alone Detection, among other safety features.
“We also introduced new features to improve Safety Assist functionality even further – like head-on collision avoidance & crossing traffic AEB – to detect & respond to potential hazards faster, helping avoid accidents in the first place.
“Lastly, we released Child Left Alone Detection – if an unattended child is detected, the vehicle will turn on HVAC & alert caregivers via phone app & the vehicle itself (flashing lights/audible alert). Because we’re using novel in-cabin radar sensing, your Tesla is able to distinguish between adult vs child – reduced annoyance to adults, yet critical safety feature for kids,” Tesla wrote in its post on X.
Below is the Euro NCAP’s safety report on the 2025 Tesla Model 3 sedan.
Euroncap 2025 Tesla Model 3 Datasheet by Simon Alvarez on Scribd
Elon Musk
USDOT Secretary visits Tesla Giga Texas, hints at national autonomous vehicle standards
The Transportation Secretary also toured the factory’s production lines and spoke with CEO Elon Musk.

United States Department of Transportation (USDOT) Secretary Sean Duffy recently visited Tesla’s Gigafactory Texas complex, where he toured the factory’s production lines and spoke with CEO Elon Musk. In a video posted following his Giga Texas visit, Duffy noted that he believes there should be a national standard for autonomous vehicles in the United States.
Duffy’s Giga Texas Visit
As could be seen in videos of his Giga Texas visit, the Transportation Secretary seemed to appreciate the work Tesla has been doing to put the United States in the forefront of innovation. “Tesla is one of the many companies helping our country reach new heights. USDOT will be right there all the way to make sure Americans stay safe,” Duffy wrote in a post on X.
He also praised Tesla for its autonomous vehicle program, highlighting that “We need American companies to keep innovating so we can outcompete the rest of the world.”
National Standard
While speaking with Tesla CEO Elon Musk, the Transportation Secretary stated that other autonomous ride-hailing companies have been lobbying for a national standard for self-driving cars. Musk shared the sentiment, stating that “It’d be wonderful for the United States to have a national set of rules for autonomous driving as opposed to 50 independent sets of rules on a state-by-state rules basis.”
Duffy agreed with the CEO’s point, stating that, “You can’t have 50 different rules for 50 different states. You need one standard.” He also noted that the Transportation Department has asked autonomous vehicle companies to submit data. By doing so, the USDOT could develop a standard for the entire United States, allowing self-driving cars to operate in a manner that is natural and safe.
News
Tesla posts Optimus’ most impressive video demonstration yet
The humanoid robot was able to complete all the tasks through a single neural network.

When Elon Musk spoke with CNBC’s David Faber in an interview at Giga Texas, he reiterated the idea that Optimus will be one of Tesla’s biggest products. Seemingly to highlight the CEO’s point, the official Tesla Optimus account on social media platform X shared what could very well be the most impressive demonstration of the humanoid robot’s capabilities to date.
Optimus’ Newest Demonstration
In its recent video demonstration, the Tesla Optimus team featured the humanoid robot performing a variety of tasks. These include household chores such as throwing the trash, using a broom and a vacuum cleaner, tearing a paper towel, stirring a pot of food, opening a cabinet, and closing a curtain, among others. The video also featured Optimus picking up a Model X fore link and placing it on a dolly.
What was most notable in the Tesla Optimus team’s demonstration was the fact that the humanoid robot was able to complete all the tasks through a single neural network. The robot’s actions were also learned directly from Optimus being fed data from first-person videos of humans performing similar tasks. This system should pave the way for Optimus to learn and refine new skills quickly and reliably.
Tesla VP for Optimus Shares Insight
In a follow-up post on X, Tesla Vice President of Optimus (Tesla Bot) Milan Kovac stated that one of the team’s goals is to have Optimus learn straight from internet videos of humans performing tasks, including footage captured in third person or by random cameras.
“We recently had a significant breakthrough along that journey, and can now transfer a big chunk of the learning directly from human videos to the bots (1st person views for now). This allows us to bootstrap new tasks much faster compared to teleoperated bot data alone (heavier operationally).
“Many new skills are emerging through this process, are called for via natural language (voice/text), and are run by a single neural network on the bot (multi-tasking). Next: expand to 3rd person video transfer (aka random internet), and push reliability via self-play (RL) in the real-, and/or synthetic- (sim / world models) world,” Kovac wrote in his post on X.
-
News2 weeks ago
Tesla Cybertruck Range Extender gets canceled
-
Elon Musk6 days ago
Tesla seems to have fixed one of Full Self-Driving’s most annoying features
-
Lifestyle2 weeks ago
Anti-Elon Musk group crushes Tesla Model 3 with Sherman tank–with unexpected results
-
News2 weeks ago
Starlink to launch on United Airlines planes by May 15
-
News2 weeks ago
Tesla Semi gets new adoptee in latest sighting
-
News2 weeks ago
Tesla launches its most inexpensive trim of new Model Y
-
News2 weeks ago
US’ base Tesla Model Y has an edge vs Shanghai and Berlin’s entry-level Model Ys
-
News2 weeks ago
Tesla Cybertruck owners get amazing year-long freebie