Tesla Cybertruck Crash: Brakes May Not Work on Some Terrain

Remember the sales pitch?


And remember how some people talked about the Tesla CEO while they were waiting for their Cybertruck?

In reality Tesla software engineering seems to be even worse than their hardware, prone to sudden catastrophic failures.

Here’s a case where Tesla says the brakes may not work in some (unspecified) terrain, by design, so insurance should pay for a Cybertruck crash.

In related news, Cybertruck owners still say a simple wash with water completely kills it… dumb and dumber.

Discovery of 14th Century BCE Shipwreck Proves Ancient Navigation of Open Seas

The narrative has been that ancient ships hugged shoreline to avoid open seas, because they couldn’t navigate. This new shipwreck discovery near Israel helps prove navigation goes back thousands of years.

A 3,300-year-old ship has been discovered at the bottom of the Mediterranean Sea, making it one of the oldest shipwrecks ever discovered and rewriting our understanding of sailing in the ancient world, according to the Israel Antiquities Authority.

The vessel is estimated to be from the 13th or 14th century BCE, the authority said in a statement. It was discovered 90 kilometers (around 56 miles) from the shore, in waters 1.8 kilometers (1.1 miles) deep, with hundreds of intact jars still on board, the statement added.

Tesla Investor Spreads Dangerous “Head-On Collision Avoidance” Disinformation About FSD

As someone who records nearly weekly head-on collisions by Tesla into other cars, I was of course asked to evaluate claims being made on the Nazi-addled Swastika platform that Tesla FSD can prevent the very thing it repeatedly fails to prevent.

Let me start by saying my first review of this new video evidence that was posted by “CyberMikeOG” showed a car NOT approaching the Tesla head-on at full speed, but rather a confused car slowly creeping along a lane to the side.

FSD12.3.6 (Supervised) saved me this morning at 5:40am from a potential head on collision. A driver pulled out onto a west bound roadway and turned right into my lane. My car quickly moved over as if nothing was wrong and continued in my lane. I believe they were intoxicated. Could not get license but I’m grateful for @Tesla and it’s team of engineers for making a product that will save millions of lives. @tesla_na @Tesla_AI @elonmusk @WholeMarsBlog @SawyerMerritt

These claims sound like total bullshit to me, after watching the video. My first thought was that this text must be generated by a Tesla investor desperate to pump and dump stock.

“Saved me”?

“My car quickly moved over”?

Both are factually false statements.

I watched many times to carefully time the Tesla as it continued driving directly at a growing hazard in front of it, until it almost crashes and then panics. It appeared to me the person in the Tesla driver’s seat must have been asleep or distracted, because there’s no way a human would have maintained full speed in the same lane the way FSD does in the video.

Here is the final frame, several crucial seconds after the car was first seen.

It begs the question why the Tesla kept driving at full speed for several seconds after it first observed a car enter the wrong direction and pose a danger, instead of slowing down, stopping or moving away completely by changing lanes.

At 0:16 the car is blocking the lane and at 0:18 you can see a scene that immediately should have initiated evasive action. My reaction right then is to slow down and direct myself towards the right shoulder, but FSD waits, and waits and waits.

FSD thus seems blind as it reacts far too late, at the very last second due to obvious failures to observe risk.

The Tesla swerves harshly to the right as if surprised, when objectively a safer more controlled maneuver is overdue. It was about to crash into a car so close that it had to swerve an inch, because it didn’t react sooner.

Note that swerving a little bit in a last second looks less like a head-on collision risk than a side-swipe. Nevertheless, Tesla should never have continued driving at high speed without immediate reaction.

In other words, this is evidence of failure, not a success. FSD puts people FAR MORE AT RISK than just a human at the wheel because of overconfidence, such as continuing to drive straight at a car on the wrong side of the road.

17 June 2024 actual test results:

Tesla Full Self-Driving is far less safe than an average human driver.

Tesla owners (who clearly aren’t paying attention) will attempt to frame these failures as success, which is the same problem I’ve seen since April 2016. Joshua Brown made this exact analytic mistake and as a result he was killed by Tesla a couple weeks later.

Has Tesla improved “self-driving” since 2016? This video proves to me that it has NOT. And it definitely does NOT prevent head-on collisions. My research indicates that FSD is to blame for head-on collisions.

Tesla FSD nearly crashing head-on into a police car.

Now for the obvious question why this Swastika profile was compelled to drive dangerously distracted by FSD and then post very dangerous disinformation about FSD capabilities…? Take a look at the CyberMikeOG profile description on Twitter:

Elon Musk enthusiast and $TSLA long term investor since 2012.

He’s a Tesla stock investor, who will profit from disinformation about FSD.

Or, perhaps more to the point, this profile might be part of a coordinated stock pump and dump effort.

Lying about Tesla should not go unnoticed? Ok then. Take a look in the mirror? It’s almost as if the tweets are saying stock prices are just a reflection of whatever the CEO says, not based in reality.

Where does that supposed April 5th, 2024 “stock crash” figure into the price? Here’s the six month view:

And here’s yet more evidence of stock manipulation going on from the top, based on dangerous disinformation about car safety.

Tesla regularly run red lights and blow stop signs, so such PR work is terribly dangerous fraud. Even worse, Elon Musk fraudulently promotes drunk driving with FSD as safe.

Related: Tesla blows stop signs and red lights.

“Safety Comes Last”: Experts Warn Against Getting in a Tesla

Experts all over continue sounding an alarm, telling people not to get in a Tesla, and not to let friends or family get in one either.

Brooks, the executive director for Center for Auto Safety, thought it was “absurd” to blame the firefighters for not knowing how to open the car.

“It’s not the firefighters’ fault that Tesla chose electronic door latches that don’t have proper emergency safeguards,” he said.

Tesla has manual release doors for when you are inside the car and unable to get out, but they are unmarked, unlike seatbelts and airbags. Brooks said this was Tesla’s choice to put “form over function,” that was ultimately “unsafe.”

“When there’s not a federal standard that specifies how these vehicles are to be made, Tesla very rarely chooses routes that are safe,” Brooks noted. “They’re usually choosing something glitzy: safety comes last.”

Brooks added this incident contributed to an overall “failure in Tesla’s safety culture.”

Firefighters smashed a Tesla window to rescue a Toddler from death, after the car locked all the doors and wouldn’t open without following a complicated, error-prone and mostly unknown procedure.

Monkeys banging on a keyboard couldn’t design a worse car.

[Opening the car doors from outside] involves opening a three-inch circle near the front of the car called a toe cover, pulling out the cables within it, and connecting those cables to an external power supply (like a portable jump starter). That would allow the hood of the trunk to open, giving drivers access to the 12-voltage battery, which they could then jumpstart.


Related, Tesla had no worries remotely popping a door open when the Police asked them to help arrest someone inside.

So you’re not safe in a Tesla because can’t get in… and you’re not safe in a Tesla because can get in.

Same story over and over again.