HomeNewsShocking! Safety Tests Show Tesla Full Autopilot Software Hitting A Dummy Baby...

Shocking! Safety Tests Show Tesla Full Autopilot Software Hitting A Dummy Baby & Stroller Repeatedly

No, we are not trying to fool you or make you fear Tesla self-drive software, but this news is making huge rounds on social media, and we need to find out if there is any truth to the story.

Last month, Elon Musk declared that the beta version of Tesla’s “Full Self-Driving” (FSD) software is now available for all who pay for the extended option across North America.

The official statement read that anyone who pays $15,000 by tapping on their car screen, can have this extended FSD access.

Unfortunately, it seems not everyone is elated by the news. Not ‘The Dawn Project’ people at least.

Specifically, Dan O’Dowd seems to be very unhappy with the ordeal Tesla autopilot is doing as he is hell-bent to prove full self-drive software is not safe.

What Is The Dawn Project?

The Dawn Project is the latest version of groups bashing Tesla’s or Elon Musk-led philosophy and innovation.

The Dawn Project, an advocacy group well-known for its anti-Tesla sentiment and campaign to ban FSD, recently ran a series of tests that have resulted in some questionable results.

Our cue: Take the findings with caution.

This smear campaign is led by Dan O’Dowd, a billionaire, who is hell-bent to prove Tesla to be the worst software company. The Dawn Project began with a full-page advert in a leading newspaper in New York early this year, that read, “Making Computer Safe for Humanity”.

To get his new “The Dawn Project” off the ground, he bought the whole page to launch this project that he claims is devoted to making sure the software in essential systems and automobiles is both safe and free of glitches.

Following this bold move, his name quickly spread around the world after he harshly criticized Tesla’s “full” self-driving test being tested by everyday customers.

Soon after the viral video showed various clips where the Tesla repeatedly hits the baby stroller when on Autopilot. The entire media took the adverts seriously, and rumours began circulating for the full self-drive software to be extremely unsafe.

These videos are still popular on social media where critics and those in Tesla’s favour are advocating their own sides.

Kudos to the team for installing a camera inside the car that displays pedals clearly illuminated by a flashlight, authenticating there is no manual pushing. Although the strobing effect due to the camera takes away from the clarity.

Not only this but Autosteering can also be seen on view from the infotainment screen; the only problem however is that full self-driving active or not remain uncertain in the video.

Does anybody notice the quality of the video? We’ll get there in a bit. First, let’s find out more about the man behind all the action.

Tesla Full Autopilot Software Hitting A Dummy Baby

Man Behind The Project

Have you heard about Dan O Dowd? He has invested millions in a rivalling non-functioning autonomous driving feature. His “studies” are too farfetched to be replicated by his peers and even the videos he uploads, which never show both the interior and exterior views, fall short of credibility.

It’s practical as if someone took a Tesla for a spin while randomly hitting baby strollers – all under the pretence that this is what an advanced autopilot or FSD looks like!

And that’s not all -with no one to challenge his claims, O’Dowd has been able to spread dangerous misinformation about Tesla and its vehicles for a while now.

Also, why he needs to use Tesla’s name to sell his expertise?

Perhaps that’s why Tesla doesn’t have a press department, instead relying on its users to tell the true story.

On what makes O’Dowd’s claims so farfetched:

The first and most obvious way that O’Dowd’s claims fall short is in their sheer scope. While he may have convinced other companies to copy his idea of an autonomous car, it’s obviously not a reality yet – because it simply doesn’t work!

The technology isn’t there yet. It hasn’t been perfected, and if you try and recreate the conditions shown in his videos, you won’t see a similarly flawless result. Why? Because there’s no way that an autonomous driving feature could be so precise, especially under such varying circumstances!

On why it’s dangerous to believe in O’Dowd’s claims:

Not only is it dangerous to trust someone who hasn’t been able to make good on their promises, but it’s also dangerous to blindly trust someone who doesn’t understand the reality of how autonomous driving features work.

Even if a car can take over certain aspects of your driving experience, you still need to be alert and ready to take control should anything go wrong. That means that O’Dowd’s idea of “lazy” driving – where you essentially put the car on autopilot and do whatever you want while it drives itself – is simply irresponsible.

And that’s not all – if O’Dowd’s plans become reality, they’ll change the face of driving forever. If technology like this becomes mainstream, we could see a sharp increase in car accidents, as drivers let their guard down and assume that they won’t need to be alert and responsible behind the wheel.

Is FSD Unsafe?

Of course, it is unsafe. Why? Because it is not perfect so far.

The Tesla FSD is still in beta version which is constantly evolving and becoming better as the day goes by. There are already some cases where the software got it wrong leading to crashes and fatalities.

And there could be more such incidents in the future. However, these mishaps do not make Tesla FSD unsafe as a whole or grossly under-effective.

It is like using Google Maps Navigation – at first, it is buggy which leads to various errors (wrong directions, etc). But over time, it becomes more reliable and accurate.

The same goes for the Tesla FSD. It works fine most of the time, but not all of the time. However, this is true for any software under heavy development.

At the present time, Tesla FSD is safe enough to be adopted by a majority of the population provided some common sense, awareness, and vigilance are applied.

Remember, hands at the wheel at all times!

As far as FSD being unsafe for a particular person is concerned – this depends on the person’s driving skills, reaction time, and willingness to learn from making mistakes.

You see, every new technology takes time to get used to and becomes comfortable over time. But unfortunately, people are impatient to wait so they resort to risky driving behaviours.

The stroller crash video, for instance, is strangely misleading. In the parking lot, 30 mph was the speed limit and there were various challenges and additional obstacles such as a speed bump.

Additionally, certain versions of FSD had errors that weren’t displayed on the screen; even when they got caught previously for not showing it. The text shape also didn’t match what they claimed which caused further confusion among users.

Why Dan O’Dowd Is After Tesla?

This is an ultimate and viral human hack that has recently gained popularity, likely due to social media.

To make it work, one must boldly state something objectively false and then repeat that false claim until enough traction builds for it to become more convincing and accepted as fact.

Even if it is bogus or reality proves otherwise!

Folks familiar with FSD are aware of its efficacy, but the rest of the world is yet to learn about it.

Unfortunately, this method works; across the web, there seems to be a deep-seated animosity for Elon Musk that allows people to subscribe only to what is already in line with their own preconceived notions.

Be prepared for being attacked due to what car you drive.

What To Expect In the Future?

Avoiding a fuss is occasionally the best way to go. Tesla’s decision to file a lawsuit will result in massive media coverage and undeserved publicity for an individual who deserves no such attention.

Tesla has been the target of an attack for months and is simply defending itself from a person who deliberately engaged in a campaign to smear Tesla’s reputation, mislead the public by claiming that he worked at Tesla, and maliciously attempting to destroy his own exemplary professional track record.

If a company as prominent as Tesla is forced to bring legal proceedings to stop deliberate lies by a malicious individual, imagine the plight of ordinary citizens who cannot afford to take such dramatic defensive measures. But, let’s face it, Tesla

Tesla does not have a press department because the electric vehicle leader organization’s culture is based on innovation and they believe that all accurate information about their cars should come from only one source: Tesla.

Bottomline

There is no software on the planet that can make a slow, reckless and distracted driver safer than he/she would have been without any automation. But, there are smart drivers who know how to use an FSD to their advantage by being attentive, aware and disciplined about the way they drive.

You see, in this fast-changing world, people need to learn how to adopt and adapt to new technologies in their life without compromising personal and public safety. If a person is not willing to become better at learning then they should refrain from using the FSD at all because it can lead to fatal consequences.

The takeaway: Tesla’s autonomous driving features are pioneering the future of transportation – but O’Dowd simply can’t keep up with them!

For reliable, safe autonomy, trust Tesla for now.

Purnima Rathi
Purnima Rathi
Purnima has a strong love for EVs. Whether it's classic cars or modern performance vehicles, she likes to write about anything with four wheels, especially if there's a cool story behind it.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular