Connect with us

News

Tesla cleared of some claims in Blade Runner lawsuit

A judge has ruled that the lawsuit can continue, despite dismissing certain parts of the allegations.

Published

on

A judge has issued a tentative ruling dismissing some claims that Tesla and Warner Bros. violated trademark laws when using imagery from the film Blade Runner 2049 during an event in October.

Following the automaker’s “We, Robot” Cybercab launch event in October, movie studio Alcon Entertainment filed a lawsuit against Tesla and Warner Bros., claiming that the companies violated trademark and copyright laws with the use of Blade Runner 2049 imagery. On Monday, however, Los Angeles-based U.S. District Judge George Wu ruled to dismiss claims related to trademark infringement, while letting Alcon continue pursuing the copyright case (via Reuters).

The image in question was an AI-generated image of the futuristic world depicted in Blade Runner, which you can see below. Wu also went on to say that the event was only referencing the original Blade Runner movie, along with adding that Tesla and Alcon are not competing companies.

“Tesla and Musk are looking to sell cars,” Wu said. “Plaintiff is plainly not in that line of business.”

Alcon also claimed in the suit that Tesla and Warner Bros had requested rights to use images from the film last year, but said that the firm denied these requests.

Advertisement

At the time of writing, legal representatives for Tesla, Warner Bros, and Alcon have not yet responded to requests for comment. You can also see the full lawsuit below.

READ MORE ON TESLA’S WE, ROBOT EVENT: Elon Musk explains Tesla Robovan suspension system

Tesla unveiled its autonomous, two-seater Cybercab vehicle at the We, Robot event, along with a surprise unveiling of its larger Robovan. Teslarati was one of the first to be able to ride in the Cybercab at the event, and you can see our coverage of the ride below.

During the event, Musk said that he was a fan of Blade Runner, though he also noted that he didn’t want to live in a dystopian world like that, but rather in one with an exciting future.

“So you see a lot of sci-fi movies where the future is dark and dismal, where it’s not a future you want to be in,” Musk said. “I love Blade Runner, but I don’t know if we want that future—I think we want that duster he’s wearing—but not the bleak apocalypse. We want to have a fun, exciting future.”

Tesla to explore the limits of casting with Cybercab line

Advertisement

Zach is a renewable energy reporter who has been covering electric vehicles since 2020. He grew up in Fremont, California, and he currently lives in Colorado. His work has appeared in the Chicago Tribune, KRON4 San Francisco, FOX31 Denver, InsideEVs, CleanTechnica, and many other publications. When he isn't covering Tesla or other EV companies, you can find him writing and performing music, drinking a good cup of coffee, or hanging out with his cats, Banks and Freddie. Reach out at [email protected], find him on X at @zacharyvisconti, or send us tips at [email protected].

News

xAI receives more Tesla Megapacks for Colossus 2

Published

on

xAI is bolstering its Colossus 2 data center in Memphis with 168 Tesla Megapacks, enhancing the energy infrastructure for its ambitious AI supercomputer expansion. The deployment underscores xAI’s push to lead AI innovation while addressing environmental concerns.

The first Colossus site is connected to a 150 megawatts (MW) substation powered by MLGW and TVA. It is supported by approximately 156 Megapacks, providing 150 MW of stored energy backup to xAI’s supercomputer. The 168 Tesla Megapacks recently delivered to xAI’s Memphis site will provide battery storage backup to Colossus 2.

In December 2024, xAI doubled the capacity of Colossus to 200,000 Nvidia H100 GPUs, which consumes 250 MW of power–enough to energize 250,000 homes. In March 2025, the AI company bought a 1-million-square-foot site in Whitehaven, Memphis, for $80 million. xAI’s Whitehaven site could host up to 350,000 GPUs with the potential to deploy the largest number of Tesla Megapacks for backup power.

xAI plans to scale Colossus up to 1 million GPUs to create the world’s largest AI supercomputer. A 1-million-GPU setup would require over 1 gigawatt, about one-third of Memphis’s peak summer demand.

Advertisement

Initially reliant on natural gas turbines, Colossus faced criticism for nitrogen oxide emissions. The 150 MW substation, completed in early 2025, reduced turbine use by half, with Megapacks providing cleaner backup power. By fall 2025, xAI expects the second substation to come online. Once the second substation is online, the remaining turbines will only be used for backup, reducing the project’s carbon footprint.

Tesla Energy’s Q1 2025 performance, with a 156% year-over-year increase and 10.4 GWh of storage deployed, supports xAI’s needs. Tesla’s Megapack factory in Waller County, Texas, set to create 1,500 jobs, signals further commitment to scaling energy solutions for projects like Colossus.

xAI’s rapid expansion, backed by Tesla Megapacks, positions it to rival AI leaders like OpenAI and Google. The Colossus 2 deployment reflects a strategic blend of cutting-edge AI and sustainable energy. As Memphis’ infrastructure adapts to unprecedented power demands, xAI and Tesla are reshaping the AI landscape with a focus on efficiency and environmental responsibility.

Continue Reading

News

Grok 3 by xAI Rolls Out on Azure AI Foundry with Free Trial

Grok 3 is now on Azure AI Foundry with a free preview until early June. From code to vision, Grok joins a growing roster of powerhouse models.

Published

on

xAI-Grok-3-microsoft-azure-ai-foundry-free-trial
(Credit: xAI)

xAI’s Grok 3 model is now available on Microsoft’s Azure AI Foundry Models, launching with a free preview to drive AI innovation. The collaboration marks a significant step in making advanced AI accessible to developers worldwide.

Grok 3 became available on Microsoft’s Azure AI Foundry Models on May 19, 2025. Developers can explore xAI’s Grok 3 at no cost through early June. After the free trial period, Grok 3 prices will be as follows:

“Microsoft and xAI are thrilled to unveil the availability of Grok 3 into the Azure AI Foundry Models, marking a significant milestone in AI accessibility and innovation,” Microsoft stated in its announcement.

The partnership integrates xAI’s cutting-edge model with Azure’s secure, scalable infrastructure, enabling enterprise scenarios in reasoning, coding, and visual processing. Grok 3 is accessible via Azure AI Foundry’s catalog, alongside models from OpenAI, Meta, Cohere, NVIDIA, and Hugging Face, reflecting Microsoft’s commitment to a diverse AI ecosystem.

“The addition of xAI’s Grok 3 underscores Microsoft’s commitment to support an open, diverse AI ecosystem, rather than relying on a single model provider,” the company noted.

Advertisement

Like other AI models in Azure, developers can easily discover and deploy Grok 3’s model card. Grok 3 is also available for testing on GitHub models.

Microsoft provides two flexible deployment options for integrating xAI’s Grok 3 into applications: Standard Pay-Go or Provisioned Throughput Units (PTUs). The Standard Pay-Go option allows pay-per-token API calls for quick scaling. Meanwhile, the PTUs are better for reserved capacity with predictable latency.

“For production scenarios where you expect steady high volume or need strict latency, provisioning Grok 3 with PTUs can be cost-effective and reliable,” Microsoft advised.

The launch of Grok 3 on Azure AI Foundry empowers developers to build intelligent assistants, process large documents, or explore new AI applications. As xAI and Microsoft combine innovation with robust tools, Grok 3’s arrival signals a new era of AI development, inviting creators to leverage its capabilities and shape the future of technology.

Advertisement
Continue Reading

Elon Musk

Tesla Robotaxi deemed a total failure by media — even though it hasn’t been released

Nearly two weeks before it is even set for its planned rollout, Tesla Robotaxi has already been deemed a failure — even though it is not even publicly released.

Published

on

Credit: Tesla

Tesla Robotaxi is among the biggest tech developments of the year, and its June launch date has not yet arrived.

This does not matter to skeptics of the company, as they have already deemed the rollout a “failure,” “an enormous mess,” and plenty of other adjectives. No matter what, several outlets are already leaning on biased opinions and a lack of true evidence that points in any direction.

Futurism posted an article this morning claiming that Robotaxi is “already an enormous mess,” citing the opinions of Dan O’Dowd, perhaps Full Self-Driving’s biggest critic. There is no mention of any of the excitement or prosperity that would come from the opposite side of the argument.

Instead, it included that O’Dowd felt it was a failure in an 80-minute drive around Santa Barbara.

This is fair to include: Full Self-Driving is not perfect, which is why Tesla will implement safeguards like teleoperation at first. However, it’s not like it’s so awful it isn’t even remotely close. Personally, my experience with FSD was incredibly successful, responsible, and it was something I still wish I had on my car to this day. I wish the article would have included a quote from someone who is as equally passionate about FSD, just from the other side of the argument.

Credit: Tesla

There is no mention of Tesla’s most recent Vehicle Safety Report, which showed Autopilot-enabled cars are nearly 10x less likely to be involved in an accident compared to the national average. This might not be the same as Full Self-Driving, but it is still a testament to what Tesla has achieved with its driver assistance systems.

To be fair, Tesla has been a company that has missed timelines, especially when it comes to FSD. I used to roll my eyes a bit when CEO Elon Musk would say, “We’ll have Full Self-Driving finished by the end of the year,” or “We’ll have a million robotaxis on the road next year.” I was always skeptical.

However, Tesla has handled things differently this year. They’ve admitted the Robotaxi rollout will be controlled at first, including a fleet of only 10-20 Model Y vehicles. It will be private at launch, and only the lucky invited will have the opportunity to experience it in Austin in June.

It might be less than a public rollout, which of course, for people like you and me, is disappointing. But let’s be real: if Tesla launched a full-blown Robotaxi platform with no regulations or small-batch testing, there would be criticism of that, too.

Some media outlets are pointing to the recent NHTSA request for more information on how Tesla’s tech will “assess the ability of Tesla’s system to react appropriately to reduced roadway visibility conditions.” This seems more than reasonable as Robotaxi will be among the first driverless ridesharing programs in the United States.

Tesla gets new information request from NHTSA on Robotaxi rollout

It’s no more than a request for information on how things will be handled and how the tech works.

It is sad to see so many outlets already deem something that could be the next big thing as a failure, despite there being no real indication of it being that or a success. Let’s be fair and give Tesla an opportunity to meet its June target and Robotaxi some time to operate and prove to be a reliable ride-share option.

Continue Reading

Trending

OSZAR »