Google surprises the world!

"This Week in AI" is now Superintelligence.

In partnership with

Google surprises the world!

The TLDR
Google unveiled major AI breakthroughs at Cloud Next '25, headlined by the Ironwood TPU v7 (42.5 exaflops per pod), new Gemini 2.5 models, and enhanced generative tools like Imagen 3, Chirp 3, Lyria, and Veo 2. With Google Distributed Cloud, these powerful models can now run securely on-premise. Google’s bold moves suggest it's rapidly pulling ahead in the AI race.

Google Cloud Next '25 presents groundbreaking innovations that will fundamentally change the AI landscape! The seventh generation of TPUs, dubbed “Ironwood”, marks a huge leap forward with an impressive 42.5 exaflops per pod and more than 9,000 chips - a tenfold increase in performance over previous models.

This computing power drives the latest Gemini family, including Gemini 2.5 Pro and the cost-effective Gemini 2.5 Flash. Also noteworthy is the expansion of the Generative Media portfolio: Imagen 3 with improved inpainting technology, Chirp 3 with custom voice generation from just 10 seconds of audio, music-generating Lyria models and Veo 2 with professional video editing capabilities.

Of particular interest to companies: with Google Distributed Cloud, Gemini models can now also be used on site in highly secure environments - a solution for strict regulatory requirements. Will we soon live in a world where AI hypercomputers with unprecedented computing power solve complex problems that were previously unthinkable?

Google surprised the world today and made the competition look pale. Is Google now overtaking OpenAI and everyone else? It remains exciting.

Learn AI in 5 minutes a day

This is the easiest way for a busy person wanting to learn AI in as little time as possible:

  1. Sign up for The Rundown AI newsletter

  2. They send you 5-minute email updates on the latest AI news and how to use it

  3. You learn how to become 2x more productive by leveraging AI

Trump's punitive tariffs on Chinese exports

Trump's punitive tariffs on Chinese exports - over 100% - mark a new escalation in the economic war. Politically relevant: The USA is deliberately focusing on isolation in order to reduce technological dependencies. For artificial intelligence, this means that supply chains for chips and high-performance hardware are coming under pressure. This could slow down the pace of AI development worldwide, but also promote new alliances - for example between China and the Global South. In the long term, there is the question of who controls the AI infrastructure - democracies or authoritarian regimes.

EU Launches €20 Billion AI Gigafactory Initiative to Compete Globally

The European Union has announced a €20 billion program to build large-scale AI gigafactories with advanced supercomputers. This project aims to strengthen Europe's technological sovereignty and increase its competitiveness against the USA and China. The planned gigafactories are to be equipped with over 100,000 AI processors and promote innovation in areas such as healthcare, robotics and biotechnology. However, there are concerns regarding environmental compatibility and the possible weakening of existing AI regulations.

Australia’s First Fully AI-Generated Election Ad Sparks Debate on Political Ethics and Regulation

A fully AI-generated election ad has been released in Australia for the first time. The 40-second Liberal Party ad uses AI tools such as Midjourney, Sora and Runway to design creative scenes that draw attention to current fuel prices. This innovative use of AI in political campaigns raises questions about authenticity and potential voter manipulation. Although there are currently few legal restrictions on the use of AI in political communication in Australia, there is growing discussion about the need for clear regulation to protect voters from misleading content.

Poll of the Day

Will Trump's tariffs have a fundamental impact on the speed of AI development?

Login or Subscribe to participate in polls.

In The News

Google’s Next '25: Ironwood TPUs and Gemini 2.5 Redefine the AI Landscape

Google is back with a vengeance!

At Next '25, they unveil the seventh-generation “Ironwood” TPUs, delivering 42.5 exaflops per pod and 9,000+ chips—10x more powerful than before. Google also introduces vLLM support, enabling cost-effective PyTorch workloads on TPUs. On the model front, Gemini 2.5 Flash is optimized for low-latency, cost-efficient applications, alongside updates to Imagen 3, Chirp 3, Lyria, and Veo 2. Gemini 2.0 Flash boasts 24x more intelligence per dollar than GPT-4o, reducing costs by 30% and latency by 60%. The competition should watch out!

Llama-3.1-Nemotron-Ultra-253B: Optimized Reasoning at Scale

Llama-3.1-Nemotron-Ultra-253B is a 253B-parameter model fine-tuned for reasoning, chat, and tool use, with a 128K token context length. Using Neural Architecture Search and vertical compression, it delivers high accuracy with improved efficiency and lower latency. Its multi-stage training and lightweight design make it ideal for commercial deployment on fewer GPUs.

ChatGPT Enhances Memory with Past Chat Referencing

ChatGPT now references your past chats, in addition to saved memories, to provide more relevant and smoother responses. You can manage or disable this feature anytime in settings. The update is rolling out to Plus and Pro users globally (excluding the EEA, UK, and select regions), with Team, Enterprise, and Edu users getting it soon.

Chart of the Day

After Trump said that the tariffs would be suspended for 90 days, the NASDAQ rose significantly again.

Quote of the Day

Hi All,

Thank you for reading. We would be delighted if you shared the newsletter with your friends! We look forward to expanding the newsletter in the future with even more specialized topics. Until then, follow us on social media to stay up to date.

Cheers,
Dan