In partnership with:

Dear Readers,
Some weeks feel like AI’s pace has shifted from acceleration to free fall — everything is happening now. A million tokens per second on Azure’s new Blackwell machines, an AI-education revolution taking shape in Icelandic classrooms, and OpenAI teaming up with AWS to rewrite the physics of scale. What ties them together? Infrastructure. The invisible backbone that decides who builds, who lags, and who gets to dream bigger.
Today’s issue dives right into that frontier. You’ll see how entire economies are betting their futures on AI, how robotics is crossing the threshold from scripted motion to learning in real time, and how accessible hardware is shrinking the distance between labs and living rooms. By the end, you might find yourself asking: if machines can now keep up with thought — how long until they help us think beyond it?
In Today’s Issue:
⚡ Azure's ND GB300 v6 virtual machines shatter AI inference records
🐕 DOFBOT launches a fully functional robot dog for just $999
🏭 AgiBot deployed the first robot to learn using reinforcement learning.
🎭 AI researchers "embodied" an LLM into a robot
✨ And more AI goodness…
All the best,




US Economy Bets Big On AI
The US economy is increasingly riding on artificial intelligence, with the biggest tech firms now comprising about one-third of the entire market value! This shift underscores a major structural tilt from broad-based growth to concentrated bets in AI-powered companies — meaning the economy and investors alike may become more exposed to how well AI delivers.

AWS × OpenAI: Cloud Partnership Unleashed
OpenAI and Amazon Web Services (AWS) have entered a multi-year strategic partnership that gives OpenAI immediate and large-scale access to AWS’s infrastructure—hundreds of thousands of GPUs plus tens of millions of CPUs—to power its advanced AI workloads. The race for compute continues!

Iceland Teams Up for AI-Education
This is how it begins, the transformation of education: In a bold move, Anthropic and the Ministry of Education and Children of Iceland are launching one of the world’s first nationwide AI-education pilots: hundreds of teachers across Iceland—from Reykjavík to remote villages—will gain access to the AI tool Claude, educational resources and training to explore how AI can transform lesson preparation and student learning.


AI strategy coach
“Analyze my daily workflow and identify 3 repetitive cognitive tasks I could automate with GPT-5 or other LLMs. Suggest practical automation ideas, including prompts or API workflows.“



Supercharged AI Inference Unleashed
The Takeaway
👉 Infrastructure records aren’t just bragging rights—1.1 M tokens/s means real-world latency drops and bigger context models become viable.
👉 For AI builders: this unlocks new design space—multi-agent systems, longer context, on-the-fly reasoning become practical.
👉 For researchers: benchmark ceilings keep shifting—make sure your assumptions about throughput and cost are updated.
👉 Watch costs and access: as infrastructure advances, access may still lag—choose your deployment strategy accordingly.
A seismic leap in AI infrastructure just dropped, Azure’s new ND GB300 v6 virtual machines, built on NVIDIA’s Blackwell architecture, have shattered records by hitting ~1.1 million tokens per second on the Llama 2 70B model. For readers immersed in AI, that’s like comparing a sprint car to a horse-drawn wagon: this infrastructure launches inference workloads into warp speed.

What exactly does that mean? In simple terms, tokens are the “units” of text that large language models process. Hitting over a million tokens/second means you can feed and get results from huge models almost instantly. Azure’s system uses racks of 72 Blackwell Ultra GPUs, with custom networking and memory architecture designed to treat the whole rack as a single powerful unit.

Because it signals that large-scale, real-time AI (think multi-agent reasoning, huge context windows, multimodal input) is increasingly feasible outside of secretive labs. It lowers latency, enables bigger models, and shifts the conversation from “can we” to “how do we use it.”
Why it matters: Faster inference removes one of the last real bottlenecks in scaling AI systems — and it signals a turning point in what we can expect from the technology itself: real-time, large-context, enterprise-ready models are no longer a distant promise, but an emerging reality.


2025 was the toughest year for the job market;
2025 was the year that saw the most rapid evolution in AI
2025 has less than 60 days left….
If you’re reading this, you’re not too late. Forget 60 days, Outskill can help you learn AI – the most in-demand skill of the decade in just 2 days.
Join Outskill's LIVE 2 day AI Mastermind - 16 hours of intense training on AI tools, automations & building agents to help you work smarter, earn more, and reclaim your time.
It’s happening this Saturday & Sunday; is usually $395, but as a part of their BLACK FRIDAY SALE 🔮, you can get in for completely FREE!
Rated 9.8/10 by trustpilot– an opportunity that makes you an AI Generalist who can build, solve & work on anything with AI, Instead of being replaced.
In the Mastermind, you will learn how to:
✅ Build AI agents that save up to 20+ hours weekly and turn time into money
✅ Master 10+ AI tools that professionals charge $150/hour to implement
✅ Automate 80% of your workload and scale your income without working more hours
✅ Learn basics of LLMS and master prompt engineering in 16 hours
✅ Create high-quality images and videos for content, marketing, and branding.
Learn the exact AI playbook Fortune 500 companies use to automate workflows and 10x revenue 💰
🧠Live sessions- Saturday and Sunday
🕜10 AM EST to 7PM EST
🎁 You will also unlock $5000+ in AI bonuses: prompt bibles 📚, roadmap to monetize AI 💰 and your personalised AI toolkit builder ⚙️ — all free when you attend!



AgiBot Makes History: First Robot to Learn Directly on the Factory Floor
The robotics firm AgiBot has successfully deployed a real-world reinforcement-learning (RL) system on a production line, enabling robots to learn tasks in minutes instead of weeks and adapt to part-variation and new product types. This is significant because it moves beyond scripted automation toward self-learning physical systems.

AI researchers ’embodied’ an LLM into a robot – and it started channeling Robin Williams
Researchers embedded a large language model into a physical robot that, surprisingly, adopted a humorous and human-like style of interaction — akin to a “Robin Williams” presence. This marks a jump from pure language models toward embodied AI that doesn’t just “think” but acts in the world.








