In partnership with

Dear Readers,

What if AI became not just the next industrial revolution—but its financial system too? This week, the tectonic plates under both Silicon Valley and Wall Street are shifting at once. OpenAI has outgrown Microsoft’s cloud, Apple is turning laptops into mini data centers with the M5 chip, and capital markets are beginning to price AI infrastructure like oil pipelines. We’re watching the dawn of an era where compute, power, and liquidity fuse into a single feedback loop.

In today’s issue, we unpack how BlackRock’s $40 billion data-center bet reframes AI as a yield-bearing asset class, why Lila Sciences’ “Science Factories” might redefine scientific R&D, and how central banks quietly operationalize machine learning in monetary policy. Plus: DeepSeek’s open-source OCR breakthrough, new OpenAI chip risks, and a conversation between Dario Amodei and Marc Benioff that says more about the next decade of AI leadership than any earnings call could. Let’s dive in.

In Today’s Issue:

🍎 Apple's new M5 chip is set to revolutionize on-device AI

💰 Wall Street is making a $40B bet on AI infrastructure

🔬 Lila Sciences raised $115M with Nvidia backing for "Scientific Superintelligence"

🏦 Central banks are quietly operationalizing AI

And more AI goodness…

All the best,

OpenAI Outgrows Microsoft’s Cloud Limits

OpenAI’s explosive compute demands have pushed it to end its exclusive cloud partnership with Microsoft, striking massive new deals with Oracle, Google, and CoreWeave worth hundreds of billions through 2030. Microsoft, once OpenAI’s sole cloud provider, now shares the load while retaining 20% of OpenAI’s revenue and key model rights. The shift marks a new AI power dynamic, one where infrastructure scale, not alliances, decides who leads the race.

DeepSeek launches fast OCR model

DeepSeek released DeepSeek-OCR, a 3B-parameter, MIT-licensed vision-language model for document OCR and “context optical compression,” with HF Transformers examples (FlashAttention 2, BF16) and GPU inference. It supports prompts like “convert document to Markdown,” multiple image/base sizes, optional cropping/compression, and references PDF workflows via GitHub.

In The News Title 3

Nvidia, AMD, and Broadcom struck very different OpenAI agreements—custom silicon (Broadcom), up to $100B investment exposure (Nvidia), and stock warrants tied to chip purchases (AMD)—that shift risk profiles as OpenAI plans to burn ~$115B through 2029. Nvidia still holds the tech+software moat but carries concentration risk (AI data centers ~90% of revenue), Broadcom looks most insulated via custom contracts and software diversification, and AMD—now trading at ~43x next year’s EBITDA—faces the biggest downside if OpenAI demand underdelivers.

Elon Musk just gave his definition of AGI. And he is very bullish for Grok 5.

A Conversation with Dario Amodei (CEO Anthropic) and Marc Benioff (CEO Salesforce) at Dreamforce 2025

The Takeaway

👉 M5 delivers ~4× GPU AI compute over M4 and up to ~6× over M1, thanks to Neural Accelerators embedded in each GPU core.

👉 With unified memory bandwidth at ~153 GB/s and a 16-core Neural Engine, it opens realistic workflows for local LLMs and generative models.

👉 On-device AI becomes a differentiator: lower latency, more privacy, and flexibility for experimentation outside the cloud.

👉 Developers and AI practitioners should explore optimizing models for Apple silicon and consider new UX paradigms rooted in local intelligence.

From the jump, the new Apple M5 chip shatters expectations — imagine a laptop chip that treats your local AI models like light errands. Announced on 15 October 2025, M5 is built on a third-generation 3nm process and features a 10-core GPU architecture with a Neural Accelerator in each core. That means Apple claims over four times the peak GPU compute performance for AI compared to the previous M4.

What this means in plain terms: you could run more advanced generative models, do heavier on-device inference, or explore AI workflows without always relying on the cloud.

For the AI-community this is especially relevant: it signals a shift from “cloud only” to serious on-device AI. Smaller latency, better privacy (since data stays local), and a burgeoning stack of tools make it possible to experiment with creative AI use-cases right on a MacBook Pro, iPad Pro or even the Apple Vision Pro headset.

Why it matters: The M5 era shifts real AI power from data centers to your own desk, giving the community new freedom to build, test, and create without waiting on the cloud.

Sources:

Attention spans are shrinking. Get proven tips on how to adapt:

Mobile attention is collapsing.

In 2018, mobile ads held attention for 3.4 seconds on average.
Today, it’s just 2.2 seconds.

That’s a 35% drop in only 7 years. And a massive challenge for marketers.

The State of Advertising 2025 shows what’s happening and how to adapt.

Get science-backed insights from a year of neuroscience research and top industry trends from 300+ marketing leaders. For free.

Wall Street’s New Power Play: $40B Bet on AI Infrastructure

BlackRock’s GIP-led consortium is buying Aligned Data Centers for ~$40B—one of the largest data-center deals ever. Translation for markets: institutional capital is now treating AI compute as a core, yield-bearing infrastructure asset class. Expect M&A roll-ups, longer-dated financing (project finance/infra debt), and utility-style PPAs to scale. This tilts returns toward owners of power, land, and chips—while nudging monetary policy to watch AI-driven capex spillovers into credit growth and electricity prices.

“Scientific Superintelligence” Gets Real Money

Lila Sciences raised $115M (Series A extension), lifting valuation >$1.3B, with Nvidia participating. Lila’s pitch: AI-guided automated labs (“Science Factories”) that continuously generate proprietary data for energy, semis, and drug discovery. If it works, the moat shifts from model size to closed-loop data and wet-lab throughput—favorable unit economics for B2B SaaS + services. Investors should watch: capex intensity, data rights, and time-to-discovery versus Big Pharma/industrial R&D cycles.

Central Banks Are Quietly Operationalizing AI

A fresh BIS report outlines how central banks/supervisors are adopting ML across forecasting, surveillance, and policy analysis. The pivot is a regime change: better nowcasting of inflation/output gaps, faster stress-testing, and earlier detection of systemic risks. For investors, this raises the bar on “information edge” and could tighten feedback loops between macro data and rate paths—potentially reducing policy lags but increasing market whipsaw if models misfire.

Craft Emails That Convert

Stand out with personalized campaigns that build loyal customers. Grow your audience and scale your business effortlessly.

Reply

or to participate

Keep Reading

No posts found