- Tech Barista
- Posts
- Google launched Gemini 3
Google launched Gemini 3
The newest flagship model scorches benchmarks with a 37.4 score on Humanity's Last Exam while rivals scramble to keep pace
☕ Good morning,
Seven months. That's how long Google waited between Gemini 2.5 and Gemini 3. Not two years. Not even a full year. Seven months. And it's not like they're iterating on minor improvements this thing topped every major benchmark and came bundled with an entirely new coding environment.
—Here’s to the first sip.
TODAY IN AI
Google launches Gemini 3

Image: Google
Google dropped Gemini 3, its newest flagship model, and it’s already being positioned as one of the strongest AI systems available right now. It showed up only seven months after Gemini 2.5, which tells you how fast this race is moving. The timing is wild too OpenAI just pushed out GPT 5.1 and Anthropic recently launched Sonnet 4.5. Everyone’s firing on all cylinders.
Google says Gemini 3 is a big jump in reasoning. Early benchmarks back that up. It scored 37.4 on Humanity’s Last Exam, which is now the highest score recorded. The previous top model was GPT-5 Pro at 31.64. It also took first place on LMArena, the benchmark based on real human satisfaction rather than pure math.
A more advanced version called Gemini 3 Deepthink is coming soon for Google AI Ultra subscribers, but only after more safety checks.
The model already has a huge built-in audience. Google says the Gemini app now gets 650 million monthly active users, and 13 million developers are using Gemini in their workflow. So whatever improvements land here, they instantly reach a massive crowd.
Google also shipped a new coding environment called Google Antigravity, which is basically an agentic coding setup baked into one interface. You get a prompt window like ChatGPT, a terminal, and a browser panel showing your app as the AI edits it. It’s Google’s answer to agentic IDEs like Cursor or Warp, and the idea is to let the AI hop between your editor, terminal, and browser so it can actually help you build projects end to end.
TECH BARISTA
Cloudflare outage caused by bot system glitch

Image: Cloudflare
Cloudflare explained what caused the massive outage, and it turns out the whole thing started with a mistake inside its Bot Management system. This system is supposed to decide which bots are allowed to crawl websites, especially now that AI scrapers are everywhere. Instead of protecting sites, a small database change ended up taking down a huge part of the internet for hours.
Here’s what actually happened. Cloudflare uses a machine learning model that assigns bot scores to incoming traffic. That model depends on a configuration file that gets updated all the time. After a change in how their ClickHouse database handled queries, that file suddenly started filling up with duplicate rows. It kept expanding until it blew past memory limits, which then crashed a core proxy system that sits in the middle of Cloudflare’s entire network.
Once that proxy went down, any website relying on Cloudflare’s bot rules started blocking real users by mistake. That’s why ChatGPT, X, Downdetector, Spotify, and tons of major sites all fell over at the same time. The funny part is that customers who didn’t use Cloudflare’s bot scoring stayed online, completely unaffected.
Cloudflare originally suspected a cyberattack, since spikes in weird traffic often point in that direction. But this time it wasn’t AI scrapers, DNS issues, or a huge DDoS attack. It was just a subtle internal behavior change that spiraled out of control.
To prevent this from happening again, Cloudflare says it’s going to harden how its systems ingest internal config files, add global kill switches for risky features, stop error logs from overwhelming system resources, and review how their core proxy reacts to failure conditions.
PRESENTED BY ROKU
Find your customers on Roku this Black Friday
As with any digital ad campaign, the important thing is to reach streaming audiences who will convert. To that end, Roku’s self-service Ads Manager stands ready with powerful segmentation and targeting options. After all, you know your customers, and we know our streaming audience.
Worried it’s too late to spin up new Black Friday creative? With Roku Ads Manager, you can easily import and augment existing creative assets from your social channels. We also have AI-assisted upscaling, so every ad is primed for CTV.
Once you’ve done this, then you can easily set up A/B tests to flight different creative variants and Black Friday offers. If you’re a Shopify brand, you can even run shoppable ads directly on-screen so viewers can purchase with just a click of their Roku remote.
Bonus: we’re gifting you $5K in ad credits when you spend your first $5K on Roku Ads Manager. Just sign up and use code GET5K. Terms apply.
BIZ BARISTA
Adobe is buying Semrush

Image: Adobe
Adobe made a big move in the marketing world. It’s buying Semrush for about $1.9 billion, and the whole point is simple: brands are scrambling to stay visible in a world where AI is changing how people search.
Adobe already has a stack of marketing tools, but this deal plugs a big gap. Semrush is one of the top platforms for SEO and traffic analysis, and it’s been shifting hard toward what it calls “generative engine optimization” basically helping companies show up in AI answers from ChatGPT, Claude, Perplexity, and all the other AI search tools people now rely on.
Adobe’s betting that this is the next big marketing channel. Their own analytics show traffic from generative AI chatbots to retail sites has jumped more than 1,000% in a year. So instead of just optimizing for Google, brands now have to optimize for AI models too. Semrush has already built tools for that, and Adobe wants that tech inside its ecosystem.
It’s a much smaller deal than the Figma acquisition Adobe had to abandon, but the deal still needs regulatory and shareholder approval and isn’t expected to close until the first half of 2026.
But if it goes through, Adobe basically gets a front-row seat to the future of SEO or whatever we’ll call it when AI models become the new gatekeepers of web traffic.


