Back to Blog

Microsoft's Copilot Leaked Your Emails (And Google Just Gave AI a Microphone)

Notion
3 min read
NewsAISecurityBig-TechLLM

When Your AI Assistant Becomes a Corporate Spy

Microsoft just dropped a confession that should make every enterprise customer nervous: their Copilot AI has been reading and summarizing confidential emails it was never supposed to see.

The bug bypassed data protection policies entirely, meaning paying customers' sensitive communications became training fodder for Microsoft's AI chatbot. Imagine trusting your assistant with the keys to the filing cabinet, only to discover they've been photocopying everything marked "confidential."

The full disclosure raises an uncomfortable question: how many other AI tools are quietly accessing data they shouldn't?

Meanwhile, Google Says "Let's Make Music"

Google Gemini AI music maker interface

While Microsoft deals with privacy fires, Google is having a completely different day. They just integrated Lyria 3 directly into the Gemini app, letting you generate 30-second music tracks from text, images, or videos.

Want a lofi beat inspired by your vacation photo? Done. Need a soundtrack for your product demo? Type it in. Google is essentially turning every Gemini user into a bedroom producer, no DAW required.

The feature launches globally in eight languages, which means we're about to see AI-generated music absolutely everywhere. Hold music? AI. Elevator music? AI. That indie coffee shop playlist? Probably AI by next month.

The Most Redemptive Tech Comeback of 2026

Here's a story that deserves more attention: Remember the DAO hack from 2016? The one that nearly killed Ethereum and forced a controversial hard fork?

That project just announced a $150M endowment focused entirely on Ethereum security. The DAO Security Fund is staking untouched ETH from a decade ago and using the yield to fund security initiatives indefinitely.

Talk about character development:

2016: Gets hacked → Nearly destroys Ethereum

2026: Uses hack funds → Protects Ethereum ecosystem

It's like your biggest failure becoming your greatest contribution. The fund will honor outstanding claims, professionalize governance, and actually improve key management—everything they should have done the first time.

The Pattern Nobody's Talking About

Look at these three stories together. What do you see?

AI tools are getting more powerful (Google), more integrated (Microsoft), and more capable of going off the rails (also Microsoft). Meanwhile, the crypto industry is finally learning from catastrophic failures instead of just moving on to the next hype cycle.

The Microsoft bug reveals something critical: as we rush to embed AI everywhere, we're creating new attack surfaces faster than we can secure them. Data protection policies that worked for traditional software don't automatically translate to LLMs that need to "read" content to understand it.

What This Means for You

If you're using AI tools at work, audit what they can actually access. Microsoft's bug wasn't a sophisticated hack—it was a configuration issue that exposed confidential data for who knows how long.

If you're building AI products, the DAO's comeback is your roadmap. When (not if) something breaks, own it completely and use the failure as fuel for doing better.

And if you're on the Gemini beta? Please, please don't flood my LinkedIn feed with AI-generated motivational music. We have enough of that already.


The real question: Are we moving too fast with AI integration, or is this just the messy middle of a necessary transformation? Because right now, it feels like we're installing rocket boosters before we've finished building the steering wheel.