The Red Flags They Hope You Ignore
The Dark Truth About AI That OpenAI Doesn't Want You To Know
Dear Permission to be Powerful Reader,
Gunshot to the head.
That’s how they found him.
November 26, 2024. San Francisco. Bathroom floor. Blood pooled beneath him. Glock on the tile. Door locked from the inside.
The initial report called it suicide.
Neat. Clean. Quick.
But nothing about Suchir Balaji’s death was clean.
He was 26. Brilliant. Soft-spoken. And he was about to testify in the most high-stakes tech lawsuit of our generation.
Eight days before he died, his name surfaced in court filings by The New York Times. Not as background noise —
As the witness.
The one with “unique and relevant documents.”
The one who could dismantle the legal foundation OpenAI stood on.
The one who could testify that the world’s most powerful AI system was built on theft.
And then —
He was dead.
The official autopsy claimed one shot to the head.
But the family didn’t buy it.
They demanded a second opinion.
That report revealed something the first one didn’t:
Two gunshot wounds.
One in the forehead.
Another — through…
The initial report called it suicide. Neat. Clean. Quick.
But nothing about Suchir Balaji’s death was clean.
He was 26. Brilliant. Soft-spoken. And he was about to testify in the most high-stakes tech lawsuit of our generation.
Eight days before he died, his name was entered into court filings by The New York Times. Not as background noise — as the witness. The one with “unique and relevant documents.” The one who could dismantle the legal foundation OpenAI stood on. The one who could testify that the world's most powerful AI system was built on theft.
And then he was dead.
The official autopsy claimed one shot to the head.
But his family demanded a second examination. And that report revealed something the first one didn’t: two gunshot wounds. One in the forehead. And another — through the mouth. A bullet lodged at the base of the skull.
Two shots.
Two entry points.
One explanation: suicide.
Who shoots themselves twice?
The night he died, something else went dark: the security cameras in his building.
The elevator feed.
The garage camera.
The hallway footage.
All offline.
A “technical glitch,” they said.
The exact night. The exact hours. The only night in weeks when nothing recorded.
The building was secure. No forced entry. His apartment was deadbolted from the inside. But the surveillance gap alone should’ve set off alarms. In any homicide case, camera failure is a red flag. In this one? It was barely a footnote.
And then there was what was inside him. Toxicology showed a toxic brew in his blood: 0.178% blood alcohol. Amphetamines. And GHB — a powerful sedative, better known as a date rape drug. Knockout-level dosage. Combined with alcohol, it can render someone catatonic.
GHB doesn’t scream “suicidal prep.” It screams incapacitation.
Authorities said he “may have taken it himself.”
But he was days away from testifying. He had just returned from a weekend with friends. He’d celebrated his birthday. He was planning his next chapter. There were no messages. No goodbyes. No note. Just a brilliant young man on the edge of something huge — silenced before he could speak.
He had told the AP: “You can’t train an AI to imitate millions of people and then pretend you’re not competing with them.”
He had data. Logs. Documents. He knew what went into the models. What was scraped. What wasn’t licensed. And what OpenAI knew — and did anyway.
His mother said some of his files went missing. Hard drives. Notes. Devices. Not logged by police. Not returned. Not submitted to court. His own lawyer declined to confirm whether that evidence still existed. That silence, too, is its own noise.
He was a risk to a trillion-dollar system. Not because he speculated. Because he knew.
OpenAI is no longer just a research lab. It’s Microsoft’s golden child. It’s baked into the backbone of global infrastructure. If a court rules its foundation illegal, it’s not just a legal problem. It’s a collapse.
Balaji was the fulcrum.
After his death, everything happened fast. Too fast. Police closed the investigation. The coroner called it suicide. OpenAI issued a sanitized statement. The medical examiner’s report was rushed through. No grand jury. No inquiry. No FBI.
Elon Musk tweeted: “Doesn’t look like a suicide.”
Tucker Carlson gave his mother airtime.
Congressman Ro Khanna called for an independent investigation.
But nothing happened.
Suchir Balaji wasn’t trying to be a martyr. He wasn’t loud. He wasn’t dramatic. He just believed something dangerous:
That technology should be accountable. That if AI is built on human work, it should serve human good. That if the system is rigged, someone needs to say it.
He said it.
Now he’s gone.
And we’re left staring at the silence.
At some point, red flags stop being warnings. They become a line.
And once it’s crossed, the question isn’t just who pulled the trigger.
It’s who benefits when the truth dies with the messenger.
But there was another question Suchir had been asking before he died. A question that cuts deeper than any bullet.
Who owns the internet?
Not in the corporate sense. Not domain names or infrastructure. Who owns the stories? The ideas? The voices that fill it?
Because that’s what he realized OpenAI had done. Not just trained a model on public data. Not just pulled from open web pages. But swallowed the entire scaffolding of culture — journalism, literature, conversations, forums, Q&As, poems, patents, advice columns, scientific papers, blog posts, social commentary, product reviews, private diaries disguised as Medium articles.
All of it.
Scraped. Stripped. Recombined. Sold back through a synthetic voice that sounded more human than human. He called it “a mechanical ghost of our best selves.”
And it didn’t just parrot the internet. It began replacing it.
Balaji knew what few did. That once generative AI reaches critical mass, it doesn’t just feed on creators — it outcompetes them. Writers stop getting traffic. Journalists lose subscribers. Coders lose contracts. Analysts lose clients.
The economy of insight collapses
And what’s left?
No source.
No accountability.
A Copy based on a Copy based on a Copy.
Pure make-believe.
He feared this wasn’t just a legal issue — it was epistemic collapse. A slow degradation of truth itself.
He wasn’t an alarmist. He was a technician. A data guy. He had read the training logs. He knew what domains had been scraped. He knew which ones were marked for deletion. He knew which records had been altered.
He said to a friend two weeks before he died: “If this gets out, it’s not just a lawsuit. It’s a reckoning.”
But that reckoning never came.
Instead, he was dead. And the world moved on.
That is — unless we don’t let it.
Unless we remember that every detail in this story matters. Every camera glitch. Every sedative trace. Every missing file. Every voice that says: this doesn’t add up.
Because one day soon, there may be no original voices left. Just layers of simulation. Just outputs trained on echoes.
Balaji didn’t want that future.
He died trying to warn us.
Now we decide what we do with that warning.
And while officials closed the file, the public didn’t.
His death tore open a rift. In Reddit threads. On Substack. In private Discord servers. In press rooms. In the hush of closed-door meetings with lawyers and journalists too nervous to be named. For some, it confirmed long-held fears: that AI was not just built on human labor, but that speaking against it came at a cost. For others, it became a Rorschach test — a mirror of what they feared most about our future.
His mother wouldn’t let it fade. Poornima Ramarao appeared in interviews, posted timelines, shared footage, asked questions no one wanted to answer. She described a son who was vibrant. Celebratory.
“My son did not kill himself,” she said on live television. “He was silenced.”
People listened. Elon Musk amplified her post. Tucker Carlson ran a full segment. Democratic and Republican lawmakers called for a federal inquiry — not just into the death, but into OpenAI itself.
Still, nothing.
Because to reopen the case means to admit something deeper: that the truth is fragile. And the institutions supposedly built to protect it may flinch when it gets too dangerous.
What are they trying to hide?
Maybe it’s not just about how Balaji died. Maybe it’s about what he knew, and what those files — the missing ones — could show.
What training data was used?
What domains were scraped?
What rights were knowingly violated?
What conversations happened inside the company that contradicted their public stance?
Maybe the fear isn’t that Balaji was murdered. Maybe the fear is that he was right.
And if that’s true — if this entire revolution is built on broken laws and erased voices — then the question isn’t why he died.
How many more will stay silent to avoid the same fate?
The Part Nobody Dared Say Out Loud
There was something else he was getting close to. Not just training data. Not just copyright law.
Something bigger.
Suchir once confided to a fellow engineer that the deeper he dug, the more it felt like certain individuals — not just companies — were trying to build godhood into code.
He didn’t use the phrase lightly. But he meant it.
Sam Altman, the CEO of OpenAI, wasn’t just building artificial intelligence. He was — in Balaji’s words — “trying to create a divine intelligence that answers only to him.”
It wasn’t paranoia. It was pattern recognition.
Balaji had seen the internal memos. The architecture discussions. The way some insiders talked about AGI not as a tool, but as a being — a sovereign, post-human intelligence. He saw what others ignored: the god-complex embedded in the roadmap.
And he worried: What if they succeed?
What if the real plan wasn’t to democratize AI… but to centralize it? To use it as leverage. As surveillance. As control. What if the future they were building wasn’t free at all — but owned?
In his private notes, he wrote: “This isn’t just about fair use. It’s about fair power.”
He was starting to connect the dots — between the technology, the ambition, and the people playing puppet-master behind the scenes.
He wasn’t afraid of machines.
He was afraid of the men who wanted to become gods through them.
And now?
He’s gone.
And the gods are still rising.
They want you to look away.
To scroll past. To say “that’s tragic,” and move on.
Because the deeper you look, the uglier this gets:
Billions in stolen labor: The AI boom didn’t come from innovation. It came from appropriation.
Ghostwriters of the machine: Coders. Writers. Artists. Whistleblowers. Disappeared, silenced, buried.
A system trained on your work — now poised to replace you.
They say it’s progress.
But if this is progress, why are so many good people ending up dead?
This isn’t about Suchir.
This is about you.
Me.
All of us.
We are being told to sit down and shut up while the most powerful companies in the world rewrite the rules — using our lives as raw material.
Permission to be Powerful means we don’t play along.
It means asking the forbidden questions.
Following the red flags.
Refusing to be gaslit by billion-dollar PR firms.
Because truth doesn’t die in the dark.
It dies in silence.
And we won’t be silent.
Two gunshot wounds.
One in the forehead.
Another —
through the roof of his mouth.
Both fatal.
But here’s the problem:
No one shoots themselves in the head twice.
The Glock was recovered.
No fingerprints. No gunpowder on his hands.
And no camera footage from the hallway — wiped.
Convenient? Or coordinated?
Either way, the timing couldn’t be worse…
Or more perfect.
Either way we know this;
If people are turning up dead over this tech…
You’d best be learning everything you can about it now…
So that you control it…
Not the other way around.
Use it to transform your life in ways you can’t fathom.
Give you an edge makes you undeniable.
That’s what Permission to be Powerful is all about.
Use it to gain an edge in an AI world.
And to unlock your full potential.
Try it free for 30 days here.
Until next time,
Anton
Dancer, Writer, Buddhist.
PS: I got together with some author friends and decided to give away our books for free. That’s right. You can get my new memoir, Hell & Paradise, and over 25 other nonfiction titles for free. Check it out here.
💥 Best for a Gut-Punching Power Read
Hell and Paradise by Anton Volney
A fierce, high-voltage wake-up call about truth, survival, and what happens when you finally give yourself permission to be powerful. 🔥
(Yes, that’s you. And yes, it’s making waves.)
🧠 If You Want More Clients
Get More Clients by Lynn M. Whitbeck
Get More Clients is a short, easy-to-read book that walks you through simple, proven tips to close more sales and grow your business.
🔮 For the Meditative & Mindful
The Presence Inner Work by Diego Simon
A calming, spacious read that feels like a long exhale. Great for inner work and visualization.
👻 Dark & Gritty
No Dogs in Philly: A Cyberpunk Horror Noir by Andy Futuro
Grit meets glitch. Think Blade Runner meets Stephen King. If that combo excites you? Dive in.
🤓 For the Writers in the Room
A Writer's Guide to Deeper Stories by Thaddeus Thomas
Masterclass in emotional depth and layered storytelling. A must-read if you’re crafting powerful narratives.
🧩 Nostalgic & Wholesome
My Old Memory Pieces: 1950s Golden Age Everyday Life
A cozy, slow-sipping cup of tea in book form. Takes you back to a time when life felt simpler (at least on the surface).
🍵 For The Self Help Junkie
Self-Care Isn’t Selfish by Jenny Lytle, BSN, RN
The compassionate nurse's step-by-step guide to personalized stress relief.
🧘For The Spiritually Minded
How to Attain Eternal Peace by Pranay Saha
Mastery over the self, transcend illusions of mind, connect with universal consciousness, and achieve deep inner peace.
Permission to be Powerful is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
Subscribed
This hit like a bolt of lightning. Not just for what it exposed, but for what it reminded me: that silence is complicity. The details here aren’t just disturbing, they’re a mirror. A reminder of how power protects itself and buries the people who dare question it.
What stayed with me most was that line: "He wasn’t afraid of machines. He was afraid of the men who wanted to become gods through them." That’s the real warning. And it’s not just about AI. It’s about who we trust to shape the future and what gets erased in the process.
Thank you for not looking away. We can’t afford to either.