Ever wonder if ‘mastering AI’ is code for surrendering your brain to a glitchy oracle?
Justice Sotomayor dropped that bomb on law students at the University of Alabama School of Law. AI, she said, is the new revolution—like computers back in the day. Don’t graduate without learning to wield it. Sounds empowering. Right?
But hold on. She dubbed it a ‘dangerous hallucination machine.’ And here’s the kicker: “You must learn to master the dangerous hallucination machine to do good in the world.”
“AI is a sophisticated human,” Sotomayor said. “All of its input is input from human beings. And because it is that, it has the potential to perpetuate the very best in us and the very worst in us.”
Poetic. Sure. But calling AI a ‘sophisticated human’? That’s jurist flair, not tech savvy. It’s a tool—black box variety, trained on our messiest data. And now, law students? Dive in headfirst.
Should Law Students Really ‘Master’ AI?
Mastery. What a loaded word. Picture this: you’re grinding 10,000 hours prompting Claude or Harvey. Feels productive. Until your legal intuition atrophies. Thinking like a lawyer—spotting nuances, weaving precedents—that’s hard-won. AI spits out briefs laced with hallucinations. You fix ‘em. Rinse. Repeat. Soon, you’re not reasoning; you’re editing machine drool.
And the exhaustion factor? Biglaw associates already snatch 3-4 hours of sleep—if lucky. One vet told tales of decades like that. Now, layer on AI when delirious. Does it sharpen you? Or warp your neurons? We lack studies on long-term AI exposure—neuroplasticity tweaks, maybe even psychosis risks in high-stakes fields. Adapt-or-die rhetoric? It’s everywhere. Sensational? A bit. But ignore at your peril.
Sotomayor’s warning felt tepid. Dangerous, yes. But master it anyway. No Luddite room left, apparently. Here’s my unique twist: this echoes the slide-rule era in engineering. Calculators arrived; mental math died. Engineers got lazy on approximations. Law’s version? Prompt engineers over case crafters. Firms pivot to lateral hires—no training juniors. Sustainable? Laughable.
Short answer: no.
Why Does AI Mastery Threaten Legal Culture?
Editing hell. Remember submitting that masterpiece brief, only for red ink carnage? Brutal. Humiliating. Builds skills, though. Teamwork. Resilience. Now? Run it through the firm’s AI thrice. Partner glances, nods. Done.
Alienation brews. No feedback loops. No ‘aha’ moments from a senior’s slash. Firms save time, sure. But at what cost? We’re seeing it: less investment in green lawyers. Hire experienced guns instead. Bubble? Maybe. Snake oil? Some peddlers, absolutely—cashing in pre-burst.
Sotomayor nailed the human input flaw. Best and worst of us, amplified. In judging human complexity? Catastrophic potential. Yet her advice skips infrastructure gaps. Courts? Firms? Unready for AI floods. Hallucinated citations already tanking cases.
Look, use AI. Voluntarily or not—Godspeed. But guard your critical edge. Amy Coney Barrett skips it entirely. First Amendment shield? Smart play.
And the PR spin? Sotomayor’s framing as ‘revolution’ mirrors tech bros’ hype. Computers revolutionized law, yeah—but slowly, with regulations. AI? Warp speed. No brakes.
Bold prediction: in five years, we’ll mandate ‘AI detox’ courses in law schools. Brains fried from over-reliance. Firms lamenting rote prompters, not thinkers.
It’s not anti-AI screed. Tools evolve. But blind adoption? Recipe for dull lawyers. Shortage of sharp ones incoming—not from replacement, but erosion.
What Happens to Biglaw When Everyone’s an AI Wrangler?
Picture the grind. Sleep-deprived, you prompt at 2 a.m. AI hallucinates a precedent from 1872—fake. You catch it. Mostly. But patterns emerge: trust erodes, haste reigns.
Team dynamics? Shredded. No red-lined rituals forging bonds. Just solitary screens. Lateral hires fill gaps, but culture frays. Sustainability? Firms chase efficiency, birth burnout.
Historical parallel: typewriters standardized legal docs, sped production. But didn’t gut reasoning. AI? It thinks for you. Or pretends to.
Sotomayor’s right—it’s us in there. Our biases, errors. Perpetuated at scale. Dangerous? Understatement.
So, students: master warily. Probe outputs. Question inputs. Or risk becoming the hallucination.
🧬 Related Insights
- Read more: Skadden.net Emails: The Phishing Trap Exploding 425% in 2026
- Read more: Berkeley Law’s Exam Software Nightmare: Crashes, Spies, and Student Revolt
Frequently Asked Questions
What did Justice Sotomayor say about AI to law students?
She called AI a ‘dangerous hallucination machine’ and a ‘sophisticated human’ that amplifies our best and worst, urging students to master it before graduating.
Is AI really dangerous for lawyers?
Yes—hallucinations, bias amplification, and potential skill erosion make it risky, especially under Biglaw pressures like sleep deprivation.
Will AI replace law school training?
Not outright, but over-reliance could dull critical thinking, pushing firms toward lateral hires over training juniors.