In January 2025, the World Economic Forum released its Future of Jobs Report estimating that AI and automation would displace 92 million jobs globally by 2030 — while simultaneously creating 170 million new ones. Those two numbers should sit next to each other for a second, because that’s the real tension: this isn’t a story about AI taking jobs or AI creating jobs. It’s about a simultaneous disruption happening faster than most institutions are built to absorb. And unlike previous automation waves that hit factory floors, this one is hitting knowledge workers, lawyers, coders, marketers, and analysts — the people who thought they were safely on the right side of the automation line.
The Productivity Gap Is Already Opening
The most important economic story right now isn’t mass unemployment — it’s a productivity split happening inside companies, between workers who’ve integrated AI into their daily workflows and those who haven’t. This gap is measurable and it’s widening fast.
A 2023 MIT/Stanford study on GitHub Copilot found software developers completed tasks 55% faster when using the tool. A separate Harvard Business School study on consultants using GPT-4 found BCG consultants using the model outperformed non-users by 40% on complex tasks. These aren’t theoretical projections. They’re controlled experiments showing that AI is already a serious productivity multiplier for people who actually use it well.
What’s less discussed is who benefits. The MIT/Stanford Copilot study found the biggest gains came from lower-skilled developers — not the senior engineers. AI is acting as a skill leveler in some domains, compressing the gap between a junior coder and a mid-level one. That’s economically significant. It means companies can do more with leaner teams, and it means the marginal value of being “pretty good” at something drops when AI can close most of the gap.
Andrej Karpathy has talked about this directly — his framing of “software 3.0,” where natural language becomes the new programming interface, points to a world where the barrier to building software collapses. That’s not just a developer story. It’s a labor market story.
Which Jobs Are Actually at Risk (and Which Aren’t)
The “AI will take all jobs” narrative is too blunt. The “AI only takes repetitive tasks” narrative is now clearly outdated. The real picture is more granular and, honestly, more unsettling for certain categories of knowledge work.
Here’s a cleaner framework for thinking about job vulnerability:
- High substitution risk: Tasks that are primarily about processing information, generating structured documents, or following repeatable decision trees. Think: first-draft legal contracts, basic data analysis reports, entry-level customer support scripts, boilerplate code, routine radiology reads.
- High augmentation (not replacement): Tasks requiring judgment, relationship management, physical presence, or cross-domain synthesis. Think: senior engineering decisions, sales relationships, surgery, complex negotiation, creative direction.
- New job creation: AI trainer, prompt engineer (yes, still), AI deployment specialist, model auditor, trust and safety researcher, human-AI workflow designer. These are real roles being hired for right now at companies like Anthropic, Scale AI, and major enterprises.
- Unclear / too early: Creative work, management, teaching, social work. AI is entering these spaces but the displacement timeline is genuinely uncertain.
Goldman Sachs estimated in 2023 that roughly 300 million full-time jobs globally could be exposed to automation. “Exposed” doesn’t mean eliminated — it means the task composition of those jobs changes. A paralegal’s job doesn’t disappear when AI can draft a contract. It changes: less drafting, more reviewing, more client interface, more judgment calls. Whether that’s better or worse depends heavily on the person and the firm.
The Sectors Being Reshaped Right Now
Some industries aren’t waiting for 2030. The reshaping is happening today, in measurable ways.
Software Development
GitHub Copilot has over 1.3 million paid subscribers. Cursor, the AI-native code editor, went from niche to mainstream in under 18 months. Cognition’s Devin made headlines as an “AI software engineer” — the reality is more modest, it handles isolated well-defined tasks — but the direction is clear. Engineering teams are already reporting they can ship faster with smaller headcounts. The entry-level developer job market softened noticeably in 2024, and many in the industry connect that to AI tooling compressing the need for junior engineers on routine tasks.
Legal and Professional Services
Harvey AI is now being used by Am Law 100 firms. Firms using these tools report meaningful time savings on document review, due diligence, and contract analysis. Casetext (acquired by Thomson Reuters for $650 million in 2023) built an AI legal assistant that passed the bar exam in the 90th percentile. The billable hour model — the entire economic engine of Big Law — is under structural pressure when AI can do in minutes what used to take a first-year associate days.
Customer Support and Operations
Klarna made headlines when it reported its AI assistant handled 2.3 million customer conversations in a month — doing the work of 700 human agents, it claimed. That figure got scrutinized and the reality is messier (quality issues, escalation rates), but it illustrated what’s possible. Intercom, Zendesk, and Salesforce have all embedded AI agents into their platforms. This is one of the highest-velocity areas of actual deployment.
Media and Content
The Associated Press has been using AI for automated earnings reports since 2014. By 2024, AI-generated content is embedded across sports recaps, financial summaries, and product descriptions at scale. The disruption here isn’t coming — it arrived. Entry-level content production jobs have already contracted. The jobs that remain are editorial, strategic, and relationship-based.
The Wage and Inequality Question Nobody Wants to Answer Honestly
Here’s the uncomfortable economic question: even if AI creates more jobs than it destroys, does it distribute the gains fairly?
Historical precedent isn’t encouraging. The automation wave from 1980–2016 increased aggregate GDP significantly, but real wages for the bottom 60% of earners barely moved. The gains concentrated at the top — to capital owners and high-skill workers. AI could easily repeat this pattern, or accelerate it.
The reason is structural. AI is predominantly a capital good. Companies buy it, deploy it, and capture the productivity gains. Workers who use AI well become more valuable — but that’s a small percentage. Workers whose tasks AI replaces face wage pressure or displacement without necessarily seeing a path to the new roles being created (which often require different skills, different geographies, different educational backgrounds).
Dario Amodei at Anthropic, Sam Altman at OpenAI, and others have at least acknowledged this tension publicly. Altman’s proposals around universal basic income and national equity stakes in AI companies are one answer — though none of these have moved from concept to policy. Mustafa Suleyman’s framing in The Coming Wave is sobering: the containment problem isn’t just about safety, it’s about
Google Just Bet $40 Billion on Anthropic: Inside the Circular Finance Powering the AI Race Google will invest $10 billion now and up to $30 billion more in Anthropic, creating the largest single company bet on an AI rival in history. The deal reveals how circular finance is reshaping the... GPT-5.5: OpenAI Stops Selling a Chatbot and Starts Selling an Agent OpenAI released GPT-5.5 on April 23, 2026, positioning it as an autonomous agent rather than a chatbot. With 82.7% on Terminal-Bench 2.0, a verified mathematical proof, and $30 per million output...Recent Posts
