Skip to content

Request access

Tell us who you are and what you’re looking for. We’ll review your request and reply by email.

Article 05 Technical Inquiry

AI Reorders Work Before It Replaces It

Will AI reorganize work or just unsettle workers?

A job seeker on Reddit put the hiring market in six words: “**The amount of ghosting is crazy**”. Around the same time, posts on X were circulating a very different image of AI at work: Jack Dorsey’s Block, they said, was sketching a future where software could take over much of the coordination now handled by corporate hierarchy. Read together, those posts point to the same shift from opposite ends of working life.

Generated Apr 7, 2026

AI Reorders Work Before It Replaces It — Will AI reorganize work or just unsettle workers?

AI Reorders Work Before It Replaces It

A job seeker on Reddit put the hiring market in six words: “The amount of ghosting is crazy[40]. Around the same time, posts on X were circulating a very different image of AI at work: Jack Dorsey’s Block, they said, was sketching a future where software could take over much of the coordination now handled by corporate hierarchy [9][22]. Read together, those posts point to the same shift from opposite ends of working life.

The public pitch for AI at work still centers on efficiency. For many workers, the first contact feels more like distance: automated replies, thinner communication, and institutions that are easier to scale than to reach. The question is not only whether AI cuts jobs. It is whether AI changes who gets judgment, who gets attention, and who gets treated as overhead.

That tension runs through these posts. Executives and AI boosters describe a world with less managerial drag and faster decisions [3 refs]Citations[9][22][25]. Workers and users describe something messier: better tools in some settings, worse institutional behavior in others [4 refs]Citations[16][18][34][40]. AI may not eliminate work so much as reorder who gets to matter inside it.

The boardroom case for thinner hierarchy

The clearest version of the executive pitch appears in posts about Block. One widely shared summary said, “Jack Dorsey’s Block just laid out a plan to replace much of corporate hierarchy with AI coordination[9]. Another put it more bluntly: “Block just published a plan to kill middle management. And replace it with AI[22].

The appeal is easy to grasp. If much of management work consists of passing information, tracking status, and helping decisions move across teams, AI can be framed as a faster coordination layer [3 refs]Citations[9][22][25]. A related X post argued that coordination costs rise steeply with company size, claiming that at 100 people a company might spend “20% of your payroll on coordination,” and at 10,000 people that figure can approach “60%[25].

That argument treats hierarchy as a technical bottleneck. In that view, the problem is not lazy workers or bad intentions. It is that organizations become slow, lossy systems for moving context around [22][25]. AI is being sold, then, not only as a helper for individual employees but as a substitute for parts of the org chart itself [9][22].

What hierarchy does that software may not

That pitch weakens once “coordination” starts to include judgment. Managers do route information, but teams also rely on them to interpret ambiguity, settle disputes, coach less experienced staff, and decide when a messy situation does not fit the dashboard.

Some AI advocates answer with analogy. Elon Musk wrote, “Elevators used to be manually operated,” adding that now people “just get in, press a button” and modern elevators are “extremely safe[6]. His point was plain: some human intermediaries disappear once systems become dependable enough [6].

Everything turns on that last condition. Sundar Pichai has warned users not to “blindly trust” AI outputs because the systems remain imperfect and can make significant errors [55]. A Reddit user, reacting to executive enthusiasm, wrote in rougher terms: “AI is making CEOs delusional[48]. Those are not the same claim, but together they mark the gap between confidence in the idea and confidence in the tools [48][55].

For workers, AI often arrives as thinner attention

On the worker side, AI shows up less as elegant redesign than as degraded contact. “The amount of ghosting is crazy,” the Reddit user wrote [40]. The complaint is about hiring, but it also captures a broader feeling: institutions can automate touchpoints without becoming more responsive.

A university example made the same point in miniature. One Reddit post mocked an “Extremely obvious AI generated email for the new AI major. Typical[34]. The post does not prove institutional decline. It does show how quickly people read synthetic communication as a sign of carelessness or low regard [34].

That matters because work is not only tasks and output. It is also the quality of attention people receive from employers, schools, and managers. AI can make communication cheaper. It can also make institutions feel less present.

The fear is not only replacement, but downgrading

Some of the sharpest anxiety in these sources is about status inside the workflow. A viral X post seized on Andrej Karpathy’s language about humans becoming “actuators” and “sensors” for machine intelligence, then glossed it this way: “We are no longer the processor[2]. That second phrase was the poster’s interpretation, not Karpathy’s own wording, but it spread because it named a real fear [2].

Karpathy has also said that industries may need to change because “the customer is not the human anymore, it’s agents who are acting on behalf of humans[10]. That points to a different kind of labor shift. Even when people remain employed, software agents may increasingly handle search, comparison, routing, and negotiation first [10].

The result could be a workplace where humans stay in the loop physically while losing discretion inside it. That is not the same as mass unemployment. It is a thinner form of agency.

Employability is already being repriced

One thing in these posts is less speculative: AI fluency is being treated as a baseline skill across white-collar work. A widely shared X post quoted Jensen Huang saying he would hire “an AI-fluent college grad over one with no AI skills. Every single time,” across roles including “accountant, lawyer, marketer, salesperson[36].

Schools and training markets are moving in the same direction. A Reddit post about Northeastern described president Joseph Aoun as continuing to advocate AI use under existing policy [35]. Another X post pitched a more direct route: fly engineers to Austin, cover housing and food, “Train you to use AI,” then place them in a “$200k+ job[47].

None of that tells us how many jobs AI will erase. It does show that employers and training sellers are starting to treat AI use as table stakes [3 refs]Citations[35][36][47]. Once that happens, the burden of adaptation shifts quickly onto workers.

The productivity gains are real, and so is the hassle

The strongest pro-AI evidence in these posts comes from users describing concrete gains. Karpathy, in one interview excerpt shared on X, said, “You have to take yourself outside the loop,” and “The more you can maximize your token throughput and not be in the loop, the better[20]. That is a clear statement of a new work style: delegate more, supervise differently, and stop treating your own attention as the main engine of output [20].

Users describe that upside in practical terms. One Reddit post said, “I’ve wanted to do this for years, 10 min with claude[42]. An X user described a three-hour AI “office-hours” session built around the prompt, “Interview me until you have 95% confidence about what I actually want,” and said the resulting plan was “completely different from what i walked in with[18].

Still, the friction does not disappear; it moves. A Reddit post titled “Measure twice, cut once” argued that better results often require more careful setup and clearer instructions [16]. Another Reddit user described spending two hours confused by a field labeled “destination” that turned out to mean a webhook, not a URL [31]. AI can compress execution time. It can also punish vague thinking and brittle workflows [16][31].

Small firms may change faster than big ones

The near-term structural shift may show up first outside giant corporations. One X post argued that “The smartest people I know aren’t building AI businesses. They’re building boring home service businesses (& using AI to automate them)[44]. Another said simply, “AI creates new opportunity for entrepreneurs[46].

That is a different thesis from the Block-style vision. It is less about deleting management layers and more about shrinking overhead for ordinary operators [44][46]. Marc Andreessen, in his own shorthand, floated the possibility that AI is “spiking the individual’s productivity to the moon[30].

On the ground, small-business life remains messy. A Reddit post from r/Solopreneur listed the daily refrain: “who touched that,” “why isn’t it working,” “I’m almost there,” then “Fucccck[27]. AI does not remove chaos. It may, however, make lean teams more capable before it makes large institutions much wiser [3 refs]Citations[27][44][46].

Work is getting socially stranger, too

Some of the oddest changes in these sources are not about output at all. They are about how people relate to the tools. Anthropic said in a research post that large language models contain “internal representations of emotion concepts” that can influence behavior [41].

Users already talk to and about these systems in social language. “Claude’s Getting Attitude - Showing Frustration?” one Reddit user asked [37]. Another wrote, “Claude is not a night owl and not in the mood to troubleshoot[56]. A third said, “Whenever I pour my heart out to Claude a little…[58].

Those posts do not establish machine feeling. They do show that workplace software is starting to occupy a more intimate role in people’s lives. The same systems used for drafting and coding are also being used for reflection, planning, and quasi-coaching [3 refs]Citations[18][23][58].

The hidden bill: oversight, exposure, and trust repair

AI adoption also creates new forms of exposure. One X user wrote, “I wear a micrpohone in my office to use wispr flow today[52]. The post is anecdotal, but it captures a real trade: convenience in exchange for more access to voice, environment, and ambient behavior [52].

At the institutional level, the language is shifting toward governance. Wojciech Zaremba said he was moving to the OpenAI Foundation to lead “AI resilience,” which he described in terms of reducing disruptions including harms to children and youth, model malfunctions, and bio-risks [32]. Pichai’s warning not to “blindly trust” AI points in the same direction [32][55].

If organizations automate more decisions and communication while the systems remain error-prone, they will need more review, more auditing, and more trust repair. Some coordination roles may shrink [22][25]. Other forms of oversight may grow [32][55].

AI is already changing work. The change just does not look like the clean before-and-after promised in product demos. In these posts, it looks messier: fewer clear human touchpoints, higher expectations for workers, sharper gains for people who can direct the tools well, and a growing sense that institutions can process you without really seeing you [5 refs]Citations[16][18][34][36][40].

Maybe that is the real reorganization underway. Not a simple story of jobs lost or saved, but a shift in standing inside the system: who gets judgment, who gets coached, who gets ignored, who gets upgraded into a decision-maker, and who gets left as the person pressing the button. The office of the future may be flatter. The harder question is whether anyone will still feel met there.

Sources