Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The tech jobs bust is real. Don't blame AI (yet) (economist.com)
113 points by andsoitis 16 hours ago | hide | past | favorite | 174 comments
 help



Two big tech FAANG jobs, org is 95% h1b engineers from china/india. Tons of resumes from american grads somehow never hit my desk, continue to interview random candidates from india with some low quality USA masters from missisipi state. Candidate has spent the last year locked in a room memorizing algorithm interview questions

I’ve seen two interns through to FTE jobs at a FAANG, both were at least second generation immigrants (so citizens I guess, though I never asked). So kids are still getting jobs, they come from reasonable universities, I haven’t seen UW resumes yet (my Alma mater), for some reasons those kids are all scooped up before I can take a look (I work at an office in Seattle so I find it weird that we can hire from the local university).

the weird thing is seeing american kids from a top 20 american school (duke, carnegie melon, etc) mixed with a much larger majority of h1bs from absolute no name schools. its almost like the bar for americans is higher

I guess it’s different since I mostly deal with interns. But I haven’t noticed a lot of H1s being hire at my FAANG, at least ones that are obviously F1?

We aren’t an AI tech group or anything like that. I was on a Z (working) visa in China for 9 years though.

I knew an H1 in another group from an obviously second rate school in China and a masters from a no name in America. He was pretty successful in his career so I guess school brand doesn’t mean that much. Likewise when I was in China some of the PhDs from second rate Chinese universities also did very well.


Americans are more expensive. More pressure to justify yourself these days.

no they get paid the same

100% is

I think the issue could also be this:

- Tech or engineering company hires foreign resident engineer.

- That engineer works their way up in the company to a position where they are in control of hiring.

- They hire more people like them.

We used to accuse start-ups for hiring for cultural fit (people like them) which looked like sexism and ageism. Same thing, different group.


DSA itself may now not be as indicative, but grinding is absolutely a key trait in good hires. That type of person will put in very effort at work.

Yup. ATS has always been a problem. Even worse with AI

Why are we letting tech companies in the USA do this?


Because if we didn't we wouldn't have a tech industry?

The main problem with immigrant talent in computer field is that legislators don't understand the difference between IT and Tech product development jobs. IT jobs don't need immigrant talent, so companies like Accenture, Infosys etc. should not be given H-1B visas. But tech companies like Google, Meta, Apple, OpenAI etc. absolutely need immigrant talent, or they will lose to Chinese competitors.


That is absolutely not true. There are plenty of Americans who can do these jobs.

> There are plenty of Americans who can do these jobs.

This thinking is wrong. For IT jobs, the work is pre-defined and you go find people who can do the job. For product development this is sometimes the case, but for truly innovative products, such as AI models, this is not the case. You have to hire the best in the world and give them the resources they need, as opposed to defining the project upfront and hiring people "who can do the job".


"Truly innovative".

I wrote my first neural net in the late 90s. Based on nothing but an old geocities post some rando put up about training a model to only unlock a pet door for their cat.

I implemented the same and it worked.

Where you see true innovation I see run of the mill. OpenAI, Google, etc are propping up data center rental business they came to rely on to titillate biology with whatever spaghetti that sticks. That's it.

The interesting science isn't happening anywhere close to big tech.

The mathematics of LLMs exists in textbooks from 1950s. Your entire comment chain here is little more than reciting propaganda.


> I wrote my first neural net in the late 90s

Why aren't we using it, if that's all the world needs?


Oh this game, huh?

If your powers of analysis were worth anything you would be running the world not copy-pasting hype straight from CNBC and Big Tech PR

As you are not running the world your analysis is worthless. As such I say; good day, sir.


Why is it important that Google (or any of these large companies) only hire Americans for their jobs in the first place? They are global companies now, they make money from everywhere. Why is the insular "Americans only" idea worthy of consideration at all?

How many “best in the world” people are we talking about, though? Based on what I’ve seen that’s a very small percentage (maybe 5%) while the rest were being hired by companies who valued having workers with limited negotiating power.

(I’m not opposed to immigration at all but it was transparent how for decades the industry resisted any change which would make it easier for a skilled H1-B worker to take a better job)


> having workers with limited negotiating power

Sorry, this is 100% false. Companies like Google, Microsoft, Apple, Meta etc. do not hire H-1Bs in order to depress wages. That does happen, but not at these companies. It typically happens when hiring IT workers.


If it’s “100% false”, I’d think you could have addressed the point. Do you think that H1-Bs have had the same negotiating power as permanent residents and citizens? Do you think that companies-especially the huge contractors and enterprise vendors who hired so many of them—did not exploit them?

I’m not saying that there aren’t people who really lived up to the idea that the best in the world were coming here—I’ve known a few of them myself—but that there were a much greater number of people who were not in that class and it wasn’t exactly a secret that their managers knew they could be imposed on more than their equivalently-skilled colleagues.


The mere presence of H1-B workers depresses wages systemically.

In the early 1990s a good software engineer was paid $40K starting salary, and good companies like Sun Microsystems paid $45K. If you adjust that for inflation it is around $100K. But good companies in silicon valley today pay $120K plus stock grants (so around $170K or so), and Meta and Google pay much more.

So software engineer salaries have gone up dramatically in the last 35 years H-1B visa has been around. In fact, the H-1B visa is the reason the salaries have gone up. Without it the industry would be stagnant, just like non-tech S&P 500 companies and most companies in Europe in the same time period.


Are you trying to argue that increased supply of labor is responsible for increasing wages?

As others have said, H1-B has been good for companies, and bad for American workers. The same companies who were found to be colluding to keep wages down.

Europe is stagnant because of regulation, not because of immigration.


> Are you trying to argue that increased supply of labor is responsible for increasing wages?

I am saying the reason silicon valley exists is because of the immigration of the smartest people from around the world. High Tech needs the best in the world, not the best in the US.

Consider the seminal research paper that kicked off the AI revolution (titled "Attention is all you need"). It was written by 2 Indians, 1 German, 1 British Canadian, 1 Pole, 1 Ukrainian, and 2 US born people. These people came to America, worked together and changed the world as we know it. Why would we want to stop it? Has this immigration been bad for American workers? Far from it. These immigrants are the lifeblood of the tech industry, without them the center of tech would be Beijing.


I think they are making a meta argument?

That H1B labour allowed other firms to build tech, which kept those firms competitive, creating a deeper economy and experienced bench.

That depth then enabled more advanced tech firms to be born.

At least thats what I think they are saying.

The analogy would be that China took over low tech manufacturing, and then because of that were able to develop expertise to move up the value chain.

At the same time, supply demand curves are real. If you have more workers, it should result in competition that drives down wages. (ALL THINGS BEING EQUAL)

There was a distinction being made between Tech and IT, which I am not too sure about.


> If you have more workers, it should result in competition that drives down wages. (ALL THINGS BEING EQUAL)

Sort of a meaningless statement when all things are definitely not equal.

If there are 5 million people in a country, or 200 million, the theory of too many workers means the 5 million people country should be paying everyone vastly more.

But that is trivially untrue.

Economies grow and shrink and adapt around the number people.


you fundamentally dont understand economics

These companies were all guilty of wage collusion years ago. Their culture has not gotten any better about this since.

I'm not opposed to hiring the cream of the crop using H1Bs, but that would only be a few thousand people a year. The vast majority of H1Bs though are people taking jobs that Americans can definitely do.

Not sure how you came to draw this conclusion, as there's lots of data out there showing droves of Computer Science graduates here in the states unable to land jobs.

I think that data captures the fact that there are more people being handed degrees without an education than there are jobs. Especially when there are thousands of mid career people on the market right now.

We had a tech industry prior to H1Bs before the 90s. What we didn't have was Silicon Valley corporatism that doesn't value American labor nor American education. It's why SV is so gun-ho on charter schools and devaluing American labor.

Let's not act like we need to import 80k "high tech" workers that amount to writing react components and spring endpoints.

Hardly anything hard that we couldn't force companies to train workers to do, but they don't want to ever help people they just want to suck up all the money in the room while decimating entire populations.zzzzzzz

Also, as an American I don't really benefit if US corporations are doing "better." How does that help the person that can't pay for healthcare or afford to go to school, but they sure can get their serving of Zuckerberg slop? I'm supposed to care about these companies success? Really? I hope they go down in flames.

The problem is that the rich and elite have captured and dictated American tech policy for far too long.


Do you share the same thoughts for blue collar immigrants?

No, because blue collar immigrants aren't working for cartels that hold deeply undemocratic views.

It is interesting to see the different views on immigration. Here in the UK, leading up to the brexit vote, everyone said blue collar workers were the problem, because they depressed wages for the poor and made the middle class richer because they could build cheaper houses, pick cheaper crops, etc.

In Singapore, the rage is mostly against higher earner immigrants, because they take all the good jobs, making the middle class in Singapore poorer.

I'm sensing a bit of a mix in your US centric argument.

All in all, a lot of people just hate immigration, always have, always will. It is a topic as old as time.


> How does that help the person that can't pay for healthcare or afford to go to school

How would you like to make t-shirts for rich Chinese, for $5 an hour? There is a reason Americans are not doing that. It is because we are smarter than the rest of the world. How do you think that happened? Were all the smart people born here? Nope. It is because smart people born around the world immigrated here. The prosperity they bring doesn't only help high tech workers, it feeds the economy, so everyone benefits.


I mean we aren't doing it because capitalists decided they would rather move the factories outside of the country because they don't care about workers.

Americans are absolutely willing to work in factors, but capitalists want chattel slave workers instead.

Your view of history is farcical, acting as if American workers had any real say in their countries industrial capacity rather than a few thousand people decided to inflict mass poverty to tens of millions of Americans.


We would absolutely have a tech industry. The richest people on the planet, however, would make slightly less money. It is not an exaggeration to say this is what the entirety of American society is based on right now.

> But tech companies like Google, Meta, Apple, OpenAI etc. absolutely need immigrant talent, or they will lose to Chinese competitors.

Let them lose.

Google and the rest do not prop up humanity. They prop up a financial engineering Ponzi scheme.

You're just parroting media and social tropes you grew up with.

We could assert in our children social truth about other forms of economics; for example, healthcare as a tent pole rather than stock valuations; still requires technology and jobs and we don't remain the last modern economy on the planet without universal healthcare. We're losing to Russia and China in healthcare.

But thankfully we win when the metric phallic rockets to nowhere and Google search uptime?

You should consider your economic benchmarks and their provenance; a bunch of self selecting biological organisms that we socially describe as billionaires have convinced you via their fear mongering that if we don't give them all the power giant foot will step on us


Doesn't sound accurate. Please list some sources here that are credible that explain your point. Just sounds like dismissive bias.

To do what, attract talents from the rest of the world?

Because we live in a techno-oligarchy now? Because the leaders of the top tech companies (by revenue) literally sat behind the President at his inauguration?

Because they can afford to buy the 'right' to do what they want, and you can't, and what they want is cheaper labor who they have more control over, and H1B workers will never rock the boat because the visa is a sword hanging over their heads.

Downvote all you want, it's the truth.


[flagged]


You can’t just write off everyone downvoting you for being conspiratorial as government plants lol

honestly i have no idea, in some cases, they are working weekends/are hyper focused on extremely boring, somewhat manual work. some of the systems are complicated and break constantly, so they are almost just oncall fodder for manually fixing a constantly breaking high scale service

Will these kinds of jobs survive Claude code? It sounds like exactly the job that would be easiest to automate.

And who gets paged when the Claude integration fails?

Why would we stop them? Labor is a free market.

It generates economic activity and taxes in the US and suppresses wages.

Most of the H1 candidates are in shitty roles that are well defined low/moderate skill jobs for giant companies. Hire people whom you can’t actively exploit and those are the kind of jobs where unions can organize.

The alternative is offshoring the work, not hiring Americans.

The smart thing would be to just let people immigrate. Instead we have a weird tiered system with a small number of highly skilled specialists and an army of serfs facing deportation if they piss off the bosses.


Then offshore the work. Americans aren't getting the jobs anyway and the imported labor now competes for things like groceries, gas, housing, etc. which drives up prices.


Keep up with that rhetoric. I'm sure it will go well for you come election season.

"listless rednecks"

Yea, keep that same energy and see how it plays out.


Just so people know since the user above wanted to edit their comment away to hide how they really see things. It was this:

"They also pay the social security and Medicare taxes so listless rednecks can collect SSDI for being unable to work."


Look, I can make a solid economic argument against offshoring and how certain business practices hollow out local economies.

However immigrants are a net increase in investment and GDP. Yes - terms and conditions apply (its economics, when do they not)

Immigrants have to pay rent, buy clothes and groceries from wherever they live. This creates demand which depends the consumption economy. These are positive growth levers. This is despite whatever work they do in that region.

In contrast, asset prices like house prices rising, because they have become stores of wealth, are a different deal altogether. In that situation house owners benefit from just holding onto property, and not renting. The asset appreciates all the same.

The issue which can be brought up is wage depression, and paying immigrants under the table. This should depress wages for American labour.

One solution for this is to increase minimum wage, and to ensure that everyone is paid minimum wage.

This is a simplified model of the situation, but in general immigrants put more into the system than they take out.


> These are positive growth levers.

These are pointless growth levers.


FYI, these jobs pay the highest in the world. If these jobs are exploitative, then so are other non tech jobs that employ citizens and pay lower wages.

My former organization employed ~750 contractors developing software.

Their billable rates ranged from $44-76/hr in 2022. The people in the cafeteria probably made more. They get minimum viable salary like indentured workers in hopes of getting a green card and more opportunity.


Offshoring isn't a given. It's simply permitted.

Without an archive link I think 90% of these comments will be in response to the title only.


I'd happily provide one but I've had enough of being repeatedly trashed and denigrated here for posting too many archive links.

You could look at your denigrators and decide: "fuck you". Internet points are strangely attractive but not vital. You can always post with another account with a bit of effort too.

I understand that you are pissed off (as am I too) but debating with an army of bots, LLMs, wankers and Russians is unfortunately the status quo, quotidien.

On the bright side there are lots of lovely folk hereabouts with a large thing between their ears.


> You could look at your denigrators and decide: "fuck you"

That escalated quickly and I enjoyed this comment a lot


Happy to oblige.

It is interesting that sites like HN have existed for decades and still don't have any real solution for this sort of problem. Is providing a way for us to all bypass paying for this content that cost money to produce actually the most desirable outcome?

Like imagine if there was some 90 minute tech documentary on Netflix that was worth discussing here. Could I just rip it and link to a copy on my Google Drive? How long would that link stay up? I can't imagine long. I guess the conclusion based off how these sites operate is that piracy doesn't count when it's just words.


> It is interesting that sites like HN have existed for decades and still don't have any real solution for this sort of problem.

Heck, I find it odd that we don't even try to model the problem, let alone solve.

For example, there's no submitter-tickbox for "this requires additional access", no icon to distinguish between the kinds of items, and and we don't even an informal convention like putting [paywall] in the title.

Without even tracking that information, it's very difficult to address the pain-points, such as allowing the submitter to preemptively provide alternate links, or allowing readers to select against certain sites they've already decided they won't subscribe to.



That first link is not relevant to the point of my comment. I was not complaining about paywalls. The comment also doesn't address whether paywall bypasses would be acceptable for non-text links.

Regarding the second link, I'll happily engage with something specific dang said on this topic if you want to link to it, but a link to every time he said the word "paywalls" is not a productive contribution to this conversation.


I have only ever gotten thanks and net positive points for posting archive links.

Looking at your karma, I think you can afford it. ;)

It's dead Jim.


It doesn't work.

> You’ve just missed out—free access to this article has expired. Register to view


The gift that doesn't keep on giving

Given that it's paywall, there's a good reason for that. (gift link below no longer works)

Posts with links to paywalled articles should be identified somehow. (Flashing red title??)

I would prefer to just skip them, but they are not easy to spot.


Software devs lost their pricing power due to LLMs but not exactly how most people think.

What's missed in understanding is 'how exactly does this functionality work for this specific case?' or 'can we implement this tiny one off feature in some legacy code base'. Both things are why you keep the guy that wrote it around. And you couldn't really replace him. Because digging into what he wrote was hard.

Now, LLMs can do that stuff better than the guy that wrote it.

Software devs were non-fungible. Now they're commodities. When things become commodities, they lose their value.

I'm not sure why I haven't heard people talk about this aspect. It's the biggest effect on jobs.


While this is true to an extent, oftentimes the important context is not in the code but in the head of the writer. The code is just the fence in the Chesterton’s Fence analogy. And that is still non-fungible and will (presumably) forever be.

> There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, “I don’t see the use of this; let us clear it away.” To which the more intelligent type of reformer will do well to answer: “If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.”


You're correct that it doesn't answer the why.

But it answers the what, how, and allows one-off features.

So the guy that wrote might (or might not) still have the edge with the why. But that's not the moat it used to be.


In big companies, the why is 80% of the work. I could swear actual dev work is less than 20% of a “developer’s” job at a standard large (non-SV/FAANG/tech-first) company. The rest is holding a lot of really weird organization-specific context in your head to make the right decision.

With my own tiny company, I used to answer questions about my code to support. Supporting the support. I remember doing that when working at big companies too.

Now, my support asks claude about the codebase to answer those sorts of questions. He's better than my memory.

Maybe we've had different experiences.


I love it. And no doubt, the smaller your company, the more profound the impact of “agentic” tooling can be.

But I am skeptical that people who have not seen a complex corporate environment understand how different it is.


> I could swear actual dev work is less than 20% of a “developer’s” job at a standard large (non-SV/FAANG/tech-first) company.

In banks its rather 10%, at least in mine is.


Software devs were always fungible up to p99.9

I am yet to see this pan out in the enterprise. Enterprises are full of mini kingdoms built by VP+ leaders with the tools they prefer or were sold on. And many of these tools are inherently and sometimes by design are cumbersome, expansive and not onboarding friendly. LLMs haven't breached this domain and this domain empirically is 80% of enterprise software. I am yet to see direct examples of llm agents replacing say 4 engineers out of an existing 8 person team.

Feels early to blame AI for most of this. A lot of it still looks like a reset from 2021-era hiring assumptions rather than actual labor replacement

Over hiring is one thing.. but that wouldnt be a problem if there was an endless stream of projects to take that are value creating.

So the issue is not necessarily the over-hiring.. more that the large tech firms are running out of projects to take that are value-creating. Which is not surprising - the labour market in its currently state is absolutely not perfect in allocating labour.

It should be noted that fixing tech debt is not necessary value-creating from a financial standpoint. What engineers think is value creating has nothing to do with what a CFO determines to be a value creating project - whose job is to maximize firm value.


I don't think it's that simple. The 2021 over-hiring was a weird time where hiring as many people as possible became a goal in itself.

There were companies growing organizations faster than they could be made productive. They acquired a lot of bureaucracy, excess structure, and inefficiency. The current backlash is trying to reverse the excess structure and middle management buildout so they can get back to more functional teams.


Glad I was over working three full time jobs (remote w2) during that period. I stacked A LOT of cash during that those years that has appreciated nicely.

People like you are why several companies I know of have stopped hiring remote: Too many people trying the overemployed schtick and and making remote employees synonymous with people who are constantly distracted and divided.

Thanks a lot for ruining it for all of us.


How does this show up in background checks? Genuinely curious

It doesn’t, I have one of these jobs on my resume and I simply don’t mention the others. Who would know if I don’t tell them? The IRS knows I had 3 w2s but they don’t know the dates of employment. It does so happen all three companies used the same large payroll management provider, so that company knows, but it would be pretty bad to share customer data wouldn’t it?

Hiring manager here. There are actually new tools from the background check and payroll providers to detect some of these past situations.

> It does so happen all three companies used the same large payroll management provider, so that company knows, but it would be pretty bad to share customer data wouldn’t it?

You may be in for a surprise.


The "value" creation is tied to the interest rate.

When interest rates are low, even low value creation projects are viable.

When rates are high, those exact same projects are no longer viable.

Therefore, i would argue that the labour market is not perfectly allocating labour, but it is close enough for practical purposes.


Value creation is more of a function of cash flow potential on a project than the risk-free rate, cost of equity or cost of debt.

"labour market is not perfectly allocating labour, but it is close enough for practical purposes."

No its not lmao. Do you even know what characteristics a 'perfect labour market' is even comprised of? Go on, surprise me.


Isn’t it both - ROI must exceed whatever cost benchmark?

"..is more of a function of cash flow potential "

"more of".

If you wanna finesse the discount rate by a few percentage points go ahead. Cash flows contribute more toward the end value number.

in b4 some numpty writes about the fed changing a market set rate.


Don't conflate businesses that have tech organizations in them, with actual technology companies. They are not the same thing. Technology companies exist as investment vehicles for R&D. There are endless streams of projects when your whole existence is structured around finding endless streams of projects.

Market saturation for tech products + new competition from vibe coded startups moving into mature enterprise spaces.

The rate of non-tech business growth has slowed, who is going to continue to buy all these cloud software services? Tricking consumers into subscribing to AI tools or extra storage only goes so far.


> Over hiring is one thing.. but that wouldnt be a problem if there was an endless stream of projects to take that are value creating.

I very much agree.

A lot of tech job growth during the late 2010 and pandemic period were frankly BS for a ROI perspective. Late 2010s was really the first time in tech that I started to feel like most of the stuff that needed to be built was built, and increasingly I was working on BS projects offering less and less value every year.

Consider:

- In the 80s developers were needed to write fundamental business software for word processing and spreadsheets

- In the 90s computers became mainstream and there was a huge demand for consumer software

- In the 00s the internet took off and we needed people build the web

- In the 10s the smart phone revolutionised computing and we needed people to build apps and rebuild websites to be mobile-first

But towards the late 10s entrepreneurs and investors seemingly ran out of no-brainer tech investments so increasingly started trying mental stuff still promising tech-like returns – block-chain, metaverse, Web 3.0, [insert traditional industry here] but a tech company.

I'm not saying there's nothing to build or maintain anymore, but I also no longer see where people think the exponential need for new software and software developers could come from, and I suspect this would have become obvious earlier if it wasn't for ZIRP.

But it's not a lack of productive things to build. We also have other trends hurting demand for new SWEs today. Consider how today completely non-technical people can start and scale an ecommerce company without any developers. Things that would have taken armies of developers just 10-15 years ago, can now be largely done in an afternoon on platforms like Shopify. It's actual hard to believe that just 15 years ago selling things online used to be very hard if you weren't technical.

Similarly starting in the early 2010s even being a developer got significantly easier because increasingly there was packages for everything. Things I might have spent weeks building before could now be built in days or less. And another thing that changed was sites like stackoverflow and blogs which help you solve problems and learn new skills. I remember trying to learn how to do things before 2010s was hard, and before the 00s it very hard.

And of course now we also have AI coding tools which don't just hurt the overall demand for developers, but effectively expands the supply of developers to anyone with an internet connection and computer.

So to summerise:

- There's much fewer good investments to be made in new software today.

- Where there are investments to be made you need far less developers.

- When you need developers there's far more people who can do the job.

Even if tech companies are doing well and the number of tech jobs is increasingly, the above means the average person trying to find a job in tech today will find it much, much harder than they have in the past. People working in tech today genuinely should consider a career change if they're primarily in tech for the money.


Isn't the need right now the need for more intelligent, context-aware software?

I mean, I feel like we've barely touched on what applications AI could be used in, that we have only just begun to start developing. It's tech that can be used in more advanced robotics, video games (adaptive NPCs, PCG, narratives), better data analysis, search, and more.

I understand the fears and dislikes around AI at the moment, but I think people need to understand that it can never fully replace human labor. At least, I am not concerned about that aspect in the slightest. To me, the real thing to fear isn't so much the tech itself, but our fellow man, and the ways they might possibly use it against others.


Yeah it feels like a lot of this could still be shakeout from FAANG and FAANG wannabes hiring 10x more engineers than needed. I realize Square is a more complicated business than OnlyFans. But OnlyFans isn't a trivial app and it runs on 42 employees. Block had more than 10k employees pre-layoff.

42 employees and a legion of contractors.

How many is a legion?

No, this is 100% AI this time.

These companies need liquid cash to invest in AI infrastructure, and so they're doing mass layoffs to give them the capital to invest. Simple as that.

2022-2024 was COVID reset. 2025-2027? is reallocation of spending from R&D to infrastructure.


We still haven’t fully corrected from ZIRP either

I don’t think that sufficiently explains it either. We all want a nice little bow on a simple, easy target. But there isn’t one.

Block, for example, said that they had experienced record profits… in spite of their, “hiring spree.”

This is just capitalism working as usual, only more of it, faster.

AI has a part in it. So does austerity policies and the rise of a certain political climate.


You'd think if the market were rational that Block would be punished for firing that many people, and obviously lying about why, but we haven't reached that point yet I guess.

I work for a fairly big tech company, For the last 2 years, basically stopped hiring people in the US. Only hiring engineers in the India office.

Can anyone provide historical data for "job busts" or other types of declines in tech employment, massive layoffs or hire freezes? I seem to read about something like this every few years. Would like some data to see if this trend is stronger than the previous ones or not.

Here's some historical data for employment in category NAICS 5415: https://fred.stlouisfed.org/series/IPUMN5415W200000000. Which, NAICS 5415 definitely isn't a complete picture of tech employment, but it should give rough correlation I would hope.

Layoffs.fyi maybe? Not sure if that’s the data you are asking for

I just put your question into GPT-5 Pro

developer jobs are up: postings ~11% YoY, tech workforce +1.9% projected for 2026, BLS 15% long-term growth. Rebound from 2024-25 lows

Jevons paradox


Job postings are not jobs. A company can layoff 10k and post jobs for 1k and have job postings be up.

Projections are useless. The past does not predict the future.

BLS is useless for the same reason, amplified.


Maybe if you’re at the point where you dismiss three datapoints that all disagree with your thesis without any real argument you should rethink your thesis.

I think a big part is that the hiring market for juniors feels apocalyptic rn. Jobs may be up in aggregate but that doesn’t necessarily mean everyone’s feeling it equally.

Maybe a case of really high profile companies doing RIFs while medium and smaller companies keep hiring?

Reversion to the mean?

One of my mentor at my company got redundant exactly a year ago. He is still looking for the next role. I really feel bad.

Post dot-com, a LOT of people basically left the industry. They may have continued on in somewhat adjacent roles but people can only afford to keep looking for the sort of employment they had for so long.

Where did they go?

I'm sure you'll find a variety of answers. Often mid-management jobs in possibly tangentially related industries. Basically retired early, or pursued a hobby business, if they could afford to do so. In my case an industry analyst job for about ten years until I went back to marketing for a software company.

So I want to talk about the Claude Mythos mega-model that had a real marketing push in the last week. Marcus Hutchins (of IWannaCry fame) posted about the economics of this [1].

The upshot here is this was a longstanding BSD bug but mostly because nobody is paying BSD bug bounties and a null-pointer dereference may induce a crash but rarely (if ever) leads to privilege escalation as buffer overflows generally do so isn't as high-priority. The estimate on the cost is $20-50k of tokens.

$50k gets you a lot of developer time, upwards of 2 months. As has been stated in a bunch of threads, much smaller models could also find this bug. The defense is "they didn't" and "once you know it's there, it's easier to find" but again, nobody has been paying BSD bug bounties. Put another way: far fewer people have been looking.

My point is that the economics of these models are still highly debatable. What they will do is allow you to use 1 engineer with AI tools instead of 2-4 engineers.

And we've arrived at the real goal here: to reduce head counts. Why? To suppress labor costs. To get people who didn't get laid off to do more free work.

In addition, AI capex is used as an excuse to cut costs ie reduce labor, for exactly the same reasons.

Lastly, the threat of this happening in the near to medium future is also used to scare labor and reduce costs.

Sufficiently largecompanies can't grow anymore. Their only path forward is to raise prices and redcue costs. For tech companies, labor is a significant cost. That's what's going on here.

[1]: https://www.tiktok.com/@itsmarcushutchins/video/762774007353...


> What they will do is allow you to use 1 engineer with AI tools instead of 2-4 engineers.

In a competitive market this doesn’t work exactly that way because you will then lose to your competitor who has 2-4 engineer + AI.


> $50k gets you a lot of developer time, upwards of 2 months.

God, I wish I was American sometimes...


Right up until you lose that job and get sick.

You have to consider all the costs of employing someone, not just the direct compensation. Benefits, health insurance, taxes (eg employer payroll taxes), office space, equipment, various insurances, utilities, costs related to just having that person (eg HR) and so on. So even a $100k salary, which is pretty modest for a US-based engineer, the additional costs may well add another $50-100k.

You’re missing the cost curve of inference. What costs $20-50k now will cost 2-5k in a year or two at which point the math is a no brainer. It makes a lot of sense to build products that almost work now or are almost economical and ride the trend line.

Why would it reduce ten fold? I think performance of the chips gets a little better but not that much better. The cost of the infra is probably going to go up if energy costs keep going up and presuming the US can keep getting cheap chips.

They have been exponentially decreasing so far [1] and the Vera Rubin generations chips will be going live that are 35X more efficient in terms of inference/ megawatt [2]. Even with rising prices 10x is possibly conservative.

Maybe if demand is truly crazy the labs will take more margin

1. epoch.ai/data-insights/llm-inference-price-trends

2. http://hashrateindex.com/blog/nvidia-vera-rubin-nvl72-specs-...


Problem with these models is that they are not reliable. You never know when the provider decides to nerf the model and after good week of making progress you find the model is now running in circles and you are burning tokens like you are operating a steam locomotive and standing still.

Tbf people aren't reliable either. Not that I'm defending the AI models but remember these long-standing bugs were introduced by people and not found by people for years.

Anytime anyone feels insecure that AI will genuinely do their job, I point them to creators like this [1]. Also, I find it so wild there are people, even here on HN, who don't admit that LLMs hallucinate, lie and forget.

I would have a hard time trusting any critical code produced by an LLM. This was a frequent complaint with OpenClaw (being Vibe coded) and the general advice was to not run it on any computer you're logged onto as you'll get hit by prompt injection attacks and the like.

Anyway, the models don't need to be reliable. They just need to be an excuse to fire people to suppress labor costs. That's the true use case.

[1]: https://www.tiktok.com/@huskistaken


> Tbf people aren't reliable either.

I only seen one instance where developer came to work still under influence of drugs after the weekend. He was moonlighting as a DJ and as he said the afters was legendary. He was fired that day. Usually you get an odd Monday with younger devs that they call off sick or family/health leave with the older ones. But they never suddenly run in circles for weeks seemingly not knowing what they do after weeks of exemplary performance.


> What they will do is allow you to use 1 engineer with AI tools instead of 2-4 engineers.

Stop there. The costs of today state-of-the-art hype-ware will be lower tomorrow.

In a year, there will be fewer easy to find bug, and models like this will be cheaper.

Doing math based on hardware deployed today is a bit naive.

Assuming any of this is intentional about reducing devs is just dumb.


It is the perfect storm, companies overhired during covid by a 1000%

Then covid went away, life went back to "normal", tons of juniors everywhere who moved to IT because of WFH and the salary.

Then comes AI, within the IT field, I am seeing first hand companies firing by the thousands because of AI.

Only Blue collar was being affected in the past, but now is everybody, no matter of your degree, experience, you either already had a hard time or you will have.

If Google Quantum computing somehow finds a massive breakthrough, we are all fu :)


I think a lot of people don't see the historical trend of where investment moves around. The 90s dot-com era was driven by massive investment in e-commerce (remember that term?) businesses. It created a bubble, which then burst.

When bubbles burst, investors change tack, and when investors pull back, companies turtle and do layoffs.

After Y2K, investment moved on to real estate. In 2008, that bubble burst. It moved on to social media next.

I would argue that bubble burst somewhat silently during COVID with the quiet deaths of Xitter and Facebook (from a standpoint of cultural relevance), and investment transitioned towards and into its current AI bubble.

Investors are always going to move on once the lustre of a field has waned, and I'd hazard that we'll see investment move somewhere other than software tech next (if I had to prognosticate, I'd say it's moving into robotics/ drones).


If inflation kicks off, it will go back into physical assets.

Would really appreciate an archive link to read this, even though it appears to be another meta-analysis wrapped in a “everything will be fine” narrative from the limited previews I’ve been able to gleam and The Economist’s general lean.

The problem with the tech layoffs is that it’s poisoning the well downstream. Smaller employers have repeatedly cited their layoffs as justification for abysmal, unlivable salaries, and demanding those of us looking for work suck it up and deal with it while they search for bottom-dollar unicorns.

AI isn’t replacing the IT crowd outside of the expected junior roles, and even that’s starting to rebound as executives realize Juniors were how they got “white glove service” for themselves - a Senior Engineer isn’t going to wipe their ass for them, job market or not, because said Engineer’s time is infinitely better spent on literally anything else.

One other thing I’ll note is that the layoffs also seem to be remnants of the Brogrammer hustle culture: tech folks were enjoying more time for themselves to grow or live life outside of work specifically thanks to a few good years of paying down technical debt with properly-staffed teams, but the grifters up top see anything less than a 9-9-6 as somehow stealing from the employer and slash accordingly. The remnants are expected to do more work for less pay, with AI tools somehow filling the gap (even though these same employers often don’t want to pay for proper tooling to maximize use of AI).

This is definitely an industry downturn as those who stand to gain maximize their immediate returns. From the perspective of the C-Suite and Boards, the safest (albeit unethical) move is to betray: if AI is a bust, they’ll have made their wealth and can fuck off; if AI eliminates jobs and work, they believe their wealth will protect them in the future dystopia of their own creation.

It’s in that context (“fuck you got mine”) that the broader narrative fits with the myriad of puzzle pieces out there (higher interest rates, stock pumps, circular financing, tariffs, aging population, AI, etc).


Would really appreciate if you made an archive link and sharing instead of complaining

Says the person who came here specifically to complain instead of doing it themselves.

For the record, I tried, but the archive site only had the preview cached; I suspect it’s looking for a URL identifier to bypass the paywall that I lack access to, but I can’t confirm either way from a mobile device.


Click on one of the other previews that come up, last I looked there were 4 options, at least one had the full article

corps do that to make stocks go up, once the quarter ends, they start hiring again

AI + Outsourcing is the major reason of tech job bust. But, I think politician won't care as they are here for corporation not for the people.

> politician won't care

they dont care because nobody has a real gut feel affinity for computer programmers or the work, the sort of feeling that is required to animate somebody to action. it's never been a profession with any esteem, and the field never professionalized in the past 60 years, which is a shame, because we now see the outcome.


Politicians would deeply care about a specialized highly paid workforce that was politically active, but what we have is mostly a do-nothing industry that has nearly 100% ceded to their corporate masters.

I mean look at what the coal mining industry is able to get and their numbers are quite small. There's more fast food workers at Wendy's than coal miners, but something tells me that Congress doesn't care about the baconator over rolling coal.

Start doing some advocacy work and this can easily change.


The only reason to blame AI is that it doesn't live up to promises, so corps are betting that scale will solve the problem. So datacenter after datacenter wants to be built in areas that can't supply them water/power.

And that money sink is causing the layoffs. If AI really helped like it expected to, you would grab any dev you could so that you could have a army of 100x devs.

Humans are still cheaper for the real costs. Augmented humans are on average a few percentage points better.


The same thing was said about the crazy build-out of fiber and telecommunications infrastructure. That infrastructure did prove useful but it took about 10-20 years before that was the case. It took 4G becoming broadly available and the ensuing increase of mobile devices to use at least of the overbuilt network capacity.

While I get this stream of thought, the major difference is that the internal hardware will age much faster than the core infrastructure. The real question is, how much of these build out costs is the center itself and the hardware within.

That fiber build out will last for decades, a Blackwell GPU, not so much.


I could be wrong but it seems like in the case of a crash no one will be buying new GPUs and thus the existing ones could hold their value longer. Of course that value will no longer be massively inflated by bubble FOMO.

>in the case of a crash no one will be buying new GPUs and thus the existing ones could hold their value longer.

No, because no one has any use for those monstrous GPUs outside of ML and some research projects. They can't even be dropped onto the consumer market because a SOHO is not equipped to house devices like that. The best case scenario is that the boards get dismantled and the VRAM gets salvaged for refurbishing. They've built these machines so specialized that they're essentially disposable.


There's a baseline paid demand for AI inference that can fully occupy today's GPUs (even after a crash) so there's no need to sell or scrap them.

What are you basing that on? Some of the demand that currently exists, exists because of all the money sloshing around the AI ecosystem (i.e. people using AI to sell AI solutions to other people), so how are you so sure demand can fully utilize all existing compute even after a crash?

It isn’t about holding value, the cards are going to burn up. If they don’t, in 5 years one could run a rack of 4 cards at home at an affordable rate. Either the cards become affordable again and the datacenter is useless, or they don’t, and nobody can fucking afford to rent them.

GPUs definitely have higher failure rates than CPUs but I'm not sure what the absolute rates will turn out to be. If 10% of GPUs die within 5 years that's very high but also probably economically fine. If 50% die that's a disaster.

Sorry, I meant at some point the current cards in he data centers will be obsolete, financially. They’ll be sold on the secondary market. Buying 8 h200s at $150 a pop will either be a real thing, or they all burn up and capex explodes again, which would be a death knell.

Either way, the moat is about 7 inches wide.


Chalk an cheese, Wavelength Division Multiplexing took out Globalcrossing Worldcom etc…

But the secondary market that grew out of it was because once it is in the ground it has a long lifespan and low upkeep costs, this is not the same thing as ultra high power density data centers.

Cooling needs to be balanced with demand, they may not work for even cloud scale type loads without serious issues etc…

Not that it matters, my hometown has an announced DC and it is looking more and more like it is a shill, as do several of the others in the area.


The difference is that fibre is infrastructure, LLMs are an application. Who knows, maybe they will pass and leave behind that server infra, and that's where our digital consciousnesses will live once our bodies die.

> The difference is that fibre is infrastructure, LLMs are an application

When I zoom out, I see “token generation” as an infrastructural layer, with applications built upon it.


> If AI really helped like it expected to, you would grab any dev you could so that you could have an army of 100x devs.

This seems maybe a bit reductionist.

AI will have diminishing returns because at a certain point, coding is not the bottleneck and coordination is or some other thing that hasn’t been optimized yet. The exact bottleneck seems like it depends on the organization.

My theory is that in general, augmented devs are much more productive, but 100% of that gain doesn’t translate into 100% more software delivered to customers, and there is a point where coding isn’t the longest pole.

But I don’t think most orgs are at that break even yet, and I think we can still get more out of engineering before we plateau.


What you say is accurate. Just remember that generating code is not the only way for an engineer to amplify themselves.

"America’s Census Bureau estimates that just 28% of firms in the San Francisco metropolitan area use AI regularly as part of their day-to-day operations. In America as a whole, adoption is much lower."

Nothing happens in a vacuum. "Tech companies" act in a coordinated fashion despite claiming they don't. Remote work was great until it wasn't. This is a coordinated action to shed all the mass gathered during ZIRP. Market forces is one naive explanation. But as history has shown factors extraneous to market forces are at work. This period will be written about, analyzed and diced in thousand different ways by future "thought leaders".

> Don't blame AI

Some may blame AI, but the opposite is also true: The believers use the bust as "proof" that AI coding "works".


When anyone asks the question: "What is AGI", it is actually this. An "abundance" of nothing else but this.

It's just that the tech workers are the canaries in the coal mines for the other white collar knowledge workers.

This is "AGI".


Meanwhile... I had to step in and hand code lots of css today because copilot couldn't do the thing. And I had to step in and manually fix tests yesterday because copilot couldn't do the thing.

Are you paid by OpenAI and/or Anthropic?


Have you considered not using copilot and using Claude Code or Codex directly?

I'm using codex latest model via copilot.

Is this where you say I'm holding it wrong? Is it so hard to admit these AI tools aren't as good as they are hyped up to be?


> Are you paid by OpenAI and/or Anthropic?

No. The VCs and angel investors screaming “abundance” are the paid promoters.

The problem is their “utopia of abundance” is not for us. They know the opposite is the reality (layoffs, offshoring, wage suppression and AI backlash)

They built their own bunkers and moats for a reason. Because true “AGI” will bring an abundance of very angry people going after them.

That is not worth being paid for by any AI lab.


Another reason is that wages don't keep up with the cost of living. For instance in the UK it makes little sense to be in IT, unless it's something boring and typing React boilerplate feels easier than stacking shelves or running deliveries.

Oh no, we are all going to die, again. How many times already.

Bad news gets sold better. Even better hysteria. Why not write a hysterical article to milk some ads money on doom and gloom? Because we value your privacy, we use cookies and similar technologies. Some collect your data, like your IP address. Others collect anonymous data. Together with our 177 (holy fuck) trusted partners.


reciprocal tariffs had put the non-tech and tech economy in stasis (except for hardware for AI). they are also are better than tax breaks and will supercharge bottom lines for large corporations once reclaimed and if prices remain high.

also if you want to test/force ai adoption you have to put pressure by firing some

now wars will put us into further stasis or decline via increased inflation pressure.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: