Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Why you can’t buy a high-end graphics card at Best Buy (arstechnica.com)
200 points by smacktoward on Jan 21, 2018 | hide | past | favorite | 210 comments


I'm legitimately worried about the long term future of my hobby (PC gaming) at this point.

Prices like this have a cascade effect because when less people can afford to build a PC and/or upgrade there's less incentive for developers and publishers to spend time on the platform. Consoles are shielded by the ability of Sony and Microsoft to take losses which will be absorbed by online fees and higher game prices but no such cushion exists for PCs (this is a strength to be sure, but a double edged sword in this respect).

PC gamers have spent years disproving the image of PCs as prohibitively expensive and now it's coming crashing down due to people in a faux gold rush wasting hardware and energy. I wish it would stop, but I don't know how much of the damage to the hobby can be repaired.

Of course there are other factors. Machine learning is eating up cards, smartphones and tablets are eating RAM, and I don't doubt there's price fixing. But ML is at least useful to humanity (sometimes), smartphones and tablets are tangible products, and price fixing can be investigated and punished.

I'm relatively shielded from this as I just built a high end machine (i7-7700, 1080ti, 32GB DDR4) that will last a while but the next time I want to build a machine might be painful. While I would probably pay the price, others will not, and the hobby will suffer.


Most PC gamers aren't using high-end machines and don't upgrade very often. The Steam Hardware Survey shows that the majority (74%) of PC gamers have a 1080p monitor and a plurality (37%) have a video card with 2GB of VRAM and 8GB of system RAM (47%). The GTX 1060 only recently overtook the GTX 750 Ti as the single most popular video card.

Pricing of high-end hardware is largely irrelevant to the success of PC gaming as a platform. The majority of users have always opted for value-oriented hardware.

http://store.steampowered.com/hwsurvey/


I can't get a mid level graphics card either. These cards in the article are $200 and $250 cards from AMD. They are going for over $400.

I am the person that buys $200 graphic cards and high end is $500+ like a 1080ti or some sort of Titan. I always thought that $200 was the sweet spot with the best price to performance ratio. I can't buy a $200 card now.

I was lucky and got a new AMD 480 last year for $220. It sadly has an issue and needs to go in for warranty (3 years). It can only handle the simplest of graphics or it hangs and freezes the computer. No big deal, I wanted to make my kids a Minecraft and coding computer and I can't buy a decent $150 card either.

RAM is also sky high and I do video editing from time to time an want at least 16 GB in my rig. That would cost $100 18 months ago and now thats $225+ also due to the miners.


When the crash comes or when someone comes up with an ASIC for Ethereum, all these cards will flood the secondary market and the spare capacity slack will cause a price crash. You'll be able to get the highest end card in the world for half the current price.


Unless there will be no crash and/or Ethereum will be replaced by another cryptocurrency that's profitable to mine with GPU's.


> RAM is also sky high and I do video editing from time to time an want at least 16 GB in my rig. That would cost $100 18 months ago and now thats $225+ also due to the miners.

That is crazily out of whack. Back in 2011 I bought 8Gb (2x4Gb, 1333Mhz) for in my 2009 15” MBP and it cost me €25 secondhand. Worked like a champ too, even transferred it to the computer of my little brother when I upgraded to a new MacBook.


It's not that long ago I used to think of RAM as _almost_ disposable - you could max out your capacity pretty cheaply, and it wasn't really worth reselling a lot of the time. Over the past few years it seems to just been getting increasingly expensive though.


The effects on prices are quite bad, but the pendulum always comes back in the other direction two. Every GPU shortage leads to the second hand market getting flooded a few years later with used mining GPUs.

So in effect we get short term problems, but if you time your purchases correctly you can get deals too.


The problem that a few years later those GPUs aren’t worth much and with more and more coins coming out miners have always something to fallback on for their older hardware.

It doesn’t look like the GPU flood of 2015 would happen again.


Would you want to buy a piece of hardware that had been run at 100% of it's capacity for years, without any warranty?


Well, not everyone is aware of this, and 2nd hand mining card price points on various market platforms probably will push down the prices of everything else to varying extents. (both new, as well as 2nd hand gamer cards).


I might. If it still works after mining for years then that tells me it'll last for a while.


This impacts everything from mid-range upwards. A friend of mine, not so well versed in gaming hardware, recently asked me for advice on buying a new GPU, to replace some 3 generation old card he's using.

I recommended him something like a 1060, assuming they'd be reasonably affordable by now, just to discover that they've actually increased in price, especially considering he's on a tight budget.

In 20 years of PC gaming I can't remember a situation comparable to whats going on right now and frankly, it's quite depressing.

With this going on neither Nvidia nor AMD have much reason at all to innovate and keep increasing their performance, as most of their performant cards sell like crazy anyway, which is a horrible situation for consumers.


Keep in mind that the hardware survey may be skewed by Steam being used on machines aren't primarily for gaming, so I'd be somewhat cautious about drawing firm conclusions (e.g. I have Steam on an old Surface Pro 3 that is basically just for chat/non-real time games).

I'd also wonder if the small percentage of people on top-end gear drive the market forward (e.g. purchasing significantly more games even if the users are smaller in number).


keep in mind that the numbers have been skewed in the last few months because of an influx of pubg players from china, who probably don't have as much money to spend than western gamers.

see: http://store.steampowered.com/hwsurvey/directx/

windows 7 having a huge uptick from aug-oct, and high end graphics cards' market shares dropping in the same time.


Simplified Chinese having 63.90% share by language suggests that as well.


Are those stats normalized for game purchases?


Gamers are complaining about why can't nVidia just ramp up their production.

Having worked in a Fab and an assembly factory, there is a tremendous amount of coordination that needs to happen. Let's forget the business aspects for a moment - in order to ramp up production, it is not just building more lines. Silicon industry supply chains are mind bogglingly complex from oil-free-air supply hardware, etching chemicals supplier, automation equipment all the way to the copper mine. All of these interdependent supply chains have to scale in perfect coordination without an exception to be able to scale production.

Then there is the business side - building more lines could be foolish investment if the LRP (long range projections) is weak.

nVidia fabs their chips at TSMC. Say, TSMC has wafers ready but the assembly houses are struggling to keep up - the entire supply chain is broken and demand cannot be met.


> Gamers are complaining about why can't nVidia just ramp up their production.

I don't see too much of that in gaming circles. It's understood that it would be risky as crypto will crash sooner than later. The anger in the PC community is almost exclusively directed at miners. Threads about it pop up on the daily, but what can we do? We could lobby NVIDIA/AMD to somehow implement restrictions on what the Gaming SKUs can be used for as they do with the Quadro/FirePro workstation cards, but it's unclear if that would work let alone if it's technically/legally possible.

It's a bad time to be a PC gamer, and just months ago it was a great time to be a PC gamer.


could lobby NVIDIA/AMD to somehow implement restrictions on what the Gaming SKUs

Infact the opposite has happened. NVidia has restricted consumer cards from DCs with an exception for blockchains! They are very much making hay while the sun shines


I think something interesting will happen with TPU's.

nvidia's titan v has a 110 TFLOPS TPU (and only 12 TF GPU iirc). Ir's $3000, but that seems not reflect costs, but avoid cannablizing their even higher-end scientific computing offerings... so price could come down (to forestall competition).

Can TPUs give better ROI than GPUs? Conversely, will games end up exploiting TPUs?


Google’s TPU is a GPU without the silicon burden of needing to render graphics so no.


I referred to nvidia's titan v, which is independent of google's tpu. https://wikipedia.org/wiki/Tensor_processing_unit https://www.nvidia.com/en-us/titan/titan-v/

I also didn't say it would be used for rendering in games, but that it could be used by games. Games can use GPUs for compute, for physics, simulation amd even AI. Some is already happening. TPUs would be even better. to I find that interesting.

The point of my comment was that TPUs providing some differentiation relevant to mining vs gaming would affect prices.


Why exactly the TPU would be better? it's not a general computation processor it only does MMA.


I don't know, but at 110 TF for $3000, it would be worthwhile figuring out some way to use it.


For gaming? the 110 TF of the Titan V are also very very very conditional they are only applied when you are doing mixed precision MMA for any other operations you aren't getting 110TF....


What about for mining?


If the mining was to crash hard, it'd be a fantastic time to be a gamer with 2nd market GPU offerings. ;)

Also, I bought my own 1080TI last May and have had it mining when PC not in use. It's already earned itself back, potentially speaking (haven't actually sold anything yet for real dollars). A single card isn't enough to bring anything solid in, but it's at least a foot in the door kind of thing, especially for the fun factor.


A card used for mining will have worn out fans and potentially damage from heat on it. You'll have to hope for really cheap used cards to make up for that.


I'm running my GPU in mid 50s Celcius, with plenty of case cooling. (Lowered the power target etc.)


I think goda90 was referring to the second hand market.You may have taken good care of yours, but I suspect that miners won't have been as careful so buying these could be a bit of a crapshoot


When it comes to GPU mining (especially ETH) it is very common to undervolt / power limit a GPU as it reduces power usage greatly so the temperatures are lower than usually. Power consumption together with the price of kWh is an important factor that affects mining profitability


Yes but they are still stuck in open air cases in not exactly pristine environments.

Getting a GPU that has been running 24/7 for 2-3 years will be the mother of all lotteries.


I wonder if Nvidia et al could short Ethereum as a hedge while expanding production.


The markets are not liquid enough for a hedge of that scale (eg to cover a factory investment of 8, 9 or 10 figures USD).


This is why futures exist. They would sell the futures to hedge their crypto exposure.


I don't think that taking out naked short positions on the cryptocurrency market, or a coin that's had 15,000% run-up over the last year, is anything that a publicly traded company, that isn't in finance, would do.


You'd trust your business to a bitcoin exchange?


Futures are sold on CME/CBOE, not on bitcoin exchanges.


Doesn't matter since the price can be easily manipulated. It pays to drain millions into inflating the price or keeping it afloat, if you could fleece the counterparty for billions...


I wonder if they could cripple integer performance; this would prevent mining while probably not hurting gaming performance.


That may not impact the mining speeds that much as most integer operations would then be done using floats.

Also mining speeds are way more bottlenecked by memory speeds than GPU speeds these days.


I don't understand why companies like Nvidia don't adopt to the market and offer cards that are specifically designed for miners. Wouldn't that be possible and make them potentially more useful than mere graphics cards?


Because when the crypto bubble pops NVIDIA will be holding the bag on a new line of unwanted products.


There is at least one [0], but I'm frankly not sure where you can buy it.

[0] https://www.asus.com/Graphics-Cards/MINING-P106-6G/


As much as I think it's bad for gaming (right now at least), I think hardware manufacturers artificially limiting what it can be used for is a bad route to go down.


It is worth noting that nVidia contributed as much revenue to TSMC as Bitmain did last quarter. A key reason nVidia can't just ramp up is that other customers might be able to pay more than them for priority.

https://seekingalpha.com/news/3323673-tsmc-expects-iphone-sh...


I remember when the PS3 and Xbox360 had just come out, and everyone was screaming from the rooftops about how PC gaming will just crash and burn. Not for a week or two, but for years after those boxes had been released.

So you're saying PC gaming survived that sustained direct threat only to be wiped out by cryptocurrency mining that isn't even concerned with gaming?

This is a silly exaggeration. In the long term, if the crypto market collapses the prices will come back to normal.

If it thrives instead, the GPU manufacturers will probably scale up their production lines to meet demand.

Or maybe a miracle will happen and we'll see some really good GPU manufacturers pop up who'll make the market more competitive.


> I remember when the PS3 and Xbox360 had just come out, and everyone was screaming from the rooftops about how PC gaming will just crash and burn. Not for a week or two, but for years after those boxes had been released.

This still lead to the situation that, for years, AAA publishers wouldn't touch the PC gaming market with a ten-foot pole and focused pretty much all of their efforts on the 7. console generation, this was additionally compounded by the "piracy factor".

During that time PC gaming pretty much only had MMORPG's (which made the big bucks due to subscription-based payment), some RTS and a couple of rare exceptions in the FPS genre going for it while Steam was still busy establishing itself and the indie scene was nowhere to be found yet.

It was a rather boring time to be a PC gamer as you'd miss out on many of the interesting and unique releases due to them being only released on consoles.


PC as in Windows, or Linux/Mac? Big titles like Quake and Unreal were available on PC (meaning x86-32 plus Windows or x86-64 plus Windows in my post though things like Wine have been amazing for me).

There's always been an ample amount of games available on PC. Whether they were boring or not is a matter of opinion (I wouldn't say missing out on say Resident Evil is any issue). I am entirely positive all these games had recent equivalents in their genres. I said recent because else the comparison is too broad. Even Rez is just a newer version of Space Harrier, or compare first Doom or Quake VS last.

Also, some game vendors sign exclusive deals with a console vendor where they initially only release the game for that specific console. Then you have to buy the console, or wait.

So yeah curious what you felt you were missing out on.

Personally, I don't like consoles because I got nothing with this whole Trusted Computing thing. Instead I got a Steam Link for 5 EUR during Black Friday.


> PC as in Windows, or Linux/Mac? Big titles like Quake and Unreal were available on PC (meaning x86-32 plus Windows or x86-64 plus Windows in my post though things like Wine have been amazing for me).

Counter-Strike was way more relevant at that point, but that doesn't change the basic fact that pretty much all the major publishers gave little to no attention to the PC sector during that time.

Case in point: While Halo 1 and 2 still had Windows ports, the following Halo games didn't get them, same situation with Gears of War, first game got a Windows port, none after that until recently.

> Whether they were boring or not is a matter of opinion (I wouldn't say missing out on say Resident Evil is any issue).

That's a matter of personal taste and preferences. Personally, I really enjoyed RE1+2 on my Playstation back then (Just like Metal Gear Solid) which was also my last console before gaming exclusively on PC, skipping the whole sixth console generation, which already led to a bit of a backlog due to many interesting GameCube/PS2 titles.

Wasn't until late in the seventh generation (Xbox360/PS3) that I got back into console gaming, mainly due to the lack of certain genres on PC (Third Person games/backlog of exclusives) and the consoles/games gotten more affordable at that point.

If it wasn't for my interest in MMORPG/RTS/western RPG games I'm not sure I could have held out that long back then, as I also enjoy the occasional JRPG/Arcade Racing Game/Third Person Shooter, which had been rare to non-existent on PC back then.

> Personally, I don't like consoles because I got nothing with this whole Trusted Computing thing. Instead I got a Steam Link for 5 EUR during Black Friday.

There's something to be said for getting tons of games very cheaply from a still functioning second-hand market. Getting a cheap used PS2+tons of games or a Wii+tons of GameCube games offered a lot of game for very little money. Steam had a phase where this worked somewhat well too, but it feels like over these past few years good deals have mostly been replaced by tons of shovelware 99 cent games for card collections or having to wait for Summer/Winter sale events.


> So you're saying PC gaming survived that sustained direct threat only to be wiped out by cryptocurrency mining that isn't even concerned with gaming?

Once bitten, twice shy. I remember the dark time you're referring to. I don't think they've actively contributed to it but I can't help but think that MS and Sony must be seeing this as an opportunity to "twist the knife" some, bring back the good old days of the 7th generation.

I admit that in reality my fears will probably never come to fruition but that doesn't change the way I feel about the current situation.


Are there any non-VR games that even push the 1080ti?

It seems like you'd need some giant 4k monitor to make it sweat.

But also Steam doesn't seem to be releasing a ton of graphics-intensive games. It seems much more cost effective to get away with a lower price and less intensive art requirements and focus on gameplay. And less graphics puts you in a safer place as far as ports to consoles and tablets/mobile (e.g. Stardew Valley).

The Vive HD upgrade will push the envelope further, but I don't think VR has great success. (Elite Dangerous in particular gets boring because it's rather easy to make a ton of credits to afford everything)


1080TI can take heavy hits at 2560x1080 even, Battlefield 1 max settings can see 90 FPS at times. 2560x1080 @ 200hz it can't even come close to consistently with high-end games and settings. PUBG? frames are meh. Even with an overclocked card it can be rough sometimes in my experience.

Personally i'll be upgrading to the next card that trumps the 1080 ti and moving from 2560x1080 200hz (for all but CS, cs is 1280x960 @ 144 on another monitor) and going back to 1080p @ 250hz.

ultrawide is god-tier for movies and productivity work, though.


I just upgraded to 2560x1080 and was wondering if 3440x1440 is worth it. But just the idea of having to push that many pixel made me decide that 1080p is enough. I also have a 200Hz Monitor, but I do not mind if I do not get 200Fps.


90 vs 200 FPS is meaningless on it's own.

If it feels off something else on your rig I'd messed up. Check latency on your keyboard / monitor etc.


And here I set in front of 1680x1050 with an ATI Radeon HD 5770 playing Rocket League on Minecraft like graphics so the the splitscreen with a friend does not stutter noticeably. :D


I have a 4k monitor with two GTX 1080 in SLI and I usually don't get 60fps if I play on max settings. Annoyed by this I got another monitor at 2560x1440 @ 144hz and I can't max that out either. The games were stuff like rise of tomb raider or witcher 3.

I think 1080ti situation is similar.


I have a gsync 1440p 144hz, running on a 980ti. IMHO, if you can get at least 90 FPS it's a great experience. I don't really see the need for faster frames.

Granted, I believe gsync does a lot to make it appear smoother so if you don't have that it may be more important to max the frame rate. I do see it though when I drop to 60 fps, it feels choppy.


It's still a silly situation to be in, you spend all that money on a fancy monitor and GPU, yet you can't even max out visual fidelity if you want to make full use of them.

About GSYNC: Afaik isn't its purpose to eliminate screen tearing without the use of vsync, thus not getting stuck with awkward halfing of fps if the GPU can't keep up?


Why do people still think you need a high-end rig updated every year or two just to be a PC gamer? Almost every game I want to play runs satisfactorily on a five-year-old machine.

This is bad for the tech-obsessed top 10% of PC gamers. For the rest of us it's meaningless, and it's not going to last in any case.


It's kind of funny when 99% of the games are optimised for PS4 or maybe xbox, and both run the Jaguar cpu and 1.8 tflops cpu (xbox slightly lower)

My i7 4770k with a gtx 670 is still running all the games i throw at it, even if i might have to go down from ultra high effects in some.

However I have to admit I play mostly on PS4/switch now a days.


A lot of those people are buying cards/systems for the next 5 years, replacing their 5-year-old setups.


Depends on how quality your five-year old machine is. Graphics matter, less CPUs these days.

e.g. I found I could play high-end games (Deus Ex: MD for instance) fine on my rig with a Core i7-3770 (almost 6 years old!). However, it lagged on my original Geforce GTX 650 (~4.5 years old at that time), prompting an upgrade to a Geforce 1060 (an upper-mid end card that is now super expensive)


You should be glad those people exist, because they subsidize the R&D for people who buy older/cheaper cards.


Enh, honestly, if graphics somehow stopped advancing forever right now, it wouldn't bother me that much as a gamer. The era where good game concepts just couldn't be made because the hardware couldn't handle it are long over. We're just adding successive layers of polish now. That's nice, but it doesn't add much to gameplay or fun.


You could find somebody who said or thought this very thing each year for the last 20 years.


Who do you think paid for the R&D on the 5 year old cards you're playing on today?


Games I shove the most time at aren't even that intensive. Klei, and Zachtronic games.


We've had the most fun recently with one of those "Pandora's Box" pirate 1980s arcade simulators. I believe internally it runs on a 32 bit ARM chip.


If those games are your thing, you'd probably love Factorio.


>I'm legitimately worried about the long term future of my hobby (PC gaming)

The equipment required for gaming loses value very quickly. Old 7970's released in 2012 command amazing resale prices in 2018. Everything else in a rig back then is now worthless apart from scrap metal.

GPU's are one of the biggest expenses for gaming, as a hobby it got far cheaper. The fear is unfounded and this is typical alarm-journalism.

Now if only CPU's held value 6 years from now.


My i7 2600k from 2011 is still doing great now, even not overclocked. Maybe not much resale, but I would guess it would play almost all games today fine (not much of a gamer myself). Actually, the upgrade from DDR3 to DDR4 is probably as much of a factor.


As an avid gamer, with an i7 2600k@5GHZ I can confirm that it's still up to the task with anything I've thrown at it so far, the bottleneck is pretty much always the GPU, and it's not looking like that will be changing anytime soon.

And I'm still sticking with DDR3, which runs with way sharper timings compared to DDR4.


> Old 7970's released in 2012 command amazing resale prices in 2018. Everything else in a rig back then is now worthless apart from scrap metal.

RAM is much more expensive and keeps resale value well. I could sell my 16GB from 2012 now almost for twice the price (!!), which is also keeping me from upgrading to 32GB.


Monetary value, perhaps.

Gaming value? Actually the reverse is true - everything in my i5-3570K's rig is from 2012, except for the GTX 1080 and it ran everything 2017 @1440p very well.

Admittedly no BF1 or PUBG as I grew tired of the FPS genre a long time ago.


Don’t worry. This kind of supply constraint is an intrinsically short-term thing.

Either...

* crypto mining will crash (in which case you’ll be able to get powerful cards for cheap since the secondhand market will be flooded)

* or crypto mining will move on from the stop-gap measure of using gaming cards (more sophisticated setups will be more efficient and everyone is competing)

* or — in the highly unlikely event the current situation proves to be stable — card manufacturers will ramp up production


I agree with all of these. I think the biggest worry is that graphics cards manufacturers will be, justifiably, very reluctant to ramp up production due to the how volatile the crypto currency ecosystem is. If they first ramped up production to meet increased demand, your scenario 1 (crypto market crashes) would hit them double hard since not only will they not be able to sell lots of cards to miners, but the miners will be selling their used cards for cut-rate prices.


The 1050 and the 1060 3GB GPUs are more than good enough to run any tier one game on nearly the highest graphics on a 24" monitor. Everything above it is just ridiculousness for the sake of being absurd in the gaming industry, and for benchmarks that don't reflect real usage - unlike GPU mining, where the cards are actually pushed to their maximums.


I have the 1060 6GB. It can max out any game on my 1080p 60hz 24" monitor.

However, that's table stakes these days. 1440p monitors are gaining ground as people realize the DPI difference between their PC monitor and their smartphone. Moreover, most people can also notice the difference between 60hz and 100hz monitors.


So now we're talking about bleeding-edge technology in the 1440p / 100hz realm. I don't feel really bad that gamers can't readily get access to the technologies that drive the 0.0001% of gamers to these settings, no more than I feel bad for cryptocurrency miners who spend $1200 on a 1080ti and configure it like an idiot and get half the earning potential they should.


1440p is hardly bleeding edge in the productivity end.

If you already have that kind of monitor, why not reuse it for gaming?


~4% of people are on 1440p or high monitors according to the Steam hardware survey - http://store.steampowered.com/hwsurvey/

It's high-end, but not exactly what I'd call bleeding-edge / 0.0001% of gamers.


Why is a noticeable difference considered meaningful? It won't make the gameplay any better.

(Asking as a Luddite who doesn't see the point of Retina displays.)


Gameplay is only part of what a game offers. Increased realism or visual acuity from higher resolutions, larger textures, higher framerates, better anti-aliasing, better shadow rendering, and so on can contribute to feeling more immersed in the world of the game. Many gamers, myself included, use gaming as an escape from the world - it is helpful that the "world" I escape to look and feel real within the context of the fictional world.


I don't have a strong argument for DPI in gaming, I find DPI to be more noticeable when watching movies/reading text.

Going from 60 => 100 fps on the other hand makes fast-paced games feel much more fluid and responsive.


"It looks nice(r)" is basically the end of that argument. Some people will care (I find easier to be immersed in a game when technical limitations aren't as apparent), others won't.


1060 3GB cards are also going through price shocks. I bought a 6GB Geforce 1060 for $250 in Oct, 2016. Even the 3GB version is going for $450+ now


> Consoles are shielded by the ability of Sony and Microsoft to take losses which will be absorbed by online fees and higher game prices

I imagine they are also shielded to some extent by being AMD APU-based, so the market for discrete GPUs has less of an effect.


AMD APUs (and Intel+AMD Graphics APUs) might be what keep PC gaming alive for the time being. If AMD can pack a chip that can match an RX 460 onto a single die for something like $200 it'd be the goto chip for an entry level system.


Let’s add some perspective. The average price of a PC in 1995, right in the middle of PC gaming’s glory years, was $1500 which is $2500 in today’s currency. That’s an average pc, not top of the line, which went a multiple of that. Unless average graphics cards start costing $1000 we’re nowhere near 90’s prices.

PC gaming will do fine. Don’t worry. Maybe game makers will be a little slower to move to higher end specs, and that’s about all it will amount to.


Long term is not a problem actually, with high demand more factories will be built, but it takes time.

Short-term, we're stuck with high prices.


I had no issues at all getting a 1080 when those were new. I don’t think the next generation will really be that bad either. Even with what happened with the Vega release, it was easier to find a Vega GPU than it was to find a Nintendo Switch when those were just coming out.


There is a side effect in that the margins of prebuilt PCs should rise.


"... ML is at least useful to humanity (sometimes), smartphones and tablets are tangible products, and price fixing can be investigated and punished."

I'm amused that you're qualifying the worthiness of a project by its usefulness to humanity. In comparison to PC gaming, to me, revolutionizing the financial world seems a lot more "useful to humanity".


> revolutionizing the financial world seems a lot more "useful to humanity".

1. Citation needed.

2. Revolutions aren't always positive changes.


The shortage is not entirely because of miners. Nvidia manufacturers switched to using slower RAM in some of their cards. This was actually a very smart move (they should expand it to all their cards IMO). If you want the fast RAM, you have to buy the "mining" version of the cards. So in theory, it should have no impact on the gaming market.

For some of the popular mining cards (e.g. RX 470/80), well they simply haven't been available in many places for over a year now. Their successors aren't that great for mining either because of the increased power usage and RAM "lottery" (you don't really know what type of RAM you're getting when you buy a card). But most gamers don't generally care about RAM speed. This is also why the 5 series is generally cheaper than the 4 series (where you can find them).

However, the biggest problem is that there simply isn't enough RAM. The RAM shortage is being driven by smart phone and GPU production, and suspected price fixing. Samsung said they plan on increasing production (apparently they don't like it their competitors who make inferior RAM are making similar profits), but we will probably only see the effect mid-2018.

https://www.dramexchange.com/WeeklyResearch/Post/2/4815.html

http://www.pcgamer.com/samsung-to-increase-dram-output-as-ra...


AMD gambled on HBM2 and it paid off in ways they never expected. They were simply getting dominated by Nvidia and threw a hail mary pass on memory bandwidth and speed, knowing they couldn't compete on compute loads.

Enter the CryptoNight and Dagger-Hashimoto algorithms, and well, the rest is recent history for the RX architecture, primarily the Vega line.


And yet the most efficient (hashrate/tdp) GPU for Etherum mining is GeForce 1070, not any of the Radeon models. Even Vega 64 couldn't beat it.


Vega 64 has improved since launch, beating the GTX 1070 in efficiency this past Sept 2017 by achieving 0.20 mHash/W over the 1070's 0.18 mHash/W.

They're still within 10%, so it's not a great margin unless you're very focused on density.


Its bad. Its never been this bad before. Just go on pcpartspicker and realize that there are GTX 1080 Tis selling on third party websites for over $1500, and that's the only place you can get them beyond getting lucky.

Nvidia isn't going to ramp up production significantly because crypto could crash any day, simultaneously destroying their new sales and flooding the used market with old mining hardware. Combine it with the industry-wide RAM shortage... What a disaster.

The funny thing is, I remember back when ASICs first started coming out for Bitcoin, and a whole bunch of new cryptos came out saying "we're specifically designed to thwart ASICs so everyone can mine on the cards they have!" Well, here's what you've created. No one can afford cards anymore, and you made making more efficient hardware harder. Thanks for destroying the planet.


ASICs aren't saving the planet either - people are incentivized to burn up to $BLOCK_REWARD of electricity per block, the number of hashes it takes to do so is irrelevant. If you come up with a better ASIC, people will buy more of them, the network difficulty increases, and you're back where you started. It's just another stage in the Red Queen's Race.

The only thing ASICs do is centralize control (because economies of scale favor companies who can produce+operate their own chips). Full stop.

Proof of stake doesn't actually work either, or at least not without a centralized master key to prevent stake grinding/rewriting history (at which point why have staking in the first place). And in many cases, it may devolve to proof of work anyway.

There is no efficient cryptocurrency, and the cat's out of the bag now, there is nothing that can (practically) be done to stop it. Apart from charging people exponentially-increasing rates on their utility bills, that is, but China isn't going to do that, so the planet is just going to have to take another one for the team.


Is Cryptocurrency a form of entropy exchange from Energy (increasing entropy via heat) to information entropy (decreasing information entropy by hashing)?

I am thinking more broadly in terms of thermodynamics. I understand classical Thermodynanamics but I don't understand information entropy well.

From the standpoint of physics, Energy is the ultimate currency and it seems like we are trading it to decrease information entropy. Typically, entropy is the tax when performing work (for e.g in a heat engine) if that process is irreversible. I wonder if this kind of theory applies to Cryptocurrencies?


Proof of work is about consensus among mutually distrusting parties. It has approximately nothing to do with entropy.

Of course there is some connection, but the simple "exchange" you're looking for does not exist.


>> Proof of work is about consensus among mutually distrusting parties. It has approximately nothing to do with entropy.

Think more abstractly - at the end of the day, Energy is being consumed to compute some problem. That solution has intrinsic value precisely because it took energy to solve it. Extrinsic value is the market economy of CC, and that's where the 'proof of work' and 'consensus amongst distrusting parties' comes into play - that concept of a currency is only valuable if sufficient number of parties agree to its value. I am weeding into the underlying fabric that holds Crytocurrency together.

Imagine if Energy was free, despite of 'consensus', the supply of such currency would be virtually unlimited and it would be worthless.


Energy is consumed and that is why there is a cost. I'm not sure there is any such thing as "intrinsic value", but there is a cost.

Because of that cost, the larger system works, because that is how the system keeps itself honest, by making it expensive to lie.

But there are other ways to have an honest system that don't involve wasting energy, so there's no direct connection between the energy that's wasted and any value the system has.

If energy was free, silicon would be more expensive, mining difficulty would go up, and bitcoin would still "work" just like it does today.


> Imagine if Energy was free, despite of 'consensus', the supply of such currency would be virtually unlimited and it would be worthless.

They would just ramp up the network difficulty. Soon mining would be counted in exhausted stars per second.

Cryptocurrencies are the perfect example of the maxim that "just because you can, doesn't mean you should". They're funny in theory, proof of work being an interesting solution to a particular mathematical problem. In the real world, with problems determined by laws of physics and (in consequence) the structure of human minds, this is about the worst imaginable way to run the economy. The constraints of the problem solved by cryptocurrencies are not very meaningful in reality.


None of that makes entropy any more relevant.


> It has approximately nothing to do with entropy.

Err.. it's removing surprisal from bit sequences. That's exactly what Shannon entropy is. There's even a boundary trade-off (which we're obviously trading well, well above) between Boltzmann and Shannon on this.


Yes, that's the "work". The value of that work is as part of a mechanism for enforcing consensus. To the question "does entropy tell us anything about the larger system" I am arguing that it does not (in any non-trivial way).

Obviously entropy is related to any computation we perform. It's just not going to give us any special insight into cryptocurrencies to point that out.


Though there is some benefit to cryptocurrencies that rely on ASICs for mining: it means that someone with a significant amount of CPUs/GPUs under their control (such as botnet operators, or government groups like the NSA) can't waltz in and trivially 51% attack the blockchain with their existing general-purpose hardware, and then go back to using it for other things. An attacker would have to commit to buying/building specialized mining hardware for the attack, and then that hardware can't be repurposed for anything else after the attack other than mining.


ASICs are supposed to be orders of magnitude more efficient at their job than GPUs. So NSA can make a 0.01x the investment and "waltz in and trivially 51% attack the blockchain", all while not touching their existing equipment.


Only if they've developed an ASIC that is orders of magnitude more efficient than what anyone else has - which seems unlikely given the enormous economic incentive to find more efficient hardware.


I'm not sure if you understood my post right. Bitcoin mining relies on ASICs. If the NSA wanted to do a 51% attack, then they'd have to spend a bunch of money building ASICs that are then worthless to them once they're done with their attack.

If Bitcoin mining relied on GPU or CPU computation, then the NSA could just redirect some of their existing GPU/CPU hardware to do a 51% attack, and then go back to using it for something else when they were done. It would be cheaper and easier for them to do.


The only reason to invoke NSA in this context is for their ability to make a 10000x investment, with no concern for this being (directly) economically viable. The question then is, what will they do with control of the blockchain?


Idea: prevent North Korea from spending their coins?


Unrelated - did you used to frequent a Halo tricking forum called HIH?


Yeah, that's me. I stuck around its game-modding subforums a lot back then.


I did, too - not sure if you’d remember me, but I used to go by Zanzang. I went from modding games to programming them, then from the game industry to startups.

Funny how 10 (plus?) years later we’d run into each other on a different site.


A huge amount of the electricity being used was unused/not-competed-for in the first place. Miner farms are coalescing in Northern America around excess production in hydroelectric areas. This is an inconvenient fact that is ignored in the "electricity oh nooooo" debate.


Except excess hydroelectric production is extremely valuable to other utilities on the interconnection who run primarily thermal plants.

Hydro is one of the few forms of baseload power that can be ramped in a hurry: you can increase or decrease production just by increasing or reducing flow through the turbine. If you use less water, it builds up in the reservoir as potential energy and you can use it later.

Compare this to thermal plants, which require many hours to have a significant swing in energy output. I have heard of ramp figures on the order of days for very large nuclear reactors.

Because thermal stations run with pretty flat power output, hydroelectric stations have a critical role smoothing out load conditions, ramping down at night and up during the day. Since electricity is traded on a market system, utilities with significant hydro operations really benefit from this: they can reduce production and import excess capacity from thermal plants at night, and increase production during the day to export at a much higher rate.

This is part of why regions with predominantly hydro power have such low rates: in addition to hydroelectric being pretty cheap once a dam is built, hydro operators make significant amounts of money by appropriate import/export to thermal operators. Since a lot of hydro utilities are state owned (eg: BC Hydro, Hydro Quebec, Bonneville Power Administration), the profits are used to subsidize lower rates for ratepayers.


I had no idea I was so lucky on my timing. It's hard to believe they're going for 1200-1500 right now! I bought my son a 1080Ti for christmas, and paid about 1/2 of what they're going for now.


Destroying the planet = upset some gamers?


Bought some cards during the Christmas break season and prices/availability were fine. I was surprised when I went to buy some more and saw the prices! I reason that if crypto currenices crash I can re-purpose my nice new collection of GPU's to do Deep Learning training. It's got me pondering how one could do a distributed deep-learning training as the proof-of-work for a crypto-currency. Then the POW would be doing useful work anyways!

Along with causing issues with GPU card supplies there's been a lot of articles talking about how crypto-mining wastes electricity and hurts the environment. That's not necessarily the case across the board. Instead of running an electric heater this winter I am running a few GPU miners which produce a decent amount of heat. That helps me get a two-for-one benefit for ~9 months of the year since it's relatively chilly here year around. Might not work in all climates, but in many places its nice way to produce heat that'd just be waste heat anyways.


Yeah, heat and (80dB of noise) the equivalent of a vacuum cleaner in both power and noise. You can't have these near humans.


I think the supply cost issue is more related to memory production shortages and cost increases. RAM has increased in price over 100% in 12 months in a number of markets. Crypto isnt the only culprit here.


Interestingly the French market (and more generally the European Market?) seems to have been spared from that trend, at least for now. The prices have remained steady during the last year and there is still stock available for a large choice of card flavors. See for example https://www.materiel.net/achat/gtx-1080/catNom-cartes+graphi...


Perhaps electricity costs there are higher than in the U.S. so mining is less profitable?

Although I don’t have these issues in Ohio, either. You can find any NVidia card you want easily, and only AMD Vega is in short supply at the moment. Still doable though, if you’re willing to put in a bit of effort.


France famously derives >70% of it's energy from nuclear energy and has vastly lower prices than neighboring countries.


Electricity in France is 0.10 to 0.20 E/kWh. Depends on the terms and the time of day.

It's more than China and USA cities with hydro. It's less than some other places.

We may have lower production prices but it's either not significant or not passed down to the consumer.


Cards in French retailers are 20 to 40% more expensive than the American retailers.

They are constantly sold out as well. From my experience, there are only the overclocked/premium editions left and they charge a lot more than the RSP.


20% is because of VAT, so you have to discount that from comparisons.


Consumers have to pay it. No, this can't be discounted from comparisons.

The $500 card becomes > €600 because of adding the VAT and skipping the exchange rate.


In the long run, explosion in demand for GPUs will help lower prices and push the technical envelope for enveryone, including the gamers complaining in this article.


I think that's only likely if the demand is sustained though another development cycle / generation of GPU products, though. If demand collapses soon, it'll just be yet another story about how the market was really weird for a bit.


I wouldn't be so sure about that. Increased demand for RAM due to smartphones usage has led the prices upwards, not downwards.


This is an oft-repeated article of faith, which I have yet to see realized despite years of sustained extra demand for GPUs.


I'm not saying doitLP is right, but "years" is not a long time for building extra production capacity across the manufacturing process. Hell, they probably need a couple years of sustained demand to even consider upgrading it.


Only if the folks providing the cards decide that it is worth investing in having a larger pipeline of production. If they feel the volatility might be too great... they may choose not to.


It's (unsurprisingly) pushing the used market up. I'm selling a 1070 I don't need anymore, and it's already at more than I originally paid for it, despite being over a year old at this point.


My local Microcenter has an open-box 1070 they want $800 for, an open-box 1060 3 GB they want $475 for, a couple new 1060 6 GBs they want $600 for, and a couple 1050 Tis they want $350 for. That's their entire stock apart from some misc GTS cards from like 10 years ago.

So yeah, they're marking everything up to double MSRP basically.


That’s crazy, where are you located? My Microcenter is basically fully stocked with whatever GPU you’d want (aside from Vega)


Strange; Microcenter is blamed for not marking up the AMD Vega56/64 and selling it for MSRP, though reference cards are no longer being produced.


I wanted to buy 2 1080 for deep learning experiments, but that's going to have to wait.

I picked up a used 32core 128gb server for $1000 recently... It's only a matter of time before these 1080 will be sold for pennies on the dollar.

If your lively hood doesn't depend on it, wait.


I understand most people are wishful to see Cryptocurrencies crash but what if it doesn't?

What would happen if Crytpo booms big time?


First, segmondy didn't attribute future low GPU prices to a cryptocurrency crash. Second, it isn't necessary for a crash to happen to collapse demand for GPUs.

Bitcoin isn't driving GPU demand; Bitcoin is mined with ASICs. Ethereum is mined with GPUs, a consequence of deliberate design. However, when `Casper' lands (Ethereum's Proof of Stake scheme) mining will end, and so will the extreme demand for GPUs. If that happens Ebay will fill up with thousands of discrete GPUs and you'll buy them with a song.


Nobody buys 24/7 gpus if they cost more than 10% of the original (before crypto crazy) price.

Undervolting or not, running your card 24/7 makes them worthless.


In terms of graphics cards? Well presumably if cryptos keep driving up demand for an extended period of time, then nVidia will just increase production. Right now they haven't increased production because they think it's a bubble.


>> Right now they haven't increased production because they think it's a bubble.

Much more complicated than that. They haven't increased production because there is a RAM shortage and collusion is suspected. Samsung is the bottleneck, not Nvidia. This is changing as Samsung already announced they will increase production over time as Hynix continues to eat into their profits despite being inferior.

It's not as simple as cryptocurrency-haters would like to make it sound.


New cards will come out, which will make the current cards obsolete.


Because of the spectre/meltdown mitigation, I would expect that we would see a sharp fall in price of used servers that use the 6th generation and before.


from where might I ask did you buy that server


I don't have a link to that deal but buying quality used hardware isn't difficult or expensive.

I built a Quanta Open Compute node with 12 physical cores (24 logical with HT) and 96GB of RAM for around $350.

There are many companies selling Sandy/Ivy Bridge-EP Xeon servers for pennies to the dollar.

For example in the UK/EU: www.bargainhardware.co.uk/e5-2600-lga2011-series-machines


Amazon, I bought a HP Z820


I am curious as to what we'll do with all that surplus computing power after crypto currencies (or mining on general purpose hardware) fade out of fashion.

Better weather forecasting? AI?

And don't forget the investment on cheap power sources.


Lol. The cards are not used by companies, if the crypto market crashes a lot of single people have a lot of single cards.

How do you think this people with their cards are going to do anything else with it? There does not exist a single computing power but 1.000.000 single ones and the only reason they waste electricity is because they get money for it.


It could be used for decentralized cloud computing.


Yes, but it's numerics-heavy, so the workload would be very specific. Think SETI@Home on steroids. If we could solve the trust issues, pooling retired mining rigs into distributed supercomputers for rent would be viable.


About trust: You could distribute the load randomly, and do double work, if two data sets doesn't give the same output you do the calculation again and blacklist the node that was wrong.


Yes, but you won't want any sensitive data to be processed on the platform unless an irreversibly unrecognizable transformed version of it is what is processed.


It seems as though the large PC companies like Dell are still getting high end graphics cards. Ironically It's probably cheaper to buy an Alienware right now, as opposed to making a PC from parts.


I was this close to buy a hackintosh with high end parts (before this help me in save a lot) and now the iMac 27 5k not look bad..

If only Apple put bigger ssd at lower prices...


The trade off is that the parts that go into a machine that don't affect its performance on paper are usually sub-par. You can get a 1000W PSU from some no name OEM or you can get a 1000W PSU from e.g. Corsair. Both supply the amount of power your rig needs but one is cleaner, lasts longer, quieter, etc.


One of the biggest culprits, Ethereum, is working to change its consensus model to Proof of Stake, which will remove the need for mining. This should help with a lot of the demand.


After finding and disclosing a vulnerability in a graphics card manufacturer's site they offered me cash or a brand new GTX 1070. Which was sold out everywhere at the time and was selling for about 2x the retail cost on eBay.

Not sure what you can take away from this, but could it be possible that the "storage" is in part manufactured by the card manufacturers themselves? (or maybe they thought this would be a special reward I might have wanted?)


Curious: what will happen when it's not economic to mine Ether anymore? Especially if it happens suddenly (e.g. after a crash or a switch to proof-of-stake).

Won't the opposite situation suddenly appear, where second hand GTX1070's and GTX1080's suddenly flood the market and sell for $100?

Or will miners just switch their rigs to whatever coin is possible to mine with their rigs, instead of selling off their rigs?


Probably depends on if it's a broad crash or a single-currency one.

If it's just one currency people would probably switch.

I suspect that when there's a crash it's going to shake confidence in all crypto and you're going to see them all drop significantly in value.


Isn't this a pricing problem then?


Not really; if NVidia set their MSRP to $1,000 then gamers would still complain.


To be fair, any time anyone does anything, gamers complain.


Maybe there's something I missed about GPUs and mining, but how can an amateur rig like these hope to keep up with ASIC rigs? Can you profitably mine with residential electricity and GPUs still? How long can that possibly last?


Algorithms that are memory-hard cannot be profitably mined using ASICs. For now. The problem is one endemic to computer architecture and physics, so it is unlikely to change for the major algorithms like CryptoNight and Dagger-Hashimoto.


Some cryptocurrencies are mined using only ASICs and others are mined using only GPUs (because there aren't any ASICs for those algorithms).


Some (like Monero) are also still able to be efficiently mined by CPUs. Mining Monero isn’t very profitable right now though. It’s probably half as good as ETH.


It's covered in the article: Different currencies use different hash algorithms. Ethereum takes a lot of memory (gigabytes apparently). No one's building ASICs with GB of memory, at least so far.


>> No one's building ASICs with GB of memory, at least so far.

It's also nearly impossible to do at any price that approaches reasonable.


There are no ASICs for Ethereum.


Shouldn't be surprised that the stock prices of the graphics card makers are skyrocketing


It's great. Once ethereum etc. crash we can buy used high end graphic cards cheap.


What is the basis for an overwhelming assumption on HN that Cryto will crash? I am genuinely curious. What specifics indicate that it will crash?


Most people here are engineers and can see through the buzz words all these crypto tokens are throwing around to lure in naive investors. So it's assumed that it is a bubble as majority of tokens are worthless from technical perspective.


It's basically as if there were 300 new social media networks all popping up within 1 year, and 99% of them were clones of an open-source social media github, with a few minor alterations and a ton of marketing fluff, with perhaps 2-3 social media networks (Facebook, Instagram, Reddit) commanding 90% of the market.

There's just no utility behind all these coins. Hell, there aren't even any significant amount of bitcoin users, let alone Ethereum or Ripple, or any of the ridiculous coins. They're all presenting themselves as interesting for their utility, but are valued as an extremely short-term store of value play. It just can't last.

Lastly, note that there's only 3 cryptos that have a market cap of at least 10% of that of Bitcoin, and one is Bitcoin Cash so it doesn't count. The other are Ripple and Ethereum. Ripple has proof of stake (i.e., no mining is necessary) and Ethereum is expected to go proof of stake in the next hard fork, likely this year.

Just to put that into perspective. Nvidia's revenues are about $7b a year, AMD about 4 billion, let's say together 12b. They command pretty much 100% of the discrete GPU market, with Intel being the major integrated market player that is irrelevant in this context. Then look at https://digiconomist.net/ethereum-energy-consumption for an idea about mining revenues. It's about $14b a year, and electricity costs estimate the after-electricity profits at about $12b a year. Now you'd expect that a miner who can make $12 by turning on a device (with its electricity already paid for), is willing to pay anywhere between $0 and $12, and that with enough competition, they'll spend close to $12.

That gives you a rough idea about how significant Ethereum is as a player, with Bitcoin running on Asics and Ripple already at Proof of Stake, and most other coins being quite insignificant (and also a combination of Proof of Stake or Asics based like Litecoin, anyway).

So in short: I think cryptos will crash and even if a lot of them remain, I argue a lot of this price pressure is due to the few big coins of which only ethereum has viable GPU mining and that'll end in 2018.


It doesn't have to crash. It just has to be uneconomic to mine on GPU's. For example by becoming too hard, or by switching to proof-of-stake. The question is rather whether there is another coin that will take over the workload of the rigs (I know too little about cryptocurrencies to know whether another coin will have the same properties as ether, i.e. be economic to mine on GPUs).


In the short term I think we may see Bitfinex/Tether blow up pretty soon. Just this last week they printed around $500M in fake money to keep bitcoin prices from falling.


You really want to buy a card a miner has pushed hard for weeks and months on end with who-knows-what cooling?


Depends how far they crash. If it's cheap enough, it might be a worthwhile trade-off.


I’m not sure I’d want a card that’s been at full utilization, 24 hours a day, for who knows how long.


On the other hand, you can be sure these cards have made it well past the "infant mortality" part of the bathtub curve.

In fact, military and other high-reliability applications specify "burn-in" periods where components are operated at elevated temperatures for several hundred hours --- they are effectively "used":

http://reliabilityanalytics.com/reliability_engineering_libr...


I recently learned a really fascinating stuff about semiconductors: as they heat up, the doping atoms tend to move, by diffusion. Heat = more diffusion.


Another fun fact: Chips are designed to last years at full power and max temperature.


It's actually the temperature cycling that's far worse for them.

Plenty of available reading on cryptocurrency mining forums.


Yep: atoms in general move more with more heat. In solids, the energy required for movement is higher, but it still happens. The more heat available, the more energy you have to break bonds, the more diffusion (movement) you have.

I have a feeling that's not the main reason for heat failure of semiconductors though. (Not to put your comment down, it's an interesting fact and I'm glad you mentioned it.)


The weak link on a lot of hardware is the capacitors - I'm not sure about GPU cards, but if that's the case then you sure won't be replacing that surface mount stuff when it fails.


Why? All you need is a heat gun and some spare parts.


I've read that thermal cycles do more damage than constant running.


Yep, this applies to most things. E.g Engines.


If I can buy a 1080 for $100 when they flood ebay, I'm willing to take my chances.


Sell pickaxes (during a gold rush).


Go out of business after gold rush ends, because market is flooded with barely-used pickaxes being sold at a loss by failed miners.


I'd be glad to make a bet with you that the 1080ti doesn't collapse significantly after the "gold rush ends."

Come to think of it, I'd like to hear your predictions about when the gold rush will end. Or what the gold rush even is.


What, in your definition, are the pickaxes?


About two years ago, I bought significant amounts (for me) of stock in both AMD and Nvidia on the selling pickaxes theory. I'm quite happy with my choice.


[flagged]


>You are buying expensive computers to solve mathematical puzzles with the desperate hope that it will get you get rich fast

there are plenty of worse hobbies and professions to get rich quick. Like buying lottery tickets and investment banking.

Instead of focusing your tangent on a small group of individuals, why not focus on entire industries built to destroy the environments?


Whataboutism will convert nobody to your side.

Cryptocurrency is a bad thing. Unrelatedly, other bad things exist.


>Whataboutism will convert nobody to your side

Rationality isn't entirely about getting x amount of people to 1 side. Plenty of people flock to the wrong side.

Your post is similarly akin to complaining that people kill innocent spiders meanwhile homeless people die everyday. The way you prioritize cryptocurrency as a huge evil is completely silly and nonsensical


Global warming is a bit more serious than killing spiders.


Why is building and tuning a mining machine to get the most hashes per second more of a waste than tuning a car to squeeze out the most horsepower?


Funny that in this image in the article:

https://cdn.arstechnica.net/wp-content/uploads/2018/01/bare_...

The entire shelf is sold out of GPUs, but it's a near-lock that the RX550 2GB model for $71.xx had a higher ROI than any single card purchased for cryptocurrency mining.

Now, that being said, gamers (and to a lesser extent, deep learning researchers) need to shut up and enjoy a good thing while it lasts. Gamers who have 1080tis (who don't get close to their maximum performance, nor need it) just walked into a situation where they can print money for owning equipment they already have or would already buy - in their spare time when they aren't gaming.

It's very odd to watch a group of people bitch that they are handed a bunch of free money, but gamers manage to do it.


That is in my opinion an overly simplistic way of looking at it.

First of all, many people (myself included) would not want to have to burn electricity in order to offset the increased price. Even if you disregard the obvious environmental reasons, people sleeping close to their computers simply do not have the choice of leaving it running 24/7 unless they want to have vivid dreams about vacuum cleaners every night.

Second, even if the higher prices can be evened out in the long run from mining, this forces people to pay more money up-front in both hardware and electricity. Buying gaming equipment suddenly became an investment.


> gamers [...] need to shut up and enjoy a good thing while it lasts.

I don't consider what is happening to be good. I don't think cryptocurrencies are a positive change for the world at large nor for gaming in particular. I don't want to make money off of my hobby, I want it to be my hobby.

> It's very odd to watch a group of people bitch that they are handed a bunch of free money, but gamers manage to do it.

Gamers are being handed insanely inflated prices for hardware, and many of us don't want to deal with the crypto bubble - we just want to game.

I'd flip this on its end. Miners and AI enthusiasts really should be thanking gamers - we've put up the money that allowed GPU research to reach the point it's at today.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: