Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Fuck the cloud (2009) (textfiles.com)
149 points by downbad_ 13 hours ago | hide | past | favorite | 88 comments
 help



Up to early 2000s, people would go to the internet to have fun, everything was new, it was the mass migration from analog to digital era.

2020s, people are going offline to have fun.

Homelab is becoming a thing even for people who never had experience with computer, people hosting their own documents, movies, music, backups in case things go bad.

Even some companies have realised the price of going cloud, some are moving back to on-prem hardware with full control.


Before we had facebook and iphones the only people that were able to run a home lab were technically adept. In 1998 I used Avantgo and Vindigo to browse the news on the train and find restaurants when nobody else could. In 2005 I remember running my netgear mp101 upnp player and everybody was impressed how I could stream music. Then we made things like iphones and facebook which got everybody on the internet, and we made all the “hard” things like music, video, news, reservations, etc.. a “service” – democratizing it (what a nice word). But for technical people it was actually shittier than just running it on your own. Not right away -- there was small overlapping period from 2005 to 2014 or so, where the pace of advancement of technology was complementary to hosting it yourself, but after the corporate monopolies got fully involved everything just went to shit. I think it has come full circle again, where the “technically illiterate” will just consume the shitty services, and will be happy or oblivious to it – they are actually serfs giving their labor/money away and they don’t care. The rest of the technical folks are just going to do their own thing again because we’re sick of the crappy services. And it will be better than the general public can ever do, just like 1998 again.

""Homelab is becoming a thing even for people who never had experience with computer, people hosting their own documents, movies, music, backups in case things go bad."

Does the term "hosting" come from "web hosting" or some earlier terminology

Does the term "hosting" in the "homelab" context mean storing data locally on own computers, or running locally stored programs

If yes, could the the term "storing" be used instead

If no, then why is "hosting" the term used

This is sort of rhetorical question. I think I know why but I'm looking for clarification


Well, we're talking about network services. Which run on a host machine.

Not entirely unlike how viruses (or memes) execute on a host organism ;-)


> Homelab is becoming a thing even for people who never had experience with computer

This sounds like a filter bubble plus wishful thinking. Most people can barely manage their phone settings, let alone run a homelab.


> Homelab is becoming a thing even for people who never had experience with computer, ...

Oh totally. I got my brother, who lives on the other side of the world and who's not a dev/sysadmin, just a poweruser, to install Proxmox and he's now using GPU passthrough to have VMs run different AI models (in either Linux or Windows) for image generation, experimenting, etc. He's also got a NAS with RAID etc.

To me a homelab is the 2020's version of having fun with computing: there's something incredibly refreshing in disconnecting my sub-LAN from the Internet and still have music, movies, private pastebin (yup I use this at times between computers for simple stuff I don't want to both scp'ing), private Git repositories, complete backup system (including offline HDDs/SSDs that I rotate into a safe at the bank), etc.

A movie projector, a dumb one, is another very cool thing: connected to nothing but a HDMI cable (not that HDMI is the best standard ever but it does the job).

And to be sure I can still code and work without having a nanny holding my hand as if I was a toddler, I regularly have coding sessions where I don't use Claude Code (but I also pay for a subscription: these things aren't mutually exclusive).

For anyone who wants to have a fun, a used HP Workstation with ECC memory is basically $200 and makes a perfectly fine server at home. Doesn't need to be up 24/7 either: my online service that is up 24/7 is my unbound DNS resolver and I run that one on a Raspberry Pi (for the low power consumption). The rest of my homelab (two Proxmox servers) is basically something I only need when I'm awake/at my desk. So I turn them off at night.

You never go full cloud.


It's kind of funny that people are talking about "home labs" as a new thing because I've been running some form of servers on consumer PC hardware in my home since around 1998. For me this was an inseparable part of getting to know Linux and *BSD in that era.

I guess I'm just old though.


>I've been running some form of servers on consumer PC hardware in my home since around 1998

My excuse is that I never had the financial stability that I have now in my middle 30s to get things going, also moving oversea and what not didn't help either.

But I didn't go crazy, I have 3 Proxmox servers running a few services, Pihole + Unbound as recursive DNS to avoid DNS poisoning and personal data tracking.

A DIY TrueNAS as the primary system to have a copy of my data.

I have a 4K bluray with physical media, but I do have Jellyfin also because nothing matches 80s, 90s, early 2000s movies and buying DVD in 2026 is pointless. Also, it is not easy or very, very expensive to find a bluray copy of old movies in 2026. Jellyfin solves that.

All my servers are consuming 110W 200VA tops, connected to a second hand APC UPS 1000VA.

If the whole world goes to shit right now, I can still run all my stuff without dependency to the internet.

My last goal is to have a solar/battery system so if WW3 really happens sending us to the cave age, wherever I am will still be 21st century.


How does Jellyfin solve findings Blu-ray copies of old movies? Unless you say you just pirate them? Jellyfin isn’t just for movie pirates.

You can still borrow a lot at your local library and rip them yourself.

1988. On a math TA salary I paid $600 for an 80MB (That's megabytes) hard drive. I had dialup. I also had Turbo Pascal and an 8087 coprocessor. I was a MS student in computational math AKA numerical analysis.

It was goddam glorious.

Took until 1995ish to have a homelab to experiment with FreeBSD and later Linux over a 10-Base-T network with gcc/g++ and dialup access to this thing called "The World Wide Web". The browser had a throbber dinosaur.

It was even more goddam glorious.

Right now I've got three main systems with decent CPUs and 128GB of memory, and several emphemeral satellite systems. With 8GB of NVIDIA VRAM I'm running gemma4:31b just fine on my media system. Which curiously enough has, ah... media on it.

I feel like I have a good idea how EV owners feel right now. (We have a Prius.)


>I feel like I have a good idea how EV owners feel right now. (We have a Prius.)

The difference is that you don't own your EV, it is a computer on wheel. Any firmware update like Tesla has done in the past and features are no longer available.

That is totally opposite of homelab, you have full control, you flash firmware that gives you full control over devices.

I am hard core Linux user, my wireless access point runs Linux, my router is a Sophos baremetal running OPNSense/FreeBSD Unix. My 3D printer is DIY running Debian Linux.

That is the best thing about homelab, nobody can take it away from you, you own everything, it is yours and yours only.


"The difference is that you don't own your EV, it is a computer on wheel. Any firmware update like Tesla has done in the past and features are no longer available."

Yeah, I think that's right.

I only thought in one dimension: reliance on corporate controlled high density existing infiltration of fossil-fuel delivery infrastructure. Which is worthless if the price is occasionally exorbitantly volatile or might even run into zero supply issues.

Another equally important dimension is: that EV car might just be a puppet, and not you running the puppet.

I'm pretty sure the Prius doesn't phone home (2015), but I admit that I've not gone deep into it.

I can't stand this thing I just did in this comment where I tried not to sound like an AI. I might have to give up short comments entirely because I can't generate enough context for authenticity credibility. <= It's a fact, and that right there sounds like AI to me now.


"since 1998"

We're old.

A lot of HNers weren't born yet.


> and who's not a dev/sysadmin

> He's also got a NAS with RAID etc.

https://xkcd.com/2501/


"Oh totally. I got my brother, who lives on the other side of the world and who's not a dev/sysadmin, just a poweruser, to install Proxmox and he's now using GPU passthrough to have VMs run different AI models (in either Linux or Windows) for image generation, experimenting, etc. He's also got a NAS with RAID etc."

dude this is way more than "power user" you're being unserious.

If you tell a genuine power user, someone comfortable with Windows registry edits, Office macros, maybe some light PowerShell scripting, that they can "totally do what my brother did," and then the actual task list is Proxmox installation, IOMMU group isolation, VFIO stub drivers, GPU passthrough debugging, RAID configuration, and multi-OS VM management, subnetting, raid and HBA configuration, you're setting them up for a brutal wall of frustration.


>you're being unserious

PSA: the answer to "you're being unserious" starts with "are you fucking kidding me?"


I'm surprised people are advocating self-hosting as a viable solution. It takes a lot of knowledge to do sync and backup yourself, most of it implicit knowledge that people here don't realize we have and so for us it seems very easy.

There was a comment in another post on the front page about how anyone "remotely technical" can set up a docker container, and I think this is a good example because the mechanics of it are simple (edit a couple text files, run a couple commands), but half the world couldn't tell you what a terminal is and they're focused on other things in life instead of learning how computers work. Cloud succeeded because cloud is easy (at least in the beginning), it's that simple.

If we are to solve this problem, we're going to have to make self-hosting easy enough for the average 7-8 year old to do it without struggling. One promising way forward is with local-first E2EE sync and backup. The only good implementation I know of personally is Obsidian Sync, which has a UX that I adore, and hope to see more of in the future. There's other good options too, but none that I'd feel comfortable trusting a seven-year-old to execute correctly first try.


I agree about Obsidian Sync, I'm a happy user.

A distinction worth making is between "self-hosting" (running docker-compose, Proxmox, etc.) and "local-first software" (applications that store data on your own machine with no cloud infrastructure required). The former is hard, the latter is just how desktop software worked before SaaS took over.

In small business software the shift has been nearly total. Tools aimed at craft makers, small food producers, etc. have almost universally migrated to monthly subscriptions. The practical result: you're paying $tens-$hundreds/month to track whether you have enough beeswax for your next candle batch, the price increases annually, and if the vendor folds you get 90 days to export your data (if you're lucky).

These users won't set up a homelab, but a desktop app that installs normally, stores data locally, works offline, and has a one-time price is achievable - I've been building one [1] and it's a reasonable middle ground between "trust us with your data forever" and "configure your own NAS."

[1] https://kitted.site (inventory and production management for small manufacturers)


> The only good implementation I know of personally is Obsidian Sync

Obsidian Sync gets around a major platform problem with Apple iOS devices, which is that they don't allow one app to change the data of another. You can use Syncthing for local E2EE sync, but it won't work on your mobile due to this. It works fantastic machine to machine. I'm paying for Obsidian Sync now just to get around that, but it shows how some of the platforms are made to prevent functionality. Ostensibly its for security, but I'd argue the benefit is mainly financial for app makers (and therefore their app store cut).


> If we are to solve this problem, we're going to have to make self-hosting easy enough

It used to be easy enough in the 90s, when plenty of folks had their own custom website. You signed up with a hosting company; they provided you with a bunch of different ways to upload files; your website was hosted.

Of late, I've decided that the problem is that HTML development halted at what is still a very beta product. HTMX is a reasonable attempt to continue the spirit of HTML; where I'm going with this is that I think HTML should have continued to add enough reasonable features, without needing Javascript or massive amounts of CSS, so that most websites most people wanted to develop would still be straightforward enough to do. HTML stopped before it even had a usable <table> with sorting and filtering defined, so we've spent decades with inferior tables in every web app that all suck compared to what we got used to with even Windows 3.1 apps. HTML finally grew date and colour pickers but it should have had all kinds of rich UI controls and behaviour that would have made it totally unnecessary for people to build all the Javascript middleware that essentially treats the browser as a display canvas and otherwise totally reimplements the GUI from scratch. And we wonder why the beautiful new Macbook Neo is kneecapped by only having 8GB????

It's time for HTML6. My standard will be: everything a restaurant website needs should be basically batteries included, with the exception of an e-commerce backend. It should all be doable in static HTML files with almost no Javascript and really just enough CSS to set artistic theming elements instead of having to do arcane shit just to position things.


> You signed up with a hosting company; they provided you with a bunch of different ways to upload files; your website was hosted.

But that is not self-hosting. You're still using a cloud service. The problem is how to run something local, at home, that you have full control over


> half the world couldn't tell you what a terminal is and they're focused on other things in life instead of learning how computers work.

Thankfully, the converse: the computers these days are focused on nothing else but learning how humans work.

Hell, half the world doesn't even have a computer with a physical keyboard.


The irony of this is hilarious

> Resource Limit Is Reached

> The website is temporarily unable to service your request as it exceeded resource limit. Please try again later.


A copy for people who want to read the article :

https://archive.md/Q0DYu


So what? It will be back up.

...and the person just lost the majority of traffic that wouldve went to his site

yeah I'm sure Jason is just fine with that.

Oh no! All those ad views they missed out on!

Yeah I get the impression that making money from advertising isn't the primary reason that site exists. I know, hard to fathom right?


As it appears to be hugged to death, archive link: https://archive.ph/qsdc3

Sometimes you fuck the cloud, sometimes the cloud fucks you

Maybe the old man is on to something.

Forgot to put a cache on it probably. :)

Nope


lol, as a VPN user, I get to read nothing. No offense to archive.org, I get it.

Ironically I've been opening up Tor for archive.org lately and it seems to never be on the same blocklist the VPN IPs are on.

the irony

Flared by the cloud.. sic

The idea of offshoring computing is good. However, the cloud developed as a centralized computing platform instead of a distributed one. This has created power dynamics that harm customers. The same happened with social media, and has happened to other industries. I think it would be better for customers if there were many small cloud providers and they could easily switch between them. But even migrating from one cloud provider to another is a huge endeavor these days.

If you actually mean offshore as in located in a different country or especially a different continent, then that is a terrible idea for latency for many forms of computation. There are acceptable use cases, eg when round trips are infrequent and average latency is already high like batch workloads or some forms of LLM, but even then closer compute is pretty much always a better experience.

I feel this way about the sandbox in android.

I've taken to buying the occasional CD and DVD. While I still use spotify more than physical cds, I still have my old CD collection and the sound quality is so refreshing. And soundtracks aren't on Spotify. With movies it's hard to rewatch favourites because they don't stay on the streaming services. Again it's much more satisfying to own them yourself.

I echo similar sentiments. It is high time to choose self-hosting over handing over essentials to the cloud. You don't know when it could be inaccessible due to plethora of reasons. It is just that, every time I looked into setting up a home lab, it feels cost prohibitively expensive.

> home lab

Well that's your problem right there. The home labber setups are for experimentation or "hot rodding" purposes and they typically way overbuild their solutions.

What most people need is an old desktop in a corner somewhere (preferably close to your router so you can get to it with an ethernet cable).

It's won't be Grandma proof, but if you're remotely technical you can write a docker compose file that glues together some useful home server utilities that sound interesting to you.

My setup is roughly speaking: Ubuntu LTS, ZFS (with 4 disks in a RAID10 style config), and a docker compose file that runs plex, transmission, syncthing, vaultwarden behind an nginx-proxy[1] container that even automagically renews my Let's Encrypt certs for me (though it's probably even easier if you use a Cloudflare tunnel).

If you're confident all your apps are available on these platforms, the storage part is easier with something like TruNAS or Unraid. If you don't need storage at all you can slim down your hardware a lot and just use a raspberry pi.

IMO, just find an old beater machine and get hacking :)

[1]: https://github.com/nginx-proxy/nginx-proxy


I moved my DO server to a pi that was gathering dust. I agree, folks need to get off the cloud, find an old laptop or an old $40 mac mini, they are usually low power enough.

What makes it grandma proof is software that makes it extremely simple, which is like a home appliance, which is within the realm of possibility.

The simpler way to go on most fronts is some form of Proxmox with things like the above managed, it takes care of much of the overhead and doubt on it's own or through a reasonably point and click interface, which could be pre-configured.


Sounds like you’re describing Umbrel.

Could be, there should be more and more every year.

>It is just that, every time I looked into setting up a home lab, it feels cost prohibitively expensive.

You would say that if you look into my 12U rack right now, only 6 months ago all I had was 2x Dell SFF second hand computer from eBay that might have costed me AUD300.

Before that, I had one of those miniPC with two network ports that cost me AUD200. I installed Proxmox in it, then OPNSense (router) and pihole as virtual machine, it ran like that for years

Install Proxmox in them and you can run eveything.

This is the major misconception regarding homelab, you absolutely don't need expensive gear. A single miniPC + Proxmox is all you need to start, try to have at least 16GB of memory, 256GB NVMe is more than enough to start.

Don't let those massive homelab setup you see on the internet tell you that is the only way :)


We're teetering on the brink of highly capable software agents that can run on a phone using a local model, that can manage things like basic digital hygiene, operating a self-hosted cloud, with tailscale and other private vpns that can leverage your own home internet service with a well maintained set of firewall rules and level of locked-down access that it's actually practical.

An inspired nerd can do it right now, but grandma will be able to do a curated, accessible set of things by the end of the year, and by the end of 2027, the internet and self hosted things are going to be incredibly different. When people can self host plex and anonymously pirate anything, and their local model can do the ethically gray area stuff like ensure everything is done so they don't get caught - cloud services can't compete with that. Cable and netflix and spotify and the rest are going to have to up their game, and not do the stupid lashing out, price gouging, hunting the pirates type of thing or they're just going to burn down faster.

We're headed for some really cool, interesting times.


It’s like drugs, the first hit is cheap or free, but you end up spending all your money and your entire life on it!

Just get an old server for free somewhere and go …


People overengineer homelabs all the time for fun and practice. To selfhost (which is not, in fact, the same thing), you can get a mini PC and probably host all of the basics. A small two-bay NAS plus a mini PC and you're really cooking with gas.

Homelab = Experimenting with environments you might use at work. Selfhost = Hosting what you need at home.

These are two very different goals with very different reasonable choices. People homelab with Kubernetes clusters, selfhosting with Kubernetes is dumb.


We've gone full circle from Mainframes (1980s Cloud) and back again !


Thanks, Macroexpanded, and some others too...

Fuck the Cloud (2009) - https://news.ycombinator.com/item?id=10771539 - Dec 2015 (219 comments)

Fuck the cloud (2009) - https://news.ycombinator.com/item?id=2984083 - Sept 2011 (2 comments)

Fuck the Cloud - https://news.ycombinator.com/item?id=441885 - Jan 2009 (23 comments)


Resource Limit Is Reached - Hug of death

The irony.

That's really odd. If I remember correctly Jason Scott has talked in his podcast about how textfiles.com is a co-lo'd self hosted box. I wonder what "resource limit" got hit.

It really isn’t odd at all.


The reliability, speed and internet connectivity makes local first more appealing. Honestly - i host my own webpage, file server, and some compute locally.

17 years on, no value created, trillions extracted and used for stock buybacks. Destabilizing the economy, raising borrowing costs and making the feds print print print. In the 90s there were 8000+ publicly listed companies doing real business. Now there are ~3600, 10 years from now there will be at most 1000.

Gambling and endless consolidation feel good for monkey brains. Governments are supposed to step in, but we have a heckin' Cheeto in the White House.


The sequel to Kiss the sky

Anyway, I love how well GDPR demonstrated this:

> Insult, berate and make fun of any company that offers you something like a “sharing” site that makes you push stuff in that you can’t make copies out of or which you can’t export stuff out of. They will burble about technology issues. They are fucking lying. They might go off further about business models. They are fucking stupid.


Counter-take: this was almost entirely wrong, and the author should be embarassed looking back after 17 years.

I mean, it was 2009. How much of your personal data from then is still around on non-archival media you still control? Even among the geek set here, the answer is likely to be "almost none of it". At best it's "backed up" on media you haven't validated.

Or more likely, copied somewhere else to keep it secured. Like... Dropbox or Backblaze or S3 one of those, you guessed it, CLOUD services.

Likewise, do you still have your email from 2009 online in a useful form? Gmail users, many of them in this very thread, still do.


All of mine. Music, photos, copies of important documents, archived sets of email (and gmail) across different eras. My facebook archive export, IRC & IM logs stretching back to ~2000. A lot of it even on SSD, let alone HDD's, let alone "archival media". The spinning rust is mostly used for double- and triple-redundant copies of my music and photos, as well as the usual movie collection.

I'm not sure HN is the best place for such... technological anachronistic skepticism? A lot of us ARE going to be storing all that for shits and giggles.


Yeah I have all this data backed up on a couple different drives. IRC and ICQ logs going back to when I was a teenager. Digitised photos from when I was a kid through to the present day. Source code for projects I worked on from when I was 10. Rips of all the cds I used to own. And yes, email exports dating back to about 2003.

I wish I kept more, honestly. It’s a beautiful record.

I think my most treasured possession is videos of myself and my parents from when I was young. I’m thinking of sitting my sisters kids down in front of a camera for 15 minutes and getting them to talk about their life. It’s beautiful to rewatch this stuff decades later. It’s transporting.


> How much of your personal data from then is still around on non-archival media you still control? Even among the geek set here, the answer is likely to be "almost none of it".

As other commenters have stated, maybe this isn't the best place to ask.

I'm definitely in the "almost all of it" camp. I have Diablo II game saves on my desktop that are carried forward directly from my Windows 98 SE box circa 2002-2003. As well as Linux ISO's I acquired on Kazaa while still on dial-up internet.


I have all my music from 2009, shuffled from drive to drive. It out-survived my subscriptions to on-demand music streaming services (I do Pandora for discovery but don’t like the feeling of building an Amazon streaming “library” that will actually vanish when I stop paying).

I think the drive that held my old home directory might have died, though.


Uhhh, me? My home directory has 20-30 years of documents, photos, emails, the email address itself, instant-messaging logs, etc. Even a downloaded zip of every comment I ever made on Reddit. (But not HN, I should look into that.)

The primary exception would be Google Photos pictures which were auto-uploaded from my phone that I haven't curated and downloaded yet.

I predict I will maintain my custom-domain email address much longer than if I had used Gmail, given the attrition rate of bannings without support.

> on non-archival media you still control [...] Or more likely, copied somewhere else to keep it secured.

Hold up, is this OR or XOR? It sounds like you're trying to add unreasonable (dis-)qualifiers. TFA isn't saying one must boycott "the cloud" and erase all data, it just advocates that you retain an independent copy.

> Dropbox or Backblaze or S3 one of those, you guessed it, CLOUD services.

I think that's conflating different use-cases.

* Having a regular offsite backup into S3 isn't that different from when the data was rsync'ed to a Linux machine I paid for an account on. Any cloud-ness is a remote implementation detail, not a change in the consumer relationship.

* In contrast, "all my photos are in the cloud and my friends and family can collaborate on shared albums" is different, it permanently moves the locus of control.


Funny you bring up Gmail as a positive example when they reneged on their promise of unlimited storage 5 years ago.

Most of my media is backed up on my Unraid server, the most important stuff is backed up on an external drive and I also have some things that exist in the cloud, which I do not trust, which is why it's tiered as least important.


Is your Unraid free and unlimited then? Funny argument, indeed.

I'm amazed at the number of people jumping out here to insist that people don't use or value cloud storage because of the existance of one or thirty or whatever kludgey manual solutions. I mean, I know you can store stuff manually. I still have all that junk too! It's fun. But I don't recommend it to friends or coworkers or family or anyone else because... well, duh, as it were.

This forum's cherished (and, apparently, deeply insecure) geek cred notwithstanding, THE MARKET walked straight into the arms of the cloud, and has derived immense value from it. Grandmothers have terabyte archives of their progeny's development and will take it to the grave, without needing to puzzle out (sigh) an unraid install.


I'm grandfathered to get unlimited updates, though if they rugpull on that the drives are just formatted as XFS. It'd be a hassle to move to something like TrueNAS, but I could do it even if the OS stopped working. Even if Lime Technology completely disappear one day and make every Unraid USB stick self destruct, I'll still have physical access to the data.

Cloud services, like everything else in control of rent seeking companies, are getting worse. That was always the obvious, inevitable trap with all of this, with any system where you pay a subscription for remote access to a timeshare computer. Which isn't to say that it isn't useful, I even use it, but I don't rely on it.

You didn't frame your initial post around the market of grandmas, your rhetoric was targeted to those reading your post; "How much of your personal data", "do you still have your email".


I still have hour long techno/house mixes that I downloaded from some dude who was trying to get into DJing in 2008/did house shows or something, because we played on the same garry's mod server. They don't exist anywhere else on the internet as far as I could possibly find. Searching his dj name doesn't bring up anything.

A UK trance artist called Deathboy left directory-traversal open on his website about 23 year ago. Since then I've had a lot of mp3's that he's never released or put on albums, which is sad because a lot of them are pretty great.

Similarly (also from ~2003), the (Australian) ABC's website held a lot of recorded breakfast radio show clips from when Adam & Wil hosted it, getting the awesome comedy band Tripod [0] to write songs in an hour. Many of these were released on their CD's, but nowhere near all of them.

Eventually that ABC server was shutdown due to lack of government funds. There's a very good chance I'm the only one on the planet with these excellent songs & interviews from those shows.

[0] https://en.wikipedia.org/wiki/Tripod_(band)


Well have you uploaded them to archive.org?

I think very soon we will read “Fuck AI”.

I've been reading it for a couple years already, where have yall been looking? At marketing materials?

See e.g. https://aphyr.com/tags/the-future-of-everything-is-lies-i-gu...

(That whole series is also available as PDF; I recommend printing it, spiral-binding the printout kinda like a pirated college textbook, and surreptitiously leaving it in cafes and the like.)


Neither would be fucked if they were open source.

Or if they had a ton of viable competition.


Is not most cloud tech based on open-source? Without Linux I feel like we would have seen cloud take off 20 years later than it did.

Most cloud features are open source tools with special sauce sprinkeled in. But at the same time these companies heavily fund said OS project so I suppose it's not just pure community based work.

Please show me the open source AWS.

It's an encrustified open source offering where the original vendors aren't compensated. Where there's lock in, proprietary offering creep, and highway robbery billing.


>Please show me the open source AWS.

I will not, because that is not what I asked.

>Is not most cloud tech *based* on open-source?


No. It's not.

The proprietary pieces are 99% of the offering. It just wouldn't have ever worked without the open source bits.


[flagged]


It took me way too long to get that.

Can you more accurately translate this to English for us?

According to Google Translate: “A man who loves text files and wears a peculiar hat announces a plan to have sex with the weather.”

Hope this helps.


Keep hoping!



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: