Yup! Since 1993… Started Linux on my desktop and haven’t looked back.
I thought you were going to say you liked lint (the source code checker).
I have one set up as an irrigation controller. I was going to build an OpenStack cluster to test configuration settings on (I run a production cluster at work), but gave up when the supply chain problems happened and prices skyrocketed.
Open in a new window/tab.
I still use Perl for most things – it’s my go-to language when I have to get something done quickly. And quickly doesn’t have to mean small one-liner scripts.
My biggest reason for using it is that mod_perl is still blazingly fast.
I thought this was an article about the X Windows system based on the preview for the article. Boy are those two similar-looking.
Probably like the peanut butter, because I wouldn’t want it to be confused with my gorilla with the name spelled the same way, but pronounced with a hard ‘g’ sound.
I loved QuickBASIC. I’d write Assembly Language routines in Turbo Assembler and call them from QuickBASIC.
I wrote a DeskMate clone for fun and it was actually pretty decent; TASM gave it decent performance.
Dvorak keyboard mode enabled.
Maybe they ‘won’, but I don’t count a pyrrhic victory as winning. It will take years to recover.
Thank you. I hadn’t considered the payment part. The cloud system that I manage is in education, so everyone pays in advance.
This makes sense, and I’ll start with a lower number and ask it to go up later. It will take a couple of months to migrate everything from Linode anyhow, so I don’t need them all at once.
My identity infrastructure alone uses a whole bunch of servers.
There are the three Kerberos servers, the two clusters of multiple LDAP servers behind HAProxy, the rabbitmq servers to pass requests around, the web servers also balanced/HA behind HAProxy… For me, service reliability and security are two of the biggest factors, so I isolate services and use HA when available.
I told them everything that I wrote here in my original request – I need 25 now, but would like a quota of 50 to maintain elasticity, testing, etc.
They followed up with the request for actual resources needed.
I haven’t answered since then.
I loved Reddit until I realized they were just going to do whatever they wanted and the community, apart from creating free content and work, didn’t matter. But the lying about discussions with the app creator was the straw that broke the camel’s back.
Suddenly they weren’t just a bully, but they were a proven lying, dishonest bully. Everything that they say going forward will be suspect, so I decided to walk away. Who knows what they’re doing with my data/content. I know what they’re telling me. I don’t know what’s true.
I deleted most of my posts from my nearly 14-year history except for a handful that I think need to stay up and a couple of others that I’m testing something on. I log in every once in a while to leave any groups that might have unlocked since I was last there and delete those posts too.
I don’t hate them. But they’ve lost my trust, and I don’t see any way to regain it.
There could have been other, better solutions. The biggest problem right now is that the only tool in Steve Huffman’s toolbox is a hammer.
I kept my account, but deleted most of my posts/comments from the past thirteen years unless I felt it was super important to leave them. Some I’m leaving while watching to see what happens with the subreddit.
I’m done with Reddit though. I didn’t care so much about the API, but when they started lying about talks with the developer and then running roughshod over everything it became clear that they can’t be trusted. How do I know that I’m even being presented with an accurate view of the world – are the moderators hand-picked by Reddit to push an agenda? Can a corporation/government/political campaign buy moderators brokered through Reddit now?
Too many questions with no good answers. So I’m glad I’m gone.
I’ve been doing this for 30+ years and it seems like the push lately has been towards oversimplification on the user side, but at the cost of resources and hidden complexity on the backend.
As an Assembly Language programmer I’m used to programming with consideration towards resource consumption. Did using that extra register just cause a couple of extra PUSH and POP commands in the loop? What’s the overhead on that?
But now some people just throw in a JavaScript framework for a single feature and don’t even worry about how it works or the overhead as long as the frontend looks right.
The same is true with computing. We’re abstracting containers inside of VMs on top of base operating systems which is adding so much more resource utilization to the mix (what’s the carbon footprint on that?) with an extremely complex but hidden backend. Everything’s great until you have to figure out why you’re suddenly losing packets that pass through a virtualized router to linuxbridge or OVS to a Kubernetes pod inside a virtual machine. And if one of those processes fails along the way, BOOM! it’s all gone. But that’s OK; we’ll just tear it down and rebuild it.
I get it. I understand the draw, and I see the benefits. IaC is awesome, and the speed with which things can be done is amazing. My concern is that I’ve seen a lot of people using these things who don’t know what’s going on under the hood, so they often make assumptions or mistakes that lead to surprises later.
I’m not sure what the answer is other than to understand what you’re doing at every step of the way, and always try to choose the simplest route (but future-proofed).