I mean, this was 100% predictable.
And anyone who didnt think it would happen were willfully blind or just plain ignorant.
Every 2nd microsoft OS is bad. Its normal for them. XP good, vista bad, 7 good, 8 bad, 10 good, 11 bad.
Uh, no.
95 bad, 98 bad, 98SE good only compared to 98, XP actually decent, Vista only really bad because of the change in how drivers were handled and there not being a robust library of them because of it, 7 THE GOD KING OF WINDOWS OSes…The Best, The Pinnacle. The Peak. The Top of the bell curve, 8 was shit, 10 was more shit than 8, 11 is just spyware.
Seriously. Windows 7 was the first genuinely stable OS from Microsoft.
Everything before it required regular reformating. Granted, the frequency of the reformating less over time, but still required it. Like, Win95/98 required it like every 3 months, XP every 6 months to a year, just to avoid the bloat and slowing down and issues. Same with reboots, it didnt have to be rebooted every time you ran a program, either.
Windows 7? My longest run between formats was like 4-5 years iirc, and that was due to hardware changes, not due to any performance or maintenance need. Ans for reboots? Only time that computer ever got rebooted is when a windows update demanded it, or when the power went out. Neither of which was particularly frequent.
It was also slick, agile, easy to use. You didnt have to think about shit when you used windows 7, you just did shit.
I’m not a fanboy, despite what this sounds like, but 7 was legitimately the best Windows OS, hell it wouldnt take much twisting for me to say it was the best Desktop OS, period. It was the first time ever that I was able to use the computer, and not have to stop and think “Well, I just finished running a heavy game, I need to reboot before I do something else” I just stopped one heavy task, gave the background processes a second to finish up, then went right to another heavy task without issue or concern.
It also had a very good UI. But Windows always had the best UI, by comparison, in the market, cause they spent billions on developing it so that the most computer illiterate could pick it up and use it with 15 minutes of instruction.
If you’re calling 95 bad i don’t think you spent a lot of time in 3.1. Resolving IRQ conflicts, configuring winsock.DLL, whatever the hell else. 95 had its issues, especially on the gaming side, but it was leaps and bounds better than what came before. Meanwhile 98SE was good enough to keep people, especially gamers, on it for a long time.
Actually 2K, ME, XP
2K December 1999, ME June 2000, XP October 2001
So the good bad good is preserved
No. They’re all bad, some are just worse than others. You’ve all just been stockholm syndromed into thinking better of the “less bad” ones.
Everything after w7, id agree. Windows 7 was actually legit. It ran fine on my amd athlon with 512MB ram. Ran dolphin back in the day too. Now after that it was all shite
No 7 sucked too. It just came off the back of Vista which was a real hot mess, so 7 appeared better.
The thing is, Microsoft has always had an adversarial (or abusive) relationship with its customers, forcing things on them that most of them don’t want. Like active desktop and IE integration in Windows 9x, “activation” and Fisher Price UI in XP, bloated (for the time) Aero UI that required a 3D capable GPU in Vista, UAC in Vista, forced automatic updates in 7, abandoning the start menu in favor of that awful tile UI in 8.x, telemetry you can’t disable in 10, a start menu that acts more like an app store and advertising place in 10, forced TPM and Microsoft accounts in 11 … the list is endless. And then when they back down on one thing, people are like: “Hurray, the czar heard us! Windows is actually good now!” … forgetting all the other things they have been forced to swallow in the past.
I generally agree, but I feel like Windows 8.1 was a vast improvement on 8. It was really more like Windows 9 with a Windows 8 theme.
This is just Vista all over again. Calm down people. Go to Linux or church if you’re scared.
The difference is, that you could just continue using XP until Win7 was released or continue using Win7 until Win10 was released. Win10 will reach end of life next year and then the only supported Windows will be Windows 11. Vista or Win8 were never as forced as Win11 is now.
not really because Vista does not have strong hardware requirements. But, this one have
Vista was absolutely the slowest thing imaginable. They reduced the requirements as part of a marketing campaign for “Vista-ready” PCs, but PCs that ran it “well” were few and far between. Even after 7 came out if you went back to Vista it was noticeably slower.
I decided to look up what that term meant.
The minimum specs seem to be an 800Mhz system with 512MB memory. No, Vista will not run good on that. Even Windows 7 will not like it. Windows XP with SP3 will run on that, but even that will feel sluggish on 800Mhz.
That’s like early XP computers being released with 64 or 128 Megs of RAM. That may be the minimum specs but it’s not gonna be usable.
Today, sure.
2005 was a different story, one the opposite of this one.
While Vista didn’t have high specified requirements, it gobbled resources so updating from XP to Vista you’d have a noticable slowdown.
Win11 is the opposite of that story. While modern PC models (as in 5-year-old when Win11 first came out) can run Win11 fine, Microsoft forces requirements which aren’t needed.
Sure, while having a better TPM and newer processor is a good thing, making anything other than that ewaste (because windows runs 90+% of consumer PCs, with Apple being the majority of the 10%) definitely isn’t.
wants people to use windows 11 make it difficult to use windows 11 people find ways to use windows 11 anyway (what you wanted in the first place) punish them for using windows 11
???
People that are running a windows modified to disable the hardware eligibility checks are probably also disabling/deleting the telemetry and activation checks.
Microsoft doesn’t want you to use windows 11, they want your money and data.
Which is why I dropped windows after 7 and went linux. Telemetry bullshit was odious in 10, but in 11 the spyware is basically one of the core functions/purposes.
Its why they pushed Windows 11 for free. Cause its not the product, you are.
Theres more money to be made in monetizing your daily using habits and selling them (and serving you tons of ads), than there is in making you pay 150-200 bucks for the new OS once.
And that new direction and drive radically alters how they develop the OS, and how you, the user, may interact with it. Which is why Windows is on the path of becoming a walled garden experience, with strict controls for “Security” (I.E. to keep you from doing anything that might impede their harvesting of data)
Greed.
Sure, they want you to run Win11, but chances are you’re already running it, or at least Win10, so there’s not much to gain there.
By making higher requirements for Win11 than neccessary Microsoft makes a killing on Windows licences.
OEMs have to pay Microsoft for keys. And for MS to make money off of keys, OEMs need to make more PCs. And how does MS force/incentivise them to do that? By 80% of the Win10 PCs incompatible with Win11.
Oh, and also, now they get to push their Copilot key as well.
Microsoft has a vested interest in PC sales not stagnating any more than they do, and sometimes it takes an artificial push to make that a reality.
I just installed Linux Mint on a 15-year-old desktop that has never been upgraded and was middle-of-the-road when I got it. It shipped with Windows 7, and I tried a couple of times to upgrade to 10 (it failed every time, either losing core hardware functionality, running so slowly as to be unusable, or just refusing to boot altogether). But it runs Linux like a dream. Seriously—it’s easily running the latest version of Mint better than it ran an 11-year-old service pack of Windows 7.
What’s even crazier is that I installed VirtualBox on it, and put Windows 10 on that, to use some work programs. And that runs Windows 10 a bit slowly, but otherwise more or less flawlessly!
That’s right: I’m having a better Windows experience in Linux than I’ve ever had on baremetal Windows on this box.
I can’t believe I didn’t do this…well, 15 years ago.
I can’t believe I didn’t do this…well, 15 years ago.
For what it’s worth, your experience 15 years ago likely would have been very different. It’s only in the past few years that things like drivers for basic hardware have become widely available on Linux without a bunch of weeping and wailing and gnashing of teeth. And even today, there are still certain drivers that often don’t like to play nice.
Ask anyone who had an nvidia GPU 15 years ago if they’d suggest switching to Linux. The answer would have been a resounding “fuck no, it won’t work with your GPU.”
Eh, “a few years” here is selling Linux a bit short. I switched about 15 years ago, and while driver issues were a thing, it was still a pretty solid experience. I had to fiddle with my sound card and I replaced my wifi card in my laptop, but other than that, everything else worked perfectly. That still occasionally happens today, but as of about 10 years ago, I honestly haven’t heard of many problems (esp. w/ sound, that seems largely solved, at least within a few months of HW release).
I don’t know what you’re talking about WRT GPUs. Bumblebee (graphics switch) was absolutely a thing back in the day for Nvidia GPUs on laptops, which kinda sucked but did work, and today there are better options. On desktops, I ran Nvidia because ATI’s drivers were more annoying at the time. Ubuntu would detect your hardware and ask you to install proprietary drivers for whichever card you had. I ended up getting a laptop w/o a dGPU, mostly because I didn’t want to deal with graphics switching, but that doesn’t mean it didn’t work, it was just a pain. For dedicated systems though, it was pretty simple, I was able to play Minecraft on the GPU that came with my motherboard (ATI?), and it ran the beta Minecraft build just fine, along with some other simple games.
In short, if you were on a desktop, pretty much everything would work just fine. If you were on a laptop, most things would work just fine, and the better your hardware, the fewer problems you’d have (i.e. my ThinkPad worked just fine ~10 years ago).
Playing games could be a bit more tricky, but for just using the machine, pretty much any hardware would work out of the box, even 15 years ago. It has only gotten better since then.