Windows '95 was a bold step into a new way of using computers. In hindsight, its reputation for its instability overshadows how it single-handedly revolutionised the idea of the user interface, and this revolutionary character was something very much in the minds of people at the time. Now every idiot could use a computer. I was raised on 3.1, Norton Commander and simple DOS, and I remember just how different '95 was at the time, and how you no longer needed to understand a computer's inner workings to be able to use it.
That’s an interesting take, and not at all how I remember it actually going down at the time, at least in the U.S.
By 1995 Macs had already been around for 11 years, and they were the “idiot-proof” computers that had revolutionized the idea of the personal computer user interface (even though its interface was lifted from Xerox), adding a mouse and driving everything graphically, hiding the inner workings away. By the early 90s it was pretty commonplace to see a Mac in almost every college student’s dorm room in the U.S., and they were in most school computer labs as well (alongside aging Apple II and PET computers, with a smattering of C64’s).
The early Windows UI was always a “me too” graphical layer slapped on top of DOS as an attempt to respond to the Mac UI, offering the choice of a graphical UI environment in color on a more affordable and performant hardware platform. In the early 90s this was up to Windows 3.1, which many computers auto-booted into, even though it was running on top of DOS. Office apps worked reasonably well in 3.1 (though it was notorious for locking up the entire UI if any single application barfed, much like the Mac at the time), but you always bailed out of Windows back into DOS to run serious games, because Windows was a resource hog. In that sense, one of Windows 3.1’s best features was the ability to exit it.
Windows 95 was essentially just a fresh coat of paint on Windows 3.1, but without the ability to truly exit back to DOS (you could sort of get to a DOS command prompt, but 95 never truly went away in the background). Make no mistake, it was still DOS underneath, it was just wallpapered over to sort of look like it wasn’t, and you weren’t able to get all the way out to maximize game performance. 95 was actually a bit of a nightmare for PC gamers in that respect, and that didn’t get sorted out until years later when PCs got more powerful, accelerated video cards became the norm, and the Windows driver situation got more stable (mostly after the move to the NT-based Windows 2000, which finally left its DOS underpinnings behind).
Yes, Windows 95 introduced the abomination of the Start button, which gave us decades of people navigating an ugly cascade of poorly arranged folder-dialog things until people were able to start pinning apps to the task bar to avoid it. I don’t think this was an innovation worthy of celebration.
The cool kids in the early 90s were actually running IBM’s OS/2, which was a 32-bit OS that had true separation of app environments, so the crash of one app wouldn’t take everything down. It could run DOS, Windows 3.1, and native OS/2 apps in the same graphical environment without a meaningful performance hit, and once you got over the shock of it being delivered on 50 floppy disks (as opposed to Windows’ 10 or so disks), it was the place to be. OS/2 4.0 even had voice control built into the OS that ran well on the hardware of the time and actually worked. One thing that the introduction of Windows 95 assured was that OS/2 wouldn’t be able to run new Windows applications going forward, which nipped IBM’s competitive edge in the bud, even though late-90s Windows was a far inferior product.
So, yeah, no rosy memories of Windows 9X here. It was actually one of the factors that drove me away from PC gaming and onto consoles instead.