Let's try and get 1,000,000 replies to this post

Windows '95 was a bold step into a new way of using computers. In hindsight, its reputation for its instability overshadows how it single-handedly revolutionised the idea of the user interface, and this revolutionary character was something very much in the minds of people at the time. Now every idiot could use a computer. I was raised on 3.1, Norton Commander and simple DOS, and I remember just how different '95 was at the time, and how you no longer needed to understand a computer's inner workings to be able to use it.
 
Windows '95 was a bold step into a new way of using computers. In hindsight, its reputation for its instability overshadows how it single-handedly revolutionised the idea of the user interface, and this revolutionary character was something very much in the minds of people at the time. Now every idiot could use a computer. I was raised on 3.1, Norton Commander and simple DOS, and I remember just how different '95 was at the time, and how you no longer needed to understand a computer's inner workings to be able to use it.
That’s an interesting take, and not at all how I remember it actually going down at the time, at least in the U.S.

By 1995 Macs had already been around for 11 years, and they were the “idiot-proof” computers that had revolutionized the idea of the personal computer user interface (even though its interface was lifted from Xerox), adding a mouse and driving everything graphically, hiding the inner workings away. By the early 90s it was pretty commonplace to see a Mac in almost every college student’s dorm room in the U.S., and they were in most school computer labs as well (alongside aging Apple II and PET computers, with a smattering of C64’s).

The early Windows UI was always a “me too” graphical layer slapped on top of DOS as an attempt to respond to the Mac UI, offering the choice of a graphical UI environment in color on a more affordable and performant hardware platform. In the early 90s this was up to Windows 3.1, which many computers auto-booted into, even though it was running on top of DOS. Office apps worked reasonably well in 3.1 (though it was notorious for locking up the entire UI if any single application barfed, much like the Mac at the time), but you always bailed out of Windows back into DOS to run serious games, because Windows was a resource hog. In that sense, one of Windows 3.1’s best features was the ability to exit it.

Windows 95 was essentially just a fresh coat of paint on Windows 3.1, but without the ability to truly exit back to DOS (you could sort of get to a DOS command prompt, but 95 never truly went away in the background). Make no mistake, it was still DOS underneath, it was just wallpapered over to sort of look like it wasn’t, and you weren’t able to get all the way out to maximize game performance. 95 was actually a bit of a nightmare for PC gamers in that respect, and that didn’t get sorted out until years later when PCs got more powerful, accelerated video cards became the norm, and the Windows driver situation got more stable (mostly after the move to the NT-based Windows 2000, which finally left its DOS underpinnings behind).

Yes, Windows 95 introduced the abomination of the Start button, which gave us decades of people navigating an ugly cascade of poorly arranged folder-dialog things until people were able to start pinning apps to the task bar to avoid it. I don’t think this was an innovation worthy of celebration.

The cool kids in the early 90s were actually running IBM’s OS/2, which was a 32-bit OS that had true separation of app environments, so the crash of one app wouldn’t take everything down. It could run DOS, Windows 3.1, and native OS/2 apps in the same graphical environment without a meaningful performance hit, and once you got over the shock of it being delivered on 50 floppy disks (as opposed to Windows’ 10 or so disks), it was the place to be. OS/2 4.0 even had voice control built into the OS that ran well on the hardware of the time and actually worked. One thing that the introduction of Windows 95 assured was that OS/2 wouldn’t be able to run new Windows applications going forward, which nipped IBM’s competitive edge in the bud, even though late-90s Windows was a far inferior product.

So, yeah, no rosy memories of Windows 9X here. It was actually one of the factors that drove me away from PC gaming and onto consoles instead.
 
Windows '95 was a bold step into a new way of using computers. In hindsight, its reputation for its instability overshadows how it single-handedly revolutionised the idea of the user interface, and this revolutionary character was something very much in the minds of people at the time. Now every idiot could use a computer. I was raised on 3.1, Norton Commander and simple DOS, and I remember just how different '95 was at the time, and how you no longer needed to understand a computer's inner workings to be able to use it.

Really?

I didn't see it as that.

Thinking about it right now, it's probably due to the concept of a desktop, which is user centric. Win 3.11 is application centric, like a glorified program launcher. You first need to fire up the application then open your file, or at least fire up the file manager, remember where you put your files in the directory hierarchy, go there and open them. In both cases there's some steps in between user and his work that depend on some basic knowledge of the OS's shell. While in 95 you have a desktop which is the biggest and the most accessible thing on the entire screen. Also reflects the IRL desk paradigm somewhat.

For me, DOS compatibility especially for those fat SVGA SoundBlaster 3D games was the main thing, so I hanged on to DOS/3.11 for quite a while. But I learned to appreciate 95 in the following years. I still know my CD key. I had a nostalgia trip last year and I ordered some old hardware and made an installation. Played with it for a month or two then shoved it in storage. It's funny how everything is primitive :D

Btw. I don't believe that desktop is a paradigm invented by Microsoft, fairly sure other computers had it before, UNIX workstations, Amigas, etc.
 
I agree with Jer. Apple revolutionized the personal computer with the first affordable GUI & mouse, since Macintosh release, 1984. Windows 95 came a decade later and due to Microsoft's business model took the masses by storm.

By the way.. is anyone familiar with the M1 Apple chip? It was released last year, blew the industry away and I thought it was almost as revolutionary as the first iPhone. I'm writing this on a (fanless) M1 MacBook Air which may be easily an all time top 5 value for money laptop.
 
What Jer mentions up there about early Windows' best option - exiting to DOS - is entirely true.
The amount of 16-bit software from late 80s to mid90s was staggering. The transition to new model of programming and OS was very rough. Keeping up to 'times' was completely unimportant as opposed to running your software.

The notion of having some fat UI just to organize your files and applications was largely nonexistent in early 90s PC. When computers were shipped out with two 5 25s and without HDD frequently. You sort your work around the computer, in the diskette holders, label them, etc. You put in the OS disk to boot, application disk in 2nd drive, run the application, remove the OS disk from first drive and put your files/work disk in.

There has been a steep climb in the following years but for most people it set the standard of keeping up with your computer's content somewhat outside the OS's facilities. I didn't see the desktop of 95 that usable because I always kept my own directory hierarchy.

PC wasn't in a good position until Windows shifted to NT and usable Unixes appeared in form of BSD and Linux.

DOS is a toy OS and so are all Windows up to and including Millenium edition. It took them years to unify and stabilize the driver model, it shifted three times. While all your DOS stuff worked.

Then again the best perk of DOS is that it was a basically non existent operating system. It set a number of functions via BIOS calls and it yielded all other resource management to user application which could then utilize the entire hardware. In its later versions that support extended memory and CPU protected mode through popular extenders (DOS4GW :*) its actually a fairly good 'embedded OS' while pre NT Windows aren't.
 
The NT kernel is a huge change, but while from the in depth computer nerd perspective, yes, 95/98 is a skin on DOS in many ways, but from a UI user perspective, it made computing much easier. The Start Menu alone changed the way we do business on the common folk front.
 
I got Windows 11, but I'm not a prolific enough user to tell you if it's worth getting or not. My biggest gripe is that the Start button (or whatever it's called now) is in the middle instead of the left corner and I keep forgetting.
 
Back
Top