The Internet has brought us many great things (besides my blog of course 😉 ), from being able to contact people around the globe quickly, easily, and cheaply to buying your groceries to managing your finances, almost everything we do is done online. Because of this, more hardware is being built to focus on our online habits and our digital lives. The question this leads to is: Is this changing the face of everyday computing?
The answer to this is sadly yes. “Sadly?” you may ask. Yes, let me take some time to explain.
When I was 3 years old, my parents bought the first family computer – a Packard Bell Legend 660. This 486 machine was my first experience of a PC compatible machine, running MS-DOS 6.0 and Windows 3.1. All software had to be installed from floppy disks, and to use DOS, you needed to learn the commands. A few years later, my Dad bought a Toshiba 200CDS laptop to be able to work on the move. Both of these machines lacked the hardware to connect to the Internet out of the box.
Using a computer without the Internet was a normal thing in the 1990s and even the early 2000s, files were swapped on floppy disks, or ZIP disks if you had the technology (later CD-Rs and USB memory sticks). Graphically rich software was generally installed from CD-ROMs or DVDs. The graphical interfaces of computers, whether you were using Windows or Macintosh was designed to work with a standard keyboard and mouse combination.
Most people I know from my childhood didn’t experience DOS, or at least use it directly, so their first experience with PCs running Windows was Windows 95. The importance of this? Well, Windows 95 was the first major release of Windows to use an interface that’s become familiar with computer users worldwide, and it all revolves around that little button in the bottom left hand corner of the screen. (or the right hand corner in countries where they write from right to left)
The Start Menu is, arguably, the most recognised feature of modern computing. So much so, that one year after the release of Windows 8, Microsoft listened to the public outcry and reintroduced the start button in 8.1 which they had originally removed. It seems people don’t generally like big changes.
Of course, more recently, the focus has been shifted from stand alone computers which you install and manage your applications and data to cloud computing. Here all your data is stored online on someone else’s server, and your apps are generally linked to web applications. Whilst this in itself isn’t totally a bad thing – your work and digital life is easily accessible wherever you are, you are trusting someone else not to mess with these applications, or worse still, spy on your data.
Also, desktop systems are becoming less like desktop systems. Take Windows 8.x for example: Although they restored the Start button, it still takes you to the same Metro interface. An interface which works great on touch screens, but is bulky and unnecessary for users of the good old fashioned keyboard and mouse. Apple have done a similar sort of thing by introducing their launchpad interface – an obvious attempt to make OS X act the same way iOS does on their mobile devices.
Now, if you know me (or have read enough posts on here), you’ll know what’s coming next. “surely, you’re going to recommend everyone switches to Linux” Actually, no, even the Linux world isn’t immune to the touch-central world of computing today, with Ubuntu’s unity interface sporting a semi-touch-friendly interface, which they’re pushing onto their Ubuntu Phone. But the beauty of systems like Linux is that if you don’t like something like the interface, you can quite easily install another one. For example, KDE users have the luxury of the K Menu, which undoubtedly bares some resemblance to the Start menu found on older versions of Windows.
But back to the point, there is another side effect with all this move to cloud computing, where apps are installed and used on the web: People will become less computer literate. Let me expand on this.
Sure, people will know how to use their tablet computers, launch an application and work with it. But less people will know, or care, how these systems work. As a self confessed computer geek, I’ve always claimed I was introduced into the world of computing at the right time (pre 1995). Why? Because everyday computing still require a bit of command line knowledge. Most educational games (as well as other games and applications) I had were written for DOS, and for me to run one of these, I had to learn some DOS commands. Many years later, the old 486 Packard Bell became my own personal system, and I opted to reinstall MS-DOS (this time, 6.22) and Windows 3.11 thus expanding my command line knowledge even further.
Another geeky characteristic involves messing around with the working parts of the system to try and understand more about it. Playing around with the Windows registry, writing scripts, and, later, installing Linux, are all things that “young geeks” are less likely to do unless they happen to install old systems in virtual machines.
Because the options of tinkering with your own system is in danger of being made impossible due to the way these new systems are locked down, our potential “young geeks” will never get the chance to engage in these methods of messing around with a system to see what makes it tick. The hardware is also a contributing factor, 10-20 years ago, it was relatively easy to take your computer apart, take bits out of it, upgrade parts, and figure out how it works. I was fortunate enough to get a few old PCs from school when they were decommissioning them, one of them powered the OP-EZY website for many years. The only thing you’ll get out of prying open an iPad is a voided warranty.
In my opinion, the more locked down computer systems are, the less likely people (and mainly kids) are going to be interested in poking around, learning how the systems work, and as a professional System Administrator, this scares me. If no-one is interested in computing, we might find ourselves entering a dark age where our technology won’t advance any further, and end up with a lot of systems no-one really knows how to maintain.
A bleak future? Something to think about.