The last decade has seen huge changes. Ric Parkin looks at technology and its effects.
Welcome to the first Overload of the new decade! (Note: as defining a decade is purely a matter of semantics, I've decided that the common convention that the years starting with the same three digits - eg 200x and 201x - are a reasonable way of partitioning years into decades, so no pedantic letters please!) So on this completely arbitrary cusp, I thought I'd have a look back at what has changed technologically over the last decade, and peer into a foggy crystal ball to hazard a guess at what the next may bring.
This is going to be full of facts and figures, but so many I can't realistically give references without going way over the top. Most were found by search for 'history of X', if you're interested.
Disco 2000
So let's start with personal computers. In many ways they were quite similar to now, but as you would expect most of the parts were much less powerful: the latest chips were things like the Intel Pentium III, with around 9.5 million transistors, 512KB onboard cache, running at around 750MHz (although this was rising fast from 500MHz shortly before to 1MHz quite soon after). Windows 2000 was just about to be released, so most people were on Windows 98 or NT4, while Apple had been making a comeback with the iMac for a couple of years, and had just released OSX (but only for servers - the desktop version was still over a year away). Linux (and other free software such as StarOffice, which would shortly become OpenOffice) was increasingly being seen as a challenge to Microsoft's dominance of PC Operating Systems and Office software.
Most PCs came in big towers, the occasional 'pizza box', or still-bulky laptops, although more slimline desktops were making inroads. Monitors were pretty much exclusively big bulky CRTs with limited resolutions - the first 36" 1920 × 1200 monitors only appeared around 2000. Hard disk size was in the region of a few GB. USB2 was new but taking off, 3.5inch floppy disk drives were around but being phased out, replaced by CDs, USB sticks and file transfer via networks.
Handheld computing was still in its infancy in many ways. Small or tablet computers had been around for ages, notably the Apple Newton in the mid 90s, and Psion and Palm had popular PDAs, but they had never really sold beyond some business use and and tech hobbyists. Mobile phones, while they had become a mainstream device during the 90s, were still mainly simple phones, although a few early smartphones were around such as the Nokia Communicators. Some could now access the internet via the newly introduced Wireless Application Protocol, but the slowness over the old phone networks, reduced experience compared to a normal web browser, and the limitation of having to use special cut-down sites meant it was a patchy success at best.
By 2000 the internet had left the preserve of the more technically minded and was becoming mainstream. Access was still mainly via dial up connections, but ISDN and cable modems were becoming more popular. Microsoft was close to winning the so-called Browser Wars with Internet Explorer 5. The dot.com bubble was at its peak as people scrabbled for a foothold in this rapidly growing new medium, but would burst only a few months later. Few people had much idea of what would actually work on the internet. Much of the investment capital was thrown at all sorts of ideas, in the hope that some of the companies would survive and go on to dominate. Inevitably, many of the companies folded quickly after the money had run out, for example the notorious Boo.com who spent $188 million in six months. In contrast Amazon, then around 5 years old, was criticised for only having slow and steady growth instead of spending as much as possible to get market share. Google was a fairly new search facility - only 40 employees - but becoming popular as its new page ranking technology allowed people to find the most relevant information. The internet had also started to make an impact in mainstream life, for example the early Blog, the Drudge Report, had broken the Lewinsky scandal a couple of years earlier. Many news sources were setting up a web-presence, in the UK notably The Guardian and the BBC. Wikipedia was still a year off, although the underlying wiki technology had been around for a while, after being invented in order to aid collaborative Pattern writing.
Due to the internet, computer security was becoming much more important with the spread of viruses and trojans made much easier by the improved interconnectedness, enhanced by worries that due to the dominance of Microsoft operating systems and application hosts for scripts, a mono-culture effect could mean a major outbreak could spread quickly and widely. Governments too were concerned with security, but mainly so they could intercept and read people's telephone calls and emails. Strong encryption algorithms were even classified as munitions and covered by arms trading legislation.
Right here, right now
That was then, where are we now? Moore's law has continued to hold, and the number of transistors on a modern Intel chip now number around 750 million! Significantly though, the clock speed hasn't maintained the rapid rise of a decade ago - after peaking at a bit over 3GHz, the speeds have dropped back to around 2.9MHz as heat dissipation became a major problem. So instead of relying on clock speed increases to improve performance, chip manufacturers have had to use increasingly complicated techniques, such as instruction lookahead and speculative execution, hugely expanded on-chip memory caches - we're now talking multiple MB - to avoid having to wait for the main memory, and having multiple cores to allow true multitasking. This is where all those extra transistors have gone - instead of implementing a much bigger instruction set, most are actually memory and copies of the main processing cores, and also the tricky algorithms to improve instruction and memory throughput. In contrast the instruction sets haven't expanded as much, although they have gained some extra multimedia-oriented facilities and processing large data sets.
As well as the sort of complex chips in PCs, much smaller, simpler and more focused chips are now much more common, whether it's an ARM RISC chip in a mobile phone, a custom ASIC for implementing bluetooth or GPS. As well as simple chips, it's easier to combine off-the-shelf modules on a single piece of silicon to create a so called System On A Chip. These have allowed a wide range of powerful, yet cheap and small consumer devices to be released.
The amount of main memory has also expanded so many PCs now come with 4GB, which is the limit of addressable space for 32bit pointers. Disk sizes have shot up faster than Moore's Law, helped by applying exotic techniques such as spintronics, with 1TB disks being easily available, and even laptops come with 256GB disks. A major problem with disks and storage is now the transfer rate.
The actual form factors have changed a lot though. As well a the traditional tower and big laptop, there are now tiny form factors such as the Shuttle boxes, Mac Mini and Acer Revo, and all-in-one computer and monitors have become more widespread. Laptops have become smaller, lighter, and more powerful, with a new niche of cheap 'Net-books'. Having multiple monitors has become much more common, initially via graphics cards with multiple outputs and then software solutions such as DisplayLink. USB 2 itself has became ubiquitous and is used to connect a wide range of consumer electronics as well as PC peripherals. Version 3, theoretically ten times faster, has been defined and new products are starting to be rolled out.
In terms of operating systems, Windows 7 is now out to replace Vista, although XP remains popular on lower powered machines, OSX has gone through several iterations, and there are others such as Ubuntu's version of Linux, the iPhone operating system, and Google's Android and upcoming ChromeOS.
In terms of raw numbers, the iPhone is not that big a player in the massively expanded mobile phone market, but very important in terms of influence. Smartphones had been around for ages, but the iPhone made the leap to making them usable and desirable with its large touch sensitive screen, smooth graphics and UI, and excellent design. Plus with 3G and WiFi access from it and other phones, mobile internet access is now easy to do, and with other technologies such as built in camera, GPS and compass, portable 'information appliances' are now a reality. By the time you read this, Apple will have launched its tablet computer - could this change the new eReader segment and portable computer market in a similar way?
Google has been one of the biggest successes of the decade. The range of services it now provides is stunning, although there are increasing worries about data security in the cloud, and how much personal data it keeps with important privacy concerns.
Which shows how important the internet now is in people's everyday lives. With the advent of common ASDL and cable broadband, plus upgrades to mobile phone networks, and wired and wirelss networks in the home and wireless hotspots in public places, virtually all PCs and mobile devices are now connected to a vastly expanded range of information and services. Whether it's a developer looking up some up-to-date documentation, doing your tax return online, getting medical advice via NHS direct and booking a doctors appointment, uploading a video of an anti-government protest, or a Twitter during the Haiti earthquake, someone checking IMDB to cheat in a pub quiz, or looking up the nearest restaurant and checking reviews, the internet is now an essential service. Which does have its dangers - cyber attacks and information theft are much more common and more severe than ever before, although these are mitigated to some extent by the advance of the defences in operating systems, firewalls, and anti-malware. It is a continuing arms race, though.
If I had to name the one disruptive change over the last decade, it must surely be the roll out of fast, pervasive networks, leading to permanent connection to a vast information source, and other people. Virtually all the other big changes are built upon this.
A look into the future
Predicting disruptive technology and future directions is very hard. Sometimes it's because it hasn't been thought of yet. Sometimes it exists but needs other things to happen for them to become important. The last big one was probably the internet, which took 40 odd years. Something in the medical or biotech arenas might be next - the price of genome sequencing is dropping fast, opening up lots of possible developments. Integrating GPS and cameras into powerful phones is opening up some interesting syntheses - think of William Gibson's Virtual Light . RFID has been promising much and gaining footholds in niches. But would it change everyone's life significantly? Parallel and distributed computing will be important, whether it's multi core, grid or cloud computing. But what will be the killer application? If I knew that, I'd be rich soon!
But you want a prediction. I think the major technological changes for the next decade or so will be driven by nanotech. Improvements in things such as battery efficiencies, LCD screens, chip development, disk densities, photovoltaic efficiencies, etc, have already been happening due to its application, but I expect it will accelerate and affect many things, and result in the next step change in small, efficient, ubiquitous computing.