As computers became steadily more connected across the 90’s new forms of applications started to emerge. In the 80’s applications ran on one computer with people accessing them from a terminal. This was fine for the simple text applications that we were used to running, but a new way of doing things was already becoming the normal way of doing things – the graphical user interface. It became known as the GUI which I personally pronounce something similar to gooey.
The emergence of the GUI and increasing compute power in the clients led to a realisation that it was perhaps better to do some work on the client device and other work on the server supporting it. This became known as client-server with each application having its own client and its own server.
My first experience of a graphical client-server application was Lotus Notes I think. We’d done some work with cc:Mail before it was purchased by Lotus, but it wasn’t really client server because there wasn’t any cc:Mail running on the server, all of the work was done by the client. All that you needed for cc:Mail to work was a shared file area that each of the clients wrote to and read from. Lotus Notes was different because it had a server that the client talked to. I don’t think we did too much with it prior to version 3, and even then, it wasn’t the dominant email client that people used in the organisation. We also did some playing around with Microsoft Mail and eventually early versions of Microsoft Exchange. The organisation I worked for, and later supported (after being outsourced), had a number of divisions and each division had its own opinion about email so ran its own systems. Each of these systems talked to the other via the x.400 protocol and we exchanged directory information using x.500. At some point the division I was supporting chose to consolidate its systems into a divisional Lotus Notes system as did some other division, other divisions chose Microsoft Exchange then at version 5.5 and still quite limited (the database on a server was limited, practically to 100GB and you could only have one database).
At one point I did some pretty basic Lotus Notes database development. We had a paper based ordering system for IT equipment and wanted to get away from all of the writing, so I was tasked with creating a database with the forms in it. We still printed the forms out and sent them off for approval and processing, but at least the creation of the order was done electronically. It was a great idea and the database lived on long after I’d stopped supporting it (it was given to a more professional development team to look after). Like many organisations the deployment of Lotus Notes lead to an explosion of databases for different tasks, everything from a lending database to numerous discussion databases, from document library stores to customer relationship lists.
Lotus Notes was only one of many applications built in this client-server architecture. We thought that this was the way that applications were going to be written for the foreseeable future but a new way of displaying information was already being used and a new way of finding information was about to be launched.
I can’t be confident that I used Google in the 90’s, but certainly by the early 00’s ‘to Google’ was something that had become second nature. Before Google though, there had already been Yahoo and a number of other ways of searching the growing catalogue of information on the Internet. The Internet was now the place to find information (text and pictures) but little did we perceive that this would become the default way of providing any and every capability. Although you could argue that the browser is itself client-server in architecture the difference comes from the browser being the universal client for all sorts of applications. It took until the late 90’s for Microsoft to realise this and it wasn’t until the successive releases of
Internet Explorer 5.0 (1999) and 6.0 (2001) that they built a (not to healthy as it turned out) dominant position.
In the early 1990’s we were still in a world were every device was built to support the person using it and that person could do whatever they liked on it. If we wanted to upgrade a client-server application we had plan an upgrade to the server and all of the clients. There was very little commonality between them and every device required local support. We knew how much it cost us to buy IT equipment and it wasn’t cheap, but a new term was starting to gain momentum – total cost of ownership (TCO). TCO was popularised by Gartner and it meant that it was no longer sufficient to just think about the procurement costs of IT we needed to start talking about other costs – operating costs, training costs, support costs, licensing costs, management costs, consumable costs. This change in thinking put personal computing under a bright spotlight as a place where organisations were spending huge amounts of money compared to the procurement costs of the equipment. PC support organisations were shown to be significantly larger per user than the equivalent mainframe support organisation (or so it seemed).
In the mid-90’s we started looking at ways of driving down the costs of personal computing; techniques that made use of network technologies to make things easier and to reduce the amount of travel needed for support purpose. We figured that if things could be centrally managed then they ought to be cheaper to operate, and perhaps there were some benefits for the end user in doing this too. The organisation I now worked for had a Novell Netware infrastructure (which was the dominant way of doing it at the time), but our client preferred us to look at Windows NT
as the way of achieving this. They’d already done some work with the technology, some of it out of curiosity about it’s key developer, Dave Cutler, who also had a lot of involvement in the development of Digital OpenVMS. People who already supported OpenVMS, like myself, saw a lot of commonality between the two.
The free-spirited pioneering days were coming under significant pressure to demonstrate their value, but that’s another journey for another day.
There were UNIX workstations from Sun, Digital, Silicon Graphics and IBM. I remember being quite impressed by the Sun SPARCstations, particularly the ‘lunch-box’ sized ones that stacked together with other SCSI connected peripherals. If I remember correctly there was one particular character in the office who’s stack of papers on his desk were always at least as high as his stacks of SPARCstation equipment. The IBM AIX Workstations were used extensively in the design department for 3D CAD. There were a few Silicon Graphics devices, limited to some high-end graphics requirements that we had. There were also VAXstations and X-terminals. These systems were all used for engineering purposes. Calculation has always been a huge part of engineering, those calculations where becoming computations and the computations were being integrated into applications. The human barrier to calculation was being removed.
On the hardware side of things, being an IBM shop, we preferred the PS/2. We thought that the MCA architecture was superior to the ISA architecture that all the clone manufacturers were pursuing. We stuck with IBM even after the PS/2 had been superseded by the IBM PC Series 300 and 700 (named in BMW model style). PCI was replacing both MCA and ICA. There were a number of clone devices around, primarily those from DEC and latterly Compaq, but there were also a number of Toshiba laptops around. The DEC devices were introduced by the teams supporting the engineering computing environment. IBM’s grip on the PC hardware and software market was well and truly slipping. At some point, I don’t quite remember when, we left IBM behind for desktop devices and moved over to HP Pavilions we never went back.
The laptop was a luxury and only provided to those who were important enough to justify it. I remember the embarrassment of one particular manager who had left his laptop on the roof of his car, forgotten about it, and then reversed down a hill, eventually driving over the top of his much loved Toshiba. It survived quite well, remaining in working order apart from a big crack down the middle of the screen. Most laptops were, however, IBM ThinkPad even after we had switched over to HP for the desktops. There was a number of people who got massively excited about the Toshiba Libretto. The problem with laptops of the day was weight and the diminutive Libretto promised a lot more mobility. They never really took off. The same was also true for the HP OmniBook 300 with it’s odd, inbuilt, mouse contraption. I’ve had a number of ThinkPad’s down the years and they’ve always been reliable work-horses.
Another class of devices were also starting to be used, the PDA. Some people had tried to use the original Psion organisers but it was the release of the Psion Series 3 that moved these devices into the mainstream. When in 1998 Psion got together with other Nokia and Ericcson to form Symbian everyone thought that this plucky British company was onto a winner, we were, again, wrong. Another set of devices were already starting to become popular, the PalmPilot. The Apple man in the office played around with the Newton for a period of time too. HP introduced the Jornada PDA running Windows CE in 1998, that never really took off either.










