The Joy of Digital Technology: A Personal History


The Joy of Digital Technology:
A Personal History
Richard Blake
(12th February 2016)

I bought my first computer in February 1984. I had used the mainframe system at my university as a word processor, but found it baffling, and there was a print queue of at least a day. For most of my time as an undergraduate, and for eighteen months after, I found it most convenient to write with a fountain pen on lined paper. For business letters and presentation drafts, I had a manual portable typewriter. In 1983, I “acquired” a manual desktop machine from an abandoned office building.

However, relentless advertising by computer companies, and a growing volume of my work as a writer, suggested that I should move into the last fifth of the twentieth century. A further reason was that I had taught myself touch typing, and was soon able to type faster than I could write – a sure recipe for typing mistakes, and a challenge to my habit of revising as I wrote. Lined paper covered with amendments and scorings out and insertions seemed wholly appropriate. Scribbling on a sheet of apparently perfect typescript, and then retyping, seemed less appropriate. And so I went off to the nearest computer shop, and put myself into the hands of a teenage boy. For £600, he sold me an Atari XE64, and a 5.25” disk drive, and a drum printer.

The two inch strip of paper that came with the boxes told me to connect everything, install the software, and then to print the instruction manuals. This was before the Internet, and before there was anyone in my circle of friends who knew more about these things than I did. The all night sitting that resulted was, in retrospect, a useful experience. But I was eventually delighted with what I had bought. 64Kb RAM, or whatever was left after loading the Atariwriter software, was enough to write nearly a thousand words before starting a new file. The formatting language was a set of codes – rather like HTML tags, though less intuitive – that had to be inserted before and after words and blocks of text. By modern standards, the printer was a strange thing. The characters were on a flexible rubber drum, about the diameter of a broomstick, that rubbed itself against an inked pad, before spinning round so an internal hammer could press the relevant character onto paper. It printed about as fast as I could type at full stretch, and produced the same wavy line of text as a manual typewriter with a worn out ribbon. No monitor – I had to buy a second hand television set on the same day as everything else, to see jagged white text on a greyish background.

As said, though, I was delighted with what I had bought. Two years later, I upgraded to a 128Kb model, and was able to create files of nearly two thousand words. I also bought a Juki dot matrix printer for heavy duty work, and an Epson daisy wheel printer for work that needed to look good. It was a minor nuisance that I had to spend nearly £100 for a parallel interface to get these working with the computer. A slightly greater nuisance was that I was unable to share files, or even disks, with anyone else. But I wrote three books on that system, plus a dissertation, and reams of other stuff.

In 1990, I upgraded to an Atari ST, with an internal 3.5” drive and a whole megabyte of RAM. This was, in itself, a marvellous thing. Sadly, I could find no way of transferring any of the work I had produced on the old system. Even worse, I soon realised that I had backed the wrong horse. By now, Intel and Microsoft had taken over the market, and the best I could say about my Atari ST was that it allowed me to save plaintext files on a floppy that others could read with MSDOS. I wrote more on this in two years than I had on the earlier system in six. In 1992, I backed out of the dead end of Atari by spending £1,600 on a 286 notebook with 1Mb RAM and a 20Mb hard disk. I ran it with MSDOS 5, and fell hopelessly in love with WordPerfect 5.1. Indeed, I used WordPerfect until 2003, when I jumped over to MSWord, which I have used, in its various incarnations, ever since.

Nowadays, I have two computers. The big one in my office I built myself – I have been building computers since 1998. I last upgraded this in 2009, and it has an Asus motherboard, and a Socket 775 processor, and 8Gb Ram, and 5Tb disk space. It has a 28” monitor, and is attached to a Brother HP100 laser printer. The computer I mostly use for writing is a Samsung notebook that I bought new in 2012. This has 6Gb of RAM and 1Tb disk space. It is part of a home network that allows me to use the laser printer. I run both systems with Windows 10, and do all my writing with MSWord 2010.

Here, the story appears to end. Before the free Windows 10 offer runs out, I shall probably need to replace the motherboard on my big computer. It has a problem with the BIOS settings that two new batteries have failed to solve. I suspect something vital and irreparable is about to fail. This means I shall also need to replace the now obsolete processor. Sooner or later, my notebook will suffer a fault I am unable to repair. The printer will not last forever. But, hardware failure aside, I feel no present reason to upgrade. Indeed, the specifications of both my computers are greatly in excess of my needs. I need to process the occasional video. I browse the Web. I send and receive a large volume of e-mail. My first and essential need, however, is word processing. Do I really need 8Gb RAM for that?

This brings me to what may be an entirely personal reflection, and may even be a sign of advancing years. Until about 2010, every upgrade, whether of hardware or software, was a joy. The jump from one Atari to another was exciting. I loved WordPerfect, but loved Office 2003 still more. Windows XP was a step into the future. There was no going back from Windows 7. Nothing since then has given me more than a languid pleasure. I upgraded to Office 2010 because I needed to format a three column book in English, Greek and Latin, and the 2003 version failed to give me the control I needed over the placing of tables. For all other purposes, I am not sure if it is an improvement on Office XP. I upgraded to Windows 10 because it was free, and because Windows 7 had gone unstable on my notebook, and upgrading saved me the trouble of reformatting and reinstalling – and, having upgraded one, I upgraded all the other computers in my house for the sake of neatness. It is an improvement on what I had, but was hardly needed for what it offered. I doubt I would have paid sixpence to buy a copy. My printer is twenty years old, and 600dpi text looks as good today as it ever did.

Oh, there are some new peripherals that I like. My Blue Yeti USB microphone is lovely. So too my Logitech HD webcam. But they are as perfect as I shall ever need.

What I am saying is that, after three decades of hectic improvement, digital technology, for all but specialist and perhaps gaming uses, appears to have matured. Replacement of worn out hardware aside – and any consequential replacement of incompatible peripherals and software – I feel no incentive to upgrade. I am happy with what I have.

Indeed, I suspect that this is not entirely a personal reflection. It seems that continued growth in sales of retail computing technology is based at present on a combination of replacement of what is worn out, and rising incomes outside the West. Sooner or later, this second curve will merge with that for the West. Or, moving away from the retail market, there is the embedding of computers in things like refrigerators and washing machines. But most people only replace these once a decade. If I am right, most hardware and software makers will soon be in trouble.

I am enormously grateful to live at the dawn of the age of digital technology. If I have written somewhere between twenty and thirty books, and thousands of other pieces, and if I am able to make a living from what I write, this is almost wholly because of my jump from pen and typewriter to computer keyboard. But I fail to see what more I could want beyond what I already have. For me, and for many others, enough is enough.

26 thoughts on “The Joy of Digital Technology: A Personal History

  1. Couldn’t agree more! My first box was a TRS80 Model III, in 1980. I then succumbed to the siren song of the Apple IIGS in 1986. After that, I built my own. My current mega-box, “Brynhldr”, is my last build, though I have upgraded all of her components, and will continue to do so as whim directs and budget allows. And, barring forceful arm-twisting, the dual operating systems of Windows 7 and Ubuntu will serve me well.

  2. But haven’t people always thought they were living at a time of “matured” technology, where there was no more room to improve? Speech to text software still needs a lot of improvement, for one thing. It would surely be better if you could walk up and down the room dictating your work rather than straining your fingers and eyes typing. I also think there’s a lot of room for improvement with e-readers.

  3. On the technology front, one major development we will probably see over the next 2/3 years in the retail market is the rapid decline of the hard drive. Possibly also we will see the first signs of the disappearance of device-specific software and a shift towards ubiquitous computing within the average household, in which computing becomes something you can do on a range of integrated devices. Cloud computing [‘cloud’ here just being another word for the internet and worldwide web] has reached the stage where the hard drive is no longer needed. All that’s required is a monitor of some kind, a router and a keyboard, and maybe a residual processor. At the moment, cloud services are based on the notion of synchronisation between the remote server and the user’s own memory capacity (be it hard drive or local server), but that is not a necessary service as the entire process could be run from a managed server.

    To what extent cloud computing will innovate practices, I am unsure, because I imagine a lot of writers, academics and home-based professionals will prefer to retain a hard drive and continue with synchronisation to hybrid cloud services, and professional firms and businesses will want to do the same and also use a server if they can afford it. The potential is probably more in consumer entertainment and gaming. The more sophisticated market probably offers opportunities for companies like Google, who have the capability and resources to offer synergies to users and have the credibility to tap that end of the market. So you could see Google or Facebook or Microsoft offer cloud services.

    I do a lot of creative writing, and believe it or not, most of it I still type on an old-fashioned Brother Charger 11 that I’ve had since I was small. I’m not sure why, but I think it’s just that there is something romantic about typewriters. That, or an old-fashioned notebook (the paper type) and pen, used while sat on a beach rock or outside a café in the open air, seem to be apt for finding inspiration.

    For whatever reason, I seem to struggle with fiction writing, poetry and so on when using a computer. It’s an aesthetic issue. There is just something cold and uninspiring about the medium. It’s also a discipline thing. I think the problem with computers is that if you try to write something and it doesn’t quite work, there’s a temptation to go back and delete it, whereas with a typewriter, the ‘bad’ writing is preserved. I have found this important because often when re-typing the whole thing, it’s better that I work from what I previously thought of and improve on it, whereas on a computer, those original thoughts would be lost forever.

    I also find aspects of the typewriter functionally superior to the PCs. For one thing, there is no chance of file errors, malware, hacking, and what not, whereas with a PC, there is always the possibility of some mishap that will result in lost work. For instance, the Operating System might become corrupted, with the result that any Word files open at that time are then ‘blanked’. Fortunately, that has only happened to me once, and the work lost wasn’t very important, but it gave me a fright.

    • People who get used to a manual typewriter often have to be dragged away from it, such is their attachment to it. There are still writers out there using 1970s machines and ordering their ribbons from obscure specialist outlets. One advantage they had a decade or two ago is that if you passed enough rubbish skips on the street you were sure to find a machine in perfect working order – they were being thrown away every day in the rush to digitalised office management.

      If you’re worried about corruption of files, it might be worthwhile buying a cheap laptop for writing only, one that you never connect to an external network of any kind.

    • I personally think that trusting your data and, indeed, ability to compute at all to a few remote servers in “the cloud” is extremely dubious. I want a computer that I own and which works whether or not it’s connected to some licensing authority, thanks.

      • I agree, Ian. I’m just saying that cloud computing in various forms is probably going to be a big part of the future. I wasn’t making a sales pitch. That said, I already use cloud services quite a lot and I think it has its benefits, and the issue you identify could be addressed by making sure that there are thousands of providers rather than an oligopoly of just a few corporate giants. It’s possible also that groups could run their own cloud services locally or on a corporate basis. We shouldn’t forget that there are also problems with local storage, be it a server or just reliance on a hard drive. I have relied on both in the past for business, and there is always the fear that one day you will arrive at the office and find you can’t switch the thing on. Even printing things out has its risk in that files can be lost, misplaced, and stolen. There’s no perfect solution.

        • I’ve got complete continuity of data back to 1999 thanks to a mixture of RAID and backup. I don’t see any problem with using remote storage as well as part of that, though I don’t currently (broadband is not practically fast enough yet anyway for this much data).

          What scares the pants off me is the prospect of computers that are nothing but terminals and cannot do anything without a server connection. This will in practise put everything you do under an Acceptable Use Policy, which may come to include various forms of censorship. The kind of large corporations running cloud services are quite happy generally to participate in such censorship so long as it is for “progressive” purposes, even something as simple and silly for instance as imposing the American terror of womens’ nipples on Facebook users etc.

          The irony is that this is the same companies who grew up in Silicon Valley as youngsters eager to get a computer of their own, on their own desktops, away from the need for access to a mainframe. And now they have turned into older, very wealthy people, are imposing the same themselves.

          And what happens if there is a sudden economic crash and the “cloud” you rely on goes bankrupt and closes down..? I find this thought particularly interesting as I have a minor interest in old computer systems and enjoy for instance watching vintage computer videos on YouTube. We may soon be in an era though where machines will become entirely redundant, unable to boot because the remote support they need from the “cloud” is no longer there. A machine with a processor too old to run Windows 18, but Windows 12 which it could run no longer exists, being just a terminal service to Microsoft which has been discontinued…

          • I read an article in The Economist back in 1995, which promised that increasingly few people wanted hard disks any more, but were moving to some kind of early cloud system. It seems that most people back then agreed with me, that the Economist was talking through its hat, and preferred the idea of looking after their own stuff. I predict that cloud computing will simply be part of a mix, and that no one in his right mind will trust his private data off-site.

  4. I bought into the personal computer revolution around the same time you did, but I bought what was described by the techie who sold it to me as “an IBM XT equivalent in a genuine IBM case.” He had a 10 meg hard drive in it, but took it out, wouldn’t sell it because it was unreliable, So I got it with two 5.25″ floppy drives and 640k RAM which was pretty good when most XTs had 512k. Dual floppies was perfect for me as I was doing number crunching (analyzing election returns) using Lotus 1-2-3; I could keep the Lotus program disk in one drive and read and write data with the other. I used WordStar, with a 3rd party extension called StarFixer, for word processing. I bought a cheap tractor-feed dot matrix printer. For nice looking letters, I had a portable Brother memory typewriter with a two-line display.

    Before this I had spent one day working with an Apple and hated it. Back at university in the 70s we (econ and business students) had teletype terminals connected by acoustic modems to a HP-2000 Time Share Basic mainframe across campus in the natural sciences building. Program and data storage was on punched paper tape which damaged easily. My empty pipe tobacco tins were sought after for protecting those paper tapes.

    I was on CompuServe when it was text only and you paid for access in six-minute increments. For a little more you could search something like 60 newspapers in North America. If the search got no hits, it was free. If you got a hit and the title seemed interesting you could view the article for an additional fee.

    There was a similar deal for the abstracts of papers and reports of the National Institute of Standards and Technology and a few other agencies. Full text could be had on paper or microfiche delivered by mail and charged to your credit card or you could maintain a NIST account to draw from. When in the wind energy business, I used this clunky system to acquire wind data from the National Weather Service and the US Air Force.

    Unless you were rich, you were constantly logging on, running a search, then reviewing results offline. You downloaded your email to read off-line, write replies, then log on again to send replies. To exchange files with another computer, you established a voice phone call and then both parties plugged in their computers to transfer files. I still have the first modem I bought – 75 baud, I think.

    The IBM standard upgrade path was fairly straightforward, a Leading Edge 286 and 13″ CGA monitor. When I moved up to 386, it was a custom-built machine and that techie also sold me what was then a monster 20″ CRT monitor. Since then I have had another custom-built and three HP machines and, along the way four or five laptops. and more inkjet and laser printers than I could remember, including an oversize photo printer. About the middle or late 1990s I gave up trying to do my own hardware upgrades – I found I was spending more time trying to understand the latest wrinkles in computers than using one. Aside from hard drive crash recoveries and and a couple of laptops and printers, I haven’t spent any money on hardware for about 15 years.

    BTW, the computer control of appliances goes back a long way. In the progression from 8088 to 8086 to 80286, 80386, etc.. did you ever wonder why there was no 80186 computer? The much more capable 80286 came along very quickly and the 80186 went into appliance controls.

  5. Ah, the sweet nostalgia!

    I well remember, at university, the replacement of the old Titan machine by the brand new IBM 370. With 2 megabytes of RAM! (For an entire university). I recall my pleasure at discovering how much easier it was to edit my programs on punched cards than on paper tape. No more splices! Fortunately, I never suffered the disaster one of my friends did – an entire boxful of cards deciding to imitate a downward pointing fountain. Sorting them back into order was a nightmare.

    In 1975, in my first job as a professional programmer, I used a machine where to start it up I had to enter a particular pattern on the handswitches, then run two paper tapes through the reader. Only then could I load the paper tape for the program I actually wanted to run, such as the compiler. We used to have girls called “programmers’ assistants” to type in our programs and to splice our tapes. One does wonder what happened to people like these when technology advanced to the point where programmers doing their own editing because cost-effective?

    Since I bought my first PC in 1991, I have always adhered to the same policy I use with other goods, including cars. I buy the very best I can possibly afford, then run it into the ground. In 25 years, I’ve only owned four PCs, and one of those was a laptop. Which perhaps partly explains my annoyance when software seems to change just for the sake of change. As to hardware, I would never even contemplate building my own computer, being the second best person I know at breaking hardware.

    I’m not sure whether Sean’s pessimism about the future of the industry is warranted. What I do know is that there’s still plenty of mileage in the area I currently work in – systems which help businesses to run their businesses, and which evolve as their businesses evolve.

    As to older means of doing the same job, I still do possess a typewriter (1971 vintage) but these days I only use it for typing labels. And I carry an A6 notebook all the time I am out and about. When I have an idea, whether out walking, or in the pub, or travelling, it goes into the little book. And if it’s a good enough idea, it will get transferred on to the computer in due course.

  6. It does depend on what you’re using machines for. Various applications have “matured” over time; I remember when bitmap graphics was something you kept needing more speed for, but that matured in the very early 2000s. In 1998 I recorded a CD of music using Cubase; it was possible on my 300MHz Pentium II, but only just, but since then that has matured.

    3D graphics still has a long way to go to reach maturity. I am as a hobby playing with 3D animation in Blender, and I desire a machine orders of magnitude faster than the one I have. We are a very, very long way from real time professional quality 3D rendering, which would be the benchmark of maturity in that application.

    But yes, word processing and office stuff has been mature for a long time now, I think.

    • That’s interesting. I will look into Blender. I haven’t attempted animation yet, but would like to as it seems to be very effective when it comes to keeping people interested in e-commerce websites.

      • You may notice that the book advertisements to the right of this post are missing. This is because the cloud-based server on which the images are stored suffered a power cut a few days ago, and everything is down.

        My own preference is to have everything stored, or at least backed up, locally.

        • The book advertisements are shown on my screen, so maybe it’s a local problem rather than the cloud-based server? I agree, though, that there are potential hazards and risks with this, and I’m not trying to ‘sell’ the concept. The safest way is to have local back-ups, I agree, and as I state above, that’s what the more sophisticated and risk-aware users will do. But I imagine there will be major growth in ‘device-neutral’ cloud computing among ordinary household users, where there is a market for gaming and entertainment and people will enjoy using integrated devices without all the costs and hassles of hardware.

  7. We are all very privileged to be living right now. None of us could do what we have learned to do without what Jerry Pournelle in Byte magazine portentiously called in the early 90s, “the coming little computers”.

      • To me, it’s the paradox of living in a time of technological wonders, and political catastrophe. I sometimes lapse into short periods of a kind of wistful depression, imagining how the country would be if Thatcherism had led onwards to libertarianism of some form, instead of the ghastly progressive statism and monetary inflationism we got.

        • I recall a documentary from about 20 years ago about the favelas of Brazil that depicted how high technology can exist and be utilised even by very poor communities. It had people in these slums using satellite dishes and even computers.

  8. You all missed out on the pleasures of the Amiga 500. Modern computing way back when. My son is an IT professional but he still wants to program for the Amiga too. It is a very rewarding machine

  9. What about the problem of those who cannot come to terms with IT? What of the millions of people too old to have had any experience of computers in their youth and who find computers terrifying? What of those who are disabled in ways which make using computers impossible, for example, many blind people? What about those who are simply lacking in the intellect required to operate them? What about technophobes? What about those who are too poor to have the Internet? How are all these people to survive in a world which increasing expects and indeed demands that services and goods may only be obtained online?

    • My experience with people with learning difficulties is that they seem entirely capable with IT, and manipulate their Smartphones fully competently, etc.

  10. What’s the sense in a private user availing himself of a cloud service merely to store data? Data storage and backing up of files are easily done on one’s own computer and storage facilities. I suspect computer giants have some plan in mind in their encouragement of cloud usage.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s