1976 – Apple

You know the old joke: the world is divided into to two kinds of people, those who believe there are two kinds of people, and those who don’t! As I sit typing on my Apple Mac, there is another and rather more intriguing division: between those who are Apple users, and those who aren’t. I am not talking about iPhones, but about the serious devices, Apple computers, PowerBooks, iPads and the like. Did I say ‘Apple users’? I meant Apple followers, fanatics and fans! Perhaps this is no more meaningful than the division between those who believe there are two sorts of people and those who don’t, but I think it’s significant: Apple people are different.

To put my thoughts in context, I have to go back to the earlier days of computing, and the way things were, reminiscences which are likely to confirm two things: first, that what the world was like back then is ancient history, and second that those who talk about such things are themselves ancient and their recollections largely irrelevant! Well, too bad. I am going to talk about the 1960s and 1970s, because I think they help set the scene for the changes that Wozniak and Jobs introduced. As relatively early computer user, I was to join an IBM world. Not from childhood, of course, but using computers saw me seamlessly introduced to the world of IBM, and, not incidentally, MS-DOS. In fact, when I first used a computer in the 1960s, it was in the time of Titan, punch cards, and machine code. You never saw the computer, but left your tray of cards at reception, and one of the machine’s acolytes would whisk the tray away, to run your program during the night. The computer was a black box, invisible to humble users, even if it was a rather large black box! My interactions were mediated by FORTRAN, an early computer ‘language’, and the tasks were simple – in my case carrying out basic statistical analyses on questionnaire data. This was like – like the time of the Wright brothers and beginnings of commercial aviation – using hardware and programs that were, quite frankly, rather esoteric, their systems akin to those crazy Heath Robinson and Rowland Emett mechanical sculptures.

I won’t dwell on those early days, although I am tempted, but rather acknowledge that by the end of the 1960’s business had taken over, and the giant was IBM. IBM had long been the leader in mechanical counting and tabulating machines, sorting data on punched cards. It’s roots ran back to the 1880s, combining four predecessor companies in 1911, those businesses covering routine tasks like employee time-keeping and punched card equipment among other activities. The punched card business was used in 1890 for the 1890 US Census systems, and was the core part of the Computing-Tabulating-Recording Company, which its CEO, Thomas J Watson, renamed the International Business Machines Company, IBM. Watson was both a risk taker and a strategist. He focussed on sales, with a generous incentive scheme, and demanded attention to customer service, together with a commitment to his staff – hiring the company’s first disabled worker in 1914, setting up an employee education department in 1916, and, famously in 1915 introducing the slogan THINK. At one level it was hokey, with singing at meetings and an employee newspaper, but employees were rewarded for innovative ideas, and particular encouragement given to a core of leading engineers. His company introduced the 80-column punched card in 1928, which was to remain the industry standard up to the 1970s.

The company grew and grew, both in the US and overseas. However, while UNIVAC was to launch the first commercial electronic computer in 1951, IBM was only a year behind. When Watson died in 1956, his son Thomas J Watson Jr took over, modernising operations. Perhaps indicative that he was as far-sighted as his father, in 1957, he recognized the emerging use of transistors and committed IBM to use solid-state circuitry in all future machines, stopping the development of any new product based on ‘old fashioned’ vacuum tube systems. Working on government contracts, especially military applications, IBM drew on cutting-edge research into digital computers being undertaken under military auspices. It saw programming as a short-term business, which meant it allowed others to develop software for a diversity of applications.

The hardware business grew and grew, and 1964, IBM introduced the first truly general purpose machine, the System/360, a ‘family ‘ of computers using interchangeable software and peripheral equipment. Within two years, System/360 became the dominant mainframe computer and IBM quickly emerged as the world’s largest computer supplier. Then, in 1969, it “unbundled” software and services from the hardware sales, which until that time had been provided free with the hardware, a decision that was to play a key role in encouraging an independent software industry. By the 1970s, the company’s dominance ensured it kept prices high, product innovation slowed, and manufacturing relied on  IBM components. It didn’t seem to matter. An academic with an interest in computing, it wasn’t long before I was using a computer via a terminal, the IBM 370 . The days of TITAN were long gone, and now I could run programs like the Statistical Package of the Social Sciences (SPSS), a flexible programming system designed for use on mainframe computers: it was a lot easier than writing my own programs in FORTRAN.

However, change was coming, and nowhere more fatefully than on April 1, 1976, when Steve Wozniak and Steve Jobs founded Apple Computers, in Job’s parents’ garage in Cupertino, California, south of San Francisco, now the heart of ‘Silicon Valley’. Fatefully? Certainly. By the end of the year, Apple 1, first seen in prototype at the Homebrew Computer Club in the middle of the year, was on sale, where it was offered as a kit rather than a fully fledged personal computer. Within a year, Apple 2 appeared, and by the end of the decade Apple Computer Inc. was ready to sell Apple 3, targetted to do battle with IBM in both business and corporate markets. The next few years have been described in numerous histories, most of which focus on the demanding and somewhat mercurial Jobs.  However, one significant moment came when Jobs and other Apple staff visited Xerox PARC R&D facility. After three days looking at ideas being developed, Jobs was convinced the future would see the use of graphical user interfaces on computers. That insight led to Apple’s Lisa, and eventually the Macintosh in 1984.

More than 37 years ago, the Macintosh was launched at Superbowl XVII, in a $1.5m television advertisement . Directed by Ridley Scott, it still stands out as a masterpiece, like a science fiction epic about to begin, and ending with the memorable line “On January 24th Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like ‘1984’”. It seems so unlikely, but that advertisement is still accessible on YouTube. Equally compelling was Job’s own launch of the Macintosh, both he and the computer behaving like master showmen, and all to the background of Vangelis’ music for Chariots of Fire! Apple had grown up, and from that moment, despite the trials and tribulations of Job’s career, the peaks and troughs of the company’s business, Apple was always going to be advancing the envelope of what was possible, what modern electronic devices could do and what they would look like.

If Apple was selling personal computers from 1976, IBM seemed unconcerned. It’ focus was on Japanese competition in the business market, and by the late 1970s it had shifted its strategy to offering volume discounts and lower prices to large customers, together with the frequent introduction of ‘new’ products. However its share of the overall computer market declined from 60% in 1970 to 32% in 1980, as it completely missed or ignored the fast-growing market for mini-computers, falling behind such rivals as Wang and Hewlett-Packard. Having realised its failure in this segment of the market, IBM was determined not to lose out in the emerging personal computer sector. Its slow entry into the competition was a function of a massive development task, and the groundbreaking IBM PC, launched in August 1981, proved a major success. It was a powerful device, with 128 kilobytes of memory (expandable to 256 kilobytes), one or two floppy disk ports, and an optional color monitor. It wasn’t cheap, nor were the major buyers corporate computer departments, (after all, the IBM PC wasn’t a ‘proper’ computer), but purchases were often made by middle managers and senior staff who saw their business potential, especially when IBM included the ‘VisiCalc’ spreadsheet program.

By 1984 IBM had launched a more powerful PC, at a relatively low price, and this contributed to its return to dominance across the computer industry. By 1985, IBM had 41% of all revenue and 69% of profit for the 100 largest data processing companies. Its revenue was about nine times that of second-place DEC, and larger than that of IBM’s six largest Japanese competitors combined. The 22% profit margin was three times the 6.7% average for the other 99 companies. However, even as it seemed unstoppable, IBM changed its approach. It had relied on a vertically integrated strategy, building most key components of its systems itself, including processors, operating systems, peripherals, databases and the like. However, facing high demand and growing anxiety over time-to-market, IBM chose not to build a proprietary operating system and microprocessor for the PC. Instead, it sourced these vital components from Microsoft and Intel, ending its monopoly and in doing so passing extraordinary power over operating systems and processor architecture to those two companies. This was to open the door to IBM ‘clones’, and the creation of hundreds of billions of dollars of market value outside of IBM. Apple was well aware of what was happening, and was determined to keep control over software and systems, even if it did have to acquire processors from independent integrated circuit manufacturers.

Was that the ‘secret sauce’ for Apple? I don’t think so. What Steve Jobs brought to the business was not just technical expertise, nor was it his deep understanding of changing approaches to software; his most fateful contribution was an aesthetic of elegant simplicity: Apple products would be stylish and easy to use. That approach was clearly the motivator in the development of the iPod. Portable MP3 players weren’t new, and had been in the market since the mid 1990s. However, existing players were “big and clunky or small and useless” with user interfaces that were “unbelievably awful”. Apple was concerned about storage: flash memory systems didn’t carry enough songs, and hard drives were big and heavy. They decided to develop their own.

Work on the device was undertaken by Tony Fadell, who wanted to build a better MP3 player than those on the market, and at the same time believed the player should be accompanied by a music sales store. Fadell had started a company, Fuse Systems, but his attempts to sell to major electronics firms like Sony and Philips, had been unsuccessful. Then Apple came on the scene and he was taken on as an independent contractor to work various projects. Busy with the iMac line, Fadell hired engineers from his own startup, as well as veterans from others companies, and outsourced the software to PortalPlayer. That development led to iPod OS, the software that Apple was to extend and develop for all its products. The other key person was Jonathan Ive, who set designed the player with a wheel interface, said to have been prompted by Bang & Olufsen products. Quickly Jobs became deeply involved in the design, and Apple’s “Walkman of the twenty-first century” was developed within a year. It was released in late 2001.

Within a few years, the iPod came to dominate digital music player sales in the United States, with over 90% of the market for hard drive-based players and over 70% of the market for all types of players. But it was much more than that. It established a reputation for quality, in design, simplicity and ease of use, that became the signature for Apple. It was also the symbol of Apple’s revival, since Steve Jobs had returned as CEO in 1987. Six years later, the iPhone was launched. The history of the iPhone deserves more than a brief paragraph in a blog. Sufficient to say that the project to develop the iPhone began with Steve Jobs asking Tony Fadell, software engineer Scott Forstall, and design engineer Jonathan Ive to create a touchscreen device to superseded existing mobile phones and music players, while also drawing on what had been learnt in the innovative but unsuccessful Newton project. In January 9, 2007, Steve Jobs announced the first iPhone at a Macworld Convention, and it has led the market ever since.

I began these comments by talking about two kinds of people: Apple users, and those who weren’t Apple users. I was certainly in the second group. I did succumb to an iPhone, and that was because of a special offer to a group of Qantas frequent flyers. But for everything else, I was in with the vast legion of IBM users. Not IBM computers themselves. I was one of those ‘discriminating’ people who studied the difference between ACER, HP and Dell PCs, considering CPU speeds, storage capacity, USB slots and similar data. I thought Apple products were for academics in design, arts subjects, and the like. I never expressed it that way, but I knew ‘real men’ and businesspeople used IBM lookalikes. And then I leapt over the chasm.

I can’t explain what happened, but when we arrived in the US, my wife and I walked into a box store, Best Buy, where we promptly bought a desktop Mac, a MacBook, and two iPads. We had moved to the other side, instantly becoming Apple people, and never looked back. It wasn’t just the aesthetic, though that was impressive. It wasn’t some kind of siren call from our iPhones, though I’m sure they played a role. It wasn’t about processing speeds, though they were very fast. It was, I think, we felt welcomed into a family, where everything worked seamless together. We were seduced. Could I cross back over ? Yes, I could. Will I? I don’t think so. For a variety of reasons, I’ve become ‘Appled’. I get grumpy about software upgrades. I don’t like the latest version of iBooks – oops, now just ‘Books’ – but I am already being seduced in by some of the new features. Looking over from Apple, it’s a long way back to the other side.

Recent Posts

Categories

Archives