Brainstorm Magazine

  • Full Screen
  • Wide Screen
  • Narrow Screen
  • Increase font size
  • Default font size
  • Decrease font size

The mainframe is dead; long live the mainframe

Since its arrival in 1964, the mainframe has been the one constant in an ever-changing sea of technology. It remains so despite periodic predictions of its imminent demise.

BY  Brian Bakker , 1 October 20070 comments

An IBM System/360 Model 50, circa 1968 (source: IBM Archives)An IBM System/360 Model 50, circa 1968 (source: IBM Archives)

E-mailPrint


In the May edition of Brainstorm, we investigated the origins of electronic computing and how that culminated in the arrival of what is often referred to as the first mainframe: IBM`s System/360. More recently, we examined two other classes of machine that emerged from those same roots: the minicomputer and the supercomputer. Now it`s time to look at how the mainframe evolved from those IBM-dominated days of the ‘60s and early ‘70s.

In Inventing the Electronic Century, author Alfred Chandler explains how the 360 represented a consolidation for an IBM that was facing increasing competition in the late 1960s. Indeed, before launching the 360 family, Big Blue had seven different lines, each targeting a different price/performance range. On top of that, each used different peripherals and had its own operating software and set of applications. In a word, they were incompatible. Add to this the fact that IBM`s competitors all followed different development paths and created proprietary systems and devices, and you have a sense of the computing landscape through the ‘50s and early ‘60s.


RCA's failure did not invalidate the basic economics of copying the 360.
But IBM changed all this with the 360 by standardising all IBM computers on a single operating system (OS/360). This created the economies of scale that allowed IBM to leap ahead of its rivals in the then fledgling industry. In Big Blues: The Unmaking of IBM, Paul Carroll elucidates: “The 360 line was the first true family of computers. To that point, anybody buying a more powerful computer from IBM or anyone else couldn`t use his old software on the new machine. With the 360 line, customers could start with a small machine and work their way up as their needs grew, taking all their software along with them. The ability to run old software on new machines was a crucial feature that convinced IBM customers to write hundreds of billions of dollars` worth of software to run on IBM machines over the years.”

Readers may recall that, in the early days, computers shipped with a basic operating system and tools in the form of programming languages like Cobol and Fortran, needed for customers to develop their own business applications (Brainstorm, June 2007). In those early days, there was no such thing as packaged software [another topic we`ll explore in a subsequent chapter – Ed ].


Less likely to move

Indeed, the very fact that customers needed to invest in the skills and time to develop their own applications meant they were less likely to move to a different, incompatible platform. Since such a move would be tantamount to writing off the previous investment and require the purchase (or leasing) of new hardware, customers tended not to do it. The phrase “vendor lock-in” may resonate.

However, the economic impact of the 360 was significant for another reason. Kenneth Flamm relates the influence it had on the industry as a whole, and IBM in particular, in Creating the Computer: government, industry and high technology : “The introduction of the concept of compatibility proved a turning point in the economic history of the industry.”

He notes further that this concept “created a unified market that greatly stimulated the commercial use of computers…. By drawing the boundaries of this large, unified market with a proprietary, internally-controlled standard, IBM created serious obstacles for current and potential competitors”. This is not to say that there was no competition. Other players in the US vying for a slice of that pie included Sperry Rand, Control Data, Honeywell, Philco, Burroughs, RCA, General Electric and NCR.

However, IBM still dominated the industry – largely because it had effectively cornered the market on computer research funding provided by the US government – to the extent that such funding made up over half the company`s research budget in the early 1950s. This is exemplified by the Sage air defence programme, for which IBM developed 56 specialised computers at a whopping $30 million apiece. Mull this over for a second: IBM received an effective $1.68 billion in research funding from the federal government for a single programme. To put this into context, the commercial computer market worth was less than $1 billion in 1955, according to figures provided by James Cortada in Historical Dictionary of Data Processing.

Another key research programme of the time was Project Stretch, which started in 1956. This project ultimately provided a new component technology that was used in IBM`s large 700 systems, its smaller 1400 computers and was also central to development of the all-important System/360 in the 1960s, according to Chandler. Fortunately for the competitors, the US Justice Department had taken cognisance of the benefit IBM was deriving at their expense and resolved to address the imbalance legally. This process culminated in the consent decree of January 1956, the most critical part of which called on IBM to license its “existing and future patents” to “any person making a written application”.


Opening up


Each time IBM introduced a new peripheral, [CDC] quickly followed by bringing one at a lower price.
Alfred Chandler
In the interest of opening up the electronics field, similar agreements were inked by Justice with RCA and AT&T – two other virtual monopolies of the time. This opened the door for competitors but, ironically, the first company to take advantage of the consent decree to challenge IBM was fellow monopolist RCA. The company had spotted a major weakness in the way the IT industry was evolving. Each of the competitors was working on and had developed its own proprietary architecture, was claiming it to be superior to that of everybody`s chief rival IBM as well as other smaller ones. RCA bucked the trend by launching the compatible mainframe market in 1964 with its Spectra 70 – a line of four computers that would execute, without modification, software written for corresponding models of the IBM 360 line. And they cost up to 40 percent less (Brainstorm, May 2007).

Paul Ceruzzi reports in a story of Modern Computing that while RCA ultimately failed in its attempt to challenge IBM, this did not invalidate the basic economics of copying the 360 architecture. “Other companies with far less capital than RCA proved successful, not by copying the entire line, but by targeting pieces of the 360 system: memory units, tape units and central processing units. These companies, operating under the umbrella of IBM`s pricing policies, established the ‘plug-compatible manufacturer` or PCM business...”

At this juncture it is worth noting that, unlike today where most components are housed in the same physical chassis, early mainframes featured many discrete components connected by under-floor cables. Chandler writes that a number of existing enterprises started producing IBM-compatible products in the mid to late ‘60s.

“They usually started producing on an OEM basis – that is, selling directly to other manufacturers to be used in those producers` final products. Soon, however, they were marketing them [under] their own labels to computer users as replacement parts.”

Among these was Telex, a pre-war hearing-aid producer. The company got into the IBM plug-compatible tape drive business in 1967 by reselling the products of Information Storage Systems, a company formed by 12 defecting IBM engineers (aka the Dirty Dozen). Another group left a short while later to form Storage Technology Corporation (better known as StorageTek).

Memorex got into the tape drive business the following year and later expanded into compatible disk packs and drives. Also in 1968, Ampex – an established maker of tape drives – started producing plug-compatible units and, a year later, plug-compatible core memories. Meanwhile, IBM`s more traditional competitors weren`t being left behind. Control Data entered the PCM business toward the end of the ‘60s by redeveloping its existing range of peripherals to be IBM-compatible. CDC started with disk drives, then add-on memory and, shortly thereafter, was selling a full line of substitute peripherals for the IBM 360 line.


Tit for tat

Writes Chandler: “Each time IBM introduced a new peripheral, [CDC] quickly followed by bringing one out at a lower price. Through the practice of [reverse] engineering and making slight changes, it could stay abreast of IBM without incurring similar development costs. By 1970, Burroughs, NCR and Sperry Rand, as well as RCA and GE, were producing plug-compatible peripherals.”

Meanwhile, international competitors weren`t just sitting on their hands, despite being at a serious disadvantage from not having access to the Cold War-era US defence spending. Four European and five Japanese companies were mounting a challenge.

Chandler: “The European enterprises were International Computers Ltd., Compagnie des Machines Bull S.A., Olivetti S.p.A. and Siemens-Nixdorf, a subsidiary of Siemens AG (a descendent of Siemens & Halske AG). The Japanese enterprises were Fujitsu Limited, NEC Corporation, Toshiba Corporation, Hitachi Ltd. and Mitsubishi Electric, a member of the Mitsubishi Group.”

It wasn`t long before attention moved beyond peripherals to the processors themselves. The first serious challenge to IBM`s dominance in this area came from one of its own. Ceruzzi reports that top IBM designer Gene Amdahl left to establish Amdahl Corporation in 1970 with the express intention of building a compatible processor. Unfortunately, he chose to start his company in the midst of an economic depression and was unable to find a venture capitalist to provide the necessary funds.

Enter Fujitsu, which, with other Japanese companies and the government agency MITI, had been making little headway in its efforts to build an IBM plug-compatible. Fujitsu jumped at Amdahl`s offer and purchased 24 percent of the company in return for an exchange of technical information. The effect on IBM`s market share was gradual but inevitable.

An un-attributed graph published in the 1992 Downsizing Information Systems (edited by Steven Guengerich) lists the top five mainframe processor manufacturers as IBM (44.4 percent), Fujitsu (11.6 percent), Hitachi (11.1 percent), NEC (8.5 percent) and Amdahl (5.2 percent). There were, at that time, still a large pool of smaller players – the “other” category makes up the remaining 19.2 percent of the pie. However, IBM didn`t only have to compete with Amdahl and the Japanese; it was also facing a challenge from a group of independent companies.

Chandler reports: “As the System/360 came on stream in the late 1960s, its leasing immediately became big business. These firms purchased IBM machines, leasing them at a lower price and offering more financing alternatives than IBM… Some of the leasing enterprises were units of large established enterprises, but most were start-ups, including such firms as Itel, Diebold, Leasco, Levin-Townsend and MAI.”


Mainframe computing will die with the coming of the millenium... 31 December 1999.
Robert X. Cringley
And then there were the threats to the mainframe itself. In the mid to late ‘80s, a new system architecture emerged called client-server [the full story will be told in a subsequent issue of Brainstorm – Ed.]. Until then, the architecture of computers had been built around a large central processing unit, coupled to a network of peripheral devices for input, output and data storage.


Peripheral intelligence

Initially, data capture and programming was accomplished by means of punched cards or tape, but these started to give way to video display units or so-called dumb terminals in the late ‘70s. And then came the PC [another story for another time – Ed.]. Because there could now be intelligence at the periphery of the computer, the theory was that the processing job could be spread around. Unfortunately, this proved tricky to accomplish in practice and typically meant customers would have to redevelop their entire software portfolio for the new platform. Since few big customers were prepared to throw away decades of investment in home-grown software, the advent of the client-server model didn`t kill off the mainframe as many had predicted. If anything, this software legacy ensured the longevity of both IBM and the mainframe.

The advent of client-server – the model we still use today – wasn`t the last time the refrain was heard. Indeed, in the late ‘90s, pundits were discoursing at length about how Y2K would be the death knell of big iron. In Accidental Empires, Robert X. Cringely wrote: “We can predict the date by which the old IBM – IBM the mainframe computing giant – will be dead. We can predict the very day that the mainframe computer era will end. Mainframe computing will die with the coming of the millennium. On 31 December 1999, right at midnight, when the big ball drops and people are kissing in New York`s Times Square, the era of the mainframe computing will be over.”

There were very good reasons for his pessimism and he was by no means alone. But what he and others failed to consider was ingenuity; the ability of the human animal to think its way out of difficulty. Looking back now, we can unequivocally say that Cringely was wrong. The mainframe didn`t die on the day and neither did IBM, but it was a close-run thing.

To give Cringely his due, he could have been right on both counts. When his book was published in 1992, IBM was in freefall. That year, the company posted what was then the largest financial loss in corporate history.

Writes Carroll: “John Akers [then IBM chief executive] was finally in serious trouble in late 1992. After years of one step forward and two backward, he had promised the board members that IBM would take another step forward in the second half of the year.

“Securities analysts, with some guidance from IBM, had begun 1992 projecting a $4 billion profit for the year, but prospects for any kind of profit began to fade in late September as IBM`s business in Europe fell apart. Akers was on his way to a $5 billion loss...”

What Cringely and other pessimists didn`t count on was Lou Gerstner`s appointment in 1993. But even then, many were openly dismissive at the time.

Carroll wrote: “Gerstner will be able to apply only management consulting dogma to IBM. His choice by the board indicates that it doesn`t think there`s any grand vision out there that could revitalise IBM the way the movement into computers did in the 1950s, the way the 360 mainframe family did in the 1960s, and the way the PC did briefly in the early 1980s. Without that grand vision and without a breakthrough product, Gerstner will just be fiddling.”

IBM turnaround

In retrospect, it is probably simply a case that nobody within the IT industry at that time could have turned IBM around. It needed a person not wedded to the concept of pumping out new products every few months. It needed someone who would find alternate revenue streams to replace those with dwindling margins such as hardware. What few commentators realised at the time was that IT products – be they hardware or software – had become, or were fast becoming, a commodity. But Gerstner knew this and, under his guidance, IBM became the leading IT services organisation on the planet. He tells the tale himself in his fascinating autobiography Who Says Elephants Can`t Dance?

Even now in 2007, the mainframe is still alive and well, living in a data centre near you. Granted, it`s not the same machine it was back in 1964. It has gone from being air-cooled to relying on liquid coolant and back again. It has much faster processors, oodles more memory and terabytes of disk capacity. And it`s certainly not running the same software. In fact, in some cases, it may even be running software called Linux, originally developed as recently as 1992 by a Finnish university student named Linus Torvalds. All of that may well be true, but one thing hasn`t changed: the mainframe is still a monolithic central processing unit sitting in a carefully controlled basement computer centre.

It`s still sitting there, computing complex calculations at a blistering pace. To paraphrase Mark Twain: reports of the mainframe`s demise have been greatly exaggerated. The mainframe is dead; long live the mainframe.