How Microsoft did not negotiate the contract of the century

Paul Allen and Bill Gates in 1970 sitting in front of a Teletype Model 33 ASR terminal at Lakeside School.
Paul Allen and Bill Gates in 1970 at Lakeside School – public domain photo by Bruce Burgess via Wikimedia Commons.

MO5.com association, of which I am a member, aims at preserving information technologies heritage. As part of this activity, we also document the history of information technologies. Concerning this history, one can regularly hear that Microsoft negotiated a contract with IBM in the very early 1980s, which allowed Microsoft to become the giant it is today, at the expense of its former partner.

Therefore, explaining how Bill Gates, a good poker player and fine negotiator, managed to negotiate the contract of the century, the one that led Microsoft to become the leader in the world of personal computers to the detriment of IBM. It is all part of Bill Gates’ legend, which does not seem to bother him.

However, looking at the documents from the time, the evidence tells a different story. These documents paint a picture of a combination of circumstances rather than a skilful negotiation by the young head of a struggling start-up … Let me tell you how Microsoft did not negotiate what would become the contract of the century.

To reconstruct this story, I have used the book Accidental empires1Cringley, Robert X. (1996). Accidental empires: How the boys of Silicon Valley make their million, battle foreign competition, and still can’t get a date, updated edition, Harper-Collins. Available on the author’s website., the associated documentary Triumph of the nerds2Cringley, Robert X. (1996). Triumph of the Nerds, Channel 4. Available on-line in two parts. There’s also a companion website to this documentary, with a whiff of the 1990s!, the episode of Computer Chronicles dedicated to Gary Kildall and Digital Research3Cheifet, Stewart (1995). “Gary Kildall Special”, Computer Chronicles, PBS. Available on-line., the series of articles dedicated to these events on The Digital Antiquarian4Maher, Jimmy (2012). The IBM PC, 4 parts, The Digital Antiquarian. This article has been published on-line., as well as the preservation work carried out by the MO5.com association. The information had to be cross-checked, as even if the events in question were reported by the people involved in, their memory is imperfect, and they may have an interest in the story being told not exactly the same as the facts. In any case, there is enough evidence to reconstruct the most likely story.

In the beginning was IBM

IBM CHQ in Armonk, NY, USA
IBM headquarters in Armonk – photo by Treesmittenex Creative Commons Attribution-ShareAlike 4.0 International licence via Wikimedia Commons.

The computer, in its modern sense, emerged following World War Two, as the premises of computing were used in the war effort. In this new market, the market for computers, which at the time were machines that took up a whole room and were soon to be known as “mainframes”, IBM soon established itself as one of the main actor. Its decisions shaped the market. However, another kind of computer, the minicomputer, appeared, offering slightly less computing power, but with the advantage of occupying a space equivalent to a few cupboards rather than an entire room. In this field, manufacturers other than IBM managed to make a name for themselves alongside the giant. One example is the PDP series from DEC, widely used in universities.

In 1971, Intel brought to market a new development in computing: the microprocessor. A processor is a logical data processing unit: it is the entity that brings together the computer’s calculation and data processing capabilities. Originally, the processor was made up of several independent hardware components that communicated with each other. The principle of the microprocessor is to bring together the various components of a processor in a single component, which not only saves a great deal of space but also greatly improves energy consumption and the speed of communication between the various processor units.

Introducing this type of chip made it possible to design much smaller computers that could be placed on a desk: microcomputers.

Incidentally, one of the very first, if not the first, microcomputer, that is a computer based on a microprocessor, was designed in France: the Micral from R2E. MO5.com is restoring one of these, as well as documenting its history.

The advent of microcomputers led to the emergence of two new areas in computing: small-scale dedicated applications—for instance, the Micral restored by the MO5.com association was used to control centrifuges–and computing for small organizations (such as small businesses) and even individuals. This increased the size of the computer market.

Then came Digital Research

Gary Kildall
Gary Kildall in 1988 – © Tom O’Neal, Carmel Valley, CA via, Computer History Museum.

A computer, whether it is a mainframe, a minicomputer, or a microcomputer, is not just limited to its hardware, it also needs a very central piece of software: the operating system. Indeed, each component of the computer is materially capable of communicating with the other components. However, it is still necessary to determine the conventions that will enable these components to get along with each other, as well as the form in which data will be stored and how the various programs will be able to access the resources of the computer, the same to be done for the user. These tasks are performed by the operating system.

In the mid-1970s, Gary Kildall created the CP/M (Control Program for Microcomputer) operating system. It was particularly well-suited to the resource-poor microcomputers of the time, and very quickly became the benchmark in this emerging field. In order to market this operating system, Gary Kildall founded Digital Research in 1974, one of the very first software publishers dedicated to microcomputers.

In addition, computer manufacturer Seattle Computer Products (SCP) marketed kits based on Intel 8086 processors. Initially, SCP supplied its kits without an operating system, but it soon became apparent that sales would be much higher if the kits had a dedicated operating system. However, rather than use CP/M-86 from Digital Research, SCP commissioned one of its employees, Tim Paterson, to develop an alternative. The result would be an operating system offering the same programming interface and virtually the same commands as CP/M, whose trade name was 86-DOS (DOS for Disk Operating System). Gary Kildall raised suspicions, but Tim Paterson always denied that he had been inspired by the CP/M code itself: so, according to Paterson, it was a compatible operating system, not a clone.

It was in these same years that Microsoft was founded (in 1975 to be precise), positioning itself in this emerging market. The company mainly sold Basic language interpreters, which had acquired a good reputation.

Computers bite the apple

Steve Wozniak and Steve Jobs in 1976.
Steve Wozniak (left) et Steve Jobs (right) in 1976 – © Apple.

In the 1970s, two electronics enthusiasts, Steve Wozniak and Steve Jobs, created a microcomputer that was easier to access than those available at the time. This was to be the Apple-1 computer. To sell this computer, created the Apple company in 1976. While this first computer was already a step forward in terms of accessibility, it still required its purchaser to assemble the various components of the computer. The next Apple computer, the Apple ][, launched in 1977, remedied this problem.

This new computer was sold already assembled and with an operating system designed to be used without spending too much time reading a complex user manual. As a result, it has broadened the audience for microcomputers. This computer also had several extension ports, meaning slots into which electronic cards could be inserted to expand the capabilities of the computer. This meant that the computer could be adapted to a wide range of tasks.

One piece of software in particular emerged that led to the adoption of the microcomputer in many companies, particularly small ones: VisiCalc, the first spreadsheet software, published in 1979. VisiCalc made it easy to create balance sheets and financial projections, and very quickly became a huge success in the finance sector for a software package of its time.

IBM rides the wave

Kelly Slater surfing in 2008.
Kelly Slater in 2008 – photo by Rob Keaton under Creative Commons Attribution-ShareAlike 3.0 Unported licence via Wikimedia Commons.

At the very end of the 1970s, a new demand emerged for microcomputers equipped with spreadsheets. At the same time, the first word-processing software had appeared, making it possible to combine in a single system the functions used to help with financial work and the drafting of documents usually produced using typewriters. Customers of IBM tended to be large groups rather than small companies, but they were interested in equipping their employees, at least in the financial departments, with such machines. As IBM customers, they preferred to have the same supplier for both their mainframes and their office machines, so they put in a request to IBM. Also, for IBM, marketing microcomputers could broaden its customer base.

However, managers from IBM did not believe that microcomputers could really become an important sector. In their view, the market would always remain relatively small. And the potential profit margins were clearly lower than for large systems. So they decided to start designing a microcomputer, but with the aim of doing it at a lower cost. Moreover, it was clear that to have a chance of entering this market, they would have to be able to design the new computer in a short amount of time, which was contrary to the usual organization in IBM.

This had a major impact on the way the project was run. First, the team working on the IBM PC 5150 (PC for Personal Computer) had very little time, which meant that a large part of the work had to be outsourced. In particular, despite IBM employee’s experience in this area, the operating system and compilers5A compiler is a program that transforms source code into a machine-executable program, making it the central tool for creating software. were produced by third-party software vendors.

When it comes to compilers, IBM turned to Microsoft, which had already produced compilers for the processor chosen for the IBM PC 5150, namely the 16-bit Intel 8088 processor (a variant of the 8086). In the context of an emerging market, survival is difficult for small companies that have entered the field. This was the case for Microsoft, for whom the contract with IBM was a bargain not to be turned down, whatever the conditions. So Microsoft gladly accepted.

Concerning the operating system, IBM naturally turned to the market leader and contacted Digital Research to ask it to produce a version of CP/M adapted to the IBM PC 5150. Unlike Microsoft, Digital Research was a prosperous company at the time. As a result, the company’s lawyer was much more careful about the terms of the contract. What she read appeared to her as unacceptable: in short, Digital Research would not have the right to communicate about the IBM PC 5150, let alone its specifications. On the other hand, IBM could communicate as much as it wished, and the CP/M version would become part of its assets, enabling it to carry out internal developments on this operating system. In other words, not only would Digital Research not be able to communicate on the work it would do for the IBM PC 5150, which meant that it would not be able to really capitalize on this work to obtain new markets, but it would also be dispossessed of the said work. While having a contract with IBM was interesting in itself, Digital Research did not need it to ensure its survival. It was far more important for its managers to retain ownership of their work. On the other hand, there were very few companies capable of supplying a working operating system for this machine in the time allowed—and, at the time, Microsoft was not one of them. Therefore, the position of Digital Research in the negotiations was much stronger than negotiators from IBM thought. As a result, the negotiations stalled6Executives from Microsoft have stated, for example in Revenge of the nerds, that they referred IBM to Digital Research, but Gary Kildall was more interested in flying his plane than meeting negotiators from IBM. However, this version is denied by Digital Research employees, for example in the special episode of Computer Chronicles devoted to Gary Kildall, as well as by the work of journalist Andrew Orlowski, both in MS-DOS paternity suit settled (July 30th, 2007, The Register, available online) and in Bill Gates, Harry Evans and the smearing of a computer legend (August 7th, 2012, The Register, available online). Furthermore, it is hardly credible that the IBM PC 5150 development team would have heard of Microsoft, at the time a very small organization, but not of Digital Research, which was then a major player in the world of microcomputing. The version presented here is therefore the most likely..

Chief executives of Microsoft were aware of these difficulties in the negotiations. Bill Gates, CEO of Microsoft, then launched a bluff. He knew that SCP had an operating system compatible with CP/M and dedicated to the processor architecture selected by IBM for the PC 5150. Bill Gates had the idea of buying 86-DOS from SCP and adapting it for the IBM PC 5150. Even before starting negotiations with SCP, he told IBM that he had an operating system that he could easily adapt for the IBM PC 5150. Negotiating with SCP, he obtained a non-exclusive licence to 86-DOS for $25,000, the contract also stipulating that Microsoft would pay $10,000 per computer manufacturer that integrated the operating system, rather than a share of the revenue from copies sold7See the copy of the contract provided as evidence in one of the lawsuits against Microsoft, available online.. From this point of view, Microsoft succeeded in a negotiation that was entirely favourable to it indeed. Microsoft also recruited Tim Paterson, who was in any case unhappy with his position at SCP at the time. As for IBM, it is unlikely that it was aware of the plotting of Microsoft’s management. However, IBM was very happy to find someone prepared to supply it with the operating system it needed on the terms it wanted to impose. And so, IBM accepted. Microsoft agreed to supply the operating system, to be known as PC-DOS, on the terms IBM wanted. At this stage, we are still a long way from the contract of the century.

IBM told Digital Research that it no longer required its services. However, Gary Kildall eventually learned that the IBM PC 5150 would be delivered with an operating system reproducing the way CP/M worked. Unhappy, he began negotiations with IBM as a prelude to possible legal action. However, he came to an agreement with IBM: the operating system would not be linked to the machine, as was customary at the time, but instead the purchaser would be able to choose one of three possible operating systems: UCSD p-System, CP/M-86 or PC-DOS. UCSD p-System, dedicated to the creation and execution of programs written in the Pascal language, was a niche operating system that was not expected to sell much. It essentially allowed some universities to acquire machines dedicated to programming in the Pascal language, at a lower cost than the purchase of workstations which were much more expensive at the time and whose range of possible uses was not useful for the purpose for which they were dedicated. On the other hand, CP/M-86 and PC-DOS were in direct competition. However, as already mentioned, the two systems were compatible with each other, so it did not really matter which one you selected. CP/M-86 was sold for $240, while PC-DOS was sold for $40–in his memoirs, Gary Kildall criticizes IBM for artificially raising the price of CP/M-868Kildall, Gary (1994). Computer Connections, self-published. Available on-line.. Moreover, CP/M-86 was not available for the IBM PC 5150 until three months after its release, as Digital Research had not been able to work on the machine during its design. No surprise that PC-DOS was the most installed operating system on IBM PC 5150 once it went on sale in 1981.

The clone war

Promotional image from the 2003 animated series “Star Wars: Clone Wars”
Promotional image from the 2003 animated series Star Wars: Clone Wars, directed by Genndy Tartakovsky – © Lucasfilm.

Thanks to an appropriate purchase and negotiations between Digital Research and IBM, Microsoft found itself in possession of an operating system, PC-DOS, which it could sell to anyone it wished, without being constrained by IBM. On the other hand, this system was dedicated to the IBM PC 5150, so although Microsoft was theoretically independent of IBM, in practice the situation was identical to that of a contractor who had supplied an operating system to IBM. In any case, as long as there was no machine compatible with the IBM PC 5150.

The engineers who designed the IBM PC 5150 had done a good job to offer an office machine suitable for small businesses, to carry out their accounting, office work and minimalist database management, sufficient for small structures. From this point of view, although it did not match the computing power of servers at the time, it did offer more power than the average desktop machine then available on the market. Also, in line with the inspiration of the Apple ][, IBM engineers had given the PC 5150 significant expansion capabilities. The reduced budget and time allocated to the project had no impact on the quality of the machine. The real impact was that, except for one central component, the BIOS, all the components making up the IBM PC 5150 were standard components, unlike other microcomputers of the time, which included numerous proprietary components. This feature was to prove crucial for the rest of the story.

The machine was good and the reputation of IBM was well established, so the IBM PC 5150 was a commercial success. In fact, sales exceeded forecasts from IBM by around 800 %9See for instance: Burton, Kathleen (1983). Anatomy of a colossus, part III, PC Mag, vol. 1, n° 10, pp. 467–478. Available on-line.. Many users bought this machine, which naturally attracted some greed.

As we have seen, the IBM PC 5150 essentially contained standard components, so anyone could build a compatible machine. This excluded the BIOS, which was subject to intellectual property rights from IBM. Since IBM did not permit duplication of the BIOS, it seemed impossible to make IBM PC 5150 clone machines.

However, American law, at least at the time, allowed a technician who had not participated in the development of the equipment to carry out a complete study of it in order to document its function. Yet, anyone who had carried out such work was prohibited by law from making a clone of the component analysed.

Still, another individual with the necessary skills to make a clone and who had not worked on the development of the hardware was authorized to follow a technical description to make a clone of this hardware. Consequently, by paying a consultant who had not participated in the design of the IBM PC 5150 to analyse the BIOS and by hiring engineers who had not worked on the development of the IBM machine either, it was possible to produce clones quite legally. Several manufacturers stepped into this breach, leading to a proliferation of IBM PC compatible computers.

It happened that Microsoft had the operating system for the IBM PC 5150, which negotiations between IBM and Digital Research had authorized it to market independently of the microcomputer from IBM. Renaming the system MS-DOS, Microsoft did not hesitate to distribute this system for clones of the IBM machine. As a result, Microsoft found itself at the head of a larger market than expected.

Anyway, at the time, this contract with IBM could not yet be described as the contract of the century.

The emergence of a de facto standard

The IBM PC compatible Tandy 1000 computer
The original Tandy 1000 – photo by Judson McCranie under Creative Commons Attribution-ShareAlike 4.0 International licence via Wikimedia Commons.

As we have seen, the IBM PC was well suited to the tasks that small businesses were beginning to computerize in the early 1980s: accounting and stock management. It also offered word-processing systems, everything that a professional not belonging to a large group might need. On the other hand, it was not suited to the main uses of home computing then in its infancy, namely digital entertainment. Its graphics and sound capabilities, while not absent, lagged behind 8-bit computers such as the Commodore 64 released around the same time as the IBM PC 5150, and even further behind the 16-bit home machines released in the mid-1980s.

The Tandy brand, with the Tandy 1000 range of computers launched in 1984, offered a machine with the computing power and compatibility of the IBM PC 5150, while providing graphics and sound capabilities at least equivalent to those of other domestic machines. IBM had already tried to bring out the IBM PC Junior, which had the same concept as the Tandy 1000, but was surprisingly less compatible with the IBM PC 5150.

As a result, numerous IBM PC clones went on sale. This created a sufficiently large market for it to be profitable for an electronics manufacturer to offer extension cards for these machines, which enabled IBM PC compatible machines to be equipped with the capabilities needed for a particular application, such as graphics or music. In this way, the IBM PC became a base to which the components required for the applications the user wanted to perform could be added. The versatility of this machine was to encourage its spread.

Changes in home computing

The Amiga 1000
The Amiga 1000 – photo by Rama for the Bolo museum under Creative Commons Attribution-ShareAlike 2.0 France licence via Wikimedia Commons.

In 1984, Apple released the first Macintosh, a machine that was rather expensive compared to other home computers at the time, but still accessible to the domestic public. This computer offered a graphical interface that made it much easier to use than other computers aimed at the same audience, which were essentially operated by means of commands entered by the user. The concept of a graphical user interface had in fact been in the pipeline in computer research laboratories since the late 1950s10The pioneer of graphical user interfaces is Douglas Carl Engelbart, who worked on this subject from the late 1950s at the Stanford Research Institute., and the first professional systems with such interfaces, notably the Xerox Star (marketed in 1981), were marketed by Xerox from the very end of the 1970s. However, it was Apple who, with the Macinstosh, offered the first computer with a graphical interface at a price that was more or less affordable for private users.

Following the release of the Macintosh, both Digital Research and Microsoft quickly developed their own graphical interface systems. While Microsoft Windows failed to convince, Digital Research came up with the GEM (Graphical Environement Manager) system, which was very competitive compared with what the Macintosh offered. In fact, GEM introduced a number of ergonomic practices that are still in use today, and its programming interface was much more rigorous than that of the Macintosh. As a result, the versatility of the IBM PC allowed it to offer multiple possibilities and, from then on, a very accessible user interface, at least as much as that offered by the competition.

Seeing the danger posed by the possibility of having the same accessibility on IBM PC-compatible computers as on Macintosh, Apple then sued Digital Research for plagiarism. When we remember that Microsoft was able to impose itself on IBM PC-compatible computers by proposing an operating system that reproduced the functioning of the one proposed by Digital Research, and knowing that Apple launched an action over a technology that it had taken over from a Xerox laboratory, it is tragically ironic.

As Apple had not invented the principle of the graphical interface, it was unable to get GEM withdrawn from the market. However, it did obtain the removal of some secondary elements, which nonetheless undermined the coherence of the user experience, such as the removal of the dustbin icon. In addition, the lawsuit delayed the diffusion of GEM, while more and more utility programs were released for MS-DOS, which did not take advantage of GEM, making it much less attractive. As a result, GEM was a failure.

As a result, IBM PC compatibles have acquired a reputation for being austere machines designed for technical use.

Moreover, in 1985 Atari Corporation released the Atari ST home computer, followed by the Amiga from Commodore International. Although the Atari ST was marketed before the Amiga, it was actually designed to compete with the Amiga. The making of the Amiga took a long time, but it had made a name for itself even before it went into production. Atari tried to buy it, but Commodore International beat it to the punch. Not to be outdone by the competition, Atari then designed the Atari ST, a less innovative machine, but one that could still stand up to the Amiga, especially as it was marketed at a much lower price. Furthermore, since we have mentioned GEM, it should be pointed out that this was the system chosen by Atari, especially as the lawsuit with Apple only concerned IBM PC-compatible computers. It was the marketing of the Amiga and the Atari ST that really launched the era of 16-bit home computers, the Macintosh also being a 16-bit computer. All three of these computers had a graphical interface.

In the United States, this situation led to a fragmentation of home computing and computing for independent professionals into four parts: IBM PC-compatible computers were machines dedicated to simple office automation, programming, the use of spreadsheets and simple database management. The Macintosh was a machine aimed at more affluent users, mainly for office automation, graphics and desktop publishing. The Amiga was the machine of choice for graphic designers and the audiovisual industry, as well as being a particularly compelling gaming platform. With its standard MIDI connection and highly stable operating system, the Atari ST was the computer of choice for computer-assisted music, and with its more powerful processor than the Macintosh and the availability of a larger screen, the Atari ST competed with the Macintosh in desktop publishing. It was also a convincing gaming platform, even if its capabilities in this area were a little behind those of the Amiga.

In Europe, manufacturers did offer alternative machines, but the market was too fragmented between different countries, making competition from American manufacturers far too overwhelming, as the latter drew considerable financial power from their domestic market.

Sales of the Macintosh, Amiga and Atari ST were similar to those in the sector before the release of the IBM PC, with the Macintosh selling better and the Atari ST less well. On the other hand, IBM PC-compatible computers had opened up a much larger market, creating an installed base an order of magnitude larger than that of the other machines. As a result, it was profitable to sell software at affordable prices exclusively for the IBM PC, which was much more difficult to do for other computers. This created a dynamic in which the supply of IBM PC compatible computers led to an increase in the number of users of these machines, which in turn reduced the number of users of other machines. The success of IBM PC compatibles led to the failure of other home microcomputers.

In 1990, with the release of Microsoft Windows 3.0, Microsoft finally succeeded in offering a truly convincing graphical interface. Apple sued Microsoft for copying its user experience. Probably to protect itself, Xerox in turn sued Apple for the same reasons. None of these lawsuits succeeded11A Wikipedia page provides details of these trials..

In addition, IBM PC-compatible computers were able to keep up with the evolution of x86 processors from Intel, moving on to the 80286, then the 80386, entering into the 32-bit world. At this stage, IBM PC-compatible computers offered credible options in all the areas covered by competing machines. In addition, it was around this time that computers produced by Asian assemblers appeared on the market at lower cost, making these machines all the more affordable. So by the early 1990s, it was clear that home computing would be dominated by IBM PC-compatible computers, and that Microsoft would play a central role.

However, while the contract with IBM had already proved highly profitable for Microsoft, it was not yet being described as the deal of the century.

You too, child!

Painting “The Assassination of Caesar” by Vincenzo Camuccini.
The Assassination of Caesar (1798) by Vincenzo Camuccini (1771 – 1844) – public domain painting via Wikimedia Commons.

As we have seen, a large part of the success of the IBM PC came from the existence of clones. As a result, IBM was losing control of the platform it had created. On several occasions, it tried to regain the upper hand against its competitors. The IBM PC Junior, mentioned above, was the first of these attempts. IBM also proposed various evolutions of the IBM PC, for example the IBM PS/2 (for Personal System 2), which offered a more powerful processor and proprietary (but soon copied) connection ports. None of these attempts really made a difference.

However, it soon became clear that the PC-DOS system needed a major overhaul. This time, IBM decided to take advantage of its in-house expertise, albeit still in partnership with Microsoft. Therefore, the two companies worked on a joint project for a new operating system, called OS/2 (for Operating System 2), the first version of which was published in 1987. Initially, it had only a text-based interface, but the next version, released in 1988, added a graphical interface. From the outset, OS/2 offered compatibility with PC-DOS, meaning that software developed under PC-DOS could run without even a recompilation on OS/2.

OS/2 was an ambitious project and, while it did not neglect the consumer market, its focus was more on professional uses. However, there was already a de facto standard in the industry: the Unix system. This system had first been popularized in universities and then, in the 1980s, in businesses and industry. At that time, the system was still running on machines more powerful than IBM PC compatibles. Although most of the original Unix systems are no longer in use, this standard was formalized by the POSIX standard and is still in use today. Applications on large-scale systems are therefore based on this standard. However, OS/2 was not, or at least not completely, POSIX compatible. In order to survive on the professional market, OS/2 therefore had to create a new ecosystem, even though one was already well established.

At this stage, executives from Microsoft felt that, while the partnership with IBM had helped the company to flourish, it was now acting as a brake on its growth. So they decided to end the partnership and launch a rival project. This article has already mentioned Microsoft Windows, which at the time was an overlay on MS-DOS, offering a graphical interface. Using the same graphical interface, Microsoft came up with a completely new system, Microsoft Windows NT (NT standing for New Technology). Although the graphical interface was the same as that of Microsoft Windows, internally the system differed greatly from MS-DOS, so that it was not very compatible with the latter. This system was also aimed at professionals. The two former partners, IBM and Microsoft, were therefore competing in exactly the same segment, with equivalent success. Incidentally, customers opting for OS/2 tended to equip themselves with IBM hardware, while Microsoft Windows NT tended to equip clones—that said, once again, this was mainly a general trend.

In 1995, Microsoft released the Microsoft Windows 95 operating system. This was aimed more at home users. It was a largely new system, but it was still highly compatible with MS-DOS. Migration from one to the other was therefore simplified. Aided by an aggressive advertising campaign and anticompetitive practices12A Wikipedia article summarizes Microsoft’s controversial practices and provides numerous references to back up the facts., this operating system quickly became established. As a result, Microsoft consolidated its position as a key player in home computing. In industry, however, POSIX remained the standard. Microsoft had succeeded in steering its own course. But we are still not at the stage where we can say that Microsoft has signed the contract of the century.

One Unix to rule them all

The one ring from The Lord of the ring.
The one ring from The Lord of the rings – © 2003 John Howe.

In 1969, Kenneth Lane Thomson and Dennis MacAlistair Ritchie, researchers at the Bell laboratory, found themselves a little idle after their laboratory had abandoned its participation in the development of the Multics system. Multics was a particularly ambitious operating system project, but which never achieved the success its creators had hoped for. To occupy his spare time, Kenneth Thomson decided to port a game he had programmed to a PDP-7 computer. However, this computer did not have an operating system that was adapted to his needs. With the help of Dennis Ritchie, he designed a system from scratch. Since, unlike the Multics project, this system performed all its tasks in a unified way, whereas Multics offered several paradigms, the system was called “Unics”, soon spelt with an “x”: Unix.

In truth, this system was much more than just a support for a game: its designers took advantage of all their experience in designing operating systems, particularly that gained from working on the Multics project, to create a complete and completely effective operating system. In particular, it was a perfectly functional operating system that was both multitasking and multi-user. Multitasking means that the system can run various programs at the same time. Multi-user means that many people can access the same computer with an individualized workspace. By offering a unified interface, the possibility of multitasking and connecting different users, all within an efficient operating system, Kenneth Thomson and Denis Ritchie have succeeded in doing what several operating system projects have failed to do, starting with the Multics project. What’s more, Unix’s unified interface greatly simplified the use of a computer. As a result, Unix generated a great deal of interest in the then small but growing community of computer system designers and users. Unix was quickly ported to a wide range of machines, and methodologies and tools were put in place to make porting easier.

At the time, however, anti-monopoly laws prohibited Bell from exploiting Unix commercially. As a result, the system was distributed in the form of source code for a token amount. This facilitated the proliferation of the system, but also encouraged its users to propose evolutions and improvements to the system. In this way, Unix was able to build up a very solid user base, but also to build on its initial qualities and evolve as the computer industry evolved. Very quickly, this system became a reference.

In the 1980s, a change in the status of Bell allowed for Unix to be distributed more commercially. At the same time, advances in microprocessors made it possible to produce high-performance microcomputers. This led to the emergence of desktop computers designed for professional use: workstations, which were generally running a Unix operating system or a derivative. These microcomputers were too expensive to be bought by private individuals, but were a real success in industry and universities. Unix therefore became the de facto standard in these areas.

From the outset, Unix offered a high degree of portability for programs running on this system. In fact, whatever the hardware architecture on which the system runs, it offers a unified programming interface, so that a simple recompilation can run a given program on a new architecture.

However, in the 1980s, computer manufacturers with Unix-derived systems were proposing their own developments, undermining the Unix portability model. To counter this, the main suppliers of Unix-derived systems worked together to establish an operating system standard, the Portable System Interface, known as “POSIX” (the “x” being an allusion to Unix). This meant that any program based on this standard could be easily compiled on any system that implemented it. So Unix was ready to dominate the world of computing!

The War of Unices

Frame extracted from the movie Jurassic Park, with a quote of the same movie (same scene): “It’s a Unix system, I know this!”
Excerpt from Steven Spielberg’s Jurassic Park – © 1993 Universal Pictures.

As mentioned above, at the end of the 1980s, it was quite clear that the industry needed a standard for its IT applications. The POSIX standard was based on an ecosystem that had emerged almost twenty years earlier, which had demonstrated its formidable ability to be updated and adapted to changes: it was clear that POSIX was the ideal candidate. And the market promised to be a huge one: the bulk of the industrial, academic, and large group markets.

Naturally, all the manufacturers wanted to take advantage of the windfall. Thus, they tried to attract users with their specific functionalities. While the basic programming interface for accessing the basic resources of the system was the subject of a standard, this was not the case for the graphical interfaces. In fact, the Unix world had already developed graphic interface systems in the early 1980s, under the influence of the Xerox Star and before the introduction of the Macintosh by Apple. However, by the end of the 1980s, no standard had become established. Each manufacturer therefore tried to offer its own system, in the hope of increasing its market share: we ended up with exactly the problem that the POSIX standard was intended to solve, namely a weakening of the Unix portability model.

After a number of twists and turns that I will not go into in detail in this article–that would take us too far away from our subject, namely how Microsoft arrived at the situation where it could be said to have signed the contract of the century–the formation of The Open Group, bringing together the major players in the POSIX world (including IBM), enabled the sector to agree on a standard also for the creation of graphical interfaces. At last there was a standard covering most of the requirements for using the hardware resources needed to run a programme. However, this battle of the Unices damaged the image of interoperability of POSIX systems, delaying the hegemony to which this ecosystem seemed destined.

However, one manufacturer managed to establish a strong foothold: Sun Microsystems offered systems that were widely used in Internet infrastructure and for managing corporate networks. The growth of the Internet meant that Sun Microsystems’ market was constantly expanding. Ultimately, it appeared that it was only a matter of time before POSIX systems covered the bulk of installed systems.

The bursting of the dot-com bubble

Logo of Sun Microsystems from the 1990s to the acquisition by Oracle.
Logo of Sun Microsystems from the 1990s to the acquisition by Oracle – © Sun Microsystems via Wikimedia Commons.

However, in the early 2000s, what had initially been a speculative bubble burst: many small structures that had taken advantage of the liberalization of the Internet to launch themselves had to cease trading because they had not found a profitable business model. This had an impact on the POSIX systems sellers at the heart of the digital economy. Sun Microsystems in particular, which, while in itself representing a very solid structure, had many small structures among its customers who could not withstand the end of this bubble. The sudden loss of many customers left Sun Microsystems very vulnerable: the major player in the POSIX world in the 1990s was forced to scale back considerably during the 2000s.

What does this have to do with Microsoft and its famous contract of the century? At this stage, not much. In fact, Microsoft has been dragging its feet a little on Internet connectivity, even though it used anticompetitive practices at the turn of the millennium to impose its Internet connection solutions over the alternatives on offer, which were often more advanced from a functional point of view.

At this stage, the PC platform was still a 32-bit platform, while the most demanding industrial and computing applications were running on 64-bit POSIX workstations: in the early 2000s, the separation between personal computing and computing for demanding applications was still clear.

AMD revolutionizes Intel

Logo of AMD.
Logo of AMD – © AMD via Wikimedia Commons.

While it was clear that it was not possible to keep the MS-DOS operating system indefinitely, it also seemed clear that the x86 platform from Intel could not endure. The architecture had successfully made the transition from 16-bit to 32-bit and had managed to renew itself, but it was still an architecture originally designed in the late 1970s, which had to make do with the technical constraints of the time. In particular, between the end of the 1980s and the 1990s, POSIX workstations had switched to a new form of processor: RISC processors13Given the small word size of 8-bit and earlier processors, in order to increase the number of instructions, processor designers created architectures using instructions that could be encoded in several words. The size of the words in 32- and 64-bit processors makes it possible to have a varied instruction set using instructions of constant size: this is the principle of RISC architectures. The constant size of instructions means that many optimizations can be applied to the execution of processor instructions.. These architectures offered significantly better performance than their predecessors. Analysts were quick to point out that the logical move for computers was to migrate to RISC architectures.

Intel itself had produced RISC processors, but these were not very successful, at least not to the same extent as the x86 architecture, which was driven by the success of IBM PC compatible computers. On the other hand, IBM, in partnership with Motorola, offered the RISC architecture named POWER, whose qualities, especially concerning its desktop version, the PowerPC, where quickly recognized. Microsoft did offer a version of Windows NT for PowerPC, but it never managed to gain a foothold: the main argument in favour of Microsoft Windows is binary compatibility–in other words, unlike the Unix portability model, there is no need to recompile programs to make them run on a new machine. While this immediately simplifies programme distribution for software publishers, it also blocks the possibility of changing architecture. In any case, on platforms not based on the x86 architecture, Microsoft Windows has never managed to gain a significant market share.

In the mid-1990s, despite the success of Microsoft Windows 95, it seemed fairly clear that we were on the verge of a change of architecture, and Unix seemed much better placed to accompany this change. However, with the introduction of the Pentium Pro, Intel succeeded in applying the optimizations specific to RISC processors to the x86 architecture. As a result, Pentium processors from Intel became competitive with PowerPC processors from IBM and Motorola: for most practical applications, there was no decisive difference between the two architectures. As a result, the Microsoft Windows compatibility model encouraged people to keep the same architecture: for the 32-bit generation of processors, in any case, the future of x86 processors was assured, and Microsoft Windows was able to establish a solid user base.

However, from the second half of the 1990s, computers for industrial use switched over to 64-bit processors. Once again, this was not x86 architecture, and the systems running on these computers were POSIX systems. The demarcation remained clear and, despite the ability Intel proved in evolving its architecture, it was clear even to Intel that it needed to propose a new architecture. As a result, Intel launched a project to produce a particularly ambi tious 64-bit architecture, leading to the Itanium series of processors, but which unfortunately never lived up to its promise.

Still, Intel remained quite strong, on the one hand thanks to the installed base of IBM PC-compatible computers, on the other hand thanks to the server usage dedicated x86 processors it was manufacturing. The latter allowed small structures to set up their own servers at low cost, either using a server dedicated version of Microsoft Windows, or using a POSIX-compatible free operating system such as GNU/Linux.

In contrast, ever since the mid-1990s, the POWER architecture had a 64-bit version. As a consequence, PowerPC processors easily evolved into a 64-bit architecture. Intel, for its part, was reluctant to produce a 64-bit version of the x86 architecture, probably mainly not to compete with its own Itanium processor brand.

However, Intel was not the only manufacturer of x86 processors. As early as 1976, Intel had signed a cross-licensing agreement with AMD14Intel has confirmed this.. This allowed AMD to produce x86 processors if, for example, demand was too high for Intel’s production capacity. Intel supplied the masks needed to produce 8086/8088 processors to AMD, which could then produce processors based on this architecture: the processors were identical, in that the design was carried out entirely by Intel, regardless of who produced the processors.

However, in 1985, in a series of events that are a little too long to summarize here, Intel established itself as the sole source of production for 80386 and higher processors. To survive, AMD had to invest in research and development for the x86 architecture. Over the next ten years, AMD failed to convince. In 1996, it finally bought NexGen, a company specializing in computer chip design. Building on NexGen’s expertise, in 2003 AMD finally came up with an evolution of the x86 architecture that would set it apart, in particular by switching to a 64-bit architecture. Not only was this architecture compatible with processors used in IBM PC-compatible computers, it was also competitive in relation to what was available on workstations at the time.

Open-source operating systems such as GNU/Linux already supported other 64-bit architectures, and their development is highly responsive. Therefore, they quickly took advantage of the new architecture. Thus, the first consequence was to make this architecture highly competitive on the workstation and server markets, which in turn led to a reduction in costs. Microsoft, for its part, published versions of Microsoft Windows adapted to this architecture about a year after it was made available. In this way, Microsoft offered a system that was globally competitive with that of workstations, while offering binary compatibility with programs developed for earlier versions of this system and for previous architectures. What’s more, its server offering was also becoming competitive with this market as a whole.

At first, Intel did not want to follow this development. However, as the market evolved, it had to react and soon offered a 64-bit version of the x86 architecture, confirming the failure of the Itanium architecture.

So it was only at this stage that all the elements were in place to enable Microsoft to become the major player in a large part of IT in general that it is today. It was only then that we could truly say that the contract it had signed with IBM had turned out to be “the contract of the century”.

A temporary conclusion

The Atari logo from Ridley Scott’s film Blade Runner.
In 1982, Ridley Scott’s film Blade Runner used the Atari logo to represent the future, as the company seemed to have a bright future, but it still went bankrupt the following year – © 1982 Warner Bros.

The legend of Microsoft, and more specifically the one of Bill Gates, often recounts how he managed to con IBM during the designing of the IBM PC 5150. However, at the time, Microsoft was a very small company that was in no position to impose anything on IBM. Moreover, it was because Microsoft could only accept the conditions demanded by IBM without negotiation that it obtained the contract rather than Digital Research.

The reasons this originally leonine partnership became the contract of the century for Microsoft are circumstances, luck, managers from Microsoft being opportunists, and anticompetitive practices. A less romantic story, certainly less flattering for Microsoft’s managers, but far more credible.

We must also be wary of our tendency towards teleology, that is our tendency to already see the end in the premises of events. We tend to imagine that Microsoft has carried out an implacable strategy, which was inevitably to make it the undisputed giant of the computer industry. In truth, however, many things could have turned out quite differently, and events would have seemed just as implacable.

From the 1950s until the early 1980s, IBM was the hegemonic giant that dominated the IT world, and it did not look like this state of affairs was going to change any time soon. During the 1990s, many players in the Unix world feared that Sun would become an ogre that would devour the entire market. While these analyses were entirely relevant, they have nonetheless proved to be wrong. Today, Microsoft is a giant, but it was not destined to become one, and the current situation cannot be considered immutable.

Just a few years ago, the situation seemed to be set in stone. ARM architecture dominated the embedded world, using Android, QNX or dedicated Linux-based systems as operating systems. Desktops, including Apple Macintoshes, all used x86 architecture, with Microsoft Windows overwhelmingly dominant in the field, even though macOS had a smaller but solid user base and a tiny proportion of workstations ran GNU/Linux (this observation comes from an exclusive GNU/Linux user). In the server sector, once again the x86 architecture was hegemonic, while the majority operating system was GNU/Linux, even though Microsoft Windows remained a major player. When it came to high-demand applications, such as scientific computing, x86 architecture was once again hegemonic, as was the GNU/Linux system.

Today, however, there are strong indications that we could see a large-scale change of architecture: several factors are contributing to this. On the one hand, the x86 architecture and the Microsoft Windows system form two black boxes, both designed in the United States, which poses major sovereignty problems. The RISC-V open architecture offers a more collaborative solution, while open source software is already a functionally equivalent offering to the Microsoft Windows ecosystem, even if it involves major changes in user habits. Furthermore, the binary compatibility model of Microsoft Windows has certainly facilitated the distribution of proprietary software, but is a major obstacle to innovation. Apple very recently introduced its own processors for desktop machines, based on ARM architecture, one of the most widely publicized signs that a large-scale change of architecture is imminent. For its part, the European Union has launched projects aimed at producing high-performance computing machines based on RISC-V architecture.

It would be wrong to think that the position of Microsoft is unshakeable: the cases of IBM and Sun, mentioned in this article, but also many other examples in different industries, show that things can change radically in a short space of time. Moreover, it has to be said that all efforts from Microsoft to impose itself on other architectures than x86 have been a failure. On the other hand, before burying Microsoft and Intel, it is worth remembering that there have already been several periods when it was clear that this ecosystem was out of date, and yet it has managed not only to survive, but even to flourish.

The current period is an interesting one to observe, because what seemed to be a static situation for a long time may now be in the process of complete change. Or things could go on as before. We should not need to wait too long to find out.

But, in any case, it is clear that at the end of the 1970s Microsoft was not in a position to impose anything to IBM. When we know the end of the story, we tend to see its end in the premises, but the contract of the century never existed: it was the evolution of the situation that produced it.

Acknowledgements

Logo of MO5.com association.
Logo dof MO5.com association.

I would like to thank the members of the MO5.com association for reviewing this article, and in particular Christophe Menebœuf, who made a number of corrections and clarifications.

Notes

Notes
1 Cringley, Robert X. (1996). Accidental empires: How the boys of Silicon Valley make their million, battle foreign competition, and still can’t get a date, updated edition, Harper-Collins. Available on the author’s website.
2 Cringley, Robert X. (1996). Triumph of the Nerds, Channel 4. Available on-line in two parts. There’s also a companion website to this documentary, with a whiff of the 1990s!
3 Cheifet, Stewart (1995). “Gary Kildall Special”, Computer Chronicles, PBS. Available on-line.
4 Maher, Jimmy (2012). The IBM PC, 4 parts, The Digital Antiquarian. This article has been published on-line.
5 A compiler is a program that transforms source code into a machine-executable program, making it the central tool for creating software.
6 Executives from Microsoft have stated, for example in Revenge of the nerds, that they referred IBM to Digital Research, but Gary Kildall was more interested in flying his plane than meeting negotiators from IBM. However, this version is denied by Digital Research employees, for example in the special episode of Computer Chronicles devoted to Gary Kildall, as well as by the work of journalist Andrew Orlowski, both in MS-DOS paternity suit settled (July 30th, 2007, The Register, available online) and in Bill Gates, Harry Evans and the smearing of a computer legend (August 7th, 2012, The Register, available online). Furthermore, it is hardly credible that the IBM PC 5150 development team would have heard of Microsoft, at the time a very small organization, but not of Digital Research, which was then a major player in the world of microcomputing. The version presented here is therefore the most likely.
7 See the copy of the contract provided as evidence in one of the lawsuits against Microsoft, available online.
8 Kildall, Gary (1994). Computer Connections, self-published. Available on-line.
9 See for instance: Burton, Kathleen (1983). Anatomy of a colossus, part III, PC Mag, vol. 1, n° 10, pp. 467–478. Available on-line.
10 The pioneer of graphical user interfaces is Douglas Carl Engelbart, who worked on this subject from the late 1950s at the Stanford Research Institute.
11 A Wikipedia page provides details of these trials.
12 A Wikipedia article summarizes Microsoft’s controversial practices and provides numerous references to back up the facts.
13 Given the small word size of 8-bit and earlier processors, in order to increase the number of instructions, processor designers created architectures using instructions that could be encoded in several words. The size of the words in 32- and 64-bit processors makes it possible to have a varied instruction set using instructions of constant size: this is the principle of RISC architectures. The constant size of instructions means that many optimizations can be applied to the execution of processor instructions.
14 Intel has confirmed this.

Published by

Yoann Le Bars

A researcher and teacher with slightly too many interests to sum this up …

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.