Every so often, I look around our office here at Software Advice and marvel at the fact that there’s no paper on our desks. (Needless to say, it’s a far cry from the strips of dot matrix paper that once lined my family’s home office.) What’s even more amazing is to think about where computing technology has been over my nearly four decades of living. From practicing math on a TRS-80 and shattering the backboard in the One on One Dr. J vs. Larry Bird basketball game on my Commodore 64 to now sitting in this paperless Mac-based office using entirely cloud-based apps.
But what seems to be a relatively transparent interaction for us laypeople has been a long – and often difficult – road for the engineers, programmers and developers whose entrepreneurial foresight and hunger for innovation make life simpler today. From punched cards to vacuum tubes and transistors of the mainframe – early innovations in computing hardware created a successive platform of growth (which you can read more about in the first article of this series.) Along the way, of course, there were also curious programmers hellbent on mastering the programming languages – among the likes of BASIC, FORTRAN and Unix – to connect the computer’s core applications.
By the end of the 1970s, the industry was embarking on an entirely new frontier – the birth of the personal computer and along with it broader adoption of enterprise software applications. Before we get into the thick of the story, however, there is an important, albeit brief, stepping stone that would lead the way.
Integrated Circuit Spurs Minicomputer Development
The minicomputers that emerged in the late 1960s and early 1970s served as a bridge between room-sized mainframes from years earlier and the personal computers that would soon revolutionize homes and businesses alike. As integrated circuits (ICs) and core memory technologies matured, computers were now occupying a much smaller footprint.
For mid-range organizations, minicomputers – such as those produced by Hewlett-Packard, Digital Equipment Corp. and Data General – helped fill a much-needed gap where mainframes left off. Compared to the “big iron,” minicomputers were relatively cheap, small and fast to build. And given that they were designed with specialized hardware and software to handle sizable tasks, minicomputers were well-suited for manufacturing processes and company email. In fact by the 1970s, minicomputers would help launch the computer-aided design (CAD) industry.
The heart of the minicomputer was the IC and these silicon wonders could pack a number of transistors onto a single chip. While development on the IC had been ongoing since the late 1940s, Texas Instruments’ Jack Kilby was the first to successfully get the chip to work in 1958. “TI was the only company that agreed to let me work on electronic component miniaturization more or less full time, and it turned out to be a great fit,” wrote Kilby in his autobiography upon receiving the 2000 Nobel Prize in Physics.
Ongoing development of the IC in the late 1960s spurred the rise of the semiconductor industry, including the emergence of companies like Intel. Intel was the first to make a commercially available microprocessor – the Intel 4004. The microprocessor was, in simplistic terms, an extension of the IC, packing more functions onto one chip. The difference was, however, that there were so many functions on a microprocessor that a simple programming language was needed to make it all work.
Around this time, in the early 1970s, developers were further redefining what was possible with programming languages, particularly BASIC and Unix. Aggressive young programmers such as Bill Gates and Paul Allen – along with experienced computer science professor Gary Kildall – were making significant strides in laying the foundation for programming languages. And it would be these innovations that would soon spur widespread microprocessor application in the personal computer (PC) – not to mention a whole host of software applications that would emerge alongside it.
Computing Still Limited to the Enterprise
Clearly, discussion around bringing smaller, faster computers to the enterprise market had been circulating for years. Yet, as authors Paul Freiberger and Michael Swaine point out in their book, Fire in the Valley, many of the traditional mainframe vendors weren’t pushing to develop PCs because they wrongly assumed minicomputers would fill the gap within the enterprise.
“Without exception, existing computer companies passed up the chance to bring computers into the home and onto the desk. The next generation of computers, the microcomputer, was created entirely by individual entrepreneurs working outside established corporations.” -Paul Freiberger and Michael Swaine, Fire in the Valley
Although the industry was rapidly evolving, computer hardware and software were still quite costly to implement and maintain. And given that the microcomputer, or PC, was still in development – software applications were limited to the large enterprise. This was fine by companies like Oracle, J.D. Edwards, SAS and Baan, which were emerging on the scene by the mid-to-late 1970s to deliver powerful database, enterprise resource planning (ERP) and accounting software solutions.
German software vendor SAP, founded by five former IBM’ers in 1972, was also gaining new ground in the European ERP market (although it would be another decade before they emerged in the states). SAS was founded in 1976 to deliver statistical analysis software for government and higher education, as well as pharmaceutical, insurance and banking industries.
“When you talk about enterprise computing, pretty much until the mid-1970s, all the software was enterprise software,” explains Christopher Baum, an industry consultant with more than 30 years of experience. “Enterprises were the only ones that could afford computers, and any enterprise that had a computer wanted to use it all the time to help defray the cost.”
Yet, the computing world was changing faster than ever, and software development was inextricably linked to the race to launch the personal computer on a large scale. While the Silicon Valley had no doubt been a hotbed of development, there was another duo of Bill Gates and Paul Allen working frantically in Albuquerque – and their innovation would be a major turning point in the game.
Altair Has Huge Influence on PC Industry
Drawing from their many years spent tinkering with programs, childhood friends Gates and Allen developed BASIC software for the MITS Altair 8800 microcomputer introduced in 1975. According to Freibeger and Swaine in Fire in the Valley, the work that went into developing the Altair computer was some of the most significant in history. That’s because the Altair not only “introduced the first affordable computer,” they write, “but it also pioneered computer shows, computer retailing, computer company magazines, users’ groups, software exchanges, and many hardware and software products.”
But the success of the Altair 8800 also did something else: it led Gates and Allen to launch their own company, Microsoft. And while MITS was celebrating the Altair 8800 and Gates and Allen were hatching their business, innovators in other reaches of the country were just as busy.
Around this time, video games were also making their way into the market, with Pong emerging in 1972. It was this game, in fact, that caught the eye of former engineering student and computer hobbyist Steve Wozniak. After playing the game at a bowling alley arcade, Wozniak and his buddy Steve Jobs (who was working at Atari) began developing their own version of the game, called Breakout. But video games weren’t all the two friends were interested in: both shared a love of computers.
Wozniak and Jobs began dabbling with a $20 MOS Technology 6502 microprocessor, which morphed into their own computer design. By the time the Altair 8800 hit the market, another little company named Apple was in its infancy. In 1976, the two Steves introduced the Apple I to their hobby computer club, with the sequel, Apple II, not far behind.
Personal Computing Puts Software in Everyone’s Hands
Led by Wozniak and Jobs, Apple raised the bar on personal computing with the introduction of the Apple II in 1977. The Apple II quickly put the start-up on the map for selling one of the first highly successful mass-produced PCs. It featured an integrated keyboard, built-in BASIC programming language, expandable memory and the monitor was capable of handling color graphics.
Not to be outdone, however, Commodore’s PET computer and Radio Shack’s Tandy TRS-80 were both introduced that same year. Selling 10,000 units in its first month and a half alone, the Tandy TRS-80 soon led what BYTE Magazine called, the “1977 Trinity.” But what really put these three computers on the map wasn’t just their novelty or accessible price point – it was that the computer's software now gave both enterprise and home users the opportunity to compute numbers, process words and play games. (Oh, and the fact that these computers were aggressively marketed and had national distribution footprints didn’t hurt either.)
According to Baum, the veteran industry consultant, before this point software wasn’t as highly valued in the industry and what was being developed was often given away by those in favor of an open-source environment. Other software developers were racking up enormous costs in production and made very little money simply because the volume of users still wasn't high enough to buy the software to recoup costs.
“The Apple II, TRS-80 and Commodore PET were sold nationally and had a large enough base of users that you could actually write a piece of software for them and have enough people buy it,” Baum says. “So they had an installed base, and as it turns out, that’s pretty important for software development.”
There were three companies early on that began to change the software game: Ashton Tate, Software Arts (which are both gone now) and Microsoft. Ashton Tate’s dBase II became the standard for databases on small systems for more than a decade, while Software Arts’ VisiCalc was the first spreadsheet program. And of course then there’s Microsoft. Its BASIC language was developed for PCs, but would soon be replaced by DOS.
PCs Revolutionize the Enterprise
While Tandy, Commodore and Apple were gaining ground among consumers, IBM and other big iron mainframe vendors, including Burroughs and Honeywell, were looking to move into the PC market as well. IBM was the forerunner with the introduction of the IBM Personal Computer Model 5150 in 1981, which opened an entirely new set of doors for Big Blue. IBM used its many years of computing successes (such as the IBM 1401 and S/360 in the 1960s) to move its PC deep into the enterprise market. In a recent article reflecting on the 30-year anniversary of the IBM PC’s introduction, BYTE Editor Gina Smith writes:
“Now, IBM never envisioned its explosive success, nor the resulting aftershock of PC-compatible hardware or software that followed. But it was the PC’s open architecture and use of third-party hardware and software that enabled an industry of PC hardware and software makers to grow up around it.”
By the early 1980s, software developers had indeed taken note, and the race was on to unveil ubiquitous solutions that would translate across platforms. A 24-year-old programmer named Tim Patterson, who worked at Seattle Computer Products, created an operating system he originally called Quick and Dirty Operating System. QDOS, as it was called, became known simply as DOS, and was made commercially available as DOS-86. But soon Microsoft snatched up DOS. After purchasing it, Microsoft made some tweaks and released its branded version, MS-DOS, as part of IBM’s new PC.
Although MS-DOS became IBM’s gold standard for operating systems, the relationship between the industry giant and the up-and-coming Microsoft had always been shrouded, shall we say, in a bit of skepticism. From the first meeting, IBM asked Gates and colleague Steve Ballmer to sign confidentiality agreements before the conversation. (That way, if by chance, Gates and Ballmer just happened to say something that made its way into IBM’s forthcoming PC, well then Big Blue was covered.)
By the second meeting, both parties came to the table with an army of representatives and corporate lawyers and another round of confidentiality agreements. While negotiating deals and securing intellectual property was nothing new for either party, the Microsoft-IBM partnership was a big opportunity and both companies knew it. According to Freiberger and Swaine’s historical account in Fire in the Valley, the rounds of meetings with Microsoft were one of the most “unusual things” IBM had ever done.
Eventually, Microsoft signed onto the PC project, which demanded grueling schedules and tight security standards. But the relationship eventually paid off in a big way for IBM and Microsoft – helping both companies make the enormous leap into becoming the enterprise-wide standard for computing.
Around this time, other industry players were starting to make names for themselves in the enterprise as well – including Oracle and its founding trio – Larry Ellison, Bob Miner and Ed Oates. After success with its government contracts, Oracle Version 3.0 was released in 1983, using C language for mainframe, minicomputers and PCs. This enabled identical databases to be built and analyzed on any type of machine – marking a major step in today’s open-architecture IT systems. And this of course opened a whole new door for Oracle – as well as other vendors, such as SAS, Baan and JD Edwards.
Approximately 200 miles north of Oracle’s headquarters, another major enterprise player – German software vendor SAP – was making inroads into the U.S. market with an educational partnership at Chico State University. Working with graduate students in the business school, SAP made improvements to its R/2 real-time version of its ERP solution. But it would be SAP’s R/3 version, launched in 1992, that would help set the stage for major architectural shift in the enterprise.
Of course all the while, the development machine at Microsoft was also embarking on what would be a lengthy undertaking: creating a more user-friendly interface for IBM computers. But before we dive headfirst into the world of Windows and the web, let’s pause and catch our breath here for a minute.
Because if you thought changes in business and personal computing were happening at warp speed with the proliferation of the PC – well, the hit 1974 Bachman Turner Overdrive song sums it up best, “you ain’t seen nothing yet.”
Check out the next article in our series: Enterprise Software History, Part 3: Client-Server to the Web.