Enterprise Software History, Part 1: Origins of Modern Computing

by

Managing Editor,

Note: This is the first of a four-part series exploring the history of enterprise software and the evolution of computing technologies. You can read the other articles here:  Part 2: Minicomputers to the PC; Part 3: Windows to the Web; and Part 4: Dotcom to Today.

It’s a bit sad to me that the computing conveniences we enjoy every day go without much thought. Unless, of course, we can’t seem to find a WiFi connection or for some reason our email has a hiccup and we lose a message. And I’m no different. But diving into writing this upcoming four-part blog series on the history of enterprise software opened my eyes to an incredibly intricate (and dare I say it, fascinating) story of technological evolution.

It’s a story with endless numbers of personalities, mind-boggling inventions and the giants that we know as IBM, Hewlett-Packard, Oracle, Microsoft and Apple. From the early days of finding common programming languages to the series of booms, busts and consolidations along the way – the maturation of enterprise software has hands down revolutionized the way we do business.

But let me not get too far ahead of myself. Every story needs context, and to adequately tell this particular tale, we must venture back long before we emailed, swiped, texted, chatted, moused over or scanned anything. To fully understand and appreciate how marvelous it is that I’m typing on this keyboard using Google Apps or you’re reading this from your laptop or iPad, we must journey a little further back in time.

History of Punched Cards

Punched Card Sets the Stage for Computation

Although punched cards date back to the 1700s, it wasn’t until the turn of the 20th century that they emerged in full force. Herman Hollierth’s work on punched card data processing was used in the 1890 census. Then several years he later founded the Tabulating Machine Company, which later sold to Charles Flint in 1911. After a name change to Computing-Tabulating-Recording Company, Flint’s company would morph yet again by 1920 into a name we all know still today – International Business Machines (IBM).

IBM soon dominated the punched card market, cornering almost 90 percent of market share by the early 1950s. But along the way some other players would emerge on the scene to help lay new ground for collecting, sorting and storing data. In 1936, mathematician Alan Turing derived the principles of the universal Turing machine, a device used for testing whether a task is computable. Turing’s principles would later give way to the primitive concepts of a computer operating system.

Meanwhile, two other household names – Bill Hewlett and Dave Packard – were tinkering away in their garage to develop their first product, the 200A Audio Oscillator, for Walt Disney Pictures. After a coin toss in 1939, the friends settled on the name of their company, Hewlett-Packard. Hewlett’s been recorded as saying that he and Packard didn’t start with any sort of plan to launch a business, rather they were driven by curiosity – and of course, “there were lots of little steps” along the way to growing the global icon of today.

While those early years of punched cards set the stage for the soon-to-emerge mainframe – perhaps one of the most significant events in early computing took place in 1942. Iowa State College Professor John Vincent Atanasoff and a graduate student were entangled in a lawsuit over who invented the computer. While Atanasoff developed several original ideas for the basic computer, the outcome of the dispute was that the computer as a concept was unpatentable. So, it was freely open to all to invent and modify.

That would be good news for researchers like John van Neumann, who a few years later, outlined the architecture of a stored-program computer while consulting for the University of Pennsylvania on the EDVAC project. Soon Neumann’s design ideas about a single, unified store were put to the test as mainframe computers began filling rooms.

The Mainframe Dawns the Computing Era

By the 1950s, the first mainframes entered the scene, with the Manchester Mark 1 cited as the first commercial computer delivered to the University of Manchester in 1951. That same year, the first “mass produced” computer – Remington Rand’s UNIVAC I – arrived at the U.S. Census Bureau. Not to be outdone, however, IBM released its first fully electronic data processing system, the IBM 701, the following year.

Early mainframes (commonly referred to as “big iron”) had no operating system and ran only a single, preloaded program at a time. Data was stored on punched paper cards and magnetic or paper tape. Reliability was always a bit questionable, as the machines ran until the program completed or crashed. And due to expense, the computers were extremely limited.

As the machines became more robust, run-time diminished and functionality increased, enabling the computers to process larger volumes of enterprise data. With the demand growing for usage, big players like IBM, UNIVAC, Burroughs and Control Data Corp. established service bureaus where they leased time to use their machines. According to a Computer Economics article by Frank Scavo, these service bureaus helped “expand the market and make computers available to smaller companies.”

History of Mainframe Computers

Needless to say in an emerging industry, competition was fierce among mainframe developers. Ongoing research and development in both industry and higher education spurred a string of firsts in the mid-to-late 1950s. For example, researchers at the Massachusetts Institute of Technology completed the first programmable computer, TX-0, featuring transistors in 1956. MIT researchers were also the first to experiment with a keyboard. Another first was introduced when IBM rolled out an operating system to deliver batch processing for the IBM 704.

However, the biggest win for IBM came in 1959 when the company introduced the 1401 Data Processing System – one of the first computers to run business applications completely on transistors, not vacuum tubes. IBM’s 1620 Data Processing system also made its debut during this time for science and engineering applications. With the move to transistors, the IBM 1401 machine was significantly smaller and more durable to process data and perform accounting functions.

Given its smaller footprint and capacity to handle larger volumes of business data, the IBM 1401 was quickly embraced. In fact, IBM notched more than 5,200 orders for the 1401 computer just five weeks after its introduction – more sales than the company anticipated selling in the machine’s entire life cycle.

With the growing demand for data processing, the need for advanced security measures also increased. Vendors came under fire to beef up run-time libraries and automated monitoring. Audit trails of program files were also added. Given the advances in the 1950s, the stage was set to usher in supercomputing.

Supercomputers Deliver More Power

Supercomputers follow the idea of Moore’s law, which was coined in 1970, and essentially says that the number of transistors that can inexpensively be placed on a single integrated circuit doubles every two years. Whereas the early days of mainframes saw nominal processing capabilities, the supercomputer introduced in 1960s could perform up to 3 million instructions per second.

Designed by Seymour Cray, Control Data Corp. debuted its 6600 supercomputer in 1964 to quickly corner the market as the fastest computer in the world. The key to Cray’s design – and what gave the 6600 its speed – was the 10 smaller computers (or peripheral processors) that funneled data to a larger central processing unit (CPU). Unlike most CPUs at the time, Cray’s had a simplified instruction set to free up space, a concept which would later become known as a reduced instruction set computer.

Early supercomputers were used primarily for repetitive calculation tasks, such as forecasting, modeling or simulations of physical systems. Given their enormous processing capacity – and multimillion dollar price tag – supercomputers were most often used in scientific research and government programs, such as NASA and national security operations.

OS & Languages Support a New Breed of Computing

The earliest mainframes lacked any type of operating system (OS) – other than the batch computer operator who manually set up the computer to run each program. By the early 1960s, however, new advancements in features and programs were paving the way for the true OS environment. For example, run-time libraries were becoming more sophisticated, and the development of background programs, or monitors, helped manage multi-step processes.

These early OSs were unique to each manufacturer and even then each new machine had its own operating procedures, commands and applications. By the late 1950s, IBM’s existing line of machines were running out of memory and the charge was set forth: develop a compatible family of computers that would replace IBM’s five outdated product lines. Amid much internal scuffle, IBM put everything at risk to roll out its S/360 series of machines in 1964, which featured a common operating system that could be configurable according to the application.

To set the stage for the scale of the IBM S/360 project, author Jim Collins highlighted the S/360 as one of the top business feats of all time in his bestselling book Good to Great. And in an article, Collins writes, that the project was, “the largest privately financed commercial project undertaken to that point in history, requiring more resources than the Manhattan Project to develop the atomic bomb in World War II.”

History of Programming Languages

The hallmark of the S/360 computers was compatibility, something no one had yet mastered. And IBM’s S/360 would be the catalyst for the company’s ongoing deployment of systems that would be the precursor to the IBM z/OS and z/VSE systems still in use today. While IBM was at work with S/360, however, other manufacturers were equally focused on bringing new innovations to market.

By this time, a host of programming languages had been developed, including FORTRAN, COBOL and CPL. In 1964, Dartmouth College professors John George Kemeny and Thomas E. Kurtz invented BASIC programming language. Based in part on FORTRAN II and ALGOL 80, BASIC included time-sharing functionality and its simplicity made computers easier to use for more students. Like many early innovators, Kemeny and Kurtz were more interested in getting their creation into the hands of those who could benefit from it rather than making money – so BASIC was available for free.

Integration Spurs New Developments

As new advancements in operating systems and languages emerged, manufacturers began making these technologies available in their computers. By 1964, computing was heading into its third generation, moving away from vacuum tubes and transistors toward integrated circuits. With new speed and efficiency, computers were integrating more power and operating capacity in a smaller footprint. For example, in 1966, Hewlett-Packard introduced its first general business computer, the HP 2115, which provided the type of power in a smaller computer that was once found only in larger machines. HP’s computer supported a wide variety of languages, including BASIC, AGOL and FORTRAN.

Heading into the latter half of the decade, another innovation would yet again change the face of computing. In 1969, AT&T Bell Labs programmers Kenneth Thompson and Dennis Ritchie developed the UNIX operating system. The design combined file management and time sharing features of the Multics project, which had been the first big push in the mid-1960s to create a multi-user, multi-tasking operating system.

From the vacuum tubes of the 1940s and ‘50s to the transistors and integrated circuits of the 1960s, the computing industry was maturing – laying the foundation for what was to come. To date, computing applications had been rooted in business computation and functionality. But with the emergence of more sophisticated operating systems and programming languages – along with more robust processing and storage capacity – a whole new breed of personal computing would soon revolutionize the business landscape.

Check out the next installment of Enterprise Software History, Part 2: Minicomputers to the PC.

 
  • http://www.facebook.com/profile.php?id=723982579 Tamara Dwyer

    Love the graphics, and the short recap of history.

  • Joruar

    Lara, this is an incredibly history. I’m 52, IT professional and this article exposes the tremendous change experimented. From the punched card to the  iphone, my whole life.

    Thanks for creating this gem.
    Joaquin Ruiz, Madrid

  • Blogs by Market:
  • Subscribe to the Software Advice Enterprise Blog

Popular Blog Posts