Researching and writing this four-part series on the history of enterprise software (and here are the first and second articles in case you missed them) has taken me on a stroll down memory lane – even at the expense of being the brunt of office jokes. That’s because the journalist in me couldn’t help but get downright nosy about all of these technological milestones – so what better way to appease that curiosity than to return to my roots and (gasp) go to the public library.
You see, unlike many of my colleagues here at Software Advice, I spent my college years tucked among the stacks rummaging through books and scouring microfiche machines, not surfing from my laptop. Although in my own defense, I was one of the first among my college friends to have an email address – only because I worked in the university’s software training lab (a foreshadowing, perhaps?).
Just as the mainframe established computing and PCs transformed the enterprise, the emergence of a ubiquitous operating system in the 1980s blew open the doors of innovation in software application development. From human resources (HR) and customer relationship management (CRM) to enterprise resource planning (ERP), a whole new crop of solutions were emerging on the scene to create new levels of efficiency, productivity and revenue.
Client-Server Makes Data Sharing Even Easier
Although more businesses were adopting PCs by the 1980s, these machines still couldn’t compete with the power mainframes delivered. While mainframes were expensive, they had a sizable advantage: they could handle large volumes of both users and data so costs would be spread over multiple groups within an organization. Mainframes also had more robust security protocols, so users could complete tasks on their desktop PCs and then send their local work to back the mainframe to save and store. As you can imagine, this wasn’t the most efficient and files were prone to deletion or being saved over.
As data capacity became cheaper, faster and more widely available, PCs were grouped together with servers – with the server acting as a host to store and share information back and forth between computers (called “clients”) linked to it. On occasion, organizations still used their mainframes as giant servers as well. This client-server environment provided a new level of flexibility within the enterprise – providing PC users a more centralized way to house and access their mission-critical applications and databases. And the lifeline within this new world was connectivity via local area networks (LANs). LANs connect computers within a specific region, such as within an office building, enabling them to access the same software, data and printers.
By the early-to-mid 1980s, solutions like Novell NetWare emerged to deliver LAN software based on file-server technology. It was also a time of rapid change for established software vendors, such as Oracle, SAS, and German-based SAP – which was making inroads into the U.S. market with its R/3 ERP solution. These firms were forced to evolve to new operating environments to run across platforms, but they also had to ensure the PC hardware could support them. For example, SAS, which included several million lines of code, had to be rewritten several times – moving away from IBM-only P/L language to C language.
According to an InformIT article by IT veteran Rich Schiesser, as database and application servers became more robust, the gap narrowed between mainframe and client-server environments – especially in the area of storage management. “Midrange and client-server platforms were by now both running critical applications and enterprise-wide systems,” wrote Schiesser. “The back up of their data was becoming just as crucial a function as it had been for years on the mainframe side.”
Windows Spurs the Move from Infrastructure to Applications
Whereas computing had relied heavily on hardware infrastructure to date, Microsoft Windows was establishing an entirely new platform to support widespread application development. And adding even more fuel to the fire was the enterprise itself. Demand was growing for packaged business applications and both established vendors and newcomers alike were ready to deliver.
Responding to this demand, in 1987, Oracle created its applications division to develop essential software that could integrate with its database. With an acquisition of TCI and its project management applications, Oracle also introduced an accounting module that same year. In 1990, Oracle released its first packaged client-server application, which featured accounting applications running as client programs networked to an Oracle database.
Established accounting vendor JD Edwards was also migrating to a server-centric environment with its World software – which would later evolve into an integrated ERP solution, OneWorld, by the mid-1990s. By this time, ERP vendor Baan had been steadily gaining ground as well – with a surge of growth coming in the mid-1990s thanks to its Dynamic Enterprise Modeler.
While resource planning was gaining ground in the enterprise, ERP’s roots actually date to 30 years earlier to manufacturing resource planning (MRP). In the 1960s, Joseph Orlicky and Gene Thomas were developing ways to help manufacturers better understand where resources were being allocated – such as production schedules, purchasing plans and bill of materials. With the ongoing development of computing technology, MRP software emerged in the mid-1970s. And it was from this foundation that ERP solutions emerged for organizations in industries beyond manufacturing.
As Gartner analyst Jim Shepherd wrote in a recent commentary about the future of ERP: “ERP has been successful largely because thousands of companies have been convinced that they can and should do most things the same way. The twin holy grails of ‘process standardization’ and ‘industry best practices’ have allowed the ERP vendors to write one set of applications, and sell them over and over again, which is how you make money in the software business.”
Applications Become the New Cornerstone of the Enterprise
While the move to Windows-based, client-server environments was the trend, another factor also had an influence: application programming interfaces (APIs). Similar to an instruction manual, APIs provide a base set of instructions of what’s included in the software solution. Developers can take this information and make changes to the software.
This type of open environment was nothing new – as it had characterized computing from the beginning – with one developer creating an application and then another adding to it. But according to IT veteran consultant Christoper Baum, APIs were exactly what the industry needed to innovate quickly.
“For software developers this was utopia. Now they had powerful tool kits they could use. If you could go out and write software that worked against someone's Oracle or SAP installation, the large vendors welcomed it because you just added new value to their offerings.” -Christopher Baum, IT Consultant
Because APIs provided third-party developers a starting point, they could bring to market new applications even faster. As a result, new software vendors were cropping up left and right to support the enterprise’s adoption of client-server environments. Among the front runners was PeopleSoft, which released its first product in 1988, to deliver a fully integrated, client-server human resources management system (HRMS). PeopleSoft would later roll out its financial management and distribution modules in the early 1990s.
As the enterprise was now relying heavily on PCs, the volume of electronic customer data was exponentially increasing as well. Introduced in 1987, ACT! contact management software was one of the early pioneers in this budding sector. Customer relationship management (CRM) industry analyst Denis Pombriant recalls ACT!’s early ads that appeared in in-flight magazines, which appealed to sales people on the road.
“ACT! gave you a way to move your leads out of wire-bound notebooks and scraps of paper and put those leads into your PC,” says Pombriant. “It was the late ‘80s, and many organizations didn’t really understand the power of CRM. Most products that we would consider CRM today – like sales, marketing and help desk – were all standalone products.”
Before there were true CRM solutions, Pombriant says, he often had to cobble together applications. In fact, he almost got fired once for using the company’s database to track his leads. But as the standalone software vendors began to consolidate, they acquired new products that they could then sell as integrated application suites into the enterprise. For example, Siebel made its debut in 1993 with salesforce automation, and later purchased Scopus, which enabled them to move into CRM and call center support.
Of course, in the midst of the growing enterprise applications market, Microsoft was steadily releasing new innovations as well – including Windows 3.0 in 1990 and 3.1 in 1992. More than 10 million copies of Windows were sold in those first two years. Windows for Workgroups 3.11 was also launched to bring new levels of peer-to-peer work-group and domain networking support for the client-server environment. However, the platform that would really turn heads in the enterprise was Windows NT 3.1, which was introduced in 1993 with a 32-bit operating system. Then, let’s not forget the hype of Windows 95, which was greeted with epic sales – 7 million copies in the first five weeks alone.
Hardware Evolves to Deliver Advanced Processing
While software was no doubt taking center stage in the enterprise, the hardware running these applications was also rapidly evolving. The backbone that morphed from the room-sized mainframes and minicomputers was now supported by the ubiquitous adoption of desktop PCs. This was driven in part by the emergence of manufacturers developing computers similar to IBM’s original PC, which was released in 1981. Not long after, vendors such as Compaq, Hewlett-Packard, Texas Instruments and Xerox were also bringing to market their version of the PC, which were dubbed “IBM clones.”
With this surge of PC vendors, the market soon began consolidating and companies like Dell and Gateway rose to prominence in the 1990s. Dell would soon emerge as the leader, however, offering affordable PCs – thanks to a highly efficient supply chain and manufacturing processes, which meant Dell could quickly produce build-to-order PCs.
With the growth in client-server environments and more affordable PC options, the mid-1990s were shaky times for Apple. Founder and leader Steve Jobs had resigned from the company in 1985, when he left to start NeXT computers. With the business media counting them out, Apple was struggling to compete in a PC-dominant world. Making matters worse, Apple’s operating system was considered outdated compared to Microsoft’s bright-and-shiny Windows 95. Many naysayers didn’t expect Apple to make it to the 21st century.
Although Jobs had intended NeXT computers to “run Apple into the ground,” it was the other way around: Apple wound up purchasing the failed NeXT in 1996. And Jobs reluctantly returned to the helm at Apple in 1997 on the exact same date (Sept. 16) that he had departed a dozen years earlier. Heading into the new millennium, Apple was looking for a resurgence and it would soon make what some have called the greatest turnaround in business history.
The Web Widens the World
Although developments in software and hardware were now making things faster, easier and more efficient within the enterprise – the introduction of the World Wide Web in 1990 turned the world upside down. The tools and concepts, such as TCP/IP and packet switching, for a widespread data network – or Internet – had been around since the late 1960s and early 1970s. By the early 1980s, CompuServe was gaining popularity, offering people the ability to connect to online communities via a nationwide packet-switched network.
Several iterations of the Internet were evolving – including ARPANET in the early 1980s, and then later in the decade the National Science Foundation developed CSNET and NSFNET. With the growing popularity of NSFNET, a consortium was formed among IBM, MCI, and the University of Michigan with the charge of establishing a network that could carry data at speeds up to 56 kilobits per second. It worked – and 1987 the modern Internet was born.
However, the real turning point came a few years later – on Christmas Day 1990 – when programmer and physicist Tim Berners-Lee successfully established communication using a higher-powered NeXT computer between a Web browser and the Internet. It had been a long road until this point for Berners-Lee whose two previous proposals outlining the World Wide Web were rejected.
Berners-Lee demonstrated his Web to others at the European Organization for Nuclear Research (better known as CERN), and soon his urging to take the web seriously paid off. According to a Scientific American article celebrating the web’s 20-year anniversary, he finally convinced “professors, students, programmers, and Internet enthusiasts to create more Web browsers and servers that would soon forever change the world of human communication.”
Soon millions would follow, using services like CompuServe, The Source, AOL and Prodigy, to venture online. And by the early 1990s, the race was on to develop user-friendly interfaces for the web. Marc Andreessen and Eric Bina’s Mosaic was the early contender, debuting its first version in 1993. Andreesseen and Bina then teamed with Jim Clark to develop a spin-off browser, Netscape Navigator, a year later. Users loved the integration of web access, email, and newsgroups – and Netscape’s popularity skyrocketed.
Realizing the potential and threat of the newly emerging web, Bill Gates and the team at Microsoft rolled out Internet Explorer in 1995. The initial version was an add-on to Windows 95, but then introduced its cross-platform Version 2.0 later in the year – supporting both Mac and PC users. And thus the browser war began with Netscape and IE going at it for nearly two years before IE took the top spot with the introduction of its fourth version in 1997.
New developments were emerging to support this new environment as well. Sun Microsystems unveiled its new Java programming language, giving developers an entirely new cross-platform playground to create applications. Secure Socket Layers were also now emerging, enabling secure web-based transactions. Thus, the way was now paved for e-commerce. Pizza Hut was among the first to give it a go, with online ordering in 1994. But it would be Amazon and eBay, both launching in 1995, to take the e-commerce game to a whole new level.
With the web’s growing popularity among consumers, businesses of all sizes were beginning to pay close attention. And by 1996 most companies had no choice but to join the migration to the web. It was quickly becoming the new standard repository for information sharing and conducting business transactions. More importantly, the web introduced a revolutionary new architecture for application development. Soon the web would also be the hotbed of entrepreneurial dreams, as dotcom start-ups sprouted up everywhere.
Click here for the final installment of this series: Enterprise Software History, Part 4: Dotcom to Today.