Enterprise Software History, Part 4: Dotcom to Today

by

Managing Editor,

If you’ve been following this four-part series on the history of enterprise software – then you know the journey that’s led us here has been a long and winding road paved with technological foresight and sheer ingenuity. And for those of us who have lived through this evolution – from firing up our very first desktop PCs to dialing into the World Wide Web – it’s been a pretty incredible ride.

It’s a story that began with punched cards and mainframes, then moved to personal computers – which we know transformed the enterprise with a slew of application development. Clearly, the pace and breadth of innovation that led us into the new millennium completely changed the way we do business. And we were in store for another (re)evolution with the ubiquitous adoption of the web.

History of Minicomputer

Dotcom and Y2K Create Widespread IT Boom

The big story as we moved into mid-to-late 1990s was the Y2K rollover. Everywhere you looked it seemed, the media was in a buzz about the implications of the changeover to year 2000 – or “Y2K bug” as it was often referred. That’s because many of the legacy computing systems, which had been in place a decade or more before, were built using two-digit dates to log time.

Back in the mainframe computing days, this binary approach was used because computing power was limited, and two digits were easier to store than four. But as we headed into the new century, no one was quite sure what would happen when we transitioned from 99 to 00. Would we be zapped back in time to 1900?

To circumvent any fallout from making this transition, businesses in every industry were making sizable investments to upgrade their hardware and software systems. In fact, according to a Federal Reserve Bank of St. Louis white paper, authored by Kevin Kliesen, in both the public and private sectors expenditures on equipment and software to prepare for Y2K were in the neighborhood of $114 billion.

And the bulk of this spending came between 1997 and 1999 – primarily to replace enterprise resource planning (ERP), accounting and other mission-critical applications. This chain of spending went far beyond software licenses, however, as countless hours of consulting time were also needed to install and customize the software on site.

Running parallel to Y2K, another lucrative story was emerging by the late 1990s with the widespread adoption of the Internet. Ongoing efforts to create a rich user experience on the front end – along with developments of HTTP, XML, Java and Secure Socket Layers on the back end – meant the web was no longer just a way to check movie listings. It was becoming a robust platform upon which to generate and transact business.

Sites like Amazon and eBay – along with throngs of others – were quickly racking up massive sales. Everywhere you looked it seemed a dotcom idea was sprouting up with a frenzy of investment to match. And this “new economy” was fertile ground for entrepreneurs with promising (and not so promising) ideas to find capital. According to business reporter and columnist Sarah Lacy, this surge went from an estimated $12 billion in venture capital investments in 1996 to $106 billion in 2000.

In her book, Once You’re Lucky, Twice You’re Good, Lacy takes a look at the emergence of web 2.0 in the years after the dotcom era. She contends that while venture capitalists were enamored by the promises of this shiny new toy called the Internet – Silicon Valley was no stranger to technology hype. She writes:

“From a behavioral standpoint, there was nothing inherently unique about how venture capitalists responded to a big new innovation like the Internet. As long as Silicon Valley has been around, bubbles have existed. A new technology emerges – chips, computers, video games, business software – that suddenly creates market opportunity.” -Sarah Lacy, Once You’re Lucky, Twice You’re Good

While the business world was now engulfed with start-ups, established tech companies were developing Internet strategies, too. Firms that had built applications around a legacy enterprise architecture were now questioning how to best migrate to and deliver Internet-based technologies.

Software Struggles in the Aftermath of the Bust

Although the enterprise was consumed with Y2K and Wall Street was making huge investments in dotcom start-ups, there was a deeper, double-sided reality emerging. For software vendors it was the reality that Y2K came and went without much of a hiccup – and when the Y2K concern passed, spending on enterprise solutions slowed dramatically. Then, of course, we all know what happened with the dotcoms – or rather dotbombs. The quick recap: many of them failed to make money as quickly as they were spending it, and they were forced out of business.

All things combined by the early 2000s, the software industry was now facing a much different landscape. In an article by Summit Partners’ Greg Goldfarb and Jason Glass write:

“Forecasts of a Y2K disaster, Internet-driven hype and gangbuster corporate profits allowed both vendors and customers to embark on a spree of untethered spending that led, inevitably, to restructuring the software industry over the subsequent decade.” -Greg Goldfarb and Jason Glass, Summit Partners

While the software industry – and market as a whole – struggled to get back on its feet after the turn of the century, the industry was maturing and with that came a natural shakeout.

Consolidation Signals End of an Era

Heading into the new millennium, many of the legacy vendors that had emerged as major players in the client-server era of the 1980s and early 1990s were struggling to compete. Soon the industry resembled a game of Pac-Man with many of these early software vendors being gobbled up through acquisition. In fact, by the mid-2000s, consolidation in the software industry was outpacing all other U.S. industries. Software acquisitions represented the majority of all M&A in the technology sector – accounting for approximately 40 percent of the $298 billion deals in 2006, according to Mary Hayes Weier’s Information Week article.

While historically companies used acquisitions as a strategic approach to expand product breadth and depth, many of the M&As during this time were driven by financial pragmatism. Firms like Oracle, Infor, SSA and Consona would purchase an older, growth-challenged company and scale back that firm’s R&D and sales and marketing. Because mission-critical systems are not easily replaced, customers continued to pay for them to be supported, which delivered “sticky” revenue for the software vendor. This created a stable flow of revenue – and kicked off huge gross profits – which could be leverage with debt to yield higher financial returns.

History of Minicomputer

In this spree of consolidation, legacy ERP vendor Baan was purchased by legacy vendor, SSA Global Technologies in 2003 with capital provided by large private equity firms. Then three years later, Infor acquired SSA. JD Edwards, which had been a pioneer in ERP software since the late 1970s, merged with PeopleSoft in 2003. Oracle then snatched up PeopleSoft in 2005 through a hostile takeover.

Oracle followed this with a string of other acquisitions in the applications sector, such as Siebel, Hyperion, Primavera, TempoSoft and Agile. But the database company hasn’t stopped with applications. Oracle also made massive buys in hardware with Sun Microsystems and middleware vendors like BEA.

Industry giant IBM also went on an M&A spree, acquiring more than 100 companies in the 2000s. These acquisitions expanded Big Blue’s breadth and depth across the board – including applications (FileNet Corp., Cognos and Telelogic); middleware (ISS, Ascential Software and Rational Software); and services (PricewaterhouseCoopers Consulting).

Needless to say, the face of the software industry underwent enormous change in the 2000s. Legacy vendors that had once influenced the direction of the industry were now quickly being bought out. And those that were left standing were being forced to rapidly evolve because another major chapter in software was now underway.

Web Spurs New Model for Software Delivery

Whereas traditional software installations required buyers to purchase licenses, host and then maintain the software in-house, vendors were exploring how to deliver the same applications through the Internet. What emerged were two similar business models: application service providers (ASPs) and software as a service (SaaS).

The ASP model was created as both a deployment and business model, which allowed traditional vendors with client/server architectures a way to offer hosted versions of their software. Buyers could then access these products via an emulation tool like Citrix, or in some cases, a web browser.

On the other hand, with SaaS, the entire experience is web based – both in hosting the software on the back end and delivering the solution to customers strictly via a web browser. Rather than the software being installed on-site on an individual computer or server, SaaS applications are hosted, maintained and upgraded by the software vendor. Users simply purchase a subscription, log into a web browser and then access their application.

Given the roller coaster of a ride the industry experienced from the late 1990s to early 2000s, the enterprise was “suffering from exhaustion,” explains industry analyst Denis Pombriant. So when pure SaaS companies like Salesforce.com emerged, built entirely upon a web-based, pay-as-you-go, SaaS delivery model, everyone began taking note.

“It was a big turning point,” says Pombriant. “Salesforce.com became the poster child for proving that SaaS is a viable business model. The development of SaaS has definitely been one of those lines in the sand that distinguishes the industry from before and after Y2K and the dotcom bust.”

The SaaS model was not only putting new pressure on traditional software vendors – and creating a new point of entry for pure SaaS companies – but the benefits were proving to go well beyond the web browser.

SaaS Impacts Enterprise Software Consumption

Perhaps one of the more significant advantages of SaaS from the buyers’ perspective has been the purchasing process itself. Buying traditional on-site software licenses required a significant upfront investment of time and capital – not to mention ensuring that you had the right hardware to support the application.

With SaaS, however, purchasing software has become much like signing up for any subscription-based service. In many cases, you go to the SaaS vendor’s web site, check out the features and prices and then use any credit card to subscribe to the service on the spot. That means buyers can bypass the bureaucratic chain of approval to instantly purchase and access a software solution that previously would have have taken months – or even the better part of a year – to purchase and implement.

According to Tom Nolle, president of consulting firm CIMI Corp., it’s this type of instant, inexpensive access that’s dramatically changing how the enterprise relates to software. For example, Nolle says, in the 1990s IT spending was centered around the cost-benefit ratio of investing dollars in software and hardware to gain productivity. Yet with SaaS and mobile computing, end users have direct access to the same technologies – for a fraction of the cost.

“Through the years, what we’ve seen is that new technologies that have brought about changes in the fundamental cost-benefit relationship that enterprises use to justify their projects. What’s now starting to take hold and this is where mobility – on devices like the iPad and iPhone come along – is that we are now creating a situation where instead of educating the enterprise, we’re changing the culture of the workforce.” -Tom Nolle, President, CIMI Corp.

From the users’ perspective, the emergence of a subscription-based software model has no doubt provided a new level of convenience. Behind the scenes, software developers are enjoying the same benefit to innovate new applications faster and for far less cost.

Cloud Creates New Platform for App Development

One of the key drivers of this SaaS environment is the widescale emergence of virtualization and cloud computing. Virtualization enables users to simulate and run multiple virtual computers on one physical computer. Each of these virtual machines can essentially run entirely different operating systems and sets of software applications. And this can be multiplied out exponentially, creating an entire virtual infrastructure. This means that multiple departments or customers can share machines to leverage hardware investments.

In the early 2000s, firms like VMWare led the way in virtualization – providing the platform for IT users to create a network of interconnected simulated hardware and software systems. Given its virtual nature, the technology is essentially hidden from the end user behind a “cloud.” Hence, where the term cloud computing was derived. Another competing approach to Cloud computing – ad one use effectively by most SaaS vendors – is called multitenancy, where multiple customers and users use the same application code base and share the same database, but careful partitioning of records once again hides this complexity behind the metaphorical Cloud.

While there’s been much speculation over the years about the viability of the cloud, one thing is certain. The cloud offers a cost-effective platform for developers to create and move to market new applications quickly and for far less cost than traditional software. And this is having a ripple effect throughout the industry, explains Carl Brooks, an industry consultant, who previously covered cloud computing as a reporter for Tech Target.

“Whether you do an internal build of a private cloud or access apps on a public cloud, you choose your buying capability,” says Brooks. “And now you can do it for practically pennies a month per terabyte. A lot of enterprise companies are still comfortable with the data-center model, but now we’re moving from a traditional life cycle to being able to get everything on demand. It’s pretty incredible.”

In fact, there are now SaaS solutions available for almost every business sector – from ERP and HR to accounting and supply chain management. And then of course, there’s Google, which delivers a pretty powerful set of business productivity applications like email, calendars, spreadsheets and word processing.

But firms like Google, Salesforce.com, NetSuite and Marketo aren’t the only ones playing in this sandbox. Traditional software vendors are also making the move to deliver SaaS-based applications. For example, Microsoft now offers a pay-as-you-go Office 365 for small- and mid-sized businesses and enterprise clients. Legacy ERP provider SAP is also in the SaaS game with its integrated SAP Business ByDesign suite for finance, HR, CRM, supply chain and procurement functions. Oracle offers Oracle CRM OnDemand, as well as other SaaS enterprise apps.

History of Minicomputer

Mobile Technology Creates Broad Accessibility

While cloud computing is certainly demanding its time in the spotlight these days, there’s ongoing speculation as to just how viable of a solution it offers for large-scale enterprises. Although questions remain around the cloud’s security and accessibility, there’s one trend that’s undoubtedly changing the face of the enterprise: mobile technology.

No company has catalyzed this change more than Apple. Just as Apple co-founders Steve Jobs and Steve Wozniak revolutionized personal computing with the Apple II in 1977 and Macintosh in 1984 – Apple’s launch of the iPhone in 2007 changed the face of mobile connectivity. And Apple’s smartphone has helped add fuel to the company’s turnaround, which started with the iPod in 2001.

While the iPhone created a pretty, user friendly way to text, talk and browse the web, the most powerful aspect was all behind the scenes. Apple’s iOS operating system provided a dynamic platform upon which software developers could create a multitude of applications. And compared to traditional software development, these apps can be created for far less cost and either given away for free or sold inexpensively – to thousands and millions of users.

While Apple has pushed the envelope on mobile phones – moving into its fifth generation of the iPhone – the company has created a tipping point on another evolution in computing with the release of its tablet device, the iPad, in 2010. Following on Apple’s heels, other big-name manufacturers have been rolling out their own versions of tablet PCs this year, including Amazon.com, Dell, Samsung and Hewlett-Packard.

While the enterprise was where computing took root – and where software found its footing – the shift in the past decade to consumer-driven technology has transformed the face of enterprise computing. And with smart technology – not to mention social media – the lines are now forever blurred with the way we play, live and work.

Certainly, computing’s early visionaries probably never imagined we could generate more computing power and access more mind-boggling software applications on a single, thin tablet than ever conceived when those first mainframe computers were released 60 years ago. While we could speculate – and many do – on what lies ahead, the truth is there are likely developments on the horizon we cannot even fathom yet. But I can tell you none will ever be as cool to me as the first time I shattered that backboard on my Dr. J vs. Larry Bird Commodore 64 game.

What do you think? Where do you foresee the big trends in where enterprise software is heading in the next 10, 20 or 30 years? Feel free to leave your comments below. (And if you’ve made it to the end of this fourth article, thanks for reading!)

 
  • Blogs by Market:
  • Subscribe to the Software Advice Enterprise Blog

Popular Blog Posts