As the “cloud” concept arose, there was a lot of debate about whether cloud computing would really replace the client-server architectures of today’s enterprises. Now it’s clear that the answer is both “yes” and “no.”
Cloud computing isn’t an “either/or” proposition. What will happen at first is that CD and DVD-ROM drives on computers will start to disappear. “App stores” and the like are nothing more than cloud-based software management systems -- but they work really, really well. Few of the “apps” one downloads to a device or PC are actually cloud-based apps -- but as more and more applications are developed for cloud-based distribution, more and more cloud-based functionality is being written into these apps.
Over time, applications will have a mix of “client/server” and cloud-based features, and it will be difficult for end users to tell where “client/server” ends and “cloud” begins. For software developers, this means they don’t need to completely rewrite their software for the cloud. They merely need to develop a strategy to migrate functionality into the cloud over time.
What is the right balance of “client/server” vs. “cloud” functionality? It depends. Features that are used often, are overly “chatty” (a high volume of data being sent between the application and the microprocessor), or are highly time-sensitive will probably remain on the client side. At the least, cloud-based apps will always have client-side modules for those clients that can handle the offload of microprocessing tasks. Those features that are rarely used or can tolerate a little more jitter will be moved toward the cloud.
As a result of this trend, cloud computing may actually be a boon for software developers. In the old days, the software industry’s pricing model assumed that a significant number of illegal copies would be made for every legitimate copy that was sold. Now, with the added security of cloud-based software distribution, developers could theoretically reduce the price of their applications, as well as create innovative “by the drink” pricing solutions. This may involve a license where a piece of software is used only a few weeks a year, or where the software is used year-round but the advanced functionality is only used for a few weeks out of the year.
Adobe, creator of in-demand programs like Photoshop and Premiere, is one company that has had to charge relatively higher prices for its sought-after software, often putting it out of reach for all but professionals. Now, through Adobe Creative Cloud, people like me who edit a professional-quality video only rarely can pay $79 for one month of access to Adobe Premiere Pro without any annual commitment. As you can imagine, paying $79 every couple of years for access to the best video editing software is a big deal for a guy like me, who could never justify shelling out $699 for such infrequent video projects.
Even those enterprise applications where CIOs are hesitant to move to cloud-based services will eventually put everything in the cloud. Why? Because a cloud-based app is perfectly capable of keeping a local copy of the app on each client so that business processes are not affected during the 0.01 percent of the time that a cloud might be down (and local database servers can provide for disaster recovery by keeping a local copy of all data, while also supporting some data needs during outages). As such, it's not a question of “client/server” vs. “cloud.” That debate is meaningless. The only relevant part of the cloud “war” is which applications move to the cloud before the others.