This article first appeared in BRINK on September 7, 2017.
Large financial institutions are starting to consider new ways that the public cloud can help them drive innovation and competitive advantage. JP Morgan and ANZ Bank have announced that they will join long-standing public-cloud users such as Capital One in shifting more of their workload to the cloud. Other major players will soon follow suit.
JP Morgan and ANZ Bank have announced that they will join long-standing public-cloud users such as Capital One in shifting more of their workload to the cloud
After spending billions of dollars to improve their capabilities in key areas such as security, scalability, and multi-region support, cloud providers such as Amazon, Google and Microsoft are now a viable way for companies to modernize their IT infrastructure and improve their technical capabilities, while simultaneously cutting costs and driving efficiency.
Smaller organizations with lower technology budgets, such as credit unions, made up the first wave of financial firms that took advantage of the cloud. But now, large banks, asset managers, and insurance companies are running tests and modifying policies and procedures to allow them to move large compute loads and data infrastructures to the cloud as well.
Negative Cloud Assumptions Proving Untenable
Until recently, most large financial firms resisted migrating to the cloud for reasons that are becoming less and less tenable. Banks generally felt that any “public” infrastructure could never meet their stringent security requirements. Yet, there are recent examples of hackers gaining access to their critical applications and customer data even when they own the infrastructure. In parallel, the large cloud providers have hired some of the top security talent in the world, and invested heavily in supporting capabilities, which has allowed them to provide a level of security that in many ways exceeds that available to all but the largest corporations.
Many institutions also assumed that regulators would not permit business-critical infrastructure to reside in the cloud. If you interpret many global regulations literally, financial institutions are required to directly supervise vendors and monitor hardware down to “bare metal,” which is impossible in the cloud. To bypass this Catch-22, cloud providers have implemented advanced logging and reporting mechanisms that support monitoring of applications and data systems operating on their virtualized infrastructure. Moreover, regulators, such as the Financial Industry Regulatory Authority, were among the earliest adopters of the public cloud, as their data and computing requirements outstripped internal capacity.
The final rationale for remaining with captive data centers is that large firms felt they could deploy less expensive and more flexible computing infrastructure than Amazon, Microsoft or Google. This assumption has also proven false, given that these providers are investing billions of dollars each year in expanding their offerings, acquiring emerging startups and integrating their capabilities. As cloud providers “climb up the stack” to provide more sophisticated capabilities (including machine-learning and advanced analytics), the business case for remaining on private infrastructure has become more difficult to support.
Most large financial firms have resisted migrating to the cloud for reasons that are becoming less and less tenable
Cloud Use Cases
The emerging cloud use cases for most large financial institutions involve large amounts of data and compute power. Applications migrating to the cloud include complex risk analysis, stress testing, regulatory reporting, surveillance, anti-money laundering, and customer marketing. The combination of the need for massive databases with scalable infrastructure to accommodate non-uniform spikes in computing resources is a perfect fit for the public cloud. Moreover, since full-scale changes in business requirements are already forcing major rewrites of these applications, banks don’t have to transfer much, if any, code to the cloud and can instead start nearly from scratch.
To hedge their bets, some firms are implementing “hybrid” cloud installations, using both internal data centers and the cloud. This allows them to store mission-critical or commercially sensitive data on their own infrastructure, while gaining the cost savings and capabilities of the cloud.
Higher Order Cloud Capabilities
Cloud developers are adding functional capabilities at a furious clip that goes way beyond just providing server replacement. Cloud infrastructure can now support massive data lakes, complex data transformation and analytics, visualization, machine learning, call centers, collaboration, and out-of-the-box mobile-device support. On Amazon’s cloud platform AWS, for example, you can deploy a custom chatbot with a few clicks, a task that would have been a huge, expensive project just a year ago.
More recently, the emergence of “serverless” capabilities is beginning to change how business software is built. These offerings do away with individual servers and move business capabilities and processes into “logic containers” that run on demand and without dedicated virtual servers. They can scale to whatever volume is required, and users are only charged for the transactions they process, so there is no ongoing server cost.
While financial services firms debate the extent to which they want to embrace the cloud, it is critical that they at least start to build up their cloud skills. Competitors are discovering the advantages of infinite computing power, functionality-in-a-box, and pre-built components that are instantly deployable. These capabilities provide business value far beyond what on-premises servers can provide, and companies locked into legacy infrastructure will struggle along at a competitive disadvantage.