To state that the financial services industry is undergoing tremendous change is an understatement. We hear about the headwinds of fintech disruption daily as well as regulatory and compliance requirements that continue to tighten. Customer expectations also continue to grow and managing them becomes tougher every day.
But these institutions are not sitting idle. There is a definite sense of urgency to transform both business and operational models with a view towards growing the topline and optimizing the bottom line. Consequently, digitization initiatives are top of mind, as technology is used both to enable change and fight disruption. Automation is critical to these digitization efforts and it can be an excellent pivot between speed and risk-management, especially as DevOps processes and cloud models are embraced.
Recently, I had occasion to address over 100 executives from banks, non-banking financial companies (NBFCs), hedge funds and other financial services professionals in the UK. Those attending were focused on several areas including compliance and risk management, quality assurance, technology design and planning as well as those with broader architectural responsibilities. Based on the engagements with the financial services sector in the UK, as well as the United States and customers of my own company, here are five trends and challenges worth sharing:
1. Data is the new oil, gold and everything else valuable put together. Ensuring complete data privacy is still a struggle. Data storage is also very distributed and complex to manage.
Data is a double-edged sword. On one hand, these financial institutions are privy to considerable personal and financial data, and on the other hand they have a tremendous responsibility towards ensuring privacy and security, as guardians and gatekeepers. Handling data in fast-paced dynamic environments is a challenge, as is ensuring it is “touched” appropriately even for “mundane” testing operations. Accountability is difficult, particularly when public cloud is adopted, or when external vendors are in the picture. Regulatory requirements like the EU’s General Data Protection Regulation (GDPR) have brought about significant overhead, without a lot of resource allocation to deal with it. Amidst this drive to do more with less, there is recognition that crypto-currencies will intersect the mainstream very rapidly, and the backend operational systems and regulations are not really geared to accommodate them.
While the nature of data is changing in real-time, data storage is also quite distributed, non-uniform and on all forms of archaic media. It is hard to know what is where and there is active recognition that leakages could more readily occur from data in backup drives and secondary storage than someone hacking into a production server. The infamous Equifax incident and other recent hacks are poster children for these problems. Managing this concern is a complex task and with the availability of cheap storage, it is often easier to defer acting on this, leading to continued fragmentation.
2. Software investment is growing exponentially, and software engineering headcount has grown 2-3X over the past few years. DevOps and Agile practices are being rapidly adopted along with automation across pockets. Software lifecycle automation is ideal, but still fraught with potholes.
The popular adage that every company is a software company and software is eating the world is exemplified by one simple metric – headcount. These financial institutions have hired more software engineers than any other function within their organizations, whether it be locally or in offshore locations. Making these software dev/test engineers productive and aligning them with business transformation initiatives requires adopting AGILE methodologies (most didn’t care for these names) and also instituting DevOps practices. This change is shifting organizational practices and influencing cultures.
While several execution elements have been automated, these aren’t necessarily standardized across distributed organizations. Fragmented tool adoption and disparate open source projects across different teams still contribute to more inefficiencies. While open source is seeing increasing adoption, non-standard usage and support issues are becoming bottlenecks while scaling. Likewise, building complex dev/test environments at scale in an expedited manner is still cumbersome. Raising tickets for IT teams to set-up or tear-down these environments means considerable waiting in line, leading to productivity losses and slower time to market. Not everyone feels this pain uniformly.
3. (Public) Cloud is still a risk, but project managers are ready to take the plunge where regulations permit. Application migration will be selective. Mainframes running Cobol are still humming away!
While some projects are budgeted to go towards the cloud, they are taking small steps with a crawl, walk and run approach. One of the executives I met mentioned that he would “put something up there on AWS and see how things go and run things full steam till management shuts it down.” These institutions have considerable technical in-house intellectual property that would take considerable time to modify and replicate in the context of the cloud. The technical baggage can vary and consequently, different institutions are at various parts of their journey to the (public) cloud. And, while they do see hybrid deployment models, but don’t necessarily see hybrid workloads for the same applications, nor any application being multi-cloud.
They also don’t think it is possible to lift-n-shift every application. Nor do they want to. For one, it isn’t necessary – several are mainframes that are still reliably running Cobol applications or otherwise and they really don’t want to touch them. Many still prefer them to be front-ended by something else that resides in the (public) cloud. Second, it would be quite expensive to do so. These are reliable machines with significant investment and optimization efforts behind them, and there’s no point in ripping them away.
It is a fact that these Fortune 500 institutions have invested considerably in technology over the past few decades and kept most of it even as they have added new elements to the stack. We cannot dismiss these outright as legacy systems, as they’re serving critical functions in many cases.
4. “Non-core” functions that were previously outsourced are becoming strategic and vendor management in the new era has considerable challenges.
Several financial institutions have outsourced or offshored many aspects of their software development, software maintenance, quality assurance and other non-core execution elements. This has led to distributed teams and allowed them to gain cost efficiencies. Vendors like Infosys, CAP Gemini, Accenture, Wipro and others have gained from this trend and built up commendable FSI practices. While they help with cost efficiencies, there are overheads to deal with. Many of these offshore vendors have good domain knowledge but poor business context. This shortcoming has implications for testing, security and quality assurance.
Since solution architects were most often co-located with the business and had the better context, they were most often the go-to-people to ensure that nothing was “lost in translation.” To minimize this, the idea of having solution architects design environment blueprints and publish them in self-service catalogs that distributed dev/test teams could leverage in a consistent manner the world over appealed to many. This ensures standardization across teams leveraging a non-disruptive workflow while ensuring fewer elements are misunderstood. Productivity definitely increases.
5. Security and compliance are hanging like the Sword of Damocles – but several black holes persist in the organization. Managing certification and compliance without visibility isn’t easy.
Regulations and compliance have always been integral to the financial services sector. But especially in Europe, with the GDPR deadlines looming about, it is certainly top of mind for a lot of people. While they aren’t thrilled about it, most view it as a necessary evil and they are pragmatic to accept that while no regulation by itself will make things foolproof, it is another way to showcase a better security posture. I see the same posture adopted by several US-based companies that are doing business in Europe which have dedicated significant resources to get ahead of this initiative.
Regulations aside, security is top of mind in most day to day operations. Without fancy names like DevSecOps or continuous security, these institutions have to put the processes in place to become more thorough in adhering to a security checklist. However, the sheer velocity of code changes, as well as environment complexity, have certainly made even simple tasks like applying a software patch pretty cumbersome. Whether it be certifying infrastructure or applications (or both), the system is under pressure to do so at the same pace as before.
Standing up to auditors is another issue. It is hard to protect what you don’t know or don’t have visibility into. Plus, there is certainly a lot of “legacy gear.” Several consultants and auditors do their jobs to certify and provide reports. However, these dynamic environments are fast-changing and it is quite a challenge to ensure that even something that was recently audited isn’t compromised already as the pace of change is high. How then to perform quality assurance or security assurance in such dynamic environments? Many are employing automation here as well, bringing continuous security assurance along with principles of continuous testing in an automated manner. Cyber ranges are also getting attention due to the fact they help train employees and contractors in best practices in authentic environments.
Money as we know it, and our perception of it, is itself changing rapidly. The currencies of yesteryear could end up just becoming commodity metals or plain paper. What then can these noble banking institutions – the gatekeepers and citadels of money – do to make themselves relevant? While there will be some dinosaurs, my money is on the fact that most of these institutions will be successful in their transformation and will adapt to come out ahead.
You can take that to the bank.
About the Author:
Shashi Kiran is currently the CMO at Quali. He has 20-plus years of experience in the hi-tech technology industry in the areas of data center and cloud computing, security, internetworking, software and analytics for enterprise and service providers. Shashi is based in San Jose, Calif. He was previously the head of worldwide marketing for Cisco’s data center and cloud portfolio. In addition to his day job, Shashi advises a few startups and venture firms on marketing functions, strategy and scale. Shashi can be reached via LinkedIn at http://www.linkedin.com/in/skiran