By Silvia Davi, Chief Marketing Officer, Equities.com
The constant influx of new technologies into the financial sector is driving major transformation on Wall Street. Most recently, the advent of advanced systems like artificial intelligence, big data and high frequency trading has made the finance industry almost unrecognizable from what it looked like just a decade ago. While it’s created a net positive effect—facilitating more efficiency and transparency in the market for the most part—it’s also become impossible to track through traditional means. For investors and regulators, an inability to keep up with this progress creates very concerning vulnerabilities, to say the least.
These new issues are something the Financial Industry Regulatory Authority (FINRA) realized early on. It’s also why Wall Street’s leading self-regulatory organization, whose mission is to provide investor protection and promote market integrity, invested so heavily in modernizing its operations years before this current wave of innovation really hit the market. Leading the effort all the while has been Steven Randich, Chief Information Officer of FINRA. Prior to joining FINRA, Randich’s extensive experience had been on the other side of the table, holding similar roles at Citigroup (C), NASDAQ (NDAQ), the Chicago Stock Exchange, IBM Global Services and KPMG.
Today, FINRA’s architecture isn’t just keeping pace with Wall Street, it’s arguably ahead of most of the industry’s key players. In this Equities.com interview, Randich gives us a peek under the hood of FINRA’s newly revamped, well-oiled machine and explains how it positions the regulator to protect investors better than ever.
EQ: The financial industry has been undergoing a significant period of innovation and technological disruption in recent years. We’ve seen this through the prevalence of Fintech startups, big data analysis, algorithmic trading, and so forth. Yet, as the industry regulatory body, FINRA hasn’t only kept pace, it’s actually been leading the curve. How has fully embracing emerging trends such as open source data and cloud computing early on helped you stay ahead of the industry you’re tasked with watching over?
Randich: It is really amazing how FINRA, as a financial services industry regulator, has been able to achieve the level of innovation, creation and successful deployment of such advanced and pioneering technology. In most respects, we believe we are several years ahead of the vast majority of financial services firms. At the recent Amazon Web Services (AWS) re:Invent annual conference, where FINRA had a total of eight speakers, AWS’s CEO stated that FINRA is “one of the very, very top practitioners of building on top of AWS” and that they have learned much from us. We set forth in 2013 with a deliberate, yet unconventional approach to our implementation of big data software in the public cloud.
First, we insisted on being self-sufficient, by doing it ourselves, by training our own employees, and not relying on vendors. Next, we wanted to do it in the public cloud, not the private cloud, as is so prevalent in the financial services industry. We did not want to continue acquiring proprietary technology, provisioning it for peak and disaster recovery, and then watch it depreciate in our data centers. Instead, we wanted access to massive scale at commodity costs. Third, we wanted to use open source database and big data software. We had numerous proprietary database vendors telling us to use their software, insisting that the open source software was not mature and would not scale. Lastly, we felt in order to fully leverage the cloud model, we needed to re-architect and rewrite our software from the ground up, as opposed to a “lift and shift.” This approach allowed us to fully leverage the demand elasticity of the cloud, build cyber security controls into the foundation of our software, and to fully automate the development operations (devops) function.
EQ: What are some advantages FINRA now has at its disposal with open source and cloud technology that weren’t available with your legacy infrastructure?
Randich: There are several. First it is important to understand that open source big data software and the public cloud present distinct, yet symbiotic, advantages. Public cloud allows massive (virtually limitless) processing and storage scale at commodity prices. Previously, we had major challenges with the physical storage and processing capacity of our legacy data warehouse appliances. When implemented wisely, with 100% elasticity, we have access to this scale on demand. In other words, pay for it only when you need it, so it is significantly less expensive than resident infrastructure. A related benefit is that we don’t need to provision for peak capacity because we can burst our available capacity to meet the processing peak for only the time that it is needed, and only pay for such. Resiliency and business continuity also are superior to the traditional primary and secondary data center approach, again with the ability to provision and pay for disaster recovery processing infrastructure only when needed.
Open source software, like those in the Apache Software Foundation, like Hadoop, Hbase, Hive, and Spark allow FINRA to leverage the massive scale and elasticity of the public cloud. With open source, and its broad software developer community, we have access to much more product development and innovation, because the number of software developers contributing to the software are not limited to the employees of one company. Often, we see hundreds of developers from all over the world contributing to these open source software products. Which brings me to the last major advantage of open source, which is that you are not locked into a vendor’s proprietary software and their limited quantity of software developers, e.g. no vendor lock-in.
EQ: In terms of scope, how much data and what moving parts are involved on a day-to-day basis regarding the infrastructure you’ve built on the cloud? What are the advantages and challenges you face when dealing with the size and complexity of the financial markets, now that they’re moving at lightning speed?
Randich: We now have over 20 petabytes in the cloud, representing over 90% of FINRA’s total data volume. The volume in the cloud is growing rapidly, as on average, every day we collect 50 billion market events from broker dealer firms and exchanges. These include pretty much every quote, order, cancel, and trade occurring in our equity securities market today. We have had peak days where we have collected 75 billion market events. Before moving to the cloud, we had clear challenges in dealing with this data volume given our fixed footprint infrastructure. This was particularly challenging when, after collecting the data, processing it, and then surveilling it, a firm or exchange would inform us that the data they had previously sent was done so in error and needed to be resent. In these cases, which are not rare, we would have to determine what the impact of the resubmitted data was to our previous surveillance results.
Now in the cloud, we can store and process multiple versions of the data simultaneously, so we can handle the resubmissions with relative ease. We have been observing 20% year over year increases in total market event volume. With the unlimited scale, at commodity costs, of the public cloud, we can absorb these market volumes increases with no effort whatsoever. Finally, now that we are not spending so much time on managing our infrastructure by dealing with capacity constraints, we can focus much more time on innovations to our surveillances so that we can stay ahead of the latest developments in market manipulation and fraud and consequently be a more effective regulator.
EQ: Artificial intelligence is another area garnering intense interest in the market right now. Is this an essential area for FINRA going forward as you continue to try and transform data into information?
Randich: Yes, FINRA is investing specifically in machine learning software and algorithms to improve the efficiency and effectiveness of our surveillances and are currently pursuing a proof of concept in our market surveillance area. We are experimenting with several open source machine learning platforms and are expecting to benefit from the rapid innovation that is occurring in the open source community. While it is early in our process, we are optimistic that machine learning will ultimately add value to our regulatory effectiveness.
EQ: FINRA’s cloud-computing roll-out began in early 2014 and was projected to be a 30-month project. Can you reflect on the process and how it compared to your initial expectations? Did it evolve as the market evolved?
Randich: The migration of our market surveillance platform to the cloud began in January 2014 and completed in full in July of 2016. Our initial expectations were that this migration would reduce our costs, both data center infrastructure, and labor associated with the operational work necessary to keep the system working within a constrained capacity environment. The project completed on time and delivered these cost reductions. However, there were several big and pleasant surprises.
First, the performance has been stunning – on average a 400-fold improvement in the time required to run surveillance queries. Second, the cybersecurity is superior to that of a private data center model. Features like encryption, micro segmentation and many others contribute to make our data safer in the cloud. Thirdly, the resilience and business continuity is better than that of a private data center model because our processing and storage of data is ubiquitously and virtually running in dozens of geographically distributed data centers with varying data center utility providers. Of course, these additional benefits require a proper system design and architecture, taking advantage of functions that AWS makes available to their clients.
EQ: FINRA’s primary goal is to protect investors from fraud and bad actors within the industry. How have new innovations and technologies in the financial industry changed the way in which fraud is being committed? What are some new threats and risks that investors need to start paying more attention to?
Randich: As trading has become increasingly electronic, and consequently faster, market manipulation and fraud have become more sophisticated and consequently more difficult to catch. With the fragmentation of our markets, where orders in the same security can be sent to and executed on multiple exchanges, alternative trading systems and liquidity pools, there are greater opportunities for manipulators to disperse their trading activity in an attempt to avoid detection. Further, we also see market manipulators using accounts at different firms to trade in an attempt to obfuscate their activity.
For all the reasons I noted above about the benefits of the AWS cloud, particularly its speed and on-demand capacity, FINRA is much better positioned to detect and deter this kind of nefarious activity. Through our surveillance systems that are enabled by the cloud we see one, virtual integrated market, not a fragmented market. This is powerful and essential for us to continue to protect investors and promote market integrity.
EQ: The industry’s perception of FINRA as a technology heavyweight has certainly grown in recent years, largely due to the efforts you’ve spearheaded here. How does this wider acknowledgement as an innovator help FINRA better achieve its goals as a regulatory body?
Randich: As with most things nowadays, the more advanced the technology being used, the better. Particularly in our industry with the electronic high frequency trading (HFT), the speed, and the massive volume of market events every day, we simply could not keep up and be able to properly surveil this activity without sophisticated advanced technology. In many respects, our regulatory competency is fueled by our technological prowess. For this reason, we in FINRA technology have been quite noisy in the press, at industry conferences, etc. about our innovation and success in implementing advanced, pioneering technology. At this point in time, we have dozens of financial services firms visiting us frequently to learn of our experience in migrating our most critical and data intensive applications to the cloud.
This article originally appeared on Equities.com