U.S. financial institutions have long been recognized as early adopters of technology in order to increase efficiency, better understand their own risks, and gain a competitive advantage over their peers both in the United States and internationally. Before the financial crisis, however, many firms did not have integrated data systems, so that information was often segregated by division, by office and, in some cases, even by individual trader. Consequently, many firms lacked a cohesive picture of their financial positions, risks and exposures.
As many of today’s financial services firms have come to learn, big data systems can be used to analyze and publicize large, complex data sets. These systems can accommodate massive amounts of data, conduct deep and thorough analysis, and work with both firms’ and regulators’ systems. At the New York Stock Exchange’s Euronext, for example, daily data volumes are up 200 percent year-over-year, which led the NYSE to implement a fine-grained analytics system to measure latency across all transactions on a daily basis. By implementing new technology, the exchange now has better intelligence, more rapidly and at lower cost. Firms throughout the financial sector have seen similar results.
What has worked well in the private sector is now being adopted in the public sector. In 2010, Congress passed the Dodd-Frank Wall Street Reform and Consumer Protection Act, which included a number of provisions aimed at creating more transparent and stable financial markets. For example, Dodd-Frank includes measures designed to increase regulators’ understanding of financial markets by requiring financial institutions to improve their data collection and reporting and by giving regulatory agencies significant latitude in the types and amounts of data they collect from firms.
The impact of the massive regulatory overhaul has been felt throughout the financial industry, including transaction types, the nature of financial advice to the authorities of regulatory agencies. One of the central guiding principles of the overhaul was the need for more information. Regulators are using this information to provide a real-time view of bank performance and market conditions and improve their oversight of the financial system. Four federal financial regulatory organizations are at the heart of this effort: the Federal Reserve Board, the Securities and Exchange Commission (SEC), the Office of Financial Research (OFR) and the Consumer Financial Protection Bureau (CFPB).
The Need for Better Data
Following the 2008 crisis, the Federal Reserve Board recognized its need for more detailed data on mortgage and credit markets and launched the Risk Assessment, Data Analysis and Research Group. RADAR was part of an effort to help the Fed acquire and centralize a broad array of U.S. consumer credit data — credit cards, auto loans, student loans and mortgages — and make the information more broadly available to Fed staff and, in some cases, the public. Since its launch in mid-2010, RADAR has helped the Fed produce timely reports and research papers and has produced meaningful insights that inform monetary policy and bank supervision and regulation, as well as macroprudential supervision. While the data warehouse is mainly used for bank surveillance purposes, it has also proven useful in the Fed’s community development initiative.
In 2011, the Federal Reserve Bank of New York began work on a Sentiment Analysis and Social Media Monitoring Solution that enabled the bank to monitor discussions across Facebook, Twitter, blogs, YouTube, web forums and other media. The project recognized “a need for the FRBNY Communications Group to be timely and proactively aware of the reactions and opinions expressed by the general public as it relates to the Federal Reserve and its actions on a variety of subjects,” as the bank stated in a request for proposals. In essence, the Fed Reserve wanted to know what people were saying about the economy and to better understand how consumer confidence was trending.
With social media, a real-time opportunity exists to monitor local, national and even global consumer psychology. Consumers are said to be 70 percent of the economy, so listening to what they are saying — and what they are spending their money on — is important. As a result of its experience during the financial crisis, the Federal Reserve has stated that it is committed to “improving the responsiveness and flexibility of its business intelligence tools for analysis and decision-making,” and numerous business intelligence initiatives are underway.
A Clearer View
The SEC is similarly beefing up its business intelligence efforts and is seeking to leverage the digital process by employing data analytics and increasing the use of dashboards across the organization. For example, the SEC recently procured an analytics tool that collects real-time trade data from the exchanges and hosts their entire repository of historical market data logs on a Virtual Private Cloud (VPC). The VPC is dedicated to the SEC for a secure environment where its users are able to access, analyze and create complex models.
The SEC is also planning to develop and deliver a system that allows four of its divisions to track the creation, modification and cancellation of orders in real time. The system is intended to allow the SEC to collect, store, aggregate, monitor, query, manipulate and analyze trades, and quotes and orders on stocks and options, as disseminated by national securities exchanges, over-the-counter markets and alternative trading systems. The SEC analysts within the newly created Office of Analytics and Research will be looking for patterns of disruptive activity and nefarious trading practices, intentional or accidental.
The SEC’s Electronic Data Gathering Analysis and Retrieval (EDGAR) system, which electronically receives, processes and disseminates more than 500,000 financial statements every year, has also been going through an upgrade. Using interactive data, an investor can pull out specific information and compare it to information from other companies, performance in past years and industry averages. At the SEC, interactive data can provide investors faster access to the information they want in a form that's easily used and can help companies prepare information more quickly and accurately. As more companies embrace interactive data, sophisticated analysis tools that are now used by financial professionals could become available to the average investor.
In the years leading up to the financial crisis, policymakers and investors lacked sufficient data to anticipate emerging threats to financial stability or assess how shocks to one financial firm could impact the whole system. Accordingly, Dodd-Frank established the Office of Financial Research (OFR) within the Treasury Department to improve the quality of financial data available to policymakers and facilitate more robust and sophisticated analysis of the financial system.
OFR is empowered to collect “all data necessary” from financial companies, including banks and private equity firms. OFR’s mandate includes providing critical information and analytical tools to anticipate and respond to future emerging vulnerabilities; making it easier to aggregate and organize data; and maximizing data efficiency and security.
OFR operates a data center to standardize, validate and maintain the data necessary to help regulators identify vulnerabilities in the system as a whole. It also runs a Research and Analysis Center to conduct, coordinate and sponsor research to support and improve regulation of financial firms and markets. These data and analytical capabilities are intended to help policymakers and regulators, including the newly created Financial Stability Oversight Council (FSOC), as well as to promote financial stability and enhance market discipline.
The OFR has already begun to support the FSOC and its member agencies by providing analyses and data-related services. For example, the OFR is providing FSOC with data and analysis related to the designation of nonbank financial companies for consolidated supervision by the Federal Reserve Board. The OFR is also actively working with the FSOC to develop and maintain an initial “dashboard” of metrics and indicators related to financial stability.
In December 2012 , OFR and FSOC hosted a conference titled “The Macroprudential Toolkit: Measurement and Analysis.” The conference brought together thought leaders from the financial regulatory community, academia, public interest groups and the financial services industry to discuss issues related to data, technology and analytical approaches for assessing, monitoring and mitigating threats to financial stability. The OFR is also promoting stronger data-related standards to improve the quality and scope of financial data, which in turn should help regulators and market participants mitigate risks to the financial system. Such standards will also help firms link and aggregate information more easily and allow them to use the same basic data for reporting to regulators and for managing their businesses, providing important efficiencies and cost savings.
21st Century Tools
The basic idea underlying the OFR’s mandate is that better data and analysis can support the design of stronger financial shock absorbers and guardrails to reduce the risk of crises. They can also support earlier warning and effective responses to reduce the effects of crises when they occur, and help draw lessons for the future, which fits with the CFPB’s mission. Elizabeth Warren, former special adviser for the CFPB and now a U.S. senator representing Massachusetts, proposed that “a 21st-century agency should use 21st-century tools” and promised that the CFPB would employ innovative technologies to advance its goals. As the first federal “start-up agency” in a generation, the CFPB has already begun several large-scale data collection efforts on topics ranging from credit cards to mortgages to student loans.
Since opening it doors in July 2011, the CFPB has launched numerous consumer complaint databases and plans to collect data from large and small financial firms and rely on “crowdsourcing” — using technology to gain input from large groups of people — to inform its oversight efforts. As part of its Project Catalyst, the CFPB now makes consumer credit card complaint information available to the public through the bureau’s Consumer Complaint Database. In addition, the CFPB is able to identify patterns of deceptive and unfair sales and billing practices that have been reported by consumers. The agency can then use its crowd-sourced big-data analytics to alert credit and debit cardholders to similar charges on their cards and help them resolve the problem.
This treasure trove of data also allows the CFPB to observe trends in consumer complaints and resolution and act on them. For example, “credit card protection” is so far the 12th-most prevalent type of complaint overall, accounting for 3.6 percent of all complaints filed. Based on this data, the CFPB found that at every stage of the consumer experience — from advertising to enrollment to payment to collection — several companies had violated consumer financial laws. The agency brought actions against those companies for deceptive marketing practices.
On the bright side, those same companies can leverage the CFPB’s consumer complaint database to see where they stand in relation to competitors. And other companies can monitor the database in order to identify compliance issues and take action to avoid similar penalties, which could help them to retain customers, strengthen relationships and avoid fines.
The CFPB has also set up a Social Networks and Citizens Engagement System that, according to the CFPB, “will enable the CFPB to interact with the public in effective and meaningful ways, encourage the wide ranging sharing of consumer financial information and the strengthening of an online community of consumers, and ensure that critical information about the agency and key consumer finance issues is distributed.” According to Sen. Warren, “real-time data collection will be essential, both for the agency to serve as an effective cop on the beat, and for giving third parties a chance to glean insights from the data quickly enough to be useful.”
Going forward, federal financial regulators must continue to leverage new and emerging technologies that will enable them to better aggregate and understand the amazing wealth of financial and market information that is available today. As our financial system becomes increasingly complex, data aggregation and storage challenges will only continue to grow. In order to fulfill their oversight, supervision and enforcement responsibilities effectively, regulators must rapidly adopt new technologies so that we can identify problems in the economy sooner rather than later — thereby helping to forestall the next financial crisis.