HomeFinanceRemarks by Under Secretary...

Remarks by Under Secretary for Domestic Finance Nellie Liang at the Brookings Institution

Introduction

A key lesson from the global financial crisis is that opacity about critical markets and institutions resulting from lack of high-quality data can contribute to financial instability.   And, because different financial regulators have visibility into different market segments, detecting and addressing financial stability vulnerabilities requires close coordination and information sharing among regulators. Simply put, in a dynamic, interconnected economy such as ours, regulators cannot effectively safeguard financial stability or respond to crises if they do not have good data and are not talking to one another. 

Congress took this lesson to heart when it passed the landmark Dodd-Frank Wall Street Reform and Consumer Protection Act in the summer of 2010. Among many important reforms, the Act created the Financial Stability Oversight Council (FSOC or Council) to identify risks to U.S. financial stability, and established the Office of Financial Research (OFR) to support the Council and member agencies in collecting and standardizing data collections.   

In my remarks today, I will talk about how Treasury, with FSOC and OFR, have been approaching data collection, standardization, and risk measurement for safeguarding financial stability.  I’ll use ongoing work in two important areas – Treasury market resilience and the financial sector’s climate risks – to illustrate.  The financial regulatory community has made tremendous progress to reduce opacity since the financial crisis and is poised to advance further due to investments in data and analytic infrastructure.  The progress helps regulators and the regulated firms themselves, and more broadly the public. 

Data and the FSOC’s New Analytic Framework

Last month, the Council issued its new Analytic Framework which explains to the public how it uses regulatory data and data from other sources to monitor risks to financial stability.1   The Framework details key vulnerabilities and transmission channels that most commonly contribute to risks to financial stability, as well as sample quantitative metrics. 

Vulnerabilities include, of course, leverage and liquidity and maturity mismatch, complexity and opacity, as well as others, all of which have been well studied.  The Framework’s discussion highlights how data gaps, such as those from lack of regulatory or public disclosures or difficulties in determining interactions of market participants, may exacerbate complexity and opacity.   More broadly, the Framework makes clear that filling significant data gaps in order to better evaluate vulnerabilities is an integral part of the Council’s risk identification and assessment process. 

To be sure, in the 13 years since the Dodd-Frank Act became law, financial regulators, on their own and working with the Council and OFR, have improved the quantity and quality of data, increasing visibility into some previously opaque market segments.  To list a few examples, the Federal Reserve is now collecting for the largest banks highly granular loan-level data and resolution plans with key inter-linkage data ; the SEC has significantly expanded the detail on assets in money market mutual funds and open-end funds, and is collecting more timely data for private funds from Form PF; and the CFTC and SEC now have access to transaction-level data on swaps trades reported into registered trade repositories. 

Clearly, reliable data are a necessary input into our assessment of financial stability risks and lack of transparency may exacerbate risks. But we know that data production is not costless, both for the entities who report it and the regulators who maintain it.   So how can we ensure we get the greatest “bang for the buck” for the data that we rely on?   I believe we can most effectively leverage our resources when we adopt a holistic, end-to-end, approach to data collection and standardization. 

This approach starts with a rigorous process for identifying data gaps and unmet needs to monitor vulnerabilities, such as described in the FSOC Framework.  Having identified a gap that needs to be filled, regulators or OFR actively consult with relevant stakeholders to develop a data collection process that addresses regulators’ objectives in a way that is efficient and avoids unnecessary burdens on reporting entities.  By being careful in the way we design data collections, for example by relying on open standards for data reporting formats and variable definitions, we can often improve the efficiency of data sharing and interoperability among regulators. 

Once we’ve determined what must be reported, we need to deploy robust infrastructure to onboard, maintain, warehouse and analyze the data.  This infrastructure includes not only physical data storage and server capacity, but also data governance structures that enable us to protect confidential data while also facilitating information sharing among regulators with a “need to know”.   Finally, to the extent feasible, I believe there is great value in making as much data as possible available to the public subject to the need to protect privacy and intellectual property, prevent market distortions, and other constraints. 

Providing data to the public has a variety of benefits.  I am perhaps “preaching to the choir” but academics and practitioners outside the regulatory community have much to contribute to the discourse on financial stability, and access to better data raises the quality of that discourse.  And it’s been my experience that the more people who use a particular dataset, the more quickly its quality tends to progress over time, as users identify weaknesses and suggest improvements.  Of course, there is also a vast academic literature on the effects of public transparency.  This literature has documented many benefits including lower costs, improved liquidity, and better price discovery, but there may be unintended consequences as well, and we need to be cognizant of these possible consequences in determining what and how much information to disclose. 

To illustrate this holistic approach in practice, I’ll turn now to two current data initiatives, Treasury market resilience and climate-related financial risks.

Current Data Initiatives:

Treasury Markets

The $26 trillion U.S. Treasury market is the deepest and most liquid market it the world, and is the foundation for global financial markets. Ensuring that it functions well, particularly during times of stress, is fundamental to safeguarding financial stability.  Greater data transparency and new data collections are critical parts of our efforts to improve Treasury market resilience.

Secondary Market Transparency

During the March 2020 “dash for cash” episode, when Treasury markets became disrupted at the onset of the COVID pandemic, data on Treasury market trading volumes available to the public were very limited.  Only weekly data on trading volumes was publicly disclosed, and this was done with a one-week lag.  It’s hard to assess how well a market is functioning during stress periods if you cannot match price data to contemporaneous quantity data, so this lack of transparency was a serious gap.

Earlier this year, the Financial Industry Regulatory Authority (FINRA), working with Treasury, replaced the weekly reports on secondary market trading with daily reports.  These reports also provide information on trade counts and volume-weighted average prices for on-the-run nominal coupon securities. Coincidentally, this greater transparency became available just weeks before the regional banking turmoil in March, allowing market participants to benefit from the enhanced information about the Treasury securities market activity during that difficult time.

The Treasury Department has been working further with FINRA to enable public dissemination of secondary market trading at the transaction-level.   We have consulted closely with market participants and other stakeholders on what data could be disclosed without compromising market functioning.  We have proposed to begin by disclosing transaction-level data for on-the-run nominal coupon securities.  The disclosures would put caps on trade sizes and a modest time delay to prevent strategic front-running and other distortionary behavior that might reduce market integrity.  Last month, FINRA submitted their proposed rule filing to the SEC to move forward with transaction-level dissemination.  If the rule goes forward, we will then assess over time the effects of these disclosures for on-the-run securities and consider possible steps to provide additional transparency.

Non-Centrally Cleared Bilateral Repo Markets

We have also made significant progress in improving regulators’ visibility into the Treasury repo market. A vibrant Treasury repo market is an important prerequisite for a deep and liquid Treasury securities market. But as we saw during the global financial crisis, repo markets can be subject to run dynamics. Financial regulators have done much to improve regulators’ visibility into this market since then, but gaps remain and we are working to fill them.

The Federal Reserve now collects data on tri-party repo transactions and the OFR collects data on centrally-cleared repo transactions, but currently no one systematically collects data on the non-centrally cleared bilateral repo (NCCBR) market.  This market is roughly a $2 trillion market.  and is the largest segment of the Treasury repo market. It represents dealer-client transactions, which may feature significant borrowing by leveraged actors such as hedge funds.

The OFR turned its attention to closing this gap and conducted a pilot collection in 2022.  In January of this year the OFR proposed a rule to begin collecting NCCBR data on an ongoing basis.  The proposed collection would provide daily transaction-level information from an estimated forty financial companies.  These data would allow regulators to monitor the NCCBR market in near-real time [which would help them to identify emerging market vulnerabilities and track developments in repo markets that might indicate stresses elsewhere in the financial system].  The OFR received over 30 comments on its proposed rule [representing a range of stakeholders from industry associations to individual household investors]. Public feedback is being used to refine the Final Rule, which OFR expects to publish in the coming months. 

The NCCBR data collection will require that the OFR manage data submissions from reporters on a daily basis. To support the proposed NCCBR data collection and other projects, the OFR has developed a Data Collection Utility (DCU).  The DCU is an efficient, cost-effective tool for data collections, which will allow the OFR to securely collect and store business confidential data or large-scale bespoke collections.  This utility will provide flexibility for financial industry participants and other data reporters, allowing for both automated or manual submissions. Looking beyond the NCCBR data collection, the utility also positions the OFR to respond quickly to the FSOC’s evolving data needs through pilots, surveys, and other ongoing collections. The OFR’s Data Collection Utility will become operational in early 2024.

Climate Risk Data Initiatives

A second area where Treasury is working to improve financial data and analysis is risks to the financial sector from climate change. Some climate data needed to measure risks – such as information on acute physical risks like hurricanes and wildfires and more chronic physical risks like protracted excessive heat or sea-level rise – already exist.  Other data, such as granular emissions information of borrowers used to help identify financial firms’ exposures to climate transition risks, are still being developed. 

Joint Analysis Data Environment (JADE)

For regulators and researchers accustomed to analyzing more standardized financial data, the size, complexity, and format of many climate datasets makes them difficult to work with. In 2022, the OFR launched a pilot program to assess the feasibility of a collaborative research environment with data and analytical tools for assessing climate-related financial risks for FSOC member agencies.  Following the successful completion of the pilot, the project permanently launched as the Joint Analysis Data Environment, or JADE.

JADE is a high-performance computing platform that will enable collaborative, interdisciplinary research by providing FSOC member agencies access to analysis-ready data and analytical software in a secure, cloud-based environment.  JADE integrates processes and protocols to ensure data privacy based on those it has developed and refined over the years.   These processes and protocols ensure only those with the proper permissions, in agreement with the providing agency, are allowed access to the data on the platform.   Currently, JADE hosts a mix of publicly available climate and financial data. For example, it includes information on the geography of flood and wildfire hazards, as well as information on the geographic distribution of mortgage loans exposed to those hazards. Providing these resources in a collaborative space enables FSOC member agencies to focus on the research, rather than individually obtaining the necessary resources to conduct their work.

Currently, in addition to the two pilot participants, the Fed NY and the Federal Reserve Board, FSOC and the Office of the Comptroller of the Currency have access to the platform, and there are plans for additional FSOC member agencies to join in the near future.   In addition, OFR is planning for JADE to accommodate other data and analytical software and support research on other financial stability topics as well. 

Federal Insurance Office Data Collection

We are also working to expand the data we collect to fill an important gap in our ability to evaluate and monitor the impact of climate-related hazards on households and real estate markets, and risks to financial stability. 

We’ve all seen the headlines about spiking homeowners’ insurance premiums or limited insurance availability in places like California and the gulf coast that are struggling to cope with the rising economic costs of wildfires, hurricanes and other climate-related hazards.  Many physical climate hazards tend to have highly localized effects, and the risks from these hazards depend on local geography, as well as migration and development patterns. To understand how physical climate risks affect property insurance markets, regulators and researchers need to know how the availability, price and loss claims on insurance policies vary across local geographies, but current data are typically available only at the state level.

To address this gap, in October 2022 the Treasury Department’s Federal Insurance Office (FIO) sought public comment on a proposal to collect geographically granular data on multi-peril homeowners’ insurance.  Since then, FIO has carefully reviewed comments received and engaged with numerous stakeholders, including the National Association of Insurance Commissioners (NAIC), state insurance regulators, insurance companies, and consumer groups.  Based on these discussions, FIO streamlined and refined its data collection request to facilitate a more effective implementation of this first-ever data collection. 

Last month, FIO announced that it would seek formal OMB approval to begin collecting insurance data at a ZIP Code level on a consistent, granular, and comparable basis from the largest homeowners’ insurance providers that collectively underwrite around 70% of homeowners insurance premiums nationwide.  These data will help regulators and analysts to better understand how physical climate risks influence insurance coverage and availability for homeowners, and potential knock-on effects for mortgage lenders and real estate markets, and financial stability risks. 

Conclusion

To conclude, let me emphasize that we at Treasury and in the broader financial regulatory community understand the critical importance of data to our financial stability mission. Collecting and maintaining data for analysis is a critical part of that work because we can’t manage risks if we can’t effectively measure them. The examples I’ve presented today of ways we’re using data and infrastructure to improve the resilience of Treasury markets and better understand climate-related financial risks demonstrate our holistic approach to data management: we identify data gaps; consult closely with stakeholders to develop data collections; build infrastructure to collect, secure, and analyze data; and work to share data among regulators and, where practical, with the public. In this way, we are working to improve both the extensive margin – by expanding the data we collect – and the intensive margin – by using our data more efficiently – to advance our financial stability mission.

###

 


1 The Council unanimously voted to issue its new Analytic Framework for Financial Stability Risk Identification, Assessment, and Response (the Framework)  https://home.treasury.gov/system/files/261/Analytic-Framework-for-Financial%20Stability-Risk-Identification-Assessment-and-Response.pdf

Official news published at https://home.treasury.gov/news/press-releases/jy1992

Most Popular