Private Data Providers May ‘Over-Claim’ Ability to Predict Local Climate Impacts

Claims that the booming private sector climate data services industry is over-claiming its accuracy and failing to deliver equitable, reliable ,or transparent datasets were addressed by experts in a panel discussion on Monday.

Hosted by Climate & Capital Media, the event brought together global industry experts, including former U.S. deputy treasury secretary Sarah Bloom Raskin, to discuss the accelerating private sector “climate intelligence arms race”.

Madison Condon, of the Boston University School of Law, opened the panel by presenting her paper, Climate Services: The Business of Physical Risk. Condon’s research has revealed that private climate data analytics firms may be over-claiming the utility of “downscaling” techniques.

Downscaling is the bottom-up process of using location specific information to flesh out aggregate geographical data from publicly available global climate models. Downscaling aims to provide greater granularity and usable information on physical climate risks.

However, Condon argued that downscaling often results in claims that “aren’t particularly scientifically rigorous”. To improve the quality of data, she said more investment is needed in resource-intensive catastrophe modelling techniques.

The panel—which included Climate & Capital Media’s climate editor Blair Palese and moderator Kate Mackenzie—considered how to reform the industry to make it fit for purpose.

The climate intelligence arms race is currently dominated by a small number of corporate players that acquired many early entrants in the space. Palese highlighted the importance of moving away from a corporate black box structure toward transparent, useful, and accessible analytics.

Raskin expressed concern over findings that climate analytics are often not validated, authenticated, or complete, stating that she “shudders to think what policy looks like when models being used are not incorporating full climate data”.

In her paper, Condon proposes creating publicly-owned, open-source national climate data services. The approach would require federal, state or provincial, and international agencies to invest in and help build datasets that can be used to integrate climate modelling into financial risk analysis.

Raskin endorsed Condon’s concept of a public sector “climate hub”. To enable regulators to price climate risk, Raskin emphasized that the hub should include credible and usable information regarding the exposures and vulnerabilities of assets, communities, or municipalities.

The underrepresentation of climate scientists in  building climate finance models was another issue highlighted in the discussion. Panelists stressed the need for interdisciplinary approaches to produce robust data on physical risks.

Citing the U.S. Federal Reserve’s hurricane stress test released early this year, Condon said the Fed’s failure to include climate scientists resulted in an “expertise breakdown”.

“How did a bunch of economists think that they could design hurricane stress tests?” she asked.

Palese said practitioners are “very early in the stages” of developing climate analytics, even though climate-induced extreme weather is increasing globally. She pointed to the need for advancements internationally as well as nationally.

This requires planning for “what that risk looks like in places where the impacts are most intense”, such as in the Pacific Islands region. Planning must ensure that affected regions have equal access to risk data, she added, because “they are getting hit hardest and have done the least to create this problem”.

nj