

Financial institutions face major structural challenges regarding the reliability and use of ESG data. In this article, we take a closer look at the risks to data quality and share the strategic recommendations of WeeFin's team of experts.
To explore the subject in detail, WeeFin's data and ESG experts have written a comprehensive guide to help you implement an ESG data management system that is tailored to your organisation and delivers results. Download it for free here to discover our roadmap.
Unlike financial data, which has been standardised for decades, the ESG ecosystem suffers from a glaring lack of standardisation. This situation leads to divergent interpretations and compromises the comparability of analyses.
One example is the coexistence of two distinct methodologies (market-based and location-based) for measuring Scope 2 emissions. The same company may thus have a near-zero carbon footprint according to the market-based approach, while maintaining a significant footprint according to the location-based method. It is therefore possible for a company to select a methodology that is more flattering than another, which can be likened to greenwashing.
Financial institutions use the services of specialised providers to obtain ESG data. The ESG data coverage offered by these providers also varies significantly depending on the indicators and providers. For carbon footprints, for example, Scope 3 data is not as well covered as Scope 2 data.
Data from non-financial reporting is also far from perfect. Companies rely on multiple, non-binding methodological frameworks (GRI, SASB, TCFD), which limits comparability. The same metric can be calculated using different scopes with varying assumptions. The arrival of the CSRD in the European Union was intended to provide a more comprehensive and binding framework, but the Omnibus Directive unveiled in early 2025 paved the way for a significant reduction in its scope (by narrowing the scope of companies covered and significantly reducing the number of ESG indicators published). The problem is therefore likely to persist in the future.
Given this situation, the implementation of automation tools that enable methodical and automated data completion is crucial. The industrialisation of processing, provided that it preserves analytical quality, represents a lever for optimising data management.
This is a first step that can be accompanied by other strategic actions:
The challenge: compensate for the lack of standardisation through methodical diversification of sources.
Key actions:
By allowing methods and scopes to be compared, this approach transforms the fragmentation of the ESG ecosystem into an analytical advantage.
The challenge: establish a single repository that meets standards of coverage, consistency, transparency and robustness. And is suitable for all use cases.
Key actions:
This architecture guarantees the operational usability of the data.
The maturity of ESG data still has room for improvement in terms of the reliability standards required by modern financial institutions. The implementation of a hybrid data architecture, combining diversification of sources and the construction of a single repository, is now a strategic imperative.
This approach not only addresses current market shortcomings, but also prepares organisations for future changes in the regulatory framework. The challenge goes beyond simple compliance: it is about building the foundations for a truly sustainable and analytically robust financial system.