THETRADETECH DAILY THE OFFICIAL NEWSPAPER OF TRADETECH 2025
What advice would you give those looking to consolidate their transaction data? Begin with a thorough analysis of your scope and objectives. Clearly define the problems you ' re trying to solve through data consolidation and what capabilities you want to establish. Consider whether you ' re focusing on a single asset class like equities or expanding to cross-asset and derivatives, as this impacts the number of data sources and adds complexity to the data consolidation. Prioritise need-tohave versus nice-to-have capabilities, such as intraday updates for real-time decision-making versus one- or two-days delay in data if only used for quarterly assessments.
Don ' t underestimate the ongoing effort and cost of maintaining data pipelines and ensuring data quality. Budget accordingly, as internal management requires significant investment, especially with multiple data sources and pipelines. Consider leveraging external providers as this might be the easiest way to get going on a small budget but be aware that this may result in less control, oversight, and flexibility for future changes.
Assess your team ' s technical capabilities and resources. Consolidating transaction data often requires expertise in data engineering, database management, and analytics. If these skills are lacking internally, factor in the cost of hiring or training.
How important is it to supplement this with third-party and other market data? What does this look like empirically? Supplementing transaction data with third-party and market data is crucial for comprehensive analysis and informed decisionmaking. Most firms have access to transaction cost analysis( TCA) of their data, as it ' s challenging to benchmark transactions without market data. While request for quote( RFQ) protocols might allow for some analysis without additional market data due to their auction nature, generally, it ' s difficult to be truly datadriven without external data sources.
Empirically, we use external TCA providers to enrich our data across all available asset classes. This enhancement improves trading desk operations, interactions with portfolio managers, and broker relationships. Thirdparty peer data comparison services offer valuable relative performance insights and are only possible to obtain by using external providers. We ' re also exploring executionrelated data like indication of interests( IOI), particularly in the fixed income space, to get a feeling for the liquidity landscape before going in the market.
The importance of additional data is evident in its impact on our decision-making process both pre- and post-trade. For instance, externally provided cost models and market data allows for more precise evaluation of execution quality, helping identify opportunities for improvement. Having a proper TCA solution available has helped us change the conversations we are having with
Buy-side must prioritise needs before wants when it comes to data
The TRADE sits down with Simon Hordum Bonde, head of trading quants at Nordea Asset Management, to discuss what’ s front of mind when it comes to continued data consolidation, the ongoing pursuit for high quality, and the impact of third-party data on trading processes.
our brokers. We have been able to steer the conversation more and improve performance this way.
When it comes to maintaining a high quality of data, what should be front of mind? Data visibility is paramount – if no one is looking at the data, it ' s likely to be incorrect. Our goal has been to democratise data, making it accessible to as many users as possible. This approach serves multiple purposes: it enables employees to explore data and enhance their data management skills, fosters data-driven decision-making, and improves data quality by identifying issues earlier and more frequently.
Continuous monitoring of data pipelines and data state is essential to ensure completeness and accuracy. With multiple data providers and sources, fields inevitably contain noisy content. Balancing the number and types of quality checks is crucial – too many tests can lead to an overwhelming number of failures, so it ' s important to prioritise and potentially only flag as failures when a certain fraction of data is impacted.
While external systems and data providers often highlight data normalisation as a selling point, it ' s important to thoroughly evaluate and test these claims. In our experience, we have found the normalisation provided by external sources to be lacking in some areas, often requiring additional in-house efforts to meet our specific needs. This consideration also supports using fewer providers and simpler data pipelines to reduce the overhead of data normalisation.
Implementing a robust data governance framework is vital. This includes clear data ownership and well-defined data quality standards. Additionally, fostering a culture of data quality awareness across the organisation can significantly contribute to maintaining high-quality data, but in practice it’ s difficult if not strictly enforced by the systems i. e. having a list of pre-defined options instead of free text fields.
Are you an advocate of in-house solutions in this space, or an advocate of third-party providers? I generally advocate for a‘ buy before build’ approach, especially for services like transaction cost analysis data. The pretrade and portfolio implementation spaces are generally where we see more benefit in developing bespoke solutions.
However, we have found value in building internal data pipelines and data models to enhance quality, timeliness, and flexibility of our data. This approach allows us to tailor our data infrastructure to our specific needs and workflows. Moreover, by exposing our data internally to multiple teams, we achieve a level of convenience and integration that would be challenging if we relied solely on external providers ' platforms.
The decision between in-house solutions and third-party providers often depends on factors such as the organisation ' s size, available resources, and specific requirements. In-house solutions offer greater control and customisation but require significant investment in technology and expertise. Thirdparty providers can offer cost-effective, readyto-use solutions but may lack the flexibility of custom-built systems.
Our strategy involves a hybrid approach, leveraging third-party providers for standardised analyses and data sets while developing in-house capabilities for areas where we need more control or have unique requirements. This balanced approach allows us to benefit from external expertise while maintaining the ability to innovate and adapt quickly to our evolving needs, but it’ s definitely not the cheapest approach.
18 THETRADETECH DAILY