Banking Fintech Guest Posts Investment Services Trading

Why Banks Are Only as Good as the Data That Underpins Their Trading

Why Banks Are Only as Good as the Data That Underpins Their Trading

Imagine this scenario, the head of equities trading for one of the world’s largest investment banks is trying to navigate clients through the worst trading day since the 2008 financial crisis. $125bn is wiped off the value of major US firms, while the Dow tumbles more than 2,000 points after a trading halt and the biggest oil crash in nearly 30 years. Then, as if the pincer movement of a Russia and Saudi Arabia oil dispute combined with an unprecedented world health crisis was not enough to think about, the same head of equities is frantically trying to get hold of someone in the back office to figure out why there are two listings for a Delta Airlines stock in his trading system?

The stock is sinking like a stone and the head of equities desperately wants to get out of a long position. Unfortunately, there is confusion as to why exactly why there are duplicate positions for Delta Airlines, with no indication as to which is the one that will trade without error or break.

While this may seem like a very specific problem, it is the last thing any head of desk needs in the middle of a market meltdown. The problem lies the fact that the data underpinning Delta’s stock price is sourced from many systems which can often lead to multiple data duplications and inaccuracies. Someone in the back office may have wanted to correct this error but they have no way of doing so before it arrived at the front office. And they wouldn’t have been able to correct it in all systems, because they don’t have the visibility to do so. By the time the head of equities finds out which instrument to act on it will be too late, as Delta Airlines may well have plummeted further.

The trouble is that far from being a one off, this is a typical example of a problem heads of desks face on a daily basis. The issue is that the equity data underpinning the Delta Airlines stock, or any stock for that matter, is present across multiple data management systems at a tier one bank. Before the information reaches the front office, numerous checks are carried out not just across equities, but across corporate and government bonds, FX, as well as listed derivatives to name just a few.

Read More: GlobalFintechSeries Interview with Ryan Frere, EVP of Payments at Flywire

In some cases, there are over thirty different asset classes, often with close to three hundred critical data elements which is the essential data needed to trade, settle and report in a compliant way. With millions of securities involved and data elements managed inconsistently across departments and systems it is, therefore, hardly surprising that there is a plethora of errors by the time the information gets to the trading desk.

This is why, driven by the current volatility placing a greater emphasis on making trading decisions based on accurate data, banks are looking at ways to measure and validate the quality of their data for common attributes across all asset classes. For example, no trading desk can afford to have the same internal identifier code for Google also attached to Microsoft. Both will be listed on exchanges, but they will still need to have separate identifiers. As a case in point, Microsoft and Google will have separate IDs listed on the various different venues they are listed on. The LSE would have a different ID for the stock in comparison to NYSE for instance.

Banks who fail to make these checks across their data systems undoubtably run the risk of making inaccurate decisions at a time when they can least afford to do so. Due to these risks, regulators are understandably pushing the industry to implement their own due diligence on their data vendors through independent verification across data sources. This includes collating, ranking, and defining instrument types in a central place where records can be audited easily – not across numerous systems. Or, if multiple systems are inevitable, then to have an umbrella data quality oversight across them all.

Only by applying rules to detect issues before they occur can financial institutions harbor any hopes of demonstrating responsibility in their trading decisions to regulators and investors during this unprecedented period of market volatility.

Read More: GlobalFintechSeries Interview with Chuck Klein, Chief Executive Officer & Founder at IMM

Related posts

Vicus Capital Uses Intelliflo Redblack’s Rebalancing and Trading to Support Dramatic AUM Growth

Fintech News Desk

smartTrade Announces Strong Growth for 2019 Which Is Set to Continue In 2020

Fintech News Desk

Intuit to Launch IDEAS Program in Partnership with the Los Angeles Urban League

Business Wire
1