Data Deluge and the Dread of Latency

4 minutes, 8 seconds Read

iStock_000000372335_L2In recent years, data has taken the business world by storm. The amount of information captured by enterprises along with a growth in technology has fuelled a global data explosion, and its effects on industry have been substantial. In a 2011 study, the McKinsey Global Institute projected a 40% growth in global data generated per year; the study also found that 88% of all sectors in the United States have more data stored per company than the US Library of Congress (MGI). The strain of burgeoning data is felt quite profoundly in the energy and commodities markets, where analyzing and interpreting data has long been of high importance.

In the three decades since IBM released IBM 5150, one of the first widely successful personal computers available to consumers, technology has developed at an increasingly accelerated pace (IBM). On January 1, 1983, the National Science Foundation’s university network–the precursor to the World Wide Web–became operational (Wired); since that day, the advancement and evolution of every aspect of technology has grown exponentially year after year.  As technology has become more and more sophisticated, it has infiltrated nearly every aspect of our lives. We have access to more information in the palm of our hands than previous generations could find in an entire library, and our access to this data happens in the blink of an eye, with the touch of a finger.

These rapid developments have had a profound impact on trading exchanges. As early as 1986, while the internet was still in its infancy, the Stock Exchange Automated Quotation system was implemented to take over processes that were previously done manually; in that same year, exchanges began transacting significantly more trade than ever before. As Karl Flinder notes, “just before the meteoric impact of the SEAQ, the average number of daily trades at the London Stock Exchange was 20,000. After the introduction of automated trading, that figure went up to a daily average of 59,000 trades in only a few months” (Computer Weekly).

The SEAQ was merely a harbinger of what was to come. Now, trade floors utilize high-speed servers linked to exchanges through Ethernet cables. High-frequency traders trade fractions of cents using algorithms, making immense volumes of trades each day. In the high-speed trading game, some estimates claim an advantage as small as a millisecond can be worth as much as $100 million per annum to a large brokerage (Information Week). In recent years, exchanges have begun implementing extreme and costly measures to gain that millisecond advantage and reduce latency.

Data latency refers to the amount of time it takes for data to be stored or retrieved; or, the time in between data being entered and a user viewing the data (Big Data Imperatives, 84). For high-frequency traders, it’s the difference between a penny earned and a penny lost.

“If everyone has access to the same information, when the market moves, you want to be first. The people who are too slow are going to be left behind,” says Alistair Brown, founder of Lime Brokerage (Information Week).

While this kind of down-to-the-millisecond information flow isn’t a common concern for most businesses, the need to gain a competitive advantage through rapid access to data is becoming more of an issue in a variety of industries. Many companies rely upon up-to-the-minute data to make trade decisions; if they experience latency issues but their competitors do not, they have lost the advantage. Businesses that have the most current data are better equipped to make informed decisions. In our information age, rapid access to data has become increasingly important for corporations struggling to gain an advantage over their competitors.

To read more about the growth of big data in business and the impact of latency, click here for the full story.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *