Businesses are overwhelmed by an ever-increasing volume of incoming data. As Foreign Affairs reports, “Today, there is enough information in the world to give every person alive 320 times as much of it as historians think was stored in the Library of Alexandria’s entire collection—an estimated 1,200 exabytes’ worth.”[1] With data collection and analysis becoming increasingly difficult as the amount of available information grows, concerns about data quality naturally arise. Traders and analysts manually entering data into spreadsheets can further jeopardize information quality by making errors, which inhibits successful analysis.
The Problem: Maintaining Data Hygiene
According to Joe Norburn, head of client and front-office solutions at Coutts in London, “Poor data quality impairs banks’ ability to deliver on their commitments to customers and to meet regulatory requirements.”[2] The data challenge is not an easy one. A quantitative research survey entitled ”Data and the CFO”, commissioned by Statistical Analysis System (SAS) in July 2012, found that 78% of chief financial officers and financial directors at large U.K. companies are concerned that their company might accidentally submit inaccurate data during internal or external financial reporting.[3] According to a study conducted by LEPUS, all 44 of the surveyed financial services institutions reported that data quality remains a primary area of concern, with 60% commenting that the majority of their data is received late, with questionable consistency and completeness. [4]
Issues with data integrity can include the following:
- Outliers – e.g., the price of crude oil will never be $5,000 a barrel, so if a data point claims this, then it is likely wrong. Traders must possess a system which automatically flags anomalous data.
- Incomplete Data – Missing data that is essential to the client. What is missing for one department may be no issue for another and vice-versa. Implementing notifications that automatically signal analysts when data is missing is important.
- Time of Data – Some data may be quite useful, but it may not be available when needed. Ensuring the timeliness of data can make an investment successful.
- Frequency of Data – Some data comes in slower/faster than needed. Scheduling the automation of data collection helps ensure that human error is minimized.
- Changing Formats – Data comes in a variety of formats, so making sure it is in the appropriate one is essential. Data that is collected manually will usually feature inconsistent formatting. Analysts who possess a software system that normalizes their information to maintain the consistency of formatting save terabytes of data and weeks of tedious work.
- Sheer size of data – Some data is inefficiently stored and formatted. Ensuring that files aren’t duplicated and that data is sorted by relevant properties helps individuals evaluate risk.]
The Difficulties of Regulatory Constraints
According to a study published by Microsoft earlier this year, 38% of financial services firms do not have budgeted disaster recovery plans; 22% have no formal risk management program; 23% have inadequate policies for secure data disposal; 29% do not have a plan for responding to security breaches; and 37% do not use standardized data classification. Most of these issues arise due to poor data management. For example, some companies operate several different BI tools and spreadsheets to organize data.
On top of this, Capco’s 2012 study—which surveyed chief information officers (CIOs) and tech executives at U.S. banks—concluded that 25% of respondents had inaccurate data; 47% lacked sufficient automation methods for data management; and 48% found they had inadequate integration and analysis methods for their data. Regulatory compliance has also been on the minds of financial executives, with nearly two-thirds of respondents concerned that Dodd-Frank, the Patriot Act, and the Foreign Account Tax and Compliance Act (FATCA) will pose data management challenges. Navigating regulations can be a hassle for organizations that lack a way of logging which data entitlements correspond to which users, as well as where data is being accessed and transferred.
The Solution
Market participants who have an all-in-one data management solution can prevent data loss and increase security, since their corporate data is contained within one program. ZEMA is an all-in-one solution that automates data collection, analysis, and persistence. Further, ZEMA enables its users to track data entitlements and restrict access to aspects of the software for security purposes. ZEMA allows its analysts to schedule data collection and validate that data according to customizable settings. This ensures regulatory compliance and data integrity.
ZEMA’s Data Validation application allows users to automate the process of ensuring the quality of the data it collects. Data Validation allows users to check that all of the incoming data is within the set parameters (correctness), that all of the data is successfully collected (completeness), and that all of the data is collected on time (timeliness). This enables ZEMA users to ensure their data is clean and ready to be used for analysis. ZEMA’s Admin Console allows users to restrict data entitlements so that users who shouldn’t be able to see certain data reports aren’t able to access them. This ensures regulatory compliance is met and prevents users from accessing more data than data vendors have allowed them. Admin Console also allows users to manage passwords and other security settings ensuring the security of your data, analytics, and cruves.
The ZEMA solution is also highly capable of tracking updates to major financial markets. For instance, Figure 1 is a ZEMA graph that represents Bank of Canada bond yields (ten-year issue) for the past five years. ZEMA features the ability to program dynamic dates, over 150 mathematical formulas, and powerful data collection and software integration capabilities.
ZEMA is used by global financial institutions, including major North American and European banks; the world’s largest oil and gas producers; and Fortune 500 companies. ZEMA collects over 300 financial market reports at present and can easily collect more upon a client’s request. To learn how ZEMA can transform financial market data into relevant market intelligence, book a complimentary demo.
_____________________________________________________________________________________________________________________
[1] Neil Cukier, Kenneth, and Viktor Mayer-Schoenberger, “The Rise of Big Data: How It’s Changing the Way We Think About the World,” Foreign Affairs: Published by the Council on Foreign Relations, June 1, 2013, accessed September 22, 2014, http://www.foreignaffairs.com/articles/139104/kenneth-neil-cukier-and-viktor-mayer-schoenberger/the-rise-of-big-data.
[2] Ebbage, Alison. “Regulatory Reform and Data Management: Problems and Benefitsv.” Risk.Net. February 10, 2013. Accessed October 1, 2014. http://www.risk.net/operational-risk-and-regulation/feature/2240987/regulatory-reform-and-data-management-problems-and-benefits.
[3] Ibid.
[4] “Data Challenges Faced by the Financial Industry,” LEPUS, January 1, 2014, accessed September 22, 2014, http://www.lepus.com/2012/data-challenges-faced-by-the-financial-industry-2/.