Using Trustworthy Data in Our Spreadsheets Avoids Costly Mistakes

4 minutes, 0 seconds Read

An interesting article crossed my path this week while I was clicking online. Forbes published a story with a headline that must have raised a few eyebrows, not least in Microsoft HQ.

Unashamedly titled ‘Microsoft’s Excel might be the most Dangerous Software on the Planet’, the crux of the article said that while the emergence of financial markets over the last three decades was probably made possible with Excel, it’s likely the business intelligence tool contributed to the Global Financial Crisis.

It’s no small claim. Could the sloppy handling of excel spreadsheets really be attributed to causing the crisis?

When it comes down to it, traders have for years been copying and pasting their data lists from one spreadsheet to another. Whilst no doubt some have been meticulous in their approach, in truth the room for human error has been immense.

As an example, the Forbes journalist pointed to one bank that followed this practice, got one of the equations wrong and as a result ‘lost several billion dollars’. ‘

And this doesn’t just happen to traders. Just think of the recent Reinhart and Rogoff controversy; the two Harvard economists whose credibility has been forever shaken when it was uncovered their research, on rising government debt and its association with much weaker rates of economic growth, had significant data omissions in it’s findings.

What triggered my interest however was the mention that both the Switzerland-based Basel Committee on Banking Supervision (BCBS) and the Financial Services Authority (FSA) in the UK recently issued warnings on poor spreadsheet management.

We haven’t seen regulatory bodies warning on this before.

But it makes sense.  For someone who gravitates around energy and commodity markets data on a regular basis at work, the rise of new sources and reports is really something to behold. From here on in, our complex data world can only get more chaotic.

The post-crisis risk management and regulatory world that we find ourselves in requires us to ensure we understand and monitor our data or else risk the financial consequences (think Dodd-Frank and its newly-formed European counterpart, EMIR).

The tools and solutions we use must be based on the current and future needs of the markets. They must come with built-in transparency, solid error-handling and validation options.

In short we need to know the data we’re using is trustworthy.

Here’s the selling part of the blog – but it sure is relevant.

For the past decade that company I work with, ZE PowerGroup Inc, has been developing an enterprise data management solution known as the ZEMA Suite that comes with all of the above.

We understand the weaknesses that exist with typical spreadsheets, so that’s why we’ve developed a solution that gets data into tip-top shape (also known as data that is validated, filtered and secure), before sending it to Excel using our Data Direct tool. Data Direct embeds normalized data and saves queries directly into spreadsheets, eliminating time consuming manual processes.

To ensure data is clean, timely and correct, there is Data Validation. It provides logging, notifications and on-screen identifiers so success and failures can be measured in real-time and historically.

Finally there’s Admin Console, a tool that enables users to protect and manage their data by providing security and reporting for all the data being processed. Importantly, it also allows users to designate and manage permissions for other colleagues in an organization who need access to specific information.

To learn more about these and other components of our award winning solution, visit our website and see what others have said about ZEMA.

 

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *