Strategic Treasury

Page 1 of 2

Riding the Wave of Big Data Big Data is a new and exciting frontier, and only time will show in which direction its capabilities and effects will take us. One thing is for certain: it is a trend that is ignored at one’s peril.

Riding the Wave of Big Data

by Donya Rose, Chief Operating Officer UK and Ireland, Global Transaction Bank, Deutsche Bank

Donya Rose, Chief Operating Officer UK and Ireland, Global Transaction Banking, Deutsche Bank, explains what is really meant by ‘Big Data’ and why grappling with it has become a ‘must do’ for banks and businesses everywhere.

It is a truth universally acknowledged that most of the data in the world today was created over the last few years. The amount of data being created and stored globally has soared exponentially due to the lowered costs and increased capabilities of storage, the granular richness of data flows and of course the increased frequency and ease of digital interactions. Each year brings storage improvements, in terms of both methods and cost reduction; three decades ago a gigabyte of storage cost around 1,000,000 times what it does today and was the size of a large fridge rather than the miniature disks now available. With the growth of ‘data lakes’, information of all kinds – but particularly unstructured data – is being generated on all fronts and stored in unprecedented volumes, and its effects and power are only beginning to be explored.

As industries in every area undergo digital transformations, they face decisions about what data to store and where to store it – whether in on-site, remote or cloud-based ‘containers’ – and, most importantly of all, the question of how to process it. The flood of data being captured every second can become a millstone without the power and technology to leverage it correctly. Indeed, without the right tools, even filtering incoming information to separate the meaningful from the white noise becomes a needle-in-a-haystack task. Yet inadequately managing such information is to risk falling behind, whether in terms of market trends, impending threats or potential operational improvements.

The Big Data phenomenon

It should be noted that the term ‘Big Data’ (one that is increasingly searched for online) is a slightly more nebulous concept than the sheer volume of digital information being produced. Big Data can be defined as anything from ‘a greater scope of information’ to real-time information to social media data, but also refers to technologies with data-related functions. Regardless, the importance of both harvesting and correctly utilising data has been acknowledged; nine out of ten C-suite executives consider data to be the fourth factor of production, after the traditional pillars of land, labour and capital. This shift in mindset is being steadily followed by an appetite to make better use of this data, with global investment in Big Data expected to grow at a CAGR of 17% to reach $76bn by the end of 2020.

Figure 1 - Interpretations of what Big Data is
   Click image to enlarge 

What this signifies in practice is that we have moved on from questioning the value of investing in such tools to asking how we can best put them to use. Part of that investment will be spent on human skills, such as hiring Chief Analytics Officers, but on the tech side it covers four key areas: infrastructure, storage, processing and management, and analytics. Clearly, Big Data cannot exist without the first three factors, and meeting the demand for secure and inexpensive storage and sufficient processing power continues to require technological developments. But without effective data analytics, even the largest hoards of data are worthless.

Four main forces are driving this investment in Big Data technology. The first is the explosive growth of data for the reasons mentioned above. As the number of consumers and bank account holders – and the devices they use – continues to swell, so does the volume of data produced by internal and external sources. The recent surge of interest in wearable data will only accelerate this growth further; the world’s data volume has been predicted to grow 50-fold between 2010 and 2020. In contrast, existing legacy systems struggle to keep up. For example, much of banks’ data sits on difficult-to-access databases meaning it is not best used, and most of the data held by financial institutions (80-90%) is unstructured and therefore difficult to analyse.

The second driver is necessity for capabilities to access point-in-time information to comply with changing regulations, for example regarding liquidity planning, asset and liability management or risk reporting. Not too long ago, a static analysis of financial ratios was sufficient, but newer regulations such as Basel III require intraday reporting, edging ever closer to real-time visibility. For the sake of risk management – whether compelled by regulation or not – a key treasury goal is increasingly to achieve real-time monitoring of the financial environment, from counterparty exposure to interest rate volatility.

Next Page   2 

Save PDFs of your favorite articles, authors and companies. Bookmark this article, or add to a list of your favorites within mytmi.

Discover the benefits of myTMI

 Download this article for free