by Donya Rose, Chief Operating Officer UK and Ireland, Global Transaction Bank, Deutsche Bank
Donya Rose, Chief Operating Officer UK and Ireland, Global Transaction Banking, Deutsche Bank, explains what is really meant by ‘Big Data’ and why grappling with it has become a ‘must do’ for banks and businesses everywhere.
It is a truth universally acknowledged that most of the data in the world today was created over the last few years. The amount of data being created and stored globally has soared exponentially due to the lowered costs and increased capabilities of storage, the granular richness of data flows and of course the increased frequency and ease of digital interactions. Each year brings storage improvements, in terms of both methods and cost reduction; three decades ago a gigabyte of storage cost around 1,000,000 times what it does today and was the size of a large fridge rather than the miniature disks now available. With the growth of ‘data lakes’, information of all kinds – but particularly unstructured data – is being generated on all fronts and stored in unprecedented volumes, and its effects and power are only beginning to be explored.
As industries in every area undergo digital transformations, they face decisions about what data to store and where to store it – whether in on-site, remote or cloud-based ‘containers’ – and, most importantly of all, the question of how to process it. The flood of data being captured every second can become a millstone without the power and technology to leverage it correctly. Indeed, without the right tools, even filtering incoming information to separate the meaningful from the white noise becomes a needle-in-a-haystack task. Yet inadequately managing such information is to risk falling behind, whether in terms of market trends, impending threats or potential operational improvements.