Petabyte potential: harnessing the data flood

Companies must start treating their data as an asset rather than a burden, writes Piers Ford.

Tags: Ciena CorporationCommVault Systems IncorporatedGartner IncorporationSeagate Software (S.A.)StorIT Distribution FZCO (
  • E-Mail
Petabyte potential: harnessing the data flood
By  Piers Ford Published  April 24, 2013

Companies must start treating their data as an asset rather than a burden, writes Piers Ford.

If data was water, we’d need more than a fleet of arks to save us from the flood. The volumes being bandied around– take your pick from exabytes, terabytes and petabytes – defy comprehension for most of us. But digital data is spilling out of its traditional home in the corporate database, crying out for analysis, integration and above all, use.

And with analysts like Gartner and IDC predicting that the digital universe will double in size every year between now and 2020, resistance is futile. Instead, it’s time for enterprises to embrace the phenomenon of big data and acknowledge the opportunity it represents: to find innovative, cost-effective ways to manage and store information, and build applications that will exploit it in increasingly creative ways.

If the infrastructure is not to buckle under the pressure, organisations must start treating their data as a dynamic asset rather than an overwhelming by-product requiring expensive storage just to hold it at bay.

Defining big data
What is big data? Definitions abound, but a broad consensus is that it is the deluge of unstructured data generated by what might loosely be called ‘life in the 21st century’: everything from multimedia traffic to mobile devices, surveillance networks and social media. The list is endless. The smallest transaction creates its own wave of data, adding another layer to the information pool.

“Organisations are producing more data than ever before from various internal and external sources, thereby making it critical for them to manage and analyse this enormous volume,” explains Boby Joseph, chief executive officer at StorIT Distribution.

“Although there is no exact definition of big data, most research firms define it as the massive volumes of complex, high velocity and variable data that an organisation collects over time and which it is difficult to analyse and handle using traditional database management tools. Such large volumes of unstructured data require advanced technologies and techniques to capture, store, analyse, distribute and manage this information.”

Joseph says that simply acknowledging the phenomenon and trying to apply traditional management tools to accommodate this bewildering array of data sets is not the answer. Businesses need to interact with big data in real time so that they can react quickly and make fast business changes in response to the live situation it represents. The wealth of information can only yield its true value if there is a shift in attitude.

“To address the big data problem, organisations need to change their mindset in addition to upgrading their technology,” states Joseph. “To use big data effectively, organisations need to choose from a number of advanced technologies and new platforms that will help them tap into internal systems, silos, warehouses and external systems. They also need to add resources with skills to use this massive volume of data optimally. This means that the organisation’s infrastructure, operations and development team need to work together to tap the full potential of big data.”

So it’s a challenge for everyone. And there are some important questions to consider.

Add a Comment

Your display name This field is mandatory

Your e-mail address This field is mandatory (Your e-mail address won't be published)

Security code