OPINION
By Nicolas Adamek, Big
Data CoE, Teradata Labs
Data visualisations and the work of data scientists
can be very impressive. However, those organisations
that depend on extracting maximum value from their
data may be better off considering the unsung hero
of data analytics to secure their results.
Fragmented big data
Countless suppliers have embraced big data and
built ecosystems using different technologies for
data management and analytics systems, often
from multiple vendors. With vital data sourced from
and stored on isolated and disconnected systems,
organisations are finding it difficult to find the big
answers to business questions and this is impacting
productivity and limiting the data advantage.
It’s understood that big data involves high volumes
of data flowing into an organisation from multiple
sources. However, problems have arisen because the
resulting disjointed data repositories make it difficult
to use and analyse that data and extract its secrets.
As a result, big data grows bigger and IT teams
struggle to keep up.
Organisations often operate several data and
analytics environments in stovepipes meaning that
they spend more and more time sifting data and
building underlying IT processes and infrastructure
and it becomes a distraction. Data lakes have
emerged to deal with the volume and variety of
data and data types but dumping all that data into
a data lake won’t help if it can’t be analysed within
the context of other data that is already available in
enterprise data warehouses.
For example, an online retail company using a web
server stored on Hadoop, might receive an W'&