NICOLE REINEKE
Distinguished Engineer , Dell Technologies
Consolidate data to avoid duplication
There is a tendency to recreate the wheel rather than consolidating ( and improving ) what businesses have . For instance , the Data Paradox study shows us that almost half are currently investing in more data lakes , even though nearly the same amount of leaders say a lack of consolidation from too many data lakes is a barrier to getting value from their data *. This suggests businesses are taking the easiest option ( in the short-term , it ’ s less time-consuming to bolt on a new data lake than consolidate what they have ). created the exact same query with the same joins . Data teams then have the arduous and time intense job of tracking that data . It is no surprise that data scientists spend less than 10 percent of their time actually mining data for patterns . Sixty percent of their time goes into the drudgery of finding , aggregating , and cleaning data sets . We need to empower our data team to do the extraordinary , rather than confine them to a life of déjà vu . It ’ s not good for them and it ’ s certainly not good for business .
In the same vein , enterprises are wasting precious hours and storage recreating the same datasets over and over again simply because the data isn ’ t responding swiftly enough to queries and the business doesn ’ t know if someone else has already
“
We need to empower our data team to do the extraordinary , rather than confine them to a life of déjà vu . It ’ s not good for them and it ’ s certainly not good for business . @ nicolereineke
CLICK TO TWEET
© 2021 Dell Inc .
Beat the Data Paradox | 9