New choices bring enterprise big data home
An IT industry analyst article published by SearchDataCenter.
Enterprises recognize the tantalizing value of big data analytics, but traditional concerns about data management and security have held back deployments — until now.
When big data practices come to your organization, it’s all about location, location, location.
I’ve heard recently from a bunch of big-data-related vendors that are all vying to gain from your sure-to-grow big data footprint. After all, big data isn’t about minimizing your data set, but making the best use of as much data as you can possibly manage. That’s not a bad definition of big data if you are still looking for one. With all this growing data, you will need a growing data center infrastructure to match.
This big data craze really got started with Apache Hadoop’s Distributed File System (HDFS), which unlocked the vision of massive data analysis based on cost-effective scale-out clusters of commodity servers using relatively cheap local attached disks. Hadoop and its ecosystem of solutions let you keep and analyze all kinds of data in its natural raw low-level form (i.e., not fully database structured), no matter how much you pile up or how fast it grows.
The problem is that once you get beyond, err, data science projects, old familiar enterprise data management issues return to the forefront, including data security, protection, reliability, operational performance and creeping Opex costs.
While Hadoop and HDFS mature with each release, there are still a lot of gaps when it comes to meeting enterprise requirements. It turns out that those commodity scale-out clusters of direct-attached storage (DAS) might not actually offer the lowest total cost of ownership when big data lands in production operations…
…(read the complete as-published article there)