Open source strategies bring benefits, but don’t rush in

An IT industry analyst article published by SearchDataCenter.


article_Open-source-strategies-bring-benefits-but-dont-rush-in
Before your organization can reap the benefits of open source, it’s important to understand your options and map out a plan that will guarantee success.

Mike Matchett
Small World Big Data

It’s ironic that we spend a lot of money on proprietary databases, business applications and structured business intelligence platforms for “little” data, but we turn to open source platforms for big data analytics. Why not just scale down free, big data open source systems to handle the little data too?

Of course, there are a number of real reasons, including minimizing risk and assuring enterprise-class data management requirements. Cost probably isn’t even the first criteria for most enterprises. Even when it comes to cost, open source doesn’t mean free in a real economic sense. Open source strategies require cutting-edge expertise, professional support and often buy-up into proprietary enterprise-class feature sets. The truth is, open source platforms don’t necessarily maximize ROI.

Still, open source strategies create attractive opportunities for businesses that want to evolve their aging applications. Many IT investing strategies now include a core principle preferring open source for new applications. In fact, we’d claim open source now represents the fastest growing segment of enterprise IT initiatives. From a theoretical point of view, when it comes to developing new ways of doing business, new types of agile and web-scale applications, and new approaches to analyze today’s ever-bigger data, open source presents innovative opportunities to compete and even disrupt the competition.

But this is much easier said than done. We’ve seen many enterprises fumble with aggressive open source strategies, eventually reverting to tried-and-true proprietary software stacks. So if enterprises aren’t adopting open source because it’s cheaper, and it often lacks enterprise-class features, then why has it become such a popular strategy?

Adopting open source strategies goes hand in hand with an ability to attract top technical talent, Rajnish Verma said at the Dataworks Summit in June, when he was president of big data software vendor Hortonworks. Smart people want to work in an open source environment so they can develop in-demand skills, establish broader relationships outside a single company and potentially contribute back to a larger community — all part of building a personal brand, I suppose.

In other words, organizations adopt open source because that’s what today’s prospective employees want to work on…(read the complete as-published article there)

Storage technologies evolve toward a data-processing platform

An IT industry analyst article published by SearchDataCenter.


article_Storage-technologies-evolve-toward-a-data-processing-platform
Emerging technologies such as containers, HCI and big data have blurred the lines between compute and storage platforms, breaking down traditional IT silos.

Mike Matchett

With the rise of software-defined storage, in which storage services are implemented as a software layer, the whole idea of data storage is being re-imagined. And with the resulting increase in the convergence of compute with storage, the difference between a storage platform and a data-processing platform is further eroding.

Storage takes new forms

Let’s look at a few of the ways that storage is driving into new territory:

  • Now in containers! Almost all new storage operating systems, at least under the hood, are being written as containerized applications. In fact, we’ve heard rumors that some traditional storage systems are being converted to containerized form. This has a couple of important implications, including the ability to better handle massive scale-out, increased availability, cloud-deployment friendliness and easier support for converging computation within the storage.
  • Merged and converged. Hyper-convergence bakes software-defined storage into convenient, modular appliance units of infrastructure. Hyper-converged infrastructure products, such as those from Hewlett Packard Enterprise’s SimpliVity and Nutanix, can greatly reduce storage overhead and help build hybrid clouds. We also see innovative approaches merging storage and compute in new ways, using server-side flash (e.g., Datrium), rack-scale infrastructure pooling (e.g., Drivescale) or even integrating ARM processors on each disk drive (e.g., Igneous).
  • Bigger is better. If the rise of big data has taught us anything, it’s that keeping more data around is a prerequisite for having the opportunity to mine value from that data. Big data distributions today combine Hadoop and Spark ecosystems, various flavors of databases and scale-out system management into increasingly general-purpose data-processing platforms, all powered by underlying big data storage tools (e.g., Hadoop Distributed File System, Kudu, Alluxio).
  • Always faster. If big is good, big and fast are even better. We are seeing new kinds of automatically tiered and cached big data storage and data access layer products designed around creating integrated data pipelines. Many of these tools are really converged big data platforms built for analyzing big and streaming data at internet of things (IoT) scales.

The changing fundamentals

Powering many of these examples are interesting shifts in underlying technical capabilities. New data processing platforms are handling more metadata per unit of data than ever before. More metadata leads to new, highly efficient ways to innovate …(read the complete as-published article there)

CI and disaggregated server tech can converge after all

An IT industry analyst article published by SearchDataCenter.


I’ve talked about the inevitability of infrastructure convergence, so it might seem like I’m doing a complete 180 degree turn by introducing the opposite trend of infrastructure: aggregation. Despite appearances, disaggregated server technology isn’t really the opposite of convergence. In fact, disaggregated and converged servers work together.

In this new trend, physical IT components come in larger and denser pools for maximum cost efficiency. At the same time, compute-intensive functionality, such as data protection, that was once tightly integrated with the hardware is pulled out and hosted separately to optimize performance and use cheaper components.

Consider today’s cloud architects building hyper-scale infrastructures; instead of buying monolithic building blocks, they choose to pool massive amounts of dense commodity resources.

…(read the complete as-published article there)

Assimilate converged IT infrastructure into the data center

An IT industry analyst article published by SearchDataCenter.


I feel like the Borg from Star Trek when I proclaim that “IT convergence is inevitable.”

Converged IT infrastructure, the tight vendor integration of multiple IT resources like servers and storage, is a good thing, a mark of forward progress. And resistance to convergence is futile. It is a great way to simplify and automate the complexities between two (or more) maturing domains and drive cost-efficiencies, reliability improvements, and agility. As the operations and management issues for any set of resources becomes well understood, new solutions will naturally evolve that internally converge them into a more unified integrated single resource. Converged solutions are faster to deploy, simpler to manage, and easier for vendors to support.

Some resistance to converge does happen within some IT organizations. Siloed staff might suffer — convergence threatens domain subject matter experts by embedding their fiefdoms inside larger realms. That’s not the first time that has happened, and there is always room for experts to dive deep under the covers to work through levels of complexity when things inevitably go wrong. That makes for more impactful and satisfying jobs. And let’s be honest — converged IT is far less threatening than the public cloud.

…(read the complete as-published article there)

Scale-out architecture and new data protection capabilities in 2016

An IT industry analyst article published by SearchDataCenter.


January was a time to make obvious predictions and short-lived resolutions. Now is the time for intelligent analysis of the shark-infested waters of high tech. The new year is an auspicious time for new startups to come out of the shadows. But what is just shiny and new, and what will really impact data centers?

From application-focused resource management to scale-out architecture, here are a few emerging trends  that will surely impact the data center.

…(read the complete as-published article there)