What’s the future of data storage technology and the IT pro?

An IT industry analyst article published by SearchConvergedInfrastructure.


article_Whats-the-future-of-data-storage-technology-and-the-IT-pro
Several enterprise data storage trends are all about getting rid of storage as an IT silo. That will have consequences for both the industry and IT pros who work in it.

Mike Matchett
Small World Big Data

Have you sensed a shift in storage these days? Maybe you’ve noticed a certain resignation among storage industry veterans contemplating the future of data storage technology. Or maybe when it comes time to refresh aging storage arrays, there’s less differentiation among competing products or no exciting new storage technologies — everyone has flash by now, right? — or flaming vendor wars to get excited about. Maybe many of your important storage needs are now met using a cloud service, a relatively no-name vendor product or even open source.

For many years, it’s been fun to watch the big storage vendors fight the good fight. They used to line up elbow-to-elbow in the front row at big shows like VMworld, vying for the biggest booth to show off their hottest products. This last year, it seemed storage has moved back a few rows. Market forces and trends such as software-defined and hyper-converged have changed large parts of the storage game, sure. But when the game shifted in the past, competitive storage vendors shifted with it. Maybe this is harder to do now that storage is getting embedded, integrated, converged and “cheapened” through cloud competition.

Many recent storage trends involve getting rid of storage as an IT silo, raising questions about the future of data storage technology. How can you sell Storage with a capital S if no one is buying stand-alone storage anymore? Are we coming to the end of storage as an important, first-class industry? The short answer is no.

But data? Lots of data
Accounting-focused industry reports show legacy storage-centric companies continue to suffer from thinning margins for their high-end hardware arrays. But the collective storage footprint in general is growing. With data volumes exploding from globalized applications, web-scale databases, big data analytics, online archiving and that little internet of things opportunity, all those new bits will have to go somewhere.

Maybe the job won’t even be called storage administrator in a few years, but rather something like chief data enabler.

All this new data simply can’t go into cheap and deep cold cloud storage. If data is worth having, as much business value as possible must be wrung out of it. And if it’s important data, it has to be governed, protected, secured and ultimately actively managed…(read the complete as-published article there)

Future of data storage technology: Transformational trends for 2018

An IT industry analyst article published by SearchStorage.


article_Future-of-data-storage-technology-Transformational-trends-for-2018
Risk-averse enterprises finally accepted the cloud in 2017, and we didn’t even notice. Expect the same for these data storage technology trends in the new year.

Mike Matchett
Small World Big Data

Sometimes big changes sneak up on you, especially when you’re talking about the future of data storage technology. For example, when exactly did full-on cloud adoption become fully accepted by all those risk-averse organizations, understaffed IT shops and disbelieving business executives? I’m not complaining, but the needle of cloud acceptance tilted over sometime in the recent past without much ado. It seems everyone has let go of their fear of cloud and hybrid operations as risky propositions. Instead, we’ve all come to accept the cloud as something that’s just done.

Sure, cloud was inevitable, but I’d still like to know why it finally happened now. Maybe it’s because IT consumers expect information technology will provide whatever they want on demand. Or maybe it’s because everything IT implements on premises now comes labeled as private cloud. Influential companies, such as IBM, Microsoft and Oracle, are happy to help ease folks formerly committed to private infrastructure toward hybrid architectures that happen to use their respective cloud services.

In any case, I’m disappointed I didn’t get my invitation to the “cloud finally happened” party. But having missed cloud’s big moment, I’m not going to let other obvious yet possibly transformative trends sneak past as they go mainstream with enterprises in 2018. So when it comes to the future of data storage technology, I’ll be watching the following:

Containers arose out of a long-standing desire to find a better way to package applications. This year we should see enterprise-class container management reach maturity parity with virtual machine management — while not holding back any advantages containers have over VMs. Expect modern software-defined resources, such as storage, to be delivered mostly in containerized form. When combined with dynamic operational APIs, these resources will deliver highly flexible programmable infrastructures. This approach should enable vendors to package applications and their required infrastructure as units that can be redeployed — that is, blueprinted or specified in editable and versionable manifest files — enabling full environment and even data center-level cloud provisioning. Being able to deploy a data center on demand could completely transform disaster recovery, to name one use case.

Everyone is talking about AI, but it is machine learning that’s slowly permeating through just about every facet of IT management. Although there’s a lot of hype, it’s worth figuring out how and where carefully applied machine learning could add significant value. Most machine learning is conceptually made up of advanced forms of pattern recognition. So think about where using the technology to automatically identify complex patterns would reduce time and effort. We expect the increasing availability of machine learning algorithms to give rise to new storage management processes. These algorithms can produce storage management processes that can learn and adjust operations and settings to optimize workload services, quickly identify and fix the root causes of abnormalities, and broker storage infrastructure and manage large-scale data to minimize cost.

Management as a service (MaaS) is gaining traction, when looking at the future of data storage technology. First, every storage array seemingly comes with built-in call home support replete with management analytics and performance optimization. I predict that the interval for most remote vendor management services to quickly drop from today’s daily batch to five-minute streaming. I also expect cloud-hosted MaaS offerings are the way most shops will manage their increasingly hybrid architectures, and many will start to shift away from the burdens of on-premises management software…(read the complete as-published article there)

Reap IT automation benefits in every layer of the stack

An IT industry analyst article published by SearchITOperations.


article_Reap-IT-automation-benefits-in-every-layer-of-the-stack
Automation technologies create an artificial brain for IT operations, but that won’t turn skilled admins and engineers into zombies — far from it.

Mike Matchett
Small World Big Data

As a technology evangelist and professional IT systems optimizer, I see the benefits of IT automation and can only champion trends that increase it. When we automate onerous tasks and complex manual procedures, we naturally free up time to focus our energies higher in the stack. Better and more prevalent automation increases the relative return on our total effort so that we each become more productive and valuable. Simply put, IT automation provides leverage. So it’s all good, right?

Another IT automation benefit is that it captures, encapsulates and applies valuable knowledge to real-world problems. And actually, it’s increasingly hard to find IT automation platforms that don’t promote embedded machine learning and artificially intelligent algorithms. There is a fear that once our hard-earned knowledge is automated, we’ll no longer be necessary.

So, of course, I need to temper my automation enthusiasm. Automation can eliminate low-level jobs, and not everyone can instantly adjust or immediately convert to higher-value work. For example, industrial robots, self-driving cars or a plethora of internet of things (IoT)-enabled devices that cut out interactions with local retailers all tend to remove the bottom layer of the related pyramid of available jobs. In those situations, there will be fewer, more-utilized positions left as one climbs upward in skill sets.

Still, I believe automation, in the long run, can’t help but create even more pyramids to climb. We are a creative species after all. Today, we see niches emerging for skilled folks with a combination of internal IT and, for example, service provider, high-performance computing, data science, IoT and DevOps capabilities.

Automation initiatives aren’t automatic

If one squints a bit, almost every IT initiative aims to increase automation.

A service provider has a profit motive, so the benefit of IT automation is creating economies of scale. Those, in turn, drive competitive margins. But even within enterprise IT, where IT is still booked as a cost center, the drive toward intelligent automation is inevitable. Today, enterprise IT shops, following in the footsteps of the big service providers, are edging toward hybrid cloud-scale operations internally and finding that serious automation isn’t a nice-to-have, but a must-have.If one squints a bit, almost every IT initiative aims to increase automation. Most projects can be sorted roughly into these three areas with different IT automation benefits, from cost savings to higher uptime:

  • Assurance. Efforts to automate support and help desk tasks, shorten troubleshooting cycles, shore up security, protect data, reduce outages and recover operations quickly.
  • Operations. Necessary automation to stand up self-service catalogs, provision apps and infrastructure across hybrid and multi-cloud architectures to enable large-scale operations, and orchestrate complex system management tasks.
  • Optimization. Automation that improves or optimizes performance in complex, distributed environments, and minimizes costs through intelligent brokering, resource recovery and dynamic usage balancing.

Automation enablers at large
Successful automation initiatives don’t necessarily start by implementing new technologies like machine learning or big data. Organizational commitment to automation can drive a whole business toward a new, higher level of operational excellence…(read the complete as-published article there)

Cloud-based environment: The new normal for IT shops

An IT industry analyst article published by SearchServerVirtualization.


article_Cloud-based-environment-The-new-normal-for-IT-shops
The sky is the limit as new cloud management tools and evolutions in storage help make hybrid and multicloud IT a viable option for organizations with on-prem data centers.

Mike Matchett
Small World Big Data

Doubts about a cloud-based environment being little more than a passing fancy are vanishing. Plenty of real enterprises are not only comfortable releasing key workloads to public clouds, but are finding that hybrid operations at scale offer significant economic, productivity and competitive advantages over traditional on-premises data centers.

In fact, many of the big announcements at VMworld 2017 highlighted how mainstream businesses are now building and consuming hybrid and multicloud IT.
NSX all around

VMware has accelerated its transition from hypervisor vendor to cloud-management tool provider. Its virtual networking product, NSX, is not only a big source of revenue for VMware, but it also underpins many newer offerings, such as AppDefense, VMware Cloud on AWS and Network Insight. Basically, NSX has become the glue, the ether that fills VMware’s multicloud management business.

By shifting the center of its universe from hypervisor to the network between and underneath everything, VMware can now provide command and control over infrastructure and applications running in data centers, clouds, mobile devices and even out to the brave new internet of things (IoT) edge.

More MaaS, please
VMware rolled out seven management as a service (MaaS) offerings. MaaS describes a sales model in which a vendor delivers systems management functionality as a remote, subscription utility service. MaaS is ideal for systems management tasks across multiple clouds and complex hybrid infrastructures.

One of the motivations for MaaS is that the IT administrator doesn’t need to install or maintain on-premises IT management tools. Another is that the MaaS vendor gains an opportunity to mine big data aggregated across their entire customer pool, which should enable it to build deeply intelligent services.

Four of these new services are based on existing vRealize Operations technologies that VMware has repackaged for SaaS-style delivery. We’ve also heard that there are more MaaS products on the way.

It’s important for vendors to offer MaaS services — such as call home and remote monitoring — as the inevitable future consumption model for all systems management. There isn’t a single organization that benefits from employing an expert to maintain its internal, complex systems management tool. And with mobile, distributed and hybrid operations, most existing on-premises management products fall short of covering the whole enterprise IT architecture. I have no doubt the future is MaaS, a model that is bound to quickly attract IT shops that want to focus less on maintaining management tools and more on efficiently operating hybrid, multicloud architectures.

Storage evolves
The VMworld show floor has been a real storage showcase in recent years, with vendors fighting for more attention and setting up bigger, flashier booths. But it seemed this year that the mainline storage vendors pulled back a bit. This could be because software-defined storage products such as VMware vSAN are growing so fast or that the not-so-subtle presence of Dell EMC storage has discouraged others from pushing as hard at this show. Or it could be that in this virtual hypervisor market, hyper-convergence (and open convergence too) is where it’s at these days.

If cloud-based environments and hybrid management are finally becoming just part of normal IT operations, then what’s the next big thing?

Maybe it’s that all the past storage hoopla stemmed from flash storage crashing its way through the market. Competition on the flash angle is smoothing out now that everyone has flash-focused storage products. This year, nonvolatile memory express, or NVMe, was on everyone’s roadmap, but there was very little NVMe out there ready to roll. I’d look to next year as the big year for NVMe vendor positioning. Who will get it first? Who will be fastest? Who will be most cost-efficient? While there is some argument that NVMe isn’t going to disrupt the storage market as flash did, I expect similar first-to-market vendor competitions.

Data protection, on the other hand, seems to be gaining. Cohesity and other relatively new vendors have lots to offer organizations with a large virtual and cloud-based environment. While secondary storage hasn’t always seemed sexy, scalable and performant secondary storage can make all the difference in how well the whole enterprise IT effort works. Newer scale-out designs can keep masses of secondary data online and easily available for recall or archive, restore, analytics and testing. Every day, we hear of new machine learning efforts to use bigger and deeper data histories.

These storage directions — hyper-convergence, faster media and scale-out secondary storage — all support a more distributed and hybrid approach to data center architectures…(read the complete as-published article there)

Secondary data storage: A massively scalable transformation

An IT industry analyst article published by SearchStorage.


article_Secondary-data-storage-A-massively-scalable-transformation
Capitalize on flash with interactive, online secondary data storage architectures that make a lot more data available for business while maximizing flash investment.

Mike Matchett
Small World Big Data

We all know flash storage is fast, increasingly affordable and quickly beating out traditional spinning disk for primary storage needs. It’s like all our key business applications have been magically upgraded to perform 10 times faster!

In the data center, modern primary storage arrays now come with massive flash caching, large flash tiers or are all flash through and through. Old worries about flash wearing out have been largely forgotten. And there are some new takes on storage designs, such as Datrium’s, that make great use of less-expensive server-side flash. Clearly, spending money on some kind of flash, if not all flash, can be a great IT investment.

Yet, as everyone builds primary storage with flash, there is less differentiation among those flashy designs. At some point, “really fast” is fast enough for now, assuming you aren’t in financial trading.

Rather than argue whose flash is faster, more reliable, more scalable or even cheaper, the major enterprise IT storage concern is shifting toward getting the most out of whatever high-performance primary storage investment gets made. Chasing ever-greater performance can be competitively lucrative, but universally, we see business demand for larger operational data sets growing quickly. Flash or not, primary storage still presents an ever-present capacity-planning challenge.

A new ‘big data’ opportunity
The drive to optimize shiny new primary storage pushes IT folks to use it as much as possible with suitable supporting secondary data storage. As this is literally a new “big data” opportunity, there is a correspondingly big change happening in the secondary storage market. Old-school backup storage designed solely as an offline data protection target doesn’t provide the scale, speed and interactive storage services increasingly demanded by today’s self-service-oriented users.

We’re seeing a massive trend toward interactive, online, secondary storage architectures. Instead of dumping backups, snapshots and archives into slow, near-online or essentially offline deep storage tiers, organizations are finding it’s worthwhile to keep large volumes of second-tier data in active use. With this shift to online secondary data storage, end users can quickly find and recover their own data like they do with Apple’s Time Machine on their Macs. And organizations can profitably mine and derive valuable insights from older, colder, larger data sets, such as big data analytics, machine learning and deep historical search.

If that sounds like a handy convergence of backup and archive, you’re right. There’s increasingly less difference between data protection backup and recovery and retention archiving…(read the complete as-published article there)