Becoming an IT Industry Analyst

In April 2012 I changed career directions to become a full time IT Industry Analyst (with Taneja Group), focusing on enterprise IT infrastructure, HPC technologies, and Big Data opportunities coming into data centers everywhere.  Oh, and all things virtualization, storage, and cloud.

As I will be writing for a living, and that writing will mostly belong to my new employer and our clients, I will add a post here when I get something published, but it will likely only be an excerpt and link to the authoritative/copyright owning web site.

But occasionally I expect to write a few things just for myself yet 🙂

Thanks for reading!


p.s. new bio for the new job!

mikematchett-lg

Senior Analyst & Consultant

Mike brings to Taneja Group over 20 years experience in managing and marketing IT datacenter solutions particularly at the nexus of performance, capacity and virtualization. Currently he is focused on IT optimization for virtualization and convergence across servers, storage and networks, especially to handle the requirements of mission-critical applications, Big Data analysis, and the next generation data center. Mike has a deep understanding of systems management, IT operations, and solutions marketing to help drive architecture, messaging, and positioning initiatives. He has previously worked at a senior level in services, marketing, and product management at a diverse portfolio of companies ranging from large enterprises like Dell Storage and BMC, to successful startups like BGS and Akorri (acquired by NetApp), and was most recently a CTO/co-founder of a social media startup applying machine learning to augment human capability. He started out in IT implementing highly secure networks for federal agencies after proudly serving as a USAF intelligence officer in Desert Storm. Mike received his Bachelor’s degree in Electrical Engineering from MIT.

Amazon Defines Big Data As Big Opportunity

(Excerpt from original post on the Taneja Group News Blog)

Today I had the opportunity to hear about Amazon’s Big Data solutions at an Amazon Web Services Big Data Summit here in Boston. At this event crowded with local tech talent from the hot bio, research, financial, and web industries, AWS showcased their EC2 compute cluster instances for HPC and their Elastic Map Reduce service that runs Hadoop in the cloud – trotting out several interesting real-world users.

John Rauser, an Amazon big data scientist, presented a thought provoking session on how Amazon uses and views Big Data. I wouldn’t want to shamelessly steal his material but I just have to relate his definition of Big Data. He said that you are really dealing with Big Data at the point when you have data that needs distributed processing. In other words, it’s Big because it’s more than one node or a single monolithic application can handle today, or even can be expected to handle “forever” as the dataset grows. Once you have to cross the threshold to “distributed” you have effectively entered the land of the Big.

In this view the effective Big Data market isn’t just the “extremely Big Data” of petabyte sized datasets that others are talking about. Those petabyte apps get a lot of news but they are out on the long tail of datasets. Rather, the Big Data opportunity is every potential analysis and app that is just out of reach of current single node IT systems, developers, and operators, up to and including the petabyte monsters. That covers a really broad swath of datasets that isn’t limited by a hard threshold on dataset size.

I really like that Big Data definition. It’s a practical and useful way to think about when you might and should get into Big Data technologies. And it clearly drives into Amazon’s strategies to enable both cost-effective analytical development and ongoing cluster operations for any indefinitely scalable dataset, even if those datasets are only a few hundred GB today. With all datasets conceivably growing large eventually, eventually all datasets will be Big Data. If you are developing any new data analytics apps, the time to develop it as a Big Data app might be now.

…(read the full post)

DH2i: Getting Mission Critical Apps Virtualized

(Excerpt from original post on the Taneja Group News Blog)

Last week we found ourselves in a discussion with a new vendor in the virtualization space. It’s a crowded space to get into but their target seems right on – virtualizing mission-critical applications. That’s a hot topic these days as folks try to squeeze even more benefits out of enterprise virtualization efforts by tackling apps that have high availability, performance, and integrated storage requirements.

…(read the full post)