Digital transformation is a hot topic that triggers countless conversations on a range of technological adoption and capabilities. But one often overlooked aspect comes with dire consequences. The very ability to manage the complexity of the digital business environment. Digital businesses are adopting technologies at breakneck speed, but are the capabilities to efficiently manage new IT assets evolving as fast as they should? This post opens the conversation on some frequent “misses” of monitoring for the digital business and the impact they have on its ability to compete on the new standard of digital experience. Don’t be surprised by this post’s conclusion: legacy monitoring will die its own death in 2021.
Consider the data*
- Only 54.25% of the IT system is monitored.
- 49% of IT professionals report using between 4 and 10 different monitoring tools to cover their monitoring needs.
- 51% use manual procedures to update their monitoring perimeter.
- 20% are tracking the performance and availability of IT assets with just alerts and notifications.
- 34% manually create graphical maps, diagrams and dashboards to report availability and performance data.
- And finally, over a third report they’ll need to invest money and efforts to rehaul monitoring capabilities in the next five years.
Let’s be honest, that’s a lot of manual labor to keep up with a digitally transformed business. It’s like asking a horse to check if a Tesla runs at its best performance. (It’s really unfair for the horse).
Is your IT monitoring like a horse managing a Tesla?
Checking if you’re running on an outdated IT monitoring system is easy. You can either take our quick self-assessment of your monitoring maturity or peruse through this list of common Legacy IT Monitoring symptoms:
- A disparate set of monitoring solutions are used, they work in silos and are often point-based—falling short at clouds and containers.
- No automated processes to configure an ever-changing set of interacting technologies in the cloud and at the edge, where users and connected objects interact.
- No comprehensive understanding of which devices, infrastructure, networks, and applications are essential to support workflows.
- Data (which should be the purpose of monitoring) is inconsistent, incomplete or duplicated—it cannot be contextualized for a deeper understanding of IT performance.
- No fluid, dynamic view on how the digital business works as a whole.
Digital disruptions falling under the radar
So, no big deal, you may be thinking, a little manual work is only slowing us, ITOps. But the reality is that legacy monitoring damages the ability of the digital business to reach full throttle. Unoptimized monitoring is essentially throwing sand in the gears.
It increases the likelihood of service disruptions, slows their resolution, and negatively impacts the customer experience which is the new standard businesses compete on.
According to a prediction by IDC, we’re reaching the point at which 80% of IT executive leadership will be compensated based on business KPIs and metrics that measure IT’s effectiveness in driving business performance and growth. In today’s world, this means responding quickly to changing needs and staying vigilant and aligned with what’s going on at the front lines by building operational agility and resilience throughout the I&O organization.
Serious data starvation
Inability to monitor modern, hybrid infrastructure from cloud to edge deprives the organization of a mine of data that could inform anything from security or capacity vulnerabilities, future digital need predictions, up to improved customer journeys. In short, it limits the ability of enterprises to leverage the full value of the agile technologies they invested in. On the other end of the spectrum, a third of IT operation teams have already drafted a clear roadmap to build an IT Ops environment augmented by AI*, eventually connecting it to the business analytics platform.
The statistics have spoken: More IT complexity in 2021
- 32% Of IT budgets will be dedicated to the cloud by 2021 (Forbes)
- Along with Kubernetes, 95% of more new-micro services will be deployed in the containers by 2021 (IDC)
- By 2022, over 90% of enterprises worldwide will be relying on a mix of on-premises/dedicated private clouds, multiple public clouds, and legacy platforms to meet their infrastructure needs (IDC)
- By 2021, AI spending for the data center will increase to 52.2 billion US dollars (IDC)
- By 2024, the edge computing market will balloon to 9B US $ driven by 5G, IoT and AI technologies (Statista)
2021: the year legacy monitoring dies
In 2021, major IT trends will contribute to widening the gap between those teams that have access to Smart Monitoring tools and those who are toiling with outdated tools.
IT environments are going to get even more hybrid, as businesses are addressing exponential data growth, a by-product of digital transformation reaching the deepest layers of industrial activity. IT is moving closer to users, with more popular hybrid cloud solutions, increased connectivity between cloud and on-premise assets, and a myriad of connected objects powered by new 5G network deployments. Add to this the global work-for-home “epidemic” and you’re reaching the breaking point where IT monitoring done the old way loses all relevance. The end of 2020 is the now or never time for a team discussion on the unmet needs of IT monitoring—and this assessment tool might just be the perfect starter.
If alive and kicking IT monitoring is what you need to face the challenges of 2021, ask us about the possibilities of Smart Monitoring.
*Centreon with FocusVision surveyed an online panel of 200 IT professionals from France, the UK, Spain, Italy, the US, and Canada in September 2020.