Blog

The State of Observability in 2021

 

Today, we released our second annual Observability Maturity Community Research Findings report. This year-over-year report identifies trends occurring in the observability community that we use to further develop our Observability Maturity Model.

Our goal in running this annual report is to understand community perceptions and awareness of observability, how engineering teams are approaching observability, and mapping an observability maturity model that reflects current research findings. 

The report is a helpful way to understand what’s working across the industry, what’s not, and what you can do to further your own business goals with observability.

Key findings in 2021

The 2021 Observability Maturity Community Research Findings report is a follow-up to the inaugural report we published in 2020. This is the first report to examine trends in the observability landscape year-over-year.

This year, we found that overall observability adoption is on the rise. More teams report having moved beyond planning phases for their observability initiatives and into actual practice (up 8%). However, a majority of teams are still at the earliest stages of observability maturity.

Organizations on the higher end of the observability maturity spectrum are realizing benefits such as higher productivity, improvement in code quality, higher rates of end-user satisfaction,  and greater retention among their engineering teams. They also report greater levels of confidence in their ability to catch bugs both before and after deploying to production and the ability to deploy features faster than those on the lower end of the spectrum.

Teams further along the observability maturity spectrum not only realized additional benefits; they also delivered higher impact business outcomes. Virtually all respondents in the Advanced group and nearly 90% in the Intermediate group indicated high rates of customer satisfaction. They were also 3X more likely to indicate their customers were always satisfied than groups who had not yet started practicing observability.

The observability maturity model focuses on team capabilities and outcomes. In other words, it goes beyond what types of data some teams may be collecting (e.g., logs, metrics, and traces) and focuses on what types of problems they’re able to solve. Interestingly, many teams who report using tools like logs, metrics, and traces in production still have trouble catching bugs before and after deploying to production. They also report lower rates of confidence in knowing how to locate the correct source of issues in production. These less mature teams have started down the path of observability, but they aren’t realizing the same benefits as their more mature counterparts.

While interest in observability has gained significant momentum, we also found that maturity is shifting at different paces: The Intermediate and Advanced groups have shifted toward higher maturity over the past year. However, the overall proportion of these two higher-spectrum groups remained constant, signaling that lower maturity groups are stagnating in their journey. That could indicate that teams who are well versed in observability practices are accelerating their skills and may be pulling away from their lower-maturity counterparts.

The evolving middle, or Novice group, had more respondents reporting that they have started their observability adoption journey. They report practicing observability processes or having tooling related to observability, but rarely both. They also report some, but not most, key capabilities, such as identifying and resolving bugs before and after deploying to production. Lastly, at the time of the survey, approximately one in five respondents were not practicing or using observability tooling, but have plans to do so within the next year.

An interesting trend noticed this year was that these lower-maturity teams already practicing observability but not realizing the benefits also disproportionately cite lack of implementation skills as one of their biggest barriers to adoption. When combined with other factors—like reporting observability practices or tooling, but not both—we believe this signals either confusion in the market over which capabilities should be achievable when practicing observability, or that teams need more training to achieve those capabilities (or both). There was not enough granularity in this year’s data to dissect that further (that will have to wait for follow-up years of research), but one thing is clear: A wide berth of novice teams need to build up their skills in order to be successful with observability.

About this research

The survey used to generate these findings was conducted from December 2020 – January 2021 in tandem with ClearPath Strategies, a boutique consulting firm recommended to Honeycomb by Dr. Nicole Forsgren, former lead researcher for DORA and the State of DevOps Reports. Invites to participate in the survey were distributed via a Honeycomb email list and social media outreach. A total of 405 respondents participated in the survey. Our mutual goal is to continue developing an understanding of observability awareness and perceptions, engineering practices, and to evolve a model for observability maturity.

What this means for you

Download and read the report to understand what’s driving results on the upper end of the observability spectrum, what those just getting started can do to avoid stagnating in the middle, and how to tell if your observability adoption initiative is heading in the right direction.

On June 29 at 9 a.m. PT, you can also join James Governor, Analyst at Redmonk, and Charity Majors, CTO and co-founder of Honeycomb, for the webinar, “The State of Observability 2021: Mature Teams Ship Better Code Faster and You Can, Too,” as they talk through what the findings mean and what you can do to advance your own business impacts with observability.