Reflections on Monitorama 2019
By Christine Yen | Last modified on June 25, 2019This year was my third in a row attending (and now speaking at!) Monitorama. Because the organizers do a great job of turning introverts into extroverts for three days straight, it’s always a fun and exhausting time—but one of my favorite parts is how much folks continue talking about and sharing the content, days or weeks after it’s over. So, to continue the drumbeat, here were some of my highlights from this year:
I love the stories about real-world engineering teams struggling to make sense of their systems, and Luke Demi’s talk (“Logs, Metrics, and the Evolution of Observability at Coinbase”) was no exception. Unlike many past talks at Monitorama, his take didn’t describe the arduous journey of inventing their own solutions—instead, he walked the audience through the tradeoffs made by Coinbase engineering as they tried out different vendors and learned what qualities were ultimately important to them.
As a bonus, he made the most compelling case for understanding context that I’ve come across yet:
https://twitter.com/Catchpoint/status/1135963674789732352
https://twitter.com/cyen/status/1135960311813496832
Another highlight for me was Nida Farrukh’s talk (“The Power and Creativity of Postmortem Repair Items”), arguing that outages are the best catalyst for change in engineering teams and practices. A real benefit of bringing together folks across hundreds of different organizations, all interested in harnessing the power of data to understand systems, is in being able to talk about the human aspect of our business: how the result of our work can feed back into how our engineering teams work together.
https://twitter.com/devblackops/status/1135625061933568001
And, when postmortem’ing an outage, she reminded us all to include different perspectives (smells like “don't forget to pull devs into postmortems, too” reminder to me!):
https://twitter.com/lizthegrey/status/1135625417560186880
And finally—while the event itself is consistently great, it’s almost more fun to look at Monitorama as a metacommentary on the industry as a whole. As the monitoring/observability zeitgeist changes, so too does the tenor of the conversation and programming:
- From my first Monitorama in 2017, with 0 talks on “observability” and among which Charity, speaking on “Monitoring: A Post Mortem,” predicted seeing everyone the next year at “Observabilityorama”...
- To 2018, in which there were no fewer than 6 talks on “observability,” each with a painstakingly crafted “definition of observability” slide...
- To 2019, in which the event tagline itself was changed to “An Inclusive Event for Monitoring and Observability Practitioners,” and I only counted one "what is observability?" definition across 7 observability-related talks.
It's been great to see the industry's focus evolve and begin to mature, and I am looking forward to seeing what the landscape looks like at next year's Monitorama!
A note from Honeycomb Marketing
We'd be remiss to not share links to the talks Christine and Liz Fong-Jones delivered at Monitorama :)
Christine spoke about developing your own superpowers with observability:
https://vimeo.com/341142053
And Liz spoke about the many tradeoffs on the road to observability:
https://vimeo.com/341145526
We always learn a lot at Monitorama, and hope to see you all there next year. Until then, find out what observability can do for you, no strings attached, at Honeycomb Play.
Related Posts
LLMs Demand Observability-Driven Development
Many software engineers are encountering LLMs for the very first time, while many ML engineers are being exposed directly to production systems for the very...
Honeycomb + Tracetest: Observability-Driven Development
Our friends at Tracetest recently released an integration with Honeycomb that allows you to build end-to-end and integration tests, powered by your existing distributed traces....
Observability and the DORA metrics
The Accelerate State of Devops Report highlights four key metrics (known as the DORA metrics, for DevOps Research & Assessment) that distinguish high-performing software organizations:...