Observations on the Enterprise of HiringBy Charity Majors | Last modified on August 18, 2022
Almost everybody hates interviewing.
You aren’t wrong to hate it: interviewing is fucking broken, in ways that tear you down and rob you of your self-respect and your will to live.
If interviewing were a dreaded yet rigorous process that delivered a high degree of confidence and accuracy in the results, that would be one thing. But it’s not. Predicting who will succeed and be happy is scarcely better than a coin flip, even at Facebook and Google. Your own personal enthusiasm for the hire rarely predicts your top performers or your strugglers. Meanwhile, it’s fraught with biases and risk aversion.
At Honeycomb we have dissected all the things wrong with all of the interview processes we have ever been part of, and decided to err in new and exciting ways. Maybe we can’t do better than the industry mean, but we could hardly do much worse. Why not experiment?
The Honeycomb Hiring Experiment
We believe a good process starts with attracting the kind of candidates we value, puts the candidate at ease and shows them at their best, and gives both us and the candidate a semi-realistic sense of what it feels like to sit together and work on something. We want to be respectful of their time, we want them to leave feeling valued whether we hire them or not. And we want to keep the “must-have” skill list as short as possible…no laundry lists. After all, almost any skill can be learned.
We believe bad processes make people memorize things or study to prepare. Bad processes are adversarial or unpredictable. But a more subtle point is that bad processes can be bad because they don’t produce enough signal to guess whether the person will ultimately be successful on the team. We could design a process that anyone could cruise through and come away loving us … but we’d end up having to manage a lot more people out afterwards. More heartache.
Our interview may feel “easy” to some. It’s not. We have an extraordinarily high bar, in that we don’t hire the large majority of candidates. But our bar is tuned for many criteria that are not algorithms and data structures. Here are some of our interview goals:
- No surprises. The research is clear: unknowns cause anxiety, and people don’t perform well when they’re anxious. We willingly offer up as much detail in advance about the process as we can, even down to the questions. The hiring manager will walk you through what to expect in advance, and tell you what we’re looking for in your answers.
- Realistic code. Obviously we don’t do whiteboard coding: it’s terrible. The coding exercises we use involve extending or improving existing code…from home, the night before, in the ease of your own dev environment and without an interviewer watching. The code should relate to actual work that you would actually do, and…that doesn’t usually mean green field, does it?
- The communication is the interview. Plot twist…the “coding interview” isn’t really about the code. When you come in for your interview, you’ll talk through your PR with a couple of engineers, and that is the meat and bones of the interview. We believe that if you can communicate clearly about your choices and the tradeoffs, you can certainly do the work, while the reverse is not necessarily true: there are people who can do the work but can’t communicate clearly about it, and that doesn’t work for us. We need both. (We have to teach the world about distributed systems, after all!)
- Product and production. Our other questions should explore your product sense and your production skills … your taste when it comes to shipping user experience, and the impact you will have on the team around you over the lifecycle of your code.
- See every candidate at their best. We don’t hire people for lack of weaknesses, duh. We hire them for their strengths. We try to ask everyone in advance, “what can we do to see you at your best?” We aren’t interested in dwelling on areas where you already know you are weak. Don’t bother trying to fake it! Instead, help us learn what makes you amazing.
- Interview us! We try to allow plenty of time for you to interview us as well. We invite all candidates to come to our engineering meetings and/or all-hands, and usually set up a “get to know us” visit separately from the interview.
How is the Experiment going? Join us and find out!
At this early stage, every single person we hire will leave a lasting imprint on who we are as a company. Who we hire is a mix of skills, interests, personality, background, and availability.
The big companies will hire anyone who “passes” the interview; for us, it’s like delicately playing Jenga with people. We have had to regretfully turn away lots of terrific people with great skills that we simply don’t need at the moment.
And we value diversity a lot. We could have hired all our awesome former coworkers instead of casting a wide net and taking a risk on unknowns…but dammit, we are building a product for everyone, we can’t risk developing a monoculture or tunnel vision.
Honeycomb is not for everyone. As our values statements try to convey, this is a high trust environment, where we operate with abnormally high ownership and autonomy. We also talk about our feelings more openly, try to give feedback regularly, receive it with vulnerability, and operate with a level of transparency that is above the norm. These are some of the most important relationships in our lives: we want to invest in them.
As Chris once put it: we’re “more than friends, less than family”. We aren’t trying to be your life — you’re an adult, you already have one! But we are trying to do something really big and hard together, and if you find all of this interesting, we would love to chat with you.
We Did it Again: We’re a Leader in 2023 Gartner® Magic Quadrant™ for APM & Observability for the Second Year in a Row
When the Gartner Magic Quadrant Report came out in 2022, we did the professional equivalent of a spit take, then cheered wildly. NOT ONLY did...
The future of observability has never been more exciting, and this latest round ensures we can continue to invest—with conviction—in improving the lives of software...