I moved companies recently, which came with the requisite job search
Having conducting four or five hundred of interviews over the past two years,
I felt more prepared than any previous search, but being back on the other
side of the table again really solidified some ideas for me.
The first thing which struck me is that the state of interviewing
is improving: many processes now involve presenting a prepared presentation
on a technical topic instead of an impromptu presentation (closer to replicating
a real work task),
and many have replaced the whiteboard algorithm problem
with a collaborative pair programming on a laptop with your editor of choice.
Looking back on my early interviewing experiences where I was once asked to
do calculus on the whiteboard, it’s amazing how far things have improved.
That said, it’s certainly not the case that interviewing has improved
uniformly. There is still a lot of whiteboard programming out there,
and a disproportionate number of the most desirable
companies continue with the practice due to the combined power of
inertia (it was the state of play when many engineers and managers–including
myself–entered the profession) and coarse-grained analytics (if you’re hitting
your hiring goals–and with enough dedicated sourcers, any process will hit
your hiring goals–then it can be hard to prioritize improving your process).
Reflecting on the interviews I’ve run over the past few years and those I got to
experience recently, I believe that while interviewing well is far from easy,
it is fairly simple:
- be kind to the candidate,
- ensure all interviews agree on the role’s requirements,
- understand the signal your interview is checking for (and how to search that signal out),
- come to your interview prepared to interview,
- deliberately express interest in candidates,
- create feedback loops for interviewers and the loop’s designer,
- instrument and optimize like you would any conversion funnel.
You don’t have to do all of these to be effective! Start from being nice
and slowly work your way to through to the analytics.
A good interview experience starts with being kind to your candidate.
Being kind comes through in the interview process in a hundred different ways.
When an interview runs over time before getting to the candidate’s questions,
the kind thing to do is to allow the candidate a few minutes to ask questions instead of
running on to the next interview to catch up. Likewise, in that scenario the kind thing is
to then negotiate new staggered start times versus kicking off a cascade of poor interviewer
time management as each person tries to fractionally catch up to the original schedule.
My experience is that you can’t conduct a kind, candidate-centric interview process if your
interviewers are tightly time constrained. Conversely, if an interviewer is unkind to a candidate
(and these unkindnesses are typically of the “with a whisper not a bang” variety), I believe
it is very often a structural problem with your interviewing process, and not something you
can reliably dismiss as an issue with that specific interviewer.
Almost every unkind
interviewer I’ve worked with has been either suffering from interview burnout after doing
many interviews per week for many months, or has been busy with other work to the extent
that they started to view interviews as a burden rather than a contribution.
To fix it, give them a interview sabatical for a month or two and make sure their overall
workload is sustainable before moving them back into the interview rotation.
(Identifying interview burnout is also one of the areas where having a strong relationship
with open communication between engineering managers and recruiters is important.
Having two sets of eyes looking for these signals helps.)
What Role Is This, Anyway?
The second critical step towards an effective interview loop is
ensuring that everyone agrees on the role they are interviewing for,
and what extent of which skills that role will require.
For some roles–especially roles
which vary significantly between companies like engineering managers,
product managers or architects–this is the primary failure mode for
interviews, and preventing it requires reinforcing expectations during
every candidate debrief to ensure interviewers are “calibrated.”
I’ve found that agreeing on the expected skills for a given role can
be far harder than anticipated, and can require
spending significant time with your interviewers to agree on what the role requires.
(Often in the context of what extend and kind of programming experience is needed
in engineering management, devops, and data science roles.)
After you’ve broken the role down into a certain set of skills and requirements,
the next step is to break your interview loop into a series of interview slots
which together cover all of those signals. (Typically each skill is covered by
two different interviewers to create some redundancy in signal detection if one of
the interviews doesn’t go cleanly.)
Just identifying the signals you want is only half of the battle though, you also
need to make sure the interviewer and the interview format actually exposes that signal.
It really depends on the signal you’re looking for, but a few of the interview
formats which I’ve found very effective are:
- prepared presentations on a topic: instead of asking the candidate to explain
some architecture on the spur of the moment, give them a warning before the
interview that you’ll ask them to talk about a given topic for 30 minutes,
which is a closer approximation of what they’d be doing on the job,
- debugging or extending an existing codebase on a laptop (ideally on their
laptop) is much more akin to the day to day work of development than writing
an algorithm on the board. A great problem can involve algorithmic components
without coming across as a pointless algorithmic question (one company I spoke
with had me implement a full-stack auto-suggest feature for a search inbox box,
which required implementing a prefix tree, but without framing it as yet-another-algos-question).
- giving demos of an existing product or feature (ideally the one they’d be working on)
helps them learn more about your product, get a sense of if they have interest
around what you’re doing, and helps you get a sense of how they deliver feedback
- roleplays (operating off a script which describes the situation) can be pretty effective
if you can get the interviewers to buy into it, allowing you to get the candidate to
create more realistic behavior (architecting a system together, giving feedback on poor performance,
running a client meeting, etc).
More than saying you should specifically try these four approaches (but you should!),
the key point is to keep trying new and different approaches which improve your chance of
finding signal from different candidates.
If you know the role you’re interviewing for and know the signal which your interview
slot is listening for, then the next step is showing up prepared to find that
signal. Being unprepared is, in my opinion, the cardinal interview sin because it shows
a disinterest in the candidates time, your team’s time, and in your own time.
In the fortunately rare situations when I’ve been interviewed by someone who was both
rude and unprepared, I still remember the unprepared part first and the rude part second.
I’ve also come to believe that interview preparedness is much more company dependent than it
is individual dependent. Companies which train interviewers (more below), prioritize interviewing,
and maintain a survivable interview-per-week load tend to do very well, and otherwise they just don’t.
Following from this, if you find your interviewers are typically unprepared, it’s probably
a structural problem for you to iterate on improving and not a personal failing of your
Deliberately Express Interest
Make sure your candidates know that you’re excited about them.
I first encountered this idea reading Rands' “Wanted” article
and he does an excellent job of covering it there. The remarkable thing is how
few companies and teams do this intentionally: in my last interviewing process, three of the companies
I spoke with expressed interest exceptionally well, and those three companies ended up being the ones
I engaged with seriously.
Whenever you extend an offer to a candidate, have every interviewer send a note
to them saying they enjoyed the interview (compliment rules apply: more detailed explanations
are much more meaningful). At that point as an interviewer it can be easy
to want to get back to “your real job”, but resist the temptation to quit closing
just before you close: it’s a very powerful human experience to receive a dozen
positive emails when you’re pondering if you should accept a job offer.
Interviewing is not a natural experience for anyone involved. With intentional practice you’ll
slowly get better, but it’s also easy to pick up poor interviewing habits (asking brainteaser questions)
or keep using older techniques (focus on whiteboard coding). As mentioned earlier, even great
interviewers can become poor when experiencing interview burnout or when they are overloaded
with other work.
The fix for all these issues is to ensure you build feedback loops into your process, both
for the interviewers and for the designer of the interview process. Analytics (discussed in the
next section) are powerful for identifying broad issues, but for actively improving your process,
I’ve found that pair interviews, practice interviews and weekly sync ups between everyone
strategically involved in recruiting (depending on your company’s structure, this might be
recruiters and engineering managers or something else) work best.
For pair interviews, having a new interviewer (even if they are experienced somewhere else!)
start by observing a more experienced interviewer for a few sessions and gradually take on more
of the interview until eventually the more senior candidate is doing the observing.
Since your goal is to create a consistent experience for your candidates, this is equally important
for new hires who are experienced interviewing elsewhere as it is for a new college grad.
To get the full benefit of calibration and feedback, after the interview have each interviewer
write up their candidate feedback independently before the two discuss the interview and candidate
together (generally I’m against kibitzing about a candidate before the group debrief to reduce biasing
later interviews based on an earlier one, but I think this is a reasonable exception given you’ve
experienced the same interview together and in a certain sense calibrating on interviewing at your
company is about having a consistent bias in how you view candidates, independently of who on your
team interviews them).
Beyond the interviewers getting feedback, it’s also critical that the person who owns or designs the
interview loop get feedback. The best place to get that is from the candidate and from the interview
For direct feedback from candidates, in my “manager interview” sessions, I’ve started to ask every candidate how the process
has been and what we could do to improve. The feedback is typically surprisingly candid,
although many candidates aren’t really prepared to answer the question after five hours of interviews
(it’s easy to get into the mode of surviving the interviews rather than thinking critically about the
process which is being used to evaluate you). The other–more common–mechanism is to have the recruiters
do a casual debrief with each candidate at the end of the day.
Both of these mechanisms are tricky because candidates are often exhausted and the power dynamics
of interviewing work against honest feedback. Maybe we should start proactively asking every candidate
to fill out an anonymous Glassdoor review on their interview experience. That said, this is definitely
a place where starting to collect some feedback is more important than trying to get it prefect in the
first pass: start collecting something and go from where.
Optimize The Funnel
Once you have the basics down, the final step of building a process which remains healthy for the long haul
is instrumenting the process at each phase (sourcing, phone screens, take home tests, onsites, offers, and so on),
those metrics over time. If your ratio of
referrals:sourced+direct goes down, then you probably have a
problem (specifically, probably a morale problem in your existing team), and if your acceptance rate goes down then
perhaps your offers are not high enough, but also it might be that your best interviewer has burned out on
interviewing and is pushing people away.
Keep watching these numbers and listening to candidate post-process feedback, and you can go to sleep at
night knowing that the process is still on the rails.
As a side note: I’ve put optimizing your funnel–and by this I include the entire procss of building explicit analytics
around your process–as the last priority in building a great interviewing process.
From a typical optimization perspective, you should always measure
first and optimize second, and here I’m giving the opposite advice.
Doing this first intead of last is definitely reasonable, in fact, I considered making this the first priority
and when I was setting up my last hiring process it was the first thing I did.
In the end I think you’ll find that your process cannot thrive without handling the first six priorities, and
that your analytics will direct you towards fixing those issues. Plus, the underlying data is very often poor
and it can be easy to get lost spending your cycles on the process of instrumenting your process
instead of improving it.
Altogether, the most important aspect to interviewing well is to budget enough time to
interview well and maintaining a healthy skepticism about the efficiency of your current
process. Keep iterating forward and your process will end up being a good one.
I’m sure I missed quite a few components of interviewing well, I’d love to hear more ideas
at @lethain on Twitter or over email (included in the right-rail).