And this is why most the interview processes as practiced today are a joke. It's far more this weird sort of cargo-cult hazing process than any actual sort of reasonable assessment. Few people give challenging problems that they expect won't get solved to step through thought processes, they have some predisposed ideal solution or perhaps probably optimal algorithmic solution on the spot. That lends itself well to a combination of assessing rote memorization and chance, I suppose.
This is 100% what it is. A "We went through it so you do too." sort of thing, combined with the fact that management can't interview everyone and SWE's aren't good at reading people to determine if they're lying.
I have a friend who's a manager at a FAANG and while there's a lot of good, smart people there, there are many, many people who are terrible at what they do and are very difficult to deal with.
My company does not do whiteboarding and makes an effort to get to know the person. We rarely get bad hires. Maybe 1 out of 100? Give me 10 to 20 minutes just to talk to someone and I'll tell you if they'll be a good employee. One of the first things I do is assess if they're lying on their resume. If they're not, then I'm comfortable believing in their skill set.
Every other practice we have in our interview process is justified based upon historical precedent as well as a team decision for where the team wants to go next.
This is pretty much the opposite in an interview situation, even when it's a subject people know well, there's way too much "unknown" about your particular setup that the interview doesn't/can't know.
2. I usually ask candidates questions and scenarios based upon their submitted code samples and will presume an OS or cloud provider they’ve listed as something they’re familiar with on their resume. If you have lied for any reason it will be very obvious very fast as I keep adapting the question to a narrower and narrower scope to being a purely toy problem that is now useless for measuring anything beyond whether they’ve seen the problem before. Usually we can spot some errors in code samples submitted and make it clear to candidates that we do not expect perfect or necessarily even working code! The test isn’t about necessarily being bug free but about the attitude one has about their work output and how they’ve thought about various failure modes. I’ve had candidates surprisingly often say “that can’t happen, I made sure of it in my test cases” and then I show with a quick test run I think of that their submitted code does indeed have some flaws that would have had issues in production causing, say, OOM issues, segfaults, etc. A successful candidate’s attitude is accepting / welcoming of criticism as a team effort, has strong but loosely held opinions when encountering data contradicting their position appropriately, and is able to accept a challenge from junior engineer with respect and sincerity. Red flags = arguing with the interviewer for asking an admittedly irrelevant question, getting very defensive about a technical decision, and saying adamantly “this should have been caught in a code review so it’s not relevant” and being dismissive about the question. Yes, I’ve seen all these responses before. Nowhere in this process does “is expert at X” enter the picture, and we are usually evaluating the quality of the questions asked by the candidate as well. Many of us on the team have respectfully asked the relevance and presumptions of a question and that’s a big plus IMO - they have courage, respect, and critical thinking skills under pressure. I make sure to tell them that during the interview because people are so used to taking orders and shutting up that it’s bad for the organization.
Another set of red flags I’ve seen is “I’ve never had to do X, it’s taken care of by Y” and being unable to reason or even conjecture about how Y could be designed and reassured it’s not about correctness when it’s on their resume. Like I had someone that was pretty clearly a competent, skilled programmer but was applying to be an SRE and didn’t know how to work on a system below the container level with any tools proprietary or OSS - they were recommended to a different team but not for the applied position.
I assure you that everyone I’ve interviewed for years now has said they’ve had a great experience in all the follow-ups with recruiters, felt that the interview was a technical conversation rather than a hazing / torture test, and that nobody felt the questions were not relevant to the job once hired. I’m not an asshole as an interviewer trying to push people to some limit or something that seems to be the result of most technical interviews in our industry.
When that happens, do you tell them they have 5 minutes and if they don't figure it out they're fired? Try that sometime and see how they perform.