"Do coding interviews work in the actual world in which we live in?" No, fundamentally not. Almost nobody is sufficiently capable to actually analyze code, and writing code with an audience is absolutely a stupid anxiety inducing high stress situation that makes candidates into babbling children who can't define an array - EVEN IF THE CANDIDATE CAN IN FACT PROGRAM JUST FINE. Add in one idiot on the interview panel who can't stop "tsk"-ing or asking obnoxious questions ("Why did you use a foreach and not a for loop?") and you have an hour long anxiety sandwich where the only thing you are getting from the candidate is whether or not they can dance in front of a sufficiently hostile crowd on demand.
With an exception of one, which was a bit too much like a puzzle, I think the assignments were completely fair examples of what I might experience on daily basis. Not at all tailored to people doing competition style programming, which was what I expected after being exposed to HN for years.
I have to say I enjoyed both preparing for the interview as well as the interviews themselves. And I believe that overall the interviewers were able to grasp how I would code.
With zero preparation, I could go to an interview with you right now and tell you anything you wanted to know about designing a database schema, how to get any kind of data out of that schema, how to make that data available via an API, how to cache the data and perform permissions checks, how to deploy code to a cloud environment, how to set up a build pipeline, or any other question you might want to ask about the day-to-day process of developing and deploying web applications. I can answer those questions because I already work in the industry and I broadly know what I'm doing.
However, I probably couldn't pass a FAANG interview without months of prep work, precisely because whiteboard interviews bear almost no resemblance to the reality of software development.
The questions that are asked have little to no basis in my daily work as an engineer. Same is true for the hundreds of other engineers I’ve worked with who have gone through the same hoops that I have.
In my experience, you have a lot more than an hour to do it in the real job. And while perhaps 1% of SW folks will deal with in on a daily basis, most will not. I certainly have gone years without needing to use graph algorithms (including even DFS/BFS). And I've needed a BST only once in a decade - and I didn't need to code it, just needed to understand the data structure and complexity to know that "binary_search" in C++ was a superior option than using a map for my use case (even though the complexity is the same).
Some context: At work I recently solved a problem where I had to build a graph (DAG), and do some queries on it mildly efficiently[1]. I did it quickly thanks to wasting time on Leetcode, but no one knew how quick, and people were impressed that I "got it done in less than a day". The reality in most SW work is that whether it takes you a day or an hour to solve this problem, the overall rate of progress will not be impacted.
[1] Mildly efficiently because for the first POC pass, I didn't memoize, and then I got lazy and decided I'd optimize it only after getting a real world data set and seeing if it was slow. It wasn't, even for a decently large data set. The reason? Business logic dictated that the max distance between two nodes did not exceed 8, so from a complexity stand point, you're not increasing the work N-fold by not memoizing, but it's just changing the constant factor.
Somehow, I suspect that in half of the FAANG interviews, had I coded this and argued that the complexity didn't change, I would not get a pass.[2]
[2] Not trying to be cynical: I did argue this for a different problem in a Google phone screen - I had been asked to write some code + data structures to handle an Manager/Report relationship, and the interviewer pointed out my data structure was less than optimal. I countered with "In most companies, the number of employees a manager has under him/her is usually bounded by a small number." Surprisingly, the interviewer gave in.
Curious, were there any questions about what you think of their company ethics, and if you have a problem with it?
You learn a bit about the company during the interview, and if that bit is bad, then move on. If they ask shitty leetcode questions and the interviewers are hostile or hung up on unimportant details - move on to another company.
Take for example the "can't define an array" crowd. Are there people virtually sitting in front of me who just forget the basic syntax? Absolutly, but given we a) tell them that correct syntax isn't dreafully important, and b) let them pick the language, a [] would probably suffice, I don't think it's a completely unreasonble ask.
Moreso, when we added a prestep of basic questions that one can google, the "can't define an array" crowd mostly disappeared. Sure, there may be some other causation there, but it's hard to hire a coder that you know, can't demonstrate coding.
Finally, anecdata, since we started coding interviews, the quality of developer is simply better. We have had internal debates about this, but the truth is, everyone is overall happier with what we're doing, and I'm a lot happier that false positives are way down.
I'm sure there's other ways of course. :)
Knowledge generally isn't a great indicator of future job performance, intelligence is. There might be some correlation between the two (you probably need a level of intelligence to acquire the knowledge for LC-style questions), but most jobs are layering proxy upon proxy to test something that has little to do with job performance. Being able to fuddle syntax or search basic solutions and mutate, is leagues different from what LC in an isolated environment is.
Then you add onto that the alternatives tend to be even worse. Personality testing works when done right, but the vast majority of interviewers play pretend-psychiatrist and make the entire thing way too subjective to be effective beyond filtering obvious red flags. Take-homes work, but most companies won't compensate for time and there's no framework to show which take-homes individuals have succeeded (kind of like a certification process), making them an incredible time sink.
And then there are always the cases who have little experience yet manage to pick everything up in 2-3 months and become anywhere from average to top performers.
Tell that to the people who are always the false negative.
I wonder at what point does it stop being "whiteboard interview" and just a "few sprinkle of algorithmic questions to test logical skill".
Also when the talents pool is large enough, obviously you can do as many brutal questions as you can and still fill the headcounts with good developers. So I think saying "but it still works" is not very productive.
I look forward to seeing you disrupt every major tech company in the world via your superior hiring strategy.
> EVEN IF THE CANDIDATE CAN IN FACT PROGRAM JUST FINE.
You don’t seem to have thought very hard about what employers are optimising for. Everyone realises that programming interviews produce lots of false negatives. Even the strongest programmers have bad days. From the employer’s point of view, that’s okay - minimizing false negatives is not the goal. The goal is minimizing false positives. Failing to hire a strong candidate carries a much lower cost than hiring someone who can’t do the job.
Which becomes even worse when realizing "false positive/true positive" is too binary of a definition for a spectrum such as job performance.
And I say this as someone who's in favor of giving people a quick check on whether they can actually do FizzBuzz.
Have you ever seen a hiring process that's simply not effective?
The undercurrent of this irks me a bit. It assumes a kind of omniscience on part of the employer, and enough time for them to develop a good interview process. I think this is really hard to do.
I think you're generally right. And a lot of the time, employers are empathetic, pragmatic, self-aware and really are doing us a favor when they deny us. Some take into account I-O psychology stuff, too.
I (and others probably) are eager to hear what companies you feel do a good job at hiring.
> Everyone realises that programming interviews produce lots of false negatives.
Let's assume a bad scenario. A company can disregard evidence of competency, like having a portfolio, to basically have a rolling, year round tech Olympiad. To them, it's about fronting as a strong candidate in events, even at the expense of being good at the role.
You can be a pro at pole vault or discus throwing, but a poor carpenter. But there's a mentality out there - with some - that a strong, fit candidate can handle any technical challenge.
Can a chemist be a great chef? I bet, I also bet some could overthink and/or overengineer preparing a simple meal. Does the interview process do anything to check for soft skills, like curiosity, simplicity, etc? Hopefully.
> The goal is minimizing false positives. Failing to hire a strong candidate carries a much lower cost than hiring someone who can’t do the job.
I think some companies - not all - can create a "squid game" contest out of candidates. Worse, I feel candidates can unduly suffer and be humiliated in these cases.
The company gets a superb interviewer, who is no doubt a very intelligent person, but not necessarily one that can adapt to technical reality, contingencies, ambiguities, etc. Some interview processes end up focusing on theoretical and rote knowledge, not integration and synthesis, when they deeply need that. Turnover happens.
But we're back to the beginning, did the company: 1. really know what they needed for that position, 2. interview process check for that? My guess is when you wrote that you were assuming situations where both were true?
Hire for strengths, not to avoid weaknesses.
you should be able to explain why you wrote something the way you did; that's not an absurd ask.
perf characteristics of different iteration styles are often significant to the task at hand.
This ridiculously untrue.
It's also a great example of how dumb current interviews are. Someone like this poster could easily interview you, and now what? Do you just play along or begin to explain how incredibly wrong they are?
See, that's just not right. I mean, we all agree there's a hard problem to solve here, right? The world is populated by engineers of widely varying skill levels. It just is.
So how do you detect them? Well, the ability to bang out a correct solution on a whiteboard is a verifiably correct way to do that. It may not be optimal (passing candidates with other issues that make them bad fits), and it may have undesired failure modes (rejecting people who have trouble performing in the high pressure environment). And those are problems worth discussing.
But... it works. It clearly works. We've all seen it work. Companies that use these techniques aggressively tend strongly to be organizations with a long track record of producing working software. The FAANG employers are desirable because FAANGs ship code and make money!
So I just don't get the absolutism here. If there's a clearly better way that produces a more effective workforce, where's the darwinian example? Who's winning by hiring better people than Apple or Meta? I just don't see it.
Once you get into scale where you need to hire 20 devs a year and get 100s of CVs per week it all breaks down.
[0] side note - I work in games where we're often filtering resumes for specific roles, e.g. a gameplay programmer or an online programmer. The filtering here is most of the time bucketing candidates based on their experience and sorting that way.
If you have a better way on how to assess a software developer, I would love to hear it.
Instead I try to find an isolated problem/task we have in our backlog that the candidate should try to implement on their own time, so I can review what kind of solution they will actually deliver once hired. We then go through the solution with them and do a code review as they would go through if they got hired. You then see both the problem-solving skills as well as how they take feedback.
I would recommend checking out DuckDuckGo's excellent "How we hire" guide[0] which describes some of the best hiring process I have come across (albeit it may be too extensive to do in full for some companies).
Step 1: An initial phone interview with the candidate regarding technologies of choice, war stories, etc
Step 2: A small coding exercise that is similar to what we work on. The candidate can solve this on their own time and email us the solution.
Step 3: Review the solution together and ask questions about their methods.
Step 4: Meet the boss & offer.
The coding exercise is only sent _after_ the initial phone call, so we don't waste their time or ours.
The only problem here is that we can't virtue signal that we follow FANG whiteboard interviews and pretend that everyone working at the company can invert a Trie Tree in a 30 minute coding session..
1st interview - Go over the role etc, both parties ask question to see if they are a good fit. If both parties are happy to move to the next stage then there is a take home exercise to do.
Tech Exercise - Something simple which touches the core competencies that the role requires.
2 Second interview - Go through the exercise discussing the design choices etc. If everyone is happy offer the job.
I think the probation period should be used to determine if the candidate is the right fit for the role. That way the business gets a much better idea of who it is they are hiring and the same goes for the candidate.
I'm sure Google wouldnt be as successful as they are if their answer to "how do you define an array" was "well think about what you're trying to do here"
One of the things I like to do, especially when a candidate is super anxious, is just ask them to talk about a recent project they've done that they like and why they like it.
It’s about the person’s behavior, the way he/she interacts with you. It’s about the person’s culture, tabs vs space, and that kind of things. And it’s mostly about whether you like the person or not.
And I think it’s fine. You mainly need to be sure that the person knows what a for loop is, other than that, most people are ok at programming. What really matters is that you can work with that person.
Unfortunately interviewers who don’t realize this will unconsciously tweak the interview to make it harder for people they don’t like and easier for people they do like. And then they have an “objective” reason to not hire the person: “oh my god, he didn’t know merge sort”.
I give all kinds of random questions. And many times I recommend hiring people who bombed it, and many times I recommended bailing on ones who nailed it (overconfidence, poor communication, etc).
Not in FAANG or famous startups it's not. You're either gonna solve 3-6 medium hard Leetcode questions perfectly or you're out. Sure, liking you will help you better not come off as a douche - but you're not gonna get an offer without being able to solve (unless it's some kind of a diversity hire).
* read code
* extend some working code with a new feature
* parse some semi-structured text into a data structure
Programming interviews suck for everyone, even the interviewer. I feel absolutely horrible when someone locks up, or fails really hard. The questions I ask aren't hard but half don't pass.What folks should do is practice coding in front of someone. Writing little exploratory snippets and understand the base library of the language they claim to know. I don't even ask people to vocalize while they are working on the problem. And I solve it at the same time they are and if they get stuck, I show them some of my code.
I had one person start crying they put so much pressure on themselves, I had to take a two week break from interviewing after that.
The idea that everything boils down to answering some intro to programming questions quickly is absurd.
of course i interview for my specific situation. the candidate will be working with me directly, so they should at least be able to interact with me. if i don't like them then that's almost a showstopper, but if the candidate otherwise qualifies i should try to figure out what my problem is, and see if i can get over it.
The analogs of the traditional coding interview in the law school world are the LSAT and classroom timed memory tests. Both have absolutely nothing whatsoever to do with being a lawyer or the kinds of activities that lawyers actually do.
What they are are intelligence tests. They are easy ways for judges and top tier law firms to tell if you are "one of them." You can get your foot in the door if you can perform, at a minimum, at a certain level. There are plenty of things you can do without 97+ percentile LSAT and A's in law school, but you are screened from the top echelon of work.
People object that this isn't fair and screens good people out -- false negatives. But experience has shown that it is effective and easy. Perhaps there is a better way, but this way seems to work well enough. There is little incentive, beyond some people complaining about it, to change a system that is working well.
I see the traditional leetcode interview in the same light. The point is to see if you can perform at a certain minimal level. Yes, it's arbitrary, and often unconnected to real life. But it is an effective way to gauge a minimum level of intelligence and competence. Not perfect, but easy and effective.
Finally, there is ego element to this. People who can perform at this level see these skills as a minimum qualification. They are uncomfortable seeing themselves in relation to others who cannot perform at this level. Knowing that their peers have similar minimum qualifications makes them feel safe and secure about their status. This is not a criticism, just an observation about human nature. People in a club need a shared identity.
Especially if the interviewers make comments like "um how much do you actually practice law?" if you stumble on one of their pet quiz questions.
How are these analogous? LSAT is the Law School Admission Test. It gets you into law school. It doesn't get you a job as a lawyer. (The bar exam doesn't get you a job as a lawyer either.) And you only have to take it once, not for every job interview.
You can go to a top tier university, land a great job for several years, have positive references, and still be required to do leet code for a recruiter... that seems like ongoing gatekeeping, rather than a one-time effort/pass.
When I'm interviewing a senior developer that will guide other juniors, they will need to prove they know more than a for loop. They will need to show they understand the underlying concepts and technologies. Else it's a no hire, tough luck.
We also fired plenty of people that weren't up to the quality we expect.
Were are you working? At some government?
If they come up with, or start walking with an interesting solution, I listen to them.
Tabs and spaces don't matter.
I actively work against the biases of "whether I like the person or not" because I'm aware of that as a factor.
We can absolutely minimize all of those peripherals when we explicitly train against it and a good programming interview doesn't have to be about that.
But to be perfectly unbiased, the process would have to double blind. And I don’t know anyone who likes to hire people (or anyone who wants to be hired) following a double blind process.
So at least there is _some_ non-technical bias that gets in the process.
It’s hilarious listening to people review candidates on third coding interviews. Usually it’s “this person didn’t know this one thing I think makes me really smart”.
This is why I lean so heavy in OSS to review candidates. I can see how they interact with people, and whether they are actually able to build software that _people want_
I find with this approach it’s a lot harder to justify nonsense.
I hear people say maybe they don’t have time for OSS and then hand them a Leetcode interview, which as we all know is just about memorizing problems. OSS is such a better metric and has the added benefit of improving the world.
Yup. I just went through a round of interviews at multiple companies. In the past I successfully interviewed at a FAANG.
I agree completely.
This.
There is absolutely no need to ask such ridiculous algorithms and advanced equations questions that you will never use or implement from scratch yourself, or asking the candidate to create a formal proof by rote for a entry-level software engineering position or even a senior level position.
Such interviewers who ask these questions have both googled the answers to these questions before the interview and also have never used them practically, but only do this to appear smarter than they actually are. Worse part is not even FAANG asks any of these formal proof questions, but only some certain startups that the last time I have checked, they have failed to make money and have shutdown. Why? Because they seriously thought they were a FAANG company.
Open-source cuts through the algorithms nonsense, saves time and gets a realistic idea of what the candidate has actually worked on in the open. I can ignore the typical hello world, or demo projects but can ask for if they have created open-source libraries or contributions to any serious projects like compilers, OS projects, or any related project to the job description etc.
A take home project is also a fine metric, but asking questions about algorithms and finding out that the interviewer (or the company) doesn't even use them tells you that the interviewer doesn't know what they are looking for.
If you don't answer the 2 leetcode mediums in optimal complexity within 45 minutes then you don't pass. It's about coding skills.
Yet most companies interview as if they are inventing novel storage/processing mechanisms. Theory is important to understand which tools to leverage when, of course, but not to the extent that it's typically prioritized in the typical interview.
I've had plenty of people crush the theory portion of the interview and be poor performers on the job (slow coding, low volume output), but I've never had somebody crush the coding portion (implement very fast and proficiently), and perform poorly on the job
Even as their manager it’s not my job to lead a death march.
Somebody who produces 2x at same quality of work is worth 2x the other person, fairly objectively.
It's common to find people who produce up to 10x the average. Anyone in a startup or growth oriented company is smart to optimize for these people.
The most proficient and fast coders often write higher quality code too, in my experience.
People who over index on comp sci theory often are lacking in theory underlying good code. But it's not mutually exclusive. Just that if you have to optimize for one, coding proficiency matters much more for startup or growth type companies.
Exception might be like an AI/ML based startup with a lot of theoretical challenges. Most companies much more business/rule oriented though
Being able to come up with an algorithm in a reasonable time doesn’t make it crappy. No wonder you can’t pass those.
I think the reason for that observation relates to why these interview tests are generally a waste of time.
Based on the tests that I've seen, they are 'recall lots of useless information' tests. That means it's actually possible to study for these interview tests (if you have the desire) and if you can ace the test by just spending time memorizing answers.
However, 'slow coding' has very little to do with coding and everything to do with problem solving. You only learn how to problem solve through lived experience and many people never learn that skill.
These interview tests generally fail only because they're not actually testing for the skills needed to do the job.
What does that even mean?
In an interview context, I've seen cases where somebody fully understands the logic underlying the solution, yet it takes them 30m to write it, while another person can do it in 5m or less.
Essentially how long it takes to translate from mental understanding to working code
Leetcode-style interviews became popular in the mid 00s, primarily because they were used by hot tech companies of the time. The thing to understand is that at that time, the idea of asking people to write code during an interview what sort of revolutionary. Prior to that interviews had little structure. It wasn't unusual for the hiring manager to make the decision, some times based on credentials, recommendations, or trivial-like questions.
This type of interview became wildly popular because it allowed budding unicorns to hire programmers of high quality at scale. The process was less biased than the alternative, reproducible and scalable. Here you have two blog posts [1][2] that show the line of thought at the time.
The reality is that big tech has elevated leetcode type interview to an art. They have reached a near local optimal through years of experiments and refinements. It is working well for them so they don't have the need to take big risks such as completely revamping their hiring process.
I love the topic of hiring and interviewing and I'd love to truly get at the bottom of which method works best. I like this article because it explicitly calls out shortcomings with typical alternatives that are not usually mentioned. I hope in the future a new crop of unicorns can take these practices to the next level and do a fair comparison.
[1] https://www.joelonsoftware.com/2006/10/25/the-guerrilla-guid... [2] https://sites.google.com/site/steveyegge2/five-essential-pho...
And I think it's a great idea, personally I would never consider working anywhere that hired devs without seeing them write some code. But my problem is with the types of questions asked at many companies, specifically the types that require weeks (or months!) of prepping.
During my last job search one interview that stood out (positively) was a problem around parsing some HTTP headers (it started simple and then had layers of complexity added as I solved each one). It was honestly one of the best questions I've seen in an interview as it requires the candidate to be able to write code and solve a problem but without requiring/expecting the candidate to have prepped beforehand to learn (or brush up on) theoretical stuff far removed from the types of problems we actually solve on a daily bases (evidence of this disconnect is demonstrated by the fact people need to prep).
I had two jobs at the turn of the decade and they were conversations about topics such as “where would I go to find authoritative information on x” (google is not the answer) and approaches to troubleshooting.
Granted, I’m a script monkey (sysadmin) but some level of automation has always been part of the job.
I’ve had precisely 1 leet-code style interview (2hours for 16 problems) and I couldn’t tolerate the pressure.
The interview I had at google was much more humane than what people are attempting to cargo cult.
I see whole bunch of people complaining about Leetcode (LC) style coding questions, but I don't really see any alternative that is as good as LC. LC interviews solve all the following criteria better than other interview formats in aggregate.
- Objectivity: can you evaluate candidates as objectively as possible as opposed to subjectivity?
- Scalability: can you evaluate many candidates fast with high throughput?
- Accuracy: can you filter out bad candidates? Yes, the filter is often restrictive to the point where good candidates also get filtered out unfortunately.
- Meritocratic: can you make sure the process is assessed based on the merit of solving LC questions well as opposed to having connection to someone you know at the company or having gone to some prestigious school or other companies?
Leetcode format does good in all criteria above, or at least better than other formats below in aggregate.
1. Take home assignments.
- Objectivity: bad
- Scalability: bad. Once the question leaks, it's harder to come up with another question easily. Also doesn't scale for candidate side either. As a rule of thumb, I always decline interviews that require take home because it's useless to try hard as this is not applicable to interview at other companies. ROI is extremely low for candidates.
- Accuracy: good.
- Meritocratic: good
2. Coding questions with heavy emphasis on practicality (Stripe & Square)
I was pretty impressed with questions from Stripe and Square. They created problems that are heavily related to string, array, hashmap manipulation, searching, replacing etc using regex and such. If you are not into LC questions, you might prefer this, but I actually found some of these questions harder than dynamic programming questions I got from Google.
- Objectivity: good
- Scalability: bad. Once the question leaks, it's harder to come up with another question easily.
- Accuracy: good.
- Meritocratic: good
3. Knowledge / Trivia questions about programming language or framework
This might make sense a bit if you are specifically looking for iOS, Android or web developer but it still will need to be paired with LC questions.
- Objectivity: good
- Scalability: good
- Accuracy: bad. Candidates can memorize and recite these answers to the questions. Also too many good candidates actually do not memorize the details about programming language or framework. Also it can easily obtained from just Googling so people can cheat easily.
- Meritocratic: good
4. Pure behavioral
- Objectivity: bad
- Scalability: good
- Accuracy: bad
- Meritocratic: bad
5. Ex-coworker
- Objectivity: bad
- Scalability: bad
- Accuracy: good. If you've worked with the candidate before, you'll have a pretty good grasp of whether or not he/she can do the job well.
- Meritocratic: bad
> Meritocratic: good
I'd go with "iffy": because of the "take home" part there's no guarantee it's the candidate that did the work. This can probably be determined with a follow-up conversation during the in-person part, but without such an addition I don't see this format as "good" on this metric.
Best I can think of is take home tests but I wouldn't want to rely entirely on that, and they have their own problems too.
I have a decade of experience, working as a senior and staff engineer at megacorporations, have experience as a research scientist, am in a PhD program for computer science research, but lets just double check that I know how to use a hash table.
If it is standard at the company to give all candidates that same question, this is actually a good way to reduce bias in hiring decisions. Otherwise, you might bias toward hiring, say, people with college degrees or people who worked at large corporations, for example. And this might unintentionally bias against certain demographics that obtain college degrees at lower rates (for example).
Also, typically the point of these questions is to see if the candidate can solve a novel problem not if they can use a hash table.
This is a hyperbolic rhetorical question but consider how much of modern work really is CRUD API gluing, and how much of that work is not rigorous engineering, but tedious implementation as per documentation and wading through hastily-thrown together legacy cruft written during previous crunch time.
How much of that work involves novel programming and memorized understanding? Most of it requires study of existing codebases, not given a blank REPL to hash out something new.
When I started interviewing I started off by jumping into what I considered to be not insulting questions, but I very quickly learnt how awkward that can be if the candidate just had no clue.
Now I start with for loops and preface it with "apologies if this seems too simple" or something like that.
You would be surprised how many people can't do a simple for loop. I mean really simple.
I really hate the contrived problems that interviewers pull from HackerRank and its ilk. Even if you could glean useful information from watching someone program a knight hopping around a phone dial pad (a real problem I've gotten, which I'm highly skeptical of its utility), it comes off as pretty irritating to the interviewee. You're not solving little programming riddles on day to day work, you're solving engineering problems, and I think that these leetcode questions don't reflect that.
I think there's evidence of this too; you can study for these leetcode questions and get better at them, so what exactly are we testing? You're not testing the person's experience, you're testing how long they spent on hackerrank the night before, or how many interviews they had recently.
I think one can fairly accurately determine my level of skill from my resume, and thus I don’t think that the leetcode interviews are all that useful. I keep hearing about senior engineers who can’t code, and maybe I’m just lucky, but that really has not been my experience when interviewing, and I don’t think “spent the previous night practicing on hackerrank” really gives you any useful information.
Should it be from university CS programs? Which are supposed not to be vocational? And would discriminate against those who are not degree holders?
Should it be from internships, which are nowadays subject to the same Leetcode examinations as actual work?
Should it be from junior developer roles, which are an endangered species, as every business in desperate need for seniors, and lack the patience to train?
Maybe open source can be a sort of free apprenticeship to teach developers these good practices?
Or maybe grinding away at Leetcode, Project Euler, and arbitrary coding puzzles is really the only pedagogical solution.
It is how people pass these tests, but it’s pretty poor pedagogy.
We have no issue whatsoever hiring juniors (we hire about as many as mid-levels), but we don't have to settle for someone who doesn't have internships or projects and the openings can be closed in 2 weeks so it seems like nobody is hiring, but actually the bar has risen (and there are plenty of candidates who clear the bar).
I am confused - how is this question materially different from asking how engineers are expected to know what they will need in order to properly do their jobs? Just how are engineers expected to know programming languages, or domain-specific languages, or standard libraries, or platform apis? Is this what you are asking?
Corporations are not testing your knowledge of algorithms per se. While that knowledge is obviously important, what's more important is how dedicated you are when it comes to reaching a goal. Essentially, they are testing your will power. You know the rules. Can you achieve success and not give up in the middle of the road by skipping, say, Dynamic Programming? They could've tested your level of dedication by asking to bake some fancy cake, but throwing CS-related questions just makes much more sense.
That reasoning completely changed my perspective about algo-heavy coding interviews.
That's an interesting theory. And they pick the goal that happens to be best suited to fresh/current undergrads from Stanford who weren't working a draining job to put themselves through school, so can afford the additional person-months to achieve the gatekeeping that was set up by people like them, for people like them.
I and many other experienced software engineers have demonstrated perseverance and rising to the occasion of extremely challenging goals. But that theory suggests they're not interested in that. Maybe they're interested in the class shibboleth that was established when computers became less of a meritocracy for people with a passion for it, and more of a go-to status job for the upper middle class?
I'm not sure how to make this come through but I'm rather sympathetic. I think practical experience & can-do coder know-how count for most of the marbles. But I also think it's impressively hard to assess by, that most of what we do is just taste & path dependence on the specific techs we've seen that happened to leave good impressions on us: not science, not knowledge, not truth: most coding, most practice is almost entirely happenstance.
But again I want to empathize. Because I think it's cruel to disregard engineers that have seen so much. But I also think you're hyperbolizing overmuch & discrediting the discourse when you make your argument so sharply pointedly. Because there's some truth, absolutely, but it's also no-where near as lopsided. For example, you talk about fresh/current undergrads at Stanford. But this is a pretty regular, basic CS path that any university student should be familiar with. I only had two undergrad classes that really talked to algorithms (Algorithms and then Data structures; OS kind of somewhat), albeit there was some computational thinking already in play at that point. Understanding complexity & algorithms is understanding the basics, it really is. If you can't see time flowing, don't know the tally & impact of the work you're asking for, you're lacking knowledge to avoid a lot of bad decisions. This isn't made up fake shit taught only to the elite.
What's alluring about this system is that it has some hard facts & theory underneath it. It's not just opinion & engineering pop-culture. The answers don't change, the rules don't change, the material stays the same, and it's all rooted in being able to analyze and understand problems, rooted in comprehension. Being able to see & understand & analyze, being able to apply some basic principles: this is a pretty sure way to find people who will be able to Not Mess It Up, who are capable of looking, assessing, & navigating through scenarios computationally.
And last and perhaps most key to me: it's also material that someone with even a modest bit of aptitude can cram for. Pick up a book like "Cracking the Coding Interview" and you can semi-accurately reproduce the output of four years of undergrad in two or three months of occasional light practice & trying. Projects like Leetcode can get you the experience. You may feel bad that what you want yourself to be measured on doesn't count here, but to people trying to go get hired, it's enormously helpful to them to have well defined expectations, to have specific kinds of tests to expect & be able to prepare & study for. I see so many protests that it's not fair, that it favors only those with the luxury of time, but those views to me massively undermine how wonderfully vastly accessible it is that there's a known, well-described, well-supported subject-area we can study. The industry overflows with things to learn, study, & know, but here's something concrete & specific, which undergirds it all, which alone might not determine your coding skill, but which does indicate you at least have some raw basic intellectual capacity to go understand problems put before you & apply some sensible computational thinking to tackling them.
There's tons of things not tested for, but by having a well defined, technology-neutral set of computational thinking tests, based on actual science with objective, factual answers, rather a much wider corpus of future/current/legacy pop-engineering-of-the-day built on nothing but wishy-washy subjective opinion, I think we achieve one of the best possible wins we can against an enemy we both hate: class-based discrimination, cruel tests that reject outgroups.
There's other ways to tease out if someone is "dedicated to reaching a goal" like actually asking them questions about their life, daily routine, accomplishments etc. I think these are much better signals than "this person can implement many sorting algorithms and solve towers of hanoi or the egg drop puzzle without a google search to refresh their memory" with an N=1 sample size used to gauge how reliably they can do that. How many people if asked to do this on a job rather than an interview a year later would then go and implement it from memory without double checking they didn't forget some edge case on stackoverflow?
I know grinding leetcode is fundamentally useless because you will immediately start losing your ability to interview once you get hired. If you don't change jobs within a year you'll need to start studying all that crap again for the next interview.
An excellent question.
If the interviewer is to ask such algorithms questions looked up on leetcode, etc, then they must be prepared for when the candidate asks if they also use it themselves on the job. If the interviewer admits they don't use it, then they also admitted that they don't know what they are looking for and really are wasting the candidate's time.
If it were me, I would look at open-source contributions (no hello-world or demo projects) where that is enough proof for me to evaluate an entry-level candidate and cut through the algorithms nonsense and ask questions to the candidate based on that which will save everyone time.
That this isn't the bulk of what we do doesn't change the fact that these challenges do test computational thinking acuity. Having the ability to see & speak computationally is a good skill, one that connects our day-to-day abstract practice with actual real processes. Being able to break down problems & analyze how to tackle them shows an objective ability to assess & work through problems. I want to work with people who can be clear, who can model & explain & step through situations.
And these skills are, generally, learnable, and relatively quickly. I disagree that these abilities fade, but yes, some re-familiarizing & re-training is probably important, especially because, as you point out, the sample size is indeed often N=1, and that's pretty wild.
If you are a no-name company paying whatever your HR department defines as "market rate" and you don't get hundreds of very qualified candidates for every role then interviewing like Google might not be a great idea.
"Great" filtering for parents or people caring for ailing family or any other non-work commitment.
I guess it's not ageism at all, right?
That may sound good in theory, but in practice I've been rejected quite a few times for not delivering the correct or most optimal solution despite trying my best.
I have two very memorable no's. They were for pretty senior/advanced IC roles. Juniors would get far more slack. I usually ask an expression evaluator question.
One person, used eval(string). Brilliant, absolutely brilliant. full points for answering, but how do we make sure the string is safe to eval? Long frustrating discussion around regexes that that maybe a junior dev is responsible for maintaining. Man, just write the parser. I know you can do it. Give me something that's something that's not _crazy_ to put into prod. Give me something that won't be a bug tarpit. Sorry man, I loved your answer. I hated your response to how this can be operationalized. Clearly super smart. Maybe I should have given the green light, but the response to criticism was so bad. There are lots of people I disagree with, that I'm happy to work with. I think I made the right choice, but I wonder from time to time.
Another person I absolutely would have loved to work with, and would have learned a ton. Very into formal methods, or slightly relaxed versions of verifying reliability. My organization at the time was pretty fast and loose. I believed they would be miserable. In the interview I told them I'd give them the green light, but they'd have a huge uphill battle.
it's so hard to tell. People are adaptable, small mismatches can be papered over, but big mismatches - it's so tough to say. Relaxing and being vulnerable on both sides is so hard to do well. Maybe I was just a jerk. I think I made the right calls, but it haunts me.
Personally I don’t mind a whiteboard session or two during a loop but what’s wild to me is how, especially at big companies, you’re expected to do four or five of these to get an offer.
How often do these companies decide “well they understand when to use DFS and they can merge-sort a linked-list and they knew to use dynamic programming but we can’t hire them because they couldn’t remember how to implement a heap”?
The more casual, flexible, subjective coding interview described by the OP is not the prevalent form of whiteboarding interviews, which has been superseded by the weeder version.
I agree that this is unreasonable but not for the reason that you mention.
Once you can pass this kind of interview, you can throw me 10 more and I will still pass them.
Edit: on further thought. I think I’d be ok with coding interviews if the interviewee got to bring in their own coding question and the interviewer had to do it as well. That way the they also knows the people they are working with can live up to their ideal of a good coder.
It's saying coding interviews are not spectacularly awful if you (thoughtfully) change them.
Great! That's what everyone complaining about them has been saying in one way or another all along, change them.
I wrote about my own experiences with coding challenges during my recent job search on my blog at https://blog.urth.org/2022/04/19/software-job-search-2022-re...
I'll summarize my conclusion about live coding challenges, which is that I'm not convinced that my performance on these challenges reflected any abstract skill I have. Instead, they mostly reflected the fact that the problems I was given were either nearly identical to work I've done in the past, or were similar enough to things I'd done recently that they felt pretty easy.
I guess there's _some_ signal in that, but I don't know if it really says anything about how good I am at coding in general.
Here's some data models, and here's how you can reference eachother by properties. Cool. Let's do 3 levels of nested loops to extract the relationships and map them. That's like 99% of what they'll be doing.
The signals I'd look for are "do they start the entire thing by looking at structures or not", etc.
But unfortunately this breaks down when you're interviewing 10-year+ candidates. It doesn't yield anything useful. So sticking for those positions with a really really basic problem and pseudocoding it, and using the remaining time to instead have them map out solutions they did in the past seems to be useful.
Most things leet code interviews screen is "does this person know how to cram" because I tell you, everyone I know who aced leet code interviews crammed for 3 months and literally knew any one you can think of, and every variation. Juniors tend to do better in those, ironically enough.
Is it ironic or intentional? Seems like a great way to hide age discrimination under the guise of "fairness" (everyone gets the same test).
As I commented above, I successfully passed multiple leetcode interviews in the last four years, including FAANG. I did not spend a single minute on prep.
A) SWEs are shit behavioral interviewers in any other way. You are not about to chnage that even if you are google.
B) It is crucial that newcomers be hired by their peers and by their direct manager. This has solid research. People hate it when they get an hire and they hadn't a meaningful part in the decision... The new hire is more likely to fail.
A+B means it is better to have a terrible process lead by SWEs (e.g. coding interviews), than a good one lead by professional recruiters (which will fail, no matter their capabilities and methodology) (and yes, I do believe that without considering A, some non technical recuiters trained at workplace psychology will have a much better success rate without needing any technical interview. They will need someone technical to have a talk with the candidate, but definitely not to ask leetcode etc).
It is an interview about your communication, whether the code you write can be maintained, whether you can inspire/build trust (can you explain what you're doing? does that feel reasonable to the other person?).
Where people fail is to think that it's a test of experience, or a "heads down I must make every test case pass", that the outcome matters more than the process.
It's all about communication.
The "write code" types of interviews fit for juniors who might not know how to even write code properly. If a person doesn't know how to do that and is a junior they usually have the right energy and can put 100% into the job to adapt/learn. I hired such juniors who had no experience in the languages/platforms and were productive within a month. They had good character and discipline.
Distilling a question like that for an advanced coder is impossible. No wonder the article didn't provide any sample of a "good interview". We'd all burn him at the stake because we'd all hate it. There's no such piece of code that would provide a valuable data point about a candidate. So this is a waste of your time when dealing with a junior and with an advanced coder.
Coding interviews are difficult. You will make mistakes, this is unavoidable. I think the best mistakes to make are the ones where you hire people who are part of your team and can grow with it. Even if they aren't the best coders around, being part of the team will help them hone those skills and evolve. If I had to pick a 10/10 coder or a team player I'd pick the latter every time.
Got the job.
Then I did a slight pivot to get into BigTech via cloud app dev consulting - “application modernization” [2] - still with no coding interview although I still do the same type of enterprise/cloud app dev I did in the latter part of my career in the “real world” - and a shit ton of yaml.
Even in 2020, if I couldn’t have gotten a job at either Azure or AWS, I probably would have done what it takes and practiced to increase my income by an easy six figures or more.
Coding interviews are a “gravity problem”. You can not like either. But they still exist.
That being said, if I were starting out today and could break the can’t get a job <-> don’t have experience cycle by “grinding leetcode” for few months and and end up in the top 10% of income earners in the US right out of college, I would have definitely done it and I encourage my younger relatives graduating in CS to do it.
However, if you’re just hiring for your standard yet another senior CRUD developer and paying as such ($90K-$160K) - don’t expect me to do the monkey dance.
I think algorithm style coding interviews are the great equalizer. Even though I at 48 probably couldn’t pass one without some serious practice.
[1] https://www.hanselman.com/blog/dark-matter-developers-the-un...
[2] yes it’s a full time position with the same compensation structure.
I'm finishing up a personal open-source project and am looking for a new position focused on ruby or rails, so if you're hiring and are open to evaluating me based on something cool I've written, hit me up.
And here's where our industry isn't a "real" profession yet. There are plenty of qualifications one can get, but none that gates the ability to call oneself a software engineer. The result: we have to prove it in interviews.
And I don't know whether this is better or not. It is different though. The price of not having to sit through exams is to have to jump through hoops.
The thing with software is people tend to think of it as something that needs maintenance to stay fresh, perhaps a bit like piloting an airplane.
Airplanes have logs and pilots recertify on an ongoing basis, but software doesn't work like that. Someone might have stopped coding but still have a title.
I'm not saying software actually does work like that, but I'm sure it's a view some people subscribe to.
Obviously it’s pants on head stupid to do that if you want a web dev to make a pretty app, tie frameworks together and liaise with 3rd party integrators. However, I think many employers want the crème of such devs who are also good at algos.
Which is not going to work for the vast majority of devs and jobs.
Most devs either don’t care or need to care about algorithms, and the few devs who do, spend most days not using them.
This would be akin to asking a civil engineer to build a wooden road bridge. Having the skills is great, but how often is the company going to need them?
Maybe not in the 21st century however, and I don’t think your analogy is perfect.
It might be more like wanting to hire an architect who is also great at civil eng. The skills definitely synergize and can help prune the architect’s search space of good designs, but such a candidate is probably just going to go for an engi job, and lean on their architecture skills there: for example an algo dev that can do mock-ups.
I think that many employers don’t have a clue what makes a good dev, in general, or in the positions they’re trying to fill.
A completely fictitious language is developed. Simple enough that you can become fluent within an hour. From this you read a bunch of problems in said language and must solve them.
A set of assertions about the problems are written down and you must verify them. Finally a full program, created from a dialect of the given language is given and you try to understand
I’m also not familiar with other fields, but are doctors or lawyers asked to do anything analogous to code assessments during interviews?
Drive, experience, intelligence, communication skills, teamwork, emotional intelligence, grit, curiosity, physical health, mental stamina, family/home situation, and dozens of other factors play into job performance, many of which you can’t even ask about to generate your test dataset. Many of these traits are what you’re trying to measure the effects of in the interview.
a) Its like saying a Tobacco company has a paper saying smoking is healthy
b) Google measures things that matter to them. The operate on a scale that 99% of companies aren't. Their predictor of success only applies to companies of their scale.
c) What is their definition of success? And how does it apply to smaller companies less than 1000 employees?
https://news.ycombinator.com/item?id=31544634
How do you test knowledge indeed?
Lol be careful not to ask for a loop, nobody uses loops in real programming problems
dont mind talking over code casually but dont put me on the spot or give me homework. not doing that.
I just want to put this out there to anyone who gets involved in interviews that this is a very important rule -- if your question can basically be redefined as some really clever solution you found on a problem you put > 30 minutes into, it's probably not a good question.
A lot of the seniors I work with would come up with questions like this, and I get their logic:
1. I'm a senior and this was a challenge for me 2. The logic is clear when you see it and look at it 3. I want people who can do work similar to me 4. I know how to get to the conclusion, so I can judge based on the path the candidate takes Conclusion: It's a good question
While I understand how we can get to such a conclusion, I think that in most cases the only takeaway you can get is "can this person think like I did in this situation?", which maybe has some value for judging workplace compatibility, but I don't think it's a fair assessment of someone's technical aptitude. If you yourself required a lot of time and brainpower to sort through a tricky issue like this with the benefit of the problem scope and system needs that led to this issue, it's very unlikely you'll be able to judge any assessment
Point 4 seems good at first because we can say "the goal is to just see them think and use their experience"; there is truth to it but without the rest of the context it's not really a good assessment because ultimately, you're looking to see "did you get the same conclusion I did? Did you avoid the same pitfalls I hit?"
Instead, focus on a general scenario that tests their understanding of basic principles. How do they use the fundamental knowledge for the subject matter to work through a new problem that you yourself haven't really dealt with. It removes your bias a bit and opens you up to those "wow why didn't I think of that?" moments that you'd have working on tough projects together.
Try your best to give a good faith interpretation to any path the candidate takes and help take it to natural conclusions, and see how the candidate reacts; if you can see they've made a workable but problematic solution, nudge them towards the problems you see and see how they discuss it.
For me a candidate that can really process their own thoughts and do a good analysis of their own solution (its benefits and shortcomings) is extremely valuable; I usually lead off with "I know how I'd do it, but that doesn't mean it's the right answer, so focus on your strategy and walk me through it." If you want tot test compatibility or you feel you have a stronger solution, accept their solution (if it's valid) then go ahead and discuss your thoughts a bit and see how they respond and how they discuss it. The more you make it a technical conversation against peers and less a university exam, the more you have to see how they think and operate and the better you can understand them.
It amazes me how many people will defend the current interviewing process and then openly admit that the majority of their engineers could not pass the same process without spending dozens of hours preparing.
[1] https://news.ncsu.edu/2020/07/tech-job-interviews-anxiety/
But other than that the study didn't find any evidence that men have any issues with being observed while coding. So unless a better study comes around we can dismiss complaints about observed interviews being stressful unless it comes from a woman, it is in the study you linked after all. (I do believe there is a stress effect, but that study isn't showing it, need a bit more samples and probably several different observers to get better data)