I just had to conduct a round of interviews in a non-SF large US city, and it was a hellish crapshoot. Resumes are meaningless, and often re-written by recruiters to match the job anyway. Everyone has the same canned answers to the stupid behavioral questions. And as for the code, we included what we thought was a trivial nested for-loop problem and virtually nobody could even get started on it.
Is this kind of code problem too complicated in your opinion? For all I join in when complaining about irrelevant algorithmic questions, I have to admit that they at least test something, even if it's just willingness to study for the interview.
Instead of reading everybody's complaints about interviewing, I'd love to hear how you think it should be done. Because I have to admit I'm pretty much lost right now.
It'd be nice to think I have some special skill here but I really don't. This is just how interviewing was done in the 90s. To some of the younger generations, I've been told it sounds crazy.
If you send me your resume I'll actually read it, carefully. If the person described in this resume fits the background experience the role needs, you get an interview.
During the interview we'll talk about all those projects you worked on that are relevant to this role. Which parts you enjoyed the best and why? Which parts were boring and why? Which parts were the most challenging and why? What you find too easy and why? What would you have done differently? Could you have? If you were to do the same project all over how would you approach it? Other open ended conversations along these lines.
I don't ask anyone to whiteboard code, that's not part of the job so it's not part of the interview. No puzzles, no trivia-pursuit style questions.
It works great. You can't BS your way through such a conversation with a senior technical peer if you didn't actually do the work described in the resume. You just can't.
It is, however, vital that the interviewer must be a expert in the field.
> > No puzzles, no trivia-pursuit style questions.
I swear HN technical interview threads are the poster child of talking past one another. First, for-loop is nowhere near a trivia-pursuit question. Second, different companies of different sizes/industries/goals have different requirements. Let's all move forward with this discussion and acknowledge that we can't all use the same process because we're not all hiring for the same type of job. If the engineers you hire consider for-loops a "puzzle" that's totally fine and OP using it doesn't invalidate your process for your multi-national or startup companies.
> I can say that I've never regretted a hire I said yes to
The real question is, has anyone else regretted that hire, which unfortunately can't be answered as they may not tell you.
> To some of the younger generations, I've been told it sounds crazy.
Doesn't sounds crazy at all. A single process guaranteed to work for everyone? Now that sounds crazy.
You don’t want ‘bozo cliques’ to form, so you make a semi-objective process like ‘solve this algorithmic question’ as part of the interview loop. I think execs and the founders doing a final review before hiring all engineers comes from that fear.
Then other companies cargo cult interview processes from larger companies and the trend propagates.
If you want to ‘hack hiring’ as a smaller company, you should use hard to scale processes like the one described in the parent post.
If you don't actually verify the technical problem-solving ability of the candidate in some way you're forgoing signal that can massively increase the confidence you can have in your decision.
After a few bad interviews I got the hang of it and aced a couple algos interviews. I’ve worked as a developer for 8 years, have tons of software in production, have some open source contributions and have worked productively on several teams.
Interviewing is a skill. I understand why a company wouldn’t hire someone who doesn’t pass a programming test but failing a programming test doesn’t mean you can’t do the job.
Yes, I wrote that right. He struggled to understand conditionals in general when building his own logic. The guy even had a masters degree.
A company trying to hire engineers can easily give a take-home programming challenge to a dozen engineer applicants and take very little time analyzing the submissions. That’s pretty unfair. But it also feels a bit untenable for a senior engineer at a hiring company to have very deep investigations of each applicant’s work history.
Another problem is objectivity. If your company’s engineering hiring process relies heavily on a senior engineer’s subjective impression of an applicant, you’re going to have big problems with your own engineers’ biases, whether subconscious or not. Expect to hear a lot of evaluations like “well, the applicant did seem to have good knowledge and experience, but I just wasn’t impressed for some reason.”
Is it? The cost of hiring the wrong person can be huge. Not just agency fees if they came through a recruiter (those aren't cheap!), but also all the time people then spend on the bad employee and all the damage that person does before the mistake is rectified.
If this system of reading the resumes and then spending a few hours of senior employee time on an interview can reliably find good employees, that's an absolute bargain.
I agree with your approach and use it myself but one thing is different now and that’s the proliferation of tiny skills. Back then you would have a few big skills, you would claim to know one or two main languages, one or two databases and so on. Now people list hundreds - literally hundreds - of skills sometimes. And there’s no way to tell on reading if they really know it, or just saw it once and maybe did a tutorial or “hello world”. People now will add a skill to their CV if they’ve done it for a hour total in their entire lives! Or read a blog post about it. It is a massive time sink to pick through that.
This tells me the person doesn't know the difference between skills and implementation details, so I don't need to proceed to an interview.
I know the theory of how to use it on large projects. I even know enough to know that I'm pretty much just scratching the surface, but so are probably most other people.
Same with most other skills. At how many lines of code can I call myself proficient in a language? And what if I copied large parts of code from stackoverflow?
All in all I think I spend like 8 hours on writing and formating that freaking thing, and it still just feels like some of my skills are a stretch.
It takes a lot more effort on the part of the interviewer and is "harder to scale", in that you can't just train people to ask canned questions. But since it's open ended, it's a lot better at finding out what the candidate is really good at, and it's very useful to have one of these in every hiring loop, usually by a senior team member or exec.
This is key. There are a lot of hiring managers masquerading as experts and are frustrated when cargo culting hiring processes falter and lack the people skills to diagnose a situation. You'd also be amazed at the quality of resumes a high, advertised salary will bring.
I don't disagree that there are a lot of fake it till you make it developers who did a vo-tech class and are applying for jobs out of their league, but I just don't know how people don't spot them. It takes me about 5 phone calls to find a competent individual, and then I bring them in for a in person. As a hiring manager, I don't find it cumbersome to weed thru 5-10 people to find a good hire.
You think this is a claim about how good your hiring practice is, but the only way I can think to read this is as a point about how little hiring you do or how little evaluation of hires you do. In the real world, perfection isn't possible, so claims of it are a sign of inexperience or naivete.
This may just mean that you say a lot Of wrong “No”s. To get very high precision or very high recall is really easy... what you must measure is your F-score
I would be honest and say "I'm under NDA for a lot of that but I'll do my best". If neither of you are jerks, you should be able to manage it.
Have you let new hires go in the first 90 days? That's just one of the ways personal regret is probably not the best metric here.
This is one way the current status quo might be better than the past: you don't get pigeonholed so much by your past experience into being a "fit" only for similar roles. Sometimes the hiring manager is really looking for a specialist, but in general, we don't care what industry you were in or what tools you were using, as long as you can prove you're smart. Some of the most impressive people we have working on Go microservices were enterprise C# developers before.
While it maybe easy to hire dev who have been on the market for 10 years. You still have to keep in mind that new developers are still a very large proportion of the dev population.
If someone can keep up in a technical conversation about their background with me and answer every question I have about a technical project they did, then basically they pass. It works especially well even if I'm not familiar with their project, because I have an opportunity to learn, so I can ask any question that comes to mind until they teach me what they learned.
I did hire someone that I regretted, though, but to be fair, this was among my first interviews. The mistake I made was getting too easily caught talking about programming and technical things without specifically diving deep into his past project. He and I vibed quickly and I liked him, and that felt like enough, but after only a week it seemed obvious that he wasn't going to be producing much code, and we let him go. Otherwise, I've been happy and my ability to discern has only gotten better as I became more experienced.
I got a little offtopic, but my main answer to your question was "if the company leaves the decision up to a majority of engineers saying 'yes', then a lot of companies do this." Google does this, the startups I've worked at do this, and some of my friends companies do this.
Reader beware, of course, I know at least one company on there shouldn't be.
I also found very few people could solve this (similar non-SF large city location). Occasionally, people who could not solve this were hired for other teams. Based on their performance, I don't think I would have been comfortable working with them.
I don't think it's unreasonable. But I'm not sure I'd use it as a screen if I was hiring now. I think I'd just have a chat and try and discuss a previous project. After that I'd move to a paid take home project (ideally representing real, useful work).
[1] Take a string, for example "ABCCABC" and count the number of times each 3 character substring occurs. In this case the answer would be 2xABC 1xBCC 1xCCA 1xCAB.
I froze up. I couldn't talk and think at the same time. I didn't have the skillset for doing this in a very intense scenario. In my case, I was homeless and needed a job ASAP. Every interview felt like life or death to me.
First one I was asked to reverse a string in C. I hadn't done C in a few months. I froze up on syntax. I looked like an idiot who couldn't do it.
I could imagine many people who have never experienced this format (or haven't experienced it much) would easily freak out and look stupid as bricks like I did.
I've since done over 200 technical interviews (as the interviewee) and I usually sweep. Still fail at FAANG but I always get the solutions. (Even the leetcode hard ones) Just not sure why I fail but cest la vie.
I realize people can still freeze up, but at some point I think there's just no solution to that unless the candidate can produce a significant portfolio. Also, I read your post and it sounds to me like to gained confidence in this area as your coding skills improved, which I don't think is as much a coincidence as you seem to believe.
Is there anything they could have done differently to make the situation easier? (I would guess perhaps giving you more notice about language requirements?). Or is there a different interview format that you feel would have worked better?
Perhaps some people might feel that while they could write code to solve this problem (or similar problems) outside of an interview. However, under the pressure of an interview they would not be able to solve the problem (due to anxiety, stress etc.).
If that was relatively common, then such a question would not be a useful interview question. Or would at least not be giving an accurate estimate of a candidates programming ability. This seems unlikely to me, but I would be interested in hearing different view.
I just tried it, and it was trivial to do in a minute or two on my laptop. However, I did took note of two syntax mistakes that I made in the python REPL that were immediately obvious there and took seconds to fix, but which I most likely would not have noticed on a whiteboard.
So there's quite a bunch of problems where if the acceptable "format" of the answer is "hey, I'll just pull out my laptop from the backpack and push the solution to github in 15 minutes" then it'd be okay, but it'd be hard to do it while 'whiteboarding' without access to immediate feedback and easily accessible API documentation.
For example, I work in many languages, and for many APIs I can't remember whether in this particular language the same thing is called add or append or something else; e.g. I've worked on Java code for a dozen years and would be a quite productive Java developer, but since I haven't written any new code in Java for quite some time recently, I can't remember off of my head what's the right boilerplate to open a text file for reading in Java - it's something that a Java tutorial might have in the first pages right after Hello world, but I'd still have to look up the incantation to pass the encoding properly - there's like three Reader classes to instantiate and I don't recall their names off the top of my head.
It’s about your approach to problem solving and your ability to communicate that.
When I used this question I wasn't looking for accurate syntax. If it was a solution that looked like it would work, after some debugging etc. I'd consider that a pass.
Regardless most people couldn't answer it, which I considered surprising.
I've also had a couple FAANG onsites and get nervous as can be, but that's just a practice more and be more confident with my abilities problem I feel. No offers though yet. Still trying and practicing.
Id put out my fingers 3 characters wide to the first substring and step through the rest of it in 3 char 'spans' with my fingers. I'd say "extract each substring, put it into a map with that substring as the key and either 1 as the value if it wasn't already present, or if it was, increment the value"
I'd probably not bother mentioning how to extract the results unless asked.
If someone said that to me it would show the solution, and I'd be 100% happy.
I'd assume that they could then render that into code - perhaps that would be a mistaken assumption though but in my experience solving the core problem is the thing I'm interested in, not the syntax. But those with more experience may say the former doesn't always imply the latter, and the code needs to be shown.
What I struggle to understand is why this sort of thing still serves as a useful weed-out screen in our industry, even for people whose resumes indicate masters degrees and multiple years of experience in development organizations. But experience shows that there really are people who are ‘faking it’.
If so, then the company you're interviewing for must not be attractive to first-class CS grads (top 10-20% I think) from any non-online university.
As a comparison, any FAANG + Palantir/Jane Street/Two Sigma in the UK have tougher questions as their FIRST phone interview for INTERNSHIPS. (Palantir requires you to go through 6-8? interviews before getting an offer)
If this is a job that requires actual software engineers, I think rejecting everyone that failed this question would be perfectly reasonable.
If you’re saying only the top 10-20% of CS grads can answer such questions then that would make sense. I’d guess I was getting a random selection of candidates.
Do you get candidates that ever ask "can I assume the strings are ascii?" The usual solutions to this will break in amusing ways if you allow arbitrary unicode. I know some interviewers who actually would have as a hidden scoring criteria "candidate asked about input encoding", and a lack of asking about that is a fail even if they correctly solve the problem for ascii. I myself disagree with using such hidden criteria -- if I'm going to score something, and not tell the candidate what exactly I'm scoring, it's at least going to be something in the code like "correctly avoided the divide-by-zero case without me pointing it out" and not an expectation of the candidate to read my mind on my expectations of things I haven't told them. (I do tell them to try and write code without errors (I help fix up basic syntax quirks or if I spot a typo I'll point it out), or ask how confident in their code they are and whether they might have in mind any edge cases to try -- I'd like to get a candidate who actually writes a unit test on their own, I always point out junit is set up...)
You see my process better if you watch me write and correct this program on a computer. If my tasks is to think up a fully correct solution without iterative trying incorrect solutions, you're missing an important part of my process, even with a trivial FizzBuzz. (Though with this one you could probably just act as the interpreter and point out the errors for me.)
Can be done in O(n) time with a hash table.
- Treat recruiting in the same way as you do software development.
- Formulate a set of requirements.
- Define interview questions that give insight into whether or not the candidate meets those requirements. This is the equivalent of "tests" in the software process.
- Specific skills with your technology stack is good, but not necessarily essential.
- Ability to discuss sophisticated software concepts, and to explain software that they have built, and how they would build out ideas given to them is good.
- Evidence that this person gets stuff done is good (ref Joel Spolsky).
Coding tests are, for the most part, garbage. Not because the test is of no value, but because you the employer probably don't evaluate the result properly.
If someone could blag that while not being able to even write FizzBuzz, all I can say is well played to them.
IMHO this process works fairly well and does a good job of being economical with people's time.
No. Just programming is actually not easy and a lot of people apply for jobs they can't do.
Plus, a nonzero number of people freeze up in any interview situation.
I once failed an interview loop because I forgot how bucket sort works.
At the final interview to join the SRE team at Google I was asked to implement the kNN algorithm. I barfed at implementing a kD-tree after regurgitating the brute force solution.
Has any SRE ever had to implement a kD-tree in < 20 minutes or Google would go down?
I asked the interviewer at the end. They had never implemented one on the job.
As long as companies insist on these inane rituals I think it’s fair game to optimize for it as an interviewee.
It’s stupid but what else can you do?
- Gmail loads slower than Eudora did on dial-up
- Chrome takes so much memory it’s basically a meme now
- The Google homepage (a text bar on a white background) is several hundred KB.
So what I want to know is, if they hire so many algorithmic whiz kids, where the fuck are they hiding?
a. hard workers, who will just do what is required without fussing
b. people who really want to work there, either because of the fantastic comp or because they drank the cool aid or they genuinely want to work on specific stuff that nobody else except the tech giants does.
I agree that it basically hazing, and has similar purposes - it discourages from applying people who are not hard workers or just don't care about working at a FAANG that much. As long as they still get tons of candidates, it's a great initial filter.
This is more true for startups that need less know "how to reverse a binary string" and more "how to properly design a database with 3rd order normalization", etc. If you're presenting an interview question, it should have actual job relevance and your co-workers should be able to solve it in the same time as the candidate. If you're drilling people on non-job qualities (e.g. invert a binary tree for a web dev role...) then you should expect a large difference between audience that can pass your bad interview tests and audience which will perform well at actual job.
Not trying to complain. I think job interviews should focus more on the 99% of what you do in your job on a Tuesday. Poor interviews seem to be more gotchas and algorithm tricks to disqualify roles which have little actual use for algos + data structures. The tests might seem too easy but as a front-end engineer I would rather be with a coworker which understands the CSS box model, knows semantic markup for accessibility, and similar web things than a person which is good at creating hash tables and doubly-linked lists in JS. Leet code probably doesn't test vertical centering techniques with CSS but if you're applying for a web dev position you better know them.
I felt this was much better in that it was less stressful, yet allowed me to demonstrate both knowledge and design skills.
Another company did something similar but more thorough: They invite candidates for a full day of work where they try to solve a small problem. Then the code is reviewed and evaluated together. They also start the day with a 1-hour overview of their current architecture and you get to ask questions and talk about alternatives. I think this gives both sides a better chance of finding the right fit.
I realize this is not always reasonable.
The first type requires the candidate to learn a lot about the existing code base. The second requires more mental work but is atypical of real work.
Implementing features sounds like a good idea. Perhaps create a small well defined system which is a simplification of the real world system and let the candidate implement some additional features.
It's possible that you asked the question poorly, or the solution wasn't as obvious as you thought.
Designing interview questions is hard[1]. I'll test out new questions on my peers at least two or three times before putting them in front of a candidate. And many don't make the cut. If a good engineer who's relaxed can't solve it easily, then a stressed out candidate will have no hope.
[1] This is why I hate seeing candidates share specific questions online. As an interviewer you'll have to scrub a good question, and switch to something you're not as familiar with. This hurts good candidates.
I think this was a great way to not only verify coding ability, but also testing team work and communication skills.
The art of behavioral questions isn’t “ask and answer” it’s the follow up questions. As you astutely point out, the questions are ‘stupid’. They might as well be “do you want a stick of gum?”
Next time you ask those questions do a couple of things. First keep in the forefront of your thoughts what information you’re trying to get out of it and keep the candidate on track answering your data point. Do that by relentlessly asking follow up questions. When you think you have everything ask more.
As an anecdote I was being shadowed during an on-site recently. I asked some arbitrary ‘dumb’ behavioral question, went back and forth a bit, wasn’t getting much out of it. I noticed my shadow clearly moving on to the next question in their notes and decided to keep pushing on the original question - why did you do this, what were you trying to solve, what motivated you. Turns out the candidate did all of this to generate new revenue for the company and ended up bringing in $10m a year extra at the small company there currently worked for. Loads of great data, would have never gotten there if I’d settled for the canned answer the candidate had.
Behavioral questions aren’t comp-sci trivial questions, you can’t just ask the behavioral equivalent to fizbuz/Fibonacci/floodfill and copy down the answer (and you should never be asking those questions either, but that’s a separate rant).
Behavioral questions are stupid and to some degree that’s the point. When you ask your significant other or kids “how was your day?” — guess what, that’s a stupid question too. What matters is what follows from your line of interviewing.
If you want to get good at behavioral questions listen to Fresh Air and try to be like Terry Gross.
> Is this kind of code problem too complicated in your opinion?
May you please post the problem so that we can provide you with meaningful feedback regarding said problem?
I agree that some coding problem needs to be used to try and answer the question, though with the right interviewer they can answer it without seeing code. The problems you use for that don't have to be at octree-collision-detection whatever challenge, a trivial nested for-loop is fine -- fizzbuzz level is fine. Sometimes you can rely on github or strong internal referral to skip this, but watch out, and anyway it's worth giving your questions to people you're sure will do fine (you've timed at least yourself right?) for the benchmark data and because sometimes they don't do fine, perhaps since maybe your question is too much. e.g. Floyd-Warshall can be done simply with a few nested loops, still I would never give it as a problem and I'd expect nearly everyone I've worked with to flunk it given only the standard hour (which really means 45 minutes).
Some jobs only need basic competence, so you might want to extend an offer if you've been convinced of its presence. At my last job, which ended up being more technically challenging / interesting than my current job, I was hired after posting my resume to Craigslist which led to exchanging some emails and having lunch with the startup founder to talk about my past work and whether I would be useful for his most pressing work. At my current job, I've been part of on-sites where I've established "can you even code?" is "no". Those were costly failures of not having that answered earlier. But we also like to believe we need more than basic competence, so rejections can still occur because of a lack of "testing mindset" or certain "behavioral answers". Only once you fix your "can you even code?" filter is it even worth considering what else you might want to justify an interview pipeline with more stages than a 'phone' screen or lunch conversation.
I used to wait to the first in-person interview to try simple fizzbuzz style questions (with the candidates on a machine and a compiler/interpreter). In about a third of cases that meant we'd committed a significant chunk of time to engineers that apparently couldn't solve trivial problems.
Now it's one of the first things I check. Done right, it's a relatively small hurdle for capable people to overcome, but really helps as a filter for those who aren't suited to the role.
I recently created a service (https://candidatecode.com) to help companies manage issuing and reviewing their coding challenges; I think it's got real potential to help some people out.
But if you need to recruit someone without any such credentials then you may need to do a simple coding aptitude test. Could be a code review or a simple excercise but whatever you do, don’t do whiteboard coding and don’t have people recite/implement memorized CS textbook algorithms. Anyone can do that and still no code.
If the person conducting the interview thinks the behavioral questions are stupid, then perhaps they are. In that case, don't ask "stupid" behavioral questions.
> Resumes are meaningless, and often re-written by recruiters to match the job anyway
Was the position entry level? Students coming right out of compsci often have little to no practicable experience. They may have difficulty thinking about what to put in their resume. After one or two years of full time experience that should no longer be an issue.
> For all I join in when complaining about irrelevant algorithmic questions, I have to admit that they at least test something, even if it's just willingness to study for the interview.
Asking those "stupid" behavioural questions and receiving the same canned answers also demonstrates a willingness to study for an interview.
The coding problem should be testing a candidate's problem solving capabilities as practicably required by the role being interviewed for. The chosen problem should reflect the types of problems that they will actually need to solve if hired. For example, you could select a small PR from one of projects being actively developed by the company. The selected PR should involve only one or two classes (assuming a language with classes) and require improvement. You can look through the history of a PR and just pull out a segment that was selected for improvement by the reviewer(s), or have the team select it for you. Then ask the candidate:
- to conduct a code review of the PR
- to improve the code
But we emphasized repeatedly we weren't looking for the O(n) solution, just the brute force naive solution was 100% OK. And it definitely didn't look like people were freezing up trying to figure out the optimal problem, they were struggling on the basic nested loop.
For instance, a front end engineer can be expected to be able to write a to do list or similar app in a framework (ideally the one your team uses, but not a hard no hire if not) app with minimal googling (although that’s fine as long as not excessive) in ~45 minutes.
Then you have to look at what level of experience they have. Less experience requires more mentoring generally, which may be fine depending on how much time your team budgets for that work.
Lastly, measure their body language and tone of voice to check for red flags pointing to difficult communication styles or people who treat others poorly.
If all three match, hire!
I recently had one requiring me to develop a native mobile application, which I enjoyed. It was interesting, the code is useful down the line, and if I don't land the job, it beefs up my portfolio.
Initial screening by recruiters is tough as my background's missing a degree and industry experience.
Context: self-taught, started out with game dev, tried going solo - not a runaway success. Looking to move away from the field.
I find that this is very useful especially when interviewing juniors who don't have many projects under their belt for the first part. It's also useful when a candidate has good verbalisation skills, but poor programming ones (which happens).
Could you give the exact wording of your "nested for-loop" question?
If no one can answer the question, maybe they don't understand it? Maybe there's something unclear in the way it is worded?
Which I think is why a lot of startups locate in the expensive Bay Area - not a lot of cities have a similar concentration of decent talent.
You'd be surprised (or maybe not now) how many applicants for a senior frontend position can't build a progress bar for the phone screem.
I loath writing algorithm on whiteboard, especially the 'catchy' type. But I've interviewed people who can't even write a for loop.. the amount of brain drain in the flyover country is insane.
Could it be that you're having issues communicating the problem?
How did you run these? While it sounds like something that should be ok even on paper, you can vary comfort a lot through the medium. E.g. for the last interview I had with a substantial coding part, I think being able to do it on my personal machine made a big difference. (I obviously was told before what kind of environment I'd need to have ready)
I’ve been in a panel where I was the only person who asked a code question, the candidate flunked, and then the VP of Engineering went over my complaints and hired the guy anyways. He had been a Professor of Software Engineering and had a graduate degree from Princeton. Within three weeks, the VP of Engineering had to fire him because he couldn’t make it through a simple code review.
BUT the extremely negative sentiment here towards the technical interview process is very well-deserved.
Assessment of code (and the selection of problems) is most often no less subjective than any non-technical assessment. Sometimes the interviewer doing the grading is flat out wrong. Several times I’ve been asked the famous “given an array of stock prices, find the optimal buy and sell indices for the biggest profit.” One interviewer was not aware of the linear time solution to this problem, and didn’t believe me when I wrote and tried to explain it to him.
But sometimes, the interviewer doesn’t even want you to do well. Once time I was interviewing with an injury that prevented me from typing efficiently. I had a doctor’s note and the injury was quite conspicuous. Nevertheless, three start-ups made me solve problems by typing on a keyboard, which guaranteed an excessively long completion time. Those companies held those results against me. (An it’s not like there is anybody to hold those panels accountable).
And then there are those who just don’t care. I had a phone interview with Airbnb that was literally as bad as the stories on Glassdoor: the guy answered the phone in a noisy office (not a conference room), gave no introduction, then simply stated the problem and dumped me into Coderpad. I literally thought it was a prank, since I had met with people at Airbnb face-to-face prior to the call. But the recruiters confirmed the guy was a real employee.
The root problem here is there is no feedback loop back to interviewers. The candidates get “feedback,” but people asking code questions, especially new grads, typically get zero assessment on how well they are doing as interviewers. What’s worse is that recruiters and hiring managers both have incentives to deprive ICs of such feedback, since it would invariably make ICs more aware of opportunities outside the company.
Until the incentive structure of technical interviewing changes dramatically, we’re stuck with Leetcode and hope for the best. People like Gayle Laakmann are helpful (especially when Facebook gives candidates a live hour-long session with her for free), but these people are ultimately invested in their own income and not the task of fundamentally fixing this broken process.
dont give them a fizbuz. give them an example of a real problem your engineers need / are trying to solve. how do they respond? thoughts / intuitions / pseudo code. do they show knowledge of the problem space / domain?
ot if you just want a kid who can code and they pass the fizbuz but fail at the real job, what does your training/culture look like? who does that really reflect on?
it seems to me that interviewing is terribly cargo cult. the problem is real, the practises ostensibly supposed to be solutions are not.
/end rant
I was like this a couple years ago. I was a self taught, "college is a scam", "practical experience" type guy. I now, however do see immense value in the ability to be able to work through these algorithm questions,especially if you ever want to do something besides web / app development.
The classic detect a loop in a linked list question. The original guy took years to devise the algorithm for it. In an interview either you’ve seen it before, in which case you rattle it off, or you have to write a research paper effectively in 5 minutes.
“This problem is (analogous to) loop detection”
“This problem requires a sorted list”
“This problem can be solved by depth first traversal”
Actually sorting a list or doing a traversal is easy after that crucial step. You can look it up. You can’t look up the step that told you what algorithm or data structure to use though.
So am I smarter than Karl Popper? Not really. It's just easier to learn something than to find it. So, yes, I know the linked list answer.
What you want is very good football (soccer) players. Unfortunately you (or upper management) may not know all of the rules to soccer. You may not know the training regime that goes into winning a good soccer game - and it's a big risk spending money training for the big game on the off-season only to lose during the Big Show. So what do you do to test potential candidates? You see if you can get along with them, if they're a team player, and then see how well they play foosball.
It's perfect! There are soccer players on the field, there's a goal, it takes skill and coordination. But oh no, it turns out in the population at large really good soccer players really sort of suck at foosball. After all, they'd rather spend their time and energy playing soccer. So now they spend all their time reading up on books about foosball and what the best foosball strategies are.
You see where I'm going with this? No one uses algorithms in their day jobs. OK a few of you, but come on man, I make full stack web applications. As do most programmers. So why are you testing theory that has literally nothing to do with the job? Somehow someone thought this was a proxy for smart people, but I mean, if the guy who wrote this "Tech Interview Handbook" was really smart wouldn't he have spent his time writing a cool program? I mean how lame is this?
If you want to hire competent engineers tell them what you're building and ask them how they feel is best for them to show they're competent. If you have a big data pipeline in Scala ask if they can construct a data pipeline example that is cool over a few days (take home) or do something similar in the office. Some people like one, some like the other. But just communicate with the people you want to hire! And if not everyone's interview process is the same then maybe that's ok.
I mean I just got a guy who sent me some automated code interview program that had a timer that counted down from an hour at the top! WHO THINKS TREATING THEIR POTENTIAL EMPLOYEES LIKE STAR WARS DRONES IS A GOOD THING? All you have to do is treat the people you want to work with like you would want to be treated and demand that they know their shit.
This. Isn't. That. Hard.
Ah, dynamic programming.
Are we going to see tech recruitment become more and more like college admissions where a top score in the interview is just one of the criteria and no longer sufficient to get a job?
Perhaps next is asking people to write essays regarding career, goals, why they want to work their, extra curricular etc..
I definitely see some of those kids doing the same tactics to get into a "good" company. The steps are slightly different but the philosophy is the same: grind Leetcode, participate in tech clubs and take bullshit jobs juuust long enough to get an internship, get said internship and parlay it into a return offer. Put your head down, play the game and reap the rewards.
Friends in college took less challenging majors to boost GPA for med school (business admin, environmental policy, etc) to boost grades and there core classes were easy so it gave them more time to focus on med school pre-reqs.
It has been and always will be a game. Few things in this world come down to meritocracy because the way we socialize and the way we interview does not beget that in absolute terms.
Tech did a better-than-most job as a meritocracy for while, but I believe those days are done. Most areas f tech are too mainstream with too much obvious money at stake.
There is also short game vs long game. For example, if you cheat on a test without learning the material. You might win the short game, but if that material was important and you need it later you're losing the long game. That said, if the material was just fluff and no one really cares. It's actually smart just to cheat and get by.
If you want to get into those schools, you gotta know how to game works, and practice specifically for that match.
I went to a school like that, and the majority of my classmates were from upper-middle to upper class families that had poured money into their education since they were young, with the specific goal of getting them into top schools.
Yeah, that's unfortunately the thing, when you start throwing in a ton of different criteria / measurements. The people that know how the system works, will study to maximize those points.
My honest opinion is that you kinda of end up with smart people that are very good at test-taking, but may not be the best people when presented a set of problems with non-obvious solutions, or any guides or road maps on how to solve the problems.
Also cover letters cover some of this.
Actually, I wouldn't mind a standardized test like the GRE, where a good score might actually keep your resume from getting thrown out immediately.
Engineering competence is not even slightly oversupplied compared to useful engineering work. Project failures, incompetent people, and systematically incompetent orgs are still very much alive at the most selective tech employers. There are real business needs to hire better engineers.
So they have a full 30% of their applicants qualified to be there but they have to narrow it to 1%. Whatever method they choose must have the appearance of meritocracy, abide by laws and regulations, be resistant to corruption, achieve goals other than academic performance such as culture, volunteering, and sports, and so on.
I think the same must be true for companies. At some point in the elimination process everyone is technically qualified, so they might as well hire someone "because we like his face". But that's demoralizing and corrupt so they invent some criteria that on its face seems useful, even though it's truly not.
You have something really special there; hang on to it. People dream of working at a place like that. And have fun taking over the world, because with a team like that, you will.
Until I interviewed a lot of people with my colleagues during these years. Interview processes are highly biased based on the knowledge of the interviewer's background and value, or even mood.
Some interviewers are too sloppy on interviewing, asking ill-defined questions, demanding answers they want, or just in a hurry and wanting to go back to work. I often feel bad and angry for interviewees - they spent time and patience preparing themselves carefully, then was treated very casually. It isn't fair at all.
I hate to say that, although standardized tests are bad, they are better than the most nowaday interviews.
> This course will prepare students to interview for software engineering and related internships and full-time positions in industry. Drawing on multiple sources of actual interview questions, students will learn key problem-solving strategies specific to the technical/coding interview. Students will be encouraged to synthesize information they have learned across different courses in the major. Emphasis will be on the oral and combination written-oral modes of communication common in coding interviews, but which are an unfamiliar settings for problem solving for many students.
At least personally in my hirings I'll never use or trust anything like this.
There are so many things wrong with this approach that I'm kinda speachless as to where to start.
Most companies today are already using a version of this that is way less respectful of applicant time.
But you’re speechless, so I guess there are strong points on both sides.
The best way is someone brings in an existing code portfolio and discusses it.
The second best way is someone completes multiple design and development exercises of varying complexity, constraints, and use cases.
The third best way is they complete a single exercise and provide commentary on alternative designs.
There is no fourth best way; all other approaches are essenrially stochastic and select for interviewing traits not development traits.
The actual best method I think is a 3 month probationary period which is more or less an extended interview. They're asked to contribute to existing codebases, participate in code review, go through some architecture design sessions, conduct stakeholder interviews - things that again are mostly impossible to accurately gauge in a typical candidate assessment window.
By the way a tremendous book for interviewers and hiring managers is How Judges Think by Richard Posner. A lot of it applies to hiring, and he's a great writer.
Number 3 is my favorite, but there's a trade-off: almost anything large enough to allow for multiple designs is probably too large to require all applicants to complete. But I also love having a candidate discuss their own code. I'm really curious what types of exercises you have for this kind of thing, and where it fits in the interview process?
I’ve seen this mistake happen so many times. Especially in early stage start ups. If you want morale issues and disruptions, you’ll achieve it 100% of the time by retaining underperforming staff. Even worse, this ultimately leads to your best team members leaving if it’s an issue you can’t solve. Fail fast applies to HR too, you need to learn to fire fast.
My workplace has a six month probationary period; I brought in a mandatory three month review after watching one group screw this up badly. At the end of the six months, the employee came in to work all happy as usual thinking everything was great, and was promptly fired.
The three month review is the point at which the employee is told that they're on track, or they're below standard. If on track, just keep going the same way and if they don't get taken aside for a specific chat in the following three months, told to assume that they're going to pass the probation period; we have definitely turned probation failures into probation successes via this mid-point review. It's also their opportunity to tell the company what the company is doing wrong; what the company is doing that will make them choose not to stay. This too has happened, and we have retained good employees by listening to them at the three month point and making changes.
If they're below standard, they're told what they need to improve and are offered help to improve, or they can just sack it now and walk (or, as happened once and once only so far, they're considered unrecoverable and we take a long hard look at how that person was hired).
The principle we subscribe to is that if the employee is surprised by the results of their probation period, that employee's team lead and by association "the company" has really screwed up. If an employee doesn't know how they're doing after six months on the job, something has gone very wrong.
I don't live or work in the US, though; we can dismiss someone during their probationary period with ease, but after that they have workplace protections. If we could fire anyone at anytime for anything we like, that effectively makes the entire employment period a probationary period. Probation never ends if you can be fired at any time.
Most states in the US have at-will employment terms. Probationary periods are common in other countries. And yet no one seems to want to do what you're suggesting.
Why would anyone give this advice? Can we stop handing out this advice and encourage everyone to stand up for their rights instead? If you think about it, by proxy this gives you the following advice: as a programmer you will have no life outside of work and you are supposed to be an idiot who spends his time working and studying outside office hours even when you could spend time with your family. So yeah, go for it and suck it up you idiot.
At least that's how most employers handle this problem. And interestingly by comparison no-one tells a MBA diploma holder that it's a great career for passionate individuals who like to spend their whole life studying.
And before you think that I am against studying, that's not the case at all. If my company pays for it and I can kick back on a sofa during working hours to study I am fine with it. But I cannot see any value studying something on my own expense on my own time which will be outdated in 3 years anyway and so by definition it only benefits my employer.
It seems relevant to getting a tech job. If one is looking for a job, chances are that things have changed since the last time they looked for a job, which on an average for most people these days is every 2-5 years.
This is not limited to engineering or tech, and extends to most specialized jobs in most industries.
Can we stop handing out this advice and encourage everyone to stand up for their rights instead?
In the right context absolutely. In general, its possibly a bad idea. You could always balance it out and educate the folks about the rights of an employer
If you were on my team, I would not expect you to sacrifice any of your rights. But if you kept falling behind your peers, to the determent of the team's performance, at some point you would be put on a performance plan. The unfortunate thing about performance plans is that by the time its enforced things are close to unrecoverable.
And if you did find yourself failing, I hope someone tells you that:
The technology industry is an extremely fast-moving one. Many technologies used today didn't even exist/were popular a decade ago; in 2009, mobile app development and blockchain were pretty much unheard of. Engineers constantly need to upgrade their skills to stay relevant to the demands of the job market.
Because, I will not.
Is this public to employees or you just simply whip everyone until they work themselves to death without telling them the reasoning?
Also I assume you are in the US. In most European countries you can absolutely do nothing about someone who works full time on your team unless they let's say causes you financial losses or punch you in the face, so you can basically shove up your plan to your bottom part in other cases.
Most of the US follows At-will employment[0], so generally no performance plans are necessary.
This is not public. We've only used it twice, one person was suspected of misappropriation of funds (and later proven), and one that's currently in motion is an employee that misrepresented their technical knowledge i.e. cannot write code, convinced other employees to do 90% of their contributions.
I get that there is this impression of the US whipping everyone until they work themselves to death. Anecdotal, but I've never really experienced that. If anything, the folk in the US like being busy, and workplaces (at least in tech) are overly happy, in something that reminds me of a cult.
I have lived an worked in 5 countries in 3 continents and worked with people living in many more countries. In my own observations, the US colleagues have usually exhibited the highest level of happiness/satisfaction. They also tend to find their next jobs the quickest. And are paid the highest.
It was always basically assumed that you will study all the necessary things on your own time and when you sit in the office then you make productive things, a.k.a. as you deliver. You always had to code and show some progress of a specific development task each week. There was never such thing as time to research/study.
> It turns out that the median number of hours racked up by an MBA in his or her first year of employment is a whopping 54 hours a week
https://www.forbes.com/sites/poetsandquants/2018/03/06/the-6...
54 sounds about average for an engineer anyway. In Germany for example it's not rare to have 9 hour working days, with 30 to 60 minute lunch breaks in between. If it's so then basically one week comes to about 45 hours / week. If you study 2 hours a day, read books or read the news, follow trends, watch recordings of past conferences, etc which are pretty standard things that you are supposed to do on your job then it easily comes to 54 hours / week. So that's not high at all.
Karl Marx's entire philosophy was based on the abuses of workers he observed.
Nowadays people mock progressive causes as 'socialism', mock unions and worker protections, demonize progressive politicians, and idolize oligarchs. Even though it's all against their own best interest.
Your idealistic notion of 'rights' doesn't exist and will never happen. You have no rights as long as someone controls your purse strings, which with booming inequality is more and more people nowadays as well.
I can't wait until most jobs are finally automated and we're done with this whole capitalist system entirely and have to figure out what to do next.
We have to figure that out much sooner than all jobs are automated.
> What is the most costly technical decision made early on that the company is living with now?
> What is something you wish were different about your job?
> What has been the worst technical blunder that has happened in the recent past?
But I would be careful how you interpret these. In fact I would almost factor in these answers in the opposite way of what I think you intended. The company that admits to the worst technical issues is at least honest and self reflective. The company that doesn’t admit to any serious issues might be just as bad or worse, but their strategy is to tell employees to lie about it rather than be open to addressing it.
Perhaps companies will start asking candidates to construct mathematical proofs of data structures, algorithms, formulas and common equations from university-level entrance examinations just to do a mobile app or a web dev job.
As soon as that happens, the 'ideal candidate' companies will be expecting to interview would be a very prodigious candidate, former math Olympiad champion and decorated with titles and research papers in their name.
You guessed it: 𝔜𝔢 𝔬𝔩𝔡𝔢 𝔩𝔢𝔤𝔢𝔫𝔡 𝔬𝔣 𝔶𝔢 10𝔵 𝔡𝔢𝔳𝔢𝔩𝔬𝔭𝔢𝔯.
Perhaps we’d produce better software if people prioritised correctness like this in practice!
Now would this make sense for a graduate entry level role for a web / mobile app developer position? Interviewers looking for such candidates need to lower their expectations a bit in for positions like that.
As much shit as we give white-boarding, I would have chosen it instead if it were an option.
Needless to say after the interview ended i ran away from that deal.
I don't think I've ever seen anyone explicitly recommend against using a custom domain for email.
https://haseebq.com/how-to-break-into-tech-job-hunting-and-i...
1. If I am not using a lot of data structure and algorithms in my day to day work how am I supposed to be good at it?
I am very good at finding solutions to problems but I am very bad at remembering a lot of things.
2. How does one even prepare for a subject as big as Android? The thing is vast. And asking trivial things about it won't be very useful.
When I was prepping I got the most value out of LeetCode for solo prep, followed by mock interviews with sites like Pramp.com, Gianlo.co, and PracticeCodingInterview.com.
I don’t know why tech companies don’t just admit that this is all pretty much standardized at this point. Just build a standardized test, or certification, and get it over with.
I’m only half joking. At least we’d only have to go through the process once.
I don’t think so
Maybe this should be titled “Just Graduated? Some Useful Tips”
https://interviewing.io/recordings/Python-Google-6/
I'm not affiliated with this site, I just thought this was a great idea and well executed.
I used to lean toward the "studying algorithms, data structures, whiteboarding, etc. is useless since I'll never actually need them" ideology until later in my career when I realized that worst case (for me) I can take a break from building CRUD apps and refresh my CS fundamentals. I enjoy speeding up code and then asking myself, "can I do better?" each step of the way, trying to make further improvements.
The fact is that there a tons of people out there representing themselves as programmers who actually can’t perform basic tasks.
The HN population massively selects for competence, so people here have a hard time imagining what things are like from the interviewer side.
The main thing is the problem solving oriented skills in a project that really makes a difference.
Thus, we pivoted to do a mini project screening automation for companies to do hirings. This site has backend to frontend tasks that come with a CLI to code the project locally. The website is called https://real.dev. We want to solve this interview problems for companies.
Has anyone had any luck with an approach like this?
Really? I did the most interesting and challenging projects earliest in my career (late 80's to early 2000's) -- just due to the nature of the industry, I'd think that more recent project descriptions have undergone a certain homogenization that would make them less and less an indicator of talent and varied experience.
I can chat excitedly about early projects, but now people must all be saying "cloud blah JS blah containers blah blah bit-pipes and storage blah - oh yeah, and modands!"
The quickest way to do that in a 45 min, is to put the candidate under pressure.
The best way to put a developer under pressure is to ask coding questions and expect some solution in a very short time.
Note that the goal is NOT to find out if you know the ins and out of a specific algorithm. It is to discover how you think under pressure.
1.) Everyone is studying these problems all of the time and they finally disappear.
2.) Other outcome is a dystopian field fueled by a race to the bottom where everyone is practicing algorithms problems all of the time. If you read the blind forums, some people are completing 500-1000 leetcode problems before heading into interviews.
I'm putting my money on number 2, which is where we already are. Can only imagine what this is doing to code quality...
People just master the foundations behind the old test material. Now that stuff has become trivia of the today. So, they need some advanced stuff to test the test takers.
Over the course of one year, I completed, classified, commented 200 leetcode problems. I also taught algorithms to third and fourth year university students not too long ago. I believe I write readable code, I know perfectly the language I'm using (at least for that purpose), I'm totally fine with complexity, and I know most methods involved in these algorithms, including more advanced algorithms such as KMP.
Yet... I failed my round of interviews at Google. After this preparation, I'm still not able to solve quickly any leetcode problem in the context of an interview. On a whiteboard, with an interviewer in my back, in a stressful situation. I need to practice more if I want to get consistent results.
So I agree with your conclusion. We are competing with a lot of people who train using the same resources. Including young graduates who have a lot of free time on their hand.
On a positive side, I'm thankful to Google for giving me a shot. Based on my resume (40+ with little experience in software industry), I'm not sure I would have been interviewed in a more traditional company, let say a bank.
To come back to my interviews, system design went very well. Algorithms quite well too but not well enough. Couldn't solve one problem, and a bit slow on an other one.
The recruiter first told me I passed, and that they were going to find a team for me and make me an offer. But they finally asked me to re-take the algorithmic interviews a few months later, to "make a stronger point to the hiring committee".
Never heard from them since then, about 8 months ago. Recruiter doesn't answer emails. I think she moved to a different position. Not sure what to do now?
I don't get why no one seems to consider the possibility that these sorts of interviews actually do get high quality engineers in the door.
I get downvoted for raising the question every time. But isn't it possible this interview style actually works, even though it doesn't resemble real coding and even though many of us hate it?
I have yet to see any compelling argument for why I should believe these interview practices don't work. And yet the fact that so many companies, with so many resources to change things up if they felt it was in their best interest, keep interviewing this way must at least suggest the possibility that maybe it works?
That's is debatable to say the least. I'm both a hiring manager and on the market for a new job (so I am still solving leetcode problems in my spare time). After 3 years of hiring based on leetcode for technical competency, I can say that the quality of engineers is a hit-or-miss. I've had people writing brilliant solutions to hard leetcode problems crash and burn when writing production code. Currently on my team, the most technical debt was written by someone who completely aced the leetcode stage and was pip-ed out a couple of months back. We even have an inside joke to look both ways before changing X's code. He easily landed a job at a unicorn and I'm really glad he's their problem now.
I really don't believe there is a strong correlation between competitive programming chops and being a competent engineer in a team environment.
We are currently changing our interview practices to ask questions which touch on more practical issues (like multi-threading, review a piece of code, change a piece of code, make a unit test pass, instrument this with metrics, etc..) because what I personally identified as a better signal was competency in specific types of leetcode problems such as LRU caches, O(1) data structures, iterators for common data structures, etc..
Why would studying algorithms and data structures affect code quality? They would similarly be able to learn to write quality code once they're inside the company, no?
While that is not a negative skill to have, it is also not a skill that I'd list anywhere in the Top-25 of most valuable skills for productive developers.
They work against creating readable, understandable and debuggable code which is much more important in general than being able to solve algorithimical problems you'll almost never see in real life.
I've seen this first hand where some were brilliant at these problems but wrote the worst code imaginable. I would rather hire someone who can write clean and simple code and teach them how to solve these problems than the reverse.
Why? Teaching people to write readable code happens automatically during code reviews, teaching people how to think is a lot harder.
First off, no, writing good code in an interview wins you additional points. Second, why do you think candidates can (or will) continue doing that on the job? New employees aren't allowed free rein to check in code from day 1 at most places - trust has to be earned. And I don't think any decent company allows check-ins without review.
Why is "passion" so important? And what even is passion? For professionals in every other field, competence, ability to deliver results, and getting along with people, are what matter. Many pros are passionate, in the sense of loving their work, but passion isn't a prerequisite for being a pro.