Google Interview Questions

The famous Google interview questions? They don’t work. Here’s Laszlo Bock, senior vice president of people operations at Google:

On the hiring side, we found that brainteasers are a complete waste of time. How many golf balls can you fit into an airplane? How many gas stations in Manhattan? A complete waste of time. They don’t predict anything. They serve primarily to make the interviewer feel smart.

Instead, what works well are structured behavioral interviews, where you have a consistent rubric for how you assess people, rather than having each interviewer just make stuff up.

Behavioral interviewing also works — where you’re not giving someone a hypothetical, but you’re starting with a question like, “Give me an example of a time when you solved an analytically difficult problem.” The interesting thing about the behavioral interview is that when you ask somebody to speak to their own experience, and you drill into that, you get two kinds of information. One is you get to see how they actually interacted in a real-world situation, and the valuable “meta” information you get about the candidate is a sense of what they consider to be difficult.


When I spent a day interviewing with Microsoft in 1987, I was disappointed that they didn't ask me any of their then-famous brainteaser questions like, "Why are manhole covers round?" Mostly they just asked the same things as everybody else, like "What do you see yourself doing in five years?"

Yeah, I thought they were the 'famous' MS questions and I also assumed they were apocryphal, considering I worked somewhere that was famous for doing something similar that I never saw or heard tell of.

I have actually been asked brain teaser questions in software engineering job interviews. I have done more brain teasers than most in my life, so I do well on them.

I have never asked one. I think that if a person fails at them there are two explanations. Either the person is stupid or they have never spent time doing brain teasers. Stupidity is easily detectable in other ways, brain-teaser experience is not very interesting.

I did a lot of interviewing over the past couple years and I agree that having a stable interview process is important. You build up experience in what a good performance looks like, and skill at keeping things running on time, etc. Without a stable platform it will be hard to compare candidates with each other or with past hires, and the chaotic process may unnerve a candidate, or just make them not want to take the job if you offer it.

When I worked for Dun & Bradstreet in 1993, I got approval to hire a computer programmer. I went to the HR department and asked them to give me D&B's written programmer test. The HR lady said that they never had written tests because then they'd have to get them expensively validated or the EEOC would be on their cases, but that I should feel free to ask orally any questions I like about programming computers.

"But I don't know how to program computers," I pointed out. "That's why I've been told to hire a computer programmer."

This proved a logical conundrum for the HR lady and me. So I mostly just asked applicants what they saw themselves doing in five years, and eventually decided not to hire anyone and just do the whole project in Excel myself.

The Google quote sounds like an illustration of The Smart Get Smarter principle. The brilliant programmers employed by Google have no problem orally interviewing programmer job applicants to find the best ones, while the pointy-haired bosses like myself have no clue how to do it right.

Sounds like a business opportunity for someone willing to validate qualified programming candidates, also known as "Be Google."

Written programming tests are pretty common, and they are written by the programmers at the company and validated by nobody. I think that company had fallen prey to legal superstition.

"fallen prey to legal superstition."

No, the EEOC cracked down on a different market research company I worked for using as a hiring test co-founder Prof. Gerry Eskin's final exam in his Advanced Quantitative Methods in Market Research class. The quality of hiring never seemed to recover.

As far as I can tell, the federal government pretty much ignores enforcing disparate impact and similar anti-discrimination regulations in Silicon Valley and Hollywood, on the grounds that, well, they're too cool or something.

Just drive by a movie crew shooting on the street in L.A. and try to count how many Latinos are employed in skilled blue collar jobs. It's as if the last 40 years of demographic change in L.A. never happened. But they can get away with it because they're Hollywood.

My wife recently had a phone interview for an accounting position with a spinoff of a large corporation. They were using HR people from the parent company to do a technical interview. The woman had to ask her to go very slowly and repeat her answers because she was writing everything down to show to someone in accounting later to evaluate.

And then the next round of interview was with people from accounting.

It was really stupid.

I once interviewed a chap for a tenure-track academic job: he, I decided, was either so much more intelligent than me that I couldn't engage with him, or was a charlatan. So I asked him why he wanted to change employer. He said it was because his current employer didn't recognise his genius and wouldn't promote him; indeed, proposed not to renew his contract. I decided he was vainglorious. I've since seen reference to his semi-stellar subsequent career, but that just left me feeling that he was a successful charlatan. So: should we have hired him?

For academia? Of course!

'What is it exactly turns you off?'

I fail to see how "describing how an interviewee solved a problem" tells the interviewer more about the interviewee than the interviewer observing the interviewer solve an actual problem. It should be obvious why the former suffers from a wide array of problems. Not a fan of behavioral interviews at all. They are so contrived.

Behavioral interview questions aren't the whole story, but they do work. Watching someone with excellent qualifications and a wonderful resume struggle to describe what they actually do in a situation that should be commonplace for them provides valuable information to a experienced interviewer.

The key is "experienced interviewer". Not some HR flunky. Nor some guy you hired a year or two ago.

This may seem unlikely, but I've seen both happen and at top companies too, especially for not-so-high-up positions. Biggest problem is HR runs the show, and most good technical people hate dancing to HR's tune, so the interview responsibility often devolves to non-cream employees or junior employees who cannot object.

e.g. If your interview rubric for a SysAdmin position was designed by an HR guy who's never seen a command line, there's only so much good it can do.

I disagree. Such questions measure a useless quality, being the ability to recall and represent things that noone ever has to recall and represent in practice. You need to see someone *solve* the relevant issues, not discuss with him about it. If this is a suggestion by the "vice president of people operations at Google" I find this embarrassing. The reason someone will "struggle to describe what they actually do" is because he ifs forced to explain something complicated to some dumb HR person, and also because explaining what he does is just not what he usually does and nothing that creates value for his employer. The proposition that this "conveys useful meta-information" is just clueless HR gibberish.

You can tell the validity of Olaf's critique by noting how much a little practice helps you with this sort of question. Memorize a few stories that make you look good, and twist them to fit the interviewer's question. Choose at least one thing technically challenging and one thing personally challenging, and make sure you "failed" on the way to your eventual success.

These questions measure interview prep and smoothness. They might be things you care about when hiring a guy who's job is mostly social, but maybe not so much for a programmer. Arguably you should look for people with great proven technical ability, but who interview poorly, and then make them low offers. All the people who heavily weigh these BS interviewing metrics will pass them over, so maybe you can get them for a good price.

I was very bad at describing what I do at my current employer, until I got asked the third time, when I just sat down to write up an answer I could recite from muscle memory while thinking about something else entirely.

It more shows how far along in the job hunt someone is.

Arguably you should look for people with great proven technical ability, but who interview poorly, and then make them low offers.

People who interview poorly often have interpersonal communication issues. So if you are hiring for a team position in a challenging field, you can't just hire for proven technical ability. A first rate talent who's also a drama queen or passive-aggressive may well be a net negative.

Drama queens tend to be very good at interviewing. Passive-aggressives? Maybe, maybe not.

It's not like there's just one interpersonal metric and you want to maximize it.

Well, if you think the interview is measuring something important, you should probably pay for the positive results.

But if you don't, you should take the negative results with a grain of salt and realize "what I've got here is a candidate who's going to be undervalued by everybody else."

I could describe in an interview the toughest engineering problem I ever solved. You could not watch me do it because that would require having the same hardware/software environment and analytical tools that I had when I did it.

If you could watch me, it would be boring. It took days of beating my head against the problem to figure out there was a subtle error in the logic of the hardware which could only be evoked by a complex set of conditions. But I could describe the whole miserable saga in a few minutes.

But that's an easy interview. Other senior engineers could and would enjoy asking all kind of technical and procedural questions. Furthermore, they would enjoy asking and hearing your answers.

The challenging interviews are when you think you' might have a viable candidate, but you can't get them to give you anything better than short, canned answers.

I make up a situation based on a real situation and explain how I "solved" it. People are ready for behavioral based questions and pull out their contrived arsenal when the questions are asked. Even for graduate school admission, the behavioral questions were variants of run of the mill behavioral questions. I don't lie in my typical life because I feel it is immoral and that being honest with people is very important, but I do lie with behavioral based questions. In many of the instances where I have "solved a difficult problem," the interviewer is unlikely to understand what I am talking about when I explain the problem. I make up a similar easy to understand problem and make up how I solved it.

Is "people operations" a way for HR to reinvent itself?

Note that most people in HR will do terribly at brainteasers. A part of the fondness for behavioral interviewing in contrast to something more technical / quantitative / analytical is reflected in the typical profile / IQ of an HR exec.

I have no clue how strong Laszlo Bock's technical credentials are, but in general I'd be very very skeptical about listening to anything HR says about what works or does not. HR has a stake in keeping its turf and behavioral interviews, SWOT, group discussions, psychological profiling etc. are all great opportunities for HR to retain its power.

On the plus side, had we done behavioral interview questions for Obama the campaign would have been very short.

Against Romney? Think again. Where is that sociopath thread when you need it ("Binders full of women.")

Slut-shaming the BDSM community, are we?

"The famous Google interview questions? They don’t work."

In my experience with Google from 2008 onwards, they never asked brainteasers. Instead they always ask extremely technical algorithms / data structures questions that involved coding on a whiteboard.

Which seems a great approach and is quite different from the behavioral crap this guy's spouting.

The big problem is that "brain teasers" means vastly different things to different people. I was disappoint to see that as the lede on this post.

Some people think "you drive past a bus stop in the pouring rain and see your best friend, a sick old woman, and the woman of your dreams there, what do you do?" as a brain teaser.

Some people think "write a computer program to count all games of tic-tac-toe" is a brain teaser.

One of those groups is wrong, but loves to muddy the issue by calling them both brain teasers.

Is the car a two seater?

In the canonical "best" answer, it doesn't matter as long as it's an at-least two seater. If you're on a scooter, you're shafted.

Oh yeah, only two seats.

I got asked that question myself. I wasn't offended (although if I work for them I'll suggest they move away from heteronormativity) even if I think the fact that I got it right doesn't actually improve me. There were plenty of better questions.

I'd have Candidate A create a challenging question for Candidate B and vice versa.

Which is why (at least in the tech industry) everyone with half a brain stopped doing this a long time ago. In fact, Google is famous not for the stupid brainteasers, but because of their awful-hard data structure and algorithm questions. Steve Yegge wrote a fairly lengthy article about that a while ago[1]. TLDR: interviews are hard, and you are bound to interview with someone that asks very detailed stuff about a subject you don't know very well (if you fail, reapply in 6-12 months).

Alex Papadimoulis said it best about brainteasers[2]: Would you want to work with the guy who builds a water-displacement scale/barge, taxis a 747 to the docks, and then weights the jumbo jet using that, instead of simply calling Boeing in the first place?


No need to call Boeing. The 747-400 has weight sensors built into the landing gear. Power up the FMC and you've got your answer.

Restriction of range. Everyone interviewing there is probably already good at problem solving/brain teasers.

A friend just did a distance interview of a sales job (medical devices - though nothing too fancy).

The interview was fully automated, via a laptop. The structure was thus, he'd be asked a question and then was to respond into the camera of his laptop while it was recorded. He could see an image of himself on the screen which he found distracting and the instructions had admonished him to look at the camera and not the screen. The multi-part questions would disappear when it came time to answer, so after the first few he began writing them down. He was given one practice question to get used to the system, but felt the first few answers he gave weren't good because of the format. The next day, when someone going over his performance with someone from the company, he asked them if they liked the results of this kind of interview and the other person admitted that the system was new to them and they were still trying to figure it out.

It sounded awful to me and I wonder at its effectiveness at measuring rapport and likability which will be major factors for sales job he was interviewing for.

Some people think a good interview system is one that rejects a lot of people. The more people you reject, the more selective you are!

“Give me an example of a time when you solved an analytically difficult problem.”

I googled it.

I work in biotech and a few years ago we were hiring for two BS/MS lab scientist positions. After the interview process we ended up with two people who looked pretty equivalent to me and the other evaluators. They both had the background, and interviewed equally well. Once hired, it took about a month to realize that they were night and day. The man was unmotivated, sloppy, and pretty much a wasted hire. The woman turned out to be the exact opposite--extremely competent, able to solve extremely challenging technical problems, and pretty much the ideal hire. I thought back to try to see whether there were any hints during the interview process that, in retrospect, might have helped me, but there weren't any. And I don't think that focusing on questions about motivation or problem solving would have helped. The man, for all his faults, was a good communicator, and probably could have come up with some good answers. The woman was a bit more reserved, and may not have been able self-promote adequately.

My conclusion was that, at least these types of positions in my field, you just can't tell how good someone is from an interview. The only way to know is to actually work with them. So now, a couple of companies later, my solution is to only hire people I either know from previous jobs or who have worked for someone I know and trust, and can vouch for them. (For my most recent hire, I--surprise!--poached the female scientist described above).

I fully agree with you. Because no matter how clever you are with an interview, in the end, it's about how well someone interviews. This is one reason I always do a second interview--you start to see more of the person when they come in twice. Since I started doing that, I've never had a bad hire, but that doesn't mean that it's always turned out exactly as I hoped.

From an oldie but goodie (summing up 85 years of literature on the predictive validity of intelligence (GMA) on job performance).

Table 1 shows the added validity gained by supplementing GMA measures with other personnel measures, such as structured or unstructured interviews. (Unstructured interviews, of course, have never been viewed as great tools among those that study job performance.)

The predictive validity of GMA tests was .51. Additional validity gained by adding supplementary measures:

0.14 (Integrity tests)
0.12 (Work sample tests)
0.12 (Employment interviews - structured)
0.09 (Conscientiousness tests)
0.07 (Job knowledge tests)
0.07 (Job tryout procedure)
0.07 (Peer ratings)
0.07 (T & E behavioral consistency method)
0.06 (Reference checks)
0.04 (Employment interviews - unstructured)
0.03 (Job experience - years)
0.02 (Assessment centers)
0.01 (Biographical data measures)
0.01 (T & E point method)
0.01 (Years of education)
0.01 (Interests)
0 (Graphology)
0 (Age)

I wonder what Bryan Caplan would say about the similar scores of "Years of education" and "Graphology". :)

Ungated copy here:,d.eWU&cad=rja

Sorry about the formatting. Not sure why the carriage returns after each personnel measure were removed.

My anecdote; I had to interview new graduates for a while for a major company, typically I got 1/2 hour with each one, and about 10 a day for a week or so. The shortlisted 10 or so went to a second interview lasting 2 days with about 2 actual positions on offer. My approach was that everyone shortlisted was probably acceptable (after all they had achieved a good degree at a prestigious university) so it was mostly about weeding out the obvious excessive introverts and weirdos. Then it was whoever I would like to spend time with. My picks seem to work as well as the other interviewers.

Now when I do mature hires, its basically on references; if they haven't worked for someone that I know, I usually know someone who knows someone who knows them. The interview is more for forms sake, and to clarify what they want from the role and whether it fits.

I have had HR people push behavioral based interviewing at me, but when I look at my employees, some of the best workers I have can't articulate very well. In fact very articulate persons raise my BS detector.

I thought behavioral interviewing was trying to corner someone to see what they would do when backed up against a wall.

No. It's pretty sound if you follow the script. It also helps if you have more than one, preferably 3 questioners. The questions are standardized and often followed up with "What was the result?", or "Did you achieve your goal?" It's an efficient method to screen a large volume of candidates using.

My observation about hiring people in general, and I've done a fair amount, is you must accept that some times you get it wrong,

If he punches you in the throat and jumps out a window, congratulations. You've identified a non-herd animal. Hire him.

The technical interview is dead

He's just preaching the new fad, so the old fad looks especially unfashionable.

"Look at the projects they've put in github at their job." Well, for lots of people, their best code is private and owned by the company that paid them to do it.

You also aren't going to get someone who is currently employed full-time to quit their job for a one-week internship at your company.

Comments for this post are closed