MLOps Zoomcamp: Free MLOps course. Register here!

DataTalks.Club

Ace Non-Technical Data Science Interviews

Season 6, episode 2 of the DataTalks.Club podcast with Nick Singh

Did you like this episode? Check other episodes of the podcast, and register for new events.

Transcript

Alexey: This week, we'll talk about the non-technical parts of a data science interview. We already had an episode where we talked about how to prepare a CV and other aspects of the interview process. You can go and check our website for it. We didn't talk a lot about the behavioral interviews and other less technical parts of a data science interview, which is what we are going to cover today. We have a special guest today, Nick. Nick started his career as a software engineer on Facebook's growth team. Then he worked with a startup called SafeGraph, which is an application analytics company. He graduated from the University of Virginia with a degree in Systems Engineering. You also interned at Microsoft and Google, right? (1:58)

Nick: Google, yes. (2:48)

Alexey: As we already know, Nick is an author of a book about acing data science interviews. I think the name is “Acing Data Science Interviews” – I also have a copy which just arrived. (2:49)

Nick: Yeah, amazing. Just amazing. (3:06)

Alexey: Just in time. [laughs] Okay. Anyways – welcome, Nick. (3:09)

Nick: Thank you for having me. I'm really excited to be part of your community and give a little talk, as well as do the Book of the Week a little bit later on in the month. (3:14)

Nick’s background

Alexey: Yeah, thanks. Before we go into our main topic of acing interviews (specifically behavioral interviews), let's start with your background. Can you tell us about your career journey so far? (3:21)

Nick: Yeah, absolutely. I've had a pretty interesting background that's all around data, but in indirect ways. In school, I studied systems engineering, which is a sort of mix of operations, research, business, and math. I was on the data infrastructure team as an intern at Google. I was also a growth engineer and did data driven experimentation, ran and implemented many A/B tests at Facebook. Then I joined a location analytics startup called SafeGraph, which sold alternative data to hedge funds and retail analytics companies. Most recently, I wrote the book “Ace a Data Science Interview” which is sort of like “Cracking the Coding Interview”, or the “LeetCode” for data scientists, machine learning engineers, and data analysts. So that's my background around data. (3:32)

Being a career coach

Alexey: When I read your bio – I think your book has your biography – it says that you're a career coach. So what do you do as a career coach? (4:17)

Nick: Absolutely. These days, I'm super passionate. When COVID hit, I left my job and thought, “Hmm. A lot of people got laid off or they had their offers rescinded.” I knew that there was a lot of advice given on how to break into product management and software engineering, but not that much around data science. So these days, what I do is work with different people who are trying to improve themselves in their interview, help them, and coach them on positioning themselves for success in the job market – how to build portfolio projects, improve their resume, and build up those technical skills to land their dream jobs in tech as a data scientist, machine learning engineer, or data analyst. (4:30)

Alexey: So you help people get a job, right? (5:07)

Nick: Exactly. That's my goal nowadays. (5:10)

Overview of the hiring process

Alexey: Okay. Let's say I want to find a job, specifically – a data science job. Usually, I apply, send my CV. Then I talk to the recruiter. It's actually a whole process, where there are many, many, many different steps. So, what are the typical steps in this process? (5:11)

Nick: Totally. Of course, it differs from company to company. But some of these larger FAANG-type companies, like Facebook, Google, Amazon – what they do is have a phone screening with a recruiter, maybe even the hiring manager, if it’s a smaller company. They assess you – they might give you an online assessment, where they often ask either a coding question or a SQL question. After these, they usually bring you on-site, (of course in COVID times, there’s no on-site), where it’s just a panel with three or four more interviews that encompass technical aspects, more system design open-ended case-type problems, as well as, of course, a behavioral interview with your probably soon-to-be hiring manager or director, or someone else like that. So it varies, but usually it sticks to this script. (5:33)

Alexey: So there are like four or five interviews in total, right? (6:20)

Nick: Yeah, these days most companies are running four or five interviews by the end of it. (6:23)

Alexey: Yeah, I remember like six, seven years ago, when I got my first data science job, I actually had just one interview. That interview was a bit long, about an hour and a half. I mean, maybe by today's standards, this is probably considered a short one. But it was just one interview. Maybe it was two, I don't remember exactly. But after that interview, a couple of hours later, I got a call about an offer. Now, even smaller companies don't do that anymore. I wonder – why is that? (6:28)

Nick: Exactly. When I'm talking about this four or five round process, it’s definitely more around these unicorn Silicon Valley startups or FAANG-type companies. At smaller companies, you might have a more expedited process, partly because they realize talent doesn't put up with something like a week's worth of interviewing. That's one thing. Another thing, a very interesting mindset they have in hiring is that they don't want false positives. They'd rather turn down people who are good than have hired somebody who suddenly turns out to be bad, because it's very cumbersome to fire somebody and put them on a performance improvement plan and things like that. So to weed out these false positives, one easy way is just to make a tougher screening process and screen them multiple times. That's kind of where the industry is headed recently. It's a pain for candidates, so I have complete empathy there. It's a pain, but it's just what it is these days. (6:57)

Alexey: But also for companies, it's not very easy to find so many people. You have to find at least four or five people who can interview the candidates. It’s very difficult. Even smaller startups do that these days. (7:54)

Nick: Some days you hear “Oh, all the companies can't find enough talent. People are raising their wages.” On the other hand, you still have a lot of folks who study data science, who are like, “Dude, how do I break in? They're only hiring PhDs. They're only hiring people with years of experience. How do I break in as a fresh, new graduate from my Master’s or Bachelor's program?” So it's a tough matching problem. Companies are complaining about talent and talent is complaining about companies. I hope that with the work I do, and with what the book does, that it at least bridges the gap a little closer so that companies are better able to ask questions and assess talent, and talent – who’s super-skilled technically – is able to better portray their own skills and experience so that they don't get dinged for some arbitrary random reason that could have been prevented. (8:09)

Behavioral interviews for data scientists

Alexey: When you described the interview process, you mentioned this behavior interview part, which is an interview with a hiring manager or a director who assesses how good you are as a person? Or don't know – how well you fit the company or the company culture? I don’t think that there is a lot of material out there that can help prepare you for that. The technical part is a bit easier, because if you just go to your favorite search engine and type something in – you'll find dozens, maybe hundreds of different resources. But if you go and type something about a behavioral interview, then maybe it's tougher. Right? (8:58)

Nick: Yeah. I think there is enough material about behavioral interviews, but it's not really geared towards data scientists and machine learning engineers. There are things that list these kinds of arbitrary questions where they ask, “Tell me about your biggest weakness.” This is sort of something that you would ask a salesperson or something like that, but it's a bit random to be asking really smart technical people that kind of question. Often, a lot of your hiring is actually around your strengths, not even your weaknesses. There's definitely a lot of BS out there from people who haven't been in the game. But behavioral interviews are really important. (9:41)

Nick: Definitely, as technical people, we just think, “Oh, I’ll grind LeetCode. Oh, I’ll just practice data structures or SQL or get really good at Kaggle, and then I'll be fine.” But there's this behavioral component that is important. Behavioral interviews look for soft skills. Let's be honest, Alexey, part of being a data scientist isn't just sitting there and building a model, it's working with so many different stakeholders. If you're not able to hold yourself, talk confidently, argue for your position, or argue for your business recommendation, you're not going to be an effective data scientist. So much of it is the soft skills, communication, and the ability to present your results.

Nick: Something else these behavioral interviewers look for is position fit and culture fit. Let's be honest, people want to work with you and you want to work with other people who like you, and you like them. We're not robots and we're not machines. We're humans at the end of the day, so of course, there’s going to be this aspect of “Do they fit in well?” I think it is a real component. People think, “Oh, that's just BS.” Amazon, for example, has these ‘Amazonian principles’ that they really adhere to – and they’re not for everybody.

Nick: Startups have their own set of principles, and often, someone from a company like Google – I'm sure they're very smart – but they just might not be a good fit for this type of place. So of course, these behavioral interviews also assess not just your soft skills, but just how good you are for this specific role and this specific company. There is an art to it, but we can talk a little bit more.

Alexey: Basically, it's not enough just to grind LeetCode, learn SQL, and learn all the probability theory stuff or machine learning, and expect that it will be enough to get the job, right? Because even though you pass all the other interviews, there is this 45-minute interview where they don't ask about theory, they don't ask about the technical part. Actually, I remember when I interviewed with a company a couple of years ago, the name of which starts with F and ends with K. They asked me, (I have a list here, because after interviews, I usually try to take notes of what they ask, for reference) “Tell me about something you're proud of.” The second one was, “Tell me about a time you disagreed and weren’t right.” Then “Tell me about a project that got delayed? What was the impact? And what did you learn from that?” And then “Tell me about a time that something changed your career.” And the last one was, “What are your personal areas for improvement? And why?” This wasn't out of the blue for me, because I interviewed with big companies before, especially in Amazon – I think that at Amazon, their interviews are… (11:46)

Nick: They really hone this in, yeah. Leadership principles are really important to them. (13:09)

Preparing for behavioral interviews

Alexey: I was expecting that, but it’s still very hard to prepare for, right? With LeetCode, it’s clear – you just go to LeetCode and you solve some problem. (13:13)

Nick: Yeah. But there is a way to prepare. There’s good news and bad news. The bad news is, it’s work – it's gonna take some time. The good news is – there's a process. It's not some arbitrary thing that you can't ever predict. In the book, I talk about some of these very common questions, and you actually got hit with some of the more common ones like, “Tell me about your proudest career moment.” They’re looking for you to show off. That's kind of a well-known question. Another one is, “Talk about an unpopular or hard decision that you had to make and how did you deal with it?” That's another one where they're trying to see “Can you go against the grain? Can you do it tactfully?” Because, of course, as a data scientist, you're going to be making some recommendations that rub other people the wrong way or you may get a little bit of a pushback. So I think these are very reasonable questions. (13:20)

Nick: My advice for you to prepare is – first of all, think about your three most significant experiences. Maybe these are two jobs and one portfolio project. Or, if you're still in school, maybe they're all just projects and internships. But think about your three most important experiences that you love to talk about. In my book, we talked about some of the common questions that are asked, but you can find these on the Internet – what you should do is simply make a grid. For each of these questions, you can answer them in the grid, like “What are you most proud of, in this job, this job, or this project?” The same way for, “What was the hardest part? What was the biggest bug? What was something that went haywire and how did you recover? What was the setback?” Because really, they fall into these kinds of patterns of like, “What did good?” “What did bad?” and “How do you deal with it both ways?” That's the first thing – just systemize it with a grid. Put it in an Excel sheet, and suddenly that will help you.

Nick: Now, I know some people listening might be thinking, “Oh, why do we have to prepare so much? This is my own experience. I already know it well enough.” But let's be honest, some of these things you talk about are from a job you did a year or two ago. You forgot about it – it's been a year or two, right? Let's be honest, we don't talk about ourselves all the time, like, “Oh, the best thing I ever did was this, this, this.” We're used to being a little shyer, a little bit more humble. But this is not the time for that. So there is some element of practicing to be a little bit boastful – to get those points across of why you're a kick-ass data scientist. It does require practice – it's not gonna come easy. It requires you to talk to the mirror, or talk to a friend, or talk to yourself, and verbalize these things and get those talking points down. But you can prepare. So that's my first thing, make a grid.

Nick: Now, what do we put in the grid for these questions? Alexey, are you familiar with the STAR format by any chance? Yeah, that's a common one. For people who might not have heard of it – and there are really great resources out there about it on the internet and also in my book (yes, out there on the internet, I don't want to just keep plugging my book) – the STAR format is basically Situation Task Action Result. It's a way to structure your story. I'll give you an example. At Facebook, they asked me “What's a time you use data to make a decision?” Or, this is another common question for a data-oriented person, like, “Tell me about a time you used data to make a decision.” Because they want to hear that you do data-informed decision making. That's a big part of our jobs. My answer in the STAR format would be – the Situation: I was at Facebook and I was on the growth team. I was Tasked with figuring out “Is Facebook Stories good for new user retention?” Meaning “Is this product helping people who are new to the app, stay on Facebook longer (retention rate)?” So that's the Situation (I was at Facebook), the Tasks (I need to figure this out). The Actions I took: I analyzed a whole bunch of data and wrote lots of SQL queries to understand “How do new users interact with this feature set?” and looked for gaps that I thought could be good opportunities. The Result: I was able to actually find real bugs that affected new users. Since I was a software engineer, I fixed these bugs. I also made recommendations to the Facebook Stories team – because they didn't really think about new users, they just thought about all the users in general, they weren't worried about the people who are new to the platform – that actually impacted their next quarter’s roadmap.

Nick: Do you see how I kind of really forced myself to say Situation Task Action Result? Maybe you don't have to be as hard hitting. But did you see just how I kind of narrowed down this success story? Like, “Wow, this guy actually did something with data and impacted Facebook's roadmap. Great!” And I kind of said that in probably 8-10 sentences. Of course, I probably said a little bit more, but I think you can get these stories down to a minute and a half. Those punchy, exact stories that say, “Hey, I improved the user retention rate and I changed the whole roadmap.” Those are the things that you want to get across and they work really well when you narrow down your story to the STAR format.

Alexey: Yeah, I guess even if you don't follow this exact format during the interview, but when you prepare for the interview and try to stick to this structure – this structure will just pop up when you give examples. Maybe you will not follow it exactly (Situation, Task, etc) maybe it will be different. But still, if you were using this format when preparing, then it’ll help during the interview. (18:22)

Nick: 100%. Actually, you touched on a good point. I know some people in the audience are thinking, “Oh, I don't want to sound like a robot. I don't want to sound so scripted. I don't want to sound rehearsed.” You wish you were that good. You wish you were an actor who can just do two minutes of prep in the STAR format, and then hit it verbatim. Right? You're not that good. So, practicing beforehand won't make you sound like a robot. It will only get your point across better. You're not going to sound robotic. People have this mindset, like “Oh, if I prepare, it will hurt me.” Actually, probably not. Now, if you're the world's best salesman, maybe you might sound too salesy. But let's be honest, most of the data scientists and machine learning engineers that are reading this stuff and listening to our podcasts right now – you're not in that position that you just [snaps fingers] and spit out things. So, preparation is possible and it takes some work. But if you go systemized, it'll work out and you'll present yourself much, much better. (18:47)

Handling "tricky" questions

Alexey: What about tricky questions? For me, a tricky one that I wasn't able to answer during the interview I mentioned was “Tell me about a time you disagreed with someone and weren't right.” For me, I was like, “Um… I don't remember.” So how do you go about these tricky ones? The other one was about “a project that got delayed.” It's a very specific question, so in my head I was like “Okay, which project did I have in my career that actually got delayed? Well… all of them?” Right? [laughs] (19:45)

Nick: [laughs] All of them, right? Actually, I wish I had a better answer, but – just don't get tricked. You're calling these tricky, but I think I have some of these in my book. They're not that tricky – a delay is a very common thing and so a disagreement with somebody else. A common question that I wrote in the book is, “Tell me about a time you disagreed with a product manager or business stakeholder.” That's what your job is. You're going to be working with these people. So, they're actually not that tricky. (20:15)

Nick: I think tricky, in this sense, would be something like, “Tell me about your three biggest weaknesses.” But these days, honestly, that kind of question is a little taboo. People think it's dumb, because the typical answers are like, “Oh, I work too hard. That's my biggest weakness. I care too much.” So people have shied away from that question. Of course, at some level, they're looking for you to be you. It's okay if you get stumped. But preparation definitely makes perfect and then these questions are not actually that tricky. They're tricky in the moment, but stepping back – every project gets delayed and every project has their disagreements. It's up to you to think about these things before you even step into the interview room.

Alexey: Yeah. I think you mentioned Amazon leadership principles. Amazon uses interviews to actually find out if people fit this principle. I think these principles kinda make sense. I mean, they make sense for many companies. Maybe like, frugality. I don't know about every company. But in general, they make sense. What I personally found useful when preparing is taking these leadership principles into account. I think I selected like two or three projects and put them in a grid. Then I took the leadership principles from Amazon and put them in this grid, like how I showed these leadership principles in a particular project. That was very helpful, not just for Amazon interviews, but for a bunch of other companies as well. (21:23)

Nick: Absolutely. Make a grid. As you hinted on, every company thinks that they're really unique, but a lot of these questions that are asked are repetitive. A lot of the principles of “customer first” – you know, half the companies in Silicon Valley say, “Oh, we're customer first, profit second.” I think Amazon does live by that a little bit better. But I'm just trying to say, you're right, putting your work into the grid and really spelling it out in the STAR format beforehand, won't just help you at Amazon, it will help you with any company. This is the type of work you should be doing before you interview. This is why I always say that this whole interview thing is an art. Some people just want to show up and think they'll ace the interview. But there's an art to preparing and it's not rocket science. But it is work and it is sometimes unintuitive. (22:17)

Alexey: Another good thing about these Amazon leadership principles is that they are available publicly. Not many other similar values of companies are also publicly available. (23:09)

Nick: Yeah. I say that that's because a lot of companies might not even have those values or be as value driven. But I would say if there's a company that is value driven, they will publicize it, usually on their ‘careers’ page or their ‘about’ page. For example, the company I worked at last, SafeGraph, was super careers-driven to the point that we had our mission and values put on posters in every single room – every single conference room and in the bathroom. So while you're peeing at the urinal, it's right in front of you – one foot in front. You can't help but look at it every day and read it and kind of live by it. I mean, of course, it goes company to company – but at my company, if you're a pooping, you'd be reading the principles. I think it'll be easy. Most companies try their best to publicize it. The ones that do, you better expect that they're going to ask you those same kinds of questions. This is the kind of prep work you can do beforehand so you don't get stumped. Because you know if they publicize it that much, they're going to be asking you about it in pretty much every interview. That's another thing I wanted to bring up, Alexey – often at an on-site, for 45 minutes, you might have a strictly behavioral interview with a hiring manager. But let's be honest, most interviews will start with a kind of, “Hey, tell me about yourself. Tell me about your project. Tell me about a failure.” These kinds of things seep in not just with your last round with the hiring manager, they seep into almost every interview. Even at the beginning with the recruiter that says “Oh, we're data driven. Tell me about how you used data to make decisions.” You might be hit with that right from the beginning. (23:25)

Project deep dive

Alexey: Right. But that's not the only non-technical part of the interview, right? There are other parts. So what are they? (25:04)

Nick: Yeah. Another really popular one, especially in data science, are these project walkthrough questions where they ask you lots of details about work you've done in the past. You might be thinking, “But doesn't every industry have that kind of question?” I think in coding, you'll definitely see more of an emphasis on things like, “Hey, listen. It’s nice that you did this project, but… reverse this link list.” Or like, “Hey, that's cool. But… do this graph theory problem. DFS.” Data science is such a broad field – especially in things like machine learning – that they can't really just hit you with like, “Okay, let's do linear regression now.” So what they often do is, they ask you about your project and then if you happen to use, let's say, logistic regression, then they'll ask you lots more follow-on questions like, “Oh, you used logistic regression? Why? Why not other techniques? How did you validate assumptions? What performance metrics did you use? Did your model overfit? Why or why not? How do you deal with overfitting?” (25:13)

Nick: This starts getting into the technical realm. But again, it usually starts out as more like “Tell me about your project and what actually happened?” And Alexey, the biggest thing I've noticed, amongst data scientists and data-driven people is that they don't talk about the results nearly enough. They talk too much about the technical details of a project and not enough about “I did this work and it saved the company a million dollars.” or, “Oh, I helped do this and it became the world's best selling this.” You know?

Nick: Usually, you see that even in the STAR format, the Result is at the end. Often, though, when you're talking about a project, that's called ‘burying the lead’. If you talk about the impact of it at the end, that's not interesting. I'd rather that you first tell me “Yeah, so this project made a million dollars, let me tell you about what I actually did.” Then it’s like, “Oh, wow! That's super cool.” Because at the end of the day, Alexey, you're being hired to a business and they just want to see business impact.

Nick: Sometimes, I hate to tell data scientists this, but how you did it might not even matter sometimes. It's more about “What actually happened?” So, preparing your answers for these project walkthrough questions, and really understanding “What is the quantifiable business impact? What actually happened?” and trying to make that at the start of your story is a real skill and a real art. This has nothing to do with your technical abilities. It has to do with your preparation and your ability to convey yourself. It's a behavioral thing.

Alexey: There’s this pyramid principle from the folks at McKinsey. Basically, when they do a presentation to executives – executives don't have time for a lot of stuff. So they just start with the end, like, “Okay, this is what we want to do at the end.” Or “This is what the project will achieve at the end.” They start with that and then build a pyramid from that and then go into details. So you start with the most important thing and then build on it. “This is how we did it. These are the three things that actually helped. And for these three things, this is what we did.” So you start with the most important thing. (27:50)

Nick: Yeah. Exactly. These project walkthrough questions are oftentimes just “Let's hear about your project.” But they can also be good jumping points into technical things. It's something you'll see at most companies – a sort of a blend. For example, they'll ask you one time about logistic regression in your project, but then they'll also ask about, “How did you make sure the model got productionized?” Or like, “How did you prevent disagreements?” So they’ll mix it in. But it's another one that, if you put the grid out, and you really anticipate some of these questions, like, “Hey, how did you know it was successful? What was the hardest part about the project?” There are some very standard questions that I talk about in the book that people ask about your projects. As long as you make a grid, STAR it out, and anticipate these things, you'll not only do well on behavioral, you'll also do well on these sort of semi-technical questions. You can anticipate that they're going to ask you something like, “If you used this type of model, why not that?” (28:28)

Alexey: Yeah. I also take part in interviews quite often – I interview people. Usually these are the kinds of questions that I ask in the screening interview. The very first interview, I ask, “Hey, let's talk about your background. What kind of things have you worked on?” and then “Okay, now let's select one of the projects and do a deep dive into that project.” Then we will do what you call the project walkthrough. “Well, this is the project you took part in. Let's talk about that.” Actually, I usually ask, “What kind of project do you want to talk about? Pick one and let's talk about that.” This is how it usually happens at other interviews? (29:25)

Nick: Yep. That's exactly it. You're falling into what most companies do. At the end of the day, they're not really trying to do a “gotcha”. They'll let you talk about what you want to talk about. But then, if that's the trade off – if they're gonna let you pick – then you better come damn prepared to talk about your favorite project. Usually, pick two things, like your most important work experience and if you’re a junior in your career, maybe also a portfolio project or side project that's really interesting. So come damn prepared, because if they're gonna let you pick your project, you better know every detail. Often they are trying to just see if you are BSing on your project. It's so easy to be like, “Oh, yeah, I did all this.” And then it turns out, “No, you and a team of seven people did this.” Back in the day, in school projects, you'd be a group of four and usually, two of the four people even did the work, right? That's why they do these project walkthrough questions just to see, “Did you actually do what you said you did? Do you actually understand things?” (30:04)

Business context

Alexey: When we talk about projects, what I often notice is that candidates don't give enough business context. They immediately jump into technical things like, “I used logistic regression and it had 70% accuracy. It wasn't good enough, so I used XGBoost and it hit 80% of accuracy.” “Oh, cool. But tell me why. Why did you do this? What was the business problem you were solving? What were you trying to achieve with this? At the end, what did you actually achieve? Did it help solve the problem?” (31:06)

Nick: See, if they started with the result, they wouldn't even just say, “I achieved this percent accuracy.” Right? Because then I would have pushed and said – the business doesn't get paid in accuracy. They get paid in, “It drove this many dollars,” or like, “It reduced false negatives here by X percentage, which reduced costs of this manual review process.” Once you bring in, as you said, the pyramid principle – bring up that result higher and then make that result, a business result, a product result, not a technical result. You'll find yourself coming across as a much more mature data person. At the end of the day, you're there to drive business results, not to improve the accuracy of some small thing by X percentage without the product or business context. (31:40)

Alexey: How much time do you think people should spend on giving the business context when they answer these kinds of questions? (32:29)

Nick: Nobody loves people who ramble. So just try to keep your whole answer to like two minutes. One interesting thing I've noticed, Alexey, when coaching different people is – in some cultures, people try to be exhaustive and give every single detail, because they think that if they don't hit every point, they're going to get points dinged. But I think what you’ve got to think about is that this is a conversation. You're having a conversation with your future boss or future coworker. So don't just shoot off rapid fire “Oh, here's everything I did here, all the technical details, blah, blah, blah.” If you're talking to a normal person, it's just like, “Dude, I don't even need to hit all these technical details. Let's first talk about ‘this is what I did was really cool because of this’” I think if people frame it as a conversation with their future coworker and less like, “Oh, they're trying to quiz me and get me.” That mindset change will itself lend you to have better answers. Then your specific question – sorry, what was your specific question again? (32:36)

Alexey: How much time we should spend on business context. (33:36)

Nick: So yeah – don't ramble. Keep it to two minutes. First of all, keep it to two minutes – in a business context, I don't know, it could be a good 45 seconds. Of course, if they're curious about one thing or another, they're gonna ask questions because it’s a conversation. So keep things on the shorter end. Again, preparing beforehand will stop you from rambling. (33:38)

Pacing, rambling, and honesty

Alexey: You actually brought up a good point – when somebody is trying to cover every single detail. When I interview, I don't feel very good saying, “Hey, stop. Let's not go into details,” because then I would have to interrupt. But I often have to do this because, you know, we have a limited amount of time and we want to cover many things. But definitely don't feel good about, you know… shutting a person up by asking them to stop. (33:59)

Nick: Alexey, you're not unique in feeling bad. So you know what happens? You just let them ramble. And then you walk away and you say, “Oh, I didn't really like this person,” and they got rejected. Then they're like, “Oh, well… I hit every point and said everything amazingly. F that company… I did amazing.” People feel awkward to shut somebody down, but they are also not self-aware of the fact that they've been talking for seven-eight minutes and didn’t even make sense. People use so much jargon and this and that. I have some crazy stories of that where it's just like – I'll give you an example. Hopefully she's not listening. (34:30)

Nick: This person worked on a male contraceptive device, so that's already an interesting topic. But they didn't even talk about that. They said, “Oh, I'm doing some testing around a medical device, blah, blah, blah, survival analysis.” I couldn't keep track of what they were saying. I'm like, “Yo, I'm in data. I have no idea what you're saying.” I happen to know this person personally. I'm like, “Dude, why don't you just start from the top?” Like “Oh, I work at a company that's making a male contraceptive. And one of the biggest things is – how long will it work? Will it work for a year or two? So I did survival analysis.” If they just said that, then I'd be like, “Oh yeah, survival analysis. Great. Let's talk about the results.”

Nick: But they buried the lead and just said, “Oh, obscure medical device. This, this, this…” and ranted on and I could not understand it. It is so easy to fall into that trap. Everyone thinks, “Oh, yeah. That's other people making that mistake. I will never make that mistake.” But my goodness, it is so easy – especially for people who don't talk for a living. If you're not a salesperson or marketing expert, you're gonna fall into these traps. So all you can do is prepare.

Alexey: Yeah, exactly. Because if I don't prepare, the behavioral questions like the ones I mentioned will have me puzzled, like “Okay, what do I answer? I don't know.” Then I answer. “Yeah, sorry.” Even worse, I can start rambling and lose 10 minutes of time without really answering the thing. Yeah. You mentioned that people can sometimes start sprinkling buzzwords in their rant. There is a comment that says “I once interviewed somebody who said they used SVM in a project, but they couldn't really walk through what the model is or what the reason for choosing this model was.” I think this is the reason we have these kinds of interview questions – to understand how much the person actually knows. I guess this is getting on the technical side, but the non-technical part of this would be – maybe if you don't know much about these things, maybe don't talk about them, right? (36:14)

Nick: Right! Write real bullets on your thing, because you're going to be asked about them. Let's be honest and let's say you did do SVM and you just don't know everything about it. Well, go prepare that before your interview, because you better believe they're going to ask about it. That's why some people who know linear regression – or do a linear regression project – but they really understand that, they might do better than someone who made some crazy RNN or CNN using TensorFlow and Keras, but literally doesn't know what the foundations of this field are. But, that's what they get assessed on. So, it goes both ways – technical and behavioral. In our field, we just can't ask a LeetCode-type question that's like, “Oh, Tell me everything you know about SVM? Let's test you.” That's why they come through with these conversations. If you're having a conversation, there's no right or wrong answer. Suddenly, it’s just behavioral stuff around pacing, not rambling, preparing STAR format – that stuff starts to matter. (37:18)

“What’s your favorite model?”

Alexey: I remember in one interview, the interviewer asked me, “Hey, what's your favorite model?” And then I thought, “Okay, the next question will be ‘tell me how this model actually works.’” So I answered “linear regression.” And then he was like “Oh, what's your second favorite one?” He wanted me to pick something more complex. [laughs] (38:17)

Nick: Ah, okay. Usually what I've seen – and I talk about it in the book, “Ace a Data Science Interview” – I've seen “What's your favorite model?” Then they say something really crazy and then the follow up question is like, “Okay, so what do you know about it?” And they're like, “Oh… I read a research paper like seven months ago – I never used it.” You're better off answering with something that you've actually used in a project. Because think about this, “Oh, yeah. I love linear regression. I made a million dollars with it in the stock market.” Suddenly, even if they want something more complex – do you see how I just twisted so that they're like, “oh, wait… what? Tell me more.” “Oh, okay. Yeah. It turns out that predicting the oil prices is just like a linear regression based on these four variables. Then I did PCA to find that these four variables are the best predicatively. And I made a million dollars.” Wow. Suddenly, your project was amazing, even though it's linear regression. Maybe sometimes people will push you to talk about something more complex, but I think linear regression, honestly, is a great answer for “what's your favorite model?” Or decision trees, random forest – great. (38:36)

What if I haven’t worked on a project that brought $1 mln?

Alexey: What if I haven't worked on a project that brought $1 million? What do I pick? (39:35)

Nick: Perfect. Yeah, that's a real question. Because let's say you're a student and all you've done is internships or not even any internships. I feel like there's an art to being able to quantify your impact. In my book, I talk a ton about the value of making kick-ass portfolio projects. You can always quantify your impact like, “7000 people use my little web app.” or “…downloaded my Chrome extension”, or something like that. You can always find some clever way to quantify it. It doesn't even have to be a million dollars. Maybe it's something like, “I reduced the latency by 5%.” Or like, “I increased AUC by 7%.” (39:42)

Nick: I mean, those are a little bit more technical and yes, you may not have a real business impact because it's a pet project. But go with the technical impact, because at least some impact is better than “I just did something,” and there's no impact. But you're right, there's a benefit to doing real business things. Wherever you can, try to work on more real things – scrape your favorite websites, look at their data, so that at least it's somewhat more realistic than doing a very pet project and saying there was no business impact.

Different questions for different levels

Alexey: How do you think these questions vary? How are they different for different levels? I imagine that if somebody is interviewing for a junior position, the questions they get are a bit different from somebody who's interviewing for a senior position, or even above that. (40:55)

Nick: Like staff or at principal – yeah. I think behavioral questions about ‘your biggest failure in this and that’ are going to be asked at every level, because everyone should have some kind of failure. I think just at the higher levels, you have to give more mature answers. Often, there, you might be asked more people-type questions, because often, a lot of being a principal or director of data science, is maybe less about being the world's best expert at one specific technique and more about, “Hey, how do you work with others to drive an initiative or business impact?” This is exactly why questions about your project delay or dealing with tough personalities are so popular. There might be more of an emphasis on the human aspect for the senior roles compared to junior roles. You just want to know more about their technical details on a project or how they dealt with failure, but not necessarily like, “Oh, tell me what the 11 personalities that are responsible in a seven year project you did.” So I think that's one thing. (41:11)

Nick: Then, going technical for a second, another thing you might be asked is some simple SQL questions at all kinds of levels. But I think more open-ended case studies are expected for senior talent. If you want, we can talk about that. These open-ended case studies often incorporate business or product sense. I'll give you an example. One question might be “What metrics would you use to quantify the success of Facebook dating?” That's an example of a question. Here's another open-ended question. That one is more like a product-sensy question around product metrics. Here's an open-ended one that's more like system designing, which is “How would you go about building Uber's surge-pricing algorithm?” For both of these kinds of open-ended questions, for more senior roles, you'll be asked to give more mature answers and have more interesting thoughts and perspectives.

Nick: Whereas a junior person might just jump into the answer and be like, “Oh, Uber surge-pricing, I'd use this algorithm.” But a more senior person would be like, “Why are we even building the surge-pricing algorithm? Is it to balance supply and demand? Is it that drivers are complaining that they're leaving money on the table? Or is it that riders are complaining that they can't get a cab because it's New Year's Eve, and there's just not enough drivers, and they wish there were more cabs on the market?” Who is motivating this problem? Or is it Uber who's like, “Dude, I think I could make more money. Things are working right now, but if I just doubled my pricing, I could probably make 50% more money without much drop-off. Who is motivating this problem? I mean, all parts of it are probably real for surge pricing. But these are the kind of maturity things you get for questions aimed at more senior people and what kind of answer that they give.

Alexey: Okay, so for case studies, basically – if you're interviewing for a senior position, when they ask questions, you do not jump into a solution immediately. You first ask “Why?” – “Why are we even talking about this problem?” As well as “Who is motivating this problem? Where is it coming from?” Right? (44:08)

Nick: Yeah, exactly. I want to caveat one thing, though. This kind of framework of clarifying, I think it's true for all levels. I'm just saying the way the senior management might clarify would be more interesting than a junior person. A junior person would be like, “What are the technical requirements?” And a senior person might be like, “Is this a one-year initiative or are we just trying to MVP it?” Because they’re used to some projects being a two-month MVP and some might be a year long. But a junior person is not even thinking like, “Oh, I didn't know there's a multi-year Uber surge pricing thing.” I mean, at Uber, there's going to be a whole team around it. It's probably a multi-year project where they’re going to keep improving it. So I don't want to make it seem like “Oh, if you're Junior, just jump in.” I think at all levels, we should clarify. I think it's one of the biggest tips I give in chapters 10 and 11, which are about product sense and case interviews. It's the framework on how to approach these open-ended questions. The first step, as you said – clarify, clarify, clarify. That's exactly the first step. (44:27)

Product-sense interviews

Alexey: Maybe you can tell us a bit more about what these product-sense interviews are? What are they about? (45:25)

Nick: Sure. So product-sense interviews are not asked by every company, but they are asked by more product-driven companies like Facebook, as well as for roles like product data science, business analytics, marketing analytics. Most analytic shops will have this, but also for more product-oriented roles, where you're actually supporting a PM and engineers. So companies like Facebook, Uber, Snapchat, Airbnb will ask these more product-oriented questions. They're kind of similar to what PMs (product managers) will be asked in interviews, but often have a more data spin to them. The reason these questions are asked is because in these kinds of companies, product data scientists work hand-in-hand with PMs to develop the roadmap. So if you don't have a good product sense, how are you going to help develop the product roadmap with the PM? That's why these products-sense questions like, “What would be the success metrics you'd use for Facebook dating?” Or, “Hey, do you have any product improvement ideas on how to improve retention or engagement of this feature?” are asked to even data scientists, not just product managers, at these types of companies. (45:30)

Alexey: If I get a question, like, “We have this feature – how do you think we should go about improving it? What should we do?” What kind of answer is expected? Should I go “Oh, let's change this button from red to blue, maybe more people will like it.” What kind of answer is expected there? (46:53)

Nick: So let's say your question was, “How do we improve engagement for Facebook Live?” The first thing is actually – again, you're going to notice a theme here – you have to do your homework, you have to do your prep work, get to know the company and know their products. Because you're allowed to clarify like, “Oh, can you tell me more about Facebook Live?” Or like, “Hey, I don't really use Facebook much, but I'm guessing it's a lot like YouTube Live.” It's a conversation, so the interviewer will be like, “Yeah, it is. Think of it as YouTube Live, honestly.” That's the first thing. You have to prepare and know your products, because otherwise, how the hell are you gonna answer your question? Let's be honest, these companies are not going to ask – Uber is not going to ask about Facebook Live. Facebook's gonna ask about Facebook Live. Uber is going to ask about surge pricing, or something to do with “How would you improve the rider/driver app?” Right? (47:13)

Nick: Usually, the first thing you try to do is, again, clarify. I talk about that framework in our book. You first have to say, “Hey! When you say ‘improve,’ what are we trying to improve? What's the key metric we're trying to improve?” Because ‘improve’ is a very nebulous term and at these large product-driven companies, there are metrics like ‘engagement’ or ‘revenue’ or ‘time spent’ that they want to improve. I think for an improvement thing, that's a little bit less popularly asked. But there, you just start brainstorming like, “I know for engagement, this has worked well at my past company, so I could try this.” or like, “I know that pop-ups or notifications are always great for engagement. So I'd look at Facebook Live and look at their notifications and see if we can do more real-time notifications on the product.”

Nick: There the interviewer will usually help navigate you to something that they want to talk about. I know that that’s such an open-ended question, “How do you improve Facebook Live engagement?” That's too hard of a question. Usually, it's a conversation – they'll narrow it down for you. It's not as daunting as it seems. But definitely a mistake would be to just jump in and say something like, “Oh, I would add this feature.” You know – without even clarifying “What are we trying to do?” and “What's the product and business goal that we need to align our answer with?”

Identifying key metrics in unfamiliar domains

Alexey: One of the things you mentioned is that we need to say “What is the key metric?” I think this is a tricky one, especially if you don't have a lot of experience in this particular domain. Maybe you're even a senior already. I work in online classifieds, so I can understand what kinds of things are important in this domain. But if I go to a different website, outside of my typical expertise, let's say Uber – I don't know much about Uber. I don't know much about this domain. How do I come up with good metrics? Not just a set of metrics, but the key metrics? (49:34)

Nick: I'll tell you the politically correct answer and then I'll tell you the real answer. I talk about this in my book as well. The politically correct answer is “Alexey, you don't need to know about Uber’s business. We're judging all candidates fairly, and it's okay that you have a different background. We'll give you the context needed to answer the question.” That's the politically correct answer. Now here's the real answer. I'm an interviewer, I work at Uber. All day, I'm thinking about this damn problem around surge pricing. All day, the management's beating my ass saying, “Hey, we're dropping. Our driver numbers are not good. Riders are happy to pay money, but we can't recruit drivers fast enough in peak times. We can't bring them online.” So they're worried about that all the time. (50:11)

Nick: Then, guess who they're interviewing you against, Alexey? An internal candidate, an intern who's now looking for a return offer, two people who worked at Lyft, someone who worked at Facebook and had a similar problem, and then you – who might not have contacts? So they tell you, “Oh, we're all looking at you equally.” But let's be honest, the internal Uber candidate has a leg up, right? So how do you fix yourself? Again, for people watching at home, this is not to discourage you so that you’re like, “Oh, crap, I'm never gonna get a job at Uber, because there are Uber internal candidates.” It's not like that. But there is an art to preparing. Again, I talk about that in the book – that are some tactical ways to prepare for these product interviews.

Nick: One way to prepare is – Uber’s a public company. You can read their reports. You can see what financial metrics they're reporting against. You can see their CEO, Dara, saying, “Hey, listen. We're trying to focus on profit because we have great revenues, but not good profits. So we're working on that.” That's one way that you can actually see these things broken down. Secondly, go look on Reddit. Go look on Quora. Go read news stories around Uber. You'll start seeing actual business analysts and stock reporters and things like that, talk about the metrics you need. Even if you don't have perfect context, if you're a reasonably smart data scientist, and you do your homework – you look at their public reports, you look at the news, you look at what people are saying – you can get yourself a leg up.

Nick: Or let's pretend that we’re talking about a private company. There's usually a public comparable that you can look at. You know what I mean? For a while, DoorDash was private, but you could have always looked at UberEats data or GrubHub data and find these answers. So that's probably my first answer, which is “Do your damn homework.” Because it'll give you an unfair leg up. Someone might be thinking, “Hey, that's not fair. That doesn't mean we're smart. That just means you did your homework.” But let's face it, doing your homework is going to give you a leg up. If you're trying to get a job at Uber, knowing the ins and outs of the Uber product is gonna help you for sure.

Alexey: What if you're interviewed for Google or Microsoft – or some giant tech company that is doing everything? (53:03)

Nick: Yeah. Then I think you'd still want to know what their product set was. Oftentimes, you might know what team you're interviewing with. Honestly, these days, if it's data scientists at GCP (Google Cloud Platform), there might be 100 teams there, but it's still GCP. It's not something else. Go read the earnings reports or go see what Azure and AWS are doing. Then that will give you a leg up. Of course, they can still hit you with something that surprises you, or they might ask about something really weird. But then, of course, they will usually be smart enough to give you a little bit more context. It's not like you have to know every product and every product line, but I'm just trying to tell you “Hey, if you know, overall, what GCP is struggling with, you will have a better sense of how to answer them.” Or “If you know something more about cloud computing, or how enterprises use the cloud and what metrics they look at like ‘total cost of ownership’, you will have a leg up” (53:11)

Nick: Even just knowing the term ‘Total Cost of Ownership’ is not something that a junior data scientist might know, but a senior business leader thinks about it all the time. I think there's always a way. I also think that there's no foolproof method for any of this. You can always get stumped. You can always be asked about a product in a weird way or an interviewer might not like your answer – but all we can do is improve your odds. It's a process, but if you improve your odds significantly and you play the numbers game that is hiring – you're not just ever interviewing for one company or not even interviewing with 10. If you can go from getting one offer out of 10 to 2 out of ten, we're in business. You're making money. This is great. So there's no foolproof plan. All we can do is have better chances.

Tech blogs

Alexey: Each company usually has a tech blog. In the blog, they talk about use cases. Not always, but sometimes, in our tech blog at OLX, we say “This is the metric we care about and this is what we want to optimize.” So the end goal for this project was to impact this metric. (54:55)

Nick: 100%. That's one of the best ways to prepare. I can't believe I forgot to mention this. But I talk about it in the book and on LinkedIn all the time. Case studies are one of the best ways to level up, because it's more than just, “Oh, we use linear regression.” It's you explaining “What was the business problem? What was the product problem? What were the technical techniques used? How did we productionize our system? And in this era, where ML Ops is important and people are talking about productionization and deployment – it's not just enough to build the models. Reading these technical blogs gives you this in-depth understanding around a business and around this problem that simply reading about it in a textbook “Oh, here's how you do linear regression” just doesn't give you. So, you're 100% right. Go read their tech blog before you interview with a company. (55:22)

Alexey: I have a funny story about that. A friend of mine was interviewing with SoundCloud. Do you know SoundCloud? It’s like Spotify. So, he was being interviewed there and he read their blog. They were doing the system design interview and then they asked, “Hey, let's design the search for SoundCloud.” Then he did the design and then they said, “Oh, really cool. It's almost like the search system that we already have.” And he said, “Yes, I read your blog and basically just followed whatever you wrote and designed the system like you did in your blog post.” OOPS! [laughs] It was funny. (56:12)

Nick: Yeah. It can happen. You might be thinking, like, “Oh, did that guy fail?” But actually no, the guy 1) showed that he knew their architecture and cared enough to read their tech blog. Remember, right at the beginning, we talked about position and culture fit? Alexey, let's be honest, some of being a good employee is just giving a damn. So if you read their technical blog before your interview, they realize, “Oh, this guy actually cared enough to read the blog.” You're already scoring high marks. They're like, “Oh, this guy didn't just shotgun applications or just sprayed and prayed. They actually care about SoundCloud and they care about engineering enough to read our tech blog.” (56:50)

Nick: That's the first thing and the second thing is – dude, he gave a correct answer because that's all these engineering guys know – their own architecture. Someone who just regurgitates that? It's just like, “Oh, wow! That's pretty good.” Now, of course, they should have asked a better question, not something from their own blog. But you'd be surprised how often this happens. Listen, I'm going to look for any advantage. The politically correct answer would be like, “Oh, they should have nullified that and said he read the blog. So they should have not given him any marks.” But let's be honest, he crushed it. I hate to say that, but yeah. It's on SoundCloud to ask better questions – but as a candidate, go do that and you will crush it. But just don't lie about it. Don't be like, “Oh, I had no idea you had a blog about this.” As long as you're upfront, they're gonna be like, “Oh, dope. That's pretty cool.”

Cold emailing

Alexey: Yeah, thanks. There was something I wanted to ask you about the book. When I was ordering this on Amazon, I looked at reviews just because I was curious. In the reviews, many people actually mentioned the ‘cold emailing’ tip. They said that this was the most useful thing from the book. I wanted to ask you – what is this thing? (58:26)

Nick: Alright. So, I'm a little biased. I think, out of this book, the most interesting part is the 201 Real Questions. But I feel like questions are just questions, right? It's great practice, but no one's like, “Oh, my God. That changed my life.” I think the thing that gave most people a ‘light bulb’ moment, or something that really changed their outlook was my chapter on sending cold emails. A cold email is an email sent to a stranger – someone you don't know. Someone you did know would be a warm introduction, or if it was a friend of a friend, that's warm. Cold emails are emails to a complete stranger. It's not just even about email, you could send it on LinkedIn, you could send it as a cold Twitter DM – but the point is, people are willing to help you. (58:50)

Nick: If you reach out to people you don't know, you will be surprised by the results. I got my last job at SafeGraph by writing a cold email to the CEO and I put it up on my LinkedIn, what that email looked like. I talk about this all the time. I think people just love this chapter because they thought that they had to apply online with 300 other people, waiting to get their resume screened. Often, they never hear back. They don't even get an interview.

Nick: So this kind of changes it up, because you're actually doing the reach-outs. When you write a good reach-out to somebody and say, “Hey, listen. I see you work at Uber. I analyzed the free data around New York City taxis. And I analyze this. I made a public Tableau dashboard here. And I did some analysis with SciKit Learn here. It's on my GitHub.” And you will link those two things to a hiring manager who was working on surge pricing. They're going to be like, “Oh, that's awesome. Cool. We would love to talk to you.” Or like, “Oh, yeah. I just forwarded you to the recruiter.” People just don't know that they can build portfolio projects and then email people – the right people – to get ahead of this game of just applying online and never hearing back. I think that's why people love it.

Alexey: That's cool. So this is basically the cover letter. But you don't send this cover letter to the recruiter, right? You send this cover letter directly to the hiring manager. (1:00:48)

Nick: You could even send it to a recruiter too. I've had great luck with sending it to recruiters as well. CEOs work too, for smaller companies, because they just want to hear from smart, talented people. I think the biggest thing is – a cover letter is very formal and it's often generic for 99% of people. So that's why so many jobs these days don't even ask for a cover letter. If they do, nobody reads it. No one has time for that. Plus, cover letters are like formatting – just a clean page. A crisp email with a hyperlink, like, “Hey, here's my GitHub link.” or like, “Here's the project.” (1:00:59)

Nick: Or you just embed the image. You could embed a GIF or an image, like “Hey, look at this visualization of all the taxis in New York City over 24 hours. I love transportation and I want to help build a future of mobility at Uber.” That's like a two sentence thing. It's one GIF. That's way more alive and shows that this person actually knows how to deal with something like 8 million taxi trip record data and did something with it, than any cover letter that just says “I'm a hard working data scientist who's passionate about making a difference.” You know, that's bullshit. So that's why this cold email stuff works.

Wrapping up

Alexey: Yeah, thanks. We should be wrapping up. Just before we finish – anything else you want to mention? Any tips you want to share? (1:02:09)

Nick: No. I would just say – people who are about to interview or just want to know about how to ace their data science, data analyst, or machine learning interviews – go check out the book on Amazon. It's basically the “Cracking the Coding Interview” of our field. It didn't exist, so we made it happen. It's only a few months old, but it's already getting some great reviews. So I just tell any person who wants to break into these FAANG-type companies or just level up in their career to go check it out. (1:02:16)

Alexey: How can people find you? (1:02:43)

Nick: You can check out the book on Amazon, of course. It’s just “Aced the Data Science Interview”. You can find me and my blogs on NickSingh.com. I also talk a ton on LinkedIn. I have about 65,000 followers on LinkedIn. I post tips daily, around career, job hunting, and data science. You can always find me there at Nick Singh. LinkedIn, my website, and buying the book are the best ways to go – to learn more about me and my tips. (1:02:47)

Alexey: Yeah. Thanks a lot. Thanks for joining us today. Thanks for sharing your story. Thanks for sharing tips with us. And thanks, everyone, for joining us today – for watching us. There was one question I didn't cover. Sorry about that. But yeah, thanks a lot, everyone. And thanks, Nick. (1:03:17)

Nick: Yeah, thanks for having me. It was great to be on. Alright. Talk to you later. Bye. (1:03:36)

Subscribe to our weekly newsletter and join our Slack.
We'll keep you informed about our events, articles, courses, and everything else happening in the Club.


DataTalks.Club. Hosted on GitHub Pages. We use cookies.