MLOps Zoomcamp: Free MLOps course. Register here!


The Secret Sauce of Data Science Management

Season 13, episode 6 of the DataTalks.Club podcast with Shir Meir Lador

Did you like this episode? Check other episodes of the podcast, and register for new events.


Alexey: This week, we'll talk about the secret sauce of data science management. We have a special guest today, Shir. Shir is a data science group manager at Intuit in the United States. She's working on developing machine learning and deep learning models for document intelligence for products like TurboTax and QuickBooks. Welcome, Shir. (1:40)

Shir: Thank you, Alex. It's great to be here. Should I call you Alex or Alexey? (2:03)

Alexey: Alexey. Well, some people do call me Alex and I'm used to that. But my mom would never call me Alex. (2:07)

Shir: Okay. [chuckles] So thank you, Alexey. I'm glad to be here with you guys. Sorry about the technical issues. (2:16)

Shir’s background

Alexey: Yeah, we always have these technical issues. Anyways. I think we should start. But before we go into our main topic of the secret sauce of data science management, let's start with your background. Can you tell us about your career journey so far? (2:22)

Shir: Yeah, sure. I studied electrical engineering, I did my Bachelor’s and Master's in that topic at Ben-Gurion University in Negev, Israel. Since then, I have worked in two startups as a data scientist – an algorithm developer. Five and a half years ago, I moved to Intuit and I took a leadership role. I've been at Intuit for five years, leading and growing the team, developing AI solutions for security risk and fraud, and smart products. (2:40)

Shir: About seven months ago, I relocated to the Bay Area (in which we don't have electricity in a large part of it at the moment) to take on a new leadership position, leading a team that is focused around document intelligence. Basically, the mission of my current team is to use computer vision and NLP methods to extract and classify information from any financial document that is uploaded to our system in order to simplify tax submission and small business management for our customers. It's gonna save them time, basically, to take the heavy lifting out of all of their financial hard work that they have in their life. (2:40)

Alexey: Just curious, did you happen to serve in the army in Israel? We had another guest from Israel more than a year ago, and I think she was some sort of pilot. That was pretty interesting. Did you need to do anything like that? Did you need to pilot anything? (4:12)

Shir: Yeah, I actually served for three years in the army. I was a pilot instructor. It's interesting that you said that. I was a pilot instructor in a flight simulator. Basically, I trained helicopter pilots for Blackhawks and Super Stallion. (4:31)

Alexey: Was it useful for your career afterwards? (4:54)

Shir: Well, definitely. I've learned so much from the Israeli Air Force. The debrief culture really helps me in really developing a growth mindset. Also, being a pilot trainer really helps you to be a good leader and a good mentor for your team. (4:58)

Debrief culture

Alexey: Which, I guess, is quite related to the topic today. So what is this debrief culture? What does “debrief” mean? (5:24)

Shir: Basically how it works is, every time a pilot goes onto a flight in the sky, or in the flight simulator, they look at their lessons learned from previous flights in the simulator or in the air on the helicopter. They take three things they want to focus on in this flight. After they get off the flight, they debrief – retrospect on what happened in that flight and what are the three things that they want to work on for the next time. So that's something I’ve really taken into my life, because this is really the best way to get better. Just keep on learning from experience, keep on marking the areas where you want to focus. (5:35)

Alexey: I just want to make sure I understand it correctly. Before the flight (or before doing something) you think about “What are the three things I want to focus on in this flight/project/whatever.” Right? You think about these things in advance, and you try to focus on them. After the flight/project/thing, you do a retrospective and you think about, “Okay, what are the three things I want to improve based on the experience I just had?” Is this a correct summary? (6:26)

Shir: Yes, it's the correct summary. And the three things that you focus on are usually things that you noticed in the past where you should do better. (6:56)

Alexey: So it could be like “better navigating the helicopter,” right? Or something like that? (7:07)

Shir: Yeah, it could be better navigating. If it’s specifically in a simulator, we work on how they handle emergencies in the air. For example, if the tail rotor is now malfunctioning and the helicopter starts spinning around, how fast did they reduce the collective – the power? And how fast did they react with the pedals? Also things about teamwork – how much did they listen to their co-pilot? Many things like that. (7:14)

Shir: Sorry about the technical difficulties. You would think that they would have better infrastructure in Silicon Valley. [chuckles] Right? (8:45)

Alexey: Than in Israel? Why? (8:49)

Shir: Better than anywhere, than it’s not [inaudible] (8:51)

Alexey: Oh, because all the IT companies are there and they need electricity, right? (8:55)

Shir: Yeah. (8:59)

Alexey: Okay. [chuckles] Well, maybe they have backup generators or whatever. I think YouTube still works, although the servers are probably not in the Bay Area. It’s probably too expensive. (9:01)

The responsibilities of a group manager

Alexey: Anyways, we stopped at the debrief summary – debrief culture. What I wanted to ask you next is – what do you actually do as a group manager? What are your responsibilities? (9:18)

Shir: I am a manager of three teams. Basically, I have team leads that lead those teams. I also manage a couple of ICs. The group manager’s role is basically to define the strategy for the whole group to plan ahead – look one year, three years ahead and see what the vision is. Where do we want to get to? Find the right things we want to focus on as a group. What is the right direction from research? What are the metrics we would want to look at? What does success look like for our team? (9:34)

Shir: Of course, we also make sure we're building a high-performing group by mentoring the managers, as well as reviewing what the team is doing – what we are delivering – setting the standards, and working with the leadership team to make sure we are delivering the highest standards? (9:34)

Defining the success of a DS manager

Alexey: Okay, that's quite a few things. So you're managing the managers and you also mentioned ICs, but I guess these are people on the staff or principal level, right? Then you also take care of the strategy for the entire group when it comes to data, I guess, in terms of what exactly you want to achieve in the next couple of years. You need to know a lot about management to manage managers, right? So what does it mean to be a successful data science manager? (10:55)

Shir: As you know, I gave a talk about that about [cross-talk] (11:27)

Alexey: Exactly. The story – it was PyData Berlin, right? I saw your talk there and I reached out. It's been one year since then and we finally managed to be on the same call and discuss that. Sorry, for interrupting you. (11:32)

Shir: Yeah. I think that when you say “What is success,” you always need to have a measurement for that. I think the two things I focus on, is how much we're able to deliver impact for the business. This is one aspect of success. And the other aspect is the growth and engagement of the team members. These are the two things I measure myself by. (11:53)

Alexey: The growth of the business and growth of the people. [Shir agrees] These two aspects, business and people. What kind of metrics are there? I guess for business, it could be usual business metrics that we look at for the product, because we use data science to improve the product. But what about the second aspect – the people? What kind of metrics do you look at there? (12:31)

Shir: Luckily, because we are at a big company, we have what's called a “pulse survey”. Basically, that's a survey that's done every six months. That's not everything, but that actually measures at least one aspect of that. It measures, for example, the engagement of the team, by asking different questions for all the team members. (12:56)

Shir: That's one very easy measurement you can look at. It also has our questions. There's also something called “manager score,” which also measures different aspects of management, like giving recognition, giving guidance for the team, providing feedback – all of those things are measured and combined. This is what we call the manager score. (12:56)

Alexey: You ask the people, whom these managers manage, what they think about their managers, right? Then they give some feedback like whether they feel that they receive enough guidance, they receive enough feedback, that they have all the resources they need to be successful at their job. I’m guessing this is anonymized, right? (13:58)

Shir: Yeah, it's just a survey through this app – this company called Glint does it – and it's all anonymized. We just get the aggregated scores of our teams. The team can also provide comments, but I don't have access to the comments. You need to have, I think, at least 50 people to see all the comments. So the VP or director would have that access to all the comments and then they can direct me to comments that they think may be relevant to my team. (14:20)

Alexey: In this case, to call a data science manager successful, we look at two things – business metrics, to see if the team is delivering on their OKRs or whatever framework they use for setting goals, if they’re delivering on the goals. They show impact on business metrics and secondaries – the manager score? If both metrics are good, then we can call the manager successful. (14:58)

Shir: Well, it's not just that. And it's not just the manager score, it's also the engagement score. One of the roles of the manager is to make sure the team is engaged, so it's not just directly through the manager's score. But, this is not the only way I would measure my managers. I would also look at the feedback I get from their team. I'm doing skip levels with their team. There are multiple aspects. This is just a part of it. (15:30)

The three pillars of data science management

Alexey: Okay. On that talk that you mentioned – the PyData talk – you talked about the three pillars of data science management. Do you remember what these pillars are? Or is this what we already talked about just now? (16:05)

Shir: Yeah. We have managing up, managing across, and managing below. Actually, I want to add something more – we have the Intuit leadership playbook, which I think is also really helpful. They also kind of define three pillars. It's not exactly parallel, but it's also an interesting way to look at it. One is, “lead with a clear vision”. The second is “drive winning results”. And the third one is “build a high-performance culture – a high-performance team, high-performance culture”. (16:19)

Shir: I think these are all kind of related to the things we talked about, just a different way to frame it. But when you talk about managing up, across, and below, I think this is more about “How do you achieve all of those things we just mentioned?” Leading with a clear vision, driving winning results. So this is more of the practice of that. (16:19)

Managing up

Alexey: So what is “managing up”? What does it mean? (17:20)

Shir: “Managing up” is how you work with your leadership team – with your manager, with a manager of your manager – to give them an understanding of what your vision of your team is. Share with them the success of your team, the requirements for your team to be successful. Maybe you need their help, maybe you need more resources. It's a lot about educating them about the accomplishments of your team. Because if you don't share the successes of your team with them, how would they know? Right? This is the role of the manager. And if they don’t know, what would be their motivation to invest further in this team? So this is “managing up”. (17:23)

Alexey: What happens if we don't manage up? What happens if we don't talk about how good our team is? What can happen? What can go wrong? (18:15)

Shir: Yeah. It's like the saying, “If a tree falls in the forest, and no one hears it, did it even fall?” I think it's kind of similar to that. Because if you don't get the backup from your leadership, how would you be able to grow and get more impact and improve? This is also true for one's career. I also encourage my employees to do that for themselves and I try to do that for myself. Try to list your own accomplishments and share that with your manager. Because your managers don't see everything you do. So to be recognized, you need to help your manager recognize you. (18:28)

Alexey: I think I heard about a concept called the “brag document” where you “brag” about all your accomplishments. You say “Okay, in this quarter, I did this, this and this,” and then you have this document. Then you can share this document with your manager and the manager will realize how valuable you are. Maybe they don't know all the details about all the work you do, but with this brag document, you help them know about that. (19:15)

Alexey: In this instance, we’re talking about managing up from me, as an individual contributor, to my manager. But then it also can work up even further – from the manager level to the group manager level or head of data science or whatever – to the manager of managers. This is what you're talking about when you mean “managing up”? (19:15)

Shir: Yeah. Also, find opportunities for your team to share their work with leadership. That's also very important. This way, leadership would understand the impact and also the complexities of the things we deliver. (20:07)

Alexey: What kind of tools can we use for that? What can we do for that? (20:27)

Shir: We have regular – we call them “side visits” or “tech reviews” – with the upper leadership. We need to make sure that our team gets a slot in this. And myself, as a group manager, I'm not a frontline manager, so I also do those tech reviews with my team members to get to know what they're working on and to see if I have some directions for them. It gives an opportunity to ask them some questions. (20:33)

Alexey: So it's a meeting where we have the management and the actual people who do the work, together in the same room (or virtual room) where the employees talk about their accomplishments, right? What they did so far, how successful their projects are – and then the management learns about all the complexities and all the things. (21:11)

Shir: Yeah. It's mainly focused around projects. When we get a project to a certain milestone, we ask the managers “Should we do a tech review on this? I would like to learn about this, this and this,” and then we can set the tech review. They prepare something and present it and then we have a discussion. (21:38)

Alexey: In practice, it could be just a Zoom meeting, depending on the setup of the team – a Zoom meeting, or just a meeting, right? (21:59)

Shir: Yeah. We have a regular schedule for that. Once a week, we have a tech review or design review. (22:08)

Alexey: Do you do anything else apart from these meetings? Maybe some sort of email with a summary of all the accomplishments? Or not really? (22:16)

Shir: No. That's what I do with my team members, but we have similar forms with the upper management that we do every few months. We present them with something. In the past, I actually tried to do a kind of ‘accomplishments’ email – I did a newsletter of what the team does. I think at small companies, it may be a good idea to do that. (22:27)

Shir: In a startup, it's really helpful to give the rest of the departments an understanding of what the data science or AI team is doing. I think that in a big company like Intuit, we have a lot of data science teams so it's a little bit challenging to actually read through those things. (22:27)

Alexey: How many teams are there? Do you even know? (23:10)

Shir: Wow. When I joined, we were like a few dozen – some data scientists and MLEs. That was five and a half years ago. We were just building up the organization. Right now, I think we have a few hundreds of data scientists and MLEs. This organization has also been unified with our organization of the data analysts and data engineers. So the entire organization has like 1500 people: data analysts, machine learning engineers, data engineers, and data analysts. Many of them do work around AI. (23:14)

Alexey: So it's very hard to keep up with everything that is happening. (23:48)

Shir: Exactly. [chuckles] (23:52)

Managing down

Alexey: Okay, so we've talked about “managing up,” which is letting your manager, or the manager of your manager, (when you are manager, the upper management) know what exactly your team is doing and what their accomplishments are. (23:54)

Alexey: What about the other two main things you mentioned – “managing down” and “managing across”? What are these things? Let's maybe start with “managing down” or “managing below” as I think you called it. What is that? (23:54)

Shir: Yeah. “Managing down” or “managing below” basically means similar things to building a high performing culture. It is doing everything that consists of the manager score, if you are a manager. This is defining the expectations for your team, providing them the training, the resources, giving them the feedback, the recognition, making sure they get the guidance they need from you, making sure people are getting growth opportunities in their role – all that aspects of building a team and making it successful. (24:24)

Alexey: I guess the tools you use here are performance reviews, setting goals, and all these things, right? (25:13)

Shir: Yeah, definitely. We're setting goals. We have weekly one-on-ones. We have monthly check-ins, during which we review the progress of the goals they set for themselves. As I mentioned, we provide feedback. That's really important. We provide recognition of things. (25:23)

Alexey: How does it work? These goals that you set and this monthly check-in meeting – how do you set these goals? What kind of framework do you use for setting these goals? And how often do you do this? (25:46)

Shir: We have year-level goals, which we update sometimes. If there's a shift in priorities, they might be changed a little bit. But we do a goal-setting session at the beginning of the year. [cross-talk] (26:01)

Alexey: With each person individually or with the team? (26:22)

Shir: No, I did something with the entire group. But in the company, because it's a big company, you have a lot of levels of hierarchy. It starts with the most upper levels – the directors, the VP – they set their goals, and according to that, the managers set their goals in alignment of the roadmap. Then according to that, the team members define their own goals in collaboration with their managers. (26:25)

Shir: Basically, we share our goals, which are more high-level, and then in the people according to what they think they should be doing for the business and also where they think they want to focus on from their personal development, they set their goals and then they get feedback from the manager. We always try to set goals that have a timeline and also some metrics that we can measure them by. (26:25)

Alexey: So your manager has goals, then you look at these goals, and based on (it's probably some sort of VP) – based on their goals, you need to define goals for your group. You have the goals for your group, and perhaps you have some personal development goals, too. Then the managers who report to you (your team) look at your goals and think, “Okay, which of these goals make sense for me?” Then they break them down? For each team, each manager does the same work that you did – they come up with the goals for their own teams. Eventually, each person in their team does the same thing. This is how they end up with individual goals. Is this correct? (27:37)

Shir: Exactly, you got it just right. We even have a system in which we are able to align all of those goals to each other. So if people in the manager’s team are making progress with their goals, and they align it to his goal, then the manager would get progress in their goal. (28:28)

Alexey: Okay. So then at the very bottom, where individuals work, they have their own goals – business goals, I guess – and then they also have personal goals. For example, let's say I want to learn more about neural networks or whatever. This could also be a personal development goal, which should be related to one of the business goals, but can be a separate goal. Right? (28:47)

Shir: Exactly. (29:15)

Alexey: What does this look like in practice? What's the fraction between these business goals and individual self-development goals? (29:16)

Shir: Yeah. If you're lucky, you're able to achieve the development goals within the scope of your business work. Sometimes, it doesn’t specifically have a complete overlap. Then you'll try to focus 10% of your time in building those skills you're not able to build during your regular work day-to-day. (29:26)

Alexey: But I guess the business goals are more like, “Improve this KPI by X percent,” or “deliver this thing,” or “open a new flow,” or “index this amount of documents” or whatever. They are business-related goals with business metrics attached to them. But what I was asking more about was “Okay, there are business goals. But now I opened Twitter and I see all these things about large language models and all that and I want to kind of learn all that because I feel like I'm missing out if I don't.” Which can still be connected to these goals, but I can probably have some individual goals that are something like “Okay, I want to learn that thing in order to see how I can contribute the skills I build to one of the business goals.” Right? (29:50)

Shir: Yes. You definitely got it. I want to say that, in my team, the work we are doing is definitely relevant to large language models. So even if it's your development goals, it might be something that you will actually work on. (30:42)

Alexey: That's cool when you work on NLP stuff, right? This is exactly what your group is doing? You work on NLP. [Shir agrees] That's cool. Then you have these goals and every month, you have some sort of checkup where you check “Okay, am I progressing towards achieving those goals? Does each of these goals even make sense or should we rethink them?” Right? [Shir agrees] So that's “managing down”. I guess when you look at how these goals cascade down, it also makes sense. They go all the way down to individuals. Did we forget about anything? You talked about expectations for the team, feedback, recognition, growth opportunities. We kind of covered all that when it comes to managing down. (31:00)

Shir: Yeah. This is a lot about the individuals in the team. There are also some aspects of team building, where you want to make sure that you have good team spirit and team atmosphere. You want to have a group meeting where there is a safe place to share things. You want to have those forums when an open discussion can be had. Some people run a book club or a paper club where they review papers together. So you want to open up the space for learning, you want to make sure you have a place like that. (32:00)

Shir: You make a place, as a manager, for innovation in your team. Indeed, it's really nice, because we have a hackathon for the entire company twice a year – it's a five-day hackathon, basically. It's called Global Engineering Days. I really like it because it's the five days in a year that I get to go back to actually being a hands-on engineer myself. Basically, it's just a few days where everyone plays with whatever technology they like and solves whatever problems they want. It's one way to encourage innovation, but I also try to do some work in encouraging innovation all the time. (32:00)

Shir: It’s about finding the balance between working on deliveries, as well as doing some long-term research within the team. That's also a lot of the work of the manager. Of course, it’s also what we talked about – we need to set the vision and the roadmap for the team. These are things that go in all directions – both above, sideways, and below. There’s planning for the team – there are so many aspects of management. (32:00)

Managing across

Alexey: Okay, interesting. The last pillar is “managing across”. I understand “up”. You have this hierarchy – you manage your managers, and the hierarchy goes down, where you manage your team. But what is “across”? Does it mean that, in your case, for example, your group manager – you collaborate with other group managers? This is the “across” part? Or what does it mean? (34:02)

Shir: It will change from organization to organization. One thing would be the collaboration with other AI managers in the organization. This would be one aspect of that. But a lot of that is just working with product teams, because we are not a self-sufficient product team. We are working cross-functionally with different product teams across the company. I have my own product managers, which handle all of their collaborations with the product team. My team builds reusable services, which are used across many offerings at Intuit – different offerings of TurboTax, different offerings of QuickBooks, which use our services. (34:31)

Shir: We maintain relationships with a lot of teams. This is also managing across – getting product requirements, translating those to the capabilities that the team is building, collaborating with the product teams. When our services are going to production, communicating the expected performance. When the services are in production, maintenance of those services. What is the support model that we are building for their services? What is the maintenance model? All of those things. (34:31)

Alexey: So this means we need to talk with the product team, by which I guess you mean engineers and product managers and everyone who are not data scientists – not AI folks. [Shir agrees] Okay. What kind of managing across do you need in this case? How do you actually do this with them? (36:03)

Shir: It's a lot of expectation management, sharing our roadmap, communicating. What are the capabilities that we are building? How are they solving the problems of the customers? How are they achieving the business metrics? What should they expect from them? Even just explaining things like, “These are machine learning models. They are not perfect. They have some predicted amount of error. (36:28)

Shir: We cannot review every error that they have in production. We want to focus on patterns.” For example, things that would require a retrain of the model or any further, more deeper change in the model. All of that work – educating product teams on what AI looks like, what we can achieve, and what we cannot. (36:28)

Alexey: I guess the main goal for managing across is making sure that the product teams know how important the stuff that you’re working on is. They want to help you, they want to work with you, they want to integrate the models you build, because they know that these models contribute to the goals of the company, and they want to be a part of that. They also want to contribute to the goals of the company. Did I get this right? (37:34)

Shir: Yeah. It's even deeper than that. They come to us with their requests so we have to explain to them – how do we fulfill those requests with the solutions we build? (38:00)

Alexey: Okay. So the ask comes from the product team. They come to you and you discuss how exactly you can help. You need to make sure that you set expectations right and this is the expectation management aspect you mentioned. Then you need to make sure that there are no unreasonable requests, so you need to talk about education for them to know what is possible to achieve. I guess this is also kind of related to expectation management. (38:16)

Alexey: Then, based on that, you come up with a roadmap, you share this roadmap with them, and you start working. You then probably also need to somehow let them know how exactly you're progressing and what state you're in so they know when exactly they can start integrating your solution. Right? (38:16)

Shir: Right. Yeah, definitely. We try to meet with them regularly and discuss the progress and raise questions. We also have joint Slack channels in which we can raise questions and get their questions about things. Yeah. I think that one more important thing that we kind of talked about, but didn't – as AI leaders, we need to give the product team the perspective of what kind of problem is suitable for an AI solution and what is not. This is another aspect of our mutual work with the product team. (39:03)

Shir: Also, I wanted to mention that it's not always requests coming from the product team. In many cases, because we are the ones who play with the data and have a better understanding of the data – of AI technologies – in many cases we will push in the other direction and tell them “You have to adopt this. This will help the customers.” So we’re also making that case and making sure they integrate those solutions and build the right product experience to handle them. (39:03)

Alexey: So it's not only coming from the product teams. They don’t only come to you and say, “Hey, implement this,” but you also know that if they add this particular feature or capability, it will be a better user experience. You come to them with suggestions like, “Hey, this is what we can do in our team. What do you think about this? Does it sound interesting? Do you want to integrate it in the future? This is how this integration will look like.” [Shir agrees] I see. This is managing across. So “managing up” is managing your management, “managing down” is managing the team, and “managing across” is managing the people that you need to work with when you actually deliver your solution – the product teams. Anything else we forgot? (40:15)

Shir: Um. No? [chuckles] (41:02)

Managing data science teams vs business teams

Alexey: Okay. [chuckles] We have a few questions so maybe I'll go and check out these questions. A question from Patrick is, “Is there any major difference that you noticed regarding the management of data science teams versus usual business teams? Maybe you talked to your peers in management and discussed that there are some differences in that?” (41:06)

Shir: Yeah. I think that everyone who has worked in an AI team knows that there is a lot more risk and a lot more unknowns when you're working on a project that is reliant on data. Because until you get into the data, explore it, play with it, you don't really know how relevant it is to actually solving the problem. In some cases, the results just won't be good enough. There are a lot more unknowns. Also, in many cases, things will pop up and will take more time than expected for various reasons – the technology is not suitable and there are certain aspects of data sensitivity or data security that you need to address in your work. (41:36)

Shir: Things that come from production, (that's also true maybe for engineering teams) but sometimes we have things we need to maintain from production that interrupt the plans. So there are a lot of unknowns, I think and it's hard to communicate that to the product team. That's why it's important, as you said, that we need to build a plan with the milestones and communicate that in advance. I know that maybe when this role of a data scientist has started being a thing, they would go and research for months, and then they will come up with something. We really try to avoid that and try to have rapid experimentation, rapid communication, showing the progress, getting feedback, and trying to deliver small parts as possible. (41:36)

Scrum teams, brainstorming, and sprints

Alexey: Yesterday we had another interview about software engineering practices for machine learning. The guest, Nadia, mentioned that it's quite difficult right now to merge the data science process (how we deliver projects with data science) with traditional software engineering processes like Scrum, Kanban, and all that. I'm wondering, what does the process look like in your case for your teams? How do they work on things? (43:30)

Alexey: Let's say you have an idea – a business or a product team comes to you with an idea – what does it typically look like from the beginning (when you just start discussing this idea) to actually delivering this? I know this can be its own talk, but maybe you can briefly walk us through it? (43:30)

Shir: Yeah. [chuckles] Actually, my current team does work in Scrum. (44:18)

Alexey: The data science team? (44:23)

Shir: Yeah. Actually, the team that I lead right now is a mix of data scientists and machine learning engineers, because we have the model part, but we also have services that we build on top of those models. We have a lot of engineering work in our team because it may be that one service will include multiple models that we merge together. These are complex services. (44:24)

Shir: There are our MLEs and data scientists in all three Scrum teams – they are all working together. They work in two-week sprints, like the rest of the company. They do sprint planning. We do retrospectives as a group. They try to size all of those tasks according to the amount of unknowns they have in each of the tasks. (44:24)

Alexey: I was going to ask about that. Because with the amount of unknowns in AI teams and data science teams, it's very different from software engineering teams, where it's a lot easier to scope the problem to understand, “Okay, this thing will take five story points (or whatever).” With data science, it’s more difficult. So how do you handle that? (45:17)

Shir: I think they're doing a really great job at that. But it's really hard. They try to break things into smaller points. I think that in the data science part, sometimes it will take longer than we expected. But they try to break it down as much as possible. I don't have any magic here. It's mainly just thinking about how you break things down, what kind of milestones you can define for each of those stories. When we start a project, or when we start looking at a problem, we try to think about what the design would be first. (45:36)

Shir: We kind of brainstorm during that. I would usually ask the data scientist or the machine learning engineer to do some exploration for a couple of weeks and then present some kind of an idea of what they think they're going to build – what they think they're going to do – after they played a little bit with the data. Then there's an opportunity there to show the overall direction, talk about and discuss the plan, about the anticipated impact, and what the architecture we think we're going to build around this thing is, and have a discussion about those things. So I think that trying to plan ahead a little bit helps you break things down later. (45:36)

Alexey: So if there is an unknown – you don't know how exactly this data looks like, if it's good or not – so what you do is, one ML engineer or one data scientist just takes the entire sprint only to explore the data. Because you can say, “We don't know yet. We need time to figure this out.” Then they take the entire sprint to figure this out and at the end they say, “Okay, this looks promising. These are the tasks we need to do for the next steps.” (47:08)

Shir: Yeah. Well, it doesn't really translate directly to tasks, but more I would expect, “This is an overall type of solution I would look at. This is the type of architecture I would anticipate.” Maybe it's not a task level, but more of a story level thing. Then the tasks would be broken down later, sprint by sprint. (47:39)

Alexey: Do you do some sort of grooming sessions when you actually break down things as a team together? (48:04)

Shir: Yeah. The Scrum team does that. I'm not very much involved in that. But they do it. (48:10)

Alexey: Because you're a group manager. But the people who you manage, the frontline managers, they're probably involved in these things, right? [Shir agrees] So you have these grooming sessions, then you have estimation sessions or maybe it's the same session – I don't know how it works for you. Then, for two weeks, they work on a sprint and at the end they have a retrospective, right? That’s the process? (48:16)

Shir: Yeah. After you do these kinds of grooming or sizing sessions, I ask them to share those results. I review it and ask questions and see if there are things that I think they missed. Also we have principal engineers that also look at that and review that feedback. (48:47)

Alexey: Do you know any resources that talk about Scrum for data science? I'm really curious to see how this could be implemented. (49:09)

Shir: Well, I'm not sure I do unfortunately. I did the regular Scrum training for years. We just tried to adjust it to what works. (49:20)

Alexey: So it's internal expertise that you have. (49:35)

Shir: Yeah. I think everyone should just experiment. There might be some good resources out there because other people have experimented and wrote things. But there’s not something specific that I can recommend. (49:39)

The most important skills and strategies for DS and ML managers

Alexey: Okay, another question from Patrick. “What is, in your view, the most important skill for a manager in data science and machine learning?” (49:54)

Shir: Oh, wow. [chuckles] There are so many skills. (50:03)

Alexey: Well, maybe let's take the top three. “The most important one” is too difficult to come up with. (50:08)

Shir: I don't know if these are skills, but the three pillars that I mentioned are the three things you need to focus on – lead with a clear vision, drive winning results, and build a high performing team. You need to try to do all of those things. This is what leads me when I try to do my work. (50:14)

Alexey: It’s more like strategy – how exactly you manage, how exactly you lead – you follow this principle. What does it actually mean to lead with a clean vision? This is about clarity? You know what exactly the team will do in the next quarter, in the next year? (50:37)

Shir: It’s about having a vision of where you want the team to be in a year. What is the roadmap for this year? But it's also about communicating, as we mentioned, both to leadership and to the team: What is expected? What is going to happen? What is going to be the impact? All of those things. So that people will not have ambiguities. (50:56)

Alexey: Okay, “lead with clear vision” is the first one and the other two – can you remind us? (51:22)

Shir: Drive winning results. (51:28)

Alexey: Drive winning results. This is about, again, setting expectations and then setting goals? What is it about? (51:31)

Shir: This is about making sure we deliver with the highest quality and actually make an impact for the business. That's more of the tactics, from my perspective. “Leading with a clear vision” is more of a strategy and “drive winning results” is more of a tactic. It’s making sure that we execute with efficiency. (51:40)

Alexey: If we talk about these large language models, they are fun to play with. But if we play with them, we also need to make sure that we’re actually contributing to the business goals. It's not like we just have fun with whatever models – we're doing this because we want to have results. We want to have an impact. That's what this thing is about. Right? (52:03)

Shir: Exactly. (52:28)

Alexey: Then the last one is what? (52:29)

Shir: Build a high-performing team. (52:32)

Alexey: I think this is what we talked about, like setting goals and all that, so that each team member knows what exactly they should do. (52:36)

Shir: Yeah. And, as we said, mentoring them, coaching them, giving them the growth opportunities they need. If you want to focus on one skill that I think is most important for me, I think this is the growth mindset, which I also mentioned in my talk back then. There are so many skills, but the most important thing for me, at least, is to be able to… we started talking in the beginning of this conversation about me being in the Air Force, and this is what I learned from that – the debrief culture. (52:45)

Shir: This is exactly what it is – always being in growth mode. When you get feedback, don't be sad that you're not good at something like “Why did you say this thing about me? I'm really trying.” Well, this is all great, but really understanding how feedback is a gift and everything that you didn't do that well in the past is an opportunity to get better in the future –nailing down those areas and those things you need to do to grow and to be better. (52:45)

Alexey: I realized that our talk is called “the secret sauce for data science management”. So what is the secret sauce? Is this the growth mindset? (53:55)

Shir: Maybe? [chuckles] I think that it’s all of the things we talked about. It's a lot about the growth mindset. It's a lot about internalizing those pillars of management that we talked about and understanding “What is the strategy of you as a manager, to handle those things in the organization that you're in?” (54:05)

Alexey: What does “internalizing” mean? (54:31)

Shir: I’m having a hard time thinking about other words. It’s just taking it in, really believing it, and living by it. (54:40)

Alexey: So doing it, basically. Implementing it. (54:46)

Shir: Yeah and believing it. Taking a belief and value out of it – something you go by. (54:55)

Making sure proof of concepts get into production

Alexey: There is another question. I know we don't have a lot of time – only two minutes. I don't know if you can answer this in two minutes. Maybe you can give us a very brief overview. The question is, “A lot of data science proof of concepts do not get productionized and efforts are ‘wasted’. How do you increase the chances of a POC being successful?” (54:59)

Shir: It's a lot about really deeply understanding the customer problem that you're trying to solve, working with the product partner to understand how the customer’s story looks like, and working with the customer, if you can. We call it “follow me home,” which is actually looking at what the customer is doing and what you can improve. (55:23)

Shir: It’s also about trying to choose those problems to solve in which you have an opportunity to make an impact for the customer and an opportunity to show that impact – a measurable impact. Then, when you go and do that POC, try to define “Okay, I want to get this metric from X to Y,” and then try to show that you have achieved that. So I think it's much easier to make the case to get a POC to production if you've shown that some business metric has moved due to that. If you have an opportunity to do A/B testing – experimentation to actually show it and combine it in the project for some fraction of the users – that's really a great way to kind of push things to production in a gradual manner, through rapid experimentation. (55:23)

Alexey: So it’s about focusing on solving the problem. If you do this, and you show the impact on business metrics, the POC will be successful. (56:50)

Shir: Yeah, hopefully. Of course, it takes a lot of communication with the product side. Listen and try to see how you can really help. (57:01)

Alexey: Maybe that's also the answer to the question earlier from Patrick, “What is the most important skill?” Maybe communication is the answer there, because I see that it's a lot about talking. You need to talk with the team, you need to talk with management, you need to talk with product teams, you need to talk about understanding the customer problem. All of this involves talking, right? So you need to be quite good at this. (57:12)

Shir: You definitely need to be a good communicator. It's important. (57:37)

Alexey: Okay, I think that's all we have time for today. So thanks a lot for joining us today, for sharing your experience, we will definitely link the talk. Because I don't think we managed to cover most of this. I think we just talked about the first couple of slides for most of the time, but I think that went really well. If there is anything else you want us to add, maybe anything useful that you think that our listeners will enjoy on that topic, please send us links. And that's it. Thanks, everyone, for joining us today. (57:43)

Shir: Yeah. Thank you so much, Alexey. I do have a few blog posts I've written in the past about leading a data science team, about how to streamline the development process of AI solutions. So I'm happy to share those resources. (58:18)

Alexey: Okay, well, then, have a great weekend, everyone. Goodbye. (58:37)

Subscribe to our weekly newsletter and join our Slack.
We'll keep you informed about our events, articles, courses, and everything else happening in the Club.

DataTalks.Club. Hosted on GitHub Pages. We use cookies.