AI Dev Tools Zoomcamp: Learn AI-powered coding assistants and agents Register here!

DataTalks.Club

Season 21, Episode 8

From Semiconductors to Machine Learning: A Career in Data and Teaching | Dashel Ruiz Perez

Links:

Dashel Ruiz Perez
About this Guest

Dashel Ruiz Perez

Dashel Ruiz Perez is a data analyst, ML engineer, and educator based in Oregon, USA. He spent nearly a decade at Microchip Technology, where he worked across production, process, yield, and software engineering roles. Today, he teaches programming and data skills at major U.S. universities through ThriveDX and continues to deepen his expertise in machine learning and AI. Dashel holds degrees in computer science and data analytics from Western Governors University and is a graduate of ML Zoomcamp.

You'll be subscribed to our newsletter and receive a Slack invite in 3 minutes.

The transcripts are edited for clarity, sometimes with AI. If you notice any incorrect information, let us know.

Dashel's unique career path from music to semiconductors

Alexey: Hi everyone, welcome to our event. This event is brought to you by the Data Docs Club, a community of people who love data. We have weekly events and today is one of them. If you want to find out more about our events, there is a link in the description. Check it out to see all the events we have planned. (0.0)

Alexey: Do not forget to subscribe to our YouTube channel to get notifications about future streams. Last but not least, join our amazing Slack community where you can hang out with other data enthusiasts. (14.0)

Alexey: During today's interview, you can ask any question you want. There is a pinned link in the live chat. Click on that link to ask your questions, and we will cover them during the interview. That is the usual introduction I do. (42.0)

Alexey: Thank you. I think I came up with it once and have used it ever since. We can start if you are ready. (1:02)

Dashel: Yeah, I am ready. (1:15)

Alexey: Hi everyone. Today we will talk about the journey of Dashel from working in semiconductors to machine learning. (1:25)

Alexey: We are joined by Dashel Ruiz Barretts, who is a data analyst, ML engineer, software engineer, and ML Zoomcamp graduate. Dashel's career has spanned semiconductors manufacturing, software engineering, data analytics, and teaching. (1:51)

Alexey: At Microchip Technology, he worked with complex engineering and production data before moving into software and machine learning. Today he teaches programming and data at major US universities as a lead instructor at Thrive DX. Welcome to our podcast. (2:03)

Dashel: Thank you for inviting me. (2:20)

Alexey: It is a pleasure to have you here. For me, it is always interesting to talk with graduates of our courses. I am also interested in learning about your career path. (2:26)

Alexey: It is not very common. You started in the semiconductors industry and then worked in software and data. Can you tell us more about your career journey so far? (2:42)

Dashel: You are right, it is not completely common. Even before semiconductors, I was a professional musician. I came to the US in 2012 as a political refugee. When I got here, I was a professional musician. (2:58)

Dashel: I studied classical guitar for nine years and was just playing music all day. (3:24)

Alexey: Where are you from? (3:31)

Dashel: I am from Cuba. (3:33)

Alexey: Are you in Florida right now? (3:36)

Dashel: No, I am in Portland. (3:38)

Alexey: Okay. You are not the first person from Cuba I have spoken with. (3:42)

Dashel: It is completely common. I went to Florida like every Cuban, but I did not like the city that much. There are too many Cubans. The US government placed me in a refugee program in different states. (3:48)

Dashel: 0 4:15 (3:48)

Dashel: Dashel (3:48)

Dashel: Oregon was the least cold place of the states I could select, which is why I chose here. I love it here; it is really nice and not as cold as Pennsylvania or other places. We do not like cold weather; Cuba is like summer the whole year. (3:48)

Dashel: When I got here I could not speak English, and it was very hard to do anything. After a year I finally found a job at Microchip, which has a fab here in Oregon. The position was expediter, just walking with a little car to deliver wafers manually. (4:49)

Dashel: That job helped me understand the flow of wafers through the fab. It was work where I had to walk a lot, about 25,000-26,000 steps every single day. I moved around to different positions until I got a process engineer technician position. (5:18)

Dashel: That was when I finally got involved with semiconductors. Until that moment I was working in production, but then I started working with engineers running tests and experiments. I finally started working with data, and that was interesting. (5:49)

The transition into data and software engineering at Microchip

Alexey: Was your original education in engineering? (6:16)

Dashel: No, my education was music. (6:22)

Alexey: Music? You first started as a driver, an expediter moving things in a truck. How did you manage to get an engineering position? Did you study in the meantime? (6:28)

Dashel: First, I went to college to learn English. I wanted to be in a different position for several reasons. Being an engineer means more money and more responsibility. Another thing is I always wanted to know why and how things work. (6:43)

Dashel: If you want to know that, you have to move out of production and start learning how everything happens. With a semiconductor, it is like magic. The way chips are made is completely insane. It is physics and math, but if you do not have an engineering background, it is really weird. (7:06)

Alexey: It is very interesting even for me. I am a software engineer by education. If I needed to work with semiconductors, I would be completely lost. It is engineering, but for you with a music background, it was a completely different world. (7:44)

Dashel: It was completely different. Was it that hard? Maybe by curiosity I always wanted to be like my friends. My friends in Cuba were software engineers or architects. I was the musician, but I knew about their homework in C++. (8:10)

Dashel: I was there with them while they were doing homework. Finally, I was able to do something close to that. That is what drew me to data. Something interesting at Microchip, and I assume it is the same in other semiconductors, is that all the chip processes happen in tools, which are huge machines. (8:49)

Dashel: You have different areas. I worked in four different areas at Microchip. Every time you need to run an experiment or analyze data, you have to check the log files for the tools. They have thousands of lines of data coming out, and it is very complex. (9:23)

Dashel: For every second of the process, you have the amount of gases used, what tool it is, what plot. Some tools have six steps in different places. Everything that happens is logged; every detail like pressure and gas amount is recorded every millisecond. (9:55)

Dashel: The log files for one hour are huge. You have to do that manually, sitting there with the cursor moving to see everything going on. It was too complex because it is not only the data; sometimes you see an error message. It is way more data than you think. (10:23)

Dashel: Every single detail of the process is logged, including what should be happening in that step. That was really eye-opening regarding data in a semiconductor. It is way more complex. I assume Intel, which is huge and close by here in Oregon, does not do the steps manually like we used to do at Microchip. (11:06)

Alexey: You learned to parse these logs? How did you get into software? (11:38)

Discovering machine learning to solve real problems in semiconductor manufacturing

Dashel: In the beginning, I decided to go to college. I wanted to be a psychologist because I like psychology, but it was too hard. I liked software at the same time, so I decided to take computer science classes. I was in this position getting this data every single day. (11:44)

Dashel: I was working in the CMP area, which is chemical mechanical planarization. When you grow wafers, you grow the oxide or material. To make sure everything is the right shape, you have to mechanically planarize the wafer. This is a very sensitive step. (12:15)

Dashel: You have to check those logs every time something goes wrong. You have to do that manually. I made the calculations to fix the wafer or chips manually with my notebook. I had to do that every single day because I worked from 6:30 p.m. to 6:30 a.m. (12:48)

Dashel: We always had issues, and I had to do it completely manually. I talked with my supervisor and said, "I am doing this every single day. I can write an application so I do not have to make the calculations manually." After six months I knew the wafer thickness by memory. (13:21)

Dashel: I could manually calculate that one needed three more seconds and another needed four more seconds. I was right too many times. I said we need a procedure to do this, and I can write software to do it. He was very happy about that. (13:50)

Dashel: I wrote everything using Java FX with Java to make the UI. When everything was done, they did not want to use it because the IT department wanted to use C or something else. I was very frustrated about that. (14:12)

Alexey: Yeah, I can imagine. (14:46)

Dashel: I talked with another person in yield. They were trying to find someone for a master's degree position. We started talking about the job, and I mentioned the data I was parsing and writing in Excel manually. (14:52)

Dashel: I was telling him we could do that automatically; we could write something fast in a week. He said, "I am trying to find someone to help me analyze data with JMP. Are you okay moving?" I said I was okay moving. I moved from the technical engineering position to the engineering position without the degree at that time. (15:23)

Dashel: The last position I took at Microchip was working in JMP. There I was able to start doing the analytics and the engineering data part because I knew the data. It was very complex because there was too much data about too many details. (15:58)

Dashel: It was very complicated. Engineers do not have access to the data outside of the fab. You have to go inside the fab, put on the suit, go to the tool, and do it manually. They have to go there to run an experiment. If you have a technician, you can rely on that person to collect data. (16:20)

Dashel: You have to make sure the person is collecting the data and knows what they are doing. This is the way I finally got to yield, just from frustration. I did not want to do the same thing every single time because I was bored. I knew I could do more than just making calculations by hand. (16:52)

Dashel: In a fab like Microchip that makes a billion dollars a quarter, you have to follow procedure. I thought we could do that with software. You just put the thickness of the wafer and get the processing time. In the end, I think that software was never built in that area. (17:17)

Alexey: So the software you wrote was never implemented by the developers? (18:00)

Dashel: No, not in that area. I wrote other stuff with the data in yield that was used. I had a couple of projects that people adopted. My title was software engineer, but I learned during the Zoomcamp that I was really a data engineer. (18:07)

Dashel: I had access to all that data and was writing scripts in Python to clean it. I put that in an Oracle database and wrote small PL/SQL applications so they had access. It was more automation for my supervisor. (18:38)

Dashel: In yield, you try to make sure you have the data of the whole fab, what is going on, how many things fail and pass. Every request was about how to get this data. All my years in different positions at Microchip gave me access to how every area works and the data behind it. (19:11)

Dashel: I was curious and knew what was going on. That gave me an advantage because other engineers in the fab did not know what was happening in other areas. Moving around in production and as a technician gave me the opportunity to see how everything connects. (19:42)

Dashel: When they asked me to get something, I knew where to go and who to ask. I was able to talk with production people or engineers to ask questions and get answers. I was able to be successful with the data part of the job. (20:06)

How Dashel found and his experience with the Machine Learning Zoomcamp

Alexey: How did you become interested in machine learning and data in general? You were doing data engineering and needed to analyze a lot of experiments. Did this lead you to think machine learning could help you? (20:40)

Dashel: Yes. In my last year of undergrad, we had a big final project. One of the last classes was AI, and they wanted us to make software that could use AI. I used the common house price dataset from Seattle. (21:02)

Dashel: It was interesting to predict something and know what was going to happen based on price. When I was getting all this data at Microchip, a question you always ask in a job with a repeating process is, "When is an error going to happen?" (21:39)

Dashel: That was always a question from my supervisor. We wanted to predict an error so we could fix the tool before it happened or take action to have better yield and more sales. I thought I could use machine learning to do that. (22:16)

Dashel: At Microchip, you have a schedule of tests, sometimes called quals, to check if a tool is ready for production. A tool might have a qual every two weeks, one week, or one month, depending on how many wafers it runs. Sometimes you feel you need to calculate based on the number of wafers processed instead of time. (22:48)

Dashel: With the amount of data available, I thought it might be possible to predict this. I worked on a project about "wafers at risk." A tool processes a certain number of wafers every hour. For every thousand wafers processed, we need to check if the tool is ready, verifying the amount of particles and gases. (23:29)

Dashel: In the process, I wrote software that checked the entire production database for the whole fab. It took me about three months to count the number of wafer process steps and determine how many wafers were at risk for different tools in each area. (24:13)

Dashel: After I had all that data, I thought I could write something to predict how many wafers would be at risk if we kept working at the current pace. I did not need to wait. If my prediction was good enough, say 85%, I could tell the engineers to run their quals every 10 days instead of 15 days. (24:38)

Dashel: This would mean fewer wafers at risk, less waste, and more money. That was the beginning of machine learning for me. I remember running some algorithms but I did not have the knowledge. I was missing stuff. (25:16)

Dashel: Sometimes you run a Bayesian algorithm or a random forest and get 65% accuracy. You make some tweaks and suddenly you have 85%, but you do not know what happened. At that point, I could not use an algorithm even if I got the result I wanted because I could not explain the steps to my supervisor. (25:35)

Dashel: I decided I needed to learn analytics and machine learning to make better predictions. I fell in love with it because it was more interesting than making software that just does CRUD operations. CRUD operations felt a little boring at that time; machine learning was more involved. (26:08)

Alexey: I was also doing slightly more complicated stuff than CRUD. I remember the feeling that you can make predictions based on data; it is so amazing. (26:32)

Dashel: Yes, it is very interesting. To understand the wafer process, I was taking courses to learn about atoms and how they bond, but it was too hard. I needed more math and physics classes than I took for computer science. I did not want to do that. (26:57)

Dashel: It was not really interesting, at least not at Microchip. Microchip makes old technology for blenders, refrigerators, and cars. The new chips for computers are made by Samsung, Intel, and TSMC. You are not at the front edge of the field, but the data was there. (27:22)

Dashel: When you do something and get a good result, you get the heat. When I finished the risk project and had the risk per tool per area, my supervisor was happy. I decided to add a page with basic stats like range, mean, max, standard deviation, and some prediction just to add spice. (27:55)

Dashel: He was very happy with that. It was not only the idea of predicting that in two weeks at this pace a tool would have an 85% chance with a margin of error, but you know you have a problem with a tool already. Now you are watching the tool to make sure that between 3 and 12 days you need to run a check. (28:33)

Dashel: If the measurements are where they should be, you can run the quals. If we stop processing due to an issue, we can wait a little. But you know what is probably going to happen in the future and you can plan for it. They were very happy with that. I thought this is something good; I can do more of this stuff. (29:06)

The practical advantages of DataTalks.Club courses over other platforms

Alexey: I think I already asked you before we started. Do you remember how you came across our course, Machine Learning Zoomcamp? (29:33)

Dashel: As I said, I do not remember well because it was my daughter's first year and it is blurry. I think I was trying to find how to learn more machine learning. Maybe it was from someone at school because I was in grad school already. (29:45)

Dashel: I went to Arizona State University for a master's in computer science. I think someone in my class told me about it. We were studying for a logic class using a weird language from a European university. I was talking about machine learning, and he said I should check it out. (30:17)

Dashel: I checked one of the public videos about the courses. Another time I decided to wait to learn more. One day I decided to check it out because 95% of internet content is not good for people beyond the beginner level. When you go more advanced, there is not a lot of good stuff. (31:13)

Dashel: I had finished my undergrad. I went to the GitHub page and thought it was good, but I decided to wait until the course began in September because I wanted the cohort experience. I wanted to be more engaged with the course, to feel the pressure of homework and asking questions. (31:50)

Dashel: I wanted to have that interaction. Waiting was a really good decision. It is so far the best course ever for me. I have access to DataCamp, Coursera, and LinkedIn Learning through my school, but in comparison, the quality of DataTalks is sometimes not even close. (32:22)

Dashel: The structure is the best for learning. It is not easy if you are not following along. You guys change the answers, and each new cohort starts fresh; you cannot check the previous one. I think that is good because it keeps you engaged. It gives you a push to continue when you get tired or something does not work. (33:25)

Dashel: I remember once after the midterm project we moved to using TensorFlow. I was using a Mac and had issues running it. It was a pain, and I could not run it. I moved to a PC and finished there. I remember talking with you on Slack, and you answered. The idea of having people in Slack to answer questions is awesome. (34:34)

Dashel: The Q&A works really well. For students, I really admire that the person teaching the class, with that many students, is able to be there and answer. It makes a huge difference. The material is taught better than on other platforms. Technology changes, and the models change with it. (35:55)

Dashel: The practicality of DataTalks is the huge difference. I finished a course on Coursera from a huge university, but you cannot use that course to do something; you just have some knowledge. I took a six-month boot camp in Mexico in Spanish. The issue with that university and others is that everything stays in a Jupyter notebook. (36:30)

Dashel: You know how the algorithm works and how to write a random forest regressor, but it never goes beyond the notebook. You run an inference and get a result, but it is just in a Jupyter notebook. The difference with DataTalks is that the goal is for the person to use it in real life. (37:29)

Dashel: You learn to put a model into a Flask application, get a result, and create a REST API with simple authentication. That gives you the idea that you can write this and it works. It is a huge difference to be able to use cloud providers like Google Cloud and integrate tools into the process. (37:52)

Dashel: This makes a huge difference in the quality of the course. When you finish, maybe you do not get a job right away, but it gives you the ability to do a really good project that shows you can do the job. It gives you the tools to write something good and nice that works in a company. It is not just a notebook; it is something real. (38:30)

Dashel: That is the difference. For my career, doing the DataTalks courses has been a game changer. I did the ML Zoomcamp and the MLOps course too. It is awesome. (39:06)

Alexey: Okay, so you did the MLOps course too. That is cool. You said so many good things about our courses; we should just cut it and use it as a testimonial. Thank you very much. (39:26)

Overcoming challenges and the value of the learning community

Dashel: They are. I can tell you about the first midterm project in ML Zoomcamp. We were coming from COVID, and I had gone to Mexico. I got access through the school to a paper written about how comorbidities could be bad for dying from COVID. They had the paper and the data. (39:52)

Dashel: I was doing the Zoomcamp at that time and decided to do the midterm with this data. I finished the project, put the API on Render, and told my friends they could send a request to get the data. Everybody said, "Oh, it is live and working. How did you do it?" This included people with PhDs and master's degrees. (40:28)

Dashel: They were impressed. I remember when I did my master's in data mining and inference, we did everything in a Jupyter notebook. You write the result, and it is just a paper and a notebook. When you finish the project, nobody can use it. It is just on your computer. If nobody uses your machine learning model, it does not matter. (41:08)

Dashel: DataTalks gives you the ability to make your stuff matter. You can put an API online and say, "Format the JSON this way and you get the result." You have three months to use it for free on Google Cloud. The process you teach includes how to use Google Cloud, set up a VM, set up Anaconda, and use containers. (41:50)

Dashel: You have everything there. A person who knows how to write ML in a Jupyter notebook can finally use it. You are not getting that at a university. The courses on YouTube do not go to that depth. (42:24)

Alexey: Thank you a lot for such kind words. When it comes to machine learning, I remember my education was about how algorithms work, implementing algorithms, and a lot of mathematics. For me as an engineer, I want to make things available. (42:49)

Dashel: I want people to use it, right? That is the thing I learned at Microchip without looking for it. I remember my supervisor telling me, "It is good that you know how to predict that. How do I get that data? How do I get this result?" (43:20)

Dashel: If nobody uses the prediction, it does not matter. Everybody needs to know what is going on. I was just doing that. Usually, the idea was to know that beforehand. I was still in school, and my supervisor was kind enough to guide me through the process. (43:38)

Dashel: I was the only one writing code on a team where everybody did something different. They gave me the space to learn. He said, "Take the time you need, but we need to do it." Finally, you have a course that I still use. (44:09)

Dashel: The other day I was thinking I usually do not run Terraform. I go to the cloud and do it manually because I think it is easier for one or two instances. I thought, "I need to automate some stuff I do not want to prepare all the time. I will need to use Terraform." (44:36)

Dashel: I decided to go to DataTalks again. I went to the MLOps course, session three. I took that week, and now I have everything I need. I know that if a new technology comes out, you will have it there for sure. I can go there and learn that step. (45:02)

Dashel: Even though I took the whole course and graduated with a certificate I am very proud of, I feel I can go to DataTalks to refresh or learn something new. That is awesome. (45:27)

Alexey: When it comes to Terraform, you said it is sometimes easier to use the web interface. While that is true, I have found that with AI coding tools, I can ask Cursor to create a Terraform file for me. It creates the file, and you just apply it. (45:52)

Alexey: It is so much faster than opening the interface. You just type a prompt, hit enter, and it compiles. Some tools will even apply the plan for you. It is amazing. (46:14)

Dashel: It is awesome. I use PyCharm and VS Code, usually PyCharm more. I feel it makes my life easier if something goes wrong. You can have those AI tools there too. (46:41)

Alexey: I think PyCharm has an AI assistant plugin. I tried it, but I usually use VS Code, so I went back to it. It is pretty cool. (47:07)

Dashel: I am using more Gemini now because I do not have a limit on requests for Gemini Flash. I find the difference is not that big between models now. LLMs have become like cars; every one of them will get you from point A to point B. (47:26)

Dashel: It is about which one you prefer. It is not like one will solve all your problems. It depends what I am asking; I use one or the other. I have both in my toolkit. (47:50)

Hands-on project experience: From image classification to Kaggle competitions

Alexey: I really like what you said about LLMs being like cars. There is a lot of truth in that. Do you remember what you did for your second project in the ML course? (48:10)

Dashel: Yes, for that one we used AWS Lambda. I made a butterfly classification thing. (48:24)

Alexey: Was it an image classification model? (48:42)

Dashel: Yes, with TensorFlow Lite. We compared it at the end. It was a really good project. (48:47)

Alexey: That is pretty cool. Where did you get the data from? (48:53)

Dashel: I got the data from Kaggle. It was a huge amount of data, a lot of pictures. I think it was a terabyte. It worked really well. The dataset had about 50 different types of butterflies. (49:01)

Dashel: There was one dataset that was light and another that was the whole thing. I downloaded the whole thing to make a better prediction. In the end, TensorFlow worked well with neither of them. On the small dataset, it worked just like the clothing dataset in the course. With ResNet, 85% accuracy does a good job. (49:26)

Alexey: This year we will do PyTorch in addition to TensorFlow. If you are interested, you can check it out. (50:07)

Dashel: Maybe I will. I do not like PyTorch because it is too verbose. (50:22)

Alexey: I also do not like it, but it seems more popular now. People have been asking to switch to PyTorch. I see many students redo the course module in PyTorch instead of TensorFlow. I thought I might record a webinar to show how to do it in PyTorch. (50:30)

Dashel: Maybe I will. I am a little busy, but what I like about the ML Zoomcamp is that it is big. I did three projects when I took it. I did the midterm and the other two at the end. (50:52)

Dashel: The other project I liked was about EDA. I got a dataset from a Kaggle competition. I remember going through the data to make the model. That was really interesting because it involved more analytics, feature engineering, and then putting it in the cloud. It is just awesome. (51:10)

Alexey: Another change this year is that instead of TensorFlow Lite, we will use ONNX. You can train a model with any framework, export it to ONNX, and then serve it. If you are interested, you can check it out. We did quite a few updates this year. (51:53)

Dashel: That is good to know. I think I could take it. I already have a couple of projects I did not finish. You start doing something and stop in the middle. I could finish them because the course is really good. (52:24)

Dashel: As I said, for everybody watching, it is the best course you can take. You do not have to pay for DataCamp for a year to get this knowledge. On other platforms, you will have the knowledge but only be able to work in a Jupyter notebook. (52:44)

Dashel: When you finish, you will be in the same place. The hardest part is not making the model; it is using the model. You need to make the thing you built usable for others. You need to be able to deploy it. (53:09)

Dashel: In our field, it is more complex. Doing a machine learning model is not enough anymore. You need to know Docker, connect to a database, know a little cloud, deploy it, and know what a VM is. All that stuff is really important because it is the way you will solve the problem. (53:23)

Dashel: The problem is not just to create the model. The problem is to make everybody know how you solved the problem. If you know how to solve world hunger but do not tell anyone, there is still no solution. (53:48)

Staying motivated throughout the long-term course

Alexey: That is an interesting analogy. Do you remember what was the most difficult part for you in the course? (54:12)

Dashel: I think the hardest part was working with my M1 Mac at that time. I assume it will be different now with the M4. I remember having issues with the car dataset. (54:25)

Dashel: I was coming from just doing the analytics part. If I remember, that dataset had a bunch of cars. I was doing the dummy variable encoding with get_dummies. The dataset was too wide, with too many variables. (54:47)

Dashel: Some cars were in the training set but not in the test set. I kept getting an error because I was not doing it the way shown in the videos. I remember a student from Brazil gave me a Python package that used KNN for encoding. It worked, but I did not get the same results. (55:15)

Dashel: I went back and did it the same way you did. The issue was the dataset had too many variables. It is not hard; if you have a week you can do it. If you think you will do the homework in two hours, you are wrong. (56:02)

Dashel: The idea is not just doing the homework. The homework is there to help you understand what you learned. I usually did the code along with you. The videos were about an hour and a half, and then I started the homework by myself. (56:28)

Dashel: I tried to do it without watching your videos first. If I had an issue over and over, I would go back to the video. It was not impossibly hard, but it was hard enough to make me learn new things. It was not so easy that I did not want to watch the video. (57:00)

Dashel: It was challenging. Some stuff was easier, but some required more time. The cloud part was where I had to spend more time, slowing down the videos. I think the course is very balanced for the level I was at. (57:29)

Dashel: If you do not know Python, it will be awful. If you do not know any programming language, you will need to spend at least five hours a day and watch the video twice. If you are at an intermediate level in Python, it strikes a balance. The Slack community is awesome. (58:07)

Dashel: The community in DataTalks is just awesome. Your question has probably been answered in the Q&A. If not, somebody will answer you. Nobody will just give you the answer outright. I wish we had this type of Slack in university. (58:42)

Dashel: In Slack, it is very fast. I remember asking a question and getting 20 answers in 10 minutes. I even studied a couple of times with a guy from Brazil. We made a study group. I remember talking with one person in Spanish who helped me directly. It was good. (59:12)

The importance of deployment and full-stack ML skills

Alexey: Okay. Well, we should be wrapping up. Maybe the last thing I want to ask you is how did you manage to stay motivated? This is a long course with challenges, from September until the end of January, about four or five months. What would you recommend other people do to stay motivated? (59:55)

Dashel: For me, it was the public learning on Slack. Seeing messages pop up on your phone, knowing people are doing it, helps. Think that in one or two months you will be done. When you finish, you will have at least two great projects to show future employers. (1:00:31)

Dashel: You will have a set of skills that allow you not only to understand machine learning but to do something with that knowledge. You will be able to really solve a problem. You will be able to deploy a project and give it to your neighbors to use. (1:01:07)

Dashel: Do not miss the feedback from other students. Evaluating three other projects is a huge learning experience because you have to reproduce their work. That happens in real life. Do not miss the community. (1:01:42)

Dashel: If you feel it is hard, just talk with people. You will see it is never easy for anyone. Life happens; you get sick or busy. If it is very hard, ask questions. Somebody will answer and give you resources. In my experience, somebody might even call you and go step-by-step. (1:03:03)

Dashel: Other students write posts about what is going on in the course. Please just hang out in Slack and say, "Hey, it is really hard," and you will have all the support you need. (1:03:35)

Alexey: I am really thankful for all the students who help each other. If it was just me, I would not be able to do this. When everyone is going through the same material and experiencing similar problems, you share and help each other. This is a great thing to see and it helps me manage my time with thousands of students. (1:03:53)

Dashel: As I said before, that is one of the distinctions. If you take a course on DataCamp, it is just you. When you finish, you will not have a real project because you work in their sanitized environment. In DataTalks, you have a real project with messy data. (1:04:33)

Dashel: You have to do all the steps. People will evaluate you, and you will see what others are doing. You have that support, the videos, and Slack. That is a huge difference. When somebody asks me why they should take this course, I say just take it and you will be fine. (1:05:01)

Dashel: You will have all the knowledge you need. You do not have to watch endless hours of random YouTube videos. You only have to watch one course. If I want to learn something like Terraform now, I do not search on YouTube. I go to DataTalks, to the third module. It is one stop, with good quality, and I can use it right away. (1:05:29)

Dashel: It is real, not like a tutorial where you finish and still do not know anything. Many tutorials are the same thing with different people. This is really good. (1:06:26)

Alexey: Okay, thank you a lot. We should be wrapping up. Thanks a lot for joining us today, for sharing your experience and your story about your career and how you discovered the course. I personally liked all the nice things you said about the course. (1:06:40)

Alexey: This is very pleasing to hear and will be very helpful for us. When we edit this, we can make posts on social media about your experience. This will help others decide to join our course. So thanks a lot for doing that. (1:07:04)

Closing thoughts on teaching and future courses

Dashel: No, thank you. I really was not expecting you to contact me to talk about my experience. I have really good things to say. I tried to create similar content in Spanish because there is not a lot of high-quality material in Spanish. (1:07:36)

Dashel: When I received the email, I was surprised. I told my wife, and she thought it was crazy but good. I have been a teacher for classical music, software, and Python. Having a good teaching structure is important. Some people make tutorials but do not know how to teach effectively. (1:08:09)

Dashel: That knowledge is sometimes missing. With DataTalks, everything is there. The homeworks and projects are perfect. I was thinking it is a good way to do it because, teaching in schools, you are usually tied to a book. The way you do it is very good; I thought you must have been a teacher. (1:09:05)

Alexey: Well, I have a second degree as an English teacher, but we never studied pedagogy. I guess it is something I learned on the job. (1:10:10)

Dashel: The job you have been doing is amazing with that. I have been waiting for the next course about using React with machine learning or AI dev tools. (1:10:28)

Alexey: We will start the AI for Devs course in November. (1:10:54)

Dashel: I want to take it with you because I know it will be good. I know React, but I know I will learn a bunch of new stuff. (1:11:07)

Alexey: The idea there is not to learn React, but to use AI to write React code without knowing React. It is about LLM-powered coding. You can produce React code without initially needing to know it deeply. (1:11:19)

Alexey: You can get started without deep knowledge and learn as you build more applications. It is a good practice. (1:11:38)

Dashel: I think one thing that happens with people is they feel they need to know everything. When you watch videos, you feel you do not know anything. I wrote applications at Microchip in a scripting language called GSL for JMP software. (1:11:52)

Dashel: I wrote three or four applications that people used. If you ask me how GSL works now, I do not remember. I do not even put it on my resume after using it for two years. It is something people need to feel more comfortable with. (1:12:20)

Alexey: Okay. Yeah. We should be wrapping up. Thanks a lot for doing this, and thanks everyone for joining us. It was fun. If you decide to join our updated modules, we will see you there soon. (1:12:49)

Dashel: Okay. I think I will. I think I will. I think I will. I am interested in the new stuff. (1:13:08)


DataTalks.Club. Hosted on GitHub Pages. We use cookies.