Conversations on Applied AI
Welcome to the Conversations on Applied AI Podcast where Justin Grammens and the team at Emerging Technologies North talk with experts in the fields of Artificial Intelligence and Deep Learning. In each episode, we cut through the hype and dive into how these technologies are being applied to real-world problems today. We hope that you find this episode educational and applicable to your industry and connect with us to learn more about our organization at AppliedAI.MN. Enjoy!
Conversations on Applied AI
Eric Lealos - Test Bench to Tech Stack: Applying AI in Manufacturing
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Welcome, everyone, to the Conversations on Applied AI podcast. Today we're speaking with Eric Lelos, a data and analytics leader with more than 20 years of experience helping organizations make better decisions with data.
Eric has worked across manufacturing, healthcare, and financial services. Building analytics and AI systems that connect complex technology to real-world operations. Alongside leading quantified mechanics, Eric is also a professional member of the Strategic Synergy Network, where he collaborates with other senior practitioners to help companies solve operational, financial, and strategic challenges without the overhead of full-time teams.
His work spans data engineering, modern cloud analytics platforms, and AI-driven insights.
If you are interested in learning about how AI is being applied across multiple industries, be sure to join us at a future AppliedAI Monthly meetup and help support us so we can make future Emerging Technologies North non-profit events!
Resources and Topics Mentioned in this Episode
- Modern Data Stack (Snowflake, Databricks, DBT, Fivetran)
- Quantified Mechanics' Test Bench Solution
- Agentic AI & LLMs in Production
- https://www.langchain.com (framework for building agentic LLM workflows)
- ChatGPT / Generative AI as a Work Accelerator
- The AI Adoption Gap — Further Reading
- https://www.gartner.com/en/research/methodologies/gartner-hype-cycle (Gartner Hype Cycle)
[00:00:00] Eric Lelos: So I always love it when people sort of reimagine, right? So originally all of these tests were designed to pass or fail a product. That was the, the, the underlying intent by the engineer who designed the test. It's like, I wanna make sure I do X, Y, and Z. Well, and if it does those things, well, I'm gonna put this thing in a box and ship it.
[00:00:23] But if you would actually change. Your perspective about it just slightly and say, I want to capture a bunch of data and find out why my products are failing. You would design the test differently and you would design the flow differently, and most importantly, you would realize how important not only the data for the one test is, but the aggregate data for all of the tests.
[00:00:54] AI Speaker: Welcome to the Conversations on Applied AI podcast, where Justin Grammens and the team at Emerging [00:01:00] Technologies North talk with experts in the fields of artificial intelligence and deep learning. In each episode, we cut through the hype and dive into how these technologies are being applied to real world problems today.
[00:01:12] We hope that you find this episode educational and applicable to your industry, and connect with us to learn more about our organization at Applied AI dotn. Enjoy.
[00:01:25] Justin Grammens: Welcome everyone to the Conversations on Applied AI podcast. Today we're speaking with Eric Lelos, a data and analytics leader with more than 20 years of experience helping organizations make better decisions with data.
[00:01:36] Eric has worked across manufacturing, healthcare, and financial services. Building analytics and AI systems that connect complex technology to real world operations. Alongside leading quantified mechanics, Eric is also a professional member of the Strategic Synergy Network, where he collaborates with other senior practitioners to help companies solve operational, financial, and strategic challenges without the overhead of full-time teams.
[00:01:58] His work spans data [00:02:00] engineering, modern cloud analytics platforms, and AI driven insights. So thank you, Eric for being on the podcast today.
[00:02:06] Eric Lelos: Thanks Justin.
[00:02:07] Justin Grammens: I appreciate you being on the program. You've been a long time supporter of Applied ai, all the stuff that that we do and one of the sponsors here at our last conference.
[00:02:15] So great to get you on the show. And the thing that you know, I typically ask people is maybe kind of paint the picture a little bit of the trajectory of your career after getting outta school or whatever. Kind of what led you down this path of getting into analytics and ai.
[00:02:28] Eric Lelos: I'd probably give a different answer to this question every time somebody asks it.
[00:02:31] Just whatever is. Currently on my mind, but essentially when I was, when I started college as an undergraduate, I started as an engineering student. So got into computer science and did all of those kinds of courses. The internet was really just beginning to unfold at that time. In fact, my first computer couldn't run windows.
[00:02:52] I didn't have enough RAM to run windows on my first computer that I would do my homework on. But because I, I had that [00:03:00] background, I was recruited as a student into a company called Computer and Information Technology, and at the time I was helping people set up email accounts and use computers for their schoolwork or their jobs.
[00:03:12] So I sort of got into this job that way. But then about halfway through undergraduate, I decided I wanted to go to law school.
[00:03:20] Justin Grammens: Mm.
[00:03:20] Eric Lelos: I changed my major, went into the regular pre-law course, but then when I finished. Undergraduate, I was broke. I didn't wanna stay in school. I was just literally hungry. Like my stomach was empty and I wanted food in it.
[00:03:37] And I thought, I'm gonna go get a job at West Publishing in order to kind of be around a bunch of lawyers and just learn that whole deal. And I ended up working at Westlaw Customer Support, which at the time. Was again, pre-internet. The internet existed. It just wasn't in its current conception.
[00:03:53] Justin Grammens: Yeah.
[00:03:53] Eric Lelos: And Westlaw was a big online legal database that people would have modems [00:04:00] on their computers and dial into and connect directly to.
[00:04:04] Using software and hardware so that they could do legal research. That's how I really got back into technology. That was a really great time. Right. That was a very great time to be in technology. Things were developing really early, fast. It kind of feels like from, say like the mid nineties until. I don't know, even 2010, it just seems like my career was on the fast track.
[00:04:26] I could sort of do, do no wrong. I just made decisions every other year or so and changed jobs and got new skills. And then eventually I started to work for a company called Information Advantage, which was a one of the very, very early data and analytics software. Companies out there. In fact, a lot of people would say like it was us in MicroStrategy, we're the two companies, and for whatever reason, information Advantage didn't really launch the same way MicroStrategy did.
[00:04:58] But I learned a lot. I learned a [00:05:00] ton of things. And then eventually I left for there. Left there in a software engineering role to go and work for a company called Cognos in professional services. But like. The whole reason Cognos hired me was because I had all these other skills that they didn't have.
[00:05:18] So I was like a, a Unix engineer. I was a software engineer. I knew how to do all these things that they didn't have experience with, and I was also, we didn't have the words for it at the time, but I was essentially a data engineer. That really became my key play as it pertains to analytics is I knew how to make the data, the shape and feature field that I wanted it to be, whereas most people didn't really have that skillset at the time.
[00:05:43] So from Information Advantage to Cognos is how I got into data analytics and data engineering, and then eventually machine learning. And then ai.
[00:05:53] Justin Grammens: You know, it's interesting you talked about the internet was developing really fast and I was around during those times too, man, like the mid to [00:06:00] late nineties, everyone was trying to build websites as fast as they could, but then we were doing a lot of programming as well.
[00:06:06] I mean, back in the early days of Pearl. And so I was writing Java app outlets and. All sorts of stuff. And then I, my career kind of then moved into, first it was like a lot of full stack development and then moved into mobile. So I feel like there's been these, at least multiple waves, whether it's been the internet, mobile development.
[00:06:21] And now my question to you is, is, I mean, do you feel like we're at this cusp of another big expansion here with ai? Like is it exciting right now in your mind as it was back then?
[00:06:31] Eric Lelos: There are a lot of things that I don't really understand about what's going on right now, but I think ultimately it is, it's unbelievably exciting.
[00:06:38] I don't know if things just haven't quite caught on, but I see the development of AI approaches and technology and solutions currently in the phase that really, it reminds me of the kinds of changes. That the iPad and iOS brought to the world. [00:07:00] As an example, once iOS came out and the iPad came out, suddenly there was this platform that people could make all kinds of applications, and we saw lots of business applications built on top of it.
[00:07:12] Things that could be built for tens of thousands of dollars that would've required tens of millions of dollars earlier. And so suddenly, like. You see people in retail operations using an iPad, and we don't need fancy cash registers and point of sale systems from really proprietary application or hardware manufacturers that were super expensive and taking tons of money out of those operations.
[00:07:40] I think AI is in the same boat, so like for all of my career when we were doing. Machine learning types of projects. There was just a, just a huge overhead in terms of data engineering, feature engineering, and there was like these monstrous [00:08:00] disconnects between the data team, the data science team, and the people on the business side.
[00:08:05] And as a result, there never was like a big payback for a lot of stuff except for a few cases. They also just drifted right away and all of a sudden we have to keep doing this. I'm gonna spend $300,000 a year to save $310,000 a year on some sort of operation, and it was worth doing. It just was like, it just was never like the big smash hit that everyone expected, whereas now.
[00:08:32] It feels between all of the things we've learned through data engineering, all the other technology that surrounds it, things like Snowflake, Databricks, DBT, ingestion, pipelines, our ability to just write code easily. It just seems like all of the little hurdles are gone. And we can easily engineer to build tools.
[00:08:52] So like we're kind of on this agentic kick right now where we're actually like, amazingly finding all of our data [00:09:00] engineering skills is perfectly aligned with building tools in agentic workflows. So for us, we talk about building, doing, delivering projects for. $20,000 that would've cost a million dollars 20 years ago.
[00:09:15] And I know just technology alone, we can do things today. It's crazy. I actually just did like the. The cost analysis on a, on an AI LLM project, and I was like, we can't even tell them this. They're not gonna believe us. They're not gonna believe that they're gonna spend $1,500 on token cost. They're just not gonna believe that it's supposed to cost 50 grand.
[00:09:40] Justin Grammens: You know, you see the light, but it feels like a lot of companies aren't seeing it, that they still either A, aren't believing it. Or they're getting stuck on these pilots that aren't actually moving to production. They're maybe not doing the cost analysis. What are some of the hurdles or the barriers that you're seeing as you're going into organizations that are maybe not really fighting off a whole lot of [00:10:00] AI projects?
[00:10:01] Eric Lelos: I think generally speaking, there's some macroeconomic headwinds, I think some interest rate economic stuff. I think some of the uncertainty associated with changing administrations and that kinda stuff and changing funding sources and money flows, all of those things are, I think, headwinds, but not necessarily fatal.
[00:10:20] But I think over the course of the last 10 or 15 years, I've personally increasingly observed that management. Leadership in organizations has gotten a little bit lazy. And what I mean by that is, generally speaking, I think a lot of management wants to shop. They wanna shop for a solution that solves exactly their pain.
[00:10:47] They wanna buy it, they want to implement it, and then they want to go and shop for the next thing. As opposed to, I wanna build a team of four or five people that are really capable of [00:11:00] building. Specific things that will give us a unique competitive advantage. But those kinds of things require, I think, a smaller team of more skilled people, both in terms of technology, but also on the business side.
[00:11:14] So like a lot of times when I start a new engagement with a client, I know if we're gonna be successful or not right away, because there's usually somebody on the business side who really has. A very in depth understanding of what they wanna do with data and why it matters, and how it can be impactful.
[00:11:36] A lot of times I'll hear somebody, I'll see somebody with a great big spreadsheet. They're downloading 10 different reports. They've got three or four intermediate pages where they're doing a bunch of stuff and it takes 'em way too long and the spreadsheet's broken and it doesn't work anymore. But then they're kind of worried.
[00:11:52] They know there are all kinds of errors in it. They know it's the right thing to do and there's a reason for it. Every time [00:12:00] I see that, I know that we can solve things and we can do things really well, both in terms of automating those data flows, but we also correct them like, 'cause they generally have access to a handful of reports that they may not even fully understand what they're doing and why they have the data they have, but.
[00:12:19] For them, they're close enough and it's as close as they can get. They pull all that data together, they crunch it together. So like for example, we worked with a company who was pulling together weekly inventory reports plus daily order reports, plus bill of material reports so that they could make a manufacturing plan every day.
[00:12:41] And the person was like trying to get this data right and they had to like. Factor in their safety stock, and they had to factor in certain products that they wanted to make sure that they never ran out of and anything that came in as an order. Certain categories from certain customers had to be sent right away.
[00:12:59] So [00:13:00] you can imagine like the data challenge associated with this, right? It's not easy. And that person never once in their entire career finished the plan. Never once, like any given day, they never finished the plan. It was just a constant effort. And then we brought all this data together and made it all essentially the same.
[00:13:22] Era meaning the same, like up to date, real time data flow, and all of a sudden we went from like, okay, we're building a plan at eight o'clock in the morning for the next eight hours, and then at four o'clock for the next eight hours, and then at midnight for the next eight hours to, we just have a queue, just build things that are on this list, just go down this list.
[00:13:45] And then when something new comes in at 10 o'clock, you don't have a sales rep coming onto the floor going, Hey, Justin. I really would appreciate it if you would get this thing there. And then they set something aside. And right now, if there are any manufacturing people listening, [00:14:00] they're all going, oh my God, I hate when that happens.
[00:14:02] Sure, they've loved it. Then make the thing gets set aside and the thing that would've taken an hour is now gonna take three hours and they're gonna forget to do that. And the whole thing got disrupted and you just went off plan. And now the plan doesn't reflect reality. And now the people are all confused and the whole thing's a mess.
[00:14:18] Justin Grammens: Yes,
[00:14:20] Eric Lelos: it happens.
[00:14:21] Justin Grammens: Yeah. Yeah. So how do you adjust and build for that is how you guys are attacking the problem and, and your focus is a lot in manufacturing. Is that what you would say? I mean, is it that we're you're finding a lot of these opportunities?
[00:14:33] Eric Lelos: We go in streaks, like generally speaking, we're usually talking about what we're working on.
[00:14:39] And so whatever we're talking about. Ends up leading us to the next thing, right? So we just go in streaks and it just depends on how things happen. So I've done a lot of work in financial services in healthcare, specifically skilled nursing, but a lot of med device. So a lot of it like overlaps, right? So I mean, med [00:15:00] device is a manufacturer, so there's a lot of manufacturing there and some of the.
[00:15:04] Some of the issues are the same. They just talk about them and think of them differently. But I just kinda wanna get back to, I forgot to really close the loop on your comment, like, why aren't people buying off on this? And I think increasingly there are fewer and fewer of those people who are really like digging into these data challenges for whatever reason.
[00:15:22] I think we're just in a phase right now of the world where a lot of people either don't know. Or they're not empowered to have an impact. They're not wired to have an impact. Like you mentioned earlier, before we got started about like entrepreneurs liking to do hard, crazy things like running and stuff.
[00:15:41] That's really not what people in big corporate America are being sold, right? They're being sold like a, like a pretty narrow role. Follow these tasks. They may not necessarily know how those tasks connect to other tasks in the organization and. As a result, like [00:16:00] they're not necessarily sure what the opportunities might be for them to do it better or change it or reinvent it.
[00:16:05] Like they don't really see enough of the organization. And I think this is one of the things, like I really do believe in small business, both in terms of technology, but also in terms of manufacturing, retail, restaurants, all of it. 'cause I think that's where you see these people who are just not really tied up and bound by.
[00:16:26] Convention or inertia or whatever. And I think right now that's essentially what I see is that's not a lot of people who have the freedom or the will or the desire or the ability to get in and really like make an impact.
[00:16:42] Justin Grammens: I love it. There's a monetary side of an organization, but then I think more oftentimes it feels like it's the culture.
[00:16:49] And like you said, it's, there's many of these organizations where everyone just sort of gets pigeonholed and they're in their own space. And when you have. Such a sort of a game changing technology like generative AI that can be [00:17:00] literally be used anywhere. There's literally not a job title. I don't think at any company that I could think of where they couldn't leverage it in some way.
[00:17:07] It's almost like it's too much. I go into organizations and do a fair amount of training with them, and to me and you, it would feel very simple. Right? Really, you haven't actually opened up chat GPT and actually. Used it to generate, I don't know, a sales deck, for example, right? And people are like, Nope.
[00:17:25] And some of it is that inertia, like you said, just like that. They've always done it the old way. And so, yeah, I think. That is a huge driving factor that I think a lot of people overlook, so I'm glad you touched on that. I talked with the guy earlier today and he feels like it's also a Minnesota thing.
[00:17:41] Just like on the coast, you have a lot more people that are bucking that trend. I don't know if you have access or think about it that way too, but I feel like. We're pretty conservative here in Minnesota around change, I guess is what I would say.
[00:17:55] Eric Lelos: Yeah, I think that's a fair statement. I think that's culturally fair.
[00:17:58] I think because we've [00:18:00] been doing some work in manufacturing, I think that's also a culture that's pretty conservative. Oftentimes I think about if we do this rollout and we shut down this line, that's a problem. But of course like that's not a really realistic thing. We know how to mitigate. Those kinds of things, and I do think there is a big cultural question mark that sits out here, but this is something that's absolutely existed through both of our careers the entire time.
[00:18:30] For as long as I can remember, there's been this tension between technology, information technology, and all kinds of technology. Replacing people and taking their jobs and there's some fear and some worry around all of that. But what's really interesting about that is almost it's never happened. Like technology has, has created way more ability.
[00:18:56] Like it's exactly like the way that you were saying it, and I can tell like you [00:19:00] got, you and I are sort of wired the same way because this technology is so interesting and its impact can be so broad. That it is literally gonna change people's impact, and that's not ever gonna get rid of people. That's only gonna make people more valuable.
[00:19:18] You know, it kind of gets back to what you were saying earlier. It's one of the reasons why I think we're in a super exciting time. Once the inertia breaks free, we're just gonna end up doing. Had a ton of amazing work.
[00:19:29] Justin Grammens: Yeah, yeah, for sure. It feels to me like this, these are the early days of the internet and that it's gonna be that much.
[00:19:34] You look back at the internet, I mean, no one had any idea that they, would you have an app on their phone to heat up the car and do remote start like that wasn't even, people were thinking about that in the nineties. It was so infantile. It was like, I remember just doing websites that were just, Hey, take this paper brochure and digitize it.
[00:19:50] That's all we were doing. But now everything's connected on the internet. Right. And there were some people maybe that were thinking out there. I certainly wasn't. When I was coming outta school, I was like, oh, interesting. All right, [00:20:00] cool. But yeah, now the internet is pervasive and it's used everywhere. And I feel like we're in those early days of brochureware with generative ai, and time's gonna tell with regards to where it's gonna fully be utilized.
[00:20:14] We just can't predict it. Who knows where it's gonna go in the next five, five to 10 years, but. It feels very early right now for sure.
[00:20:20] Eric Lelos: So I absolutely agree with all of that, but I also think that's one of the things that's contributing to sort of the sideline situation because I think a lot of folks who are responsible right now for selling stuff are really coming up with big time hype promises, and I think there's a lot of noise in the market.
[00:20:42] People will call it the hype cycle, whatever that might be. And maybe this is how it's always worked. I just haven't seen it. This from this perspective before, but I think there are so many people that are like increasingly urgently trying to push their own AI agendas. It's really causing a lot of, [00:21:00] creating a lot of noise.
[00:21:00] And I, I'd be really curious, I'd like to ask you, but like if I come across somebody in an organization that's doing AI related work or pitching AI related work or whatever the case may be, I would say it's one out of three or four, maybe one out of five even. That just in a five minute conversation, I can tell that they don't do the work.
[00:21:20] They'll say some basic things like the way they're using LLMs or the way the application's designed to work or why it works. And I'll think, no, you don't do this work if that's what you are saying. And there's just too much of that. And I think too many people have tried it, right? Like I gave a, a presentation at Applied AI and I was like, Hey, you know, our first pilot we're like, first thing we did was.
[00:21:42] We just gave them l, l, m, all the data and said, what do you think?
[00:21:45] Justin Grammens: Yeah.
[00:21:46] Eric Lelos: And it, it said kind of interesting things, but it wasn't good.
[00:21:49] Justin Grammens: Yeah. And it wasn't correct probably.
[00:21:51] Eric Lelos: But there are still people that are saying, that's how you can build these applications. And I'm like, well, you haven't built it, if that's what you're thinking.
[00:21:58] Justin Grammens: Yeah. No, you're walking this line [00:22:00] between a deterministic system, which is funny because ai, it encompasses machine learning, it encompasses deep learning, it encompasses all these. Analysis techniques, which are very much rooted in algorithms, and then you have generative ai. Which is again, a deep learning neural network.
[00:22:19] So it has it in there, but it's not probabilistic all the time. It basically, it's generating the next word. So people want to go ahead and just use that. Use an LLM for literally everything. When you ask it the same question, the next day, it might actually answer it different.
[00:22:34] Eric Lelos: You're talking about something that is like the crux of our approach.
[00:22:38] There are a couple things that we really like. We really like operational AI cases. Our entire design of our solution is how do we make this thing behave deterministically, and that is the nature of it all. And that's where like our background as data engineers comes in, right? We just carve up [00:23:00] all of these problems into these tiny little.
[00:23:03] Data problems and then let the LLM go and pick tools, get data, use the data that it gets in the next tool. And as long as you can kind of, I dunno, I've used this expression Shackle. The Genie, which I, I stole from somebody else. I think it was at a St. Thomas event. There was somebody from, I think it was either Boston Scientific or Allianz, who said something about Shackman Le Genie.
[00:23:26] And I went, oh. That's a great way to put this.
[00:23:29] Justin Grammens: Tell me a little bit about this product, I guess you've been working on, and then this kind of flows into your presentation that you did around test bench, right? For manufacturing?
[00:23:38] Eric Lelos: Yeah. So it's got all the makings of a good AI solution in that there's a whole bunch of data collected real fast.
[00:23:49] And if you had the time and the patience to weed through it all. You could make good decisions really fast. So essentially in [00:24:00] discrete manufacturing, people make something that costs a couple thousand bucks so that they're gonna test it before they send it. Usually they'll use software like lab view test stand to control a bunch of instruments to measure things like temperature, speed, current.
[00:24:18] Make sure that whatever it is, it's running correctly and it collects a whole bunch of data. There may be like 40 or 50 data points collected over the course of a 10 or 15 minute test, and in really innovative companies. 10 or 15 or even 20% of the products fail the first test, they have to be adjusted somehow.
[00:24:40] Imagine if you're making lawnmowers, for example. You might have to adjust fuel, flows, motors, belts, all kinds of things, and they may not be well adjusted routinely upstream. So when those things get tested in order to make sure that they're running properly. A test [00:25:00] operator has to run it. They see a test output report.
[00:25:03] It's super technical written by an engineer. Meanwhile, the test operator is an assembly manufacturer person. It may be their first day, so they look at a test report and. They have no idea what to do. So this is a great example of how AI can do this. So what happens is they get an error test operator will get a recommendation, they'll make a fix.
[00:25:30] If they make a fix, they'll have to test it again. We have an AI interface that receives the fix, translates it. It can be in any language. It can be like. Hey, I, I kind of kicked it, or you can say whatever you want.
[00:25:43] Justin Grammens: Sure.
[00:25:44] Eric Lelos: It'll translate it, it'll look at those things and it'll summarize and categorize it to essentially the list of things that somebody might be able to do.
[00:25:51] There are only probably 10 things a typical operator can even do, so it all gets translated. The thing gets tested again if it passes. [00:26:00] We've now recorded a data point. This profile of an error was fixed by this solution, and over time we now make good recommendations to the operator. So now I can essentially like have all of these different operators floating around to all these different benches and they don't have to know as much about the products.
[00:26:21] Then secondly, we have a loop, an engineering loop, so that only approved fixes get recommended. So that's super important in med device because they can't just start doing stuff they have to do only approved fixes. So lots of benefits.
[00:26:36] Justin Grammens: That's awesome. How was this done in the past? Was it just tribal knowledge that these teams would build up?
[00:26:42] Eric Lelos: Yep, that's actually the exact way we describe it. So what happens? Is when a new product, a new bench, new test bench, a new line gets launched, the manufacturing seller line gets all set up. You got new people on there. Essentially in the beginning it fails a ton, right? They're getting their [00:27:00] supplies right, the vendors components, upstream processes, and.
[00:27:05] At that point in time, it is the literal, like drinking from a fire hose phase, like 50% of the products are failing and engineers are fixing things on the fly and figuring out how to set up processes. And eventually things start to settle down and they don't really know. So like we've seen situations where the line is running, people are manufacturing stuff, and then all of a sudden like their first pass.
[00:27:34] Yield will flag from 97% normally to all of a sudden it's down at 88% and nobody has any idea. They have no visibility into what's going on. And then they'll come out, like generally an engineer will come out and go, why are we at 90%? Why aren't we at 97? And they'll say, I don't know. This is happening.
[00:27:52] We're kind of looking at this. And they'll figure it out. And then it'll just be sort of a ad hoc. Analysis phase as to what's going on. [00:28:00] So like mostly these organizations don't have a good way of aggregating all of this test data. So I always love it when people sort of reimagine, right? So originally all of these tests were designed to pass or fail a product, the underlying intent by the engineer who designed the test.
[00:28:18] It's like, I wanna make sure I do X, Y, and Z. Well, and if it does those things well. I'm gonna put this thing in a box and ship it, but if you would actually change your perspective about it just slightly and say, I wanna capture a bunch of data and find out why my products are failing. You would design the test differently and you would design the flow differently, and most importantly, you would realize how important not only the data for the one test is.
[00:28:52] But the aggregate data for all of the tests, you would, you would see that all differently. You would say, Hey, what are the [00:29:00] things that affected my line yesterday, last week, last month? Because you know, I might be looking at like some minor difference in a situation where there may be like producing 500 or a thousand products a day.
[00:29:15] 10 or 11 failures almost just blends into the background. But if you are failing 10 or 11 times every day, that's 200 failures a month and 2000 failures a year. And now we're talking something, right? Because like typically an organization might use between 50 and a hundred dollars as like sort of an accounting cost of a failure and a hundred dollars on 2000 failures.
[00:29:43] It's 200,000 bucks.
[00:29:45] Justin Grammens: Crazy. The testing thing I thought was interesting too. You know, I was talking with somebody at a, at a medical device company recently, and he is a mechanical engineer and he was actually thinking about, okay, how would I actually test this component? And he actually used generative ai.
[00:29:59] To come up [00:30:00] with a different type of test. There was a sheer test that he was going to do and ran that through generative AI and started just basically communicating with chat gt their enterprise version or whatever. But, and it suggested a different type of test and he stepped back and he's like, actually, yeah, that's actually a better test.
[00:30:18] So I don't know if you guys have seen that as well, just sort of bouncing ideas off of it as well. And
[00:30:23] Eric Lelos: what's funny about this is how impactful generative AI has been on our work. It is crazy. I mean, first of all, and so like I, I gave you my, my background, right? I've been doing data engineering for literally like we're approaching 30 years, which is crazy to think about.
[00:30:41] And through this process, just all of a sudden I went, wait a minute here. I've got this different data. I wanna store this differently and I wanna use it differently. And I'm just gonna start asking chat GT or perplexity to like, what should I do? And. It absolutely changed [00:31:00] the whole deal. It changed everything.
[00:31:02] So normally, like a lot of people are gonna laugh, but a lot of the data engineering that I've done in my career has been with traditional tools like PowerCenter from Informatica or Data Stage. And of course we've made this migration to tools like DBT and Five Train and Snowflake, and love those tools.
[00:31:17] They're fantastic. But initially we really just did the same old stuff that we used to do with these tools and then all of a sudden. I saw this podcast five years ago where somebody was storing like a big chunk of data as A-J-S-O-N document in a variant data type and went, oh, that's kind of nice.
[00:31:37] That's pretty cool. Now I don't have to worry about my pipelines breaking 'cause like I don't care. Like I don't care what the shape of the data is, just give it to me and I'll just do something with it and then all of a sudden, one day. I was like, all right, I've got this JSON document. I want to engineer this data.
[00:31:51] And I was like, chat GPT, what should I do with this? And it spit out SQL. I was like, what is that? I've never seen that before. [00:32:00] Like, what? What is that stuff? And it totally changed everything. We were like, oh my God, now we've got like end to end robust data. Like it's not gonna break because something is different.
[00:32:14] Now we've got this platform. I kind of forget where you were sort of leading me with this question, but this is what I was getting at when I was saying like there's this like crossroads of technology, ai, generative ai. You were talking about this engineer who figured out a new way to test it. This is just the beginning.
[00:32:31] This is me as a data engineer being unbelievably assisted by generative AI in my work. And then in order to deliver this end to end, we had to show really cool demos. So. Generative AI is not capable, at least not as far as I know of making lab view tests the way that I want. But I'd never done lab view before we really got involved in this and it walked me through everything and all of a sudden we started building [00:33:00] lab view tests that were very sophisticated, very good, and like people started to look at it and go, Hey.
[00:33:08] Can you help us with our lab? You said
[00:33:10] Justin Grammens: exactly. It almost feels like you're cheating sometimes in some ways, but you still need the experts to make a, to make the sort of the call and the decision on these things. But people can ramp up on literally anything at least, and then it gets them started and they can dive deeper much more quickly.
[00:33:26] Eric Lelos: Well, especially a guy like you, like with all of your software engineering and architecture background, you can just sniff out like, no, you're on the wrong path. Do it this way. But I would be a little bit careful, like there's this balancing between being wide open to new approaches and also recognizing approach is no good.
[00:33:44] There's kind of two things I'm like, especially when it comes to asking about architectural approaches. That's where generative AI really can be frustrating. Like I've been through a few cycles where. I started out talking about [00:34:00] how do I want this architected? How do I want the data to flow? How do I want the AI to flow?
[00:34:05] And then all of a sudden, halfway through it, I'm like, Hey, wait a minute. You just changed plans here. This is no good. I can't do it like that. And it can be a little bit, a little bit tricky, but as a knowledge worker, as a software person, as a technology person, generative AI is. Awesome.
[00:34:22] Justin Grammens: Yeah. Yep. Exactly.
[00:34:24] Yeah. We like to call it AI assisted engineering. So the idea is it's assisting you. It's not replacing you. Back to what you were saying at the beginning, you know, you mentioned 30 years working on this stuff. You know, one of the questions I do like to ask people is, yeah, if you're just coming outta school today, or you're maybe getting involved into this in this field, I don't know, where would you point people?
[00:34:42] What would you suggest that they look at?
[00:34:44] Eric Lelos: I think that I would go to a industry that I was really interested in. That's where I would try to develop my capabilities. So we've talked a lot about manufacturing today, and one of the reasons I love manufacturing is because they make [00:35:00] things right. I kind of love going out on the shop floor and seeing.
[00:35:04] People putting stuff together, building stuff, welding, whatever. I think that's a really great thing. And I think it's important for us as a country to have that capability, those skills, as a society, as a civilization, we need to be able to make things.
[00:35:19] Justin Grammens: Yes. Yeah.
[00:35:20] Eric Lelos: And that's really important and I would encourage people to go into manufacturing 'cause.
[00:35:25] It has all the other things that every other company has. It's got sales, it's got finance, it's got accounting, it's got it, all of those things. But then they also have things that were dirt. Like two years ago it was literally dirt and somebody figured out how to refine that stuff, turn it into something important.
[00:35:44] Engineers designed things, built things, prototyped things, all kinds of amazing capabilities. I don't wanna sell everybody on manufacturing, but whatever it is you love to do, that's where I would go. And then I would steer myself [00:36:00] towards how I want to be interested in technology. So almost everybody has an opportunity to use data and ai.
[00:36:09] Almost everybody. You could literally be a sales guy for a music company, and you could have the ability to inject data and AI into your work. And build something that has way more impact. If you had just done the job that someone hired you to do,
[00:36:28] Justin Grammens: I love it, man. Yeah. Focus on the industry or the things that you're really interested in, and the technology can be applied anywhere, right?
[00:36:35] So a lot of people wanna start with the tech first. That's kind of what I'm hearing in that I find myself the same thing, you know? Hey. Learn how to code this language. That was the story. Honestly, up until just very recently, it was become a really good job at developer, right? And now that's becoming completely commoditized.
[00:36:53] That's actually not where you wanna focus on. You wanna understand how I can apply technology to, and like I say, maybe it's healthcare, maybe it's [00:37:00] retail, there's all sorts of different areas and everyone is obviously drawn to those industries, whatever that might be. And then, yeah, bring the tech in afterwards.
[00:37:09] I love it.
[00:37:10] Eric Lelos: I wholeheartedly agree with that. Like even if you wanted to go to med school, I think the opportunity, if you became a doctor, you would have an amazing opportunity. Like to me, there is no doubt in my mind that medicine is gonna be delivered way differently in it's probably before we notice, before we realize that there's gonna be so much more ai.
[00:37:35] You know, the irony is. It's gonna be better. I mean, and the thing that's funny about that. It's, it's not gonna actually replace doctors. So don't, don't stay away from being, going to med school because of this. In fact, there's a very good chance that this is gonna increase the demand for doctors. Like, I dunno if you've been hearing about this, but there's a really often cited [00:38:00] study or trend or whatever, but like almost all x-rays are read by AI now, right?
[00:38:06] Like almost all of them. And. At one point people expected like, yeah, we're gonna completely replace radiologists. And it turns out there's five times as many radiologists now than there were when that first started. And it, it like quadrupled the demand for a radiologist. So, I don't know, lots of, uh, excitement there.
[00:38:25] Justin Grammens: Yeah, for sure, for sure. Cool. I guess last question, I, you know, how can people get ahold of you? Eric, what's a good place? We'll definitely put links to all of your stuff in the show notes, but.
[00:38:35] Eric Lelos: Well, I'm definitely on LinkedIn, so look for me on LinkedIn. We've got a website, quantified mechanics.com. You know, I post a little bit of content, so I, I'm more than, uh, excited to connect with people on LinkedIn especially.
[00:38:48] Justin Grammens: Awesome. Cool. You know, you've spoken at the Applied AI events and I, you know, we're gonna be doing something with mini analytics at their data tech event in May as well, so hopefully we can get you [00:39:00] there speaking there as well. Yeah, this, the whole I, the whole test bench thing I think is fabulous. I think it's something that.
[00:39:05] That industry just in general is ripe for innovation manufacturing and I think Minnesota's the right place to do it. You're right. There is a lot of manufacturers here and a lot, and especially in agriculture for sure. I can, you could say manufacturing of food, but I've talked to a couple different agriculture companies that.
[00:39:22] It's Wow if you can just increase it just a little bit. 'cause the, there's so much food that, so much food production that if you can just save just even just 5%, that can turn into millions and millions of dollars.
[00:39:34] Eric Lelos: Yeah, I wholeheartedly agree. I think Minnesota especially has an era of manufacturing around that.
[00:39:42] Is really ripe for some innovation and some new opportunities. So
[00:39:47] Justin Grammens: yeah,
[00:39:47] Eric Lelos: we're super excited.
[00:39:48] Justin Grammens: Cool. Well, very good. Awesome. Eric. I appreciate the time today. This is a lot of fun. And yeah, look forward to keeping in touch with you in the future.
[00:39:55] Eric Lelos: Thanks for having me. And we've known each other a long time.
[00:39:58] Hopefully someday we're gonna come up with [00:40:00] a great project to collaborate on.
[00:40:02] Justin Grammens: We will. And maybe we'll ask AI if that's a good idea or a bad idea once we come up with it, but
[00:40:08] Eric Lelos: hilarious.
[00:40:09] Justin Grammens: We'll get there. All right, Eric. Thanks again.
[00:40:13] AI Speaker: You've listened to another episode of the Conversations on Applied AI podcast.
[00:40:18] We hope you are eager to learn more about applying artificial intelligence and deep learning within your organization. You can visit us at applied AI Dotn to keep up to date on our events and connect with our amazing community. Please don't hesitate to reach out to Justin at Applied AI if you are interested in participating in a future episode.
[00:40:39] Thank you for listening.