The conversation this week is with Corbin Fonville. Corbin is the owner of the Minnesota Championship Series, which delivers pro quality free to compete eSports tournaments to amateur players of all levels. He's also the CEO at Skillquest where he enjoys helping gamers go from zero to a career in technology. Skillquest is a platform where high school Rocket League players learn to code. He started with a mechanical engineering degree, but established his career through online learning and building personal projects.
If you are interested in learning about how AI is being applied across multiple industries, be sure to join us at a future AppliedAI Monthly meetup and help support us so we can make future Emerging Technologies North non-profit events!
Resources and Topics Mentioned in this Episode
Corbin Fonville 0:00
If it codes a bug, which it does, for example, in one of the cases, it messed up the ID that I should be using for Google Cloud Platform, it is difficult to figure that out. Because you know, as an engineer, you're writing code and you kind of know where you make an assumption. You're like, Well, I'm just gonna stick that idea in here, because I think it's the right one. But if not, I'll come back and check. Well, when Chet GPT writes you a bundle of code, you don't know what assumptions it made, where it was very confident, or where it was just kind of guesswork. I don't know why it introduces these bugs. It's actually very, very interesting to me that it does that but that has cost me hours in troubleshooting.
AI Announcer 0:42
Welcome to the conversations on Applied AI podcast where Justin Grammens and the team at emerging technologies North talk with experts in the fields of artificial intelligence and deep learning. In each episode, we cut through the hype and dive into how these technologies are being applied to real world problems today. We hope that you find this episode educational and applicable to your industry and connect with us to learn more about our organization at applied ai.mn. Enjoy.
Justin Grammens 1:13
Welcome, everyone to the conversations on applied AI Podcast. Today we're talking with Corbin Farmville. Corbin is the owner of the Minnesota Championship Series, which delivers pro quality free to compete eSports tournaments to amateur players of all levels. He's also the CEO at Skillquest where he enjoys helping gamers go from zero to a career in technology. Skillquest is a platform where high school Rocket League players learn to code. He started with a mechanical engineering degree, but established his career through online learning and building personal projects. And I'm super excited to have him on today to talk about what he's been working on, and a few different personal projects. And most importantly, I really love that Corbin loves to share online what he's learning and sort of give back to the community so people can sort of learn together with him. And I know he's working on some pretty interesting stuff. Right now. We're on ChatGPT. So thank you, Corbin, for being on the program today. Appreciate it.
Corbin Fonville 2:00
Yeah, definitely. I'm happy to be here, Justin.
Justin Grammens 2:01
Awesome. Well, I talked a little bit about sort of where you are today, sort of these two organizations. I'm curious, maybe, you know, I was kind of piqued by you got a mechanical engineering degree and kind of worked your way into technology, maybe you could fill in some of the sort of hops along the way that got you sort of from college to to what you're doing now?
Corbin Fonville 2:18
Well, I guess it all starts from back when I was a kid, I knew I'd be an entrepreneur, there was no question in my mind that I wanted to start a business. So I went through school got a mechanical engineering degree, because that's what I, I loved, doing. I loved building things, getting in there with my hands. And it offers a lot of breath for people who are fascinated by things. So there was a good degree to get, but then I found it easier to start a business in software. And so I joined a group of folks who were starting a business ended up being the CEO of that project. And then somewhere along the way, it's debatable whether this was a good decision or not. But I decided to learn to code to help us move the project along faster, and kind of taught myself online. I think my total coding education cost me like 60 bucks from a couple really high quality Udemy courses. And then one of the other guys who was a software engineer, professionally, who saw what I was building said, hey, I'll just hire you. I was working for Polaris at the time. So I had, you know, a stable job that I enjoyed, but the benefits and everything in software were so much better that I made the jump. And then now that's kind of what skill quest is all about, you know, helping people to get into a tech career to discover it. Because in my mind, it's like a great place for interested people. The education is extremely accessible, you know, comparing to a mechanical engineering degree, where you really need to go to school because they have the welders and the machines and everything that you need to actually get hands on and learn the stuff like software is right there on the internet at your computer right now. You can just start learning and start building.
Justin Grammens 3:50
Totally Yeah. And that's one of the beauties, I think it's been my experience is, you know, well go back to the days of Apple, you know, computers, just a couple guys in a garage. And so what's so cool about software, and as I've gotten more and more into it, it's Yeah, anybody with a laptop anywhere around the world, and it's not even in the United States, per se, right? You have people from all sorts of developing countries, if they can even get a low cost Raspberry Pi, they can start coding and starting to learn this stuff and contribute.
Corbin Fonville 4:16
Yeah, exactly. And you can really show off and get people interested just because you can build something unique. And that can hold down a whole interview conversation. I actually got to meet Steve Wozniak recently he was in Bemidji, Minnesota. No way. Yeah. And I got to talk to him about this. And his his entire story is just one that I want to create for people over and over again, where he was just fascinated by having fun, you know, learning tech, he got his job before he got to college. He actually went to college after founding apple.
Justin Grammens 4:46
Funny, funny, cool, cool. And I just kind of poke at that a little bit. Like, why was he up there? Was he up there for the school?
Corbin Fonville 4:53
Let's see. There was actually an Esports competition up there. Okay, put on by the Paul Bunyan Communications Group
Justin Grammens 5:00
Very cool. Very cool. Yeah. So I know that a lot of people are that are listening to this kind of are interested in sort of AI. And one of the questions I love to ask people I guess is like, and there's no right or wrong answer on this, because I've been doing 70 Some episodes, no one has a right or wrong answer. But like, yeah, how do you think of AI? You know, how would you define it? If somebody were to ask you a short elevator pitch about it?
Corbin Fonville 5:22
Yeah, I knew that you were gonna ask this question. And I actually looked up some definitions. I was like, this is the one I have to be prepared for it. I actually landed on maybe a colloquial definition that people might completely not agree with. To me AI is anything that you can look at as a human and wonder, Is that a human? If it exhibits such intelligence that you might compare it to your own? And I'd say that's quantifiable as AI.
Justin Grammens 5:46
I see. I see. Okay, cool. Kind of like a Turing test type idea where every human can be fooled into thinking that it's not really human, then it's sort of done its job
Corbin Fonville 5:55
a little bit, but even even maybe like a step back from that, since it is definitely artificial. Like, if you can kind of wonder, Hey, is that as smart as me, then? It's at least artificial intelligence.
Justin Grammens 6:06
Yeah. Yeah, for sure. For sure. No, I totally works. It totally works. It's great. That's great. So what's a day in the life of a person here who's like, I don't know, man, it feels like you know, you're you've got this passion, you got this drive, you basically want to build this community around helping people that maybe you know, to help them get into technology careers and stuff like that. I mean, that's, it sounds like that's probably your main your main gig. Yep. So what is the date typically work for somebody who's kind of a company of one? I guess? It's one. Yeah, yeah,
Speaker 1 6:37
I do have some partners that I work with other businesses. So man, you can't really describe a typical day. I'll say that my company relates to how I got into AI very strongly, because we never you say that you're going to teach somebody how to code. In the last four months, you're going to suddenly have people saying, well, doesn't ChatGPT already do that for you. So that's a little of the lead in to why I started heavily exploring AI projects, because I'm essentially concerned about the future of tech, you know, whether I should be teaching people how to code or maybe teaching people how to use AI to code. But yeah, day in the life of an entrepreneur, they're doing marketing, they're doing partnerships. They're doing web dev, they're managing DNS routes, while they're, you know, worried about their email campaigns. It's really all over the place.
Justin Grammens 6:37
It's great. That's great. No, I know, I have been a silver serial entrepreneur myself. So you're always wearing a bunch of different hats and always feel like there's just not enough hours in a day to write it just it's like, gosh, if I only had a couple more hours, I could get this and these other things done. So I have a lot of sympathy for for the work that you're doing. And you know what, why don't you tell us a little bit about, you know, the project you're working on, you talk a little bit about skills quest, but how specifically now, are you? Are you looking at chat GPT and AI within within your business?
Speaker 1 7:53
Justin Grammens 9:40
Yeah. So you had this idea and you're kind of taking in, you know, the technology, you could code this up yourself, and I'm pretty certain of that, but you're sort of taking this, Hey, I don't know what I'm doing approach and trying to see if you could feed Chet GPT what it needs in order to in order to build it
Corbin Fonville 9:55
for you. Yep, I picked the architecture. Okay, or the the tech stack at least, but But I made it architected, I'm trying to make it solve all the problems, which has proved pretty difficult in many situations.
Justin Grammens 10:06
What are some examples? Yeah. So
Corbin Fonville 10:08
for example, if it codes a bug, which it does, like, for example, in one of the cases, it messed up the ID that I should be using for Google Cloud Platform, it is difficult to figure that out. Because you know, as an engineer, you're writing code and you kind of know where you make an assumption, you're like, Well, I'm just gonna stick that idea in here, because I think it's the right one. But if not, I'll come back and check. Well, when Chet GPT writes you a bundle of code, you don't know what assumptions it made, where it was very confident, or where it was just kind of guesswork. Yeah. I don't know why it introduces these bugs. It's actually very, very interesting to me that it does that but that has cost me hours. In troubleshooting.
Justin Grammens 10:50
That's very interesting. So are you are you doing any code review on that code? Or you literally just like, alright, I'll take it verbatim what it is slap it in here?
Corbin Fonville 10:58
I think it depends. Okay. I think it's easier to test it than it is to code review it to some degree, you know, like, I'll give you an example. I had my back end setup, I built it on serverless and AWS primarily. And I wanted to test whether or not this endpoint that I had would actually take an audio file, feed it to AWS transcript and spit out the transcription into an s3 bucket. And I was trying to get it all worked into a postman call so that I could like manually test it. Well, it turned out to just be easier to ask chat GPT to whip up a web interface that would allow me to record my voice and post it to the endpoint. And so like that code, I don't have to code review it. In terms of it being an MVP, you just kind of open it up and say does it work?
Justin Grammens 11:45
Yeah, sure. Sure. Your idea of the bug, I think was interesting. Like, could you ask it to write unit to IU? I've seen this and I've done this just playing around, like write a unit test for this function, right? Did you did you do some of that stuff at all, too.
Corbin Fonville 11:58
I have not actually played around in this project with unit testing I have in other projects. And I think it's relatively good in that. That particular example comes from trying to hook up Google Analytics where it was telling me to go to the web console and put in the property ID, but it was telling me to get, you know, another ID off the web console. So it was like, just something I wouldn't have known that was wrong. I was just following its instructions. You know, actually, and this was one of my most profound findings. It expects you to follow its instructions to the tee. And if you don't follow its instructions, it gets very confused. Because it has no, it has no reason to believe that you wouldn't have done exactly what was in the chat log. Oh, sure. Sure. And so if a bug isn't accidentally introduced, because you didn't follow instructions, exactly right, then it will never catch that. You have to go back yourself and find that.
Justin Grammens 12:54
Gotcha. Sure, sure. So when I've seen chat, GPT right color, people are like, this thing's amazing. Write me a function that does this. Right, we have function that does that. And, you know, I may have even shared this with you when I first met actually, I think was, you know, out at the thinking spot was that I always have thought about chat. GPT is very good at this, these like these little stupid, mundane things, you know, just little pieces of software, were in my mind, a true software engineer, actually can take a look at the entire picture can understand how all these modules are going to work together, can make it scalable, can make it in such a way that it's flexible in the future. I mean, these are things that I've learned over my 25 years of coding, where in my first couple years, I mean, I knew Java inside and out, I could write a Java function, but yet, it takes decades oftentimes to realize, oh, okay, here's where we use inheritance. Here's here's where you want to use, you know, a composite framework. Here's where you want to use Singleton's like all that type of stuff, all those design patterns. So I find it interesting. Do you see it being used? Because you mentioned about it doing in architecture? And maybe that is the wrong word. You're talking about the platform, AWS, what it's going to be on? But have you seen any hints of it, like being smart about the code that it's writing? Or is it just like still really good at just like performing little kind of mundane tasks?
Corbin Fonville 14:06
Yeah, I think, at this point in the project, I would agree with your sentiment that it, it is very good at writing a function that does a thing or writing the unit tests for that function, which, you know, to my friend Mir, who, sorry, Amir, if you ever see this, but he's not a software engineer. So like writing that type of code probably feels like a new power to him. Whereas for most software engineers, that's like the boring stuff you know how to do it just takes time. Click clacking on the keyboard versus broader topics which require a lot of context, like how to hook up a front end to an API that is expected to operate in a certain way. It's difficult to describe all of that to check GPT and get an impactful answer. I think that takes much more time than just making your neck to move in the design of a system where software engineers are more more experienced at the orchestration, I would say, versus the playing of a single part chat. GPT is great at being the lead fiddle where you need to play this piece correctly. Right? And then the software engineer still needs to be there to orchestrate everything to work together.
Justin Grammens 15:19
Yeah, cool. Well, so I think I've seen you post maybe three LinkedIn posts or So somewhere around there so far, yeah, um, three or four and right now for four. And so I I'm not sure how many you plan to do kind of go and go until it ends, I guess.
Corbin Fonville 15:33
Yeah, my, my original goal for this project was to get this first project done in four to five posts, I'm doing 30 minute segments. So I thought it would take two and a half hours, we're well over the four hour mark right now, which is where I'm beginning to think, man, if I had just set out to code this, myself, I probably would have been done in three. I wanted it to be at least as fast as what I would have done. But some of the additional time It's cost me which might be learning curve. To be fair, and might be the fact that I'm recording it as content, like I'm recording the whole thing as a YouTube video. Or as a video where I explain verbally what I'm doing while I go along. That might be costing me time. But yeah, I was hoping it'd be quicker.
Justin Grammens 16:17
Yeah, so the jury's still out with regards to if it's faster or not, but it's feeling like it's not going to be faster than if you were to code it. But again, you have the technical knowledge to be able to do it. If it's this is somebody that didn't know anything at all, this would be a huge time saver, I guess.
Corbin Fonville 16:32
Honestly, I don't think that somebody coming in with no software engineering knowledge could have done it. Gotcha. There are just some things that it missed about the assumptions of the server list framework that I had to go in and catch right off the bat. Like, if you wanted to build a service and actually access AWS, you would have to at least get your credential file set up. And just starting this project, it had no idea that it needed to tell server lists to access the credential file, or where to access the credential file, which I think would have just derailed somebody, like, in the first 10 minutes. They have
Justin Grammens 17:09
been pulling their hair out and said, Ah, forget it, you know, yeah, yeah, for sure. Or else they would have found other other methods, I guess. I mean, I'm just I'm wondering, you know, people talk about StackOverflow coding, right, this is just cutting and pasting, you know, kind of before check. GPT was here, everyone went on Stack Overflow, and just tried stuff. So but yeah, it'll probably
Corbin Fonville 17:27
be easier if people are fresh to engineering or fresh to building I'll say, to use services that are already pre baked, like otter.ai for voice transcription stuff, then it will be to access a framework for a cloud architecture like AWS. So maybe I positioned a little bit in a place that would be more difficult to work in for for newbie, yeah, sure.
Justin Grammens 17:53
Sure. Well, when one of the talks at the applied AI conference was about AI at the edge, not sure if you had a chance to see that one. But it was about sort of found object engineering. So how can you find these things that are out there, and the example was, was this with a company called IoT on a stick, and I kind of done a little bit of work with these guys. But the idea was, hey, let's put together a model that can do license plate recognition. And it was more of a security type camera application. And at the end of the day, it was like no other services out there that do this, you just throw it the picture. And it gives you back what you need. And actually these models can run on device. So it is one of these things where I think you're right, like how much you want to stand on the shoulders of giants, because a lot of this stuff has already been built. And you can just kind of just string stuff together, which I know you're kind of an advocate for, you know, as well, with regards to some of these low code, no code frameworks.
Corbin Fonville 18:41
Definitely. So we haven't actually touched on no code. I've been building on no code for a few years now. Oh, man, it feels like a few years, it might just be two years. But I think the pace of software development with AI should outstrip what you can do and low code here in the next year or so. Which I'm a little bit sad to see because I have invested a decent amount of time in in low code. But I think some of the problems that I've discovered through, you know, through working on this project, which I actually do have a list by the way, we've been going through kind of an unstructured play, but I think a lot of those are going to be solved. For example, auto GPT is a pretty cool project. It is not near as cool as all the hype that gets, because everybody's like, although GPT has ended chat GPT although then there are people who actually use auto GPT that's like, has anybody got it to work yet? Right. But I think the kind of self critiquing or self prompting where the AI will be asked a question by a human and then come up with a response, but then another AI will be asked to critique that response, you know, rate it for risks, provide some additional critiques and then either a human or an AI can make a decision. Do we want to go address these concerns now or do we want to take this path After the first prompt, kind of suggested, I think that type of model eventually will went out to create a lot of what we are kind of talking about in terms of any engineering replacement, I think will be done in a model like that, where it's multiple prompting. And maybe there are checkpoints along the way where an engineer or somebody who's building can check in and kind of guide the model.
Justin Grammens 20:23
Yeah, interesting. Yeah, I mean, have you looked at stuff like like copilot, or some of these others sort of cogeneration ones that are sort of built into VS code?
Corbin Fonville 20:33
I have used Merlin I haven't used copilot yet, which I aim to in a coming episode. Yeah, I'm also wanting to play with Bard. Merlin is one that kind of integrates straight into the VS code terminal. And you can highlight text. So it has that as context. And then you can ask it to do stuff with that. And that seems pretty effective. But really, context management has been the largest challenge of this entire project, because you can't feed it your entire code at one time and then tell it to work on that. I think that stops it from being able to work at a much higher level. If somebody can solve the context. Or, you know, I know there are some long term memory solutions out there using vector databases. If somebody can solve that truly for an interface that's right in front of you, though, that works with your entire project. And I think that will be the time at which AI gets actually faster than humans at building.
Justin Grammens 21:29
Sure. What do you feel right now is, you know, going to happen to students that are coming out of school with a computer science degree. I mean, how do you think things are changing? Right?
Corbin Fonville 21:39
Yeah, I think that Well, for one, I am not of the mindset that in software engineering is dead, I think that people who are trained builders, are going to be more empowered with AI tools than people who are not trained builders. And you have many of the same fundamental concepts of building like, loops like that even single 10s, or inheritance that are kind of built into your education as a software engineer, that other people don't come to the table with. So in the future of building, I think software engineers will be building with AI, rather than just everybody, you know, building like applications with AI. But a lot of their education will be outdated, like anything that's teaching syntax, I think that's almost already dead. There's a, there's a great YouTube channel called Fire ship. And he was talking about how you can ask chat GPT to create us pseudocode, to write, and it'll, it'll come back and tell you a syntax that you can use, and then you're just like, basically writing a Yamo file, but it will architect you a React page or, or whatever you want. It's a very interesting concept that I think that'll be used much more in the future than like, I mean, who wants to worry about curly braces, and whether they're closed or single versus double quotes? Anyway, that stuff was always boring to software engineers. Let's get past that. And I think AI helps us get past that. Yeah, for
Justin Grammens 23:07
sure. Ya know, it's one of the things that I think is so cool about AI, it's, I think we can use it to our advantage to basically do the mundane things that humans either are not good at, right? So try and find the curly brace that's missing. That's a frustrating thing for even a human to do. And then you know, things that they don't want to do. Right. So some of the smaller mundane tasks. That's the beauty. And well, first of all, this entire podcast is transcribed. And so we also have liner notes and stuff like that. So I'll make sure to put all of these links into the liner notes around all these, you know, YouTube site, the other stuff that you've been mentioning, for sure. And so were there a couple of things that you sort of had as as as like a crib sheet, I guess, for us to make sure that we covered around around your project?
Corbin Fonville 23:50
Yeah. So one of the dangers right now, I think this could be solved in the future with building with AI is that it is going to pick packages, open source packages to build with that are untrustworthy. It picked an open source package. I don't remember the name of it, but it had like 49 weekly downloads. And I looked at that first and I was like, we're not going to use that. I asked it to pick another package. Lo and behold, there is another package for parsing audio API's or audio incoming audio from an API that has like the expected 30 million weekly downloads, and I'm like, Okay, well, you pick the wrong one right off the bat. We had no idea what was going to happen to this one person's package that they randomly put open source. So sure, I think it builds on fragile code.
Justin Grammens 24:39
Good point. Yeah. There was one thing that I remember playing around with and this was back last fall when it kind of first came out was we were trying to this was an IoT machine learning class that I teach at the University of St. Thomas and one of the students was asking about, essentially doing a sort of like a bubble sort type thing. And they basically I'm like, well run it through GPT and what it did It was it imported this phantom package. Like, I actually tried to go out and buying the package. And I'm like, it's not there. It just doesn't exist, you know, and this was this was a C library. And I'm like looking around going like, No, you know, so it was one of these, you know, hallucinations, I guess it was in the early days, this was like before even 3.5 Turbo, I think it was just GPT, three or whatever. But so there's probably something like that, too. Still, it feels to me like,
Corbin Fonville 25:25
they ran into it, did you? I actually wasn't even gonna bring that up, because I thought that was maybe just an edge case that I ran into, but it completely made up this package. And I was like, could you tell me the docs on this? Because I couldn't find them anywhere. And it spat out documentation as if this thing actually existed?
Justin Grammens 25:41
Yes, yes. Yeah. So your mileage may vary, they say for sure. And that stuff?
Corbin Fonville 25:46
Yeah. And then obviously, it runs a high risk of giving you outdated code, which is not a problem that is unique to AI. Obviously, that's something that is easy to run into as a software engineer. And it's one of the primary reasons that I would recommend that people who are learning tech actually get a good instructor. Because it's so easy to get mixed up in the the fact that one StackOverflow post was posted five years ago, and the other was posted a year ago, and you're trying to use the two together. And it just doesn't, doesn't work. I think that can derail a lot of people on their building or on their learning track. Yeah, but it definitely derails chat GPT, as well, because I want it to be using the most recent package, not the original alpha version, which is what it tried to use for one of the one of the NPM packages, I import it. The other thing is that this is about context again, but I think it's slightly different. I've been I've been doing this all in a single chat, trying to trying to just keep it all in one place. And I can't tell where the context runs out. Like where it essentially forgets what we've built together before. It doesn't tell me and in fact, like I ran into one point where I had been building for a while. And I asked it for something new that was based on code we written together before. And it made assumptions like it gave me code, but it made assumptions about the code that we've written together before which were clearly incorrect. It was using a different NPM package. And the interface between the functions was was different. And I was like, Okay, well, we must have run out of context here. Sure, sure. But later on, it actually referenced the correct code again, unexpectedly in an in a different place. And so I really don't know, this is something maybe you could answer somebody out there could answer like, does it just, is the context linear? Does it go like, Oh, we've hit 8000 tokens, or 32,000 tokens now? We're out of that old stuff, or does it retain certain tokens more? Because that's definitely that definitely seems to be what I ran into.
Justin Grammens 28:01
Interesting. I don't have an answer for that one in particular, I mean, I do know that there is it is stateless. So like chat GPT. If your interface and API like directly, that I was like, when I first started playing with playing with it, there's actually ideas that come back, and I'm like, Oh, interesting. I could probably use this and indignation. They're like, Nope, you can't use it, you need to restate, you need to basically restate what you stated prior to it. And when you're using the web interface, behind the scenes, it is doing that it's actually repeating in what you said prior to keep it in line. But I think it's interesting. I don't have an answer with regard to like, how did it forget and then come back? Like, I don't know how the internals of that's working. But something cued it back to get back in line with the context that it had originally. Yeah.
Corbin Fonville 28:46
And to be clear, I was only using the web interface here. Nothing with the API quite yet.
Justin Grammens 28:52
Yeah, there will be no reason in your particular case. He didn't want to screw around with the API. I mean, the API is cool. You can basically get back a bunch of JSON you can, you know, it's it's probably the way that a lot of more people are building stuff on top of chat GPT, which is what I'm working on for a client right now. But But yeah, as the end user, absolutely just use the web interface right now. Because it does a lot of that stuff automatically sets the temperature, you know, with regards to how much randomness it sort of puts into the responses and stuff. So I wanted to ask about the YouTube videos. So I mean, right now I'm on LinkedIn, you're sort of posting your stuff, but there's no like, as far as I saw, right, there isn't like a blog post. I can go into a video I can watch, right? You're sort of cataloging this along the way.
Speaker 1 29:30
Yeah. So the whole project was spun out, because I basically figured out that I had about 30 minutes more a day that I could use in terms of a blog post or an email list. Those are on my like future horizons, but I need to get efficient enough at this. So what I do right now, is I record the entire 30 minute build session. And then I actually upload that to AWS transcript, to get the transcript of it. it. And then I, I've handed that over to my brother. He's been my partner kind of behind the scenes so far in, in this project. He's the marketing brands. And he has been feeding that to chat GPT the transcripts to clean them up into, you know more direct transcripts, remove the arms and ahhs and just build clarity into the process more. But then he also uses those to spin out these posts that the format is kind of evolving on and then he just hands me back the text posts. And eventually, the place that I want to get to is where I can just do a 30 minute record session. And we'll get past project one, we'll be building on project two or three at this point. But I can upload that video to YouTube, but also in the in the behind the scenes. Some process that I build will download that video from YouTube, do the transcript, do the cleaning, and create a blog post or create the LinkedIn posts and the Twitter posts all at one time, just so that it's all very seamless? And it doesn't take a bunch of my time?
Justin Grammens 31:04
Yeah, yeah, for sure. No, I think I mean, these are ideas I've been kicking around in my head too. I love where you're going with this. Because if you can build that, that's probably a product that a lot of content creators would would pay, you know, some amount per month to sort of automate all that stuff. And I, I have an AI newsletter that I generate and been figuring out ways in which I can summarize a lot of the articles that I've read. But even things like this podcast, you know, there's there's a fair amount of manual tasks that goes we go through and a human takes out the hums and ahhs and they actually add in the intro segments, the outros, you know, there's some human pieces in there that kind of, in some ways need to happen, because I need to take a look at maybe getting more in depth than what you're thinking. But like, you know, I listened to the entire recording of any conversation I had with somebody program, but then I, I like to pull out like a nice 32nd segment of like, kind of kind of what I like to call the gold. You know, it's like, it's like what the person talked about kind of a nicer a general summary, haven't figured out a way for an AI to be like, hey, between, you know, one minute and 32 seconds and one minute, 52 seconds, here's the pieces you need. Yes, there, there's still some of that stuff. But there is a huge opportunity for any content creator to sort of, yeah, eliminate a lot of their time doing this stuff. Well, it's
Corbin Fonville 32:16
funny, you should mention that, because there was actually I did a side project that I missed recording, or I didn't think it would be so results producing. So I didn't record it. But I actually use chat GPT to build an app that would automatically cut a video based on the transcript delivered from AWS transcribe. Interesting. So I think I'm like 25%, to what you were describing, and maybe we can collab on that at some point. Yeah, for sure.
Justin Grammens 32:47
Yeah. I mean, it's probably all the pieces are there. It's like listen to this audio file. Well, don't listen, audio file, look at this transcript, summarize, you know, in general, and then find in this, you know, in this, what we're at 36 minutes now, in this 30 minute, 36 minute conversations through what was talked about, but yeah,
Corbin Fonville 33:04
exciting set out and put that at the beginning. And then yeah,
Justin Grammens 33:07
yep, and then roll the intro music that we have, and all that sort of stuff, it's definitely gonna happen. It's it's definitely going to happen. It's just just a matter of when, you know, it's so but again, somebody needs to orchestrate that. That's where I still think humans like need to sort of, again, they need to explain what the AI should be doing my music. And my intro is different than somebody else who has a different podcast, like, you know, the people that I have on the program, going to talk about it. Like I still ultimately control all that right, I still ultimately controlled conversations. So
Corbin Fonville 33:35
let me send you a question. As a guy who is staying up to date on the space and even using it to do some generation for your content and your AI letter. What do you think will be the impact on trust of content to humans? Who are consuming the content? Like, do you think that people are going to get tired of the fact that there's just going to be 100, or 1000 times more content to consume? And we're going to be, we're going to learn to love a certain set of content? What's going to be the differentiator here over the next couple of years?
Justin Grammens 34:11
Yeah, good question. Well, there's a couple of things in there. Number one, when you said trust, I was even thinking about, like, Who do you trust anymore? Right. I mean, back in the day, a newspaper would show up on your doorstep. And that was the source of truth, right? And then, you know, the Internet came and people were publishing and now everyone's publishing and AI can publish more than any human possibly could. So I think there's going to be a huge problem around Well, where is your source of information? And is it actually relearn? You know, one of the things that I talked about at the applied AI conference during the kickoff was like, there's a deep fake of Joe Biden rapping, you know what I mean? And there's, it's so easy for people to now generate video and audio and stories around stuff that I'm suspect. I mean, my wife was like, did you see this thing on Facebook? Did you see what they're doing to these animals somewhere? And I'm like, I don't know if I believe that, right? It's just like, it's just somebody else. out there doing stuff. So I think in some ways, every person is going to need to have an AI to take all that content I think and and essentially filter it. So it's almost going to be like people are going to be throwing crap into the ether. But I think what's going to be is we're going to need a filter to basically come down and be able to condense and find out what's true and what's not. Because I don't have time to fact check everything. So I there's probably going to be an arms race, it's just going to continue to be leveling up around the crapple meter of people putting out false and misleading information. And then everyone actually having some sort of an assistance. And that's where I think, assistance, it's just going to be common practice. Like, I'm just going to wake up in the morning, and I'm going to have the assistant tell me what I want to hear, I guess, you know, the content that I want to hear, but also what does my day look like? How I should what should I be eating this this week? Like? There's just there's a lot of things that I think we can be using these voice activated assistants or digital assistants in the future. And one of them's around content. The other one is just in general, how we're going to live or live. So not sure if I answered your question, but I definitely think it's scary. With regards to all the content and the fake content. I think people are going to have to watch out be wary of it. Yeah, well, I
Corbin Fonville 36:09
mean, I think you brought up a novel idea that I hadn't considered so much is that you need some filter that will probably also be aI powered, because you just need something that can match the power of this rampant rampant content creation. I mean, you and I like right now, we are on this podcast on a video with each other. We know this is real. Hopefully this is not an AI Justin, I'm talking ID big live stream Justin, which I think I have maybe tended toward video content with people's faces on it here. Over the last few weeks to a couple months. I think I value people who are talking because I know that they're actually talking versus like, a lot of the newsletters, even the ones that I used to enjoy quite a bit. I just start wondering like, did anybody write this? Yeah. Is this something that is as bad at operating as auto GPT? Is that just spun this text out? And now I'm, you know, I'm attributing it importance as if the actual creator who I do respect, had written it?
Justin Grammens 37:14
Yes. Oh, yeah. Yeah, it's interesting, I don't know where that's going to end because AI is just going to get better and bette r. I still believe that. Well, I mean, I do tell people that I feel like, the more AI generates content, and the more it's going to get sucked into the model. And it's more just going to generate sort of this bland stuff over and over again. And I it'll be interesting to see, but potentially humans, as long as you can be creative and write things that are different, that are outside of the dorm, you know, chat, GPT always has a way to bullet list everything out, right? It's just like it has this format, that it that it does. And it's like, if you can, right out of the norm, you might actually do quite well in the future. You know, it's not the same rinse and repeat type stuff over and over again. And there was an article this goes back probably a good four or five months or so. But it was a guy at a it was either the New York or Washington Post, I forget which publication it was, but he was attributing and it made all the rounds. And it's kind of just went away. But he was talking about and I'm probably gonna get this wrong. But he was talking about how chat GPT is really like a compression algorithm. And it was a unique spin on basically how it's taken all this text and really sort of compressed it and he compared it to like an mp3 or, or a JPEG and stuff like that, where you're not seeing all of the information. So the details of the of the of the article aren't really germane to what we're talking about what was interesting to me was was like, I was like, Dude, that was a unique take on this whole thing like that came out of your brain, like I wrote down. And I was like, as like this is just totally a different way of spinning Howard Hartamas conversation, it was not generated. And I was like, I gotta keep reading more of this guy stuff. Like I can tell that it's basically very unique and very creative and very different. And he's a kind of author I like to read.
Corbin Fonville 38:52
But it is interesting. It also is almost foundational to how AI is built, at least for image generation. I know that any image that's generated typically starts with a like a Gaussian Blur, like it's just a bunch of gray, just random noise. And then it kind of tries to figure out what should go in the places based on the prompts and keeps like the model basically refines this noise into a picture, which is how you make sure that the same image isn't generated all of the time. Yeah. So it's like to impress normality. Pure normality, filtered into something, given a prompt and then filtered in something that you want.
Justin Grammens 39:30
Yeah, yeah. Really, really cool ideas, or ideas?
Corbin Fonville 39:33
Well, yeah. So
Justin Grammens 39:34
were there anything else that you wanted to sort of chat about, I guess, or other things related to AI? Maybe we didn't we didn't touch on,
Corbin Fonville 39:41
man. Well, I guess we didn't exactly conclude as we wandered through everything. I think my projects conclusion right now is that engineers are still absolutely necessary. You won't see AI really replacing good engineers, for at least at couple of years, and at the point where AI does begin to replace engineers, you will see those engineers just moving into different jobs. It's just a different tool. Humanity has never run out of things to build. Every time we've built a better tool, we found more things to build with it. So I don't think that engineering is dead. And I think that anyone who thinks that they are software engineer who has not had that, you know, that discipline and wants to go up against a software engineer building with AI, I'd be happy to take that challenge.
Justin Grammens 40:34
Oh, good, good. Well, one of the things you did mention was you got to find a good instructor. Right. And I think that sort of ties back to some of the stuff that you're doing at skills quest, right is making sure that people are actually, I guess, learning still the underpinnings and the tools of what they do. So not just blindly taking this information on. Maybe we could talk just another minute or two about this. But skills quest, I mean, is there a cohort around people that go through this where people can lean on each other?
Corbin Fonville 41:02
Yeah, so we have a couple of models. It's all elearning. So it's always available. But what I like to try and do is spin everything into a cohort based competition design competition. So for example, we have a design competition running right now, that is around programming a bot that will play the game Rocket League. And so Rocket League for anyone who doesn't know is sort of a game where two cars will play soccer against each other. So it's a physics based environment, very 3d, lots of variability to what can happen in the game. So it's sort of a perfect programming challenge, along the lines of something like deep racer, and so two students will program different bots to play against each other. And then it will end in a competition where there's a cash prize this August, the cash prize is $2,000. And all you have to do to get into the competition is buy and do the course and then enter your pot. So you know, you have to learn something to get the chance of earning. But along the way, we have weekly scrims and students, you know, support each other. It's all in discord right now, where, if there's a question asked, it usually gets answered within a few minutes, because people are pretty involved in watching the chats and and seeing what's being built.
Justin Grammens 42:20
That's fabulous. That's fabulous. Yeah. So so how do people get a hold of you, Corbin,
Corbin Fonville 42:24
if they want to get a hold of me to talk about an opportunity, then they could probably reach out to me on LinkedIn, just send me a connection request and a message that is not automated. But if they're interested in checking out skill quest, and what we offer there, they can go to skill quest.io. And I'm also on Twitter, if that's more comfortable for people at just at Corbin. Farmville.
Justin Grammens 42:45
Gotcha. Cool. Cool. Cool. Well, it's been a lot of fun. Corbin, I appreciate you taking the time. I know you've got a lot of different things going on. And it this competition skills quest sounds really, really cool. Hopefully some people and so it can they join anytime somebody hears this podcast? Or? Or is it kind of? Yep. Okay. Yeah, they
Corbin Fonville 43:03
can join any time. And if they miss this one, they can be a part of the next one.
Justin Grammens 43:06
Got it? Got it. Cool. And how long? Do they typically go for
Corbin Fonville 43:09
three to four months? Depending on how long the course is? Yeah, we have a 3d design course that we're planning on launching later that, you know, might go around six months, depending on where we try to line it up with big events.
Justin Grammens 43:23
Cool. Well, good, good. Well, again, I appreciate the time, thank you for all the work that you're doing and helping people you know, get into this field. It's a and I know for a lot of people, it can be kind of scary, like, oh, how do I how do I do this stuff and have an online course where people can kind of work at their own pace, I think is a fabulous idea. I think it's really, really exciting. And it's really important, I think, for us to be very inclusive with regards to all different perspectives that we can have people working on software, it's can't just be built by one. One type of individual like it kind of has been been done in the past. So I'm all for more inclusion and different people getting involved. So thank you, for all the work you do at sales quest. This is fabulous.
Corbin Fonville 44:01
Definitely. Oh, and one more thing, I guess I will be planning to publish a course on AI once I feel like I am a subject matter expert enough to really make a smart decision on who the instructor is what we should be teaching, I will be publishing a course on AI. So yeah, that's great.
Justin Grammens 44:18
Yeah, the one thing I love about these new emerging technologies is like, it's I almost feel like everyone's at square one. You know, I mean, a it's changing so fast. And then V it's like, you know, chat GPT has really only been out since last November or so. So it's still very early and very new. And especially when you have things like this, yeah, people are interested in learning about it and seeing how it can apply to them. So really, really cool. I look forward to maybe taking that class actually, I'm I always learned stuff. I always learned stuff. Every time I have somebody on this program. I talked to them for you know, 4550 minutes or so I always learned stuff so always into the new stuff. So look forward to having you guys put that class out. Cool. All right. Take care, Corbin. Thanks again. Thanks, Justin.
AI Announcer 44:58
You've listened to another episode. side of the conversations on applied AI podcast. We hope you are eager to learn more about applying artificial intelligence and deep learning within your organization. You can visit us at applied ai.mn To keep up to date on our events and connect with our amazing community. Please don't hesitate to reach out to Justin at applied ai.mn If you are interested in participating in a future episode. Thank you for listening