Conversations on Applied AI
Welcome to the Conversations on Applied AI Podcast where Justin Grammens and the team at Emerging Technologies North talk with experts in the fields of Artificial Intelligence and Deep Learning. In each episode, we cut through the hype and dive into how these technologies are being applied to real-world problems today. We hope that you find this episode educational and applicable to your industry and connect with us to learn more about our organization at AppliedAI.MN. Enjoy!
Conversations on Applied AI
Will Preble - Creating Heart-Centered Technology
The conversation this week is with Will Preble. Will is a media and tech entrepreneur passionate about the intersection of Web3 and human potential. He believes the world needs strong, empathetic, and holistically minded leaders who are willing to break from models of the past and help their communities imagine abundant futures. He is currently the head of strategy at Ascendance, where he partners with brands and leaders to amplify reach, revenue, and impact through AI and emerging technologies. Additionally, he is the director of advancement and a founding board member of Smart North, a Minnesota based 501c3 organization that is dedicated to closing the digital equity gap in marginalized urban and rural communities.
If you are interested in learning about how AI is being applied across multiple industries, be sure to join us at a future AppliedAI Monthly meetup and help support us so we can make future Emerging Technologies North non-profit events!
Resources and Topics Mentioned in this Episode
Enjoy!
Your host,
Justin Grammens
[00:00:00] Will Preble: Say we go to AGI and we ask it, Hey, AGI, can you fix the climate crisis? And it says, sure, no problem, Will. I'm going to do that. And five years later, it's fixed. And what do you know, it killed half of humanity, right? It accomplished its goal, but its, its objectives weren't aligned with human life and human flourishing.
[00:00:22] AI Voice: Welcome to the Conversations on Applied AI podcast where Justin Grammons and the team at Emerging Technologies North talk with experts in the fields of artificial intelligence and deep learning. In each episode, we cut through the hype and dive into how these technologies are being applied to real world problems today.
We hope that you find this episode educational and applicable to your industry and connect with us to learn more about our organization at AppliedAI. mn. Enjoy.
[00:00:52] Justin Grammens: Welcome everyone to the Conversations on Applied AI podcast. Today we're talking with Will Preble. Will is a media and tech entrepreneur passionate about the intersection of Web3 and human potential.
He believes the world needs strong, empathetic, and holistically minded leaders who are willing to break from models of the past and help their communities imagine abundant futures. He is currently the head of strategy at Ascendance, where he partners with brands and leaders to amplify reach, revenue, and impact through AI and emerging technologies.
Additionally, he is the director of advancement and a founding board member of Smart North, a Minnesota based 501c3 organization that is dedicated to closing the digital equity gap in marginalized urban and rural communities. Thank you, Will, for being on the Applied AI podcast today.
Thanks for having me,
[00:01:35] Will Preble: Justin. Glad to be here.
[00:01:36] Justin Grammens: Awesome. Well, I told a little bit about, you know, what you're doing today, which is awesome. I love people working with nonprofit organizations. Actually, Applied AI is a nonprofit, 501c3 as well. But maybe you could give a little bit of a short background in terms of how you got to where you are today.
[00:01:50] Will Preble: Yeah, absolutely. So I graduated from school with a bachelor's degree back in 2018. I got a degree in data analytics and went into a machine learning team at Optum. Right out of school when I was 22, worked at Optum, data analytics and machine learning, fortune three company now. So about as big as you can get.
I got a lot of great corporate experience there, you know, got to play with some, some cool technologies and understand how, you know, big data really operates in a, at the biggest, biggest levels you know, the health group being the biggest healthcare company in the country. So I got to be involved in some fun projects there and learn a lot, but realized pretty early on that I was.
Pretty entrepreneurial and couldn't sit still in a cubicle and no offense to anyone in the corporate world. It's a great path, just wasn't my path and so. I quit in 2019 and decided to go into media and start a marketing agency. And the reason being back in 2018, 19, I had started to leverage my data background to get into some influencer marketing consulting.
At the time, it was a bit of a wild West and a lot of brands were throwing around half a million here, a million dollars here. To influencers, but they didn't really understand how to pair an influencer with demographics that were actually relevant to their brand or understand how to track and quantify from a analytics perspective what they were actually doing with their influencer marketing.
And so that's kind of the angle that got me into the media marketing space. And I just generally enjoyed creative work. And so it was a way for me to kind of merge technology and creative. I ran that agency, it was called Eterna. Still own 50 percent of it, not super involved in the day to day right now.
Ran that for about four years and worked a lot in entertainment. I worked with Live Nation doing influencer marketing, digital marketing. We built First Avenue in virtual reality back in 2020 when all the music venues were shut down. Hosted a digital and VR concert that had 5, 000 people on it. So I got to do some pretty cool stuff, but my trajectory has always pushed me towards emerging tech because that's just generally what I'm interested in.
And so about a year, year and a half ago, I had this idea for. an innovation studio. And this is just kind of when AI was starting to come into the pop culture conversation a bit more and people were getting more aware of generative AI and some of the tools that we we all know and are using today. And this idea for this brand called Ascendance came to me and what Ascendance represents is A new model of growth for the AI era.
And for me, I see the AI age as an opportunity for brands and leaders to rethink what growth means. Maybe look at it a bit more holistically. Obviously, your next quarter's earnings are important. You got to make money and have cash flow to grow as a business entity. But also we have the opportunity to think.
Much farther out into the future now and take different approaches to growing, whether that's more of a web three approach, if you're growing a community or, or just really looking at different ways to grow and also align that growth with human potential. Maybe we can talk about the AI alignment problem.
I think aligning emerging technology with. Human potential, not just short term business interest is really important. And so those were the themes that kind of went into the brands that became Ascendance. And so I went full time on Ascendance this year. We're a growth and innovation studio. Primarily right now, we're helping small to mid sized, although we do have one Fortune 100 project right now.
We help companies either develop an AI strategy. Or do low level automations, workflow, automations, marketing, automations, or on the high end, we do custom enterprise application development. So conversational AI applications and stuff like that.
[00:05:53] Justin Grammens: Gotcha. Gotcha. Yeah. Well, AI strategy is kind of a hot buzzword these days.
I'm doing a fair amount myself. And speak to me a little bit about the AI alignment problem. Maybe you can, I'm guessing a lot of our listeners maybe haven't heard what that means. And maybe give some examples of what you mean by that.
[00:06:10] Will Preble: Yeah, absolutely. So it's pretty simple. It's basically if we're creating.
AI, which is something that is going to become more intelligent than all of humanity combined very soon. And we ideally would want that intelligent thing to have interests that are aligned with humanity's interests. And so, when you talk about the AI alignment problem, the environment is one example that comes up a lot.
Say we ask Artificial General Intelligence, or AGI, which is Basically the term for one, when AI becomes as intelligent as humans, say we go to AGI and we ask it, Hey, AGI, can you fix the climate crisis? And it says, sure, no problem, Will, I'm going to do that. And five years later it's fixed. And what do you know?
It killed half of humanity, right? It accomplished its goal, but its objectives weren't aligned with human life and human flourishing. And so that's something that is at the highest levels of. AI engineering and business being talked about right now, and it has been for the past year or so, and it's something that's important because AI, you know, especially generative AI, it's really a reflection of us.
It's a reflection of everything that we as humans have produced with the obviously heavy emphasis on the post internet age. And so if we're the ones building these models and we're reflecting content that we as humans generated back at ourselves being intentional about aligning that model, aligning what that model is trained on with things that are going to align very smart artificial intelligence with, you know, making us better and unlocking more of our human potential in the long run, I think is a meaningful problem to, to solve.
Yeah,
[00:07:55] Justin Grammens: yeah, yeah, for sure. So it kind of. Seems to border on the line of people talking about ethics, you know, ethical AI, sort of being able to follow a set of broad based concepts that we all sort of agree upon, which is hard for humans to probably agree, but I think in general, we know what's good and what's not good.
Yeah, agreed. I mean, is that a problem you're trying to tackle within a sentence, like actually going after how you do that specifically from a technical standpoint or? Or is it making your clients aware that they should be thinking through this? What's your day to day on it?
[00:08:26] Will Preble: Yeah, it's not a problem that I'm spending all of my time on necessarily as far as like, I'm going to personally engineer the solution to AI alignment.
But I do think it's important for myself as a leader in the space and other leaders to talk about it and be working towards. solutions that are, are going to produce better AI models in general, whatever that looks like. I mean, people are going to disagree on what that looks like, but at least then we're orienting towards something instead of not even having it in mind.
I think part of the solution is, is just generally to create what I would call heart centered technology. So again, it's looking at growth holistically from a business standpoint. And if I'm developing a very powerful technology, maybe I'm thinking about my business interests, but I'm also thinking about how this technology is going to impact humans, the people that interact with it, how it's going to impact the planet and just maybe taking a little bit more holistic of an approach to.
The things that I'm bringing forth into the world, because I think as, as entrepreneurs and business leaders, we have the tendency to move really fast and break things, which is great. I do that too at times, but this is maybe a situation where we don't want to move quite as fast and we don't want to break quite as many things, especially with parts of this technology that we don't fully understand yet.
[00:09:47] Justin Grammens: Yeah, yeah, for sure. I mean, it certainly. Has the capability, I think to do a lot of damage because it's just across so many different sectors, right? It touches so many different things and you, you've wrapped AI up into, into web three, which I typically kind of think of web three. I mean, I'm assuming when you say web three, have you guys, are you guys doing stuff with like blockchain and, and some of these other decentralization, some of those other concepts around web three, are you guys.
Looking at AI sort of as a part of that
[00:10:15] Will Preble: component, I think there's a intersection I've worked with a few different web three companies on the marketing side, as well as I wrote, co wrote a white paper for an early stage projects about a year ago in the music space that was working on alternative funding models for independent artists.
You know, there's some scammy stuff that everyone wants to stay away from, but there's a lot of really good stuff. Coming out of web three as well, as far as new ways to build communities and to align incentives of business and technology with that community. And I think with AI, you know, what's interesting is.
There's this battle between open source and closed source and kind of this overarching narrative around like, who's going to own these super powerful AI models. There's maybe a middle ground in some ways, if there are large companies that take a web three, I guess that can mean a lot of things, but take just to use it as a term of like the ethos of what web three represents, if there are companies that.
Take a more web free approach to building and scaling AI. I mean, you could have something like a token incentive structure where instead of just being a users of chat GPT, maybe there's a token model where the users of the product are actually, they're training the product, they're providing the data.
They're helping improve it. And maybe you have some sort of ownership in that product too, where you can vote on the trajectory of that product. So again, it's really early. And I think it's going to, it's going to take a lot of experimentation and iteration to get this right as far as implementing web three models with AI.
But I think there's a meaningful intersection.
[00:11:56] Justin Grammens: Yeah. Yeah, for sure. For sure. Yeah, I guess it remains to be seen. I'm, I'm a huge fan of open source software. I mean, my background is in technology and engineering. So, you know, I was writing web pages and what back in the early days. And, you know, people were always like, well, why would anybody work on these projects, you know, as an open source community, but you take a look at Wikipedia is like the prime example people point to the encyclopedias went the way of the Dodo because everyone was just passionate enough about just keeping Wikipedia going.
It became the. You know, the best source, the most worked on source and trusted source for free, which I think is, which I think is phenomenal. So yeah, the more we can move it more into the open source space that everyone can contribute, I think the better, no matter what the technology is. I'm always a fan of that.
You mentioned a little bit about moving from. I guess moving from a data analytics career at, at Optum, which by the way, do you know, like Dan McCreary or Josh Cutler, those names ring a bell at all? You ever talk with those
[00:12:47] Will Preble: guys? Dan McCreary ring, rings a bell. I didn't, I don't think I may have met him. I didn't really know him, but he was speaking a lot when I was there.
And I remember hearing him speak a few times and super, super intelligent
[00:13:00] Justin Grammens: dude. Yeah, he was on the podcast. In fact, I think he was episode number one that I had when I started this thing back at the beginning of 2020. I think his first episode was. So. Yeah, it was a long time ago, but you know, to shift gears, I think from that to, you know, you said becoming more, I mean, I guess there is a tech growth and innovation studio, but you really kind of did a 180, I guess, around media marketing, right?
People, they go to college for four years to do this and you seem to have jumped in and just sort of figured it out. This question is probably kind of leading a little bit, but I mean, do you see people doing that in AI today? Do you, do you think AI is something that you can just sort of pivot into and start?
Start learning and just sort of picking pieces up.
[00:13:36] Will Preble: 100 percent yes. And I think now more than ever is a time where anyone can get into the space and start doing something meaningful. Because there's just so many tools now that like, you don't have to learn how to code. You don't have to, I'm not saying that that's not valuable.
Coding is amazing. I work with a lot of developers. But you can do a lot now without having some of the heavy, heavy technical skill sets. So if you want to go really deep on the tech side and become a machine learning engineer, great. But if you're a marketer and you want to build an application with a chat bot or some kind of AI component, you can probably do that.
At least something basic. You could probably figure it out in a weekend with YouTube and a no code platform like Bubble. I think it's a, it's a cool time because we're starting to see less barrier between your creative vision and the execution of that vision. There's just so many tools a lot of them enabled by AI that helps you get your vision out into the world.
And that's the biggest thing that really excites me about AI. It's just a way to Unlock some shifts, some of that time that humans use, you know, executing technical stuff or processing information to creating, to just going directly to creating. And again, with the caveat being that if you're really good at a technical skill set, you're going to be way better with AI.
Just because you have AI doesn't mean you're an advanced data scientist. But, you can do a lot of stuff now that you couldn't do unless you had that advanced skill set. Yeah,
[00:15:10] Justin Grammens: yeah, for sure. I guess I always ask people, kind of, how do you define AI? If somebody asks you, do you have an elevator pitch, or A sentence or two, or way do you kind of describe it?
Frame it up as?
[00:15:21] Will Preble: Yeah, I kind of gear towards generative AI as being the topic. Sure. Cause AI means, again, I've had these conversations too with people where it's like, what is AI? I mean, there's a lot of different ways people describe it. It's machine intelligence, some kind of intelligence, I guess. But generative AI, I think is what everyone is referring to as AI.
That's where you're having content, videos, text, photos, et cetera, being generated in a way that typically only a human would have been able to do before.
[00:15:52] Justin Grammens: Yeah, for sure. And so going back to your idea of heart centered technology, the whole concept is to make sure we generate these things that aren't offensive, I guess, or that are true representations of how we want to be shown as humans.
I don't know that
[00:16:06] Will Preble: I would say not offensive because you're potentially, I mean, they're definitely You're not going to try to offend people, but that is the nature of the world and people disagreeing like you can't have a productive conversation if you're so afraid to offend people that you won't state your opinion.
Now, of course, that doesn't give you license to be a horrible person, but I would more so say. That heart center technology is really about us as leaders in technology, realizing that the things that we create are a reflection of ourselves. Right. And so that, especially in this time, when this is becoming increasingly true with AI and for those of us creating with AI at a high level, there's an increased responsibility for technology leaders to.
Work on themselves. And again, I'm not going to prescribe what that looks like for everyone. Yeah, I have a path that I've taken, but I think it's important for us to, I guess, spend time getting into our bodies, exercising, meditating, whatever that looks like for you, not just spending your entire life behind a screen, but really being multifaceted, exploring different parts of the human experience.
And when it comes to being heart centered, I mean, Understanding, like, what do you believe in? What are you orienting your life towards? What is your purpose, or what is this higher concept of, of the self that guides your actions, your morality, your spirituality? I think those, again, I'm not gonna sit here and prescribe any of that to anyone, although I can speak to that in my path, but I think Answering those questions or thinking about those questions is as important for technology leaders as, as ever in this moment, because we're shaping a very important technology that's going to have an impact on humanity for a long time.
And if we're not right, like if we're, you know, acting out of trauma, we're acting out of the wrong impulses, the wrong motives. We don't have a clear vision for ourselves. We can't articulate our own personal narrative. You know, we're just going to do things out of ignorance that may not be the best things we could be doing.
Yeah.
[00:18:10] Justin Grammens: Yeah, for sure. I mean, you, you remember earlier this year, there was a, this letter to sort of put a pause on, on AI advancements. Do you feel like we're moving too fast or was this. Letter a little bit premature, I guess. What's, what's, what's your feeling?
[00:18:24] Will Preble: Yeah, it's, it's hard to say for context, the letter.
I'm sure you've talked about it, but the letter was a lot of the top AI researchers and leaders were calling for a pause or a halt on new model developments, or I don't remember the specific parameters that they put on it. But. I mean, it's, it's hard in the world that we live in today to say that that's possible because you have this multipolar world where you have the U.
S., you have China, Russia. Because of how powerful this technology is, it's hard to say, hey, we over in the U. S. should stop and we hope China stops too, but they get six months ahead of us and that means a shift in global power. Again, I'm not speaking to the geopolitics, but just in general, it's, it's a hard.
I don't think we can necessarily stop development. I think we can maybe be more intentional with development and we can put more guardrails on development, but I don't necessarily think it's going to slow down. Yeah.
[00:19:25] Justin Grammens: Yeah, fair, for sure. You know, interesting applications that maybe you've seen going on in the, in the marketplace, either things that.
You know, if you want to speak to anything that you've developed per se or just in general, things that are sort of crossing your desk that you're like, wow, this is an interesting application.
[00:19:41] Will Preble: I'm working on a really cool project right now. I can't, can't speak to the client or the specifics of the data, but we're helping a large company.
Take some of their market research data that would traditionally be experienced as like a 200 page PDF, for example, and we're turning it into a chat experience where they can speak to that data like they're speaking to a person and we're seeing some really cool, just like aha moments come out of that, even in our early test scene and it's cool to see, you know, Yeah.
people be able to experience, just experience technology in new ways and be able to do things that come to insights or create content or outputs that may have taken them four months, may have taken them three weeks. Now they're getting it in a 10 minute chat with an AI application. It's certainly not a silver bullet, but it's really cool to see just kind of a reframe in how we're helping that client in that case, how we're helping their people.
Just interact with data
[00:20:43] Justin Grammens: in a new way. Nice. Yeah, no, that's, that's cool. I mean, I usually ask that question and people give answers and then sort of the follow up is maybe, how do you think this will change how humans work? Like what's the future of work? It sounds like there are some applications in that case.
It feels to me like you're going to save humans a bunch of time. Somebody would actually need to read through this stuff. Now you can actually ask the questions that you want and get the results. So it seems like a huge time saver, huge productivity gain there. I don't know if there's a downside to that particular application or ways that you guys have thought about who's going to lose their job over this.
But I don't know if you've seen any other examples or things you've been looking space.
[00:21:20] Will Preble: Yeah. I mean, I would look at all the other industrial revolutions and what happened. I think, you know, people were afraid when they automated factory workers and you couldn't go to the textile mill and work a 12 hour shift.
Well, I. I guess I can't speak for everywhere in the world and there's definitely some problems in different labor industries, but that's another conversation. But whenever industrial revolution happens, there's a bunch of stuff that gets automated or changed. And then we find new stuff to do and we find a new markets emerge and there's, there's just more abundance.
Now, is that abundance always equally distributed? Probably not. And that's, that's, I think an important conversation. And that's some of the work that I'm trying to do with. With the nonprofit stuff that I've been involved in, but I, I think there's going to be plenty of stuff for humans to do. I think we just need to redefine what it is, what the true value of, of humans in business is, because if we just see ourselves as, you know, these middle managers or these low level marketing or technology, like output machines, or we output a little code, we output a little marketing copy.
And we pick up a paycheck, I think at the lower levels, and even at the higher levels in a lot of cases, a lot of that will be, will be automated. But I think it allows us to spend more time clarifying our own thinking, being creative, being visionary, and just learning how to output more with the same amount of time.
I think we're just going to be able to do a lot more. And so then maybe we need to look for more meaningful problems to solve. And build markets around them. Yeah.
[00:22:58] Justin Grammens: Yeah. I just, I actually interviewed somebody earlier today and we kind of went down this, this sort of thought exercise around, you know, what happens when the singularity hits, you know, people get scared about that.
Oh, wow. These machines are going to be smarter than us. So they're actually building basically super intelligence. Right. But there is another aspect to another facet to it. Then it's like, you know, so then how can we actually then. Think about, I guess, spending time on the right things, kind of getting back to what you were sort of saying, like, as a human, here's the thing that I enjoy doing, and now that all this other mundane work has sort of taken out for me, now I get a chance to actually experience what it's like to be human and actually be able to spend all of my time on it, rather than, essentially, you know, my first hour of my day is responding to stupid emails, so, I don't think it's always so bad, right?
Everyone's looking at it as like, it's like it's a terrible, dire situation, but it sounds like At least in my conversations with people, people tend to stay pretty positive.
[00:23:54] Will Preble: I think a lot of the fear is just around how am I going to make money? I think that's, that's a big chunk of it because we're used to a certain way of accumulating resources to live and do the stuff that we want to do and now that way is being threatened or might have to change.
I don't think most people want to do 70 percent of the stuff that they're doing. They probably have, and that's a general statement. I don't know if that's true for everyone. I, I certainly. Enjoy a lot of the stuff that I do on a, on a day to day basis and feel very blessed. But I think in general, there's a lot of types of work and activities that we don't have to do as humans if they're automatable.
And we probably would enjoy life more if we didn't have to do them or we had more help. And so if you look at it that way, to your point, it's freeing us up to have more time to do creative stuff or, or more meaningful stuff. The main question is just how do we get paid to do that stuff? So I think if we can figure that out, then a lot of the fear goes away because if you can get paid and still live and be able to, if you're an entrepreneur, grow wealth or do whatever you want to do and have resources, and you can do more meaningful stuff with your time, I think that's a win win for everyone.
[00:25:10] Justin Grammens: Yeah, a hundred percent. Yeah, that's great. That's great. What are some resources that you used as you kind of got into this field? I'm always kind of, I'd like to ask people if I was just graduating from college, for example, and regardless if it's data analytics or marketing or, or medical or law, whatever it is, and I want to get interested in this field, where, where do you typically find your resources or where have you pointed people?
Yeah. I
[00:25:33] Will Preble: mean. Maybe an ironic answer, but ChatGPT is a great resource or some of the other flawed or some of the other models, if you want a free one, I'm not sure if clause is still free, but yeah, I think if you want to learn and get a feel for technology, playing with a large language model and asking it.
to teach you stuff is a good, is a really good place to start. You can learn about code. You can learn, if you want to learn Python, you want to learn about marketing, just chatting, like getting up every day, do your morning routine and go chat with chat GPT for 20 minutes about a subject that you're interested in.
Start to build that kind of muscle to use it to help you learn. I think would be a really great thing to do because you're learning the skill that you want to learn. And you're also just building this intuitive feel for how to work with AI. So, that's, that's what I would recommend first. And then second YouTube is great.
I did take a lot of like Microsoft certifications. There's some, some good ones out there for like data science and AI stuff specifically. If you want, if you're going more of a corporate direction, I think check out Microsoft stuff. But yeah, that's, that's what I'd say off the top. That's
[00:26:40] Justin Grammens: great. That's great.
So how big is a sentence? Or it's
[00:26:43] Will Preble: just a number of people. It's me and a bunch of contractors. So I, I, I don't have any full time employees right now. I work with a couple of different development teams. And I built a pretty large network of media marketing contractors over my time in that space. But I'm, I'm keeping it pretty lean for the time being.
We'll, we'll see if that changes in 2024, but just, I don't know that I want to really build a huge centralized corporation. It's, it's not what most interests me. Again, not, not that it's a bad thing, but I like being lean and, and having a lot of flexibility. So we do. Most things project by project right now and put different teams on different
[00:27:24] Justin Grammens: projects.
That's great. That's great. No, I've, I've lived a lot of different lives in my career. I've been, I've been doing this for 25 years. So, and yeah, I've gone through a high growth opportunity where, you know, we went from 50 people to 500 people. And I was the VP of engineering throughout that entire thing.
And it, it was quite an experience. I did it because it was something new to do, but it's something that I tell people I would rather not do again. So I really enjoy still being able to sort of stay lean and mean and frankly be able to kind of be able to touch the entire stack all the way down. I'll be able to actually.
Do what I say that the business do. So be able to go low, but also be able to think strategically at the business side. So totally get it. Shouldn't feel bad about wanting to continue to do that. Whatever makes you happy. I guess I said at the beginning, you're one of the founding board members of Smart North.
So I've known Ben Wallace for many years. I'm assuming he was there as a part of the initial group as well. I mean, are you guys looking to bring AI into that like organization? You know, have you guys been thinking about AI through your
[00:28:22] Will Preble: initiatives? So I've been working on, and yeah, just to comment on, and Ben's a great guy, he was, he was involved in the founding group that all came together back in 2020, after all the civil unrest, and it was a group of entrepreneurs and leaders across different spaces that, you know, wanted to come together and use technology to do something that had a long term impact on the communities that we lived in, and so that was the intent and the spirit behind the project, and I've been working on AI and education stuff here these last six months.
There's a program that's going to be coming out in early 2024 for youth. And then there's a program that's being tested right now with teachers called AI for Educators. That's really more about helping reskill and upskill teachers for the AI era and just understanding. One, what is it? How do I deal with this?
Kids can go home and go on chat GPT. What does that mean for me? But also what are some really cool and proactive ways that I can make my job easier, create better learning experiences now that. These technologies are available to me. So there's those two sides to it. So long answer short, yes, we are absolutely bringing AI into the picture.
They're going to be doing a lot more with it in 2024. Yeah.
[00:29:43] Justin Grammens: Awesome. That's super exciting. Yeah. We for sure got to keep in touch as a part of Applied AI. We're starting to do some hands on workshops. I've got those scheduled here in January. And obviously we do conferences and monthly meetups and stuff.
And like I mentioned, we are a 501c3. So, I mean, part of our mission is to get out into the community and start doing some of these things. I think starting with educators and teachers is a, is really a great place because a lot of teachers maybe just feel threatened in some ways. I mean, A is their job.
And then also like, yeah, students are just going to be quote unquote cheating when they use this tool, but I've got a lot of, a lot of data. Me personally, I'm an adjunct at the University of St. Thomas, so I teach a class in machine AI and internet of things there. And I've been able to structure my class in such a way that I encourage the students to use it.
I don't have any problems at all with them because it's just one component. In fact, it can help with your research, great, now you don't have to spend a lot of time digging through. If you think back, people were digging through encyclopedias, they'd have to go to the library and dig through it. Then the internet made it easier.
Now we're just putting all that knowledge into more of a chat interface. So it's more what you do with that information that you're getting. It's amazing. So cool. Very, very good. Well, yeah, I wish you guys, you know, luck on that initiative. There's a lot of people that are going to be tackling this and we need a lot of different organizations to help out that space, come at it from a number of different angles, so that's great.
Well, how do people contact you? Will, how do they reach out to you?
[00:31:06] Will Preble: Yeah, you can, you can reach me on LinkedIn. That's probably the easiest place right now. Just will Preble on LinkedIn or Twitter. Trying to get more active on Twitter, it's king will xm, so like William, but with an X instead of the ia.
You can also go to www dot ascendants, the word ascendants one ONE. Check out our studio, some of the stuff we're doing and contact us there. If you're interested in building something with AI or running an AI strategy session, or just generally getting more intentional about the opportunities for your brands.
I'm happy to help anyone who has a need, but yeah, LinkedIn is probably the easiest way to just shoot me a DM and I'm pretty responsive. Excellent.
[00:31:49] Justin Grammens: That's cool. That's cool. Well, great, great. Well, was there anything else that you know, maybe didn't, didn't touch on that you wanted to mention?
[00:31:55] Will Preble: Not off the top, I will, I'll make one final point on, you know, what you were saying with, and that's awesome that you're, you're teaching.
I know you taught at St. Thomas. Definitely want to get connected on some of the AI education stuff after this podcast. You, you mentioned that AI is just putting, we're putting things in a different interface. I think that's a good way to think about it because there's a lot of mystique and a lot of kind of just ideas about what this is being thrown around.
But at, at its core right now, it's, You know, you had an encyclopedia as a physical book, then you had Google and you could search for things and now you have a chat based interface that. It gives you answers in an even easier format. And that's, that's kind of the way that things are going with technology.
It's like we're getting better and better interfaces that help us do more of what we already want to do as humans in faster and more meaningful ways. And so with that approach, I think it's really hopeful. And I see a really abundant future if just more people learn how to use these tools and do the things that they're passionate about in more efficient and effective ways.
Yeah. Yeah.
[00:33:00] Justin Grammens: Amen to that brother. It's good. Good, good stuff. All right, well, well, thanks. I appreciate the time today and you know, nothing but the best for you guys at Ascendance. Sounds like. Doing some fun projects and, you know, raising up all these interesting use cases and ways to bring you know, this new technology into businesses and kind of do it the right way.
I like your sort of the heart center technology approach to this. So really, really cool. Thanks for, thanks for being on the
[00:33:24] Will Preble: program today. Thanks for having me, Justin.
[00:33:28] AI Voice: You've listened to another episode of the Conversations on Applied AI podcast. We hope you are eager to learn more about applying artificial intelligence and deep learning within your organization.
You can visit us at AppliedAI. mn to keep up to date on our events and connect with our amazing community. Please don't hesitate to reach out to Justin at AppliedAI. mn if you are interested in participating in a future episode. Thank you for listening.