Conversations on Applied AI - Stories from Experts in Artificial Intelligence

Tyler Nigon - Artificial Intelligence in Production Agriculture

February 09, 2021 Justin Grammens Season 1 Episode 13
Conversations on Applied AI - Stories from Experts in Artificial Intelligence
Tyler Nigon - Artificial Intelligence in Production Agriculture
Show Notes Transcript

In this episode, Tyler Nigon shares with Justin Grammens his deep experience in production level agriculture and some of the cool ways AI is being applied to his field.

Tyler is a Ph.D. Candidate in Land and Atmospheric Science at the University of Minnesota’s Department of Soil, Water, and Climate. Tyler is passionate about sustainable agricultural productivity. His research focuses on evaluating remote sensing and other precision agriculture tools to reduce fertilizer loss and keep our soil and water resources clean and productive.

Tyler has been learning and applying AI techniques during his statistical training and graduate studies and he says that he finds AI to be a useful tool for the agricultural sector, not only in research but in practice as well.

Tyler also recently co-founded Insight Sensing (together with a fellow Ph.D. candidate at the U) as part of the National Science Foundation Small Business Technology Transfer program.

If you are interested in learning about how AI is being applied across multiple industries, be sure to join us at a future Applied AI Monthly meetup and help support us so we can make future Emerging Technologies North non-profit events! 

Resources and Topics Mentioned in this Episode

Your host,
Justin Grammens

Tyler Nigon  0:00  
Knowing the use case or like in scientific terms, knowing our hypothesis or objectives, if we don't have that, it's gonna be very difficult because it starts with data collection, it goes on to processing the data and analyzing the data. If we don't do that a particular way for the use case that we're talking about, it's probably not going to be done right.

AI Announcer  0:23  
Welcome to the conversations on applied AI podcast where Justin Grammens and the team at emerging technologies North talk with experts in the fields of artificial intelligence and deep learning. In each episode, we cut through the hype and dive into how these technologies are being applied to real world problems today. We hope that you find this episode educational and applicable to your industry and connect with us to learn more about our organization at applied Enjoy.

Justin Grammens  0:53  
Welcome to the conversations on applied AI podcast. Today on the program we have Tyler niggun Tiger is a PhD candidate in landed Atmospheric Science at the University of Minnesota Department of soil, water and climate. Tyler is passionate about sustainable agriculture productivity. His research focuses on evaluating remote sensing and other precision agriculture tools and reduced fertilizer loss and keep our soil and water resources clean and productive. Tyler's been learning and applying AI techniques during his statistical training and graduate studies as he finds AI to be a useful tool in the agricultural sector. Not only in research, but in practice as well. Taylor also recently co founded insight sensing together with another fellow PhD candidate at the EU as a part of the National Science Foundation, small business technology transfer program. So really, really cool stuff Tyler's involved in. Thank you, Kelly, for joining us today. You're welcome. And thank you for having me. Yeah, absolutely. Absolutely. So I gave a little bit of bio, you know, kind of what you're currently doing kind of curious maybe to rewind the clock back maybe a couple years, I guess, and maybe talk a little bit about how you got to where you're at today. Maybe, you know, what did you work on in your undergraduate? What got you excited about this field? Yeah, absolutely. And if you don't mind, I'll maybe go back even further than that. I actually grew up on a dairy farm in Central Wisconsin, my parents still own and operate that farm. And really the farm, you know, I grew up through my childhood became very interested in agriculture. And so it really started with that. We are a dairy farm. But we are also a diversified farm and have more than just dairy. So I had, I don't know how to put this other than I didn't, I wasn't really passionate about cows. To get into there, you have to really, really love them. But I did really like the other aspects of the farm. And so I went to college and did a soil science major. Not too long into college, I was kind of looking to get a minor to diversify a little bit. And I stumbled across a GIS, geographic information systems, really kind of the study or the practice of processing and interpreting spatial data. With that I got involved with some a couple of remote sensing classes and learned all about satellite sensors and image sensors. And all the while I'm learning about soil fertility, conservation, in agriculture, and things like that. And so I was looking for a way, you know, to really bring these things together. And when I was finishing up undergrad, I was looking into graduate programs, and I really didn't even have an idea of what graduate school was, you know, research with something that, you know, scientists do, and I had really very little knowledge of that. So I just started emailing, you know, advisors at University of Wisconsin Stevens Point where I did my undergrad kind of helping me out through that. And I got really lucky I found a research group, Dr. Carl Rosen and Dr. David Mola at the University of Minnesota said, yeah, we just got this grant where we're looking to find the relationship between nitrogen and water, stress in potatoes.

Tyler Nigon  4:04  
hyperspectral and thermal imagery. And so, you know, I didn't you know, I knew a little bit about hyperspectral at that time, but I'm like, Oh, this is this is great. This is perfect. So I applied and was accepted. And I did my master's degree then at the University of Minnesota working on potatoes.

Justin Grammens  4:22  
Okay, what was the focus of that? I mean, was it really around this spectral imagery stuff or?

Tyler Nigon  4:27  
Yeah, really around the spectral imagery. So the, you know, the, the problem at hand is just that okay, potatoes are growing in sandy soils, and just because they would need to be well drained. They require a lot of nitrogen fertilizer. They're irrigated because of the sand, sandy soils. And they just tend to Leach a lot of nitrate from the fertilizer and nitrate in the groundwater is a problem that requires treatment and it's really a pollutant. So we want to be able to keep that out and really make And apply the fertilizer nitrogen as efficiently as possible. And so as it relates to the aerial imagery, the the idea was to train statistical models, machine learning AI, to be able to use aerial imagery to make predictions about this next gen status of the crop during the season. And so potato growers are oftentimes putting nitrogen fertilizer on for six times a year through the months of June and August. And so there's all this opportunity then to take the aerial imagery, if we're able to make predictions about nitrogen status and use that as kind of a management tool to more efficiently and effectively apply the fertilizer.

Justin Grammens  5:45  
Gotcha. And where does this imagery typically come from? are we dealing with drones and stuff right now? Or is it just a lot of planes that are still flying over?

Tyler Nigon  5:52  
Well, it's an evolution over the years, though, we, you know, we've had access to this aerial imagery for decades. You know, when Landsat was deployed, we had this multispectral imagery, and there was a lot of work done back 10 years ago, I mean, we didn't really have consumer grade drones that were easy to just okay, you know, put your camera on and go fly. So it's really changed a lot in the last 10 years from satellites and manned aircraft. And now we're able to mount these cameras and sensors on drones, which is a big deal, because the expense, you know, it's a balance between data quality, and cost in satellites, relatively cheap. I mean, as long after it's actually deployed, relatively cheap, but we're dealing with pretty coarse spatial resolution clouds, not a very frequent revisit time. And so a lot of times, we're even lucky to get one image every two to three weeks, manned aircraft, that's actually how we collected the imagery for my master's research project. I remember, you know, there's a group out of Nebraska that came in flew a plane up and they're there, they have a hyperspectral on a thermal camera mounted. And just to give an idea of the cost, I remember it being right around $10,000 per image campaign, you know, so to do this research in 2010, and 2011, we had to pay right around $40,000, to do two years of research where we're only getting to image dates. And so then, you know, just kind of moving along the timeline, you know, really in 2013 2014, consumer grade drones became a thing. And then bigger consumer grade drones became a thing. And now we're mounting these sophisticated sensors on a $10,000 drones still rather expensive. But just for context, that's the same price we were paying for just one image campaign. Sure. Not very long ago.

Justin Grammens  7:55  
Yeah, I know, the cost of the technology is always sort of dropping, which is great for startups that want to get into this business, like you probably are leveraging some of that in your in your new endeavor. And then also just making it more and more generally available to the average farmer, because I would assume, I guess I was gonna, I was thinking about your your background a little bit. I mean, you said your parents were on this farm, and you grew up on the farm. I mean, were your grandparents on a farm, too? Was this kind of a generational thing for you to sort of break away and start getting more into this stuff? I guess, rather than sort of having a life as a farmer?

Tyler Nigon  8:26  
Yeah, absolutely. You know, both of my grandfather's were farmers or had farms themselves. And, you know, and as I mentioned, I didn't really have a whole lot of exposure to like, Okay, what is graduate school? What is the research side of things, and I certainly didn't know a whole lot about sensor technologies and AI and things like that. And so I had to make the decision. I mean, that decision really came between my undergraduate and when I started my master's program, because that was something that, you know, I didn't you know, I remember having the conversation with my parents, something that they weren't familiar with at all. And they're like, Well, why would you want to go to more school when you can have a job? And well, it's like, there's, there's more out there. I don't know what that is yet. But I want to find out. And I can always come back, you know, if I, if I don't like it,

Justin Grammens  9:20  
for sure. Yeah, not to take it off too much on a tangent I was just thinking about, there's a book that I'm reading by Kevin Scott. He's the CTO of Microsoft. And the book is called reprogramming the American dream. And, you know, when when when we publish these podcasts here, I'll have a link off to it and some of these other things that we that we've talked about, but it's a really, really good book. And he grew up in rural Virginia, I think, and his his whole story is really about how a lot of these rural communities need to modernize. And he talks a lot about AI and about how AI is going to change the workforce in general. And honestly, there's a lot of mundane work that people are doing that they actually don't want to do. You know, most correct me if I'm wrong, but he's kind of saying You know, most farmers don't want to drive the tractor back and forth 50,000 times, they would much rather automate that. And so the whole thing is, is that everyone gets gravitated to these cities. And that's where the Knowledge Center is, and no one goes back. And so he's really trying to have a very strong initiative with regards to having schools rurally, like, right in the rural communities be able to teach these advanced techniques and advanced techniques, and in AI, machine learning and deep learning, like, why are they only available the University of Minnesota? Why are they unavailable? And all these other areas? So, you know, I don't know if you have any future plans or whatever, but it's a it's a, it's a really good book, and it's called reprogramming the American dream. Okay,

Tyler Nigon  10:36  
yeah, I'll definitely take a look into that. And, you know, following up with everything you said, I mean, that's really what we're trying to do. I mean, I would say, both in the research, precision agriculture research, but especially with the NSF grant and cofounding insight sensing is to be able to take this technology to the practitioners take it to the farmers so they can use it. And that's has its own set of challenges. But it's definitely interesting. And as you said, much needed for sure.

Justin Grammens  11:05  
So you you mentioned a couple times here, as you were, as you were sort of walking us through what you've been doing. And one of them is just this idea around remote sensing. And I think you know, I'm in the field for a year in the field, but maybe people listening, that might have been a new term for it for somebody, do you have a definition, you know, of that of how you think of what remote sensing really entails? And means?

Tyler Nigon  11:23  
Yeah, absolutely. Yeah. So remote sensing is just the acquisition of information about an object without coming into physical contact with that object. So at the most basic level, we can think of our eyes as remote sensors. If we look across the street and see a tree there, and see that the leaves are changing colors. We can make inferences then from what our eyes see. And this is really kind of been used as a term when we start to use sensors, cameras to make interpretations about the environment around us, based on what those pixels are picking up.

Justin Grammens  12:01  
Gotcha. You talked about some challenges, maybe with regards to using drone and satellite imagery, you know, before you and I had sort of started the conversation. I don't know if you wanted to talk a little bit about that, you know, with regards to some of the challenges and trying to train a model with various sensors and make predictions.

Tyler Nigon  12:18  
Yeah, absolutely. I'll start by by saying, okay, the availability of consumer grade drones has done a lot on the front of collecting data, as long as the weather is okay, we're able to go out, collect the data, and you know, stored in our computer hard drive. But there's a lot of processing that has to be done with imagery. And with satellite sources, and even a lot of manned aircraft, they understand these remote sensing tasks that more or less kind of clean up and calibrate the data. And that a lot of scientists even using platforms from satellites and manned aircraft and really have to do with because, you know, it's either a public program with a lot of thought going into it, or it's like a manned aircraft system where you're paying to get that service. So the data is coming to you, like ready to analyze or post process and analyze and interpret. But now with drones, it's kind of, you know, turn that on its head, because now throw any camera on there. But when it comes to using this aerial imagery, and calibrating or using it in a model, artificial intelligence and such, we have to do a fair amount of processing to be able to ensure or have confidence that the pixels are representing some physical value. One of these things is to convert to absolute units of radians, and then to reflectance. So I'm going to get an image from a camera sensor, it's represented in digital numbers, it's just some arbitrary number based on the exposure time of the camera, and the gain and offset that it was set. And it'll be, you know, for an eight bit camera, which doesn't really exist, but it's a good example, the pixel values are going to be between zero and 255. Well, for you know, if the pixel value is 212, or something like that, what does that mean in our physical world. And so when we're collecting the imagery, we have to do some extra steps. And there's different ways to collect the data required to make that that calibration one would be to put down a reference panel into the field. When we go and capture the imagery. This reference panel, we're also capturing an image of that and then more or less correlating it to the known physical radiance and reflectance values of that panel. There are other sensors out there downwelling irradiance sensors that are mounted maybe right on the drone that are kind of keeping an eye on The light conditions every second or you know, every millisecond, because and this is kind of, you know, the point behind all of this is that when we're outside, there's clouds in the sky, and the lighting conditions are really changing every very frequently. And we have to be able to account for those lighting conditions. Because otherwise, our imagery is just going to be so noisy. And when we put that data into the model, the noise will just be too much for it to generally be able to predict anything meaningful.

Justin Grammens  15:35  
So to clarify, I guess, when you're up and you're taking these images, you've talked about different types of omegas. There's thermal imagery, so you're basically looking at heat, I guess, coming off hyperspectral. Maybe you could define that a little bit. For our listeners?

Tyler Nigon  15:46  
Yeah, absolutely. That's a one that catches a lot of people. So I'm gonna have plenty used to it. So hyperspectral versus multispectral, though, we're talking in the visible and the near infrared region of the electromagnetic spectrum. So from about three 400 nanometers up to, you know, maybe 10,000 nanometers is getting into thermal. But really, you know, without getting too much into the technical details of the sensor, we're mostly capturing information from 400 to 1000 nanometers. Now, multispectral in this is more common, generally cheaper, it's fairly subjective, but it indicates that there are multiple spectral bands that we're sensing. So the camera on our cell phones, this is, you know, what we're used to seeing, oftentimes, it's really a combination of red light, green light, and blue light, that when in combination with each other kind of mimics the human eye and what we're seeing, but we can put a band in there for near infrared, or, you know, maybe to near infrared bands, one at 800 nanometers, one at 900 nanometers, this would be multispectral hyperspectral, is, now instead of like five bands, we might have a band every two or three nanometers. So between 400 and 1000 nanometers, we might have 250 bands, which really gives us the spectral precision aspect of it, when we kind of move along the spectral domain, we can see, okay, how does the reflectance change? How does the energy coming off of an object change? When we just move from, you know, the middle of the green spectrum to just two nanometers away, which would be really impossible for any are to pick up? And so to really digitize this signal, it's it's really beneficial.

Justin Grammens  17:37  
Sure. Now, so I guess to back up to even further, are we just looking at plants here? I mean, is that the whole point of a lot of this stuff is based on all this imagery, whether it's thermal or spectral, or hyperspectral, multispectral, whatever it is, what's Yeah, maybe you can give us the use case that we're trying to solve? Yeah,

Tyler Nigon  17:55  
I'm glad you brought that up. Because this is a really important thing, as well, the use case. And so, so many times over the last 10 years, since I've really been involved with this work, we learned that, okay, we have the data, we have a camera, apply it to agriculture to fix all of a farmer's problems. I don't know that I've seen that work even once other than maybe for scouting. So knowing the use case, or like in scientific terms, knowing our hypothesis or objectives. If we don't have that, then it's going to be very difficult because it starts with data collection, it goes on to processing the data and analyzing the data. If we don't do that a particular way for the use case that we're talking about, it's probably not going to be done right. So to answer your question, what are the use cases, I am applying the aerial imagery in the spectral domain to really look at the inseason nitrogen status of corn, I never guess got into my dissertation research in my Ph. D. program here, but I'm looking at nitrogen uptake, nitrogen status in corn, similar to potatoes, corn take a fair amount of Mexican fertilizer. And we want to apply that as efficiently as we can. We also recognize that in one area, the field it doesn't require as much nitrogen fertilizer is another area of the field that might have more nitrogen available in the soil or might not use as much because it doesn't have as much crop potential or yield potential. And so nutrient management is a big one. And yeah, we're looking at the plants we're looking at basically the color of the plants and how they are changing throughout the season and throughout space to understand its status, how stressed it is or if it's if it feels pretty good and it doesn't need more fertilizer, for example. And then making management decisions on that. Others in my lab are working on phenotyping so plant breeding where, you know, the breeders are generally looking to make plants more robust to our changing environment, things with climate change, trying to expand crops into new regions, there are a lot of breeding efforts going into finding out okay, how can we achieve a high yield, so the farmer can earn a living, growing it. But in the meantime, having the ability to earn, you know, Breeden in such a way that it's resistant to disease, or some maybe drought, stress or flooding, things like that. So these breeders, they're making DNA sequencing, and they're playing their numbers game, but part of any breeding program is the phenotyping aspect. So they're growing the different lines out in the field. And so now we're applying these AI machine learning models, to the data that we collect over those fields, whether it be soil sensors, weather data, aerial imagery, we're looking at the biomass, change the nutrient composition, and there, it's really a lot more open ended on exactly what they're looking for, because it depends on what they're trying to breed for. So that's another example.

Justin Grammens  21:15  
So they might, they might just take a new corn species, I guess, for lack of a better term, but you know, it's something it's something that's been modified. And they might just try planting it in some different environment, you know, hey, we've never planted corn, and I don't know, Tennessee, right? So let's just try planting it in Tennessee, let's use the sensors, imagery, like to talk a little bit more about the soil sample, too, because I want to, I want to, you know, touch on some of that stuff, too. But yeah, let's get all this data around it. And then we can really see how it's performing and kind of rinse and repeat.

Tyler Nigon  21:43  
Yeah, absolutely. And so we're getting to the point, some breeding labs at the University of Minnesota are now incorporating some of these machine learning models that have been developed, you know, in the past few years into their actual programs. So now, instead of just being research on the technology, now, it's applying the technology to do the research that they're really interested in, which is really where we want to be with everything right? You know, to be able to apply it and use it.

Justin Grammens  22:13  
Right. Right, exactly. So you you can get images from drones, and from the sky and all that type of stuff. Do you find you mentioned soil, which I think is key? I mean, do you think I mean, I guess any data scientist wants more data? Does it open up a whole new new realm of things for you to have that? Is it a complete game changer to be able to know what's going on in the soil? And of course, there's lots of questions around that, too. It's moisture levels, but it's probably also nitrogen. And there's probably just a lot, a lot of other other chemicals. How do you? How do you see that playing in?

Tyler Nigon  22:44  
Yeah, and this is probably some most data scientists would tell you that this, you know, remote sensing aerial imagery is just another tool in your toolbox. And I've kind of focused on that, because that's where a lot of my efforts and experience have really focused in the last several years. But you're absolutely right. I mean, there's all this other information that we really want to be able to get access to. And really, we're trying to understand the soil, and what's there, what's not there, what we might have to supplement. And remote sensing is oftentimes limiting when it comes to the soil. Because if we're imaging, we can only really see the surface and the plant roots obviously go down several inches or feet, even to capture the moisture and the nutrients that they need to grow. And so this is where other information works in. And this is where I would say I'm really focused with insight sensing, and being able to bring this technology to practical use to farmers. Because first off a really practical thing to consider with aerial imagery is that even with a drone, if a farmer has a few 1000 acres, it's still not practical to go fly that drone to cover all of your acres, even at once a week until we get to the point where humans are not involved in the equation and they can go collect the data, process the data, analyze the data, recharge the batteries, that's really not a feasible solution right now, at the scale of farms that we're dealing with. You know, with cost consider we're dealing with satellite imagery. So one of the problems now with satellite imagery is that the revisit time isn't all that great. Like I had mentioned earlier, every couple of weeks, the satellite might come over, we're not getting a really fine detailed look at the crop. So the pixel size is is usually right around three meters or 10 meters and so it's a really coarse look at the plants. It's really the plant canopy. Things are changing pretty drastically with planet scope imagery on satellites, but they have problems of their own. They're getting 50 centimeter resolution or 50 centimeter pixel size, but the radiometric call There is much to be desired. So like putting that into a machine learning model. Really, we found it doesn't work very well. So how do we use the satellite imagery to be able to make these predictions but make the predictions on a daily basis instead of a monthly basis? And the answer right now is incorporating things like weather information from weather stations that we have right on the field, or for a model the weather, from weather services, and radar and things like that. Management Information is a big one. I mean, this is maybe really old fashion. But knowing the farmers plans, usually the farmer, they make their plan for how they're generally going to manage a crop in the wintertime. And usually, it's the same year after year after year. And just kind of knowing the practices that a particular producer uses can be really valuable for building the models, or maybe making separate statistical models for each use case. And then, of course, you know, I think IoT and getting cheap sensors, you know, as they're becoming cheaper and cheaper things like soil temperature, soil moisture, and getting them out in the fields, to be able to access that data as well, is really something that is really exciting. on this topic, I heard of some research that's going on where, you know, they don't have anything working yet of this is like really basic research and trying to get it but the goal is to build these decomposable sensors that are pennies, so five cents or less, or they can just put it in the fertilizer and spread it out in the field. And it'll sit on on the soil surface. And we'll be able to communicate and contribute data to a model to be really be able to get information about the soil, the environment in every particular area of the field, at a very high resolution in both space and time. It's hard to believe but I mean that it'll ever be possible. But I mean, think 20 years down the road, it might be okay, the sensors costs $1 or $2. Maybe they're not quite that cheap. But it's something that's really exciting for you know, a data scientist to think about in agriculture.

Justin Grammens  27:14  
Yeah, you have that many features, I guess, for lack of a better term, I guess when you know, we need loaded into your into your deep learning model, the better because that's what it sort of thrives on. Right, when you're doing deep learning. It's learning itself. So the more rows and features and columns you can give it, the more likelihood you will to have better predictions. The hard part, I think in with some of these sensors today, I think it's just it's just the kind of activity, is it cellular? You know, obviously, there's no Wi Fi out there, like how can you daisy chain these things together and create some sort of a mesh network, I think these will all be solved. I know, they will all be solved. But it is it is a little bit of a challenge. I think right now and and I you guys are probably seeing it with with your company, you know, as well, I don't know if you guys are offering sensors and stuff like that to put out in the field or how you guys are overcoming some of that stuff.

Tyler Nigon  27:59  
Yeah, it is challenging, especially in a lot of the rural areas where there's farming, I you know, I think it's it's not as bad as you might think there is generally good cell reception, you know, out in a lot of rural areas, but it's not as great as in the city. And so like it can be a problem sometimes where Okay, your your sensor goes offline for a couple days. Now with insight sensing, currently, I mean, we're we're, like I said, really early stage. I mean, this only started a year ago. And we're just trying to build a commercialized product that is halfway scalable to be able to expand in the future future seasons. So we're not focused on on hardware or getting hardware out there ourselves. With that said, we've kind of identified, you know, some more or less standard pieces of hardware, things like weather stations and soil temperature, soil moisture, that we want to be able to use those features because we understand their importance, or understanding nitrogen status and crop for sure. So,

Justin Grammens  29:00  
so we've talked a lot about getting the data, right, and, you know, a lot of like, a lot of the sensors and some of the the imagery, you know, imagine you have this data now, what sort of tools, I guess, and techniques are you applying first, I guess, in your research, and then also, I guess even further down the road. So then what do you do about it? Like, okay, so we've identified a portion of this field that needs more nitrogen, or less nitrogen or whatever, walk me through also how you solve that problem, you know, as well, because you need to have specialized probably tractors to make that happen. So I guess maybe start with the first question. So we have all this data, how are you applying that? You know what you're doing right now to then pinpoint where these issues are at and I guess utilize AI to the best of what can be done today?

Tyler Nigon  29:43  
Yeah, so I'll use my guess our beachhead market with potato growers with insight sensing. So potato growers, they have agronomists that are already traveling out to the field and collecting tissue samples, ideal samples, sending them to the lab to get the nitrate concentration Usually they're doing that weekly, a lot of these farms have someone or multiple people that specifically do this. And so we're already working with potato growers, because they're already collecting a lot of the training data that we made, I think, you know, this is one thing I don't know, that I specifically mentioned yet, but training data in agriculture is generally really difficult to come by, because it's expensive to collect, there's so many variables that even if you have some data collected, usually you only have a few of the features that you want to use. Meaning that, you know, it's may not be that useful. And then we're working on this annual cycle. And in our case, June and July, are the months to collect all of the data. So there's, there's challenges with that. So like, how, how do we go ahead and use that data, then, for the most part, you know, we realized, you know, with this National Science Foundation grant, part of this was customer discovery. And, you know, I've kind of always known this, you know, as a farmer, myself, but a lot of these precision ag companies, they will present you with a product where they show you an image that, you know, is a beautiful map or color map, but it might be like relative stressor, relative plant status, and, you know, at first like, I'd say, in 2013 2015, they were really, you know, not all that transparent that, okay, this is just a relative map. Now, I would say, there's like an understanding, okay, you know, take this for what it is, it's a relative map that doesn't have any absolute meaning. We're trying to get past that next step. So not not provide just like, you know, this colorful map, but provide a colorful map that is representing the units that they're used to dealing with. So I mentioned petiole, nitrate concentration. So that's what we want to train the models on to be able to predict petiole nitrate and provide a map in petiole nitrate. So in terms of training the models, you know, this is a supervised regression problem. Honestly, I've, you know, as part of my dissertation research, looked at a few different models, lasso regression support vector machine regression, random forest, partial least squares regression, honestly, if you if you have clean data in this is, you know, maybe a small sample set. So take this with a grain of salt. But if you have good data, clean data, the model I found doesn't really make a whole heck of a lot of difference, other than the number of hyper parameters that you have to tune for that model. Maybe the efficiency of the model to train and test and cross validate support vector machine is, is particularly takes a long time to do. Once you have a trained model, though, then were you taking new data for a potato farmer, okay, we get a new satellite image, new weather information, maybe new management information that farmer applied fertilizer last Thursday. Okay, let's consider that now. And first, update our models. So you know, to have this all automated is, I think, pretty important, because we want to be able to update the models, as we get more information, we don't want to train a model, and just kind of stick with that. Because if that's a bad model, it's going to give us bad information forever. But the idea is, if we can kind of keep updating the models, then we will continue to improve in our predictions and hopefully reach some plateau. And then ask the real question to the producer, the farmer, is this accuracy good enough for you? You know, can you use this in your operation to make management decisions and feel comfortable about it? And, you know, really, frankly, from our perspective, would you would you pay something for that? Sure.

Justin Grammens  33:50  
Is there some value?

Tyler Nigon  33:51  
Is there a value there? And that's something that we're trying to figure out? Its we found to be pretty difficult, because a lot of the agronomists the farmers, I don't I don't know that they know what they want. Better. Yeah, it's like you have to you have to build an MVP, minimum viable product, and get it to them. Before you can really, really get good feedback. Those are some some of the beautiful struggles of startup.

Justin Grammens  34:21  
Yeah, yeah. Well, and it's also just the nature of the market you're in, right? It's a you're in an emerging area. It's one of these things where it's like, you know, I don't think the consumer would have would have asked for the iPhone, if you said, Would you like a new phone? No, they wouldn't have right so Apple sort of led that. And it's, it's a balancing act, where it's like, you got to take input from the customer, but yet, you know, especially in these new and sort of, you know, fast evolving technology times they don't, oftentimes they don't know what's possible, and sometimes they don't even know the right questions to ask. So it's all about it. Just iteration. I mean, I've done a number of different startups, number different companies, and you're right, you're just you're pivoting you just got to make sure okay, you know, I'm gonna pivot a little bit over here, a little bit over there. And hopefully over a course of time, you don't run out of money and be there is value there, right? At the end of the day, sometimes you're chasing something that you think has some value. And it turns out the customers aren't willing to pay for it. And you need to sort of like short circuit that it sounds like there were a couple of things that were coming to mind when you were talking. One is, you know, it sounds like you, you were talking about getting very specific and hypersensitive to what they wanted to see. Right. It wasn't like, here's how the plants feeling or, you know, it's like, no, this is the exact concentration that you need to do

Tyler Nigon  35:30  
the same playing field as what you're doing already. You know, you're already taking petiole nitrate samples. This is this is what we're predicting here. Perfect.

Justin Grammens  35:38  
Yeah. So that that will resonate much better with a customer with when when you actually try to present a solution to them. I was also curious, like, how, how many you mentioned training data? So, I mean, obviously, they weren't doing the samples back in the 30s, and 40s, and 50s. Right. I mean, how far back? Do you even have you only have data from the past? X number of years? Or what's what's that data

Tyler Nigon  35:57  
set even look like? Yeah, so that I mean, that's an interesting question, when it comes to the collaboration between the university and commercializing and working with growers? Yeah, yeah. I mean, I don't even know for sure. With potato farmers, like, how long have they been doing petiole nitrate sampling? I would say, since the 90s. Definitely, there probably it's probably a much more prolific part of a management program, in the last 10 years, the testing labs have really stepped up their game in terms of getting the samples and doing the wet chemistry to give the results back as soon as possible, like within 24 hour period, or less even. Yeah,

Justin Grammens  36:35  
I mean, so the samples are there. And it definitely goes back at least at least a decade, you know, or more. So

Tyler Nigon  36:40  
that helps. I mean, I guess, but there's probably even more refined tests that are gonna come out in the future, that might be even even better. You know, who knows? I'm guessing the science on this sort of continuously changes. Right, right. Yeah. So one other thing that I wanted to say, Well, this is university research. Um, so in, you know, in the last 15 years, 20 years, even, you know, we've always been, for example, growing potatoes and looking at the effects of nitrogen management on that potato. And even with remote sensing for 20 years been collecting, even if it's not imagery, there's handheld sensors, to collect this reflectance information to be able to really evaluate the feasibility, can we use these sensors to make predictions about nitrogen status. The problem with using some of this old data, whether it's old or not, the problem with using a data set that maybe uses this active sensor to capture reflectance information is that it's not directly transferable to the sensors that are scalable, or that we have access to. And one way to put that is I always say all sensors are not created equal, you know, getting back to the electromagnetic spectrum and multispectral and hyperspectral. And image band in a camera might be centered on the green wavelength about 550 nanometers. But camera a might have 20 nanometre bandwidth. So from 540, to 560. But camera B might have a 40 or 60 nanometer bandwidth. And that's a big problem, because the precision of the on the spectral side introduces noise to that model that really negatively affects our prediction accuracy. Sure. Another thing I'm working on on the research side is to be able to collect hyperspectral imagery and mimic if we have the specifications of another sensor that's maybe more scalable, lower cost, gotcha. Try to mimic the specifications of that using this really high resolution trainer models with that mimic data, sure, and deploy that train model, kind of using hyperspectral imagery, but it's something that's more specific to what we're actually going to deploy.

Justin Grammens  38:51  
Cool. No, that that makes that makes a ton of sense. And it's obviously as any sensor evolves, and technologies get get better the data. You're right, if it's collected at one point in time versus 510 15 years later, you got to factor in the hardware and how it was collected. You mentioned about training models or deploying models, I guess. Hey, do you guys, how have you been doing that? Have you been running AI sort of at the edge? Or have you been shipping data to the cloud? what's what's your overall game plan?

Tyler Nigon  39:17  
Yeah, so that's, I guess, future work. So we're not even quite that far yet. I guess, just handling these different data sources and being able to ingest them and working with the farmers to be able to get access to their data. I mean, we didn't even talk about data privacy, but there's a lot of concerns around that as well. But yeah, that is the the goal was to get our database, you know, on the cloud and our software up there where every night for a particular field or you know, 1000s of times within a field, you know, every night you know, check for new data, if there's new data, make new predictions, if there's data that we can train on, retrain the model and getting that in to be an automated flow. I think is probably the only answer to some of these agricultural problems, because of the specificity of the soils, the climate, the environment, you know, even on one farm, you know, if we go from one field to the next, it's a completely different ballgame. So if we have one model that works, and field a, it's not not guaranteed to do all that well on field B, even though they might be across the road from each other.