93: Ross ‘MEMPHIS’ Pambrun

93: Ross ‘MEMPHIS’ Pambrun
Examining
93: Ross ‘MEMPHIS’ Pambrun

Apr 01 2026 | 01:04:08

/
Episode 93 April 01, 2026 01:04:08

Hosted By

Kris Hans Erik Christiansen

Show Notes

In this episode, Erik and Kris chat with Ross "Memphis" Pambrun about his unique path across healthcare, firefighting, technology, music, and media. They discuss how Ross built a career that spans public service, AI and cyber innovation through The Memphis Group, storytelling through Red River Gold, and performing with his band, Memphis and The Grande.

SHOW NOTES

CONTACT

Website: examining.ca

Twitter: @ExaminingPod

Erik Christiansen, Co-Founder & Co-Host
Website: erikchristiansen.net

Kris Hans, Co-Founder & Co-Host
Website: krishans.ca

View Full Transcript

Episode Transcript

[00:00:08] Speaker A: Welcome to Examining, a technology focused podcast that dives deep. I'm Eric Christiansen. [00:00:16] Speaker B: And I'm Chris Hans. [00:00:24] Speaker A: And welcome to another episode of the Examining podcast, the technology focused podcast that dives deep. Today's special, a really special interview. Today we are here to welcome Ross Memphis Panbrum. Welcome, Ross. [00:00:38] Speaker C: Welcome, folks. Couldn't be happier to be here. It's my Saturday night. [00:00:41] Speaker A: Exactly. This is terrific. And so we're here to do an interview and talk about your background. So I think my colleague Chris is going to kick it off. [00:00:50] Speaker B: So, Ross, for our listeners who may not know you yet, can you introduce yourself and give us a. A short version of your background and how you ended up moving across worlds like healthcare, firefighting, technology, media and leadership? [00:01:08] Speaker C: That's the, that's that number one elevator question, and it's the hardest one for me to answer. So for any of the listeners out there, I'm Mati. Let's just say I'm a bit swarthy. How's that? I always wear generally a blazer, and on my blazer, I have a beaded poppy with moose hide on the back because my father was an aboriginal veteran. And I also wear a pin on the right side that shows the 15 colors of the foundational languages in the indigenous community. And I'm very proud of the fact that I. And this is, this is a story that maybe I'll share afterwards that I wear medals of service, one from the federal government, one from the province of Alberta, and one from the city of Calgary. In my journey, I had two Saskatchewan parents. And as a result of that, they dad's choice when he was young was either coming out of Saskatchewan, you either become a priest or you become a soldier. So my soldier father, when we ended up over in Europe, I was born in Germany. Now, as you end up coming back as a son of a, you're an army brat, you live all over Canada. Or as I like to say, you get beat up all over Canada because you got to make new friends everywhere you go. So along the route, when we finally ended up out west, I ended up taking the opportunity to become a registered nurse, because my mom was a registered nurse and I had no idea what I was going to do. So I ended up getting to advanced critical care, worked to the highest level of cardiac sciences. And then I had an older brother who said, you know what you should do? You should become a firefighter. All right, that's not easy to do. But I managed to make the transition, get into that career. But at the same time, I'd been a musician for, since I've been 18 years old. I'm an actor right now. I have shows that are on TV five days a week. I have other shows that are, that are coming up that we're working on. But in the world of technology, we ended up building an occupational health company. Now I since sold that nationally, but as a result of the work we were doing, we were using machine learning to come up with new strategies to protect people's hearing in occupational health. Now after we sold the company and the success of sort of that awareness, I had another business partner reach out to me saying, hey, do you know that there's a gap in environmental monitoring when it comes to wildfires? And I said, well, if you can show to me where it is, I will make sure we get funding for it now at the same time, because I needed to learn, I needed to continue to advance my studies. And it's crazy. I've actually, there's been places that I've forgotten that I've studied. That's how I've lived my life. I've always been learning. So I was taking courses out of Harvard and MIT because they were the only ones that were offering what I needed to learn from that. We built a satellite tool that looks back at the earth and I can tell you exactly which tree is going to burn down first and which house will burn down first as a result of that. So when you combine all of these worlds, and I used to give talks in the indigenous community about indigenous awareness and how to share culture, all of a sudden the Speaker's Bureau of Canada comes to me saying, boy, we've seen you talk, you're really good. Would you be interested in joining our bureau? But can you talk about anything else other than indigenous? Well, I guess I own this small European Space agency recognized satellite company that does artificial intelligence. He smiled and said, you'll have a, you'll have a talk tomorrow. So since then I'm in New York City, I'm in Ottawa, I weave and interweave the relationships of understanding Ferris and binas and artificial intell bias and artificial intelligence. But I use references from the indigenous community, my knowledge, and there's one thing that always grounds me. We're still human and technology isn't. Now, how was that? [00:04:50] Speaker B: That's awesome. [00:04:51] Speaker A: That's terrific. That's probably our best that, that, you know, we have some follow up questions to ask too. But that's a terrific encapsulation because I like how you, you trace the thread throughout. [00:05:01] Speaker C: Well, I hope you do. I don't want you to hang up on me just yet. [00:05:03] Speaker A: No, no, no, no, we wouldn't. You know, but that's a great segue, Ross, because, you know, I was reading about the background, your background, so it was interesting, the firefighting and how that connects to Memphis group, which I wanted to ask you about. But just as an aside, you know, being a registered nurse, you did that for seven or eight years, as I understand. And would you say that was. I mean, was that experience, did that in any way change your. I know that we're going to talk more about technology, but impact your perspective because you talked about bias and fairness and, you know, there's a bedside manner to being a registered nurse. And I'm just kind of curious now about how that maybe changed your perspective, personality when it comes to even approaching technology. [00:05:51] Speaker C: I absolutely love that question because you're bringing back the fundamental how do we create values? And people are always looking at ethics, but we need to continue to look at our values and the morality behind our values. So for me, when I'm keynoting for a thousand people, somebody invariably is going to come up to me and talk to me afterwards about the challenges that they're facing, that their threat actors are causing harm, the deep fakes, and what's happening to their families. And those questions always invariably go back to the human condition, our fears, the challenges we face. And nobody ever comes and tells me about just how wonderful their avatar is. Nobody ever does. So all of those years of nursing when no matter what the situation is, anybody I'm talking to, when you're sick, you don't care about anything else other than getting better. When your family member is sick, you don't care about anything else other than getting better. So that grounds me to realize we need to use our technology for good. How are we building better advanced tools to monitor for breast cancer faster, men's health concerns? How are we helping anybody in the community when it comes to exploring new ways of understanding medical science so that we can continue to strategize about better practices? And. And it's happening every day. That's one of the things about how agentic tools, they think differently. That's good. You know, when I come to neural nets and I'll have conversations with people, I'm like, remember the encyclopedia? That was one neural net. That was it. That was it. We all. We all relied on that. But I said, we also all think differently. So when each person is utilizing that tool, it's great that somebody says, well, now that we have these agentive tools. We don't have to think for ourselves. Yeah. I can still tell you, when I look at the color blue, my wife looks at it and sees a different color of blue. Well, how is that possible? Yes, because we're trained to think differently. We look differently. We have different experiences when it comes to it. That's the nature of neural technology. [00:08:07] Speaker A: And the idea of experience is interesting. So. So, as I understand it, Memphis group, like, well, you touched on it looks at. Uses satellite imagery, artificial intelligence, machine learning for firefighter preparedness. Can you tell us a little bit about the gap that you saw to create that company with your colleague? Maybe. And you mentioned about funding. That would be interesting how you pitch a company like that. I would imagine, based on the current environment, that that was maybe easier today than it is 20 years ago. [00:08:39] Speaker C: Well, the Memphis Group, it's funny, and I'll let you in on a little secret how that worked many, many years ago. Again, I've musician, and I've traveled with some hall of Fame rock stars, but at the time, I was quite young, and the only musician I ever wanted to meet was B.B. king. And I had a chance. The first time ever. Somebody knew a guy who knew a guy who said, you could probably go backstage to meet him. And I was elated. I didn't know how to respond. Well, I'll jump forward in that experience. But I had him sign a picture, and it was a. It was a great picture. And he smiled at me, and off he went. And I had. I had time with BB King, but. And off he went. Well, somebody behind me said, memphis. That didn't mean anything to me. They said it a couple more times. So finally I turned and I looked at this gentleman and I said, sorry, like, keep saying Memphis. He looked, said, you got that picture in Memphis, Tennessee, didn't you? Yeah. He said, you got that picture at a little poster shop at the end of Beale Street. Yeah. He said, that's worth $5,000 now that it's signed by B.B. king. And it's a picture of Elvis and B.B. king in 1957 standing on Beale Street. And that's the only place you could ever buy that picture at the time. So when I went and told the musicians that I was playing music with at the time, this story, they looked at me and they said, memphis. And I've had people since. It's the greatest accolade in the world. Musicians say to me, you're the only person I've ever known that could pull that name off that stage. Name off and on my TV show. Depending on which TV show you're watching, you know, I might go by Memphis. Depends on the nature of the character. So when it came time that I'd sold one company and I had my own numbered company, I just needed a name for doing business. So a friend of mine said, well, why don't you just call it the Memphis Group? Makes sense to me. All I'm doing is, you know, writing grants and getting opportunities, but nothing that was ever really going to expand into a brand that I was exploring. Well, then all of a sudden, again, we find this environmental gap. And what it is, is as we look back at the Earth. So often when I've been in conversations with government, they always say, case you want to do environmental monitoring, sure. How much helicopter fuel do you need? This is questions I actually get asked. I said, well, the interesting thing about what we're doing is we actually use data from satellites. We don't have any funding for that. Well, it's just data. Well, but do you need helicopter fuel? No, no. In fact, we could use drone technology. But the satellites are going all around the Earth and they capture what's called multi spectral or hyperspectral images of the Earth. So I've had to explain that to the governments. Well, all of a sudden, when we explained to one of our big funding, government funding partners, they said, wow, that's amazing. Can you prove to us that you could actually do that? So we partnered with one of the universities and said, yeah, you prove our science, we prove that we can do what we're doing. You tell the government that works, and that's what happened. And I'll give you an example of how we think differently. I made comment a little earlier about my wife and I both seeing the color of blue. Well, the three of us very likely see red, blue, green, our primary colors, how we visualize things. And from those three colors that the three of us can see, we can extrapolate a thousand shades of blue. But in the ocean, there's a little mantis shrimp, just this tiniest little shrimp. Colorful, beautiful. Imagine a little shrimp just walking along the floor and it sees virtually the entire visual spectrum. Ultraviolets, infrareds, it sees thermal. But the thing about that little mantis shrimp is it its brain is so small, it only knows what's going to eat it and what it needs to eat. So when we tie one of those visual cameras to a NASA satellite, which they have, or if you put like on a, on a, on the spaceship, or if you put them tied to satellites like European space agencies, Copernicus network as they go around the Earth, they capture swaths of hyper and multispectral. Now if we can use computers, folks, to look at the earth and extrapolate that data, that's how we tell exactly what will burn down first from that. Imagine the other rest of the information. Well, if it's really dry, that indicates the, the indices of drought. Okay, what are the risks of drought? Can we look for pests? Sure. I've had people ask me, can I follow where the buffalo are? Because they want to keep one buffalo separate from buffalo herd. And I said, well, why don't you just use GPS for that? Like there's, there's a time and a need for certain values of information and data. And that's what I try to do. Now, in some cases it's a hard sell because a lot of communities go, well, we've never burned down yet. Is that how you want to live your life? Is that, is that what. Because California didn't seem to work for them. And then I have other communities that go, we're never going to change our landscape. How Both. Then if we explore alternative routes for you to get out of your community when it does burn down, because right now you only have one. And if you can imagine how many cars are going to be lined up to get out when that fire is coming, that's not going to be safe for you. So that's what one of my goals is. But as a result of the success that all of a sudden I had, the next thing you know, I was helping study terrain in Africa to understand about species translocation. Is there enough of a natural habitat that we could put the indigenous animals back into that community so they could survive? Love that question. Then all of a sudden I became global. Then all of a sudden the New York, the Wall Street Journal Future of Everything festival is calling me saying, can I come to their stage and talk to them about artificial intelligence and the bias and fairness in AI and, and folks, I just took too much time away from this dude's question. Let's get back to the next one. [00:14:56] Speaker A: Well, that's really interesting though, because you talk about kind of using information that's available in novel ways rather than reinventing the wheel. [00:15:04] Speaker C: And that's what happens. A big conversation I just had the other day in Regina was at one of the university. It was their AI Futures conference and it was about the nature of data sovereignty, but in the sense of a trust. What often happens is you got 50 companies out there and they're all vying for to work with some of these large municipal or provincial or federal partners. But it's just repeat data. Why are we re buying data? And if we're buying data, why don't we hold it? Why doesn't our agency, whether you're a university, whether you're a municipality who paid a vendor to do some work for you, they're the ones. Then you should be holding that data, sharing it with whoever you want. And the nature being is that buy it once, use it often. [00:15:56] Speaker A: We could talk about academic publishing another time. I think we would have a lot to say. I work in open educational resources. So like I totally, I totally get where you're coming from, but I won't. [00:16:06] Speaker C: And I've been, I've been through your MRU library and I love that interactive space that you guys have with the, with the wall that surrounds you. It's just delightful. [00:16:17] Speaker B: Yeah, yeah, absolutely. So a lot of people talk about AI as either a miracle or a threat. From your perspective, Ross, where is the real value right now and where's the hype getting ahead of reality? [00:16:33] Speaker C: So in the past, and I have no ego again, I've traveled with rock stars. Ego doesn't work. And these and these good rock stars know ego doesn't work either. But it's something that's a little bit hard to get away from. And in my case, I want to meet everybody. As I tell anybody, everybody in the world can have 20 minutes of my time. But honestly, there's still only so much of me to go around. So when it comes to the nature of hype, the one thing I do know is me. And I know me. And I guess I get to trust me. So at the time, let's talk about five years ago, as the generative tools were starting to build and we had our large search engines, when you typed in who was Ross Pambrin, very little information came up in Google. It was a bit of a fight. And then all of a sudden chatgpt came out, right? Just that beautiful spirally look and said, guess what we can do now? This generative tool, this generative pre trained transformer that most people have no idea what a large language model is, but they're, they're starting to explore it. And the world said, you have to try this. So I tried. Who is Ross Pam? And it said Ross Pamron hasn't made enough of an impact that we have any information about it. I'm like, folks, I've been a musician and a keynote speaker for a While by them. Okay, that one stung. But what I knew was we weren't multimodal yet, right? We weren't connecting all of the dots. This was just one LLM that had a certain repository of, let's call it borrowed information. Stolen's another word. But you know that somebody has that put information into this tool. Okay, so as I was starting to give keynotes and explaining the talks about these generative tools, as I, each time I would go to the next tool, I'd say, who is Ross Pambran? Well, once we moved up a year and a half, two years later and we were into the co pilot space, all of a sudden I could see there was a ton of information that was starting to come up about me, but it wasn't connecting with all of the socials that were out there. But we're at a, we're at a point now where I can set a prompt saying, who's Ross Pambran? It shows me where it's found the information, where the tool finds where the information is. It references podcasts, it references TV shows, talks, the interviews that I've been in. So the scraping of the Internet is occurring at a much faster speed and the connections are occurring faster. Well, I tell people, you have to ask yourself, and this is one of my catchphrases, if you don't participate, AI will define you. But here's the thing, it's not such a bad thing if AI doesn't know anything about you. What is the nature of the role you're trying to play? Mine is to bring people into conversations, bring humans together. These tools are nothing more than tools. There is no consequence to an AI agent when it comes to giving you a wrong response. Now we're getting to a point that lawsuits are occurring for the agencies that manage large LLM agents, but we're at the infancy of that. But what are you going to do if the agent gives you the wrong information? You're going to get upset. Again, these are simply tools. You have to understand what information is out there. And the disinformation that you may put out there about yourself, guess what? That's going to be attributed to you as well. So it's our job to understand who we are online, create. If it's a value to you. We're going to end up like the Europeans where we maybe trademark or we, we, we start to copyright our images so that at some point we're going to be into that space of control. But what I often run into right now is the conversation of deepfakes and at the end of every one of my talks, invariably some family member comes up to me to. To tell me in private that they had a son or daughter or, you know, somebody on the LGBT community who was deep faked. Okay. I said, well, I understand the. The fear, because I'm mature enough to understand the fear. But what we haven't done is a job of educating our youth that this is your time where you have to put barriers up. And your responsibility in a situation like that, if you can't get. Just walk away, is to say it's none of your business. And based on the way the world is right now, what makes you think that would be me? But you have to protect yourself, and you have to get out of there. And our youth are going to make mistakes, right? There's too much opportunity right now. Everybody's got a camera in their hand, and they think that this will never be a consequence to them. Well, it very well and could be a consequence to you. But here's the thing. The safe space right now is, yeah, deep fakes are actually just your answer out of it, and it's none of your business. Drew Barrymore, that poor thing, she. She posed for. For the men's magazine back in the, I think, probably the 80s. And at that time, there was no Internet. There was no. We didn't have the availability of digital cameras. So she just thought to herself, at her age, once these magazines have run their time and they're ragged and nobody looks at them again, they're just gone. Well, the Internet created a space where any bit of information that digital or printable that you want to take a copy of, there's an exposure. She wishes she had never done it. But at the end of the day, we're all human. We all make mistakes. It can just be a passing fancy. [00:22:12] Speaker A: I was going to ask you, Ross, about. I think you kind of answered this question already about kind of bias and fairness and in tech and culture, but I'm going to. I want to pivot it a little bit because I think you already touched on this about and kind of the idea that you mentioned, I think you spoke to about data governance. You know, one of the things that Chris and I have talked about on this podcast is that, you know, these tools are really great, but, you know, they. They've turned. There's instances where they've, you know, turned these AI tools on and, you know, even within a company, like an organization, and. And suddenly now all the documents that I shouldn't have been shared with, I can now unearth with this incredible scraping tool. We've also talked in our last episode which is. Was out a little bit late, but I put it out about kind of the advantage of working with these tools if you're kind of of that generation like Gen X, you know, and things like that. So I wondered if you could, if you could touch on those things and, and particularly the data governance and, and, and some of the indigenous perspectives on that. Because I sometimes wonder about, you know, these things being sucked into tools in the right place and time. [00:23:14] Speaker C: Yeah. [00:23:14] Speaker A: On the type of information. [00:23:17] Speaker C: Well, you know, I'm a bit of a storyteller so the how I'd like to present that is, is on twofold because I believe in tools like rag retrieval augmentation generation. But we have to understand what the we, you know and have a plan for why we're doing things. So having worked in healthcare and worked with a lot of different agencies and still speak to a lot of health agencies, the nature of that tool is imagine every unit, and this is not unusual, every unit in a hospital, they all had their own different hand washing techniques. Here's the 10 things you have to do to wash your hands. Well, in retrieval augmentation generation, if you were to pull and apply all, let's say 50 hand washing techniques into the tool to your agentic tool and say standardize these now the tool will standardize it. Now the beauty of where we're at right now is we're getting to a space where once we've standardized it and now it's our responsibility to validate it, we could then ask the tool, let's work to put it in an available resource for anybody of a different language or somebody who could actually talk to it because they're visually impaired or vice versa. Now we still need the validation of the information that if it's going to be put into a different language. But imagine that as a resource. Now one of the challenges with that is if I'm an agent, a tool, I know nurses ever since the moment I walked into my first hospital. You're taught to wash your hands. But I go back to the human condition. So there was a study that was done to research the nature of, of hand washing. Once they came out with a, with a document that didn't include wash your hands first for nurses. Because every nurse washes their hands. Yeah. But once that document came out and it didn't say wash your hands, they started to see a reduction in nurses who were washing their hands. Because our human condition is that we still follow rules. And so once they put that rule back in there, wash your hands, then all of a sudden the rate of responsibility went back up. So when it comes to who now holds that information, well, if you're the organization that created that value of this commodity of hand washing, once you've standardized it and you've made it work across your agency, that is a value to you. That is a product that you now should own and now be able to share. Imagine how many hospitals there are across the world. We need to think globally on how the value of this information works. Now, in the indigenous communities, there is two sort of frameworks that have been developed by the Indigenous communities, and these frameworks can extend farther outside of the Indigenous communities to anybody. The first nations uses one called ocap, which is ownership, control, access and possession, and Inuit and Metis communities and larger communities outside that group called ocas, which is ownership, control, access and stewardship. So there's a responsibility of who gets to manage it as well as who's going to guide it. Now, if you are going, if you're a community, let's just say this little podcast right here, all of a sudden, we're going to take every bit of information from the examining podcast and we're going to put it into one, resour all the knowledge, the technology, the discussion. Now, that is a resource that you guys should be holding on to, which I know you do. And when somebody accesses that, when all of a sudden Google comes calling, because this is an amazing podcast and I love the charm and Chris's brilliant thoughts and Eric's ideas, when these things come out and somebody goes, hey, you had 50 hits. There's a value to that. You guys built this value. So you have the ownership. You should control who you give it to. Right? You need, we need to control access. That is us ensuring that where it's stored is a place that you have the resource. But who gets to access it? You get to control that. And then the stewardship. Yeah, you know what? We don't want to give it to you. We don't want to share it with this organization because there's enough agentic tools, there's enough large groups out there that are actually, information is not stored anywhere near you. And a lot of times, Deepseek, for example, stores all of your information securely in the People's Republic of China. So you have to decide where you want your information to be. So everybody has to start thinking about that. And I tell people, just because you have social media doesn't mean you need to put your information on there, you get to control. And at the point that you don't feel comfortable with it, don't, don't worry about it, people are still going to be able to find you. [00:28:13] Speaker B: Those are good points. I mean, I remember when we first started the podcast, that was a big decision that we had to make was where are we going to store our actual files? Because there was plenty of tools where you could, you know, store it for free. But then who knows what happens with your, your data and information? [00:28:32] Speaker C: Well, I'll ask you to. How, how much do you enjoy and how often do you finish reading the end user license agreement with all, all of the technology you use? [00:28:41] Speaker A: I don't, but, you know, now I can upload them into AI, into Claude, because the context window is so big and I can query it. So I feel like, I feel like we may have solved that problem. [00:28:51] Speaker C: I've been in conversations where experts have said to Microsoft, you know, under this allowance over here, under this, it says Microsoft has the right to your information. Well, that's not the intent of it. That, that's, that's not, that wasn't the intent of what this, you know, user license agreement is for. That's for a judge to decide someday. I get, I get it that you're saying this. You know, these are, these are the conversations that I hear back and forth and I hear experts looking at each other and it's like, yeah, once it's online and I asked an agentic tool this just the other day, I dropped in a whole bunch of information and I said, now here's my question to you as a tool, because this is what I teach a lot of people. Can you never, ever, ever share that information? And it clearly responded back saying, I have neither the technical or legal ability to deny that or, you know, answer that question, that I cannot share the information. Of course not. Once it's been typed, it's out there. Much like once it's been said and been recorded, it's generally out there until you change my voice and make me sound, you know, less glamorous. But through the deep fake, you know, frequency. Deep frake. But until then, yeah. So we have to be very conscious. And it's unfortunate that's the nature of privacy right now. But I was just in a meeting yesterday where a cyber security expert was saying, oh, they have somebody who, who lives in a world that they can't be online. I kind of started to laugh, like, okay, so there's some, you know, government agent and they can't be online. Have they ever walked past a bank? Well, yeah. So they've used a bank machine before. Yeah. Then they're online, like, I'm sorry, right. You, you. There's no place that you can go that you can't. Now, I get it. They're not creating social media posts for themselves. I get that. But then you've got to come up with better terminology because your comment for being an expert actually makes no sense. There is absolutely. If this person only walks wherever they go and takes back routes, maybe. But if they've ever flown, you're online [00:31:08] Speaker A: whether they know it or not. [00:31:10] Speaker C: Whether they know it or not. [00:31:11] Speaker B: Yeah, it's a good point. So, I mean, you kind of touched on this, but maybe if you want to just tell us how AI has changed the way that you work personally, not, not just professionally. And if there's any type of tools or workflows that you genuinely find useful where you, where you think that human judgment matters more. [00:31:37] Speaker C: Oh, that's a big question, isn't it? That's a fabulous one. The thing that when it comes to the conversations about bias and fairness and when I talk to organizations, I familiarize people with tools that are not native to their organization. So talk about the deep seats. I'll talk about the, you know, the Geminis, the clods, and I'll explain to them, when you're using these, say, sort of available technologies, if you're using it for free, then you are the product. So, and I tell people, write that on your arm. And every time you go to use your computer or use your phone, look at that on your arm. Because understand somebody is getting value out of that. Now, one of the things that I'm responsible for is, say, organizations that come to me and they're exploring human resources because everybody thinks we got to save money and human resources, so we're going to make this easier for ourselves. Okay, so I've explained to you tools that are, that you're using, that are risk your information, and then I'll talk to you about things like copilot that technically are native to your organization, which means they're protected behind your firewall, they protect your information security, they're responsible for ensuring that information doesn't get out there. Yeah. Okay. So, so now if you're going to use a tool that integrates with something like one of these programs and you want to try an HR tool, I said use synthetic information and let's just see what opportunities present itself out of it. And if you're going to use an HR tool. I look at that person, that one individual who just asked that question and I say, you put yourself through that hiring process and let's see if you get hired. Because if you don't get hired, then you might not want to work with that organization or you might want to explore what their algorithms have been derived because we're creating limitations. I'll give you the example. Amazon had created a hiring algorithm. Now this was a number of years ago and in that algorithm if you said that you were a woman, if you went to a woman's college, if you played a woman's sport, if you had that word in there, you were not getting hired by Amazon. Now there's great articles that explain what happened in that situation. So for one year and you think about it, in STEM technology, you're a woman and you wanted to maybe work for that organization, you couldn't get hired. So for 3% of your life there was not an opportunity if you're going to work for 30 years. So when people look at that and go, yeah, but who set the bias? Absolutely. Who trained the model is the question. So what, what information was available in there? So for me, part of what I'm doing is as I sort of research each of the tools, I invariably will try the information through all these different agentic tools and go, Claude, help me out here and see what the experience is like and does it meet my need? So in the nature of humanity I find it's a wonderful tool. If we look at the perplexities tools out there, bit more Google shaped. But again folks always understand that there is an inherent, somebody owns this, so where are they trying to direct you depending on what you're using it for. If I start working with, you know, as you fiddle around with OpenAI tools, the I try them all and see what the answer is. And again I go back to the simple part before where I used to say who was Ross Pambrin? A lot of times I'd get misinformation because that was single prompting. And when I would say where did you get your references? And once I asked that question, that secondary prompt and they, the tool itself had to provide where it got the reference and all of a sudden it corrected itself. Well I thought now that's interesting again. So if we're, if, if the, the most front facing technological tools are not working, if you're just adding an API to it, all of a sudden you're just going to have those continued inherent mistakes. So I, I use them all. But when it comes to Sort of how I build my workflow. There's still a human side to me. And I, and I still write notes and when I reach out to somebody and I pass them a business card, I tell them all the same thing. You can throw that business card out by the time you get to the door. Not worried. There's another relationship that I could have. But we're still human and if you reach out to me, we're gonna have a good time, going to have a good conversation. And so when I got your guys conversation opportunity, this is going to be a good time. Yeah. Because I know you two do your work. [00:36:51] Speaker A: Interesting that you talked a bit about that, that human connection. Because I think, I think people see like you, I kind of alluded to these. It's kind of, there's, there's kind of an either or thinking that's happening with AI. It's either going to replace people or they're going to be involved. But there's still, like you said, there's still plenty of opportunity to kind of use your own judgment, think things through. It's funny when she mentioned the, the single prompting because Chris and I have discussed this kind of ad nauseam, you know, the difficulties, why prompting is so hard because we taught people to boil down searches to simple search strings for so long and, and now you have to talk to a tool where you kind of have to do the opposite. You have to. It kind of works into your favor, as I said, to be a bit of a blabbermouth to AI because it gives it more detail to work with. Do you find people's minds open up to these tools a bit more and the quality of them once they see the difference in responses from different models? I mean, that's been my experience working with students. I see them working with a free tool where they're perhaps the product. And then I will try that same prompt or maybe a slightly edited version in a reasoning model and then they go, wow, from like a research perspective, is that so much better? [00:38:02] Speaker C: The, you know what you're setting me up for here is for me to tell another story, which is essentially what happens with me. And in this concept and hearing a little bit about that, it takes me into the direction of anthropomorphizing. Now one of the first times I, after there was that period of time that all of us had to wear masks. I don't even want to say it out loud, but I was down south and there was a robot that went past me and I ran up to my hotel room and my wife's like, what are you doing? And I called down to the front desk and I said, I need a comb. And okay. And a minute later, a couple minutes later, phone rings again, there's somebody at your door. I go and open the door and there's this robot. Priscilla is her name. Her name. When, when it sees me, the top opens up and there's a comb inside there. Now, I've seen lots and lots of robots since. Now that people would ask me, well, did they take a job away? Did I need the comb? Right? So sometimes we, we create unnecessary value. But in the case of naming this thing Priscilla, I think it's so that we are less likely to kick that Robert robot over. Because I was recently at a place where there was of Wayos, the self driving autonomous vehicles. And I'm watching them as they're trying to navigate each other and navigate one single person. And I just ended up crossing the street and stood there for a second going, now what happens? So the question you're kind of asking when it comes to the agency of agentic tools, what's the purpose and the value that you're trying to get out of it? If you just want some code, what is the relationship that you have with that technology matter? But we're still human. And so when I, when Sam Altman said, I spend millions of dollars to make this thing human. No, you spend millions of dollars because we're more likely to interact with a tool that we feel has some humanity, our humanity behind it. But we need to really explain to our youth there is no humanity in it. I'm. There's intelligence in it. There's even emotion in it. Because when I tell somebody, well, there's no emotion in artificial intelligence. If you think that, that they didn't steal 3,000 of Dolly Parton's most beautifully written songs and put that into these LLMs with all of her emotion. I'm sorry, there is emotion in there, but is there, is there intelligence? Anything that can get smarter, as far as I'm concerned, can gain intelligence, but is there humanity in it? Is there intuition? At some point there's going to be a hybrid where we go to Spock's brain and all of a sudden there's a neural relationship. And so there's something more likely that I would be able to trust. But when it comes to our youth, you have to understand whatever answer it's likely giving you. There's, there could be a bias behind it, but we need to understand what's our relationship with it. And I'll give you a little story if I have a moment for it. This is a bit arrogant or maybe it's narcissistic. I can't figure out which one it is. But I was in Saskatoon speaking at this, at this absolutely tremendous business event in Saskatoon. And it was, it was high hitting, it was delightful. They, they, Saskatoon was going out of their way, their university, to build this product, a business product for, for this meeting, fully attended. Wonderful. So Katie King was there, There was another speaker. I was a keynote on day one. Now, I gave a wonderful talk. Loved it. I just, I love getting on those stages. I love the relationships of people coming to talk to me afterwards. I eat this stuff up because I want to share it. I have no ego. What happens is, is people say thank you and I just say, I'm still a human being. I appreciate that kindness because that's what I need to keep going, that I'm sharing the right message. Then Katie King gave a talk. Well, the next night was a keynote speaker. And my wife said we. Because I took her along because at the same time my TV show was coming out the same night that this keynote speaker was giving a talk. And I said, my wife said, we need to go see him, Ross. You need to see him. Unfortunately, we can't watch your TV show Red River Gold tonight, even though it's episode one coming on tv, because we have to go see this guy. So we sat at the back of the room. His name is Adam Chair. Now, Adam Chair invented this little thing you may or may not have heard of called Siri. Siri just happens to be on anybody's iPhone. So he gave a wonderful talk, validated a lot of the things that I've been saying and came up with the same expression. And we had to talk about it afterwards, about the nature of. Is there humanity in this, in this tool? No, it's a tool. Now, after his talk, which was exceptional, he's just a character. As he gave his talk, I don't want to give too much away, but he was, he's a magician as well. And he, he, he described how AI sees him. And as he takes this, he takes off some duct tape off his shirt, he's wearing a tie. It was delightful. And I said to my wife, I need to meet Adam. So at the end of his talk, there I am backstage. Now, there's not as many people out there who need to meet AI Speakers as you think, folks. But there was about three or four people waiting to see him as I'm waiting There patiently, he looks over the crowd, he sees me and he says, I need to get a photo with you folks. Did I mention that Adam invented Siri? So I'm looking around going, no, he must be talking about somebody else. And he says, my son and I did the research on you. I need to get a photo with you for my son. Now everybody else left, we stay. I have a wonderful conversation. We're good friends now. And I, I was asking him, like anybody else was about Siri and his nature. And he talked a little bit about this on his keynote as well. He said Steve Jobs was asking him repeatedly to sell Siri. So, but it's like, I'm not interested. I'm not interested. 30th day in a row, Steve Jobs asked him. And finally Adam said to him, why do you want this product so bad? He said, I'm not looking at this as an AI tool. I'm looking at this as a product that will connect people. And Adam said, then I'll sell you my product. The fact that Adam needed to meet me, I still don't understand it. But he's a man who does the research and it's something that stuck with me. My attitude has always been, if it doesn't connect us, it just separates us. So I try to live by that as well. What is the product of the tool? What are you using it for? If it's going to build a relationship between you and somebody you're with, it's going to connect us. And so that was, that was a pretty high spirited day for me when he said he had to meet me. And at the same time my TV show was on the air. Crazy. [00:45:29] Speaker A: Chris, did you want to take the next one? I've asked lots of follow ups, so I feel like I'm hogging it. [00:45:34] Speaker B: Yeah, no worries. So, you know, one of, one of the things, like you, you've, you've worked in pretty high trust, high stakes environments. How has that shaped the way that you, you think about technology? Especially in culture, where often the, the rewards, speed before reflection, [00:46:01] Speaker C: that's another one that's actually incredibly important. So Chris is very well framed. Here's what I tell people. And this happens to me all the time when you, you know, having been a nurse before and a firefighter for over 25 years, and I'm just closing in on 30 years of service, and that's sort of the nature of why you get medals. But people will, these are the, these are the two most honorable professions. And I don't discount how important that is to me, to my relationship with my parents, my father, who's a, who served, a soldier, my family, my wife, who's a nurse practitioner. Our job is to still give and to serve. And so the idea when somebody comes up to me and says, ross, can you keep a secret? No, so don't tell me. But I go, if it's something that you need to get off of your chest, then you need to talk about it. But understand how big of a deal you think this is going to be is probably not as big of a deal as it will be as we move forward through time and space. So. But if it's just something that you just don't want to share, then don't share it. So that's the idea that I have when it comes to technologies is I was working with the Office of the Commissioner of Indigenous Languages, and I was working and emceeing some events. And because I get hired as an emcee or I get hired to moderate or have conversations or be on panels, and one of these ones I was working with was languages. Now, one of the panelists had been working with one of the many of the large agencies, the Metas, the Microsofts, and this was in an indigenous language where other panelists are researchers. And as I looked around the room, I could see some of these individuals that were working with language were giving away their language. Because let's say a large organization comes to you and says, well, we're going to pay you a reasonable wage hourly, and you need to validate all of your language for this language. LLM. Okay, so it's a translation app. Now, I don't want to offend people. I have to be very cautious of how that works. But I said, well, do you understand that now that you've put that in their space, they own that? Well, what do you mean? If you found a partner who was willing to help you translate the information, record the information, store the information, ocas, this ownership, control, access, stewardship, then you could go back to these language translation devices saying, happy to work with you. Here's what it's going to cost each time you access our information. Now, there's a value to our information because the likelihood of the translation is going to allow our community to now work with your community. And as we start to work and share information about environmental concerns, environmental governance, the history of our, of our communities and the knowledge and the background that we have, that commodity is going to be shared amongst our community. And as we validate it, as we strengthen the data, we're going to build a product that other people are interested in. But what happens sometimes is people think that the nature of beneficence versus non maleficence do good versus do no harm. So they think that they're doing good? Well, no. Are they doing no harm? Actually yes. So when we look back and we go back to the relationships that we have when it comes to health protection of our, of the, of the nature of consideration of our health, that's the fundamental something challenge that every physician has ever faced, every healthcare provider how can I do good but also do no harm? So we, if we're going to ask ourselves the nature of how I like to think seven generationally we have to ask ourselves that with every one of these questions. But a little earlier we were sort of facing and we were having this conversation about where we are and how I use technology. We when that ChatGPT, when ChatGPT came out, if you weren't trying it, you were one of the rarest people on earth. And the peak of inflated expectations went so high that if everybody was trying it then the novelty of asking the question of who is Ross Pamron wore off for everybody. And so people started to realize how does this help my life? How does it help the relationships with people that I have? And they were seeing it's a bit of a tool but it's not really helping me. So we need to focus on that and our large agencies have to ask themselves that what tool are we making available and why? [00:51:08] Speaker B: Yeah, you know it's, it's interesting. I've been chatting with some, you know, former students of mine, Ross and also how the people are using the technology like it kind of just mind boggling like using it as a therapist and psychologist and you know, apparently chat and you know, even how they're referring to it. It's like he knows me so well. [00:51:33] Speaker C: That's the anthropomorphizing. Yeah. And, and, and I. The human condition is to get the answer that you want. Anybody who's ever been in a relationship at the end of the day we're self serving human beings. Can we get what we want out of this? And technology is there to give it to you because it wants you to stay connected and it wants you to enjoy. But I guarantee that if you were to sit with a group of kids and put a deck of cards out and teach them how to play fish, at the end of the day they're going to laugh differently, they're going to explore the relationships differently, they're going to Try and cheat and then somebody's going to laugh and go, you've been caught. Yeah, that's the beauty of the human relationship that doesn't happen with AI because there is no consequence. [00:52:25] Speaker A: That's probably a good. That's, that's a, that's a terrific answer to that question and probably a good follow up to our last thing that we do, Ross, with all our interview guests. And this is you can choose to not answer or do a both, but if possible, we, if they're kind of either or questions, it's called our rapid fire section. And so we put together a bunch of, well, I guess all these questions are somewhat of a surprise, but, you know, random surprises. All innocuous, of course. Nothing political. You know, we, it's really just a personal preferences question. And I think Chris is usually our, our rapid fire question leader here, so he'll take it away, I think. [00:53:11] Speaker C: Love it. [00:53:12] Speaker B: Ready to go now. [00:53:14] Speaker C: Just, just to, to lead into this a little bit, though. As you can tell by the way I do things. As much as you want a rapid fire answer me, you usually get a story behind. [00:53:26] Speaker A: Yeah, we want people to be them. So you can do, you can do you. So that's fine. [00:53:30] Speaker C: Tremendous shoot. [00:53:32] Speaker B: All right, coffee or tea? [00:53:35] Speaker C: Now, I was a nurse before, so before I was 30 years old, I never had one coffee. Once I hit 30, I tried my first latte and I'm like, this is outstanding. So now I'm a coffee guy. [00:53:50] Speaker B: All right, Mac or PC, I tell [00:53:53] Speaker C: anybody, I'll answer any question you have ever had in the world. As long as you use a Mac. If you use a PC, don't call me. [00:54:00] Speaker B: All right, I guess on that note, then, iPhone or Android, you know the [00:54:04] Speaker C: answer to that one, folks. [00:54:07] Speaker B: What about from a working standpoint? Do you like standing or sitting desk? [00:54:13] Speaker C: Depends on the work, I think. But I'm getting to this age now where motion is lotion. You got to keep moving. So I actually will move both from sitting to standing and, and, and, and change my space because, yeah, it's, it's just important, but I think I'm moving into standing a bit more. [00:54:30] Speaker B: Yeah, I like that. Motion is lotion. [00:54:33] Speaker A: And my wife also says that I'll. I'll interject briefly because she. Yeah, years ago, she. I was, I was complaining. I was so sore. [00:54:41] Speaker C: And she was like, I was like, [00:54:42] Speaker A: I wonder what the problem is. And she's like, I think it's called sitting around disease. And I'll never forget. I'll never forget. It' comes up Every week to this day. And she's like motion is lotion. And she just disappeared. [00:54:54] Speaker C: Yeah, she does that like a flyby. She's. She's very, very intelligent. [00:55:00] Speaker A: He's very good at the flybys. [00:55:03] Speaker B: All right, this is a Hollywood thing. Star wars or Star Trek? [00:55:07] Speaker C: So I'm a captain on the fire department now. I could have moved up anytime I wanted to become a chief. Here's the thing. You got Captain Maverick, Top Gun. You got Captain Kirk. You got Captain Picard. I think there's probably your answer right there. Everybody respects the captain, so I'm a Star Trek guy. [00:55:24] Speaker B: All right, awesome. Do you like ebook or paper? [00:55:29] Speaker C: I still like holding a beautiful magazine newspaper in my hand. [00:55:34] Speaker B: All right, what about in terms of web browser? What do you like to use the most? [00:55:40] Speaker C: Being a Mac guy, I'm pretty straightforward with still running Safari. Yeah. [00:55:45] Speaker B: And video conferencing. [00:55:48] Speaker C: I go back to my earlier comment. I used to only there's three ways you met me. But right now, because of the clients that I work with and depending on where they are internationally, it's. It's dealer's choice, whatever they got. I hate them all. They're all a struggle. They all rate every. There's always a problem. [00:56:06] Speaker B: Yeah, yeah. No, for sure. Do you have a favorite car? [00:56:12] Speaker C: 1977 blue Honda Civic. If I could get my hands on one of those again. Yeah, that. You know, you go back to that time where you had that first bit of independence. I drive nice cars now. But if you go back to where you, you. You held on to that emotion, where you had that independence, you could go as far as the city limits until your parents found out, in which case, and you had to come back. But apart from that. Yeah, it got me there. Here's how I explain cars. They usually get me from some place I didn't want to be to someplace I don't want to go. They're just a tool. [00:56:49] Speaker B: Do you like cable or streaming guys? [00:56:52] Speaker C: I'm on tv. I'm on cable tv. Red River Gold. [00:56:59] Speaker B: It doesn't stream anywhere. [00:57:01] Speaker C: It does, actually. It does. Apple TV plus, I guess. Oh, and APTN Lumi. Oh, that's a good point. I guess I like both. [00:57:11] Speaker B: Who inspires you? [00:57:15] Speaker C: So I did talk about Adam, but who inspires me? Boy, that's wonderful. You know, the easy answer is always my dad, my mom and such. But there's a friend of mine who's a director and a producer. His name is Saxon De Kock, and we met once when I was doing a podcast that we took from a level of, you know, sort of five, $600 a week to $20,000. It sort of became a variety show. And I met him, he was a director there and told me in our conversation that he's Matey. And I said, well, that's. I'm Matey, and I'm, you know, the host of the show, and that's why we're here. So we started to work together. And what I came to realize was we were both on a journey of. Of sharing the indigenous history, the knowledge, the connection. He's very successful as a producer, director of three story pictures. And he. I asked him one time, we're both trying to figure this out. Why do we care? And we care because not enough people have a voice out there. Not enough people have heard the story of all the indigenous communities, the challenges where we're faced with. But I saw this man who not only is raising a wonderful family, will take time, is award winning, and will. And if I call and reach out, he'll take that call. That means the world to me, right? Because there's times where people will say, listen, I'll put you on the list and I'll get back to you in three to six months. Not this man. And I think that's the reason I see people are successful. And it's something I've said. If you surround yourself with successful people, positive, successful people, success will find you. It's hard to go looking for it, but if you surround yourself with great people, the next thing you know, two great guys reach out to you on a podcast saying, do you want to join? Absolutely. This is the raise. I've lived my life, so I hope Saxon doesn't hear this because he'll make fun of me, but big shout out to him as a guy who still [00:59:13] Speaker B: inspires me, thank you for that. And maybe the last couple of rapid fires, since you have a music background, what's your favorite band? The. [00:59:24] Speaker C: Having had a chance to travel along with Rush, I didn't grow up loving that band. And they're at the forefront now of coming right back with the new German drummer. Being from Germany, I. There's a bit of an affection to that as well. But the. When I first, when I went out with them, like, ah, I don't know what this is going to be like. Then I came to look at my biggest favorite rock stars in the world, the Bruce Springsteens. I'm like, look how many albums they had. And I didn't love every song that he wrote either. But once I once I was get a chance to live backstage and at the front of house to see what the passion of driving musicians are. They do it for the same reason you guys run this podcast. You're passionate about it, and you want to give it to as big of audience as you can. So I got to give it. It seems like an easy answer, but when you're a nerd, too, is Rush [01:00:18] Speaker B: and your favorite drummer [01:00:22] Speaker C: the. Well, as much as people would probably think that it's. It's Neil Pert, and I have some pretty funny stories about that. What was like the guy like Charlie Watts, who. Whose job was with the Rolling Stones. I'm not really a rock drummer, but I can keep the beat and just did the job patiently. I'd also be remiss if I didn't shout out to my own drummer, Corey Baskin, who's with Memphis and the Grant. My bad, because that would almost be kind of rude. But. Yeah, look us up, folks. We got albums out. Memphis and the Grand. [01:00:52] Speaker B: All right, well, thanks again for participating with the Rapid Fire. [01:00:57] Speaker C: I do have a question, I guess, for you guys. [01:00:59] Speaker B: Yeah, for sure. [01:01:00] Speaker C: When it comes to indentation, tab or space bar, I do tab. Tab, and. But there's always a risk because it's some. In some cases, tab jumps four spaces. In some cases, it jumps five. If you're coding or if you're writing. How did you come up with that as your. As your indentation? [01:01:24] Speaker A: Probably more of a result of how I was trained to type in school. Now it's impossible to fix it. I don't have anything against space. It's for the same reason that I always do space with my left hand. And there's no way to change it. I've tried. I've spent dozens of hours forcing myself. I've taped. Taped this thumb down so I alternate. I can't do it. It's just. Oh, that's what I'm. I'm stuck with it until I get arthritis there. [01:01:48] Speaker C: I don't think I've ever touched the space bar with my left thumb in my life. That's. That's quite interesting. Yeah. [01:01:54] Speaker B: You know, I was just looking down at my keyboard because my. My spacebar is split, so I use either or. [01:02:01] Speaker A: That's incredible. [01:02:04] Speaker C: Can I finish on something? [01:02:05] Speaker B: Yeah, sure. Sure. Yeah. [01:02:06] Speaker C: In 1962, because, you know, fact that I own a satellite company and I've used a lot of NASA information, but 1962, JFK was walking down the halls of NASA and people asked me about business, people ask me about technology, people ask me about human connection. Here's the President of the United States walking along. He's got his big entourage behind him. And in the middle of the hallway, there's a man who's cleaning the floor. He's one of the janitors. And the president walks up to him and he looks at him and he says, what do you do here? And that man looked at the president and said, I'm here to get a man on the moon. And the thing I explain about that is that the organization, everybody knew what the mission was, what the goal was, what the celebration was. But here's the other side of that. 12 men walked on the moon. Most of us don't even remember their names. And so for the listeners out there, it's okay to make mistakes. These guys came back to ticker tape parades of 12 millions of people lining the streets to celebrate them. We don't even remember their names. Most of us may remember one or two, but most part, we don't know. So get involved. Don't be afraid to make a mistake. But you got to figure out, do I know what the goal and the mission is? And if you understand that, you're going to realize humanity comes first. So always be learning. [01:03:34] Speaker A: Well, that's a terrific, a terrific ending point and a great piece of advice to the people who listen to this podcast. So thank you very much, Ross, for your time today. [01:03:42] Speaker C: Thanks, folks. You can find me www.rosspamber.ca or just type my name into the Google you'll find me. Thanks, you guys. Take the rest of the day off on me. You bet. [01:03:53] Speaker B: Yeah, no, awesome. Thank you very much.

Other Episodes