In the latest episode of "Intel on AI," host Heather McGuigan sits down with Luke Norris, the pioneering co-founder of Kaza AI, for a far-reaching conversation that maps the rapidly evolving landscape of enterprise artificial intelligence. The discussion, which kicks off Intel's 2025 podcast season, offers listeners a rare glimpse into how major corporations are reimagining their operations in an age where AI capabilities advance seemingly overnight.
Norris, who often guides Fortune 500 companies through global AI/ML deployments, challenges conventional wisdom about how enterprises should approach AI integration, often returning to a simple but revolutionary concept: bring AI to the data, not the other way around.
"The current system is stuck in proof of concepts," Norris paints a picture of a technology ecosystem evolving at breakneck pace, describing the innovation cycle in open-source language models as "Christmas every week," with breakthroughs emerging from research hubs around the globe.
Norris suggests that the conventional chat interface—the way most people currently interact with AI—is "the worst way to interface with AI." Instead, he sees a legion of autonomous agents working tirelessly in the background, monitoring systems, analyzing contracts, and optimizing processes without requiring direct human prompting.
"In large enterprises, they have hundreds and thousands of contracts with software vendors and services vendors," he explains. An AI agent could rationalize these agreements across business lines, analyze actual software usage, and even generate optimized purchasing plans—tasks that would overwhelm human teams due to their scale and complexity.
When McGuigan asks about the fifth industrial revolution—a term increasingly used to describe the current wave of AI-driven transformation—Norris identifies multimodality as the defining breakthrough. The moment when AI models gained the ability to process not just text but also images, video, and computer interfaces marked a fundamental shift, enabling enterprises to implement process automation at unprecedented scale.
"That allows enterprises easily now to adopt 25 to 30% process automation across the entire organization, and they could take that step into the fifth industrial revolution," he explains. Unlike previous technological revolutions that unfolded over decades or centuries, Norris predicts this transformation will occur within just 5-10 years.
For AI, the traditional cloud approach, with its emphasis on usage-based pricing, makes little sense for always-running autonomous agents. Instead, Norris advocates for on-premises deployments or dedicated cloud resources positioned close to data sources.
"Three Gaudi servers running a medium-sized model can generate about 30,000 tokens a second. A human can only digest or read at its fastest about 20 tokens a second. So you're talking those three servers could represent 1500 plus PhDs now incorporated into an enterprise."
The conversation concludes with McGuigan posing an audacious question: "If this is the fifth industrial revolution, what's going to be the sixth?" After a thoughtful pause, Norris suggests the next revolution will involve "new states of matter, new states of energy, new states of actual work"—and ultimately, a redefinition of "what it means to be human and what it means to actually get enjoyment not just out of your work, but out of your purpose."
Watch the Full Podcast
Read the Transcript
[00:00:00] Heather McGuigan: Hi, and welcome to Intel on ai, the podcast that connects listeners with leaders. This is our first episode of 2025. My name is Heather and I'm going to be your host. Our first guest today is Luke Norris. Luke Norris is the co-founder of Kaza AI driving enterprise AI innovation with a focus on secure, scalable gen AI deployments.
[00:00:47] With extensive experience raising over a hundred million in venture capital and leading global AI ML deployments for Fortune 500 companies. Luke is passionate about enabling enterprises to unlock the full [00:01:00] potential of AI with unmatched flexibility and efficiency. Luke. Welcome to Intel on ai.
[00:01:07] Luke Norris: Thank you and thanks for that introduction.
[00:01:09] It's quite a lot.
[00:01:12] Heather McGuigan: It's all you. It's very impressive. Appreciate that. It's amazing to hear it all together, isn't it?
[00:01:18] Luke Norris: It's humbling.
[00:01:19] Heather McGuigan: We have about a 30 minute chat today and we have a ton to get through, so I'm gonna launch right in if you're ready.
[00:01:26] Luke Norris: Yeah, let's do it.
[00:01:27] Heather McGuigan: Great. So I wanna talk first about data architecture.
[00:01:32] Now you've been working to rethink enterprise AI data architecture. How does the current system work and how would you like to improve it?
[00:01:42] Luke Norris: I think the current system is stuck in proof of concepts, proof of value, and it really is where they had to move data to the AI engine. The actual processing of that data and the AI itself have had they come together.
[00:01:56] What we're seeing in the global 2000, fortune five [00:02:00] hundreds is the EENT approach. An agent that actually executes autonomously on a goal or a task. It needs all the data and it needs access to data at multiple locations, multiple SaaS services and that can also span multiple geolocations. So what I'm hoping the world starts to adopt is they bring AI to the data.
[00:02:20] That's a paradigm change. That allows a ability for ENT service to access that data wherever it lives, process it anywhere, and actually get a full outcome. And that's where the enterprises are gonna find that, that next step in value from ai,
[00:02:34] Heather McGuigan: it becomes a concept of efficiency, which has been a very popular world lately.
[00:02:43] Of course, all businesses are looking to improve. So talking a little more about efficiency, and I wanna move this over to the topic of multi LLM strategies, what are the risks of not embracing these new [00:03:00] models?
[00:03:00] Luke Norris: I think there's a lot that question that you just posed. First off, the innovation on the open source side when it comes to LLMs and the new model innovations, it's incredibly rapid.
[00:03:10] Here at Mwaa, we actually call it as Christmas every week. 'cause it literally seems like there's a new breakthrough, a new paradigm change in the open source models that comes out literally every week. Secondarily, the innovation cycle is all over the globe, so the ability to follow the new technology and the new services wherever it comes from, is another major sort of step forward.
[00:03:32] Third, I think it's it's incumbent for the enterprise to understand something that might be very complicated today, is gonna be easily processed tomorrow because of this innovation, because of the new refresh cycle. And last in what you said. I think the ability to mix models is where the sort of future resides.
[00:03:51] The ability to run a large world model for against and with a bunch of small s SLMs is where you're gonna start to get, once again, [00:04:00] services that can mix and match across the entire enterprise.
[00:04:03] Heather McGuigan: And so it becomes an issue of scale as we move forwards with these processes using smaller businesses, larger businesses, and how they all come together.
[00:04:12] Luke Norris: Yeah, absolutely. But I also think replace business with business unit, you're gonna have small business units within larger companies that are gonna need their point solution. But it should all tie back to the larger governance, the larger processes, larger oversight of the enterprise, and that's where the world models can oversee the smaller models.
[00:04:29] Heather McGuigan: I think one of my, right now I'm feeling like one of my really direct interactions with AI at this point is with agents in customer service situations. You're typing on the computer. I know that someone is trying to market and funnel my way to help me out. So when we talk about the rise of autonomous AI agents in enterprise, what are some more of the behind the scenes uses for these agents and [00:05:00] how can they benefit businesses?
[00:05:02] Luke Norris: And I think this is where we were just talking about all these new models and features and s are coming out. The chat interface is like the worst way to interface with ai. I what I mean by that is AI is just so powerful. Yeah. So powerful. But we're seeing the ability to establish a 24, 7, 7 days a week back in process that is always running, is always looking for the new data to be uploaded.
[00:05:26] It's always looking for the files to be changed and. Waiting for some pipeline to kick it off and start its process. And when you do that, you can start having amazing outcomes. And these outcomes are really augmentation of current knowledge worker processes. Everything from you could turn on a my favorite one here is the procurement agents.
[00:05:46] It's just my absolute favorite. In large enterprises, in medium companies, they have hundreds and thousands of contracts with software vendors and services vendors. Those contracts have ad orders, they have entitlements. They have very nuanced details. When you're in a [00:06:00] large company, each line of business might have their own contracts.
[00:06:03] They might have their own entitlements, they might have their own ad orders with all of it. You can put all that together in an AI framework. You can have an AI procurement agent that now can rationalize all contracts, all services, all entitlements. It can then upload that to software services that tell you what software's actually being utilized.
[00:06:20] It can then lay out a two or three month purchasing plan that tells you, renew this exact license structure at this exact price. Have all the EU loves the enterprise license agreements change their language, so match this. To give you a rationalization, a cost service, everything. And that's just like the tip of the iceberg of starting a procurement agent.
[00:06:38] And it goes on and on from this.
[00:06:39] Heather McGuigan: One really cool part of my job is that I get to I wouldn't say stalks, say research is a nicer word the people that I get to interview and I got to spend quite a bit of time on your LinkedIn profile I did scope your Instagram a little bit as well.
[00:06:54] That's that's got some more personal things on it. But listeners, if you are not [00:07:00] following Luke Norris's LinkedIn profile, you should. It is filled with an enormous amount of insightful articles and videos that are all about the depth of your passion for this industry and how you're really looking to think and grow differently.
[00:07:18] One of the articles that I found the most interesting was your discussion around the fifth industrial revolution. Now, do you think there was a defining moment or breakthrough that marked the shift?
[00:07:35] Luke Norris: For me it's when models win multimodal. So when you could not just process language, but you could process video.
[00:07:43] The full spectrum sort of came into play and most recently computer use. So the ability to have these models access the tools and functions of a computer just a human really opened up. I'd say the full breadth of moving to full agent workloads. [00:08:00] Getting these enterprises easily now could adopt 25 to 30% process automation across the entire organization, and they could take that step into the fifth industrial revolution in my mind.
[00:08:09] Heather McGuigan: And what do you think AI's duty, it's a big word. It's duty, it's job. What do you think AI's duty is in transforming human productivity, but more importantly, enterprise value?
[00:08:23] Luke Norris: Duty is a big word. I could step all the way into that. I'll say this, of when we're working with the enterprise. We are talking about, it's what common WASA means.
[00:08:33] Superhuman capability. Not just replacing a workflow, but what does that dream workflow, what does that dream output? If you had unlimited resources, unlimited knowledge to throw at something, what would be that next level that you would move it to if you had this capability? And that's where it starts to get really fun.
[00:08:51] Like I said, back with a procurement agent U use case. It wasn't just consolidating everything, it was going out and checking all the actual use. It was them reporting that [00:09:00] use and actually putting that into the algorithm of what you were gonna put in to renew that next step. That 20% difference makes a hundred percent value difference to the organization, and those are typically use cases.
[00:09:13] Don't get funded. Those are things that are so beyond the effort that it would take from a manpower aspect in today's work that now the agents and that AI can take to that whole new level. And that's what we're really excited about delivery.
[00:09:25] Heather McGuigan: And when we talk about the impact of human productivity, every revolution is usually an advancement in technology.
[00:09:34] And those are always built by pioneers. And I consider you you're one of those pioneers that is really looking to continue to advance this technology, but pioneers always get asked the same question, and now I get to ask that question to you. How do you tackle the thoughts around AI for replacing existing [00:10:00] jobs?
[00:10:01] Luke Norris: You're right. I get asked every time and I don't think I have the best answer, but I have an answer. So I see this as really a k approach to the market. And what I mean by that is you're gonna have the 50, 60% of data input data processing that is gonna be replaced with ai. Then you're gonna have that 50, 60%, the top part of the K, that is the knowledge worker, and they're gonna get enhanced by that 20, 40 or 50%.
[00:10:28] And I do think those things are gonna have a very large impact in the long run in a positive I. Because now you're gonna be able to have so much more productivity, so much more insights, so much more value derived in the enterprise. That's such a much better margin profile. I think it just opens up so much more resources to be put back into r and d for new products, new features, new functionality to come to market.
[00:10:51] But there is gonna be that short term, very painful bottom part of that K that I talked about of simple data processing, data entry and data sort of manipulation [00:11:00] that AI is gonna do at a superhuman level today out of the gate. And it's gonna make a big impact on that. So in the short term, I think it's gonna be rough in the long term.
[00:11:07] And I'm long term in AI is not that long. We're talking five, seven years. I think the output that the benefits are just so gonna be exceeding, it's well worth it.
[00:11:15] Heather McGuigan: And when we look at some of those concepts of AI growth, we've been talking a lot about individual enterprise or individual businesses.
[00:11:23] When we scale that out into global governments, how can nations start to embrace this type of efficiency and growth?
[00:11:33] Luke Norris: Wow, that's a big one. You, you know, I don't know if I have the macro answer, but I do have a tactical answer, which I think is fun. We've actually done a project with Department of Homeland cisa it's the cyber infrastructure and security group of DHS.
[00:11:47] And we did this amazing sort of work in proving out the effort and then eventually presenting it. And soon there'll be an online tool. Thateveryone will be able to access. It's amazing. It [00:12:00] took 90 years of data around atmosphere pressure low metric pressure from hundreds and hundreds of sensors in America and some sensors worldwide at Air Force bases.
[00:12:11] And just think about that. 90 years of weather sensors. So the actual data set was in many archaic file formats that don't exist now. It literally just don't exist. B
[00:12:21] Heather McGuigan: Yeah, like a floppy disc
[00:12:22] Luke Norris: practically. Yeah. Stuff was written by hand and put in a BAM or BMC file. I don't even know this stuff.
[00:12:27] It's crazy and it's spread everywhere. Multiple colleges have it multiple data sets just reside in multiple locations in these different file formats. And just think of the breadth of the files. So it's a heartbeat that's just collecting this data for 90 years from all these sensors we to have the human effort put into it, to actually collect all that data, get it in a standard file format, clean all that data, process, all that data was staggering.
[00:12:53] It was so big. Humans couldn't take on the effort. You'd have to have a team of hundreds, if not thousands of people working in concert [00:13:00] coordinated to even get it. We were able with Intel's help actually turning on a very large cost here at Gaudy Twos and gaudy threes we're able to turn on multiple agent-based services.
[00:13:11] Our partner known as the Government Acquisition Corp was helping us with their data scientists telling the agents what to do. The agent went out over the internet, grabbed all the data. All the data in all the different formats. Unpacked it, wrote the applications. It literally was writing its apps in real time.
[00:13:25] To open those files in the formats, unpack them, put 'em all in a standard file format, go through, clean up all the sensors, we're talking multiple trillion points of data, 1.3 billion rows in the parquet, and putting that all together. And then writing the apps to actually build all of the graphs and then come back and tell the data scientists what anomalies it was seeing so it could then get permission to remove.
[00:13:48] And then what insights it was seen, and then put that as in the graph form we're talking 30, 40, 50 man years of work that was accomplished in a weekend. And now the government has an incredible tool. That can actually say the [00:14:00] next time a Barrack Metric pressure event like this is gonna hit this exact zip code, here's all the cost of insurance.
[00:14:06] Last time it did it, here's the cost of life, or what other problems there were. It can now have a direct correlation to the future low barrack metric pressure event saying if another one hits, this is zip code, here's what we can expect. That is a massive deal, and we're talking something as simple as just saying, Hey, we need to increase gutters from two inches to four inches will save us $2 billion.
[00:14:26] Next time there's a bare pressure event like this. It's just the forward forecasting, the beauty of bringing AI and technology to the government. And that's one little use case. I can't imagine that at a scale.
[00:14:37] Heather McGuigan: Yeah, you can imagine that if you know a government were, or as we start to expand our space exploration, which I think the world is interested in continuing to do that, if you can study those patterns and you can study every single thing in movement and you can be studying data from around the world in real time.
[00:14:55] How much more effective our tests can be. Maybe we [00:15:00] don't need to have as many. Maybe there's a more direct way to have less mistakes, therefore saving billions and billions of dollars in space exploration.
[00:15:09] Luke Norris: Totally. And then you also that space exploration, but I'm just thinking the random.
[00:15:13] Proactive ability for the governments now to adjust to change the proactive ability for governments to react in real time, like you said, and the ability to say, Hey, once again, the gutter use case, if we just make this small change, it's gonna save this much money over time. And just having those cheaper investments versus the cost is obviously always more expensive to react after the fact,
[00:15:33] Heather McGuigan: which also becomes a directly affected to the individual, which is nice when you can hear rather than.
[00:15:39] Go do this, but we can't tell you why, or we're gonna jump over and now do this. It's saying, we have an enormous amount of data that will tell us that going from two inches to four inches on your gutter is gonna save you from replacing your roof. And you go, great. Thanks.
[00:15:54] Luke Norris: Yeah,
[00:15:55] Heather McGuigan: that's amazing.
[00:15:55] Absolutely. Okay, I'm gonna move on. I wanna move over [00:16:00] to the inference and leverage index. So how does hardware agnostic AI inference affect enterprise scale intelligence?
[00:16:12] Luke Norris: There's a lot to think about. Right now the zeitgeist out there is thinking about AI's training. The massive computers and data centers that are required to train these models from the very large, massive data sets that are out there.
[00:16:28] The difference is that's almost scared the enterprise to adopting AI and operationalizing it, and that's called the infancy. And the amazing part about inferencing is you can start incredibly small and scale it up, where with training, you have to start incredibly big and actually scale it depth.
[00:16:46] So it's a different mindset. And what's now happening is there are multiple vendors, obviously Intel taking the lead on this, that are now bringing the capability to start with two servers, three servers or servers [00:17:00] with decent power envelopes, and actually stark building ag genix services in the enterprise.
[00:17:05] I love this just one little thought. Three gouty servers running a medium sized model can generate. Let's just say 30,000 tokens a second. A user, a human can only digest or read at its fastest or type at its fastest, about 20 tokens a second. So you're talking those three servers could represent 1500 plus PhDs now incorporated into an enterprise.
[00:17:30] Like what could you do with that much processing power, with that much new information, with that much new workloads? Frankly, 1500 new PhDs now on your staff, 24 hours a day, seven days a week. So it's amazing to think about and the silicon neutrality approach is just saying.
[00:17:46] Everyone's focused on training. Everyone's typically then focused on possibly Nvidia. They've really done a great job winning early the training segment, and there's a lot of incumbents coming at it, but the ability to right-size the GPU and CPU at the right location [00:18:00] to do the right level of inferencing off the right level of data, once again, bring in AI to the data.
[00:18:05] Is where the enterprise is now, gonna start to unlock that data, get those workloads moving, get those PhDs processing for them, and really start to make the workflow changes that they're looking for.
[00:18:14] Heather McGuigan: And as we move into the topic of monetization, the SaaS model has been industry standard. Does it still fit for these AI native businesses?
[00:18:27] Luke Norris: I think that's a hard no. Like a very hard no. The SaaS model typically is measuring some sort of usage and in that SaaS model, right now the sort of thematic use is a million tokens. How much does it cost for you to process either non input tokens are long output tokens, but when you're establishing a agent to run 24 7, 7 days a week in that background process, always scanning, always being vigilant, always being ready.
[00:18:53] It's always processing those to. You don't wanna be paying for something when it's quote unquote idle, but it's not [00:19:00] actually idle. It's waiting for that next process kickoff. It's waiting for the next file to go. It's waiting for the handoff from another agent to come into play. That's where I do believe the business model.
[00:19:09] The model needs to move on-prem and needs to move to dedicated resources whether those are dedicated resources in the cloud, dedicated courses on prem, dedicated resources at the edge further, the SaaS models typically are there. Shared GP services or shared CPU services, and that's typically far away from the data.
[00:19:27] So if you just follow the themes of this podcast so far we wanna move AI next to the data. We wanna have AI always running next to the data and always processing next to the data.
[00:19:36] Heather McGuigan: And in regards to inference, how does inference affect pricing for enterprises?
[00:19:43] Luke Norris: I've been talking a lot in two different.
[00:19:46] Modalities. And that might be the wrong word, but I think there's agents that need to be immediately responsive and there needs to be agents that are processing and the agent that needs to be immediately responsive has a totally different [00:20:00] inferencing cost. The needs to be loaded into the fastest GPUs that can get the fastest time to the first token.
[00:20:06] Therefore, if you're interacting with it, you're not waiting for that token to come back. You're getting the immediate response you need. Then there's how do I get inferencing at the right price, at the right process for it to be in that background? And that has a different paradigm because all you're caring about there is, it's getting its job done, the work to be done, the work to be handed off to other agents, and you wanna do it at the right price in the right power envelope.
[00:20:28] And those are two sort of competing things in the inferencing. And that's where, finding vendors with an incredible large breadth like Intel. That can mix Zon fives and the six CPUs for the background process. And maybe the GPUs, like Gotti three for the foreground process is an imperative partner.
[00:20:44] That's what match up with
[00:20:45] Heather McGuigan: now all of this all of this growth obviously comes with power growth. Now, has there been an under the radar conversation surrounding how we power this AI technology to scale? [00:21:00]
[00:21:00] Luke Norris: I under the radar it. It's countervailing. Both on the training and the inferencing side at scale that like fifth industrial revolution size, where now you have 25, 30% of your enterprise being fully automated, that is gonna take a lot.
[00:21:15] And a lot of big servers and big power it's well gonna be offset by the ROI and the cost efficiency. But to do this, I think in the, in, in the medium term, we're not talking about the power grid and. The conversation outside of Silicon Valley needs to be opened up a little broader. We're actually talking about localized power capabilities.
[00:21:37] We're talking the ability to have the power generation literally on campus in these data centers. We're talking primary power is gonna come from things like natural gas and abundant sources here in America. While we wait for the four or five, 10 years, it's gonna take to get the power grid in America upgraded enough.
[00:21:53] So then things like nuclear power and large power generation can augment that. And I think that's not something that's wildly [00:22:00] spoken about, but the reality is you look at X's recent data center, I think that thing is entirely powered off a local generation from oil and natural gas. And that is gonna be the model in the short term.
[00:22:09] Heather McGuigan: And do you think there are current power companies that are starting to invest and build in this type of infrastructure? Or is it still theoretical at this point?
[00:22:20] Luke Norris: No, I don't think it's theoretical. Like I said, it's currently being implemented. I think the investments are definitely happening.
[00:22:25] I don't think it's the power companies per se, that are powering that local generation. These are newer companies, I'd call 'em, probably born in the last five, 10 years that have come up with very efficient, very stable local turbines, gas turbines and oil turbines to actually power those. And then we have the big power companies.
[00:22:45] Working with new technologies like out of the bill Gates Foundation for small, pebble based reactors on the nuclear side, that would go not only next to those power. Sorry, next to the data centers, but then also to power the grid for those surrounding areas. And I [00:23:00] think we're gonna have to implement both and hopefully the timelines come together.
[00:23:03] Heather McGuigan: Amazing. I have, we have two questions left. I wanted to mention recent paper by Signal 65 called the New AI Accelerator Economic Landscape. It is a fascinating. 12 page study. Do you think that you could give us a brief explanation of its findings?
[00:23:26] Luke Norris: I, I can. So first off we have a customer called Signal 65.
[00:23:30] I believe Intel partnered with them to really test out the new gouty threes. And what was interesting that Signal 65 came to us, they wanted a single platform that had real business use cases and real test use cases built into it. That was silicon agnostic, would work across the prevailing ones and work with the new providers such as Intel and Gati and Gatis.
[00:23:51] So we're very honored that they selected us. More importantly, their findings in working with us were fun, let's put it that way. Everybody looks at AI workloads, as we've said over and over [00:24:00] again, more from that training lens. How many. Sure. How much pure power, how many flops how much memory can I throw at a particular problem?
[00:24:08] And almost, cost be darned. Just, I wanna scale this up in that training scenario, but on inferencing, there's a new paradigm that's taken effect, and it's effectively how many tokens can you process at what total wattage and at what total cost. And then you can build an ROI and you can actually build a use off of it.
[00:24:27] And then you have not only what it's costing me to purchase it now, run it, but also how much am I getting off of the use of it and the ROIs from that. What use cases are those particular cards? And what their findings showed was there is a growing capability, especially from Intel, especially from the three be extremely.
[00:24:48] On real world use cases and inferencing. And as the market moves from this training modality to actually getting ROI and moving to inferencing, and by the way, inferencing should be 300 times [00:25:00] plus or minus the size of the training market. Like every enterprise, every small, medium business should be having their own inferencing capabilities and sixes.
[00:25:07] This ROI, this paradigm, this growth is gonna be imperative and it's great to see that competitors are starting to get out there new services and new chips like the Gotti three are making such a dent into that. ROI I recommend everybody read the paper 'cause it helps you change the lender inference, change the lender, ROI and actually look at the competitive landscape.
[00:25:24] And Gotti three definitely stuck out.
[00:25:26] Heather McGuigan: It's always so fascinating to hear a topic that. There's so much conversation around and it's so exciting to see someone that's going, but yep. But what about from this way? I think, and that's what we have to keep doing. I have one question left. It's a question that I've been kind I to ask you since you were coming on the show.
[00:25:49] Are you ready? If this is the fifth industrial revolution, what's going to be the sixth?
[00:25:56] Luke Norris: Of course you said that. Wow. So the fifth industrial [00:26:00] revolution, I think is l lemme just point on that for one second. These revolutions typically take a lot of time. And the first one like the printing press, it took a good. A hundred years almost of humanity to really get in tune you. You had to break so many social constraints.
[00:26:17] Literally religions didn't even want their books re, printed by certain people, et cetera. It was like a whole paradigm shift for humanity to absorb it. And each sequential one has been faster and faster. So I will tell you, I think this fifth industrial revolution is only a five to 10 year.
[00:26:31] And what do I mean by that? Unlike even, and this wasn't an industrial revolution, but even cloud, if you think of cloud as a general purpose technology it took a long time for people to adopt it. In the economics model, we're already seeing the oldest largest companies adopting generative AI solutions and overall ai.
[00:26:48] It is happening at breakneck peak. Then now you layer on robots later this year, beginning of next year, you're gonna have information completely moved to Moore's law and you're gonna have robots moved to a whole nother [00:27:00] economic law of labor. And those two things are just gonna transform humankind very quickly.
[00:27:05] So what is the sixth industrial revolution? I think you should probably curse while on your podcast. A lot more than me and think about a lot more. But I think it's the new revolution. And what I mean by that is new states of matter, new states of energy, new states of actual work.
[00:27:22] And then that also means new. State of what it means to be human and what it means to actually get enjoyment. Not just outta your work, but outta your purpose. And I think that's what we're gonna find and that's what sort of that six industrial revolution is gonna be.
[00:27:34] Heather McGuigan: Amazing.
[00:27:35] Thank you so much for coming on the show today. This has been an amazing chat. It was so nice to meet you. And I hope we have a really great day for everyone else. This was just the first episode of the 2025 season of Intel on ai. My name is Heather. Thanks for joining us. Bye, Luke.
[00:27:55] Luke Norris: Bye. Thanks
[00:27:55] Heather McGuigan: everyone.[00:28:00]
[00:28:05] Visit intel.com/ai to learn more.