Transcripts

This Week in Enterprise Tech Episode 562 Transcript

Please be advised this transcript is AI-generated and may not be word for word. Time codes refer to the approximate times in the ad-supported version of the show.


Lou Maresca (00:00:00):
On this weekend, enterprise Tech, Brian Cheek, Curtis Franklin and I talk Cisco's acquisition of Splunk. What does that mean for you and Splunk's ecosystem? We'll see if it survives. Plus we'll talk about Intel's groundbreaking announcements from their annual innovation event that just might be the dawn of a new AI error. Today we have Jason Kimery sales and marketing VP from Intel to talk about neural processors and so much more. You definitely should miss it. T twice on the set

TWIT Intro (00:00:26):
Podcasts you love from people you trust. [00:00:30] This is twit.

Lou Maresca (00:00:39):
This is twit this week in Enterprise Tech episode 5 62, recorded September 22nd, 2023. Who A thunk Splunk? This episode of this week, enterprise Tech is brought to you by Collide. Collide is a device trust solution for companies with Okta and they ensure that if a device isn't trusted and secure, it [00:01:00] can't log into your cloud apps. Visit collide.com/twi to book an on-demand demo today and by Miro. Miro is your team's online workspace to connect, collaborate and create together. Tap into a way to map processes, systems, and plans with the whole team. Get your first three boards for free to start creating your best work yet at miro.com/podcast. And by lookout, whether on a device or in the cloud, your business [00:01:30] data is always on the move, minimize risk, increase visibility, and ensure compliance with lookout's unified platform. Visit lookout.com today. Welcome to twit this week in enterprise tech, the show that is dedicated to you, the enterprise professional, the IT pro, and that geek just wants to know how this world's connected. I'm your host, Louis Raska, your guide to the big world of the enterprise, but I can't guide you by myself. I'm going to bring in the professionals and the experts that have very own Mr. Brian Cheese [00:02:00] network security expert and he's all around tech geek as well. Welcome back to the show. Brian, what's been keeping you busy this week?

Brian Chee (00:02:06):
Actually, I've been working with a intern at the central Florida Fair and trying to teach him about internet of things. He seems to have a background in node jss, so I'm going to start 'em off with node red, no

Lou Maresca (00:02:21):
Red. That's right,

Brian Chee (00:02:22):
Yes, and we're going to go and get him to go and build some sensors for the fairground so we can tell what temperature [00:02:30] the air is coming out of the air conditioning system and whether the lights are on. Hopefully that'll be a nice first project for him and he's going to build us a dashboard.

Lou Maresca (00:02:42):
That's awesome. You're going to get him hooked on that stuff. That stuff's so much fun.

Brian Chee (00:02:45):
I hope so. I like node red. You can do some really cool stuff with it.

Lou Maresca (00:02:50):
You really can. You really can. Well, thank you Bert for being back. Of course. We also have thank and welcome back Mr. Curtis Franklin. He's principal analyst at I'm Dia, and of course the man who has pulse of the enterprise. [00:03:00] Curtis, how's the pulse feeling this week?

Curtis Franklin (00:03:03):
Well, it's just your basic boring week in cybersecurity. Nothing big going on. No real news. Yeah, right. We've had a big weekend. We're going to be talking about that a little bit later on. The big news kept me busy the last couple of days. I've been working on some reports. I've been working on collaboration with Omnia analysts in other areas [00:03:30] already looking at speeches and reports that we're going to be coming out with in the first and second quarter of next year. So lots of good stuff going on, lots of fun to try and figure out and can't wait to dive deeply into it here on twt.

Lou Maresca (00:03:51):
Thank you, Curtis. Well, speaking of lots of stuff going on, Cisco's surprising acquisition of Splunk is happening now. What does that mean for you and Splunk's [00:04:00] ecosystem? We'll have to see if things survive there. Plus we deep dive into Intel's groundbreaking announcements from their annual innovation event. This might just be the dawn of a new AI air we'll have to see. We have Jason Kimery sales and marketing VP from Intel. He's going to talk about neural processors and so much more, so lots to talk about, so definitely stick around. But before we get to everything, I just want to remind you, of course there's an A m A coming up on September 28th at 9:00 AM Pacific time. That's right. It's with MOIs. That's right. It can come and ask me anything, [00:04:30] jump in and go ahead and give me hell. I'll definitely do that. But first before you have to get to all the other stuff and the fun stuff, we're going to talk a little bit about this week's news blips in the fast-paced world of tech, sometimes oversight can lead to significant vulnerabilities.

(00:04:44):
In this week's story from ours, Technica is a testament to just that We're diving deep into a concerning development from two tech giants we've come to trust that's Apple and Google. A couple of weeks ago, apple disclosed an active vulnerability with an iOS. Now this wasn't just a minor kink in the armor. It allowed for an installation [00:05:00] of a notorious Pegasus spyware using a zero click exploit raising many eyebrows credit worth credits. Due citizen labs at the University of Toronto played a pivotal role in identifying the buffer overflow issue voted as C V E 20 23 4 10 64 rooted in apple's image io and just when we were processing that, Google stepped into the spotlight reporting at critical vulnerability in Chrome also tied to Web P, but instead of a collaborative approach [00:05:30] to address the court issue, apple, Google and Citizen Lab made actually presented different C V E designations.

(00:05:37):
Now, resilient researchers suggested that the root cause of both vulnerabilities might indeed be the same issue traced back to the code library for web WebP images. This implies a staggering ripple effect leaving potentially millions of applications in a vulnerable state. Unfortunately, relying solely on vulnerability scanning outputs seem to leave gaps out there. The huge blind spot actually underscores [00:06:00] the complexities we're continuously navigating and while Google's responses pivoted towards addressing its immediate community, the larger concern still remains. Now for our audience out there, it's clear for everyone that you want to be proactive and actually patching. Plus, remember, don't pay attention just its single entities out there, single issues, but ensure you pay attention to the entire security ecosystem.

Curtis Franklin (00:06:23):
Well, once upon a time, the bot swarms responsible for massive operations like DDoS attacks [00:06:30] and similar activities were clustered in North America and Asia. But according to new research by Netea, the number of reported bot attacks originating from the Middle East and Africa have increased dramatically over the last year and now total 34% of all attacks reported in the survey of US and UK businesses, the researchers say the aim of these bot swarms have changed as well. The aim of a modern bot attack apparently is most commonly [00:07:00] to get access to accounts for streaming services to steal and sell gift cards at a reduced rate or to purchase limited or high demand goods like event tickets or sneakers and yet another unintended consequence of an otherwise good thing. The attacks can often be conducted using open source software like Open Bullet and require only a configuration file to determine the target making them perfect campaign tools for novice criminals and threat actors.

(00:07:30):
[00:07:30] The researchers show that the percentage of bot attacks originating from the Middle East has gradually increased over the last three years from 2% in 2020, 13% in 2021 and 21% last year. And Africa, the number in 2022 was slightly down from a high of 16% in 2021, but that was up from 5% in 2020. Now there is one big caveat in all of this. It's notoriously difficult to be sure that an attack [00:08:00] is actually originating from the IP address that's given as it's source. It's possible in fact that these bot swarms are coming from old, let's say Antarctica, but just being routed through the Middle East and Africa. Regardless of whether that's true, it's obvious that there are significant swaths of infrastructure in the two regions that have been compromised and are being turned to criminal or at least highly unauthorized purposes by a whole host of organizations.

Brian Chee (00:08:29):
So [00:08:30] first off, a big thank you to the true cable.com folks, and they also like fluke instruments for testing cables continuity test just isn't it? You need to go and actually test to see if the cable can pass the data signals correctly with minimum crosstalk. Anyway, the big headline is what is copper clad aluminum and why is it a problem in the industry? Well, in my opinion, this whole thing started [00:09:00] with the construction of the three gorgeous dam in China, which sucked up the world's supply of copper for nearly a decade. At the time, my father was the revere copper distributor in Hawaii and I got to see firsthand how the price of copper products nearly rose 10 times the original price and became extremely scarce. My speculation is that some less than scrupulous manufacturers decided to cut corners and thought that since it was copper clad that substituting a copper clad aluminum [00:09:30] conductor would be just fine.

(00:09:32):
Well, so even though this didn't meet the E I A T I A specifications for unshielded twisted pair wire for ethernet applications, copper clad aluminum was advertised as cat seven, which at that point in time hasn't even officially been ratified as a standard coming forward in time testing by various laboratories seem to indicate that the higher wattages of emerging p [00:10:00] o E standards, that copper cloud aluminum, A K A C C A could potentially be a fire hazard, especially as p OE plus plus is now reaching the 100 watt level. So my advice to our viewers is to check the fine print when they purchase Unshielded twisted pair cable. One CAT seven doesn't exist yet. Two, if the cable is a mega deal, there's a good chance at C C A [00:10:30] three beware that C C A especially for P O E applications could very well be a fire hazard in your facility.

Lou Maresca (00:10:40):
A significant development from OpenAI this week has begin previews of their new tool. That's right, Dolly three, a advanced version of their image creating technology will soon be integrated in the D Wily used chat G P T chat bot a Dali three has notable advancements over its predecessors. It's more depth at translating complex text into coherent images addressing previous [00:11:00] limitations in AI image generators. Detia Ramesh, he leads the Dali three team demonstrated its capabilities crafting logos for a fictional company called Mountain Ramen. This rollout comes at a challenging time for OpenAI. The startup faces mounting competition and traffic to vote Dolly and chat G B T has also declined. However, the integration of Dolly three into chat G B T may be a strategic move to revitalize their user engagement rather than offering it as a separate product. [00:11:30] However, there are concerns enhanced AI generated images might blur the line between reality and fabrication.

(00:11:35):
As we all know, as uc, mercury professor points out distinguishing genuine photos from ai. Generat generated ones is becoming more challenging on the legal front. AI companies including open AI's competitors face allegations of copyright theft due to extensive data scraping required for AI training. Moreover, the misuse of such tools in creating DeepFakes for distributing or misleading content has caught the attention of law enforcement and advocacy [00:12:00] groups as well. Now, in response, open AI's Dolly three team engaged external experts to evaluate worst case scenarios incorporating their findings into safety measures. We also committed to developing mechanisms to label AI generated content aligning with the recent White House pledge. We'll see how this actually plays out. Well folks that does it for the blips. Next up we have the news bites. Before we get to the news bites, we do have to think this week's sponsor out this week in enterprise tech and [00:12:30] that is Collide.

(00:12:31):
Collide is a device trust solution for companies with Okta and they ensure that if a device isn't trusted and secure, it can't log into your cloud apps if you work in security or IT and your company has Okta. This is definitely a message for you. Have you noticed that over the past few years the majority of data breaches and hacks you read about are something in common actually? Well, it's actually employees. That's right. Their weakest links are usually employees. So sometimes an employee's device gets hacked because of an unpatched software. Sometimes [00:13:00] an employee leaves sensitive data in an unsecured place. We've heard that a lot and it sometimes seems like every day a hacker breaks into using credentials they phished from an employee. The problem here isn't your end users, it's the solutions that are supposed to prevent these breaches. It doesn't have to be this way.

(00:13:19):
Imagine a world where only secure devices can access your cloud apps In this world, Phish credentials are useless to hackers and you can manage every oss, including Linux, all from a single dashboard. [00:13:30] Best of all, you can get your employees to fix their own device security issues without creating more work for your IT team. The good news is you don't have to manage this world. You can just start using collide. Visit collide.com/twite to book an on-demand demo today and see how it works for yourself. That's K O L I D e.com/twight and we thank Collide for their support of this week in enterprise tech. [00:14:00] Oh folks, it's time for the News Bites. Now this week we've had a pretty massive announcement about Cisco actually acquiring Splunk, but I definitely want to hear about the details here. What's going on here, Kurt,

Curtis Franklin (00:14:12):
Yesterday morning, those of us in the analyst community were greeted. Very first thing with this announcement happened early Cisco. Everybody knows Cisco has announced plans to acquire Splunk, which is [00:14:30] the prominent, the leading provider of sim, that's security information and event management solutions as well as analytic software for, let's call it everything. Cisco in this deal is valuing Splunk at a dollar 57 per share, which leads to a purchase price of right at $25 billion. [00:15:00] Now, there are certain aspects of this that are not a surprise. Cisco had talked to Splunk last year about a purchase at a much lower price, but the timing is interesting and the fact that this is just a massive, massive purchase makes it something that we have to take account of. Now, there [00:15:30] are tons of questions. One of them is when the deal will close according to Cisco, they think that it will take three to four quarters.

(00:15:40):
So at the latest by this time next year, the deal should be closed and they don't expect, they say much regulatory pushback because there is essentially no overlap in products between the two. There's a tiny bit, but analytics software [00:16:00] is not one of Cisco's big profitable profit lines. So why do this? Now, there are several reasons. The number one is that this gives Cisco something that they've wanted for a while and that is an ongoing revenue stream. Splunk has customers that buy subscriptions, they pay a certain amount on a regular basis for Splunk services. Cisco tends to get [00:16:30] one-time blobs of money from their customers. There may be some ongoing support expenditures, but it's primarily big dollars that happen once every few years. The other thing is that this gives Cisco a lot of presence in the SIM market right now. They don't have to build big sim and again, they've got some SIM capabilities, nothing [00:17:00] like what Splunk has, but they get SIM capability, they get analytics capability, they get all kinds of capabilities including we have to say ai, which Splunk has been researching.

(00:17:15):
Splunk talked a fair amount about their AI push in July at the Splunk con where I attended, and it's important to note [00:17:30] that Splunk said quite correctly that they've had AI in their products for a while. What they're talking about of course now is generative AI or large language model ai, which is the new form of AI that everyone's talking about. Splunk has an interesting take on this. They spent a lot of time talking about mission specific [00:18:00] ai. In other words, AI that has expertise in a specific area rather than being sort of this global generative AI that something like chat G P T is, regardless of what you think about ai, this is a massive thing and it's going to have massive repercussions. I wrote a piece that went up today where I looked at some of the categories of categories [00:18:30] of interest, whether you're a Cisco customer or a Splunk customer, the company of Splunk, the company of Cisco, and what this could mean for you. There are a lot of that that we don't know yet because we haven't been told details, but this is something that is without question going to have some major repercussions up and down the industry.

(00:18:58):
And Lou, I want to [00:19:00] ask you, we've got this stuff going on. What do you see this, you work in a company that has a massive installed user base in terms of actually a couple of them. One for its operating system products, another for its productivity products, and there are others as well. But [00:19:30] let's focus on those two. What kind of response can you imagine from customers if they heard that all of a sudden, I don't know Intel was going to buy Microsoft and do that. I mean, can you imagine that there would be some anxiety among the existing customer base if they heard news like that?

Lou Maresca (00:19:59):
I would [00:20:00] say so. I would think so. I would say I've seen lots of anxiety from even smaller acquisitions. The acquisition of Slack, the acquisition of some of these other companies that are out there. I would say lots of organizations will worry about whether the support contracts will still exist, whether the ecosystem will still persevere through and be able to continually upgrade and get new features. I think another thing is really big [00:20:30] is the fact that they have lots of integrations with things. And so if they're utilizing Splunk for integrating with things, they're hoping that Cisco doesn't sever those things because they want to kind of favor their stuff. So there will be a lot of questions here and it might even prevent people from writing those new deals for the following couple of years until they figure out what happens after the acquisition. So that might be a very interesting [00:21:00] thing as well, which could lead to people moving to other seam solutions.

(00:21:05):
It could mean that people move to more of a monthly or maybe just a single year type deal. So this could be pretty interesting to watch the share price, the stock price of Splunk through the coming months, because I've seen in the most recent months before the acquisition, we've seen pretty steadily increase. So it actually should be interesting to see what happens after. But I would say any type of install [00:21:30] base, any type you're integrating with something, any, anytime you depend on a technology that's benefiting your solutions, you have to worry when a large corporation like Cisco comes along and gobbles it up, especially if they're going to integrate it tightly into their offerings. So I'm curious to see what happens here.

Curtis Franklin (00:21:50):
No, I think you're absolutely right about that. Now Brian, I wanted to turn to you because you've been familiar with Splunk for a very long time. For years, [00:22:00] they were an important part of the interop network. They provided log file analysis and other things for that particular installation. But being in the research world that Splunk's analytics capabilities are applicable to a wide range of things that have nothing to do with a firewall log file, what kind of impact can you imagine this happening [00:22:30] on those customers, people who are very far from being a Cisco customer?

Brian Chee (00:22:36):
Well, I think one of the things being a researcher, this acquisition worries me a little bit. So the example is there's a lot of people in the research community that want to play big data, and Splunk was one of the easier ways of doing this, especially because they have continued to offer a free version [00:23:00] that can ingest up to 500 meg of data per day. Now, if you're doing logs from firewalls and switches and so forth, 500 meg per day will disappear in the blink of an eye. But if you're doing things like water temperature or air temperature or things like how much power is a hobby, wind turbine producing, you could do an awful lot of really good science [00:23:30] with 500 meg per day. So I'm truly hoping that our friends at Cisco don't kill that program. The other thing I'm worried about is the Splunk community, the of amazing, amazing solutions have come out of the community of these people.

(00:23:54):
I saw a high school that used Splunk to log [00:24:00] power from a home brew wind and system and created one heck of a nice looking power dashboard for the school. So there's a lot of really cool things that have happened in the past because of Splunk's direct involvement. A lot of their engineers actually volunteer their time to help out in their community panels, and I've known a couple [00:24:30] of the Splunk engineers that have actually taken vacation to go and help out charities that are using Splunk for various cool projects. Heck, Kurt and I actually know a gentleman that was Splunk employee number five, and he did all kinds of really cool things. Now, flipping the coin just a little bit, Cisco has, I described, this is my personal opinion, so don't get [00:25:00] me wrong. My personal opinion is Cisco's not so much a technology company as it is a holding company.

(00:25:08):
They buy technology and integrate it into their product line. It is not always for the best. I happen to know the gentleman that created precept systems, precept software, that was the inventor of I P T V, great person Cisco bought Precept [00:25:30] and that's kind of it. They didn't keep it as a separate product anymore. It got integrated into a lot of other things. The gentleman that built and founded Precept became part of the chief scientist office, not necessarily a bad thing. He actually did a lot of really cool things for Cisco, but acquisitions, especially acquisition like this where it's going [00:26:00] to be completely absorbed within Cisco sometimes is not for the best. I'm going to hold my breath and see what actually happens. And my hope is that Splunk keeps a lot of the really cool things that they've done for the scientific community and keeps the community that have created amazing solutions.

Curtis Franklin (00:26:26):
That would be an amazing thing. And you're right [00:26:30] at the last Splunk conference, I mean there were lots of people walking around wearing FEZs to identify them as core members of this user community, people who were not only incredibly knowledgeable, but incredibly generous with that knowledge for other users, especially helping individuals and companies get up to speed on Splunk. [00:27:00] That is a huge part of what has made Splunk valuable. And let's acknowledge, just like Brian said, Cisco does have a long history of acquiring companies large and small, not the best record at keeping the product lines and cultures of those companies intact. And I think there are a lot [00:27:30] of people who use and depend on Splunk who are nervous about that, and it's going to be fascinating to see what Splunk's Splunk's pans are. Now we know that Gary Steele, the C e O of Splunk is going to become part of the management team at Cisco that's been announced.

(00:27:49):
So that's there, but there are a whole bunch of us who hope that the things that have made [00:28:00] Splunk special will continue to do so. It's also worth pointing out, pardon me, that Cisco is very clear in their announcement today. This is an acquisition that's about security. This is not about performance. This is not about getting to a root cause analysis on problems. This is about security and that has both the capability [00:28:30] of letting us know where they're going to go and some cautionary tales for anything that comes into the Splunk universe that isn't tied directly to that security. So we're going to be talking about this a lot more over the coming year, frankly, over the coming years. But for now, it's big news and the ripples from this particular boulder thrown into the water are going to continue fanning out across the pond of cybersecurity [00:29:00] and IT for some time yet to come.

Lou Maresca (00:29:04):
Thank you, Curtis. Yeah, I'm curious to see where this heads, because this is a big deal for the entire ecosystem, so we'll have to see how it travels. Well, guys, that does it for the bites, but we have the guest up next, so we definitely stick around. But before we get to the guest, we do have to think another great sponsor of this weekend enterprise tech. And that's Miro. Quick question. Are you and your team still going from tab to tab tool to tool losing brilliant ideas and important information along the way [00:29:30] with Miro? That doesn't need to happen? Miro is the collaborative visual platform that brings all of your great work together no matter where you are, whether you're working from home or in a hybrid workspace, everything comes together in one workspace online. At first glance, it might seem just a simple digital whiteboard, but with miros capabilities run far beyond that.

(00:29:50):
It's a visual collaboration tool packed with features for the whole team to build on each other's ideas and build the future shorten time to launch so your customers get what they need faster. [00:30:00] With Miro, you need only one tool to see your vision come to life. Planning, researching, brainstorming, designing and feedback cycles can all live on the Miro board across teams. And faster input means faster outcomes, right? In fact, Miro users report the tool increasing project delivery speed by 29% view and share the big picture overview in a cinch really easily when everyone has a voice and everyone can tap into a single source of truth, your team remains engaged, invested, [00:30:30] and most importantly, they're happy, right? Cut out any confusion on who needs to do what by mapping out processes, roles, and timelines, you can do that with several templates, including miros Swim Lane Diagram.

(00:30:42):
Strategic planning becomes easier when it's visual and accessible. Tap into a way to map processes, systems, and plans with the whole team so they not only view it, but have a chance to give feedback as well. And if you're feeling meaning fatigue, I know I am Miro users report saving up to 80 hours per user per year [00:31:00] just from streamlining conversations and feedback ready to be part of the more than 1 million users who join Miro every month. Get your first three boards for free to start working better together at miro.com/podcast. That's m iro.com/podcast and we thank Miro for their support of this week in enterprise tech. Well, folks, it's my favorite part of this show. We're going to a guest to drop some knowledge on the twi, right? Today we have Jason Kimery, [00:31:30] he's Intel Sales and Marketing vice President. Welcome to Michelle. Jason.

Jason Kimrey (00:31:34):
Hey, thanks for having me.

Lou Maresca (00:31:36):
So we've heard a lot of announcements this week, but before we get to that, I do want to take the audience through something very interesting because we hear a lot about from our audience that they love to hear people's journey through tech because they have lots of experience from whether they're entry level experience all the way up to the CTOs and CEOs of the world, and they want to hear people's journey and what brought them to their current role. So could you take us through a journey [00:32:00] to that and what brought you to Intel?

Jason Kimrey (00:32:02):
Yeah, sure. It was kind of an interesting journey. I've actually been at Intel a little over 20 years, but coming out of college and my first few years were in the healthcare industry and kind of decided around that time that I really wanted to get into tech and went and got an M B A. And then I felt like there was no better tech company on the planet to get in with than Intel because Intel is absolutely the heart of everything. And throughout the last 20 [00:32:30] years, I've been involved in sales marketing programs selling to the US government, and now my current role of working with partners around North to sell and promote Intel technology to solve some of the world's greatest problems.

Lou Maresca (00:32:45):
Fantastic, fantastic. Now they definitely have come a long way in the last 20 years. I was actually an intern back in 2003, 2004, 2002, so back in the atium group. So they've definitely come a long way since then. Speaking of a long way though, [00:33:00] we just heard a bunch of announcements. They had their amazing innovation conference just recently and they had a bunch of announcements that came out of there. And one of the biggest things was talk about neural processors. Now I'm curious, I know we have a lot to talk about all the things that came out, but this is one of the big things that our audience has been wondering about. I've been wondering about is just what is a neural processor? What's the difference between neural processor? Why is Intel kind of blanketing it? Why is it combining [00:33:30] it with some of the current core that it has today?

Jason Kimrey (00:33:34):
Yeah, it's a great question and you can't be involved in a technology conversation today without talking about AI. In an AI dominated world, the amount of data that's generated, collected, needing processing is expanding exponentially and edge solutions, which are anything outside the data center that process, analyze and store data closer to where it's generated or being deployed more now than ever. [00:34:00] So with all this data out there, you've got to have a way to process it faster at the edge. And one of the biggest, most prevalent edge devices out there is the pc. So what we're going to do is we're putting in addition to A C P U and A G P U into every PC platform that goes out the door, we're going to add essentially an AI processor in addition to that, and that's called the N P U or Neural Processing Unit. So that is going to become standard with platforms [00:34:30] that come out with now called it's code named Meteor Lake. We just announced the brand Intel Core Ultra, and it's essentially a breakthrough client product that delivers power efficient architecture scale, a huge advancement in AI acceleration with that dedicated N P U, which essentially will help offload those key AI applications or tasks onto a dedicated neural processing unit that will really help [00:35:00] process AI and data at the edge faster than it's ever been possible before.

Lou Maresca (00:35:05):
And we hear a lot of different terms out there. Obviously a lot of people are saying, obviously you need a pretty powerful G P U if you want to do some of the current number crunching to support this vector math and so on. But we also hear about the tensor processor units, the TPUs that a lot of companies are now kind of bundling or adding to their infrastructure. So what is the difference? What is the power of [00:35:30] this N P U that you get that these other processors don't give you?

Jason Kimrey (00:35:36):
Well, the N P U is actually will be specific to the pc, whereas TPUs tend to be in the cloud. And GPUs for data center, like with Nvidia tend to be heavily in the data center. And again, there's so much data out there that you can't possibly process it all in a data center, and it's simply just too expensive to round trip data to the cloud every time a piece of data is collected [00:36:00] or generated at the edge. So we believe that it's not a one size fits all scenario for ai, and you really need to look at the workload and your price constraints and your time constraints and decision criteria and make sure you've got the right processor for the tasks that you're trying to accomplish. So we believe there's a place for all of them, but the N P U for tasks that are processed on one of the most prevalent edge devices [00:36:30] out there, which again is the pc, the N P U will be specifically and really ideally targeted for those.

Lou Maresca (00:36:37):
So one of the interesting thing that I always worry about, especially in the workstations or desktop universe is the power consumption. And now that we're adding this additional daughter board or daughter processor or secondary processor core in there, what does that do for power? Do people have to worry now you're trying to compute and do AI models and processing at the edge, [00:37:00] does that mean that they're going to have need to be at a spot where there's a lot of power where consistent power to be able to utilize this functionality? What does that look like? What's the plan there?

Jason Kimrey (00:37:12):
Yeah, I don't have the exact data and in fact at launch we'll provide more of that, but what I can tell you is the intention of having a dedicated AI co-processor is to offload. So instead of running certain workloads on A C P U or on a graphics card, that may not be really ideal [00:37:30] for that. The N P U is highly targeted for those applications that tend to be very data intensive and power hungry. So by offloading and be highly efficient, I think it'll be competitive. Again, the data will come out at launch, but I think it'll ultimately be a great thing for users.

Lou Maresca (00:37:48):
Fantastic. Now you talked a little bit about workloads. What are some of the targeted workloads to bring some of this powerful AI number crunching and model crunching to the edge? What's some examples of workloads that you're [00:38:00] talking about?

Jason Kimrey (00:38:02):
Yeah, I think I'll actually go, there's a lot of different use cases that are being batted around some I don't even think have been created yet, but I'll just go back to innovation, which was our technology event that was hosted just the last, a couple of days this week. And there was three great examples and demos that were shared, and I would encourage anyone who didn't see 'em to go search 'em online, but one of 'em was a company called Deep Render, which uses AI to compress files by [00:38:30] five x what's possible today. Another one, which was called Rewind ai, which essentially can basically capture data from it listens and can capture that data and then transcribe it for what it hears for future reference including chat, G P T, like queries. And then another one was Fabletics, which actually creates a virtual avatar of you to try on close in a much more performant way that's ever been possible. So I think mean those are three examples, but [00:39:00] I think you're going to see industries and different types of use cases that span across performance, security, digital transformation, but those are at least three that we just showed this week.

Lou Maresca (00:39:13):
Now obviously with the advent of large language models and chat, G P T, you've seen a lot of different models coming out. Is one of the target segment sectors of the market also LLMs and generative technology as well for the edge?

Jason Kimrey (00:39:28):
Yeah, I think what's interesting [00:39:30] about large language models is models. There's some models that are billions and billions of parameters. And when you have those, you need a heavy duty G P U and sitting in a big data center, and it's so much data that has to be processed, but not all large language models are extra large and some are smaller. And we think that it's really going to be the type of thing where it depends on the size of the language models, and somewhere you're just doing inference on those large language models, [00:40:00] A C P U or is going to be perfectly fine for that workload. So again, I think we're really in the early stages of AI and certainly chat G P T and has all the buzz in the headlines, but very few people are going to be able to afford to build a massive large language model, and they're going to be looking for ways to make decisions or analyze data at a much more cost effective way. And that's why we think that Edge AI is really where there's going to be a lot of action, and it's where a lot [00:40:30] of our focus is in addition to making sure that there are viable alternatives to Nvidia GPUs in the data center as well.

Lou Maresca (00:40:39):
Speaking of innovation, there was a lot of announcements out there. One of the biggest things that I thought was pretty interesting was just a new process for chip design. They're shifting to using glass as a medium in chip design to help break up the slowdown of Moore's Law. Obviously, what's [00:41:00] going to happen with that? What is Intel going to be using that for? Are they going to be developing just a new type of of CPUs or had they already done that?

Jason Kimrey (00:41:10):
Yeah, I think what's exciting about this announcement, a lot of the announcement is that Moore's law is alive and well. It's just evolving and it's becoming as much of a physics challenge as it is a manufacturing challenge. So what you're seeing is Intel's not standing still because to create and package the number of the billions of transistors [00:41:30] that we're now putting onto into Silicon or into CPUs, it requires amazing innovations like the one that we announced. So I think you're just going to see Intel with our five nodes and four years announcements that we've made. It's going to come with a lot of innovations around the packaging and the development as part of that.

Lou Maresca (00:41:55):
Fantastic. Well, we have lots more to talk about here. You want to bring my co-host back in, but before we do, we do have [00:42:00] another great sponsor of this week in enterprise tech, and that is Lookout Business has changed forever. Boundaries to where we work or even how we work have literally disappeared. That means your data is always on the move, whether IT devices or in the cloud across networks, or even at the local coffee shop. Well, that's great for your workforce. It's a real challenge for IT. Security Lookout helps you control your data and free your workforce. Now with Lookout, you'll gain complete visibility into all your data so you can [00:42:30] minimize risk from external and internal threats, plus ensure compliance by seamlessly securing hybrid work. Your organization doesn't have to sacrifice productivity for security, and Lookout makes it security a lot simpler. Working with multiple point solutions and legacy tools in today's environment is just too complex with its single unified platform Lookout reduces it complexity, giving you more time to focus on whatever else comes your way. Good data protection isn't a cage, it's a springboard [00:43:00] letting you and your organization bound toward a future of your making. Visit lookout.com today to learn how to safeguard data, secure hybrid work and reduce it complexity. That's lookout.com and we think lookout for their support of this week and enterprise tech. Well folks, we'll be talking with Jason Kim, he's Intel's sales and marketing vp. We've been talking about Intel's latest announcements at their innovation conference, but I do want to bring my co-host back in here and [00:43:30] get their comments and questions. Let's throw it over to Curtis.

Curtis Franklin (00:43:34):
Jason, I appreciate what you've been saying. I've got some questions about some of our use cases. I know that some of my fellow analysts have been looking at generative AI and saying that we are maybe a year, maybe two from having solid usable generative AI [00:44:00] instances on device devices as small as smartphones. When you look forward, do you see that being a realistic sort of use case that we're going to have these sitting around our offices, sitting around work groups, sitting around our homes, or is that perhaps some wishful thinking getting ahead of the industry?

Jason Kimrey (00:44:27):
I think generative AI is [00:44:30] just another evolution of ai, and I think AI at the edge is absolutely, it's already happening today and it's just going to become more and more advanced. As we were talking earlier, I don't think you're going to see large language models where you're doing a lot of heavy duty training in a home device or an Edge device, but you're certainly going to see a lot of rapid decision making happening. I mean, one of the stats I heard today from I D C is 50% of all data [00:45:00] that's captured essentially not valuable in just a matter of hours. So data has to be used for a lot of use cases at the point of capture and at the point of decision. So I think the type of analytics, the type of generative AI that's happening is going to depend, but I absolutely think it's going to happen everywhere, and that's one of Intel's goals is to make AI accessible everywhere.

Curtis Franklin (00:45:30):
[00:45:30] Well, with AI being accessible everywhere, it indicates to me that people are going to be able to develop applications for a wide variety of platforms and a wide variety of use cases. Do you see the processors, the systems, the computers on board, all of these making it [00:46:00] easier for companies to develop applications and for researchers to develop applications? And we've seen in the past, because let's admit it, AI has been a real bear for development for as long as it's existed.

Jason Kimrey (00:46:19):
Yeah, you couldn't be more, right? Kurt, and I think that's one of great hardware is made even greater by software, and it's really the software that unlocks the power of the [00:46:30] hardware. And that's a big focus for Intel is on the developer and bringing tools for the developer to help bring those models into reality. I mean, one of the platforms that Intel released is called the Intel Getty platform, which essentially empowers anyone in the enterprise to rapidly develop AI models without being AI experts themselves. You essentially need these types of platforms to streamline model development [00:47:00] because it reduces complexity and it encourages collaboration. And it's just, it's these types of tools that are going to be critical to enabling AI to be accessible for everyone. But it does start with having the right development tools to enable developers to unlock the power of all that hardware that's out there.

Curtis Franklin (00:47:20):
Well, with everybody developing these AI applications, we're hearing a lot of people, regulators, legislators, [00:47:30] people around the industry talk about what is potentially the privacy and IP and other concerns about possible ramifications of AI as it spreads. Do you see this proliferation of AI being something that we have the [00:48:00] regulatory structure to deal with, or do the regulators and legislators need to get on their horse and catch up with where we're going?

Jason Kimrey (00:48:13):
I don't want to call it the wild west, but it is. If we're not careful as an industry, it could get out of hand. I mean, it's why Intel, and not just Intel, but Microsoft, others are really promoting this concept of responsible ai. And at Intel, we believe [00:48:30] that enabling and an ethical and equitable AI requires a truly comprehensive approach around not just the technology, it's the people, the processes, the systems, the data, the algorithms, it's all of it. And it really requires the industry to come together in coordination with our government and others to set policy ultimately, that really enables responsible ai. We believe that [00:49:00] Intel, we're doing our part to design AI to lower risk and optimize the benefits for our society. But you don't have to think too hard to think about how AI use cases that could ultimately not could be a bad thing.

Brian Chee (00:49:19):
Okay. I want to be very clear. I am a big Intel fanboy, so I'll be honest about that. And one of the things that has made Intel my favorite [00:49:30] ever since I was literally in high school was that the software side was never forgotten. Some of the developer and integration toolkits that Intel has provided in the past have been nothing short of stunning. I've actually developed class material around your integration tools. So one of the things I'm really excited about with AI isn't large language models. [00:50:00] No, no, no. It's eight small AI models, task specific AI models at the Edge. So my favorite example is actually the forestry service with smoke are actually really, really easy for humans to spot, but really, really hard for computers. And I bought an Intel Neural process, a U S B version. [00:50:30] I think I actually had to buy it directly from Intel. That was how early it was. So one, are we going to see those really cool integration developer toolkits coming out from Intel? Please, please, please, please, please. And two, are you seeing the Edge go that far into single board computers into O E M motherboards?

Jason Kimrey (00:50:56):
Yeah, I think so. One, thank you for being a [00:51:00] fan of Intel and a couple of things. One, I would tell you with Pat Gelsinger, our C E o, I will tell you, he has brought back our software Mojo. We never left it, but it's just he's brought it to another level. It's why we had the Intel developer form for years. We didn't do it for a few. And with Intel Innovation, which we just did, it was all about the developer. It's all about the software. I think you're going to see more and more enhancements and [00:51:30] investments in software. And I think just mean our goal with a lot of this stuff is integrating into the platform. I think the example you're giving of was Movidius was our Movidius product, which was a product that we acquired back in 2016, which was a leader in the mobile vision processor technology for connected devices. That technology, [00:52:00] which was a standalone mobile vision processor, is kind of the precursor to the N P U, which we're now making standard into the pc. So our goal is always to look at what's the great technology out there and then how can we integrate it in the platform to one, make it ubiquitous and then two, make it really performant.

Brian Chee (00:52:24):
Yeah. So one of the other things, this is me polishing my crystal ball. [00:52:30] The power over ethernet trend has been going up and up and up. In fact, one of our news stories was about copper clad aluminum cat fot. Well, they call it CAT seven. Yeah, that bogus. But the power ratings are going up. P OE plus plus is all the way up to a hundred watts. But I want to be able to go and use these devices like I was actually part of a DARPA grant. We're trying to go and find Japanese [00:53:00] factory ships in Korea Atal way, way, way at the northern part of the Hawaiian island chain. But we didn't want to have super expensive SATCOM bills, iridium and so forth. What we wanted to do is we wanted to only fire it up when we're absolutely sure. So here's my plug. Can we get developer and integration help from Intel someday? This is me hoping [00:53:30] for developing models. When I first got my Intel renewal processor, my head almost exploded. The learning curve on that was just insane. Are we going to see that pretty soon

Jason Kimrey (00:53:48):
Or I don't know. I can't comment on that. I can tell you that we're constantly, again, in this industry, it's about you got to innovate in all aspects and it's [00:54:00] hardware's, software, it's connectivity, and

Brian Chee (00:54:02):
We're well, if you guys want help more than happy to help,

Jason Kimrey (00:54:06):
I'll follow up. I know we have some people working on things very similar to what you described, and I'd be glad to follow up on that one.

Brian Chee (00:54:15):
Well, anyway, it's been spectacular talking to, and I see a bright future because when the world changed and added GPUs, it changed the world. It was a paradigm [00:54:30] shift in the industry. I'm hoping NPUs will be a similar paradigm shift and best of luck to you and your product line.

Jason Kimrey (00:54:40):
Well, thank you. And I would just say it launches soon, the Intel core Ultra brand, it's going to be pervasive in the consumer platform. It'll come out in the business platform next year in laptops, everywhere, and look, get your hands on it. Now, we also announced at the [00:55:00] innovation event, the Intel Developer Cloud, which is really our way of making the latest Intel CPUs, GPUs, gdi, which is available in a cloud infrastructure to enable customers to get started quickly. It'll also include AI frameworks and tooling, including open Vno optimized versions of PyTorch and TensorFlow, and even one A p I. So our goal is making AI more accessible for [00:55:30] everyone, and that's going to happen through software, through hardware, through the Intel developer Cloud. And I would just ask you, and anyone that's listening, just stay tuned because that's our mission at Intel is to make it more accessible, efficient, and impactful for everyone. And excited to be on this journey with all of you and Ken Kent. I'm excited to see where it goes from here.

Lou Maresca (00:55:52):
Thank you so much, Jason, for being here. Since we are about to close, I want to give you a chance maybe to tell the folks on where they can go learn a little [00:56:00] more about Intel's Ultra and all their new platforms and all their stuff coming out.

Jason Kimrey (00:56:05):
Well, intel.com is where all of this information is. And the part of the business that I represent, which is our partner organization, we do have a program called Intel Partner Alliance. I encourage anyone that's part of a small business or any business to join, and you can find more information on that at intel.com/partner.

Lou Maresca (00:56:29):
Fantastic. Well,

Jason Kimrey (00:56:29):
Thanks. And [00:56:30] if you're interested, follow me on LinkedIn at Jason Kimery.

Lou Maresca (00:56:34):
Thanks again. Well, folks, you've done it again. You sat through another hour of the Best Tank Enterprise and IT podcast in the universe to definitely tune your podcast twi. We want to make sure we thank everyone who makes this show possible, especially to my wonderful co-host Target, her very own Mr. Brian Chi Bert, what's going on for you the coming week? Where could people find you and get ahold of you?

Brian Chee (00:56:54):
Oh, sorry. I was going through old emails trying to find Keith's five twelves email [00:57:00] and couldn't find it. Sorry. But anyway, I am building some racks for the central Florida fairground and running fiber, getting a bunch of fiber in my diet and basically enjoying being a retiree, donating my time to try and make the world a better place. So that's going to be fun. Other than apologies to Keith's five twelves that I forgot [00:57:30] to tag his email with his question, normally I'm better at that. So I am Bert, c h e e b e r t at twit tv. You're also welcome to use twit at tv and that'll hit all the hosts. I do a lot of my talking about cool things, projects that I'm building and so forth on Twitter, which is now called X, that never made sense to me. I'm A D V N [00:58:00] E T L A B advanced Net Lab, and I'd love to hear your ideas for shows. And if you have questions, I'll be more than happy to. I know one of our viewers got really excited about Bluetooth serial dongles. I actually had a copy of my old Info World article on that and send it to 'em. So teaser for your viewers. If you ask questions, sometimes I can give you toys.

Lou Maresca (00:58:25):
Thanks sheer, thanks for being here. We also have to thank our mayor, Mr. Curtis Franklin. Curtis, thank [00:58:30] you so much for being here. Where could people find you all your work and what's coming up for you this week?

Curtis Franklin (00:58:34):
This well, coming up this week, I've got a new piece that's going to be going up on dark reading that's Dark reading slash omnia. Have a new report coming up on omnia.com for our subscribers. And if you're interested in knowing more detail about what I think about out the Cisco Splunk deal, my first thoughts head to [00:59:00] LinkedIn, that's linkedin.com/curtis Franklin, I've got an article that I put up today about that and would love to hear your thoughts on it. Take a read, let me know what you think.

Lou Maresca (00:59:14):
Thank you, Curtis. Well, we also have to thank you as well. You're the person who drops in each and every week to get your enterprise and it goodness. I'm going to make it easy for you to get your enterprise and IT news. So go to our show page right now, twit tv. So there it is, twit tv slash twit. There you'll [00:59:30] find all the amazing back episodes in the co-host information. But next to those videos there, you'll get those helpful subscribe at download links, and we want to get your audio version or your video version of your choice and be able to download it and watch it on any one of your devices. So definitely go ahead and click on those links and subscribe. And you can also subscribe on any one of your podcast, your app locations as well. On all of them's definitely look for twit and subscribe to the TWI podcast.

(00:59:54):
They may have also heard, we also have Club Twit as well. That's right. It's a members only ad-free podcast [01:00:00] service where you get a bonus plus feed that you really can't get anywhere else, and it's only $7 a month. That's right. Not only that, you get access to exclusive members only Discord server. You can chat with hosts and producers, and there's lots of separate discussion channels and lots of fun stuff. So definitely join Club twit at twit tv slash club twit. And you know what? I want also to let you know that Club Twit also offers corporate group plans as well. It's five members, a discount rate of $6 each per month to get you access to all of [01:00:30] our ad free Tech podcasts. It's a great way to give you your teams, whether it's IT teams or developers, sales teams, tech teams, whoever, to access to all of our podcasts.

(01:00:40):
And just like that regular membership, you can join the TWIT Discord server and you get that TWIT plus bonus feed as well. So definitely have them join and be part of that. Plus get this, your family can also enjoy family plans. That's right, it's $12 a month. You get two seats with that, and it's only $6 each for each additional seat, and it's just like all the other memberships. You get access to the server, [01:01:00] you get access to thew plus fee, just like the regular plans. So lots of options. Definitely check out Club Twit at twit TV slash club twit. Now, after you subscribe, you want to press your friends, your family members, your coworkers with the Gift of twit. Please do, because we talk a lot about some fun tech topics on the show, and I guarantee they we'll find it fun and interesting as well.

(01:01:18):
So have them join and be part of that. Of course, we also do this show live. That's right, tweet tv. You go to live TV for the live stream there. We do it Fridays at 1:30 PM Pacific Time. [01:01:30] Go ahead, watch the behind the scenes before the show, after the show come see how the pizza's made, all the manter and fun stuff before and after the show's. Definitely have check out the live stream. And of course, you're going to watch live streaming. You should jump into our IRC channel as well. It's IRC TWI tv there. You can find some amazing characters each and every week, and we have a lot of great discussions of fun stuff in there. In fact, we got some questions, some topics from them as well, and we get some really great show titles from them each and every week as well.

(01:01:55):
So thank you guys for being there and being part of that fun at IRC dot Twitter tv. [01:02:00] Definitely hit me up. I want you to hit me up for questions, conversations, whatever. I'm on Twitter or X Z X com slash lu. I'm also on Threads Lu, M P m on there. I'm on the Mastodon, of course, Lu mwi social and of course always on LinkedIn. Louis moca on there definitely hit me up. I've had a lot of questions this week about AI getting started, AI prompt engineering and design. I had some good conversations about that. That was a lot of fun. So please, definitely contact me. I love having conversations. [01:02:30] Hit me up on direct message on Twitter if you want. That'd be great. Of course, you want to know what I do during my normal work week at Microsoft? Definitely check out developers.microsoft.com/office. There we close to latest of greatest ways for you to customize your office experience.

(01:02:44):
Of course, if you have Microsoft 365 right now, go ahead and open up Excel. Check out the automate tab. That's where my team lives. We allow you to create macros, modern macros we like to call them, where you could actually generate JavaScript and TypeScript. You can run them in [01:03:00] Power automate. You can run them against all of your workbooks, whatever you want to do. Really great way to automate your processes. So definitely check out the automate tab there. I want to thank everyone who makes this show possible, especially to Leo and to Lisa. They continue to support this weekend enterprise tech each and every week, and we couldn't do the show without them. So thank you for all their support over the years. I want to thank you to all the engineers and staff at twit. And of course, I also want to thank our Mr.

(01:03:24):
Brian one we time, because he's not only our co-host, but he's also our tireless producer. That's why he does all the bookings in the play [01:03:30] for the show. We couldn't do the show without him. So thank you, Bert, for all your help and your support. And of course, before we send out thank you to our editor today because they make us look good after the fact. They cut out all of our mistakes, especially mine. And of course, thank you to our technical director today, talented Mr. Ann Pruitt. He does some shows on twit as well as he does some amazing photo walks and some other fun stuff that's going on at twit. So can you give us a little tease on what happened this week and maybe what's happened next week on twit?

Ant Pruitt (01:03:57):
Well, thank you Mr. Lou. Appreciate [01:04:00] all the support. Well, what's happening at TWIT is we have an a m A coming up featuring some dude named Lou Marca, and this is part of our Club Twit package. So make sure you're signed up and can join in and check that out and put your questions in our Club TWIT Discord. And also feel free to give me a follow over on Instagram. I'm still taking pictures and doing a bit of campaigning of my boys. So yeah, you're going to see a lot of pictures of my boy doing his [01:04:30] thing, but you can still join in because every now and then I'll do little stories, even stuff that says Life at twit, so you get a little behind the scenes too. So check it out. Ant Pruitt on Instagram.

Lou Maresca (01:04:44):
Thank you, sir. Well, until next time, I'm Lewis Eska just reminding you, if you want to know what's going on in the enterprise, just keep TWIET
 

All Transcripts posts