webcasts

Decision Point: NetSpeek - The End of Alert-Only AV Management with Lena AI

Written by Craig Durr | Apr 14, 2026 6:04:06 PM

Summary

On this episode of Decision Point, the Collab Collective’s Craig Durr speaks with Sam Kennedy, EVP of Go-to-Market, and Osman Bicakci, EVP of Product and Engineering at NetSpeek. Recorded during the expo, the conversation explores how NetSpeek is applying AI to move IT teams beyond reactive monitoring and into autonomous operations for AV and UC environments. The discussion centers on Lena, NetSpeek’s AI platform, and its evolution into a purpose-built orchestration layer designed to diagnose and resolve issues in real time. 

 

Their discussion covers:

  • AI-Driven AV Operations: How Lena transforms traditional alert-based monitoring into intelligent, automated workflows that detect, diagnose, and resolve issues without manual intervention
  • From Monitoring to Orchestration: Why legacy tools fall short and how NetSpeek’s platform introduces a deeper, multi-vendor understanding of conference room environments through structured data and API integrations
  • Knowledge-Based AI Advantage: How Lena leverages vendor-provided documentation, APIs, and contextual data to deliver accurate insights while avoiding the limitations of generic AI models
  • Autonomous Troubleshooting (Lena 1.1): A look at newly launched capabilities that identify root causes, resolve issues automatically when possible, and guide IT teams through resolution when physical intervention is required
  • Real-World Demo Insights: Live demonstrations showing how Lena restores room configurations, resolves vague user-reported issues, and reduces the need for on-site support across distributed environments
  • The Road to Self-Healing Environments: A preview of NetSpeek’s roadmap toward proactive, fully autonomous AV environments, with future updates expected to expand self-healing capabilities even further 

 

Listen to the Audio:

 

Or tune in on your preferred streaming platform:

 

 

 

Transcript

Craig Durr: Everyone, this is Craig Durr, Chief Analyst and Founder of the Collab Collective, and I want to welcome you to another edition of Decision Point. This is Workplace Insights for IT Leaders, and this is an exciting one. Why? Because we're going to be tackling a topic that everyone is talking about, which is AI and how you can make AI something that's practical and useful for you as an IT administrator. So very specific to that, I have a very, very exciting show plan today. In fact, it's going to align with some product announcements and even a live demo we're talking about. This all comes from the company NetSpeek, and to walk me through this, I want to introduce you to two of the leadership team from NetSpeek. First, we have Sam Kennedy, who's EVP of Go-to-Market.

Sam, how are you doing?

Sam Kennedy: I'm doing great. Great to be here, Craig.

Craig Durr: Good to have you.

And then we have Osmond Bicakci, who is the EVP of Product and Engineering.

Osmond, how are you doing?

Osman Bicakci: Doing great, Greg. Thank you.

Craig Durr: Okay, so here's the premise of the show, just to make sure you guys know what's going on. My job here is to ask some of the questions that I think our IT leaders want to hear. And this is a new product for a lot of people. They may not have had an introduction to it as of yet.

NetSpeek has been around for probably about two and a half years in stealth mode, then came out in the last year. But let's go ahead and go through an introduction and start learning about what the company is about.

Sam, why don't you do this? Give me an overview of NetSpeek, and then who is this character, Lena, that you keep talking about?

Sam Kennedy: NetSpeek is focusing on transforming how organizations manage their AV and UC environments.

The way that most teams operate today is in a fundamentally reactive mode. So traditional monitoring tools generate alerts, but they don't actually fix the problems. They'll tell you something's wrong, but it's ultimately up to a human to figure out what the problem is and how to solve it. With Lena, our AI platform, we shift to reactive monitoring, AI-driven operations where issues can be detected, diagnosed, and ultimately resolved automatically. So Lena delivers AIOps specifically for AV, UC, and digital signage, and she understands these environments deeply, not generically, because this is all she does.

Craig Durr: I love it. You're personifying her, Lena. I know that acronym stands for something. What does it stand for?

Sam Kennedy: Language Enabled Network Administrator

Craig Durr: Okay, so what you're saying in that, almost in the promise of that name, is we're talking about something, we're not talking about a generic horizontal AI platform. It sounds like what you're telling me, this is purpose-built for the AV pro, IT pro needs, especially around conference rooms, right?

Sam Kennedy: Yeah, absolutely. And this is what makes her so effective in these environments. A general AI tool doesn't understand how these systems work, how they fail, and ultimately how to fix them.

Craig Durr: And so this is going to be replacing what's the status quo right now in the market that we're talking about?

Sam Kennedy: Well, the status quo is more generalized monitoring tools that can give you uptime or downtime and see if a device may be online or offline, whereas we're with Lena delivering a lot more capabilities. A lot of times, we'll call it an AI orchestration platform because it's doing a lot more than just saying if my system's up or not.

Craig Durr: That's one of the veins of existence when I talk to IT administrators, is the alert-based monitoring. Now, fundamentally, it's the right idea, but if you're getting pinged back and forth with multiple alerts, it's almost like the boy who cries wolf. It's almost the idea that I don't know what the right ones are to focus on. It seems like a core problem there.

Sam Kennedy: And even then, if you understand that something's offline, well then what do you do? So, your users have this downtime, and instead of being able to proactively start to help you fix these problems, your administrators are off chasing how to fix these problems down versus just implementing the solutions.

Craig Durr: Okay. Osmond, is he doing a good job here? I mean, he's giving me the marketing spiel, but you’re product and engineering. Is he representing well what NetSpeek represents here?

Osman Bicakci: I think he's definitely touching the correct points. The one thing I would underline is that yes, we are currently focusing on device orchestration, but as a platform, we are an AI company. When somebody asks me where I work, I work with an AI company. I think that's the real key vibe here to underline that AI sense in this conversation.

Craig Durr: Well, that's a great segue to this next level of question. I've been fortunate to have touchpoints with NetSpeek along the way, almost even before you guys came out publicly, which is great, but most people don't understand that journey that's taking place here over essentially two and a half years. Why don't you give us a walkthrough and probably a great touch point to talk about where this probably really came to life, which was at ISE, Integrated Systems Europe conference, just this past year.

Sam Kennedy: Yeah, we've been working on Lena for about two and a half years. We came out of stealth mode right before ISE in 2025. We launched our 1.0 of Lena in October. We actively have customers and partners that are using Lena today, and we started previewing at ISE 2026 some of the new capabilities that we're announcing today.

Craig Durr: I'm going to pause you there. I mean that's key because from what you had as a promise in 2025 to what we saw in 2026, just this past February, you delivered almost like 90% of what shift of what you've talked about with the original vision. The execution is pretty commendable.

Sam Kennedy: Well, we're moving very fast, and I think part of the reason for that, as Osmond just touched on, is that we are an AI company first. We spent a lot of time in that stealth mode of building a strong base, and we were fortunate enough to be able to leverage a lot of the AI capabilities that were starting to become available. This is why we're able. Because we built a strong base, we're able to really take advantage of that now and move extremely fast.

Craig Durr: Alright, I'm going to push back and understand something here. You have some fundamental capabilities that are kind of foundational. I think you've outlined a couple for me when we're in the green room talking about that, please hit those again. I think this helps to establish what you're promising, being an AI-first company.

Sam Kennedy: Well, there are a couple of key parts of the architecture, and one is a knowledge management platform which trains Lena on documentation, release notes, admin guides… And we pull this all directly from the vendors that we're partnered with. So this is a primary source of information, which is constantly being updated so that we always have data that's relevant. This is in contrast to more of a generalized AI tool that pulls data from internet searches and forums, and that's just so prone to hallucinations. Additionally, with these vendors, we work very closely with them and leverage their APIs so that Lena can observe and provide a deep level of control for their devices.

Craig Durr: And so this all translates into deep knowledge because what a common problem is, no one has the same vendor across all the environments within different rooms, even the same room. You can have multiple vendors within that same room environment. So you're talking about getting insights into those specific devices and that knowledge base idea that you were talking about before?

Sam Kennedy: Absolutely, and I think that you hit a very important point that most customer environments are pretty much all customer environments are multi-vendor just inherently. So what we're able to do is leverage all of that knowledge from multiple vendors and pull that all together to have a much deeper, richer set of understanding, which enables the ability to ask questions, answer questions, control devices, and pull it all together in a more cohesive system than a traditional monitoring tool could ever deliver on.

Craig Durr: Osmond, you guys call this your fleet partnership program or your partnership programs. Who are some of the vendors that you have these relationships with to get that firsthand, first-level knowledge to pull into your data sources?

Osman Bicakci: When we approach our partnership program in the ecosystem, our team comes from the industry, so we now have lots of manufacturers in the market, their market share, their technology, and their perspective and vision on the market. So we are approaching, and they're approaching us as well, all the mainstream manufacturers in this ecosystem to get a partnership with us. So these are including with the very well-known display manufacturers, but also on the video and collaboration side, you will see the list of all this partnership in our website.

And this is a program that we are not limiting. So we are open to any new manufacturer who are open to get delivered the best experience for their end users because what we bring to the table is getting the best out of the different manufacturers, players, devices in a space, whether it's the meeting room or open area, wherever they're sitting to collaborate together as if a conductor is orchestrating them and giving the best out of concert for their users. So what we are enabling is getting the best value for their customers as well. That’s why we are open to any manufacturer who is willing to work with us to add their devices to our support on our platform.

Craig Durr: So I'm going to help try to translate this into layman's terms. So you're saying, for let's say an HP device, formerly Poly, you would have insights into how that device is configured because you've absorbed all the release notes, all the installation documents, and even have troubleshooting guides built out for this as well. Is that kind of the takeaway?

Osman Bicakci: That's definitely the takeaway. Let me bring in a bit of a boring AI fact here. Large language models generally are stateless, right? So what they do is, when they're getting out of training from their model delivery manufacturers, which are known names like Meta, OpenAI, etc., they're trained on general internet or human knowledge. But because of our industry's vertical knowledge, the knowledge base in that model itself, and its initial training by the model provider, does not include vendor-, manufacturer-, or AV-related information. So what we are providing is the base-level data for these models. When they're working on their devices, they know how to solve problems and use them.

The other part that we are enabling is the control part. The control is enabling the host. You send commands to these devices, like API commands in the traditional sense. So you know how to do action. So, to act on these devices with the information you grab from the data part, this is what we do: we control these devices.

The last part is the monitor part. So use the same APIs or the information available from these devices as telemetry to monitor these devices. So whatever we do, we are getting this documentation source and using it for multiple purposes. One of them is getting knowledge into our system, so Lena can answer questions. But the other part also involves taking actions on behalf of the admins when they ask Lena to do something.

Craig Durr: I got it. Okay, that's good. You're creating your own language model specific to this industry, to this device, in some ways?

Osman Bicakci: No, we are not creating models. We are using two open-source models provided and delivered by the best time provider. Our platform is model-agnostic, so we are not dependent on models; in fact, one of our targets for the end of this year is to adopt a bring-your-own-models approach that lets customers use their own models. What we are delivering is the data the models need to operate in their environments, like device orchestration, and providing that information in a structured way.

Craig Durr: That helps. And now, when we talk about a structured way, we're still talking about a secure environment, enterprise-grade. I'm not sharing data with the broader internet and ecosystem, am I?

Osman Bicakci: Nothing. One of the things that we did on the webinar, it was a very cliché kind of thing, but your data is your data. So we are never using customer data for training or collecting it for any other purpose. But what we do is use that information only for your tenant. There's a tenant installation, so what this is known as in the AI world is the contact management. You most probably heard that if you're interested with the AI or AI development, the word context contact is bringing what we use as a smart means of managing context, which is customer information. With the device information we have, the data that Sam mentioned comes from the manufacturer themselves, and we are putting all these together in a kind of listing, that runtime memory, and enabling the LMM to work on that space only for that customer. Customer A’s data and Customer B’s data, and the context, are separate from each other, and they only live in their own tenant environments.

Craig Durr: Alright, you're doing well. I appreciate this. I told you I wasn't going to pull punches. Alright, so the baseline right now with Lena, the 1.0 inversion, there's some core foundation. Sam, I kind of cut you off. You were kind of going through this, but that's like the ability to ask questions, that's natural language device control. It's something else that I've got a note on, which is what you call the Room Health Check Agent. Did I get that right?

Sam Kennedy: Correct. With room check or health check, Lena is able to detect configuration drift in a space and automatically restore the space to the correct state without any human intervention. Most organizations do this today with sneaker wear, where they have employees walk into the room or an MSP, someone from an MSP who walks into the room to check to make sure that everything's functioning in that room, that the way that their users want it to be. What we've done is we've leveraged Lena to automate that process so that when their users walk into the room, it's always in a ready state.

Craig Durr: I like what you're saying. So this is the picture I think I heard you guys paint, talking about an enterprise-grade, data-specific, AI-first model that's taking data and creating these insights for us. This is what Osmond was talking about.

Sam, if I'm going to reflect back on what you were saying, this is how this is working as an end-user experience. I think you mentioned having some natural-language controls. I could ask questions, I guess, just as is, and the fact that I could probably do some device control based upon some of the other insights that Osmond was sharing as well, right? All that rolls into these great feature sets, like room health check, which is kind of a baseline idea. And measuring against that is that kind of where… is that a good encapsulation of Lena? I know I'm not selling it, I'm just trying to make sure I can repeat back to our audience.

Sam Kennedy: That's correct. From a 1.0 perspective, that's where we started, and really what we did with our 1.0 features is find real problems that many users are facing today, show what can be done with AI with our base platform, and now we're starting to add much more advanced capabilities on top of it.

Craig Durr: Got it. Well, you said now, and that is not why I have you guys here. You guys promised me some new cool stuff. Today, you guys had a press release, and I had a chance to preview it. So you are announcing some new features in Lena 1.1, is that right?

Sam Kennedy: That's correct.

Craig Durr: Spill the beans, tell me what you got.

Sam Kennedy: With 1.1, there's a number of features here. We have a refreshed user UI, we have some updated partnerships, we added role-based access control, quite a few other improvements… But the real headline of what we're announcing today is a new troubleshooting capability in Lena. This is one of the biggest innovations we think in the industry. It's definitely one of the biggest innovations within Lena that we are launching today.

Craig Durr: Okay, troubleshooting is what we're talking about. What does that mean?

Sam Kennedy: As we said earlier, historically, tools in this space have been focusing on monitoring. They surface alerts without actually resolving anything. The solving of problems still relies on human beings to go and figure out what an actual problem is and how to resolve…

Craig Durr: Take an alert, translate it into what it means for my specific environment, figure out how to act upon it, then go out and do the action on it, right?

Sam Kennedy: Correct. Yeah, so the new troubleshooting capability within Lena identifies root causes, resolves issues automatically where possible, or can guide a user through what needs to be done to solve a problem.

Craig Durr: Okay. What actually is taking place in that framing? Are we on the path to being truly autonomous? What does this do?

Sam Kennedy: So this is a step towards that. This is, I think, the first step towards that. I could say even though we delivered the health check was really the true first step to that. But ultimately, our goal with Lena is to provide truly autonomous self-healing environments. And what we're delivering today with 1.1 and our new troubleshooting capability is the journey that we're on to ultimately get there.

Craig Durr: Okay. All right. I could ask more questions. I really think what I want to do is have you show a demo. Why don't we dive into this? Is that possible?

Sam Kennedy: It's absolutely possible.

Craig Durr: Alright, bring up your stuff.

Sam Kennedy: Okay, so let me start by sharing some content here. Here, we have the Lena user interface, so from here I can see all the different devices that I have within my environment. I can see previous chats, as we were talking about earlier, we have that knowledge base where I can ask deep technical questions because we've trained Lena on all the knowledge, and you can just see some of the different vendors.

Craig Durr: So those are all the vendors that we were talking about before.

Sam Kennedy: These are all the vendors we were talking about before that you could ask those deep technical questions. Lena will answer the question and pull the documentation that she has on this. And like we were saying earlier, because we've trained Lena on this information, it's not being pulled from potentially nefarious sources. It's not being pulled from a blog somewhere where someone didn't put in the correct information. So we're very accurate on the data that Lena's going to respond with.

Now, one of the first major features, as we talked about, is the ability to do a room check. But before we go there, I'm going to go into the management, and you can see my demo environment here, the number of devices. But what we have is a room, and this is the configuration in which we call the canvas for a specific room. I just happen to have a live stream up. This is just a YouTube video, a camera on YouTube pointed at a traditional conference room. You can see a Neat bar, a Samsung monitor, there's the knee pad, a BrightSign player, and a Barco. So I'm going to tell the little story.

Craig Durr: This is a multiple vendor environment. This isn’t unusual at all.

Sam Kennedy: Very typical, right? This is what most users, most environments are. So I'm going to have one of our employees go into the room here live and is going to screw the system up. They're going to go in, they want to be able to take advantage of the great AV that we have in this room, and they're going to go in and change the input on the monitor, lower the volume, and mute the monitor.

You can see Eric is now in the room; he's going to go and change it all up. So while Eric's doing that, you'll be able to see that he muted the system. And so what I'm going to do is go back to Lena now. As I was saying, this is the configuration of that room inside Lena. It does require some configuration for the room, but we kind of think of this as more of a no-code type solution. There's some configuration that's needed, but there's no programming or anything more required than that.

And I'd love to get into a lot more detail as to the different things that you can see. But I would suggest any customer who wants to get into that level of depth follow up with me. But what we're going to do is I can have different target states for the room. Just like some days we talked a lot about higher education, where what's on the ClickShare is what's most important for some days or other days. Other days, I want to see my Zoom room, which is most important on that day. On the weekends, what we do is shut down the devices so that we can save on power. But in this case, what we want to do is we want to set these rooms up for a video conference, and I can go through the configuration here.

It's really leveraging the APIs like Osmond talked about earlier. I'm going to now manually run a room check for the space that we just showed. Of course, you could schedule this. That's the way most of our current customers have this set up. They schedule their room checks. And again, this is what we were talking about earlier, where these devices tend to have drift, and we're actually seeing this in real time in this particular room.

Craig Durr: The configuration is the car drifting? Is that what you meant when you said the word drift?

Sam Kennedy: I know, but again, in this case, one of our users went in, and they changed the configuration of the room, and now, when users walk in for their Zoom meeting, they're having a broken experience. So what Lena did was go and see that there was some configuration drift in this room. So she automatically switched the input on the monitor. She took the system off mute, she changed the volume on the system, and now, when users walk into the room, it's actually going to dial that Zoom call. It drops the Zoom call. It's actually testing to make sure that that's good. And now, users walk into the room; they're ready for their meeting.

Craig Durr: This is awesome. This is what you were calling that room health check functionality, right, where you established baselines, which were the true state you wanted to have? I love that you had that pull-down menu. You can see the different ones as well. And when it got off of that core state, Lena was able to go back in automatically because you were talking to me, you weren't doing anything here, right? It went back, and it found the right input, changed everything it was supposed to be.

Sam Kennedy: It did it all automatically. And this is the true orchestration level that we were talking about. Lena found problems, realized that they were problems, and was able to send the commands to the devices to get them into the ready state where the users need them when they're going to go in for their meeting.

Craig Durr: That's incredible. This is still Lena 1.0, this isn't even…

Sam Kennedy: Yes, it’s Lena 1.0.

Craig Durr: Let's dive into troubleshooter. I can only imagine what this looks like.

Sam Kennedy: We're going to show a couple of scenarios now with the troubleshooter. So what we're going to do for this next demo is send Eric back into the room. And for this particular demo, we're going to walk through an issue where a user reports a very vague statement about a problem they're having with their Barco or a perceived problem with their Barco ClickShare. So what we're going to do is we're going to have Eric so everyone can see the screen and kind of get into the scenario we're talking about. Eric went and turned the display off. So we all know that the issue is that the display is turned off.

But the scenario I'm going to talk through is a user walks into the conference room, takes their click share dongle, plugs it into their personal device, goes to share content, and nothing happens. So they open up a ticket in a very generic way and say that the Barco ClickShare isn't working. So I'm now going to go back to Lena and show everyone how Lena solves this particular scenario.

I'm going to just start up a new chat because she is contextually aware of the discussions that I was having earlier. So I'm going to start up a new one. In this case, I'm going to act as the IT administrator who has now received that ticket, and I'm going to talk to Lena, leveraging natural language to troubleshoot this problem.

Craig Durr: I'm on the guy. Hey, my Barco is not working

Sam Kennedy: And you're the guy, and now I'm going to troubleshoot that issue. The Barco ClickShare is not working.

Craig Durr: You did this in voice; you didn't type this in…

Sam Kennedy: Correct, I leveraged natural language to talk to Lena to troubleshoot the problem. Now, in this particular case, I didn't give her enough information on purpose. I wanted you to see if I don't give her enough information, she's going to come back and have a dialogue with me. So, in this case, she's asking me, "Well, what room are you talking about?" We have two Barcos in this environment. So these are the two. If there were hundreds, she would've clarified it in a slightly different way, but in this case, we only have the two. So I'm going to say that this is in the demo room, so I'm going to confirm there. And now Lena's probing, and she has found what the true problem is. Hopefully, Craig, you saw how fast that was.

So Lena actually came back and said, well the Samsung display is powered off. So the ClickShare output has nowhere to show. With the display turned on, the ClickShare unit will be able to present. So she realized even though the ticket came in with a Barco problem statement, Lena realized that it's actually not the Barco. She came back and said that it looks like it's the display, and I can actually turn it on. She also gives some physical responses, but she prioritizes anything that can be done where I don't have to send someone to the room. So she's going to say, “Okay, I'm going to ask her to turn the display on.”

And as you can imagine here, when we come back to the room, it'll just take a second because again, there's a slight delay because of the stream on YouTube. But you'll see here in a moment, Lena found what the correct problem was, and then enabled me as the administrator to solve the problem without having to send a human being into that room.

Craig Durr: And talk about total cost of ownership… Wow, it just did it. That's fantastic. It was actually able to suggest to you what it wanted to do on its own. So you're still sitting there at your desk, no sneaker net involved... You're not rolling out anyone here, is that right?

Sam Kennedy: That's absolutely correct. And it's small challenges like this that add up over a day for the organizations we talk to, which have tens, hundreds, and potentially thousands of conference rooms. Sneaker wear adds up when you multiply across all of these rooms. So here, we were able to find the correct solution even with a vague problem statement. A lot of the organizations that I talked to, when they share some of their help desk tickets with us, and as we look through, you can see that very often, the users don't give a lot of details. So here, even with just a little bit of information, Lena was able to find the problem and assist in solving it without ever having to send a user.

Craig Durr: Got it.

Sam Kennedy: Now I have one more demo to show you.

Craig Durr: I'm going to pause one moment. Hey Eric, pause one moment. Can you hear me? That's fine. We'll just keep running. Go ahead and go.

Sam Kennedy: So the next problem, what we want to show is what you saw, Eric physically went and unplugged the Barco ClickShare. So now we all know that the problem actually is an issue with the Barco because we removed the power. So I'm going to go back to Lena, and I'm going to start up a new chat here. I'm going to ask the same problem statement that I did, where she was able to solve the problem, which, in this case, she doesn't have arms and legs, so she can't go into this room to solve, but I want you to see how she walks a human being through what the problem is and how to resolve it.

I'm going to leverage natural language in the same way that the Barco ClickShare is not working. So again, I ask the same problem statement. Now she is going to come back to me and say, “Okay, what room are we talking about here? Then, I'm going to go back to the demo room, and I'll click confirm there. Now, it's going to take her a little longer on this one to solve the problem because she's not going to get a lot of information back, because again, the device is turned off. So in a moment, you're going to see, she's going to come back and say, “It looks like I found a problem, and here's the thing, it looks like it's a physical problem, and here are the things that I want to walk through that you need to go do in order to solve.”

So it's not just saying the system is turned off; it's actually giving you a path to resolution for the people supporting these spaces. So you can see she came back with that, the Barco ClickShare unit is not reachable, so it can't send video to the display. And now, she gives me the solution, the resolution, the path to resolution for devices for this device. So she's saying it's likely due to a power or network issue. So choose a step that verifies. We're going to now send our engineer, Eric, back to the room to plug it in. But we're going to check the power, and again, we're going to say that we found a problem and that the power was out, we'll click confirm. We want that to go into our ticketing system.

Again, we'll see that Eric's now plugging it in, and then we're back. And so what we wanted to show everyone with our 1.0 or 1.1 troubleshooting capability here is that Lena can, in certain cases that she has the ability, can find problems and fix problems without ever having to send a human being, or when there is a physical problem, she's able to guide a user through the path to getting to that resolution. So we think that this is a very significant release for the industry.

Craig Durr: Oh, it's a lot. It has fewer manual interventions. It tells you when you have to send someone versus when they can try to do it themselves. And I would think this translates very specifically to more uptime and lower operational cost.

Sam Kennedy: And that's really the goal of what we're delivering with Lena, is just again, deliver a higher level of experience, enable people to deploy more technology and more spaces, but be able to do it at a lower operational cost.

Craig Durr: Fantastic. And you guys are just announcing that now. When is this going to be available in the market?

Sam Kennedy: So that's what we're announcing, that it is now available in the market.

Craig Durr: Fantastic. This is good. This is a lot of value for those IT teams, especially those that are thin, that are strengthened, and that have you manage multiple campuses. And these are physical rooms. It's like a person can bring in their laptop to that IT desk in that room. This is where you have specialized equipment with specialized needs, and trying to get holistic insights across what could be a very large campus.

Sam Kennedy: Craig, I talked to customers who have offices all over the world where they may not even have IT staff in those locations. And so enabling them to have better uptime for their users, where they may not even have people, is just a tremendous value for the whole industry. Everyone involved, we think, benefits by delivering a solution and what we're doing with Lena.

Craig Durr: Okay, I'm excited about this. Let's transition here. So Osmond, I want to bring you back into this conversation. We talked a lot about this architecture. Walk me through what we're talking about here. I mean, this is a lot of power. How are you able to deliver this?

Osman Bicakci: So, we teased earlier on how our infrastructure and architecture are built around LLM large models. And as I mentioned, we are building a platform, not a device management platform with AI. So when we kind of work on this one, this architecture, LLM, and AI tools require data. And that's the foundational level for us, where we position Iris in an intelligence source integration system, which is going through the documentation that we provide to our partners and also the public documentation to bring the data and the context to our devices or platform when they're operating.

And then we have to digest and process this information with the metadata and the required information to ensure that every document is not a document itself, but it's connected in a graph-like manner. So you know exactly which document page belongs to which manufacturer and which manufacturer's device supports switch environments.

Lena, when she acts on certain problems like these troubleshooting scenarios, she can go through this documentation, but she can also see the context, the metadata that we generated along with the documentation. That's a really key part and difference, that when you put this, for instance, the same document to check GPT or other common use LLMs and ask about a question about that document, versus our platform, which has all this documentation in place, and we have the relationship between these documents in our data store as well.

Craig Durr: Yeah, this architecture is built from scratch, which I appreciate; it's not off-the-shelf, it's not bolted on, but it probably lets you have some really powerful native integrations for getting data in from the environment. So you've got the knowledge base, how are you interacting with third-party integrations or other cloud environments, things like that?

Osman Bicakci: So, one of the key things, as I mentioned earlier, so the data part is the one part that your platform is connected to. The other part is data stuff, where you can ask questions and suggest doing something, but acting on that information itself by Lena is another thing. And for that act part, you need to control devices; you need to monitor devices. And this is where we use the integrations with the APIs; generally, they are available from many of the manufacturers publicly. And there are some manufacturers with whom you can have agreements, and you can have private communication with them and use the APIs.

We use those APIs to talk with the devices to understand what the space they are in is, whether they are healthy, whether they are network-connected, and whether they have any issues with other devices. So what we do is, especially in troubleshooting, it's very key when we kind of attack a problem with a device. A device in a room itself is not a single device. There are multiple other devices in the same room, and maybe hundreds of devices on the same floor as that device.

What we are looking into that problem is not the device level but at the room level, floor level, how we can address it in the actic, this is why you've seen in the send demo, it's not only checked for this specific device, the Barco ClickShare, that she reports as a problem, and then text for the other devices in the same room because she have the context from Neat cameras, how devices are connected and she now tested for other devices that device is connected to check is there any power problem is going on? Is there a network issue in the room? So kind of looking at the more holistic way than a single shot.

Okay, let me reach this device and send this recommended approach. So what we are doing is different than the rule-based troubleshooting, where you can define, okay, if this happens to that, if this happens to that… What we are doing is kind of more complex, and the power of large language models with our data layers and our management layer is multiplying stats, possibilities that solve the troubleshooting problems.

Craig Durr: Okay, I got it. Well, this is really interesting. I'll have our producer do a view of what you just described so our viewers can follow along. This is exciting, you guys. I appreciate you sharing this new development with us within the Lena offering. 1.1 seems to offer a lot of value, and you continue to be on this path of providing an AI-first AV management construct. I love the term that you use, AI orchestration. There are a lot of moving parts here, and I think you're removing the friction through your purpose-built enterprise-secure, enterprise-grade AI tools that you've put together here. I know I'm telling you what you know, but I think I'm hitting it on the nail on the head here. What do you say?

Sam Kennedy: Absolutely.

Osman Bicakci: I agree.

Craig Durr: So, alright, here we are. I appreciate you guys giving this update. Anything else you want to share along the lines? I mean, you guys have had a pretty aggressive roadmap, and InfoComm is not far from…

Sam Kennedy: I would encourage everybody that's going to be attending Infocom to make sure to come by and see us…

Craig Durr: You don't want to tell me something right now? It's part of your final grade.

Osman Bicakci: So I can maybe tease a bit. Just one thing: as Sam showed, this is 1.1 of our platform, so we are not calling 2.0; we are not calling 3.0. This is 1.1. So imagine what's going to come up with the 2.0 version of our platform, which we plan to show at InfoComm. So what we are kind of currently delivering is the human-initiated troubleshooter, where we have to kind of ask questions to Lena to address that troubleshooting problem.

We are targeting to make that a more proactive self-healing version. So, triggered by the room checks trigger, many other actions are triggered by the signals. This is where we are going, and this is hopefully what we are going to show at InfoComm.

Craig Durr: I love it. This is part of what Eric shared with me when we spoke back at ISE just earlier.

Osman Bicakci: That's correct.

Craig Durr: I have to commend you guys. I love the fact that you're delivering what you have talked about and you're doing it in a timely fashion, which brings value to these customers who are participating in the platform.

Alright, so if our viewers want to hear more, learn more, what's our next step? What should they be doing?

Sam Kennedy: So for anyone, any IT or AV, anybody managing the collaboration environments, I would encourage you to go to netspeek.ai. You could schedule one-on-one demos with us. I would encourage you to add Lena to your LinkedIn or follow NeatSpeek on your LinkedIn. We do have the ability to set up proof of concepts and pilots.

As I said, we already have customers who are using this today. Any of the partners, the channels, integrators, MSPs, definitely reach out to us as we have a path there as well. You can see we're not slowing down. There's a lot more to come. We think we're going to really help shape where the industry is going. Please reach out to us through our website, LinkedIn, or InfoComm.

Craig Durr: And for the technology vendors, the people that you are having these deep relationships.

Osman Bicakci: Our doors are always open. Our doors are always open for adding more devices to enable more value to our customers.

Craig Durr: Okay, I appreciate this. So here's the closing thoughts. I think that we were living in a world where monitoring just told us what was wrong. You just showed examples of your solution, Lena fixing it and doing it on behalf of that IT administrator, reducing costs, increasing uptime, and creating a lot of operational efficiency. I love it. This is moving well beyond alert only monitoring. I think there's a lot of value here.

Alright, Sam, Osmond, I want to thank you for your time. This has been really insightful. I think another good edition of Decision Point. I think I'm going to give you guys a good grade here. I'm going to give you somewhere on that A, A+ range, but you had a good demo and actually bravely showed a live demo, but it's a working product so I appreciate you doing that.

Let's go ahead and wrap this up. Again, my name is Craig Dur. I'm the Chief Analyst here at the Collab Collective, and I want to thank you for spending time with us at Decision Point. We're here to ask some of those tough questions on your behalf so you can get the insights you need to make those IT decisions. Until our next episode, take care.