Ctrl+Alt+Mfg: Ep. 3: Rethinking OT Security With Leah and Jeremy Dodson, Piqued Solutions

Episode 3 November 04, 2025 00:49:02

Show Notes

Cybersecurity isn’t just an IT issue — it’s a manufacturing issue. In this episode of Ctrl+Alt+Mfg, hosts Gary Cohen and Stephanie Neil talk with Leah and Jeremy Dodson, co-founders of Piqued Solutions, about the evolving threats facing operational technology (OT) environments.

From the myth of the “air gap” to the rise of AI-driven defenses, the Dodsons explain why factories must rethink security and move from perimeter protection to layered resilience. They share real-world insights on cyber hygiene, protecting critical data and building a culture where security enables innovation instead of blocking it.

Topics covered:

Subscribe for more conversations about people, technology, and strategies reshaping the future of manufacturing.

#Cybersecurity #Manufacturing #IndustrialSecurity #OTsecurity #DigitalTransformation #ControlAltManufacturing

Chapters

View Full Transcript

Episode Transcript

[00:00:00] Speaker A: Foreign. [00:00:08] Speaker B: Hello, everybody, and welcome back to the Control Alt Manufacturing podcast. Resetting and Rethinking Manufacturing Again. We're in the early stages. Let's talk about what this is about. So on this podcast, we're going to be talking to people, talking about people, technology strategies that are driving the digital transformation of manufacturing. We'll be having conversations with industry leaders, system integrators, engineers, innovators, interesting people who are working in all of these industries. I am one of your hosts, Gary Cohen. The other host is Stephanie Neal. Hi, Stephanie. How you doing today? [00:00:41] Speaker A: Good. How are you, Gary? [00:00:43] Speaker B: I am okay. I'm excited, actually. We got a great one today because we're talking to two people who have been really kind and have sort of become friends of mine here, the dynamic duo, Leah and Jeremy Dodson, founders of Peak Solutions. And we're going to be talking about cybersecurity in OT environments. I know when people think about cybersecurity, they typically think it's, trust me, it is not the same. And it's not like cybersecurity hasn't been in the news. I mean, recent attacks have hit things like the airline, Qantas and Sonicwall and Hertz and Aflac and AT&T. They've been all over the place. Yeah. [00:01:17] Speaker A: And you know, when you think about it, Gary, like, we've all experienced some sort of data breach where our personal and our financial information was exposed somewhere online. And if you don't think you have, you're wrong. I guarantee that all of your info is all over the dark web, that it's not a good feeling when someone tries to take out a loan in your name, which happened to me. Or when a hacker takes over your phone and gets into your banking app, which has happened to a family member of mine. But these are all really small incidents in the bigger picture, where corporations and even our infrastructure are constantly being threatened by bad actors. And for manufacturers, it can mean losing millions of dollars. So, like just one example, in 2023, Clorox was the victim of ransomware, which took many of its automated systems offline. It disrupted its operations. It disrupted its entire supply chain down to the retailers. And it was reported that that breach cost Clorox $356 million. More recently, in July, this July, Ingram Micro, which is a distributor of IT products, experienced its own ransomware attack on internal systems, disrupting its ability to process orders on its digital platform. And when you think about the connections that a distribut with other technology companies, there's a trickle effect. There Right. As well. You know, one breach opens the door to another. So as manufacturers connect more systems and digital transformation, one compromised controller can shut down an entire production line. And it's not only costly, if you're talking about like a chemical plant, it can be deadly. So with cybersecurity threats on the rise, we need to really take every precaution there is to keep the factory floor safe. [00:03:08] Speaker B: That's a really good point. We used to always talk about it as, you know, if you attack the OT side, you're turning off the cash register for companies which nobody likes, to stop the production line. But then, like you said, if you're talking about critical infrastructure, then it becomes human safety and all sorts of things. One of the things I want to point out, every year, the OT cybersecurity company Dragos releases their annual OT Cybersecurity Year in review report, which really kind of gives you the ground truth, what's happening across operational technology and ICS systems and the cyber threat landscape. And one of the alarming things they keep documenting, I think they're in their eighth one this year. They keep documenting year over year, is the arrival of ICS specific malware, industrial control systems specific malware. So it's not like the IT malware, the where people are stealing credit card numbers or getting your company data. It's purpose built to impact industrial control systems. So it's a huge operational risk to industrial companies, to critical infrastructure. Infrastructure. They also have some great names in 2024. That was Fuxnet and Frosty Goop, which are great, although bad Frosty Goop was deployed on the energy sector in Ukraine. So they're out there being used. But what they're also showing this Dragos report, is that OT and critical infrastructure are getting targeted much more often. So Dragos documented 1693 ransomware attacks that were targeting industrial organizations in 2024. [00:04:34] Speaker C: And. [00:04:35] Speaker B: And that's an 87% increase in ransomware attacks over the previous year. Manufacturing was the hardest hit sector, but ransomware was also big in energy and critical infrastructure. So you're thinking about, you know, global warfare, all the conflicts around the globe. Those don't have to be hot wars. They can be cyber wars as well. So, you know, if you're thinking, who cares? That's an IT problem. It is not an IT problem. You know, like we were talking about operational shutdowns. Of all those ransomware incidents Dragos responded to In 2024, 75% led to a partial shutdown of OT, 25% led to a full shutdown of OT. So those are. That's a pretty big deal. [00:05:17] Speaker A: Well, and you know, I was looking at something recently about sort of the top OT security threats. Ransomware with physical consequences in the form of downtime is number one because it creates leverage for hackers. Right, like we can't have downtime. [00:05:33] Speaker B: Absolutely. I also want to point out a little self promotion here. Leah and Jeremy also recently joined me for a control engineering webinar on the importance of site security as an integral part of automation cybersecurity. You can access all our webcasts on Controlenge controlenge.com if you have any responsibility for OT security or security in general. I highly recommend this one. And it was such an interesting topic because as I said earlier, I don't think think people generally treat OT security in the same way they treat it. Security and you know, OT data is pretty intricately linked to business decisions, whether that's inventory or maintenance billing. But there's also this idea, and Stephanie, you touched on this earlier, of how many unsecured devices do you currently have connected to the Internet in your office? Whether that's cameras, printers, how many hidden back doors do you have in your systems? How many forgotten and unsecured access points? All of those things leave you vulnerable. So definitely a big deal. All right, that's enough preamble. Yeah, enough preamble. Stephanie. [00:06:38] Speaker A: Yes. Let's bring on the people who really know what they're talking about. [00:06:40] Speaker B: Let's do it. So I'm gonna introduce them and then we'll bring em in here. Leah Dotson is the co founder of Peak Solutions and a cybersecurity strategist. With more than a decade of experience, she helps organizations align security programs with business goals, regulatory needs and operational realities. She's known for making complex challenges actionable, with a passion for risk management, devsecops and creative approaches like gamified security training, which I think is really interesting. Jeremy Dodson is a cybersecurity leader with more than 20 years of experience across government, military and the private sector. Before co founding Peak Solutions with Leah, he held key roles at the NSA and US European Command and led cybersecurity efforts at companies like Silance and Dell. He specializes in critical infrastructure protection, red teaming and secure automation and is a frequent speaker on industrial cybersecurity and DevOps. Leah and Jeremy, welcome to the podcast. Come on in. [00:07:32] Speaker D: Hi Gary. Stephanie. [00:07:34] Speaker B: I like magically appear like that, so I gave you a little intro there, but I always want to know how you came to cybersecurity Leah, I know you used to be one of us. You used to be a content person. So how did you come to cybersecurity? And then why did you decide to start Peak Solutions? [00:07:52] Speaker D: Yeah, so my background is very heavy in writing and technical writing. I got into technical writing for cyber through some, some people who were doing penetration testing and were looking for some help with writing up the reports. And so I would help kind of structure the reports, get things into terms that were less technical, more digestible by the board, by leadership, so that it would become actionable. I met some really amazing technical people who had great technical chops. And there was a little bit of a translation error. And how do we get that conveyed to the decision makers who are going to be doing something about, you know, the things that we found on the technical side? So I really, I really enjoyed doing that. And the more that I wrote, the more I wanted to learn about the technical side and really tried to, to marry those two things so that we could have that conversation. And that, that's kind of what was one of my driving forces behind what I wanted out of Peak Solutions and creating our own company is how do we make that translation for companies, for people who are decision makers between what's being told to them on a regulatory standpoint. So you must be doing X, Y and Z of this technical nature. How do we translate that into a business decision? You know, why are we doing that just because the regulation says so? Is it really going to help our business? Yeah, so that was one of my driving factors behind starting the company. [00:09:29] Speaker C: I just like to break. [00:09:30] Speaker B: Did that surprise you? I like your. I didn't know that. [00:09:34] Speaker C: No, it's, it's, it's always fun because she was very kind. Like both you and her said something about taking complex things and, and making it actionable. And I just got to remember that I was one of those complex things that needed to be actionable for others. So it's always awesome to hear, you know, yeah, you could break into this facility or do the things that you do, but so what? And that's one of the things that I've always loved about Leah is, is her ability to take what is being done and grab that and capture that. So what? And so eloquently put it in front of the people that need to understand, like how this affects them, what's the bottom line? Is it going to affect money? Is it going to affect process? Is going to affect, like you were talking about earlier, you know, the physical consequences, what is all going to be there and for me, I just wanted to talk about. So I got in. Or so I broke a thing, or so I could bypass or. I know you spent all this time building this thing and you, you, you thought it was impenetrable, but look what I did, right? So it's, it's funny to hear it all in one spot there. [00:10:45] Speaker B: Yeah. And I think like engineering topics, cybersecurity, they sort of need a translator because especially with cybersecurity, if you're asking a business to spend money on it, it's a hypothetical spend money on this thing that might happen to you. And it's like, but I could spend money on this practical thing over here. And so you really do need to tie it into business objectives and things like that so people understand why it's necessary other than just SC them into spending money on it. [00:11:10] Speaker A: Yeah. [00:11:10] Speaker C: And that's. And that's what's great. [00:11:13] Speaker B: Yeah. So let's start at the start here. Cyber hygiene still seems to be a huge challenge in OT environments. So why are issues like default credentials, like weak access controls, outdated, legacy systems still so common? And what are the sort of organizational or systemic reasons these hygiene issues continue to persist out there? [00:11:36] Speaker C: So from my perspective, they're so common because they're default. Right. So by default, all of you have thousands of things that get put in place. You're. It's always going to be a numbers game. Some people are going to do the right thing because they have the process in place. Some people don't even know that it's a big deal because they haven't been educated on it. But the problem is all the attackers know that it's a big deal. Like I, when I sit down and go after an environment, for example, I have every single default credential for every single manufacturer of every device at my fingertips. So if I am fingerprinting or if I'm researching or doing recon on a company, but before I even sit at the keyboard, I already know exactly what my first tests are going to be. One of my mentors back in the day said, you know, as attackers, we flow like water. We want to do the easy stuff first, and those are the easy things. And conversely, on the defense side, this should be the easy things to protect against as well. You get a device, have a process in place before it's on the line, that has already chased, that has already changed these things, but it just doesn't happen. [00:12:50] Speaker A: So interesting though, Jerry, so are you going into these organizations and hacking them to show them where their weaknesses are? [00:12:58] Speaker C: Oh yeah. Like I said, I like to break stuff. Yeah, yeah. So either physical or what we call physical or over the wire. So something that's going to be done, but then a combination of both. And the reason is we learned early on when I was doing this with the Department of Defense or Department of Energy that when we would go in separately, when we had a physical team go in to show that they could get to, you know, a reactor or get to a spot they weren't supposed to be in, and then we had the over the wire team go in and show that they could, if they were in, inside the facility, that they would be able to get all the stuff, access to all the things they weren't supposed to have access to. But the problem was we got told, well yeah, but you have to get in or now that you're in, you don't know what to do. And so that's when we started realizing, well let's just marry these two. Because that's what the attack emulation is anyway. That's what the attacker is going to do. They're going to have someone break into a facility and put a, leave behind, put something, plug something underneath the desk or put something in a server room that then allows the next chain, the over the wire to then start exfiltrating data or taking over stuff. And so we, we realized that that needs to be the way that was the. So what? Right? Like how does this all piece together and how do, how does it become a big problem? Because when you can show someone that, that think they've created this hard candy shell that can't be penetrated, that once you get through that shell, then what. That's when they realize they, they need to go with more. What I talk about usually, which is the gobstopper effect, you have to have layer after layer after layer after layer after layer. Because once someone's taken over your cameras and bypassed your RFID doors, you gotta have something else, someone on the other side checking or a man trap that does some other detection. [00:14:47] Speaker A: And yeah is a good. Leah. [00:14:51] Speaker D: Well, I was, I was just going to mention that Gary had talked about the hypothetical. Right. There's, there's so much conversation in Cyber about hypotheticals. There's a lot of numbers and data being thrown about around about what could happen if you, you know, if you leave default credentials, you could have this happen. And it's different when you see it happen to your organization. And the idea behind, you know, ethical hacking is that you're, you're being paid to show a company, what could happen in a manner in which they can fix it, as opposed to having it done, you know, on the less ethical side by somebody who's after your money. Taking those steps to have it done prior by an ethical hacker who gives you a report lets you see what the actions you should take are. That's the difference between taking something that's just hypothetical and you could talk about it in a boardroom and making it actually to something you can actually fix. [00:15:50] Speaker A: So many still believe in the safety of the air gap. But from your perspective, why is that no longer a reliable strategy? And what does effective network segmentation actually look like in a modern OT environment? [00:16:06] Speaker C: Oh, the air gap. So you mentioned Dragos. I have a background with them as far as some of the people that work there. And we all have the same feeling about the air gap that is extremely technical that we don't need to go into, because there really is no such thing as an air gap, even if someone considers it. But that all that aside, what we have found, especially after 2020, so after the COVID is there are so many shortcuts in place because people either need expertise to come in and help them with areas where they think that they've air gapped their environment, but then they've added a wireless access point or. One of the attacks that I did while I was at the Department of Energy was we actually used cellular devices because it wasn't on their network. So we would come in after breaking into a facility, leave literally like an LTE USB dongle attached to one of their systems that allowed us to come in over LTE or over wireless signals. And the first time I saw that that was actually being done on purpose years after I had done that, blew my mind. Where that is actually a solution when they can't get, you know, expertise from overseas into their environment because either they couldn't fly them here during COVID stuff like that, they were putting these cellular connections attached to these devices so that expertise come in with the intent of, oh, we'll just take it out after they never did. Or it was, hey, we're going to leave this in because we keep having to put it not in and out and in and out. And so you have these things where too many people will learn or figure out how to circumvent. We actually talked about this during the. The webinar that Gary was talking about is that the idea is that the best way to find out where your problems are is let the people that are figuring out how to circumvent their daily workload by taking and putting these shortcuts in place. And so that's one of the reasons why Air Gap, not only because it's not technically sound, but too many people do a workaround with it because it's inconvenient. Sneaker Net is what we would colloquially call it back in the day, where you have to take it from one system and walk it over literally with your shoes to another system. It just becomes too cumbersome and people like to bypass that. [00:18:33] Speaker B: And, you know, I mean, companies tend to think, my people aren't that dumb. We wouldn't leave the default passwords in there and no one would leave the, at that access point open. But companies are also big and I, you know, you hear it all the time of the defenders have to be right 100% of the time. And the attacker only has to be right once. So, you know, you find that one access point and you know, and you. [00:18:54] Speaker C: Go, yeah, but I, I'm not a fan of that particular saying, okay? And it's because it minimizes something that I think is very important. Yes, attackers have to be right just the once, but at the same time from the defenders. If you do security in layers, yeah, you only have to be wrong. You have to be wrong so many times for there to be success from an attacker. When I attack a system and it is done in a way that, that flips that script of I have to, I, who cares if an attacker gets in? Because they're going to get in when they instead protect the data instead of the access, they always mess with attackers. And this is what I mean by that. Let me make simple, make it a little simpler in terms of like an email process problem. A lot of times we say we need to train all of our staff to not know, not to click on a link. And I am not a fan of that. And the reason is no matter how well you train them, there's going to be someone that clicks. I would rather my security team defend the space by expecting that to happen and denying any sort of access to data or control the data in such a way that it can't be exfiltrated anyway. We all know what we care about. If someone, if we're like, hey, everyone can break into our house, but you have a safe room. Who cares if they're, they break into your house, you put all the stuff you don't care outside the safe room. All the stuff you do care about in the, in the safe room. They rummage through. They seem to do whatever no big deal. You walk out, all your valuables are protected. And so when we, when we teach defenders that there's just, they're up against this, you know, gigantic problem that they can never solve anyway, because we're just going to get in. I have found, being on both sides of it, that that narrative really hurt me as a defender, because then I became like, well, why are we even worried about why am I paying for this tool? Why am I doing whatever? When I started realizing when tools or people or whatever are looking at process and protecting the important parts, it doesn't matter if an attacker gets in, if they can't get out with what they need or they can't ransomware, because I have great backups that are immutable, that don't have malware in them. Who cares if they take down my space? I'm back up and running right away. [00:21:23] Speaker A: So that's so interesting. So you said protect the data and not the access. So we just have to assume they're going to get in. If they want to get in, they're going to get in. [00:21:33] Speaker C: Now do the right things to have that access protected as well. I'm not saying also, leave your windows open and your doors open. [00:21:40] Speaker B: Windows and doors open. It's fine, lock your doors, lock your. [00:21:43] Speaker C: Windows, do all of that. But the idea is that the most of your effort should be that when it gets to the part that's important, know when to pull the cable, know when to stop the access, know when to shut down a line before fire gets happens in the warehouse because of something else. Know when to spin down a turbine, whatever the case may be. But. Yeah, go on. [00:22:06] Speaker A: No, I just, I mean, I, I, I think that, I think it goes to what this podcast is about, rethinking manufacturing and how we approach these different topics. And cybersecurity is, this is just a great example of we need to rethink the way that we approach it. Um, I was talking to somebody not too long ago, and they, they were talking about having some, like, setting up like a fake, fake gallery, like kind of like a, you know, a fake area where their information was. So that if a hacker got in and they went into that area, they knew it was a hacker, because anybody who has, you know, access to that already, you know, knows that it's fake information. So just, you know, kind of coming up with different sort of ways to protect the information and also creating a culture, like you said, knowing when to take something down or take something offline or react to something. [00:23:05] Speaker D: Yeah, I was at An a manufacturing event a couple years ago talking about cybersecurity. And I spoke with a gentleman who, you know, when he thinks of cybersecurity, he thinks of the training that he had to do and it was the phishing training that Jeremy was talking about. And he said, I fail every time and I have to do the re education portion. And he said I will fail every time because I'm in sales and it's my job to answer emails and it's my job to click on things. And it really got me thinking, you know, that that concept of you're never going to get 100% of your workforce to care about cybersecurity in a way that will be your layer of protection. You can have cybersecurity champions, you know, that, that are on the lookout for things, but you're never going to have the entire workforce take on that responsibility. And should they, you know, does it make sense that so much of their thought process is dedicated to that or should we be layering, you know, keep that training going, keep them, you know, in the back of their mind they're thinking of these things, but then also layering it as Jeremy was talking about, with additional protections that, you know, if, if you have a great salesperson, you don't want to lose them because they're accidentally clicking on things they think are important. You know, you want to keep your business going in, in multiple ways. [00:24:27] Speaker C: Yeah, I love what you said also, Stephanie, about the, the what. What would be called a honey net or a honey pot, right? Something where an attacker goes in. Now, a honey pot is a singular thing. I love the concept of honey nets. I, I love the idea of, of attracting an attacker and then letting them rummage around. And the absolute gold would be if they, you know, burn a zero day on your honey net because then you get to, you need to get to capture their tactics, their techniques and their procedures and start learning about all of that. One of the things. And again, I'm just going to call back Bragos because they have a great process of when you then learn from that data, you learn from the honey net, the honey pot and, or these fake information or these fake areas. It's so important not only that your organization learns, but that you then put it into a system that other manufacturing organizations get to learn those same ttps, the tech, the techniques and so on and so forth. And that is one of the reasons why they're so great, is because we get to share that information as defenders and say, look for these things now and these are the things that are relevant because we're actively getting attacked in this way. [00:25:47] Speaker A: So what's interesting too is AI is beginning to show up in these OC environments. And when you're talking about learning, can we, can we use AI in that capacity to, you know, learn what's happening on the network, what these hackers might be doing, how we can respond, or how do you see it being used? [00:26:04] Speaker C: My. My favorite way. I know Leah has a few things, but I'm going to talk real quickly about my favorite way because I just got done talking about this. My favorite way that it can take a look at things is normalize, like understanding a baseline. What does that traffic look like when it, when a day is good in your space and when it can understand that and fine tune that, it is phenomenal. Because now when someone on the inside does something they shouldn't, they turn off a machine because they want to take a break or they want, you know, they want to do something they shouldn't do, that's going to pop as an anomaly. And we have different things that do that already without AI. The nice thing about AI is it can parse through so much more data and look at it and learn from those anomalies. And then if we train it correctly, if we have and protect against side channel attacks and misinformation and we have eyes on it, where we're empowering people as opposed to replacing it is such a great tool to take in a ton of data to know anomalous behavior. Anomalous behavior is. That is the most used approach, meaning that if we look at a bell curve of attacks, more attacks are going to come with anomalous behavior. It's the ones that are on the right side of that bell curve where someone like me might go in and I'm going to mask all of my stuff through behavior that's happening. I'm either going to piggyback off of a packet or I'm going to piggyback off of something that would normally happen during that time. So I don't show up anonymously. But the nice thing about AI, and this is what threat hunters do, is they follow that packet all the way through ingress and egress. And it's very hard for threat hunters to do that all the time. AI really helps with understanding an attack from the beginning to the end. [00:28:02] Speaker A: Can it in. Can AI introduce risk though as well? [00:28:05] Speaker B: Oh yeah. [00:28:08] Speaker C: Leah, do you have anything about that? [00:28:11] Speaker D: Yeah, I mean there, there's a lot of risk in, in how you're using AI, how you're setting it up, what data you're sharing, where you're sharing it to, how you're allowing it to be used within your organization. But then also there's the risk of not allowing it. And then, you know, people find workarounds if you're not allowing it at an organizational level, are people taking that data and putting it into their own personal chats and trying to enhance their workflow and do their work better? You know, but in a less secure way? There's, there's a lot of risk that can be introduced if you're using AI in a way that isn't thought through. [00:28:51] Speaker B: So let's help our listeners then. So from a security standpoint, especially inside industrial environments, how should teams be approaching the adoption of AI tools or models? What should they be looking for? [00:29:03] Speaker C: I would say if you wanted to boil it down, it's just like any other tool that gets brought in. You understand its limitations and you plan for those, right? All too often as humans, we're taught when someone says something confidently or information is presented in a very eloquent way, it was well thought out and it must be true. And fortunately, slash, unfortunately, that's AI in a nutshell, right? Like you ask a question, it, it confidently tells you, I see six Rs in the word strawberry and you go, wait, what? And it, it, it breaks it down for you, it tells you why. It feels like it's being transparent, but you just know better. And all too often we're seeing that when the, when AI in any form, generative machine learning or whatever is being brought into these areas, it becomes this. You have two sides, two very opposing sides. I don't want it here. Get it out. I'm not going to use it and in fact, I'm going to make sure that it fails to the. I blindly trust it. It is, it has made my life easier. I'm now way more productive. But if you don't bring critical thought into the process and you don't look at it with a curious mind and say, what is the output? Or what is it analyzed or what is it found? If you don't keep the human in the loop at this point in time, fast forward 10, 15 years from now, when a AGI and everything is in place and it's doing critical thinking on its own at this point in time, if you don't keep human in the loop, there's going to be problems and you need to know that and plan for it. [00:30:49] Speaker B: An example of that, that has nothing to do with Cybersecurity. When GPT was released to the masses, a friend of mine and I started to see, like, what can we do from a content perspective with ChatGPT? What can it do? So we would say, because we're working in cybersecurity, write us a story on a cyber attack that happened on the oil and gas industry, whatever the prompt was, and it would write this wonderful article again. It's amazing how far confidence will get you. This wonderful article that sounded great and it talked about the attack and the ttps and had links. And then I'd go, I've never heard of that attack. And so then I'd go do some research and it was like, nope, that attack never existed. Oh, no, none of those links go anywhere. They're all just dead links. But man, did it sound good. So, you know, if you don't want to dig a little bit deeper and do the fact checking, yeah, it's easy to just go, yeah, it sounds good. I'm sure the AI knows what it's. [00:31:39] Speaker C: Doing and that's what it is, right? We, we get told that it's AI, it's intelligent, it should know its stuff, and we don't think to check it, or most people don't think to check it. And that's where these guardrails come in place, right? Like it shouldn't be making, in some areas, it shouldn't be making final decisions because there's also the problem of, of accountability, right? If, if we make a change to the line to make it more efficient because AI told us to, and it doesn't work, do we blame any AI? Do we blame the human that could have looked, taken a step back and went, yeah, that would never work. Like, why did we make that change? Or we've made that change before? No, it doesn't work. But so, like, that's the thing. It doesn't have necessarily all the history, it doesn't know all the things that it needs to to know. And it's still going to be something you have to train. Yelp, does it learn a lot faster than Steve that we just hired last week about our environment? Yes, yes, it does. But we still have to have, you know, the lens of curiosity and understanding that we have to question its output and we have to put critical thought to it. [00:32:52] Speaker B: So I want to talk about DevSecOps for a second because you guys both are interested in that. So some of the practices that are there, version control testing, can these be realistically applied to PLCs or industrial control system code? And if they can. Where do we start with that? [00:33:11] Speaker D: Yeah, I mean you have, you have some different thought processes here, right? Some PLCs, what they do is very simple and it's not necessary to have version control for them. It's not necessary to be bringing higher level collaboration. But the complexity is rising in the things that we're doing and the more things we're adding in, the more conditions we're adding in, the more developers were getting involved in the process. It definitely can help speed things up when you have rollback options and version controlling. It can definitely help improve the process. [00:33:47] Speaker C: For everyone and where the systems reside. Like I could see on a line that, you know, I just, in my mind I picture PBS shows when crayons were moving across and like the machine never moved once. And you can just see the, the years of dust on it, right? Like if there's, if there's, if it's doing a thing and it does that thing. Well, yeah, like what are we, what are we trying to complicate here? But when you put these same systems, these same relays and, or anything in a robotic, like a co op, cobot arm or any other robot and it's using all of the same components, then very much there's going to be so much change. I think the litmus test should be, is what we're building with this system going to change or can it change often? And if it does something like DevOps is going to be where it's at, like something that, some sort of version control where you can monitor that change and not keep repeating mistakes. Right. And that's one of the tenets of DevOps is dry. Don't repeat yourself. Right. So if you are able to do that version control, if you're able to put it into systems, do it. But it's not for everything, just, just like any other tool. Can it be used here? Should it be used here? Is it going to be effective? Is it going to be causing more problems than not all too often we put these blanket statements on that you have to do this. MFA was the example, I think five, 10 years ago in OT IT and OT convergence talk was happening and everyone was like, well, MFA has to be everywhere. And I loved it when IT people would go to OT people and say, you have to put MFA on everything in the space. And they were like, where would you want me to. It's not even built in, it's not even a function to turn on. Like, what do you want me to do here? There's no data being held on it. It's a relay. Like, I can't put MFA on something like this at that time. And that was the thing. Like, we, these blanket statements have to be careful. That's why, Stephanie, when I mentioned the, you know, access protection isn't as important as protecting this important data. I didn't mean for everything, but. [00:35:56] Speaker A: Yeah, that's right. I'm just trying to catch up on your cybersecurity talk. It took me a minute, mfa, and I'm like, oh, no, I know. [00:36:03] Speaker C: Oh, yes, Multi factor authentication. [00:36:05] Speaker A: Took me a second. [00:36:06] Speaker C: I apologize. [00:36:07] Speaker B: Yeah, the, the acronyms in engineering are the worst part of being in engineering. It took me to be like, I don't know what any of this stuff means. I had to look it all up. [00:36:17] Speaker C: Yeah, yeah, I apologize. [00:36:20] Speaker A: No, no worries. But just, you know, just a quick follow up on that DevOps question. We're talking about version control. I mean, what, how does this factor into cybersecurity? It's just like you've got to have everything updated. There has to be a way to keep track of that. Otherwise you're going to open up another window for someone to get in. [00:36:40] Speaker C: Yeah. So a lot of times when we, when we build or when we program something, you know, the industry will have things called like patterns and anti patterns, so good ways of writing code and bad ways, things that open up attack vectors. And the idea behind doing something in, for example, Git version controlling is that you can also tie in different scans that will allow to look for those bad habits or look for vulnerabilities that you might be introducing that some developers don't even know they've been doing for years. In the IT space. SQL injection, which should be a thing of the past, was probably the most rampant way of getting into environments and attacking web pages or attacking any application that was built that, that did form or had a database behind is the easiest thing to do with validation to protect against. But yet, like default passwords, it's still everywhere. And so these same things happen. Like for so many years, we never had the IT visibility into the OT space because the Purdue model put the very big DMZ and says it shall not do X. And there never was any of these practices in place because there never was any need. Internet wasn't touching it, outside wasn't touching it. No one was putting, you know, these LTE dongles in place or putting wireless access points on things that weren't supposed to have wireless access points. And so it wasn't a big deal. But we're doing all those things now. And sometimes we're doing it by design. There are companies spinning up to do it on purpose, which is great if it's done and implemented correctly. But you know, there are some, I don't want to say old school people because I consider myself old school as well, but there are some people that, you know, have been in the industry for 50 plus years that have never touched a wireless access point, never was in their space. So now we've got to understand that by putting this in their space, we. It's replacing something. And I don't mean something, I didn't mean to say it that way. Replacing a person with that much knowledge is not as easy as finding a way to mitigate the risk that we're bringing into their space that we're never going to know, like I said a lot. We're never going to know more than those individuals that have been day in and day out in those warehouses and on those lines for the past decades. Then they will, they are about that space. And conversely, they're never going to understand unless it's just something they do also how to protect and do cyber security and protect their physical space and their, their over the wire space as much as we are. And so we need to understand how to complement those, not combat those. [00:39:30] Speaker A: And given sort of the changing dynamic of everything that's happening in the manufacturing space, like, what's one practical change that each of you could recommend to industrial teams that they could make now to better secure their systems, either technical or cultural? [00:39:48] Speaker D: I think from my perspective, I always look at things from a cultural standpoint because we want things to be repeatable, right? Technology changes and we want our culture to support whatever the new technology changes are that come in. And so one of the big cultural changes that I see that's being really effective is raising awareness of cybersecurity in the parts of the company that you might not normally think to raise awareness of cybersecurity. So build champions on the plant floor. Build champions who as they're doing a walk around can say, hey, why are we using those ports? Why are they exposed right now? Are we checking our vendors that walk through? Are we making sure that they are on our approved vendor list? You know, raising those questions and opening those conversations, I think is a pretty lasting change from a cultural standpoint. [00:40:47] Speaker C: I was mesmerized by your answer that I lost. I even tried to write it down. From my perspective, culture definitely is a big one. I remember where I was going. And this is always, people always find it hard When I say this, but if you're someone out there that's doing cybersecurity, it is your job to empower the company that you're, you're responsible for protecting, not to slow them down. If they want to move at an uncomfortable rate, get comfortable with that and make sure that they don't look at security as a blockage for their innovation and for what they're doing. It is something that I learned the hard way. It's something that Leah pointed out to me over a decade ago when I got uncomfortable in my research around cloud environments, for example, I said, don't go to cloud, and she said, stop being dumb. Why don't you teach them to go to cloud the right way instead of telling them not to move forward? Which is true. If anyone had listened to me back then, I don't even know what company you would be right now if you weren't playing in the cloud space. [00:42:02] Speaker D: Cloud is just someone else's server. [00:42:04] Speaker C: Exactly. AI is that same, is that next one. If you're uncomfortable right now allowing your company or working with, allowing your company to understand how to safely implement AI, you need to get comfortable with it. You need to put in the effort to learn what you need to learn because it's where it's going. You're not going to be able to stop it. And if you try and stop it, you're not going to be there long in this day and age. So get comfortable being uncomfortable with innovation when you're in security. I would like, if I had my way in the next five, 10 years, I would want security to be the driving force of innovation alongside engineering as opposed to we've always been bolted on and, or we're holding people back. And that's how I've changed the way I look at things. It's why I come at it from attacker perspective. It's why I understand that, you know, sometimes we have to make decisions as business leaders, sometimes we have to make decisions as plant, you know, leaders as, as, as managers on the floor that are going to seem like it's a corner cut. But if we put compensating controls and mitigate and put things that mitigate that risk in place and make sure that it's right and it's documented and it goes through process and all the other things. Don't anyone twist my words. There's still all those things that have to happen. It's what's going to allow innovation to move forward. And we want to be on an enabler, not a disabler Love that advice from both parts. [00:43:32] Speaker B: We're getting toward the end here. Jeremy, I gotta ask you one question about Mr. Robot. So you were a technical consultant on that show? Here's why I gotta ask. So, on our old cybersecurity podcast, we used to always end by asking our guests, what's your favorite cybersecurity, like, TV show or movie that has anything to do with cybersecurity? And we would get everything from the office to Parks and Rec to the Net to. And then Mr. Robot. One thing we got a ton of was, you know, I should say Mr. Robot, but I have a hard time watching that show because it's often too realistic, and then it just feels like work and it bothers me. What was your experience on that show and how deep did they let you go? [00:44:11] Speaker C: So I actually did two different. I wore two different hats while I was working with that. So we had a whole group of us that were technical consultants for it that looked at the realism of the hacks that a lot of our hacks that we had. If, you know, if we had done it, we. We made sure it was in there. We had people like Dave Kennedy who let set their social engineering toolkit be shown on. You know, if you're watching something on screen, it's actually what was being typed and. Or what we would do in that moment, which was great. And the other part we had was because we were controlling what was on screen, we were able to make an arg. So an alternate reality game that took all of that experience and gamified that show so all the viewers then could go through and do different hacks or do different things around the Internet and with each other. And it formed what technically became the Mr. Robot ARG Society, which was this huge group that was trying to solve the. The puzzles within the arg. So I loved that, like, all the stuff that happened with that show. And I completely. I. I think that's the first time I've ever heard someone say it that way. But I completely understand the sentiment of. This just feels like work because you're like. All too often, I used to love watching things where, you know, an IP address would show up. And it was in the, you know, the 320 names, or the 320 was the first Octa. And you're like, okay, that's not real. Or there's reasons we can't show those, like, real IP addresses and stuff. But it was really fun to watch people react to the realism of the show. And obviously they. They took creative liberties in Some areas. But yeah, I think that's funny. Like, it felt like work for some people. Yeah. Yeah. [00:46:03] Speaker B: Guys, thank you so much for coming. I love talking to you guys. Jeremy, Leah, always a pleasure. Yeah, thanks so much for coming on the podcast. [00:46:11] Speaker D: Thank you for having us. [00:46:13] Speaker A: Great discussion. Thank you both. [00:46:15] Speaker B: Yeah, well, we may have you come on again at some point. I love talking about cybersecurity with you guys. So as I said, always a pleasure. Yeah, look them up. Peak solutions. Definitely look up Leah and Jeremy. They're good people. So thanks so much for being on, guys. Stephanie, I gotta tell you, I love talking to people like that. So when we were doing the old cybersecurity podcast, it's really early in my time talking to cybersecurity people. There was a guy that we talked to and he was saying, yeah, like earlier in my career, he was a white hat hacker too. It's like I was in this hotel overseas and it was the first one that I was ever in where all the controls were on like a pad, like an iPad. You know, you could control the heat and the windows and all that stuff. He said. So I was back in my room one night and I was bored and I was like, I wonder what I can get into through this. And so he spent the next 24 hours, stayed awake and spent the next 24 hours messing with it. And he was able to access room controls on all the other rooms in the hotel, all sorts. I was like, oh, I guess it's good that he's out there. And then also it's very scary to know that it's that easy. [00:47:19] Speaker A: Oh yeah, it's really scary. But you know what's so great about talking to Jeremy and Leah is that they do, like, for people like me. I don't. I mean, I'm not a cybersecurity expert. I don't know really anything about it because is changing so quickly and there's so much going on. And like you said, it's very scary. You know, I've seen demos where, you know, somebody hacks a phone, can tell where you are or can find an individual, like a random individual, anywhere in the world. But like, Leah and Jeremy break it down in a way that I can understand, you know, so it's really great to, I mean, I think our audience has an elevated level of understanding and engineering, but it's also good to just bring it down to earth and just, you know, talk about what this really means, the repercussions, what organizations have to do to get started. Just real world information. And practical approaches. [00:48:17] Speaker B: And if you're looking for real world information practical approaches, I'm always looking for a good segue. Check out Control Engineering or any of our other engineering sites. Lots of great information on there, other podcasts, videos, obviously written articles, webcasts. So so check those out. And also if you are interested in what Lee and Jeremy are talking about, definitely go on to Control Engineering. Click that webcast button and you'll see their webcast on there. Absolutely worth watching. Thank you again for joining us. Always happy to have all of you out there listening to us. So thanks for joining us on the Control Alt Manufacturing podcast and we'll be back with you soon. Thanks, Stephanie. [00:48:54] Speaker A: No, thank you. Until next time.

Other Episodes

Episode 4

November 18, 2025 00:38:13
Episode Cover

Ctrl+Alt+Mfg: Ep. 4: Making Digital Transformation Real With Alicia Lomas, Lomas Manufacturing

Digital transformation is everywhere — but what does it really mean for manufacturers? In this episode of Ctrl+Alt+Mfg, hosts Gary Cohen and Stephanie Neil...

Listen

Episode 2

October 21, 2025 00:30:13
Episode Cover

Ctrl+Alt+Mfg Ep. 2: Uniting Disparate Data With John Lee, Matrix Technologies

In this episode of Control Alt Manufacturing, hosts Gary Cohen and Stephanie Neil sit down with John Lee, senior manager of manufacturing intelligence at...

Listen

Episode 1

October 08, 2025 00:29:07
Episode Cover

Ctrl+Alt+Mfg Ep. 1: Resetting and Rethinking Manufacturing

Ctrl+Alt+Mfg publishes new episodes every two weeks and is available on ControlEng.com and major podcast platforms. The opening episode, “Resetting and Rethinking Manufacturing,” introduces...

Listen