Securing healthcare, HIPAA, and beyond: Cybersecurity insights from CISO Bill Dougherty
Is HIPAA a healthcare CISO’s best friend or worst nightmare?
This episode of Code to Cloud features an interview with Bill Dougherty, CISO at Omada Health, a virtual-first, integrated care provider combining the latest clinical protocols with breakthrough behavior science to make it possible for people with chronic conditions to achieve long-term improvements in their health. Bill brings with him over 25 years of experience in IT and security at such companies as RagingWire, StubHub and Copart.
Time Stamps
[00:00:11] Tim Chase: Hello and welcome to Code to Cloud. I'm Tim Chase, global Field CISO at Lacework, and today I'm excited to talk with Bill Dougherty. Bill is CISO at Omada Health and he brings with him over 25 years of experience in IT and security at such companies as Raging Wire, StubHub and Co-part. Bill, welcome to the show.
[00:00:31] Bill Dougherty: Thanks, Tim. It's great to be here.
[00:00:33] Tim Chase: All right. Let's just get straight into it. I don't believe kind of in lead-ins, I, I wanna start talking about hipaa right off the top. Right. one of the most fun things about being in, the security industry is dealing with, with regulations and obviously, You working for a healthcare organization, you've got some, very specific ones that you have to deal with, for example, hipaa. So you know, at the CISO at Omada Health, obviously one of your primary challenges is navigating hipaa. how do you deal with that? Like, what do you and your team do to be able to make sure that you're HIPAA compliant and that you're protecting, the data of patients.
[00:01:03] Bill Dougherty: it's interesting as a ciso, HIPAA is the bane of my existence, but it's also my best friend. And what I mean by that is that in healthcare, everything is harder than it would be if you're at another organization. because you have this overarching set of regulations. At the same time, it's easier to justify, investing in security and investing in privacy and investing in controls because you have this overarching set of controls coming from the federal government that says you must do this. So, things that were easy at other companies are really hard, but at the same time, things that were hard at other companies, like justifying budget for some things are actually easier at a healthcare organization because, HIPAA gives you the stick, if you will.
[00:01:47] Tim Chase: that's a really good way of looking at it. You know, I, I've been talking with several CISOs lately, had a panel yesterday. And I think, that really hones in on something that is important when it comes to budgeting, right? So obviously we're all fighting for budget. the security leaders are no, exception. Like a very rarely have I ever heard of a CISO that says, I get what I want, you know, and, and leaves it to that, right? We're always having to balance and fight for it, right? And then you are doing that sometimes with a board. That may not be security focused, right? but so you find that when you go up to your executives or you go up to your board, being able to say things like, look, we've gotta comply to HIPAA in order to comply to hipaa. Like we've gotta enforce it, all of this encryption here. We have to be monitoring, at the intrusion detection level. Like, it kind of gives you that way of,enforcing security and getting budget.
[00:02:34] Bill Dougherty: I would say I've never met the CISO that has too much budget. I certainly don't have too much budget, and if my board is listening in, I could, I could absolutely use some more. money. but it gives you kind of a baseline where, for us, we are protecting millions of records of health information. And if we have a data breach, it is an existential risk to the company. because, foundational, what we do is trust. we have to have the trust of our partners and our customers and our patients. And a data breach would break that trust. So, you have to then have a good relationship with your board and with the rest of your executive team. Where if I say I must have this, like if I don't have this, we're putting the life of the company at risk. you have to build up a trust where you can't just pull that card every time. If you pull that card every time, your chicken little is not gonna work. but it certainly gives you a, framework and a backdrop to say, these are the things that are really important and these are the things that we must do, and here's why. I was working with our general counsel the last couple months trying to come up with a more sophisticated financial model where we could actually model out what is our real risk of a data breach on a per record basis. And the answer is, it's really, really complicated. it's also really large dollar amounts. this year we will cross having more than, 1 million patients in our program. so think of that as. 1 million people where we have health information. Now go look at any study on the cost of, or the, risk of a record. And let's say it gives you a range. And the low end of the range, it's five or $10 a record. The high end of the range, it's $500 a record. So $500 a record times a million patients is a really large number of a data breach. But that isn't actually our risk because not all records are created equal. so, I guess it's a long way of saying If you've got the right relationships and you build the right risk model, you can get the resources you need. Not necessarily what you want, but the resources you need to do the job right.
[00:04:39] Tim Chase: tying it back to business objectives, whether that's, you know, keeping the, uh, goodwill, not being fine, but just in general, helping them advance the business. Right. They've got a million, I'm sure that they would like to have more data because, um, That's how they, grow and they expand. Right? And so, tying it to those business objectives is something that I've been hearing a lot about and talking a lot about lately. but one thing I do wanna hone in on, 'cause I'm really kind of curious, your thoughts. you mentioned that HIPAA is a good start. I think you said it's a framework. do you find that. you start at hipaa, but then you have to add more controls on top of HIPAA in order to, you know, protect the, data to an even better extent.
[00:05:20] Bill Dougherty: Yeah, I think that that's fair. HIPAA is a 25 year old law and things evolve over time. So, certainly if, if you just took the security rule of HIPAA and said, okay, this is my checklist, you aren't meeting the bare minimums. you might be meeting the bare minimums of hipaa, but you're not meeting the bare minimums of what you actually have to do to protect your data. so what we've done is over the years, we've put together our control framework, and it's based on, some standards. It's based on nist and it's based on, SOC two, and it's based on high trusts, and it's based on the security rule of hipaa. And then, we layer onto that also, obligations we've made to other customers. So we sell to the largest employers and the largest insurance companies in the country. And when you do that, they have lots of opinions as to what your security ought to be as well. And so we make contractual commitments. And when we do that, that then becomes a new set of controls that have to go into our, bucket and into our framework.
[00:06:22] Tim Chase: One of the things that I've seen lately is that Florida is introducing new regulations that prohibit providers from storing data offshore. what's your kind of interpretation with what they're doing and what do you think the implications of that are?
[00:06:36] Bill Dougherty: if I remember correctly, it's SB 2 64 is the new Florida law, and what they did is they've imposed new obligations on healthcare providers, using, uh, certified e h r certified electronic health records systems to store only in the United States. But it, it actually goes a little bit farther when you read into it. it covers storage, processing, and accessing of health information. the general idea there is they don't want health information offshore. now there, there's two good things for us. One is we aren't using a certified E H R. So even though we have Florida patients, it's not really clear that the law applies to us. the second is we have always operated in the US where we store and process exclusively. In the us we commit to, our customer contracts. the sticking point really is gonna be the definition of access. so, if you think about it, like we are a cloud native company, so we have lots of SaaS providers. we run all of our infrastructure in Amazon. the large SaaS providers and Amazon, they have global support organizations, and there is a scenario where one of my SaaS providers that has electronic health records in it could have a support organization in Australia or in India. And if I contact them for support in the middle of the night, are they accessing health information from offshore? , it's a harder problem to solve. And so we're looking at it a, trying to figure out exactly what is meant by access B. Does it apply to the systems that we're talking about, or does it only apply to a certified E H R? and then c, what do we do about it? I think the spirit of it is good, which is we should keep US health information inside the US and that's how most. health providers operate today. but the details of it become really tricky and it's always at the edge cases. The edge cases for security are, really hard to write into law.
[00:08:42] Tim Chase: That's tricky, right? Every time I feel like we try and do something here, when it comes to the law in cybersecurity, there's always still a matter of catch up. Like we talked about HIPAA earlier. I. It may say encrypt, but it doesn't say how to encrypt or what level of encryption because it's kind of, ambiguous. But like, when you talk about this Florida healthcare law, it's like one of those things that it almost has to get out there, have the critique and feedback, first or second round, and then, you know, do it. the one thing that's still a little bit interesting, I. To me at least, is it can't be stored there. But what about the access? So what if you have an offshore provider, you know, in Eastern Europe or in Asia, that just connect in through, zero Trust or V P N or something like that. I don't think it prevents anything like that. And does that kind of halfway negate the security risk, or does that
[00:09:29] Bill Dougherty: I, think that's where the, there's a difference between how we think about security risk and how we think compliance with the law. and I am far from an expert on the details of Florida's, law and I, I've got great privacy attorneys that help me with this things. But, I think the risk of accessing data from offshore, if you have the right controls in place, is relatively low. We still try to prevent it, but the law may not account for any nuance. The law may just say you can't access it. It. And so you then have to go exactly what do you mean by the term access? Does reading it on a screen mean access? Does downloading it to my computer mean access? Does printing it mean access? There's lots of different use cases and we have to know exactly which ones are allowed and which ones aren't in order to comply.
[00:10:16] Tim Chase: And, and how do you define offshore? That was the thing that always bugged me a little bit. Like I would see contracts that would say, or laws that you can't do anything offshore. Well, I mean, are you trying, does it say anything outside of the us? What about Canada, Mexico? like how do you define offshore? Exactly. So
[00:10:31] Bill Dougherty: that, that is a great question. In our contracts, we usually specify within the United States. Okay. So now then sometimes it says the 50 United States. Does that include or exclude Washington DC Sometimes it says the United States. and sometimes it says United States and Territories. So like we have a provision where we don't allow employees to take a a company laptop outside the US. can they take a laptop to Puerto Rico or the US Virgin Islands, or can they take a laptop to Alaska if their flight to Alaska actually lands in Canada on the way.
[00:11:09] Tim Chase: that's the one that I was thinking of.
[00:11:11] Bill Dougherty: it's the always the edge cases that get you. And so we, we try to write policies and put in controls and do all kinds of things to deal with the nominal case and the edge cases, but it's always the edge cases that keep me up at night.
[00:11:25] Tim Chase: It is kind of the hacker in all of us, right? Even the ones that, that aren't hackers. Like, you know, I, I could never be a hacker. That just wasn't the way my mind was built. But like, it's the way you wanna game the system a little bit, right? And figure out how to game it. Like that's the security. In all of us a little bit, I think. speaking of which, gaming the system and kind of, planning that out, what are the things that I want to talk to you about as threat modeling, because, I think that's an area that you have a lot of experience in, you put a lot of effort in. So before we kind of go into your threat model and that you and and your team developed, I'd like to talk about like, just for people who may not know, like what is a threat model, inside of the, the cybersecurity space.
[00:12:00] Bill Dougherty: Yeah, so generally speaking, a threat model is. A systematic way of trying to identify what can go wrong with the system ahead of time, and then tailoring, your controls around it. a lot of work was developed on this, from Microsoft, several years ago. the Stride model, there is a fantastic book by a guy named Adam Shostak. The, anybody who's interested in threat modeling read that book. It is the Bible. he's brilliant on it. And they were really focusing on the biggest kind of commonplace risks in software development and how do we prevent those? So, things like we want to, our software to be available. so the D in stride is, stands for denial of service. What do we need to do in our code or in our system design to prevent denial of service. So that we have this goal of availability achievable. so it's been around for a lot of years. and one of the goals of using threat modeling is to do it early and the other is to make it repeatable. So you, you want to be able to have across a large group of people the same general answers to come out every time you do it. If that makes sense. So it, it's a systematic way of identifying risks versus just getting into a room with a whiteboard and throwing stuff a wall and see what sticks.
[00:13:22] Tim Chase: Yeah, and I think the repeatable part is something I've always honed in on in my cyber, career. Like when I'm leading a, starting a, cloud security team or application security team. Whatever you do in the process you roll out has to be repeatable. It can't be something that you and you alone can do. And you can only hit play once and then you're not gonna get the same thing, every time. I think that even goes back to my testing days when I was running, like performance tests and automation tests. Like Right. it's gotta be a hundred percent repeatable. so I, I agree with that. But you also mentioned that it, it has to be integrated into the lifecycle. Like how do you integrate threat modeling into the lifecycle? Like where would you put it? Is it the beginning, the end? Is it part of like the continuous improvement cycle, of DevOps?
[00:14:01] Bill Dougherty: all of the above. when you're doing it right, you're doing it at the design phase of a project. And then you are coming back to it as the project progresses. So, let's say you're, you're designing a new application and the application's gonna have, sensitive data in it. So we know we need an authentication mechanism, and we're gonna write down at the very beginning, what we think the authentication mechanism ought to be, and is it single factor or multifactor? Okay, great. now we've got that written down. Now we start going and writing code, and three months in we change our mind and as to what the authentication mechanism is gonna be. All right, let's go back to the threat model and see what impact that has. And are we making things better or worse? What are the compensating controls? Let's update the model. And then you get into testing or you get into production, you have something to validate against this. Oh, at the very beginning in our design, we said we were using multifactor. Now it's in live, in production, and I can get in without multifactor. what changed? So it, it's a way of analyzing a system and you can look at it like if you have a system in place already, you can go do a threat model on it and use that to assess the risk of the system. But ideally, you're doing it at the very, very beginning. we also use it for our vendors. Every time we we're gonna bring on a new vendor, we do a threat model on 'em. because we need to figure out how that vendor is going to fit into our ecosystem. So a, a good threat model can be, inserted at all various stages of the lifecycle.
[00:15:33] Tim Chase: Yeah. It seems like, there's a little bit of parallel, I think, like the way that I do static testing sometimes, like, it's not always like you run a fresh. static scan every single time. Sometimes it's, you run it on the diff like, what has changed? and just run it based on that, right? Like, 'cause that's the way I think to scale to some extent. Like the very beginning or maybe at the beginning of every release or something along those lines. Like maybe you'll run a, a fresh one just to make sure that you have a complete understanding. You haven't missed anything. and it comes at the end, like I'm a big fan of the. The continuous improvement. So some people think it's very much linear and then you're done with the development of what you're building, right. But at the end, you should be doing a, a lessons learned, a continuous improvement. And that threat model, I think should be kind of in there as well.
[00:16:16] Bill Dougherty: Yeah, your, static testing model, example that, fits really well with kind of how we use threat modeling because part of it is we have a baseline set of controls and so we can go through a system and very quickly say, is this going to adhere to the baseline set of controls? If so, check the box and move on. So using authentication as an example, we bring in lots of SaaS products. Do they integrate with our single sign-on? Tool, yes or no? If they do, then I automatically pick up all kinds of stuff. I pick up multifactor, I pick up device certifications. I pick up, you know, role-based access, lots of stuff. If they don't, now I need to go deeper and say, okay, since you can't adhere to my single sign-on thing, What are you doing? How are we managing factors? Who's managing passwords? how do we provision, how do we de-provision all this stuff that is built into my framework of base controls that if you just said, yes, we get to move on so it's a way for us to identify the path of least resistance. Also, like you, you can evaluate three vendors and one of them adheres to all your controls, and one of them is a unique unicorn. Go with the one that adheres to all your controls. Your life will be easier.
[00:17:27] Tim Chase: A hundred percent. but continuing along with the threat model talk, you developed, one yourself calling, includes, no Dirt, right? And so, you win for better name than Stride. so maybe just talk about like, where in the world did that name come from, right? and then, what does it do? Like, how is it different from like Stride or some of the other ones that maybe we're familiar with?
[00:17:48] Bill Dougherty: So, I take no credit for the name. the, my, head of compliance and I, we co-wrote the paper on this. What we were trying to figure out is we knew we wanted to insert threat modeling into our process. I. But as you can imagine, I was very focused on security and he was very focused on compliance. And we also have our privacy officers very focused on privacy, and there was no kind of one overarching model that encompasses that triad of security, compliance, and privacy. we had stride for security. There's something on the privacy, I think it's called Lindo. there wasn't a great one for compliance and so we said, well, gee, let's write our own. And so we, we put together a list of the, you know, risk categories and then he started playing with the first letter of all of them. 'cause Stride is just an acronym. Well, so is ours. And so we reordered things until he came up with. Includes no dirt, whi, which governs security and privacy and compliance, has a cool acronym. And then, you know, also has the advantage of like, well, if we've ticked off the whole threat model, then our system is dirt free. Like, or, or we're reducing the dirt in a, so it it's got that kind of, it works both ways. E exactly. So like in in the Stride model, the SS stands for spoofing Well, and it stands for Spoofing and includes No Dirt too. But we had includes, no Dirt for us, includes Identifiability. for every risk, there's a goal also. So, for spoofing, your, goal is authentication. For identifiability, your goal is anonymity. So if you think in health data, sometimes we want to de-identify data, and there's lots of good reasons to do that. Well, if you do that, can you re-identify it? And if you can, that's a risk. 'cause if we say something is de-identified, a whole different set of rules and controls and laws apply to it. so that is a privacy risk for us. so I won't bore you with every risk and every goal in the model, but includes no dirt's, an acronym. And we wrote this down. We kind of started using it and we said, Hey, we think this is really cool. And We haven't found anyone else doing something similar in healthcare, but we wanna make healthcare better. So we wrote a white paper on it and we published it to the world for free and said, Hey, other healthcare, people, security and privacy practitioners, please start using this. It'll make your life better. So anyone who wants it, you can download. It includes no dirt.com and we hope you use it and we hope you, extend it and modify it and make it your own.
[00:20:21] Tim Chase: I love it. I spent, uh, quite a bit of time reading through it, after we talked, last time, just 'cause I'm always, I'm always curious when I'm building training or talking to people about threat modeling. Like what are some of the different options that are out there? So if you haven't checked it out, just Google, you know, includes no dirt threat model and, And, and learn more about it and, read the white paper. it's a really interesting way to go about it. so taking the conversation a little bit of a different direction, but, I kinda wanna talk about cybersecurity basics and, why do you think that it's important to revisit cybersecurity basics and then what are the basics?
[00:20:55] Bill Dougherty: a couple years ago actually during Covid, I got asked to be a contributing author to a book with a bunch of other CSOs called Back to Basics. And, the topic of it was basically asking a bunch of people who've got a lot of scar tissue and have done this, how do we get back to just the core essence of what we're doing, as security practitioners and cover the basics? And if You look at a lot of breaches that occur every year, it's pretty clear that there's some organizations that aren't even covering the basics. so, you know, what are the basics? Uh, To a certain extent, it varies by company because your risk model is different than mine. I'm dealing with health information. So the basics of protecting health information are governed by things like hipaa. The basics of covering, uh, social media information or auction information or pick your business are gonna be different. but there's some things that just. It kinda rise to the surface of what I think are basics across all businesses. I think I gave a list in there of 10 or 15 in that book, like authentication. we all know authentication sucks. We all know passwords sucks. We know that M F A works. Why aren't we using it? It used to be M F A was expensive and hard, and single sign-on was expensive and hard. It's now cheap and easy. so to me, authentication, changing your default passwords, I. Those are basics. Uh, earlier, Tim, you mentioned encryption data encryption. again, it used to be hard and expensive. Now it's cheap and easy. Encrypt your data. Encrypted at rest, encrypted at transit. you and encrypt your data with the, you know, the latest algorithms. Don't try to roll your own. Don't get fancy. The stuff that's out there works pretty well. patching is another one. most vulnerabilities are resolved if you just patch your stuff. yes, there are zero days and they will hit you, but the odds are actually against you getting hit by a zero day if you patch your systems. Because the zero days always hit the largest targets first, and then a patch gets created before it makes its way to you. So these, these are the kinds of things that I think are just fundamental basics for, organizations. occasionally I'll talk to startups, people just getting into this, what should my security program be? It's like, okay, here's 10 or 15 things. Go do these and then we'll talk about everything else. If you haven't done these, like the rest of it is gravy compared to the first 10.
[00:23:21] Tim Chase: Well, why do you think that's. So difficult to understand. 'cause I, I feel the same way. Like when I'm talking with, customers sometimes I feel like, all right, let's just take two steps back on what you're doing and like, you know, trying to decide are you doing the right thing and are you focusing on the right thing? Like, why is basic so hard? Is it 'cause we're busy? Is it we don't understand? Is it so much information thrown at us? Like, what's your take?
[00:23:45] Bill Dougherty: I think there's a lot of contributing factors you know, we started the call talking about budgets And you're always fighting for budgets. basics aren't cool and sexy and a lot of times they're boring and your staff doesn't wanna work on 'em. I'm at a company that's 13 years old, but was born in the cloud. lives in the cloud. We don't have 30 or 40 years of legacy systems, and it's always the legacy systems that kill you. my previous, job I was the CSO and C T O for a data center company and data centers are very large, refrigerated boxes with very expensive, industrial control systems. And the industrial control systems were never designed with security in mind. They use, protocols like Backnet, across two wire, and there is no authentication mechanism. There is no. encryption of those protocols. And then oftentimes you're connecting it to like a Windows 98 or a Windows NT four system, because when they produce this thing 10, 15 years prior, that's just what you did. And there's no budget to upgrade it. So now you've gotta put a whole bunch of. Secure wrappers around this thing. it's like when you're a kid and you had to drop an egg and not have it break, and so you'd wrap it in bubble wrap, you know, science project, you'd wrap it in bubble wrap and do other stuff because the core was fragile and so we started layering all this crap on top of it. And then somebody forgets or they, they have trouble and things are breaking all the time. So they open up a port just so they can access it from home. they're not trying to do something malicious, but they blow past all your layers of security, to make their life easier. So I kind of think that's why it's hard. I think legacy systems are hard. I think budgets are hard. oftentimes also, I think the third area is who's responsible for it. So like I run it and security, so I'm responsible for it. But if it and security are separate, the security person says, patch your systems, and the IT person says, yeah, that's on my roadmap. I'll get to it next year.
[00:25:52] Bill Dougherty: all right, so let's talk a little bit about you, and, and kind of your journey to, uh, being a ciso. So walk us through early career, like how did you first get involved in it and security?
[00:26:04] Bill Dougherty: by accident, I've been dealing with computers my whole life. Going back to my Atari 800 days in the mid seventies, I. I went to a camp when I was, I don't know, 12 or 13 at Caltech where we learned punch cards. So I've been dealing with computers and computer systems for a very long time. but I came outta college with a business degree and my first job in life, I, a mortgage broker. and weirdly at that was also the first time I sold software commercially because, I was a mortgage broker and I was lazy, and every Monday morning we had to produce sales reports for our sales manager. And there's a whole team of people, that were all doing these spreadsheets and it would take hours and hours. And I figured that this was a problem I could solve with a computer program. So I wrote, Something in FileMaker Pro to manage my portfolio of loans. And every Monday morning, the boss would ask me for my report. I'd click print and here's the report. I go on with my day. And all the other brokers said, Hey, we want that too. So sold it to 'em for beer. They basically, we'd, we'd go out drinking and they would pay my beer and that got them unlimited software updates and support. so I was a mortgage broker for a couple years. The interest rates went up, not unlike what's been going on in the markets, last year or so, the mortgage market fell apart. And so I said, well, gee, I need another way to earn an income. I know a little something about computers, so I started selling computers and I spent 10 years in the reseller community. selling, hardware, software services, networking, et cetera. mainly to the Fortune 500. And, after five or six years of selling this stuff, I went, I was tired of living on commission. I was tired of selling without technical support. And I went to my boss and said, I wanna become an engineer. I wanna become a sales engineer. And if I do that, I want you to, put me on salary and I wanna support all the other salespeople. 'cause we didn't have any sales engineers. And I'll take a cut of their commissions. And he said, okay, that sounds like a great plan. 'cause he thought it would take me years to become an engineer. And I went and passed my M C Ss E in like six weeks. and then came back to him. He was kind of shocked, but that kind of started me on my journey. And so I did that for a while. I, co-founded an M s P in the early two thousands, when MSPs were cool. And then, after a few years, one of my customers recruited me away to work in it, for them. And so I jumped the fence from being a, a service provider to being a customer. And that was really where I got into security in depth. I looked at the kind of the landscape and realized security people made more money than it people, and I also kind of liked security, so I went and got my C I S S P and started going to Black Hat and Defcon and building up my skills and kind of, grew from there. And for the next 10, 15 years of my career, I kind of straddled this, fence where I had an IT title, but also responsible for security, or I had a security title. Also responsible for it. and then now at omada, my title is security, but I'm also responsible for internal it. I think that that model works really well because I'm responsible not only for the policy, but for the implementation. And so that makes me accountable for how well it works.
[00:29:24] Tim Chase: So it sounds like the CSO became kind of a natural path at, at, at some point, like, just because you wanted to, do more. You got that skillset, you built it up, and then eventually, like the CSO just made sense for you because you were, managing it, you were doing it then managing it, and then just leading it for the, for the
[00:29:41] Bill Dougherty: Yeah. Yeah, I think that's fair. at every job I entered, I was able to kind of advance my career and take on more responsibility and more leadership. but CISOs have to be really good at talking to the business. it's one of the most important, roles of a CISO is understanding the needs of your executive partners. So my early days in. Banking, you know, mortgage banking and in selling, actually helped me out a lot because I've carried a bag. I know what it looks like, or a company I was at, I, I was responsible for accounting and, and accounting system transition. So I can talk to the C F O about our financials and I can talk to our head of commercial about the sales process and I can kind of understand the world they're in because that's the world I came from before I started trying to implement systems.
[00:30:29] Tim Chase: that is awesome. And one last question and then we'll get to some, rapid fire question. So, any advice that you have to someone entering the IT or information security field for success?
[00:30:40] Bill Dougherty: I almost always tell people to go become an expert in something else first, I. So the best it people and the best security people that I know have some other expertise within the business first. So it, like if you want to, be really good at operating a financial system, you should understand accounting first and not just understand it. But you should have actually had to, pay a bill or make payroll or live what it's like. if you wanna run an H R I S system, you should have some expertise in working in an HR department. Uh, if you wanna run Salesforce, run a commercial system, you should have some expertise in, the sales side of the house and some affinity for what it's like to get up every morning and make 50 cold calls and get hung up on 50 times because that then, gives you the, the knowledge you need to make the system better for the people who are gonna actually use it. So consider the fact that I didn't start out in IT or security actually a gift because it gives me at least some empathy for my customers. that's why I love it when I'm able to take staff in other departments that are interested in transitioning. And give them an entry role if possible, because they come with a different set of knowledge than you get if you came outta college with a CS degree.
[00:32:02] Tim Chase: no, that, that's a great point. and I find that same thing when I talk sometimes with, people that are on the C I o, the c t O side of the house or the, like the developers before they came over. To be in a security. It's like they have that empathy. So rather than taking things and, you know, flipping them over the wall to the developers and saying, do it, like, they have the empathy on that side of kind of having security shove stuff in their face. So when they actually are in charge of security, like all of a sudden they're like, okay, like we need to figure out how to be enablers for this dev team. Make their life easier, not harder. Right. So I think, the empathy factor, comes in all around. I, I like that,
[00:32:37] Bill Dougherty: Yeah, a hundred percent. security departments and CISOs, we get a bad rep because people think our job is to say no, that isn't our job at all. That's the polar opposite. Our job is to say yes safely. And so the best way to say yes is to understand what the other party is trying to do because they aren't trying to do something malicious. They're trying to move the business forward. Your job is to enable them and, and be a support organization for them.
[00:33:03] Tim Chase: Spot on. Exactly. I love it. Okay, let's get to some rapid fire questions. I got three of 'em for you. Yeah. Here you go. Ready? what's the most important habit an IT leader can have?
[00:33:14] Bill Dougherty: curiosity, like when I am talking with the rest of the business leaders and I'm being presented financial data or sales data, I wanna know what's underneath it. And if you ask a lot of why questions and you get deeper, oftentimes you'll uncover something down deep three or four layers deep where your department can actually have an impact on it. Like it turns out that the real reason we missed our numbers this month. Was because something my team did interrupted our most important sales meeting in the middle of it. And hey, maybe we should put in a control that prevents that from happening next time. So curiosity.
[00:33:52] Tim Chase: what emerging technology or trend in cybersecurity excites you the most?
[00:33:57] Bill Dougherty: , I would say AI both excites and scares me. excites because it has tons of potential, for being kind of a force multiplier scares me because it has tons of risk. and not like I. Skynet risk, but, a data breach. I've been playing a lot with these tools and there isn't a day goes by that, I don't get 'em to hallucinate and get just flat out factually wrong information from them that the tools will present as though it is a hundred percent factual and in a situation where we want to use it to like improve the quality of healthcare. factually wrong information is really bad. So I'm, I'm both excited and terrified of them, but, you know, never bored.
[00:34:44] Tim Chase: Never bored. that could be our motto, in general with security. Right? It's terrified, but never bored. Um, what one tip would you offer listeners to increase their cybersecurity?
[00:34:54] Bill Dougherty: focus on your relationships with your customers, like
[00:34:57] Tim Chase: that's a great one.
[00:34:58] Bill Dougherty: the worst incidents in security happened because nobody told you what was going on. and the are times like, well, why didn't you tell me you wanted to do that? Because we could have said yes safely versus trying to go around me. So build up great relationships so that you become an enabler and not a roadblock.
[00:35:18] Tim Chase: I love it. Thank you, bill. that is all for us today. Thanks so much for tuning in. If you're enjoying the podcast, please take a moment to subscribe or write us a review, and we'll see you next time on the Code to Cloud podcast.
About the guest
Bill Dougherty is the CISO for Omada Health and the co-author of the INCLUDES NO DIRT threat model. Dougherty brings over 25 years of experience in IT and security at such companies as RagingWire, StubHub and Copart.
Try Lacework for free
Spot unknowns sooner and continuously watch for signs of compromise. Take us on a test drive to see for yourself.