The Security Table

The Final Take on the National Cybersecurity Strategy: Software Liability And Privacy

Chris Romeo

Chris Romeo, Izar Tarandach, and Matt Coles discuss the national cybersecurity strategy, focusing on pillar three, which aims to shape market forces to drive security and resilience. They explore the idea of liability and the goal of shifting the consequences of poor cybersecurity away from the most vulnerable. The trio also considers the influence of GDPR and its impact on the US, comparing it to the European Union's experience.

The podcast hosts discuss the need for better security in IoT devices and the potential impact of the policy on the rest of the world, including China. In addition, they express concern about the potential for a tedious and complex liability process similar to the medical industry, which may not ultimately benefit users.

FOLLOW OUR SOCIAL MEDIA:

➜Twitter: @SecTablePodcast
➜LinkedIn: The Security Table Podcast
➜YouTube: The Security Table YouTube Channel

Thanks for Listening!

Chris Romeo:

Hey folks. Welcome to another episode of the Security Table. This is Chris Romeo and I'm also joined by Izar Tarandach and Matt Coles. And we are gonna tackle the. Rest of the national cybersecurity strategy. We, we had lots of thoughts so far. We had many things to say, many things to discuss, debate slash argue about. Uh, but we have a few more. And I really, one of the things that I think for software security, application security, and for software in general, one of the biggest things we haven't even talked about yet, this whole idea of liability And so let's, let's pick right back up in pillar three, which is called Shape market Forces to drive security and resilience. I want to get both of your takes on the high level goal that they have right at the top of Pillar three, and I'm gonna read it directly. It's shift the consequences of poor cybersecurity away from the most vulnerable, making our digital ecosystem worthy of trust. your, what's your initial reaction when you hear that?

Izar:

Okay. So, Funnily enough, we, we go back to that word of last week trust. But now it's, it's a different kind of trust. Right. And as someone who has seen the nitty and greedy of putting out secure software and always battled with the, no, not always, but in, in many instances, I battle with the fact that. wanted to institute a new, a new procedure or to put a new tool in place or to ask people to do things differently. And then there was that question, uh, do we have a mandate to do this thing? And the answer was always no, we don't. The only mandate here is that product has to go out of the w out, out of the door. The market waits for nobody, right? if there's any problem, we're going to deal with it after. It sounds great, but the problem is that afterwards, in between the launch and the afterwards, usually there is an incident in the middle where somebody may lose something that's very important and dear to them. And that wasn't really taken into account many times. And I feel that, uh, the, the aim of, of this, this part of the policy is, is right in point. I mean, people can't defend themselves. They, they give us their data on trust and. Trust is something that you earn, right? We haven't earned that trust so far. Quite, quite the opposite in many aspects. We have lost again and again and again, the trust, but people have no option. They have to deal with the word as it is today, and that goes to our systems. So I see this is an opportunity to actually earn that, that trust and and go after it. And the inside have the mandate to actually institute those things that many times. the developer in terms of the velocity that they want to move at or the direction they want to move at, but that, uh, unfortunately, are so, so needed.

Matt:

Yeah, I, I definitely agree with this. Or, uh, I guess it, it provides a, a backstop or a, um, I won't say this, I won't say a stick, because it's not really a stick. It's, it's in incentive for, uh, for organizations that are producing products, uh, to, to take more of an interest. Uh, and I'm not saying that developers didn't always already, but you know, as, as I mentioned time, the market was always a challenge, right? There was always a, a push for a lot of organizations to, to ship. And now, um, with this in place, uh, there's an emphasis on don't just ship. uh, don't just blindly ship, right? Make sure that your requirements match your, um, not only your functions you want to offer, but also match the customers that you're, that you're selling to, uh, and give them the capabilities and the tools, or build a system that is by default. So I'm gonna use secure by default today. Uh, uh, and, and very important, there's a follow up document that came out from CISA on this topic that. Probably should highlight here, uh, you know, shipping products that are, that can be consumed securely, not just those that, that, not just those that customers can put into a trustworthy state, but that start trustworthy

Izar:

Yep.

Chris Romeo:

and I mean, the thing I love about this statement is does really correct for the sins of the past Had in, in, in technology. I mean, for so many years, the, the fate of data breaches and whatnot has been put on the users. Like it's ultimately our data that's, that's been stolen, lost, whatever, on the dark web. Um, it's, it, it, the comp, so the what, when I see this statement, I see the industry shift that we needed 20 years ago. Being pushed front and center to say we need to start, we need to move away from the, the end customer being the recipient of whatever bad things happen and really push this back on the people that are building technology products. And, you know, the three of us, like we've been doing security for, for we could count up the number of decades like. We could, we could very simply create like a five point list of what people need to do, what those companies need to do to really be successful with security and build secure products. The challenge is they just haven't been doing it. And so for me, flipping this around and saying like, we're, we're, we're taking this off. The users or the, the customers we're putting it back on, on big company X, Y and Z and startup X, Y and z I think that's a really positive statement. I.

Matt:

Well, and I, I think it actually, the, the, the statement goes step fur or the, the, the policy goes a step further. Um, you know, we're, we're sort of talking about here, having, having, uh, a reason to build security into your system and deploy it in a, in a secure way and, and earn back trust. But the keep in mind that the strategy itself actually. Will establish, require Congress to establish a liability scheme, meaning not just drive software vendors to produce secure software, but also there's a liability. There's a legal hook if they don't. Now in that same paragraph, there is discussion about safe harbor. So some sort of legal framework that would allow a company that does the appropriate due diligence, whatever that is, uh, to, um, to be shielded from, from liability. Um, and so, uh, or to some, to some extent, if I'm reading that correctly. And so, um, that could go in a lot of different directions. So hopefully, hopefully the emphasis stays on. Moving the role of security back into the vendor's world because that's, they have the ability to influence it from design, from development, from deployment, whether we're talking cloud or desktop software or whatever. Giving customers the tools to know that the system can be trustworthy. Um, and, and obviously they may want to provide options to make it less secure, not more, you know, start it more secure and be less and not less secure. uh, in other words, require users to opt out of security as opposed to require them to opt into security. Um, and then if you, if a team is grossly negligent, uh, an organization is grossly negligent in terms of, of that, then have liability for that. And by the way, I, I think that we've seen precedence for this in the medical and automotive and safety, you know, safety critical spaces where, uh, an organization that doesn't, doesn't go through the dil due diligence practices is potentially liable for any harms that are caused. And so this is putting. regular software development into the sort of more in, in line with, not, not in line, but more towards that goal. Um, and that's probably a good thing overall.

Izar:

Yeah.

Chris Romeo:

and it's painful.

Matt:

It, it will be interesting to see how this plays out with open source software given h liability especially. And who do you, who's, who or what is liable in an open source project?

Chris Romeo:

I mean, liability's about money, right? like if, if you don't have any money, then there isn't really any liability because I guess you could sue the personal product project maintainers personally. And so you might see LLC or LLCs start to be used for open source projects. Like we

Izar:

Or the,

Chris Romeo:

an L L C.

Izar:

or they'll just change the license to say, Hey, we're not responsible for anything. You use it at your own. Your own per, and that's.

Chris Romeo:

a good.

Izar:

but b actually before we go into the liability, cause I think that we're going to spend a good amount of time there, there, there's another aspect of the thing that keeps coming back to me and this week especially cuz yesterday I had a, a very interesting chat, uh, about this. So we, we have the policy and as Matt said, the policy already generated the secure, secure by design, uh, document from, uh, Caesar and. Right at the, the, the start of the, um, of the, the pillar three, uh, text on the first strategy objective, hold the stewards of our data accountable. The opening sentence there is securing personal data, is foundational aspect, is a foundational aspect to protecting consumer privacy. Privacy in a digital future. That, that, that sentence that for me one of the most interesting here. At the moment that you declare something to be foundational and you marry it with, uh, secure by design, you are saying, okay, from now on, everybody has to build these very strong foundations of security. And on top of that, develop your stuff. But at the same time, you are saying, we want you to plan ahead. We want you to think in terms of security even before you build those foundations. Right? start thinking in terms of, uh, What kind of security value your, your, your product is going to have. going back to the discussion that we had, a couple of results back about the security, uh, uh, what, what was it that we called it? The companies that provide the security products? Uh,

Matt:

Security vendors.

Izar:

security facilities. Security facilities that we said it. It's like a water facility thing, right? Yeah. Utilities, utilities. that, that's going back to that because those people are immediately going to be expected to have even more of a broader and, and harder, uh, uh, approach security. if you take like the, the normal everyday software producer, this discussion that I alluded to mentioned the fact that sometimes we, we look at, okay, Some people love saying, considering the security that you have to put in place, think like a hacker. The three of us don't like that because we know that thinking like a hacker doesn't mean anything. Right? If you don't have the knowledge, you can think until tomorrow you're not going to come up with anything. But then in this conversation that that took me to another aspect of the thing, like a hacker, example. Uh, If we see an attack technique that we know it's going to cost X amount of dollars and give you a return of Y amount of dollars, and Y minus X is a small sum or what we consider a small sum, we are going to say, Hey, you know what? I can live with that I that, that's fine because attackers just won't come for that small amount of profit. then this person told me, Think in terms of not the US but the rest of the world. There are people out there to whom that small amount of dollars is actually a large amount of whatever the local currency is. So the think like a hacker here becomes, think in terms that somebody somewhere is going to have an incentive to go after your data. So don't take that approach of, it's okay, I'm going to accept the risk and that risk is fine because nobody's going to come knocking on my door because there will always be somebody coming and knocking at your door. Right? And that to me changes the whole, the, the, the whole equation of, of risk, of probabilities, of thinking what is, what is exactly that we are going to defend against lowers the bar of no raises, the bar of the things that I'm. uh, to accept as risk, right? And I, I think that this is a push in the right way.

Matt:

Uh

Chris Romeo:

let me throw

Matt:

oh

Chris Romeo:

gonna be a little more controversial

Matt:

can I actually can I yeah, I wanna respond to that. So I, uh, Will it actually change? I, I guess I wonder if it will really change the calculus when it comes to what level of risk you may accept. I think more importantly though, it may, um, it may drive organizations to be more complete in their risk, in their risk, uh, analysis and risk treatment. uh, thinking back to like medical devices and whatnot, when you thought about harms for medical devices, you had to be pretty thorough, right? There were a lot of different cases you had to consider, and even even the most obscure case, you know, could still have a significant impact. And so you, it still needed to be part of the, the hazard analysis and the, and the exercise around risk management as opposed to, Just taking, you know, a defined set or, or, uh, you know, the, the ones that tend to be the higher, higher dollar amount or tended to be the higher, you know, uh, impact in terms of, you know, publicity or whatever. And so I think, uh, it may drive a more complete risk exercise as opposed to necessarily changing your thresholds. I think maybe.

Izar:

Hmm.

Chris Romeo:

So the, I guess the, the controversial thing I want to throw in here for 3.1, GDPR for the usa. You heard it here first. Like I'm on the campaign trail now. Gdpr. I, I mean, I lived, I was running a startup security journey when GDPR hit. I had to go through and we had to do the dance. We had to build the features that, that made us GDPR compliant. Uh, who knows if what that, if that actually was, you know, what was true or not. But, um, I still, I like what GDPR did as far as forcing companies to have to protect data and pay the price if they don. If they do bad things with data, if you, if you disrespect the data of your user base, you will, you will be held, held accountable for it. With GDPR still to this day so I'm advocating for, we need a GDPR on the u S A. We don't need to call it gdpr, but we need that level of personal privacy protection to exist across everything that we do. Now, I don't think that's what this is leading to, but it might be, I mean, hold the stewards of our data accountable, accountable, like accountability. like what they're saying. Maybe I'm taking it a step further. I mean, what's your reaction, Eza, when you hear GDPR For the usa, do you want a T-shirt, like a hat? I'm, I'm printing t-shirts, the whole nine yard.

Izar:

Just don't tell me that you're going to make GDPR great again. But, uh,

Matt:

We're gonna get so much hate mail

Izar:

yeah. The, the, the thing that keeps, uh, uh, coming back to me is, Okay. Privacy fine. Great. That, that, that's, that's awesome. Did GDPR do anything to solve SQLI No,

Chris Romeo:

It for that though. It

Izar:

it wasn't for that,

Matt:

actually indirectly it was right. You, you, in order to have privacy, you have to have security.

Izar:

but But, the, the, the point here is when I look at the foundational aspect again, right? To me, it's not the laws that we are going to put on top and say, this is the amount of fines that you're going to pay if something bad happens. This is going back to people and saying, these are the bad things that can happen and this is what you, what you do about them.

Chris Romeo:

Yeah, but GDPR is a different, it's something different though. GDPR didn't exist to cause security to be built into applications.

Izar:

That's exactly my point.

Chris Romeo:

But, but, but that's so, so, yeah. I agree that we need more protections against sql. I, GDPR is not the answer. GDPR protected big companies from taking our data and just doing whatever the heck they want with it and, and not respecting our rights of data at all of

Izar:

Let's look

Chris Romeo:

I

Izar:

the comp. Okay, let, let me put it like this. I gave up on my privacy. Google knows everything about me. That's it. That, that, there's nothing more to know, right? But. It's my problem Google. It's Google that really, who, who's going to, to really, really cause me to have a problem because of what they know about me. Of, of that loss. Of privacy. Of, of the chosen loss of privacy, right? No, what I'm worried about the leak, is the whatever somebody with intent to begin with coming after my. That's not a, that's not a matter of privacy. I, I agree. Right. But again, I go to the foundation. We haven't solved the, the simple things. Why, why are we worried so much about the big Pay more to the lawyers?

Matt:

So

Izar:

pay more to the developers so that they do the right thing.

Matt:

remember earlier you talked about the mandate, gdpr. Provides a mandate to do things during a systems requirements and design and implementation because there's legal impact if you don't, right. And. so we don't have a matching one for security, right? You, the underwriter, uh, UL tried with their, uh, with their standard, right? And ISO has a standard which, uh, or, or you know, a thing around application security, but none of these are legally binding, right? GDPR provides a legal framework, so it isn't just a matter of a policy statement about protecting user data, it's presenting a legal f. A liability framework for doing so, uh, which drives behavior.

Izar:

Right, but my point is that gdp, GDPR by itself built this whole industry around being GDPR

Matt:

Yep.

Izar:

And what you see is companies having to duplicate their internal environments. Half of it is GDPR compliant, half of it isn't, but at the end of the day, nobody has solved yet the basic problems.

Matt:

but I would be careful

Izar:

lawyers at it.

Matt:

I would be careful about that blanket statement. I think that some organizations have taken the approach that, well, GDPR covers a large portion of our customer base, and therefore we should really, rather than duplicating effort, we should really code to that level, right? And have systems in place so that everyone basically gets the protections of gdpr, even if they don't live.

Chris Romeo:

Yeah, that's, that's my experience as well. I've seen, um, I, I, I haven't seen separate infrastructure being deployed to cover a GDPR friendly case. I mean, think for, like, for example, one of GDP R'S requirements is the right to be deleted. You, you, you have to give Google for European citizens, European Union citizens, Google has to give them the ability to truly be removed from Google, Google's databases from end to end. And so if you want that, and I, I get these are, you're coming at this from kind of like the other end of the, you've kind of like said, okay, I'm personal privacy. I'm just, I'm throwing everything in in Google's, I already got everything about me and whatnot. Um, I would say I tend to lean more towards the other side of. I don't, I don't, I try to stay away from Google as much as I possibly can, just because I don't like the ad, the, the market for how they do advertising. For example. I don't like the profile they have on me. I don't like the profile they have on everybody else. And so, yes, can I completely fight it? No. Am I caught in their machine? Yes, but not willingly. Like, I don't, I don't want them to have a profile that knows everything about me, but they've been able to put it together by all of the sources of data. And so when I see GDPR for the U S A I see at least I have some controls where I can, I can then take ownership of my data again, even if it lives in their system.

Izar:

So about the right to delete. Funny thing happened on the way to that. I, I heard of a friend of mine who asked, To be deleted from a given system and he got an email back saying, your request has been accepted. 30 days and your data is going to be gone from everything that we have. Blah, blah, blah, blah, blah, blah, blah, blah, blah. Six months later, there was a breach and amazingly his data was exposed. yeah, they can tell me until tomorrow that they're going to delete it and, and everything, but do we have an enforcement to that? No, we don't.

Chris Romeo:

I mean, GDPR provides an enforcement from a, from a lawsuit perspective. It's happened a number of times already. I mean, it happened right when it happened the day GDPR went live. Uh, I don't remember what the guy's name was, but he sued Facebook and Google and a bunch of other people for, for non-compliance. I don't know if they ended up settling. I never chased the case down, but at least it gives it, it gives, there is a legal recourse. You're, whoever that happened to technically could have filed a lawsuit against that company according to gdpr, and they probably would've written a.

Izar:

Right, but the calculus of the, the, the calculus of the company is still going to be, is it worth for us to redesign our systems or to with a lawsuit at some point? What, what's going to cost us more?

Chris Romeo:

that are, the, the, the big companies that are gonna, that, that could potentially really cause that have the most data. Have the most incentive to comply with it because if, if the year, the European Union Supreme Court, I don't remember, it's not called Supreme Court, but there's a equivalent of Supreme Court there. If they come after Google, they're not coming for$500. They're coming for 500 million or a billion dollars, which does impact the bottom line even of a company as big as Google. If they're, if they're, if they're losing a lawsuit for a billion dollars, sure. It doesn't hurt them. hurts them some. It's not, it's not painless to lose a billion dollars or multiple billions of. in their

Matt:

Yeah, but the right to operate would be a, would be a worse fate than a financial.

Chris Romeo:

True. And that's that. I don't think that's happened to my knowledge that anybody's been shut out. But I get Yeah. I mean, I guess you, to your point, man, that's, that's true there is that hammer does exist gdpr. They could be kicked out of the European Union, which for a big company that's, that's gonna hit the bottom line a lot harder than a billion dollar fine. Okay.

Izar:

So let's

Chris Romeo:

into the deep end here, but Ezr, I'll give you Give us the last thought on this.

Izar:

Yeah, no. Uh, a actually, not a thought, but a question, uh, at the end here of 3.1. This legislation should also set national requirements to secure personal data consistent with standards and guidelines developed by nist. Does GDPR bring you in requirements to secure

Matt:

Yes. Well, uh, in indirectly, and again, in order to have privacy, you have to have security. And so there's a require.

Izar:

know that you know that, but do the people who read the GP know that

Matt:

uh,

Chris Romeo:

I don't think so

Matt:

in my experience, not directly. The, the, it depends. It depends, right, I guess on which lawyer you're talking to.

Izar:

always, but, but this is a clear call to action, right? Th this is saying, Hey, there, there's, there has to be attached to secure personal

Matt:

There's a clear call to action to have legislation, and this is a policy direction request for we'll support legislation. If you put something through Congress and, oh, by the way, in order for us to be a leader here, this should be the things that are in this legislation. Right? So the legislation could just be a LI liability framework around privacy loss

Izar:

when you, attach it to, to nist, aren't you automatically proposing something that's a bit more doable, a bit more tangible in terms of action?

Matt:

if you consider the software that the s SDF as tangible and actionable, then yeah. And, and and generally that's gen Well, generally it's true. I think the SSD F is, is a good thing. Um, it'd be nice if NCAT is act, I'm gonna go on, don't limb here and say, you know, it'd be nice if NCAT is act together with all the various initiatives that they have. They seem to be a little bit disjointed at times. And I there's an opportunity for them to, to shore that up. I think they're doing a better job over time. CISA and NS together have done some good stuff.

Chris Romeo:

Yeah, it's net

Matt:

yeah

Chris Romeo:

not, they're not making the world a worse place. They're making the world a better

Matt:

Well, just when they get, when you get contradictions, that's I think, or, uh, confusions. I think so. I'll, I'll, I'll throw out for instance, you have the cybersecurity framework, which is not a development framework, right? It's primarily an operational framework, and then you have SSD F, which is based on that in terms of structure, but uses development terms. And then separately you have, uh, N 801 60, which is all about secure systems engineering. And they aren't necessarily always aligned.

Chris Romeo:

Yeah. I

Izar:

Again, Marvel, your capability of keeping all those numbers aligned and bashes and. It's it's wall, it's

Chris Romeo:

like, NIST one 60.4. Um,

Matt:

Special publications, 1800.

Chris Romeo:

That's right. People, people used to count sheep in the olden days. Now we count NIST documents and

Izar:

because today we are the cheap. Anyway,

Chris Romeo:

and that that makes a, you're making my point. See, let the audience recognize Zaza has come around on the whole personal privacy thing, cuz he did admit that he is the sheep. Okay, moving on. All right, let's talk about 3.3. Cause this is probably

Izar:

Wait, wait, wait. We're going to jump 3.2.

Chris Romeo:

I'm trying to, I want to talk about liability, but we can talk. Let's talk about 3.2 real quick, but drive the development of secure IOT devices. What are your thoughts there? Za.

Izar:

I'm just asking myself why call out fitness trackers and baby monitors?

Matt:

Why, why restrict us to IOT devices? I guess it's maybe it

Izar:

Like do the whole thing to everywhere.

Chris Romeo:

well, IOT devices have, they, they do have differences that I think require being called out, given that even us as security people have them in our homes now, and they, they live in a different Yeah. I mean, we're wearing them, they're everywhere. Like it's, it's become part of the average home has a Nest or bee or whatever.

Matt:

but, but why not These things too, these aren't IOT devices.

Chris Romeo:

Yeah, they

Izar:

It looks like, like a Hawaii. Um, anyway,

Matt:

not. speaking of Speaking of Google

Izar:

but. Matt, go, going back to the numbers and everything. This thing mentions the iot OT Cybersecurity Improvement Act of 2020, which apart from being in an awesome name to a document, I never heard of it

Matt:

I have never heard of it before either.

Chris Romeo:

No,

Izar:

I, I

Matt:

I've, and I've been in, I, and then I've been in in company companies that do IOT development. don't recall that.

Izar:

I totally That if, if anybody would know it would be you. Okay. Three point.

Chris Romeo:

Okay, let's talk about this whole idea of shifting liability for insecure software products and services. Let me give the standard disclaimer, um, Matt and Chris are not attorneys, nor have we gone to law school. So you should speak to your attorney if you do want to take anything that we say, your attorney needs to review it and then counter whatever we

Matt:

Oh, actually, uh, sorry, I, I need to go back to 3.2. I'm sorry for for derailing that

Chris Romeo:

again, but I mean,

Matt:

Okay, fine. So there were two parts of, uh, of 3.2 actually that were kind of important. So as our called out the Cybersecurity Improvement Act that none of us have ever heard of, which is interesting, um, but there the two parts is secure by default common. Right. And, and I, we have a link to the CISA document, which is secured by design, secured by, secured by design, and secured by dis default, which are two design principles, secure design principles that should be followed and, and are taking on more of an emphasis given the cybersecurity strategy. And 3.2 critically calls that out. So again, having a system, an iot device, you know, a smart TV or smart toaster, your smart oven, your smart whatever, uh, having, um, being secure. As an opt in, as a, as a by default rather. So you have to opt out of security as opposed to the other way around where you have to opt into security. That's critically, that's a critical key concept there and I think that the other key concept of interest here is the labeling discussion. So we may table that for some time in the future maybe, but the labeling in in is an interesting concept of. Uh, sort of like nutrition labels. I think this is where, where that was started from about when you buy a device, how does it compare to other devices on the market? You know, whether that's, uh, within the same, you know, if you're buying, you know, if you're looking at a smart toaster and you have three smart toasters to choose from, you can look at the label and go, oh, this one's less fattening and this one has less sugar. Uh, you know, obviously security and privacy terms instead so

Chris Romeo:

what, UL was trying to do. trying to build that, that standard, that UL approved from a security perspective. If anything, they were a few years too early. now it's probably more the, the correct to your point, like labeling, what is labeling there for? It's not there. It's not there for the three of us. Like we can go Google some design documents and, oh, look at me. I used the word Google. Sorry, I take that back. We can go search ask chat G P T look for, to give us some details and analyze what an iot device is doing.

Matt:

You a Nest Bard

Chris Romeo:

do that, right? That that, yeah. Ask bar. There you go. Come on, Matt, bring it back around here. But the, like, the average consumer though, like, think about our parents. If our parents are going out to buy something, um, people from our parents' generation are going to buy a technology device. They don't know what the heck they're, they need something that's a stamp of a seal of approval. Um, not good housekeeping seal of approval, but, you know, good security seal of.

Matt:

Well actually, uh, not even a seal, seal of approval suggests that somebody made a decision that it's good enough for them as opposed to a, uh, an information sheet card that says, compare eight apples to apples. Because how, how do you get through Mar? How do you get the market, if you look at the marketing for, uh, we'll just say Samsung versus Apple phones for instance. They use different terms for the same thing. know that they're using AEs 2 56 encryption with, you know, securing those, those keys in a secure element, et cetera, blah, blah, blah,

Chris Romeo:

I think because in the iPhone they're using terms like best of breed, uh, better than the other thing that Matt was just talking about. All those. Yes. I'm an Apple fanboy and I admit it. So let's talk about liability. Okay.

Izar:

Oh, great. Security by marketing.

Chris Romeo:

Hey, we're all in sales and marketing. If you don't agree or if you don't understand, you just haven't figured it out yet. Like either you're selling something to someone else or they're selling something to you. You heard it here last because it's been said in movies and whatnot before, but, um, 3.3, shifting liability for insecure software products and services just at the highest level. What's your gut reaction to this c r I know you're. You see something like that? I know it's, it's generating a thought inside of your head as far as is this, is this the right stance for this document to take?

Izar:

Yes. Then no. I think that, uh, yes, it's the right way to go because tell you the truth, I, I can't imagine, uh, A single industry that's less responsible for the things that they, they produce we are. Right? We, we release these things in the, in the world and you use it at your own risk and whatever, and it's probably written somewhere in those licenses like windows and whatnot, that whatever happens, happens, and it's your problem, not ours. So, yeah, I, I, I would love to see some, some liability changing places here, unfortunately. Uh, I have to, to look at what happened in what I understand could be wrong in the medical industry when liabilities became a real, real thing and became, uh, an extremely tedious thing and doctors I understand have to pay through their years and noses insurance against liability. Right. like, like everything it's going to become, it's going to become an industry and a lot of people are going to make a lot of money out of it. And I worry about is it going to bring something new, something different, or is it going to be more of the same with different names? And every time that you have to litigate something like this, it's going to take years. So it's not going to go.

Matt:

I, I think the biggest problem I have with this, this, these statements is the word reasonable fail to take reasonable precautions if that is left vague. Then the fear is this ours highlighting, I think is is can be real.

Chris Romeo:

Well, they, they describe it further down, right? They talk about this, this safe harbor will draw from current best practices for secure software development, such as nist, secure Software Development Framework. It must evolve over time because things are changing, so like they're not gonna, it doesn't sound like the, the idea is to just. Decide what some people think are reasonable, and then because like we said, like we could, the three of us could sit here and argue about what's reasonable. We could do a whole episode on reasonable software security and we would argue different things. We'd all say threat modeling. Yes. First poll position for threat modeling, but the rest of the stuff like, you know, I mean, we'll, we'll, we'll definitely cue that up as an episode, but just as a teaser, I, I'm reaching the opinion where I don't think DAST is worth much. in the world, and I'm just gonna leave that there for people to send me him, me hate mail after they send Matt hate mail from the previous thing he said. but so that, but

Matt:

It's our start

Chris Romeo:

you might have a difference of opinion though. That might be a one where you're like, oh, no, DA is, is super important or, or whatever. Like that's just, and so it, so the, but the point being like, we're not gonna agree on what's reasonable ourselves, but we can look to something like the S S D F and say, okay, we, we can all get behind this as a baseline. It's got the right goodness in it. We might not agree with everything it says, but it's at least it's a structure that the things in it that we really believe are I.

Izar:

Look reasonable is always going to be a balance between cost and benefits. Right? And I think that it's fair to say that what's reasonable for one person, it's less reasonable for another. So that, that, that's where my, yeah, mean like, uh, uh, apple can do a lot of stuff that three guys in the garage.

Matt:

Uh, yeah, I don't think that's the, I don't think that's the reasonable part, though. They're not looking at cost benefit at that level

Izar:

but that, that's how it's going to be interpreted. They're not looking at it, but afterwards to come to Apple and say, oh, you have a problem. You have a a, an issue in your enclave, chip. all the test suits that you had, you forgot to test this thing. And they'll say, well, that's reasonable because you know, I already have a test suit, I already have the test, uh, uh, harness and everything. Yeah, we forgot. Sorry. And then can you get that exact same case and, and try it on the three guys in the garage and have, listen, because Apple is doing it, that means that it's reasonable. So you should be able to, to have exactly the same setup and exactly the same level of, uh, testing and. You see what I mean? For that that's just not going to be reasonable. They don't have the everything that they need for that thing to exist, even if they are making a cell phone. So it could be the same product, different levels of of

Matt:

Ability

Izar:

Thank you. So, uh, it, it goes to that, and, and again, I I, I'm just afraid that it's going to go the medical way and it, it's going at the end of the day, who's going to make money out of it is lawyers the consumer is going to be left at perhaps a bit better place, not much opportunity for shift because those things take a long time to happen. The burden of proof is. I don't know.

Chris Romeo:

I mean, the first

Izar:

I, I like what I hear, but I'm not optimistic.

Chris Romeo:

first case is gonna be one of those landmark cases, like somebody's gonna get, they're gonna, they're gonna get fined or something for a lack of proper software security, uh, discipline or lack of reasonable software security. And they're gonna end up going all the way up to the Supreme Court to argue about what it, what are, you know, how, how did you determine that? What is the fines? Like, there's gonna be this, this doesn't, I don't see this having an impact in the next. It's probably, I mean, further.

Izar:

I, I, I'm happy for Gary cause he's going to make a lot of money as a, an expert witness. But you know, at, at the end of the day, what, what's going to move the needle?

Matt:

I, I guess so. Interestingly enough, they used some pretty particular language in here about liability, right? So establishing a framework, li liability, we talked about that previously. That was 3.1 Also. And then it says, legislation should prevent manufacturers and software publishers with mar, with market power from fully disclaiming liability by contract and establish the particular term higher standards of care for software. So that is the medical device scenario, right? That is an organization that has. Uh, a lot of market power meaning, uh, you know, large customer base potentially, or a technology that is, um, that is fundamental right to the operation of, of things for, for a number of, for a large set of users, um, or for a lot of data potentially, um, will be set to a, to a standard that's potentially higher for than for others, I. think is the I

Izar:

the, you jumped the end of the sentence for specific high risk scenarios, but already qualifying it to something that's already untangible, to something that somebody is going to have to decide. What's the amount of risk that makes it a specific high risk scenario.

Matt:

Well, this is, this is the, this is the policy part that gets translated. This is the policy part that gets translated to, um,

Izar:

And Gona agrees with me

Chris Romeo:

what what's just happened here is uh, one of these big multinational corporations has been listening to this, violating our privacy, and now they're at Matt's door

Izar:

again.

Matt:

go meet the nsa. I'll be right back. I'll be right back. You guys talk so much yourself.

Izar:

Yep.

Chris Romeo:

This

Izar:

I mean, I, I, I get what Matt's saying. Okay. And, and, and that's right. Market power and, and I would guess that somebody going, I don't know, again, against Apple because somebody is something that they forgot to do on the iPhone. We're talking about billions of users hear it, but I don't know the numbers, but I'm guessing. But then somebody's going to come and say, okay, it's billions of users. But the risk is not that big because what got breached was just a number of people. And now we have fixed and everyth. So we, we reduce the risk because we addressed the.

Chris Romeo:

Hmm. Yeah, it's gonna be an interesting, uh, interesting number of years here, as this kind of, as this peels out into something that's tangible. Um, you know, liability is, I don't know. Hopefully it'll have the impact of scaring straight. Some of these companies, like, you know, you, you take young people to the prison and they, they get scared about the, out the potential outcome of being in prison and then they, they change their ways. I don't know. I mean,

Izar:

Did, uh, did gdpr, uh uh, do.

Chris Romeo:

I think it did. I think it did because of the. At least it, it, it forced compliance of the individual requirements of, you know, right. To be forgotten. Right. To, you know, this, I don't remember what the other tangible rights were. I, I think, I think companies in general, in the ur that that service, the European Union, follow that guidance. They do live with the changes that they made to their systems to support that.

Izar:

So what I'm hearing you saying is that today of, of everybody in Europe is GDPR compliant.

Chris Romeo:

Yeah,

Izar:

What? What's the amount of compliance?

Chris Romeo:

Yeah, I don't know what the, I mean, I would think it's probably higher than that because I mean even for companies here in the US like when we were dealing with, there was this whole safe harbor thing of,

Izar:

Mm-hmm.

Chris Romeo:

was this, as a US company we could go through. There was some, uh, reciprocating agreement between

Izar:

Mm-hmm.

Chris Romeo:

US and, and it's now been nullified, so it doesn't, doesn't matter anymore. uh, we could go through. basically self attest that we were complying with all those requirements behalf of users from the European Union. And so, yeah, I would say that the compliance level is probably higher than that. And I think the, the, the value of gdpr, the, the only real value is there's still the fear of the hammer. They could

Izar:

This year.

Chris Romeo:

one of these big companies with the hammer and for a, for five, 10 billion that they.

Izar:

I think I just got too cynical for this thing cuz I hear self Atest station, fear of the hammer. It's all up there. It's one more risk. Risk probability, right? It might happen to us. Might not happen to us. We're willing to put up of the cost of it happening. Self Atest Station. We investigate ourselves and we found us not guilty.

Chris Romeo:

that's only for US companies. That wasn't

Izar:

Right?

Chris Romeo:

That was for, that was for us. That was how they tried to bridge the gap from into the us. Turns out the EU Supreme

Izar:

So it, it's a measure it, so it's a measure that can be found to be acceptable as a solution to this problem because it was tried and, and it, it happened.

Chris Romeo:

I mean, it's

Izar:

So ation is something that, it is something that might happen, right?

Chris Romeo:

Yeah.

Izar:

it's an accept, it's an accepted level of, of protection. Let's

Chris Romeo:

yeah, I mean,

Izar:

they self-assessed it, that that's.

Chris Romeo:

I mean, nothing and, and nothing's gonna provide perfection, right? Like,

Izar:

No, not, not based on, on.

Chris Romeo:

but we're, and we're cynical. I mean, I don't think based on anything like we're cynical security people. Yes. And so we're, we're all going to. Probably be too cynical about hope for the future and, and how these things could have a positive impact. I think there's gonna be some type of a happy medium though. Like it's not gonna be exactly what this document would, that we've gone through this whole document. Like the world is not gonna, the United States and companies that support the United States are not gonna shift to a hundred percent compliance with all these things they've said. They're also, they're also going to do some things to get. As a result of whatever comes out of this document. And so just like everything else, it's gonna land in the middle somewhere.

Matt:

What'd I miss

Izar:

you, you touch not much, but you, you touch on something very, very interesting to me. You. the whole world is going to do something about this. And I, I dunno if you meant the whole word or not, but

Chris Romeo:

that I meant, I meant not everybody, like not, all the people that are, that are technically, that are governed according to this cyber, national cyber strategy.

Izar:

but we, we have five minutes here and I would love to hear you guys tell me something about this. What would be the impact of this policy on the rest of the world? Think in terms of what, how the GDPR impacted us here, and you pointed to a good number of cases. How is this going to, to impact the rest of the world? What's China going to say about privacy?

Chris Romeo:

Well, I think you gotta, you gotta segment the rest of the world into a few categories. You gotta, you gotta, uh, we have friendly. you know, we have friendly partnerships. There's the five i's, you know, that, uh, share intelligence. Um, there then are the NATO member comp, uh, countries that are also more friendly towards us. And there's also, you know, a couple other categories, including the seven companies. I think it's SE or seven countries you can't even shift crypto to at this point. Maybe that. So you it though, before you can't, you can't talk about, like, I think our member, our friendly countries, I think this is gonna help pull them along. And Matt's brought up that, that, um, CISA document, the secured by default,

Izar:

But it's not.

Chris Romeo:

that's signed by a number of other countries

Matt:

Yeah, so specifically it brought together csa, nsa, F B I, within the us, the Australian Cyber Cybersecurity Center, the Canadian Center for Cybersecurity, the uk, the Germany, Netherlands, new and New Zealand's, uh, cert, uh, cert and nsc, N C S E.

Chris Romeo:

and the five Eyes. That's an a couple ones.

Izar:

which by itself is a huge accomplishment. Let's be honest

Matt:

So as a as a position, it is the, it is the United States being, becoming more, um, equi, not e equitable is the wrong word, uh, uh, sort of in line with international norm or, or setting a international norm. I, I will be careful with not universally, right.

Izar:

That No, that, that, that's a given. Cause those are US-led organizations. But let, let's address the dragon in the room. What's China going to do about this, if anything?

Matt:

Who

Chris Romeo:

think they're gonna do

Matt:

who Ca Why is that

Izar:

that New Zealand has on me. I'm worried about the data that

Matt:

We, we set? So the United States has always set, set a standard that other countries can choose to emulate. And, and I'm, I'm gonna speak it from from, I wanna be careful here. We, I think we set a standard that we hope that others emulate. In many cases, we we're trailing, not fo, not leading. And playing catch up in, in many cases, especially in, in terms of privacy and security. Uh, but. Well, I would for some things of security, uh, certainly. Um, but, uh, but then you have other parts of the world that are going to follow their own paths or their have their own standards. And similarly either being leaders or leaders or followers in, in their respective areas with their, with a set of folks who follow them. And it's not like we're talking about nuclear non-proliferation.

Izar:

Oh, isn't, or is it?

Chris Romeo:

It's a few steps below. I mean, we did, I did learn about those fancy defense words you guys were teaching me that were in this document at the, at the.

Izar:

But no, think, think about it. We, we already seen, uh, companies being, having, having the US market closed and by extension, all those, those other markets that you guys mentioned. Could this policy be used as, as something like a, a hammer for a new economic policy where we say, Hey, we're not going to use any service or anything from that specific country or

Matt:

So

Chris Romeo:

to

Izar:

I'm just drawing that.

Matt:

well, so I think you were, you're, now what you're alluding to is what's, what shows up in, in section four, which we weren't gonna talk about. So section four is securing the technical foundation of the internet,

Chris Romeo:

which we don't

Matt:

specifically. So this is implementing technologies that if other company, other countries and companies, Adopt. They won't be able to play

Chris Romeo:

Now I wanna come back. I want to, I wanna address AAR's original point, cause I just thought of something from my, my professional experience that that has, does play into this. So, When I worked at Cisco, I was part of the global certification team. We were responsible for common criteria and FIP certifications that were US government certifications, but were also recognized by other countries, uh, some of the other partner countries around the world. Interesting thing happened though. copied those standards, but made their own version. And for the most part, they said a lot of the same things. They literally copied the document and put their own title page on it. And so that was an example of where, yes, we can think like there's this big disagreement between the US and China on so many different levels. But there was an example where China took something that, I mean, common criteria was an international standard that that many countries poured into and they took it and yes, they weren't willing to sign on and say that they were. in the international space and doing that thing, but they were willing to copy it and use it, and they, they had their own version of FIPs 140 as well. That was the same stuff that NIST FIPs one, FIPs 140 says. So I thi this could have some positive outcomes, I think for the global communities, even some of those that we think of as maybe not so friendly today.

Matt:

Well, and, and specifically, so since we're in the, since we're now in the 4.1, I'm gonna highlight in paragraph two as autocratic regimes seek to change the internet and its multi-stakeholder foundation to enable government control. Dot dot.

Chris Romeo:

Yeah.

Matt:

So

Chris Romeo:

that's part of the, part of the internet is not, does not belong to the United States government, even though ARPANET was originally created

Matt:

And Bill Nsfnet and all the bases Yep

Chris Romeo:

That, car has left the garage and we're not, we're not, there's no reverse gear on the internet, so I know we're, we're, we're about out of time for today, and so we've spent a lot of time pouring through this document, and I, I've really enjoyed this conversation. It's really, it's really made me think about some things that are outside of. product security where I tend to live my, all of my thinking goes into, and so I've really enjoyed kind of th thinking big picture and applying and hearing your experiences and your knowledge about how do we apply this to geopolitical issues. Um, it's,

Izar:

I, I learned a lot about it. Thank you.

Matt:

was fun

Izar:

really

Chris Romeo:

we'll, uh, now one thing we are gonna do is we're gonna have a future episode called Reasonable Software Security. And, and it's gonna be a battle royale where we argue about what is reasonable in software security. So folks, Reasonable software security will be a future episode here. And with that, Matt's dog would like to tell you. Goodbye.

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

The Application Security Podcast Artwork

The Application Security Podcast

Chris Romeo and Robert Hurlbut