The Security Table

Privacy and the creepiness factor of collecting data

Tania Ward, Izar Tarandach, Matt Coles, and Chris Romeo

What is privacy, and how does it intersect with security? We are joined by our first guest, Ally O'Leary, a privacy compliance expert. Ally works for a consumer electronics company, ensuring compliance with global privacy laws and acting as a data protection officer.

The episode delves into the intersection of privacy and security, with Ally explaining how these two areas often go hand in hand. She emphasizes the importance of understanding the definition of personal information and being aware of where such data is stored within a company's systems.

A significant part of the discussion revolves around why security and privacy are two different functions within a company. Ally explains that privacy is a relatively new concept for most companies, often triggered by regulations like the GDPR. She also mentions that privacy often becomes part of the legal function due to the close work with attorneys to interpret laws.

The conversation also touches on the challenges of data governance and the importance of proper data ownership on the business side. Ally highlights the need for regular reviews of data flows and audits to stay on top of data governance.

Towards the end of the episode, Ally advises security professionals on when to involve privacy experts in their processes, especially during the development life cycle. She encourages security professionals to notify their privacy colleagues about any projects or initiatives that might impact systems containing personal data.

Overall, the episode provides valuable insights into the world of privacy compliance, the relationship between privacy and security, and the role of data governance in protecting personal information.

FOLLOW OUR SOCIAL MEDIA:

➜Twitter: @SecTablePodcast
➜LinkedIn: The Security Table Podcast
➜YouTube: The Security Table YouTube Channel

Thanks for Listening!

Many Things About Privacy

[00:00:00] ​

[00:00:00] Chris Romeo: Hey folks. Welcome to another episode of the Security Table. This is Chris Romeo, and I am as always pleased to be joined by my friends Izar Tarandach and Matt Coles as we sit and ruminate around At the Security Table. Well, this is kind of a momentous occasion for us. We have our first ever guest and no, it's not because this is the first time someone actually agreed after many, many efforts we made to try in begging and asking and pleading.

[00:00:39] No, no, no. We we're super excited to have Ali O'Leary with us, who, I'll let, I'll let Allie tell you what, what she's, uh, what kind of her, her expertise and everything is.

[00:00:49] Ally O'Leary: Yes. Hi. Thank you for having me. I'm very excited. Uh, so I work in privacy compliance for a, uh, consumer electronics company, and it's my responsibility to ensure the company is in compliance with global privacy laws. So we have, um, a global privacy program and we've got a small team that, um, works for me. and I'm also, um, a data protection officer, which I don't know if you guys are familiar, familiar with that role, but a lot of privacy regulations require that role, which in a nutshell is someone who's supposed to remain independent and always be thinking about, um, the individuals whose data you're collecting, employees, consumers, whatever it is. To make sure that you're honoring their rights and their freedoms. So you get to kind of remain an independent person that gets to kind of poke at what folks are doing and see if it's actually the right thing on behalf of those folks. So, uh, that's what I am. And then prior to doing, I've been in this role for about five-ish years now, but prior to that I did a brief stint in security. Um, so I have a, I got a vast knowledge of it, but privacy and security go hand in hand. So having that kind of little bit of background has definitely helped me in my current role today.

[00:01:54] Izar Tarandach: So many questions.

[00:01:56] Chris Romeo: Yeah, me too. . That's what I'm thinking. It's like so many things. I want to ask, where do we even

[00:02:01] Izar Tarandach: so many.

[00:02:02] Chris Romeo: begin? so we talked, we, we wanna, we wanna focus our conversation in the world of privacy by design. So between Matt and Za and myself, we spend a lot of time thinking about. Software applications, hardware developers, building stuff is, is something that'll, that, that we all spend a lot of time thinking about. And so when we think about privacy by design, I feel like this is something I don't know enough about. I'm just gonna put that on the public record here. And so maybe you could start out, Allie, just by laying a foundation for us about what is privacy by design, why do we care about it? And kind of, let's see where we, let's see where we go from there.

[00:02:42] Ally O'Leary: Sure. So it's, it's based on the same principle as Secure Security by design. Which is you're always, or privacy of the individual is not an afterthought. So you're always considering it and developing requirements throughout the entire development life cycle. It's not something you're all the requirements you're thinking of at the beginning of the project and then maybe at the end you come back and make sure they're right.

[00:03:01] Now you should be having that conversation throughout the whole time. And so some things that you have to think about are the legal and ethical considerations when it comes to collecting personal information. So, um, and most privacy regulations have the same set of standard. Privacy principles, which are, um, notice and consent, making sure you people are aware of the data you're collecting, um, and that you get consent to that collection if needed. Um, when you collect that data, you need to have a specific purpose and use for collecting that. You just can't collect all of this data and then decide, oh, we need this, these elements now, but later. We wanna collect these other data elements cuz we might use them a year from now. So I have a lot of conversations with, um, secur, not security, sorry, it teams too to kind of vet the different data elements they're gonna be collecting. Um, so that gets at the data minimization principle of most privacy regulations. I. . Also the entire data lifecycle management. Most people like to collect data, but they don't like to think about once they're storing it, one, how they secure it. But two, how long are they gonna keep it? How long, when, when are they gonna delete it?

[00:04:01] Data gets stale all the time. We should be thinking about that. Um, the rights of individuals. So I. Um, individuals have the right in some cases to get a copy of the data a company holds on them to delete that data or even update that data. And you have to build in processes for those. And a lot of the time you have to build that as you're building the system. And then another thing that, um, is always a considerate, there's a number of things, but the big one too that's helpful at the beginning of a project versus the end is something called a cross. Border data transfer. So if you're collecting data from an individual residing in one country and your IT system happens to be in another country and you're gonna store it there, there are actually some requirements, um, when it comes to where you store that data. And so of course you'd like to have those conversations upfront or along the way because there are times where the IT teams have decided we're gonna store this data in the US and then at the end of our project right before they go live, I don't wanna come in telling them, sorry. Nope. And now you have to do a bunch of rework.

[00:04:54] So, it's one all about ensuring the privacy of the individual and it's always the forefront of the actual person you're collecting the data on, but also making sure the company is in compliance and there's no rework that these poor IT or software teams have to deal with.

[00:05:10] Chris Romeo: Yeah, and that's something we're, we're always happy to hear on the software side, is being able to minimize the rework. Um, I could say, you know, from my privacy, call it a privacy journey, I encountered privacy in the. Kind of with gdpr. So building a product, having a startup that was building a product, bumping up against gdpr and the um, uh, what was the thing that they called?

[00:05:36] That was the, the thing we could sign up for Privacy shield the thing here in the US that was supposed to cover us. So that was really my exposure to privacy. I'm curious from, from Izar and Matt's perspective, like what. Now, now that we've got this kind of foundation of privacy by design and the things that privacy are doing, what, what's been your experience, Matt and Izar? Have you worked with privacy teams before, kind of in, in, in trying to solve the problems that Allie's talking about here?

[00:06:03] Izar Tarandach: Go first, cause I'm gonna take this somewhere completely different.

[00:06:10] Matthew Coles: Uh, uh, yes. Uh, so, so first off, I'll level set that, uh, El and I have worked at the same consumer electronics company. Uh, and, uh, so I have, I do have some experience with, with working with my, my. My peers and partners in, in privacy. Uh, it is definitely a different mindset. Um, and, and definitely, uh, um, interesting tug of war, I guess between who owns what and, and who does what. Um, but uh, in the end, obviously security has to be able to support privacy because you can't have privacy without security. Although I, I imagine the reverse is not true. And actually I only, I wanted to get your opinion on that. Um, you know, You definitely, I, I, I think you cannot have privacy without security, but can you have security without privacy?

[00:07:01] Ally O'Leary: when the data is not personal information. Sure. Um, I don't necessarily care about highly confidential IP data that a company may have, and you need to secure that, but, if you guys are securing personal information, you can't not have privacy rules associated with it. Does that, it's, it's like, I view it as a Venn diagram, right? Security is very focused on the requirements associated with very, the very important data that a company or whatever has privacy is focused on a narrow set of that data, personal information. But we're more focused on, like I said, those legal and ethical considerations with use collection, storage, the whole data life cycle. And so really the Venn diagram is the. the security of that personal information and that's, or that's how I view it at least.

[00:07:46] Chris Romeo: So. 

[00:07:47] Let me throw, let me just before Izar goes, and he's gonna take us in a different direction, but I have a similar, I have a similar direction, but I, I think this is a bit, could be a bit of a controversial question, so that's why I'm gonna throw it out on the security table, because that's what we do here, is store controversial things out and see what happens here. 

[00:08:02] Why is security and privacy two different functions then inside of a company? Why are these things not combined together? Why don't we have a single group that serves both of these purposes together?

[00:08:16] Ally O'Leary: That's a valid question. Yeah. And I will say, so I will, most companies, privacy is new and most companies, it was the GDPR that forced companies to stand up a function. So, What I will say a lot, a lot of folks I've talked to, privacy ends up being in the legal function just because you have to work so closely with attorneys to interpret the laws, and it's just a defacto, let's throw them in this organization. Other cases that ends up being folded under an infor, a security team in A G R C function as well. I don't think one, unless you're like the big players like Microsoft, I don't think one company has really figured out the best way to do it. And I would argue too, the same problem exists with data governance and under, like the teams are responsible for ownership of data, data flow, data flow diagrams, all three of the security, privacy and data governance all kind of fit together and. I don't know if any, there's no one right answer. I don't think I, I personally have never experienced any fighting though with security and that where we've had to go to our different management streams and we've had to like fight over something. We've always worked things out. So I'd be interested if you guys have other experiences in that regard.

[00:09:23] Chris Romeo: I can't say I've, I've knowledge of any fights between them, but it's, it was, it's just always co it's always been for me, kind of just one of those things I'd look at and go, why are these things? Because the cons, the, the people you're trying to influence, at least from a privacy by design perspective, are the same engineering teams. One of the things like, and I'll, I'll reference my, my time at, I'll just say where I worked, cuz everybody knows Cisco, so I worked at Cisco and one of the, one of the pushbacks we would get. in regards to security even was product teams would say, Y you're coming at us from 27 different directions. You gotta get your story together.

[00:10:01] Come at us from one angle. We want to do the things you're, you're preaching in prescribing for us, we think they're a good idea, but you're sending 27 different streams at us simultaneously, and we have to dedicate individuals just to manage all the noise coming from you. And so when I think about security and privacy being. One stream that could go to the engineering team potentially to save additional overhead. Um, but it seems like from your perspective, they're coming from, you're kind of, you're, you're sitting under the legal fund or used to privacy teams sitting under legal and security teams. I don't think I've ever reported to legal.

[00:10:35] I've never seen a company where security reported to legal.

[00:10:37] Matthew Coles: actually our, our, the company I work for, the company I work for now, uh, our security team is under legal. 

[00:10:44] Izar Tarandach: so turns out that if I wait enough, I don't even have to take things to another different, uh, direction because somebody else is going to bring it around. 

[00:10:52] So, 

[00:10:52] Matthew Coles: Oh, is that where you're, the bombshell you were waiting for 

[00:10:54] Izar Tarandach: yeah. yeah. No, no. You guys sprinkled the grenades when I was playing with the, with the nuclear bomb. But, uh, the thing is, the, the other day I was working at this place and, and of course one of my my tasks was to like train people, raise awareness and all that good stuff. And one of our engineers caught me at lunchtime and told me that, uh, his manager had turned to him. The, the guy had been doing a governance and, and, and stuff like that. And he said that his boss turned to him one day and said, mazeltov, now you are, uh, a privacy engineer.

[00:11:26] And he looked at me with like puppy eyes and said, could you please tell me what the hell's privacy? and it seems to me like we, we, we are just assuming since we started talking that everybody knows what privacy is. Can you please give me a definition of what privacy is?

[00:11:45] Ally O'Leary: Yes. Um, I would say in layman's terms, it's the creepiness factor of whether or not you wanna collect all this data on individuals and what you wanna do with it. That's like the simple, if I had to tell my sister, but that, that's how I would explain it. But, um, it really comes down to our legal obligations and then your ethical considerations for when you wanna collect data and what you wanna do with it. Um, and that's why it's all about the data life cycle management. And I know you guys are talking about, um, Chris, you were just talking about G seems like everyone's going to the same teams with the same requirements and that is true. So I have a couple of comments about that. Um, there are different, obviously Gates and checkpoints in the software development lifecycle that I think it makes sense that you have both a security and a privacy person in those meetings as part of those same conversations. That makes complete sense. And then, but I also think, I'm not sure you guys, the security necessarily works with. At the business function, like business process, for

[00:12:40] example, sales initiatives, marketing initiatives, those are all collecting data, but it's not necessarily a new system that we have to make sure have security requirements implemented in them. So I actually have to talk with those people, understand their business processes and talk to 'em well how are you planning to use this data? And then also, once we have the data and secure, how is this other group gonna use the data? Have we disclosed that use to our customers? Do people know we're gonna use?

[00:13:02] Can we even use the data in that way? So it's a matter of really. I'm more concerned about use of data and the spreading of data and who has access to it for that purpose.

[00:13:13] Izar Tarandach: So

[00:13:13] Chris Romeo: Hmm.

[00:13:13] Matthew Coles: So would you say, sorry, would you say that, so security is focusing on primarily technology stacks and, and some process, whereas privacy is focusing on people and functions and processes. 

[00:13:28] Ally O'Leary: Yes, I I think that's accurate.

[00:13:31] Chris Romeo: It's, it's always like, it's funny to think about this, like, so when we're attached to the engineering team, sales calls us and we're like, phone is, my phone is apparently is not working. My, my slacker teams.

[00:13:46] Izar Tarandach: I'm going into a tunnel

[00:13:48] Chris Romeo: Yeah, I'm going into a and that's kind of our general, our general approach is the, and, and it's, it's always funny because there's, every place I've been, there's always been this weird. This weird kind of relationship between engineering and sales. And I don't even know why it always exists. Like, I don't know why people are always like, oh, it's sales, you know, come on. Don't even, we know we don't wanna talk to them. But it sounds like from the privacy perspective, like you're thinking about all the things that salespeople are typing into their blackberries Yes. I said they're blackberries and storing somewhere into, into, uh, some data source somewhere. And so yeah, that's a, it's an interesting perspective. And Allie, that's one that I've never thought of and I didn't realize, I'd never realized. And and thought about, oh wait, you have more than just these software pipelines and source control systems, contents to worry about. You have all the things that salespeople are doing along the way.

[00:14:40] Izar Tarandach: So now that I know that privacy is basically the creepiness factor of data, . My next question is, Chris said, can, can there be privacy without security or security without privacy? 

[00:14:53] Matthew Coles: I asked that 

[00:14:53] question actually, but that's okay. 

[00:14:55] Izar Tarandach: And now I am going to scope that down. As we know, security likes to say that we have this, like top values that we try to work with cia, a potentiality, integrity and uh, availability.

[00:15:07] And I want to focus on confidentiality. Can you have privacy? without confidentiality and confidentiality, without privacy. How do the two values link? Do you see them in the same level today? Meaning now we have C C I A P, or is P still growing up?

[00:15:28] Chris Romeo: P is silent.

[00:15:30] Izar Tarandach: P silent,

[00:15:31] Ally O'Leary: I, so it's,

[00:15:33] it's interesting because. if you hold personal information about consumers, employees, whatever it is, you need to keep that confidential. Um, and that that should, there are rules around how you protect that and keep that, but there are other scenarios in the privacy world where you companies may wanna scrape publicly available data out there that might not naturally be confidential data. And so, um, and you, you scraped that possibly to, and there are obviously considerations around scraping data anyways, but, um, you may wanna use that for marketing initiatives or to segment and profile people and things like that. So once you bring it in-house and now you're kind of tying it to individuals, then it becomes confidential.

[00:16:13] So, um, I, I think it, it's still part of that confidential umbrella. I would love for a silent p Um, obviously I'm always advocating for privacy and learning more and more, more people to learn about it, but. I, yeah. I think it ultimately does fall under the confidentiality.

[00:16:29] Izar Tarandach: Thanks. 

[00:16:29] Matthew Coles: can we extend that, that question actually a bit then also with, with, let's not leave out I and a in those conversations. So if confidentiality evolves, uh, as you go from to simply data to now data that. Is scoped within privacy. What about integrity and availability? Uh, if I recall, there's, there's something around loss of data, uh, that you hold on a, on a user. Is that, is that really a privacy problem or is that really, uh, a

[00:17:02] security problem still?

[00:17:04] Izar Tarandach: Hmm. 

[00:17:05] Ally O'Leary: Both. Both because, um, if you lose data, then you're at a risk of, you know, having significant breach. And breach is very specifically defined within regulations and has obligations. You have to report to regulators within a certain time period. You may have to notify the end of the people who you have. Um, their data has been lost, so, um, it's hand in hand, but there's so many more legal obligations that. Don't, don't, don't lose the data. is what I would say that gives, it gives me heart palpitation, just talking about it.

[00:17:36] Matthew Coles: Well, the, I mean, losing a data from a data breach though is, is arguably confidentiality, not availability still. Right. If, if somebody else steals the data, it, you, you 

[00:17:46] violated 

[00:17:46] confidentiality. 

[00:17:48] If somebody, if a company owns, has a, has data on a person and they delete it by accidents, it's availability.

[00:17:57] Correct.

[00:17:59] Ally O'Leary: Yes.

[00:17:59] And I mean, 

[00:18:00] Matthew Coles: do you,

[00:18:01] yeah, 

[00:18:01] Ally O'Leary: you'd have to think about

[00:18:03] the, go ahead. No, I didn't wanna 

[00:18:05] Matthew Coles: No, no, you, you finish your, finish 

[00:18:06] your 

[00:18:06] thoughts. Finish 

[00:18:07] your thoughts. 

[00:18:07] Ally O'Leary: You'd have to think about. 

[00:18:08] Matthew Coles: is what we do 

[00:18:09] here. 

[00:18:09] Ally O'Leary: The significant effects of losing that data to the individual. So for example, you are, uh, a healthcare company who has now just wiped out all of my medical records. That's gonna have a significant impact on me and my ability to get services. But if, uh, you have accidentally wiped out my Twitter account, it's not that big of a deal, I can recreate the account again. So I think it's, it more becomes a risk-based factor and well, I,

[00:18:34] you might argue 

[00:18:35] Izar Tarandach: people that we 

[00:18:36] Matthew Coles: not Twitter. Can't get that blue check mark back, you know, let's.

[00:18:43] Ally O'Leary: So, yeah, I think it comes down to a risk-based decision about the type of data you have and whether or not it's available or not. There are other arguments, right? So you have certain rights I mentioned earlier. Um, we can get a copy of your data to delete your data. Update your data. If you don't give me the ability to get a copy of my data because you've wiped it out, sure.

[00:18:59] You could have some legal implications there too. So, um,

[00:19:03] you're right. All three, the cia, they all apply integrity too. You have the data has to be accurate and. Based on what you know about me has should be, it should be accurate, and yeah, there should be good quality data. Yes.

[00:19:18] Matthew Coles: So I, I'm gonna ask, I want to a, I, I need to ask the question here, uh, because it's, it was initially confusing and then it was just a, a point of interest talk about personal data and maybe thinking about GDPR and the various other schemes that are now popping up over the place. You know, every state seems to be coming out with her own, seriously.

[00:19:39] Our email address is really

[00:19:41] personal. 

[00:19:43] Izar Tarandach: Good point.

[00:19:45] Ally O'Leary: So from a legal perspective, the answer is yes. I know you're saying like everyone's email just is out there in the wild. That's probably where you're going from all of this. But if you were to look at it 

[00:19:52] Matthew Coles: It's also, it's also what we use almost universally for all authentication 

[00:19:57] Izar Tarandach: Yeah.

[00:19:57] Ally O'Leary: mm-hmm. 

[00:19:59] Izar Tarandach: Plus the schemes are well known. Like you, you always know that it's named whatever at whatever, whatever.

[00:20:06] Ally O'Leary: but, but you're tying that email address to a lot of other data that you have on this individual, and the moment you take that one identifiable piece of information and tie it to all of that other information, it all becomes personal data. And then it just comes down to all the requirements under breach notification laws that basically say if you have email address plus X, Y, and Z, now you need to notify.

[00:20:24] So it's, it's definitely personal information from my perspective.

[00:20:29] Chris Romeo: Let's not forget, 

[00:20:30] let's not forget real

[00:20:31] quick. There are a number of email addresses people created, like at Gmail that are super embarrassing when they're adults.

[00:20:38] And so we wanna make sure those are, you know, super private and, and that those are protected because,

[00:20:43] you know, I'm trying to think of one.

[00:20:46] Izar Tarandach: what we want to be private here is the relationship, not the identifier itself. Like the string itself doesn't mean much. What, what means is the relationship between that string and the rest of the data or, or the person. 

[00:21:00] So it's not like the mail itself by string bites stored has some special value of privacy or, 

[00:21:09] Chris Romeo: You just mentioned something that I just happened to scan Lyndon because Ali, our, we're all good friends with Kim

[00:21:16] Buttz whose Linkability,

[00:21:18] is 

[00:21:18] Izar Tarandach: yes. 

[00:21:19] Chris Romeo: think that's, that's what you just described. I was 

[00:21:20] Izar Tarandach: use it.

[00:21:21] Chris Romeo: I was like, I think Eza just used something from Linden, cuz I was just looking at it a second ago.

[00:21:25] So, Allie, so what, what, like, give us a little more context on Linkability, like in AAR's example here. It, it's the, it's the connectivity to the person that we're most concerned about. Not the string of, of bites and values.

[00:21:40] Ally O'Leary: Yes. I, it, it all comes down to identifiability. If you are able to individually identify a person through the data, then yes, and that's why you'll see there's always gonna be a. This is where privacy fights with development teams, where, um, you have direct identifiers and indirect identifiers. So direct iden name, email address, things like that.

[00:22:00] And then indirect identifiers, like IP address, other things. There's a big argument in the world about whether or not cookies are personal information, but because it can all be tied and directly connect to another, and you can make observations and you can, uh, either profile people based on that data. Or understand their browsing habits and all sorts of things like that. It's all personal. All personal information that could either feel invasive, like you're monitoring them, like you're making automated decisions about things that they've done, so linkability and identifiability as well.

[00:22:31] Matthew Coles: So this is where the conflict, I think, comes in between security and privacy, right? So let's take authentication as a, as an example, right? If we're gonna use email addresses as the source of identity for, for a system. So you're gonna use your username is your email address we're going to ask you to. That obviously we want to protect the, the interface that we use for authentication. So their credentials can't be stolen, but we're going to wanna make that association between a username and a credential and a username and their set of roles, and we're going to log any attempted authentication by writing who tried to do what, which means writing their username to the log. Now all of that becomes in scope from a privacy, privacy standpoint, but we can't not do that. Right. We can't not collect that data cuz if we don't collect that data, we, we lose all the security capabilities. But by doing so, we now have legal implications.

[00:23:29] Ally O'Leary: So here's what I like to say about that. It's not that you can't collect data or do anything with that data, it's just a matter of making sure we have the appropriate controls and other things in place. So for example, most privacy laws say all privacy laws say you have to collect data and have a use for that data of why you've collected that.

[00:23:46] And in order to retain that data, you also have to have, and for example, security purposes is one of those legitimate reasons why you can retain data and and things like that. So security plays a big point in, um, in all of this and how, how you can collect and use data or justify the argument at least.

[00:24:03] Chris Romeo: I have an, I have another bombshell question just to throw it on the table here, to try to mix, to mix it up even some more. Should we institute a GDPR like piece of legislation here in the United States?

[00:24:21] Matthew Coles: Wow. 

[00:24:21] Izar Tarandach: Wow.

[00:24:23] Matthew Coles: Boom. 

[00:24:25] Ally O'Leary: Do you want me to answer that first

[00:24:26] Chris Romeo: Please do.

[00:24:27] I cuz I, I actually do, I do have, I do have strong opinions about this as well, but I'd love to, I wanna understand your, your perspective first.

[00:24:35] Ally O'Leary: So as the average consumer, hell yes, is what I, am I allowed to swear? Um, I absolutely, I I wish I lived in Europe right now as a privacy professional. I will also say hell yes, because as someone who needs to interpret all these, now we're up to I think 10 or 11 different state laws. They're all different and they all have different varying degrees of, um, Strictness too.

[00:24:58] And so it's almost like, oh man, I wish I lived in California now, not Florida, for example. And so as a privacy professional, it's very hard, not hard. It makes it more complex to digest all requirements and make sure your program, cuz most companies operate globally or across the United States is in compliance because there's just too many different states.

[00:25:15] So it would be great if we had one G P R S that kind of consolidated all of it. And if I had to request it. Right now, California has the most strict privacy laws, so it should be based off of California as in, in my opinion.

[00:25:27] Matthew Coles: Do you have a good example of any conflicting 

[00:25:30] privacy 

[00:25:30] laws 

[00:25:31] that you can. 

[00:25:32] Ally O'Leary: conflicting,

[00:25:32] just weak. Like, oh, like why, why even, why even put that into effect? Like it's just, it's not even doing any good in my mind.

[00:25:41] Chris Romeo: So Izar, what's your take? Do, I mean, do you think we should have a GDPR for the us Like a, a version of a, a like version that we have across all 50 states?

[00:25:54] Izar Tarandach: What could possibly go wrong?

[00:25:59] Matthew Coles: All 50 states in the

[00:26:00] District of Columbia? 

[00:26:01] Yes.

[00:26:01] Izar Tarandach: Or, or now we have to start litigating every single new piece of data that we want to collect and store and save and protect and use and whatnot. And as we know, everybody around here loves a litigation. So yeah, do we want that added, uh, or do we want to do the right thing because it's the right thing to do?

[00:26:25] Matthew Coles: Why 

[00:26:25] would we ever

[00:26:25] want to do

[00:26:26] that? 

[00:26:26] Izar Tarandach: because we don't want the creepiness

[00:26:29] Matthew Coles: We

[00:26:29] don't We don't wanna be creepy. Yeah. Okay. Uh, now, Where's this all gonna go? So, first off, I'm, I'm right off the top of my head. I'm thinking every, every credential data breach, it, it every credential theft is a privacy violation. Large language models and other things are just gonna make this infinitely worse. Having inconsistency of regulation across just the United States is very, very

[00:26:57] bad in that 

[00:26:57] ca in these 

[00:26:58] cases, especially when we talk about cross geographical, uh, data transfers. Um, Maybe the question isn't, do we need a US gdpr, but do we need a global gdpr? And I, I'm using GDPR as a placeholder 

[00:27:12] for 

[00:27:13] a privacy regulation. 

[00:27:14] Chris Romeo: do you get 

[00:27:14] global 

[00:27:15] How do you get global Anything though, like how do you.

[00:27:18] Oh, 

[00:27:19] Izar Tarandach: for a second. 

[00:27:20] Chris Romeo: on. 

[00:27:21] Izar Tarandach: For a second. I, I thought that he was going to, to say that he was using global as a placeholder,

[00:27:27] Matthew Coles: well, I wanna leave, I wanna leave room for Morris 

[00:27:30] and, and the

[00:27:30] moon, right? 

[00:27:31] So, 

[00:27:31] Chris Romeo: Elon's, Elon's gonna take us 

[00:27:32] there. So, but I mean, I, 

[00:27:34] Matthew Coles: the International Space 

[00:27:35] Station, 

[00:27:35] Chris Romeo: ISO has a standard, like, give me an example of where ISO's done anything on a global scale that was, that was super positive that, that I'm gonna get excited about here.

[00:27:45] Izar Tarandach: Oh.

[00:27:45] Matthew Coles: ISO

[00:27:45] 27,001, 

[00:27:47] Izar Tarandach: Positive 

[00:27:47] Chris Romeo: no, no, I I, you didn't hear the rest of the question. 

[00:27:49] Izar Tarandach: question.

[00:27:49] Chris Romeo: I, I said that I'm gonna get excited about, like, I mean, what is iso What, what has ISO 27,001 really done for, for the world

[00:27:58] Matthew Coles: it's only created the underpinning for the

[00:27:59] entire information 

[00:28:00] security 

[00:28:01] space. 

[00:28:01] Izar Tarandach: Why? Why? do I feel like I'm in a scene of Monte Python's life of Brian

[00:28:08] Chris Romeo: It's, so, it's hold, hold on. I'm

[00:28:11] Izar Tarandach: What has ISO ever 

[00:28:12] Matthew Coles: I'll, and I will.

[00:28:16] What is, 

[00:28:16] Chris Romeo: It's the u you're,

[00:28:17] it's the underpinnings

[00:28:19] of the information security.

[00:28:22] Matthew Coles: everything refers to iso, right? So ISO is the Uber 

[00:28:25] standard, 

[00:28:26] Chris Romeo: What? Everything what? I don't refer to 

[00:28:28] iso 

[00:28:28] Matthew Coles: you look at, 

[00:28:29] Chris Romeo: that isn't is a 

[00:28:30] Matthew Coles: don't, you don't refer to iso, you don't refer to ISO directly, but presumably you use things like NSTA 

[00:28:36] 53, 

[00:28:38] right? 

[00:28:38] Chris Romeo: not a federal government.

[00:28:39] I don't 

[00:28:39] Izar Tarandach: Listen, uh, I, I, I have to be honest here, I, I get fascinated every time that Matt starts declaring and the claiming ISO numbers. I have no idea if they're right or wrong, but it's amazing the number of numbers that he knows ISO documents, like it floors me every time.

[00:29:00] Chris Romeo: Okay, so.

[00:29:01] Matthew Coles: So, so I will go out on a limb and suggest that iso, certainly ISO 27,000 Series provides a, provides a common base. A common base for a lot of the things that we refer and rely on. Today, even if we're not using 

[00:29:17] ISO 

[00:29:17] directly as a 

[00:29:19] Chris Romeo: But what, what teeth does it have though? Like, Like, what? Like I can create an ISO standard and I can get a bunch of people to agree. We can have a meeting and we can make a standard. I can't make it infor. How do, there's no enforceability to it though.

[00:29:32] Matthew Coles: Well, so there are complaints. There obviously are complaints. People love taking p, taking other people's money, 

[00:29:37] right? So there 

[00:29:38] are complaints 

[00:29:38] Chris Romeo: Be careful. 

[00:29:39] Be careful. ahead. 

[00:29:43] Matthew Coles: What,

[00:29:43] Chris Romeo: was a joke. I was a joke. 

[00:29:46] Matthew Coles: uh, so, so obviously you, you, so not, obviously there are some ISO standards that have no teeth. I'll give you example. ISO 27 0 3 4. 

[00:29:56] It's in the 27,000 series. It's inflation application 

[00:29:59] security related. 

[00:30:00] Chris Romeo: SDL one 

[00:30:01] has sdl. 

[00:30:02] Matthew Coles: is basically entirely, you should have a standard and you should follow that standard, but we're not gonna measure you on the following of said standard. Uh, so there are ISO documents that provide references, and then there are ISO documents that provide measurable compliance based things. Now, what makes them, what makes them want people want to measure against them is, is other regulations. And other requirements and other standards that say measure against X.

[00:30:30] Right. So 27 thousands and, and one for instance is, uh, sort of the super set of, of controls and processes for managing information secur, information security management systems. There are various regulations across the world that then refer to ISO 27,000 series as the reference point. Uh, and not just, not just 27,000, uh, 20, so 27,001. Uh, so if you look at this stage 853, for instance, as a set of controls, those controls implement, or implementations of, uh, things that are, are found in 27,001. Uh, and if you look at, uh, many of the other standards as well that we refer to the cybersecurity framework, for instance, uh, the S sdf, uh, those all, uh, covid and other things all have relations to to ISO standards.

[00:31:23] Chris Romeo: Hmm. Okay, I wanna go back. I want to bring Allie back into the conversation cuz we kind of went off on a bit of a rant here. of, uh,

[00:31:30] Ally O'Leary: I can 

[00:31:31] Matthew Coles: If you had just 

[00:31:31] taken my 

[00:31:32] statement

[00:31:32] as is, 

[00:31:33] you

[00:31:33] Chris Romeo: not get, I mean, I still, I still don't agree with where you landed, but I'm just being, I want, I wanna be polite to our guests and I want to get, I want to, I want to try to answer this question about how, like, could we make it a global standard for privacy?

[00:31:44] Because I'm not opposed to it. I just don't think ISO would do that. I don't think it would get us there. That's a whole, we can continue arguing that for the next 10 years. But Allie, from, from your perspective, like what, like how, how would we do this on a global scale? Like could we do this on a global scale?

[00:31:58] Ally O'Leary: I, I think the simple answer is no. And it comes down to, um, other countries not trusting the US So they allo one of, I was talking earlier about the data transfer and the worry is that if we transfer data to the United States, do current US laws allow, um, Other, like allow the government to get access to that data under Pfizer orders and other things like that.

[00:32:21] And then, so that's why people are nervous to send their data of European citizens, China citizens, and have it stored in the us. So that is, for example, right now I'm working to get in compliance with the China privacy law. And we basically have to justify to the Chinese government why we're allowed and we, it, it, the data is okay to be stored in the US for example.

[00:32:38] So I don't, and I just think. China would never get on board with the same requirements around data that the US would have. So I just, I don't think it's practical, but, and then before, I didn't wanna lose the thought I had when Matt was bringing up iso. What I will say is if, um, you, if a company has an ISO certified or, or a security program based on ISO or another industry leader, and ISO I will say is the industry leader globally. And a regulator is inquiring cuz you've received a complaint or you have an issue and you say, we're ISO certified. They kind of walk away. They're like, oh great. That meets the adequate security safeguards of our law, which is very basic. And so I would argue ISO is actually valuable, um, from that perspective.

[00:33:17] In terms of the argument, if you had to

[00:33:21] Izar Tarandach: But comes the question. Okay, so before we go doing global policies and and global policing, is privacy a global value? . So what they decide that it's private data in China and here is that the same data?

[00:33:41] Ally O'Leary: Generally speaking, actually, yes. If you were to actually look at the definition of personal data across our global regulation, they, they're nuanced a little bit here or there, but yeah, I would say yes. But I think it's the perception of the individuals in those countries. So for example, in China, I think you would argue the average person does it.

[00:33:56] They don't anticipate having any privacy because of the government they have in Europe, though they're very much, oh yes and thank you GDPR, cuz you're now forcing international companies to honor our respect. And then in the US people are like, eh. we just assume everyone has our data. It's fine. Not a big deal. So

[00:34:14] it, it varies. 

[00:34:15] Izar Tarandach: me. Oh, okay.

[00:34:19] Ally O'Leary: But that's why I do have difficult conversations, for example, with our marketing teams because it's, it's the, the mindset of how you, where you work in marketing, they want all of the data because they wanna be able to profile and get everything. And in their mind, because they market, they know other people have all their data.

[00:34:34] And so it's not a concern. But I think it's the same, probably insecurity, right? You innately assume a platform is insecure, so you are cautious not to put your data in there or use it until you've done some due diligence. So I, I just think it's a. a. mindset of how you, what data you value of your own. Is it definitely, it's a personal mindset influenced by maybe what you do for work or if you've been breached before or, or things like that.

[00:34:56] And it definitely varies by country

[00:34:58] or 

[00:34:58] Izar Tarandach: So yeah, 

[00:35:01] Matthew Coles: Yes. 

[00:35:02] Izar Tarandach: a word of our sponsor.

[00:35:03] Chris Romeo: I, I, let me translate, hold on real quick. He just, the, the dog just said ISO is not the answer.

[00:35:09] Matthew Coles: Actually on that point, I just looked it up. ISO 27 7 0 1

[00:35:15] is the privacy 

[00:35:16] management

[00:35:16] standard. 

[00:35:17] Izar Tarandach: three quarters

[00:35:18] Matthew Coles: information management standard.

[00:35:21] Izar Tarandach: So, Ellie, I I have a question for you and, uh, this is going to be a bit strange cause perhaps you, another word from our sponsors. , is it data? Hmm. I hear a dog. Okay. So anyway,

[00:35:38] my experience of privacy lately has been that at some point somebody woke up and said, Hey, we need a privacy effort and whatnot. And as you said, marketing loves collecting data and have been collecting all the data for all the time. And it's all over the company. Nobody knows who owns, nobody knows where it's stored, nobody knows systems, uh, all that stuff.

[00:35:57] So we decided that we needed to look into a system that would let us. take some control of data. And of course we saw the different options out there. Many of them had agents and things that went into your cloud environment, started going through your database and this, and at some point they stopped moving their hands in there like I'm doing right now, very fast and say somebody has to go and tag the data.

[00:36:24] So from your experience, what, what are the solutions out there that people could. Without vendor names, what are the approaches for that kind of challenge?

[00:36:35] Ally O'Leary: It's funny, I was on, I was participated in another, kinda like a privacy forum where the same customers raised and we were in fact talking about vendors and it, it was a, a smattering of smaller companies and bigger companies participate in the conversation. And the answer is no one company has figured it out. They are either doing it all manual or they have a hodgepodge of this vendor, this vendor and this vendor because no one vendor does it all for them. and even still, there's still, there's no thing called automation. You still have to have a human looking at stuff, tagging things, validating things, making sure there aren't any false positives.

[00:37:06] So unfortunately it's really good marketing, I think, on the vendor side. And then when you actually get the tool, it's still too new. I think. Um, I think data governance tools in general are still too new in the industry.

[00:37:18] Izar Tarandach: Is it too new or is it inherently too subjective for us to be able to put rules into place?

[00:37:26] Ally O'Leary: That's fair. I, especially when it comes to tagging personal information, I like there, like I said, it's the indirect identifiers that get, get people hung up.

[00:37:34] Those are hard to identify, especially if they're not connected to other data. You could have data that's pseudonymized versus anonymized versus not, and it, it's, so, it's more of a. Generic tag of did this table may contain personal information. Let's do that. Because here's the reality. If I, if it would, we'd love to have all of those data flows if you have an incident happen, yes, it would be great to reference those data flows, but I'm gonna be calling people directly to tell me exactly cuz I'm never gonna trust those data flows or those diagrams or those systems because

[00:38:03] Izar Tarandach: Oh. Oh. 

[00:38:04] Why? 

[00:38:04] Ally O'Leary: often. That data changes so often. Um, I feel like data feeds are not reliable. Uh, ownership of the data changes I and new systems are added that are never actually incorporated into those systems. It's just you have to be really confident. Um, and I'm not sure, I haven't met a company that's been a really confident in their processes to date

[00:38:26] Izar Tarandach: What could be done to raise your confidence?

[00:38:32] Ally O'Leary: y. A big data governance team, proper ownership of data on the business side, um, proper roles and responsibilities of what it means to own data. Be a steward of data, I would say. Um, a regular reviews, I think of all of the data flows, like you can do audits and things like that, but you have to stay on top, on top of it.

[00:38:54] Izar Tarandach: Okay, so you, you, you just made my budget for privacy and security four times as big as it was before. Here at the security table, we are very big on things being reasonable. What's, uh, uh, a measure. Of privacy for a company to decide what's reasonable for them to stand by their privacy goals. In other words, do everybody have to have the same goals?

[00:39:22] Or if I'm a very small company startup mom and pop shop, do I have to have the same thing as the big corporations?

[00:39:32] Ally O'Leary: Uh, so there's a lot of factors that play into that. One would be, I would say, volume of data. I would also say the jurisdictions in which you ha like you. Do you just have us consumers? Are you small enough that you only work in Massachusetts, for example? Do you have global data? Those would be factors.

[00:39:48] And then the type of data, right? You could have, uh, there's, there's personal information and then there's sensitive personal information, right? So your name, email address, just standard personal information. But then there is sensitive data like bio biometric, genetic data. Um, it. Woo, gosh. Social security numbers, things like that.

[00:40:05] Trade union membership. They're actual data elements that are more sensitive medical data. So if you are processing that data, definitely you need to invest more in because your risk level is too high. Also, if you are processing data up with considered vulnerable populations, so children, elderly, um, things like that, I, you should also, you would have a bigger risk profile and therefore I would think you would wanna invest more in privacy and security. for those. So volume of data. Type of data, like individuals, which you're collecting type of data and your jurisdiction. I think.

[00:40:39] Izar Tarandach: Okay. 

[00:40:39] Matthew Coles: does 

[00:40:39] tr, 

[00:40:40] Izar Tarandach: Just a 

[00:40:40] second, 

[00:40:40] Matt, before we go there. Sorry. No, actually Matt, you'll go 

[00:40:44] for it.

[00:40:45] Matthew Coles: Yeah. So why does trade union data, why is that 

[00:40:48] private? 

[00:40:49] Ally O'Leary: It's, it's a sensitive topic in Europe. I will say, I, I will speak out of term if I, I, I don't actually know the answer. I'll. The top of my head, it's very simple. Political affiliations, religious affiliation, all of that is also, I, it

[00:41:00] probably comes down to pro profiling. Uh, and then also making sure you're not making automated decision making on a lot of those things. This is a bad example, but making sure you're not hiring an employee because, or not hiring an employee because of a certain religion they have, things like that. It's, it's, there's a lot that kind of goes into those factors that I could, I could give you more information after

[00:41:20] Izar Tarandach: So that, that was awesome. Cause that leads me to my next question, the how to question. The more we we talk about this, the more it sounds to me like that person that came to me at lunch and said, now I'm a privacy engineer, was set up to fail. Because it sounds to me that what you need is actually a lawyer because there is so much interpretation here.

[00:41:41] There's so much nuance. Who, who are, who are the people who are doing privacy, and who are the people who should be doing privacy?

[00:41:49] Ally O'Leary: Yeah, that's valid question. It's, um, it's a hodgepodge. I, what I will say is, I think it's probably a 50 50 split right now of folks like myself who came from security and went into privacy because these laws popped up and there were no bodies or or they were fascinated by it, or attorneys. It's a mixture.

[00:42:05] And so if you, you likely have a privacy compliance function that is led by an attorney. I am not an attorney, but um, and or you like myself, work directly with privacy and security related attorneys who help you interpret the law and, and come up with these justifications arguments. Reg regulators help you balance the risk. So I, it doesn't have to be a lawyer. It's good as a lawyer, but sometimes you don't want the lawyer because you want the person to be able to talk with the, here's my other thing. I, I, I'm not a big fan of coming, being within the legal department because when you join a call, there's, people know you're from legal. If people just shut down, they don't wanna say things, they don't wanna talk to you. It's, it's not

[00:42:45] Izar Tarandach: Guilty. 

[00:42:47] Ally O'Leary: yes. And so I, here's what I like to you, and 

[00:42:50] Matthew Coles: I like talking to Lars. I don't know what you guys were talking 

[00:42:52] Ally O'Leary: Well, that's why. I'm not a lawyer. Guys, please talk to me. And I, I, my argument, one of my big thoughts here is in any compliance role or a role where you have to give security requirements or requirements that people don't wanna do, one of my big things is to actually build relationships with people.

[00:43:07] Like literally let them know you have a job outside of this. You're not a, I mean, a family friend outside of this. You're not a police officer. Like, I like to, um, like you can do things, it's just. There's a certain confine of what you can do. And like even, I know people aren't in the office anymore, but I would like to like just strike up a conversation with a person in the hallway or like send them a funny meme over teams because then they, like you build a relationship, you build trust and they'll actually raise issues to you and talk to you about things and actually work through things, which is cute.

[00:43:35] And then when you have to ask them to do something that they don't wanna do, they actually understand, oh, I know why I have to do this. It's actually not, she's actually trying to help us here. You know? Um, so. I think that's the biggest part of the job, and I think if you're an attorney, I love attorneys. I work with them all day long. E people just immediately kind of shut down, so I don't think the actual function would be an attorney.

[00:43:55] Matthew Coles: I know we're running sort on time, but I, I want to ask this really important question, uh, just related to I think, what, what is, where ISR is going with this. What can we in, we as security professionals do to help? Like, what, how can we as security, uh, uh, consultants, advisors, experts, pros, whatever, and or the security champion communities that we work with, what can we do to help facilitate or enable privacy and to facilitate those conversations or, or whatever. 

[00:44:23] Ally O'Leary: Yes, Thank you.

[00:44:24] I would say learn the definition of personal information. Understand it. Understand if where you work, you guys have it in any of your systems. That would be key. It's similar, like when I talk to architects too. If you become aware of any project or initiatives that is going to touch this or change this system in any way. Just be like, Hey team, you might wanna check with your privacy folks about this. Similarly, when we, when we go through reviews, if we know someone's gonna store data, we have pointers like, you're gonna store it in the system, go make sure it's secure work with your security team. So it's more of just a point them in the right direction because I, and I, I think my, any colleagues in privacy would say we'd rather be notified than not. So, and then we will, we will work with them as needed.

[00:45:04] Um, Yeah, I, I would, I would say, say that. And then also, if you're aware, the other thing I, I struggle with is getting in the right spot of the development life cycle. So if you are aware of certain gates or things like that where it makes sense to bring in a privacy person, we'd more than happy to participate. So, um, it's likely you guys, because security's been around for so long, it's likely you guys are already in the right conversations with the right people at the right point. Then we just wanna bring a friend in to listen, um, and, and learn from there.

[00:45:31] Matthew Coles: So obviously we do a lot with threat balling here. So I mean, is that a good point where we should bring in privacy if, uh, you know, we, we find a data flow that has personal information on it? Is 

[00:45:42] Ally O'Leary: Absolutely. Yes. If anything.

[00:45:44] more to be like, did you guys even know this exists? Cause I can't tell you how many times I'm learning about a business function doing something that we didn't know about. So, and, and like I said, we were talking earlier about data flows and everything being manual and there's no one system.

[00:45:55] So any triggers of a pocket where you might be doing something with data. definitely let them know and make sure it's on their radar.

[00:46:01] Chris Romeo: Hmm. Yeah, that's, that's excellent. Allie, thank you for, uh, walking us through this conversation and, you know, the fact that you're thinking about developer empathy, a privacy perspective, this is something

[00:46:13] we've been thinking about for the last number of years as well. And, and it's just a powerful way to, to connect with the people we're trying to influence.

[00:46:20] And so I love to hear that's kind of your perspective coming from the privacy side. And so, yeah, thanks for, uh, being our first ever. Guest, uh, around the security table. And just a quick message for those out there that, uh, might be ISO fans and might wanna send me some, some, some select comments. Uh, you can send that to iso trolls security table.com.

[00:46:42] Izar Tarandach: Wait, wait, wait, wait. There's a standard for that. It's 6 66.

[00:46:46] Chris Romeo: Yeah, you have to look in ISO standard 1 7 4 7 2 7 4 7 2 3 8 dash

[00:46:51] Izar Tarandach: 6 66 and three quarters

[00:46:53] Chris Romeo: which will tell you how. I'm totally kidding. We don't send an email there cuz, cuz it won't actually go anywhere. So,

[00:46:59] Izar Tarandach: No. No.

[00:46:59] Chris Romeo: yeah, I mean, this is, is, this is all in fun. Like we, around the security table, we do, we do argue about stuff that's for fun.

[00:47:06] It's, it's to help us explore the ideas. And so Allie, thanks for being with us as our first ever guest. And folks, we look forward to seeing you again soon around the security table.

[00:47:17] ​ 

[00:47:17] 

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

The Application Security Podcast Artwork

The Application Security Podcast

Chris Romeo and Robert Hurlbut