The Security Table

The Future Role of Security and Shifting off the Table

October 17, 2023 Chris Romeo Season 1 Episode 32
The Security Table
The Future Role of Security and Shifting off the Table
Show Notes Transcript

The Security Table gathers to discuss the evolving landscape of application security and its potential integration with development. Chris posits that application or product security will eventually be absorbed by the development sector, eliminating the need for separate teams. One hindrance to this vision is the friction between security and engineering teams in many organizations.

Many people think that security incidents have negative implications on brand reputation and value. Izar points out that, contrary to popular belief, major security breaches, such as those experienced by Sony and MGM, do not have a lasting impact on stock prices. Chris counters this by highlighting the potential for upcoming privacy legislation in the U.S., which could shift the focus and importance of security in the corporate world.

Chris envisions a future where the security team is dissolved and its functions are absorbed across various business units. This would lead to better alignment, reduced infighting, and more efficient budget allocation. Security functions need to be placed where they can have the most significant impact, without the potential conflicts that currently exist between security teams and other business units.

The second topic of discussion is the "shift left" movement in the realm of application security. There is ambiguity and potential misuse of the term. What exactly is being shifted and from where does the shift start? The term "shift left" suggests moving security considerations earlier in the development process. However, the hosts point out that the phrase has been co-opted and weaponized for marketing purposes, often without a clear understanding of its implications. For instance, they highlight that while it's easy to claim that a product or process "shifts left," it's essential to define what is being shifted, how much, and the tangible benefits of such a shift.

Matt emphasizes the idea of not just shifting left but starting left, meaning that security considerations should begin from the requirements phase of a project. Chris mentions that the concept of shifting left isn't new and cites Joe Jarzombek's late 90s initiative called "Building Security In" as a precursor to the current shift left movement. The hosts also humorously liken the shift left movement to a game of Frogger, suggesting that if one shifts too much to the left, they might miss the mark entirely. The discussion underscores the need for clarity and purpose when adopting the shift left philosophy, rather than just using it as a buzzword.

FOLLOW OUR SOCIAL MEDIA:

➜Twitter: @SecTablePodcast
➜LinkedIn: The Security Table Podcast
➜YouTube: The Security Table YouTube Channel

Thanks for Listening!

Chris Romeo:

Hey folks, welcome to The Security Table. This is Chris Romeo joined by Izar Tarandach and Matt Coles, and we are going to jump right in, skip all the witty banter, so you can just remember witty banter from a previous episode, uh, that usually happens here, uh, in this little segment. We're going to dive right back into where we picked up from, or picked up from, where we left off from, be the right expression, where we left off in that last episode, and I kind of left off at the cliffhanger. But I gave you guys a week to think about this, so I'm, I'm expecting some brilliant, uh, solutions have been crafted inside of your brains. So the issue that I left us with at the, at the conclusion, which was based on an interview I did with somebody else, and kind of some of their opinions. The future of application security. I want to offer a hypothesis, I guess. Maybe not a hypothesis, just a prediction. I think that in the next number of years, I'm not sure if it's going to be 5 or 10 or how long it's going to take, but I think application slash product security is going to be eaten by development. I think it's going to be absorbed. And when I say eaten, I mean it's going to be absorbed. Um, because, and the premise being, we have a lot of friction. in our day here between security and engineering. Different levels of friction, I would say, in different companies, but I've seen some companies where the friction level is, there might be a fight here shortly, all the way down to, eh, we just, you know, we just don't really listen to what they say and stuff like that. So I'm imagining a world where security becomes part of engineering and then Everybody reports to the same boss, the CTO, the VP of engineering, the whatever. Uh, and so let's dive in there. I want to see, what do you guys think about this? Do you think this is possible? Do you think it's irrational? Do you, where does it sit on the scale here, Matt? You go first.

Matt Coles:

Well, um, so first rule of Security Fight Club is you don't talk about Security Fight Club. Um, so, uh, it's interesting actually, uh, it's an interesting premise. Uh, I, I may answer it with a question or two. So. Quality engineering is part of engineering. It is not a distinct function, although it's not, well, maybe not best done in by developers, you know, uh, cause you do want to check and balance in place, but it's part of an engineering discipline. And so the requisite knowledge and experience and training and capability can all be served out of the engineering organization, right? That exists, has existed for decades, right? Uh, um, certainly in our industry. So. Uh, both hardware and software related, right? You can do, have quality engineering and software engineering as side by side functions. What you're suggesting is making security engineering one of those functions. Uh, and in that regard, for certainly for product development, I, it sounds reasonable. Are we there yet? Probably not, uh, but, but are, should we be there? Uh, that's an interesting question, uh, because what that means is that engineering organizations would have all the functions within them to basically govern their own destiny, uh, and, and, They know already have structures in place for handling requirements and reporting and, uh, and, you know, and working with other parts of the business. Do you really need a check and balance on from security? Uh, is there any benefit to having security outside of that organization is what you're proposing? Uh, so actually really interesting. I wonder, I wonder if that would work.

Chris Romeo:

Governance is where I, is kind of the, one of the sticking points, because...

Matt Coles:

Well, also, uh, response, uh, so things like, you have customer support that is not part of, of the engineering function, right? People to handle fielding bugs and working with customers. So, similarly, would vulnerability and incident response be outside of that function? So, would it be as...

Chris Romeo:

Maybe they become a part of customer success or customer service. Maybe become a function there.

Matt Coles:

Right. So, but, but either way, security as a whole probably can't be consumed by the engineering organization, but it sounds like, I mean, if you use quality as an, as a, as a, by comparison, engineering already contains quality functions. Could it contain security functions too? Maybe. And that might really work. Izar, what do you, what are you thinking there?

Izar Tarandach:

I have to agree with you and the thing is that for for a long time we have been using that pithy thing that says that if we do our job bad or job right then we're going to work ourselves out of a job and we usually say that when we're training developers and or training other security people. And for a while, I think it was cute, but now we are seeing the, we are seeing what success looks like looming over us. But, uh, I think that one thing that I can for sure say is that if that ever happens, and as Matt says, I said that, I think that for a certain portion of the work, that's definitely the way that things are going. It's not going to happen because of tools. It's not going to happen because the magical AI in the sky is doing its whole thing. It's going to happen because at the end of the day, developers are going to understand that secure code is quality code. They're going to incorporate the things that we have been talking for years upon years of all the principles and all the specifics. And all that stuff. But at the same time, there will be other developers who will develop new ways of things that could go wrong. So there will be aspects of security that will still be very much needed, namely design architecture. And I think that at the end of the day, what we're going to see is it's a sort of a commoditization of security knowledge, where developers know just enough to get by day after day without perhaps, uh, security and application security engineers sitting on their shoulder. But there will still be security functions out there beyond GRC, which I completely agree with Matt. It's always going to be a problem. And, uh. Not a problem, but it's always going to require more expertise than a developer probably wants you to put there. One thing, though, that I want to throw in the mix. Throughout this whole cycle where we teach, we teach, they learn, we measure, we get to conclusions. We never really got to ask them, do you want to do this? And I have the feeling that most developers, I won't say most, but many... A large percentage of the developers that I have met through the years are not really interested in getting security as a process, as a responsibility to themselves.

Matt Coles:

Uh, can we just, uh, talk a little bit about DevSecOps then? Um,

Izar Tarandach:

Do we have to?

Matt Coles:

Well, so for, so for, so first off, let me, let me just jump in a little bit, something you're describing. Yeah, you use the word developers a lot. Um, in when my description, when I was talking about this, I wasn't thinking developers specifically. I was thinking engineering organizations. So that includes non developers within that organization. Um, I agree with you, most engineers probably don't want to take on security responsibility, although DevSecOps, we kind of give them that responsibility. Uh, although I, I think at times I, I, At times, I have a sense that it's, give them a lot of responsibility, but not a lot authority or capability. Uh, and so, um, and that's not universally, you know, it's not universal, but it's, it's sometimes there's observations there that, that seems to suggest that. Uh, but the, um, The notion that an engineering organization, not necessarily the engineers themselves, may take a function, take that function, right? We don't, developers get taught how to write quality code, but they don't often get taught how to do quality engineering. Likewise, we would, we train engineers how to write secure code, but not necessarily on the function of, of ensuring security.

Izar Tarandach:

There is

Matt Coles:

And if you look at it that way. I think that's where the engineering organization as a whole can pick up those, those tasks, as opposed to having a dedicated security organization sort of do it for them. And, and that's where I was, that's where I was going with. So I just want to clarify that because developers may want to take on that responsibility or they may not, but the engineering function as a whole should.

Chris Romeo:

And when I, when I think about this model, I'm not envisioning what we think of as the people that are on an AppSec team or on a ProdSec team today as going away. They just don't report to the CISO anymore. They're part of the engineering organization and they're deployed. So it's not like we don't, it's not like we're taking all the security people and saying, you're just done. The developers are replacing you, the product managers are replacing you, the program managers are replacing you. We're bringing them into the fold. It's just we don't have, we're eliminating, one of the big pluses of this is we're eliminating any potential political fighting between two separate organizations by putting them under one organization in one management chain that can make risk management decisions at the end of the day. They can, they can decide what are we going to do for security. And I think we've come far enough in our history. Like this would have, would not have worked 20 years ago. Would not have worked 10 years ago. And when I first started in security, people were still saying, Hmm, no. Like we, we'd go to engineering, they'd be like, you need to do this. No, we don't, we don't have to do that. We're not there anymore. Right. And that's why I can start to think about this new world approach. Because I know I don't have people that are just going to tell me we're not doing it.

Matt Coles:

They have to be equal partners to your organization, again, along the same lines of quality engineering and other aspects, and, but it would certainly help with things like funding, right? So you'd have, you know, fewer opportunities for cross charging and whatnot, uh, and, and certainly would help with reporting. Uh, and, uh, the only downside is the check and balance part, right? The GRC function, audit, uh, we talked about vulnerability and instant response, um, where those functions go, how they're managed, right? There's still some cross cross org communications that have to happen.

Chris Romeo:

So that's one threat. You've identified one potential threat.

Izar Tarandach:

There is more. So again, in a perfect world, assuming that the cow is a perfect sphere, that would really work. But now throw into the mix the fact that if you get that team, bring it into the engineering team. you are most probably going to put them under, as Matt said, engineering management. Now, engineering management has very set ways of measuring performance, measuring goals, measuring achievements, that don't really mesh with the way that we measure things at security. And sometimes that's even a very big source of friction. So now we are taking an overloaded management level, giving them a bunch of people that speak a completely different language. And saying, for the good of everybody, play nice, kids. Historically, it doesn't work like that.

Chris Romeo:

Historically. But we're, we're, we're, we're reaching new places now in

Izar Tarandach:

that's going be a new place that has nothing to do with security itself, but a lot to do with management, uh, uh, culture. And I'm not touching that one with, uh...

Matt Coles:

Well, let's talk about the language. Let's talk about that language that you were talking, that you were describing, right? So the language of security today versus if, imagine a world where security is part of the engineering function, Where, like, how different, first off, would it be the same language as we have today? And, and if not, what might it look like? So, and I'm thinking, and again, I'm using quality as a, as a analog here, um, right? Developers talk about, you know, there's code complexity, but there's also code velocity, um, you know, number of lines of code, uh, produced and, and features implemented, etc. Well, quality engineering is about, You know, test case developed, you know, cases developed, cases executed, pass rate, uh, or fail rate, um, defects of quality defects or a performance defects, or, you know, basically failure to meet, uh, requirements of validation and verification. If security was part of the, um, as part of the engineering function, why wouldn't they be just translated into those terms? Where, where would there need to be, what translation would need to happen, do you think?

Izar Tarandach:

Let me give you an extreme example, okay? And maybe even a very weak one. But let's say that you just brought a security researcher slash pen tester into the team and that person just went over what came out of the sprint did their thing. Now they look at their manager and at their engineering manager and say I didn't find anything. Now the engineering manager will say oh, that's good we don't have any risks. We don't have any problems, right? And the security person is going to look at them and say I didn't say that I said I didn't find anything.

Chris Romeo:

But doesn't that force a cultural shift either break or it will make things very successful?

Izar Tarandach:

What is cheaper? To do a cultural shift or for the manager, for the engineering manager to say he didn't find anything, there's nothing to be found and we are good, ship it.

Matt Coles:

So, uh, I'm going to ask a naive question here. When this researcher did their research, did they follow a process for this? And again, about, I'm thinking about the, let's talk about, if we talk about QA for a moment, right? So QA runs through all their tests and doesn't find any bugs. They go to their engineering manager and go, we didn't find any bugs. We ran all these tests. We had X amount of code coverage, right? Maybe we covered 80 percent of the code. We didn't find any bugs. Engineering manager will probably look at that and go, we have quality code.

Izar Tarandach:

But QA testing for spec. They're testing a positive case. Yes, the code does what the spec says that the code should be doing. The security person is doing the negative case. Hey, I clicked some stuff and it exploded. If it doesn't explode, does it mean that it won't explode if I click two things? No, it doesn't.

Chris Romeo:

In your example, though, you're, you're extracting an isolation mode, meaning you're not considering any of the other things that would be happening in any process that would be part of any program any of the three of us would build. We wouldn't rely...

Izar Tarandach:

If we run DAST, we know that everything works, but, uh,

Matt Coles:

Boom!

Chris Romeo:

And we'd have WAFs, WAFs blocking our traffic and the hampster wheel...

Matt Coles:

You would be scanning scanning and fixing

Chris Romeo:

scanning and fixing

Izar Tarandach:

AI reading the code.

Chris Romeo:

Up issues. But the point is, though... We wouldn't, this wouldn't be an isolated incident where the pen test that like the only thing the engineering manager has as data is the result of the pen testers work. And now they're going to say, we're good. No, they're going to have, you know, pipelines that are part of code builds happening that are doing software composition analysis. They're doing SAST. They're doing, um, other, you know, their runtime stuff

Izar Tarandach:

so now you asking the engineering manager to become aware of those things, understand their limitations, understand the output they put out, interpret all that, and come out with a risk assessment on top of all the other things that they already do every day.

Chris Romeo:

No, I mean, I'm thinking let's put a security person on their team. Like they're not experts in They don't, don't

Izar Tarandach:

Which at the end of the day will come to the engineering manager and say, looking at all the stuff that we have, we didn't find much. So the guy is going to ask, are we secure? That means that we're secure, right? And the guy says, I didn't say that.

Chris Romeo:

Engineering managers don't ask that. That's executives that ask that.

Izar Tarandach:

But the language is different. The language is different.

Chris Romeo:

Immediately go up to, are we secure? Izar, are we secure?

Izar Tarandach:

We don't measure things with the same stick and we don't take risks with the same stick, right? In our book, Matt and I said that engineering moves at the speed of innovation and security moves at the speed of caution. You can't drive the same car at two speeds at the same time.

Chris Romeo:

Well, we just have to make a super highway and we got to merge everybody together onto the same highway.

Izar Tarandach:

But you know what this whole conversation reminds me of? Your last LinkedIn post. Because basically what we're saying here is, hey, we have achieved left maximum, critical mass, right? We went left all the way. I say we have shift left so much that we are falling off of the table. I mean, it's, it's...

Chris Romeo:

So we're falling left. That's a new one. I haven't heard that one yet. So, um, yeah. So, I mean, I think

Izar Tarandach:

it, yeah.

Chris Romeo:

just to, to kind of wrap up, put a bow on the future of AppSec and then we can transition and talk about shift left. Uh, you guys have challenged my thinking in good ways here because you made me think, Matt, specifically about like where does GRC sit? Where does audit sit? And I think it's possible for those things to live on in the finance organization. I think it's possible.

Matt Coles:

Or legal.

Chris Romeo:

Were just going to do away with a separate security department. We know that privacy is already pretty much embedded with the legal side. By, that's just by nature where they've landed.

Izar Tarandach:

Sad as that is.

Matt Coles:

And, and, and by the way, we're not talking about things like IT security, right? We're talking about AppSec, development security, ProdSec...

Chris Romeo:

I mean, InfoSec should, like, you could even pull this thread more and say why does InfoSec exist? Why doesn't it live in IT? Why isn't, why don't we just have people in IT thinking about security? Versus having a separate group that thinks about security versus a group who just thinks about building network architectures. Why do we have two separate teams doing this at the same time?

Izar Tarandach:

We keep talking about this and I keep thinking who watches the watchers. But not, not in terms of who watches people doing stuff, but... There's a, there's a function here that the output is risk.

Chris Romeo:

That's governance. That's GRC though, right?

Izar Tarandach:

no, no, no, no, no, no, no, no, no. People are building stuff. That stuff carries risk. And we are saying, rather than having two teams, one that builds and one that checks what got built, let's get the one that checks what got built and put it together with the one that builds. And, and, and let's have it like be consumed by all the processes in that team. And to me it goes back to, now I don't have a function that actually independently checks what's coming out and giving me an output of risk that's, that can be RTP on.

Matt Coles:

Think about the com com. Think about the quality engineering as, uh, analogy again here. Developers write code, quality engineering validates the code that they write. And because they're, they are a risk avoidance function too. Just not the same risk. So, there is a check and balance within a single engineering organization for quality engineering to exist side by side with developers,

Chris Romeo:

there's even, there's even another function that exists in a lot of most big companies now that we haven't even touched on yet. There are groups now called enterprise risk. They're not part of cyber. They're not part of IT. They're looking at risk across the entire organization from legal risk, regulatory risk, uh, reputational risk. Cyber security is a small, you know, it's a piece of what they're considering. But I know big companies, I know people that are doing that job now and they're not part of cyber, they're not part of the security stack, because they're thinking risk.

Izar Tarandach:

And they wouldn't recognize a line of code if it beat them in the ass.

Chris Romeo:

Yeah, but their job is to It's to think though

Izar Tarandach:

They're interested in that risk index that comes out of the function of security that says, Today, our developmental risk is blah, blah, blah.

Chris Romeo:

But you're, but part, I mean, part of your objection here is that we don't have anybody watching. I would argue we don't have anybody watching now.

Izar Tarandach:

No, but

Chris Romeo:

If somebody wants to sneak code into something, most likely they they can get away and do it today.

Izar Tarandach:

Yes, but that goes to a different discussion, which we already sort of touched upon, which is, we have nice tools, we don't have tools that embrace everything, and that seems to be changing, that there's a lot of stuff out there that's worrying about this kind of stuff, and even sign commits and whatnot. I think that my point was that, uh, We keep trying to overload developers and sorry, Matt, engineers. And I don't remember which word I was thinking about using, but we keep trying to overload them with the security thing without me really seeing, uh, no, I'm going to contradict myself here.

Matt Coles:

We've done our job. We're done.

Chris Romeo:

Yeah. Victory victory. I I

Izar Tarandach:

This week is really, really easy to get me in a loop. The thing here is that I think that engineers should know security. But we shouldn't expect them to know more security than they need to be able to do their jobs. Because at some point security takes a jump, both in level, in depth, and even in breadth, that we shouldn't be expecting them to be able to by themselves explore and embrace and own. So that's why I think that there will always be a place to the security professional, at so many different tasks, who will pick up that thing that happens after basic development happened. And again, Matt, to quality control, it's a nice parallel, it's not an analogy, I think, because, you know, in the set of things that are being sought here, theirs is a closed set, security is an open set. We don't know the size of the set of things that we are looking for. They are testing for spec. We are testing for how can this go wrong.

Chris Romeo:

I mean, they're testing further than spec. Right. They're testing for performance usually through the same function.

Izar Tarandach:

So performance there with

Chris Romeo:

You have a non-perfrmance

Matt Coles:

with chaos engineering, now have, you know, you'll have quality engineers running, chaos engineering type tests. Right. So

Izar Tarandach:

Chaos is a new thing and it's great, but still, and I'm speaking here from a position of ignorance. I think that at the end of the day, it comes down to a gorilla banging on a keyboard, right? You're just throwing stuff out there and seeing it, if it breaks or not.

Matt Coles:

All my QA engineering friends are going to be like, ah, what did I just talk to?

Izar Tarandach:

Fuzz, fuzz world, right? I love the idea of fuzzing the world. We all know the limitations of a fuzzing system. So, yeah, I think that my bottom line is. It's great to push things left. They shouldn't go, not too many things should go too left and there will always be a place for the security professional somewhere in there. What can and might and I hope happens is that we are going to have to elevate our own game to be able to answer these coming challenges in a way that actually brings value.

Matt Coles:

know, one, Oh, sorry, Chris, just

Chris Romeo:

just want to closing arguments. That was his closing arguments, your

Matt Coles:

Oh, I actually have one additional point I wanted to

Chris Romeo:

Go long it's on my side, you're fine to share it.

Matt Coles:

Oh, I don't know if it's, I just to be on, um, the question that always comes down to. should, should secure, should security be part of a, um, which cost center should, should, should security engineering be part of? Should it be part of a revenue, you know, revenue stream, or should it be, uh, part of, you know, operations?

Izar Tarandach:

I don't know about you, it's part of my revenue stream!

Chris Romeo:

Yeah, it's gotta part. I mean, it should be part of revenue stream. Too many people treat it as an operational expense and that's, I But that's the classic route. That's the 20, that's what we did 25 years ago. Was, it was an operational expense. It didn't make money. Security is not supposed to make money. If I heard people, how many times I've heard that in my career.

Matt Coles:

We're revenue protection.

Chris Romeo:

We're here to protect the, the, the fort and whatnot. It's like, no, well, not in a, not in a product focused company. Like if you're a product focused company, security, if you, you don't have revenue if you don't have security and privacy. Your revenue goes away in the, in the world that we live in now. You can't build a SAST solution that doesn't consider security and privacy.

Izar Tarandach:

So here's the funny thing, I used to think like that too. Until I saw the statistics, and I think that it's still true. We used to tell people, hey, if you, if you have a security incident, your brand is going to be impacted and your stock price is going to be impacted. It isn't. They have like blips and they just go back exactly to where they were. You look at the big ones, Sony, MGM, all that good stuff, or the big data breaches, some telcos and whatnot. The stock price just go right back to where it was.

Chris Romeo:

Yeah, that's, that is a,

Izar Tarandach:

Sad,

Chris Romeo:

it's a sad state of affairs though, right? Let that, so you're basically, I mean, with that conclusion, you're saying a company can willfully slash woefully. Decide not to focus, not to do, not to focus on security and get away with it.

Izar Tarandach:

Yeah.

Matt Coles:

Well, and then you'll, but then you'll have, uh, you know, regulatory environments that will say otherwise, right? Especially in critical infrastructure, private-public partnerships, etc.

Izar Tarandach:

But everybody is there.

Chris Romeo:

Yeah, critical is a whole, whole other kind of set... I think people understand the risks of critical infrastructure. I don't, I mean, I think most people understand. Oh, we talked about that before. I think we concluded most people probably don't understand how much danger there is the water supply, for example, or,

Izar Tarandach:

oh God, don't, don't even go there.

Chris Romeo:

Yeah. I don't want to go

Izar Tarandach:

Just to give you an example, Matt. You're totally right, and we talked about this in one of our first episodes, the utility value of things, right? But one of the biggest wireless carriers, that thing is breached every day that ends with Y. Now, they sent me an email saying, oh, you are in our auto pay thing, so in order for you to continue on this, you're going to have to use a bank account rather than a credit card. So that took me to having to open a new account. with a separate number and a separate card to once a month put money into that account enough to pay that specific bill and that whole account is just for that bill. Why? Because I don't trust my my bank account number, like my everyday account number, with that specific company.

Matt Coles:

Yep.

Izar Tarandach:

And they don't care.

Matt Coles:

And nobody, and, but how many other people would have done that isolation? Probably very few.

Izar Tarandach:

I hope many.

Matt Coles:

But probably pretty, pretty few.

Izar Tarandach:

I told everybody that I know works with them, but at the end of the day the point is They don't care for, for, for that point 00005 fee that they have to pay for the credit card every month. They are going to elevate your risk to a point that they don't care. Because they're not hit.

Chris Romeo:

You're going to make me an advocate for regulation now. I can't believe that's going to happen. By the way, my name is Chris Romeo and I'm running for Congress.

Izar Tarandach:

By the way, my name is Zartan and I'm running to blah. Anyway, uh,

Chris Romeo:

Hey,

Izar Tarandach:

people know what I'm running. You people know what I'm running for.

Chris Romeo:

uh, days from

Matt Coles:

I'm not running unless somebody's chasing me, so I don't know what you guys are doing.

Chris Romeo:

They're going to have be, they're going to have to be bigger than you two, right? Like, I'm not running if somebody's smaller than me.

Matt Coles:

Remember, you don't have to be the fastest.

Izar Tarandach:

just faster than the last guy.

Chris Romeo:

That's, I that's the, that's the rule of bears, you know, running, running away a bear. You don't have be faster than the bear. Just gotta be faster than your friend. Alright, so did we, did we, I don't know I didn't get

Matt Coles:

Oh, we mangled this topic.

Chris Romeo:

Let me.

Izar Tarandach:

We, we didn't go left.

Chris Romeo:

We will, we will, but I have to provide my closing arguments. You had your closing

Izar Tarandach:

Oh yeah, please.

Chris Romeo:

So my closing argument, um, I mean, when I think about the value proposition of taking a separate organization and melding together, yes, I understand there will be pain in figuring all those things out. So that's why I don't think it's going to happen now. I think it may happen 10 years in the future. It may happen further, but I think the future world is one where. The security team is dissolved and is absorbed by various functions across the business. So, the AppSec people and ProdSec people are in engineering. The, uh, people, the kind of, GRC would have to be, it's, you'd have to take GRC and drop it under finance. Like I said, leave privacy under, under legal or maybe bring privacy engineering. Back to engineering and let, having the, let legal deal with the privacy issues on the side. But I think the value proposition in this is we, we're not, there's no longer infighting. There is, to Matt's point, there's budgetary alignment because I can now just align on the The things that I need to build us to, to, to build the best possible product. And yes, the, the, your evidence about data breach is stark and scary, but it's the truth. Uh, but I think over time there's going to be, you know, when we get a privacy legislation here in the United States, and I think it's coming, I just don't know when it's going to happen. I don't know if it's going to be GDPR for USA, but I think it's coming. I think we, we're, we need it. Um, I think that can drive a lot of the focus here, but I just think at the end of the day, it'll save once we get this, once we had this new system up and running, it'll save a lot of arguments, budgetary disagreements, uh, and we'll put things in the right place in my mind, put it where, where it put those functions where they can have the biggest impact. without the potential infighting that occurs at, uh, between security teams and various other functions within the business. And so with that, Your Honor, I rest my case. Wait, no, I want to call a witness. I don't know who I would call. Who would I call as a witness? I don't I don't have any witnesses myself. Um, okay, let's spend a couple of minutes having some fun with this idea of shift left. So, I got this new idea. I want to bounce this off you guys. I'm going to start a marketing campaign. Okay. I'm going to try to get everybody in our industry to believe something that we were talking about in the late nineties, but I'm going to summarize it in such a way in a very simple thing. It's going to be called shift left and I'm going to try to get them to, I'm trying to reinvent this thing right before your eyes, but that's. Partially a joke, like, it's, I mean, I, I wrote a post on LinkedIn. It's getting a lot of attention now and I called it, it's time to stop shifting left. Um, and the point of it is like, I remember, do you guys remember Joe Jarzombek? From the Department of Homeland Security. Were you guys, you guys were probably in kindergarten when, when Joe was at DHS.

Izar Tarandach:

I'm older than you.

Chris Romeo:

This is the late, no you weren't, but you guys were all about the same age, but late nineties um, Joe Jarzombek was at Department of Homeland Security and he started this revolution for security, for software security. And he called it Building Security In. You could go to buildingsecurityin.gov and Joe and his team were writing all about software security and secure code, you know, things. And they were, they did a pretty good job. But, but that's really... What sh what shift left is like Shift left isn't anything new. It's the same thing Joe was was talking about back in the late nineties and I looked it up. Okay. Larry Smith in 2001 was the first person to use this term shift left, and he used it in the context of shifting left in the project cycle. So he was like a program project person.

Izar Tarandach:

Wait, Perl, Perl, Larry Smith?

Chris Romeo:

Ooh, is that the same one?

Izar Tarandach:

I'm asking.

Chris Romeo:

Now I don't know. Now you've got me now I'm, it's such a common name though. Larry Smith. Um, I'm going to,

Izar Tarandach:

I don't know. Izar Tarandach is the only thing that I have to think about how common it is.

Chris Romeo:

Yeah, uh, let me do a quick check on that to see my sources at Wikipedia see if it was in fact Pearl. Um, Larry Smith. Was it Larry Smith that did Perl?

Izar Tarandach:

No, it was Larry. Something else.

Chris Romeo:

Yeah, I don't think it was. So, but yeah, so I mean at the end of the day, this individual created, Larry Smith created this idea in 2001. And, you know, it's like 2017 or 18 or so when somehow the, somebody in AppSec found it somewhere and kinda, I can't figure out exactly who the first person was to use it in the AppSec space, but it wasn't... Oh, I thought you were saying Matt. Matt was the first one.

Izar Tarandach:

Larry Wall.

Chris Romeo:

Larry Wall is the Perl guy. Yes. How could I not remember that? That was my first programming language.

Izar Tarandach:

Sorry, my bad.

Chris Romeo:

Try to program. Let's do some regular expressions in Perl, shall we? Come on. It was so painful. It never worked. Okay, I'm, I'm off on a whole other tangent. So, I mean, Matt, I know you've been maybe not as fond of the shift left terminology from its very early days. Give us your, how do you really feel about this, about this term Shift Left?

Matt Coles:

Uh, I think I have this quote in our book, actually, uh, don't, don't shift left, start left, uh, right thinker, think from requirements onwards, not, uh, and, and I think we've fallen, we traditionally, or historically fall into this trap of security comes later, right? And, and I'm looking at the article that you're talking about from shift left, uh, from 2001 from Larry, uh, Larry Smith, he's talking specifically about shifting quality left and we've adopted this practice from, from a security standpoint for the exact same reasons, right? So test driven developments and having QA be part of the process. If you have requirements, why not involve QA? Well, again, if you have requirements, why not involve security? Security should be, as a stakeholder, should be introducing requirements and when requirements are known, that's the moment, as soon as requirements are known, or a concept has been developed, that's the, that's the moment when security can start to be effective and the time where you can most cheaply impact the security of a system. And so you really don't want to, in my opinion, you don't really want to shift left, because shift left assumes you're starting from somewhere, meaning you started from as far right as possible, and you're working your way through the development lifecycle back towards requirements, but, or, or towards concept, but along the way, maybe that maybe you stop at, at, um, whatever your quality gate is, your code freeze, your functional freeze date. Well, you've already built stuff. And now you have, okay, what do I, how do I retrofit or how I patch? Well, I have to patch after the fact, after, after I ship or I have to delay shipment. And that's very costly. So I'm going to stop now. I'm going to ship, keep shifting left. I'm going to stop at development. Now I'm going to find bugs in code, but I'm, again, I've already architected a system. So I'm going to continue shifting left, and I'm now going to stop at architecture. Well, but now I've already set my requirements, so I built an architecture around those requirements, um, and I can't really retrofit those, or I've done concept development, and so I've gone down a certain path where it may be too late to make some security alterations. But if you had started left... then you could be most effect, effective and most efficient. from a resource cost, et cetera, standpoint.

Chris Romeo:

Isn't that just SDL though?

Matt Coles:

Well, that's what the program, an SDL program or an SDLC that has security, uh, milestones and activities is how you, how you implement this. And, and yes, uh, an organization that has a good security, that has a good development life cycle that includes security at each, at each phase of their life cycle, uh, has effectively already done the shifting, right? Uh, uh, now this gets challenging if, you know, and again, you know, we're no longer really in a waterfall model anymore, right? Even those companies that do waterfall aren't doing waterfall. But then again, companies that are doing agile aren't necessarily doing agile, pure agile anymore either. So where is left in an iterative process or a spiral or? So it gets a little bit challenging there. But basically anytime you have concept and requirements, that's the time to get security involved because it's the exact same time you would get, it goes back to, it's like quality. It's enough like quality that you can use it as a, as a placeholder. If you're going to involve quality, involve security.

Chris Romeo:

Yeah, and I think the, the, the, the, the, the part that's bothered me is this has been taken as a marketing term. It doesn't really mean, like, it still has its core meaning, but now you have, if you look in various sources, you have Shift Left, you have Shift Right. You have shift everywhere. I even found shift up and shift down, down references. where is that going? Like, we shifting up to the moon and you're going to run a secure product on the moon?

Matt Coles:

Actually,

Izar Tarandach:

Don't give them ideas, please!

Matt Coles:

We do, we do, well, so if, if we were going to look at those terms, I mean, we, we see this with shifting down the stack or up the stack, right? So traditionally, software, software security is software security, it's application development, as opposed to moving into the platform or moving into the hardware, right? And so you're moving down the stack, I guess don't

Chris Romeo:

you legitimizing their term here?

Matt Coles:

No, absolutely not. Uh, just at least, or, or if it hasn't been coined, then maybe I'm coining it now, but, uh,

Chris Romeo:

No, people have, I've found using shift up and shift down, but they didn't really have a good explanation for what it meant. They were just, it was just another kind of marketing. Let's, let's, let's do something a little different than what the rest of the pack is doing, shifting left by we're going to shift up or shift down and

Matt Coles:

Yeah. I can imagine shifting down as a lot, a lot easier to as a concept. Like if you are starting with applications already, there, there is no up, right. But there's probably a down.

Izar Tarandach:

There's always a down.

Matt Coles:

And, and there could be an out, I guess there could be an out as well, you know, looking

Chris Romeo:

Shift out?

Matt Coles:

Well, looking beyond your application, Now to the things around it, your ecosystem, customer base.

Chris Romeo:

I hope there aren't any marketers listening to this and they're like, Ooh, that's a good idea. We should shift out. That's our new campaign. We're gonna, our product shifts out right out of the building. Like, it's not even there anymore.

Matt Coles:

Need, we need some, it's some, it's some, it's some four dimensional chess thing...

Izar Tarandach:

We call it Elvis!

Matt Coles:

We need some four, we need need a fourth dimension there

Chris Romeo:

Shift Elvis. Elvis has left the building.

Matt Coles:

Well, instead of, of directional, instead of, you know, directional, so now we need time. So shift, shift later. Or shift sooner or...

Chris Romeo:

Why can't we time shift?

Izar Tarandach:

But why do we have to shift at all? Like it goes together with what we were talking behind,

Matt Coles:

As a Rocky Horror Picture Show, uh, quote going on here. We need to get introduced

Chris Romeo:

Matt's singing.

Izar Tarandach:

To tell you the truth. Okay, confession time. The first time that I heard the term shift left as applied to security to development, the image that I had in my mind was a bunch of developers doing the shuffle.

Matt Coles:

The time warp. It's a time warp.

Izar Tarandach:

It is not something I want to ever see again.

Matt Coles:

As opposed to Thriller. We could do the Thriller. It's it perfect for Halloween!

Chris Romeo:

Oh, the time warp.

Izar Tarandach:

But, uh,

Chris Romeo:

Whew.

Izar Tarandach:

I think that where this goes together with what we were talking before, like, we already touched it a bit in there, is that, uh, first of all, it sounds good. Like, you say shift left, people can have this mental image of things tipping towards the beginning of a timeline and whatnot. So, it's useful as a mental framework to think about how am I organizing my pipeline and stuff like that, right? But, I think that every time that we say shift left, again, we... We. People. All of us. The industry. I love that term, the industry. It sounds like the mafia. Uh, the industry, it, it, it, it just doesn't, it's very easy to say shift left. And it's very easy to say my product, my process, my way shifts left. But what are we shifting? How much are we shifting? How are we proving that it's shifting? And how are we showing that the fact that we shifted gives you a better result? That we haven't quite worked out yet.

Matt Coles:

The problem is shift is a relational term. When you shift, you start from some place and you go to some place. And when you say shift left, what are you shifting and where did you start from? We make the assumption when you say shift left, it means, well, I didn't shift left originally and now I'm going to shift left, so I probably wait until the end. As opposed to, well, I could be shifting from the middle somewhere and shifting left.

Izar Tarandach:

No, I, I think that you put it well, when, when it reminded us that both bolting on security is very much still a thing, unfortunately for, for all the efforts that we have done in the past years. And, and I think that you pointed out to that people adopted the shift left mentality because we were so much bolting in, but now that, with what you just said to me comes the question, how much is getting away from bolting things on in the end? How much of that, how much of shifting from that is a meaningful shift to the left?

Chris Romeo:

Now we're

Matt Coles:

I think unless you're, unless you're introducing it, jumping in at requirements and concepts, you're still bolting on.

Izar Tarandach:

Because that's

Matt Coles:

You're going to have to, you're going to have to bolt on, right? You're going to have to bolt on at the end because you're going to miss stuff, right?

Izar Tarandach:

Won't we have to bolt on something almost always because there will always be a finding that happened after the thing is out and deployed?

Matt Coles:

Well, so I wouldn't say patching. I would not say patching as bolt-on.

Izar Tarandach:

No, patching no, but changing stuff at the end.

Matt Coles:

Well, so here's an interesting question on that. So bolting on, I think, in my mind, at least, is seen as a, oops, I forgot to, I forgot about this thing. And now somebody reminds me I need to do it. And so I'm going to bolt it on. Or as opposed to, I have a concept and, uh, I want to make this thing modular. And so this is a new and now an add on addition that can improve capability or, or whatever, and I, and maybe I can charge for extra or not. Um, but, but you making an informed decision. So a lot of our work is about making informed decisions, risk decisions, or business decisions. And so if I. If an organization or, or, you know, product team, for instance, um, makes a decision to bolt on as an add on capability, that may be still acceptable compared to the traditional bolt on of, Oh, we forgot. And now we're in recover, we're in reactive mode. So proactive bolt on versus a reactive bolt on.

Izar Tarandach:

So let me put something out there, and this is going to be convoluted because it's that kind week, so I apologize from the beginning. 10 years from now, when we have application and development and deployment security people inside the engineering, because they have shifted left so much, if at the end of the pipeline it is found that, hey, we forgot something or we didn't think about this at the security level, that means that What was needed, the knowledge or the finesse needed to figure out, to identify that thing wasn't present in the engineering framework that created the thing. So it's almost like we need somebody with even more of a security background from those people that are in the pipeline to say, whoops, we forgot this thing, or hey, I just thought about this threat, or I just thought this use case. So have, have we just like. close the wheel and going back to where we started and packaged everything.

Matt Coles:

No, and I'm going to use another quality analogy here. When you make a mistake and you ship something and you discover that you ship something, you go through an RCA process, right? Some sort of quality control process. And as long as that process is consistently applied, or at least consistently devised, right? Um, then, then that would work too. In other words, you don't necessarily need to have the foreknowledge that to know that you missed something because regulations will change or customers will ask for something new or somebody will decide to use your product in a way that wasn't originally intended for and now it's suddenly it is intended for. Um, you just have have the process to, to bring that feedback back into the, into your life cycle and make the adjustment adjustments.

Izar Tarandach:

But, but the trigger for that is an exception. For that is, as you said, it went out and somebody corrected us. What I'm is, in between, in between that, that event of it going out and somebody being able to create an exception, there needs to be a different level that looks at different things that probably we don't look at today to say, hey,

Matt Coles:

Uh,

Izar Tarandach:

be the last gate before this thing shipped.

Matt Coles:

So I wonder, we do, I think in engineering organizations, we, we call them technologists, right? We have CTO organizations that do, advanced R&D, you know, advanced concept development. I, I think you have those opportunities to re, to take those functions, the outcomes of their work and feed them into this process. And if you...

Izar Tarandach:

Oh god, that's what they do? I never knew.

Matt Coles:

I think so, uh, and so if you put security into those functions, so if you take the security folks and put them, like take security researchers and put them in that function and have appropriate feedback and communication channels, right, That, that may solve the problem that you're, you're highlighting.

Izar Tarandach:

God, I disagree with that at so many levels.

Matt Coles:

Chris, do I have ally here?

Izar Tarandach:

CTO organizations, in my experience, they are some kind of like the gods in the Olympus saying we are looking forward 20 years and we are talking quantum cryptography and we are talking homomorphic, uh, uh, operations.

Matt Coles:

We've

Izar Tarandach:

While you, you poor people are PHP code. why do people, you poor people are still writing PHP code. Don't, don't, come bother us with your thing. We, we, we are somewhere else.

Chris Romeo:

CTO organizations in the context that we're talking about them here, let's be honest, a good part of their work is outbound. They're out speaking at conferences and they're thought leaders that like this idea where you can have 75 CTOs inside of an org... you know, it's, it's a chief, you're a chief technical officer. There's only one CTO in a company. And maybe, I mean, you, maybe you have one at the business unit level. That's truly making technology decisions, but I see a lot of the folks that, that wear the title of CTO. Now they're not really, they may be influencing the product a little bit, but they're primarily out trying to demonstrate technical dominance in the industry of ideas and thoughts and things there, you know.They're like a developer advocate, developer relations type of role. But I think, you know, kind of, I want to wrap whole thing up here and try to put a bow on it. Shift left, build security in whatever we want to call it. It's not perfect. It's never going to be perfect. Like even if you, and I'm not even going to say shift left, but if you built security in from the very beginning, you started left, to Matt's point, it still doesn't, it's still, it's still a process that's being executed by people that aren't perfect. That's being, that is executing tools that are not perfect. That's, uh, building software and hardware that's not perfect. So at the end of the day, These are preventative measures. There are things that we can do to try to get ahead of as many things as we possibly can, but we're still going to have challenges. There's still going to be problems that come out of it, no matter where you start in the process. And what we've learned collectively over decades of experience, all the way back to what Jarzombek was doing at DHS is if you start thinking about the security things when you start building the product, you're going to have a better security outcome when you ship. That's really all that matters. And I what's happened in our industry is this shift left has just been, it's been manipulated, weaponized, and used as a, as a term for people to try to demonstrate that their product is the best one out there. And I think they missed the mark. I think they, I think they took something and they twisted it. And they didn't, it doesn't really, at the end of the day, it doesn't really say anything about what you do because you don't, you don't really shift left. Your, uh, your SAST tool doesn't shift left.

Izar Tarandach:

I think that when you say, when you say miss the mark, I think that's, that's exactly the thing. When you point something way too much to the left, you will miss the mark that's in front of you. And that's what keeps happening. Miss, like shift left became a marketing thing. I'm leftier than you in the timeline. So therefore I am better than you. I'm saving more money than you.

Matt Coles:

It's kinda like a of fr... It's kinda like a game of Frogger. You're shifting across and there's stuff coming from the side.

Izar Tarandach:

Yep,

Chris Romeo:

I.

Izar Tarandach:

And that's why I said we're shifting ourselves right off the table.

Chris Romeo:

I mean, I tell what, with that, there's no way we can continue. We're going to, thanks folks for joining us at security table. We have to wrap with that analogy. Shift left

Izar Tarandach:

Wait, wait, wait, wait. What was the Frogger song? yeah, yeah,

Matt Coles:

Oh, no.

Izar Tarandach:

yeah, yeah, yeah. I thinking of the Pac Man one. No, yeah, you're

Matt Coles:

so you get run over and it.

Chris Romeo:

Yeah.

Izar Tarandach:

Splat.

Chris Romeo:

That's what happens when you shift left too much, you get ran over and squashed. So, all right, folks, thanks for listening to security table. Hopefully you found some, some value in this conversation. Um, I think we ourselves, we've made some progress moving forward. So, uh, we look forward to joining, you joining us again in the future weeks to hear us debate, discuss, and pick apart just about anything we can think of in the world of security. So thanks for listening.

Podcasts we love