The Security Table
The Security Table is four cybersecurity industry veterans from diverse backgrounds discussing how to build secure software and all the issues that arise!
The Security Table
Secure by Default in the Developer Toolset and DevEx
Matt, Chris, and Izar talk about ensuring security within the developer toolset and the developer experience (DevEx). Prompted by a recent LinkedIn post by Matt Johansen, they explore the concept of "secure by default" tools. The conversation highlights the importance of not solely relying on tools but also considering the developer experience, suggesting that even with secure tools, the ultimate responsibility for security lies with the developers and the organization.
The trio also discusses the role of DevEx champions in advocating for security within development processes, emphasizing the need for a balance between security and usability to prevent developers from seeking workarounds. They touch upon integrating security into the developer workflow, known as "shifting left," and the potential downsides of overburdening developers with security responsibilities.
There's a recurring theme of the complexity and challenges in achieving a "secure by default" stance, acknowledging the difficulty in defining and implementing this concept. The conversation concludes with an acknowledgment that while progress is being made in understanding and implementing security within DevEx, there's still a long way to go, and the need for further clarification and discussion on these topics is evident.
Matt Johansen's Original Post:
https://www.linkedin.com/posts/matthewjohansen_i-really-feel-like-a-lot-of-security-problems-activity-7170811256856141825-lKyx
FOLLOW OUR SOCIAL MEDIA:
➜Twitter: @SecTablePodcast
➜LinkedIn: The Security Table Podcast
➜YouTube: The Security Table YouTube Channel
Thanks for Listening!
All right. That's yes, that's a terrible visual. I'll just read it. So, hey folks, welcome to another episode of Behind the Scenes at the Security Table, which is now transitioning into the Security Table. Uh, my name is Chris Romeo, joined by Izar Tarandach, Matt Coles, uh, the Sultans of Security. I don't know. I'm just, I'm always, I'm always branding people. You guys know this.
Matt Coles:Where's the, where's the, where's the Mark? Effler rips riff, uh,
Chris Romeo:Sorry, we didn't have the budget for that, but uh, maybe if we get a sponsor someday we can have like
Izar Tarandach:have a triangle here somewhere.
Chris Romeo:Yeah, or maybe Brooke, maybe Brooke Schoenfeld will create us a, uh, on his guitar, he'll create
Izar Tarandach:Oh, that would be awesome.
Matt Coles:or, or our number one fan. Use some AI to, to do it for us.
Chris Romeo:That's true. Okay. So the tasking has been sent out to, uh, everyone to find us a little riff that we can use here. But let me read the, the LinkedIn post. It's going to be the, it's going to kind of set the stage for us and lead our discussion today.
Izar Tarandach:We call that
Chris Romeo:Johansson posted this on LinkedIn. And what this Post says is the following, and I quote, I really feel like a lot of security problems are actually developer experience problems. The more tools I put in my engineers hands to do their job that are just secure by default, the less they figure out workarounds to introduce risk. I'm more successful the more I get out of their way and figure out ways to enable them to do what they need to do, but with guardrails. If you aren't already, find a champion with your DevEx org. It'll pay dividends. So we're talking about developer experience. We're talking about secure by default. We're talking about guardrails. Uh, so I want to give you guys the chance to set the stage or give us the first opinion here to, to get the conversation rolling. Matt, we always go to you. So we're going to do it again.
Matt Coles:Well, so actually I want, I, I want you, if you could to, there's a few points. So there's a few points. I mean, luckily, uh, Matt, uh, Johansson here, uh, constructed this, this. This, um, this post as individual sentences. We can pick it apart one by one, pick it apart. So I didn't mean to say that, uh, I meant to say, uh, analyze each, each point individually. But where does it come from? And I want to touch upon the secure by default statement first, because that's probably the more contentious out of all of them. So when I read that, the tool, more tools I put in my engineer's hands to do their job that are just secure by default. the less likely they figure out workarounds to reduce risk. So, when I first read that, I was thinking, is the tool secure by default? Or is the tool producing guidance or results that are secure by default? So which do you think it is?
Chris Romeo:That's a good question. I mean, you can, you can, You can see, you could almost say it's both if we, if we, and we're definitely not a textual analysis podcast here, but we, we analyze the, well, there was the word the, at the beginning of the sentence. Did you not see that? That changes the entire meaning, Matt.
Matt Coles:so I guess maybe the other piece here is, and it's another matter of interpretation, Izar.
Izar Tarandach:But no, point of contention, we can't. Because the first sentence actually says that the problems that he's trying to solve are actually developer experience problems. The result of the tool is not a developer experience problem per se.
Matt Coles:it's the use of the tool?
Chris Romeo:It's the usage of the tool. Yeah.
Matt Coles:Well, so, and then I was thinking that there's maybe another interpretation of tool here. So, and again, so we need to love, we need to bound our conversation because otherwise we're going to go off in tangents and it's going to get ugly. It may get ugly anyway. So, so do you think, because we don't, we don't know what Matt specifically was at, was going for here, but, um, is he talking about like SAST or is he talking about, uh, production frameworks? So, for instance, are we talking about tools used during the development process? Or are we talking about the tools that developers deploy to, which I think we would commonly call infrastructure or frameworks, right? Is that what he's referring to? Because that,
Chris Romeo:say yes.
Matt Coles:a very big difference there. D all of the above. Okay.
Chris Romeo:So what is developer experience in the modern world of technology? Developer experience tends to be a team that's responsible for the tooling that a group of developers slash engineers use to build something,
Matt Coles:We used to just call them tools teams.
Chris Romeo:Tools to yeah, I mean we used to call them tools teams back in the day, but they've gotten a fancier name now That's fine. Everybody has fancier names for things. But yes classically In the early 2000s even 2010s. We referred to them as the tools team,
Matt Coles:Or release engineering, or something like
Chris Romeo:released. Yeah, I mean for me release engineering was more of a team that was focused on getting builds out the door They manage the pipelines, but developer experience was the tools team. They up, they uploaded, or they, they upgraded pieces of software that developers use to build, uh, the things that they build. And so that's what I think we're talking about here though, is tools that the developers use to build, something. And so when I think of this, I'm thinking with the IDE is in this category. I think all of the AppSec tools that people are, that we're going to, we've talked about ad nauseum on this podcast, would fit into that category. Uh, CICD build related tools fit into that pipeline, right? All of these things are part of the developer experience. Anything the developer uses to build you stuff is part of the developer experience.
Izar Tarandach:and that includes frameworks and SDKs and all that code
Chris Romeo:Yeah. Libraries, open source software, things like that.
Matt Coles:So I rate this a parti I rate this a partially true.
Chris Romeo:Okay,
Matt Coles:So, I would contend, I will contend, that I could give every SAST, STARAST, tool, whatever, uh, SCA tool that is itself secure by default and have a piece of garbage outcome of a product.
Izar Tarandach:Yes, but those are tools serving the dev, those are not dev tools. Right? Okay, let's
Chris Romeo:they should be dev tools, but I would say classically and even in the modern age right now, I don't consider a SAST tool to be a developer tool. It is a security tool that we make developers run, sometimes against their will.
Izar Tarandach:a bit further into what he's saying, okay? It says, the more tools I put in my engineers hands to do their job that are just secure by default, the less they figure out workarounds to introduce risk. Let's look at SAST.
Chris Romeo:Well, SAST wouldn't really fit in that category though.
Izar Tarandach:right,
Matt Coles:oh, so now I'm, I see now I would think that a SAST tool is something you give to developers, meaning they run SAST, they go into
Chris Romeo:To do their job, developers don't need SAST to do their job.
Izar Tarandach:no. And they are the first ones to say that.
Matt Coles:hold on, if you do SAST, if developers are going to write secure code, right, you're going to run it
Izar Tarandach:No, that's, that's, that's them doing our job.
Chris Romeo:Of course that's good process, but that's not, so what are the, so if I, if, okay, I take a developer first day on the job, I sit them down. I say, here's your laptop. You only have a SAS tool. They can do nothing. They can do, they cannot be productive. They cannot, they cannot do anything in their job. They will quit two days later because they're like, all you gave me was this SAS tool and you locked on my laptop. So I couldn't even load an IDE or any of the things I needed to build software. So that's why
Matt Coles:a developer tool?
Izar Tarandach:No, but, but that's not where I see the problem. It's not the fact that it's the only tool that I have and that proves that it's, it's not a developer tool. To me, it's not a developer tool because a developer can achieve their objectives with or without the SAST. With the SAST, they are going to achieve better objectives from our security point of view, right?
Chris Romeo:a supporting
Izar Tarandach:to me makes it a security supporting tool.
Chris Romeo:it's a supporting tool or technology,
Matt Coles:Okay, so,
Chris Romeo:to the
Izar Tarandach:Not only that, not only that, but in that class of tools, there isn't anything that a developer can do to end with a less, no, there is, they can change config and, and stuff like that, but that's not going to lead them, apart from perhaps creating less tickets or whatever, or alerts or whatever, it's not lead them to, to, to, to their outcome, which is at the end of the day, better code. Perhaps if you have a tool that's very aggressive in the CICD and it kills the build and they can shut that tool down, then okay, that lets them achieve their objective of putting code out in a less secure manner because the tool had bad defaults. But what I read in this, in this line here, is exactly what we spoke about, I think, last week and the week before and the week before that. Give them a library, give them an ORM. If they think that they need to, they will find ways around that goodness of the ORM to write their raw queries and introduce injection. So that's what I think is going on. If we start giving them the things that we're talking about with the guardrails and libraries that work by default in a secure manner, it's If those things are too in their faces, if those things are too restrictive, they are going to be an incentive for them to find smart ways around the controls that we put in place. So the whole thing defeats the objective. When you, as Matt says, find a champion with the DevEx org, you are bringing in somebody who can be the voice of balance in the world. You want this to be secure, but the developers want it to be flexible. Here's the middle term where everybody can live. Happily.
Chris Romeo:think you got to take a, you have to declare a moment in time where you're making this shift from developer tools that are giving developers all the freedom to developer tools that are have, that are secure by default.
Izar Tarandach:Yes. Agreed.
Chris Romeo:I don't think we are there yet. I don't think, I think Matt is describing a future state. that we should aim for that he says he's trying. He, you know, the more tools I put in my engineer's hands, he's, he's iterating on his, on his process of how he's working with developers. I don't think we are there yet, but I think there is, there are things that we can do, for example, with the IDE to make the decisions that the developers are making influenced or constrained by the guardrails, but also directed into a secure by default stance.
Izar Tarandach:So let's take the id one popular example today is a visual code. Almost everybody's using it one way or another. Extremely powerful thing, like almost as powerful as emacs, right? But the thing is that you, you, you bring it up, the first thing that you get is there in the corner restricted mode enabled or restricted mode flashing or whatever. And what that does, that means is that it's not going to interpret a lot of code, both local and external, that you could be importing as modules, stuff like that. The second thing that happens is the developer turns that off. It says, trust, trust, trust the code in here because it's me writing, I'm great, trust it, or whatever. right? Now,
Chris Romeo:so well throughout our history.
Izar Tarandach:true, but everybody feels good because there was a security by default thing there in place
Chris Romeo:Well, you
Izar Tarandach:and the developer, wait, wait, wait, wait, I'm not in the whole scenario, so they turned it off, cool, everybody continues doing their thing, write their code, whatever. Now they go and they find a plugin, a plugin that they need to bring in. that plugin could come from a central repository with Microsoft, it could be some repository on GitHub, whatever. It comes in, they start using it. Nobody knows what that thing actually does, right? So my point in here is, you have this tool, developers need it, it's the experience that they choose, they build the experience by importing modules and functionality, and security has absolutely no control or nice way of control or visibility into what these guys are bringing. So, the approach up to now has been that security takes a step back and says we are going to create a local repository with things that we've vetted beforehand and we know where they come and we expect everybody to only pull modules and functionality from that repository that's vetted. Would we call this secure by default?
Matt Coles:Could be, and, and to follow up on that, so thank you for realigning my expectation here, uh, but now I want to ask a clarifying question along that, along that lines. When we talk about supervised fault, And we talk about risk. Risk to who or what? Are we protecting the end system and the customers? Or are we protecting the business that's creating it? Because what you're describing, using vetted, validated components within an IDE, you know, as a developer plugin, is a business risk problem. Not a customer solution
Chris Romeo:Yeah, but isn't, I mean, at the end of the day, this is a great question, Matt, because I've been thinking about this over the last couple of weeks. At the end of the day, for me, every issue is a customer issue.
Izar Tarandach:If you're a SAST,
Chris Romeo:my build infrastructure gets broken, ultimately the biggest thing that scares me is that my customer's data gets disclosed. I don't care how it happened, so everything for me funnels towards customer data disclosure.
Izar Tarandach:Or if it interrupts your service.
Chris Romeo:or denial of service or whatever, right? So, that's why I don't separate business,
Matt Coles:well, it makes sense if you're in there so we've we talked about this a couple weeks ago with with product versus, you know, service provider, right? And whether that's business service provider or cloud application service provider right where your infrastructure is is where your customer data lives versus and your service resides and your business model is based on as opposed to like a like product development where if the CICD pipeline goes down and you can't ship a product you know on day one well if it comes up on day two oh okay so you ship you delay a day it may be may be bad but it may not be catastrophic to your end customers right um so the risk level is different but we're still talking about business risk, and I agree there's a transitive effect there to customers when you're talking about your business is up as a service provider.
Chris Romeo:Yeah, because without customers we don't have a business.
Matt Coles:that's right,
Chris Romeo:we can, we can measure all the business risk we want. If the customers are gone, we're just, we're just tootling our thumbs,
Matt Coles:right. So secure by default, I will, I will argue that secure by default is not the only metric here, and unless we, unless we further define what secure by default is, right, so secure by default is securely configured, right, and in theory patched for vulnerabilities. Right, for known vulnerabilities.
Izar Tarandach:Mmm, I'm not sure if I'm there with you.
Matt Coles:Okay, I did that, I did that on purpose. I did that on purpose.
Chris Romeo:have to do with, I mean, you got the intended result. What does patching have to do with, uh, with secure by default?
Matt Coles:Okay, so secure by default is at least secure configuration, right? And you do secure configuration because you want to prevent somebody from Uh, having an exposure of something that can be bad, right? So you don't, you don't configure HTTP because you don't want data in transit to, you want data in transit to be secure, so you use HTTPS only. You authenticate with strong credentials because you don't want to have a bypass of that authentication. You have strong RBAC in place. right, as a configuration option because you want strong access controls. So if you don't patch as part of that secure by default process, you're leaving exposures, potentially, especially if those are attack surface type, uh, vulnerabilities.
Chris Romeo:I think we've got a break secure by default into. I want, I want to segment it. So I think secure by default, I think design decisions. We have a number of loaded design decisions up front that if we do not do or do not enact, we'll result in a crappy product whenever we get to production. Now there are supporting operational processes that'll have to, that are part of calling something secure by default. You can't tell me it's secure by default if you don't have a patching process. Oh, well, we can't, we don't, there's no upgrade. Well, then you're not, you're not secure by default. If there's no upgrade path, if there's no, if you're not triaging vulnerabilities to that product and releasing new versions, hopefully in an, in an automated way that I don't even have to think about or worry about. Like the coolest thing I've seen in the last number of years is like software patches from products I've purchased that just happen. Just magically. And then sometimes they tell me, sometimes they don't, right? So, but big picture for me, we have secure by default design decisions. We have operational things that are supporting, that have to be there for something to be secure by default, but they're not secure by default. You can't say, well, because I patch, I'm secure by default. That's, that, that doesn't, doesn't add up.
Matt Coles:Sure, let me, actually I wanna, I wanna touch upon something since you just brought it up. Uh, if you have an automated patching process in your tool that you've given to developers as a developer tool, and You can push patches that are, that potentially have functionality side effects. Have you broken developer experience?
Chris Romeo:What do you mean by patching what? Patching, auto patching like the end product that we send to customers, or auto patching my IDE, or what? I don't,
Matt Coles:auto patching, auto patching the IDE.
Chris Romeo:Auto patching the IDE.
Matt Coles:So we know that secure, we know that patches that come out for system, for things can include functional side, functional side effects. Deprecation of APIs, deprecation of functions, removal of functions, change of behavior of functions.
Chris Romeo:But I mean, that's why you have a developer experience team though, that they don't just, they don't just let visual studio code updates flow to everybody whenever Microsoft posts them.
Matt Coles:So just having a secure by def,
Chris Romeo:has a triage process to ensure that they aren't breaking the whole world.
Matt Coles:so then it's just having a secure by default tool is not sufficient. Mm
Izar Tarandach:no, Matt, wait,
Chris Romeo:I never said it was the only thing. I never said it had to be the only thing. I'm not saying you don't have other things that support it. It's not standalone by any stretch. Sorry, Izar, go ahead.
Izar Tarandach:No, wait, wait, developer, developer, developer experience. I, I need to, to understand this better. To me, if you have a Porsche or if you have a bicycle, both are able to take you from point A to point B. You're probably going to enjoy the ride much more in the Porsche. Developer experience, same thing for me. Visual code or VI, both are going to enable you to write code. You're probably going to enjoy the experience much more in visual. Visual Code, because it's going to do a lot more for you. To my probably mistaken eyes, that defines developer experience. We are giving our developers the best environment where they can flourish and be innovative and be everything, and we are whispering in the back, and be secure, and be secure, right? And, and, I, I, I, I, to, to Matt's point,
Matt Coles:Which Matt?
Izar Tarandach:If you, the last one that you, that I think you made, if you patch that, that tool, that developer experience tool, and it takes a bit out of the developer experience, the developers will still be able to write code. just not with all the functionality that they are used or demanding. So they will come with the pitchforks to the door of security, but it's not going to stop the company from functioning to,
Chris Romeo:we've gotten, we've gotten hyper focused on the IDE versus focusing in on what the developer is actually building. We want the secure by, we want the tool to be secure by default in such a way that it drives the developer to the right decision and takes away any crappy decisions. Somebody
Izar Tarandach:goes to the library for me.
Chris Romeo:somebody adding a feature that enables the HTTP protocol. in this day and age, should not be allowed to happen. That should be prevented somewhere. Whether it's the IDE putting a little buzzer and going, wah, sorry, wrong answer, you cannot do that. Whether it's the check in process at GitHub, GitLab, whatever you're using, having some rules of engagement that say you can't do that. But, but, Somewhere we need the developer experience to catch that problem. And right now the challenge, this is all crystallizing for me now. The challenge we've had as an industry is that we've pushed all of that stuff off to these, uh, other tools. We've pushed the responsibility to SAST and the other one that starts with D and many other tooling tools that exist out there versus bringing it back into the developer's workflow.
Izar Tarandach:So that, that's where the, the guardrails come in place, right? I mean, if I start bringing that into the developer workflow, basically what I'm doing is, I can't believe I'm saying this. I'm shifting left. I'm taking
Chris Romeo:boo, boo.
Izar Tarandach:I'm taking those tools, right, and bringing it in into the developer's desktop, which for some reason, some people thought that it was a really good idea. But, uh, the, the, the thing is that you're just transferring things, you're moving things from one place to another. You're not really, Changing the way they're done, changing the expectations, or changing the responsibilities. You're just giving more shitty work to the developer, and you're making that thing that they use to, to, why people need Macs with 64 gigs memory? Is it for, uh, uh, uh, running AI models? No, it's for running Visual Studio. It's, it's like, it's, it's becoming ridiculous, right? And we want to pile even more on that. So. I don't know, I think that these are very separate things that connect in that word experience. So the last line of Matt's post, uh, what was it?
Matt Coles:How about champion?
Izar Tarandach:guys. Yeah, uh, if you aren't already, find a champion with your DevEx org, it will pay dividends. So now we want to go to DevEx. and explain security to them, and get a champion that, and everybody knows what I think about champions, and, and, and, again, transfer the responsibility. Do your work the way that suits me. Do your DevEx thing the way that I think it should be, because it's
Chris Romeo:but we've argued this point on the, at this, on the, around the security table a dozen times, at least about who's ultimately responsible for security and where do you get the results? You can't, yeah, but you can't get the results from you because there's not enough of you.
Izar Tarandach:you, you can't, you, you're totally right. But even if I transfer the responsibility to a developer, the final outcome is still my responsibility, because I decide what responsibility I'm transferring to them. Right? So, at the end of the day, it becomes two extremes. Actually, two extremes and a middle. You have security in here, defining the requirements, you have the developer in the middle actually doing the work, and then you have security at the end verifying that the work was done. I don't think that any of these three pieces in here is going to be better well, better served by introducing dev experience in the mix. I mean, I love the idea of security by design. I love the idea of, uh, Tools that provide security by design. I, I, I'm still not very close on what's a tool that's secured by design. I think that we, we, we need to ask Matt actually what he means here. And, and just a plug in, Matt has a great newsletter called Vulnerable You. No, Vulnerable You, and it's really cool. So,
Matt Coles:Sorry, secure by default was the statement that Matt, uh, Johansson said, not secure by design. That's important. It's important. Secure
Izar Tarandach:Yeah, yeah, yeah, yeah,
Matt Coles:by default are two different
Izar Tarandach:It's a default, completely
Chris Romeo:what I've figured out as I've continued to study these things is everybody thinks they know what that means and nobody freaking knows what it means or how to do it. Which is hilarious
Izar Tarandach:know I don't.
Chris Romeo:it, but most people are like, people are throwing it around now because of CISA's document and everything. And I'm like, I'm kind of, I'm starting to unpack that for some research I'm doing. And I'm like, nobody knows what this means. And there is no defined definition of how to do it.
Izar Tarandach:You, you know when secure by default?
Matt Coles:We tackled this
Chris Romeo:by default is easier than secure by design to wrap your brain around.
Izar Tarandach:secure by default to me, and I know that nobody asked, but to me, it means that once upon a time I worked in a place that was very, uh, forceful, about publishing hardening guides. Every product goes out with a hardening guide. To me that means the product is going out by default, open, and it's on the customer's side to apply these hardening guides to close it. Secure by default to me means it leaves the door closed. It goes out
Matt Coles:We, we, we had this, we had this argument, this discussion about loosening guides. That's the new term that CISA wants to use. yeah,
Izar Tarandach:boo, boo, boo.
Matt Coles:because releasing, releasing with a guide should. It should, it implies that it was left, it was open when it left the factory, but it should be hardened, and that's why we give hardening guides. But, but loosening is not the right, I think we've decided, we've, we've had this conversation that loosening is not necessarily the right thing. It's a secure deployment guide that gives you options for deployment. But it should be, by default, one of those secure options. Oh,
Izar Tarandach:so we go right to the, the heart of the thing in this dev experience discussion. If we do something that's completely closed, like the IDE or the libraries or whatever, and we say we are not giving you guys a loosening guide because that's a bad thing. I think that that's what Matt means when he says they are going to find ways around. So if you don't give the losing opportunities, they will find them and you won't have control over them because you believe that, that everything is by default,
Chris Romeo:But you have to make it easier for them to use the new tooling and the new guidance that you're enforcing in the tooling than if they had to go do it themselves. So you can't make it harder so they have to try to sneak around it. You have to enable them with this and think of it as a positive instead of a negative.
Matt Coles:so Matt basically, Matt said though, it's my job as a security person, or his job as a security person, to give developers tools that are secure by default. so that they don't have to find workarounds. And what Izar was saying, and I agree with him, is if you give a locked down system, the reason why I don't recommend hardening an operating system before you give it to a developer is because they're gonna have to unharden it to do their job. And so if you give a secure by default component, and let's talk about frameworks and components more so than business tools, although the IDE is probably a good example of a business tool, developer tool that if you give it in a secure by default state, and it doesn't work the way the developer needs it to work, they're going to have to find workarounds. And that counter that contradicts Matt's argument that if I give a secure by default tool, developers will be happy and can do their job.
Chris Romeo:we've gotten hyper focused on the IDE. but we're not talking, this is not talking about hardening a IDE environment. And it's, it's, it's the decisions that are coming and being implemented in the IDE are the things that measure this. It's, it's secure by defaultness. I'm making that
Matt Coles:I
Izar Tarandach:I think that it's talking about giving to the developers something that's already closed from the shop. It comes to them closed. This is what you're going to use to work,
Matt Coles:Right. And, and, and, and you can't, because, because, well, so let's, let's, let's go with, with the line of, of the thread that you were just, was just suggesting there. So if you're talking about what gets built in the IDE, and you give, so first off, you're not going to give pre hardened code, right? You're not going to give secure by default code because the developer, that's the developer's
Izar Tarandach:No, yes, you will, you will, you will, libraries,
Chris Romeo:it's the
Izar Tarandach:components, libraries, third party.
Chris Romeo:I don't think that secure by default IDE means that it only allows secure code. But let's talk about, let's use an open source example, okay? So, if I want to add a component to my project, should I as a developer inside of a big company be able to go scour the internet and find any package I want and load it into my, via my IDE. I say no. I don't, I, in a large engineering, a startup is a different thing. Startup is, you get, you, you got to have that, that ability to move and shake and do things differently. But in a established engineering organization, I do not want packages being added from some place where we have no idea how trustworthy they are. They could be, have been backdoored, they could be malware. And so I want the IDE to prevent somebody from adding that package that hasn't been vetted into a
Izar Tarandach:Yeah, we, we spoke about that with the visual code example, right? Are you going to get it from a repository or you're going to go everywhere? But
Matt Coles:as long as there's guardrails, as long as there's guardrails for innovation, that's fine, right? So developer,
Chris Romeo:there's gotta be a process where you can say, Matt, you can say, Hey, Hey, but I found this cool library. It's not on the list. Somebody's got to review it and we got to get this into our product because it's great. That's fine because then, but then we, because we don't, we don't just have random software being added into our software. And we wonder why our build systems got hacked or our, or some other type of challenge that came as a result of an open source component that was, that was breached.
Izar Tarandach:so from experience there will be those draconian places, and by the way, this happens with Chrome extensions as well. No, here's the list of the ones that you can use and whatever else, shut up, we don't want to hear about it. will be those that will say here's a set of procedures and tools that you can use to make sure that your risk is as little as possible, and after you run your proposed extension or whatever through this, then we are sort of okay with you using it. And there are those places that say do whatever you want, right? But, Let's bring that to, to, let's get out of the IDE for a second. Another thing that we all love and love to hate, Java. There is that file that java. whatever. sc. whatever, that I forget the name, where you can define which ciphers are available in the decrypted extensions of Java. If there was a way to enforce that every single developer would use only that version of that file, I don't believe there is, but if there was, would we say that this is an example of a developer tool that's secure by default, and that There might be some cases where a developer could come and say, I can't work with these ciphers, I need something else.
Matt Coles:I don't think it's a developer tool.
Izar Tarandach:Compiler's not a developer tool?
Matt Coles:The configuration of your it's not your compiler. It's the configuration of your runtime, I believe. It doesn't affect the
Izar Tarandach:Right, right, right, right, right, right, you're right, you're right, you're right, you're right, you're right,
Chris Romeo:again, it's a design decision. Right? It's a, we're back to that. That's a design decision. What crypto algorithms do you allow to be included in a project? Now,
Izar Tarandach:But you are guardrailing it, you're guardrailing it by only giving the known good options for that design decision.
Chris Romeo:yeah,
Matt Coles:but it's not
Izar Tarandach:default.
Chris Romeo:it around the other way. Do you want a developer to be able to add DES into a, that configuration file and start using DES encryption, which we all know isn't even encryption. We can, it's barely, we can barely call it encryption at this point because it can
Matt Coles:DES with the Triple DES with the sync key Triple DES with the sync keys is at least Something.
Chris Romeo:Yeah, but that's not that. Yeah. Triple that. Yes. Now you said triple DES, which is a different algorithm. Yes. It's using the same, the
Izar Tarandach:But, but that's not a flip. That's not a flip, that's the continuation of the scenario.
Matt Coles:Well, so, so then, then the extent by extension, are we, are we bringing, are we bringing, um, specifications, whether that's coding conventions or, or in this case, a crypto specification. into the fold of secure by default and or by developer tools. So as an example, if we set a standard for what crypto algorithms to be used or can be used and or cannot be used, we will end up with things that are secure by default. Well, secure by design, potentially secure by default. Uh, and, but, but that's, that's a standard. That's a specification. That is not a tool. And it's not a developer experience,
Izar Tarandach:but it's not a tool, yeah.
Matt Coles:but it's not really a developer experience, is
Chris Romeo:I mean, the, the, the, that, that's certainly everything you just described is right on, that is part of secure by design, potentially secure by default. But let's bring it back to this conversation and say, do you want the technology that your developers use to enforce that for you? Because that would make it, that would make the technology that the IDE they're using secure by default. If they go to, or maybe it's when they go to check in the code, like we can also push it out of the IDE into the source code control system.
Matt Coles:Or a CICD, or
Chris Romeo:yeah, I like that a little bit less because now we're letting them waste time on something that's never going to get approved. And they may spend three days implementing something and they go to check it in and the thing says no way you can't use DES encryption. What are you doing? Whereas if it's in the IDE and we're able to enforce that and say, hey, this is something we don't do. We don't allow DES encryption anymore because it's not 1964.
Matt Coles:it's part of the SAST, or it's identified by the SAST
Izar Tarandach:so here's, here's another take. What if we found a way to do all this securing by default that he's proposing in those tools? Would it be fair to say that going on an 80 20 arbitrary divide, 80 percent of the developers don't care enough? or don't reach the level that they have to disagree with the defaults of the tools that they are using. They just want to get cold out of the door. And the 20 percent that do disagree are hopefully going to engage in a dialogue with security around those defaults because they understand that they are breaking the defaults. So could we say that if we go the way that he's proposing, By closing things as part of developer experience, what you get at the end of the, of the, the, the, the experience is that either you're going to have things that are secure by default, 80 percent of the cases, and 20 percent of the cases, people are going to come and talk to, to security and actually have a dialogue around those defaults and perhaps figure out that, hey, we need different defaults, or, hey, we need a different way of doing this kind of thing. right? So could it be that what he's proposing, rather than leading to a more secure and beautiful future, actually leads to a higher level of dialogue between security and developers through developer experience? Does that make any kind of sense?
Chris Romeo:Yeah. Yeah.
Izar Tarandach:Yes,
Chris Romeo:I think the, the connection once again. We've all, I think, agree on this point that part of, part of being a good security person is understanding the plight of the other functions across the business from developer experience to product to engineering to the best way that we support them is when we walk a mile in their shoes and understand what they're going through. So yeah, I mean, I think you're, I think you're going in the right direction here about the, this is going to create more dialogues. If you, so yeah, that comes back once again to the exception process. The secure by default tool cannot be draconian in that it's locked down and there is no, sorry, we don't allow that. Uh, it has to have an exception process where you can have a request sent to Matt as our security engineer and he can review it and go, um, okay, I can't find any reasonable reason why you'd want to use DES encryption. Make your argument, give them the opportunity, but it could be something a lot more, you know, it's okay. Here's a, here's an even better example. The, um, the, the engineer, you have a really, uh, engineer who knows crypto and they say, well, um, I want to use, uh, some of this, this new quantum crypto. in our product, which is potentially, there are quantum safe algorithms or something like that. They want to use, they want to use something that's, or they want to use something that's actually potentially a higher level of security than what you're doing, what your minimum standard or your maximum standard kind of prescribes. Then Matt could have a dialogue about, well, you know, we don't think quantum safe algorithms, we're not that afraid of quantum computers yet. But thanks for looking into that. And we'll, we'll, we're going to be chasing that in our minimum standard for the future. And we'll definitely make a note to circle back with you. Cause it looks like you've got some really strong opinions,
Matt Coles:So, I
Izar Tarandach:I'm using a memory safe language. Hehehehe
Matt Coles:actually disagree, I think I might disagree with that just a little bit. Security should be consulted, I think, in that case, because there's value in that conversation. But you don't want to, if a product wants to go beyond and innovate, because maybe they have pressures coming from their customers. Right? Then we should be supportive of that within reason. I, I, I, that's the setting the guardrail. It's not the exception process, because what we're talking about here is making a choice that goes above and beyond the minimums, and you should never set a maximum, right? Security should never say, well, you, you should, you, you should at least use AES 128, but you can't use anything more than AES 256.
Chris Romeo:Yeah. But what if they want to use a different
Izar Tarandach:No, no, no, that's not true. That's not true. What if you have a performance problem and if you go too high, that's going to bring things down? There is such a thing as too much security. No, no, no, but security could be influencing that performance problem. What I'm saying is there is such a thing as too much security for any given
Chris Romeo:There is a such thing as reasonable security too.
Izar Tarandach:There should be.
Chris Romeo:It
Matt Coles:reasonable security doesn't preclude, reasonable security does not preclude the ability to go above and beyond, right? Because a developer, development team may decide they want more than
Izar Tarandach:But that's future proofing. What today is reasonable tomorrow could be the minimum.
Chris Romeo:yes. So
Matt Coles:like RSA, like RSA, going RSA 4096 on a, on a key, key length, as opposed to RSA 2048, which is the, which is the
Izar Tarandach:Right, but if you tell people you always have to use 4096, anybody writing for those things is going to come and say I can't do that.
Matt Coles:That's right. Developers will find ways around though, if those are too stringent guidelines.
Izar Tarandach:Yep.
Matt Coles:But that's not secure by default. That is guidelines or specifications, right? Guardrails, the choice of tools in the post originally, is where I think we're getting hung up on. The definition of what a tool is, developer tool specifically, versus the things that developers use. that they configure, that they build, that they produce. Those are the things that we're talking about more
Izar Tarandach:I think that the verdict here is that we have to invite him to explain himself.
Chris Romeo:It took us 42 minutes to get to the point where we decided we should have just had the author of this post here so he could just tell us what he meant.
Izar Tarandach:yeah, I don't think we quite grasped what he was going for, but
Chris Romeo:I mean, I
Izar Tarandach:we're in the ballpark, I think.
Matt Coles:Yeah. I mean, there's a lot of discussion points here and, and, you know, maybe he didn't intend, maybe he didn't, wasn't thinking in, in any of these areas, but from a secured by design and secure by default standpoint, you kind of have to.
Izar Tarandach:Right, right.
Chris Romeo:Yeah, I think that's a good place to end our discussion for today about, uh, this developer experience, secure by default, guardrails, and DevEx champions. I feel like we made a little bit of progress. I don't know if I could measure how much, but I think we made a little bit of progress forward, at least in our own understanding. We had to crystallize some points to make this whole thing work. So, uh, folks, thanks for listening again to The Security Table and stay tuned for another episode real soon.