Skills verification has been a facet of our lives for most of the modern era, granting us validity and opening doors that wouldn’t otherwise be available. Driving, for example, is an important rite of passage for most, and we’re expected to pass a set of standardized assessments to confirm that we can be trusted with a four-thousand-pound machine, capable of traveling over a hundred miles an hour. Mistakes, especially at speed, can cost you that privilege, or even a human life.
But what if, for some, driving is more than a day-to-day convenience, and it becomes an elite profession? A person can continue their upskilling journey and potentially become an F1 driver, where they are permitted to operate machines that go faster than any civilian could realistically handle without a huge likelihood of error at high speeds.
To that end, it seems baffling that most developers who work on code that powers critical infrastructure, automobiles, medical tech, and everything in between, do so without first verifying their security prowess. On the other hand, why do security-skilled developers, who have repeatedly proven that they understand how to build things securely, need to queue with everyone else in the ever-slowing development pipelines because of all the security gates? The industry doesn’t see this as an oversight, it’s the norm.
We know from extensive research that most developers simply do not prioritize security in their code, and lack the regular education required to navigate a lot of common security bugs. They tend to be part of the reason that security at speed seems like a pipe dream, and many security-enabled developers feel like they are stuck in the slow lane on the Autobahn behind a bunch of learner drivers.
Despite this, the security world is slowly lurching forward, and there is an increasing demand for developers to have verified security skills who can hit the ground running. The Biden administration’s Executive Order on Improving the Nation’s Cybersecurity specifically calls for the evaluation of vendors – and their development cohort’s – security practices, for any supplier in the US government’s software supply chain. It stands to reason that emphasis on developer security skills will only grow across most sectors, but with little on offer in the way of industry-standard assessments, how can organizations prove their security program is growing verifiable developer security skills in a way that won’t bring delivery to its knees, or stop the security-aware developers from spreading their wings?
Merit-based access control: Could it work?
Least-privilege security controls are a mainstay in a lot of organizations, with the idea that each role is assigned access to software, data, and systems on a need-to-know basis in the context of their jobs, and nothing more. This method – especially when paired with zero-trust authorization principles – is helpful in reeling in the full extent of the attack surface. And, really, we should apply this same strategy to API permissions, and other software-based use cases as standard.
Most of us in the security business are hyper-aware of the fact that software is eating the world, and the embedded systems code running your air fryer is really no different from the code keeping the power grid up and running, in terms of its potential to be exploitable. Our lives and critical data are at the mercy of threat actors, and every developer must understand the power they have to fortify their code when properly educated. It requires a serious upgrade to an organization’s security culture, but for true DevSecOps-style shared responsibility, developers do need a reason to care more about the role they play, and perhaps the fastest way to shift their mindset would be to tie code repository access to secure coding learning outcomes.
If we take an organization in the BFSI space, for example, chances are good that there will be highly sensitive repositories containing customer data, or storing valuable information like credit card numbers. Why, then, should we assume each engineer that has been granted access is security-aware, compliant with stringent PCI-DSS requirements, and able to make changes to the master branch quickly and without incident? While that may be the case for some, it would be far safer to restrict access to these delicate systems until this knowledge is proven.
The challenge is that in most companies, enacting a “license to code” scenario would be arduous, and depending on the training solution, a little too manual to support any kind of security at speed objectives. However, the right combination of integrative education and tooling can be the core of a developer-driven, defensive security strategy.
Effective training integration is not impossible.
Finding developer upskilling solutions that complement both high-velocity business objectives and their workflow is half the battle, but going the extra mile to move past “one-and-done” style compliance training is the only way we will start to see a meaningful reduction in code-level vulnerabilities. And for developers who successfully prove themselves? Well, the coding world is their oyster, and they don’t have to be hamstrung by security controls that assume they can’t navigate the basics.
Hands-on skills advancement that integrates seamlessly with the development environment provides the context needed for engineers to truly understand and apply secure coding concepts, and these same integrations can be used to effectively manage access to critical systems, ensuring those who excel at their learning outcomes are working on the highest-priority sensitive tasks without hindrance. It also makes it easier to implement rewards and recognition, ensuring security-skilled developers are seen as aspirational in their cohort.
Like many things in life, fortune favors the brave, and breaking the status quo to adopt an out-of-the-box approach to developer skills verification is exactly what we need to uplift tomorrow’s standards of acceptable code quality without sacrificing speed.
The post Code in the fast lane: Why secure developers can ship at warp speed appeared first on SD Times.
from SD Times https://ift.tt/YcnKISe
Comments
Post a Comment