<- Back to the blog

Developers don’t care about (data) security!

I’ve heard the title of this article uttered in exasperation by more than a few CISOs. That can’t be the case though, right? Developers are some of the most paranoid cautious, security-conscious people I know. Compared to your average person, developers are far more skeptical when it comes to their personal data. Even as a CEO, those instincts from my time as a full-time dev persist. Whether it’s using a VPN, enabling 2FA everywhere possible, or simply a lower risk tolerance when it comes to flashy new “free” services, I care much more about my data than many of the non-tech people I know.

I think we can at least all agree that developers care about protecting their privacy and by extension their data. But what happens when it comes to the data of their users? Should they care? Is it the developer’s job to protect user data to the level that they protect their own? 

Spoiler: yes, but with some help.

What does Security mean for developers?

Security is about protecting assets, but that’s quite vague. Ultimately security means different things for different people. 
When you talk to information security people that started their careers 20 years ago, security means firewalls, antivirus, networking, and the infrastructure that went toward securing physical devices. If you are a developer today, security means something different. It’s much more granular than one-size-fits-all tools and less tangible than a hardware firewall. 

For developers, security is about writing or using low-risk, low-vulnerability code. Those risks can be categorized and prioritized, that’s what the OWASP Top Ten is for. The Open Web Application Security Project (OWASP) provides a set of guidelines and principles to counter them.

Development is a field that changes extremely fast, and with that, the security risks and countermeasures change quickly too. If you look at the OWASP top 10 you can see important differences in only 4 years! 

This practice where developers put focus on security at the application level is known as AppSec

Security in the developer’s journey

Developers at modern companies need to keep moving. In addition to their daily tasks, there’s an expectation for continuous improvement. Leveling up, upskilling; whatever you like to call it. Be it a new version of Rails, the hot new front-end architecture, or even the latest deployment pipeline. This is a lot.

Security is a matrix of risks that you can apply throughout the software development lifecycle. As a developer, these risks permeate everything you touch. With the constant barrage of vulnerabilities and security expectations on top of this, you have to wonder: do developers have time for security?

The short answer is: they should. Because if they don’t, they are part of the problem. Security is not an add-on that you can delay until the MVP is finished. To the v2.0 or even v1.1. Security is everyone’s responsibility, from idea to production.

That said, developers are under more pressure than ever to ship code quickly. Unfortunately, developers are not security experts and therefore are not good at writing secure code. This creates a huge problem: organizations are shipping insecure code, quickly.

So how do we solve this? The answer is not to put more pressure on developers.

Developers should not be security experts

Let's start with the assumption that all developers intend to write secure code. They shouldn't be responsible for discovering, understanding, and finding the best remediation strategy for security risks. So they want to write secure code, but we shouldn’t expect them to self-assess. How does that work?

A good parallel we can use is quality assurance (QA). Developers are notoriously really bad at QA’ing their code. There are a lot of reasons for this, but the most important one is that you can’t impartially evaluate your work, however good you are. This is true for everyone else too. So we have QA engineers and QA processes.

For application security, that’s the role of the Application Security engineer, or AppSec engineer. Here is how Gitlab describes this role:

“The AppSec Engineer is responsible for working with the development team to help them build security into the application development process and to find and fix security vulnerabilities in GitLab’s products.”

The main responsibility of an AppSec engineer is to help developers write more secure code by providing training, guidance, and expertise. It’s not about reviewing every line of code or being a gatekeeper, it’s about empowering the developers to think about security from day 1, while coding.

Developers should not be security experts. They should focus on writing code, securely, thanks to the expertise they get from the application security team.

AppSec is all about recognizing that security is not an afterthought, but instead an important part of the design of building software. AppSec is the security aspect of the “shift left” movement.

Why does it matter so much?

Let’s look at numbers to make sure we all agree on the urgency.

85% of organizations will be building cloud-native applications by 2025, essentially shipping code at a bigger scale than today, while at the same time, application vulnerabilities are still the most common external attack method. Add to the fact that 98% of companies have experienced at least one cloud data breach in the past 18 months, up from 78% last year. 

The good news is that 57% of security teams have already shifted security left or are planning to this year. Though it also shows that a lot of work and education is still needed if we want to actively build more secure software. We are not yet there.

At Bearer, we are on a mission to empower developers to build more secure applications with a data-first security model. Protecting sensitive data should not be an afterthought, so why wait?

Industry Focus
Share this article:

Ready to shift data security left?

Request early access and start discovering data in minutes.