A new report out today from the software security firm Veracode found that civilian federal agencies — those largely unconnected to the military or intelligence communities — rank dead last in fixing security problems in the software they build and buy.
That’s particularly relevant given that the massive hacking attack on the U.S. federal government’s Office of Personnel Management has exposed the personal information of at least four million people, and that number is likely to grow as the criminal investigation proceeds and more information comes to light.
The attack on the OPM, likely carried out by a group based in China, was significant for the damage caused, but it’s only the latest in a long string of computer security incidents at federal government agencies, the numbers of which have increased by more than 1,100 percent since 2006.
Veracode, based in Burlington, Mass., runs a cloud-based service that audits the source code of software applications for security vulnerabilities. The report documents the results of these scans carried out over the course of 18 months, ending in March, of 208,670 applications for its customers in both the private and government sectors. And it doesn’t make government IT managers look good.
The firm examined how often software used by its customers contained security flaws, how often those applications complied with widely accepted security standards, and how often vulnerabilities were fixed.
The company found that Web applications in use by federal agencies failed to comply with security standards 76 percent of the time. The standards, created by the nonprofit Open Web Application Security Project, are widely used across the Web. By comparison it found that the financial services industry complies with OWASP 42 percent of the time.
It gets worse: Veracode also measured how often and how quickly software security flaws are fixed after they’re found. During the 18 months covered by the report, Veracode discovered a total of 6.9 million security flaws, of which its customers fixed 4.7 million. But when you break down the tendency to fix those flaws by industry, government agencies ranked dead last again. Veracode found the agencies patched the flaws found in their software only 27 percent of the time. By comparison, companies in the manufacturing sector fixed their flaws 81 percent of the time.
Why aren’t government agencies fixing their flaws? Because no one is requiring them to do so, says Veracode CTO Chris Wysopal. “They don’t fix them because there’s no regulation or compliance rules that require it,” he said in an interview with Re/code.
Additionally, government agencies often work with outside contractors to build their software or to deploy commercial software, Wysopal said. Often when security problems are discovered, government contracts don’t specifically require that the contractor fix the problem.
Government agencies tend to follow what IT pros call a policy-based approach to computer security, where agencies check off a list of requirements set by lawmakers and regulators that they have to follow. Private companies typically do the same thing, but they also add to their mix a risk-based approach. “With a risk-based approach, you look at what you have that attackers might want and what’s in place to stop them,” Wysopal said. “Both approaches are valid, but everyone should do both.”
And sadly, none of this is news in government circles. An April report by the report by the Government Accountability Office found that the number of security incidents at federal agencies grew from 5,500 in 2006 to more than 67,000 last year. And the number of security incidents that involved personal information of either employees or other people rose from about 10,500 to nearly 28,000 in 2014.
Gregory Wilshusen, the GAO’s director for information security issues and the author of that report, says agencies rarely have adequate programs and procedures for testing the security of their software and systems. “When we evaluate these agencies, we often find that their internal testing procedures involve nothing more than interviewing the people involved, and not testing the systems themselves,” he said. “We consistently found that vulnerabilities that we identify as part of our testing and audit procedures are not being found or fixed by the agencies because they have inadequate or incomplete testing procedures.”
And even when agencies try to fix the problems they’ve found, they fail at fixing it the first time, Wilshusen said. “When we find these problems, sometimes we go back and look to see if they’ve been corrected, and if so, how. … We found that, often, the actions the agencies take aren’t sufficient.”
All of this might be forgivable if it weren’t for the fact that the government spends about $80 billion a year on computing and IT systems, of which a little less than $13 billion was devoted to cyber security last year, according to a report by the Office of Management and Budget. As the details about the OPM hack unfold, some people may wonder what all that money was for.
This article originally appeared on Recode.net.