Software quality and security specialists are struggling to apply conventional safeguards to new and rapidly changing technologies. Now, new research is showing that conventional software or network-based security measures can be undermined.
A May 15, 2015 eSecurity Planet blog, Integrating Bulletproof Security into App Development by SoftServe’s Nazar Tymoshyk gives some up to date advice on building secure software. The blog notes the current parlous state of play in a section titled Sad but True: Security Process in Reality. The author goes on to list 14 Steps to Secure Software, including the use of the The Open Web Application Security Project (OWASP) Software Assurance Maturity Model to evaluate the process.
In terms of the current state of play, the SAN 2015 State of Application Security: Closing the Gap is cautiously optimistic. While “Many information security engineers don’t understand software development—and most software developers don’t understand security”, this year’s State of Application Security Survey shows that the gap is closing slightly. The survey uses the OWASP Builder (developers), Breaker (pen testers) and Defender (infosec) community model. The results show that “The gap between defenders and builders is closing, and they share a common goal of eliminating risk from their processes. However, there is still much work to be done at many levels.”
If only it were that simple. A ScienceDaily, 28 April 2015 article, Advancing security and trust in reconfigurable devices, from Georgia Institute of Technology shows that the Builders, Breakers and Defenders may all have more to worry about. Conventional approaches are now exposed to new
challenges involving programmable logic devices, particularly field programmable gate arrays (FPGAs). “FPGAs are integrated circuits whose hardware can be reconfigured — even partially during run-time — enabling users to create their own customized, evolving microelectronic designs.”
The Georgia Tech researchers have identified multiple issues with FPGAs, which could introduce a whole new class of vulnerabilities. “Conventional protections such as software or network-based security measures could be undermined by altering the logic of a system utilizing programmable devices.”
On the up side, the article describes a range of techniques the Georgia Tech researchers are developing to provide assurance in programmable logic designs. They have also developed the Trustworthy Autonomic Interface Guardian Architecture (TAIGA), “ a digital measure that is mapped onto a configurable chip such as an FPGA and is wrapped around the interfaces of process controllers.”
The research Team Leader notes that “TAIGA ensures process stability — even if that requires overriding commands from the processor or supervisory nodes. It’s analogous to the autonomic nervous system of the body, which keeps your heart beating and your lungs respiring — the basic things that your body should be doing to be in a stable state, regardless of anything else that’s going on.”
See also the article The Airline Bug Scandal in this issue for other recent details of potentially disastrous security vulnerabilities.