1. Design security into the mobile app
The first step should always be to consider security during the application design stage. Relying too much on data stored on the client, for example, can offer up a vector of attack for a variety of bad actors, according to game developer Glaser.
“I have become a big believer that you can’t retrofit security, which is how most things get done,” Glaser says. “You build an application—maybe you do some testing—but that is very difficult and very costly.”
By considering this advice at the beginning of a project, developers can more easily and cheaply create secure applications.
2. Test each iteration of the product
Once a secure design is created, developers should make sure their code doesn’t result in vulnerabilities. Frequent code scanning (not just at the end of the project during the quality assurance stage) and threat modeling can help detect any vulnerabilities or design flaws that creep into the application, says Sriram Ramanathan, chief technology officer at Kony, a maker of mobile app development tools.
When the company makes its tools, they follow a secure design lifecycle that incorporates a lot of testing.
“We document a very clear set of security use cases, and then we design abuse cases, which are ways of testing the products,” he said. “The design process for us includes a threat model, which defines the threat vectors, and then we come up with means of engineering mitigations and we test for those issues.”
As part of testing, developers should also run their application and monitor network traffic. Often, coding libraries and advertising frameworks can perform insecure activities, which are revealed through monitoring.
3. Encrypt data stored on the device
Poorly implemented encryption is a major problem for many mobile apps. Just ask Starbucks. In 2014, a security expert found that the company’s mobile app left users’ data unencrypted on the device. Historically, mobile apps have struggled to protect data due to oversights, such as not implementing encryption on the connection to the server and not storing authentication credentials securely.
“You might not be able to guard that information while the application is running, but in terms of the way you store data and the way that you transmit the data off the application, it is vitally important that you think of data in those terms and encrypt it,” Trustwave’s Henderson says.
When considering what communications and data should be encrypted, developers should consider how to protect data if an attacker gains control of the application. Programmers have to view themselves through the eyes of the attacker, he says.
4. Identify and actively manage third-party libraries
Developers should use a system to regularly check for updates in the third-party code they use in their product, so the code remains current with the latest versions. Failing to do so could leave a known security hole in their products that attackers can exploit.
Keeping up to date on a handful—if not dozens—of coding libraries and application frameworks is difficult. Once the effort of keeping up with third-party libraries is understood, more companies may want to pare down the workload by minimizing the amount of third-party code in their applications, says Theodora Titonis, vice president of mobile for application security provider Veracode.
“Having insight into what those third-party libraries are doing is critical and asking whether the functionality is really necessary” she says.
5. Minimize the attack surface area
That is an attitude that should apply to other facets of mobile development as well, says Adrian Mettler, a development engineer on the mobile team at FireEye. Developers should look to not use broad frameworks but minimize the functionality of the mobile app to just the capabilities needed, essentially shrinking the opportunities for attack, a concept also known as minimizing the attack surface area of the application.
A mobile app, for example, doesn’t need to trust many certificates. In many cases, a company can hardcode trusted certificates into the software. Known as certificate pinning, this technique could, in the case of the JBOH vulnerability, eliminate the threat of an attack.
“The way that the developer can ensure that they are protected against that [the WebView issue] is to make sure that they only load trusted sites that they control,” Mettler says. “If you are careful to properly validate SSL connections and only load those pages…then there is no way for the attacker to get malicious code into the application through WebView.”
6. Obfuscate the code
Finally, developers can adopt a number of techniques to harden their application against attackers’ efforts to reverse engineer the code. Obfuscation, which turns the code into indecipherable gibberish, raises the bar slightly for attackers. As Mettler points out, why make it easy for the bad guys?
“If you make it difficult enough to reverse engineer your application, it may make it less likely that there is a trojanized version floating around somewhere,” he says.
In the end, creating secure mobile applications boils down to education and resolve. Developers need to take the time to learn about secure application development and the common vulnerabilities and security weaknesses that creep into applications. Only by incorporating security into their development process—whether through secure design review of third-party libraries, or the simple step of obfuscating the resulting code—can programmers create applications that do not empower attackers, but resist them.