Stop me if you’ve heard this before: Users are becoming increasingly uneasy with the way in which apps collect, store and share their personal information. It’s a story we’ve discussed a lot here on the uTest Blog over the years (and more recently, on the Applause Blog), but it’s a story that isn’t going away anytime soon unfortunately.
Late last week, MEF Global Chairman Andrew Bud penned a thoughtful guest post for VentureBeat on this very topic, where he argued that trust in apps is on a downward trajectory. In his view, it all has to do with personal information.
In many ways, the apps economy runs on personal information. It’s the currency – the lifeblood – and the main reason why apps can succeed with a freemium model. As Bud argues, it’s also the reason why trust is quickly declining. He writes:
What underpins this transactional relationship is consumer trust and it follows that, for the mobile industry, this should be the watchword for how mobile businesses build and retain customers. The less confidence people have in their mobile device, the less they will use it and the apps on it. That’s bad news for everyone.
Yet for almost as long as apps have been on the market, consumers have been bombarded with stories in the press and across social media platforms that raise privacy concerns about the way apps gather and store and use personal information. As an industry we have a long way to go.
He backs his opinion with some hard figures from a recent MEF/AVG Technologies study, which found that:
40 percent of consumers cite a lack of trust as the reason they don’t purchase more via their mobile — by far the most significant barrier. And it’s getting worse. In 2012, 35 percent named trust as an obstacle compared to 27 percent in 2011.
Second, 37 percent claim a lack of trust prevents them from using apps once they’ve installed them on their phone. Third, 65 percent of consumers say they are not happy sharing their personal information with an app.
Hard to argue with numbers like that. So what’s to be done? While Bud places a small amount of the burden on users – arguing that they should be more aware of the threats – he places most of it on the industry as whole: marketers, developers, publishers, aggregators, executives and so forth.
And to that I would add software testers.
Just as quality cannot be tested into a product, neither can trust. It has to be a shared value on the part of the brand and/or app publisher. That said, there are a few areas in which testers can ensure a higher level of security – and hence, trust – trust among current and prospective users. Here are a few areas in particular:
One perk of creating a mobile app is the slew of device features available for apps to access. However, this can be a tricky thing to navigate and leads to some non-traditional security considerations.
Developers have historically been liberal in asking for permissions, typically requesting more access than is strictly necessary for the app to function properly. In early 2012 it came to light that apps were accessing address books and other data and features without any good reason. This practice was exposed by the media to much public backlash on the part of users, resulting in many operating systems keeping a closer eye on permissions.
When testing an app, carefully consider which features the application truly needs to access. Apps should not access device features (such as the camera or address book) or gather data that is not necessary to the app’s function. Also be careful when using outside libraries. Be sure to validate the origin to ensure the library is secure and its inputs are not compromised or malicious.
With the rise of BYOD, companies, as well as individuals, will be keeping an eye out for apps that collect unusual data, work with suspicious libraries or access excessive permissions, particularly in relation to contacts or camera/recording features. Superfluous permission requests will often get your app flagged by official reviewers or keen-eyed users.
As mentioned above, some apps collect user data for marketing or research purposes. This is not in-of-itself bad – much of the information these apps gather is valuable for improving the app itself or fixing bugs. However, apps should always make users aware that data is being collected and inform them about how the data will be used. Apps should also notify users which permissions are being granted to the app. This is required by some app stores and even legally required by some states (California, for example). If you are testing an app, be sure you are prompted to grant feature access and data collection permission upon initial launch.
Part of creating a secure app is making sure data isn’t accessible without the right permissions. This goes beyond the realm of authorization and authentication and into the little places you might not think of.
Side Channel Data Leakage appears in OWASP’s Top 10 Mobile Risks. This leak occurs when little trails of data are “left behind” in different places that developers may not realize. Caches, logs and temporary files are all common culprits. When security testing an app, be sure that potentially sensitive information isn’t left unprotected in any of these scenarios.
Just as bad as data leaks is storing information – like API keys or proprietary logic – inside its own code. Apps can be reverse-engineered and shouldn’t contain anything in the code that is sensitive.
Of course, many security issues are not the result of careless behavior on the part of users, but rather careless behavior on the part of the app publisher. Specifcally, failing to properly encrypt sensitive data.
How secure is the data created or stored by the app? You should ask yourself this question before launch.
Even if you think the data is secure, it’s best to encrypt any sensitive or user-generated data – including usernames, passwords, personal information, shopping history, etc. – for an extra layer of protection. Even without leaks, unencrypted or poorly encrypted data is exposed to multiple risks, especially when moving through potentially vulnerable mobile networks.
Sensitive data should be encrypted by the app and then decrypted with a user password. If you are dealing with truly sensitive information, consider if it really needs to be stored or transmitted. If it is necessary, consider only storing partial information or using hash data. That way, if data is leaked the damage will be mitigated. If you do have a leak it’s best if no one can decipher the data.
Even casual apps should encrypt data to avoid unintentionally leaking user information. At the end of the day, most users assume apps are secure and any leaked information (even something as innocent as game scores) will cause users to second guess your company’s integrity.
User trust in apps might be diminishing, but that doesn’t mean it cannot be regained. With the right philosophy – and the right testing processes – users can and will trust their apps once again.
For more great security testing tips, be sure to check out our Security Testing Whitepaper.