Who can you trust?

Technology, like any tool, can be used well or badly. Sometimes the bad is malicious, and sometimes it’s just the result of laziness, carelessness, inexperience, or thoughtlessness.

I’m hoping the latest mobile revelation falls into one of the latter categories.

Researchers at Columbia University discovered that many apps in the official Android app store, Google Play, have secret keys to services such as Google, Facebook and Amazon hard-coded within the app. That makes them accessible to anyone who downloads the app and knows how to look for them. Those keys can be used to steal resources or user information from the vulnerable services.

Needless to say, hard coding any security information into an app is very bad programming practice. Not only does it open the user to hacker activity, it opens the developer to a lot of aggravation and hard work should that information change; it would cause inconvenience to users and require a new release of the app. And that doesn’t even take into account the hit to the developer’s reputation (and maybe lawsuits or prosecution) if an app results in loss of critical information that is subsequently used to cause harm.

The researchers found that it wasn’t only the casual developers who were at fault; even some designated “Top Developers” in Google Play engage in the same practices.

Fortunately for users, Google is taking these flaws seriously, notifying offending developers that they have to fix their apps. It is also using the Columbia team’s techniques to check incoming apps so the problem doesn’t recur.

However, that won’t stop authors from uploading bad apps to other Android marketplaces and download sites — one good reason to stick to using the official app store.

This whole mess raises an important question: who can you trust? It’s impossible for most users to peer into the guts of an app and see what it’s actually doing. Probably none of the users of the apps identified by the Columbia researchers have any idea that, by having the app on their devices, they’re putting themselves at risk. The researchers found that the app doesn’t even need to be actively running to be vulnerable; many apps today lurk in the background so they can generate alerts, even if the user hasn’t launched them.

The problem is compounded by the ease with which anyone with access to a software development kit can create and publish an app. There’s no official training required, and no instruction on basic good practices. Consequently, stuff happens, and the developer is oblivious to why.

This is especially an issue with Android apps. While Apple, Microsoft, and BlackBerry have approval processes in place that should (in theory) catch bad behaviour in apps, Google Play is wide open. You pay a small fee to create a developer account, and have at it. Consequently, with no or few controls, developers take the easy road, glossing over the idea of security, and demanding more and more access to user information and access to everything the device is doing. Why, for example, should an app that allegedly just plays music need access to phone records? Why should an app developed to help users navigate a conference and build their agendas need to send email without the owner’s knowledge (and why did one such app contain contact info for everyone who had installed it? Yes, this is a true story!)? Why does an app that displays transit schedules need access to the user’s contact list?

We can hope that these blatant and gratuitous invasions of privacy are merely the result of lazy or less than competent developers, but at the same time, we have to fear that the information is being captured (and transferred to insecure locations) for other, less savoury reasons. And with, our lives increasingly being run from mobile devices, that fear should translate into demands that we be given ways to ensure that we can, indeed, trust the apps we rely on.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.