Mobile Technology

Experts call for risk scores to improve smartphone app security

View 2 Images
Researchers have found a simple risk score could help improve mobile device security (Photo: Shutterstock)
Researchers have found a simple risk score could help improve mobile device security (Photo: George Kelly)
Researchers have found a simple risk score could help improve mobile device security (Photo: Shutterstock)

Next time you download or update an app for your smartphone or tablet and blitz through messages asking for permissions approval, you may be unnecessarily exposing your personal information to possible cyber violation. Researchers suggest this issue could potentially be addressed through better consumer education and an easy to understand risk score for each app.

Researchers from Purdue University, working as part of a U.S. National Science Foundation (NSF) funded project, took a look at the decision making patterns of smartphone users with regards to apps usage. They disturbingly came away with the conclusion that most users habitually ignore security warnings and consent to app permissions without a second thought as to what they are actually giving acceptance to.

"Although strong security measures are in place for most mobile systems, the area where these systems often fail is the reliance on the user to make decisions that impact the security of a device,” the researchers wrote in a recent report.

Furthermore, besides users paying little attention to what they are clicking through exists the plain fact that often the permissions which seek approval seem written by programmers for programmers. In other words, they aren’t always written in plain English or, at the minimum, require time and considerable effort by average individuals to try and understand.

"The complexity of modern access control mechanisms in smartphones can confuse even security experts," said Jeremy Epstein, lead program director for the Secure and Trustworthy Cyberspace program in NSF's Directorate for Computer and Information Science and Engineering. ”Safeguards and protection mechanisms that protect privacy and personal security must be usable by all smartphone users, to avoid the syndrome of just clicking 'yes' to get the job done.”

The Android ecosystem as an example

The scope of the problem, from a pure numbers game, raises big red flags. In the Android ecosystem alone, more than 400 million related devices were activated in 2012. To these devices, as of July 2013, users had downloaded over 50 billion apps from Google’s official online store. Although users are warned that giving permissions to apps from certain categories could allow them to read and modify contact details and calendar events, send emails without the user's knowledge or use settings that control the user's mobile data connection, even this may not be enough to inform average users.

What researchers are proposing instead is a simple risk score system that would inform users of potential risks in a simpler, more transparent way and prompt app developers to create apps that use less personal information. Experiments conducted by the team to test out reactions to a risk score approach found users generally had better attention and more curiosity around security warnings presented in this way.

“This is a classic example of the links between humans and technology," said Heng Xu, program director in the Secure and Trustworthy Cyberspace program in NSF's Social, Behavioral and Economic Sciences Directorate. "The Android smartphones studied by this group of scientists reveals the great need to understand human perception as it relates to their own privacy and security."

The team's research appears in the journal IEEE Transactions on Dependable and Secure Computing

Source: NSF

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
3 comments
Daishi
I was thinking about this the other day and I thought of a partial solution to the problem though something like "app store filters".
The way it works is mostly simple, I go into app store, select a filter, say Gizmag, and what's left are the applications in the store that Gizmag has reviewed or recommends.
The benefits of this are 2 fold, for starters it means not only did google approve the app but a reviewer likely selected the app after doing some basic amount of research and deciding it was a best of breed for the category its in and not entirely shady. Additionally, an app publisher with a solid but lesser known app can be seen in the app store instead of being buried under a pile of crap. It improved the general signal to noise ratio of apps.
On top of this, users have the ability to select different filters. If I'm someone who cares deeply about privacy maybe I select EFF or some other pro-privacy group who has evaluated applications specifically making sure they aren't requesting too many permissions or collecting data they have no business to collect and then whitelisting them. This improves security and punishes applications who currently have little to no incentive to take better security/privacy measures.
It would give people other methods of finding software and improve the quality of the software they find and its something that could be built right into the store so the normal process for updates etc. would be the same as now.
People not wanting to use the feature could just not use it so I don't see a good reason not to do it. Once it was supported I'm sure many communities would be eager to pick up the role of helping hand pick good applications people can trust.
christopher
Of course warning/permissions are ignored - there's no other option. When you install an app, there needs to be a checkbox beside ALL the things it wants access to, letting us deny stuff we don't want it to know!
Never going to happen of course - advertising powers most revenue today, and they want to know everything about you.
Koolski
Five stars Daishi!