There are some interesting factors that can’t be replicated during lab tests. You can’t account for network strength, what other programs are running on a given device or a user’s intuitive understanding. You also run into issues you might not have thought of (and that wouldn’t have been an issue in the past), factors like how users hold a mobile device. It turns out this is a bigger variable then you might realize. Steven Hoober wrote an in-depth piece on how people hold and use mobile devices in the wild for UX Matters. During his recent study he found that there are three major ways people interact with their mobile phones.
For two months, ending on January 8, 2013, I—and a few other researchers—made 1,333 observations of people using mobile devices on the street, in airports, at bus stops, in cafes, on trains and busses—wherever we might see them. Of these people, 780 were touching the screen to scroll or to type, tap, or use other gestures to enter data. The rest were just listening to, looking at, or talking on their mobile devices. …
Most people hold their mobile phone with one hand, but a large percentage “cradle” the phone – holding it with one hand while using the other to perform actions. Broken down into percentages, 49% of people use their mobile phones one-handed, 36% cradle the device and 15% use two hands.
While most of the people that we observed touching their screen used one hand, very large numbers also used other methods. Even the least-used case, two-handed use, is large enough that you should consider it during design.
Steven points out that it’s not uncommon for a user to change their grip depending on the task they’re performing or the situation they’re in at the moment. Plus, each of the three methods he and his team observed have variations depending on the user. Here are some illustrations that demonstrate how a user might hold their phone and what type of reach the position gives them. Green means the area is easily accessible, yellow requires some reaching and red means a user would have to alter their grip.
Read the full article for more insights at UX Matters >>>
These illustrations should give developers insight into which parts of the screen are most easily accessible, but as Steven pointed out, several different holding positions have enough user adoption to make them an important consideration.
Steven’s research didn’t include tablets (“Since we made our observations in public, we encountered very few tablets, so these are not part of the data set. The largest device that we captured in the data set was the Samsung Galaxy Note 2.”) which add another level of usage to the mix. The larger size of a tablet most certainly is going to change how a user holds and interacts with the device. That might change all over again as you start getting into the varying sizes of tablets.
This adds many more testing variables to the mix – ones you definitely can’t automate. Ultimately, how users hold their mobile devices and how that action affects your mobile application is something that needs to be explored with some good usability testing. Odds are you won’t be able to please everyone, but if enough people complain that your app is hard to use you might want to reconsider some design elements.