iPhone and Android touch screen smartphones have a lot of interaction design and usability problems, many of which stem from originally being designed only for 3.5 inch screens and also not being designed for adaptability across the system. Here are some tips to consider to help resolve those shortcomings. All of the general interaction design principles apply to smartphones, too, so be sure to see: General User Experience Design Guidance as well.
-
Place interactive elements where fingers can reach them
Why?
So users can press the buttons!
Judging by the user interface designs of most apps and the operating systems of Google Android and Apple iOS systems as of 2019, most UI designers don’t consider hand placement when designing interactive elements within apps. Most appear to assume that the users are standing still, holding the phone in one hand and poking at it with the other hand. Often you’ll find menu items and buttons and even hidden notifications/action drawers at the top of the screen AND at the bottom of the screen. No human hand can reach all of those places while holding a touchscreen phone that’s larger than 4″ and just about all of them newer than 2015 have very large screens.
The two handed, “hold in one hand, poke with the other” interaction method is ok if you’re sitting at a desk or on the couch, but mobile phones are meant to be mobile. The two handed interaction method falls apart very quickly if you’re standing on a train where you need one hand to hold onto the handles, walking somewhere while carrying bags in one hand, driving a car, etc. UI designers often assume stationary usage instead of mobile usage even though we actually call them mobile phones.
If you’re a UI designer looking at a large screen on your desktop or laptop computer and wondering where to place interactive elements for a smartphone UI, you can very easily figure this out by installing a drawing app on your phone, holding it with one hand and painting within the drawing app using your thumb without scooting your hand around or adjusting it to different positions.
For increased efficiency usability, a user’s thumb should be able to reach all system and app controls while holding the device without having to reposition the hand or use a second hand. This rule of thumb should be common sense for smartphone UI designers, but it clearly is not.
Arguments
Placing buttons at the top of the screen makes them easier to see since that’s where most people look first due to our top-to-bottom left-to-right reading habits.
This is true, but don’t you think being able to actually use the buttons would be more important than simply seeing them?
Also see:
- 8 ways to tell if your mobile app sucks
- Responsive Navigation: Optimizing for Touch Across Devices
- How Do Users Really Hold Mobile Devices?
- How to design for thumbs in the Era of Huge Screens
- The Thumb Zone: Designing For Mobile Users
- Designing for Large Screen Smartphones
- Why Mobile Menus Belong at the Bottom of the Screen
-
Place interactive elements where fingers can reach them in landscape mode, too.
Why?
Most apps simply stretch the width and shorten the height of their interface when a user rotates a smartphone into the landscape orientation, but user interface designers forget that holding a phone in landscape mode has a completely different interaction method. The fingers required for interaction are in a totally different position compared to where they would be in portrait mode.
Arguments
Resizing the UI width and height is much easier. Again, this is an argument based on laziness.
-
Don’t rely on touch screen gestures
Why?
This goes back to the “easy to learn” concept. Touch gestures are often non-discoverable unless by accident. Hence, they require training and users often don’t have the time or interest to go through training processes to use an app. If you add a left-edge swipe gesture that does something important, no user is going to instantly know that just by looking at the screen.
Arguments
Gestures can be “easy to use” once the user does learn them. This is true and that’s why it’s good to implement them, but it’s important to not rely on them since they are clearly not “easy to learn”.
You can find a better “easy to learn” and “easy to use” balance by making gestures more discoverable by using buttons that have text labels explaining the use of gestures and then allowing the user to opt-in to removing the text labels (or tool tips) once they have learned the gesture commands.
Consistency is very important with gestures as well. Some app developers may assign one function to a left-to-right screen swipe gesture, but this gesture’s function may be completely different when the user is on the home screen, and it may be completely different depending on how close to the edge the touch gesture begins. These inconsistencies can be extremely confusing and require increased cognitive energy to memorize and carry out.