16.1 Touch Input
Devices have many different input methods, such as buttons, keyboards, and touch. This section provides guidelines for implementing and using touch input.
If a device has touch input, incorporate it into MIDP's high-level UI components so that users can interact with MIDlets in the same way that they interact with native applications. This follows the advice, "Make It Predictable" on page 9. For example, Figure shows Sun's MIDP for Palm OS, which enables users to select an element of an exclusive-choice list by tapping it.
1. Exclusive-Choice List
If a device has touch input, make it available to application developers for use on their canvas and game canvas screens.
Test whether a MIDP implementation offers touch input by calling the methods hasPointerEvents and hasPointerMotionEvents of the Canvas class. If a device has pointer events, you will be able to tell where a user tapped on the screen. If a device has pointer motion events, you will also be able to tell when a user drags a pointing device from one point on the screen to another.
If touch input is available, incorporate it into your Canvas screens.
For example, most screens in SmartTicket are structured, high-level screens (such as implicit lists). On a PDA, users interact with these screens using touch input. If SmartTicket's canvas screen (shown in Figure) also supports touch input, users can interact with it in the same way that they interact with the other, structured screens. If the Canvas screen does not respond to touch input, it feels very different from the rest of the application; it feels awkward and unusable. In usability testing, users thought that the application was broken.
2. SmartTicket Canvas
1 Usability of Canvas and Game Canvas Screens
Tailor the behavior of your Canvas and Game Canvas screens to the types of interaction devices support. A Canvas screen that can accept touch input should also accept game controls or phone keypad input (game controls are preferable to the phone keypad, as noted in Chapter 10). Incorporating the flexibility to use both touch and key input will enable the screens to behave predictably on the widest range of devices.
For example, the Push Puzzle game, shown in Figure, should accept game controls on all devices. It could add touch input, if a device has it available, so that users could tap on the screen to indicate the box's new location. It could add support for using the stylus to drag the box to its new location if a device also supports pointer motion events. The ability to use whatever input methods the device has available will give the application the ability to integrate well into the maximum number of devices.
Include a Help screen that explains how to operate your Canvas screen. The Help screen should explain how to use all types of input that the screen accepts. For example, Figure shows a Help screen that explains how to use the game controls and a touch screen with the SmartTicket canvas.
4. Help Screen That Explains How to Use a Canvas
Test your touch-sensitive application with users on platforms that have and do not have touch input. This will ensure that your application is usable.
If your MIDlet does not support touch input, notify consumers with product documentation and Help screens. Consumers with touch-input devices might be less frustrated by the difference in behavior if you tell them how to operate the application and they know what to expect.
3. Push Puzzle Game
2 Sizing Touch-Sensitive Components
Touch-sensitive components must be an appropriate size. If they are too small, users will have a difficult time tapping them accurately.
Determine the sizes of your touch-sensitive components, such as buttons, by examining successful native applications. For example, Figure shows that the buttons used by a high-level component of a MIDP implementation are the same height and shape as those of a native application on the same device.
5. Buttons on a Native Application and a MIDP Application
Look to MIDP implementations on touch-sensitive devices as your model for the size of any touch-sensitive components you draw on a canvas.
Make output you draw to the Canvas screen scalable so that you can adapt it to the screen size of the device. (See "Accommodating Different Screen Sizes" on page 148 for more information.)
3 Interaction Styles
Both MIDP implementors and application developers should follow the same advice with respect to the way they expect users to interact with the touch screen. The advice in this section applies to both groups.
Application Developers and MIDP Implementors
Use a single tap interaction model. It is difficult to tap twice in the same spot with a stylus because of the human hand's natural jitter. Because it is error-prone, don't require double-tapping.
Typically, interactive components on a touch screen show a state change at a user's touch (stylus down) and execute when the touch releases (stylus up). The state change could be as simple as turning on the highlight. For example, Figure shows the state change that takes place when the user touches an abstract command button on the Movie List screen of SmartTicket.
6. State Change When Abstract Command Is Touched
This model decreases user error because the highlight shows users what would happen if they released their touch at their current screen location. If users do not want to take the highlighted action, they can move their point of contact with the screen to a point outside the highlighted area before releasing. Components that follow this model are predictable, and easier to learn and use.
Make obvious targets active and touchable. For example, when using elements like check boxes or radio buttons, allow users to tap either on the text or the associated box or button. They expect both to operate the item or list element. Making these obvious targets selectable will make the MIDP implementation or the application's canvas screen less error-prone and more user friendly.