Let them focus on the real world

Pervasive applications should not distract from shoppers' natural interactions with products, each other, and the environment. Shoppers’ focus should remain on their physical surroundings until either they choose to focus on the application or the application takes action to assist them in completing their task.

These calls-for-attention (CFAs) should be contextually appropriate and take advantage of multiple modalities (visual, auditory and haptic) to accommodate differences in the abilities of shoppers and the state of the environment.

1 Modality matters - alerts should be relevant and distinct.

High-value events require shoppers’ attention. They should be distinct, emphatic, and relevant. As a general rule, the greater the risk that the shopper will go off task, the more distinct the CFA should be. Reserve audible and haptic methods for infrequent, but critical alerts or to escalate an earlier alert that had gone unnoticed. This helps to avoid habituation or desensitization to these type of feedback. You don’t want your shoppers to become blind to the fact that their attention is needed.

2 Don’t be needy - the shopper’s primary focus is on the environment.

The most useful pervasive applications stay out of the way until they are needed. They limit the amount of dialog with the shopper and are smart enough to know when to ask for attention. By attracting attention only at key points in the experience, they allow shoppers to more accurately, efficiently and safely distribute their attention.

3 Show, don’t tell.

Display a persistent view of the shoppers’ environment. Visually explain the context, but allow them to zoom in on areas or elements of interest. This contextual view of the environment allows shoppers to quickly orient themselves, understand their options and choose the best path. The visual representation should provide them anytime/anywhere access to just about anything the application or environment has to offer. But remember, this is a pervasive application -- it should not be the focus; the environment should be.

4 Detect and adapt to the attention of the shopper.

While the system actively tracks shoppers and their context, the application patiently waits for a cue to act. Don’t wait for shoppers to tap, swipe, or yell to get its attention. Most modern mobile devices have front-facing cameras and some even have proximity sensors (and who knows what’s next). Use these features to sense the attention of the shopper and adapt accordingly. If your application can detect the focus of the shopper, it can more accurately assess the need to adapt feedback methods and modalities. For instance, if the shopper is looking at the device, there may not be a need for an audible alert to an approaching event.

5 Adapt calls-for-attention (CFA) to the situation or environment.

Understanding the context of the shopper includes situational factors of the environment, like ambient noise and inconsistent lighting. This information can be used to more accurately choose the modality of a CFA. For instance, in a loud environment, there is a point where raising the volume of an audible alert is less a solution than it is an aggravation. In situations like this, the application should escalate its CFAs appropriately and distinctly, perhaps with a random or erratic pattern of vibrations.

Example: A shopper is presented with a visual CFA, but does not appear to be paying attention. The system escalates the CFA to include auditory cues. If that cue is ignored, the system increases the volume only to the level that it will not distract other shoppers. The system then escalates the CFA by flashing the display with a bold color (visual) or vibrating the device in a distinct pattern (haptic).

Accessibility Considerations

Audible and haptic feedback can help visually and cognitively impaired shoppers better understand and navigate the environment than visual cues alone. Haptic feedback should replace audible feedback in areas with excessive ambient noise or if the shopper requests silent operation.

14 references informed this principle.

[1] Baber, C; Bristow, H; Cheng, S; Hedley, A; Kuriyama, Y; Lien, M; Pollard, J & Sorrell, P, Augmenting Museums and Art Galleries, 439--446, 2001.

[2] Li, Ian; Dey, AK & Forlizzi, Jodi, Understanding my data, myself: supporting self-reflection with ubicomp technologies, Conference on Ubiquitous computing, 2011.

[3] Lim, Brian, Improving Understanding, Trust, and Control with Intelligibility in Context-Aware Applications, Human-Computer Interaction, May 2011.

[4] Phansalkar, Shobha; Edworthy, Judy; Hellier, Elizabeth; Seger, Diane L; Schedlbauer, Angela; Avery, Anthony J & Bates, David W, A review of human factors principles for the design and implementation of medication safety alerts in clinical information systems., Journal of the American Medical Informatics Association : JAMIA, Vol. 17, No. 5, 493-501, 2010.

[5] Ritsos, PD; Ritsos, DP & Gougoulis, AS, Standards for Augmented Reality: a User Experience Perspective, Standards Meeting, 1-9, February 17, 2011.

[6] Roy, Nirmalya; Gu, Tao & Das, Sajal K, Supporting pervasive computing applications with active context fusion and semantic context delivery, Pervasive and Mobile Computing, Vol. 6, No. 1, 21-42, 2010.

[7] Srinivas, Preethi; Huang, Haidan; Pirzadeh, Afarin & Bolchini, Davide, Clever Shopper: Supporting In-Store Decision-Making , 2011.

[8] Wobbrock, Jacob O.; Kane, Shaun K.; Gajos, Krzysztof Z.; Harada, Susumu & Froehlich, Jon, Ability-Based Design, ACM Transactions on Accessible Computing, Vol. 3, No. 3, 1-27, April 2011.

[9] Zhang, Xiaohui & Uruchurtu, Elizabeth, A User Experience Perspective of Design for Context-Aware Adaption, 2011 Seventh International Conference on Intelligent Environments, 1989, 322-325, July 2011.

[10] Drawing Attention to Context-Awareness with CaST: A Context-Aware Shopping Trolley, D. Black, N.J. Clemmensen, 2006.

[11] [2-old] Improving Intelligibility and Control in Ubicomp, j. Vermeulen, Hasselt University, Ubicomp ’10, Sept. 2010.

[12] [3-old] Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment, T. Ballendat, N. Marquart, S. Greenberg, November 2010.

[13] When More is Less: Designing for Attention in Mobile Context-Aware Computing – Exploring a Context-Aware Shopping Trolley, D. Black, N.J. Clemmensen, Aalborg University, 2006.

[14] Reduce Task Complexity by Dividing Attention: Exploring a Context-Aware Shopping Trolley, N.J. Clemmensen, D. Black, 2006.

© 2014 - Jonathan Morgan | @promorock | LinkedIn