Iwalk: a lightweight navigation system for low-vision users

Report 2 Downloads 15 Views
iWalk: A Lightweight Navigation System for Low-Vision Users Amanda Stent

AT&T Labs - Research 180 Park Ave., Building 103 Florham Park, NJ 07932 [email protected]

Shiri Azenkot

University of Washington Seattle, WA 98195 [email protected]

ABSTRACT

cloud’ as needed for functions including: automatic speech recognition, local business search, geocoding and reverse geocoding, and routing.

Smart phones typically support a range of GPS-enabled navigation services. However, most navigation services on smart phones are of limited use to people with visual disabilities. In this paper, we present iWalk, a speech-enabled local search and navigation prototype for people with low vision. iWalk runs on smart phones. It supports speech input, and provides real-time turn-by-turn walking directions in speech and text, using distances and time-to-turn information in addition to street names so that users are not forced to read street signs. In between turns iWalk uses non-speech cues to indicate to the user that s/he is ‘on track’.

3.

Query Input. During query input, the user provides information about a destination business listing to iWalk using text or speech1 . For speech recognition, we use AT&T’s innovative Speech Mashups [4]. iWalk also collects the user’s current location from the GPS on the user’s phone, unless the user overrides this in the input query. So Jenny might say pizza, Domino’s or pizza in Morristown, NJ (see Figure 1 (a)).

H.5 [Information Interfaces and Presentation]: User Interfaces; K.4 [Computers and Society]: Social Issues— Assistive technologies for persons with disabilities

General Terms

Destination Selection.

Speech recognition, text to speech, navigation, assistive technology

iWalk uses the input query and user location to retrieve up to ten listings from a business listing database comprising tens of millions of listings. iWalk then presents these listings to the user one-by-one in increasing order of distance from the user’s current location. Summary information about each listing, including name, address, phone number and distance from the user, is presented in text and speech (see Figure 1 (b)). So iWalk might say to the user Destination Pizza Grille, 3 Headquarters Plaza, 0.23 miles (about 3 minutes). Jenny can use the left and right soft keys, or the Next and Back buttons, to move between listings. When she has found her desired destination, she can click either on the listing summary or on the Directions button. iWalk stores the ten most recent destinations previously chosen by this user as ‘favorites’. If the user chooses Favorites, iWalk will list these destinations.

INTRODUCTION

There has been a lot of research on navigation and wayfinding applications for people with vision loss; however, many solutions rely on physical sensors in the environment (e.g. [2, 5]) or on special-purpose hardware (e.g. [1, 7, 9, 10, 11]). Recent studies have highlighted the challenges posed by existing general-purpose navigation systems on mobile devices [6, 8]. One key finding is the necessity of getting around the menu-and-tap interaction paradigm on these devices [6]; another is the potential for these devices to be useful if they support user-installable applications [8]. In this paper we present iWalk, a navigation prototype for people with low vision that takes advantage of the dramatic increases in use of smart phones, sophistication of the software on those phones, and speed of the cellular network.

2.

IWALK INTERFACE

A typical iWalk interaction takes place in three stages: query input, destination selection, and routing. Let’s look at how Jenny, a user of iWalk who lives in Morristown, NJ, might get directions to a pizza restaurant.

Categories and Subject Descriptors

1.

Ben Stern

AT&T Labs - Research 180 Park Ave., Building 103 Florham Park, NJ 07932 [email protected]

Routing. iWalk collects routing information from a publicly available routing service, CloudMade [3]. It then presents the user with turn-by-turn walking directions. Each step is associated with a single screen and is presented in text and in speech. iWalk uses the user’s location, reported at 30-second

IWALK ARCHITECTURE

The iWalk client is a lightweight ‘widget’-based client that runs on smart phones including iPhones, Blackberries and Android phones. The client accesses various servers ‘in the

1

We currently do not attempt to recognize all street addresses in the USA. If the user wants to go to a residence, she/he provides the name of the closest business listing.

Copyright is held by the author/owner(s). ASSETS’10, October 25–27, 2010, Orlando, Florida, USA. ACM 978-1-60558-881-0/10/10.

269

Figure 1: Three screens in the iWalk interface: data input, destination selection, and routing

5.

intervals, to update the directions and to reroute if the user gets off track. In its directions, iWalk uses distance and estimated time information as well as street names, so that navigation instructions are clear even if the user cannot read street signs. For example, iWalk’s second instruction to Jenny might be Turn left in about 50 meters (1 minute). iWalk also provides non-verbal feedback. For example, when the user is close to a turning iWalk indicates that the turn is coming up by vibrating the phone or beeping.

Interface Accessibility. iWalk uses a high-contrast color palette and large font to increase viewability for users with low vision. It also speaks all screen elements as the user mouses over them. For example, in the data input screen in Figure 1, as the user mouses over the buttons iWalk says, Click to Speak, Search, Favorites. As the user mouses over the text input field iWalk speaks the content of the field.

4.

REFERENCES

[1] T. Amemiya and H. Sugiyama. Haptic handheld wayfinder with pseudo-attraction force for pedestrians with visual impairments. In Proceedings of ASSETS, 2009. [2] Y.-J. Chang et al. A novel wayfinding system based on geo-coded QR codes for individuals with cognitive impairments. In Proceedings of ASSETS, 2007. [3] CloudMade. http://www.cloudmade.com. [4] G. Di Fabbrizio, J. G. Wilpon, and T. Okken. A speech mashup framework for multimodal mobile services. In Proceedings of ICMI-MLMI, 2009. [5] S. Gifford et al. Introduction to the talking points project. In Proceedings of ASSETS, 2005. [6] T. Guerreiro, H. Nicolau, J. Jorge, and D. Gon¸calves. NavTap: A long term study with excluded blind users. In Proceedings of ASSETS, 2009. [7] J. Ju, E. Ko, and E. Kim. EYECane: Navigating with camera embedded white cane for visually impaired person. In Proceedings of ASSETS, 2009. [8] S. Kane et al. Freedom to roam: A study of mobile device adoption and accessibility for people with visual and motor disabilities. In Proceedings of ASSETS, 2009. [9] D. Ross and B. Blasch. Wearable interfaces for orientation and wayfinding. In Proceedings of ASSETS, 2000. [10] O. Venard, G. Baudoin, and G. Uzan. Experiment and evaluation of the RAMPE interactive auditive information system for the mobility of blind people in public transport. In Proceedings of ASSETS, 2008. [11] Z. Wang, B. Li, T. Hedgpeth, and T. Haven. Instant tactile-audio map: Enabling access to digital maps for people with visual impairment. In Proceedings of ASSETS, 2009.

CONCLUSIONS AND FUTURE WORK

iWalk is a navigation service for people with low vision that does not rely on special-purpose hardware, but instead uses widely available hardware and network services. Its key features include: speech input for destination selection, integrated local business search, spoken turn-by-turn navigation directions and speech cues in the GUI. iWalk is a mobile example of a “mashup”: a service created by fitting together services available over the internet. There is an increasing number of such services, and many of them have potential for assistive technologies [8]. For example, we anticipate that in the next year public transit and event information will also become widely available through routing APIs. In the future, we plan to add event search and public transit information to iWalk. iWalk currently uses only GPS location information from the user’s device. However, increasingly smart phones come with additional sensors and can report the user’s speed. This provides opportunities to personalize navigation information for particular types of user; for example, for a slower user iWalk may suggest an alternative mode of transportation at a shorter maximum distance.

270