Voice Actions & Interaction API: Ubiquitous & Hands-Free Interfaces for Android.

UWS 11:50am

Voice actions enable users to interact with mobile apps using natural speech patterns, enabling hands-free behaviors in situations where the user is multi-tasking or otherwise occupied. Consistent support for voice actions across Android devices (phones, tablets, Wear) makes them a perfect enabler for ubiquitous computing experiences where actions are auto-magically handled by the right app, in the right context, across all devices potentially owned by that user.

While voice actions focus simply on triggering the relevant app, the Voice Interaction API (announced in Android M Preview) allows the app to engage with the user in a follow-up dialog to further clarify actions (e.g., confirmation) or narrow it down (e.g., select from alternatives).

In this talk we’ll look at support for Voice Actions and the Voice Interaction API in Android, explore different usage contexts and walk through code snippets that show how we can leverage them in our own applications

Nitya Narasimhan,

Nitya is a software professional with over 15+ years R&D experience in industry, startups and academia. She spent over a decade at Motorola Labs working on advanced concepts for next-generation phone and television platforms, and currently balances consulting with early-stage technology development for mobile and web domains.

She co-organizes the Google Developer Group chapters in New York City and Hudson Valley, and recently facilitated a successful 3-month Android Study Jam that taught a ‘beginner’ cohort of students and professionals to develop a non-trivial Android application.