Prototyped and implemented context-awareness across domains and languages to enrich conversational user experience of Samsung's Bixby by collaborating with design, strategy and engineering teams.
"What’s the time now?"
"Should I take an umbrella today?"
"Is it a holiday tomorrow?"
"Set a timer for 30 minutes."
"Remind me to stop at the grocery store to buy bread when I’m jogging."
Users want answers to these immediately. They get frustrated if their voice assistant does not give the right answer or doesn't help them when they really need the help. Simply put, users want contextual help in everyday tasks.
There are several factors that come into play when we talk about conversational intelligent assistants. For instance, location is one such key factor. It matters where you are located. It matters whether you are asking certain questions from Seoul or San Francisco or Bangalore or London as the answer may or may not hold good for you. Some context-awareness factors include location, timezone, language, news, weather, food, reservations and other internal apps (notes, reminders).
"Remind me to stop at the grocery store to buy bread when I’m jogging."
There are several things at play here. This task that seems so simple for a user needs multiple applications to work together. This task needs to invoke the native Clock, Maps and Samsung Health applications.
While Clock is a system application, Maps and Samsung Health are cloud-based applications that need to make calls to the backend. The backend has to fetch these responses from multiple applications and then create a response and speak in a language that a user can understand. All this needs to be done in a very short period of time.