Location: EC5 Room 1111
In this talk I will discuss the system architecture for Cortana, Microsoft's personal digital assistant, with a particular focus on its single-turn and multi-turn dialog capabilities. I will describe a ranking based approach for multi-domain multi-turn dialogs. I will show how this can be extended to incorporate additional information from a speech recognition system to determine the correct system response. I will also discuss how to handle cases where no correct system response is available. Finally, I will present a technique to automatically handle queries referring to a user's past interactions with the system by combining an information retrieval approach with a personal knowledge graph. For all these components, I will also present experimental results based on data collected from actual Cortana users over time and how to incorporate user behaviour to improve the product.
Speaker: Omar Zia Khan, Senior Applied Scientist, Microsoft