centers on the study and development of computational models for
situated spoken language interaction and collaboration. The long term question that shapes my research agenda is how can we enable interactive systems to reason more
deeply about their surroundings and seamlessly participate in open-world, multiparty
dialog and collaboration with people?
Physically situated interaction hinges critically on the ability to reason about
and model processes like conversational engagement,
turn-taking, grounding, interaction planning and action coordination. Creating robust
solutions that operate in the real-world brings to the fore broader AI challenges.
Example questions include issues of representation (e.g.
what are useful formalisms for creating actionable, robust models for multiparty
interaction), machine learning methods for multimodal inference from streaming sensory
data, predictive modeling, decision making and planning under uncertainty and temporal
Zhou Yu started as an intern with Eric Horvitz and myself. Looking forward to a fun and productive fall!