COCOA Research Lab


The COCOA (CO-designing COllaborations with AI) Lab explores how humans and AI can collaborate effectively, designing intelligent interactive systems that enhance human productivity and empower diverse users. We combine Human-Computer Interaction, AI, and accessibility research to create collaborative tools and adaptive interfaces that support real-world tasks and emerging virtual experiences.


We are part of the Graphics and Media at York (GaMaY) Hub.

Latest News

Two new publications in Fall 2025!

Our lab has two new publications this semester:

Evaluating Usability Challenges in VR Games for Older Adults: A Comparison With and Without AI Assistance

In Adjunct Proceedings of the 38th Annual ACM Symposium on User Interface Software and Technology. Key highlights:

  • Older adults encountered several usability problems in VR, including unclear instructions and goals, confusing game mechanics, and difficult-to-navigate menus.
  • AI-generated suggestions helped UX evaluators to identify a greater number of usability problems.
LLM-powered assistant with electrotactile feedback to assist blind and low vision people with maps and routes preview

In the International Journal of Human-Computer Studies. Key highlights:

  • An LLM-powered assistant that supports natural language queries about map elements, route conditions, and trajectories.
  • A wearable electrotactile feedback device that conveys map features via haptic signals.
  • A user study with blind and low vision participants demonstrating improved efficiency and spatial understanding of route previews.