PhD Candidate, Dynamic Graphics Project,
Dept. of Computer Science, University of Toronto
I am a PhD Candidate at Dynamic Graphics Project at the Department of Computer Science, University of Toronto.
My current area of focus is enhancing the effectiveness of remote meetings by addressing the limitations of virtual communication tools like Zoom and Microsoft Teams from a Human-Computer Interaction (HCI) perspectives. I developed solutions such as JollyGesture and CoExplorer (the latter during my internship at Microsoft Research) to help presenters better convey visual non-verbal cues and optimise screen-sharing using generative AI. Currently, I am working on the self-view augmentation project, which aims to enable listeners to more effectively convey visual non-verbal feedback to speakers in a way that is easily interpretable, thereby improving interaction in virtual meetings.
Enhance the interpretation of visual non-verbal cues through encouraging self-monitoring.
Encourage more intentional meetings through LLM-iniated task-space management.
Enable virtual reality presenters to deliver more interactive presentations using gestures.
Tangible Artificial Neural Network (TANN) for intuitive understanding on how to train an artificial neural network.
More efficient video lookup through the use of tangible rate based controller prototypes.
A reference deep learning-based feature detector for transfer learning in the field of epigenomics.
This project is an on-going project.
CoExplorer.
JollyGesture.
Details about Project 4.
Details about Project 5.
Details about Project 6.
An exploratory research project relating to computer-mediated meetings. A complete set of result is obtained, and it is published.
Topics: Human-computer interaction, virtual reality, and natural language processing
With Dr Nicolai Marquardt: I suggested the Tangible Artificial Neural Network project and worked on it.
With Prof. Yvonne Rogers and Prof. Anthony Steed: I contributed to 5 different PhD projects, and 2 of which are published as of now.
Conducted a research project on the topic of the rate-based controller for virtual reality environment
Topic: Human-computer interaction
2024 |
The CoExplorer Technology Probe: A Generative AI-Powered Adaptive Interface to Support Intentionality in Planning and Running Video Meetings |
JollyGesture: Exploring Dual-Purpose Gestures in VR Presentations |
|
CoExplorer: Generative AI Powered 2D and 3D Adaptive Interfaces to Support Intentionality in Video Meetings |
|
2023 |
Comparing Mixed Reality Agent Representations: Studies in the Lab and in the Wild |
Lessons Learned Running Distributed and Remote Mixed Reality Experiments |
|
2022 |
Should Chatbots Chat or Probe? Perceptions of Conversational Interfaces That Probe Human Decision-Making |
2021 |
LDEncoder: reference deep learning-based feature detector for transfer learning in the field of epigenomics Poster |
Kind | Description |
---|---|
Honours and Scholarships |
|
Service |
|
Invited Presentations |
|
Teaching |
|
Professional Membership |
![]() |
© 2025 Warren Park. All rights reserved.