Building XR Devices to Sense Mind and BodyExport this event to calendar

Wednesday, December 1, 2021 — 12:00 PM to 1:00 PM EST

Building XR Devices to Sense Mind and Body

Building XR Devices to Sense Mind and Body

Event Description

The next generation of head-mounted displays (HMD) proposes the integration of sensors capable of recording human body responses seamlessly. This feature will open endless possibilities to create extended reality (XR) technologies that are more sophisticated in capturing human’s emotions, psychological states and reactions to immersive virtual content. This panel explores how three different companies have been embedding physiological sensors into HMDs for virtual and augmented reality in order to connect mind and body and create more natural interfaces for XR applications. This integrated technology is allowing the creation of applications in the fields of assistive technology, affective and physiological computing and adaptive games for health. This panel will have three key industry players that are creating the new generation of HMDs in XR, integrating sensing technologies such as eye-tracking, electroencephalography (EEG) and facial electromyography (EMG) as well as developing novel algorithms to deconstruct the physiological information and create more humanized technologies.

Registration Link: https://uwaterloo.zoom.us/webinar/register/WN_5fXSShmfSqWD24M3bNL25w


Speakers

Charles NdukaCharles Nduka
Chief Science Officer (CSO) and co-founder of Emteq Labs.

Charles is a leading facial musculature expert with over 20 years surgical experience, including 10 years as Consultant Plastic & Reconstructive Surgeon (Queen Victoria Hospital). Charles has an extensive background in research and development including clinical trials, has over 100 scientific publications and is the Medical Advisory Board chair for the charity Facial Palsy UK.

Brian ChaeBrian Chae
CEO and co-founder of LooxidLabs.

Brian is a researcher with experience in brain-computer interface and human-machine interfaces. Brian holds a PhD in Computer Science from the Korea Advanced Institute of Science and Technology (KAIST) in South Korea. Brian has extensive experience integrating EEG signals in HCI applications such as teleoperation and virtual reality simulations.

Sarah PearceSarah Pearce
Sarah is the biosignals engineering lead at Cognixion.

She is a systems design engineer with experience in biosignal processing, embedded machine learning, software development, and user experience design for mixed-reality applications. Her work focuses on developing brain-computer interfaces for communication, control, and neurorehabilitation. She uses a mixed-methods approach to solve problems using her skills in systems engineering and user experience design.

Moderator

John Munoz

John E. Muñoz

John is a game designer and interface technologist specialized in the use of physiological signals to optimize the user experience while using interactive systems. John is currently a postdoctoral fellow at the University of Waterloo carrying out research in the fields of assistive technology, human-robot interaction, and virtual reality. 

S M T W T F S
25
26
27
28
29
30
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
1
2
3
4
5
  1. 2022 (14)
    1. December (1)
    2. October (2)
    3. September (1)
    4. July (2)
    5. June (2)
    6. May (1)
    7. April (3)
    8. March (1)
    9. February (1)
    10. January (1)
  2. 2021 (20)
    1. December (1)
    2. November (1)
    3. October (2)
    4. September (2)
    5. July (4)
    6. June (2)
    7. May (2)
    8. April (2)
    9. March (2)
    10. February (1)
    11. January (1)
  3. 2020 (31)
  4. 2019 (44)
  5. 2018 (31)
  6. 2017 (16)
  7. 2016 (28)
  8. 2015 (12)
  9. 2014 (11)
  10. 2013 (12)