Poster: Unobtrusively Mining Vital Sign and Embedded Sensitive Info via AR/VR Motion Sensors

Document Type

Conference Proceeding

Publication Date

10-23-2023

Abstract

Despite the rapid growth of augmented reality and virtual reality (AR/VR) in various applications, the understanding of information leakage through sensor-rich headsets remains in its infancy. In this poster, we investigate an unobtrusive privacy attack, which exposes users' vital signs and embedded sensitive information (e.g., gender, identity, body fat ratio), based on unrestricted AR/VR motion sensors. The key insight is that the headset is closely mounted on the user's face, allowing the motion sensors to detect facial vibrations produced by users' breathing and heartbeats. Specifically, we employ deep-learning techniques to reconstruct vital signs, achieving signal qualities comparable to dedicated medical instruments, as well as deriving users' gender, identity, and body fat information. Experiments on three types of commodity AR/VR headsets reveal that our attack can successfully reconstruct high-quality vital signs, detect gender (accuracy over 93.33%), re-identify users (accuracy over 97.83%), and derive body fat ratio (error less than 4.43%).

Identifier

85176129251 (Scopus)

ISBN

[9781450399265]

Publication Title

Proceedings of the International Symposium on Mobile Ad Hoc Networking and Computing Mobihoc

External Full Text Location

https://doi.org/10.1145/3565287.3623624

First Page

308

Last Page

309

Grant

CCF2000480

Fund Ref

National Science Foundation

This document is currently not available here.

Share

COinS