TY - GEN
T1 - BystandAR
T2 - 21st Annual International Conference on Mobile Systems, Applications and Services, MobiSys 2023
AU - Corbett, Matthew
AU - David-John, Brendan
AU - Shang, Jiacheng
AU - Hu, Y. Charlie
AU - Ji, Bo
N1 - Funding Information:
We thank the anonymous shepherd and reviewers for their insightful feedback. We also thank Dr. Qinghua Li and his coauthors of [12] for kindly sharing their source code and dataset. Additionally, we thank the user study participants for volunteering their time. This work is supported in part by the Commonwealth Cyber Initiative (CCI) and the NSF grants under CNS 2112778 and 2153397.
Publisher Copyright:
© 2023 Owner/Author(s).
PY - 2023/6/18
Y1 - 2023/6/18
N2 - Augmented Reality (AR) devices are set apart from other mobile devices by the immersive experience they offer. While the powerful suite of sensors on modern AR devices is necessary for enabling such an immersive experience, they can create unease in bystanders (i.e., those surrounding the device during its use) due to potential bystander data leaks, which is called the bystander privacy problem. In this paper, we propose BystandAR, the first practical system that can effectively protect bystander visual (camera and depth) data in real-time with only on-device processing. BystandAR builds on a key insight that the device user's eye gaze and voice are highly effective indicators for subject/bystander detection in interpersonal interaction, and leverages novel AR capabilities such as eye gaze tracking, wearer-focused microphone, and spatial awareness to achieve a usable frame rate without offloading sensitive information. Through a 16-participant user study,we show that BystandAR correctly identifies and protects 98.14% of bystanders while allowing access to 96.27% of subjects. We accomplish this with average frame rates of 52.6 frames per second without the need to offload unprotected bystander data to another device.
AB - Augmented Reality (AR) devices are set apart from other mobile devices by the immersive experience they offer. While the powerful suite of sensors on modern AR devices is necessary for enabling such an immersive experience, they can create unease in bystanders (i.e., those surrounding the device during its use) due to potential bystander data leaks, which is called the bystander privacy problem. In this paper, we propose BystandAR, the first practical system that can effectively protect bystander visual (camera and depth) data in real-time with only on-device processing. BystandAR builds on a key insight that the device user's eye gaze and voice are highly effective indicators for subject/bystander detection in interpersonal interaction, and leverages novel AR capabilities such as eye gaze tracking, wearer-focused microphone, and spatial awareness to achieve a usable frame rate without offloading sensitive information. Through a 16-participant user study,we show that BystandAR correctly identifies and protects 98.14% of bystanders while allowing access to 96.27% of subjects. We accomplish this with average frame rates of 52.6 frames per second without the need to offload unprotected bystander data to another device.
KW - augmented reality
KW - bystander privacy
KW - eye tracking
KW - visual data
UR - http://www.scopus.com/inward/record.url?scp=85169431207&partnerID=8YFLogxK
U2 - 10.1145/3581791.3596830
DO - 10.1145/3581791.3596830
M3 - Conference contribution
AN - SCOPUS:85169431207
T3 - MobiSys 2023 - Proceedings of the 21st Annual International Conference on Mobile Systems, Applications and Services
SP - 370
EP - 382
BT - MobiSys 2023 - Proceedings of the 21st Annual International Conference on Mobile Systems, Applications and Services
PB - Association for Computing Machinery, Inc
Y2 - 18 June 2023 through 22 June 2023
ER -