Data supporting "Evaluating the Usability of Microgestures for Text Editing Tasks in Virtual Reality"
Repository URI
Repository DOI
Change log
Authors
Description
- Origin of the Dataset
This dataset originates from a research project investigating microgesture-based text editing in virtual reality (VR). The dataset was collected as part of an evaluation of the MicroGEXT system, which enables precise and efficient text editing using small, subtle hand movements. The research aims to explore lightweight, ergonomic alternatives to traditional mid-air gesture interactions.
-
Data Collection Methods • Hardware: The dataset was collected using the Meta Quest Pro VR headset, utilizing its XR Hand Tracking package to capture hand skeleton data at 72 Hz. • Participants: 10 participants were recruited for gesture elicitation and evaluation. • Procedure:
- Participants interacted with a VR text-editing application that mapped microgestures to common editing functions.
- Before data collection, participants viewed a demonstration video to understand each gesture.
- Each participant performed each gesture 20 times to ensure data consistency.
- Static gestures were clipped to 2 seconds, while dynamic gestures were recorded in 5-second clips to capture complete motion sequences.
- Swipe gestures were segmented into sub-states (0–3) for granular phase analysis, with each frame assigned a sub-state label.
-
Technical & Non-Technical Information for Reusability • The dataset is suitable for: • Gesture recognition research (static/dynamic gestures, sub-state segmentation). • Human-computer interaction (HCI) studies focusing on XR input methods. • Machine learning applications, including deep learning-based gesture classification. • Reuse Considerations: • Compatible with Unity’s XR Hand Tracking package and Python-based deep learning frameworks (e.g., PyTorch, TensorFlow). • Includes data augmentation scripts for expanding training datasets. • The Null class helps mitigate false activations in real-time applications.