-
Notifications
You must be signed in to change notification settings - Fork 0
Requirements & Approach
The embodiment of subjects should be as precise as possible. Therefore one of the goals is to find a set of straightforward heuristics for offsetting and adjusting the virtual body according to individual physical aspects of the subjects (e.g. their height or distances between joints). The implementation of such heuristics as a frame of reference in the application may create a neat approximation of the real motion and therefore lead to a better virtual experience.
In the latest versions of UnrealMe calibration has been an issue costing both, lots of time and patience. Especially when using the Vicon tracking system in the HCI lab, the calibration for one subject took between 10 and 20 minutes.
To deal with this issue, one of the aims of this project is to find a calibration procedure, that automates some steps of the whole process to increase the framework's ease-of-use.
A possible approach is to have subjects stand in front of the tracking device in a basic pose, measure some relative distances (e.g. the distance between to joints like shoulder and elbow) and to apply them to the virtual body. Such procedure could be implemented in an additional calibration application that passes the computed measures to the VR application.
In order to make the framework more comprehensible and vivid, there's a need to implemented a basic configuration that on the one hand demonstrates what the framework can do and on the other hand shows how the framework is used.
Therefore a basic demo configuration will be implemented. This configuration will use Microsoft's Kinect version 2 as a tracking device and Oculus DK2 as a displaying device. Furthermore there will be a simple map/level to begin with and - depending on time and complication factors - perhaps more complex environments later on.
Even though there will be a basic demo configuration, the framework still needs to pursue flexiblity and extensibility. Therefore one of the main goals is to integrate the "Virtual Reality Periphery Network" (VRPN) for the application to be compatible with as many input devices as possible. In general as many parts of the framework as possible shall be configurable by config files to veer away from hard coding functionality as opposed to the first version of the framework.
If the main requirements can be implemented and tested in a timely manner, then the aspect of interaction with the virtual requirement may become a relevant part of this project as well. In the earlier version of the framework it was possible to carry an object in one's hand, yet this interaction was too imprecise for users as it turned out in the user study for my Bachelor thesis. Thus it would be desirable to try and improve the quality of such interaction.
To be continued...
-
Introduction
-
Requirements & Approach
- Hard- & Software Setup
3.1 Tracking
3.2 Display
- Implementation
4.1 Core Application
4.2 Embodiment
4.3 Calibration (tbd)
- Tutorials
5.1 Useful links
5.2 General UE4 information/experiences
5.3 Connecting and using a tracking device
5.4 Basic Kinect Demo
5.5 Linking static and dynamic C++ libraries
5.6 Joint IDs
- Future potential