Refine
Document Type
- Diploma Thesis (3) (remove)
Language
- English (3) (remove)
Keywords
- Avatar (1)
- Computeranimation (1)
- GPGPU (1)
- Gefühl (1)
- Human motion (1)
- Maschinelles Sehen (1)
- Motion Capturing (1)
- Natural Feature Tracking (1)
- Virtual characters (1)
- Virtuelle Realität (1)
Institute
In this thesis the feasibility of a GPGPU (general-purpose computing on graphics processing units) approach to natural feature description on mobile phone GPUs is assessed. To this end, the SURF descriptor [4] has been implemented with OpenGL ES 2.0/GLSL ES 1.0 and evaluated across different mobile devices. The implementation is multiple times faster than a comparable CPU variant on the same device. The results proof the feasibility of modern mobile graphics accelerators for GPGPU tasks especially for the detection phase in natural feature tracking used in augmented reality applications. Extensive analysis and benchmarking of this approach in comparison to state of the art methods have been undertaken. Insights into the modifications necessary to adapt and modify the SURF algorithm to the limitations of a mobile GPU are presented. Further, an outlook for a GPGPU-based tracking pipeline on a mobile device is provided.
Research has shown that people recognize personality, gender, inner states and many other items of information by simply observing human motion. Therefore the expressive human motion seems to be a valuable non-verbal communication channel. On the quest for more believable characters in virtual three dimensional simulations a great amount of visual realism has been achieved during the last decades. However, while interacting with synthetic characters in real-time simulations, often human users still sense an unnatural stiffness. This disturbance in believability is generally caused by a lack of human behavior simulation. Expressive motions, which convey personality and emotional states can be of great help to create more plausible and life-like characters. This thesis explores the feasibility of an automatic generation of emotionally expressive animations from given neutral character motions. Such research is required since common animation methods, such as manual modeling or motion capturing techniques, are too costly to create all possible variations of motions needed for interactive character behavior. To investigate how emotions influence human motion relevant literature from various research fields has been viewed and certain motion rules and features have been extracted. These movement domains were validated in a motion analysis and implemented in a system in an exemplary manner capable of automating the expression of angry, sad and happy states in a virtual character through its body language. Finally, the results were evaluated in user test.