Refine
Year of publication
- 2019 (41) (remove)
Document Type
- Bachelor Thesis (41) (remove)
Keywords
- Immersion (2)
- BPM (1)
- Boppard (1)
- Business Process Modeling (1)
- CCRDMT (1)
- Cold Chain (1)
- Cybersickness (1)
- Datenschutz, Datensicherheit, Apps, Informatik im Kontext (1)
- Denkmal (1)
- ECSA (1)
Ist es möglich, allein mittels VR-Headset bei Nutzern Immersion zu
erzeugen? Zur Beantwortung dieser Frage werden zwei Simulationen einer
Achterbahnfahrt ohne haptisches Feedback mittels der Unreal Engine
4.20.3 für ein HTC-Vive VR Headset entwickelt und implementiert. Die
zweite Simulation unterscheidet sich von der ersten durch die Darbietung
außergewöhnlicher Ereignisse während der Fahrt, für die vermutet wird,
dass sie das Immersionserleben verstärken. Elf Probanden nahmen an der
Untersuchung teil. Die Auswertung eines Fragebogens zur Erfassung der
Intensität der Immersion und der Antworten auf offenen Fragen zeigt, dass
Immersion in beiden Simulation erfolgreich erzeugt werden konnte. Manche
Merkmale der Simulation vertieften bei einzelnen Probanden das immersive
Erleben, bei anderen dagegen nicht. Die Bedeutung der Ergebnisse
und Optimierungsmöglichkeiten für künftige Studien werden diskutiert.
The goal of this bachelor thesis was to add an image processing step to the music recognition software AudiVeris, in order to extract data even from faulty music sheet images. The procedure starts with a binarization using a regional version of Otsu's method. Following this the music sheet is searched for possible bendings, similar to those a hardcover book would cause. To achieve this the Hough transform is used for line detection and the k-means algorithm for cluster detection. Thereafter the music image is straightened using the discovered curvature.
A growing flood of advertising and information as well a limited capacity to collect information present a challenge to marketing today. Marketing in general is very important for the success of a company. But the classic marketing theory which ignores the special characteristics of young companies dominates. A new company with a new idea and a new product meets an existing market with customers and established competitors. For an innovation-related founding context this is a special challenge.
In this bachelor thesis the subjects, entrepreneurship and marketing will be discussed first. In addition the special characteristics of young companies, the so-called liabilities will be explained. The meaning of the term entrepreneurial marketing is explained by detailed consideration of the close and wide conceptual understanding. Afterwards the comparison clarifies the difference to the classic marketing. As part of the literature review, the current state of research will be presented and the practical use will be examined in more detail based on the central approaches. The approaches are the guerrilla marketing, ambient marketing, sensation marketing, viral marketing and ambush marketing. How entrepreneurial marketing with the central approaches is used in a start-up company is analyzed by a qualitative investigation in the form of a case study.
The case study shows that unconventional marketing measures and low use of medium can have a large effect. The entrepreneurial marketing therefore offers an alternative to classic marketing because it pays attention to the special characteristics of a young start-up company. This bachelor thesis shows that the entrepreneurial marketing can convert the weaknesses of young founding companies into strengths and lead to superiority over the competition.
Mit der rasant fortschreitenden Entwicklung von Informatiksystemen und Algorithmen ist die Erfassung und Verarbeitung von Daten in immer größeren Umfang möglich. Verschiedene Initiativen haben sich dadurch motiviert zur Aufgabe gemacht, über die daraus resultierenden Gefahren für die Persönlichkeitsrechte und die Meinungsfreiheit aufzuklären. Dies soll einen bewussteren Umgang mit personenbezogenen Daten zur Folge haben. Zum Schutz der Grundrechte bedarf es aufgeklärter und informierter Nutzer, diese Aufgabe können die Initiativen allerdings nicht alleine leisten. Die staatlichen Bildungseinrichtungen und besonders die Schulen, stehen hier in der Pflicht sich an der Lösung des Problems zu beteiligen. Um ihrem Bildungsauftrag im vollen Ausmaß gerecht zu werden, bedarf es struktureller Änderungen, wie der Änderung von Lehrplänen. Solange diese allerdings nicht erfolgt sind, muss in und mit den gegebenen Strukturen gearbeitet werden. Eine Plattform dafür bietet der schulische Informatikunterricht.
Die vorliegende Arbeit stellt eine Unterrichtsreihe zur Behandlung von Datenschutz und Datensicherheit vor. Es wurde dabei ein kontextorientierter Ansatz nach Vorbild von Informatik im Kontext gewählt. Die Reihe Smartphone-Applikationen beinhaltet über die genannten primären Themen der Unterrichtsreihe hinaus weitere Dimensionen, die bei der Nutzung von Smartphones auftreten. Durch den direkten Bezug zum Alltag der Schüler soll dabei eine möglichst hohe Betroffenheit erzeugt werden. Dadurch sollen die Schüler ihr bisheriges Nutzungsverhalten überdenken und im besten Fall ihren Altersgenossen als Vorbilder dienen. Die Prüfung der Durchführbarkeit der Reihe im Unterricht steht noch aus. Diese war im Rahmen dieser Arbeit, begründet durch die begrenzte Bearbeitungszeit, nicht zu leisten.
Bildsynthese durch Raytracing gewinnt durch Hardware-Unterstützung in Verbraucher-Grafikkarten eine immer größer werdende Relevanz. Der Linespace dient dabei als eine neue, vielversprechende Beschleunigungsstruktur. Durch seine richtungsbasierte Natur ist es sinnvoll, ihn in andere Datenstrukturen zu integrieren. Bisher wurde er in ein Uniform-Grid integriert. Problematisch werden einheitlich große Voxel allerdings bei Szenen mit variierbarem Detailgrad. Diese Arbeit führt den adaptiven Linespace ein, eine Kombination aus Octree und Linespace. Die Struktur wird hinsichtlich ihrer Beschleunigungsfähigkeit untersucht und mit dem bisherigen Grid-Ansatz verglichen. Es wird gezeigt, dass der adaptive Linespace für hohe Grid-Auflösungen besser skaliert, durch eine ineffiziente GPU-Nutzung allerdings keine optimalen Werte erzielt.
The goal of this work is the induction, conception, implementation and evaluation of an interactive game application among Android. The game genre of the app is a 2D-Jump ‘n’ Run Side-Scroller, whose graphical implementation is based on the four elements earth, fire, water and wind. The application should have classic functions of a Jump ‘n’ Run game and allow the player to overcome the four game worlds to find the finish. The implementation is based on Unity Engine and Adobe Photoshop. A user test asks basic questions about the application and specific questions about the research question, which are then evaluated. The research question examines the connection between fun factor and color perception while playing the app. Represented by the natural color combinations of the four elements. At the end possibilities for expansion and future prospects will be discussed.
The mitral valve is one of four human heart valves. It is located in the left heart and acts as a unidirectional passageway for blood between the left atrium and the left ventricle. A correctly functioning mitral valve prevents a backflow of blood into the pulmonary circulation (lungs) and thus constitutes a vital part of the cardiac cycle. Pathologies of the mitral valve can manifest in a variety of symptoms with severity ranging from chest pain and fatigue to pulmonary edema (fluid accumulation in the tissue and air space of lungs), which may ultimately cause respiratory failure.
Malfunctioning mitral valves can be restored through complex surgical interventions, which greatly benefit from intensive planning and pre-operative analysis. Visualization techniques provide a possibility to enhance such preparation processes and can also facilitate post-operative evaluation. The work at hand extends current research in this field, building upon patient-specific mitral valve segmentations developed at the German Cancer Research Center, which result in triangulated 3D models of the valve surface. The core of this work will be the construction of a 2D-view of these models through global parameterization, a method that can be used to establish a bijective mapping between a planar parameter domain and a surface embedded in higher dimensions.
A flat representation of the mitral valve provides physicians with a view of the whole surface at once, similar to a map. This allows assessment of the valve's area and shape without the need for different viewing angles. Parts of the valve that are occluded by geometry in 3D become visible in 2D.
An additional contribution of this work will be the exploration of different visualizations of the 3D and 2D mitral valve representations. Features of the valve can be highlighted by associating them with specified colors, which can for instance directly convey pathology indicators.
Quality and effectiveness of the proposed methods were evaluated through a survey conducted at the Heidelberg University Hospital.
In dieser Arbeit wird das Echtzeitrendering von Wolken von der Theorie bis hin zur Entwicklung derselben behandelt. Dabei sollen die visuellen Eigenschaften der Wolken sowie die unterschiedliche Wolkentypen simuliert werden. Dabei ist die Berechnung der Beleuchtung essentiell für ein glaubwürdiges Ergebnis. Die Rendertechniken nutzen dabei unterschiedliche Noise-Texturen; für die Modulierung der Wolken sind es hauptsächlich Perlin- und Perlin-Worley-Texturen. Das Rendern der Wolken wird per Compute-Shader durchgeführt um die Echtzeitfähigkeit zu gewährleisten. Um die Performance zu steigern, werden Temporal Reprojektion und andere Optimierungstechniken angewendet.
Simulation von Rauch
(2019)
This bachelor thesis deals with the simulation of smoke in a particle
system. Here the possibilities are investigated to implement smoke as
realistically as possible in a particle system and to calculate it in real time.
The physical simulation is based on the work of Müller and
Ren, who deal with the physical properties of fluids and gases.
The simulation was implemented on the GPU using C++, OpenGL and
the compute shaders available in OpenGL. Special attention was paid
to the performance of the simulation. Hoetzlein techniques are
used to accelerate the particle system. Two acceleration methods were
then implemented and compared. The runtime, but also the used memory
space of the GPU is discussed.
Global-Illumination is an important part of the rendering of realistic images. However, the computational complexity of an accurate simulation of these effects is too high for the use in real time applications. In this paper Light-Propagation-Volumes, Screen-Space-Reflections and multiple variants of Screen-Space-Ambient-Occlusion are investigated as a solution for real time rendering. It is shown that they are fast enough for the use in real time applications. The various techniques approximate only a few aspects of the light transport, but complement each other.
This bachelor thesis implements a system for camera tracking based on a particle filter. For this purpose, a marker tracking is realized and the camera position is calculated based on the marker position. The marker is to be found with a particle filter and in order to accomplish this possible marker positions are simulated, also called particles, and weighted with Likelyhood-Functions. The focus lies on the evaluation of different Likelihood-Functions of the particle filter. The Likelyhood functions were implemented in CUDA as part of the implementation.
The following bachelor thesis gives an overview of various approaches and techniques for procedural generation of three-dimensional city models. Especially the usage of generative grammars is being examined and later used for the implementation of an own application. Its focus was the embedding of predetermined primary street networks as well as the procedural generation of secondary street networks and different kinds of buildings. The application allows the efficient creation of extensive and variably structured city models. However, there are restrictions regarding the realism and variation of the results.
Simulation von Schnee
(2019)
Using physics simulations natural phenomena can be replicated
with the computer. The aim is to calculate a physical feature as correclty as
possible in order to draw conclusions for the real world. Fields of Application
are, for example, medicine, industry, but also games or films.
Snow is a very complex natural phenomenon due to its physical structure
and properties. To simulate snow, different material properties have to be
considered.
The most important method that deals with the simulation of snow and its
dynamics is the material point method. It combines the Lagrangian particles
based on continuum mechanics with a Cartesian grid. The grid enables
communication between the snow particles, which are not actually connected.
For calculation of particles data is transferred from these particles to
the grid nodes. There, calculations are carried out with information about
neighboring particles. The results are then transferred back to the original
particles. Using GPGPU techniques, physical simulations can be implemented
on the graphics card. Procedures like the material point method
can be parallelized well with these techniques.
This paper deals with the physical basics of the material point method and
implements them on the graphics card using compute shaders. Then performance
and quality are evaluated.
The development of a game engine is considered a non-trivial problem. [3] The architecture of such simulation software must be able to manage large amounts of simulation objects in real-time while dealing with “crosscutting concerns” [3,p. 36] between subsystems. The use of object oriented paradigms to model simulation objects in class hierarchies has been reported as incompatible with constantly changing demands during game development [2, p. 9], resulting in anti-patterns and eventual, messy refactoring.[13]
Alternative architectures using data oriented paradigms revolving around object composition and aggregation have been proposed as a result. [13, 9, 1, 11]
This thesis describes the development of such an architecture with the explicit goals to be simple, inherently compatible with data oriented design, and to make reasoning about performance characteristics possible. Concepts are formally defined to help analyze the problem and evaluate results. A functional implementation of the architecture is presented together with use cases common to simulation software.
This thesis is about the design and the implementation of a virtual reality experience. The goal is to answer two questions: Is it possible to create an immersive virtual reality experience which is mainly using impulses and triggers to scare and frighten users? Secondly, is this immersion strong enough to create an illusion in which the user can't separate the real world from the virtual world? To realise this project the design program Unity3D as well as Visual Studios 2017 were used. Furthermore, in order to verify that the experience is indeed immersive for the user, an experiment with a sample size of seven people was created. Afterwards the candidates were interviewed via a questionnaire how they felt during the virtual reality application. As a result the study showed that the application has tendencies to be immersive but the users were still aware of the situation. It can be concluded that the immersion was not strong enough to fool users regarding the separation of virtual and real world.
This bachelor thesis investigates the utilization of the Wii Balance Board
in virtual reality applications. For the investigation a snowboard game is
implemented, in which the virtual avatar can be controlled with the pressure
sensors of the Wii Balance Board. The user should be able to move
playfully and intuitively through the virtual environment by balancing his
body. The immersiveness and the influence on motion sickness and cybersickness
will be investigated. In Addition, the Wii Balance Board will be
compared with the Xbox Controller. The aim of the work is to evaluate
whether the Wii Balance Board is able to allow free movement in virtual
environments and whether it is more advantageous to use it rather than
a conventional controller. The results of the survey indicate that the Wii
Balance Board has a positive influence on the immersivness of the game,
despite better game results by using a conventional controller. The survey
also reveals that the use of the Wii Balance Board is responsible for more
motion-sickness/cybersickness cases.
Over the past few decades society’s dependence on software systems has grown significantly. These systems are utilized in nearly every matter of life today and often handle sensitive, private data. This situation has turned software security analysis into an essential and widely researched topic in the field of computer science. Researchers in this field tend to make the assumption that the quality of the software systems' code directly affects the possibility for security gaps to arise in it. Because this assumption is based on properties of the code, proving it true would mean that security assessments can be performed on software, even before a certain version of it is released. A study based on this implication has already attempted to mathematically assess the existence of such a correlation, studying it based on quality and security metric calculations. The present study builds upon that study in finding an automatic method for choosing well-fitted software projects as a sample for this correlation analysis and extends the variety of projects considered for the it. In this thesis, the automatic generation of graphical representations both for the correlations between the metrics as well as for their evolution is also introduced. With these improvements, this thesis verifies the results of the previous study with a different and broader project input. It also focuses on analyzing the correlations between the quality and security metrics to real-world vulnerability data metrics. The data is extracted and evaluated from dedicated software vulnerability information sources and serves to represent the existence of proven security weaknesses in the studied software. The study discusses some of the difficulties that arise when trying to gather such information and link it to the difference in the information contained in the repositories of the studied projects. This thesis confirms the significant influence that quality metrics have on each other. It also shows that it is important to view them together as a whole and suppose that their correlation could influence the appearance of unwanted vulnerabilities as well. One of the important conclusions I can draw from this thesis is that the visualization of metric evolution graphs, helps the understanding of the values as well as their connection to each other in a more meaningful way. It allows for better grasp of their influence on each other as opposed to only studying their correlation values. This study confirms that studying metric correlations and evolution trends can help developers improve their projects and prevent them from becoming difficult to extend and maintain, increasing the potential for good quality as well as more secure software code.
This thesis deals with the conception and implementation of an action role-playing game using the game engine Unity. Within the context of an evaluation, the game was supposed to be evaluated with regard to the usability of the integrated control modes, the visual conviction of the animations and the user-friendliness of the tools and visualizations provided. In addition, weaknesses and problems in the game were to be identified through open feedback. The results of the evaluation showed that the game is still expandable in terms of usability and user-friendliness, but has left a good impression on the test persons.
In dieser Arbeit wird überprüft, ob die Befahrung mit Forstmaschinen auf Andosol-Waldstandorten Auswirkungen auf die Bodenverdichtung hat. Dazu werden Faktoren wie die Anzahl der Befahrungen, die Neigung sowie die Bodenart betrachtet und analysiert. Die Andosolböden im Untersuchungsgebiet sind durch sehr geringe Lagerungsdichten sowie hohe Porenvolumen, bedingt durch Laacher See-Tephra (LST), gekennzeichnet.
Infolge der Befahrung durch Forstmaschinen bei zum Teil ungünstiger Witterung, bildeten sich bis zu 67 cm tiefe Fahrspuren. Des Weiteren erhöht sich die Lagerungsdichte im Vergleich zum unbeeinflussten Wald im Schnitt um 17 %, im Extremfall bis zu 54 %. Einzeln betrachtet kann für die Bodenart ein starker Zusammenhang zwischen Feinboden, vor allem Schluff sowie Ton, und der Lagerungsdichte festgestellt werden. Ein Zusammenhang zwischen der Hangneigung und der Verdichtung des Bodens kann anhand der untersuchten Rückegasse nicht festgestellt werden. Die Anzahl der Befahrungen scheint im Untersuchungsgebiet mit dem Grad der Verdichtung einherzugehen.