Refine
Year of publication
- 2015 (106) (remove)
Document Type
- Part of Periodical (34)
- Doctoral Thesis (33)
- Bachelor Thesis (19)
- Master's Thesis (18)
- Conference Proceedings (1)
- Lecture (1)
Keywords
- Vorlesungsverzeichnis (4)
- OpenGL (3)
- Android (2)
- Compute Shader (2)
- Crowdsourcing (2)
- Eyetracking (2)
- Führung (2)
- Grafikkarte (2)
- Serviceorientierte Architektur (2)
- 360 Grad (1)
Institute
Real-time graphics applications are tending to get more realistic and approximate real world illumination gets more reasonable due to improvement of graphics hardware. Using a wide variation of algorithms and ideas, graphics processing units (GPU) can simulate complex lighting situations rendering computer generated imagery with complicated effects such as shadows, refraction and reflection of light. Particularly, reflections are an improvement of realism, because they make shiny materials, e.g. brushed metals, wet surfaces like puddles or polished floors, appear more realistic and reveal information of their properties such as roughness and reflectance. Moreover, reflections can get more complex, depending on the view: a wet surface like a street during rain for example will reflect lights depending on the distance of the viewer, resulting in more streaky reflection, which will look more stretched, if the viewer is locatedrnfarther away from the light source. This bachelor thesis aims to give an overview of the state-of-the-art in terms of rendering reflections. Understanding light is a basic need to understand reflections and therefore a physical model of light and its reflection will be covered in section 2, followed by the motivational section 2.2, that will give visual appealing examples for reflections from the real world and the media. Coming to rendering techniques, first, the main principle will be explained in section 3 followed by a short general view of a wide variety of approaches that try to generate correct reflections in section 4. This thesis will describe the implementation of three major algorithms, that produce plausible local reflections. Therefore, the developed framework is described in section 5, then three major algorithms will be covered, that are common methods in most current game and graphics engines: Screen space reflections (SSR), parallax-corrected cube mapping (PCCM) and billboard reflections (BBR). After describing their functional principle, they will be analysed of their visual quality and the possibilities of their real-time application. Finally they will be compared to each other to investigate the advantages and disadvantages over each other. In conclusion, the gained experiences will be described by summarizing advantages and disadvantages of each technique and giving suggestions for improvements. A short perspective will be given, trying to create a view of upcoming real-time rendering techniques for the creation of reflections as specular effects.
Emotion regulation – an empirical investigation in female adolescents with nonsuicidal self- injury
(2015)
Nonsuicidal self-injury (NSSI) was included as a condition for further study in the DSM-5. Therefore, it is necessary to investigate the suggested diagnostic criteria and the clinical and psychological correlates. In order to provide an optimal treatment best tailored to the patients need, a clear differentiation between Borderline Personality Disorder (BPD) and NSSI is needed. The investigation of personality traits specific to patients with NSSI might be helpful for this differentiation. Furthermore, social difficulties can often be a trigger for NSSI. However, little is known about how adolescents with NSSI perceive social situations. Therefore, we examined how adolescents with NSSI process emotional expressions. A new emotion recognition paradigm (ERP) using colored and morphed facial expressions of happiness, anger, sadness, disgust and fear was developed and evaluated in a student sample, selected for being high (HSA) or low socially anxious (LSA). HSA showed a tendency towards impaired emotion recognition, and the paradigm demonstrated good construct validity.
For the main study, we investigated characteristics of NSSI, clinical and psychological correlates, personality traits and emotion recognition. We examined 57 adolescents with NSSI diagnosis, 12 adolescents with NSSI without impairment/distress and 14 adolescents with BPD, 32 clinical controls without NSSI, and 64 nonclinical controls. Participants were interviewed regarding mental disorders, filled out self-report questionnaires and participated in the ERP.
Results indicate that adolescents with NSSI experienced a higher level of impairment than clinical controls. There were similarities between adolescents with NSSI and adolescents with BPD, but also important differences. Adolescents with NSSI were characterized by specific personality traits such as high harm avoidance and novelty seeking compared to clinical controls. In adolescents with BPD, these personality traits were even more pronounced. No group differences in the recognition of facial expressions were found. Nonetheless compared to the control group, adolescents with NSSI rated the stimuli as significantly more unpleasant and arousing.
In conclusion, NSSI is a highly impairing disorder characterized by high comorbidity with various disorders and by specific personality traits, providing further evidence that NSSI should be handled as a distinct diagnostic entity. Consequently, the proposed DSM-5 diagnostic criteria for NSSI are useful and necessary.
Virtueller Konsum - Warenkörbe, Wägungsschemata und Verbraucherpreisindizes in virtuellen Welten
(2015)
Virtual worlds have been investigated by several academic disciplines for several years, e.g. sociology, psychology, law and education. Since the developers of virtual worlds have implemented aspects like scarcity and needs, even economic research has become interested in these virtual environments. Exploring virtual economies mainly deals with the research of trade regarding the virtual goods used to supply the emerged needs. On the one hand, economics analyzes the meaning of virtual trade according to the overall interpretation of the economical characteristics of virtual worlds. As some virtual worlds allow the change of virtual world money with real money and vice versa, virtual goods are traded by the users for real money, researchers on the other hand, study the impact of the interdependencies between virtual economies and the real world. The presented thesis mainly focuses on the trade within virtual worlds in the context of virtual consumption and the observation of consumer prices. Therefore, the four virtual worlds World of Warcraft, RuneScape, Entropia Universe and Second Life have been selected. There are several components required to calculate consumer price indices. First, a market basket, which contains the relevant consumed goods existing in virtual worlds, must be developed. Second, a weighting scheme has to be established, which shows the dispersion of consumer tendencies. Third, prices of relevant consumer goods have to be taken. Following real world methods, it is the challenge to apply those methods within virtual worlds. Therefore, this dissertation contains three corresponding investigation parts. Within a first analysis, it will be evaluated, in how far virtual worlds can be explored to identify consumable goods. As a next step, the consumption expenditures of the avatars will be examined based on an online survey. At last, prices of consumable goods will be recorded. Finally, it will be possible to calculate consumer price indices. While investigating those components, the thesis focuses not only on the general findings themselves, but also on methodological issues arising, like limited access to relevant data, missing legal legitimation or security concerns of the users. Beside these aspects, the used methods also allow the examination of several other economic aspects like the consumption habits of the avatars. At the end of the thesis, it will be considered to what extent virtual world economic characteristics can be compared with the real world.
Aspects like the important role of weapons or the different usage of food show significant differences to the real world, caused by the business models of virtual worlds.
In this thesis, we deal with the question if challenge, flow and fun in computer games are related to each other, and which influence the motivational, psychological components motivation of success, motivation of failure and the chance of success do have. In addition, we want to know if a free choice in the level of difficulty is the optimal way to flow. To examine these theories, a study based on an online survey was executed, in which the participants played the game “flOw“. The results were evaluated with the help of a two-factorial analysis of variance with repeated measurement and tests on correlation. Thereby we found out that there actually exists a relation between challenge, flow and fun and that motivation does matter indirectly.
The increasing, anthropogenic demand for chemicals has created large environmental problems with repercussions for the health of the environment, especially aquatic ecosystems. As a result, the awareness of the public and decision makers on the risks from chemical pollution has increased over the past half-century, prompting a large number of studies in the field of ecological toxicology (ecotoxicology). However, the majority of ecotoxicological studies are laboratory based, and the few studies extrapolating toxicological effects in the field are limited to local and regional levels. Chemical risk assessment on large spatial scales remains largely unexplored, and therefore, the potential large-scale effects of chemicals may be overlooked.
To answer ecotoxicological questions, multidisciplinary approaches that transcend classical chemical and toxicological concepts are required. For instance, the current models for toxicity predictions - which are mainly based on the prediction of toxicity for a single compound and species - can be expanded to simultaneously predict the toxicity for different species and compounds. This can be done by integrating chemical concepts such as the physicochemical properties of the compounds with evolutionary concepts such as the similarity of species. This thesis introduces new, multidisciplinary tools for chemical risk assessments, and presents for the first time a chemical risk assessment on the continental scale.
After a brief introduction of the main concepts and objectives of the studies, this thesis starts by presenting a new method for assessing the physiological sensitivity of macroinvertebrate species to heavy metals (Chapter 2). To compare the sensitivity of species to different heavy metals, toxicity data were standardized to account for the different laboratory conditions. These rankings were not significantly different for different heavy metals, allowing the aggregation of physiological sensitivity into a single ranking.
Furthermore, the toxicological data for macroinvertebrates were used as input data to develop and validate prediction models for heavy metal toxicity, which are currently lacking for a wide array of species (Chapter 3). Apart from the toxicity data, the phylogenetic information of species (evolutionary relationships among species) and the physicochemical parameters for heavy metals were used. The constructed models had a good explanatory power for the acute sensitivity of species to heavy metals with the majority of the explained variance attributed to phylogeny. Therefore, the integration of evolutionary concepts (relatedness and similarity of species) with the chemical parameters used in ecotoxicology improved prediction models for species lacking experimental toxicity data. The ultimate goal of the prediction models developed in this thesis is to provide accurate predictions of toxicity for a wide range of species and chemicals, which is a crucial prerequisite for conducting chemical risk assessment.
The latter was conducted for the first time on the continental scale (Chapter 4), by making use of a dataset of 4,000 sites distributed throughout 27 European countries and 91 respective river basins. Organic chemicals were likely to exert acute risks for one in seven sites analyzed, while chronic risk was prominent for almost half of the sites. The calculated risks are potentially underestimated by the limited number of chemicals that are routinely analyzed in monitoring programmes, and a series of other uncertainties related with the limit of quantification, the presence of mixtures, or the potential for sublethal effects not covered by direct toxicity.
Furthermore, chemical risk was related to agricultural and urban areas in the upstream catchments. The analysis of ecological data indicated chemical impacts on the ecological status of the river systems; however, it is difficult to discriminate the effects of chemical pollution from other stressors that river systems are exposed to. To test the hypothesis of multiple stressors, and investigate the relative importance of organic toxicants, a dataset for German streams is used in chapter 5. In that study, the risk from abiotic (habitat degradation, organic chemicals, and nutrients enrichment) and biotic stressors (invasive species) was investigated. The results indicated that more than one stressor influenced almost all sites. Stream size and ecoregions influenced the distribution of risks, e.g., the risks for habitat degradation, organic chemicals and invasive species increased with the stream size; whereas organic chemicals and nutrients were more likely to influence lowland streams. In order to successfully mitigate the effects of pollutants in river systems, co-occurrence of stressors has to be considered. Overall, to successfully apply integrated water management strategies, a framework involving multiple environmental stressors on large spatial scales is necessary. Furthermore, to properly address the current research needs in ecotoxicology, a multidisciplinary approach is necessary which integrates fields such as, toxicology, ecology, chemistry and evolutionary biology.
This thesis addresses the problem of terrain classification in unstructured outdoor environments. Terrain classification includes the detection of obstacles and passable areas as well as the analysis of ground surfaces. A 3D laser range finder is used as primary sensor for perceiving the surroundings of the robot. First of all, a grid structure is introduced for data reduction. The chosen data representation allows for multi-sensor integration, e.g., cameras for color and texture information or further laser range finders for improved data density. Subsequently, features are computed for each terrain cell within the grid. Classification is performedrnwith a Markov random field for context-sensitivity and to compensate for sensor noise and varying data density within the grid. A Gibbs sampler is used for optimization and is parallelized on the CPU and GPU in order to achieve real-time performance. Dynamic obstacles are detected and tracked using different state-of-the-art approaches. The resulting information - where other traffic participants move and are going to move to - is used to perform inference in regions where the terrain surface is partially or completely invisible for the sensors. Algorithms are tested and validated on different autonomous robot platforms and the evaluation is carried out with human-annotated ground truth maps of millions of measurements. The terrain classification approach of this thesis proved reliable in all real-time scenarios and domains and yielded new insights. Furthermore, if combined with a path planning algorithm, it enables full autonomy for all kinds of wheeled outdoor robots in natural outdoor environments.
Flowering habitats to enhance biodiversity and pest control services in agricultural landscapes
(2015)
Meeting growing demands for agricultural products requires management solutions that enhance food production, whilst minimizing negative environmental impacts. Conventional agricultural intensification jeopardizes farmland biodiversity and associated ecosystem services through excessive anthropogenic inputs and landscape simplification. Agri-environment schemes (AES) are commonly implemented to mitigate the adverse effects of conventional intensification on biodiversity. However the moderate success of such schemes thus far would strongly benefit from more explicit goals regarding ecosystem service provisioning. Providing key resources to beneficial organisms may improve their abundance, fitness, diversity and the ecosystem services they provide. With targeted habitat management, AES may synergistically enhance biodiversity and agricultural production and thus contribute to ecological intensification. We demonstrate that sown perennial wildflower strips, as implemented in current AES focusing on biodiversity conservation also benefit biological pest control in nearby crops (Chapter 2).
Comparing winter wheat fields adjacent to wildflower strips with fields without wildflower strips we found strongly reduced cereal leaf beetle (Oulema sp.) density and plant damage near wildflower strips. In addition, winter wheat yield was 10 % higher when fields adjoined wildflower strips. This confirms previous assumptions that wildflower strips, known for positive effects on farmland biodiversity, can also enhance ecosystem services such as pest control and the positive correlation of yield with flower abundance and diversity suggests that floral resources are key. Refining sown flower strips for enhanced service provision requires mechanistic understanding of how organisms benefit from floral resources. In climate chamber experiments investigating the impact of single and multiple flowering plant species on fitness components of three key arthropod natural enemies of aphids, we demonstrate that different natural enemies benefit differently from the offered resources (Chapter 3).
Some flower species were hereby more valuable to natural enemies than others overall. Additionally, the mixture with all flowers generally performed better than monocultures, yet with no transgressive overyielding. By explicitly tailoring flower strips to the requirements of key natural enemies of crop pests we aimed to maximise natural enemy mediated pest control in winter wheat (Chapter 4)and potato (Chapter 5) crops.
Respecting the manifold requirements of diverse natural enemies but not pests, in terms of temporal and spatial provisioning of floral, extra floral and structural resources, we designed targeted annual flower strips that can be included in crop rotation to support key arthropods at the place and time they are needed. Indeed, field experiments revealed that cereal leaf beetle density and plant damage in winter wheat can be reduced by 40 % to 61 % and aphid densities in potatoes even by 77 %, if a targeted flower strip is sown into the field. These effects were not restricted to the vicinity of flower strips and, in contrast to fields without flower strip, often prevented action thresholds from being reached. This suggests that targeted flower strips could replace insecticides. All adult natural enemies were enhanced inside targeted flower strips when compared to control strips. Yet, spillover to the field was restricted to key natural enemies such as ground beetles (winter wheat), hoverflies (potato) and lacewings (winter wheat and potato), suggesting their dominant role in biological control. In potatoes, targeted flower strips also enhanced hoverfly species richness in strips and crop, highlighting their additional benefits for diversity.
The present results provide more insights into the mechanisms underlying conservation biological control and highlight the potential of tailored habitat management for ecological intensification.
In this thesis, an interactive application is developed for Android OS. The application is about a virtual-reality game. The game is settled in the genre of first-person shooters and takes place in a space scenario. By using a stereo renderer, it is possible to play the game combined with virtual-reality glasses.
The publication of freely available and machine-readable information has increased significantly in the last years. Especially the Linked Data initiative has been receiving a lot of attention. Linked Data is based on the Resource Description Framework (RDF) and anybody can simply publish their data in RDF and link it to other datasets. The structure is similar to the World Wide Web where individual HTML documents are connected with links. Linked Data entities are identified by URIs which are dereferenceable to retrieve information describing the entity. Additionally, so called SPARQL endpoints can be used to access the data with an algebraic query language (SPARQL) similar to SQL. By integrating multiple SPARQL endpoints it is possible to create a federation of distributed RDF data sources which acts like one big data store.
In contrast to the federation of classical relational database systems there are some differences for federated RDF data. RDF stores are accessed either via SPARQL endpoints or by resolving URIs. There is no coordination between RDF data sources and machine-readable meta data about a source- data is commonly limited or not available at all. Moreover, there is no common directory which can be used to discover RDF data sources or ask for sources which offer specific data. The federation of distributed and linked RDF data sources has to deal with various challenges. In order to distribute queries automatically, suitable data sources have to be selected based on query details and information that is available about the data sources. Furthermore, the minimization of query execution time requires optimization techniques that take into account the execution cost for query operators and the network communication overhead for contacting individual data sources. In this thesis, solutions for these problems are discussed. Moreover, SPLENDID is presented, a new federation infrastructure for distributed RDF data sources which uses optimization techniques based on statistical information.
Im Rahmen dieser Arbeit wird untersucht, wie sich Modellfehler auf die Positionsgenauigkeit und Handhabbarkeit beim Rangieren mit einem Fahrerassistenzsystem auswirken. Besonderer Wert wird dabei auf die Bestimmung von Fehlergrenzen gelegt. Es wird der Frage nachgegangen, wie groß der Eingangsfehler sein darf, damit die Assistenz noch hinreichende Qualitätseigenschaften hinsichtlich ihrer Präzision und Robustheit aufweist. Dazu erfolgt zunächst eine quantitative Betrachtung der Fehler anhand des kinematischen Modells. Danach wird eine qualitative Betrachtung anhand von systematischen Experimenten durchgeführt. Es wird zunächst ein Controller entwickelt, mit dem sich ein Manöver mithilfe der visuellen Informationen der Assistenz simulieren lässt.
Dann wird eine Methode vorgestellt, mit deren Hilfe man das Manöver anhand definierter Fehlergrenzen bewerten kann. Um einen großen Raum möglicher Fehlerkombinationen effizient zu durchsuchen, wird das probabilistische Verfahren des Annealed Particle Filters benutzt. Mithilfe einer Testumgebung werden schließlich systematische Experimente durchgeführt. Zur weiteren Evaluation des Assistenzsystems in einer kontrollierten Umgebung erfolgte in Zusammenarbeit mit dem Fraunhofer ITWM in Kaiserslautern die Portierung des Assistenzsystems auf die dortige Simulationsumgebung RODOS.
Der Fachbereich 4 (Informatik) besteht aus fünfundzwanzig Arbeitsgruppen unter der Leitung von Professorinnen und Professoren, die für die Forschung und Lehre in sechs Instituten zusammenarbeiten.
In jedem Jahresbericht stellen sich die Arbeitsgruppen nach einem einheitlichen Muster dar, welche personelle Zusammensetzung sie haben, welche Projekte in den Berichtszeitraum fallen und welche wissenschaftlichen Leistungen erbracht wurden. In den folgenden Kapiteln werden einzelne Parameter aufgeführt, die den Fachbereich in quantitativer Hinsicht, was Drittmitteleinwerbungen, Abdeckung der Lehre, Absolventen oder Veröffentlichungen angeht, beschreiben.
Uniprisma Ausg. 2005
(2015)
The subject of this thesis was to analyse the involvement of classical creativity techniques and IT tools in different phases of the innovation process. In addition, the present work deals with the integration of Design Thinking and TRIZ into the innovation process. The aim was to define a specific innovation process based on diverse existing Innovation process models from the literature. This specific innovation process should serve as a basis for the analysis of integration of creativity techniques, IT tools, Design Thinking and TRIZ. Summarizing it can be said that the application of creativity techniques and IT Tools is admissible and useful in every phase of the innovation process. In this work it was shown that the design thinking method can be integrated in the early stages of the innovation process. Also, the process model of TRIZ, which differs from traditional innovation processes, can be combined with classical innovation processes.
Campuszeitung Ausg. 1/2015
(2015)
Simulation von Schnee
(2015)
Physic simulations allow the creation of dynamic scenes on the computer. Computer generated images become lively and find use in movies, games and engineering applications. GPGPU techniques make use of the graphics card to simulate physics. The simulation of dynamic snow is still little researched. The Material Point Method is the first technique which is capable of showing the dynamics andrncharacteristics of snow.
The hybrid use of Lagrangian particles and a regular cartesian grid enables solving of partial differential equations. Therefore articles are transformed to the grid. The grid velocities can then be updated with the calculation of gradients in an FEM-manner (finite element method). Finally grid node velocities are weight back to the particles to move them across the scene. This method is coupled with a constitutive model to cover the dynamic nature of snow. This include collisions and breaking.
This bachelor thesis connects the recent developments in GPGPU techniques of OpenGL with the Material Point Method to efficiently simulate visually compelling, dynamic snow scenes.
Heat exchangers are used for thickening of various products or desalination of saltwater. Nevertheless, they are used as cooling unit in industries. Thereby, the stainless steel heat transferring elements get in contact with microorganism containing media, such as river water or saltwater, and corrode. After at least two years of utilization the material is covered with bacterial slime called biofilm. This process is called biofouling and causes loss in efficiency and creates huge costs depending on cleaning technique and efficiency. Cleaning a heat exchanger is very expensive and time consuming. It only can be done while the device is out of business.
Changing the surface properties of materials is the best and easiest way to lengthen the initial phase of biofilm formation. This leads to less biofouling (Mogha et al. 2014).
Thin polymer films as novel materials have less costs in production than stainless steel and are easy to handle. Furthermore, they can be functionalzed easily and can be bougth in different sizes all over the world. Because of this, they can reduce the costs of cleaning techniques and lead to a longer high efficiency state of the heat exchanger. If the efficiency of the heat exchanger decreases, the thin polymer films can be replaced.
For a successful investigation of the microbial and the process engineering challenges a cooperation of Technical University of Kaiserslautern (chair of seperation science and technology) and University of Koblenz-Landau (working goup microbiology) was established.
The aim of this work was design engineering and production of a reactor for investigation of biofouling taking place on thin polymeric films and stainless steel. Furthermore, an experimental design has to be established. Several requirements have to be applied for these tasks. Therefore, a real heat exchanger is downscaled, so the process parameters are at least comparable. There are many commercial flow cell kits available. Reducing the costs by selfassembling increased the number of samples, so there is a basis for statistic analysis. In addition, fast and minimal invasive online-in-situ microscopy and Raman spectroscopy can be performed. By creating laminary flow and using a weir we implemented homogenous inflow to the reactors. Reproduceable data on biomass and cell number were created.
The assessment of biomass and cell number is well established for drinking water analysis. Epifluorescense microscopy and gravimetric determination are the basic techniques for this work, too. Differences in cell number and biomass between surface modifications and materials are quantified and statistically analysed.
The wildtype strain Escherichia coli K12 and an inoculum of 500 ml fresh water were used to describe the biofouling of the films. Thereby, we generated data with natural bacterial community in unknown media properties and data with well known media properties, so the technical relevance of the data is given.
Free surface energy and surface roughness are the first attachment hurdles for bacteria. These parameters were measured according to DIN 55660 and DIN EN ISO 4287. The materials science data were correlated with the number of cells and the biomass. This correlation acts as basal link of biofouling as biological induced parameter to the material properties. Material properties for reducing the biofouling can be prospected.
By using Raman spectroscopy as a cutting edge method future investigations could be shortened. If biomass or cell number can be linked with the spectra, new functional materials can be investigated in a short time.
Zentrale Aufgaben der Hochschule sind die Bewertung, die Ursachenklärung und die Förderung von Studienleistungen (Heublein & Wolter, 2011, S. 215). In diesem Kontext gilt neben intellektuellen Fähigkeiten die Leistungsmotivation als bedeutsamer Prädiktor für den akademischen Erfolg (z. B. Schmidt-Atzert, 2005, S. 132; Steinmayr & Spinath, 2009, S. 80). Im Fokus der vorliegenden Studie stehen deshalb Überlegungen zu Motivationsprozessen von 332 Studienanfängern der Hochschule der Bundesagentur für Arbeit und zu den Faktoren, die sich förderlich auf ihre Lernresultate auswirken. Mit einer Ausschöpfungsquote von 89 % sind die gewonnenen Daten für die Grundgesamtheit repräsentativ. Anhand einer Ex-post-facto-Versuchsanordnung in Form eines quantitativen Prädiktor-Kriteriums-Ansatzes (spezielle Variante eines Längsschnittdesigns) mit unterschiedlichen Erhebungsmethoden, wie standardisiertem Selbstbeurteilungsfragebogen, Leistungstests und offiziellen Dokumenten/Aktenmaterial, wurden folgende Forschungshypothesen zugrunde gelegt: Die Stärke der Leistungsmotivation ist sowohl von Erwartungskomponenten (Fähigkeitsselbstkonzept, Selbstwert, subjektive Notenerwartung, Erfolgszuversicht und Misserfolgsfurcht) als auch von Anreizkomponenten (Gegenstands-, Tätigkeits-, Folgenanreizen) abhängig, welche wiederum vermittelt über das leistungsmotivierte Verhalten einen Einfluss auf die Studienleistung besitzt. Dabei wurde postuliert, dass motivationale Variablen auch dann noch einen bedeutsamen Effekt auf die Studienleistung ausüben, wenn weitere Leistungsprädiktoren, wie die Schulabschlussnote, die Intelligenz, die emotionale Stabilität und die Gewissenhaftigkeit kontrolliert werden.
Web application testing is an active research area. Garousi et al. did a systematic mapping study and classified 79 papers published between 2000-2011. However, there seems to be a lack of information exchange between the scientific community and tool developers.
This thesis systematically analyzes the field of functional, system level web application testing tools. 194 candidate tools were collected in the tool search and screened, with 23 tools being selected as foundation of this thesis. These 23 tools were systematically used to generate a feature model of the domain. The methodology to support this is an additional contribution of this thesis. It processes end user documentation of tools belonging to an examined domain and creates a feature model. The feature model gives an overview over the existing features, their alternatives and their distribution. It can be used to identify trends and problems, extraordinary features, help decision making of tool purchase or guide scientists how to focus research.
Das Thema dieser Arbeit ist die Entwicklung einer hardwarebeschleunigten Einzelbildkompression zur Videoübertragung. Verfahren zur Einzelbildkompressionrn existieren bereits seit längerer Zeit. Jedoch genügen die gängigen Verfahren nicht den Anforderungen der Echtzeit und Performanz, um während einer Videoübertragung ohne spürbare Latenz zum Einsatz zu kommen. In dieser Arbeit soll einer der geläufigsten Algorithmen zur Bildkompression auf Parallelisierbarkeit, unter zu Hilfenahme der Grafikkarte, untersucht werden, um Echtzeitfähigkeit während der Kompression und Dekompression von computergenerierten Bildern zu erreichen. Die Ergebnisse werden evaluiert und in den Rahmen aktueller Verfahren parallelisierter Kompressionstechniken eingeordnet.