Refine
Year of publication
- 2015 (106) (remove)
Document Type
- Part of Periodical (34)
- Doctoral Thesis (33)
- Bachelor Thesis (19)
- Master's Thesis (18)
- Conference Proceedings (1)
- Lecture (1)
Keywords
- Vorlesungsverzeichnis (4)
- OpenGL (3)
- Android (2)
- Compute Shader (2)
- Crowdsourcing (2)
- Eyetracking (2)
- Führung (2)
- Grafikkarte (2)
- Serviceorientierte Architektur (2)
- 360 Grad (1)
Institute
Real-time graphics applications are tending to get more realistic and approximate real world illumination gets more reasonable due to improvement of graphics hardware. Using a wide variation of algorithms and ideas, graphics processing units (GPU) can simulate complex lighting situations rendering computer generated imagery with complicated effects such as shadows, refraction and reflection of light. Particularly, reflections are an improvement of realism, because they make shiny materials, e.g. brushed metals, wet surfaces like puddles or polished floors, appear more realistic and reveal information of their properties such as roughness and reflectance. Moreover, reflections can get more complex, depending on the view: a wet surface like a street during rain for example will reflect lights depending on the distance of the viewer, resulting in more streaky reflection, which will look more stretched, if the viewer is locatedrnfarther away from the light source. This bachelor thesis aims to give an overview of the state-of-the-art in terms of rendering reflections. Understanding light is a basic need to understand reflections and therefore a physical model of light and its reflection will be covered in section 2, followed by the motivational section 2.2, that will give visual appealing examples for reflections from the real world and the media. Coming to rendering techniques, first, the main principle will be explained in section 3 followed by a short general view of a wide variety of approaches that try to generate correct reflections in section 4. This thesis will describe the implementation of three major algorithms, that produce plausible local reflections. Therefore, the developed framework is described in section 5, then three major algorithms will be covered, that are common methods in most current game and graphics engines: Screen space reflections (SSR), parallax-corrected cube mapping (PCCM) and billboard reflections (BBR). After describing their functional principle, they will be analysed of their visual quality and the possibilities of their real-time application. Finally they will be compared to each other to investigate the advantages and disadvantages over each other. In conclusion, the gained experiences will be described by summarizing advantages and disadvantages of each technique and giving suggestions for improvements. A short perspective will be given, trying to create a view of upcoming real-time rendering techniques for the creation of reflections as specular effects.
Emotion regulation – an empirical investigation in female adolescents with nonsuicidal self- injury
(2015)
Nonsuicidal self-injury (NSSI) was included as a condition for further study in the DSM-5. Therefore, it is necessary to investigate the suggested diagnostic criteria and the clinical and psychological correlates. In order to provide an optimal treatment best tailored to the patients need, a clear differentiation between Borderline Personality Disorder (BPD) and NSSI is needed. The investigation of personality traits specific to patients with NSSI might be helpful for this differentiation. Furthermore, social difficulties can often be a trigger for NSSI. However, little is known about how adolescents with NSSI perceive social situations. Therefore, we examined how adolescents with NSSI process emotional expressions. A new emotion recognition paradigm (ERP) using colored and morphed facial expressions of happiness, anger, sadness, disgust and fear was developed and evaluated in a student sample, selected for being high (HSA) or low socially anxious (LSA). HSA showed a tendency towards impaired emotion recognition, and the paradigm demonstrated good construct validity.
For the main study, we investigated characteristics of NSSI, clinical and psychological correlates, personality traits and emotion recognition. We examined 57 adolescents with NSSI diagnosis, 12 adolescents with NSSI without impairment/distress and 14 adolescents with BPD, 32 clinical controls without NSSI, and 64 nonclinical controls. Participants were interviewed regarding mental disorders, filled out self-report questionnaires and participated in the ERP.
Results indicate that adolescents with NSSI experienced a higher level of impairment than clinical controls. There were similarities between adolescents with NSSI and adolescents with BPD, but also important differences. Adolescents with NSSI were characterized by specific personality traits such as high harm avoidance and novelty seeking compared to clinical controls. In adolescents with BPD, these personality traits were even more pronounced. No group differences in the recognition of facial expressions were found. Nonetheless compared to the control group, adolescents with NSSI rated the stimuli as significantly more unpleasant and arousing.
In conclusion, NSSI is a highly impairing disorder characterized by high comorbidity with various disorders and by specific personality traits, providing further evidence that NSSI should be handled as a distinct diagnostic entity. Consequently, the proposed DSM-5 diagnostic criteria for NSSI are useful and necessary.
Virtueller Konsum - Warenkörbe, Wägungsschemata und Verbraucherpreisindizes in virtuellen Welten
(2015)
Virtual worlds have been investigated by several academic disciplines for several years, e.g. sociology, psychology, law and education. Since the developers of virtual worlds have implemented aspects like scarcity and needs, even economic research has become interested in these virtual environments. Exploring virtual economies mainly deals with the research of trade regarding the virtual goods used to supply the emerged needs. On the one hand, economics analyzes the meaning of virtual trade according to the overall interpretation of the economical characteristics of virtual worlds. As some virtual worlds allow the change of virtual world money with real money and vice versa, virtual goods are traded by the users for real money, researchers on the other hand, study the impact of the interdependencies between virtual economies and the real world. The presented thesis mainly focuses on the trade within virtual worlds in the context of virtual consumption and the observation of consumer prices. Therefore, the four virtual worlds World of Warcraft, RuneScape, Entropia Universe and Second Life have been selected. There are several components required to calculate consumer price indices. First, a market basket, which contains the relevant consumed goods existing in virtual worlds, must be developed. Second, a weighting scheme has to be established, which shows the dispersion of consumer tendencies. Third, prices of relevant consumer goods have to be taken. Following real world methods, it is the challenge to apply those methods within virtual worlds. Therefore, this dissertation contains three corresponding investigation parts. Within a first analysis, it will be evaluated, in how far virtual worlds can be explored to identify consumable goods. As a next step, the consumption expenditures of the avatars will be examined based on an online survey. At last, prices of consumable goods will be recorded. Finally, it will be possible to calculate consumer price indices. While investigating those components, the thesis focuses not only on the general findings themselves, but also on methodological issues arising, like limited access to relevant data, missing legal legitimation or security concerns of the users. Beside these aspects, the used methods also allow the examination of several other economic aspects like the consumption habits of the avatars. At the end of the thesis, it will be considered to what extent virtual world economic characteristics can be compared with the real world.
Aspects like the important role of weapons or the different usage of food show significant differences to the real world, caused by the business models of virtual worlds.
In this thesis, we deal with the question if challenge, flow and fun in computer games are related to each other, and which influence the motivational, psychological components motivation of success, motivation of failure and the chance of success do have. In addition, we want to know if a free choice in the level of difficulty is the optimal way to flow. To examine these theories, a study based on an online survey was executed, in which the participants played the game “flOw“. The results were evaluated with the help of a two-factorial analysis of variance with repeated measurement and tests on correlation. Thereby we found out that there actually exists a relation between challenge, flow and fun and that motivation does matter indirectly.
The increasing, anthropogenic demand for chemicals has created large environmental problems with repercussions for the health of the environment, especially aquatic ecosystems. As a result, the awareness of the public and decision makers on the risks from chemical pollution has increased over the past half-century, prompting a large number of studies in the field of ecological toxicology (ecotoxicology). However, the majority of ecotoxicological studies are laboratory based, and the few studies extrapolating toxicological effects in the field are limited to local and regional levels. Chemical risk assessment on large spatial scales remains largely unexplored, and therefore, the potential large-scale effects of chemicals may be overlooked.
To answer ecotoxicological questions, multidisciplinary approaches that transcend classical chemical and toxicological concepts are required. For instance, the current models for toxicity predictions - which are mainly based on the prediction of toxicity for a single compound and species - can be expanded to simultaneously predict the toxicity for different species and compounds. This can be done by integrating chemical concepts such as the physicochemical properties of the compounds with evolutionary concepts such as the similarity of species. This thesis introduces new, multidisciplinary tools for chemical risk assessments, and presents for the first time a chemical risk assessment on the continental scale.
After a brief introduction of the main concepts and objectives of the studies, this thesis starts by presenting a new method for assessing the physiological sensitivity of macroinvertebrate species to heavy metals (Chapter 2). To compare the sensitivity of species to different heavy metals, toxicity data were standardized to account for the different laboratory conditions. These rankings were not significantly different for different heavy metals, allowing the aggregation of physiological sensitivity into a single ranking.
Furthermore, the toxicological data for macroinvertebrates were used as input data to develop and validate prediction models for heavy metal toxicity, which are currently lacking for a wide array of species (Chapter 3). Apart from the toxicity data, the phylogenetic information of species (evolutionary relationships among species) and the physicochemical parameters for heavy metals were used. The constructed models had a good explanatory power for the acute sensitivity of species to heavy metals with the majority of the explained variance attributed to phylogeny. Therefore, the integration of evolutionary concepts (relatedness and similarity of species) with the chemical parameters used in ecotoxicology improved prediction models for species lacking experimental toxicity data. The ultimate goal of the prediction models developed in this thesis is to provide accurate predictions of toxicity for a wide range of species and chemicals, which is a crucial prerequisite for conducting chemical risk assessment.
The latter was conducted for the first time on the continental scale (Chapter 4), by making use of a dataset of 4,000 sites distributed throughout 27 European countries and 91 respective river basins. Organic chemicals were likely to exert acute risks for one in seven sites analyzed, while chronic risk was prominent for almost half of the sites. The calculated risks are potentially underestimated by the limited number of chemicals that are routinely analyzed in monitoring programmes, and a series of other uncertainties related with the limit of quantification, the presence of mixtures, or the potential for sublethal effects not covered by direct toxicity.
Furthermore, chemical risk was related to agricultural and urban areas in the upstream catchments. The analysis of ecological data indicated chemical impacts on the ecological status of the river systems; however, it is difficult to discriminate the effects of chemical pollution from other stressors that river systems are exposed to. To test the hypothesis of multiple stressors, and investigate the relative importance of organic toxicants, a dataset for German streams is used in chapter 5. In that study, the risk from abiotic (habitat degradation, organic chemicals, and nutrients enrichment) and biotic stressors (invasive species) was investigated. The results indicated that more than one stressor influenced almost all sites. Stream size and ecoregions influenced the distribution of risks, e.g., the risks for habitat degradation, organic chemicals and invasive species increased with the stream size; whereas organic chemicals and nutrients were more likely to influence lowland streams. In order to successfully mitigate the effects of pollutants in river systems, co-occurrence of stressors has to be considered. Overall, to successfully apply integrated water management strategies, a framework involving multiple environmental stressors on large spatial scales is necessary. Furthermore, to properly address the current research needs in ecotoxicology, a multidisciplinary approach is necessary which integrates fields such as, toxicology, ecology, chemistry and evolutionary biology.
This thesis addresses the problem of terrain classification in unstructured outdoor environments. Terrain classification includes the detection of obstacles and passable areas as well as the analysis of ground surfaces. A 3D laser range finder is used as primary sensor for perceiving the surroundings of the robot. First of all, a grid structure is introduced for data reduction. The chosen data representation allows for multi-sensor integration, e.g., cameras for color and texture information or further laser range finders for improved data density. Subsequently, features are computed for each terrain cell within the grid. Classification is performedrnwith a Markov random field for context-sensitivity and to compensate for sensor noise and varying data density within the grid. A Gibbs sampler is used for optimization and is parallelized on the CPU and GPU in order to achieve real-time performance. Dynamic obstacles are detected and tracked using different state-of-the-art approaches. The resulting information - where other traffic participants move and are going to move to - is used to perform inference in regions where the terrain surface is partially or completely invisible for the sensors. Algorithms are tested and validated on different autonomous robot platforms and the evaluation is carried out with human-annotated ground truth maps of millions of measurements. The terrain classification approach of this thesis proved reliable in all real-time scenarios and domains and yielded new insights. Furthermore, if combined with a path planning algorithm, it enables full autonomy for all kinds of wheeled outdoor robots in natural outdoor environments.
Flowering habitats to enhance biodiversity and pest control services in agricultural landscapes
(2015)
Meeting growing demands for agricultural products requires management solutions that enhance food production, whilst minimizing negative environmental impacts. Conventional agricultural intensification jeopardizes farmland biodiversity and associated ecosystem services through excessive anthropogenic inputs and landscape simplification. Agri-environment schemes (AES) are commonly implemented to mitigate the adverse effects of conventional intensification on biodiversity. However the moderate success of such schemes thus far would strongly benefit from more explicit goals regarding ecosystem service provisioning. Providing key resources to beneficial organisms may improve their abundance, fitness, diversity and the ecosystem services they provide. With targeted habitat management, AES may synergistically enhance biodiversity and agricultural production and thus contribute to ecological intensification. We demonstrate that sown perennial wildflower strips, as implemented in current AES focusing on biodiversity conservation also benefit biological pest control in nearby crops (Chapter 2).
Comparing winter wheat fields adjacent to wildflower strips with fields without wildflower strips we found strongly reduced cereal leaf beetle (Oulema sp.) density and plant damage near wildflower strips. In addition, winter wheat yield was 10 % higher when fields adjoined wildflower strips. This confirms previous assumptions that wildflower strips, known for positive effects on farmland biodiversity, can also enhance ecosystem services such as pest control and the positive correlation of yield with flower abundance and diversity suggests that floral resources are key. Refining sown flower strips for enhanced service provision requires mechanistic understanding of how organisms benefit from floral resources. In climate chamber experiments investigating the impact of single and multiple flowering plant species on fitness components of three key arthropod natural enemies of aphids, we demonstrate that different natural enemies benefit differently from the offered resources (Chapter 3).
Some flower species were hereby more valuable to natural enemies than others overall. Additionally, the mixture with all flowers generally performed better than monocultures, yet with no transgressive overyielding. By explicitly tailoring flower strips to the requirements of key natural enemies of crop pests we aimed to maximise natural enemy mediated pest control in winter wheat (Chapter 4)and potato (Chapter 5) crops.
Respecting the manifold requirements of diverse natural enemies but not pests, in terms of temporal and spatial provisioning of floral, extra floral and structural resources, we designed targeted annual flower strips that can be included in crop rotation to support key arthropods at the place and time they are needed. Indeed, field experiments revealed that cereal leaf beetle density and plant damage in winter wheat can be reduced by 40 % to 61 % and aphid densities in potatoes even by 77 %, if a targeted flower strip is sown into the field. These effects were not restricted to the vicinity of flower strips and, in contrast to fields without flower strip, often prevented action thresholds from being reached. This suggests that targeted flower strips could replace insecticides. All adult natural enemies were enhanced inside targeted flower strips when compared to control strips. Yet, spillover to the field was restricted to key natural enemies such as ground beetles (winter wheat), hoverflies (potato) and lacewings (winter wheat and potato), suggesting their dominant role in biological control. In potatoes, targeted flower strips also enhanced hoverfly species richness in strips and crop, highlighting their additional benefits for diversity.
The present results provide more insights into the mechanisms underlying conservation biological control and highlight the potential of tailored habitat management for ecological intensification.
In this thesis, an interactive application is developed for Android OS. The application is about a virtual-reality game. The game is settled in the genre of first-person shooters and takes place in a space scenario. By using a stereo renderer, it is possible to play the game combined with virtual-reality glasses.
The publication of freely available and machine-readable information has increased significantly in the last years. Especially the Linked Data initiative has been receiving a lot of attention. Linked Data is based on the Resource Description Framework (RDF) and anybody can simply publish their data in RDF and link it to other datasets. The structure is similar to the World Wide Web where individual HTML documents are connected with links. Linked Data entities are identified by URIs which are dereferenceable to retrieve information describing the entity. Additionally, so called SPARQL endpoints can be used to access the data with an algebraic query language (SPARQL) similar to SQL. By integrating multiple SPARQL endpoints it is possible to create a federation of distributed RDF data sources which acts like one big data store.
In contrast to the federation of classical relational database systems there are some differences for federated RDF data. RDF stores are accessed either via SPARQL endpoints or by resolving URIs. There is no coordination between RDF data sources and machine-readable meta data about a source- data is commonly limited or not available at all. Moreover, there is no common directory which can be used to discover RDF data sources or ask for sources which offer specific data. The federation of distributed and linked RDF data sources has to deal with various challenges. In order to distribute queries automatically, suitable data sources have to be selected based on query details and information that is available about the data sources. Furthermore, the minimization of query execution time requires optimization techniques that take into account the execution cost for query operators and the network communication overhead for contacting individual data sources. In this thesis, solutions for these problems are discussed. Moreover, SPLENDID is presented, a new federation infrastructure for distributed RDF data sources which uses optimization techniques based on statistical information.
Im Rahmen dieser Arbeit wird untersucht, wie sich Modellfehler auf die Positionsgenauigkeit und Handhabbarkeit beim Rangieren mit einem Fahrerassistenzsystem auswirken. Besonderer Wert wird dabei auf die Bestimmung von Fehlergrenzen gelegt. Es wird der Frage nachgegangen, wie groß der Eingangsfehler sein darf, damit die Assistenz noch hinreichende Qualitätseigenschaften hinsichtlich ihrer Präzision und Robustheit aufweist. Dazu erfolgt zunächst eine quantitative Betrachtung der Fehler anhand des kinematischen Modells. Danach wird eine qualitative Betrachtung anhand von systematischen Experimenten durchgeführt. Es wird zunächst ein Controller entwickelt, mit dem sich ein Manöver mithilfe der visuellen Informationen der Assistenz simulieren lässt.
Dann wird eine Methode vorgestellt, mit deren Hilfe man das Manöver anhand definierter Fehlergrenzen bewerten kann. Um einen großen Raum möglicher Fehlerkombinationen effizient zu durchsuchen, wird das probabilistische Verfahren des Annealed Particle Filters benutzt. Mithilfe einer Testumgebung werden schließlich systematische Experimente durchgeführt. Zur weiteren Evaluation des Assistenzsystems in einer kontrollierten Umgebung erfolgte in Zusammenarbeit mit dem Fraunhofer ITWM in Kaiserslautern die Portierung des Assistenzsystems auf die dortige Simulationsumgebung RODOS.
Der Fachbereich 4 (Informatik) besteht aus fünfundzwanzig Arbeitsgruppen unter der Leitung von Professorinnen und Professoren, die für die Forschung und Lehre in sechs Instituten zusammenarbeiten.
In jedem Jahresbericht stellen sich die Arbeitsgruppen nach einem einheitlichen Muster dar, welche personelle Zusammensetzung sie haben, welche Projekte in den Berichtszeitraum fallen und welche wissenschaftlichen Leistungen erbracht wurden. In den folgenden Kapiteln werden einzelne Parameter aufgeführt, die den Fachbereich in quantitativer Hinsicht, was Drittmitteleinwerbungen, Abdeckung der Lehre, Absolventen oder Veröffentlichungen angeht, beschreiben.
Uniprisma Ausg. 2005
(2015)
The subject of this thesis was to analyse the involvement of classical creativity techniques and IT tools in different phases of the innovation process. In addition, the present work deals with the integration of Design Thinking and TRIZ into the innovation process. The aim was to define a specific innovation process based on diverse existing Innovation process models from the literature. This specific innovation process should serve as a basis for the analysis of integration of creativity techniques, IT tools, Design Thinking and TRIZ. Summarizing it can be said that the application of creativity techniques and IT Tools is admissible and useful in every phase of the innovation process. In this work it was shown that the design thinking method can be integrated in the early stages of the innovation process. Also, the process model of TRIZ, which differs from traditional innovation processes, can be combined with classical innovation processes.
Campuszeitung Ausg. 1/2015
(2015)
Simulation von Schnee
(2015)
Physic simulations allow the creation of dynamic scenes on the computer. Computer generated images become lively and find use in movies, games and engineering applications. GPGPU techniques make use of the graphics card to simulate physics. The simulation of dynamic snow is still little researched. The Material Point Method is the first technique which is capable of showing the dynamics andrncharacteristics of snow.
The hybrid use of Lagrangian particles and a regular cartesian grid enables solving of partial differential equations. Therefore articles are transformed to the grid. The grid velocities can then be updated with the calculation of gradients in an FEM-manner (finite element method). Finally grid node velocities are weight back to the particles to move them across the scene. This method is coupled with a constitutive model to cover the dynamic nature of snow. This include collisions and breaking.
This bachelor thesis connects the recent developments in GPGPU techniques of OpenGL with the Material Point Method to efficiently simulate visually compelling, dynamic snow scenes.
Heat exchangers are used for thickening of various products or desalination of saltwater. Nevertheless, they are used as cooling unit in industries. Thereby, the stainless steel heat transferring elements get in contact with microorganism containing media, such as river water or saltwater, and corrode. After at least two years of utilization the material is covered with bacterial slime called biofilm. This process is called biofouling and causes loss in efficiency and creates huge costs depending on cleaning technique and efficiency. Cleaning a heat exchanger is very expensive and time consuming. It only can be done while the device is out of business.
Changing the surface properties of materials is the best and easiest way to lengthen the initial phase of biofilm formation. This leads to less biofouling (Mogha et al. 2014).
Thin polymer films as novel materials have less costs in production than stainless steel and are easy to handle. Furthermore, they can be functionalzed easily and can be bougth in different sizes all over the world. Because of this, they can reduce the costs of cleaning techniques and lead to a longer high efficiency state of the heat exchanger. If the efficiency of the heat exchanger decreases, the thin polymer films can be replaced.
For a successful investigation of the microbial and the process engineering challenges a cooperation of Technical University of Kaiserslautern (chair of seperation science and technology) and University of Koblenz-Landau (working goup microbiology) was established.
The aim of this work was design engineering and production of a reactor for investigation of biofouling taking place on thin polymeric films and stainless steel. Furthermore, an experimental design has to be established. Several requirements have to be applied for these tasks. Therefore, a real heat exchanger is downscaled, so the process parameters are at least comparable. There are many commercial flow cell kits available. Reducing the costs by selfassembling increased the number of samples, so there is a basis for statistic analysis. In addition, fast and minimal invasive online-in-situ microscopy and Raman spectroscopy can be performed. By creating laminary flow and using a weir we implemented homogenous inflow to the reactors. Reproduceable data on biomass and cell number were created.
The assessment of biomass and cell number is well established for drinking water analysis. Epifluorescense microscopy and gravimetric determination are the basic techniques for this work, too. Differences in cell number and biomass between surface modifications and materials are quantified and statistically analysed.
The wildtype strain Escherichia coli K12 and an inoculum of 500 ml fresh water were used to describe the biofouling of the films. Thereby, we generated data with natural bacterial community in unknown media properties and data with well known media properties, so the technical relevance of the data is given.
Free surface energy and surface roughness are the first attachment hurdles for bacteria. These parameters were measured according to DIN 55660 and DIN EN ISO 4287. The materials science data were correlated with the number of cells and the biomass. This correlation acts as basal link of biofouling as biological induced parameter to the material properties. Material properties for reducing the biofouling can be prospected.
By using Raman spectroscopy as a cutting edge method future investigations could be shortened. If biomass or cell number can be linked with the spectra, new functional materials can be investigated in a short time.
Zentrale Aufgaben der Hochschule sind die Bewertung, die Ursachenklärung und die Förderung von Studienleistungen (Heublein & Wolter, 2011, S. 215). In diesem Kontext gilt neben intellektuellen Fähigkeiten die Leistungsmotivation als bedeutsamer Prädiktor für den akademischen Erfolg (z. B. Schmidt-Atzert, 2005, S. 132; Steinmayr & Spinath, 2009, S. 80). Im Fokus der vorliegenden Studie stehen deshalb Überlegungen zu Motivationsprozessen von 332 Studienanfängern der Hochschule der Bundesagentur für Arbeit und zu den Faktoren, die sich förderlich auf ihre Lernresultate auswirken. Mit einer Ausschöpfungsquote von 89 % sind die gewonnenen Daten für die Grundgesamtheit repräsentativ. Anhand einer Ex-post-facto-Versuchsanordnung in Form eines quantitativen Prädiktor-Kriteriums-Ansatzes (spezielle Variante eines Längsschnittdesigns) mit unterschiedlichen Erhebungsmethoden, wie standardisiertem Selbstbeurteilungsfragebogen, Leistungstests und offiziellen Dokumenten/Aktenmaterial, wurden folgende Forschungshypothesen zugrunde gelegt: Die Stärke der Leistungsmotivation ist sowohl von Erwartungskomponenten (Fähigkeitsselbstkonzept, Selbstwert, subjektive Notenerwartung, Erfolgszuversicht und Misserfolgsfurcht) als auch von Anreizkomponenten (Gegenstands-, Tätigkeits-, Folgenanreizen) abhängig, welche wiederum vermittelt über das leistungsmotivierte Verhalten einen Einfluss auf die Studienleistung besitzt. Dabei wurde postuliert, dass motivationale Variablen auch dann noch einen bedeutsamen Effekt auf die Studienleistung ausüben, wenn weitere Leistungsprädiktoren, wie die Schulabschlussnote, die Intelligenz, die emotionale Stabilität und die Gewissenhaftigkeit kontrolliert werden.
Web application testing is an active research area. Garousi et al. did a systematic mapping study and classified 79 papers published between 2000-2011. However, there seems to be a lack of information exchange between the scientific community and tool developers.
This thesis systematically analyzes the field of functional, system level web application testing tools. 194 candidate tools were collected in the tool search and screened, with 23 tools being selected as foundation of this thesis. These 23 tools were systematically used to generate a feature model of the domain. The methodology to support this is an additional contribution of this thesis. It processes end user documentation of tools belonging to an examined domain and creates a feature model. The feature model gives an overview over the existing features, their alternatives and their distribution. It can be used to identify trends and problems, extraordinary features, help decision making of tool purchase or guide scientists how to focus research.
Das Thema dieser Arbeit ist die Entwicklung einer hardwarebeschleunigten Einzelbildkompression zur Videoübertragung. Verfahren zur Einzelbildkompressionrn existieren bereits seit längerer Zeit. Jedoch genügen die gängigen Verfahren nicht den Anforderungen der Echtzeit und Performanz, um während einer Videoübertragung ohne spürbare Latenz zum Einsatz zu kommen. In dieser Arbeit soll einer der geläufigsten Algorithmen zur Bildkompression auf Parallelisierbarkeit, unter zu Hilfenahme der Grafikkarte, untersucht werden, um Echtzeitfähigkeit während der Kompression und Dekompression von computergenerierten Bildern zu erreichen. Die Ergebnisse werden evaluiert und in den Rahmen aktueller Verfahren parallelisierter Kompressionstechniken eingeordnet.
Die vorliegende Forschungsarbeit beschäftigt sich mit der Positionierung und anbieterinternen Kommunikation der innovativen IT-Architektur SOA. Die zentralen Ziele der vorliegenden explorativen und empirischen Forschungsarbeit, die im Kontext der Innovations-Erfolgsfaktorenforschung angesiedelt ist, bestehen in der Beantwor-tung der beiden folgenden forschungsleitenden Fragestellungen:
Forschungsfrage 1: Welche Bedingungen tragen zu einer erfolgreichen Positionierung von SOA bei? Forschungsfrage 2: Welche Bedingungen tragen zu einer erfolgreichen anbieterinternen Kommunikation bezüglich SOA bei? Zur Überprüfung dieser beiden Forschungsfragen wurde ein zweistufiges Delphi-Verfahren durchgeführt. Hierbei wurde zunächst eine qualitative Befragungswelle (N=53) zur Identifizierung der SOA-Positionierungsbedingungen und anbieterinternen SOA-Kommunikations-bedingungen durchgeführt. Insgesamt wurden in der ersten Befragungswelle 122 SOA-Positionierungsbedingungen identifiziert, die sich in 65 Bedingungen auf Anbieterseite, 35 Bedingungen auf Kundenseite, 19 Bedingungen auf SOA-Seite und 3 Bedingungen aufseiten des weiteren Umfeldes aufteilen. Im Rahmen der anbieterinternen SOA-Kommunikation konnten 31 Bedingungen identifiziert werden. Die in der ersten Welle identifizierten SOA-Positionie-rungsbedingungen und anbieterinternen SOA-Kommunikationsbedingungen wurden mittels der zweiten Befragungswelle (N=83) einer quantitativen Analyse unterzogen. Somit liefert die vorliegende Studie Bedingungen, die sowohl zu einer erfolgreichen SOA-Positionierung als auch zu einer erfolgreichen anbieterinternen SOA-Kommunikation beitragen.
Die Resultate dieser Arbeit werden zusammengefasst und theoretisch eingeordnet. Ebenfalls wird die methodische Vorgehensweise kritisch diskutiert und die Güte der Daten beurteilt. Schließlich wird ein Ausblick auf zukünftige Forschungsfelder gegeben.
Campuszeitung Ausg. 1/2013
(2015)
Uniprisma Ausg. 2010
(2015)
Die UN-Behindertenrechtskonvention von 2008 formuliert einen Rechtsanspruch auf inklusive Bildung für Menschen mit Beeinträchtigungen. Diesem wird in Deutschland seit 2009 durch Schulgesetzänderungen Rechnung getragen, mit denen inklusive Bildung durch ein Elternwahlrecht implementiert wird. Bislang ist vor dem Hintergrund der neu geschaffenen elterlichen Entscheidungsmöglichkeiten noch nicht untersucht worden, welche Vorstellungen Eltern von Kindern mit komplexen Beeinträchtigungen mit dem inklusiven Bildungsanspruch ihres Kindes verbinden und in welcher Weise sie diesen an der Schulform ihrer Wahl eingelöst sehen. Im Zentrum der vorliegenden Arbeit steht die Rekonstruktion des Bildungsangebots aus der Perspektive der Eltern im Abgleich mit der Sicht der pädagogischen Klassenteams. Den Fragen nach den elterlichen Erwartungen und Erfahrungen wurde aus der systemtheoretischen Perspektive von Luhmann nachgegangen. In der qualitativ angelegten Untersuchung geht es um Schülerinnen und Schüler mit komplexen Beeinträchtigungen, die nach der Schulgesetznovellierung in Hamburg (2009) in den Jahren 2010 und 2011 eingeschult worden sind und aufgrund der Entscheidung ihrer Eltern in unterschiedlichen Settings an Grund- und Sonderschulen lernen. Die Datenerhebung erfolgte durch leitfadengestützte Interviews mit Eltern, Pädagoginnen und Pädagogen sowie Schulleitungen, ergänzt durch Hospitationen im Schuljahr 2011/12 und Dokumente, die von den Schulen zur Verfügung gestellt wurden. Die Datenanalyse erfolgt mithilfe der Grounded Theory nach Strauß/Corbin (1996). Die Ergebnisse der Untersuchung zeigen elterliche Bildungserwartungen im Hinblick auf ein Ermöglichen von Autonomie und Teilhabe ihrer Kinder und eine differenzierte Wahrnehmung der Umsetzung ihrer Erwartungen im Schulalltag. Einen besonderen Stellenwert messen Eltern der Zusammenarbeit zwischen Schule und Familie bei, die für das Entstehen von Vertrauen bzw. Misstrauen bedeutsam ist. Aus den Erkenntnissen und deren Rückbindung an die Systemtheorie wurde ein Modell des Professionsvertrauens/-misstrauens entwickelt.
Die Auswertung ergibt Hinweise zu Qualitätskriterien eines inklusiven Bildungsangebots und zu Entwicklungsanforderungen in der Professionalisierung, die sowohl auf die Ebene der Organisation Schule als auch der Interaktion zwischen schulischen Akteurinnen und Akteuren sowie Eltern abzielen.
Uniprisma Ausg. 2008
(2015)
In current research of the autonomous mobile robots, path planning is still a very important issue.
This master's thesis deals with various path planning algorithms for the navigation of such mobile systems. This is not only to determine a collision-free trajectory from one point to another. The path should still be optimal and comply with all vehicle-given constraints. Especially the autonomous driving in an unknown and dynamic environment poses a major challenge, because a closed-loop control is necessary and thus a certain dynamic of the planner is demanded.
In this paper, two types of algorithms are presented. First, the path planner, based on A*, which is a common graph search algorithm: A*, Anytime Repairing A*, Lifelong Planning A*, D* Lite, Field D*, hybrid A*. Second, the algorithms which are based on the probabilistic planning algorithm Rapidly-exploring Random Tree (Rapidly-exploring Random Tree, RRT*, Lifelong Planning RRT*), as well as some extensions and heuristics. In addition, methods for collision avoidance and path smoothing are presented. Finally, these different algorithms are evaluated and compared with each other.
The lasting hype around the mobile internet and the related technology of the mobile applications seem not to drop off. The immense economic potential of this market leads the businesses and ventures to continuously find new ways of monetization. The underlying causes of that phenomenon are rarely challenged. Scientific research in the field of "ubiquitous mobile" has not yet developed a clear overall picture of the causes and effect chains. Attempts of deriving causes by studies in related mass media such as the computer or the internet have been discussed controversially. By combining the research streams of media motive usage and the customer retention, this paper will present a new research model. Based on a quantitative survey in the German speaking the gained data proves the motives for mobility, information gathering and entertainment purposed to be the most important drivers of customer satisfaction in mobile applications. The paper also highlights a significant correlation between the customer satisfaction and the other determinants of customer retention.
Uniprisma Ausg. 2007
(2015)
Campuszeitung Ausg. 1/2014
(2015)
In the new epoch of Anthropocene, global freshwater resources are experiencing extensive degradation from a multitude of stressors. Consequently, freshwater ecosystems are threatened by a considerable loss of biodiversity as well as substantial decrease in adequate and secured freshwater supply for human usage, not only on local scales, but also on regional to global scales. Large scale assessments of human and ecological impacts of freshwater degradation enable an integrated freshwater management as well as complement small scale approaches. Geographic information systems (GIS) and spatial statistics (SS) have shown considerable potential in ecological and ecotoxicological research to quantify stressor impacts on humans and ecological entitles, and disentangle the relationships between drivers and ecological entities on large scales through an integrated spatial-ecological approach. However, integration of GIS and SS with ecological and ecotoxicological models are scarce and hence the large scale spatial picture of the extent and magnitude of freshwater stressors as well as their human and ecological impacts is still opaque. This Ph.D. thesis contributes novel GIS and SS tools as well as adapts and advances available spatial models and integrates them with ecological models to enable large scale human and ecological impacts identification from freshwater degradation. The main aim was to identify and quantify the effects of stressors, i.e climate change and trace metals, on the freshwater assemblage structure and trait composition, and human health, respectively, on large scales, i.e. European and Asian freshwater networks. The thesis starts with an introduction to the conceptual framework and objectives (chapter 1). It proceeds with outlining two novel open-source algorithms for quantification of the magnitude and effects of catchment scale stressors (chapter 2). The algorithms, i.e. jointly called ATRIC, automatically select an accumulation threshold for stream network extraction from digital elevation models (DEM) by assuring the highest concordance between DEM-derived and traditionally mapped stream networks. Moreover, they delineate catchments and upstream riparian corridors for given stream sampling points after snapping them to the DEM-derived stream network. ATRIC showed similar or better performance than the available comparable algorithms, and is capable of processing large scale datasets. It enables an integrated and transboundary management of freshwater resources by quantifying the magnitude of effects of catchment scale stressors. Spatially shifting temporal points (SSTP), outlined in chapter 3, estimates pooled within-time series (PTS) variograms by spatializing temporal data points and shifting them. Data were pooled by ensuring consistency of spatial structure and temporal stationarity within a time series, while pooling sufficient number of data points and increasing data density for a reliable variogram estimation. SSTP estimated PTS variograms showed higher precision than the available method. The method enables regional scale stressors quantification by filling spatial data gaps integrating temporal information in data scarce regions. In chapter 4, responses of the assumed climate-associated traits from six grouping features to 35 bioclimatic indices for five insect orders were compared, their potential for changing distribution pattern under future climate change was evaluated and the most influential climatic aspects were identified (chapter 4). Traits of temperature preference grouping feature and the insect order Ephemeroptera exhibited the strongest response to climate as well as the highest potential for changing distribution pattern, while seasonal radiation and moisture were the most influential climatic aspects that may drive a change in insect distribution pattern. The results contribute to the trait based freshwater monitoring and change prediction. In chapter 5, the concentrations of 10 trace metals in the drinking water sources were predicted and were compared with guideline values. In more than 53% of the total area of Pakistan, inhabited by more than 74 million people, the drinking water was predicted to be at risk from multiple trace metal contamination. The results inform freshwater management by identifying potential hot spots. The last chapter (6) synthesizes the results and provides a comprehensive discussion on the four studies and on their relevance for freshwater resources conservation and management.
Protest & Gerechtigkeit - Thema in Forschung und Lehre am Campus Landau
Erwachsenenbildung: Die Herausforderung einer modernen Weiterbildung
Hospizarbeit: Neuer Qualitätsindex startklar für die Praxis
Unzufrieden im Job? Wie positive Psychologie helfen kann
Psychotherapie: Neue Kinder- und Jugendambulanz
The mitral valve is one of the four valves in the human heart. It is located in the left heart chamber and its function is to control the blood flow from the left atrium to the left ventricle. Pathologies can lead to malfunctions of the valve so that blood can flow back to the atrium. Patients with a faulty mitral valve function may suffer from fatigue and chest pain. The functionality can be surgically restored, which is often a long and exhaustive intervention. Thorough planning is necessary to ensure a safe and effective surgery. This can be supported by creating pre-operative segmentations of the mitral valve. A post-operative analysis can determine the success of an intervention. This work will combine existing and new ideas to propose a new approach to (semi-)automatically create such valve models. The manual part can guarantee a high quality model and reliability, whereas the automatic part contributes to saving valuable labour time.
The main contributions of the automatic algorithm are an estimated semantic separation of the two leaflets of the mitral valve and an optimization process that is capable of finding a coaptation-line and -area between the leaflets. The segmentation method can perform a fully automatic segmentation of the mitral leaflets if the annulus ring is already given. The intermediate steps of this process will be integrated into a manual segmentation method so a user can guide the whole procedure. The quality of the valve models generated by the method proposed in this work will be measured by comparing them to completely manually segmented models. This will show that commonly used methods to measure the quality of a segmentation are too general and do not suffice to reflect the real quality of a model. Consequently the work at hand will introduce a set of measurements that can qualify a mitral valve segmentation in more detail and with respect to anatomical landmarks. Besides the intra-operative support for a surgeon, a segmented mitral valve provides additional benefits. The ability to patient-specifically obtain and objectively describe the valve anatomy may be the base for future medical research in this field and automation allows to process large data sets with reduced expert dependency. Further, simulation methods that use the segmented models as input may predict the outcome of a surgery.
Factors triggering the ecotoxicity of metal-based nanoparticles towards aquatic invertebrates
(2015)
Nanoparticles are produced and used in huge amounts increasing their probability to end up in surface waters. There, they are subject to environmentally driven modification processes. Consequently, aquatic life may be exposed to different nanoparticle agglomerate sizes, while after sedimentation benthic organisms are more likely to be affected.
However, most ecotoxicity studies with nanoparticles exclusively investigated implications of their characteristics (e.g. size) on pelagic organisms, ignoring environmentally modified nanoparticles. Therefore, a systematic assessment of factors triggering the fate and toxicity of nanoparticles under environmentally relevant conditions is needed. The present thesis, therefore, investigates the implications of nanoparticle related factors (i.e., inherent material-properties and nanoparticle characteristics) as well as environmental conditions towards the pelagic living organism Daphnia magna and the benthic species Gammarus fossarum. In detail, inert titanium dioxide (nTiO2) and ion-releasing silver nanoparticles (nAg), both of varying particle characteristics (e.g. initial size), were tested for their toxicity under different environmental conditions (e.g. ultraviolet-light (UV-light)).
The results indicate that the toxicity of nTiO2 and nAg is mainly determined by: their adsorption potential onto biota, and their fate in terms of reactive oxygen species or Ag+ ion release. Thus, inherent material-properties, nanoparticle characteristics and environmental conditions promoting or inhibiting these aspects revealed significant implications in the toxicity of nTiO2 and nAg towards daphnids.
Furthermore, the presence of ambient UV-light, for example, adversely affected gammarids at 0.20 mg nTiO2/L, while under darkness no effects occurred even at 5.00 mg nTiO2/L. Hence, the currently associated risk of nanoparticles might be underestimated if disregarding their interaction with environmental parameters
The formulation of the decoding problem for linear block codes as an integer program (IP) with a rather tight linear programming (LP) relaxation has made a central part of channel coding accessible for the theory and methods of mathematical optimization, especially integer programming, polyhedral combinatorics and also algorithmic graph theory, since the important class of turbo codes exhibits an inherent graphical structure. We present several novel models, algorithms and theoretical results for error-correction decoding based on mathematical optimization. Our contribution includes a partly combinatorial LP decoder for turbo codes, a fast branch-and-cut algorithm for maximum-likelihood (ML) decoding of arbitrary binary linear codes, a theoretical analysis of the LP decoder's performance for 3-dimensional turbo codes, compact IP models for various heuristic algorithms as well as ML decoding in combination with higher-order modulation, and, finally, first steps towards an implementation of the LP decoder in specialized hardware. The scientific contributions are presented in the form of seven revised reprints of papers that appeared in peer-reviewed international journals or conference proceedings. They are accompanied by an extensive introductory part that reviews the basics of mathematical optimization, coding theory, and the previous results on LP decoding that we rely on afterwards.
Diese Arbeit betrachtet das Thema Führung und Gesundheit und hat hierzu verschiedene Erkenntnisse der Literatur zusammengefasst, um diese von Führungskräften aus Wirtschaft und Polizei sowie von Personal- und Organisationsentwicklern bewerten zu lassen. Das Ziel war hierbei herauszufinden, ob die Führungskräfte und die Personal- und Organisationsentwickler das Thema als wichtig erachten, welche Hauptursachen sie für Fehlzeiten sehen und wie sie verschiedene Erkenntnisse der Literatur zum Gesundheitsmanagement einschätzen. Zusätzlich sollten sie bewerten, welche Maßnahmen sie als geeignet betrachten und welche Ressourcen notwendig sind, um die Mitarbeiter bei der Gesunderhaltung zu unterstützen. Schließlich sollten die Führungskräfte und die Personal- und Organisationsentwickler beurteilen, welcher Führungsstil als gesundheitsförderlich angesehen wird. Die Wirtschafts- und Polizeiführungskräfte sowie die Personal- und Organisationsentwickler erachten das Thema Gesundheit als wichtig und sehen es nicht nur als Modetrend an. Ihre Einschätzungen zu geeigneten Maßnahmen, die die Gesundheit der Mitarbeiter verbessern können, entsprechen überwiegend den aus der Literatur abgeleiteten Vorschlägen zur gesundheitsgerechten Führung. Die weitgehende Übereinstimmung der Sichtweisen in Forschung und Praxis legt nahe, dass die Erkenntnisse der Literatur zum Gesundheitsmanagement vermutlich von Praktikern als plausibel wahrgenommen werden.
This thesis deals with the development of an interactive Android card game. As an example, the Hebrew game Yaniv was implemented. Focus is the elaboration of required background components and the corresponding implementation in that application. Required game processes will be screened and a possible solution will be identified.
Geographic cluster based routing in ad-hoc wireless sensor networks is a current field of research. Various algorithms to route in wireless ad-hoc networks based on position information already exist. Among them algorithms that use the traditional beaconing approach as well as algorithms that work beaconless (no information about the environment is required besides the own position and the destination). Geographic cluster based routing with guaranteed message delivery can be carried out on overlay graphs as well. Until now the required planar overlay graphs are not being constructed reactively.
This thesis proposes a reactive algorithm, the Beaconless Cluster Based Planarization (BCBP) algorithm, which constructs a planar overlay graph and noticeably reduces the number of messages required for that. Based on an algorithm for cluster based planarization it beaconlessly constructs a planar overlay graph in an unit disk graph (UDG). An UDG is a model for a wireless network in which every participant has the same sending radius. Evaluation of the algorithm shows it to be more efficient than the non beaconless variant. Another result of this thesis is the Beaconless LLRAP (BLLRAP) algorithm, for which planarity but not continued connectivity could be proven.
A fundamental understanding of attachment of engineered nanoparticles to environmentalrnsurfaces is essential for the prediction of nanoparticle fate and transport in the environment.
The present work investigates the attachment of non-coated silver nanoparticles and citraterncoated silver nanoparticles to different model surfaces and environmental surfaces in thernpresence and absence of humic acid. Batch sorption experiments were used for this investigation.
The objective of this thesis was to investigate how silver nanoparticles interactrnwith surfaces having different chemical functional groups. The effect of presence of HA, on the particle-surface interactions was also investigated. In the absence of humic acid, nanoparticle-surface interactions or attachment was influencedrnby the chemical nature of the interacting surfaces. On the other hand, in the presence ofrnhumic acid, nanoparticle-surface attachment was influenced by the specific surface area of the sorbent surfaces. The sorption of non-coated silver nanoparticles and citrate coatedrnnanoparticles to all the surfaces was nonlinear and best described by Langmuir isotherm, indicating monolayer sorption of nanoparticles on to the surfaces. This can be explained as due to the blocking effect generated by the particle-particle repulsion. In the presence of humic acid, sorption of nanoparticles to the surfaces was linear. When the humic acid was present in the interacting medium, both the nanoparticles and surfaces were getting coated with humic acid and this masks the chemical functionalities of the surfaces. This leads to the change in particle-surface interactions, in the presence of humic acid. For the silver nanoparticle sorption from an unstable suspension, the sorption isotherms did not follow any classical sorption models, suggesting interplay between aggregation and sorption. Citrate coated silver nanoparticles and humic acid coated silver nanoparticles showed arndepression in sorption compared to the sorption of non-coated silver nanoparticles. In therncase of citrate coated silver nanoparticles the decrease in sorption can be explained by thernmore negative zeta potential of citrate coated nanoparticles compared to non-coated ones. For humic acid coated nanoparticles the sorption depression can be due to the steric hindrance caused by the free humic acid molecules which may coat the sorbent surface or due to the competition for sorption sites between the nanoparticle and free humic acid molecules present in the suspension. Thus nanoparticle surface chemistry is an important factor that determines the attachment of nanoparticles towards surfaces and it makes the characterization of nanoparticle surface an essential step in the study of their fate in the environment.
Another aim of this study was to introduce the potential of chemical force microscopy for nanoparticle surface characterization. With the use of this technique, it was possible to distinguish between bare silver nanoparticles, citrate coated silver nanoparticles, and humic acid coated silver nanoparticles. This was possible by measuring the adhesion forces between the nanoparticles and five different AFM probes having different chemical functionalization.
The intention of this thesis was to characterise the effect of naturally occurring multivalent cations like Calcium and Aluminium on the structure of Soil Organic Matter (SOM) as well as on the sorption behaviour of SOM for heavy metals such as lead.
The first part of this thesis describes the results of experiments in which the Al and Ca cation content was changed for various samples originated from soils and peats of different regions in Germany. The second part focusses on SOM-metal cation precipitates to study rigidity in dependence of the cation content. In the third part the effects of various cation contents in SOM on the binding strength of Pb cations were characterised by using a cation exchange resin as desorption method.
It was found for soil and peat samples as well as precipitates that matrix rigidity was affected by both type and content of cation. The influence of Ca on rigidity was less pronounced than the influence of Al and of Pb used in the precipitation experiments. For each sample one cation content was identified where matrix rigidity was most pronounced. This specific cation content is below the cation saturation as expected by cation exchange capacity. These findings resulted in a model describing the relation between cation type, content and the degree of networking in SOM. For all treated soil and precipitate samples a step transition like glass transition was observed, determined by the step transition temperature T*. It is known from literature that this type of step transition is due to bridges between water molecules and organic functional groups in SOM. In contrast to the glass transition temperature this thermal event is slowly reversing after days or weeks depending on the re-conformation of the water molecules. Therefore, changes of T* with different cation compositions in the samples are explained by the formation of water-molecule-cation bridges between SOM-functional groups. No influence on desorption kinetics of lead for different cation compositions in soil samples was observed. Therefore it can be assumed that the observed changes of matrix rigidity are highly reversible by changing the water status, pH or putting agitation energy by shaking in there.
This master- thesis investigates the topic of intercultural web design. Two websites from different countries are exemplarily compared. On the basis of cultural dimensions, cultural differences are presented on each respective website. The analysis particularly focuses on how detailed the respective website-designer and -operator regards their users" cultural differences and the creation of a cross-cultural web design. The analysis illustrates which cultural - and particularly intercultural - aspects of countries were taken into consideration in the design of the web sites. The investigation led to the conclusion that their implementation was not consequently executed for all web sites. Hence, this thesis offers suggestions for the improvement of aspects which are most important in intercultural web design.
Proceedings of the 9th Open German-Russian Workshop on Pattern Recognition and Image Understanding
(2015)
The Proceedings of the 9th Open German-Russian Workshop on Pattern Recognition and Image Understanding include publications (extended abstracts), that cover but are not limited to the following topics: - Mathematical Theory of Pattern Recognition, Image and Speech Processing, Analysis, Recognition and Understanding. - Cognitive Technologies, Information Technologies, Automated Systems and Software for Pattern Recognition, Image, Speech and Signal Processing, Analysis and Understanding - Databases, Knowledge Bases, and Linguistic Tools - Special-Purpose Architectures, Software and Hardware Tools - Vision and Sensor Data Interpretation for Robotics - Industrial, Medical, Multimedia and Other Applications - Algorithms, Software, Automated Systems and Information Technologies in Bioinformatics and Medical Informatics. The workshop took place from December 1st-5th, 2014, at the University of Koblenz-Landau in Koblenz, Germany.
Die vorliegende Arbeit betrachtet den Einfluss von Wald- und Wirtschaftswegen auf Abflussentstehung und Bodenerosionsraten innerhalb eines bewaldeten Einzugsgebiets im Naturschutzgebiet Laacher See. Hierfür wurden sowohl bestehende Erosions- und Akkumulationsformen im Gelände kartiert, als auch Erosionssimulationen mittels einer Kleinberegnungsanlage durchgeführt. Zuletzt erfolgte eine Modellierung des Erosionspotentials auf Grundlage der Simulationsergebnisse.
Die Analyse bestehender Erosions- und Akkumulationsformen im Gelände gab einen Hinweis auf Bodenerosionsraten von Wegoberflächen, die zwischen 27,3 und 93,5 t ha-1 a-1 und somit in derselben Größenordnung wie Erosionsraten unter intensiver ackerbaulicher Nutzung lagen.
Die Simulationsläufe zeigten, dass persistente Waldwege ein deutlich verändertes Infiltrationsverhalten aufwiesen. Auf natürlichen Waldböden lag der Anteil des infiltrierten Niederschlags bei durchschnittlich 96%. Im Falle von Waldwegen nahm dieser Anteil im Mittel auf 14% bis 7% ab. Besonders auffällig waren die Ergebnisse auf Rückegassen, auf denen ein erheblicher Einfluss der Bodenverdichtung durch Befahrung nachgewiesen werden konnte. Hier sank der Anteil des infiltrierten Niederschlags auf 31% in den Fahrspuren, zwischen den Spuren wurden noch 76 % infiltriert.
Während der Simulationsläufe konnten maximale Sedimentmengen von 446 g m-2 erodiert werden, was einer mittleren Bodenerosionsrate von 4,96 g m-2 min-1 entspricht. Diese hohen Abtragsraten wurden auf persistenten Wegen mit geringer Befestigung gemessen. Rückegassen wiesen die geringsten Abtragswerte auf, maximal konnten 37 g m-2 erodiert werden, gleichbedeutend mit einer Abtragsrate von 0,41 g m-2 min-1. Die erodierten Sedimentmengen betrugen im Mittel bei Wegen 167 bis 319 g m-2 und im Falle von Rückegassen 17 g m-2. Anhand von Vergleichsmessungen auf Waldstandorten, bei denen ein mittlerer Bodenabtrag von ca. 5 g m-2 festgestellt wurde, konnte eine erhöhte Erodierbarkeit für jedwede Form der Weganlage bestätigt werden.
Auf Basis der im Gelände gemessenen Abtragsraten wurden die Modellierungen kalibriert. Die Ergebnisse der ABAG / DIN 19708 zeigten für das betrachtete Untersuchungsgebiet eine mittlere jährliche Bodenerosionsgefährdung von 2,4 - 5,8 t ha-1 a-1 für persistente Wege und von 0,5 t ha-1 a-1 für Rückegassen. Im Vergleich zum Mittelwert weitgehend unbeeinflusster Waldflächen im Untersuchungsgebiet von 0,1 t ha-1 a-1 zeigte sich abermals ein erhöhtes Abtragspotential. Die physikalisch basierte Modellierung der Beregnungsversuche mittels WEPP zeigte ein zufriedenstellendes Ergebnis bei der Einschätzung des Abflussverhaltens, so wurden für persistente Wege nur Abweichungen von maximal -5% festgestellt. Die Abflussmodellierung auf Rückegassen sowie die generelle Modellierung der Bodenerosion während der Beregnungsversuche zeigte sich im Kontrast hierzu noch fehlerbehaftet, was ursächlich mit der für ein physikalisches Modell relativ geringen Eingangsdatentiefe zu begründen ist.
Es wurde nachgewiesen, dass Waldwege einen bedeutenden Einfluss auf den Wasserhaushalt und das Bodenerosionsgeschehen haben. Der Rückhalt von Niederschlägen wird gemindert und es kommt zu intensivierten Bodenerosionsprozessen. Schlecht befestigte Wege zeigten einen stark erhöhten Bodenabtrag, der zu ökologischen Folgeschäden führen kann. Der Abtrag kann ebenso zu einer Beeinträchtigung der Befahrbarkeit führen. Anhand der Folgen lässt sich die Relevanz der Betrachtung von Abfluss- und Bodenerosionsprozessen auf Wald- und Wirtschaftswegen deutlich machen. Die vorliegende Arbeit stellt die erste Studie dar, innerhalb derer Abfluss- und Bodenerosionspozesse für Walderschließungsnetzwerke in Mitteleuropa untersucht wurden.
Satzung zur Festsetzung von Zulassungszahlen an der Universität Koblenz-Landau für das Studienjahr 2015/2016
Satzung zur Festsetzung der Normwerte für den Ausbildungsaufwand (Curricularnormwerte) der Universität Koblenz-Landau
Satzung der örtlichen Studierendenschaft an der Universität Koblenz-Landau, Campus Koblenz
Ordnung zur Änderung der Beitragsordnung des Studierendenwerks Koblenz
Demographic change forces companies in the social sector, which already face more difficulties recruiting from the primary labor market and keep qualified workers committed then other enterprises, to deal with the increase of employees satisfaction. This research paper analyses the context of dialogic management in the relation between the manager and her staff and satisfaction of employees. It measures the personal preferences of leaders and staff from the employee perception.
The following personal preferences are distinguished: Harmony-seeking relation preference versus dominant autonomy preferencern Thrill-seeking stimulant preference versus controlling balance preference according to Riemann, 1999 und Pashen Dihsmaier 2011. The empiric research was done with help of a survey at the Samaritan institution Fürstenwalde with a 364 out of 560 employees´ participation. It finds significant correlation between the satisfaction of the employees with their managers and their dialogic expertise as well as with their skill to create confidence and spread appreciation. It determines differences in job satisfication of employees between the perceived psychological preferences of the staff and the perceived preference of their managers. It can be proved that relationship-oriented leaders show a higher degree of willingness for dialogue then autonomy-oriented managers. The satisfaction of employees with these leaders is clearly higher then with managers who are perceived as autonomy- oriented. A higher degree of correlation between the contentment and dialogic behavior of management could be determined with stimulant- and relationship-oriented employee against employees with a preference for autonomy or balance. The highest value of contentment could be reached by relationship- oriented workers, who perceive their management also relationship-oriented. The foundation of dialogic thought and behavior based on trust and appreciation must be developed upon managers and employees, first, before dialogic management can be introduced to a company.
In this context the relationship-oriented management approaches should be taken into consideration when recruiting managers and should also have a high priority in the human resources development.
For definite isolation and classification of important features in 3D multi-attribute volume data, multidimensional transfer functions are inalienable. Yet, when using multiple dimensions, the comprehension of the data and the interaction with it become a challenge. That- because neither the control of the versatile input parameters nor the visualization in a higher dimensional space are straightforward.
The goal of this thesis is the implementation of a transfer function editor which supports the creation of a multidimensional transfer function. Therefore different visualization and interaction techniques, like Parallel Coordinates, are used. Furthermore it will be possible to choose and combine the used dimensions interactively and the rendered volume will be adapted to the user interaction in real time.
Change of ecosystems and the associated loss of biodiversity is among the most important environmental issues. Climate change, pollution, and impoundments are considered as major drivers of biodiversity loss. Organism traits are an appealing tool for the assessment of these three stressors, due to their ability to provide mechanistic links between organism responses and stressors, and consistency over wide geographical areas.
Additionally, traits such as feeding habits influence organismal performance and ecosystem processes. Although the response of traits of specific taxonomic groups to stressors is known, little is known about the response of traits of different taxonomic groups to stressors. Additionally, little is known about the effects of small impoundments on stream ecosystem processes, such as leaf litter decomposition, and food webs.
After briefly introducing the theoretical background and objectives of the studies, this thesis begins by synthesizing the responses of traits of different taxonomic groups to climate change and pollution. Based on 558 peer-reviewed studies, the uniformity (i.e., convergence) in trait response across taxonomic groups was evaluated through meta-analysis (Chapter 2). Convergence was primarily limited to traits related to tolerance.
In Chapter 3, the hypothesis that small impoundments would modify leaf litter decomposition rates at the sites located within the vicinity of impoundments, by altering habitat variables and invertebrate functional feeding groups (FFGs) (i.e., shredders), was tested. Leaf litter decomposition rates were significantly reduced at the study sites located immediately upstream (IU) of impoundments, and were significantly related to the abundance of invertebrate shredders.
In Chapter 4, the invertebrate FFGs were used to evaluate the effect of small impoundments on stream ecosystem attributes. The results showed that heterotrophic production was significantly reduced at the sites IU. With regard to food webs, the contribution of methane gas derived carbon to the biomass of chironomid larvae was evaluated through correlation of stable carbon isotope values of chironomid larvae and methane gas concentrations.
The results indicated that the contribution of methane gas derived carbon into stream benthic food web is low. In conclusion, traits are a useful tool in detecting ecological responses to stressors across taxonomic groups, and the effects of small impoundments on stream ecological integrity and food web are limited.
Ray Tracing enables a close to reality rendering implementation of a modelled scene. Because of its functioning, it is able to display optical phenomena and complex lighting. Though, numerous computations per pixel have to be done. In practice implementations can not achieve computer graphics" aim of real-time rendering close to 60 frames per second. Current Graphics Processing Units (GPU) allows high execution parallelism of general-purpose computations. By using the graphics-API OpenGL this parallelism can be achieved and it is possible to design and realize a Ray-Tracer, which operates entirely on the GPU. The developed approach will be extended by an Uniform Grid - a Ray-Tracing acceleration structure. Hence, a speed-up is expected.
This thesis` purpose is the implementation of Ray-Tracer, which operates completely on the GPU, and its expansion by integrating an Uniform Grid. Afterwards, the evaluation of maximum achievable performance takes place. Possible problems regarding GPU-programming will be identified and analysed.
101worker is the modular knowledge engineering component of the 101companies project. It has developed maintainability and performance problems due to growing organically, rather than following best software design practices. This thesis lays out these problems, drafts a set of requirements for refactoring the system and then describes and analyzes the resulting implementation. The solution involves collation of scattered and redundant information, setup of unit and functional test suites and incrementalization of the bus architecture of 101worker.
While the 1960s and 1970s still knew permanent education (Council of Europe), recurrent education (OECD) and lifelong education (UNESCO), over the past 20 years, lifelong learning has become the single emblem for reforms in (pre-) primary, higher and adult education systems and international debates on education. Both highly industrialized and less industrialized countries embrace the concept as a response to the most diverse economic, social and demographic challenges - in many cases motivated by international organizations (IOs).
Yet, literature on the nature of this influence, the diffusion of the concept among IOs and their understanding of it is scant and usually focuses on a small set of actors. Based on longitudinal data and a large set of education documents, the work identifies rapid diffusion of the concept across a heterogeneous, expansive and dynamic international field of 88 IOs in the period 1990-2013, which is difficult to explain with functionalist accounts.
Based on the premises of world polity theory, this paper argues that what diffuses resembles less the bundle of systemic reforms usually associated with the concept in the literature and more a surprisingly detailed model of a new actor " the lifelong learner.
The identification of experts for a specific technology or framework produces a large benefit for collaborative software projects. Hence it reduces the communication overhead that is required to identify an expert on the fly. Therefore this thesis describes a tool and approach that can be used to identify an expert that has a specific skill-set. It will mainly focus on the skills and expertise of developers that use the Django framework. By adding more rules to our framework that approach could easily be extended for different technologies or frameworks. The paper will close with a case study on an open source project.
One task of executives and project managers in IT companies or departments is to hire suitable developers and to assign them to suitable problems. In this paper, we propose a new technique that directly leverages previous work experience of developers in a systematic manner. Existing evidence for developer expertise based on the version history of existing projects is analyzed. More specifically, we analyze the commits to a repository in terms of affected API usage. On these grounds, we associate APIs with developers and thus we assess API experience of developers. In transitive closure, we also assess programming domain experience.
Fünfte Ordnung zur Änderung der Prüfungsordnung für die Prüfung im lehramtsbezogenen Bachelorstudiengang Berufsbildende Schulen an der Universität Koblenz-Landau und der Hochschule Koblenz
Vierte Ordnung zur Änderung der Ordnung für die Prüfung im Masterstudiengang für das Lehramt an berufsbildenden Schulen an der Universität Koblenz-Landau und der Hochschule Koblenz
Promotionsordnung des Fachbereichs 3: Mathematik/Naturwissenschaften der Universität Koblenz-Landau, Campus Koblenz
Zweite Ordnung zur Änderung der Masterprüfungsordnung für den Weiterbildenden Fernstudiengang "Energiemanagement" an der Universität Koblenz-Landau
Uniprisma Ausg. 2009
(2015)
For decades a worldwide decline of biological diversity has been reported. Landscapes are influenced by several kinds of anthropogenic disturbances. Agricultural land use, application of fertilizers and pesticides and the removal of corridors simplify and homogenize a landscape whereas others like road constructions lead to fragmentation. Both kinds lead to a constraint of habitats, reduce living environment and gene pool, hinder gene flow and change the functional characteristics of species. Furthermore, it facilitates the introduction of alien species. On the other hand, disturbances of different temporal and spatial dimensions lead to a more diverse landscape because they prevent competitive exclusion and create niches where species are able to coexist.
This study focuses on the complexity of disturbance regimes and its influence on phytodiversity. It differs from other studies that mostly select one or few disturbance types in including all identifiable disturbances. Data were derived from three study sites in the north of Bavaria and are subject to different land-use intensities. Two landscapes underlie agriculture and forestry, of which one is intensively used and the second one rather moderate and small-scaled. The third dataset was collected on an actively used military training area. The first part of the study deals with the influence of disturbance regimes on phytodiversity, first with the focus on military disturbances, afterwards in comparison with the agricultural landscapes. The second part examines the influence of disturbance regimes on red-listed species, the distribution of neophytes and generalist plant species and the homogenization of the landscape. All analyses were conducted on landscape and local scale.
A decisive role was played by the variety of disturbance types, especially in different temporal and spatial dimensions and not by single kinds of disturbances, which significantly was proven in the military training area with its multiple and undirected disturbance regime. Homogeneous disturbance regimes that typically are found in agricultural landscapes led to a reduced species number. On local scale, the abiotic heterogeneity which originated of recent and historical disturbances superimposed the positive effects of disturbance regimes, whereas dry and nutrient-poor sites showed a negative effect. Due to a low tree density and moderate treatment species numbers were significantly higher in forest in the training area than in the two agricultural landscapes.
Numbers of red-listed species were positively correlated to the total number of species in all three sites. However, the military training area showed a significantly higher abundance within the area in comparison to the agricultural landscapes where rare species were mostly found on marginal strips. Furthermore, numbers of neophytes and generalist species were lower and consequently homogenization.
In conclusion, the military training area is an ideal landscape from a nature conservation point of view. The moderately used agricultural area showed high species numbers and agricultural productivity. However, yield is too low to withstand either abandonment or land-use intensification.
This research examines information audit methodologies and information capturing methods for enterprise social software which are an elementary part of the audit process. Information auditing is lacking of a standardized definition and methodology because the scope of the audit process is diversified and dependent on the organization undertaking the audit. The benefits of information auditing and potential challenges of Enterprise 2.0 the audit can overcome are comprehensive and provide a major incentive for managers to conduct an audit. Information asset registers as a starting point for information auditing are not specifically focusing on social software assets. Therefore this research pro-ject combines asset registers from different areas to create a new register suitable for the requirements of Enterprise 2.0. The necssary adaptations caused by the new character of the assets are minor. The case study applying the asset register for the first time however reveals several problematic areas for information auditors completing the register. Rounding up the thesis a template is developed for setting up new work spaces on enterprise social software systems with appropriate metadata taking into account the meaningful metadata discovered in the asset register.
Modern agriculture is a dominant land use in Europe, although it has been associated with negative effects on biodiversity in agricultural landscapes. One species-rich insect group in agro-ecosystems is the Lepidoptera (moths and butterflies); however, the populations of a number of Lepidoptera species are currently declining. The aims of this thesis were to assess the amount and structure of field margins in agricultural landscapes, study the effects of realistic field margin input rates of agrochemicals (fertilizer and pesticides) on Lepidoptera, and provide information on moth pollination services.
In general, field margins are common semi-natural habitat elements in agro-ecosystems; however, data on the structure, size, and width of field margins is limited. An assessment in two German agricultural landscapes (4,000 ha each) demonstrated that many of the evaluated field margins were less than 3 m wide (Rhineland‐Palatinate: 85% of margin length; Brandenburg: 45% margin length). In Germany, risk mitigation measures (such as buffer zones) to reduce pesticide inputs to terrestrial non-crop habitats do not have to be established by farmers next to narrow field margins. Thus, narrow field margins receive inputs of agrochemicals, especially via overspray and spray drift. These field margins were used as a development habitat for caterpillars, but the mean abundance of caterpillars was 35 – 60% lower compared with that in meadows. Caterpillars were sensitive to realistic field margin input rates of insecticide (pyrethroid, lambda-cyhalothrin) in a field experiment as well as in laboratory experiments. Moreover, 40% fewer Hadena bicruris eggs were observed on Silene latifolia plants treated with this insecticide compared with control plants, and the flowers of these insecticide-treated plants were less likely to be pollinated by moths. In addition, realistic field margin input rates of herbicides can also affect Lepidoptera. Ranunculus acris L. plants treated with sublethal rates of a sulfonylurea herbicide were used as host plants for Mamestra brassicae L. caterpillars, which resulted in significantly lower caterpillar weights, increased time to pupation, and increased overall development time compared with caterpillars feeding on control plants. These results might have been caused by lower nutritional value of the herbicide-treated plants or increased concentrations of secondary metabolites involved in plant defense. Fertilizer applications slightly increased the caterpillar abundance in the field experiment. However, fertilizers reduce plant diversity in the long term and thus, most likely, also reduce caterpillar diversity.
Moths such as Noctuidae and Sphingidae have been observed to act as pollinators for numerous plant species, including a number of Orchidaceae and Caryophyllaceae. Although in temperate agro-ecosystems moths are less likely to act as the main pollinators for crops, they can pollinate non-crop plants in semi-natural habitats. Currently, the role of moths as pollinators appears to be underestimated, and long-term research focusing on ecosystems is necessary to address temporal fluctuations in their abundance and community composition.
Lepidoptera represent a diverse organism group in agricultural landscapes and fulfill essential ecosystem services, such as pollination. To better protect moths and butterflies, agrochemical inputs to (narrow) field margins habitats should be reduced, for example, via risk mitigation measures and agro-environmental schemes.
Global crop production increased substantially in recent decades due to agricultural intensification and expansion and today agricultural areas occupy about 38% of Earth’s terrestrial surface - the largest use of land on the planet. However, current high-intensity agricultural practices fostered in the context of the Green Revolution led to serious consequences for the global environment. Pesticides, in particular, are highly biologically active substances that can threaten the ecological integrity of aquatic and terrestrial ecosystems. Although the global pesticide use increases steadily, our field-data based knowledge regarding exposure of non-target ecosystems such as surface waters is very restricted. Available studies have by now been limited to spatially restricted geographical areas or had rather specific objectives rendering the extrapolation to larger spatial scales questionable.
Consequently, this thesis evaluated based on four scientific publications the exposure, effects, and regulatory implications of particularly toxic insecticides` concentrations detected in global agricultural surface waters. FOCUS exposure modelling was used to characterise the highly specific insecticide exposure patterns and to analyse the resulting implications for both monitoring and risk assessment (publication I). Based on more than 200,000 scientific database entries, 838 peer-reviewed studies finally included, and more than 2,500 sites in 73 countries, the risks of agricultural insecticides to global surface waters were analysed by means of a comprehensive meta-analysis (publication II). This meta-analysis evaluated whether insecticide field concentrations exceed legally accepted regulatory threshold levels (RTLs) derived from official EU and US pesticide registration documents and, amongst others, how risks depend on insecticide development over time and stringency of environmental regulation. In addition, an in-depth analysis of the current EU pesticide regulations provided insights into the level of protection and field relevance of highly elaborated environmental regulatory risk assessment schemes (publications III and IV).
The results of this thesis show that insecticide surface water exposure is characterized by infrequent and highly transient concentration peaks of high ecotoxicological relevance. We thus argue in publication I that sampling based on regular intervals is inadequate for the detection of insecticide surface water concentrations and that traditional risk assessment concepts based on all insecticide concentrations including non-detects lead to severely biased results and critical underestimations of risks. Based on these considerations, publication II demonstrates that out of 11,300 measured insecticide concentrations (MICs; i.e., those actually detected and quantified), 52.4% (5,915 cases; 68.5%) exceeded the RTL for either water (RTLSW) or sediments. This indicates a substantial risk for the biological integrity of global water resources as additional analyses on pesticide effects in the field clearly evidence that the regional aquatic biodiversity is reduced by approximately 30% at pesticide concentrations equalling the RTLs. In addition, publication II shows that there is a complete lack of scientific monitoring data for ~90% of global cropland and that both the actual insecticide contamination of surface waters and the resulting ecological risks are most likely even greater due to, for example, inadequate sampling methods employed in the studies and the common occurrence of pesticide mixtures. A linear model analysis identified that RTLSW exceedances depend on the catchment size, sampling regime, sampling date, insecticide substance class, and stringency of countries` environmental regulations, as well as on the interactions of these factors. Importantly, the risks are significantly higher for newer-generation insecticides (i.e., pyrethroids) and are high even in countries with stringent environmental regulations. Regarding the latter, an analysis of the EU pesticide regulations revealed critical deficiencies and the lack of protectiveness and field-relevance for current presumed highly elaborated FOCUS exposure assessment (publication IV) and overall risk assessment schemes (publication III). Based on these findings, essential risk assessment amendments are proposed.
In essence, this thesis analyses the agriculture–environment linkages for pesticides at the global scale and it thereby contributes to a new research frontier in global ecotoxicology. The overall findings substantiate that agricultural insecticides are potential key drivers for the global freshwater biodiversity crisis and that the current regulatory risk assessment approaches for highly toxic anthropogenic chemicals fail to protect the global environment. This thesis provides an integrated view on the environmental side effects of global high-intensity agriculture and alerts that beside worldwide improvements to current pesticide regulations and agricultural pesticide application practices, the fundamental reformation of conventional agricultural systems is urgently needed to meet the twin challenges of providing sufficient food for a growing human population without destroying the ecological integrity of global ecosystems essential to human existence.
Campuszeitung Ausg. 2/2011
(2015)
Themen: Methodenzentrum am Campus Koblenz
"Macbeth" mal ganz anders
Gründungsbüro eröffnet
KOpEE-Kongress
20 Jahre ZFUW
10 Jahre Semantic Web Forschung
Lahnsteiner Schüler auf dem Campus
Westpoint meets Universität in Koblenz MINT-Aktionstag 2011 auf dem Campus
Auftakt Women Career Center
Absolventenfeiern
PTHV und Uni stärken Kooperation
Aquatic macrophytes can contribute to the retention of organic contaminants in streams, whereas knowledge on the dynamics and the interaction of the determining processes is very limited. The objective of the present study was thus to assess how aquatic macrophytes influence the distribution and the fate of organic contaminants in small vegetated streams. In a first study that was performed in vegetated stream mesocosms, the peak reductions of five compounds were significantly higher in four vegetated stream mesocosms compared to a stream mesocosm without vegetation. Compound specific sorption to macrophytes was determined, the mass retention in the vegetated streams, however, did not explain the relationship between the mitigation of contaminant peaks and macrophyte coverage. A subsequent mesocosm study revealed that the mitigation of peak concentrations in the stream mesocosms was governed by two fundamentally different processes: dispersion and sorption. Again, the reductions of the peak concentrations of three different compounds were in the same order of magnitude in a sparsely and a densely vegetated stream mesocosm, respectively, but higher compared to an unvegetated stream mesocosm. The mitigation of the peak reduction in the sparsely vegetated stream mesocosm was found to be fostered by longitudinal dispersion as a result of the spatial distribution of the macrophytes in the aqueous phase. The peak reduction attributable to longitudinal dispersion was, however, reduced in the densely vegetated stream mesocosm, which was compensated by compound-specific but time-limited and reversible sorption to macrophytes. The observations on the reversibility of sorption processes were subsequently confirmed by laboratory experiments. The experiments revealed that sorption to macrophytes lead to compound specific elimination from the aqueous phase during the presence of transient contaminant peaks in streams. After all, these sorption processes were found to be fully reversible, which results in the release of the primarily adsorbed compounds, once the concentrations in the aqueous phase starts to decrease. Nevertheless, the results of the present thesis demonstrate that the processes governing the mitigation of contaminant loads in streams are fundamentally different to those already described for non-flowing systems. In addition, the present thesis provides knowledge on how the interaction of macrophyte-induced processes in streams contributes to mitigate loads of organic contaminants and the related risk for aquatic environments.
Campuszeitung Ausg. 2/2014
(2015)
Engineered nanoparticles are emerging pollutants. Their increasing use in commercial products suggests a similar increase of their concentrations in the environment. Studying the fate of engineered colloids in the environment is highly challenging due to the complexity of their possible interactions with the main actors present in aquatic systems. Solution chemistry is one of the most central aspects. In particular, the interactions with dissolved organic matter (DOM) and with natural colloids are still weakly understood.
The aim of this work was to further develop the dedicated analytical methods required for investigating the fate of engineered colloids in environmental media as influenced by DOM. Reviewing the literature on DOM interactions with inorganic colloids revealed that a systematic characterization of both colloids and DOM, although essential, lacks in most studies and that further investigations on the fractionation of DOM on the surface of engineered colloids is needed. Another knowledge gap concerns the effects of DOM on the dynamic structure of colloid agglomerates. For this question, analytical techniques dedicated to the characterization of agglomerates in environmental media at low concentrations are required. Such techniques should remain accurate at low concentrations, be specific, widely matrix independent and free of artefact due to sample preparation. Unfortunately, none of the currently available techniques (microscopy, light scattering based methods, separation techniques etc.) fulfills these requirements.
However, a compromise was found with hydrodynamic chromatography coupled to inductively coupled plasma mass spectrometry (HDC-ICP-MS). This method has the potential to size inorganic particles in complex media in concentration ranges below ppb and is element specific; however, its limitations were not systematically explored. In this work, the potential of this method has been further explored. The simple size separation mechanism ensures a high flexibility of the elution parameters and universal calibration can be accurately applied to particles of different compositions and surface chemistries. The most important limitations of the method are its low size resolution and the effect of the particle shape on the retention factor. The implementation of HDC coupled to single particle ICP-MS (HDC-SP-ICP-MS) offers new possibilities for the recognition of particle shape and hence the differentiation between primary particles and homoagglomerates. Therefore, this coupling technique is highly attractive for monitoring the effects of DOM on the stability of colloids in complex media. The versatility of HDC ICP MS is demonstrated by its successful applications to diverse samples. In particular, it has been used to investigate the stability of citrate stabilized silver colloids in reconstituted natural water in the presence of different types of natural organic matter. These particles were stable for at least one hour independently of the type of DOM used and the pH, in accordance with a coauthored publication addressing the stability of silver colloids in the River Rhine. Direct monitoring of DOM adsorption on colloids was not possible using UV and fluorescence detectors. Preliminary attempts to investigate the adsorption mechanism of humic acids on silver colloids using fluorescence spectroscopy suggest that fluorescent molecules are not adsorbed on silver particles. Several solutions for overcoming the encountered difficulties in the analysis of DOM interactions are proposed and the numerous perspectives offered by further developments and applications of HDC-(SP)-ICP-MS in environmental sciences are discussed in detail.
Im Rahmen dieser Arbeit soll eine Methodik erarbeitet werden, die englische, keyword-basierte Anfragen in SPARQL übersetzt und bewertet. Aus allen generierten SPARQL-Queries sollen die relevantesten ermittelt und ein Favorit bestimmt werden. Das Ergebnis soll in einer Nutzerevaluation bewertet werden.
Traditional Driver Assistance Systems (DAS) like for example Lane Departure Warning Systems or the well-known Electronic Stability Program have in common that their system and software architecture is static. This means that neither the number and topology of Electronic Control Units (ECUs) nor the presence and functionality of software modules changes after the vehicles leave the factory.
However, some future DAS do face changes at runtime. This is true for example for truck and trailer DAS as their hardware components and software entities are spread over both parts of the combination. These new requirements cannot be faced by state-of-the-art approaches of automotive software systems. Instead, a different technique of designing such Distributed Driver Assistance Systems (DDAS) needs to be developed. The main contribution of this thesis is the development of a novel software and system architecture for dynamically changing DAS using the example of driving assistance for truck and trailer. This architecture has to be able to autonomously detect and handle changes within the topology. In order to do so, the system decides which degree of assistance and which types of HMI can be offered every time a trailer is connected or disconnected. Therefore an analysis of the available software and hardware components as well as a determination of possible assistance functionality and a re-configuration of the system take place. Such adaptation can be granted by the principles of Service-oriented Architecture (SOA). In this architectural style all functionality is encapsulated in self-contained units, so-called Services. These Services offer the functionality through well-defined interfaces whose behavior is described in contracts. Using these Services, large-scale applications can be built and adapted at runtime. This thesis describes the research conducted in achieving the goals described by introducing Service-oriented Architectures into the automotive domain. SOA deals with the high degree of distribution, the demand for re-usability and the heterogeneity of the needed components.
It also applies automatic re-configuration in the event of a system change. Instead of adapting one of the frameworks available to this scenario, the main principles of Service-orientation are picked up and tailored. This leads to the development of the Service-oriented Driver Assistance (SODA) framework, which implements the benefits of Service-orientation while ensuring compatibility and compliance to automotive requirements, best-practices and standards. Within this thesis several state-of-the-art Service-oriented frameworks are analyzed and compared. Furthermore, the SODA framework as well as all its different aspects regarding the automotive software domain are described in detail. These aspects include a well-defined reference model that introduces and relates terms and concepts and defines an architectural blueprint. Furthermore, some of the modules of this blueprint such as the re-configuration module and the Communication Model are presented in full detail. In order to prove the compliance of the framework regarding state-of-the-art automotive software systems, a development process respecting today's best practices in automotive design procedures as well as the integration of SODA into the AUTOSAR standard are discussed. Finally, the SODA framework is used to build a full-scale demonstrator in order to evaluate its performance and efficiency.
Uniprisma Ausg. 2006
(2015)
Thematik dieser Arbeit ist das dreidimensionale Image-Warping für diffuse und reflektierende Oberflächen. Das Warpingverfahren für den reflektierenden Fall gibt es erst seit 2014. Bei diesem neuen Algorithmus treten Artefakte auf, sobald ein Bild für einen alternativen Blickwinkel auf eine sehr unebene Fläche berechnet werden soll.
In dieser Arbeit wird der Weg von einem Raytracer, der die Eingabetexturen erzeugt, über das Warpingverfahren für beide Arten der Oberflächen, bis zur Optimierung des Reflective-Warping-Verfahrens erarbeitet. Schließlich werden die Ergebnisse der Optimierung bewertet und in den aktuellen sowie zukünftigen Stand der Technik eingeordnet.
In der vorliegenden Arbeit sollen weltweit herrschende inhaltliche Ausprägungen und Schwerpunkte des Themengebiets "BMI" bzw. "GMI" mit Hilfe des Literatur-Reviews akademischer Artikel herausgearbeitet werden. Die festgestellten Beziehungen und Zusammenhänge sollen visualisiert und lokalisiert werden, um eine globale Sicht über das Thema herzustellen. Unter anderem sollen die in das finale Set aufgenommenen Artikel auf eine Korrelation zwischen BMI und Controlling bzw. Management hin überprüft werden. Als letzter Schritt soll eine Ableitung möglicher Forschungslücken unternommen werden.
Ordnung für die Eignungsprüfung Bildende Kunst der Universität Koblenz-Landau, Campus Landau
Beitragsordnung der Studierendenschaft der Universität Koblenz-Landau, Campus Landau
Vierte Satzung zur Änderung der Satzung der Universität Koblenz-Landau über das Auswahlverfahren in zulassungsbeschränkten Studiengängen
Ordnung zur Änderung der Einschreibeordnung für die Universität Koblenz-Landau
Elfte Ordnung zur Änderung der Prüfungsordnung für die Prüfung im lehramtsbezogenen Bachelorstudiengang an der Universität Koblenz-Landau
Zehnte Ordnung zur Änderung der Prüfungsordnung für die Prüfung in den Masterstudiengängen für das Lehramt an Grundschulen, das Lehramt an Realschulen plus, das Lehramt an Förderschulen sowie das Lehramt an Gymnasien an der Universität Koblenz-Landau
Neunte Ordnung zur Änderung der Ordnung für die Prüfung im lehramtsbezogenen Zertifikatsstudiengang (Erweiterungsprüfung) an der Universität Koblenz-Landau
Achte Ordnung zur Änderung der Prüfungsordnung für die Prüfung im Zwei-Fach-Bachelorstudiengang an der Universität Koblenz-Landau
Erste Ordnung zur Änderung der Gemeinsamen Prüfungsordnung für die Bachelor- und Masterstudiengänge des Fachbereichs Informatik an der Universität Koblenz-Landau
Animationen können in instruktionalen Kontexten genutzt werden, um Wissen über Sachverhalte zu vermitteln, die Prozesse oder Abläufe beinhalten. So können dynamische Sachverhalte explizit dargestellt werden und müssen nicht vom Lerner selbst in Gedanken hergestellt, sondern nur anhand der Animation nachvollzogen werden. Dies sollte sich positiv auf den Wissenserwerb auswirken. Dabei stellen Animationen mit ihrer besonderen Eigenschaft der Darstellung zeitlicher Abläufe besondere Herausforderungen an den Lerner. Das menschliche Informationsverarbeitungssystem unterliegt bestimmten Begrenzungen im Hinblick auf die Wahrnehmung von Geschwindigkeiten. Zu schnelle und zu langsame Geschwindigkeiten können beispielsweise nur schwer wahrgenommen und dementsprechend auch nicht kognitiv verarbeitet werden. Die Zielsetzung der Arbeit, die sich daraus ergibt, war eine systematische Untersuchung der Wirkung unterschiedlicher Präsentationsgeschwindigkeiten auf das Wahrnehmen und Verstehen eines dynamischen Sachverhaltes anhand einer Animation.
Um die Fragestellungen der Arbeit beantworten zu können, wurden vier experimentelle Studien durchgeführt. Die Pilotstudie hatte das Ziel, sowohl das Lernmaterial als auch den entwickelten Wissenstest zu evaluieren. In Studie 1 wurde der Frage nach dem Einfluss der Präsentationsgeschwindigkeit auf den Wissenserwerb beim Lernen mit einer interaktiven Animation nachgegangen.
Die Studien 2 und 3 untersuchten den Einfluss verschiedener Reihenfolgen von Geschwindigkeiten auf den Wissenserwerb. Hier ging es um eine systematische Erfassung der perzeptuellen und kognitiven Verarbeitung dynamischer Informationen in zwei verschiedenen Geschwindigkeiten mittels Blickbewegungsmessung (Studie 2) und wiederholten Testungen des Wissenserwerbs zwischen den einzelnen Lernphasen (Studie 3).
Die Ergebnisse der Studien deuten darauf hin, dass bei langsamer Geschwindigkeit Wissen über Ereignisse auf untergeordneter zeitlicher Ebene erworben wurde und dass je schneller eine Animation gesehen wurde, umso mehr anteiliges Wissen auf einer übergeordneten zeitlichen Ebene erworben wurde (Studie 1), aber eindeutige Aussagen über den Einfluss der Geschwindigkeit auf den Wissenserwerb auf verschiedenen zeitlichen Hierarchieebenen lassen sich aufgrund der Ergebnisse der Studien nicht machen. Im Hinblick auf die Lernförderlichkeit verschiedener Arten der Sequenzierung von Geschwindigkeiten zeigten sich auch keine eindeutigen Ergebnisse. Aufgrund der Analyse der Blickbewegungsdaten deutet sich jedoch an, dass die Reihenfolge "langsam - schnell" den Bedingungen auf Seiten der Lerner eher entgegen kommt als die Reihenfolge "schnell - langsam".
Nutzung von Big Data im Marketing : theoretische Grundlagen, Anwendungsfelder und Best-Practices
(2015)
Die zunehmende Digitalisierung des Alltags und die damit verbundene omnipräsente Datengenerierung bieten für Unternehmen und insbesondere Marketingabteilungen die Chance, Informationen in bisher ungekannter Fülle über ihre Kunden und Produkte zu erhalten. Die Gewinnung solcher Informationen aus riesigen Datenmengen, die durch neue Technologien ermöglicht wird, hat sich dabei unter dem Begriff Big Data etabliert.
Die vorliegende Arbeit analysiert diese Entwicklung im Hinblick auf das Potenzial für die unternehmerische Disziplin des Marketings. Dazu werden die theoretischen Grundlagen des Einsatzes von Big Data im Marketing identifiziert und daraus Anwendungsfelder und Best-Practice-Lösungen abgeleitet. Die Untersuchung basiert auf einer Literaturanalyse zu dem Thema Big Data Marketing, welche neben verschiedenen Studien und Befragungen auch Expertenmeinungen und Zukunftsprognosen einschließt. Die Literatur wird dabei zunächst auf die theoretischen Grundlagen des Konstrukts Big Data analysiert.
Anschließend wird die Eignung von Big Data Lösungen für den Einsatz in Unternehmen geprüft, bevor die Anwendung im Bereich des Marketings konkretisiert und analysiert wird. Es wurde dabei festgestellt, dass anhand der theoretischen Aspekte von Big Data eine starke Eignung für den Einsatz im Rahmen des Marketings besteht. Diese zeichnet sich vor allem durch die detaillierten Informationen über Verhaltensmuster von Kunden und ihre Kaufentscheidungen aus. Weiterhin wurden potenzielle Anwendungsfelder identifiziert, welche besonders im Bereich der Kundenorientierung und der Marktforschung liegen. Im Hinblick auf Best-Practice-Lösungen konnte ein grober Leitfaden für die Integration von Big Data in die Unternehmensorganisation entwickelt werden. Abschließend wurde festgehalten, dass das Thema Big Data eine hohe Relevanz für das Marketing aufweist und dies in der Zukunft maßgeblich mitbestimmen wird.
Politische und gesellschaftliche Polarisierung ist ein interessantes Phänomen, über dessen Auswirkungen viele unterschiedliche, zum Teil auch gegensätzliche, Theorien existieren.
Polarisierung wird in der Literatur mit unterschiedlichen Methoden gemessen. Die vorliegende Arbeit gibt einen Überblick über existierende Polarisierungsmaße und es werden zwei neuartige Maße aus dem Gebiet der spektralen Graphentheorie vorgestellt. Anschließend werden die bekannten und die neu entwickelten Maße auf den LiquidFeedback-Datensatz der Piratenpartei Deutschland angewandt. Als Ergebnis lässt sich festhalten, dass die Maße teilweise zu unterschiedlichen Ergebnisse kommen. Dies liegt darin begründet, dass nicht alle Maße das Gleiche messen. Um zu verstehen was die einzelnen Maße aussagen, werden wesentliche Eigenschaften von Polarisierungsmaßen herausgearbeitet und es wird für jedes Maß dargelegt, welche Eigenschaften es erfüllt. Die angesprochenen Polarisierungsmaße beziehen sich auf die Entwicklung der Polarisierung zwischen Usern des LiquidFeedback-Systems. Bei der Betrachtung von einzelnen Personen und Abstimmungen fiel unter anderem auf, dass polarisierende Personen mehr Macht durch
Delegationen besitzen als die restlichen Personen und dass polarisierte Vorschläge circa doppelt so häufig umgesetzt werden.
Immersion into narrative worlds - theoretical and empirical approaches to audience experience
(2015)
Die vorliegende Dissertation widmet sich dem Phänomen des Erlebens der Rezeption von audiovisuellen narrativen unterhaltenden Medieninhalten. Dieses wird zunächst in die Rezeptions- und Wirkungsforschung eingeordnet und für das weitere Vorgehen konkretisiert: Transportation und Narrative Engagement stellen aktuell die beiden wichtigsten Konzepte der Medienpsychologie bezüglich der Nutzung und Wirkung von Geschichten dar.
Anschließend werden drei Fragestellungen bearbeitet. Bisher standen Forscher und Forscherinnen vor dem Problem der Manipulation des Rezeptionserlebens. Daher wurden in der vorliegenden Arbeit zwei Verfahren vorgeschlagen und in vier experimentellen Studien geprüft. Der Einsatz von Rezensionen erwies sich als geeignet, um bei allen narrativen unterhaltenden Texten das Rezeptionserleben ökonomisch zu manipulieren. Weiterhin gibt es bislang kein etabliertes Verfahren zur rezeptionsbegleitenden Messung des Rezeptionserlebens.
In dieser Arbeit wurde ein Verfahren aus einer Kombination von Real Time Response Measurement (RTR), Secondary Task Reaction Times (STRT) und der Erhebung der Lidschlagfrequenz entwickelt. Vor allem RTR war in der Lage, die im Zusammenhang mit dem Rezeptionserleben auftretenden emotionalen Prozesse zu erfassen. Die Befürchtung, die rezeptionsbegleitenden Messmethoden könnten das Rezeptionserleben verhindern, wurde in einer weiteren experimentellen Studie größtenteils entkräftet. Zuletzt wurde der Prozess des Zusammenfassens des Rezeptionserlebens in ein postrezeptives Urteil thematisiert. Nach der Entwicklung eines Rahmenmodells der Beantwortung postrezeptiver Skalen wurde in einer weiteren Studie die Bedeutung verschiedener Verlaufsparameter für das postrezeptive Urteil untersucht. Vier ausgesuchte Parameter zusammen können das postrezeptive Urteil besser erklären als der Verlaufs-Mittelwert. Die Arbeit schließt mit einer Diskussion, in der unter anderem die dynamische und die postrezeptive Messung des Rezeptionserlebens aufeinanderrnbezogen werden und hinsichtlich ihrer Bedeutung kritische Würdigung erfahren.
Today, augmented reality is becoming more and more important in several areas like industrial sectors, medicine, or tourism. This gain of importance can easily be explained by its powerful extension of real world content. Therefore, augmented realty became a way to explain and enhance the real world information. Yet, to create a system which can enhance a scene with additional information, the relation between the system and the real world must be known. In order to establish this relationship a commonly used method is optical tracking. The system calculates its relation to the real world from camera images. To do so, a reference which is known is needed in the scene to serve as an orientation. Today, this is mostly a 2D-marker or a 2D-texture. These are placed in the real world scenery to serve as a reference. But, this is an intrusion in the scene. That is why it is desirable that the system works without such an additional aid. An strategy without manipulating the scene is object-tracking. In this approach, any object from the scene can be used as a reference for the system. As an object is far more complex than a marker, it is harder for the system to establish its relationship with the real world. That is why most methods for 3D-object-tracking reduce the object by not using the whole object as reference. The focus of this thesis is to research how a whole object can be used as a reference in a way that either the system or the camera can be moved in any 360 degree angle around the object without loosing the relation to the real world. As a basis the augmented reality framework, the so called VisionLib, is used. Extensions to this system for 360 degree tracking are implemented in different ways and analyzed in the scope of this work. Also, the different extensions are compared. The best results were achieved by improving the reinitialization process. With this extension, current camera images of the scene are given to the system. With the hek of these images, the system can calculate the relation to the real world faster in case the relation went missing.
Code package managers like Cabal track dependencies between packages. But packages rarely use the functionality that their dependencies provide. This leads to unnecessary compilation of unused parts and to speculative conflicts between package versions where there are no conflicts. In two case studies we show how relevant these two problems are. We then describe how we could avoid them by tracking dependencies not between packages but between individual code fragments.
Einfluss eines Ausrichtungswerkzeugs auf die Bedienbarkeit in unbeaufsichtigten Eyetrackingsystemen
(2015)
Eye gaze trackers are devices that can estimate the direction of gaze of a person. Among usability testing eye tracking also allows persons with decreased limb mobility to control or to interact with the computer. The quality and availability of eye tracking equipment has been increasing while costs have been decreasing. This development leads to entering new markets by using eye tracking as an additional input dimension for a variety of applications. Up to now eye tracking has been supervised by qualified experts, who assured that the important conditions like position in front of the eye tracking device, calibration and light conditions has been kept, while using.
This thesis examines an adjustment tool, which is helping the user to adjust in front of the eye tracker and helping to keep this position during the experiment. Furthermore the accuracy while moving the head has been analysed. In this experiment an remote eye gaze tracker has been used to control a game character in the video game called 'Schau Genau!'. The goal was to determine whether the game is playable without the barrier of adjusting and calibration. The results show that adjusting in front of an eye tracker is not a problem, keeping this position is. Small changes of the head position after the calibration process leads to a lack of accuracy. Giving up the calibration and using someone else calibration shows way bigger deviation. Additional head movement increases error rate and makes controlling more difficult.
The present work introduces a rigid-body physics engine, focusing on the collision detection by GPU. The increasing performance and accessibility of modern graphics cards ensures that they can be also used for algorithms that are meant not only for imaging. This advantage is used to implement an efficient collision detection based on particles. The performance differences between CPU and GPU are presented by using a test environment.
Campuszeitung Ausg. 1/2012
(2015)
Themen: Schwerpunkte der Forschungsinitiative Forschungsinitiative des Landes fördert vier Schwerpunkte
Student der Universität in Koblenz gewinnt Sonderpreis
Studieren bewegt
Wanderer zwischen den Kulturen
Alphajump erleichtert Berufseinstieg für Studierende
Apps für Deutschland
QR-Codes auf dem Campus und weitere
Placing questions before the material or after the material constitute different reading situations. To adapt to these reading situations, readers may apply appropriate reading strategies. Reading strategy caused by location of question has been intensively explored in the context of text comprehension. (1) However, there is still not enough knowledge about whether text plays the same role as pictures when readers apply different reading strategies. To answer this research question, three reading strategies are experimentally manipulated by displaying question before or after the blended text and picture materials: (a) Unguided processing with text and pictures and without the question. (b) Information gathering to answer the questions after the prior experience with text and pictures. (c) Comprehending text and pictures to solve the questions with the prior information of the questions. (2) Besides, it is arguable whether readers prefer text or pictures when the instructed questions are in different difficulty levels. (3) Furthermore, it is still uncertain whether students from higher school tier (Gymnasium) emphasize more on text or on pictures than students from lower school tier (Realschule). (4) Finally, it is rarely mentioned whether higher graders are more able to apply reading strategies in text processing and picture processing than lower graders.
Two experiments were undertaken to investigate the usage of text and pictures in the perspectives of task orientation, question difficulty, school and grade. For a 2x2(x2x2x2) mixed design adopting eye tracking method, participants were recruited from grade 5 (N = 72) and grade 8 (N = 72). In Experiment 1, thirty-six 5th graders were recruited from higher tier (Gymnasium) and thirty-six 5th graders were from lower tier (Realschule). In Experiment 2, thirty-six 8th graders were recruited from higher tier and thirty-six were from lower tier. They were supposed to comprehend the materials combining text and pictures and to answer the questions. A Tobii XL60 eye tracker recorded their eye movements and their answers to the questions. Eye tracking indicators were analyzed and reported, such as accumulated fixation duration, time to the first fixation and transitions between different Areas of Interest. The results reveal that students process text differently from pictures when they follow different reading strategies. (1) Consistent with Hypothesis 1, students mainly use text to construct their mental model in unguided spontaneous processing of text and pictures. They seem to mainly rely on the pictures as external representations when trying to answer questions after the prior experience with the material. They emphasize on both text and pictures when questions are presented before the material. (2) Inconsistent with Hypothesis 2, students are inclined to emphasize on text and on pictures as question difficulty increases. However, the increase of focus on pictures is more than on text when the presented question is difficult. (3) Different from Hypothesis 3, the current study discovers that higher tier students did not differ from lower tier students in text processing. Conversely, students from higher tier attend more to pictures than students from lower tier. (4) Differed from Hypothesis 4, 8th graders outperform 5th graders mainly in text processing. Only a subtle difference is found between 5th graders and 8th graders in picture processing.
To sum up, text processing differs from picture processing when applying different reading strategies. In line with the Integrative Model of Text and Picture Comprehension by Schnotz (2014), text is likely to play a major part in guiding the processing of meaning or general reading, whereas pictures are applied as external representations for information retrieval or selective reading. When question is difficulty, pictures are emphasized due to their advantages in visualizing the internal structure of information. Compared to lower tier students (poorer problem solvers), higher tier students (good problem solvers) are more capable of comprehending pictures rather than text. Eighth graders are more efficient than 5th graders in text processing rather than picture processing. It also suggests that in designing school curricula, more attention should be paid to students’ competence on picture comprehension or text-picture integration in the future.
Software systems are often developed as a set of variants to meet diverse requirements. Two common approaches to this are "clone-and-owning" and software product lines. Both approaches have advantages and disadvantages. In previous work we and collaborators proposed an idea which combines both approaches to manage variants, similarities, and cloning by using a virtual platform and cloning-related operators.
In this thesis, we present an approach for aggregating essential metadata to enable a propagate operator, which implements a form of change propagation. For this we have developed a system to annotate code similarities which were extracted throughout the history of a software repository. The annotations express similarity maintenance tasks, which can then either be executed automatically by propagate or have to be performed manually by the user. In this work we outline the automated metadata extraction process and the system for annotating similarities; we explain how the implemented system can be integrated into the workflow of an existing version control system (Git); and, finally, we present a case study using the 101haskell corpus of variants.
Massenprozessmanagement
(2015)
This dissertation answers the research question which basically suitable approaches and which necessary information technologies are to be considered for the management of business processes in large amounts (Mass Business Process Management, MBPM) in service companies. It could be shown that for the execution of mass processes a special approach that uses methods of the manufacturing industry is necessary. The research aim to develop an MBPM approach for service companies was accomplished by using the Design Science Research approach and is explained in this dissertation in consecutive steps. For the development of the MBPM approach a longitudinal indepth case study was conducted with a business process outsourcing provider to gain insights from his approach. Outsourcing providers have to produce their services in a very efficient and effective way, otherwise they will not be able to offer their products at favorable conditions. It was shown that the factory-oriented approach of the out-sourcing service provider in the observation period of ten years was suitable to execute mass processes of highest quality, at constantly decreasing prices with less and less people.
The assumed need for research concerning MBPM was verified on the basis of an extensive literature research based on the Journal Rating VHB-Jourqual and other literature sources. As many approaches for the introduction of BPM were found, a selection of BPM approaches was analyzed to gain further insights for the development of the MBPM approach. Based on the analysis and the comparison of the different BPM approaches as well as the comparison with the approach of the process outsourcing provider it was found, that BPM and MBPM differ in many aspects. MBPM has a strong operational focus and needs intensive IT support. The operative focus mainly shows in the operative control of processes and people as well as in the corresponding high demands on process transparency. With detailed monitoring and fine grained process measurements as well as timely reporting this process transparency is achieved. Information technology is needed for example to conduct process monitoring timely but also to give internal as well as external stakeholders the desired overview of the current workload and of the invoicing of services.
Contrary to the approach of the process outsourcing provider it could also been shown, that change management can influence the implementation, the continuous operation and the constant change associated with MBPM in a positive way.
Die Arbeitsgruppe Echtzeitsysteme an der Universität Koblenz beschäftigt sich seit mehreren Jahren mit der Thematik autonomes und assistiertes Fahren. Eine große Herausforderung stellen in diesem Zusammenhang mehrgliedrige Fahrzeuge dar, deren Steuerung für den Fahrer während der Rückwärtsfahrt sehr anspruchsvoll ist. Um präzise Manöver zu ermöglichen, können elektronische Fahrerassistenzsysteme zum Einsatz kommen. Im Rahmen vorhergehender Arbeiten sind bereits einige Prototypen entstanden, von denen jedoch keiner eine geeignete Lösung für moderne, zweiachsige Anhänger darstellt. Im Rahmen dieser Arbeit wurde ein prototypisches Fahrerassistenzsystem entwickelt, wobei es noch weiterer Forschungs- und Entwicklungsarbeit bedarf, um das System straßentauglich zu machen.
Zwölfte Ordnung zur Änderung der Prüfungsordnung für die Prüfung im lehramtsbezogenen Bachelorstudiengang an der Universität Koblenz-Landau
Elfte Ordnung zur Änderung der Prüfungsordnung für die Prüfung in den Masterstudiengängen für das Lehramt an Grundschulen, das Lehramt an Realschulen plus, das Lehramt an Förderschulen sowie das Lehramt an Gymnasien an der Universität Koblenz-Landau
Zehnte Ordnung zur Änderung der Ordnung für die Prüfung im lehramtsbezogenen Zertifikatsstudiengang (Erweiterungsprüfung) an der Universität Koblenz-Landau
Neunte Ordnung zur Änderung der Prüfungsordnung für die Prüfung im Zwei-Fach-Bachelorstudiengang an der Universität Koblenz-Landau
Vierte Ordnung zur Änderung der Gemeinsamen Prüfungsordnung für Studierende des Bachelorstudiengangs und des Masterstudiengangs "Kulturwissenschaft" des Fachbereichs 2: Philologie / Kulturwissenschaften an der Universität Koblenz-Landau
Dritte Ordnung zur Änderung der Ordnung für die Prüfung im Bachelorstudiengang Umweltwissenschaften und in den Masterstudiengängen Umweltwissenschaften und Ecotoxicology an der Universität Koblenz-Landau, Campus Landau
Ordnung zur Änderung der Einschreibeordnung für die Universität Koblenz-Landau
Habilitationsordnung des Fachbereichs 5: Erziehungswissenschaften der Universität Koblenz-Landau
In this work a framework is developed that is used to create an evaluation scheme for the evaluation of text processing tools. The evaluation scheme is developed using a model-dependent software evaluation approach and the focus of the model-dependent part is the text-processing process which is derived from the Conceptual Analysis Process developed in the GLODERS project. As input data a German court document is used containing two incidents of extortion racketeering which happened in 2011 and 2012. The evaluation of six different tools shows that one tool offers great results for the given dataset when it is compared to manual results. It is able to identify and visualize relations between concepts without any additional manual work. Other tools also offer good results with minor drawbacks. The biggest drawback for some tools is the unavailability of models for the German language. They can perform automated tasks only on English documents. Nonetheless some tools can be enhanced by self-written code which allows users with development experience to apply additional methods.
Durch eine systematische Literaturanalyse sollen die wichtigsten Aspekte des Phänomens Crowdsourcing abgedeckt werden. Da die Summe an Forschungsfragen relativ breit gefächert ist, soll der Fokus der Arbeit auf die im Folgenden aufgelisteten Fragen gelegt werden: Was ist unter dem Begriff Crowdsourcing gezielt zu verstehen? Wie lässt sich das Phänomen Crowdsourcing von anderen angrenzenden Konzepten trennen? Wo liegen die Gemeinsamkeiten und wesentlichen Unterschiede zwischen den einzelnen Konzepten? Welche Ausprägungsformen von Crowdsourcing sind in Theorie und Praxis vorzufinden? In welchen Bereichen kommt Crowdsourcing zum Einsatz? Welche Unternehmen setzen Crowdsourcing erfolgreich um? Welche Plattformen zur Unterstützung von Crowdsourcing sind vorhanden? Welche Ziele bzw. Ergebnisse sollen mit dem Einsatz von Crowdsourcing erreicht bzw. erzielt werden? Wie läuft der Crowdsourcing-Prozess ab und in welche Phasen lässt sich dieser unterteilen? Wie sieht die Wertschöpfung durch Crowdsourcing (a) allgemein und (b) speziell für Unternehmen aus? Welche Chancen und Potenziale sowie Risiken und Grenzen entstehen dabei den Unternehmen? Was lässt sich in Zukunft im Bereich des Crowdsourcing noch verbessern, das heißt in welchen Bereichen besteht noch Forschungsbedarf?
“Mittelstand” businesses are the backbone of the German economy. To operate effectively, these businesses require sufficient financing provided through adequate financing instruments. Yet, which characteristics do capital seekers value in adequate financing instruments? Despite the macroeconomic relevance of the topic, only few empirical studies exist to this date, which examine the financing behaviour of “Mittelstand”. For the paper at hand, all PREPS financed German businesses were asked to fill out an online survey. PREPS is a standardized mezzanine financing instrument, which was offered to “Mittelstand” businesses with a high degree of credit worthiness - primarily to finance business growth. PREPS financed businesses are of particular interest for this research as they can choose from the greatest variety of financing options amongst their peers due to their size and credit worthiness. Financing instruments differ in terms of rights and obligations associated with them. Depending on their design, financing contracts can fulfil a variety of functions beyond the obvious supply of liquidity, such as financial transformation, influencing of behaviours, and signalling private information. The paper at hand suggests that the businesses in question selected the same financing instrument, however, for different reasons. Furthermore, the degree of appreciation for certain characteristics of the financing instrument varies with business and situation specific context. When exploring individual hypotheses on how individual factors influence this degree of appreciation for certain financing characteristics, this paper builds on core capital structure theories as well as recent empirical insights in regards to financing behaviour. In addition, the paper examines several explorative hypotheses.
In this thesis we present an approach to track a RGB-D camera in 6DOF andconstruct 3D maps. We first acquire, register and synchronize RGB and depth images. After preprocessing we extract FAST features and match them between two consecutive frames. By depth projection we regain the z-value for the inlier correspondences. Afterwards we estimate the camera motion by 3D point set alignment between the correspondence set using least-squares. This local motion estimate is incrementally applied to a global transformation. Additionally wernpresent methods to build maps based on point cloud data acquired by a RGB-D camera. For map creation we use the OctoMap framework and optionally create a colored point cloud map. The system is evaluated with the widespread RGB-D benchmark.
Die Basis für die Untersuchung bilden die theoretischen Erkenntnisse zur Transition und zum Fremdsprachenunterricht. In der Studie wurden saarländische Grundschullehrer und Gymnasiallehrer zu für den Fremdsprachenunterricht relevanten Aspekten befragt. Aus den Ergebnissen wurden Konsequenzen für die Bildungspolitik und die Unterrichtspraxis abgeleitet.