Refine
Year of publication
- 2015 (106) (remove)
Document Type
- Part of Periodical (34)
- Doctoral Thesis (33)
- Bachelor Thesis (19)
- Master's Thesis (18)
- Conference Proceedings (1)
- Lecture (1)
Keywords
- Vorlesungsverzeichnis (4)
- OpenGL (3)
- Android (2)
- Compute Shader (2)
- Crowdsourcing (2)
- Eyetracking (2)
- Führung (2)
- Grafikkarte (2)
- Serviceorientierte Architektur (2)
- 360 Grad (1)
Institute
This thesis shows an interaction of primitives in a three-dimensional space which is done by gestures. Functions which are difficult to do by gestures without any absolute feeling of the position are implemented with a touchscreen. Besides the touchscreen a second input device, a Leap-Motion, is used to obtain data of the motion of the hand. To get its data the Leap-Motion uses two CCD-cameras and three infrared LEDs. The interactions that can be done without any feedback of the absolute position are the translation, rotation and scale. These three and the movement through space are implemented as gestures in this thesis. This is done in Blender with the BlenderrnGame Engine and Python. The only function which has been implemented for the touchscreen is to select an object. Later on, a comparative control of the mouse was invented to contrast it with the control of the gestures. There are two big differences between these two controls. On the one hand, the gesture controls can be done in a three-dimensional space but most people aren't used to it yet. On the other hand, there is just a two-dimensional input possibility with the mouse control. Otherwise it is familiar to most persons. The evaluation should reveal if people prefer interaction by mouse control or by gestures. The result shows that the prefered control is done by the mouse. However in some categories of the tests the gestures are quite close to the result of the mouse.
This thesis conducts a text and network analysis of criminological files. The specific focus during the research is the field money laundering. The analysis showed the most important concepts present in the text which were classified in eleven different classes. The relationships of those concepts were analysed using ego networks, key entity identification and clustering. Some of the statements given about money laundering could be validated by the findings of this analysis and their interpretation. Specific concepts like banks and organizations as well as foreign subsidiaries were identified. Aggregating these concepts with the statements in chapter 1.4.3 on the circular process of money laundering it can be stated that different organizations and individuals, present in the criminological files, were placing money through different banks, organizations and investments in the legal financial market. At last this thesis tries to validate the benefits of the used tools for the kind of conducted research process. An estimation on ORA's and Automap's applicability for this kind of research is given in the end.
NeuLand 2011,02 = Nr. 31
(2015)
Campusschule: Uni goes Schulpraxis
Praxisluft schnuppern - Wie in der Lehre praktisches Know-how vermittelt wird
Money, money, money - Fördermöglichkeiten für Studierende und Promovierende
Hinter den Kulissen - Ein Tag mit Chefhausmeister Hans-Jürgen Lösch
Wolfgang Huber - Interview mit dem Frank-Loeb-Gastprofessor 2011
Information systems research has started to use crowdsourcing platforms such as Amazon Mechanical Turks (MTurk) for scientific research, recently. In particular, MTurk provides a scalable, cheap work-force that can also be used as a pool of potential respondents for online survey research. In light of the increasing use of crowdsourcing platforms for survey research, the authors aim to contribute to the understanding of its appropriate usage. Therefore, they assess if samples drawn from MTurk deviate from those drawn via conventional online surveys (COS) in terms of answers in relation to relevant e-commerce variables and test the data in a nomological network for assessing differences in effects.
The authors compare responses from 138 MTurk workers with those of 150 German shoppers recruited via COS. The findings indicate, inter alia, that MTurk workers tend to exhibit more positive word-of mouth, perceived risk, customer orientation and commitment to the focal company. The authors discuss the study- results, point to limitations, and provide avenues for further research.
The increasing application of titanium dioxide nanoparticles (nTiO2) entails an increased risk regarding their release to surface water bodies, where they likely co-occur with other anthropogenic stressors, such as heavy metals. Their co-occurrence may lead to an adsorption of the metal ions onto the particles. These nanoparticles often sediment, due to their agglomeration, and thus pose a risk for pelagic or benthic species. The combined toxicity of nTiO2 and heavy metals is likely influenced by the properties of both stressors (since they may alter their interaction) and by environmental parameters (e.g., organic matter, pH, ionic strength) affecting their fate.
These issues were not yet systematically examined by the recent literature. Therefore, this thesis investigated the influence of nTiO2-products with differing crystalline phase composition on the toxicity of copper (as representative for heavy metals) in presence of different organic matters using the pelagic test organism Daphnia magna.
Moreover, the duration of the stressors` interaction (=aging) likely modulates the combined toxicity. Hence, the influence of nTiO2 on copper toxicity after aging as a function of environmental parameters (i.e., organic matter, pH, ionic strength) was additionally investigated.
Finally, the transferability of the major findings to benthic species was examined using Gammarus fossarum. The present thesis discovered a reduction of the copper toxicity facilitated by nTiO2 for all assessed scenarios, while its magnitude was determined by the surface area and structure of nTiO2, the quantity and quality of organic matter as well as the aging of both stressors. The general copper toxicity reduction by nTiO2 was also transferable to benthic species, despite their potentially increased exposure due to the sedimentation of nTiO2 with adsorbed copper. These observations suggest the application of nTiO2 as remediation agent, but potential side effects (e.g., chronic toxicity, reactive oxygen species formation) require further investigations. Moreover, questions regarding the transferability to other stressors (e.g., different heavy metals, organic chemicals) and the fate of stressors adsorbed to nTiO2 in aquatic ecosystems remain open.
Real-time graphics applications are tending to get more realistic and approximate real world illumination gets more reasonable due to improvement of graphics hardware. Using a wide variation of algorithms and ideas, graphics processing units (GPU) can simulate complex lighting situations rendering computer generated imagery with complicated effects such as shadows, refraction and reflection of light. Particularly, reflections are an improvement of realism, because they make shiny materials, e.g. brushed metals, wet surfaces like puddles or polished floors, appear more realistic and reveal information of their properties such as roughness and reflectance. Moreover, reflections can get more complex, depending on the view: a wet surface like a street during rain for example will reflect lights depending on the distance of the viewer, resulting in more streaky reflection, which will look more stretched, if the viewer is locatedrnfarther away from the light source. This bachelor thesis aims to give an overview of the state-of-the-art in terms of rendering reflections. Understanding light is a basic need to understand reflections and therefore a physical model of light and its reflection will be covered in section 2, followed by the motivational section 2.2, that will give visual appealing examples for reflections from the real world and the media. Coming to rendering techniques, first, the main principle will be explained in section 3 followed by a short general view of a wide variety of approaches that try to generate correct reflections in section 4. This thesis will describe the implementation of three major algorithms, that produce plausible local reflections. Therefore, the developed framework is described in section 5, then three major algorithms will be covered, that are common methods in most current game and graphics engines: Screen space reflections (SSR), parallax-corrected cube mapping (PCCM) and billboard reflections (BBR). After describing their functional principle, they will be analysed of their visual quality and the possibilities of their real-time application. Finally they will be compared to each other to investigate the advantages and disadvantages over each other. In conclusion, the gained experiences will be described by summarizing advantages and disadvantages of each technique and giving suggestions for improvements. A short perspective will be given, trying to create a view of upcoming real-time rendering techniques for the creation of reflections as specular effects.
Emotion regulation – an empirical investigation in female adolescents with nonsuicidal self- injury
(2015)
Nonsuicidal self-injury (NSSI) was included as a condition for further study in the DSM-5. Therefore, it is necessary to investigate the suggested diagnostic criteria and the clinical and psychological correlates. In order to provide an optimal treatment best tailored to the patients need, a clear differentiation between Borderline Personality Disorder (BPD) and NSSI is needed. The investigation of personality traits specific to patients with NSSI might be helpful for this differentiation. Furthermore, social difficulties can often be a trigger for NSSI. However, little is known about how adolescents with NSSI perceive social situations. Therefore, we examined how adolescents with NSSI process emotional expressions. A new emotion recognition paradigm (ERP) using colored and morphed facial expressions of happiness, anger, sadness, disgust and fear was developed and evaluated in a student sample, selected for being high (HSA) or low socially anxious (LSA). HSA showed a tendency towards impaired emotion recognition, and the paradigm demonstrated good construct validity.
For the main study, we investigated characteristics of NSSI, clinical and psychological correlates, personality traits and emotion recognition. We examined 57 adolescents with NSSI diagnosis, 12 adolescents with NSSI without impairment/distress and 14 adolescents with BPD, 32 clinical controls without NSSI, and 64 nonclinical controls. Participants were interviewed regarding mental disorders, filled out self-report questionnaires and participated in the ERP.
Results indicate that adolescents with NSSI experienced a higher level of impairment than clinical controls. There were similarities between adolescents with NSSI and adolescents with BPD, but also important differences. Adolescents with NSSI were characterized by specific personality traits such as high harm avoidance and novelty seeking compared to clinical controls. In adolescents with BPD, these personality traits were even more pronounced. No group differences in the recognition of facial expressions were found. Nonetheless compared to the control group, adolescents with NSSI rated the stimuli as significantly more unpleasant and arousing.
In conclusion, NSSI is a highly impairing disorder characterized by high comorbidity with various disorders and by specific personality traits, providing further evidence that NSSI should be handled as a distinct diagnostic entity. Consequently, the proposed DSM-5 diagnostic criteria for NSSI are useful and necessary.
Virtueller Konsum - Warenkörbe, Wägungsschemata und Verbraucherpreisindizes in virtuellen Welten
(2015)
Virtual worlds have been investigated by several academic disciplines for several years, e.g. sociology, psychology, law and education. Since the developers of virtual worlds have implemented aspects like scarcity and needs, even economic research has become interested in these virtual environments. Exploring virtual economies mainly deals with the research of trade regarding the virtual goods used to supply the emerged needs. On the one hand, economics analyzes the meaning of virtual trade according to the overall interpretation of the economical characteristics of virtual worlds. As some virtual worlds allow the change of virtual world money with real money and vice versa, virtual goods are traded by the users for real money, researchers on the other hand, study the impact of the interdependencies between virtual economies and the real world. The presented thesis mainly focuses on the trade within virtual worlds in the context of virtual consumption and the observation of consumer prices. Therefore, the four virtual worlds World of Warcraft, RuneScape, Entropia Universe and Second Life have been selected. There are several components required to calculate consumer price indices. First, a market basket, which contains the relevant consumed goods existing in virtual worlds, must be developed. Second, a weighting scheme has to be established, which shows the dispersion of consumer tendencies. Third, prices of relevant consumer goods have to be taken. Following real world methods, it is the challenge to apply those methods within virtual worlds. Therefore, this dissertation contains three corresponding investigation parts. Within a first analysis, it will be evaluated, in how far virtual worlds can be explored to identify consumable goods. As a next step, the consumption expenditures of the avatars will be examined based on an online survey. At last, prices of consumable goods will be recorded. Finally, it will be possible to calculate consumer price indices. While investigating those components, the thesis focuses not only on the general findings themselves, but also on methodological issues arising, like limited access to relevant data, missing legal legitimation or security concerns of the users. Beside these aspects, the used methods also allow the examination of several other economic aspects like the consumption habits of the avatars. At the end of the thesis, it will be considered to what extent virtual world economic characteristics can be compared with the real world.
Aspects like the important role of weapons or the different usage of food show significant differences to the real world, caused by the business models of virtual worlds.
In this thesis, we deal with the question if challenge, flow and fun in computer games are related to each other, and which influence the motivational, psychological components motivation of success, motivation of failure and the chance of success do have. In addition, we want to know if a free choice in the level of difficulty is the optimal way to flow. To examine these theories, a study based on an online survey was executed, in which the participants played the game “flOw“. The results were evaluated with the help of a two-factorial analysis of variance with repeated measurement and tests on correlation. Thereby we found out that there actually exists a relation between challenge, flow and fun and that motivation does matter indirectly.
The increasing, anthropogenic demand for chemicals has created large environmental problems with repercussions for the health of the environment, especially aquatic ecosystems. As a result, the awareness of the public and decision makers on the risks from chemical pollution has increased over the past half-century, prompting a large number of studies in the field of ecological toxicology (ecotoxicology). However, the majority of ecotoxicological studies are laboratory based, and the few studies extrapolating toxicological effects in the field are limited to local and regional levels. Chemical risk assessment on large spatial scales remains largely unexplored, and therefore, the potential large-scale effects of chemicals may be overlooked.
To answer ecotoxicological questions, multidisciplinary approaches that transcend classical chemical and toxicological concepts are required. For instance, the current models for toxicity predictions - which are mainly based on the prediction of toxicity for a single compound and species - can be expanded to simultaneously predict the toxicity for different species and compounds. This can be done by integrating chemical concepts such as the physicochemical properties of the compounds with evolutionary concepts such as the similarity of species. This thesis introduces new, multidisciplinary tools for chemical risk assessments, and presents for the first time a chemical risk assessment on the continental scale.
After a brief introduction of the main concepts and objectives of the studies, this thesis starts by presenting a new method for assessing the physiological sensitivity of macroinvertebrate species to heavy metals (Chapter 2). To compare the sensitivity of species to different heavy metals, toxicity data were standardized to account for the different laboratory conditions. These rankings were not significantly different for different heavy metals, allowing the aggregation of physiological sensitivity into a single ranking.
Furthermore, the toxicological data for macroinvertebrates were used as input data to develop and validate prediction models for heavy metal toxicity, which are currently lacking for a wide array of species (Chapter 3). Apart from the toxicity data, the phylogenetic information of species (evolutionary relationships among species) and the physicochemical parameters for heavy metals were used. The constructed models had a good explanatory power for the acute sensitivity of species to heavy metals with the majority of the explained variance attributed to phylogeny. Therefore, the integration of evolutionary concepts (relatedness and similarity of species) with the chemical parameters used in ecotoxicology improved prediction models for species lacking experimental toxicity data. The ultimate goal of the prediction models developed in this thesis is to provide accurate predictions of toxicity for a wide range of species and chemicals, which is a crucial prerequisite for conducting chemical risk assessment.
The latter was conducted for the first time on the continental scale (Chapter 4), by making use of a dataset of 4,000 sites distributed throughout 27 European countries and 91 respective river basins. Organic chemicals were likely to exert acute risks for one in seven sites analyzed, while chronic risk was prominent for almost half of the sites. The calculated risks are potentially underestimated by the limited number of chemicals that are routinely analyzed in monitoring programmes, and a series of other uncertainties related with the limit of quantification, the presence of mixtures, or the potential for sublethal effects not covered by direct toxicity.
Furthermore, chemical risk was related to agricultural and urban areas in the upstream catchments. The analysis of ecological data indicated chemical impacts on the ecological status of the river systems; however, it is difficult to discriminate the effects of chemical pollution from other stressors that river systems are exposed to. To test the hypothesis of multiple stressors, and investigate the relative importance of organic toxicants, a dataset for German streams is used in chapter 5. In that study, the risk from abiotic (habitat degradation, organic chemicals, and nutrients enrichment) and biotic stressors (invasive species) was investigated. The results indicated that more than one stressor influenced almost all sites. Stream size and ecoregions influenced the distribution of risks, e.g., the risks for habitat degradation, organic chemicals and invasive species increased with the stream size; whereas organic chemicals and nutrients were more likely to influence lowland streams. In order to successfully mitigate the effects of pollutants in river systems, co-occurrence of stressors has to be considered. Overall, to successfully apply integrated water management strategies, a framework involving multiple environmental stressors on large spatial scales is necessary. Furthermore, to properly address the current research needs in ecotoxicology, a multidisciplinary approach is necessary which integrates fields such as, toxicology, ecology, chemistry and evolutionary biology.
This thesis addresses the problem of terrain classification in unstructured outdoor environments. Terrain classification includes the detection of obstacles and passable areas as well as the analysis of ground surfaces. A 3D laser range finder is used as primary sensor for perceiving the surroundings of the robot. First of all, a grid structure is introduced for data reduction. The chosen data representation allows for multi-sensor integration, e.g., cameras for color and texture information or further laser range finders for improved data density. Subsequently, features are computed for each terrain cell within the grid. Classification is performedrnwith a Markov random field for context-sensitivity and to compensate for sensor noise and varying data density within the grid. A Gibbs sampler is used for optimization and is parallelized on the CPU and GPU in order to achieve real-time performance. Dynamic obstacles are detected and tracked using different state-of-the-art approaches. The resulting information - where other traffic participants move and are going to move to - is used to perform inference in regions where the terrain surface is partially or completely invisible for the sensors. Algorithms are tested and validated on different autonomous robot platforms and the evaluation is carried out with human-annotated ground truth maps of millions of measurements. The terrain classification approach of this thesis proved reliable in all real-time scenarios and domains and yielded new insights. Furthermore, if combined with a path planning algorithm, it enables full autonomy for all kinds of wheeled outdoor robots in natural outdoor environments.
Flowering habitats to enhance biodiversity and pest control services in agricultural landscapes
(2015)
Meeting growing demands for agricultural products requires management solutions that enhance food production, whilst minimizing negative environmental impacts. Conventional agricultural intensification jeopardizes farmland biodiversity and associated ecosystem services through excessive anthropogenic inputs and landscape simplification. Agri-environment schemes (AES) are commonly implemented to mitigate the adverse effects of conventional intensification on biodiversity. However the moderate success of such schemes thus far would strongly benefit from more explicit goals regarding ecosystem service provisioning. Providing key resources to beneficial organisms may improve their abundance, fitness, diversity and the ecosystem services they provide. With targeted habitat management, AES may synergistically enhance biodiversity and agricultural production and thus contribute to ecological intensification. We demonstrate that sown perennial wildflower strips, as implemented in current AES focusing on biodiversity conservation also benefit biological pest control in nearby crops (Chapter 2).
Comparing winter wheat fields adjacent to wildflower strips with fields without wildflower strips we found strongly reduced cereal leaf beetle (Oulema sp.) density and plant damage near wildflower strips. In addition, winter wheat yield was 10 % higher when fields adjoined wildflower strips. This confirms previous assumptions that wildflower strips, known for positive effects on farmland biodiversity, can also enhance ecosystem services such as pest control and the positive correlation of yield with flower abundance and diversity suggests that floral resources are key. Refining sown flower strips for enhanced service provision requires mechanistic understanding of how organisms benefit from floral resources. In climate chamber experiments investigating the impact of single and multiple flowering plant species on fitness components of three key arthropod natural enemies of aphids, we demonstrate that different natural enemies benefit differently from the offered resources (Chapter 3).
Some flower species were hereby more valuable to natural enemies than others overall. Additionally, the mixture with all flowers generally performed better than monocultures, yet with no transgressive overyielding. By explicitly tailoring flower strips to the requirements of key natural enemies of crop pests we aimed to maximise natural enemy mediated pest control in winter wheat (Chapter 4)and potato (Chapter 5) crops.
Respecting the manifold requirements of diverse natural enemies but not pests, in terms of temporal and spatial provisioning of floral, extra floral and structural resources, we designed targeted annual flower strips that can be included in crop rotation to support key arthropods at the place and time they are needed. Indeed, field experiments revealed that cereal leaf beetle density and plant damage in winter wheat can be reduced by 40 % to 61 % and aphid densities in potatoes even by 77 %, if a targeted flower strip is sown into the field. These effects were not restricted to the vicinity of flower strips and, in contrast to fields without flower strip, often prevented action thresholds from being reached. This suggests that targeted flower strips could replace insecticides. All adult natural enemies were enhanced inside targeted flower strips when compared to control strips. Yet, spillover to the field was restricted to key natural enemies such as ground beetles (winter wheat), hoverflies (potato) and lacewings (winter wheat and potato), suggesting their dominant role in biological control. In potatoes, targeted flower strips also enhanced hoverfly species richness in strips and crop, highlighting their additional benefits for diversity.
The present results provide more insights into the mechanisms underlying conservation biological control and highlight the potential of tailored habitat management for ecological intensification.
In this thesis, an interactive application is developed for Android OS. The application is about a virtual-reality game. The game is settled in the genre of first-person shooters and takes place in a space scenario. By using a stereo renderer, it is possible to play the game combined with virtual-reality glasses.
The publication of freely available and machine-readable information has increased significantly in the last years. Especially the Linked Data initiative has been receiving a lot of attention. Linked Data is based on the Resource Description Framework (RDF) and anybody can simply publish their data in RDF and link it to other datasets. The structure is similar to the World Wide Web where individual HTML documents are connected with links. Linked Data entities are identified by URIs which are dereferenceable to retrieve information describing the entity. Additionally, so called SPARQL endpoints can be used to access the data with an algebraic query language (SPARQL) similar to SQL. By integrating multiple SPARQL endpoints it is possible to create a federation of distributed RDF data sources which acts like one big data store.
In contrast to the federation of classical relational database systems there are some differences for federated RDF data. RDF stores are accessed either via SPARQL endpoints or by resolving URIs. There is no coordination between RDF data sources and machine-readable meta data about a source- data is commonly limited or not available at all. Moreover, there is no common directory which can be used to discover RDF data sources or ask for sources which offer specific data. The federation of distributed and linked RDF data sources has to deal with various challenges. In order to distribute queries automatically, suitable data sources have to be selected based on query details and information that is available about the data sources. Furthermore, the minimization of query execution time requires optimization techniques that take into account the execution cost for query operators and the network communication overhead for contacting individual data sources. In this thesis, solutions for these problems are discussed. Moreover, SPLENDID is presented, a new federation infrastructure for distributed RDF data sources which uses optimization techniques based on statistical information.
Im Rahmen dieser Arbeit wird untersucht, wie sich Modellfehler auf die Positionsgenauigkeit und Handhabbarkeit beim Rangieren mit einem Fahrerassistenzsystem auswirken. Besonderer Wert wird dabei auf die Bestimmung von Fehlergrenzen gelegt. Es wird der Frage nachgegangen, wie groß der Eingangsfehler sein darf, damit die Assistenz noch hinreichende Qualitätseigenschaften hinsichtlich ihrer Präzision und Robustheit aufweist. Dazu erfolgt zunächst eine quantitative Betrachtung der Fehler anhand des kinematischen Modells. Danach wird eine qualitative Betrachtung anhand von systematischen Experimenten durchgeführt. Es wird zunächst ein Controller entwickelt, mit dem sich ein Manöver mithilfe der visuellen Informationen der Assistenz simulieren lässt.
Dann wird eine Methode vorgestellt, mit deren Hilfe man das Manöver anhand definierter Fehlergrenzen bewerten kann. Um einen großen Raum möglicher Fehlerkombinationen effizient zu durchsuchen, wird das probabilistische Verfahren des Annealed Particle Filters benutzt. Mithilfe einer Testumgebung werden schließlich systematische Experimente durchgeführt. Zur weiteren Evaluation des Assistenzsystems in einer kontrollierten Umgebung erfolgte in Zusammenarbeit mit dem Fraunhofer ITWM in Kaiserslautern die Portierung des Assistenzsystems auf die dortige Simulationsumgebung RODOS.
Der Fachbereich 4 (Informatik) besteht aus fünfundzwanzig Arbeitsgruppen unter der Leitung von Professorinnen und Professoren, die für die Forschung und Lehre in sechs Instituten zusammenarbeiten.
In jedem Jahresbericht stellen sich die Arbeitsgruppen nach einem einheitlichen Muster dar, welche personelle Zusammensetzung sie haben, welche Projekte in den Berichtszeitraum fallen und welche wissenschaftlichen Leistungen erbracht wurden. In den folgenden Kapiteln werden einzelne Parameter aufgeführt, die den Fachbereich in quantitativer Hinsicht, was Drittmitteleinwerbungen, Abdeckung der Lehre, Absolventen oder Veröffentlichungen angeht, beschreiben.
Uniprisma Ausg. 2005
(2015)
The subject of this thesis was to analyse the involvement of classical creativity techniques and IT tools in different phases of the innovation process. In addition, the present work deals with the integration of Design Thinking and TRIZ into the innovation process. The aim was to define a specific innovation process based on diverse existing Innovation process models from the literature. This specific innovation process should serve as a basis for the analysis of integration of creativity techniques, IT tools, Design Thinking and TRIZ. Summarizing it can be said that the application of creativity techniques and IT Tools is admissible and useful in every phase of the innovation process. In this work it was shown that the design thinking method can be integrated in the early stages of the innovation process. Also, the process model of TRIZ, which differs from traditional innovation processes, can be combined with classical innovation processes.
Campuszeitung Ausg. 1/2015
(2015)
Simulation von Schnee
(2015)
Physic simulations allow the creation of dynamic scenes on the computer. Computer generated images become lively and find use in movies, games and engineering applications. GPGPU techniques make use of the graphics card to simulate physics. The simulation of dynamic snow is still little researched. The Material Point Method is the first technique which is capable of showing the dynamics andrncharacteristics of snow.
The hybrid use of Lagrangian particles and a regular cartesian grid enables solving of partial differential equations. Therefore articles are transformed to the grid. The grid velocities can then be updated with the calculation of gradients in an FEM-manner (finite element method). Finally grid node velocities are weight back to the particles to move them across the scene. This method is coupled with a constitutive model to cover the dynamic nature of snow. This include collisions and breaking.
This bachelor thesis connects the recent developments in GPGPU techniques of OpenGL with the Material Point Method to efficiently simulate visually compelling, dynamic snow scenes.
Heat exchangers are used for thickening of various products or desalination of saltwater. Nevertheless, they are used as cooling unit in industries. Thereby, the stainless steel heat transferring elements get in contact with microorganism containing media, such as river water or saltwater, and corrode. After at least two years of utilization the material is covered with bacterial slime called biofilm. This process is called biofouling and causes loss in efficiency and creates huge costs depending on cleaning technique and efficiency. Cleaning a heat exchanger is very expensive and time consuming. It only can be done while the device is out of business.
Changing the surface properties of materials is the best and easiest way to lengthen the initial phase of biofilm formation. This leads to less biofouling (Mogha et al. 2014).
Thin polymer films as novel materials have less costs in production than stainless steel and are easy to handle. Furthermore, they can be functionalzed easily and can be bougth in different sizes all over the world. Because of this, they can reduce the costs of cleaning techniques and lead to a longer high efficiency state of the heat exchanger. If the efficiency of the heat exchanger decreases, the thin polymer films can be replaced.
For a successful investigation of the microbial and the process engineering challenges a cooperation of Technical University of Kaiserslautern (chair of seperation science and technology) and University of Koblenz-Landau (working goup microbiology) was established.
The aim of this work was design engineering and production of a reactor for investigation of biofouling taking place on thin polymeric films and stainless steel. Furthermore, an experimental design has to be established. Several requirements have to be applied for these tasks. Therefore, a real heat exchanger is downscaled, so the process parameters are at least comparable. There are many commercial flow cell kits available. Reducing the costs by selfassembling increased the number of samples, so there is a basis for statistic analysis. In addition, fast and minimal invasive online-in-situ microscopy and Raman spectroscopy can be performed. By creating laminary flow and using a weir we implemented homogenous inflow to the reactors. Reproduceable data on biomass and cell number were created.
The assessment of biomass and cell number is well established for drinking water analysis. Epifluorescense microscopy and gravimetric determination are the basic techniques for this work, too. Differences in cell number and biomass between surface modifications and materials are quantified and statistically analysed.
The wildtype strain Escherichia coli K12 and an inoculum of 500 ml fresh water were used to describe the biofouling of the films. Thereby, we generated data with natural bacterial community in unknown media properties and data with well known media properties, so the technical relevance of the data is given.
Free surface energy and surface roughness are the first attachment hurdles for bacteria. These parameters were measured according to DIN 55660 and DIN EN ISO 4287. The materials science data were correlated with the number of cells and the biomass. This correlation acts as basal link of biofouling as biological induced parameter to the material properties. Material properties for reducing the biofouling can be prospected.
By using Raman spectroscopy as a cutting edge method future investigations could be shortened. If biomass or cell number can be linked with the spectra, new functional materials can be investigated in a short time.
Zentrale Aufgaben der Hochschule sind die Bewertung, die Ursachenklärung und die Förderung von Studienleistungen (Heublein & Wolter, 2011, S. 215). In diesem Kontext gilt neben intellektuellen Fähigkeiten die Leistungsmotivation als bedeutsamer Prädiktor für den akademischen Erfolg (z. B. Schmidt-Atzert, 2005, S. 132; Steinmayr & Spinath, 2009, S. 80). Im Fokus der vorliegenden Studie stehen deshalb Überlegungen zu Motivationsprozessen von 332 Studienanfängern der Hochschule der Bundesagentur für Arbeit und zu den Faktoren, die sich förderlich auf ihre Lernresultate auswirken. Mit einer Ausschöpfungsquote von 89 % sind die gewonnenen Daten für die Grundgesamtheit repräsentativ. Anhand einer Ex-post-facto-Versuchsanordnung in Form eines quantitativen Prädiktor-Kriteriums-Ansatzes (spezielle Variante eines Längsschnittdesigns) mit unterschiedlichen Erhebungsmethoden, wie standardisiertem Selbstbeurteilungsfragebogen, Leistungstests und offiziellen Dokumenten/Aktenmaterial, wurden folgende Forschungshypothesen zugrunde gelegt: Die Stärke der Leistungsmotivation ist sowohl von Erwartungskomponenten (Fähigkeitsselbstkonzept, Selbstwert, subjektive Notenerwartung, Erfolgszuversicht und Misserfolgsfurcht) als auch von Anreizkomponenten (Gegenstands-, Tätigkeits-, Folgenanreizen) abhängig, welche wiederum vermittelt über das leistungsmotivierte Verhalten einen Einfluss auf die Studienleistung besitzt. Dabei wurde postuliert, dass motivationale Variablen auch dann noch einen bedeutsamen Effekt auf die Studienleistung ausüben, wenn weitere Leistungsprädiktoren, wie die Schulabschlussnote, die Intelligenz, die emotionale Stabilität und die Gewissenhaftigkeit kontrolliert werden.
Web application testing is an active research area. Garousi et al. did a systematic mapping study and classified 79 papers published between 2000-2011. However, there seems to be a lack of information exchange between the scientific community and tool developers.
This thesis systematically analyzes the field of functional, system level web application testing tools. 194 candidate tools were collected in the tool search and screened, with 23 tools being selected as foundation of this thesis. These 23 tools were systematically used to generate a feature model of the domain. The methodology to support this is an additional contribution of this thesis. It processes end user documentation of tools belonging to an examined domain and creates a feature model. The feature model gives an overview over the existing features, their alternatives and their distribution. It can be used to identify trends and problems, extraordinary features, help decision making of tool purchase or guide scientists how to focus research.
Das Thema dieser Arbeit ist die Entwicklung einer hardwarebeschleunigten Einzelbildkompression zur Videoübertragung. Verfahren zur Einzelbildkompressionrn existieren bereits seit längerer Zeit. Jedoch genügen die gängigen Verfahren nicht den Anforderungen der Echtzeit und Performanz, um während einer Videoübertragung ohne spürbare Latenz zum Einsatz zu kommen. In dieser Arbeit soll einer der geläufigsten Algorithmen zur Bildkompression auf Parallelisierbarkeit, unter zu Hilfenahme der Grafikkarte, untersucht werden, um Echtzeitfähigkeit während der Kompression und Dekompression von computergenerierten Bildern zu erreichen. Die Ergebnisse werden evaluiert und in den Rahmen aktueller Verfahren parallelisierter Kompressionstechniken eingeordnet.
Die vorliegende Forschungsarbeit beschäftigt sich mit der Positionierung und anbieterinternen Kommunikation der innovativen IT-Architektur SOA. Die zentralen Ziele der vorliegenden explorativen und empirischen Forschungsarbeit, die im Kontext der Innovations-Erfolgsfaktorenforschung angesiedelt ist, bestehen in der Beantwor-tung der beiden folgenden forschungsleitenden Fragestellungen:
Forschungsfrage 1: Welche Bedingungen tragen zu einer erfolgreichen Positionierung von SOA bei? Forschungsfrage 2: Welche Bedingungen tragen zu einer erfolgreichen anbieterinternen Kommunikation bezüglich SOA bei? Zur Überprüfung dieser beiden Forschungsfragen wurde ein zweistufiges Delphi-Verfahren durchgeführt. Hierbei wurde zunächst eine qualitative Befragungswelle (N=53) zur Identifizierung der SOA-Positionierungsbedingungen und anbieterinternen SOA-Kommunikations-bedingungen durchgeführt. Insgesamt wurden in der ersten Befragungswelle 122 SOA-Positionierungsbedingungen identifiziert, die sich in 65 Bedingungen auf Anbieterseite, 35 Bedingungen auf Kundenseite, 19 Bedingungen auf SOA-Seite und 3 Bedingungen aufseiten des weiteren Umfeldes aufteilen. Im Rahmen der anbieterinternen SOA-Kommunikation konnten 31 Bedingungen identifiziert werden. Die in der ersten Welle identifizierten SOA-Positionie-rungsbedingungen und anbieterinternen SOA-Kommunikationsbedingungen wurden mittels der zweiten Befragungswelle (N=83) einer quantitativen Analyse unterzogen. Somit liefert die vorliegende Studie Bedingungen, die sowohl zu einer erfolgreichen SOA-Positionierung als auch zu einer erfolgreichen anbieterinternen SOA-Kommunikation beitragen.
Die Resultate dieser Arbeit werden zusammengefasst und theoretisch eingeordnet. Ebenfalls wird die methodische Vorgehensweise kritisch diskutiert und die Güte der Daten beurteilt. Schließlich wird ein Ausblick auf zukünftige Forschungsfelder gegeben.
Campuszeitung Ausg. 1/2013
(2015)
Uniprisma Ausg. 2010
(2015)
Die UN-Behindertenrechtskonvention von 2008 formuliert einen Rechtsanspruch auf inklusive Bildung für Menschen mit Beeinträchtigungen. Diesem wird in Deutschland seit 2009 durch Schulgesetzänderungen Rechnung getragen, mit denen inklusive Bildung durch ein Elternwahlrecht implementiert wird. Bislang ist vor dem Hintergrund der neu geschaffenen elterlichen Entscheidungsmöglichkeiten noch nicht untersucht worden, welche Vorstellungen Eltern von Kindern mit komplexen Beeinträchtigungen mit dem inklusiven Bildungsanspruch ihres Kindes verbinden und in welcher Weise sie diesen an der Schulform ihrer Wahl eingelöst sehen. Im Zentrum der vorliegenden Arbeit steht die Rekonstruktion des Bildungsangebots aus der Perspektive der Eltern im Abgleich mit der Sicht der pädagogischen Klassenteams. Den Fragen nach den elterlichen Erwartungen und Erfahrungen wurde aus der systemtheoretischen Perspektive von Luhmann nachgegangen. In der qualitativ angelegten Untersuchung geht es um Schülerinnen und Schüler mit komplexen Beeinträchtigungen, die nach der Schulgesetznovellierung in Hamburg (2009) in den Jahren 2010 und 2011 eingeschult worden sind und aufgrund der Entscheidung ihrer Eltern in unterschiedlichen Settings an Grund- und Sonderschulen lernen. Die Datenerhebung erfolgte durch leitfadengestützte Interviews mit Eltern, Pädagoginnen und Pädagogen sowie Schulleitungen, ergänzt durch Hospitationen im Schuljahr 2011/12 und Dokumente, die von den Schulen zur Verfügung gestellt wurden. Die Datenanalyse erfolgt mithilfe der Grounded Theory nach Strauß/Corbin (1996). Die Ergebnisse der Untersuchung zeigen elterliche Bildungserwartungen im Hinblick auf ein Ermöglichen von Autonomie und Teilhabe ihrer Kinder und eine differenzierte Wahrnehmung der Umsetzung ihrer Erwartungen im Schulalltag. Einen besonderen Stellenwert messen Eltern der Zusammenarbeit zwischen Schule und Familie bei, die für das Entstehen von Vertrauen bzw. Misstrauen bedeutsam ist. Aus den Erkenntnissen und deren Rückbindung an die Systemtheorie wurde ein Modell des Professionsvertrauens/-misstrauens entwickelt.
Die Auswertung ergibt Hinweise zu Qualitätskriterien eines inklusiven Bildungsangebots und zu Entwicklungsanforderungen in der Professionalisierung, die sowohl auf die Ebene der Organisation Schule als auch der Interaktion zwischen schulischen Akteurinnen und Akteuren sowie Eltern abzielen.
Uniprisma Ausg. 2008
(2015)
In current research of the autonomous mobile robots, path planning is still a very important issue.
This master's thesis deals with various path planning algorithms for the navigation of such mobile systems. This is not only to determine a collision-free trajectory from one point to another. The path should still be optimal and comply with all vehicle-given constraints. Especially the autonomous driving in an unknown and dynamic environment poses a major challenge, because a closed-loop control is necessary and thus a certain dynamic of the planner is demanded.
In this paper, two types of algorithms are presented. First, the path planner, based on A*, which is a common graph search algorithm: A*, Anytime Repairing A*, Lifelong Planning A*, D* Lite, Field D*, hybrid A*. Second, the algorithms which are based on the probabilistic planning algorithm Rapidly-exploring Random Tree (Rapidly-exploring Random Tree, RRT*, Lifelong Planning RRT*), as well as some extensions and heuristics. In addition, methods for collision avoidance and path smoothing are presented. Finally, these different algorithms are evaluated and compared with each other.
The lasting hype around the mobile internet and the related technology of the mobile applications seem not to drop off. The immense economic potential of this market leads the businesses and ventures to continuously find new ways of monetization. The underlying causes of that phenomenon are rarely challenged. Scientific research in the field of "ubiquitous mobile" has not yet developed a clear overall picture of the causes and effect chains. Attempts of deriving causes by studies in related mass media such as the computer or the internet have been discussed controversially. By combining the research streams of media motive usage and the customer retention, this paper will present a new research model. Based on a quantitative survey in the German speaking the gained data proves the motives for mobility, information gathering and entertainment purposed to be the most important drivers of customer satisfaction in mobile applications. The paper also highlights a significant correlation between the customer satisfaction and the other determinants of customer retention.
Uniprisma Ausg. 2007
(2015)
Campuszeitung Ausg. 1/2014
(2015)
In the new epoch of Anthropocene, global freshwater resources are experiencing extensive degradation from a multitude of stressors. Consequently, freshwater ecosystems are threatened by a considerable loss of biodiversity as well as substantial decrease in adequate and secured freshwater supply for human usage, not only on local scales, but also on regional to global scales. Large scale assessments of human and ecological impacts of freshwater degradation enable an integrated freshwater management as well as complement small scale approaches. Geographic information systems (GIS) and spatial statistics (SS) have shown considerable potential in ecological and ecotoxicological research to quantify stressor impacts on humans and ecological entitles, and disentangle the relationships between drivers and ecological entities on large scales through an integrated spatial-ecological approach. However, integration of GIS and SS with ecological and ecotoxicological models are scarce and hence the large scale spatial picture of the extent and magnitude of freshwater stressors as well as their human and ecological impacts is still opaque. This Ph.D. thesis contributes novel GIS and SS tools as well as adapts and advances available spatial models and integrates them with ecological models to enable large scale human and ecological impacts identification from freshwater degradation. The main aim was to identify and quantify the effects of stressors, i.e climate change and trace metals, on the freshwater assemblage structure and trait composition, and human health, respectively, on large scales, i.e. European and Asian freshwater networks. The thesis starts with an introduction to the conceptual framework and objectives (chapter 1). It proceeds with outlining two novel open-source algorithms for quantification of the magnitude and effects of catchment scale stressors (chapter 2). The algorithms, i.e. jointly called ATRIC, automatically select an accumulation threshold for stream network extraction from digital elevation models (DEM) by assuring the highest concordance between DEM-derived and traditionally mapped stream networks. Moreover, they delineate catchments and upstream riparian corridors for given stream sampling points after snapping them to the DEM-derived stream network. ATRIC showed similar or better performance than the available comparable algorithms, and is capable of processing large scale datasets. It enables an integrated and transboundary management of freshwater resources by quantifying the magnitude of effects of catchment scale stressors. Spatially shifting temporal points (SSTP), outlined in chapter 3, estimates pooled within-time series (PTS) variograms by spatializing temporal data points and shifting them. Data were pooled by ensuring consistency of spatial structure and temporal stationarity within a time series, while pooling sufficient number of data points and increasing data density for a reliable variogram estimation. SSTP estimated PTS variograms showed higher precision than the available method. The method enables regional scale stressors quantification by filling spatial data gaps integrating temporal information in data scarce regions. In chapter 4, responses of the assumed climate-associated traits from six grouping features to 35 bioclimatic indices for five insect orders were compared, their potential for changing distribution pattern under future climate change was evaluated and the most influential climatic aspects were identified (chapter 4). Traits of temperature preference grouping feature and the insect order Ephemeroptera exhibited the strongest response to climate as well as the highest potential for changing distribution pattern, while seasonal radiation and moisture were the most influential climatic aspects that may drive a change in insect distribution pattern. The results contribute to the trait based freshwater monitoring and change prediction. In chapter 5, the concentrations of 10 trace metals in the drinking water sources were predicted and were compared with guideline values. In more than 53% of the total area of Pakistan, inhabited by more than 74 million people, the drinking water was predicted to be at risk from multiple trace metal contamination. The results inform freshwater management by identifying potential hot spots. The last chapter (6) synthesizes the results and provides a comprehensive discussion on the four studies and on their relevance for freshwater resources conservation and management.
Protest & Gerechtigkeit - Thema in Forschung und Lehre am Campus Landau
Erwachsenenbildung: Die Herausforderung einer modernen Weiterbildung
Hospizarbeit: Neuer Qualitätsindex startklar für die Praxis
Unzufrieden im Job? Wie positive Psychologie helfen kann
Psychotherapie: Neue Kinder- und Jugendambulanz
The mitral valve is one of the four valves in the human heart. It is located in the left heart chamber and its function is to control the blood flow from the left atrium to the left ventricle. Pathologies can lead to malfunctions of the valve so that blood can flow back to the atrium. Patients with a faulty mitral valve function may suffer from fatigue and chest pain. The functionality can be surgically restored, which is often a long and exhaustive intervention. Thorough planning is necessary to ensure a safe and effective surgery. This can be supported by creating pre-operative segmentations of the mitral valve. A post-operative analysis can determine the success of an intervention. This work will combine existing and new ideas to propose a new approach to (semi-)automatically create such valve models. The manual part can guarantee a high quality model and reliability, whereas the automatic part contributes to saving valuable labour time.
The main contributions of the automatic algorithm are an estimated semantic separation of the two leaflets of the mitral valve and an optimization process that is capable of finding a coaptation-line and -area between the leaflets. The segmentation method can perform a fully automatic segmentation of the mitral leaflets if the annulus ring is already given. The intermediate steps of this process will be integrated into a manual segmentation method so a user can guide the whole procedure. The quality of the valve models generated by the method proposed in this work will be measured by comparing them to completely manually segmented models. This will show that commonly used methods to measure the quality of a segmentation are too general and do not suffice to reflect the real quality of a model. Consequently the work at hand will introduce a set of measurements that can qualify a mitral valve segmentation in more detail and with respect to anatomical landmarks. Besides the intra-operative support for a surgeon, a segmented mitral valve provides additional benefits. The ability to patient-specifically obtain and objectively describe the valve anatomy may be the base for future medical research in this field and automation allows to process large data sets with reduced expert dependency. Further, simulation methods that use the segmented models as input may predict the outcome of a surgery.
Factors triggering the ecotoxicity of metal-based nanoparticles towards aquatic invertebrates
(2015)
Nanoparticles are produced and used in huge amounts increasing their probability to end up in surface waters. There, they are subject to environmentally driven modification processes. Consequently, aquatic life may be exposed to different nanoparticle agglomerate sizes, while after sedimentation benthic organisms are more likely to be affected.
However, most ecotoxicity studies with nanoparticles exclusively investigated implications of their characteristics (e.g. size) on pelagic organisms, ignoring environmentally modified nanoparticles. Therefore, a systematic assessment of factors triggering the fate and toxicity of nanoparticles under environmentally relevant conditions is needed. The present thesis, therefore, investigates the implications of nanoparticle related factors (i.e., inherent material-properties and nanoparticle characteristics) as well as environmental conditions towards the pelagic living organism Daphnia magna and the benthic species Gammarus fossarum. In detail, inert titanium dioxide (nTiO2) and ion-releasing silver nanoparticles (nAg), both of varying particle characteristics (e.g. initial size), were tested for their toxicity under different environmental conditions (e.g. ultraviolet-light (UV-light)).
The results indicate that the toxicity of nTiO2 and nAg is mainly determined by: their adsorption potential onto biota, and their fate in terms of reactive oxygen species or Ag+ ion release. Thus, inherent material-properties, nanoparticle characteristics and environmental conditions promoting or inhibiting these aspects revealed significant implications in the toxicity of nTiO2 and nAg towards daphnids.
Furthermore, the presence of ambient UV-light, for example, adversely affected gammarids at 0.20 mg nTiO2/L, while under darkness no effects occurred even at 5.00 mg nTiO2/L. Hence, the currently associated risk of nanoparticles might be underestimated if disregarding their interaction with environmental parameters
The formulation of the decoding problem for linear block codes as an integer program (IP) with a rather tight linear programming (LP) relaxation has made a central part of channel coding accessible for the theory and methods of mathematical optimization, especially integer programming, polyhedral combinatorics and also algorithmic graph theory, since the important class of turbo codes exhibits an inherent graphical structure. We present several novel models, algorithms and theoretical results for error-correction decoding based on mathematical optimization. Our contribution includes a partly combinatorial LP decoder for turbo codes, a fast branch-and-cut algorithm for maximum-likelihood (ML) decoding of arbitrary binary linear codes, a theoretical analysis of the LP decoder's performance for 3-dimensional turbo codes, compact IP models for various heuristic algorithms as well as ML decoding in combination with higher-order modulation, and, finally, first steps towards an implementation of the LP decoder in specialized hardware. The scientific contributions are presented in the form of seven revised reprints of papers that appeared in peer-reviewed international journals or conference proceedings. They are accompanied by an extensive introductory part that reviews the basics of mathematical optimization, coding theory, and the previous results on LP decoding that we rely on afterwards.
Diese Arbeit betrachtet das Thema Führung und Gesundheit und hat hierzu verschiedene Erkenntnisse der Literatur zusammengefasst, um diese von Führungskräften aus Wirtschaft und Polizei sowie von Personal- und Organisationsentwicklern bewerten zu lassen. Das Ziel war hierbei herauszufinden, ob die Führungskräfte und die Personal- und Organisationsentwickler das Thema als wichtig erachten, welche Hauptursachen sie für Fehlzeiten sehen und wie sie verschiedene Erkenntnisse der Literatur zum Gesundheitsmanagement einschätzen. Zusätzlich sollten sie bewerten, welche Maßnahmen sie als geeignet betrachten und welche Ressourcen notwendig sind, um die Mitarbeiter bei der Gesunderhaltung zu unterstützen. Schließlich sollten die Führungskräfte und die Personal- und Organisationsentwickler beurteilen, welcher Führungsstil als gesundheitsförderlich angesehen wird. Die Wirtschafts- und Polizeiführungskräfte sowie die Personal- und Organisationsentwickler erachten das Thema Gesundheit als wichtig und sehen es nicht nur als Modetrend an. Ihre Einschätzungen zu geeigneten Maßnahmen, die die Gesundheit der Mitarbeiter verbessern können, entsprechen überwiegend den aus der Literatur abgeleiteten Vorschlägen zur gesundheitsgerechten Führung. Die weitgehende Übereinstimmung der Sichtweisen in Forschung und Praxis legt nahe, dass die Erkenntnisse der Literatur zum Gesundheitsmanagement vermutlich von Praktikern als plausibel wahrgenommen werden.
This thesis deals with the development of an interactive Android card game. As an example, the Hebrew game Yaniv was implemented. Focus is the elaboration of required background components and the corresponding implementation in that application. Required game processes will be screened and a possible solution will be identified.
Geographic cluster based routing in ad-hoc wireless sensor networks is a current field of research. Various algorithms to route in wireless ad-hoc networks based on position information already exist. Among them algorithms that use the traditional beaconing approach as well as algorithms that work beaconless (no information about the environment is required besides the own position and the destination). Geographic cluster based routing with guaranteed message delivery can be carried out on overlay graphs as well. Until now the required planar overlay graphs are not being constructed reactively.
This thesis proposes a reactive algorithm, the Beaconless Cluster Based Planarization (BCBP) algorithm, which constructs a planar overlay graph and noticeably reduces the number of messages required for that. Based on an algorithm for cluster based planarization it beaconlessly constructs a planar overlay graph in an unit disk graph (UDG). An UDG is a model for a wireless network in which every participant has the same sending radius. Evaluation of the algorithm shows it to be more efficient than the non beaconless variant. Another result of this thesis is the Beaconless LLRAP (BLLRAP) algorithm, for which planarity but not continued connectivity could be proven.
A fundamental understanding of attachment of engineered nanoparticles to environmentalrnsurfaces is essential for the prediction of nanoparticle fate and transport in the environment.
The present work investigates the attachment of non-coated silver nanoparticles and citraterncoated silver nanoparticles to different model surfaces and environmental surfaces in thernpresence and absence of humic acid. Batch sorption experiments were used for this investigation.
The objective of this thesis was to investigate how silver nanoparticles interactrnwith surfaces having different chemical functional groups. The effect of presence of HA, on the particle-surface interactions was also investigated. In the absence of humic acid, nanoparticle-surface interactions or attachment was influencedrnby the chemical nature of the interacting surfaces. On the other hand, in the presence ofrnhumic acid, nanoparticle-surface attachment was influenced by the specific surface area of the sorbent surfaces. The sorption of non-coated silver nanoparticles and citrate coatedrnnanoparticles to all the surfaces was nonlinear and best described by Langmuir isotherm, indicating monolayer sorption of nanoparticles on to the surfaces. This can be explained as due to the blocking effect generated by the particle-particle repulsion. In the presence of humic acid, sorption of nanoparticles to the surfaces was linear. When the humic acid was present in the interacting medium, both the nanoparticles and surfaces were getting coated with humic acid and this masks the chemical functionalities of the surfaces. This leads to the change in particle-surface interactions, in the presence of humic acid. For the silver nanoparticle sorption from an unstable suspension, the sorption isotherms did not follow any classical sorption models, suggesting interplay between aggregation and sorption. Citrate coated silver nanoparticles and humic acid coated silver nanoparticles showed arndepression in sorption compared to the sorption of non-coated silver nanoparticles. In therncase of citrate coated silver nanoparticles the decrease in sorption can be explained by thernmore negative zeta potential of citrate coated nanoparticles compared to non-coated ones. For humic acid coated nanoparticles the sorption depression can be due to the steric hindrance caused by the free humic acid molecules which may coat the sorbent surface or due to the competition for sorption sites between the nanoparticle and free humic acid molecules present in the suspension. Thus nanoparticle surface chemistry is an important factor that determines the attachment of nanoparticles towards surfaces and it makes the characterization of nanoparticle surface an essential step in the study of their fate in the environment.
Another aim of this study was to introduce the potential of chemical force microscopy for nanoparticle surface characterization. With the use of this technique, it was possible to distinguish between bare silver nanoparticles, citrate coated silver nanoparticles, and humic acid coated silver nanoparticles. This was possible by measuring the adhesion forces between the nanoparticles and five different AFM probes having different chemical functionalization.
The intention of this thesis was to characterise the effect of naturally occurring multivalent cations like Calcium and Aluminium on the structure of Soil Organic Matter (SOM) as well as on the sorption behaviour of SOM for heavy metals such as lead.
The first part of this thesis describes the results of experiments in which the Al and Ca cation content was changed for various samples originated from soils and peats of different regions in Germany. The second part focusses on SOM-metal cation precipitates to study rigidity in dependence of the cation content. In the third part the effects of various cation contents in SOM on the binding strength of Pb cations were characterised by using a cation exchange resin as desorption method.
It was found for soil and peat samples as well as precipitates that matrix rigidity was affected by both type and content of cation. The influence of Ca on rigidity was less pronounced than the influence of Al and of Pb used in the precipitation experiments. For each sample one cation content was identified where matrix rigidity was most pronounced. This specific cation content is below the cation saturation as expected by cation exchange capacity. These findings resulted in a model describing the relation between cation type, content and the degree of networking in SOM. For all treated soil and precipitate samples a step transition like glass transition was observed, determined by the step transition temperature T*. It is known from literature that this type of step transition is due to bridges between water molecules and organic functional groups in SOM. In contrast to the glass transition temperature this thermal event is slowly reversing after days or weeks depending on the re-conformation of the water molecules. Therefore, changes of T* with different cation compositions in the samples are explained by the formation of water-molecule-cation bridges between SOM-functional groups. No influence on desorption kinetics of lead for different cation compositions in soil samples was observed. Therefore it can be assumed that the observed changes of matrix rigidity are highly reversible by changing the water status, pH or putting agitation energy by shaking in there.
This master- thesis investigates the topic of intercultural web design. Two websites from different countries are exemplarily compared. On the basis of cultural dimensions, cultural differences are presented on each respective website. The analysis particularly focuses on how detailed the respective website-designer and -operator regards their users" cultural differences and the creation of a cross-cultural web design. The analysis illustrates which cultural - and particularly intercultural - aspects of countries were taken into consideration in the design of the web sites. The investigation led to the conclusion that their implementation was not consequently executed for all web sites. Hence, this thesis offers suggestions for the improvement of aspects which are most important in intercultural web design.
Proceedings of the 9th Open German-Russian Workshop on Pattern Recognition and Image Understanding
(2015)
The Proceedings of the 9th Open German-Russian Workshop on Pattern Recognition and Image Understanding include publications (extended abstracts), that cover but are not limited to the following topics: - Mathematical Theory of Pattern Recognition, Image and Speech Processing, Analysis, Recognition and Understanding. - Cognitive Technologies, Information Technologies, Automated Systems and Software for Pattern Recognition, Image, Speech and Signal Processing, Analysis and Understanding - Databases, Knowledge Bases, and Linguistic Tools - Special-Purpose Architectures, Software and Hardware Tools - Vision and Sensor Data Interpretation for Robotics - Industrial, Medical, Multimedia and Other Applications - Algorithms, Software, Automated Systems and Information Technologies in Bioinformatics and Medical Informatics. The workshop took place from December 1st-5th, 2014, at the University of Koblenz-Landau in Koblenz, Germany.