Refine
Year of publication
Document Type
- Doctoral Thesis (245)
- Master's Thesis (91)
- Part of Periodical (84)
- Bachelor Thesis (45)
- Diploma Thesis (27)
- Article (13)
- Study Thesis (11)
- Conference Proceedings (10)
- Habilitation (4)
- Other (2)
Language
- English (534) (remove)
Keywords
- Pestizid (8)
- Pflanzenschutzmittel (6)
- Software Engineering (6)
- Internet of Things (5)
- Biodiversität (4)
- Bluetooth (4)
- Bodenchemie (4)
- Landwirtschaft (4)
- Semantic Web (4)
- ecotoxicology (4)
Institute
- Fachbereich 4 (116)
- Institut für Informatik (81)
- Fachbereich 7 (78)
- Institut für Wirtschafts- und Verwaltungsinformatik (53)
- Institut für Computervisualistik (52)
- Institute for Web Science and Technologies (50)
- Institut für Management (30)
- Institut für Integrierte Naturwissenschaften, Abt. Biologie (23)
- Institut für Umweltwissenschaften (23)
- Fachbereich 8 (20)
This Master Thesis is an exploratory research to determine whether it is feasible to construct a subjectivity lexicon using Wikipedia. The key hypothesis is that that all quotes in Wikipedia are subjective and all regular text are objective. The degree of subjectivity of a word, also known as ''Quote Score'' is determined based on the ratio of word frequency in quotations to its frequency outside quotations. The proportion of words in the English Wikipedia which are within quotations is found to be much smaller as compared to those which are not in quotes, resulting in a right-skewed distribution and low mean value of Quote Scores.
The methodology used to generate the subjectivity lexicon from text corpus in English Wikipedia is designed in such a way that it can be scaled and reused to produce similar subjectivity lexica of other languages. This is achieved by abstaining from domain and language-specific methods, apart from using only readily-available English dictionary packages to detect and exclude stopwords and non-English words in the Wikipedia text corpus.
The subjectivity lexicon generated from English Wikipedia is compared against other lexica; namely MPQA and SentiWordNet. It is found that words which are strongly subjective tend to have high Quote Scores in the subjectivity lexicon generated from English Wikipedia. There is a large observable difference between distribution of Quote Scores for words classified as strongly subjective versus distribution of Quote Scores for words classified as weakly subjective and objective. However, weakly subjective and objective words cannot be differentiated clearly based on Quote Score. In addition to that, a questionnaire is commissioned as an exploratory approach to investigate whether subjectivity lexicon generated from Wikipedia could be used to extend the coverage of words of existing lexica.
Groundwater is essential for the provision of drinking water in many areas around the world. The ecosystem services provided by groundwater-related organisms are crucial for the quality of groundwater-bearing aquifers. Therefore, if remediation of contaminated groundwater is necessary, the remediation method has to be carefully selected to avoid risk-risk trade-offs that might impact these valuable ecosystems. In the present thesis, the ecotoxicity of the in situ remediation agent Carbo-Iron (a composite of zero valent nano-iron and active carbon) was investigated, an estimation of its environmental risk was performed, and the risk and benefit of a groundwater remediation with Carbo-Iron were comprehensively analysed.
At the beginning of the work on the present thesis, a sound assessment of the environmental risks of nanomaterials was impeded by a lack of guidance documents, resulting in many uncertainties on selection of suitable test methods and a low comparability of test results from different studies with similar nanomaterials. The reasons for the low comparability were based on methodological aspects of the testing procedures before and during the toxicity testing. Therefore, decision trees were developed as a tool to systematically decide on ecotoxicity test procedures for nanomaterials. Potential effects of Carbo-Iron on embryonic, juvenile and adult life stages of zebrafish (Danio rerio) and the amphipod Hyalella azteca were investigated in acute and chronic tests. These tests were based on existing OECD and EPA test guidelines (OECD, 1992a, 2013a, 2013b; US EPA, 2000) to facilitate the use of the obtained effect data in the risk assessment. Additionally, the uptake of particles into the test organisms was investigated using microscopic methods. In zebrafish embryos, effects of Carbo-Iron on gene expression were investigated. The obtained ecotoxicity data were complemented by studies with the waterflea Daphnia magna, the algae Scenedesmus vacuolatus, larvae of the insect species Chironomus riparius and nitrifying soil microorganisms.
In the fish embryo test, no passage of Carbo-Iron particles into the perivitelline space or the embryo was observed. In D. rerio and H. azteca, Carbo-Iron was detected in the gut at the end of exposure, but no passage into the surrounding tissue was detected. Carbo-Iron had no significant effect on soil microorganisms and on survival and growth of fish. However, it had significant effects on the growth, feeding rate and reproduction of H. azteca and on survival and reproduction in D. magna. Additionally, the development rate of C. riparius and the cell volume of S. vacuolatus were negatively influenced.
A predicted no effect concentration of 0.1 mg/L was derived from the ecotoxicity studies based on the no-effect level determined in the reproduction test with D. magna and an assessment factor of 10. It was compared to measured and modelled environmental concentrations for Carbo-Iron after application to an aquifer contaminated with chlorohydrocarbons in a field study. Based on these concentrations, risk quotients were derived. Additionally, the overall environmental risk before and after Carbo-Iron application was assessed to verify whether the chances for a risk-risk trade-off by the remediation of the contaminated site could be minimized. With the data used in the present study, a reduced environmental risk was identified after the application of Carbo-Iron. Thus, the benefit of remediation with Carbo-Iron outweighs potential negative effects on the environment.
The thesis develops and evaluates a hypothetical model of the factors that influence user acceptance of weblog technology. Previous acceptance studies are reviewed, and the various models employed are discussed. The eventual model is based on the technology acceptance model (TAM) by Davis et al. It conceptualizes and operationalizes a quantitative survey conducted by means of an online questionnaire, strictly from a user perspective. Finally, it is tested and validated by applying methods of data analysis.
To construct a business process model manually is a highly complex and error-prone task which takes a lot of time and deep insights into the organizational structure, its operations and business rules. To improve the output of business analysts dealing with this process, different techniques have been introduced by researchers to support them during construction with helpful recommendations. These supporting recommendation systems vary in their way of what to recommend in the first place as well as their calculations taking place under the hood to recommend the most fitting element to the user. After a broad introduction into the field of business process modeling and its basic recommendation structures, this work will take a closer look at diverse proposals and descriptions published in current literature regarding implementation strategies to effectively and efficiently assist modelers during their business process model creation. A critical analysis of presentations in the selected literature will point out strengths and weaknesses of their approaches, studies and descriptions of those. As a result, the final concept matrix in this work will give a precise and helpful overview about the key features and recommendation methods used and implemented in previous research studies to pinpoint an entry into future works without the downsides already spotted by fellow researchers.
Today’s agriculture heavily relies on pesticides to manage diverse pests and maximise crop yields. Despite elaborate regulation of pesticide use based on a complex environmental risk assessment (ERA) scheme, the widespread use of these biologically active compounds has been shown to be a threat to the environment. For surface waters, pesticide exposure has been observed to exceed safe concentration levels and negatively impact stream ecology leading to the question whether current ERA schemes ensure a sustainable use of pesticides. To answer this, the large-scale “Kleingewässer-Monitoring” (KgM) assessed the occurrence of pesticides and related effects in 124 streams throughout Germany, Central Europe, in 2018 and 2019.
Based on five scientific publications originating from the KgM, this thesis evaluated pesticide exposure in streams, ecological effects and the regulatory implications. More than 1,000 water samples were analysed for over 100 pesticide analytes to characterise occurrence patterns (publication 1). Measured concentrations and effects were used to validate the exposure and effect concentrations predicted in the ERA (publication 2). By jointly analysing real-world pesticide application data and measured pesticide mixtures in streams, the disregard of environmental pesticide mixtures in the ERA was evaluated (publication 3). The toxic potential of mixtures in stream water was additionally investigated using suspect screening for 395 chemicals and a battery of in-vitro bioassays (publication 4). Finally, the results from the KgM stream monitoring were used to assess the capability to identify pesticide risks in governmental monitoring programmes (publication 5).
The results of this thesis reveal the widespread occurrence of pesticides in non-target stream ecosystems. The water samples contained a variety of pesticides occurring in complex mixtures predominantly in short-term peaks after rainfall events (publications 1 & 4). Respective pesticide concentration maxima were linked to declines in vulnerable invertebrate species and exceeded regulatory acceptable concentrations in about 80% of agricultural streams, while these thresholds were still estimated partly insufficient to protect the invertebrate community (publication 2). The co-occurrence of pesticides in streams led to a risk underestimated in the single substance-oriented ERA by a factor of about 3.2 in realistic worst-case scenarios, which is further exacerbated by a high frequency at which non-target organism are exposed to pesticides (publication 3). Stream water samples taken after rainfall caused distinct effects in bioassays which were only explainable to a minor extent by the many analytes, indicating the relevance of unknown chemical or biological mixture components (publication 4). Finally, the regulatory monitoring of surface waters under the Water Framework Directive (WFD) was found to significantly underestimate pesticide risks, as about three quarters of critical pesticides and more than half of streams at risk were overlooked (publication 5).
Essentially, this thesis involves a new level of validation of the ERA of pesticides in aquatic ecosystems by assessing pesticide occurrence and environmental impacts at a scale so far unique. The overall results demonstrate that the current agricultural use of pesticides leads to significant impacts on stream ecology that go beyond the level tolerated under the ERA. This thesis identified the underestimation of pesticide exposure, the potential insufficiency of regulatory thresholds and the general inertia of the authorisation process as the main causes why the ERA fails to meet its objectives. To achieve a sustainable use of pesticides, the thesis proposes substantial refinements of the ERA. Adequate monitoring programmes such as the KgM, which go beyond current government monitoring efforts, will continue to be needed to keep pesticide regulators constantly informed of the validity of their prospective ERA, which will always be subject to uncertainty.
Instructor feedback on written assignments is one of the most important elements in the writing process, especially for students writing in English as a foreign language. However, students are often critical of both the amount and quality of the feedback they receive. In order to better understand what makes feedback effective, this study explored the nature of students’ assessments of the educational alliance, and how their receptivity to, perceptions of, and decisions about using their instructors’ feedback differed depending on how strong they believed the educational alliance to be. This exploratory case study found that students not only assessed the quality of the educational alliance based on goal compatibility, task relevance, and teacher effectiveness, but that there was also a reciprocal relationship between these elements. Furthermore, students’ perceptions of the educational alliance directly influenced how they perceived the feedback, which made the instructor’s choice of feedback method largely irrelevant. Stronger educational alliances resulted in higher instances of critical engagement, intrinsic motivation, and feelings of self-efficacy. The multidirectional influence of goal, task, and bond mean that instructors who want to maximize their feedback efforts need to attend to all three.
Thousands of chemicals from daily use are being discharged from civilization into the water cycle via different pathways. Ingredients of personal care products, detergents, pharmaceuticals, pesticides, and industrial chemicals thus find their way into the aquatic ecosystems and may cause adverse impacts on the ecology. Pharmaceuticals for instance, represent a central group of anthropogenic chemicals, because of their designed potency to interfere with physiological functions in organisms. Ecotoxicological effects from pharmaceutical burden have been verified in the past. Therapeutic groups with pronounced endocrine disrupting potentials such as steroid hormones gain increasing focus in environmental research as it was reported that they cause endocrine disruption in aquatic organisms even when exposed to environmentally relevant concentrations. This thesis considers the comprehensive investigation of the occurrence of corticosteroids and progestogens in wastewater treatment plant (WWTP) effluents and surface waters as well as the elucidation of the fate and biodegradability of these steroid families during activated sludge treatment. For the first goal of the thesis, a robust and highly sensitive analytical method based on liquid chromatography-tandem mass spectrometry (LC-MS/MS) was developed in order to simultaneously determine the occurrence of around 60 mineralocorticoids, glucocorticoids and progestogens in the aquatic environment. A special focus was set to the compound selection due to the diversity of marketed synthetic steroids. Some analytical challenges have been approved by individual approaches regarding sensitivity enhancement and compound stabilities. These results may be important for further research in environmental analysis of steroid hormones. Reliable and low quantification limits are the perquisite for the determination of corticosteroids and progestogens at relevant concentrations due to low consumption volumes and simultaneously low effect-based trigger values. Achieved quantification limits for all target analytes ranged between 0.02 ng/L and 0.5 ng/L in surface water and 0.05 ng/L to 5 ng/L in WWTP effluents. This sensitivity enabled the detection of three mineralocorticoids, 23 glucocorticoids and 10 progestogens within the sampling campaign around Germany. Many of them were detected for the first time in the environment, particularly in Germany and the EU. To the best of our knowledge, this in-depth steroid screening provided a good overview of single steroid burden and allowed for the identification of predominantly steroids of each steroid
type analyzed for the first time. The frequent detection of highly potent synthetic steroids (e.g. triamcinolone acetonide, clobetasol propionate, betamethasone valerate, dienogest, cyproterone acetate) highlighted insufficient removal during conventional Summary wastewater treatment and indicated the need for regulation to control their emission since the steroid concentrations were found to be above the reported effect-based trigger values for biota. Overall, the study revealed reliable environmental data of poorly or even not analyzed steroids. The results complement the existing knowledge in this field but also providednew information which can beused particularly for compound prioritization in ecotoxicological research and environmental analysis. Based on the data obtained from the monitoring campaign, incubation experiments were conducted to enable the comparison of the biodegradability and transformation processes in activated sludge treatment for structure-related steroids under aerobic and standardized experimental conditions. The compounds were accurately selected to cover manifold structural moieties of commonly used glucocorticoids, including non-halogenated and halogenated steroids, their mono- and diesters, and several acetonide-type steroids. This approach allowed for a structure-based interpretation of the results. The obtained biodegradation rate constants suggested large variations in the biodegradability (half-lifes ranged from < 0.5 h to > 14 d). An increasing stability was identified in the order from non-halogenated steroids (e.g. hydrocortisone), over 9α-halogenated steroids (e.g. betamethasone), to C17-monoesters (e.g. betamethasone 17-valerate, clobetasol propionate), and finally to acetonides (e.g. triamcinolone acetonide), thus suggesting a strong relationship of the biodegradability with the glucocorticoid structure. Some explanations for this behavior have been received by identifying the transformation products (TPs) and elucidating individual transformation pathways. The results revealed the identification of the likelihood of transformation reactions depending on the chemical steroid structure for the first time. Among the identified TPs, the carboxylates (e.g. TPs of fluticasone propionate, triamcinolone acetonide) have been shown persistency in the subsequent incubation experiments. The newly identified TPs furthermore were frequently detected in the effluents of full-scale wastewater treatment plants. These findings emphasized i) the transferability of the lab-scale degradation experiments to real world and that ii) insufficient removals may cause adverse effects in the aquatic environment due to the ability of the precursor steroids and TPs to interact with the endocrine system in biota. For the last goal, the conceptual study for glucocorticoids was applied to progestogens.
Here, two sub-types of the steroid family frequently used for hormonal contraception were selected (17α-hydroxyprogesterone and 19-norstestosterone type). The progestogens showed a fast and complete degradation within six hours, and thus empathizes pronounced biodegradability. However, cyproterone acetate and dienogest Summary have been found to be more recalcitrant in activated sludge treatment. This was consistent with their ubiquitously occurrence during the previous monitoring campaign. The elucidation of TPs again revealed some crucial information regarding the observed behavior and highlighted furthermore the formation of hazardous TPs. It was shown that 19-nortestosterone type steroids are able to undergo aromatization at ring A in contact with activated sludge, leading to the formation of estrogen-like TPs with a phenolic moiety at ring A. In the case of norethisterone the formation of 17α-ethinylestradiol was confirmed, which is a well-known potent synthetic estrogen with elevated ecotoxicological potency. Thus, the results indicated for the very first time an unknown source of estrogenic compounds, particularly for 17α-ethinylestradiol. In conclusion, some steroids were found to be very stable in activated sludge treatment, others degrade well, and others which do degrade but predominantly to active TPs depending on their chemical structure. Fluorinated acetal steroids such as triamcinolone acetonide and fluocinolone acetonide are poorly biodegradable, which is reflected in high concentrations detected ubiquitously in WWTP effluents. Endogenous steroids and their most related synthetic once such as hydrocortisone, prednisolone or 17α-hydroxyprogesterone are readily biodegradable. Regardless their high influent concentrations, they are almost completely removed in conventional WWTPs. Steroids between this range have been found to form elevated quantities of TPs which are partially still active, which particularly the case for betamethasone, fluticasone propionate, cyproterone acetate or dienogest. The thesis illustrates the need for an extensive evaluation of the environmental risks and carried out that corticosteroids and progestogens merit more attention in environmental regulatory and research than it is currently the case
The decline of biodiversity can be observed worldwide and its consequences are alarming. It is therefore crucial that nature must be protected and, where possible, restored. A wide variety of different project options are possible. Yet in the context of limited availability of resources, the selection of the most efficient measures is increasingly important. For this purpose, there is still a lack of information. This pertains, as outlined in the next paragraph, in particular, to information at different scales of projects.
Firstly, there is a lack of information on the concrete added value of biodiversity protection projects. Secondly, there is a lack of information on the actual impacts of such projects and on the costs and benefits associated with a project. Finally, there is a lack of information on the links between the design of a project, the associated framework conditions and the perception of specific impacts. This paper addresses this knowledge gap by providing more information on the three scales by means of three empirical studies on three different biodiversity protection projects in order to help optimize future projects.
The first study “Assessing the trade-offs in more nature-friendly mosquito control in the Upper Rhine region” examines the added value of a more nature-friendly mosquito control in the Upper Rhine Valley of Germany using a contingent valuation method. Recent studies show that the widely used biocide Bti, which is used as the main mosquito control agent in many parts of the world, has more negative effects on nature than previously expected. However, it is not yet clear whether the population supports a more nature-friendly mosquito control, as such an adaptation could potentially lead to higher nuisance. This study attempts to answer this question by assessing the willingness to pay for an adapted mosquito control strategy that reduces the use of Bti, while maintaining nuisance protection within settlements. The results show that the majority of the surveyed population attaches a high value to a more nature-friendly mosquito control and is willing to accept a higher nuisance outside of the villages.
The second study “Inner city river restoration projects: the role of project components for acceptance” examines the acceptance of a river restoration project in Rhineland-Palatinate, Germany. Despite much effort, many rivers worldwide are still in poor condition. Therefore, a rapid implementation of river restoration projects is of great importance. In this context, acceptance by society plays a fundamental role, however, the factors determining such acceptance are still poorly understood. In particular, the complex interplay between the acceptance or rejection of specific project components and the acceptance of the overall project require further exploration. This study addresses this knowledge gap by assessing the acceptance of the project, its various ecological and social components, and the perception of real and fictitious costs as well as the benefits of the components. Our findings demonstrate that while acceptance of the overall project is generally rather high, many respondents reject one or more of the project's components. Complementary social project components, like a playground, find less support than purely ecological components. Overall, our research shows that complementary components may increase or decrease acceptance of the overall project. We, furthermore, found that differences in the acceptance of the individual components depend on individual concerns, such as perceived flood risk, construction costs, expected noise and littering as well as the quality of communication, attachment to the site, and the age of the respondents.
The third study “What determines preferences for semi-natural habitats in agrarian landscapes? A choice-modelling approach across two countries using attributes characterizing vegetation” investigates people's aesthetic preferences for semi-natural habitats in agricultural landscapes. The EU-Common Agricultural Policy promotes the introduction of woody and grassy semi-natural habitats (SNH) in agricultural landscapes. While the benefits of these structures in terms of regulating ecosystem services are already well understood, the effects of SNH on visual landscape quality is still not clear. This study investigates the factors determining people’s visual preferences in the context of grassy and woody SNH elements in Swiss and Hungarian landscapes using picture-based choice experiments. The results suggest that respondents’ choices strongly depend on specific vegetation characteristics that appear and disappear over the year. In particular, flowers as a source of colours and green vegetation as well as ordered structure and the proportion of uncovered soil in the picture play an important role regarding respondents’ aesthetic perceptions of the pictures.
The three empirical studies can help to make future projects in the study areas of biodiversity protection more efficient. While this thesis highlights the importance of exploring biodiversity protection projects at different scales, further analyses of the different scales of biodiversity protection projects are needed to provide a sound basis to develop guidance on identifying the most efficient biodiversity protection projects.
Generalized methods for automated theorem proving can be used to compute formula transformations such as projection elimination and knowledge compilation. We present a framework based on clausal tableaux suited for such tasks. These tableaux are characterized independently of particular construction methods, but important features of empirically successful methods are taken into account, especially dependency directed backjumping and branch local operation. As an instance of that framework an adaption of DPLL is described. We show that knowledge compilation methods can be essentially improved by weaving projection elimination partially into the compilation phase.
More than 10,000 organic chemicals such as pharmaceuticals, ingredients of personal care products and biocides are ubiquitously used in every day life. After their application, many of these chemicals enter the domestic sewer. Research has shown that conventional biological wastewater treatment in municipal wastewater treatment plants (WWTPs) is an insufficient barrier for the release of most of these anthropogenic chemicals into the receiving waters.
This bears unforeseen risks for aquatic wildlife and drinking water resources. Especially for recently introduced and/or detected compounds (so called emerging micropollutants), there is a growing need to investigate the occurrence and fate in WWTPs. In order to get a comprehensive picture on the behavior in municipal wastewater treatment, the following groups of emerging organic micropollutants, spanning a broad range of applications and physico-chemical properties, were selected as target compounds: pharmaceuticals (beta blockers, psycho-active drugs), UV-filters, vulcanization accelerators (benzothiazoles), biocides (anti-dandruffs, preservatives, disinfectants) and pesticides (phenylurea and triazine herbicides).
Increasingly, problematic smartphone use behavior (PSU) and excessive consumption are reported. In this study, an experiment was developed to investigate the influence of screen coloration using the grayscale setting on smartphone usage time in repeated measurements. We also investigated how individuals perceived suffering correlates with smartphone usage time and PSU, and whether differences exist by smartphone usage type (social, process, habitual). 240 subjects completed a questionnaire about smartphone usage time, PSU, perceived suffering, and smartphone usage types. Afterward, their smartphones were switched to grayscale setting for at least 24h, and thereafter 92 of these participants completed the second questionnaire. Analyses showed that grayscale setting decreases usage time and that there is a positive correlation between PSU, smartphone usage duration, and perceived suffering. The types of use (process and habitual) influence one’s perceived suffering. Thus, it shows that individuals are aware of their PSU and suffer from it. Using grayscale setting is effective in reducing smartphone use time.
The work presented in this thesis investigated interactions of selected biophysical processes that affect zooplankton ecology at smaller scales. In this endeavour, the extent of changes in swimming behaviour and fluid disturbances produced by swimming Daphnia in response to changing physical environments were quantified. In the first research question addressed within this context, size and energetics of hydrodynamic trails produced by Daphnia swimming in non-stratified still waters were characterized and quantified as a function of organisms’ size and their swimming patterns.
The results revealed that neither size nor the swimming pattern of Daphnia affects the width of induced trails or dissipation rates. Nevertheless, as the size and swimming velocity of the organisms increased, trail volume increased in proportional to the cubic power of Reynolds number, and the biggest trail volume was about 500 times the body volume of the largest daphnids. Larger spatial extent of fluid perturbation and prolonged period to decay caused by bigger trail volumes would play a significant role in zooplankton ecology, e.g. increasing the risk of predation.
The study also found that increased trail volume brought about significantly enhanced total dissipated power at higher Reynolds number, and the magnitudes of total dissipated power observed varied in the range of (1.3-10)X10-9 W.
Furthermore, this study provided strong evidence that swimming speed of Daphnia and total dissipated power in Daphnia trails exceeded those of some other selected zooplankton species.
In recognizing turbulence as an intrinsic environmental perturbation in aquatic habitats, this thesis also examined the response of Daphnia to a range of turbulence flows, which correspond to turbu-lence levels that zooplankton generally encounter in their habitats. Results indicated that within the range of turbulent intensities to which the Daphnia are likely to be exposed in their natural habitats, increasing turbulence compelled the organisms to enhance their swimming activity and swim-ming speed. However, as the turbulence increased to extremely high values (10-4 m2s-3), Daphnia began to withdraw from their active swimming behaviour. Findings of this work also demonstrated that the threshold level of turbulence at which animals start to alleviate from largely active swimming is about 10-6 m2s-3. The study further illustrated that during the intermediate range of turbu-lence; 10-7 - 10-6 m2s-3, kinetic energy dissipation rates in the vicinity of the organisms is consistently one order of magnitude higher than that of the background turbulent flow.
Swarming, a common conspicuous behavioural trait observed in many zooplankton species, is considered to play a significant role in defining freshwater ecology of their habitats from food exploitation, mate encountering to avoiding predators through hydrodynamic flow structures produced by them, therefore, this thesis also investigated implications of Daphnia swarms at varied abundance & swarm densities on their swimming kinematics and induced flow field.
The results showed that Daphnia aggregated in swarms with swarm densities of (1.1-2.3)x103 L-1, which exceeded the abundance densities by two orders of magnitude (i.e. 1.7 - 6.7 L-1). The estimated swarm volume decreased from 52 cm3 to 6.5 cm3, and the mean neighbouring distance dropped from 9.9 to 6.4 body lengths. The findings of this work also showed that mean swimming trajectories were primarily horizontal concentric circles around the light source. Mean flow speeds found to be one order of magnitude lower than the corresponding swimming speeds of Daphnia. Furthermore, this study provided evidences that the flow fields produced by swarming Daphnia differed considerably between unidirectional vortex swarming and bidirectional swimming at low and high abundances respectively.
Agricultural land-use may lead to brief pulse exposures of pesticides in edge-of-field streams, potentially resulting in adverse effects on aquatic macrophytes, invertebrates and ecosystem functions. The higher tier risk assessment is mainly based on pond mesocosms which are not designed to mimic stream-typical conditions. Relatively little is known on exposure and effect assessment using stream mesocosms.
Thus the present thesis evaluates the appliacability of the stream mesocosms to mimic stream-typical pulse exposures, to assess resulting effects on flora and fauna and to evaluate aquatic-terrestrial food web coupling. The first objective was to mimic stream-typical pulse exposure scenarios with different durations (≤ 1 to ≥ 24 hours). These exposure scenarios established using a fluorescence tracer were the methodological basis for the effect assessment of an herbicide and an insecticide. In order to evaluate the applicability of stream mesocosms for regulatory purposes, the second objective was to assess effects on two aquatic macrophytes following a 24-h pulse exposure with the herbicide iofensulfuron-sodium (1, 3, 10 and 30 µg/L; n = 3). Growth inhibition of up to 66 and 45% was observed for the total shoot length of Myriophyllum spicatum and Elodea canadensis, respectively. Recovery of this endpoint could be demonstrated within 42 days for both macrophytes. The third objective was to assess effects on structural and functional endpoints following a 6-h pulse exposure of the pyrethroid ether etofenprox (0.05, 0.5 and 5 µg/L; n = 4). The most sensitive structural (abundance of Cloeon simile) and functional (feeding rates of Asellus aquaticus) endpoint revealed significant effects at 0.05 µg/L etofenprox. This concentration was below field-measured etofenprox concentrations and thus suggests that pulse exposures adversely affect invertebrate populations and ecosystem functions in streams. Such pollutions of streams may also result in decreased emergence of aquatic insects and potentially lead to an insect-mediated transfer of pollutants to adjacent food webs. Test systems capable to assess aquatic-terrestrial effects are not yet integrated in mesocosm approaches but might be of interest for substances with bioaccumulation potential. Here, the fourth part provides an aquatic-terrestrial model ecosystem capable to assess cross-ecosystem effects. Information on the riparian food web such as the contribution of aquatic (up to 71%) and terrestrial (up to 29%) insect prey to the diet of the riparian spider Tetragnatha extensa was assessed via stable isotope ratios (δ13C and δ15N). Thus, the present thesis provides the methodological basis to assess aquatic-terrestrial pollutant transfer and effects on the riparian food web.
Overall the results of this thesis indicate, that stream mesocosms can be used to mimic stream-typical pulse exposures of pesticides, to assess resulting effects on macrophytes and invertebrates within prospective environmental risk assessment (ERA) and to evaluate changes in riparian food webs.
The Internet of Things (IoT) is a concept in which connected physical objects are integrated into the virtual world to become active partakers of businesses and everyday processes (Uckelmann, Harrison and Michahelles, 2011; Shrouf, Ordieres and Miragliotta, 2014). It is expected to have a major impact on businesses (Council, Nic and Intelligence, 2008), but small and medium enterprises’ business models are threatened if they do not adopt the new concept (Sommer, 2015). Thus, this thesis aims to showcase a sample implementation of connected devices in a small enterprise, demonstrating its added benefits for the business.
Design Science Research (DSR) is used to develop a prototype based on a use case provided by a carpentry. The prototype comprises a hardware sensor and a web application which can be used by the wood shop to improve their processes. The thesis documents the iterative process of developing a prototype from the grounds up to useable hard- and software.
This contribution provides an example of how IoT can be used and implemented at a small business.
Remote Working Study 2022
(2022)
The Remote Working Study 2022 is focused on the transition to work from home (WFH) triggered by the stay at home directives of 2020. These directives required employees to work in their private premises wherever possible to reduce the transmission of the coronavirus. The study, conducted by the Center for Enterprise Information Research (CEIR) at the University of Koblenz from December 2021 to January 2022, explores the transition to remote working.
The objective of the survey is to collect baseline information about organisations’ remote work experiences during and immediately following the COVID-19 lockdowns. The survey was completed by the key persons responsible for the implementation and/or management of the digital workplace in 19 German and Swiss organisations.
The data presented in this report was collected from member organisations of the IndustryConnect initiative. IndustryConnect is a university-industry research programme that is coordinated by researchers from the University of Koblenz. It focuses on research in the areas of the digital workplace and enterprise collaboration technologies, and facilitates the generation of new research insights and the exchange of experiences among user companies.
As Enterprise 2.0 (E2.0) initiatives are gradually moving out of the early experimentation phase it is time to focus greater attention on examining the structures, processes and operations surrounding E2.0 projects. In this paper we present the findings of an empirical study to investigate and understand the reasons for initiating E2.0 projects and the benefits being derived from them. Our study comprises seven in-depth case studies of E2.0 implementations. We develop a classification and means of visualising the scope of E2.0 initiatives and use these methods to analyse and compare projects.
Our findings indicate a wide range of motivations and combinations of technology in use and show a strong emphasis towards the content management functionality of E2.0 technologies.
Social media platforms such as Twitter or Reddit allow users almost unrestricted access to publish their opinions on recent events or discuss trending topics. While the majority of users approach these platforms innocently, some groups have set their mind on spreading misinformation and influencing or manipulating public opinion. These groups disguise as native users from various countries to spread frequently manufactured articles, strong polarizing opinions in the political spectrum and possibly become providers of hate-speech or extremely political positions. This thesis aims to implement an AutoML pipeline for identifying second language speakers from English social media texts. We investigate style differences of text in different topics and across the platforms Reddit and Twitter, and analyse linguistic features. We employ feature-based models with datasets from Reddit, which include mostly English conversation from European users, and Twitter, which was newly created by collecting English tweets from selected trending topics in different countries. The pipeline classifies language family, native language and origin (Native or non-Native English speakers) of a given textual input. We evaluate the resulting classifications by comparing prediction accuracy, precision and F1 scores of our classification pipeline to traditional machine learning processes. Lastly, we compare the results from each dataset and find differences in language use for topics and platforms. We obtained high prediction accuracy for all categories on the Twitter dataset and observed high variance in features such as average text length especially for Balto-Slavic countries.
The Internet of Things is still one of the most relevant topics in the field of economics and research powered by the increasing demand of innovative services. Cost reductions in manufacturing of IoT hardware and the development of completely new communication ways has led to the point of bil-lions of devices connected to the internet. But in order to rule this new IoT landscape a standardized solution to conquer these challenges must be developed, the IoT Architecture.
This thesis examines the structure, purpose and requirements of IoT Architecture Models in the global IoT landscape and proposes an overview across the selected ones. For that purpose, a struc-tured literature analysis on this topic is conducted within this thesis, including an analysis on three existing research approaches trying to frame this topic and a tool supported evaluation of IoT Archi-tecture literature with over 200 accessed documents.
Furthermore, a coding of literature with the help of the specialised coding tool ATLAS.ti 8 is conduct-ed on 30 different IoT Architecture Models. In a final step these Architecture Models are categorized and compared to each other showing that the environment of IoT and its Architectures gets even more complex the further the research goes.
The automatic detection of position and orientation of subsea cables and pipelines in camera images enables underwater vehicles to make autonomous inspections. Plants like algae growing on top and nearby cables and pipelines however complicate their visual detection: the determination of the position via border detection followed by line extraction often fails. Probabilistic approaches are here superior to deterministic approaches. Through modeling probabilities it is possible to make assumptions on the state of the system even if the number of extracted features is small. This work introduces a new tracking system for cable/pipeline following in image sequences which is based on particle filters. Extensive experiments on realistic underwater videos show robustness and performance of this approach and demonstrate advantages over previous works.
This thesis addresses the automated identification and localization of a time-varying number of objects in a stream of sensor data. The problem is challenging due to its combinatorial nature: If the number of objects is unknown, the number of possible object trajectories grows exponentially with the number of observations. Random finite sets are a relatively new theory that has been developed to derive at principled and efficient approximations. It is based around set-valued random variables that contain an unknown number of elements which appear in arbitrary order and are themselves random. While extensively studied in theory, random finite sets have not yet become a leading paradigm in practical computer vision and robotics applications. This thesis explores random finite sets in visual tracking applications. The first method developed in this thesis combines set-valued recursive filtering with global optimization. The problem is approached in a min-cost flow network formulation, which has become a standard inference framework for multiple object tracking due to its efficiency and optimality. A main limitation of this formulation is a restriction to unary and pairwise cost terms. This circumstance makes integration of higher-order motion models challenging. The method developed in this thesis approaches this limitation by application of a Probability Hypothesis Density filter. The Probability Hypothesis Density filter was the first practically implemented state estimator based on random finite sets. It circumvents the combinatorial nature of data association itself by propagation of an object density measure that can be computed efficiently, without maintaining explicit trajectory hypotheses. In this work, the filter recursion is used to augment measurements with an additional hidden kinematic state to be used for construction of more informed flow network cost terms, e.g., based on linear motion models. The method is evaluated on public benchmarks where a considerate improvement is achieved compared to network flow formulations that are based on static features alone, such as distance between detections and appearance similarity. A second part of this thesis focuses on the related task of detecting and tracking a single robot operator in crowded environments. Different from the conventional multiple object tracking scenario, the tracked individual can leave the scene and later reappear after a longer period of absence. Therefore, a re-identification component is required that picks up the track on reentrance. Based on random finite sets, the Bernoulli filter is an optimal Bayes filter that provides a natural representation for this type of problem. In this work, it is shown how the Bernoulli filter can be combined with a Probability Hypothesis Density filter to track operator and non-operators simultaneously. The method is evaluated on a publicly available multiple object tracking dataset as well as on custom sequences that are specific to the targeted application. Experiments show reliable tracking in crowded scenes and robust re-identification after long term occlusion. Finally, a third part of this thesis focuses on appearance modeling as an essential aspect of any method that is applied to visual object tracking scenarios. Therefore, a feature representation that is robust to pose variations and changing lighting conditions is learned offline, before the actual tracking application. This thesis proposes a joint classification and metric learning objective where a deep convolutional neural network is trained to identify the individuals in the training set. At test time, the final classification layer can be stripped from the network and appearance similarity can be queried using cosine distance in representation space. This framework represents an alternative to direct metric learning objectives that have required sophisticated pair or triplet sampling strategies in the past. The method is evaluated on two large scale person re-identification datasets where competitive results are achieved overall. In particular, the proposed method better generalizes to the test set compared to a network trained with the well-established triplet loss.
Climate change is an existential threat to human survival, the social organization of society, and the stability of ecosystems. It is thereby profoundly frightening. In the face of threat, people often want to protect themselves instead of engaging in mitigating behaviors. When psychological resources are insufficient to cope, people often respond with different forms of denial. In this dissertation, I contribute original knowledge to the understanding of the multifaceted phenomenon of climate denial from a psychological perspective.
There are four major gaps in the literature on climate denial: First, the spectrum of climate denial as a self-protective response to the climate crisis has not received attention within psychology. Second, basic psychological need satisfaction, a fundamental indicator of human functioning and the ability to cope with threat, has not been investigated as a predictor of climate denial. Third, relations of the spectrum of climate denial to climate-relevant emotions, specifically climate anxiety, have not been examined empirically. Forth, it has not been investigated how the spectrum of climate denial relates to established predictors of climate denial, namely right-wing ideological convictions and male gender. To address those gaps, I investigate what the spectrum of climate denial looks like in the German context and how it relates to basic psychological need satisfaction and frustration, pro-environmental behavior, climate anxiety, ideological conviction, and gender.
Five manuscripts reveal that climate denial exists on a spectrum in the German context, ranging from the distortion of facts (interpretive climate denial, specifically denial of personal and global outcome severity) to the denial of the implications of climate change (implicatory climate denial, specifically avoidance, denial of guilt, and rationalization of one's own involvement). Across analyses, low basic psychological need satisfaction predicted the spectrum of climate denial, which was negatively related to pro-environmental behavior. Climate denial was generally negatively related to climate anxiety, except for a positive association of avoidance and climate anxiety. Right-wing ideological conviction was the strongest predictor of climate denial across the spectrum. However, low need satisfaction and male gender were additional weaker predictors of implicatory climate denial.
These findings suggest that the spectrum of climate denial serves many psychological functions. Climate denial is possibly both a self-protective strategy to downregulate emotions and to protect oneself from loss of privilege. In short, it represents a barrier to climate action that may only be resolved once people have sufficient psychological resources to face the threat of climate change and cope with their underlying self-protective, emotional responses.
In a world where language defines the boundaries of one's understanding, the words of Austrian philosopher Ludwig Wittgenstein resonate profoundly. Wittgenstein's assertion that "Die Grenzen meine Sprache bedeuten die Grenzen meiner Welt" (Wittgenstein 2016: v. 5.6) underscores the vital role of language in shaping our perceptions. Today, in a globalized and interconnected society, fluency in foreign languages is indispensable for individual success. Education must break down these linguistic barriers, and one promising approach is the integration of foreign languages into content subjects.
Teaching content subjects in a foreign language, a practice known as Content Language Integrated Learning (CLIL), not only enhances language skills but also cultivates cognitive abilities and intercultural competence. This approach expands horizons and aligns with the core principles of European education (Leaton Gray, Scott & Mehisto 2018: 50). The Kultusministerkonferenz (KMK) recognizes the benefits of CLIL and encourages its implementation in German schools (cf. KMK 2013a).
With the rising popularity of CLIL, textbooks in foreign languages have become widely available, simplifying teaching. However, the appropriateness of the language used in these materials remains an unanswered question. If textbooks impose excessive linguistic demands, they may inadvertently limit students' development and contradict the goal of CLIL.
This thesis focuses on addressing this issue by systematically analyzing language requirements in CLIL teaching materials, emphasizing receptive and productive skills in various subjects based on the Common European Framework of Reference. The aim is to identify a sequence of subjects that facilitates students' language skill development throughout their school years. Such a sequence would enable teachers to harness the full potential of CLIL, fostering a bidirectional approach where content subjects facilitate language learning.
While research on CLIL is extensive, studies on language requirements for bilingual students are limited. This thesis seeks to bridge this gap by presenting findings for History, Geography, Biology, and Mathematics, allowing for a comprehensive understanding of language demands. This research endeavors to enrich the field of bilingual education and CLIL, ultimately benefiting the academic success of students in an interconnected world.
The purpose of this thesis is to explore the sentiment distributions of Wikipedia concepts.
We analyse the sentiment of the entire English Wikipedia corpus, which includes 5,669,867 articles and 1,906,375 talks, by using a lexicon-based method with four different lexicons.
Also, we explore the sentiment distributions from a time perspective using the sentiment scores obtained from our selected corpus. The results obtained have been compared not only between articles and talks but also among four lexicons: OL, MPQA, LIWC, and ANEW.
Our findings show that among the four lexicons, MPQA has the highest sensitivity and ANEW has the lowest sensitivity to emotional expressions. Wikipedia articles show more sentiments than talks according to OL, MPQA, and LIWC, whereas Wikipedia talks show more sentiments than articles according to ANEW. Besides, the sentiment has a trend regarding time series, and each lexicon has its own bias regarding text describing different things.
Moreover, our research provides three interactive widgets for visualising sentiment distributions for Wikipedia concepts regarding the time and geolocation attributes of concepts.
Foliicolous lichens are one of the most abundant epiphytes in tropical rainforests and one of the few groups of organisms that characterize these forests. Tropical rainforests are increasingly affected by anthropogenic disturbance resulting in forest destruction and degradation. However, not much is known on the effects of anthropogenic disturbance on the diversity of foliicolous lichens. Understanding such effects is crucial for the development of appropriate measures for the conservation of such organisms. In this study, foliicolous lichens diversity was investigated in three tropical rainforests in East Africa. Godere Forest in Southwest Ethiopia is a transitional rainforest with a mixture of Afromontane and Guineo-Congolian species. The forest is secondary and has been affected by shifting cultivation, semi-forest coffee management and commercial coffee plantation. Budongo Forest in West Uganda is a Guineo-Congolian rainforest consisting of primary and secondary forests. Kakamega Forest in western Kenya is a transitional rainforest with a mixture of Guineo-Congolian and Afromontane species. The forest is a mosaic of near-primary forest, secondary forests of different seral stages, grasslands, plantations, and natural glades.
While the 1960s and 1970s still knew permanent education (Council of Europe), recurrent education (OECD) and lifelong education (UNESCO), over the past 20 years, lifelong learning has become the single emblem for reforms in (pre-) primary, higher and adult education systems and international debates on education. Both highly industrialized and less industrialized countries embrace the concept as a response to the most diverse economic, social and demographic challenges - in many cases motivated by international organizations (IOs).
Yet, literature on the nature of this influence, the diffusion of the concept among IOs and their understanding of it is scant and usually focuses on a small set of actors. Based on longitudinal data and a large set of education documents, the work identifies rapid diffusion of the concept across a heterogeneous, expansive and dynamic international field of 88 IOs in the period 1990-2013, which is difficult to explain with functionalist accounts.
Based on the premises of world polity theory, this paper argues that what diffuses resembles less the bundle of systemic reforms usually associated with the concept in the literature and more a surprisingly detailed model of a new actor " the lifelong learner.
The estimation of various social objects is necessary in different fields of social life, science, education, etc. This estimation is usually used for forecasting, for evaluating of different properties and for other goals in complex man-machine systems. At present this estimation is possible by means of computer and mathematical simulation methods which is connected with significant difficulties, such as: - time-distributed process of receiving information about the object; - determination of a corresponding mathematical device and structure identification of the mathematical model; - approximation of the mathematical model to real data, generalization and parametric identification of the mathematical model; - identification of the structure of the links of the real social object. The solution of these problems is impossible without a special intellectual information system which combines different processes and allows predicting the behaviour of such an object. However, most existing information systems lead to the solution of only one special problem. From this point of view the development of a more general technology of designing such systems is very important. The technology of intellectual information system development for estimation and forecasting the professional ability of respondents in the sphere of education can be a concrete example of such a technology. Job orientation is necessary and topical in present economic conditions. It helps tornsolve the problem of expediency of investments to a certain sphere of education. Scientifically validated combined diagnostic methods of job orientation are necessary to carry out professional selection in higher education establishments. The requirements of a modern society are growing, with the earlier developed techniques being unable to correspond to them sufficiently. All these techniques lack an opportunity to account all necessary professional and personal characteristics. Therefore, it is necessary to use a system of various tests. Thus, the development of new methods of job orientation for entrants is necessary. The information model of the process of job orientation is necessary for this purpose. Therefore, it would be desirable to have an information system capable of giving recommendations concerning the choice of a trade on the basis of complex personal characteristics of entrants.
With the appearance of modern virtual reality (VR) headsets on the consumer market, there has been the biggest boom in the history of VR technology. Naturally, this was accompanied by an increasing focus on the problems of current VR hardware. Especially the control in VR has always been a complex topic.
One possible solution is the Leap Motion, a hand tracking device that was initially developed for desktop use, but with the last major software update it can be attached to standard VR headsets. This device allows very precise tracking of the user’s hands and fingers and their replication in the virtual world.
The aim of this work is to design virtual user interfaces that can be operated with the Leap Motion to provide a natural method of interaction between the user and the VR environment. After that, subject tests are performed to evaluate their performance and compare them to traditional VR controllers.
Placing questions before the material or after the material constitute different reading situations. To adapt to these reading situations, readers may apply appropriate reading strategies. Reading strategy caused by location of question has been intensively explored in the context of text comprehension. (1) However, there is still not enough knowledge about whether text plays the same role as pictures when readers apply different reading strategies. To answer this research question, three reading strategies are experimentally manipulated by displaying question before or after the blended text and picture materials: (a) Unguided processing with text and pictures and without the question. (b) Information gathering to answer the questions after the prior experience with text and pictures. (c) Comprehending text and pictures to solve the questions with the prior information of the questions. (2) Besides, it is arguable whether readers prefer text or pictures when the instructed questions are in different difficulty levels. (3) Furthermore, it is still uncertain whether students from higher school tier (Gymnasium) emphasize more on text or on pictures than students from lower school tier (Realschule). (4) Finally, it is rarely mentioned whether higher graders are more able to apply reading strategies in text processing and picture processing than lower graders.
Two experiments were undertaken to investigate the usage of text and pictures in the perspectives of task orientation, question difficulty, school and grade. For a 2x2(x2x2x2) mixed design adopting eye tracking method, participants were recruited from grade 5 (N = 72) and grade 8 (N = 72). In Experiment 1, thirty-six 5th graders were recruited from higher tier (Gymnasium) and thirty-six 5th graders were from lower tier (Realschule). In Experiment 2, thirty-six 8th graders were recruited from higher tier and thirty-six were from lower tier. They were supposed to comprehend the materials combining text and pictures and to answer the questions. A Tobii XL60 eye tracker recorded their eye movements and their answers to the questions. Eye tracking indicators were analyzed and reported, such as accumulated fixation duration, time to the first fixation and transitions between different Areas of Interest. The results reveal that students process text differently from pictures when they follow different reading strategies. (1) Consistent with Hypothesis 1, students mainly use text to construct their mental model in unguided spontaneous processing of text and pictures. They seem to mainly rely on the pictures as external representations when trying to answer questions after the prior experience with the material. They emphasize on both text and pictures when questions are presented before the material. (2) Inconsistent with Hypothesis 2, students are inclined to emphasize on text and on pictures as question difficulty increases. However, the increase of focus on pictures is more than on text when the presented question is difficult. (3) Different from Hypothesis 3, the current study discovers that higher tier students did not differ from lower tier students in text processing. Conversely, students from higher tier attend more to pictures than students from lower tier. (4) Differed from Hypothesis 4, 8th graders outperform 5th graders mainly in text processing. Only a subtle difference is found between 5th graders and 8th graders in picture processing.
To sum up, text processing differs from picture processing when applying different reading strategies. In line with the Integrative Model of Text and Picture Comprehension by Schnotz (2014), text is likely to play a major part in guiding the processing of meaning or general reading, whereas pictures are applied as external representations for information retrieval or selective reading. When question is difficulty, pictures are emphasized due to their advantages in visualizing the internal structure of information. Compared to lower tier students (poorer problem solvers), higher tier students (good problem solvers) are more capable of comprehending pictures rather than text. Eighth graders are more efficient than 5th graders in text processing rather than picture processing. It also suggests that in designing school curricula, more attention should be paid to students’ competence on picture comprehension or text-picture integration in the future.
Wikipedia is the biggest, free online encyclopaedia that can be expanded by any-one. For the users, who create content on a specific Wikipedia language edition, a social network exists. In this social network users are categorised into different roles. These are normal users, administrators and functional bots. Within the networks, a user can post reviews, suggestions or send simple messages to the "talk page" of another user. Each language in the Wikipedia domain has this type of social network.
In this thesis characteristics of the three different roles are analysed in order to learn how they function in one language network of Wikipedia and apply them to another Wikipedia network to identify bots. Timestamps from created posts are analysed to reveal noticeable characteristics referring to continuous messages, message rates and irregular behaviour of a user are discovered. Through this process we show that there exist differences between the roles for the mentioned characteristics.
Web-programming is a huge field of different technologies and concepts. Each technology implements a web-application requirement like content generation or client-server communication. Different technologies within one application are organized by concepts, for example architectural patterns. The thesis describes an approach for creating a taxonomy about these web-programming components using the free encyclopaedia Wikipedia. Our 101companies project uses implementations to identify and classify the different technology sets and concepts behind a web-application framework. These classifications can be used to create taxonomies and ontologies within the project. The thesis also describes, how we priorize useful web-application frameworks with the help of Wikipedia. Finally, the created implementations concerning web-programming are documented.
Based on dual process models of information processing, the present research addressed how explicit disgust sensitivity is re-adapted according to implicit disgust sensitivity via self-perception of automatic behavioral cues. Contrary to preceding studies (Hofmann, Gschwendner, & Schmitt, 2009) that concluded that there was a "blind spot" for self- but not for observer perception of automatic behavioral cues, in the present research, a re-adaption process was found for self-perceivers and observers. In Study 1 (N = 75), the predictive validity of an indirect disgust sensitivity measure was tested with a double-dissociation strategy. Study 2 (N = 117) reinvestigated the hypothesis that self-perception of automatic behavioral cues, predicted by an indirect disgust sensitivity measure, led to a re-adaption of explicit disgust sensitivity measures. Using a different approach from Hofmann et al. (2009), the self-perception procedure was modified by (a) feeding back the behavior several times while a small number of cues had to be rated for each feedback condition, (b) using disgust sensitivity as a domain with clearly unequivocal cues of automatic behavior (facial expression, body movements) and describing these cues unambiguously, and (c) using a specific explicit disgust sensitivity measure in addition to a general explicit disgust sensitivity measure. In Study 3 (N = 130), the findings of Study 2 were replicated and display rules and need for closure as moderator effects of predictive validity and cue utilization were additionally investigated. The moderator effects give hints that both displaying a disgusted facial expression and self-perception of one- own disgusted facial expression are subject to a self-serving bias, indicating that facial expression may not be an automatic behavior. Practical implications and implications for future research are discussed.
Leaf litter breakdown is a fundamental process in aquatic ecosystems, being mainly mediated by decomposer-detritivore systems that are composed of microbial decomposers and leaf-shredding, detritivorous invertebrates. The ecological integrity of these systems can, however, be disturbed, amongst others, by chemical stressors. Fungicides might pose a particular risk as they can have negative effects on the involved microbial decomposers but may also affect shredders via both waterborne toxicity and their diet; the latter by toxic effects due to dietary exposure as a result of fungicides’ accumulation on leaf material and by negatively affecting fungal leaf decomposers, on which shredders’ nutrition heavily relies. The primary aim of this thesis was therefore to provide an in-depth assessment of the ecotoxicological implications of fungicides in a model decomposer-detritivore system using a tiered experimental approach to investigate (1) waterborne toxicity in a model shredder, i.e., Gammarus fossarum, (2) structural and functional implications in leaf-associated microbial communities, and (3) the relative importance of waterborne and diet-related effects for the model shredder.
Additionally, knowledge gaps were tackled that were related to potential differences in the ecotoxicological impact of inorganic (also authorized for organic farming in large parts of the world) and organic fungicides, the mixture toxicity of these substances, the field-relevance of their effects, and the appropriateness of current environmental risk assessment (ERA).
In the course of this thesis, major differences in the effects of inorganic and organic fungicides on the model decomposer-detritivore system were uncovered; e.g., the palatability of leaves for G. fossarum was increased by inorganic fungicides but deteriorated by organic substances. Furthermore, non-additive action of fungicides was observed, rendering mixture effects of these substances hardly predictable. While the relative importance of the waterborne and diet-related effect pathway for the model shredder seems to depend on the fungicide group and the exposure concentration, it was demonstrated that neither path must be ignored due to additive action. Finally, it was shown that effects can be expected at field-relevant fungicide levels and that current ERA may provide insufficient protection for decomposer-detritivore systems. To safeguard aquatic ecosystem functioning, this thesis thus recommends including leaf-associated microbial communities and long-term feeding studies using detritus feeders in ERA testing schemes, and identifies several knowledge gaps whose filling seems mandatory to develop further reasonable refinements for fungicide ERA.
Real-time operating systems for mixed-criticality systems
must support different types of software, such as
real-time applications and general purpose applications,
and, at the same time, must provide strong spatial and
temporal isolation between independent software components.
Therefore, state-of-the-art real-time operating systems
focus mainly on predictability and bounded worst-case behavior.
However, general purpose operating systems such as Linux
often feature more efficient---but less deterministic---mechanisms
that significantly improve the average execution time.
This thesis addresses the combination of the two contradicting
requirements and shows thread synchronization mechanisms
with efficient average-case behavior, but without sacrificing
predictability and worst-case behavior.
This thesis explores and evaluates the design space of fast paths
in the implementation of typical blocking synchronization
mechanisms, such as mutexes, condition variables, counting
semaphores, barriers, or message queues. The key technique here
is to avoid unnecessary system calls, as system calls have high
costs compared to other processor operations available in user
space, such as low-level atomic synchronization primitives.
In particular, the thesis explores futexes, the state-of-the-art
design for blocking synchronization mechanisms in Linux
that handles the uncontended case of thread synchronization
by using atomic operations in user space and calls into the
kernel only to suspend and wake up threads. The thesis also
proposes non-preemptive busy-waiting monitors that use an
efficient priority ceiling mechanism to prevent the lock holder
preemption problem without using system calls, and according
low-level kernel primitives to construct efficient wait and
notify operations.
The evaluation shows that the presented approaches
improve the average performance comparable
to state-of-the-art approaches in Linux.
At the same time, a worst-case timing analysis shows
that the approaches only need constant or bounded temporal
overheads at the operating system kernel level.
Exploiting these fast paths is a worthwhile approach
when designing systems that not only have to fulfill
real-time requirements, but also best-effort workloads.