Refine
Year of publication
Document Type
- Doctoral Thesis (249)
- Master's Thesis (92)
- Part of Periodical (84)
- Bachelor Thesis (44)
- Diploma Thesis (27)
- Article (22)
- Conference Proceedings (10)
- Study Thesis (10)
- Habilitation (4)
- Other (2)
Language
- English (546) (remove)
Has Fulltext
- yes (546) (remove)
Keywords
- Pestizid (8)
- Pflanzenschutzmittel (6)
- Software Engineering (6)
- Internet of Things (5)
- Biodiversität (4)
- Bluetooth (4)
- Bodenchemie (4)
- Landwirtschaft (4)
- Semantic Web (4)
- ecotoxicology (4)
Institute
- Fachbereich 4 (115)
- Institut für Informatik (85)
- Fachbereich 7 (78)
- Institut für Computervisualistik (52)
- Institut für Wirtschafts- und Verwaltungsinformatik (52)
- Institute for Web Science and Technologies (50)
- Institut für Management (30)
- Institut für Integrierte Naturwissenschaften, Abt. Biologie (24)
- Institut für Umweltwissenschaften (23)
- Fachbereich 8 (20)
“Did I say something wrong?” A word-level analysis of Wikipedia articles for deletion discussions
(2016)
This thesis focuses on gaining linguistic insights into textual discussions on a word level. It was of special interest to distinguish messages that constructively contribute to a discussion from those that are detrimental to them. Thereby, we wanted to determine whether “I”- and “You”-messages are indicators for either of the two discussion styles. These messages are nowadays often used in guidelines for successful communication. Although their effects have been successfully evaluated multiple times, a large-scale analysis has never been conducted. Thus, we used Wikipedia Articles for Deletion (short: AfD) discussions together with the records of blocked users and developed a fully automated creation of an annotated data set. In this data set, messages were labelled either constructive or disruptive. We applied binary classifiers to the data to determine characteristic words for both discussion styles. Thereby, we also investigated whether function words like pronouns and conjunctions play an important role in distinguishing the two. We found that “You”-messages were a strong indicator for disruptive messages which matches their attributed effects on communication. However, we found “I”-messages to be indicative for disruptive messages as well which is contrary to their attributed effects. The importance of function words could neither be confirmed nor refuted. Other characteristic words for either communication style were not found. Yet, the results suggest that a different model might represent disruptive and constructive messages in textual discussions better.
Chemical plant protection is an essential element in integrated pest management and hence, in current crop production. The use of Plant Protection Products (PPPs) potentially involves ecological risk. This risk has to be characterised, assessed and managed.
For the coming years, an increasing need for agricultural products is expected. At the same time, preserving our natural resources and biodiversity per se is of equally fundamental importance. The relationship of our economic success and cultural progress to protecting the environment has been made plain in the Ecosystem Service concept. These distinct 'services' provide the foundation for defining ecological protection goals (Specific Protection Goals, SPGs) which can serve in the development of methods for ecological risk characterisation, assessment and management.
Ecological risk management (RM) of PPPs is a comprehensive process that includes different aspects and levels. RM is an implicit part of tiered risk assessment (RA) schemes and scenarios, yet RM also explicitly occurs as risk mitigation measures. At higher decision levels, RM takes further risks, besides ecological risk, into account (e.g., economic). Therefore, ecological risk characterisation can include RM (mitigation measures) and can be part of higher level RM decision-making in a broader Ecosystem Service context.
The aim of this thesis is to contribute to improved quantification of ecological risk as a basis for RA and RM. The initial general objective had been entitled as "… to estimate the spatial and temporal extent of exposure and effects…" and was found to be closely related to forthcoming SPGs with their defined 'Risk Dimension'.
An initial exploration of the regulatory framework of ecological RA and RM of PPPs and their use, carried out in the present thesis, emphasised the value of risk characterisation at landscape-scale. The landscape-scale provides the necessary and sufficient context, including abiotic and biotic processes, their interaction at different scales, as well as human activities. In particular, spatially (and temporally) explicit landscape-scale risk characterisation and RA can provide a direct basis for PPP-specific or generic RM. From the general need for tiered landscape-scale context in risk characterisation, specific requirements relevant to a landscape-scale model were developed in the present thesis, guided by the key objective of improved ecological risk quantification. In principle, for an adverse effect (Impact) to happen requires a sensitive species and life stage to co-occur with a significant exposure extent in space and time. Therefore, the quantification of the Probability of an Impact occurring is the basic requirement of the model. In a landscape-scale context, this means assessing the spatiotemporal distribution of species sensitivity and their potential exposure to the chemical.
The core functionality of the model should reflect the main problem structures in ecological risk characterisation, RA and RM, with particular relationship to SPGs, while being adaptable to specific RA problems. This resulted in the development of a modelling framework (Xplicit-Framework), realised in the present thesis. The Xplicit-Framework provides the core functionality for spatiotemporally explicit and probabilistic risk characterisation, together with interfaces to external models and services which are linked to the framework using specific adaptors (Associated-Models, e.g., exposure, eFate and effect models, or geodata services). From the Xplicit-Framework, and using Associated-Models, specific models are derived, adapted to RA problems (Xplicit-Models).
Xplicit-Models are capable of propagating variability (and uncertainty) of real-world agricultural and environmental conditions to exposure and effects using Monte Carlo methods and, hence, to introduce landscape-scale context to risk characterisation. Scale-dependencies play a key role in landscape-scale processes and were taken into account, e.g., in defining and sampling Probability Density Functions (PDFs). Likewise, evaluation of model outcome for risk characterisation is done at ecologically meaningful scales.
Xplicit-Models can be designed to explicitly address risk dimensions of SPGs. Their definition depends on the RA problem and tier. Thus, the Xplicit approach allows for stepwise introduction of landscape-scale context (factors and processes), e.g., starting at the definitions of current standard RA (lower-tier) levels by centring on a specific PPP use, while introducing real-world landscape factors driving risk. With its generic and modular design, the Xplicit-Framework can also be employed by taking an ecological entity-centric perspective. As the predictive power of landscape-scale risk characterisation increases, it is possible that Xplicit-Models become part of an explicit Ecosystem Services-oriented RM (e.g., cost/benefit level).
X-ray computed tomography (XRT) is a three-dimensional (3D), non-destructive, and reproducible investigation method capable of visualizing and examining internal and external structures of components independent of the material and geometry. In this work, XRT with its unique abilities complements conventionally utilized examination methods for the investigation of microstructure weakening induced by hydrogen corrosion and furthermore provides a new approach to corrosion research. The motivation for this is the current inevitable transformation to hydrogen-based steel production. Refractories of the system Al2O3-SiO2 are significant as lining materials. Two exemplary material types A and B, which differ mainly in their Al2O3:SiO2 ratio, are examined here using XRT. Identical samples of the two materials are measured, analyzed, and then compared before and after hydrogen attack. In this context, hydrogen corrosion-induced porosity and its spatial distribution and morphology are investigated. The results show that sample B has an higher resistance to hydrogen-induced attack than sample A. Furthermore, the 3D-representation revealed a differential porosity increase within the microstructure.
This thesis is concerned with an issue of considerable importance to the development of revision skills: the role of teacher feedback. Prompted by the concern to develop a model of instruction which will help students write to the best of their capacities, the present study forms a proposal: an interactive model of revision. The study researches whether the kind of feedback proposed in this model is indeed a helpful tool for revision and whether the kind of negotiated revision that occurs is a vehicle for learning. The first section of the thesis reviews different areas of literature which are relevant to the study. More specifically, Chapter 2 presents the historical and theoretical foundations of different writing instructional practices and sheds light on issues concerning the use of the process approach. It also reviews research based on sociocognitive theoretical perspectives in an attempt to delineate the impact of interpersonal or social activity on individual performance and progress. Chapter 3 examines issues associated with the process approach in particular and illustrates how theory and method come together in a process writing classroom. Chapter 4 presents the differences in revising behaviours between experienced and inexperienced writers in both L1 and L2 contexts and the various ways these differences have been justified. It also highlights a number of issues which have been identified as contributing to effective revision. Particular attention is paid to the role that teacher feedback has to play as a means of promoting substantive student revision with an instructional emphasis on fluency, organisation and language. Chapter 5 presents an interactive model of revision, which envisions a communicative exchange between two partners, the student-writer and the teacher-reader, collaborating in order to develop awareness of revision strategies and establish criteria for effective writing. Chapter 6 investigates the epistemological basis of the research and presents a set of research questions and hypotheses, which guided the investigation. Chapter 7 frames the context of the research and details the methods used to collect the data from the study. The study involved 100 Year 7 students in two gymnasia in Koblenz, Germany. During the time of the investigation, the students wrote and revised five tasks. Three of these tasks were revised after receiving teacher feedback, which focused on aspects such as appropriacy and sufficiency of information, organization, coherence and grammatical accuracy. The study investigates the effects of this kind of focused feedback on the students" revisions and explores the relationship between revision and text improvement. Large quantitative and qualitative data sets were generated during the research. The quantitative data was based on the student documents (1000 original and revised drafts) whereas the qualitative data emerged from student questionnaires and seven case studies. Chapter 8 presents descriptions of the data analyses. More specifically, it describes the initial and final coding of the revisions traced in the student documents. Then it focuses on the type of qualitative analysis employed in the case studies in order to investigate the relationship between revision and text improvement. The final section of the chapter describes the questionnaire analysis, which was carried out to investigate attitudes, benefits and constraints from the implementation of the model. Chapter 9 examines the statistical results from the analysis of the students" revisions. More specifically, it explores the revisions made by the students across tasks and the relationships between the features of the teacher feedback and these revisions. The analysis highlights patterns in the development of revision skills and positive correlations of student revisions with features of the teacher feedback. Chapter 10 looks at the descriptive data from the case studies of seven individual student writers. The analysis of this data illustrates how the specific students negotiated the revisions and sheds more light on the relationship between feedback, revision and text improvement. Chapter 11 contains the analysis of the students" answers to the questionnaire, which provide illuminative information about the feedback-related attitudes. In Chapter 12, the thesis reaches its final destination. The journey over the paths of literature exploration, data gathering and data analysis ends with reflections on the messages that emerge from the data analysis. The conclusion reached is that young students can learn how to revise their writing and focused feedback is a viable pedagogic option for teaching revision. In addition to discussing the findings, this final section considers the pedagogical implications for the teaching of writing and suggests possible avenues for further work.
Instructor feedback on written assignments is one of the most important elements in the writing process, especially for students writing in English as a foreign language. However, students are often critical of both the amount and quality of the feedback they receive. In order to better understand what makes feedback effective, this study explored the nature of students’ assessments of the educational alliance, and how their receptivity to, perceptions of, and decisions about using their instructors’ feedback differed depending on how strong they believed the educational alliance to be. This exploratory case study found that students not only assessed the quality of the educational alliance based on goal compatibility, task relevance, and teacher effectiveness, but that there was also a reciprocal relationship between these elements. Furthermore, students’ perceptions of the educational alliance directly influenced how they perceived the feedback, which made the instructor’s choice of feedback method largely irrelevant. Stronger educational alliances resulted in higher instances of critical engagement, intrinsic motivation, and feelings of self-efficacy. The multidirectional influence of goal, task, and bond mean that instructors who want to maximize their feedback efforts need to attend to all three.
The trends of industry 4.0 and the further enhancements toward an ever changing factory lead to more mobility and flexibility on the factory floor. With that higher need of mobility and flexibility the requirements on wireless communication rise. A key requirement in that setting is the demand for wireless Ultra-Reliability and Low Latency Communication (URLLC). Example use cases therefore are cooperative Automated Guided Vehicles (AGVs) and mobile robotics in general. Working along that setting this thesis provides insights regarding the whole network stack. Thereby, the focus is always on industrial applications. Starting on the physical layer, extensive measurements from 2 GHz to 6 GHz on the factory floor are performed. The raw data is published and analyzed. Based on that data an improved Saleh-Valenzuela (SV) model is provided. As ad-hoc networks are highly depended onnode mobility, the mobility of AGVs is modeled. Additionally, Nodal Encounter Patterns (NEPs) are recorded and analyzed. A method to record NEP is illustrated. The performance by means of latency and reliability are key parameters from an application perspective. Thus, measurements of those two parameters in factory environments are performed using Wireless Local Area Network (WLAN) (IEEE 802.11n), private Long Term Evolution (pLTE) and 5G. This showed auto-correlated latency values. Hence, a method to construct confidence intervals based on auto-correlated data containing rare events is developed. Subsequently, four performance improvements for wireless networks on the factory floor are proposed. Of those optimization three cover ad-hoc networks, two deal with safety relevant communication, one orchestrates the usage of two orthogonal networks and lastly one optimizes the usage of information within cellular networks.
Finally, this thesis is concluded by an outlook toward open research questions. This includes open questions remaining in the context of industry 4.0 and further the ones around 6G. Along the research topics of 6G the two most relevant topics concern the ideas of a network of networks and overcoming best-effort IP.
Willingness to pay and willingness to accept on a two-sided platform - The use case of DoBeeDo
(2019)
It is widely known that especially for technology-based start-ups, entrepreneurs need to set up the boundaries of the business and define the product/service to offer in order to minimize the risk of failure. The goal of this thesis is to not only emphasize the importance of the business model development and evaluation but also show an example customer validation process for an emerging start-up named DoBeeDo, which is a mobile app operating on a two-sided market. During the process of customer validation a survey has been conducted to evaluate the interest of the target groups as well as the fit of their expectations using the Willingness to Pay and Willingness to Accept measures. The paper includes an analysis and evaluation of the gathered results and assesses whether the execution of the Customer Development Model can be continued.
In order to enhance the company’s appeal for potential employees and improve the satisfaction of already salaried workers, it is necessary to offer a variety of work-life balance measures. But as their implementation causes time and financial costs, a prioritization of measures is needed. To express a recommendation for companies, this study is led by the questions if there are work-life balance measures which have more impact on employee satisfaction than others, how big the relative impact of work-life balance measures on job satisfaction in comparison to other work and private life variables is, if there is a relation between the effectiveness of measures and their use and if there is a difference between the measures which are most important from the employees’ perspective and the companies’ offers.
These questions are formulated in eight research hypotheses which are examined in a quantitative research design with online survey data from 289 employees of fifteen different German companies. The formation of a hierarchy of the effectiveness of measures towards job satisfaction as well as the investigation of the relative impact in comparison to other variables is performed using a multiple regression analysis, whilst the differences between employees’ expectations and the availability of offers are examined with t-tests.
Support in childcare, support in voluntary activities and teambuilding events have a significantly higher impact on job satisfaction than other work-life balance measures, and their potential use is higher than the actual use which leads to the conclusion that there is yet potential for companies to improve their employees’ satisfaction by implementing these measures. In addition, flexible work hours, flexible work locations and free time and overtime accounts are the most important measures from the employees’ point of view and already widely offered by the surveyed companies. In general, the overall use of the available measures and the quantity of offered measures are more important with regard to job satisfaction than the specific kind of measure. In addition, work-life balance measures are more important towards job satisfaction for younger people.
Men are currently underrepresented in traditionally female care-oriented (communal) engagement such as taking parental leave, whereas they are overrepresented in traditionally male (agentic) engagement such as breadwinning or leadership. We examined to what extent different prototypical representations of men affect men’s self-reported parental leave-taking intentions and more generally the future they can imagine for themselves with regard to work and care roles (i.e., their possible selves). We expected prototypes of men that combine the two basic stereotype dimensions of agency and communion to increase men’s communal intentions. In two experiments (N1 = 132, N2 = 233), we presented male participants with contrived newspaper articles that described the ideal man of today with varying degrees of agency and communion (between-subjects design with four conditions; combined agentic and communal vs. agentic vs. communal vs. control condition). Results of Experiment 1 were in line with the main hypothesis that especially presenting a combination of agency and communion increases men’s expectations for communal engagement: As compared to a control condition, men expected more to engage in caretaking in the future, reported higher parental leave-taking intentions, and tended to expect taking longer parental leave. Experiment 2 only partially replicated these findings, namely for parental leave-taking intentions. Both experiments additionally provided initial evidence for a contrast effect in that an exclusive focus on agency also increased men’s self-reported parental leave-taking intentions compared to the control condition. Yet, exclusively emphasizing communion in prototypes of men did not affect men’s communal intentions, which were high to begin with. We further did not find evidence for preregistered mechanisms. We discuss conditions and explanations for the emergence of these mixed effects as well as implications for the communication of gendered norms and barriers to men’s communal engagement more broadly.
Well-being is essential for all people. Therefore, important factors influencing people’s well-being must be investigated. Well-being is multifaceted and defined as, for example, psychological, emotional, mental, physical, or social well-being. Here, we focus on psychological well-being. The study aimed to analyze different aspects of connectedness as potential predictors of psychological well-being. For this purpose, we conducted a study examining the psychological well-being of 184 participants (130 women, 54 men, age: M = 31.39, SD = 15.24) as well as their connectedness with oneself (self-love), with others (prosocialness), with nature (nature connectedness), and with the transcendent (spirituality). First, significant positive correlations appeared between psychological well-being and self-love, nature connectedness, and spirituality. Furthermore, correlations between the four aspects of connectedness were significant, except for the relationship between self-love and prosocialness. A regression analysis revealed that self-love and nature connectedness positively predicted participants’ psychological well-being, while spirituality and prosocialness did not explain any incremental variance. The strong relationship between self-love and well-being was partly mediated by nature connectedness. Hence, self love, understood as a positive attitude of self-kindness, should be considered in more detail to enhance psychological well-being. Besides this, a more vital connectedness to the surrounding nature could benefit people’s well-being.
Despite widespread plans of big companies like Amazon and Google to develop unmanned delivery drones, scholarly research in this field is scarce, especially in the information systems field. From technical and legal perspectives, drone delivery in last-mile scenarios is in a quite mature state. However, estimates of user acceptance are varying between high skepticism and exaggerated optimism. This research follows a mixed method approach consisting both qualitative and quantitative research, to identify and test determinants of consumer delivery drone service adoption. The qualitative part rests on ten interviews among average consumers, who use delivery services on a regular basis. Insights gained from the qualitative part were used to develop an online survey and to assess the influence of associated risks on adoption intentions. The quantitative results show that especially financial and physical risks impede drone delivery service adoption. Delivery companies who are currently thinking about providing a delivery drone service may find these results useful when evaluating usage behaviors in the future market for delivery drones.
We are entering the 26th year from the time the World Wide Web (WWW) became reality. Since the birth of the WWW in 1990, the Internet and therewith websites have changed the way businesses compete, shifting products, services and even entire markets.
Therewith, gathering and analysing visitor traffic on websites can provide crucial information to un- derstand customer behavior and numerous other aspects.
Web Analytics (WA) tools offer a quantity of diverse functionality, which calls for complex decision- making in information management. Website operators implement Web Analytic tools such as Google Analytics to analyse their website for the purpose of identifying web usage to optimise website design and management. The gathered data leads to emergent knowledge, which provides new marketing opportunities and can be used to improve business processes and understand customer behavior to increase profit. Moreover, Web Analytics plays a significant role to measure performance and has therefore become an important component in web-based environments to make business decisions.
However, many small and medium –sized enterprises try to keep up with the web business competi- tion, but do not have the equivalent resources in manpower and knowledge to stand the pace, there- fore some even resign entirely on Web Analytics.
This research project aims to develop a Web Analytics framework to assist small and medium-sized enterprises in making better use of Web Analytics. By identifying business requirements of SMEs and connecting them to the functionality of Google Analytics, a Web Analytics framework with attending guidelines is developed, which guides SMEs on how to proceed in using Google Analytics to achieve actionable outcomes.
Recent estimates have confirmed that inland waters emit a considerable amount of CH4 and CO2 to the atmosphere at the regional and global scale. But these estimates are based on extrapolated measured data and lack of data from inland waters in arid and semi-arid regions and carbon sources from wastewater treatment plants (WWTPs) as well insufficient resolution of the spatiotemporal variability of these emissions.
Through this study, we analyzed monthly hydrological, meteorological and water quality data from three irrigation and drinking water reservoirs in the lower Jordan River basin and estimated the atmospheric emission rates of CO2. We investigated the effect of WWTPs on surrounding aquatic systems in term of CH4 and CO2 emission by presenting seasonally resolved data for dissolved concentrations of both gases in the effluents and in the receiving streams at nine WWTPs in Germany.
We investigated spatiotemporal variability of CH4 and CO2 emission from aquatic ecosystems by using of simple low-cost tools for measuring CO2 flux and bubble release rate from freshwater systems. Our estimates showed that reservoirs in semi-arid regions are oversaturated with CO2 and acted as net sources to the atmosphere. The magnitude of observed fluxes at the three water reservoirs in Jordan is comparable to those from tropical reservoirs (3.3 g CO2 m-2 d-1). The CO2 emission rate from these reservoirs is linked to changes of water surface area, which is the result of water management practices. WWTPs have been shown to discharge a considerable amount of CH4 (30.9±40.7 kg yr-1) and CO2 (0.06±0.05 Gg yr-1) to their surrounding streams, and emission rates of CH4 and CO2 from these streams are significantly enhanced by effluents of WWTPs up to 1.2 and 8.6 times, respectively.
Our results showed that both diffusive flux and bubble release rate varied in time and space, and both of emission pathways should be included and variability should be resolved adequately in further sampling and measuring strategies. We conclude that future emission measurements and estimates from inland waters may consider water management practices, carbon sources from WWTPs as well spatial and temporal variability of emission.
This paper introduces Vocville, a causal online game for learning vocabularies. I am creating this application for my master thesis of my career as a "Computervisualist" (computer visions) for the University of Koblenz - Landau. The application is an online browser game based on the idea of the really successful Facebook game FarmVille. The application is seperated in two parts; a Grails application manages a database which holds the game objects like vocabulary, a Flex/Flash application generates the actual game by using these data. The user can create his own home with everything in it. For creating things, the user has to give the correct translation of the object he wants to create several times. After every query he has to wait a certain amount of time to be queried again. When the correct answer is given sufficient times, the object is builded. After building one object the user is allowed to build others. After building enough objects in one area (i.e. a room, a street etc.) the user can activate other areas by translating all the vocabularies of the previous area. Users can also interact with other users by adding them as neighbors and then visiting their homes or sending them gifts, for which they have to fill in the correct word in a given sentence.
This work describes a novel software tool for visualizing anatomical segmentations of medical images. It was developed as part of a bachelor's thesis project, with a view to supporting research into automatic anatomical brain image segmentation. The tool builds on a widely-used visualization approach for 3D image volumes, where sections in orthogonal directions are rendered on screen as 2D images. It implements novel display modes that solve common problems with conventional viewer programs. In particular, it features a double-contour display mode to aid the user's spatial orientation in the image, as well as modes for comparing two competing segmentation labels pertaining to one and the same anatomical region. The tool was developed as an extension to an existing open-source software suite for medical image processing. The visualization modes are, however, suitable for implementation in the context of other viewer programs that follow a similar rendering approach.
The modified code can be found here: soundray.org/mm-segmentation-visualization.tar.gz.
Artificial neural networks is a popular field of research in artificial intelli-
gence. The increasing size and complexity of huge models entail certain
problems. The lack of transparency of the inner workings of a neural net-
work makes it difficult to choose efficient architectures for different tasks.
It proves to be challenging to solve these problems, and with a lack of in-
sightful representations of neural networks, this state of affairs becomes
entrenched. With these difficulties in mind a novel 3D visualization tech-
nique is introduced. Attributes for trained neural networks are estimated
by utilizing established methods from the area of neural network optimiza-
tion. Batch normalization is used with fine-tuning and feature extraction to
estimate the importance of different parts of the neural network. A combi-
nation of the importance values with various methods like edge bundling,
ray tracing, 3D impostor and a special transparency technique results in a
3D model representing a neural network. The validity of the extracted im-
portance estimations is demonstrated and the potential of the developed
visualization is explored.
The automatic detection of position and orientation of subsea cables and pipelines in camera images enables underwater vehicles to make autonomous inspections. Plants like algae growing on top and nearby cables and pipelines however complicate their visual detection: the determination of the position via border detection followed by line extraction often fails. Probabilistic approaches are here superior to deterministic approaches. Through modeling probabilities it is possible to make assumptions on the state of the system even if the number of extracted features is small. This work introduces a new tracking system for cable/pipeline following in image sequences which is based on particle filters. Extensive experiments on realistic underwater videos show robustness and performance of this approach and demonstrate advantages over previous works.
Virtual Goods + ODRL 2012
(2012)
This is the 10th international workshop for technical, economic, and legal aspects of business models for virtual goods incorporating the 8th ODRL community group meeting. This year we did not call for completed research results, but we invited PhD students to present and discuss their ongoing research work. In the traditional international group of virtual goods and ODRL researchers we discussed PhD research from Belgium, Brazil, and Germany. The topics focused on research questions about rights management in the Internet and e-business stimulation. In the center of rights management stands the conception of a formal policy expression that can be used for human readable policy transparency, as well as for machine readable support of policy conformant systems behavior up to automatic policy enforcement. ODRL has proven to be an ideal basis for policy expressions, not only for digital copy rights, but also for the more general "Policy Awareness in the World of Virtual Goods". In this sense, policies support the communication of virtual goods, and they are a virtualization of rules-governed behavior themselves.
This thesis connects the endeavors of the winemaker’s intention in perfect and profitable wine making with an innovative technological application to use Internet of Things. Thereby the winemaker’s work may be supported and enriched – and enables until recent years still unthinkable optimization of managing and planning of his business, including close state control of different areas of his vineyard, and more than that, not ending up with the single grapevine. It is exemplarily shown in this thesis how to measure, transmit, store and make data available, exemplarily demonstrated with “live” temperature, air and soil humidity values from the vineyard. A modular architecture was designed for the system presented, which allows the use of current sensors, and similar low-voltage sensors, which will be developed in the future.
By using IoT devices in the vineyard, the winemaker advances to a new quality of precision of forecasted data, starting from live data of his vineyard. Of more and more importance, the winemaker can start immediate action, when unforeseen heavy weather conditions occur. Immediate use of current data enabled by a Cloud Infrastructure. For this system, an open service infrastructure is employed. In contrast to other published commercial approaches, the described solution is based on open source.
As an alone-standing part of this work, a physical prototype for measuring relevant parameters in the vineyard was de-novo designed and developed until fulfilling the set of specifications. The outlined features and requirements for a functioning data collection and autonomously transmitting device was developed, described, and the fulfilment by the prototype device were demonstrated. Through literature research and supportive orientationally live interviews of winemakers, the theory and the practical application were synchronized and qualified.
For the development of the prototype the general principles of development of an electronic device were followed, in particular the Design Science Research development rules, and principles of Quality Function Deployment. As a characteristic of the prototype, some principles like re-use of approved construction and material price of the building blocks of the device were taken into consideration as well (e.g. housing; Arduino; PCB). Parts reduction principles, decomplexation and simplified assembly, testing and field service were integrated to the development process by the modular design of the functional vineyard device components, e.g. with partial reference to innovative electrical cabinet construction system Modular-3.
The software architectural concept is based on a three-layer architecture inclusive the TTN infrastructure. The front end is realized as a rich web client, using a WordPress plugin. WordPress was chosen due to the wide adoption through the whole internet, enabling fast and easy user familiarization. Relevant quality issues have been tested and discussed in the view of exemplary functionality, extensibility, requirements fulfilment, as usability and durability of the device and the software.
The prototype was characterized and tested with success in the laboratory and in field exposition under different conditions, in order to allow a measurement and analysis of the fulfilment of all requirements by the selected and realized electronic construction and layout.
The solution presented may serve as a basis for future development and application in this special showcase and within similar technologies. A prognosis of future work and applications concludes this work.
Die Entwicklung von Algorithmen im Sinne des Algorithm Engineering geschieht zyklisch. Der entworfene Algorithmus wird theoretisch analysiert und anschließend implementiert. Nach der praktischen Evaluierung wird der Entwurf anhand der gewonnenen Kenntnisse weiter entwickelt. Formale Verifffizierung der Implementation neben der praktischen Evaluierung kann den Entwicklungsprozess verbessern. Mit der Java Modeling Language (JML) und dem KeY tool stehen eine einfache Spezififfkationssprache und ein benutzerfreundliches, automatisiertes Verififfkationstool zur Verfügung. Diese Arbeit untersucht, inwieweit das KeY tool für die Verifffizierung von komplexeren Algorithmen geeignet ist und welche Rückmeldungen für Algorithmiker aus der Verififfkation gewonnen werden können.Die Untersuchung geschieht anhand von Dijkstras Algorithmus zur Berechnung von kürzesten Wegen in einem Graphen. Es sollen eine konkrete Implementation des Standard-Algorithmus und anschließend Implementationen weiterer Varianten verifffiziert werden. Dies ahmt den Entwicklungsprozess des Algorithmus nach, um in jeder Iteration nach möglichen Rückmeldungen zu suchen. Bei der Verifffizierung der konkreten Implementation merken wir, dass es nötig ist, zuerst eine abstraktere Implementation mit einfacheren Datenstrukturen zu verififfzieren. Mit den dort gewonnenen Kenntnissen können wir dann die Verifikation der konkreten Implementation fortführen. Auch die Varianten des Algorithmus können dank der vorangehenden Verififfkationen verifiziert werden. Die Komplexität von Dijkstras Algorithmus bereitet dem KeY tool einige Schwierigkeiten bezüglich der Performanz, weswegen wir während der Verifizierung die Automatisierung etwas reduzieren müssen. Auf der anderenrn Seite zeigt sich, dass sich aus der Verifffikation einige Rückmeldungen ableiten lassen.
Hybrid systems are the result of merging the two most commonly used models for dynamical systems, namely continuous dynamical systems defined by differential equations and discrete-event systems defined by automata. One can view hybrid systems as constrained systems, where the constraints describe the possible process flows, invariants within states, and transitions on the one hand, and to characterize certain parts of the state space (e.g. the set of initial states, or the set of unsafe states) on the other hand. Therefore, it is advantageous to use constraint logic programming (CLP) as an approach to model hybrid systems. In this paper, we provide CLP implementations, that model hybrid systems comprising several concurrent hybrid automata, whose size is only straight proportional to the size of the given system description. Furthermore, we allow different levels of abstraction by making use of hierarchies as in UML statecharts. In consequence, the CLP model can be used for analyzing and testing the absence or existence of (un)wanted behaviors in hybrid systems. Thus in summary, we get a procedure for the formal verification of hybrid systems by model checking, employing logic programming with constraints.
Unlocking the semantics of multimedia presentations in the web with the multimedia metadata ontology
(2010)
The semantics of rich multimedia presentations in the web such as SMIL, SVG and Flash cannot or only to a very limited extend be understood by search engines today. This hampers the retrieval of such presentations and makes their archival and management a difficult task. Existing metadata models and metadata standards are either conceptually too narrow, focus on a specific media type only, cannot be used and combined together, or are not practically applicable for the semantic description of rich multimedia presentations. In this paper, we propose the Multimedia Metadata Ontology (M3O) for annotating rich, structured multimedia presentations. The M3O provides a generic modeling framework for representing sophisticated multimedia metadata. It allows for integrating the features provided by the existing metadata models and metadata standards. Our approach bases on Semantic Web technologies and can be easily integrated with multimedia formats such as the W3C standards SMIL and SVG. With the M3O, we unlock the semantics of rich multimedia presentations in the web by making the semantics machine-readable and machine-understandable. The M3O is used with our SemanticMM4U framework for the multi-channel generation of semantically-rich multimedia presentations.
The present thesis investigates attitudes and prosocial behavior between workgroups from a social identity and intergroup contact perspective. Based on the Common In-group Identity Model (CIIM; Gaertner & Dvoidio, 2000), it is hypothesized that "optimal" conditions for contact (Allport, 1954) create a common identity at the organizational level which motivates workgroups to cooperate and show organizational citizenship behavior (OCB) rather than intergroup bias. Predictions based on the CIIM are extended with hypotheses derived from the In-group Projection Model (IPM; Mummendey & Wenzel, 1999) and the Self-Categorization Model of Group Norms (Terry & Hogg, 1996). Hypotheses are tested with data from N1 = 281 employees of N2 = 49 different workgroups and their workgroup managers of a German mail-order company (Study 1). Results indicate that group- and individual-level contact conditions are predictive of lower levels of intergroup bias and higher levels of cooperation and helping behavior. A common in-group representation mediates the effect on out-group attitudes and intergroup cooperation. In addition, the effect of a common in-group representation on intergroup bias is moderated by relative prototypicality, as predicted by the IPM, and the effect of prosocial group norms on helping behavior is moderated by workgroup identification, as predicted by the Self-Categorization Model of Group Norms. A longitudinal study with Ntotal = 57 members of different student project groups replicates the finding that contact under "optimal" conditions reduces intergroup bias and increases prosocial behavior between organizational groups. However, a common in-group representation is not found to mediate this effect in Study 2. Initial findings also indicate that individual-level variables, such as helping behavior toward members of another workgroup, may be better accounted for by variables at the same level of categorization (cf. Haslam, 2004). Thus, contact in a context that makes personal identities of workgroup members salient (i.e., decategorization) may be more predictive of interpersonal prosocial behavior, while contact in a context that makes workgroup identities salient (i.e., categorization) may be more predictive of intergroup prosocial behavior (cf. Tajfel, 1978). Further data from Study 1 support such a context-specific effect of contact between workgroups on interpersonal and intergroup prosocial behavior, respectively. In the last step, a temporal integration of the contact contexts that either lead to decategorization, categorization, or recategorization are examined based on the Longitudinal Contact Model (Pettigrew, 1998). A first indication that a temporal sequence from decategorization via categorization to recategorization may be particularly effective in fostering intergroup cooperation is obtained with data from Study 2. In order to provide a heuristic model for research on prosocial behavior between workgroups, findings are integrated into a Context-Specific Contact Model. The model proposes specific effects of contact in different contexts on prosocial behavior at different levels of categorization. Possible mediator and moderator processes are suggested. A number of implications for theory, future research and the management of relations between workgroups are discussed.
The Coronavirus Pandemic has influenced the lives of many people. We analyzed the effects of physical activity and stress on students’ motivation during the Pandemic. Participants were 254 university students who reported their academic motivation, physical activity, general stress, the Coronavirus Pandemic strain, and their Coronavirus stress. Women reported higher levels of Coronavirus stress, general stress, and motivation. The Coronavirus stress was predicted by the strain of the Coronavirus Pandemic but not by physical activity. General stress and gender predicted mastery goals, and performance goals were predicted by general stress. Physical activity was not related to students’ motivation during the Pandemic. Higher levels of general stress were associated with higher academic motivation. Negative emotions like stress could have enhanced students’ motivation during uncertain times of the Pandemic. Moreover, a moderate stress level could be favorable for academic dedication and achievement.
By the work presented in this thesis, the CH4 emissions of the River Saar were quantified in space and time continuously and all relevant processes leading to the observed pattern were identified. The direct comparison between reservoir zones and free-flowing intermediate reaches revealed, that the reservoir zones are CH4 emission hot spots and emitted over 90% of the total CH4. On average, the reservoir zones emitted over 80 times more CH4 per square meter than the intermediate reaches between dams (0.23 vs. 19.7 mol CH4 m-2 d-1). The high emission rates measured in the reservoir zones fall into the range of emissions observed in tropical reservoirs. The main reason for this is the accumulation of thick organic rich sediments and we showed that the net sedimentation rate is an excellent proxy for estimating ebullitive emissions. Within the hot spot zones, the ebullitive flux enhanced also the diffusive surface emissions as well as the degassing emissions at dams.
To resolve the high temporal variability, we developed an autonomous instrument for continuous measurements of the ebullition rate over long periods (> 4 weeks). With this instrument we could quantify the variability and identify the relevant trigger mechanisms. At the Saar, ship-lock induces surges and ship waves were responsible for over 85% of all large ebullition events. This dataset was also used to determine the error associated with short sampling periods and we found that with sampling periods of 24 hours as used in other studies, the ebullition rates were systematically underestimated by ~50%. Measuring the temporal variability enabled us to build up a conceptual framework for estimating the temporal pattern of ebullition in other aquatic systems. With respect to the contribution of freshwater systems to the global CH4 emissions, hot spot emission sites in impounded rivers have the potential to increase the current global estimate by up to 7%.
Reducing gender bias in STEM is key to generating more equality and contributing to a more balanced workforce in this field. Spatial ability and its components are cognitive processes crucial to success in STEM education and careers. Significant gender differences have consistently been found in mental rotation (MR), the ability to mentally transform two- and three-dimensional objects. The aim of this pilot study is to examine factors in psychological assessment which may contribute to gender differences in MR performance. Moreover, findings will inform the development of the new approaches to assessment using computer adaptive testing (CAT). (1) Background: The study examines the impact of emotional regulation on MR performance in primary school children whose mean age was 9.28 years old. (2) Methods: Skin conductance was measured to assess the impact of emotional reactivity (ER) on performance during an MR task. (3) Results: Patterns of ER influence response time (RT) on specific items in the task. (4) Conclusions: Identifying the effects of emotional arousal and issues of test construction such as stereotyped stimuli and item difficulty in tests of spatial ability warrants ongoing investigation. It is vital to ensure that these factors do not compromise the accurate measurement of performance and inadvertently contribute to the gender gap in STEM.
This habilitation thesis collects works addressing several challenges on handling uncertainty and inconsistency in knowledge representation. In particular, this thesis contains works which introduce quantitative uncertainty based on probability theory into abstract argumentation frameworks. The formal semantics of this extension is investigated and its application for strategic argumentation in agent dialogues is discussed. Moreover, both the computational as well as the meaningfulness of approaches to analyze inconsistencies, both in classical logics as well as logics for uncertain reasoning is investigated. Finally, this thesis addresses the implementation challenges for various kinds of knowledge representation formalisms employing any notion of inconsistency tolerance or uncertainty.
Graph-based data formats are flexible in representing data. In particular semantic data models, where the schema is part of the data, gained traction and commercial success in recent years. Semantic data models are also the basis for the Semantic Web - a Web of data governed by open standards in which computer programs can freely access the provided data. This thesis is concerned with the correctness of programs that access semantic data. While the flexibility of semantic data models is one of their biggest strengths, it can easily lead to programmers accidentally not accounting for unintuitive edge cases. Often, such exceptions surface during program execution as run-time errors or unintended side-effects. Depending on the exact condition, a program may run for a long time before the error occurs and the program crashes.
This thesis defines type systems that can detect and avoid such run-time errors based on schema languages available for the Semantic Web. In particular, this thesis uses the Web Ontology Language (OWL) and its theoretic underpinnings, i.e., description logics, as well as the Shapes Constraint Language (SHACL) to define type systems that provide type-safe data access to semantic data graphs. Providing a safe type system is an established methodology for proving the absence of run-time errors in programs without requiring execution. Both schema languages are based on possible world semantics but differ in the treatment of incomplete knowledge. While OWL allows for modelling incomplete knowledge through an open-world semantics, SHACL relies on a fixed domain and closed-world semantics. We provide the formal underpinnings for type systems based on each of the two schema languages. In particular, we base our notion of types on sets of values which allows us to specify a subtype relation based on subset semantics. In case of description logics, subsumption is a routine problem. For
the type system based on SHACL, we are able to translate it into a description
logic subsumption problem.
UML models and OWL ontologies constitute modeling approaches with different strength and weaknesses that make them appropriate for use of specifying different aspects of software systems. In particular, OWL ontologies are well suited to specify classes using an expressive logical language with highly flexible, dynamic and polymorphic class membership, while UML diagrams are much more suitable for specifying not only static models including classes and associations, but also dynamic behavior. Though MOF based metamodels and UML profiles for OWL have been proposed in the past, an integrated use of both modeling approaches in a coherent framework has been lacking so far. We present such a framework, TwoUse, for developing integrated models, comprising the benefits of UML models and OWL ontologies
Tractography on HARDI data
(2011)
Diffusion weighted imaging is an important modality in clinical imaging and the only possibility to gain insight into the human brain noninvasively and in-vivo. The applications of this imaging technique are diversified. It is used to study the brain, its structure, development and the functionality of the different areas. Further, important fields of application are neurosurgical planning, examinations of pathologies, investigation of Alzheimer-, strokes, and multiple sclerosis. This thesis gives a brief introduction to MRI and diffusion MRI. Based on this, the mostly used data representation in diffusion MRI in clinical imaging, the diffusion tensor, is introduced. As the diffusion tensor suffers from severe limitations new techniques subsumed under the term HARDI (high angular resolution diffusion imaging) are introduced and discussed in detail. Further, an extensive introduction to tractography, approaches that aim at reconstructing neuronal fibers, is given. Based on the knowledge fromthe theoretical part established tractography algorithms are redesigned to handle HARDI data and, thus, improve the reconstruction of neuronal fibers. Among these algorithms, a novel approach is presented that successfully reconstructs fibers on phantom data as well as on human brain data. Further, a novel global classification approach is presented to cluster voxels according to their diffusion properties.
Water scarcity is already an omnipresent problem in many parts of the world, especially in sub-Saharan Africa. The dry years 2018 and 2019 showed that also in Germany water resources are finite. Projections and predictions for the next decades indicate that renewal rates of existing water resources will decline due the growing influence of climate change, but that water extraction rates will increase due to population growth. It is therefore important to find alternative and sustainable methods to make optimal use of the water resources currently available. For this reason, the reuse of treated wastewater for irrigation and recharge purposes has become one focus of scientific research in this field. However, it must be taken into account that wastewater contains so-called micropollutants, i.e., substances of anthropogenic origin. These are, e.g., pharmaceuticals, pesticides and industrial chemicals which enter the wastewater, but also metabolites that are formed in the human body from pharmaceuticals or personal care products. Through the treatment in wastewater treatment plants (WWTPs) as well as through chemical, biological and physical processes in the soil passage during the reuse of water, these micropollutants are transformed to new substances, known as transformation products (TPs), which further broaden the number of contaminants that can be detected within the whole water cycle.
Despite the fact that the presence of human metabolites and environmental TPs in untreated and treated wastewater has been known for a many years, they are rarely included in common routine analysis methods. Therefore, a first goal of this thesis was the development of an analysis method based on liquid chromatography - tandem mass spectrometry (LC-MS/MS) that contains a broad spectrum of frequently detected micropollutants including their known metabolites and TPs. The developed multi-residue analysis method contained a total of 80 precursor micropollutants and 74 metabolites and TPs of different substance classes. The method was validated for the analysis of different water matrices (WWTP influent and effluent, surface water and groundwater from a bank filtration site). The influence of the MS parameters on the quality of the analysis data was studied. Despite the high number of analytes, a sufficient number of datapoints per peak was maintained, ensuring a high sensitivity and precision as well as a good recovery for all matrices. The selection of the analytes proved to be relevant as 95% of the selected micropollutants were detected in at least one sample. Several micropollutants were quantified that were not in the focus of other current multi-residue analysis methods (e.g. oxypurinol). The relevance of including metabolites and TPs was demonstrated by the frequent detection of, e.g., clopidogrel acid and valsartan acid at higher concentrations than their precursors, the latter even being detected in samples of bank filtrate water.
By the integration of metabolites, which are produced in the body by biological processes, and biological and chemical TPs, the multi-residue analysis method is also suitable for elucidating degradation mechanisms in treatment systems for water reuse that, e.g., use a soil passage for further treatment. In the second part of the thesis, samples from two treatment systems based on natural processes were analysed: a pilot-scale above-ground sequential biofiltration system (SBF) and a full-scale soil aquifer treatment (SAT) site. In the SBF system mainly biological degradation was observed, which was clearly demonstrated by the detection of biological TPs after the treatment. The efficiency of the degradation was improved by an intermediate aeration, which created oxic conditions in the upper layer of the following soil passage. In the SAT system a combination of biodegradation and sorption processes occurred. By the different behaviour of some biodegradable micropollutants compared to the SBF system, the influence of redox conditions and microbial community was observed. An advantage of the SAT system over the SBF system was found in the sorption capacity of the natural soil. Especially positively charged micropollutants showed attenuation due to ionic interactions with negatively charged soil particles. Based on the physicochemical properties at ambient pH, the degree of removal in the investigated systems and the occurrence in the source water, a selection of process-based indicator substances was proposed.
Within the first two parts of this thesis a micropollutant was frequently detected at elevated concentrations in WWTPs effluents, which was not previously in the focus of environmental research: the antidiabetic drug sitagliptin (STG). STG showed low degradability in biological systems and thus it was investigated to what extend chemical treatment by ozonation can ensure attenuation of it. STG contains an aliphatic primary amine as the principal point of attack for the ozone molecule. There is only limited information about the behaviour of this functional group during ozonation and thus, STG served as an example for other micropollutants containing aliphatic primary amines. A pH-dependent degradation kinetic was observed due to the protonation of the primary amine at lower pH values. At pH values in the range 6 - 8, which is typical for the environment and in WWTPs, STG showed degradation kinetics in the range of 103 M-1s-1 and thus belongs to the group of readily degradable substances. However, complete degradation can only be expected at significantly higher pH values (> 9). The transformation of the primary amine moiety into a nitro group was observed as the major degradation mechanism for STG during ozonation. Other mechanisms involved the formation of a diketone, bond breakages and the formation of trifluoroacetic acid (TFA). Investigations at a pilot-scale ozonation plant using the effluent of a biological degradation of a municipal WWTP as source water confirmed the results of the laboratory studies: STG could not be removed completely even at high ozone doses and the nitro compound was formed as the main TP and remained stable during further ozonation and subsequent biological treatment. It can therefore be assumed that under realistic conditions both a residual concentration of STG and the formed main TP as well as other stable TPs such as TFA can be detected in the effluents of a WWTP consisting of conventional biological treatment followed by ozonation and subsequent biological polishing steps.
Despite the significant presence of neuroactive substances in the environment, bioassays that allow to detect diverse groups of neuroactive mechanisms of action are not well developed and not properly integrated into environmental monitoring and chemical regulation. Therefore, there is a need to develop testing methods which are amenable for fast and high-throughput neurotoxicity testing. The overall goal of this thesis work is to develop a test method for the toxicological characterization and screening of neuroactive substances and their mixtures which could be used for prospective and diagnostic hazard assessment.
In this thesis, the behavior of zebrafish embryos was explored as a promising tool to distinguish between different neuroactive mechanisms of action. Recently, new behavioral tests have been developed including photomotor response (PMR), locomotor response (LMR) and spontaneous tail coiling (STC) tests. However, the experimental parameters of these tests lack consistency in protocols such as exposure time, imaging time, age of exposure, endpoint parameter etc. To understand how experimental parameters may influence the toxicological interpretation of behavior tests, a systematic review of existing behavioral assays was conducted in Chapter 2. Results show that exposure concentration and exposure duration highly influenced the comparability between different test methods and the spontaneous tail coiling (STC) test was selected for further testing based on its relative higher sensitivity and capacity to detect neuroactive substances (Chapter 2).
STC is the first observable motor activity generated by the developing neural network of the embryo which is assumed to occur as a result of the innervation of the muscle by the primary motor neurons. Therefore, STC could be a useful endpoint to detect effect on the muscle innervation and also the on the whole nervous system. Consequently, important parameters of the STC test were optimized and an automated workflow to evaluate the STC with the open access software KNIME® was developed (Chapter 3).
To appropriately interpret the observed effect of a single chemical and especially mixture effects, requires the understanding of toxicokinetics and biotransformation. Most importantly, the biotransformation capacity of zebrafish embryos might be limited and this could be a challenge for assessment of chemicals such as organophosphates which require a bioactivation step to effectively inhibit the acetylcholinesterase (AChE) enzyme. Therefore, the influence of the potential limited biotransformation on the toxicity pathway of a typical organophosphate, chlorpyrifos, was investigated in Chapter 5. Chlorpyrifos could not inhibit AChE and this was attributed to possible lack of biotransformation in 24 hpf embryos (Chapter 5).
Since neuroactive substances occur in the environment as mixtures, it is therefore more realistic to assess their combined effect rather than individually. Therefore, mixture toxicity was predicted using the concentration addition and independent action models. Result shows that mixtures of neuroactive substances with different mechanisms of action but similar effects can be predicted with concentration addition and independent action (Chapter 4). Apart
from being able to predict the combined effect of neuroactive substances for prospective risk assessment, it is also important to assess in retrospect the combined neurotoxic effect of environmental samples since neuroactive substances are the largest group of chemicals occurring in the environment. In Chapter 6, the STC test was found to be capable of detecting neurotoxic effects of a wastewater effluent sample. Hence, the STC test is proposed as an effect based tool for monitoring environmental acute and neurotoxic effects.
Overall, this thesis shows the utility and versatility of zebrafish embryo behavior testing for screening neuroactive substances and this allows to propose its use for prospective and diagnostic hazard assessment. This will enhance the move away from expensive and demanding animal testing. The information contained in this thesis is of great potential to provide precautionary solutions, not only for the exposure of humans to neuroactive chemicals but for the environment at large.
Towards Improving the Understanding of Image Semantics by Gaze-based Tag-to-Region Assignments
(2011)
Eye-trackers have been used in the past to identify visual foci in images, find task-related image regions, or localize affective regions in images. However, they have not been used for identifying specific objects in images. In this paper, we investigate whether it is possible to assign image regions showing specific objects with tags describing these objects by analyzing the users' gaze paths. To this end, we have conducted an experiment with 20 subjects viewing 50 image-tag-pairs each. We have compared the tag-to-region assignments for nine existing and four new fixation measures. In addition, we have investigated the impact of extending region boundaries, weighting small image regions, and the number of subjects viewing the images. The paper shows that a tag-to-region assignment with an accuracy of 67% can be achieved by using gaze information. In addition, we show that multiple regions on the same image can be differentiated with an accuracy of 38%.
Augmented reality (AR) applications typically extend the user's view of the real world with virtual objects.
In recent years, AR has gained increasing popularity and attention, which has led to improvements in the required technologies. AR has become available to almost everyone.
Researchers have made great progress towards the goal of believable AR, in which the real and virtual worlds are combined seamlessly.
They mainly focus on issues like tracking, display technologies and user interaction, and give little attention to visual and physical coherence when real and virtual objects are combined. For example, virtual objects should not only respond to the user's input; they should also interact with real objects. Generally, AR becomes more believable and realistic if virtual objects appear fixed or anchored in the real scene, appear indistinguishable from the real scene, and response to any changes within it.
This thesis examines on three challenges in the field of computer vision to meet the goal of a believable combined world in which virtual objects appear and behave like real objects.
Firstly, the thesis concentrates on the well-known tracking and registration problem. The tracking and registration challenge is discussed and an approach is presented to estimate the position and viewpoint of the user so that virtual objects appear fixed in the real world. Appearance-based line models, which keep only relevant edges for tracking purposes, enable absolute registration in the real world and provide robust tracking. On the one hand, there is no need to spend much time creating suitable models manually. On the other hand, the tracking can deal with changes within the object or the scene to be tracked. Experiments have shown that the use of appearance-based line models improves the robustness, accuracy and re-initialization speed of the tracking process.
Secondly, the thesis deals with the subject of reconstructing the surface of a real environment and presents an algorithm to optimize an ongoing surface reconstruction. A complete 3D surface reconstruction of the target scene
offers new possibilities for creating more realistic AR applications. Several interactions between real and virtual objects, such as collision and occlusions, can be handled with physical correctness. Whereas previous methods focused on improving surface reconstructions offline after a capturing step, the presented method de-noises, extends and fills holes during the capturing process. Thus, users can explore an unknown environment without any preparation tasks such as moving around and scanning the scene, and without having to deal with the underlying technology in advance. In experiments, the approach provided realistic results where known surfaces were extended and filled in plausibly for different surface types.
Finally, the thesis focuses on handling occlusions between the real and virtual worlds more realistically, by re-interpreting the occlusion challenge as an alpha matting problem. The presented method overcomes limitations in state-of-the-art methods by estimating a blending coefficient per pixel of the rendered virtual scene, instead of calculating only their visibility. In several experiments and comparisons with other methods, occlusion handling through alpha matting worked robustly and overcame limitations of low-cost sensor data; it also outperformed previous work in terms of quality, realism and practical applicability.
The method can deal with noisy depth data and yields realistic results in regions where foreground and background are not strictly separable (e.g. caused by fuzzy objects or motion blur).
The provision of electronic participation services (e-participation) is a complex socio-technical undertaking that needs comprehensive design and implementation strategies. E-participation service providers, in the most cases administrations and governments, struggle with changing requirements that demand more transparency, better connectivity and increased collaboration among different actors. At the same time, less staff are available. As a result, recent research assesses only a minority of e-participation services as successful. The challenge is that the e-participation domain lacks comprehensive approaches to design and implement (e-)participation services. Enterprise Architecture (EA) frameworks have evolved in information systems research as an approach to guide the development of complex socio-technical systems. This approach can guide the design and implementation services, if the collection of organisations with the commonly held goal to provide participation services is understood as an E Participation Enterprise (EE). However, research & practice in the e participation domain has not yet exploited EA frameworks. Consequently, the problem scope that motivates this dissertation is the existing gap in research to deploy EA frameworks in e participation design and implementation. The research question that drives this research is: What methodical and technical guides do architecture frameworks provide that can be used to design and implement better and successful e participation?
This dissertation presents a literature study showing that existing approaches have not covered yet the challenges of comprehensive e participation design and implementation. Accordingly, the research moves on to investigate established EA frameworks such as the Zachman Framework, TOGAF, the DoDAF, the FEA, the ARIS, and the ArchiMate for their use. While the application of these frameworks in e participation design and implementation is possible, an integrated approach is lacking so far. The synthesis of literature review and practical insights in design and implementation of e participation services from four projects show the challenges of adapting architecture frameworks for this domain. However, the research shows also the potential of a combination of the different approaches. Consequently, the research moves on to develop the E-Participation Architecture Framework (EPART-Framework). Therefore, the dissertation applies design science research including literature review and action research. Two independent settings test an initial EPART-Framework version. The results yield into the EPART-Framework presented in this dissertation.
The EPART-Framework comprises of the EPART-Metamodel with six EPART-Viewpoints, which frame the stakeholder concerns: the Participation Scope, the Participant Viewpoint, the Participation Viewpoint, the Data & Information Viewpoint, the E-participation Viewpoint, and Implementation & Governance Viewpoint. The EPART-Method supports the stakeholders to design the EE and implement e participation and stores its output in an architecture description and a solution repository. It consists of five consecutive phases accompanied by requirements management: Initiation, Design, Implementation and Preparation, Participation, and Evaluation. The EPART-Framework fills the gap between the e participation domain and the enterprise architecture framework domain. The evaluation gives reasonable evidence that the framework is a valuable addition in academia and in practice to improve e-participation design and implementation. The same time, it shows opportunities for future research to extend and advance the framework.
The paper is devoted to solving the problem of assessing the quality of the medical electronic service. A variety of dimensions and factors of quality, methods and models applied in different scopes of activity for assessing quality of service is researched. The basic aspects, requirements and peculiarities of implementing medical electronic services are investigated. The results of the analysis and the set of information models describing the processes of assessing quality of the electronic service "Booking an appointment with a physician" and developed for this paper allowed us to describe the methodology and to state the problem of the assessment of quality of this service.
Topic models are a popular tool to extract concepts of large text corpora. These text corpora tend to contain hidden meta groups. The size relation of these groups is frequently imbalanced. Their presence is often ignored when applying a topic model. Therefore, this thesis explores the influence of such imbalanced corpora on topic models.
The influence is tested by training LDA on samples with varying size relations. The samples are generated from data sets containing a large group differences i.e language difference and small group differences i.e. political orientation. The predictive performance on those imbalanced corpora is judged using perplexity.
The experiments show that the presence of groups in training corpora can influence the prediction performance of LDA. The impact varies due to various factors, including language-specific perplexity scores. The group-related prediction performance changes for groups when varying the relative group sizes. The actual change varies between data sets.
LDA is able to distinguish between different latent groups in document corpora if differences between groups are large enough, e.g. for groups with different languages. The proportion of group-specific topics is under-proportional to the share of the group in the corpus and relatively smaller for minorities.
X-ray computer tomography (XRT) is a three-dimensional, nondestructive, and thus reproducible examination method that allows for the investigation of internal and external structures of objects. Due to its characteristics, the XRT technique has increasingly established itself as an alternative examination method and is also applied in the field of mineral processing. Within this work, XRT is used to investigate the influence of hydrochloric acid leaching of iron-rich bauxites on grain composition. Acid leaching is a promising method for the beneficiation of iron-rich bauxites for refractories. Many studies have already established that leaching with hydrochloric acid can reduce the Fe₂O₃ content in bauxites. However, apart from the influence of the leaching process on the composition of the bauxites, aspects such as the influence of the acid on the exact grain constitution or the porosity behavior have rarely been considered so far. To address these open questions, XRT analysis was used to examine and characterize various bauxites. By comparing identical grains before and after leaching, it was observed that in gibbsite bauxites the acid penetration is deeper, and the volume decreases significantly. In diasporic and boehmitic bauxites, clear leaching edges can be seen in which the iron content has been reduced.
Herein, the particle size distributions (PSDs) and shape analysis of in vivo bioproduced particles from aqueous Au3+ and Eu3+ solutions by the cyanobacterium Anabaena sp. are examined in detail at the nanoscale. Generally, biosynthesis is affected by numerous parameters. Therefore, it is challenging to find the key set points for generating tailored nanoparticles (NPs). PSDs and shape analysis of the Au and Eu-NPs were performed with ImageJ using high-resolution transmission electron microscopy (HR-TEM) images. As the HR-TEM image analysis reflects only a fraction of the detected NPs within the cells, additional PSDs of the complete cell were performed to determine the NP count and to evaluate the different accuracies. Furthermore, local PSDs were carried out at five randomly selected locations within a single cell to identify local hotspots or agglomerations. The PSDs show that particle size depends mainly on contact time, while the particle shape is hardly affected. The particles formed are distributed quite evenly within the cells. HR-PSDs for Au-NPs show an average equivalent circular diameter (ECD) of 8.4 nm (24 h) and 7.2 nm (51 h). In contrast, Eu-NPs preferably exhibit an average ECD of 10.6 nm (10 h) and 12.3 nm (244 h). Au-NPs are classified predominantly as “very round” with an average reciprocal aspect ratio (RAR) of ~0.9 and a Feret major axis ratio (FMR) of ~1.17. Eu-NPs mainly belong to the “rounded” class with a smaller RAR of ~0.6 and a FMR of ~1.3. These results show that an increase in contact time is not accompanied by an average particle growth for Au-NPs, but by a doubling of the particle number. Anabaena sp. is capable of biosorbing and bioreducing dissolved Au3+ and Eu3+ ions from aqueous solutions, generating nano-sized Au and Eu particles, respectively. Therefore, it is a low-cost, non-toxic and effective candidate for a rapid recovery of these sought-after metals via the bioproduction of NPs with defined sizes and shapes, providing a high potential for scale-up.
Current political issues are often reflected in social media discussions, gathering politicians and voters on common platforms. As these can affect the public perception of politics, the inner dynamics and backgrounds of such debates are of great scientific interest. This thesis takes user generated messages from an up-to-date dataset of considerable relevance as Time Series, and applies a topic-based analysis of inspiration and agenda setting to it. The Institute for Web Science and Technologies of the University Koblenz-Landau has collected Twitter data generated beforehand by candidates of the European Parliament Election 2019. This work processes and analyzes the dataset for various properties, while focusing on the influence of politicians and media on online debates. An algorithm to cluster tweets into topical threads is introduced. Subsequently, Sequential Association Rules are mined, yielding wide array of potential influence relations between both actors and topics. The elaborated methodology can be configured with different parameters and is extensible in functionality and scope of application.
Abstract for the print-book: Walden, R. (2008). Architectural Psychology: School, University Campus, and Office Building of the Future. Lengerich: Pabst Science Publishers (in German). The need for display of self in architecture and for users' self-regulation of stress factors, which demonstrate that users crave individual control of their environment (cf. Flammer, 1990; Burger, 1992) motivated this study to use the concept of environmental control as a central criterion for the evaluation of built environment. It was applied to three case studies: a school, a university campus, and an office building. Advantages and disadvantages of the data- gathering methods of architectural Programming, User-Needs Analysis, and Post-Occupancy Evaluation were analyzed to highlight their significance in terms of Building Performance Evaluation as described by Preiser and Sc hramm (1997, 2005). The “Koblenz Architecture Questionnaire” was used as an instrument for assessing the built environment of the three case studies, and the study reports selected findings from these questionnaires. The investigation seeks to determine the effect of architecture - especially buildings' provisions for user control of environmental conditions - on user performance (cf. BOSTI studies, 1984, 2001) in three innovative buildings: the Waldorf School in Cologne, the new campus for the University in Koblenz, and the Office Tower of the Deutsche Post World Net AG in Bonn. Performance is measured in terms of (1) Lear ning and Work Efficiency, (2) Well-being, (3) Environmental Control, (4) Social Behavior (the latter just for the school project), and by means of 21 and 16 additional psychological criteria for success of the organization in the cases of the university and the office building, respectively. The study aims, among other things, at reassessing the theoretical concept of 'environmental control' and at making recommendations for both improvement of existing buildings and the design of new projects. Two central questions are: In User-Needs Analysis, what is the difference between the assessment of a building for its current use and its estimated performance in future? Do certain architectural features influence user assessments on the given performance criteria? In the studies, three mapping sentences were developed according to the 'facet approach' (Borg, 1996) as well as two systems to judge the quality of school and office buildings. Using these systems, information was obtained in all three studies to construct questionnaires. In the school study, teachers were asked 139 questions, pupils 86 questions. Responses were obtained from 26 teachers and 122 pupils. For the university, 147 students and 28 faculty members responded to 203 questions. For the office building, 56 student-experts were asked 254 questions. Characteristics of the built environment were rated using the following scale: +2 ☺☺ (very good “at present”, and accordingly very important “in the future”) down to –2 // (very bad “at present”, and very unimportant “in the future”). A general finding was a high and significant co rrelation between the responses for the three main performance criteria in all three case studies, especially for the 'importance for the future' aspect. This supports the conclusion that a perception of higher degree of environmental control by users will lead to an increased sense of well-being and consequently, there will also be a higher expectation of improved work or learning efficiency 'in the future'. The three studies further show for example that users in all three environments desire 'retreat opportunities' which may take the form of student offices in schools, niches and small group seating in classrooms, and sheltered seating in outdoor areas and work tables in the cafeteria for the university. For the offices, users wanted more visual privacy (less transparent office partitions in Combi Offices) for less visual control of their activities by supervisors and co-workers. The relationships found by the studies between the responses on the central performance criteria and the spatial characteristics of the three buildings support the contention that focused improvements in the built environment, especially with respect to features that enhance user control of environmental conditions, will influence users’ well-being as well as work performance and work or learning efficiency in a positive way.
Organic substances play an essential role for the formation of stable soil structures. In this context, their physico-chemical properties, interactions with mineral soil constituents and soil-water interactions are particu-larly important. However, the underlying mechanisms contributing to soil particle cementation by swollen or-ganic substances (hydrogels) remains unclear. Up to now, no mechanistic model is available which explains the mechanisms of interparticulate hydrogel swelling and its contribution to soil-water interactions and soil structur-al stability. This mainly results from the lack of appropriate testing methods to study hydrogel swelling in soil as well as from the difficulties of adapting available methods to the system soil/hydrogel.
In this thesis, 1H proton nuclear magnetic resonance (NMR) relaxometry was combined with various soil micro- and macrostructural stability testing methods in order to identify the contribution of hydrogel swelling-induced soil-water interactions to the structural stability of water-saturated and unsaturated soils. In the first part, the potentials and limitations of 1H NMR relaxometry to enlighten soil structural stabilization mechanism and vari-ous water populations were investigated. In the second part, 1H-NMR relaxometry was combined with rheologi-cal measurements of soil to assess the contribution of interparticulate hydrogel swelling and various polymer-clay interactions on soil-water interactions and soil structural stability in an isolated manner. Finally, the effects of various organic and mineral soil fractions on soil-water interactions and soil structural stability was assessed in more detail for a natural, agriculturally cultivated soil by soil density fractionation and on the basis of the experiences gained from the previous experiments.
The increased experiment complexity in the course of this thesis enabled to link physico-chemical properties of interparticulate hydrogel structures with soil structural stability on various scales. The established mechanistic model explains the contribution of interparticulate hydrogels to the structural stability of water-saturated and unsaturated soils: While swollen clay particles reduce soil structural stability by acting as lubricant between soil particles, interparticulate hydrogel structures increase soil structural stability by forming a flexible polymeric network which interconnects mineral particles more effectively than soil pore- or capillary water. It was appar-ent that soil structural stability increases with increasing viscosity of the interparticluate hydrogel in dependence on incubation time, soil texture, soil solution composition and external factors in terms of moisture dynamics and agricultural management practices. The stabilizing effect of interparticulate hydrogel structures further in-crease in the presence of clay particles which is attributed to additional polymer-clay interactions and the incor-poration of clay particles into the three-dimensional interparticulate hydrogel network. Furthermore, the simul-taneous swelling of clay particles and hydrogel structures results in the competition for water and thus in a mu-tual restriction of their swelling in the interparticle space. Thus, polymer-clay interactions not only increase the viscosity of the interparticulate hydrogel and thus its ability to stabilize soil structures but further reduce the swelling of clay particles and consequently their negative effects on soil structural stability. The knowledge on these underlying mechanisms enhance the knowledge on the formation of stable soil structures and enable to take appropriate management practices in order to maintain a sustainable soil structure. The additionally out-lined limitations and challenges of the mechanistic model should provide information on areas with optimization and research potential, respectively.
The STOR project aims at the development of a scientific component system employing models and knowledge for object recognition in images. This interim report elaborates on the requirements for such a component system, structures the application area by identifying a large set of basic operations, and shows how a set of appropriate data structures and components can be derived. A small case studies exemplifies the approach.
Based on dual process models of information processing, the present research addressed how explicit disgust sensitivity is re-adapted according to implicit disgust sensitivity via self-perception of automatic behavioral cues. Contrary to preceding studies (Hofmann, Gschwendner, & Schmitt, 2009) that concluded that there was a "blind spot" for self- but not for observer perception of automatic behavioral cues, in the present research, a re-adaption process was found for self-perceivers and observers. In Study 1 (N = 75), the predictive validity of an indirect disgust sensitivity measure was tested with a double-dissociation strategy. Study 2 (N = 117) reinvestigated the hypothesis that self-perception of automatic behavioral cues, predicted by an indirect disgust sensitivity measure, led to a re-adaption of explicit disgust sensitivity measures. Using a different approach from Hofmann et al. (2009), the self-perception procedure was modified by (a) feeding back the behavior several times while a small number of cues had to be rated for each feedback condition, (b) using disgust sensitivity as a domain with clearly unequivocal cues of automatic behavior (facial expression, body movements) and describing these cues unambiguously, and (c) using a specific explicit disgust sensitivity measure in addition to a general explicit disgust sensitivity measure. In Study 3 (N = 130), the findings of Study 2 were replicated and display rules and need for closure as moderator effects of predictive validity and cue utilization were additionally investigated. The moderator effects give hints that both displaying a disgusted facial expression and self-perception of one- own disgusted facial expression are subject to a self-serving bias, indicating that facial expression may not be an automatic behavior. Practical implications and implications for future research are discussed.
Abstract The present work investigates the wetting characteristics of soils with regard to their dependence on environmental parameters such as water content (WC), pH, drying temperature and wetting temperature of wettable and repellent soils from two contrasting anthropogenic sites, the former sewage disposal field Berlin-Buch and the inner-city park Berlin-Tiergarten. The aim of this thesis is to deepen the understanding of processes and mechanisms leading to changes in soil water repellency. This helps to gain further insight into the behaviour of soil organic matter (SOM) and identifying ways to prevent or reduce the negative effects of soil water repellency (SWR). The first focus of this work is to determine whether chemical reactions are required for wetting repellent samples. This hypothesis was tested by time and temperature dependence of sessile drop spreading on wettable and repellent samples. Additionally, diffuse reflectance infrared Fourier transform (DRIFT) spectroscopy was used to determine whether various drying regimes cause changes in the relative abundance of hydrophobic and hydrophilic functional groups in the outer layer of soil particles and whether these changes can be correlated with water content and the degree of SWR. Finally, by artificially altering the pH in dried samples applying acidic and alkaline reagents in a gaseous state, the influence of only pH on the degree of SWR was investigated separately from the influence of changes in moisture status. The investigation of the two locations Buch and Tiergarten, each exceptionally different in the nature of their respective wetting properties, leads to new insights in the variety of appearance of SWR. The results of temperature, water content and pH dependency of SWR on the two contrasting sites resulted in one respective hypothetical model of nature of repellency for each site which provides an explanation for most of the observations made in this and earlier studies: At the Tiergarten site, wetting characteristics are most likely determined by micelle-like arrangement of amphiphiles which depends on the concentration of water soluble amphiphilic substances, pH and ionic strength in soil solution. At low pH and at high ionic strength, repulsion forces between hydrophilic charged groups are minimized allowing their aggregation with outward orientated hydrophobic molecule moieties. At high pH and low ionic strength, higher repulsion forces between hydrophilic functional groups lead to an aggregation of hydrophobic groups during drying, which results in a layer with outward oriented hydrophilic moieties on soil organic matter surface leading to enhanced wettability. For samples from the Buch site, chemical reactions are necessary for the wetting process. The strong dependence of SWR on water content indicates that hydrolysis-condensation reactions are the controlling mechanisms. Since acid catalyzed hydrolysis is an equilibrium reaction dependent on water content, an excess of water favours hydrolysis leading to an increasing number of hydrophilic functional groups. In contrast, water deficiency favours condensation reactions leading to a reduction of hydrophilic functional groups and thus a reduction of wettability. The results of the present investigation and its comparison with earlier investigations clearly show that SWR is subject to numerous antagonistically and synergistically interacting environmental factors. The degree of influence, which a single factor exerts on SWR, is site-specific, e.g., it is dependent on special characteristics of mineral constituents and SOM which underlies the influence of climate, soil texture, topography, vegetation and the former and current use of the respective site.
Larvae of Cx.pipiens coocurred with Cladocera, but the latter established delayed in time. Biotope structure influenced time of species occurrence with ponds at reed-covered wetlands favouring crustacean development, while ponds at grassland biotopes favoured colonization by mosquito larvae. The mechanisms driving the negative effect of crustaceans on mosquito larvae were investigated within an experiment under artificial conditions. Crustacean communities were found to reduce both oviposition and larval development of Cx.pipiens. Crustacean communities of high taxa diversity, including both predatory and competing crustaceans, were more effective compared with crustacean communities dominated by single taxa. Presence of crustacean communities characterised by high taxa diversity increased the sensitivity of Cx.pipiens larvae towards Bti and prolonged the time of recolonization. In a final step the combined approach, using Bti and crustaceans, was evaluated under field conditions. The joint application of Bti and crustaceans was found to reduce mosquito larval populations over the whole observation period, while single application of Bti caused only short-term reduction of mosquito larvae. Single application of crustaceans had no significant effect, because high abundances of prior established mosquito larvae impeded propagation of crustaceans. At combined treatment, mosquito larvae were reduced by Bti application and hence crustaceans were able to proliferate without disturbance by interspecific competition. In conclusion, natural competitors were found to have a strong negative impact on mosquito larval populations. However, a time span of about 2 weeks has to be bridged, before crustacean communities reached a level sufficient for mosquito control. Results of a combined approach, complementing the short-term effect of the biological insecticide Bti with the long-term effect of crustaceans, were promising. Using natural competitors within an integrated control strategy could be an important tool for an effective, environmentally friendly and sustainable mosquito management.
The role of alternative resources for pollinators and aphid predators in agricultural landscapes
(2021)
The world wide decline of insects is often associated with loss of natural and semi-natural habitat caused by intensified land-use. Many insects provide important ecosystem services to agriculture, such as pest control or pollination. To efficiently promote insects on remaining semi-natural habitat we need precise knowledge of their requirements to non-crop habitat. This thesis focuses on identifying
the most important semi-natural habitats (forest edges, grasslands, and semi-open habitats) for pollinators and natural enemies of crop pests with respect to their food resource requirements. Special
attention is given to floral resources and their spatio-temporal distribution in agricultural landscapes.
Floral resource maps might get closer at characterizing landscapes the way they are experienced by insects compared to classical habitat maps. Performance of the two map types was compared on the prediction of wild bees and natural enemies that consume nectar and pollen, identifying habitats of special importance in the process. In wild bees, influences of spatio-temporal floral resource availability were analysed as well as habitat preferences of specific groups of bees. Understanding dietary needs of natural enemies of crop pests requires additional knowledge on prey use. To this end, ladybird gut contents have been analysed by means of high-throughput sequencing for insight into aphid prey-use.
Results showed, that wild bees were predicted better by floral resource maps compared to classical habitat maps. Forest edge area, as well as floral resources in forest edges had positive effects on abundance and diversity of rare bees and important crop pollinators. Similar patterns were retained for grassland diversity. Especially early floral resources seemed to have positive effects on wild bees. Crops and fruit trees produced a resource pulse in April that exceeded floral resource availability in May and June by tenfold. Most floral resources in forest edges appeared early in the season, with the highest floral density per area. Grasslands provided the lowest amount of floral resources but highest diversity, which was evenly distributed over the season.
Despite natural enemies need for floral resources, classical habitat maps performed better at predicting natural enemies of crop pests compared to floral resource maps. Classical habitat maps revealed a positive effect of forest edge habitat on the abundance of pest enemies, which translated into improved aphid control. Results from gut content analysis reveal high portions of pest aphid species and nettle aphids as well as a broader insight into prey spectra retained from ladybirds collected from sticky traps compared to individuals collected by hand. The aphid specific primer designed for this purpose will be helpful for identifying aphid consumption by ladybirds in future studies.
Findings of this thesis show the potential of floral resource maps for understanding interactions of wild bees and the landscape but also indicate that natural enemies are limited by other resources. I would like to highlight the positive effects of forest edges for different groups of bees as well as natural enemies and their performance on pest control.
Dualizing marked Petri nets results in tokens for transitions (t-tokens). A marked transition can strictly not be enabled, even if there are sufficient "enabling" tokens (p-tokens) on its input places. On the other hand, t-tokens can be moved by the firing of places. This permits flows of t-tokens which describe sequences of non-events. Their benefiit to simulation is the possibility to model (and observe) causes and effects of non-events, e.g. if something is broken down.