Refine
Year of publication
Document Type
- Doctoral Thesis (245)
- Master's Thesis (89)
- Part of Periodical (84)
- Bachelor Thesis (44)
- Diploma Thesis (27)
- Article (13)
- Study Thesis (10)
- Conference Proceedings (9)
- Habilitation (4)
- Other (2)
Language
- English (529) (remove)
Has Fulltext
- yes (529) (remove)
Is part of the Bibliography
- no (529) (remove)
Keywords
- Pestizid (8)
- Pflanzenschutzmittel (6)
- Software Engineering (6)
- Internet of Things (5)
- Biodiversität (4)
- Bluetooth (4)
- Bodenchemie (4)
- Landwirtschaft (4)
- Semantic Web (4)
- ecotoxicology (4)
Institute
- Fachbereich 4 (115)
- Institut für Informatik (81)
- Fachbereich 7 (78)
- Institut für Wirtschafts- und Verwaltungsinformatik (52)
- Institut für Computervisualistik (51)
- Institute for Web Science and Technologies (49)
- Institut für Management (30)
- Institut für Integrierte Naturwissenschaften, Abt. Biologie (23)
- Institut für Umweltwissenschaften (23)
- Fachbereich 8 (20)
- Institut für Softwaretechnik (14)
- Institut für Integrierte Naturwissenschaften, Abt. Chemie (10)
- Mathematisches Institut (9)
- Institut für Anglistik und Amerikanistik (7)
- Institut für Integrierte Naturwissenschaften (6)
- Institut für Integrierte Naturwissenschaften, Abt. Physik (6)
- Institut für Psychologie (5)
- Fachbereich 6 (4)
- Arbeitsbereich Sozial- und Wirtschaftspsychologie (2)
- Institut für Mathematik (2)
- Arbeitsbereich Allgemeine und Pädagogische Psychologie (1)
- Arbeitsbereich Biopsychologie, Klinische Psychologie und Psychotherapie (1)
- Arbeitsbereich Diagnostik, Differentielle und Persönlichkeitspsychologie, Methodik und Evaluation (1)
- Arbeitsbereich Entwicklungspsychologie und Pädagogische Psychologie (1)
- Fachbereich 5 (1)
- Institut für Bildung im Kindes- und Jugendalter (1)
- Institut für Erziehungswissenschaft (1)
- Institut für Integrierte Naturwissenschaften, Abt. Geographie (1)
- Institut für Kommunikationspsychologie und Medienpädagogik (1)
- Institut für Sozialwissenschaften (1)
- Institut für fremdsprachliche Philologien (1)
- Universitätsbibliothek Koblenz-Landau (1)
- Zentrale Einrichtungen (1)
This work addresses the challenge of calibrating multiple solid-state LIDAR systems. The study focuses on three different solid-state LIDAR sensors that implement different hardware designs, leading to distinct scanning patterns for each system. Consequently, detecting corresponding points between the point clouds generated by these LIDAR systems—as required for calibration—is a complex task. To overcome this challenge, this paper proposes a method that involves several steps. First, the measurement data are preprocessed to enhance its quality. Next, features are extracted from the acquired point clouds using the Fast Point Feature Histogram method, which categorizes important characteristics of the data. Finally, the extrinsic parameters are computed using the Fast Global Registration technique. The best set of parameters for the pipeline and the calibration success are evaluated using the normalized root mean square error. In a static real-world indoor scenario, a minimum root mean square error of 7 cm was achieved. Importantly, the paper demonstrates that the presented approach is suitable for online use, indicating its potential for real-time applications. By effectively calibrating the solid-state LIDAR systems and establishing point correspondences, this research contributes to the advancement of multi-LIDAR fusion and facilitates accurate perception and mapping in various fields such as autonomous driving, robotics, and environmental monitoring.
Focusing on the triangulation of detective fiction, masculinity studies and disability studies, "Investigating the Disabled Detective – Disabled Masculinity and Masculine Disability in Contemporary Detective Fiction" shows that disability challenges common ideals of (hegemonic) masculinity as represented in detective fiction. After a theoretical introduction to the relevant focal points of the three research fields, the dissertation demonstrates that even the archetypal detectives Dupin and Holmes undermine certain nineteenth-century masculine ideals with their peculiarities. Shifting to contemporary detective fiction and adopting a literary disability studies perspective, the dissertation investigates how male detectives with a form of neurodiversity or a physical impairment negotiate their masculine identity in light of their disability in private and professional contexts. It argues that the occupation as a detective supports the disabled investigator to achieve ‘masculine disability’. Inversing the term ‘disabled masculinity’, predominantly used in research, ‘masculine disability’ introduces a decisively gendered reading of neurodiversity and (acquired) physical impairment in contemporary detective fiction. The term implies that the disabled detective (re)negotiates his masculine identity by implementing the disability in his professional investigations and accepting it as an important, yet not defining, characteristic of his (gender) identity. By applying this approach to five novels from contemporary British and American detective fiction, the dissertation demonstrates that masculinity and disability do not negate each other, as commonly assumed. Instead, it emphasises that disability allows the detective, as much as the reader, to rethink masculinity.
Empirical studies in software engineering use software repositories as data sources to understand software development. Repository data is either used to answer questions that guide the decision-making in the software development, or to provide tools that help with practical aspects of developers’ everyday work. Studies are classified into the field of Empirical Software Engineering (ESE), and more specifically into Mining Software Repositories (MSR). Studies working with repository data often focus on their results. Results are statements or tools, derived from the data, that help with practical aspects of software development. This thesis focuses on the methods and high order methods used to produce such results. In particular, we focus on incremental methods to scale the processing of repositories, declarative methods to compose a heterogeneous analysis, and high order methods used to reason about threats to methods operating on repositories. We summarize this as technical and methodological improvements. We contribute the improvements to methods and high-order methods in the context of MSR/ESE to produce future empirical results more effectively. We contribute the following improvements. We propose a method to improve the scalability of functions that abstract over repositories with high revision count in a theoretically founded way. We use insights on abstract algebra and program incrementalization to define a core interface of highorder functions that compute scalable static abstractions of a repository with many revisions. We evaluate the scalability of our method by benchmarks, comparing a prototype with available competitors in MSR/ESE. We propose a method to improve the definition of functions that abstract over a repository with a heterogeneous technology stack, by using concepts from declarative logic programming and combining them with ideas on megamodeling and linguistic architecture. We reproduce existing ideas on declarative logic programming with languages close to Datalog, coming from architecture recovery, source code querying, and static program analysis, and transfer them from the analysis of a homogeneous to a heterogeneous technology stack. We provide a prove-of-concept of such method in a case study. We propose a high-order method to improve the disambiguation of threats to methods used in MSR/ESE. We focus on a better disambiguation of threats, operationalizing reasoning about them, and making the implications to a valid data analysis methodology explicit, by using simulations. We encourage researchers to accomplish their work by implementing ‘fake’ simulations of their MSR/ESE scenarios, to operationalize relevant insights about alternative plausible results, negative results, potential threats and the used data analysis methodologies. We prove that such way of simulation based testing contributes to the disambiguation of threats in published MSR/ESE research.
In the last years, the public interest in epidemiology and mathematical modeling of disease spread has increased - mainly caused by the COVID-19 pandemic, which has emphasized the urgent need for accurate and timely modelling of disease transmission. However, even prior to that, mathematical modelling has been used for describing the dynamics and spread of infectious diseases, which is vital for developing effective interventions and controls, e.g., for vaccination campaigns and social restrictions like lockdowns. The forecasts and evaluations provided by these models influence political actions and shape the measures implemented to contain the virus.
This research contributes to the understanding and control of disease spread, specifically for Dengue fever and COVID-19, making use of mathematical models and various data analysis techniques. The mathematical foundations of epidemiological modelling, as well as several concepts for spatio-temporal diffusion like ordinary differential equation (ODE) models, are presented, as well as an originally human-vector model for Dengue fever, and the standard (SEIR)-model (with the potential inclusion of an equation for deceased persons), which are suited for the description of COVID-19. Additionally, multi-compartment models, fractional diffusion models, partial differential equations (PDE) models, and integro-differential models are used to describe spatial propagation of the diseases.
We will make use of different optimization techniques to adapt the models to medical data and estimate the relevant parameters or finding optimal control techniques for containing diseases using both Metropolis and Lagrangian methods. Reasonable estimates for the unknown parameters are found, especially in initial stages of pandemics, when little to no information is available and the majority of the population has not got in contact with the disease. The longer a disease is present, the more complex the modelling gets and more things (vaccination, different types, etc.) appear and reduce the estimation and prediction quality of the mathematical models.
While it is possible to create highly complex models with numerous equations and parameters, such an approach presents several challenges, including difficulties in comparing and evaluating data, increased risk of overfitting, and reduced generalizability. Therefore, we will also consider criteria for model selection based on fit and complexity as well as the sensitivity of the model with respect to specific parameters. This also gives valuable information on which political interventions should be more emphasized for possible variations of parameter values.
Furthermore, the presented models, particularly the optimization using the Metropolis algorithm for parameter estimation, are compared with other established methods. The quality of model calculation, as well as computational effort and applicability, play a role in this comparison. Additionally, the spatial integro-differential model is compared with an established agent-based model. Since the macroscopic results align very well, the computationally faster integro-differential model can now be used as a proxy for the slower and non-traditionally optimizable agent-based model, e.g., in order to find an apt control strategy.
Artificial neural networks is a popular field of research in artificial intelli-
gence. The increasing size and complexity of huge models entail certain
problems. The lack of transparency of the inner workings of a neural net-
work makes it difficult to choose efficient architectures for different tasks.
It proves to be challenging to solve these problems, and with a lack of in-
sightful representations of neural networks, this state of affairs becomes
entrenched. With these difficulties in mind a novel 3D visualization tech-
nique is introduced. Attributes for trained neural networks are estimated
by utilizing established methods from the area of neural network optimiza-
tion. Batch normalization is used with fine-tuning and feature extraction to
estimate the importance of different parts of the neural network. A combi-
nation of the importance values with various methods like edge bundling,
ray tracing, 3D impostor and a special transparency technique results in a
3D model representing a neural network. The validity of the extracted im-
portance estimations is demonstrated and the potential of the developed
visualization is explored.
Leichte Sprache (LS, easy-to-read German) is a simplified variety of German. It is used to provide barrier-free texts for a broad spectrum of people, including lowliterate individuals with learning difficulties, intellectual or developmental disabilities (IDD) and/or complex communication needs (CCN). In general, LS authors are proficient in standard German and do not belong to the aforementioned group of people. Our goal is to empower the latter to participate in written discourse themselves. This requires a special writing system whose linguistic support and ergonomic software design meet the target group’s specific needs. We present EasyTalk a system profoundly based on natural language processing (NLP) for assistive writing in an extended variant of LS (ELS). EasyTalk provides users with a personal vocabulary underpinned with customizable communication symbols and supports in writing at their individual level of proficiency through interactive user guidance. The system minimizes the grammatical knowledge needed to produce correct and coherent complex contents by intuitively formulating linguistic decisions. It provides easy dialogs for selecting options from a natural-language paraphrase generator, which provides context-sensitive suggestions for sentence components and correctly inflected word forms. In addition, EasyTalk reminds users to add text elements that enhance text comprehensibility in terms of audience design (e.g., time and place of an event) and improve text coherence (e.g., explicit connectors to express discourse-relations). To tailor the system to the needs of the target group, the development of EasyTalk followed the principles of human-centered design (HCD). Accordingly, we matured the system in iterative development cycles, combined with purposeful evaluations of specific aspects conducted with expert groups from the fields of CCN, LS, and IT, as well as L2 learners of the German language. In a final case study, members of the target audience tested the system in free writing sessions. The study confirmed that adults with IDD and/or CCN who have low reading, writing, and computer skills can write their own personal texts in ELS using EasyTalk. The positive feedback from all tests inspires future long-term studies with EasyTalk and further development of this prototypical system, such as the implementation of a so-called Schreibwerkstatt (writing workshop)
In a world where language defines the boundaries of one's understanding, the words of Austrian philosopher Ludwig Wittgenstein resonate profoundly. Wittgenstein's assertion that "Die Grenzen meine Sprache bedeuten die Grenzen meiner Welt" (Wittgenstein 2016: v. 5.6) underscores the vital role of language in shaping our perceptions. Today, in a globalized and interconnected society, fluency in foreign languages is indispensable for individual success. Education must break down these linguistic barriers, and one promising approach is the integration of foreign languages into content subjects.
Teaching content subjects in a foreign language, a practice known as Content Language Integrated Learning (CLIL), not only enhances language skills but also cultivates cognitive abilities and intercultural competence. This approach expands horizons and aligns with the core principles of European education (Leaton Gray, Scott & Mehisto 2018: 50). The Kultusministerkonferenz (KMK) recognizes the benefits of CLIL and encourages its implementation in German schools (cf. KMK 2013a).
With the rising popularity of CLIL, textbooks in foreign languages have become widely available, simplifying teaching. However, the appropriateness of the language used in these materials remains an unanswered question. If textbooks impose excessive linguistic demands, they may inadvertently limit students' development and contradict the goal of CLIL.
This thesis focuses on addressing this issue by systematically analyzing language requirements in CLIL teaching materials, emphasizing receptive and productive skills in various subjects based on the Common European Framework of Reference. The aim is to identify a sequence of subjects that facilitates students' language skill development throughout their school years. Such a sequence would enable teachers to harness the full potential of CLIL, fostering a bidirectional approach where content subjects facilitate language learning.
While research on CLIL is extensive, studies on language requirements for bilingual students are limited. This thesis seeks to bridge this gap by presenting findings for History, Geography, Biology, and Mathematics, allowing for a comprehensive understanding of language demands. This research endeavors to enrich the field of bilingual education and CLIL, ultimately benefiting the academic success of students in an interconnected world.
The trends of industry 4.0 and the further enhancements toward an ever changing factory lead to more mobility and flexibility on the factory floor. With that higher need of mobility and flexibility the requirements on wireless communication rise. A key requirement in that setting is the demand for wireless Ultra-Reliability and Low Latency Communication (URLLC). Example use cases therefore are cooperative Automated Guided Vehicles (AGVs) and mobile robotics in general. Working along that setting this thesis provides insights regarding the whole network stack. Thereby, the focus is always on industrial applications. Starting on the physical layer, extensive measurements from 2 GHz to 6 GHz on the factory floor are performed. The raw data is published and analyzed. Based on that data an improved Saleh-Valenzuela (SV) model is provided. As ad-hoc networks are highly depended onnode mobility, the mobility of AGVs is modeled. Additionally, Nodal Encounter Patterns (NEPs) are recorded and analyzed. A method to record NEP is illustrated. The performance by means of latency and reliability are key parameters from an application perspective. Thus, measurements of those two parameters in factory environments are performed using Wireless Local Area Network (WLAN) (IEEE 802.11n), private Long Term Evolution (pLTE) and 5G. This showed auto-correlated latency values. Hence, a method to construct confidence intervals based on auto-correlated data containing rare events is developed. Subsequently, four performance improvements for wireless networks on the factory floor are proposed. Of those optimization three cover ad-hoc networks, two deal with safety relevant communication, one orchestrates the usage of two orthogonal networks and lastly one optimizes the usage of information within cellular networks.
Finally, this thesis is concluded by an outlook toward open research questions. This includes open questions remaining in the context of industry 4.0 and further the ones around 6G. Along the research topics of 6G the two most relevant topics concern the ideas of a network of networks and overcoming best-effort IP.
With the increasing importance and urgency of climate change, companies are challenged to contribute to sustainable development, especially by younger generations. However, existing corporate contributions have been criticized as insufficient, which could be particularly caused by a lack of employee engagement in corporate sustainability. In this context, gamification has been proposed and increasingly investigated in recent years as a promising, innovative tool to motivate sustainable employee behaviors in the workplace. However, there are few studies and applicable gamification solutions that address more than one specific sustainability issue and thus take a holistic perspective on sustainable behaviors in the workplace. Moreover, previous research lacks a comprehensive understanding of how different gamification elements elicit specific psychological effects, how these manifest in behavioral changes, and how these, in turn, cumulatively result in measurable corporate outcomes. The path from gamification as ”input” to corporate sustainability as ”output” thus remains unexplored.
This dissertation fills this gap by conceptualizing, designing, and evaluating a holistic gamified intervention that supports employees in various sustainable behaviors in their daily activities. The project uses a design science research approach that closely involves employees in the incremental development of the solution. As part of the iterative design process, this dissertation presents six studies to extend the theoretical understanding of gamification for sustainable employee behaviors. First, a comprehensive review of existing research on gamification for sustainable employee behavior is provided, analyzing gamification designs and results of previous studies and outlining an agenda for further research (Study 1). Theoretical foundations of research on gamification, serious games, and game-based learning (Study 2) and empirical design principles for gamification and persuasive systems (Study 3) are then systematically reviewed as a basis for the successful design of gamified applications. Subsequently, empirical studies explore employees’ motivations for sustainable behavior and illuminate their expectations for design features (Study 4), and identify contextual challenges and design dilemmas when implementing gamification in an organizational context (Study 5). Finally, a quantitative field study (Study 6) explores how different gamification designs influence sustainable employee behavior and corporate sustainability in organizations. Based on the findings, this dissertation presents a comprehensive framework of gamification for sustainable employee behavior that incorporates design, individual behavior, and organizational perspectives. Finally, building on these insights, it provides practical recommendations for designing gamification to encourage sustainable employee behavior at work.
Counts of SARS-CoV-2-related deaths have been key numbers for justifying severe political, social and economical measures imposed by authorities world-wide. A particular focus thereby was the concomitant excess mortality (EM), i.e. fatalities above the expected all-cause mortality (AM). Recent studies, inter alia by the WHO, estimated the SARS-CoV-2-related EM in Germany between 2020 and 2021 as high as 200 000. In this study, we attempt to scrutinize these numbers by putting them into the context of German AM since the year 2000. We propose two straightforward, age-cohort-dependent models to estimate German AM for the ‘Corona pandemic’ years, as well as the corresponding flu seasons, out of historic data. For Germany, we find overall negative EM of about −18 500 persons for the year 2020, and a minor positive EM of about 7000 for 2021, unveiling that officially reported EM counts are an exaggeration. In 2022, the EM count is about 41 200. Further, based on NAA-test-positive related death counts, we are able to estimate how many Germans have died due to rather than with CoViD-19; an analysis not provided by the appropriate authority, the RKI. Through 2020 and 2021 combined, our due estimate is at no more than 59 500. Varying NAA test strategies heavily obscured SARS-CoV-2-related EM, particularly within the second year of the proclaimed pandemic. We compensated changes in test strategies by assuming that age-cohort-specific NAA-conditional mortality rates during the first pandemic year reflected SARS-CoV-2-characteristic constants.