Refine
Year of publication
Document Type
- Doctoral Thesis (476) (remove)
Language
- English (249)
- German (225)
- Multiple languages (1)
- Spanish (1)
Keywords
- Pestizid (8)
- Pflanzenschutzmittel (8)
- Führung (6)
- Inklusion (6)
- Grundwasserfauna (5)
- Landwirtschaft (5)
- Modellierung (4)
- Persönlichkeit (4)
- Software Engineering (4)
- Unterrichtsforschung (4)
Institute
- Fachbereich 7 (93)
- Fachbereich 8 (47)
- Institut für Informatik (35)
- Institut für Integrierte Naturwissenschaften, Abt. Biologie (29)
- Institut für Umweltwissenschaften (23)
- Institut für Integrierte Naturwissenschaften, Abt. Chemie (22)
- Fachbereich 5 (20)
- Institut für Computervisualistik (18)
- Institut für Integrierte Naturwissenschaften, Abt. Physik (13)
- Institut für Pädagogik, Abteilung Pädagogik (13)
The goal of this PhD thesis is to investigate possibilities of using symbol elimination for solving problems over complex theories and analyze the applicability of such uniform approaches in different areas of application, such as verification, knowledge representation and graph theory. In the thesis we propose an approach to symbol elimination in complex theories that follows the general idea of combining hierarchical reasoning with symbol elimination in standard theories. We analyze how this general approach can be specialized and used in different areas of application.
In the verification of parametric systems it is important to prove that certain safety properties hold. This can be done by showing that a property is an inductive invariant of the system, i.e. it holds in the initial state of the system and is invariant under updates of the system. Sometimes this is not the case for the condition itself, but for a stronger condition it is. In this thesis we propose a method for goal-directed invariant strengthening.
In knowledge representation we often have to deal with huge ontologies. Combining two ontologies usually leads to new consequences, some of which may be false or undesired. We are interested in finding explanations for such unwanted consequences. For this we propose a method for computing interpolants in the description logics EL and EL⁺, based on a translation to the theory of semilattices with monotone operators and a certain form of interpolation in this theory.
In wireless network theory one often deals with classes of geometric graphs in which the existence or non-existence of an edge between two vertices in a graph relies on properties on their distances to other nodes. One possibility to prove properties of those graphs or to analyze relations between the graph classes is to prove or disprove that one graph class is contained in the other. In this thesis we propose a method for checking inclusions between geometric graph classes.
In international business relationships, such as international railway operations, large amounts of data can be exchanged among the parties involved. For the exchange of such data, a limited risk of being cheated by another party, e.g., by being provided with fake data, as well as reasonable cost and a foreseeable benefit, is expected. As the exchanged data can be used to make critical business decisions, there is a high incentive for one party to manipulate the data in its favor. To prevent this type of manipulation, mechanisms exist to ensure the integrity and authenticity of the data. In combination with a fair exchange protocol, it can be ensured that the integrity and authenticity of this data is maintained even when it is exchanged with another party. At the same time, such a protocol ensures that the exchange of data only takes place in conjunction with the agreed compensation, such as a payment, and that the payment is only made if the integrity and authenticity of the data is ensured as previously agreed. However, in order to be able to guarantee fairness, a fair exchange protocol must involve a trusted third party. To avoid fraud by a single centralized party acting as a trusted third party, current research proposes decentralizing the trusted third party, e.g., by using a distributed ledger based fair exchange protocol. However, for assessing the fairness of such an exchange, state-of-the-art approaches neglect costs arising for the parties conducting the fair exchange. This can result in a violation of the outlined expectation of reasonable cost, especially when distributed ledgers are involved, which are typically associated with non-negligible costs. Furthermore, the performance of typical distributed ledger-based fair exchange protocols is limited, posing an obstacle to widespread adoption.
To overcome the challenges, in this thesis, we introduce the foundation for a data exchange platform allowing for a fully decentralized fair data exchange with reasonable cost and performance. As a theoretical foundation, we introduce the concept of cost fairness, which considers cost for the fairness assessment by requesting that a party following the fair exchange protocol never suffers any unilateral disadvantages. We prove that cost fairness cannot be achieved using typical public distributed ledgers but requires customized distributed ledger instances, which usually lack complete decentralization. However, we show that the highest unilateral cost are caused by a grieving attack.
To allow fair data exchanges to be conducted with reasonable cost and performance, we introduce FairSCE, a distributed ledger-based fair exchange protocol using distributed ledger state channels and incorporating a mechanism to protect against grieving attacks, reducing the possible unilateral cost that have to be covered to a minimum. Based on our evaluation of FairSCE, the worst-case cost for data exchange, even in the presence of malicious parties, is known, which allows an estimate of the possible benefit and, thus, the preliminary estimate of economic utility. Furthermore, to allow for an unambiguous assessment of the correct data being transferred while still allowing for sensitive parts of the data to be masked, we introduce an approach for the hashing of hierarchically structured data, which can be used to ensure integrity and authenticity of the data being transferred.
Degenerative changes in the spine as well as back pain can be considered a common ailment. Incorrect loading of the lumbar spine structures is often considered as one of the factors that can accelerate degenerative processes, leading to back pain. For example, a degenerative change could be the occurrence of spinal stenosis following spondylolisthesis. Surgical treatment of spinal stenosis mainly focuses on decompressing the spinal canal with or without additional fusion through dorsal spondylodesis. There are differing opinions on whether fusion along with decompression provides potential benefits to patients or represents an overtreatment. Both conventional therapies and surgical methods aim to restore a “healthy” (or at least pain-free) distribution of load. Surprisingly little is known about the interindividual variability of load distribution in “healthy” lumbar spines. Since medical imaging does not provide information on internal forces, computer simulation of individual patients could be a tool to gain a set of new decision criteria for these cases. The advantage lies in calculating the internal load distribution, which is not feasible in in-vivo studies, as measurements of internal forces in living subjects are ethically and partially technically unfeasible. In the present research, the forward dynamic approach is used to calculate load distribution in multi-body models of individual lumbar spines. The work is structured into three parts: (I) Load distribution is quantified depending on the individual curvature of the lumbar spine. (II) Confidence intervals of the instantaneous center of rotation over time are determined, with which the motion behavior of healthy lumbar spines can be described. (III) Lastly, the effects of decompression surgeries on the load distribution of lumbar spines are determined.
The biodegradable polymers polylactic acid (PLA) and polyhydroxybutyrate (PHB) produced from renewable raw materials were coated with hydrogenated amorphous carbon layers (a-C:H) at different deposition angles with various thicknesses as part of this thesis. Similar to conventional polymers, biopolymers often have unsuitable surface properties for industrial purposes, e.g. low hardness. For some applications, it is therefore necessary and advantageous to modify the surface properties of biopolymers while retaining the main properties of the substrate material. A suitable surface modification is the deposition of thin a-C:H layers. Their properties depend essentially on the sp² and sp³ hybridization ratio of the carbon atoms and the content of hydrogen atoms. The sp²/sp³ ratio was to be controlled in the present work by varying the coating geometry. Since coatings at 0°, directly in front of the plasma source, contain a higher percentage of sp³ and indirectly coated (180°) a higher amount of sp², it is shown in this work that it is possible to control the sp²/sp³ ratio. For this purpose, the samples are placed in front of the plasma source at angles of 0, 30, 60, 90, 120, 150 and 180° and coated for 2.5, 5.0, 7.5 and 10.0 minutes. For the angles 0°, the layer thicknesses were 25, 50, 75 and 100 nm. The a-C:H layers were all deposited using radio-frequency plasma-enhanced chemical vapor deposition and acetylene as C and H sources after being pretreated with an oxygen plasma for 10 minutes. Following the O₂ treatment and the a-C:H deposition, the surfaces are examined using macroscopic and microscopic measurement methods and the data is then analyzed. The surface morphology is recorded using scanning electron microscopy and atomic force microscopy. In addition, data on the stability of the layer and the surface roughness can be collected. Contact angle (CA) measurements are used to determine not only the wettability, but also the contact angle hysteresis by pumping the drop volume up and down. By measuring the CA with different liquids and comparing them, the surface free energy (SFE) and its polar and disperse components are determined. The changes in barrier properties are verified by water vapor transmission rate tests (WVTR). The chemical analysis of the surface is carried out on the one hand by Fourier transform infrared spectroscopy with specular reflection and on the other hand by synchrotron-supported techniques such as near-edge X-ray absorption fine structure and X-ray photoelectron spectroscopy. When analyzing the surfaces after the O₂ treatment, which was initially assumed to serve only to clean and activate the surface for the a-C:H coating, it was found that the changes were more drastic than originally assumed. For example, if PLA is treated at 0° for 10 minutes, the roughness increases fivefold. As the angle increases, it decreases again until it returns to the initial value at 180°. This can be recognized to a lesser extent with PHB at 30°. For both polymers, it can be shown that the polar fraction of the SFE increases. In the WVTR, a decrease in permeability can be observed for PLA and an increase in the initial value for PHB. The chemical surface analysis shows that the O₂ treatment has little effect on the surface bonds. Overall, it can be shown in this work that the O₂ treatment has an effect on the properties of the surface and cannot be regarded exclusively as a cleaning and activation process. With direct a-C:H coating (at 0°), a layer failure due to internal stress can be observed for both PLA and PHB. This also occurs with PHB at 30°, but to a lesser extent. Permeability of the polymers is reduced by 47% with a five-minute coating and the layer at 10.0 minutes continues to have this effect despite cracks appearing. The application of a-C:H layers shows a dominance of sp³ bonds for both polymer types with direct coating. This decreases with increasing angle and sp² bonds become dominant for indirect coatings. This result is similar for all coating thicknesses, only the angle at which the change of the dominant bond takes place is different. It is shown that it is possible to control the surface properties by an angle-dependent coating and thus to control the ratio sp²/sp³.
How to begin? This short question addresses a problem that is anything but simple, especially when regarding something as sophisticated and multilayered as musical theatre. However, scholars of this vast research area have mostly neglected this question so far. This study analyses and compares the initial sections of late Victorian popular musical theatre and is therefore a contribution to several fields of research: the analysis of initial sections of musical theatre in general, the analysis of the music of popular musical theatre in particular, and therefore operetta studies. The 1890s are especially interesting times for popular musical theatre in London: The premiered works include the last collaborations of Gilbert and Sullivan as well as offshoots of Savoy opera; but the so-called ‘naughty nineties’ also saw the emergence of a new genre, musical comedy, which captured the late Victorian zeitgeist like no other. This new form of theatrical entertainment was carefully and consciously constructed and promoted as modern and fashionable, walking a fine line between respectability and mildly risqué excitement.
Because a deep understanding of the developments and new tendencies concerning popular musical theatre in the 1890s is crucial in order to interpret differences as well as similarities, the analyses of the opening numbers are preceded by a detailed discussion of the relevant genres: comic opera, musical comedy, musical play and operetta. Since the producers of the analysed works wanted to distance themselves from former and supposedly old-fashioned traditions, this book also considers influences from their British predecessors, but also from Viennese operetta and French opéra bouffe.
Nanoparticles are sensitive and robust systems; they are particularly reactive due to their large surface area and have properties that the bulk material does not have. At the same time, the production of nanoparticles is challenging, because even with the same parameters and conditions, the parameters can vary slightly from run to run. In order to avoid this, this work aims to develop a continuous synthesis in the microjet reactor for nanoceria. The aim is to obtain monodisperse nanoparticles that can be used in biosensors.
This work focuses on two precipitation syntheses with the intermediate steps of cerium carbonate and cerium hydroxide, as well as a microemulsion synthesis for the production of nanoceria. The cerium oxide nanoparticles are compared using different characterisation and application methods. The synthesised nanoparticles will be characterised with respect to their size, stability, chemical composition and catalytic capabilities, by electron microscopy, X-ray diffraction, Raman spectroscopy and photoelectron spectroscopy.
The biosensor systems to evaluate the nanoceria are designed to detect histamine and glucose or hydrogen peroxide, which are resulting from the oxidation of histamine and glucose. Hydrogen peroxide and glucose are detected by an electrochemical sensor and histamine by a colorimetric sensor system.
In der vorliegenden Dissertation mit dem Titel "Blickanalysen bei mentalen Rotationsaufgaben" wird eine Analyse der visuellen Verarbeitungsprozesse bei mentalen Rotationsaufgaben mittels Eye-Tracking-Technologie durchgeführt, um die zugrundeliegenden kognitiven Prozesse und Strategien, die bei der Lösung dieser Aufgaben angewandt werden, zu untersuchen. Ein Anliegen dieser Arbeit ist es, die Problemstellung zu adressieren, wie individuelle Unterschiede, insbesondere geschlechtsspezifische Differenzen in den Blickmustern, die visuelle Verarbeitung und Leistung bei mentalen Rotationsaufgaben beeinflussen. Hierzu wurden drei Studien durchgeführt, die nicht nur die Identifikation von Blickmustern und die Analyse der Leistungsunterschiede in Bezug auf Geschlecht umfassen, sondern auch die Korrelation zwischen Blickverhalten und Leistung untersuchen. Die Ergebnisse dieser Forschung bieten Einblicke in die Mechanismen der visuellen und kognitiven Verarbeitung bei mentalen Rotationsaufgaben und heben die Bedeutung des Eye-Tracking als Forschungsinstrument in der kognitiven Psychologie hervor, um ein umfassendes Verständnis der Einflussfaktoren auf räumliches Denken und Problemlösungsstrategien zu erlangen.
Classical music has played a central role in German music education since at least the second half of the 20th century. However, in more recent music pedagogical discourse, classical music remains a controversial topic. But what do music teachers think about classical music as a subject for music education? This topic has not yet been systematically researched in German-speaking music education.
In this qualitative-empirical study, eight semi-structured interviews were conducted to address the question of how music teachers perceive classical music in music education. The data was evaluated using the Grounded Theory Methodology. The theory developed from the study indicates that music teachers have varying objectives when using classical music in music education. However, they generally consider it unfamiliar to their students. To address this situation, music teachers develop various methods and strategies. These can be categorized into three approaches for dealing with the unfamiliarity of classical music: avoidance, reduction/relativization, and utilization.
The study's findings are contextualized within the framework of foreignness theory, music didactics, and transformational educational theory. This dissertation contributes to the field of music education in classical music, laying the groundwork for further theoretical, empirical, and didactic research.
Diese Dissertation widmet sich der inhaltsanalytischen, quantitativen Analyse der Kompilation Disney Princess durch die Anwendung der Theorie des male gaze von Laura Mulvey, welche sie in Visual Pleasure and Narrative Cinema (1975) sowie Afterthoughts on `Visual Pleasure and Narrative Cinema‘ inspired by King Vidor´s Duel in the Sun (1946) (1981) darstellte.
Die Autorin der Dissertation nutzt die quantitative Inhaltsanalyse nach Patrick Rössler, um die Filme der Kompilation Disney Princess aus den Jahren 1937 bis 2016 sowie den Film Die Eiskönigin (2013) auf die Darstellung der weiblich und männlich gelesenen Filmfiguren im Hinblick auf die Körperproportionen, den Grad ihrer Aktivität und den Umfang ihrer Präsenz sowie das Geschlecht der Filmmitarbeiter:innen zu untersuchen.
Focusing on the triangulation of detective fiction, masculinity studies and disability studies, "Investigating the Disabled Detective – Disabled Masculinity and Masculine Disability in Contemporary Detective Fiction" shows that disability challenges common ideals of (hegemonic) masculinity as represented in detective fiction. After a theoretical introduction to the relevant focal points of the three research fields, the dissertation demonstrates that even the archetypal detectives Dupin and Holmes undermine certain nineteenth-century masculine ideals with their peculiarities. Shifting to contemporary detective fiction and adopting a literary disability studies perspective, the dissertation investigates how male detectives with a form of neurodiversity or a physical impairment negotiate their masculine identity in light of their disability in private and professional contexts. It argues that the occupation as a detective supports the disabled investigator to achieve ‘masculine disability’. Inversing the term ‘disabled masculinity’, predominantly used in research, ‘masculine disability’ introduces a decisively gendered reading of neurodiversity and (acquired) physical impairment in contemporary detective fiction. The term implies that the disabled detective (re)negotiates his masculine identity by implementing the disability in his professional investigations and accepting it as an important, yet not defining, characteristic of his (gender) identity. By applying this approach to five novels from contemporary British and American detective fiction, the dissertation demonstrates that masculinity and disability do not negate each other, as commonly assumed. Instead, it emphasises that disability allows the detective, as much as the reader, to rethink masculinity.
Empirical studies in software engineering use software repositories as data sources to understand software development. Repository data is either used to answer questions that guide the decision-making in the software development, or to provide tools that help with practical aspects of developers’ everyday work. Studies are classified into the field of Empirical Software Engineering (ESE), and more specifically into Mining Software Repositories (MSR). Studies working with repository data often focus on their results. Results are statements or tools, derived from the data, that help with practical aspects of software development. This thesis focuses on the methods and high order methods used to produce such results. In particular, we focus on incremental methods to scale the processing of repositories, declarative methods to compose a heterogeneous analysis, and high order methods used to reason about threats to methods operating on repositories. We summarize this as technical and methodological improvements. We contribute the improvements to methods and high-order methods in the context of MSR/ESE to produce future empirical results more effectively. We contribute the following improvements. We propose a method to improve the scalability of functions that abstract over repositories with high revision count in a theoretically founded way. We use insights on abstract algebra and program incrementalization to define a core interface of highorder functions that compute scalable static abstractions of a repository with many revisions. We evaluate the scalability of our method by benchmarks, comparing a prototype with available competitors in MSR/ESE. We propose a method to improve the definition of functions that abstract over a repository with a heterogeneous technology stack, by using concepts from declarative logic programming and combining them with ideas on megamodeling and linguistic architecture. We reproduce existing ideas on declarative logic programming with languages close to Datalog, coming from architecture recovery, source code querying, and static program analysis, and transfer them from the analysis of a homogeneous to a heterogeneous technology stack. We provide a prove-of-concept of such method in a case study. We propose a high-order method to improve the disambiguation of threats to methods used in MSR/ESE. We focus on a better disambiguation of threats, operationalizing reasoning about them, and making the implications to a valid data analysis methodology explicit, by using simulations. We encourage researchers to accomplish their work by implementing ‘fake’ simulations of their MSR/ESE scenarios, to operationalize relevant insights about alternative plausible results, negative results, potential threats and the used data analysis methodologies. We prove that such way of simulation based testing contributes to the disambiguation of threats in published MSR/ESE research.
Sind Menschen von einer Pflegebedürftigkeit in Deutschland betroffen, so regelt der durch § 14 SGB XI festgeschriebene Pflegebedürftigkeitsbegriff den Zugang zu Leistungen der Pflegeversicherung. Der Pflegebedürftigkeitsbegriff ist dabei ein normativ gesetzter und basiert bislang nicht auf empirischen Studien aus dem Bereich der Pflege und der Pflegewissenschaft. Durch seine gesetzliche Fundierung lenkt er die Bedingungen und Strukturen, unter welchen Pflegeleistungen in Deutschland von Pflegefachpersonen erbracht werden. Weiterhin ist davon auszugehen, dass die Pflegefachpersonen durch ihre professionelle Sozialisierung einen fachlichen Fokus auf das Konstrukt der Pflegebedürftigkeit legen, welcher sich vom Pflegebedürftigkeitsbegriff unterscheidet und strukturell nicht in die Leistungsbemessung einfließt. Daraus ergeben sich Aspekte einer pflegerischen Unter- und Überversorgung.
Die vorliegende Ph.D.-Thesis verfolgt das Anliegen, die Herausforderungen des Pflegebedürftigkeitsbegriffs in Deutschland aufzuzeigen, indem die Aspekte der Pflegebedürftigkeit von Pflegefachpersonen im ambulanten Setting im Hinblick auf deren Interaktion mit pflegebedürftigen Menschen empirisch erfasst und zu einem theoretischen Konzept ausgearbeitet werden. Zur methodischen Bearbeitung des Forschungsinteresses werden problemzzentrierte Interviews mit ambulanten Pflegefachpersonen geführt, die mit Rückbezug auf den Symbolischen Interaktionismus nach Herbert Blumer unter methodologischen und methodischen Gesichtspunkten mittels einer Grounded Theory nach Kathy Charmaz sowie Juliet Corbin und Anselm Strauss erhoben und ausgewertet werden. Dabei kommt ein reflexives-konstruktivistisches Forschen und Schreiben als Konsequenz der epistemologisch-methodologischen Fundierung der Autorin zur Anwendung.
Die erarbeitete Theorie beschreibt die Herausforderungen der Pflegebedürftigkeit aus Sicht der befragten Pflegefachpersonen. So werden in der Kernkategorie Aushandlungsprozesse in den Bereichen Nähe und Distanz, Anwaltschaft und Verantwortungsüberlassung sowie Ethos und Technokratie beschrieben. Sämtliche Aspekte zeigen auf, inwiefern der gesetzliche Pflegebedürftigkeitsbegriff zu Herausforderungen innerhalb der pflegerischen Arbeit führt. Die Ph.D.-Thesis liefert mit ihren Ergebnissen einen Beitrag zur Einordnung und Relevanz pflegerischer Beziehungsarbeit im Hinblick auf herrschende Rahmenbedingungen der Pflegebedürftigkeit und zeigt auf, inwiefern sich Interaktion und Kommunikation der Akteur*innen vor dem Anspruch individueller Pflege und dem deutschen ambulanten Pflegesystem wechselseitig bedingen. Sie liefert damit einen professionell und empirisch begründeten Ansatz für die Einschätzung und Bearbeitung von pflegefachlich erlebter Pflegebedürftigkeit.
In the last years, the public interest in epidemiology and mathematical modeling of disease spread has increased - mainly caused by the COVID-19 pandemic, which has emphasized the urgent need for accurate and timely modelling of disease transmission. However, even prior to that, mathematical modelling has been used for describing the dynamics and spread of infectious diseases, which is vital for developing effective interventions and controls, e.g., for vaccination campaigns and social restrictions like lockdowns. The forecasts and evaluations provided by these models influence political actions and shape the measures implemented to contain the virus.
This research contributes to the understanding and control of disease spread, specifically for Dengue fever and COVID-19, making use of mathematical models and various data analysis techniques. The mathematical foundations of epidemiological modelling, as well as several concepts for spatio-temporal diffusion like ordinary differential equation (ODE) models, are presented, as well as an originally human-vector model for Dengue fever, and the standard (SEIR)-model (with the potential inclusion of an equation for deceased persons), which are suited for the description of COVID-19. Additionally, multi-compartment models, fractional diffusion models, partial differential equations (PDE) models, and integro-differential models are used to describe spatial propagation of the diseases.
We will make use of different optimization techniques to adapt the models to medical data and estimate the relevant parameters or finding optimal control techniques for containing diseases using both Metropolis and Lagrangian methods. Reasonable estimates for the unknown parameters are found, especially in initial stages of pandemics, when little to no information is available and the majority of the population has not got in contact with the disease. The longer a disease is present, the more complex the modelling gets and more things (vaccination, different types, etc.) appear and reduce the estimation and prediction quality of the mathematical models.
While it is possible to create highly complex models with numerous equations and parameters, such an approach presents several challenges, including difficulties in comparing and evaluating data, increased risk of overfitting, and reduced generalizability. Therefore, we will also consider criteria for model selection based on fit and complexity as well as the sensitivity of the model with respect to specific parameters. This also gives valuable information on which political interventions should be more emphasized for possible variations of parameter values.
Furthermore, the presented models, particularly the optimization using the Metropolis algorithm for parameter estimation, are compared with other established methods. The quality of model calculation, as well as computational effort and applicability, play a role in this comparison. Additionally, the spatial integro-differential model is compared with an established agent-based model. Since the macroscopic results align very well, the computationally faster integro-differential model can now be used as a proxy for the slower and non-traditionally optimizable agent-based model, e.g., in order to find an apt control strategy.
Leichte Sprache (LS, easy-to-read German) is a simplified variety of German. It is used to provide barrier-free texts for a broad spectrum of people, including lowliterate individuals with learning difficulties, intellectual or developmental disabilities (IDD) and/or complex communication needs (CCN). In general, LS authors are proficient in standard German and do not belong to the aforementioned group of people. Our goal is to empower the latter to participate in written discourse themselves. This requires a special writing system whose linguistic support and ergonomic software design meet the target group’s specific needs. We present EasyTalk a system profoundly based on natural language processing (NLP) for assistive writing in an extended variant of LS (ELS). EasyTalk provides users with a personal vocabulary underpinned with customizable communication symbols and supports in writing at their individual level of proficiency through interactive user guidance. The system minimizes the grammatical knowledge needed to produce correct and coherent complex contents by intuitively formulating linguistic decisions. It provides easy dialogs for selecting options from a natural-language paraphrase generator, which provides context-sensitive suggestions for sentence components and correctly inflected word forms. In addition, EasyTalk reminds users to add text elements that enhance text comprehensibility in terms of audience design (e.g., time and place of an event) and improve text coherence (e.g., explicit connectors to express discourse-relations). To tailor the system to the needs of the target group, the development of EasyTalk followed the principles of human-centered design (HCD). Accordingly, we matured the system in iterative development cycles, combined with purposeful evaluations of specific aspects conducted with expert groups from the fields of CCN, LS, and IT, as well as L2 learners of the German language. In a final case study, members of the target audience tested the system in free writing sessions. The study confirmed that adults with IDD and/or CCN who have low reading, writing, and computer skills can write their own personal texts in ELS using EasyTalk. The positive feedback from all tests inspires future long-term studies with EasyTalk and further development of this prototypical system, such as the implementation of a so-called Schreibwerkstatt (writing workshop)
In a world where language defines the boundaries of one's understanding, the words of Austrian philosopher Ludwig Wittgenstein resonate profoundly. Wittgenstein's assertion that "Die Grenzen meine Sprache bedeuten die Grenzen meiner Welt" (Wittgenstein 2016: v. 5.6) underscores the vital role of language in shaping our perceptions. Today, in a globalized and interconnected society, fluency in foreign languages is indispensable for individual success. Education must break down these linguistic barriers, and one promising approach is the integration of foreign languages into content subjects.
Teaching content subjects in a foreign language, a practice known as Content Language Integrated Learning (CLIL), not only enhances language skills but also cultivates cognitive abilities and intercultural competence. This approach expands horizons and aligns with the core principles of European education (Leaton Gray, Scott & Mehisto 2018: 50). The Kultusministerkonferenz (KMK) recognizes the benefits of CLIL and encourages its implementation in German schools (cf. KMK 2013a).
With the rising popularity of CLIL, textbooks in foreign languages have become widely available, simplifying teaching. However, the appropriateness of the language used in these materials remains an unanswered question. If textbooks impose excessive linguistic demands, they may inadvertently limit students' development and contradict the goal of CLIL.
This thesis focuses on addressing this issue by systematically analyzing language requirements in CLIL teaching materials, emphasizing receptive and productive skills in various subjects based on the Common European Framework of Reference. The aim is to identify a sequence of subjects that facilitates students' language skill development throughout their school years. Such a sequence would enable teachers to harness the full potential of CLIL, fostering a bidirectional approach where content subjects facilitate language learning.
While research on CLIL is extensive, studies on language requirements for bilingual students are limited. This thesis seeks to bridge this gap by presenting findings for History, Geography, Biology, and Mathematics, allowing for a comprehensive understanding of language demands. This research endeavors to enrich the field of bilingual education and CLIL, ultimately benefiting the academic success of students in an interconnected world.
The trends of industry 4.0 and the further enhancements toward an ever changing factory lead to more mobility and flexibility on the factory floor. With that higher need of mobility and flexibility the requirements on wireless communication rise. A key requirement in that setting is the demand for wireless Ultra-Reliability and Low Latency Communication (URLLC). Example use cases therefore are cooperative Automated Guided Vehicles (AGVs) and mobile robotics in general. Working along that setting this thesis provides insights regarding the whole network stack. Thereby, the focus is always on industrial applications. Starting on the physical layer, extensive measurements from 2 GHz to 6 GHz on the factory floor are performed. The raw data is published and analyzed. Based on that data an improved Saleh-Valenzuela (SV) model is provided. As ad-hoc networks are highly depended onnode mobility, the mobility of AGVs is modeled. Additionally, Nodal Encounter Patterns (NEPs) are recorded and analyzed. A method to record NEP is illustrated. The performance by means of latency and reliability are key parameters from an application perspective. Thus, measurements of those two parameters in factory environments are performed using Wireless Local Area Network (WLAN) (IEEE 802.11n), private Long Term Evolution (pLTE) and 5G. This showed auto-correlated latency values. Hence, a method to construct confidence intervals based on auto-correlated data containing rare events is developed. Subsequently, four performance improvements for wireless networks on the factory floor are proposed. Of those optimization three cover ad-hoc networks, two deal with safety relevant communication, one orchestrates the usage of two orthogonal networks and lastly one optimizes the usage of information within cellular networks.
Finally, this thesis is concluded by an outlook toward open research questions. This includes open questions remaining in the context of industry 4.0 and further the ones around 6G. Along the research topics of 6G the two most relevant topics concern the ideas of a network of networks and overcoming best-effort IP.
Organic binder mixtures and process additives have been used in refractory materials for a long time due to their property-improving effect. Coal tar pitches in particular can contain thousands of chemical compounds, of which especially polycyclic aromatic hydrocarbons (PAHs) are known to be carcinogenic and mutagenic and thus pose a risk to both the environment and human health. However, despite intensive research, the exact structure of these carbon mixtures is still not fully clarified. This is becoming an increasing problem, especially with regard to more stringent legal requirements arising from REACH, the European Chemicals Regulation for the Registration, Evaluation, Authorization and Restriction of Chemicals. Furthermore, the knowledge of the structural and chemical composition is also of great importance for optimal processing of the carbon mixtures to high-quality technical products. In the present work, an analytical strategy for the investigation of complex carbon mixtures containing PAHs is developed. Due to their complexity, a combination of different methods is used, including elemental analysis, solvent extraction, thermogravimetry, differential thermal analysis, raman and infrared spectroscopy as well as high-resolution mass spectrometry. In addition, a procedure for the evaluation of mass spectrometric data based on multivariate statistical methods such as hierarchical cluster analysis and principal component analysis is developed. The application of the developed analytical strategy to various industrially used carbon-based binder mixtures allowed the elucidation of characteristic properties, including aromaticity, molecular mass distribution, degree of alkylation and elemental composition. It was also shown that combining high-resolution time-of-flight mass spectrometry with multivariate statistical data analysis is a fast and effective tool for the classification of complex binder mixtures and the identification of characteristic molecular structures. In addition, the analytical strategy was applied to manufactured refractory products. Despite the small amount of the contained organic phase, characteristic structural features of each sample could be identified and extracted, which enabled an unambiguous classification of the refractory products.
Die Geometrie unseres Anschauungsraumes – die euklidische Geometrie – ist für einen allgemeinbildenden Mathematikunterricht elementar. Seitens der Mathematiklehrkraft stellt grundsätzlich ihr Fachwissen das Fundament des Unterrichtens dar. Als Teil ihres Professionswissens sollten Mathematiklehrkräfte prinzipiell über ein Fachwissen verfügen, das in Bezug zur akademischen Mathematik den unterrichtlichen Anforderungen der schulischen Mathematik gerecht wird.
Die im Rahmen der Dissertation entwickelte Theorie des metrisch-normalen euklidischen Raumes charakterisiert sich in ihrer perspektivischen Dualität, der mathematischen Stringenz eines axiomatisch-deduktiven Vorgehens auf der einen und der Berücksichtigung der fachdidaktischen Anforderungen an Mathematiklehrkräfte auf der anderen Seite; sie hebt sich darin von bestehenden Theorien ab.
With the increasing importance and urgency of climate change, companies are challenged to contribute to sustainable development, especially by younger generations. However, existing corporate contributions have been criticized as insufficient, which could be particularly caused by a lack of employee engagement in corporate sustainability. In this context, gamification has been proposed and increasingly investigated in recent years as a promising, innovative tool to motivate sustainable employee behaviors in the workplace. However, there are few studies and applicable gamification solutions that address more than one specific sustainability issue and thus take a holistic perspective on sustainable behaviors in the workplace. Moreover, previous research lacks a comprehensive understanding of how different gamification elements elicit specific psychological effects, how these manifest in behavioral changes, and how these, in turn, cumulatively result in measurable corporate outcomes. The path from gamification as ”input” to corporate sustainability as ”output” thus remains unexplored.
This dissertation fills this gap by conceptualizing, designing, and evaluating a holistic gamified intervention that supports employees in various sustainable behaviors in their daily activities. The project uses a design science research approach that closely involves employees in the incremental development of the solution. As part of the iterative design process, this dissertation presents six studies to extend the theoretical understanding of gamification for sustainable employee behaviors. First, a comprehensive review of existing research on gamification for sustainable employee behavior is provided, analyzing gamification designs and results of previous studies and outlining an agenda for further research (Study 1). Theoretical foundations of research on gamification, serious games, and game-based learning (Study 2) and empirical design principles for gamification and persuasive systems (Study 3) are then systematically reviewed as a basis for the successful design of gamified applications. Subsequently, empirical studies explore employees’ motivations for sustainable behavior and illuminate their expectations for design features (Study 4), and identify contextual challenges and design dilemmas when implementing gamification in an organizational context (Study 5). Finally, a quantitative field study (Study 6) explores how different gamification designs influence sustainable employee behavior and corporate sustainability in organizations. Based on the findings, this dissertation presents a comprehensive framework of gamification for sustainable employee behavior that incorporates design, individual behavior, and organizational perspectives. Finally, building on these insights, it provides practical recommendations for designing gamification to encourage sustainable employee behavior at work.