Refine
Year of publication
Document Type
- Doctoral Thesis (476) (remove)
Language
- English (249)
- German (225)
- Multiple languages (1)
- Spanish (1)
Keywords
- Pestizid (8)
- Pflanzenschutzmittel (8)
- Führung (6)
- Inklusion (6)
- Grundwasserfauna (5)
- Landwirtschaft (5)
- Modellierung (4)
- Persönlichkeit (4)
- Software Engineering (4)
- Unterrichtsforschung (4)
Institute
- Fachbereich 7 (93)
- Fachbereich 8 (47)
- Institut für Informatik (35)
- Institut für Integrierte Naturwissenschaften, Abt. Biologie (29)
- Institut für Umweltwissenschaften (23)
- Institut für Integrierte Naturwissenschaften, Abt. Chemie (22)
- Fachbereich 5 (20)
- Institut für Computervisualistik (18)
- Institut für Integrierte Naturwissenschaften, Abt. Physik (13)
- Institut für Pädagogik, Abteilung Pädagogik (13)
Speziell in Anwendungen mit intensiver Temperatur- und Korrosionsbeanspruchung finden vermehrt Phosphate als sogenannte chemische Binder für Hochleistungskeramiken Verwendung. Konkret ist die Summe der Reaktionsverläufe während des Bindemechanismus in Folge einer thermisch-induzierten Aushärtung und somit die Wirkungsweise von Phosphatbindern prinzipiell innerhalb der Fachliteratur nicht eindeutig untersucht. Innerhalb dieser Arbeit wurden aufbauend auf einer umfangreichen strukturanalytischen Prüfungsanordnung (Festkörper-NMR, RBA, REM-EDX) einer exemplarischen phosphatgebundenen Al₂O₃-MgAl₂O₄-Hochtemperaturkeramikzusammensetzung unter Einbeziehung verschiedenartiger anorganischer Phosphate grundlegende Bindemechanismen charakterisiert. Mechanisch-physikochemische Eigenschaftsuntersuchungen (STA, Dilatometrie, DMA, KBF) deckten zudem den Einfluss der eingesetzten Phosphate auf die Eigenschaftsentwicklungen der Feuerfestkeramiken bezüglich des Abbindeverhaltens, der Biegefestigkeit sowie der thermischen Längenänderung auf, welche mit Strukturänderungen korreliert wurden. Es wurde gezeigt, dass sich Bindemechanismen bei Verwendung von Phosphaten temperaturgeleitet (20 °C ≤ T ≤ 1500 °C) grundsätzlich aus zwei parallel ablaufenden Reaktionsabfolgen zusammensetzen, wobei die sich entwickelnden Phosphatphasen innerhalb der Keramikmasse quantitativ und qualitativ bezüglich ihrer Bindewirkung bewertet wurden. Zum einen wurde die Bildung eines festigkeitssteigernden Bindenetzwerks aus Aluminiumphosphaten meist amorpher Struktur identifiziert und charakterisiert. Dieses bindungsfördernde, dreidimensionale Aluminiumphosphatnetzwerk baut sich innerhalb der Initialisierungs- und Vernetzungsphasen temperaturgeleitet kontinuierlich über multiple Vernetzungsreaktionen homogen auf. Zum anderen werden Reaktionsabfolgen durch parallel ablaufende Strukturumwandlungen nicht aktiv-bindender Phosphatspezies wie Magnesium-, Calcium- oder Zirkoniumphosphate ergänzt, welche lediglich thermische Umwandlungsreaktionen der Ausgangsphosphate darstellen. Vermehrt bei T > 800 °C geht das phosphatische Bindenetzwerk Festkörperreaktionen mit MgAl₂O₄ unter Ausbildung und Agglomeration von Magnesium-Orthophosphat-Sinterstrukturen ein. Die Bildung dieser niedrigschmelzenden Hochtemperaturphasen führt zu einem teilweisen Bruch des Bindenetzwerks.
Modern agriculture is a dominant land use in Europe, although it has been associated with negative effects on biodiversity in agricultural landscapes. One species-rich insect group in agro-ecosystems is the Lepidoptera (moths and butterflies); however, the populations of a number of Lepidoptera species are currently declining. The aims of this thesis were to assess the amount and structure of field margins in agricultural landscapes, study the effects of realistic field margin input rates of agrochemicals (fertilizer and pesticides) on Lepidoptera, and provide information on moth pollination services.
In general, field margins are common semi-natural habitat elements in agro-ecosystems; however, data on the structure, size, and width of field margins is limited. An assessment in two German agricultural landscapes (4,000 ha each) demonstrated that many of the evaluated field margins were less than 3 m wide (Rhineland‐Palatinate: 85% of margin length; Brandenburg: 45% margin length). In Germany, risk mitigation measures (such as buffer zones) to reduce pesticide inputs to terrestrial non-crop habitats do not have to be established by farmers next to narrow field margins. Thus, narrow field margins receive inputs of agrochemicals, especially via overspray and spray drift. These field margins were used as a development habitat for caterpillars, but the mean abundance of caterpillars was 35 – 60% lower compared with that in meadows. Caterpillars were sensitive to realistic field margin input rates of insecticide (pyrethroid, lambda-cyhalothrin) in a field experiment as well as in laboratory experiments. Moreover, 40% fewer Hadena bicruris eggs were observed on Silene latifolia plants treated with this insecticide compared with control plants, and the flowers of these insecticide-treated plants were less likely to be pollinated by moths. In addition, realistic field margin input rates of herbicides can also affect Lepidoptera. Ranunculus acris L. plants treated with sublethal rates of a sulfonylurea herbicide were used as host plants for Mamestra brassicae L. caterpillars, which resulted in significantly lower caterpillar weights, increased time to pupation, and increased overall development time compared with caterpillars feeding on control plants. These results might have been caused by lower nutritional value of the herbicide-treated plants or increased concentrations of secondary metabolites involved in plant defense. Fertilizer applications slightly increased the caterpillar abundance in the field experiment. However, fertilizers reduce plant diversity in the long term and thus, most likely, also reduce caterpillar diversity.
Moths such as Noctuidae and Sphingidae have been observed to act as pollinators for numerous plant species, including a number of Orchidaceae and Caryophyllaceae. Although in temperate agro-ecosystems moths are less likely to act as the main pollinators for crops, they can pollinate non-crop plants in semi-natural habitats. Currently, the role of moths as pollinators appears to be underestimated, and long-term research focusing on ecosystems is necessary to address temporal fluctuations in their abundance and community composition.
Lepidoptera represent a diverse organism group in agricultural landscapes and fulfill essential ecosystem services, such as pollination. To better protect moths and butterflies, agrochemical inputs to (narrow) field margins habitats should be reduced, for example, via risk mitigation measures and agro-environmental schemes.
Amphibian populations are declining worldwide for multiple reasons such as habitat destruction and climate change. An example for an endangered European amphibian is the yellow-bellied toad Bombina variegata. Populations have been declining for decades, particularly at the northern and western range margin. One of the extant northern range centres is the Westerwald region in Rhineland-Palatinate, Germany. To implement informed conservation activities on this threatened species, knowledge of its life-history strategy is crucial. This study therefore focused on different developmental stages to test predictions of life-history theory. It addressed (1) developmental, (2) demographic and (3) genetic issues of Bombina variegata as a model organism: (1) Carry-over effects from larval environment to terrestrial stages and associated vulnerability to predators were investigated using mesocosm approaches, fitness tests and predation trials. (2) The dynamics and demography of B. variegata populations were studied applying a capture-mark-recapture analysis and skeletochronology. The study was complemented through (3) an analysis of genetic diversity and structuring of B. variegata populations using 10 microsatellite loci. In order to reveal general patterns and characteristics among B. variegata populations, the study focused on three geographical scales: local (i.e. a former military training area), regional (i.e. the Westerwald region) and continental scale (i.e. the geographical range of B. variegata). The study revealed carry-over effects of larval environment on metamorph phenotype and behaviour causing variation in fitness in the early terrestrial stage of B. variegata. Metamorph size and condition are crucial factors for survival, as small-sized individuals were particularly prone to predator attacks. Yellow-bellied toads show a remarkable fast-slow continuum of the life-history trait longevity. A populations’ position within this continuum may be determined by local environmental stochasticity, i.e. an extrinsic source of variation, and the efficiency of chemical antipredator protection, i.e. an intrinsic source of variation. Extreme longevity seems to be an exception in B. variegata. Senescence was absent in this study. Weather variability affected reproductive success and thus population dynamics. The dispersal potential was low and short-term fragmentation of populations caused significant genetic differentiation at the local scale. Long-term isolation resulted in increased genetic distance at the regional scale. At the continental scale, populations inhabiting the marginal regions were deeply structured with reduced allelic richness. As consequence of environmental changes, short-lived and isolated B. variegata populations at the range margin may face an increased risk of extinction. Conservation measures should thus improve the connectivity among local populations and reinforce annual reproductive success. Further research on the intraspecific variation in B. variegata skin toxins is required to reveal potential effects on palatability and thus longevity.
Animationen können in instruktionalen Kontexten genutzt werden, um Wissen über Sachverhalte zu vermitteln, die Prozesse oder Abläufe beinhalten. So können dynamische Sachverhalte explizit dargestellt werden und müssen nicht vom Lerner selbst in Gedanken hergestellt, sondern nur anhand der Animation nachvollzogen werden. Dies sollte sich positiv auf den Wissenserwerb auswirken. Dabei stellen Animationen mit ihrer besonderen Eigenschaft der Darstellung zeitlicher Abläufe besondere Herausforderungen an den Lerner. Das menschliche Informationsverarbeitungssystem unterliegt bestimmten Begrenzungen im Hinblick auf die Wahrnehmung von Geschwindigkeiten. Zu schnelle und zu langsame Geschwindigkeiten können beispielsweise nur schwer wahrgenommen und dementsprechend auch nicht kognitiv verarbeitet werden. Die Zielsetzung der Arbeit, die sich daraus ergibt, war eine systematische Untersuchung der Wirkung unterschiedlicher Präsentationsgeschwindigkeiten auf das Wahrnehmen und Verstehen eines dynamischen Sachverhaltes anhand einer Animation.
Um die Fragestellungen der Arbeit beantworten zu können, wurden vier experimentelle Studien durchgeführt. Die Pilotstudie hatte das Ziel, sowohl das Lernmaterial als auch den entwickelten Wissenstest zu evaluieren. In Studie 1 wurde der Frage nach dem Einfluss der Präsentationsgeschwindigkeit auf den Wissenserwerb beim Lernen mit einer interaktiven Animation nachgegangen.
Die Studien 2 und 3 untersuchten den Einfluss verschiedener Reihenfolgen von Geschwindigkeiten auf den Wissenserwerb. Hier ging es um eine systematische Erfassung der perzeptuellen und kognitiven Verarbeitung dynamischer Informationen in zwei verschiedenen Geschwindigkeiten mittels Blickbewegungsmessung (Studie 2) und wiederholten Testungen des Wissenserwerbs zwischen den einzelnen Lernphasen (Studie 3).
Die Ergebnisse der Studien deuten darauf hin, dass bei langsamer Geschwindigkeit Wissen über Ereignisse auf untergeordneter zeitlicher Ebene erworben wurde und dass je schneller eine Animation gesehen wurde, umso mehr anteiliges Wissen auf einer übergeordneten zeitlichen Ebene erworben wurde (Studie 1), aber eindeutige Aussagen über den Einfluss der Geschwindigkeit auf den Wissenserwerb auf verschiedenen zeitlichen Hierarchieebenen lassen sich aufgrund der Ergebnisse der Studien nicht machen. Im Hinblick auf die Lernförderlichkeit verschiedener Arten der Sequenzierung von Geschwindigkeiten zeigten sich auch keine eindeutigen Ergebnisse. Aufgrund der Analyse der Blickbewegungsdaten deutet sich jedoch an, dass die Reihenfolge "langsam - schnell" den Bedingungen auf Seiten der Lerner eher entgegen kommt als die Reihenfolge "schnell - langsam".
Die Wirbelsäule als tragende Säule des menschlichen Körpers ist bei vielen Bewegungsabläufen hohen Belastungen ausgesetzt. Fehl- und Überbelastungen rufen dabei oft dauerhafte Schädigungen hervor. Daher ist es von Interesse, die innerhalb der Wirbelsäule auftretenden Belastungen zu bestimmen. Eine moderne und zuverlässige Methode zur Belastungsbestimmung ist der Aufbau eines Berechnungsmodells.
In der vorliegenden Arbeit wurde ein Mehr-Körper-System (MKS) Modell der Lendenwirbelsäule erstellt. Mit Hilfe des Modells können sowohl die übertragenen Kräfte und Momente in allen inneren Strukturen berechnet als auch die Kinematik des Bewegungsablaufs simuliert werden. Die Grundstruktur des Modells bilden die als Starrkörper angenommenen knöchernen Strukturen der fünf Lendenwirbel L1 bis L5, des Os Sacrums und des Os iliums, die über die Segmentierung eines CT-Datensatzes des Abgusses der Wirbeloberflächen eines durchschnittlich großen Europäers gewonnen wurden. Die elastischen Elemente der Wirbelsäule wurden unter Berücksichtigung ihrer physikalischen Eigenschaften in das Modell implementiert. Grundlage für die Modellierung der Zwischenwirbelscheiben waren dabei eigens durchgeführte experimentelle Messungen. Das charakteristische Kraft-Deformations-Verhalten der Ligamente wurde der Literatur entnommen.
Die Umsetzung im Computermodell berücksichtigt neben dem physikalischen Verhalten eines einzelnen Ligamentes zusätzlich durch einen Gewichtungsfaktor das Zusammenspiel aller Ligamente im komplex aufgebauten Ligamentapparat. Die Facettengelenke wurden durch Kontaktmodellierung in den Knorpelschichten realisiert. Daneben wurde ein Modell eines Implantatsystems entwickelt, das zur dynamischen Stabilisierung der Lendenwirbelsäule genutzt wird. Die Validierung der erstellten Modelle erfolgte über den Vergleich mit In-Vitro erhobenen Daten. Betrachtet wurden neben der intakten Wirbelsäule zudem degenerative Schädigungen der Zwischenwirbelscheibe und deren operative Versorgung durch Nukleotomie und dynamische Stabilisierung. Die Ergebnisse der Simulationen zeigen dabei eine sehr gute Näherung an die experimentell ermittelten Messwerte. Durch Anwendung der Computermodelle konnten die Auswirkungen verschiedener operativer Eingriffe, wie Interlaminotomie, Hemilaminektomie und Laminektomie auf die unterschiedlichen Strukturen der Lendenwirbelsäule berechnet werden. Ein weiteres Anwendungsgebiet lag in der Untersuchung des momentanen Drehzentrums. Neben der Bestimmung der Drehpunktbahn bei intakter Wirbelsäule konnten die Effekte einer degenerativ geschädigten und operativ versorgten Zwischenwirbelscheibe auf den Verlauf des momentanen Drehzentrums berechnet und simuliert werden.
Social Business Documents: An Investigation of their Nature, Structure and Long-term Management
(2018)
Business documents contain valuable information. In order to comply with legal requirements, to serve as organisational knowledge and to prevent risks they need to be managed. However, changes in technology with which documents are being produced introduced new kinds of documents and new ways of interacting with documents. Thereby, the web 2.0 led to the development of Enterprise Collaboration Systems (ECS), which enable employees to use wiki, blog or forum applications for conducting their business. Part of the content produced in ECS can be called Social Business Documents (SBD). Compared to traditional digital documents SBD are different in their nature and structure as they are, for example, less well-structured and do not follow a strict lifecycle. These characteristics bring along new management challenges. However, currently research literature lacks investigations on the characteristics of SBD, their peculiarities and management.
This dissertation uses document theory and documentary practice as theoretical lenses to investigate the new challenges of the long-term management of SBD in ECS. By using an interpretative, exploratory, mixed methods approach the study includes two major research parts. First, the nature and structure of Social Business Documents is addressed by analysing them within four different systems using four different modelling techniques each. The findings are used to develop general SBD information models, outlining the basic underlying components, structure, functions and included metadata, as well as a broad range of SBD characteristics. The second phase comprises a focus group, a case study including in-depth interviews and a questionnaire, all conducted with industry representatives. The focus group identified that the kind of SBD used for specific content and the actual place of storage differ between organisations as well as that there are currently nearly no management practices for SBD at hand. The case study provided deep insights into general document management activities and investigated requirements, challenges and actions for managing SBD. Finally, the questionnaire consolidated and deepened the previous findings. It provides insights about the value of SBD, their current management practices as well as management challenges and needs. Despite all participating organisations storing information worth managing in SBD most are not addressing them with management activities and many challenges remain.
Together, the investigations enable a contribution to practice and theory. The progress in practice is summarised through a framework, addressing the long-term management of Social Business Documents. The framework identifies and outlines the requirements and challenges of and the actions for SBD management. It also indicates the dependencies of the different aspects. Furthermore, the findings enable the progress in theory within documentary practice by discussing the extension of document types to include SBD. Existing problems are outlined along the definitions of records and the newly possible characteristics of documents emerging through Social Business Documents are taken into account.
The first group that was revised within my study is Ochralea Clark, 1865 (Hazmi & Wagner 2010a). I have checked the type specimen of most species that were originally described in Ochralea and there is no doubt that this genus is clearly distinct from Monolepta. Weise (1924) has synonymised Galeruca nigripes (Olivier, 1808) with O. nigricornis Clark, 1865 and the valid name of the species is O. nigripes (Olivier, 1808). Out of ten species originally described in this genus, only this species remain valid and O. pectoralis is a new synonym of O. nigripes. Additionally, Monolepta wangkliana Mohamedsaid, 2000 is very closely related to O. nigripes and need to be transferred to Ochralea. The second genus where the revision is still published is Arcastes Baly, 1865 (Hazmi & Wagner 2010b). I have checked the genitalic characters of A. biplagiata, and most of the type species of other Arcastes. Arcastes biplagiata possesses a peculiar shape of the median lobe and asymmetrically arranged endophallic structures. These peculiar characters are very useful to delimit this genus from the others. Therefore, only three valid species remain in Arcastes, while two new synonyms are found and fourrnother species need to be transferred to other genera. While checking the genitalic characteristics of type species of Arcastes sanguinea, thernmedian lobe as well as the spermatheca of this species possesses strong differences to A. biplagiata. The species was redescribed and transferred in a monotypic new genus Rubrarcastes Hazmi & Wagner, 2010c. The fourth genus that was already revised is Neolepta Jacoby, 1884. It was originally described on base of only two species by that time, N. biplagiata and N. fulvipennis. Jacoby has not designated a type species of the genus, and Maulik (1936) did it later, with the designation of N. biplagiata. Jacoby in his original description has only commented that Neolepta is very close and similar to Monolepta Chevrolat, 1837 and Candezea Chapuis, 1879. Subsequent authors have described further eight species, and transferred one species from Luperodes to it, summing up the total number of eleven described species in Neolepta. I have checked the genitalic characters of the type, N. biplagiata and have found out that the median lobe is not incised apically and stronger sclerotised ventral carinae with an apical hook close to the apex occur. Out of all described species, only two are closely related to the genero-type, N. sumatrensis (Jacoby, 1884) new combination and N. quadriplagiata Jacoby, 1886 that will remain in this group after the revision. All other species need to be transferred to other genera, including the newly described Paraneolepta and Orthoneolepta. The last distinct paper of this thesis presented the results on Monolepta Chevrolat, 1837. The massive number of Monolepta from the entire Oriental Region, with about 260 described species names is a more long-life project and not practicable within a PhD-study. Thus I have focused on the species of Monolepta known from the Sundaland area in this work. A comprehensive revision including the study of the primary types of the described species, has never been done for Monolepta from this sub-region, while new species have also been described in the last decade (e. g. Mohamedsaid 1993, 1997, 1998, 1999, 2000a,b, 2001, 2002, 2005).
On base of the most current species lists of Mohamedsaid (2001, 2004, 2005) and Kimoto (1990), the number of valid species described from this region is about 72. After my revision, only thirteen valid species can remain in Monolepta in the sense of the generotype M. bioculata (Wagner 2007), while seven species have been found as new synonyms, three have been already transferred to other genera and further 49 species need to be transferred to other genera.
Augmented reality (AR) applications typically extend the user's view of the real world with virtual objects.
In recent years, AR has gained increasing popularity and attention, which has led to improvements in the required technologies. AR has become available to almost everyone.
Researchers have made great progress towards the goal of believable AR, in which the real and virtual worlds are combined seamlessly.
They mainly focus on issues like tracking, display technologies and user interaction, and give little attention to visual and physical coherence when real and virtual objects are combined. For example, virtual objects should not only respond to the user's input; they should also interact with real objects. Generally, AR becomes more believable and realistic if virtual objects appear fixed or anchored in the real scene, appear indistinguishable from the real scene, and response to any changes within it.
This thesis examines on three challenges in the field of computer vision to meet the goal of a believable combined world in which virtual objects appear and behave like real objects.
Firstly, the thesis concentrates on the well-known tracking and registration problem. The tracking and registration challenge is discussed and an approach is presented to estimate the position and viewpoint of the user so that virtual objects appear fixed in the real world. Appearance-based line models, which keep only relevant edges for tracking purposes, enable absolute registration in the real world and provide robust tracking. On the one hand, there is no need to spend much time creating suitable models manually. On the other hand, the tracking can deal with changes within the object or the scene to be tracked. Experiments have shown that the use of appearance-based line models improves the robustness, accuracy and re-initialization speed of the tracking process.
Secondly, the thesis deals with the subject of reconstructing the surface of a real environment and presents an algorithm to optimize an ongoing surface reconstruction. A complete 3D surface reconstruction of the target scene
offers new possibilities for creating more realistic AR applications. Several interactions between real and virtual objects, such as collision and occlusions, can be handled with physical correctness. Whereas previous methods focused on improving surface reconstructions offline after a capturing step, the presented method de-noises, extends and fills holes during the capturing process. Thus, users can explore an unknown environment without any preparation tasks such as moving around and scanning the scene, and without having to deal with the underlying technology in advance. In experiments, the approach provided realistic results where known surfaces were extended and filled in plausibly for different surface types.
Finally, the thesis focuses on handling occlusions between the real and virtual worlds more realistically, by re-interpreting the occlusion challenge as an alpha matting problem. The presented method overcomes limitations in state-of-the-art methods by estimating a blending coefficient per pixel of the rendered virtual scene, instead of calculating only their visibility. In several experiments and comparisons with other methods, occlusion handling through alpha matting worked robustly and overcame limitations of low-cost sensor data; it also outperformed previous work in terms of quality, realism and practical applicability.
The method can deal with noisy depth data and yields realistic results in regions where foreground and background are not strictly separable (e.g. caused by fuzzy objects or motion blur).
Mathematical models of species dispersal and the resilience of metapopulations against habitat loss
(2021)
Habitat loss and fragmentation due to climate and land-use change are among the biggest threats to biodiversity, as the survival of species relies on suitable habitat area and the possibility to disperse between different patches of habitat. To predict and mitigate the effects of habitat loss, a better understanding of species dispersal is needed. Graph theory provides powerful tools to model metapopulations in changing landscapes with the help of habitat networks, where nodes represent habitat patches and links indicate the possible dispersal pathways between patches.
This thesis adapts tools from graph theory and optimisation to study species dispersal on habitat networks as well as the structure of habitat networks and the effects of habitat loss. In chapter 1, I will give an introduction to the thesis and the different topics presented in this thesis. Chapter 2 will then give a brief summary of tools used in the thesis.
In chapter 3, I present our model on possible range shifts for a generic species. Based on a graph-based dispersal model for a generic aquatic invertebrate with a terrestrial life stage, we developed an optimisation model that models dispersal directed to predefined habitat patches and yields a minimum time until these patches are colonised with respect to the given landscape structure and species dispersal capabilities. We created a time-expanded network based on the original habitat network and solved a mixed integer program to obtain the minimum colonisation time. The results provide maximum possible range shifts, and can be used to estimate how fast newly formed habitat patches can be colonised. Although being specific for this simulation model, the general idea of deriving a surrogate can in principle be adapted to other simulation models.
Next, in chapter 4, I present our model to evaluate the robustness of metapopulations. Based on a variety of habitat networks and different generic species characterised by their dispersal traits and habitat demands, we modeled the permanent loss of habitat patches and subsequent metapopulation dynamics. The results show that species with short dispersal ranges and high local-extinction risks are particularly vulnerable to the loss of habitat across all types of networks. On this basis, we then investigated how well different graph-theoretic metrics of habitat networks can serve as indicators of metapopulation robustness against habitat loss. We identified the clustering coefficient of a network as the only good proxy for metapopulation robustness across all types of species, networks, and habitat loss scenarios.
Finally, in chapter 5, I utilise the results obtained in chapter 4 to identify the areas in a network that should be improved in terms of restoration to maximise the metapopulation robustness under limited resources. More specifically, we exploit our findings that a network’s clustering coefficient is a good indicator for metapopulation robustness and develop two heuristics, a Greedy algorithm and a deducted Lazy Greedy algorithm, that aim at maximising the clustering coefficient of a network. Both algorithms can be applied to any network and are not specific to habitat networks only.
In chapter 6, I will summarize the main findings of this thesis, discuss their limitations and give an outlook of future research topics.
Overall this thesis develops frameworks to study the behaviour of habitat networks and introduces mathematical tools to ecology and thus narrows the gap between mathematics and ecology. While all models in this thesis were developed with a focus on aquatic invertebrates, they can easily be adapted to other metapopulations.
We are living in a world where environmental crises come to a head. To curb aggravation of these problems, a socio-ecological transformation within society is needed, going along with human behavior change. How to encourage such behavior changes on an individual level is the core issue of this dissertation. It takes a closer look at the role of individuals as consumers resulting in purchase decisions with more or less harmful impact on the environment. By using the example of plastic pollution, it takes up a current environmental problem and focuses on an understudied behavioral response to this problem, namely reduction behavior. More concrete, this dissertation examines which psychological factors can encourage the mitigation of plastic packaging consumption. Plastic packaging accounts for the biggest amount of current plastic production and is associated with products of daily relevance. Despite growing awareness of plastic pollution in society, behavioral responses do not follow accordingly and plastic consumption is still very high. As habits are often a pitfall when implementing more resource-saving behavior, this dissertation further examines if periods of discontinuity can open a ’window of opportunity’ to break old habits and facilitate behavior change. Four manuscripts approach this matter from the gross to the subtle. Starting with a literature review, a summary of 187 studies addresses the topic of plastic pollution and human behavior from a societal-scientific perspective. Based on this, a cross-sectional study (N = 648) examines the deter-minants of plastic-free behavior intentions in the private-sphere and public-sphere by structural equation modeling. Two experimental studies in pre-post design build upon this, by integrating the determinants in intervention studies. In addition, it was evaluated if the intervention presented during Lent (N = 140) or an action month of ‘Plastic Free July’ (N = 366) can create a ‘window of opportunity’ to mitigate plastic packaging consumption. The literature review emphasized the need for research on behavioral solutions to reduce plastic consumption. The empirical results revealed moral and control beliefs to be the main determinants of reduction behavior. Furthermore, the time point of an intervention influenced the likelihood to try out the new behavior. The studies gave first evidence that a ‘window of opportunity’ can facilitate change towards pro-environmental behavior within the application field of plastic consumption. Theoretical and practical implications of creating the right opportunity for individuals to contribute to a socio-ecological transformation are finally discussed.
Die Effekte kognitiv-behavioraler Interventionen bei Patienten mit multiplen somatoformen Symptomen liegen lediglich in einem mittleren Bereich und damit deutlich unter den in der Psychotherapiewirkungsforschung angegebenen Effektstärken. Bislang ist es jedoch nicht gelungen, eindeutig replizierbare, patientenseitige Prädiktoren, die für den Erfolg oder Misserfolg einer kognitiv-behavioralen, ambulanten Therapie bei somatoformen Beschwerden verantwortlich sein können, zu finden. In einem längsschnittlichen Untersuchungsdesign wurde an 78 Patienten (mit mindestens zwei somatoformen Körperbeschwerden), die an einer ambulanten Gruppenintervention teilgenommen haben, die Bedeutung von Symptomintensität und -anzahl, soziodemographischen Variablen, komorbiden psychischen Störungen sowie krankheits- und therapiebezogenen Einstellungen und Verhaltensweisen für die Prädiktion des kurzfristigen Therapieerfolgs untersucht. In bivariaten Analysen zeigte sich ein signifikanter Zusammenhang der zu Beginn der Behandlung erfassten Symptomanzahl, Ängstlichkeit, dysfunktionalen Kognitionen und der Inanspruchnahme medizinischer Ressourcen mit dem Therapieerfolg. Alter, Geschlecht, Bildungsniveau sowie das Vorliegen einer komorbiden Angststörung oder einer depressiven Edpisode waren nicht mit dem Therapieoutcome assoziiert. In multiplen Regressionsanalysen konnten die signifikanten Zusammenhänge jedoch nur für die Symptomanzahl und mit Einschränkungen für die Ängstlichkeit bestätigt werden. Die Ergebnisse werden vor dem Hintergrund des empirischen Forschungsstandes hinsichtlich ihrer praktischen Bedeutung für die differenzielle Therapieindikation diskutiert.
Scientific and public interest in epidemiology and mathematical modelling of disease spread has increased significantly due to the current COVID-19 pandemic. Political action is influenced by forecasts and evaluations of such models and the whole society is affected by the corresponding countermeasures for containment. But how are these models structured?
Which methods can be used to apply them to the respective regions, based on real data sets? These questions are certainly not new. Mathematical modelling in epidemiology using differential equations has been researched for quite some time now and can be carried out mainly by means of numerical computer simulations. These models are constantly being refinded and adapted to corresponding diseases. However, it should be noted that the more complex a model is, the more unknown parameters are included. A meaningful data adaptation thus becomes very diffcult. The goal of this thesis is to design applicable models using the examples of COVID-19 and dengue, to adapt them adequately to real data sets and thus to perform numerical simulations. For this purpose, first the mathematical foundations are presented and a theoretical outline of ordinary differential equations and optimization is provided. The parameter estimations shall be performed by means of adjoint functions. This procedure represents a combination of static and dynamical optimization. The objective function corresponds to a least squares method with L2 norm which depends on the searched parameters. This objective function is coupled to constraints in the form of ordinary differential equations and numerically minimized, using Pontryagin's maximum (minimum) principle and optimal control theory. In the case of dengue, due to the transmission path via mosquitoes, a model reduction of an SIRUV model to an SIR model with time-dependent transmission rate is performed by means of time-scale separation. The SIRUV model includes uninfected (U) and infected (V ) mosquito compartments in addition to the susceptible (S), infected (I) and recovered (R) human compartments, known from the SIR model. The unknwon parameters of the reduced SIR model are estimated using data sets from Colombo (Sri Lanka) and Jakarta (Indonesia). Based on this parameter estimation the predictive power of the model is checked and evaluated. In the case of Jakarta, the model is additionally provided with a mobility component between the individual city districts, based on commuter data. The transmission rates of the SIR models are also dependent on meteorological data as correlations between these and dengue outbreaks have been demonstrated in previous data analyses. For the modelling of COVID-19 we use several SEIRD models which in comparison to the SIR model also take into account the latency period and the number of deaths via exposed (E) and deaths (D) compartments. Based on these models a parameter estimation with adjoint functions is performed for the location Germany. This is possible because since the beginning of the pandemic, the cumulative number of infected persons and deaths
are published daily by Johns Hopkins University and the Robert-Koch-Institute. Here, a SEIRD model with a time delay regarding the deaths proves to be particularly suitable. In the next step, this model is used to compare the parameter estimation via adjoint functions with a Metropolis algorithm. Analytical effort, accuracy and calculation speed are taken into account. In all data fittings, one parameter each is determined to assess the estimated number of unreported cases.
Die vorliegende Doktorarbeit hatte zum Ziel zu prüfen, ob Emulsionspolymere auf Acrylatbasis als neuartige Photokatalysatoren bzw. Katalysatoren genutzt werden können.
Auf Grund der Beschaffenheit und der Eigenschaften von Emulsionspolymeren ist davon auszugehen, dass die Nutzung selbiger als Katalysatoren eine neue Art einer chemischen Katalyse ermöglicht. So sollen die Vorteile der heterogenen und homogenen Katalyse vereint und die jeweiligen Nachteile minimiert werden. Als besonders erfolgversprechend hat sich während der praktischen Arbeit die Nutzung von Emulsionspolymeren als Photokatalysatoren herausgestellt.
Die Anbindung der photokatalytisch aktiven Moleküle an/in den Polymerstrang soll kovalent erfolgen. Deshalb war ein erstes Teilziel dieser Arbeit prototypische Katalysatormoleküle zu synthetisieren, die über einen Acrylat-Substituenten verfügen, der in einer radikalischen Polymerisationsreaktion reagieren kann. Als Photokatalysatoren wurden Ruthenium- Polypyridin-Komplexe ausgewählt, die sowohl für eine inter- als auch intramolekulare Photokatalyse zur Herstellung von Wasserstoff aus Wasser geeignet sind. Für organokatalytische Zwecke wurde ein L-Prolin-Derivat synthetisiert, welches jedoch nicht auf seine Polymerisierbarkeit getestet wurde.
In einem ersten Schritt wurden die prototypischen 2,2’-Bipyridin-Liganden synthetisiert. Dabei konnte eine verbesserte Synthesemethode für 4-Brom-2,2’-bipyridin ausgearbeitet werden. Die Funktionalisierung erfolgte letztendlich durch eine Horner-Wadsworth-Emmons-Reaktion, die anschließend an eine Eintopfsynthese zur Darstellung von 4-Formyl-2,2’-biypridin erfolgte. Die prototypischen Photokatalysatoren zeigten mäßige Erfolge (TON: 37-136, 6h, 10% H2O, 470 nm) in Bezug auf die photokatalytische Wasserstoffproduktion, sodass an dieser Stelle eine Verbesserung der entsprechenden katalytischen Systeme erfolgen sollte.
Die Polymerisationsreaktion konnte für zwei intermolekulare Photokatalysatoren und zwei intramolekulare Photokatalysatoren durchgeführt werden. Dabei fiel auf, dass die intermolekularen Photokatalysatoren besser polymerisieren als die intramolekularen Photokatalysatoren. Es wird angenommen, dass dies mit der Löslichkeit der Substanzen im Monomer Ethylmethacrylat zusammen hängt.
Die photokatalytisch funktionalisierten Emulsionspolymere zeigten eine ähnliche photokatalytische Aktivität (TON: 9-101, 6h, 10% H2O, 470 nm) wie die jeweiligen Ausgangsstoffe selbst. Es konnte jedoch bewiesen werden, dass Emulsionspolymere als Photokatalysatoren genutzt werden können, wenn auch noch weitere Arbeiten zur Optimierung der Systeme nötig sind.
For software engineers, conceptually understanding the tools they are using in the context of their projects is a daily challenge and a prerequisite for complex tasks. Textual explanations and code examples serve as knowledge resources for understanding software languages and software technologies. This thesis describes research on integrating and interconnecting
existing knowledge resources, which can then be used to assist with understanding and comparing software languages and software technologies on a conceptual level. We consider the following broad research questions that we later refine: What knowledge resources can be systematically reused for recovering structured knowledge and how? What vocabulary already exists in literature that is used to express conceptual knowledge? How can we reuse the
online encyclopedia Wikipedia? How can we detect and report on instances of technology usage? How can we assure reproducibility as the central quality factor of any construction process for knowledge artifacts? As qualitative research, we describe methodologies to recover knowledge resources by i.) systematically studying literature, ii.) mining Wikipedia, iii.) mining available textual explanations and code examples of technology usage. The theoretical findings are backed by case studies. As research contributions, we have recovered i.) a reference semantics of vocabulary for describing software technology usage with an emphasis on software languages, ii.) an annotated corpus of Wikipedia articles on software languages, iii.) insights into technology usage on GitHub with regard to a catalog of pattern and iv.) megamodels of technology usage that are interconnected with existing textual explanations and code examples.
The formulation of the decoding problem for linear block codes as an integer program (IP) with a rather tight linear programming (LP) relaxation has made a central part of channel coding accessible for the theory and methods of mathematical optimization, especially integer programming, polyhedral combinatorics and also algorithmic graph theory, since the important class of turbo codes exhibits an inherent graphical structure. We present several novel models, algorithms and theoretical results for error-correction decoding based on mathematical optimization. Our contribution includes a partly combinatorial LP decoder for turbo codes, a fast branch-and-cut algorithm for maximum-likelihood (ML) decoding of arbitrary binary linear codes, a theoretical analysis of the LP decoder's performance for 3-dimensional turbo codes, compact IP models for various heuristic algorithms as well as ML decoding in combination with higher-order modulation, and, finally, first steps towards an implementation of the LP decoder in specialized hardware. The scientific contributions are presented in the form of seven revised reprints of papers that appeared in peer-reviewed international journals or conference proceedings. They are accompanied by an extensive introductory part that reviews the basics of mathematical optimization, coding theory, and the previous results on LP decoding that we rely on afterwards.
We consider variational discretization of three different optimal control problems.
The first being a parabolic optimal control problem governed by space-time measure controls. This problem has a nice sparsity structure, which motivates our aim to achieve maximal sparsity on the discrete level. Due to the measures on the right hand side of the partial differential equation, we consider a very weak solution theory for the state equation and need an embedding into the continuous functions for the pairings to make sense. Furthermore, we employ Fenchel duality to formulate the predual problem and give results on solution theory of both the predual and the primal problem. Later on, the duality is also helpful for the derivation of algorithms, since the predual problem can be differentiated twice so that we can apply a semismooth Newton method. We then retrieve the optimal control by duality relations.
For the state discretization we use a Petrov-Galerkin method employing piecewise constant states and piecewise linear and continuous test functions in time. For the space discretization we choose piecewise linear and continuous functions. As a result the controls are composed of Dirac measures in space-time, centered at points on the discrete space-time grid. We prove that the optimal discrete states and controls converge strongly in L^q and weakly-* in Μ, respectively, to their smooth counterparts, where q ϵ (1,min{2,1+2/d}] is the spatial dimension. The variational discrete version of the state equation with the above choice of spaces yields a Crank-Nicolson time stepping scheme with half a Rannacher smoothing step.
Furthermore, we compare our approach to a full discretization of the corresponding control problem, precisely a discontinuous Galerkin method for the state discretization, where the discrete controls are piecewise constant in time and Dirac measures in space. Numerical experiments highlight the sparsity features of our discrete approach and verify the convergence results.
The second problem we analyze is a parabolic optimal control problem governed by bounded initial measure controls. Here, the cost functional consists of a tracking term corresponding to the observation of the state at final time. Instead of a regularization term for the control in the cost functional, we consider a bound on the measure norm of the initial control. As in the first problem we observe a sparsity structure, but here the control resides only in space at initial time, so we focus on the space discretization to achieve maximal sparsity of the control. Again, due to the initial measure in the partial differential equation, we rely on a very weak solution theory of the state equation.
We employ a dG(0) approximation of the state equation, i.e. we choose piecewise linear and continuous functions in space, which are piecewise constant in time for our ansatz and test space. Then, the variational discretization of the problem together with the optimality conditions induce maximal discrete sparsity of the initial control, i.e. Dirac measures in space. We present numerical experiments to illustrate our approach and investigate the sparsity structure
As third problem we choose an elliptic optimal control governed by functions of bounded variation (BV) in one space dimension. The cost functional consists of a tracking term for the state and a BV-seminorm in terms of the derivative of the control. We derive a sparsity structure for the derivative of the BV control. Additionally, we utilize the mixed formulation for the state equation.
A variational discretization approach with piecewise constant discretization of the state and piecewise linear and continuous discretization of the adjoint state yields that the derivative of the control is a sum of Dirac measures. Consequently the control is a piecewise constant function. Under a structural assumption we even get that the number of jumps of the control is finite. We prove error estimates for the variational discretization approach in combination with the mixed formulation of the state equation and confirm our findings in numerical experiments that display the convergence rate.
In summary we confirm the use of variational discretization for optimal control problems with measures that inherit a sparsity. We are able to preserve the sparsity on the discrete level without discretizing the control variable.
The increase in plastic particles (< 5 mm) in the environment is a global problem, which is in direct correlation to the increasing production quantity and variety. Through direct input (primary) or through the degradation of macroplastics (secondary), particles enter the environmental compartments water and/or soil via conventional material transportation paths. The research and development work on the sustainable removal of microplastic particles (inert organic chemical stressors, IOCS) from wastewater is based on the construction of polymer inclusion compounds. IOCS describe a group of organic chemical molecules, which demonstrate a high level of persistence upon entry in the ecosystem and whose degradation is limited.
Following the principle of Cloud Point Technology, a novel separation technique has been developed which induces particle growth in microplastics and allows easier separation from the water by volume increase according to the state of the art. The concept for the sustainable removal of microplastics from Herbort and Schuhen is based on a three-step synthesis. This concept was further optimized as part of the research and adapted to the criteria of resource efficiency and profitability. The fundamental research is premised on the hypothesis that van der Waals forces with short ranges and localized hydrophobic interactions between precursors and/or material and the IOCS to be connected can induce a fixation through the formation of an inclusion compound with particle growth. Through the addition of silicon-based ecotoxicologically irrelevant coagulation and inclusion units, it is possible to initiate molecular self-organization with the hydrophobic stressors in an aggregation process induced through water. It results in adhesive particle growth around the polymer particles and between particles. Subsequently, the polymer extract can be separated from aquatic media through simple and cost-effective filtration processes (e.g. sand trap, grease trap), due to the 10,000 times larger volume microplastic agglomerates.
Water scarcity is already an omnipresent problem in many parts of the world, especially in sub-Saharan Africa. The dry years 2018 and 2019 showed that also in Germany water resources are finite. Projections and predictions for the next decades indicate that renewal rates of existing water resources will decline due the growing influence of climate change, but that water extraction rates will increase due to population growth. It is therefore important to find alternative and sustainable methods to make optimal use of the water resources currently available. For this reason, the reuse of treated wastewater for irrigation and recharge purposes has become one focus of scientific research in this field. However, it must be taken into account that wastewater contains so-called micropollutants, i.e., substances of anthropogenic origin. These are, e.g., pharmaceuticals, pesticides and industrial chemicals which enter the wastewater, but also metabolites that are formed in the human body from pharmaceuticals or personal care products. Through the treatment in wastewater treatment plants (WWTPs) as well as through chemical, biological and physical processes in the soil passage during the reuse of water, these micropollutants are transformed to new substances, known as transformation products (TPs), which further broaden the number of contaminants that can be detected within the whole water cycle.
Despite the fact that the presence of human metabolites and environmental TPs in untreated and treated wastewater has been known for a many years, they are rarely included in common routine analysis methods. Therefore, a first goal of this thesis was the development of an analysis method based on liquid chromatography - tandem mass spectrometry (LC-MS/MS) that contains a broad spectrum of frequently detected micropollutants including their known metabolites and TPs. The developed multi-residue analysis method contained a total of 80 precursor micropollutants and 74 metabolites and TPs of different substance classes. The method was validated for the analysis of different water matrices (WWTP influent and effluent, surface water and groundwater from a bank filtration site). The influence of the MS parameters on the quality of the analysis data was studied. Despite the high number of analytes, a sufficient number of datapoints per peak was maintained, ensuring a high sensitivity and precision as well as a good recovery for all matrices. The selection of the analytes proved to be relevant as 95% of the selected micropollutants were detected in at least one sample. Several micropollutants were quantified that were not in the focus of other current multi-residue analysis methods (e.g. oxypurinol). The relevance of including metabolites and TPs was demonstrated by the frequent detection of, e.g., clopidogrel acid and valsartan acid at higher concentrations than their precursors, the latter even being detected in samples of bank filtrate water.
By the integration of metabolites, which are produced in the body by biological processes, and biological and chemical TPs, the multi-residue analysis method is also suitable for elucidating degradation mechanisms in treatment systems for water reuse that, e.g., use a soil passage for further treatment. In the second part of the thesis, samples from two treatment systems based on natural processes were analysed: a pilot-scale above-ground sequential biofiltration system (SBF) and a full-scale soil aquifer treatment (SAT) site. In the SBF system mainly biological degradation was observed, which was clearly demonstrated by the detection of biological TPs after the treatment. The efficiency of the degradation was improved by an intermediate aeration, which created oxic conditions in the upper layer of the following soil passage. In the SAT system a combination of biodegradation and sorption processes occurred. By the different behaviour of some biodegradable micropollutants compared to the SBF system, the influence of redox conditions and microbial community was observed. An advantage of the SAT system over the SBF system was found in the sorption capacity of the natural soil. Especially positively charged micropollutants showed attenuation due to ionic interactions with negatively charged soil particles. Based on the physicochemical properties at ambient pH, the degree of removal in the investigated systems and the occurrence in the source water, a selection of process-based indicator substances was proposed.
Within the first two parts of this thesis a micropollutant was frequently detected at elevated concentrations in WWTPs effluents, which was not previously in the focus of environmental research: the antidiabetic drug sitagliptin (STG). STG showed low degradability in biological systems and thus it was investigated to what extend chemical treatment by ozonation can ensure attenuation of it. STG contains an aliphatic primary amine as the principal point of attack for the ozone molecule. There is only limited information about the behaviour of this functional group during ozonation and thus, STG served as an example for other micropollutants containing aliphatic primary amines. A pH-dependent degradation kinetic was observed due to the protonation of the primary amine at lower pH values. At pH values in the range 6 - 8, which is typical for the environment and in WWTPs, STG showed degradation kinetics in the range of 103 M-1s-1 and thus belongs to the group of readily degradable substances. However, complete degradation can only be expected at significantly higher pH values (> 9). The transformation of the primary amine moiety into a nitro group was observed as the major degradation mechanism for STG during ozonation. Other mechanisms involved the formation of a diketone, bond breakages and the formation of trifluoroacetic acid (TFA). Investigations at a pilot-scale ozonation plant using the effluent of a biological degradation of a municipal WWTP as source water confirmed the results of the laboratory studies: STG could not be removed completely even at high ozone doses and the nitro compound was formed as the main TP and remained stable during further ozonation and subsequent biological treatment. It can therefore be assumed that under realistic conditions both a residual concentration of STG and the formed main TP as well as other stable TPs such as TFA can be detected in the effluents of a WWTP consisting of conventional biological treatment followed by ozonation and subsequent biological polishing steps.
Microbial pollution of surface waters poses substantial risks for public health, amongst others during recreational use. Microbial pollution was studied at selected sampling sites in rivers Rhine, Moselle and Lahn (Germany) on the basis of commonly used fecal indicator organisms (FIO) indicating bacterial (Escherichia coli, intestinal enterococci) and viral (somatic coliphages) fecal contamination. In addition, blaCTX-Mantibiotic resistance genes (ARG) were quantified at twosites in river Lahn and were used as markers for tracking the spread of antibiotic resistance in the aquatic environment. The impact of changes in climate-related parameters on FIO was examined by studying monitoring results of contrasting flow conditions at rivers Rhine and Moselle. Analyses at all studied river sites clearly indicate that high discharge and precipitation enhance the influx of FIO, ARG and thus potentially (antibiotic resistant) pathogens into rivers. In contrast, a decrease in hygienic microbial pollution was observed under high solar irradiation and increasing water temperatures. Based on identified contributing key factors, multiple linear regression (MLR) models for five sites at a stretch of river Lahn were established that allow a timely assessment of fecal indicator abundances. An interaction between abiotic and biotic factors (i.e. enhanced grazing pressure) considerably contributed to the formation of seasonal patterns among FIO abundances. This was enhanced during extraordinary low flow conditions in rivers with pronounced trophic interactions, clearly hampering a transfer of model approaches between rivers of different biological and hydrological characteristics. Bacterial indicatorswere stronger influenced by grazing pressure than phages. Hence, bacterial indicators alone do not sufficiently describe viral pollution in rivers. BlaCTX-Mgenes were omnipresent in Lahn River water and corresponded to distribution patterns of FIO, indicating fecal sources. Agriculture and waste watertreatment plant effluents contributed to ARG loads and participants in non-bathing water sports were found to be at risk of ingesting antibiotic resistant bacteria (ARB) including ARG, bearing the risk of infection or colonization. Results of the present study highlight the need to be aware of such risks not only in designated bathing waters. ARG abundance at both riverine sampling sites could largely be explained by E. coliabundance and may thus also be incorporated into multiple regression models using E. colispecific environmental predictors. It can be expected that the frequency of short-term microbial pollution events will increase over the next decades due to climate change. Several challenges were identified with regard to the implementation of early warning systems to protect the public from exposure to pathogens in rivers. Most importantly, the concept of the Bathing Water Directive (Directive 2006/7/EC) itself as well as the lack of harmonization in the regulatory framework at European Union (EU) level are major drawbacks and require future adjustments to reliably manage health risks related to microbial water pollution in waters used in multifunctional ways.
In this thesis we examined the question whether personality traits of early child care workers influence process quality in preschool.
Research has shown that in educational settings such as preschool, pedagogical quality affects children’s developmental outcome (e.g. NICHD, 2002; Peisner-Feinberg et al., 1999). A substantial part of pedagogical quality known to be vital in this respect is the interaction between teacher and children (e.g., Tietze, 2008). Results of prior classroom research indicate that the teachers’ personality might be an important factor for good teacher-child-interaction (Mayr, 2011). Thus, personality traits might play a vital role for the interaction in preschool. Therefore, the aims of this thesis were to a) identify pivotal personality traits of child care workers, b) assess ideal levels of the identified personality traits and c) examine the relationship between pivotal personality traits and process quality. On that account, we conducted two requirement analyses and a video study. The results of these studies showed that subject matter experts (parents, child care workers, lecturers) partly agreed as to which personality traits are pivotal for child care workers. Furthermore, the experts showed high consensus with regard to the minimum, ideal and maximum personality trait profiles. Furthermore, child care workers whose profiles lay closer to the experts’ ideal also showed higher process quality. In addition, regression analyses showed that the child care workers’ levels of the Big Two (Communion and Agency) related significantly to their process quality.