Refine
Year of publication
Document Type
- Doctoral Thesis (475)
- Part of Periodical (355)
- Bachelor Thesis (275)
- Diploma Thesis (196)
- Master's Thesis (192)
- Study Thesis (138)
- Article (23)
- Conference Proceedings (11)
- Other (8)
- Report (8)
- Book (5)
- Habilitation (5)
- Lecture (1)
- Preprint (1)
Language
- German (1139)
- English (548)
- Multiple languages (4)
- Spanish (2)
Is part of the Bibliography
- no (1693) (remove)
Keywords
- Vorlesungsverzeichnis (55)
- Bildverarbeitung (16)
- Augmented Reality (15)
- Simulation (12)
- Computersimulation (11)
- Computergraphik (10)
- Pestizid (10)
- Robotik (10)
- Computergrafik (9)
- Computervisualistik (9)
Institute
- Institut für Computervisualistik (335)
- Fachbereich 4 (298)
- Zentrale Einrichtungen (176)
- Institut für Informatik (147)
- Institut für Wirtschafts- und Verwaltungsinformatik (147)
- Institut für Management (117)
- Fachbereich 7 (99)
- Institute for Web Science and Technologies (57)
- Institut für Softwaretechnik (54)
- Fachbereich 8 (47)
Real-time operating systems for mixed-criticality systems
must support different types of software, such as
real-time applications and general purpose applications,
and, at the same time, must provide strong spatial and
temporal isolation between independent software components.
Therefore, state-of-the-art real-time operating systems
focus mainly on predictability and bounded worst-case behavior.
However, general purpose operating systems such as Linux
often feature more efficient---but less deterministic---mechanisms
that significantly improve the average execution time.
This thesis addresses the combination of the two contradicting
requirements and shows thread synchronization mechanisms
with efficient average-case behavior, but without sacrificing
predictability and worst-case behavior.
This thesis explores and evaluates the design space of fast paths
in the implementation of typical blocking synchronization
mechanisms, such as mutexes, condition variables, counting
semaphores, barriers, or message queues. The key technique here
is to avoid unnecessary system calls, as system calls have high
costs compared to other processor operations available in user
space, such as low-level atomic synchronization primitives.
In particular, the thesis explores futexes, the state-of-the-art
design for blocking synchronization mechanisms in Linux
that handles the uncontended case of thread synchronization
by using atomic operations in user space and calls into the
kernel only to suspend and wake up threads. The thesis also
proposes non-preemptive busy-waiting monitors that use an
efficient priority ceiling mechanism to prevent the lock holder
preemption problem without using system calls, and according
low-level kernel primitives to construct efficient wait and
notify operations.
The evaluation shows that the presented approaches
improve the average performance comparable
to state-of-the-art approaches in Linux.
At the same time, a worst-case timing analysis shows
that the approaches only need constant or bounded temporal
overheads at the operating system kernel level.
Exploiting these fast paths is a worthwhile approach
when designing systems that not only have to fulfill
real-time requirements, but also best-effort workloads.
Leaf litter breakdown is a fundamental process in aquatic ecosystems, being mainly mediated by decomposer-detritivore systems that are composed of microbial decomposers and leaf-shredding, detritivorous invertebrates. The ecological integrity of these systems can, however, be disturbed, amongst others, by chemical stressors. Fungicides might pose a particular risk as they can have negative effects on the involved microbial decomposers but may also affect shredders via both waterborne toxicity and their diet; the latter by toxic effects due to dietary exposure as a result of fungicides’ accumulation on leaf material and by negatively affecting fungal leaf decomposers, on which shredders’ nutrition heavily relies. The primary aim of this thesis was therefore to provide an in-depth assessment of the ecotoxicological implications of fungicides in a model decomposer-detritivore system using a tiered experimental approach to investigate (1) waterborne toxicity in a model shredder, i.e., Gammarus fossarum, (2) structural and functional implications in leaf-associated microbial communities, and (3) the relative importance of waterborne and diet-related effects for the model shredder.
Additionally, knowledge gaps were tackled that were related to potential differences in the ecotoxicological impact of inorganic (also authorized for organic farming in large parts of the world) and organic fungicides, the mixture toxicity of these substances, the field-relevance of their effects, and the appropriateness of current environmental risk assessment (ERA).
In the course of this thesis, major differences in the effects of inorganic and organic fungicides on the model decomposer-detritivore system were uncovered; e.g., the palatability of leaves for G. fossarum was increased by inorganic fungicides but deteriorated by organic substances. Furthermore, non-additive action of fungicides was observed, rendering mixture effects of these substances hardly predictable. While the relative importance of the waterborne and diet-related effect pathway for the model shredder seems to depend on the fungicide group and the exposure concentration, it was demonstrated that neither path must be ignored due to additive action. Finally, it was shown that effects can be expected at field-relevant fungicide levels and that current ERA may provide insufficient protection for decomposer-detritivore systems. To safeguard aquatic ecosystem functioning, this thesis thus recommends including leaf-associated microbial communities and long-term feeding studies using detritus feeders in ERA testing schemes, and identifies several knowledge gaps whose filling seems mandatory to develop further reasonable refinements for fungicide ERA.
Based on dual process models of information processing, the present research addressed how explicit disgust sensitivity is re-adapted according to implicit disgust sensitivity via self-perception of automatic behavioral cues. Contrary to preceding studies (Hofmann, Gschwendner, & Schmitt, 2009) that concluded that there was a "blind spot" for self- but not for observer perception of automatic behavioral cues, in the present research, a re-adaption process was found for self-perceivers and observers. In Study 1 (N = 75), the predictive validity of an indirect disgust sensitivity measure was tested with a double-dissociation strategy. Study 2 (N = 117) reinvestigated the hypothesis that self-perception of automatic behavioral cues, predicted by an indirect disgust sensitivity measure, led to a re-adaption of explicit disgust sensitivity measures. Using a different approach from Hofmann et al. (2009), the self-perception procedure was modified by (a) feeding back the behavior several times while a small number of cues had to be rated for each feedback condition, (b) using disgust sensitivity as a domain with clearly unequivocal cues of automatic behavior (facial expression, body movements) and describing these cues unambiguously, and (c) using a specific explicit disgust sensitivity measure in addition to a general explicit disgust sensitivity measure. In Study 3 (N = 130), the findings of Study 2 were replicated and display rules and need for closure as moderator effects of predictive validity and cue utilization were additionally investigated. The moderator effects give hints that both displaying a disgusted facial expression and self-perception of one- own disgusted facial expression are subject to a self-serving bias, indicating that facial expression may not be an automatic behavior. Practical implications and implications for future research are discussed.
Replikation einer Multi-Agenten-Simulationsumgebung zur Überprüfung auf Integrität und Konsistenz
(2012)
In dieser Master -Arbeit möchte ich zunächst eine Simulation vorstellen, mit der das Verhalten von Agenten untersucht wird, die in einer generierten Welt versuchen zu über leben und dazu einige Handlungsmöglichkeiten zur Auswahl haben. Anschließend werde ich kurz die theoretischen Aspekte beleuchten, welche hier zu Grunde liegen. Der Hauptteil meiner Arbeit ist meine Replikation einer Simulation, die von Andreas König im Jahr 2000 in Java angefertigt worden ist [Kö2000] . Ich werde hier seine Arbeit in stark verkürzter Form darstellen und anschließend auf meine eigene Entwicklung eingehen.
Im Schlussteil der Arbeit werde ich die Ergebnisse meiner Simulation mit denen von Andreas König vergleichen und die verwendeten Werkzeuge (Java und NetLogo) besprechen. Zum Abschluss werde ich in einem Fazit mein Vorhaben kurz zusammenfassen und berichten was sich umsetzen ließ, was nicht funktioniert hat und warum.
Standards are widely-used in the computer science and IT industry. Different organizations like the International Organization for Standardization (SO) are involved in the development of computer related standards. An important domain of standardization is the specification of data formats enabling the exchange of information between different applications. Such formats can be expressed in a variety of schema languages thereby defining sets of conformant documents. Often the use of multiple schema languages is required due to their varying expressive power and different kind of validation requirements.rnThis also holds for the Specification Common Cartridge which is maintained by the IMS Global Learning Consortium. The specification defines valid zip packages that can be used to aggregate different learning objects. These learning objects are represented by a set of files which are a part of the package and can be imported into a learning management system. The specification makes use of other specifications to constrain the contents of valid documents. Such documents are expressed in the eXtensible Markup Language and may contain references to other files also part of the package. The specification itself is a so-called domain profile. A domain profile allows the modification of one or more specifications to meet the needs of a particular community. Test rules can be used to determine a set of tasks in order to validate a concrete package. The execution is done by a testsystem which, as we will show, can be created automatically. Hence this method may apply to other package based data formats that are defined as a part of a specification.
This work will examine the applicability of this generic test method to the data formats that are introduced by the so called Virtual Company Dossier. These formats are used in processes related to public e-procurement. They allow the packaging of evidences that are needed to prove the fulfillment of criteria related to a public tender. The work first examines the requirements that are common to both specifications. This will introduce a new view on the requirements by introducing a higher level of abstraction. The identified requirements will then be used to create different domain profiles each capturing the requirements of a package based data format. The process is normally guided by supporting tools that ease the capturing of a domain profile and the creation of testsystems. These tools will be adapted to support the new requirements. Furtheron the generic testsystem will be modified. This system is used as a basis when a concrete testsystem is created.
Finally the author comes to a positive conclusion. Common requirements have been identified and captured. The involved systems have been adapted allowing the capturing of further types of requirements that have not been supported before. Furthermore the background of the specifications quite differ. This indicates that the use of domain profiles and generic test technologies may be suitable in a wide variety of other contexts.
Planning routes for trucks with a trailer is a complex procedure. In order to simplify this process, a route is segmented into elementary components, which represents basic motions of the considered vehicle. These elementary components are called maneuvers and are composed of two party. First, paths are constructed for certain reference points. Second, the vehicle is enclosed by a corridor during the execution of a maneuver. The paths of the vehicle have to take driveability into consideration. They must respect the kinematic constraints of the vehicle. The maneuver corridor can be used as a basis to guarantee collision-free motion planing. No part of the vehicle leaves the corridor during the maneuver. There are different types of maneuvers. Currently, the bending maneuver, the cusp maneuver and the straight line maneuver can be distinguished. In addition, a maneuver can be created with two different construction methods, the conventional and the iterative method.
In this thesis, a data structure to construct a maneuver is designed and implemented. The data structure is integrated into an already existing tool. The user can interact with the software to adjust various parameters of a maneuver. Afterwards the maneuver is generated based on these parameters. This also includes a visualization within the software, which can plot the parts of a maneuver. The visualization can be exported to an image file.
Web-programming is a huge field of different technologies and concepts. Each technology implements a web-application requirement like content generation or client-server communication. Different technologies within one application are organized by concepts, for example architectural patterns. The thesis describes an approach for creating a taxonomy about these web-programming components using the free encyclopaedia Wikipedia. Our 101companies project uses implementations to identify and classify the different technology sets and concepts behind a web-application framework. These classifications can be used to create taxonomies and ontologies within the project. The thesis also describes, how we priorize useful web-application frameworks with the help of Wikipedia. Finally, the created implementations concerning web-programming are documented.
Wikipedia is the biggest, free online encyclopaedia that can be expanded by any-one. For the users, who create content on a specific Wikipedia language edition, a social network exists. In this social network users are categorised into different roles. These are normal users, administrators and functional bots. Within the networks, a user can post reviews, suggestions or send simple messages to the "talk page" of another user. Each language in the Wikipedia domain has this type of social network.
In this thesis characteristics of the three different roles are analysed in order to learn how they function in one language network of Wikipedia and apply them to another Wikipedia network to identify bots. Timestamps from created posts are analysed to reveal noticeable characteristics referring to continuous messages, message rates and irregular behaviour of a user are discovered. Through this process we show that there exist differences between the roles for the mentioned characteristics.
The present thesis deals with the realization of a stepper motor driver on an 8-bit microcontroller by the company Atmel. The focus is on the devel- opment of a current control, which allows microstepping in addition to the basic modes of operation like full- and halfstep. For this purpose, a PI con- troller is derived using physical and control engineering principles, which is implemented on the microcontroller. In this context, essential knowledge for the practical implementation will be discussed. In addition, the development of the hardware is documented, which is of great significance for the current measurement.
Placing questions before the material or after the material constitute different reading situations. To adapt to these reading situations, readers may apply appropriate reading strategies. Reading strategy caused by location of question has been intensively explored in the context of text comprehension. (1) However, there is still not enough knowledge about whether text plays the same role as pictures when readers apply different reading strategies. To answer this research question, three reading strategies are experimentally manipulated by displaying question before or after the blended text and picture materials: (a) Unguided processing with text and pictures and without the question. (b) Information gathering to answer the questions after the prior experience with text and pictures. (c) Comprehending text and pictures to solve the questions with the prior information of the questions. (2) Besides, it is arguable whether readers prefer text or pictures when the instructed questions are in different difficulty levels. (3) Furthermore, it is still uncertain whether students from higher school tier (Gymnasium) emphasize more on text or on pictures than students from lower school tier (Realschule). (4) Finally, it is rarely mentioned whether higher graders are more able to apply reading strategies in text processing and picture processing than lower graders.
Two experiments were undertaken to investigate the usage of text and pictures in the perspectives of task orientation, question difficulty, school and grade. For a 2x2(x2x2x2) mixed design adopting eye tracking method, participants were recruited from grade 5 (N = 72) and grade 8 (N = 72). In Experiment 1, thirty-six 5th graders were recruited from higher tier (Gymnasium) and thirty-six 5th graders were from lower tier (Realschule). In Experiment 2, thirty-six 8th graders were recruited from higher tier and thirty-six were from lower tier. They were supposed to comprehend the materials combining text and pictures and to answer the questions. A Tobii XL60 eye tracker recorded their eye movements and their answers to the questions. Eye tracking indicators were analyzed and reported, such as accumulated fixation duration, time to the first fixation and transitions between different Areas of Interest. The results reveal that students process text differently from pictures when they follow different reading strategies. (1) Consistent with Hypothesis 1, students mainly use text to construct their mental model in unguided spontaneous processing of text and pictures. They seem to mainly rely on the pictures as external representations when trying to answer questions after the prior experience with the material. They emphasize on both text and pictures when questions are presented before the material. (2) Inconsistent with Hypothesis 2, students are inclined to emphasize on text and on pictures as question difficulty increases. However, the increase of focus on pictures is more than on text when the presented question is difficult. (3) Different from Hypothesis 3, the current study discovers that higher tier students did not differ from lower tier students in text processing. Conversely, students from higher tier attend more to pictures than students from lower tier. (4) Differed from Hypothesis 4, 8th graders outperform 5th graders mainly in text processing. Only a subtle difference is found between 5th graders and 8th graders in picture processing.
To sum up, text processing differs from picture processing when applying different reading strategies. In line with the Integrative Model of Text and Picture Comprehension by Schnotz (2014), text is likely to play a major part in guiding the processing of meaning or general reading, whereas pictures are applied as external representations for information retrieval or selective reading. When question is difficulty, pictures are emphasized due to their advantages in visualizing the internal structure of information. Compared to lower tier students (poorer problem solvers), higher tier students (good problem solvers) are more capable of comprehending pictures rather than text. Eighth graders are more efficient than 5th graders in text processing rather than picture processing. It also suggests that in designing school curricula, more attention should be paid to students’ competence on picture comprehension or text-picture integration in the future.
With the appearance of modern virtual reality (VR) headsets on the consumer market, there has been the biggest boom in the history of VR technology. Naturally, this was accompanied by an increasing focus on the problems of current VR hardware. Especially the control in VR has always been a complex topic.
One possible solution is the Leap Motion, a hand tracking device that was initially developed for desktop use, but with the last major software update it can be attached to standard VR headsets. This device allows very precise tracking of the user’s hands and fingers and their replication in the virtual world.
The aim of this work is to design virtual user interfaces that can be operated with the Leap Motion to provide a natural method of interaction between the user and the VR environment. After that, subject tests are performed to evaluate their performance and compare them to traditional VR controllers.
The present thesis gives an overview of the general conditions for the programming of graphics cards. For this purpose, the most important Application Programming Interfaces (APIs) available on the market are presented and compared. Subsequently, two standard algorithms from the field data processing, prefix sum and radixsort are presented and examined with regard to the implementation with parallel programming on the GPU. Both algorithms were implemented using the OpenGL-API and OpenGL compute shaders. Finally, the execution times of the two algorithms were compared.
To meet the growing demands in the automotive industry, car manufacturers constantly reduce the depth of production and shift value-adding processes to the suppliers. This requires that companies work together more closely and promotes the creation of complex logistics networks. To meet the requirements for information exchange, a consortium of automobile manufacturers launched the project RFID-based Automotive Network (RAN) in 2009. The initiative aims at creating a standardized architecture for efficient material flow management along the entire supply chain. Core component of this architecture is the Informationbroker, an information unit which automatically communicates data which is captured via Auto-ID technology to supply chain participants. The thesis focuses in cooperation with the IBS AG, a software company and consortium partner in the project, on the exchange of goods data.
At first, theoretical foundations are presented by describing the characteristics of a supply chain and explaining standardization efforts and related processes. The chapter on the supply chain focuses on trends in the automotive industry to create a link to the project. The topic of standardization provides in-depth information on electronic data exchange standards in order to additionally create a transition to the Informationbroker concept. In the analytical part, reference projects will be presented with a similar problem and set in relation to RAN. According to project documents, system requirements will be defined and models will be created in order to illustrate the problem. Rich Pictures are used to describe the basis and target state.
Based on these models, the flow of goods related data is depicted between two companies and the role of the Informationbroker for the information exchange is clarified. The thesis aims at establishing an understanding of the challenges of the project and how the proposed concepts of the initiative can lead to an optimization of an automotive supply chain.
The estimation of various social objects is necessary in different fields of social life, science, education, etc. This estimation is usually used for forecasting, for evaluating of different properties and for other goals in complex man-machine systems. At present this estimation is possible by means of computer and mathematical simulation methods which is connected with significant difficulties, such as: - time-distributed process of receiving information about the object; - determination of a corresponding mathematical device and structure identification of the mathematical model; - approximation of the mathematical model to real data, generalization and parametric identification of the mathematical model; - identification of the structure of the links of the real social object. The solution of these problems is impossible without a special intellectual information system which combines different processes and allows predicting the behaviour of such an object. However, most existing information systems lead to the solution of only one special problem. From this point of view the development of a more general technology of designing such systems is very important. The technology of intellectual information system development for estimation and forecasting the professional ability of respondents in the sphere of education can be a concrete example of such a technology. Job orientation is necessary and topical in present economic conditions. It helps tornsolve the problem of expediency of investments to a certain sphere of education. Scientifically validated combined diagnostic methods of job orientation are necessary to carry out professional selection in higher education establishments. The requirements of a modern society are growing, with the earlier developed techniques being unable to correspond to them sufficiently. All these techniques lack an opportunity to account all necessary professional and personal characteristics. Therefore, it is necessary to use a system of various tests. Thus, the development of new methods of job orientation for entrants is necessary. The information model of the process of job orientation is necessary for this purpose. Therefore, it would be desirable to have an information system capable of giving recommendations concerning the choice of a trade on the basis of complex personal characteristics of entrants.
Die vorliegende Arbeit betrachtet den Einfluss von Wald- und Wirtschaftswegen auf Abflussentstehung und Bodenerosionsraten innerhalb eines bewaldeten Einzugsgebiets im Naturschutzgebiet Laacher See. Hierfür wurden sowohl bestehende Erosions- und Akkumulationsformen im Gelände kartiert, als auch Erosionssimulationen mittels einer Kleinberegnungsanlage durchgeführt. Zuletzt erfolgte eine Modellierung des Erosionspotentials auf Grundlage der Simulationsergebnisse.
Die Analyse bestehender Erosions- und Akkumulationsformen im Gelände gab einen Hinweis auf Bodenerosionsraten von Wegoberflächen, die zwischen 27,3 und 93,5 t ha-1 a-1 und somit in derselben Größenordnung wie Erosionsraten unter intensiver ackerbaulicher Nutzung lagen.
Die Simulationsläufe zeigten, dass persistente Waldwege ein deutlich verändertes Infiltrationsverhalten aufwiesen. Auf natürlichen Waldböden lag der Anteil des infiltrierten Niederschlags bei durchschnittlich 96%. Im Falle von Waldwegen nahm dieser Anteil im Mittel auf 14% bis 7% ab. Besonders auffällig waren die Ergebnisse auf Rückegassen, auf denen ein erheblicher Einfluss der Bodenverdichtung durch Befahrung nachgewiesen werden konnte. Hier sank der Anteil des infiltrierten Niederschlags auf 31% in den Fahrspuren, zwischen den Spuren wurden noch 76 % infiltriert.
Während der Simulationsläufe konnten maximale Sedimentmengen von 446 g m-2 erodiert werden, was einer mittleren Bodenerosionsrate von 4,96 g m-2 min-1 entspricht. Diese hohen Abtragsraten wurden auf persistenten Wegen mit geringer Befestigung gemessen. Rückegassen wiesen die geringsten Abtragswerte auf, maximal konnten 37 g m-2 erodiert werden, gleichbedeutend mit einer Abtragsrate von 0,41 g m-2 min-1. Die erodierten Sedimentmengen betrugen im Mittel bei Wegen 167 bis 319 g m-2 und im Falle von Rückegassen 17 g m-2. Anhand von Vergleichsmessungen auf Waldstandorten, bei denen ein mittlerer Bodenabtrag von ca. 5 g m-2 festgestellt wurde, konnte eine erhöhte Erodierbarkeit für jedwede Form der Weganlage bestätigt werden.
Auf Basis der im Gelände gemessenen Abtragsraten wurden die Modellierungen kalibriert. Die Ergebnisse der ABAG / DIN 19708 zeigten für das betrachtete Untersuchungsgebiet eine mittlere jährliche Bodenerosionsgefährdung von 2,4 - 5,8 t ha-1 a-1 für persistente Wege und von 0,5 t ha-1 a-1 für Rückegassen. Im Vergleich zum Mittelwert weitgehend unbeeinflusster Waldflächen im Untersuchungsgebiet von 0,1 t ha-1 a-1 zeigte sich abermals ein erhöhtes Abtragspotential. Die physikalisch basierte Modellierung der Beregnungsversuche mittels WEPP zeigte ein zufriedenstellendes Ergebnis bei der Einschätzung des Abflussverhaltens, so wurden für persistente Wege nur Abweichungen von maximal -5% festgestellt. Die Abflussmodellierung auf Rückegassen sowie die generelle Modellierung der Bodenerosion während der Beregnungsversuche zeigte sich im Kontrast hierzu noch fehlerbehaftet, was ursächlich mit der für ein physikalisches Modell relativ geringen Eingangsdatentiefe zu begründen ist.
Es wurde nachgewiesen, dass Waldwege einen bedeutenden Einfluss auf den Wasserhaushalt und das Bodenerosionsgeschehen haben. Der Rückhalt von Niederschlägen wird gemindert und es kommt zu intensivierten Bodenerosionsprozessen. Schlecht befestigte Wege zeigten einen stark erhöhten Bodenabtrag, der zu ökologischen Folgeschäden führen kann. Der Abtrag kann ebenso zu einer Beeinträchtigung der Befahrbarkeit führen. Anhand der Folgen lässt sich die Relevanz der Betrachtung von Abfluss- und Bodenerosionsprozessen auf Wald- und Wirtschaftswegen deutlich machen. Die vorliegende Arbeit stellt die erste Studie dar, innerhalb derer Abfluss- und Bodenerosionspozesse für Walderschließungsnetzwerke in Mitteleuropa untersucht wurden.
Innovation can help a forward-looking company to rise up very quickly, furthermore, innovative products and services bring a company to a stage where it can win new segments of customers and be ahead of the competition. For their innovation process, the companies can distinguish between open and closed innovation. In this case, we will focus on open innovation and how companies share their innovation processes for the benefit of the company. They use information and innovation systems to define their innovation process, as well as, track innovative ideas and the phase of their development. There are always pros and cons when it comes to open innovation processes in an organization. We will try to look at certain examples in the business world to illustrate how good or bad an open innovation process can be for a company. In this Bachelor thesis, we will try to point out the essential criteria for an open innovation process and illustrate companies which have used open innovation processes. In some cases it went successfully and for some companies it went unsuccessfully.
In this work, some network protocols with Wireshark Protokollanalyser should be observed and described the deal with them. Wireshark is an offshoot of "Ethereal", one of the most popular protocol analyzer. Wireshark analysis network traffic, draws on it and make it clear . For the simulation of the network is used VNUML. Since VNUML can only be used under Linux, andLinux is running as a virtual machine in between to work in Windows to be able to.
While the 1960s and 1970s still knew permanent education (Council of Europe), recurrent education (OECD) and lifelong education (UNESCO), over the past 20 years, lifelong learning has become the single emblem for reforms in (pre-) primary, higher and adult education systems and international debates on education. Both highly industrialized and less industrialized countries embrace the concept as a response to the most diverse economic, social and demographic challenges - in many cases motivated by international organizations (IOs).
Yet, literature on the nature of this influence, the diffusion of the concept among IOs and their understanding of it is scant and usually focuses on a small set of actors. Based on longitudinal data and a large set of education documents, the work identifies rapid diffusion of the concept across a heterogeneous, expansive and dynamic international field of 88 IOs in the period 1990-2013, which is difficult to explain with functionalist accounts.
Based on the premises of world polity theory, this paper argues that what diffuses resembles less the bundle of systemic reforms usually associated with the concept in the literature and more a surprisingly detailed model of a new actor " the lifelong learner.
Background: Somatoform symptoms are a prevalent and disabling condition in primary practice, causing high medical care utilization. Objective: To compare the short and long term effects of cognitive behavioral outpatient group-therapy to a relaxation-group and a waiting-control-group, on physical symptoms, anxiety, depression, functional health, symptomspecific cognitions and illness-behavior. Methods: 135 subjects were treated and assessed in a randomized control group design. The manualized interventions comprised eight sessions. Results: The cognitive-behavioral group treatment lead to lower levels of somatoform symptoms (SOMS-7) and enhanced mental health (SF-12). There were no differential effects between cognitive-behavioral therapy and relaxation treatment on any of the analysed variables. Conclusions: This brief cognitive-behavioral group therapy has beneficial effects on ambulatory patients with somatoform. To enhance effect sizes and facilitate differential effects, future studies should consider applying increased therapy dosage.
In diesem Bericht wird der Einsatz von drahtlosen Sensornetzen zur Temperaturmessung in Fließgewässern untersucht. Es wird dargestellt, inwieweit solche Netze als Bindeglied zwischen Fernerkundung und stationären Sensoren eingesetzt werden können. Es werden die Anforderungen an Sensornetze für die Anwendung Gewässermonitoring ermittelt und eine prototypische Realisierung von Netzknoten für ein solches Sensornetz dargestellt. Als Ergebnis dieser Arbeit werden die Genauigkeit von Temperaturmessungen mit solchen Sensorknoten im Vergleich zu einem Temperaturlogger als Referenzsystem dargestellt. Die Messungen zeigen, dass eine vergleichsweise gute Messgenauigkeit zu geringen Kosten erreichbar ist. Durch Weiterentwicklung des hier dargestellten Prototypen steht für die Temperaturüberwachung in Gewässern ein vielversprechendes und kostengünstiges neues Messinstrument zur Verfügung. Dieses kann auf der einen Seite in tieferen Regionen Gewässertemperaturen messen, als dies mit Fernerkundung möglich ist, und auf der anderen Seite eine höhere räumliche Auflösung als stationäre Messstationen erreichen. Zusätzlich dienen die Literaturrecherche und die Formulierung der Kriterien einer Eingrenzung des Anwendungsbereichs für weiterführende Arbeiten.
Foliicolous lichens are one of the most abundant epiphytes in tropical rainforests and one of the few groups of organisms that characterize these forests. Tropical rainforests are increasingly affected by anthropogenic disturbance resulting in forest destruction and degradation. However, not much is known on the effects of anthropogenic disturbance on the diversity of foliicolous lichens. Understanding such effects is crucial for the development of appropriate measures for the conservation of such organisms. In this study, foliicolous lichens diversity was investigated in three tropical rainforests in East Africa. Godere Forest in Southwest Ethiopia is a transitional rainforest with a mixture of Afromontane and Guineo-Congolian species. The forest is secondary and has been affected by shifting cultivation, semi-forest coffee management and commercial coffee plantation. Budongo Forest in West Uganda is a Guineo-Congolian rainforest consisting of primary and secondary forests. Kakamega Forest in western Kenya is a transitional rainforest with a mixture of Guineo-Congolian and Afromontane species. The forest is a mosaic of near-primary forest, secondary forests of different seral stages, grasslands, plantations, and natural glades.
The purpose of this thesis is to explore the sentiment distributions of Wikipedia concepts.
We analyse the sentiment of the entire English Wikipedia corpus, which includes 5,669,867 articles and 1,906,375 talks, by using a lexicon-based method with four different lexicons.
Also, we explore the sentiment distributions from a time perspective using the sentiment scores obtained from our selected corpus. The results obtained have been compared not only between articles and talks but also among four lexicons: OL, MPQA, LIWC, and ANEW.
Our findings show that among the four lexicons, MPQA has the highest sensitivity and ANEW has the lowest sensitivity to emotional expressions. Wikipedia articles show more sentiments than talks according to OL, MPQA, and LIWC, whereas Wikipedia talks show more sentiments than articles according to ANEW. Besides, the sentiment has a trend regarding time series, and each lexicon has its own bias regarding text describing different things.
Moreover, our research provides three interactive widgets for visualising sentiment distributions for Wikipedia concepts regarding the time and geolocation attributes of concepts.
Analyse TV-basierter Interaktion für Senioren: Umsetzung und Evaluation im Gesundheitsbereich
(2007)
In a world where language defines the boundaries of one's understanding, the words of Austrian philosopher Ludwig Wittgenstein resonate profoundly. Wittgenstein's assertion that "Die Grenzen meine Sprache bedeuten die Grenzen meiner Welt" (Wittgenstein 2016: v. 5.6) underscores the vital role of language in shaping our perceptions. Today, in a globalized and interconnected society, fluency in foreign languages is indispensable for individual success. Education must break down these linguistic barriers, and one promising approach is the integration of foreign languages into content subjects.
Teaching content subjects in a foreign language, a practice known as Content Language Integrated Learning (CLIL), not only enhances language skills but also cultivates cognitive abilities and intercultural competence. This approach expands horizons and aligns with the core principles of European education (Leaton Gray, Scott & Mehisto 2018: 50). The Kultusministerkonferenz (KMK) recognizes the benefits of CLIL and encourages its implementation in German schools (cf. KMK 2013a).
With the rising popularity of CLIL, textbooks in foreign languages have become widely available, simplifying teaching. However, the appropriateness of the language used in these materials remains an unanswered question. If textbooks impose excessive linguistic demands, they may inadvertently limit students' development and contradict the goal of CLIL.
This thesis focuses on addressing this issue by systematically analyzing language requirements in CLIL teaching materials, emphasizing receptive and productive skills in various subjects based on the Common European Framework of Reference. The aim is to identify a sequence of subjects that facilitates students' language skill development throughout their school years. Such a sequence would enable teachers to harness the full potential of CLIL, fostering a bidirectional approach where content subjects facilitate language learning.
While research on CLIL is extensive, studies on language requirements for bilingual students are limited. This thesis seeks to bridge this gap by presenting findings for History, Geography, Biology, and Mathematics, allowing for a comprehensive understanding of language demands. This research endeavors to enrich the field of bilingual education and CLIL, ultimately benefiting the academic success of students in an interconnected world.
Climate change is an existential threat to human survival, the social organization of society, and the stability of ecosystems. It is thereby profoundly frightening. In the face of threat, people often want to protect themselves instead of engaging in mitigating behaviors. When psychological resources are insufficient to cope, people often respond with different forms of denial. In this dissertation, I contribute original knowledge to the understanding of the multifaceted phenomenon of climate denial from a psychological perspective.
There are four major gaps in the literature on climate denial: First, the spectrum of climate denial as a self-protective response to the climate crisis has not received attention within psychology. Second, basic psychological need satisfaction, a fundamental indicator of human functioning and the ability to cope with threat, has not been investigated as a predictor of climate denial. Third, relations of the spectrum of climate denial to climate-relevant emotions, specifically climate anxiety, have not been examined empirically. Forth, it has not been investigated how the spectrum of climate denial relates to established predictors of climate denial, namely right-wing ideological convictions and male gender. To address those gaps, I investigate what the spectrum of climate denial looks like in the German context and how it relates to basic psychological need satisfaction and frustration, pro-environmental behavior, climate anxiety, ideological conviction, and gender.
Five manuscripts reveal that climate denial exists on a spectrum in the German context, ranging from the distortion of facts (interpretive climate denial, specifically denial of personal and global outcome severity) to the denial of the implications of climate change (implicatory climate denial, specifically avoidance, denial of guilt, and rationalization of one's own involvement). Across analyses, low basic psychological need satisfaction predicted the spectrum of climate denial, which was negatively related to pro-environmental behavior. Climate denial was generally negatively related to climate anxiety, except for a positive association of avoidance and climate anxiety. Right-wing ideological conviction was the strongest predictor of climate denial across the spectrum. However, low need satisfaction and male gender were additional weaker predictors of implicatory climate denial.
These findings suggest that the spectrum of climate denial serves many psychological functions. Climate denial is possibly both a self-protective strategy to downregulate emotions and to protect oneself from loss of privilege. In short, it represents a barrier to climate action that may only be resolved once people have sufficient psychological resources to face the threat of climate change and cope with their underlying self-protective, emotional responses.
Remote rendering services offer the possibility to stream high quality images to lower powered devices. Due to the transmission of data the interactivity of applications is afflicted with a delay. A method to reduce delay of the camera manipulation on the client is called 3d-warping. This method causes artifacts. In this thesis different approaches of remote rendering setups will be shown. The artifacts and improvements of the warping method will be described. Methods to reduce the artifacts will be implemented and analyzed.
In the last years the e-government concentrated on the administrative aspects of administrative modernisation. In the next step the e-discourses will gain in importance as an instrument of the public-friendliness and means of the e-democracy/e-participation. With growing acceptance of such e-discourses, these will fastly reach a complexity, which could not be mastered no more by the participants. Many impressions, which could be won from presence discussions, will be lacking now. Therefore the exposed thesis has the objective of the conception and the prototypical implementation of an instrument (discourse meter), by which the participants, in particular the moderators of the e-discourse, are capable to overlook the e-discourse at any time and by means of it, attain their discourse awareness. Discourse awareness of the present informs about the current action in the e-discourse and discourse awareness of the past about the past action, by which any trends become visible. The focus of the discourse awareness is located in the quantitative view of the action in the e-discourse. From the model of e-discourse, which is developed in this thesis, the questions of discourse awareness are resulting, whose concretion is the basis for the implementation of the discourse meter. The discourse sensors attached to the model of the e-discourse are recording the actions of the e-discourse, showing events of discourse, which are represented by the discourse meter in various forms of visualizations. The concept of discourse meter offers the possibility of discourse awareness relating to the present as monitoring and the discourse awareness relating to the past as query (quantitative analysis) to the moderators of the e-discourse.
Einer der Forschungsschwerpunkte der AG Rechnernetze ist das Routing Information Protocol. Eine eigene kompatible Erweiterung dieses Routingprotokolls ist der Routing Information Protocol with Metric-based Topology Investigation (RMTI [ehemals RIP-MTI]). Um dieses Protokoll testen und mit seinem Vorgänger vergleichen zu können, wird die Virtualisierungssoftware VNUML eingesetzt. In diesen virtualisierten Netzwerken werden Router, die das RMTI-Protokoll einsetzten, mittels der Zebra/Quagga Routing Software Suite beobachtet. Dabei wird das Verhalten des Protokolls in unterschiedlichsten Simulationssituationen analysiert und bewertet. Um solche Testabläufe zentral zu steuern und protokollieren zu können wurde die Anwendung RIP-XT (XTPeer) erstellt und in fortführenden Diplomarbeiten kontinuierlich erweitert. Sie ist Schnittstelle zu den Zebra/Quagga-Routern und kann diese steuern. Zusätzlich sammelt und analysiert sie die Routing-Informationen der Router.Über ein GUI kann ein Benutzer diese Abläufe steuern. Um einen visuellen Überblick über eine Netzwerktopologie zu erhalten, besitzt das GUI auch eine Topologie-Anzeige. Die Anzeige repräsentiert das Gesamte Netzwerk durch Symbole, über die auch Interaktionen mit der Simulation möglich sind. Ziel dieser Diplomarbeit war es, die bisherige Topologie-Anzeige zu überarbeiten, um sie für neue Anforderungen anzupassen. Des weiteren wurden funktionale Erweiterungen in das GUI des RIP-XTs eingebettet.
This thesis addresses the automated identification and localization of a time-varying number of objects in a stream of sensor data. The problem is challenging due to its combinatorial nature: If the number of objects is unknown, the number of possible object trajectories grows exponentially with the number of observations. Random finite sets are a relatively new theory that has been developed to derive at principled and efficient approximations. It is based around set-valued random variables that contain an unknown number of elements which appear in arbitrary order and are themselves random. While extensively studied in theory, random finite sets have not yet become a leading paradigm in practical computer vision and robotics applications. This thesis explores random finite sets in visual tracking applications. The first method developed in this thesis combines set-valued recursive filtering with global optimization. The problem is approached in a min-cost flow network formulation, which has become a standard inference framework for multiple object tracking due to its efficiency and optimality. A main limitation of this formulation is a restriction to unary and pairwise cost terms. This circumstance makes integration of higher-order motion models challenging. The method developed in this thesis approaches this limitation by application of a Probability Hypothesis Density filter. The Probability Hypothesis Density filter was the first practically implemented state estimator based on random finite sets. It circumvents the combinatorial nature of data association itself by propagation of an object density measure that can be computed efficiently, without maintaining explicit trajectory hypotheses. In this work, the filter recursion is used to augment measurements with an additional hidden kinematic state to be used for construction of more informed flow network cost terms, e.g., based on linear motion models. The method is evaluated on public benchmarks where a considerate improvement is achieved compared to network flow formulations that are based on static features alone, such as distance between detections and appearance similarity. A second part of this thesis focuses on the related task of detecting and tracking a single robot operator in crowded environments. Different from the conventional multiple object tracking scenario, the tracked individual can leave the scene and later reappear after a longer period of absence. Therefore, a re-identification component is required that picks up the track on reentrance. Based on random finite sets, the Bernoulli filter is an optimal Bayes filter that provides a natural representation for this type of problem. In this work, it is shown how the Bernoulli filter can be combined with a Probability Hypothesis Density filter to track operator and non-operators simultaneously. The method is evaluated on a publicly available multiple object tracking dataset as well as on custom sequences that are specific to the targeted application. Experiments show reliable tracking in crowded scenes and robust re-identification after long term occlusion. Finally, a third part of this thesis focuses on appearance modeling as an essential aspect of any method that is applied to visual object tracking scenarios. Therefore, a feature representation that is robust to pose variations and changing lighting conditions is learned offline, before the actual tracking application. This thesis proposes a joint classification and metric learning objective where a deep convolutional neural network is trained to identify the individuals in the training set. At test time, the final classification layer can be stripped from the network and appearance similarity can be queried using cosine distance in representation space. This framework represents an alternative to direct metric learning objectives that have required sophisticated pair or triplet sampling strategies in the past. The method is evaluated on two large scale person re-identification datasets where competitive results are achieved overall. In particular, the proposed method better generalizes to the test set compared to a network trained with the well-established triplet loss.
There has been little research on out-of-school places of learning and their effec-tiveness in the context of ESD education measures. With the help of a multi-stage analysis, this study identifies out-of-school places of learning with reference to the ESD education concept in the Rhineland-Palatinate study area. To this end, qualita-tive literature analyses were first used to generate ESD criteria, which were opera-tionalised as a methodological instrument in the form of an ESD checklist for out-of-school places of learning. The data obtained in this way provide the basis for the creation of a geographically oriented learning location database with ESD refer-ence. A cartographic visualisation of the data results in a spatial distribution pattern: Thus, there are districts and cities that are well supplied with ESD learning loca-tions, but also real ESD learning location deserts where there is a need to catch up. Furthermore, there is an accumulation of ESD learning sites in areas close to for-ests.
A guideline-based explorative interview with two ESD experts provides additional insights into the question of how ESD has been implemented in the federal state of Rhineland-Palatinate, the extent to which there is a need for optimisation, and which continuing measures are being taken for ESD outside schools within the framework of Agenda 2030.
In addition, a quantitative questionnaire study was carried out with 1358 pupils at 30 out-of-school places of learning after participation in an educational measure, in which environmental awareness, attitudes towards environmental behaviour and local learning were also considered. By including non-ESD learning locations, a comparative study on the effectiveness of ESD learning locations became possible. The statistical data evaluation leads to a variety of interesting results. Contra-intuitively, for instance, the type of learning location (ESD or non-ESD learning lo-cation) is not a significant predictor for the environmental awareness and environ-mental behaviour of the surveyed students, whereas communication structures within educational measures at extracurricular learning locations, the multimediality and action orientation and the duration of educational measures have a significant influence.
Keywords: extracurricular learning locations, education for sustainable develop-ment (ESD), ESD criteria, learning location landscape Rhineland-Palatinate, ESD learning locations, environmental awareness, environmental behaviour.
MP3 Player for Nintendo DS
(2007)
Ziel der Arbeit ist es, einen MP3-Player zu entwickeln, der eine Benutzerinteraktion ermöglicht, wie es gängige Computerprogramme zur Wiedergabe von Musik tun. Der Benutzer soll über eine grafische Oberfläche MP3-Dateien laden, abspielen und in Playlisten organisieren können. Ferner soll es möglich sein, Metadaten wie Titel, Autor, Genre, Veröffentlichungsjahr und vieles weitere als zusätzlichen Tag zu speichern und zu editieren. Diese Informationen soll die Software auch beim Abspielen eines Musikstückes auslesen und dem Nutzer übersichtlich anzeigen. Hier scheitern die meisten Player aufgrund ihres kleinen Displays. Außerdem soll der MP3-Player auch rudimentäre Funktionen zur Echtzeitmanipulation der Musikwiedergabe bieten. Als Hardware zum Abspielen der Musikdateien dient die Spielekonsole Nintendo DS, welche aufgrund ihrer beiden Displays genügend Anzeigemöglichkeiten für eine grafische Benutzerführung bietet. Eines der beiden Displays dient zudem als Touchscreen und kann für Eingaben verwendet werden.
Die Forschung im Bereich der modellbasierten Objekterkennung und Objektlokalisierung hat eine vielversprechende Zukunft, insbesondere die Gebäudeerkennung bietet vielfaltige Anwendungsmöglichkeiten. Die Bestimmung der Position und der Orientierung des Beobachters relativ zu einem Gebäude ist ein zentraler Bestandteil der Gebäudeerkennung.
Kern dieser Arbeit ist es, ein System zur modellbasierten Poseschätzung zu entwickeln, das unabhängig von der Anwendungsdomäne agiert. Als Anwendungsdomäne wird die modellbasierte Poseschätzung bei Gebäudeaufnahmen gewählt. Vorbereitend für die Poseschätzung bei Gebäudeaufnahmen wird die modellbasierte Erkennung von Dominosteinen und Pokerkarten realisiert. Eine anwendungsunabhängige Kontrollstrategie interpretiert anwendungsspezifische Modelle, um diese im Bild sowohl zu lokalisieren als auch die Pose mit Hilfe dieser Modelle zu bestimmen. Es wird explizit repräsentiertes Modellwissen verwendet, sodass Modellbestandteilen Bildmerkmale zugeordnet werden können. Diese Korrespondenzen ermöglichen die Kamerapose aus einer monokularen Aufnahme zurückzugewinnen. Das Verfahren ist unabhängig vom Anwendungsfall und kann auch mit Modellen anderer rigider Objekte umgehen, falls diese der definierten Modellrepräsentation entsprechen. Die Bestimmung der Pose eines Modells aus einem einzigen Bild, das Störungen und Verdeckungen aufweisen kann, erfordert einen systematischen Vergleich des Modells mit Bilddaten. Quantitative und qualitative Evaluationen belegen die Genauigkeit der bestimmten Gebäudeposen.
In dieser Arbeit wird zudem ein halbautomatisches Verfahren zur Generierung eines Gebäudemodells vorgestellt. Das verwendete Gebäudemodell, das sowohl semantisches als auch geometrisches Wissen beinhaltet, den Aufgaben der Objekterkennung und Poseschätzung genügt und sich dennoch an den bestehenden Normen orientiert, ist Voraussetzung für das Poseschätzverfahren. Leitgedanke der Repräsentationsform des Modells ist, dass sie für Menschen interpretierbar bleibt. Es wurde ein halbautomatischer Ansatz gewählt, da die automatische Umsetzung dieses Verfahrens schwer die nötige Präzision erzielen kann. Das entwickelte Verfahren erreicht zum einen die nötige Präzision zur Poseschätzung und reduziert zum anderen die Nutzerinteraktionen auf ein Minimum. Eine qualitative Evaluation belegt die erzielte Präzision bei der Generierung des Gebäudemodells.
The automatic detection of position and orientation of subsea cables and pipelines in camera images enables underwater vehicles to make autonomous inspections. Plants like algae growing on top and nearby cables and pipelines however complicate their visual detection: the determination of the position via border detection followed by line extraction often fails. Probabilistic approaches are here superior to deterministic approaches. Through modeling probabilities it is possible to make assumptions on the state of the system even if the number of extracted features is small. This work introduces a new tracking system for cable/pipeline following in image sequences which is based on particle filters. Extensive experiments on realistic underwater videos show robustness and performance of this approach and demonstrate advantages over previous works.
Autonomous exhaustive exploration of unknown indoor environments with the mobile robot "Robbie"
(2007)
Rettungsroboter helfen nach Katastrophen wie z.B. Erdbeben dabei, in zerstörten Gebäuden Überlebende zu finden. Die Aufgabe, die Umgebung effizient möglichst vollständig abzusuchen und dabei eine Karte zu erstellen, die den Rettungskräften bei der Bergung der Opfer zur Orientierung dient, soll der Roboter autonom erfüllen. Hierzu wird eine Explorationsstrategie benötigt; eine Strategie zur Navigation in bekanntem und zur Erkundung von unbekanntem Gelände. Für den mobilen Roboter "Robbie" der Arbeitsgruppe Aktives Sehen wurde in dieser Arbeit ein Grenzen-basierter Ansatz zur Lösung des Explorationsproblems ausgewählt und implementiert. Hierzu werden Grenzen zu unbekanntem Gelände aus der Karte, die der Roboter erstellt, extrahiert und angefahren. Grundlage der Navigation zu einem so gefundenen Wegpunkt bildet die sog. Pfad-Transformation (Path-Transform).
The Internet of Things is still one of the most relevant topics in the field of economics and research powered by the increasing demand of innovative services. Cost reductions in manufacturing of IoT hardware and the development of completely new communication ways has led to the point of bil-lions of devices connected to the internet. But in order to rule this new IoT landscape a standardized solution to conquer these challenges must be developed, the IoT Architecture.
This thesis examines the structure, purpose and requirements of IoT Architecture Models in the global IoT landscape and proposes an overview across the selected ones. For that purpose, a struc-tured literature analysis on this topic is conducted within this thesis, including an analysis on three existing research approaches trying to frame this topic and a tool supported evaluation of IoT Archi-tecture literature with over 200 accessed documents.
Furthermore, a coding of literature with the help of the specialised coding tool ATLAS.ti 8 is conduct-ed on 30 different IoT Architecture Models. In a final step these Architecture Models are categorized and compared to each other showing that the environment of IoT and its Architectures gets even more complex the further the research goes.
Wie bereitet man komplizierte, technische Sachverhalte einfach und verständlich auf, damit sie auch der normalen Benutzer ohne tiefergehendes technisches Hintergrundwissen schnell und ohne lange Einarbeitungszeit und langwierige Erklärungen zu nutzen weiß? In dieser Studenarbeit geht es um genau diese Frage - Nichtinformatikern die Vorzüge und die Arbeit mit semantischen (Such)anfragen zu erleichtern, wenn nicht sogar überhaupt erst zu ermöglichen, sowie die Neuentwicklung und SPARQL-Erweiterung Networked Graphs von Simon Schenk innerhalb der AG Staab/Universität Koblenz zu präsentieren.
In dieser Arbeit wird ein System zur Erzeugung und Darstellung stereoskopischen Video-Panoramen vorgestellt. Neben der theoretischen Grundlagen werden der Aufbau und die Funktionsweise dieses Systems erläutert.
Dazu werden spezielle Kameras verwendet, die Panoramen aufnehmen
können und zur Wiedergabe synchronisiert werden. Anschließend wird ein Renderer implementiert, welcher die Panoramen mithilfe einer VirtualReality Brille stereoskopisch darstellen kann. Dafür werden separate Aufnahmen für die beiden Augen gemacht und getrennt wiedergegeben. Zum Abschluss wird das entstandene Video-Panorama mit einem Panorama eines schon bestehenden Systems verglichen.
Der Wettbewerb um die besten Technologien zur Realisierung des autonomen Fahrens ist weltweit in vollem Gange.
Trotz großer Anstrengungen ist jedoch die autonome Navigation in strukturierter und vor allem unstrukturierter Umgebung bisher nicht gelöst.
Ein entscheidender Baustein in diesem Themenkomplex ist die Umgebungswahrnehmung und Analyse durch passende Sensorik und entsprechende Sensordatenauswertung.
Insbesondere bildgebende Verfahren im Bereich des für den Menschen sichtbaren Spektrums finden sowohl in der Praxis als auch in der Forschung breite Anwendung.
Dadurch wird jedoch nur ein Bruchteil des elektromagnetischen Spektrums genutzt und folglich ein großer Teil der verfügbaren Informationen zur Umgebungswahrnehmung ignoriert.
Um das vorhandene Spektrum besser zu nutzen, werden in anderen Forschungsbereichen schon seit Jahrzehnten \sog spektrale Sensoren eingesetzt, welche das elektromagnetische Spektrum wesentlich feiner und in einem größeren Bereich im Vergleich zu klassischen Farbkameras analysieren. Jedoch können diese Systeme aufgrund technischer Limitationen nur statische Szenen aufnehmen. Neueste Entwicklungen der Sensortechnik ermöglichen nun dank der \sog Snapshot-Mosaik-Filter-Technik die spektrale Abtastung dynamischer Szenen.
In dieser Dissertation wird der Einsatz und die Eignung der Snapshot-Mosaik-Technik zur Umgebungswahrnehmung und Szenenanalyse im Bereich der autonomen Navigation in strukturierten und unstrukturierten Umgebungen untersucht. Dazu wird erforscht, ob die aufgenommen spektralen Daten einen Vorteil gegenüber klassischen RGB- \bzw Grauwertdaten hinsichtlich der semantischen Szenenanalyse und Klassifikation bieten.
Zunächst wird eine geeignete Vorverarbeitung entwickelt, welche aus den Rohdaten der Sensorik spektrale Werte berechnet. Anschließend wird der Aufbau von neuartigen Datensätzen mit spektralen Daten erläutert. Diese Datensätze dienen als Basis zur Evaluation von verschiedenen Klassifikatoren aus dem Bereich des klassischen maschinellen Lernens.
Darauf aufbauend werden Methoden und Architekturen aus dem Bereich des Deep-Learnings vorgestellt. Anhand ausgewählter Architekturen wird untersucht, ob diese auch mit spektralen Daten trainiert werden können. Weiterhin wird die Verwendung von Deep-Learning-Methoden zur Datenkompression thematisiert. In einem nächsten Schritt werden die komprimierten Daten genutzt, um damit Netzarchitekturen zu trainieren, welche bisher nur mit RGB-Daten kompatibel sind. Abschließend wird analysiert, ob die hochdimensionalen spektralen Daten bei der Szenenanalyse Vorteile gegenüber RGB-Daten bieten
Social media platforms such as Twitter or Reddit allow users almost unrestricted access to publish their opinions on recent events or discuss trending topics. While the majority of users approach these platforms innocently, some groups have set their mind on spreading misinformation and influencing or manipulating public opinion. These groups disguise as native users from various countries to spread frequently manufactured articles, strong polarizing opinions in the political spectrum and possibly become providers of hate-speech or extremely political positions. This thesis aims to implement an AutoML pipeline for identifying second language speakers from English social media texts. We investigate style differences of text in different topics and across the platforms Reddit and Twitter, and analyse linguistic features. We employ feature-based models with datasets from Reddit, which include mostly English conversation from European users, and Twitter, which was newly created by collecting English tweets from selected trending topics in different countries. The pipeline classifies language family, native language and origin (Native or non-Native English speakers) of a given textual input. We evaluate the resulting classifications by comparing prediction accuracy, precision and F1 scores of our classification pipeline to traditional machine learning processes. Lastly, we compare the results from each dataset and find differences in language use for topics and platforms. We obtained high prediction accuracy for all categories on the Twitter dataset and observed high variance in features such as average text length especially for Balto-Slavic countries.
Will Eisners Graphic Novels zeugen von einer tiefgehenden Identitifation mit dem Judentum als Volkszugehörigkeit, Religion und Kultur und spiegeln das Judentum in all seinen Facetten wider. Dabei ist besonders hervorzuheben, dass die Entwicklung des Gesamtwerks parallel verläuft zur Geschichte der Emanzipation der jüdischen Bevölkerung in New York City. Der Band clustert die jüdischen Aspekte in Eisners Werk in beispielsweise Faktoren kollektiver Erinnerung, Umgang mit und Kampf gegen Antisemitismus und religiöse Besonderheiten.
As Enterprise 2.0 (E2.0) initiatives are gradually moving out of the early experimentation phase it is time to focus greater attention on examining the structures, processes and operations surrounding E2.0 projects. In this paper we present the findings of an empirical study to investigate and understand the reasons for initiating E2.0 projects and the benefits being derived from them. Our study comprises seven in-depth case studies of E2.0 implementations. We develop a classification and means of visualising the scope of E2.0 initiatives and use these methods to analyse and compare projects.
Our findings indicate a wide range of motivations and combinations of technology in use and show a strong emphasis towards the content management functionality of E2.0 technologies.
Remote Working Study 2022
(2022)
The Remote Working Study 2022 is focused on the transition to work from home (WFH) triggered by the stay at home directives of 2020. These directives required employees to work in their private premises wherever possible to reduce the transmission of the coronavirus. The study, conducted by the Center for Enterprise Information Research (CEIR) at the University of Koblenz from December 2021 to January 2022, explores the transition to remote working.
The objective of the survey is to collect baseline information about organisations’ remote work experiences during and immediately following the COVID-19 lockdowns. The survey was completed by the key persons responsible for the implementation and/or management of the digital workplace in 19 German and Swiss organisations.
The data presented in this report was collected from member organisations of the IndustryConnect initiative. IndustryConnect is a university-industry research programme that is coordinated by researchers from the University of Koblenz. It focuses on research in the areas of the digital workplace and enterprise collaboration technologies, and facilitates the generation of new research insights and the exchange of experiences among user companies.
The Internet of Things (IoT) is a concept in which connected physical objects are integrated into the virtual world to become active partakers of businesses and everyday processes (Uckelmann, Harrison and Michahelles, 2011; Shrouf, Ordieres and Miragliotta, 2014). It is expected to have a major impact on businesses (Council, Nic and Intelligence, 2008), but small and medium enterprises’ business models are threatened if they do not adopt the new concept (Sommer, 2015). Thus, this thesis aims to showcase a sample implementation of connected devices in a small enterprise, demonstrating its added benefits for the business.
Design Science Research (DSR) is used to develop a prototype based on a use case provided by a carpentry. The prototype comprises a hardware sensor and a web application which can be used by the wood shop to improve their processes. The thesis documents the iterative process of developing a prototype from the grounds up to useable hard- and software.
This contribution provides an example of how IoT can be used and implemented at a small business.
Im Vergleich zu herkömmlicher Computergrafik (perspektivische Projektion) bietet Raytracing entscheidende Vorteile, die hauptsächlich in der vergleichsweise hohen physikalischen Korrektheit der Methode begründet sind. Die Schwächen liegen hingegen im immensen Rechenaufwand.
Ein Raytracer ist vergleichsweise so rechenintensiv, weil für jeden Pixel mindestens ein Strahl verschickt werden muss. Dieser muss gegen alle Objekte im Raum geschnitten werden. Hinzu kommen noch die Strahlen, die entstehen, wenn Strahlen an Objekten reflektiert werden (Rekursion). Um diesen Rechenaufwand zu verkleinern und zusätzlich ein besseres Bild zu erzeugen, soll der adaptive Sampler den Raytracer unterstützen. Der adaptive Sampler soll während des Rendervorgangs den progressiven Fortschritt in der Bildgenerierung beobachten und Pixel von der weiteren Berechnung ausschließen, für die sich ein zusätzliches Verschießen von Strahlen nicht mehr lohnt.
Anders als der rein progressive Raytracer hört der adaptive Sampler mit dem Konvergieren des Bildes auf zu rechnen. Der adaptive Sampler soll so dafür sorgen, dass schneller ein besseres Bild erzeugt wird und somit die Performanz gesteigert wird. Insgesamt erwartet man sich vom adaptiven Sampler Vorteile bei der Berechnung von bestimmten Szenen. Unter anderem eine Verbesserung bei Szenen mit rein diffus beleuchteten Bildbereichen, sowie eine Verbesserung bei Szenen mit unterschiedlich rechenintensiven Bildbereichen. Ein normaler Raytracer kann nicht beurteilen, wie sinnvoll seine Schüsse sind. Er kann nur mehr Strahlen verschießen, in der Hoffnung, das Bild damit effektiv zu verbessern.
Es gibt jedoch viele Szenarien, bei denen eine linear steigende Schussanzahl pro Pixel keine gleichmäßige Verbesserung im Bild erzeugt. Das bedeutet, dass Bereiche im Bild schon gut aussehen, während andere noch sehr verrauscht sind. Man möchte nun Bildbereiche, die bereits konvergiert sind, in denen sich ein weiterer Beschuss also nicht mehr bemerkbar macht, ausschließen und die Rechenleistung dort nutzen, wo man sie noch braucht.
Wichtig dabei ist, dass Pixel nicht ungewollt zu früh von der Berechnung ausgeschlossen werden, die nicht weit genug konvergiert sind. Der adaptive Sampler soll so lange arbeiten, bis jeder Pixel dauerhaft keine Änderungen mehr vorweist. Das bedeutet, dass die Wahrscheinlichkeit für eine signifikante Farbänderung eines Pixels durch Verschießen eines Strahls (bei mehreren Lichtquellen in RenderGin mehrere Strahlen pro Pixel) klein genug ist. Es wird zwar intern keine Wahrscheinlichkeit berechnet, jedoch bekommt der Raytracer eine Art Gedächtnis: Er speichert die Veränderungen im beleuchteten Bild und deren Verlauf in eigenen Gedächtnisbildern. Das "Gedächtnis" für das alte Bild (Zustand des Bildes in der letzten Iteration über die Pixel) repräsentiert dabei das Kurzzeitgedächtnis. Es ist absolut genau. Das Langzeitgedächtnis wird von drei verschiedenen Bildern repräsentiert. Das erste gibt die Anzahl der verschossenen Strahlen pro Pixel an. Das zweite ist ein Wahrheitswertebild, das für jeden Pixel angibt, ob dieser noch in die Berechnung einbezogen werden soll. Das dritte Bild gibt an, wie oft jeder Pixel eine Farbänderung vollzogen hat, die geringer ist als der geforderte Maximalabstand eines Pixels zu sich selbst (vor und nach dem Verschießen eines weiteren Strahls).
Mit diesen drei Bildern ist es möglich, zusätzliche quantitative Informationen zu den qualitativen Informationen des Vergleichs vom neuen und alten Bild zu berücksichtigen.
In dieser Arbeit kläre ich die Frage, ob die gewünschten Effekte eintreten und ob bei Integration in die bestehende Struktur von RenderGin ein Performanzgewinn möglich ist. Die Umsetzung eines adaptiven Samplers ist als Plug-In in der Software RenderGin von Numenus GmbH geschehen. RenderGin ist ein echtzeitfähiger, progressiver Raytracer, der sich durch seine Performanz auszeichnet. Die Bildgenerierung geschieht allein auf der CPU, die Grafikkarte wird lediglich zur Anzeige des erzeugten Bildes benötigt.
Die Umsetzung und Programmierung des Plug-Ins ist in Microsoft Visual Studio 2010 geschehen unter Verwendung des RenderGin SDK der Numenus GmbH.
In der vorliegenden Untersuchung stehen geometrische Aufgaben und die in den seit 2004 national verbindlichen Bildungsstandards im Fach Mathematik für den Primarbereich formulierten Anforderungsbereiche im Zentrum. Diese zeigen die kognitiven Anforderungen an Schülerinnen und Schüler bei der Bearbeitung von Aufgaben auf, wobei zwischen „Reproduzieren", „Zusammenhänge herstellen" und „Verallgemeinern und Reflektieren" unterschieden wird (KMK, 2005a, S. 13).
Durch die drei Anforderungsbereiche sollen Lehrkräfte unter anderem die Chance zur Entwicklung einer anforderungsbezogenen Aufgabenkultur erhalten. Des Weiteren soll die Integration von Aufgaben aus allen drei Anforderungsbereichen im Unterricht angeregt und einem einseitig ausgerichteten Unterricht entgegen gewirkt werden.
Da die Anforderungsbereiche bislang nicht empirisch validiert wurden und in den Veröffentlichungen der Kultusministerkonferenz nicht klar zur Schwierigkeit von Aufgaben abgegrenzt werden (KMK, 2005a, S. 13; KMK, 2005b, S. 17; KMK, 2004b, S. 13), wurde in der vorliegenden Untersuchung zum einen die Möglichkeit der eindeutigen Zuordnung geometrischer Aufgaben zu den drei Anforderungsbereichen geprüft.
Zum anderen wurde untersucht, inwiefern die in den geometrischen Aufgaben enthaltenen kognitiven Anforderungen in Zusammenhang mit der empirischen Schwierigkeit von Aufgaben, der mathematischen Leistungsfähigkeit von Schülerinnen und Schülern, dem Geschlecht und den Anforderungen der im Unterricht gestellten Aufgaben stehen.
Vor dem Hintergrund der dem deutschen Mathematikunterricht nachgesagten Kalkül- beziehungsweise Fertigkeitsorientierung (Baumert et al., 2001, S. 296; Granzer & Walther, 2008, S. 9) und den damit einhergehenden Stärken deutscher Schülerinnen und Schüler im Bereich von Routineaufgaben und Schwächen im Bereich von Aufgaben mit höheren kognitiven Anforderungen (Grassmann et al., 2014, S. 11; Reiss & Hammer, 2013, S. 82; Schütte, 2008, S. 41) wurde zudem die Verteilung der im Rahmen der Untersuchung gewonnenen, schriftlich fixierten geometrischen Schulbuch- und Unterrichtsaufgaben auf die drei Anforderungsbereiche analysiert.
Durch die Betrachtung geometrischer Aufgaben konnte stichprobenartig der quantitative Geometrieanteil in den Schulbüchern und im Unterricht der vierten Jahrgangsstufe ermittelt werden, um so den Forschungsstand zum Stellenwert des Geometrieunterrichts (Maier, 1999; Backe-Neuwald, 2000; Roick, Gölitz & Hasselhorn, 2004) zu aktualisieren beziehungsweise zu ergänzen.
In this work has been examined, how the existing model of the simulation of cables and hoses can be advanced. Therefore an investigation has been made on the main influences to the shape simulation and the factors of constraints and side conditions were analyzed. For the validation of the accuracy, the simulation has to be compared to real specimen behavior. To obtain a very precise digitalization of the shape, the choice was made to use a laser scanner that converts the pointcloud into a .vrml file which can be imported into the simulation environment. The assumption was that the simulation method itself has the highest impact to the simulated shape. This is why the capabilities of the most sophisticated methods have been analyzed. The main criterion for the success of a simulation approach proved not to be accuracy, as expected. Process integration and usability showed to be of higher interest for the efficient exertion. Other factors like the pricing, the functionality and the real-time capability were assayed as well. The analyzed methods are based on the solution of the equations of elasticity with different ways of discetization, finite-elements and a spring-impulse-system. Since the finite-element-system takes several minutes for the computation of the shape and the spring-impulse-system reacts retarded on user manipulation, the competitiveness of these approaches is low. The other methods distinguish more in real-time performance, data interfaces and functionality than in accuracy. For the accuracy of a system, the consideration of other factors proved to be very important. As one of these main factors, the accurate assignment of the material properties was indicated. Until the start of this work, only the finite-element-approach dealt with this factor, but no documentation or validation is provided. In the publications of the other methods, the material properties are estimated to obtain a plausible simulation shape. Therefore the specific material values of real specimen have been measured and assigned to the simulation. With the comparison to the real shape it has been proven that the accuracy is very high with the measured properties. Since these measurements are very costly and time consuming, an investigation on a faster and cheaper way to obtain these values has been made. It has been assumed that with the knowledge of the cross-section it should be possible to compute the specimen behavior. Since the braid distribution changes individually from specimen to specimen, a more general way to obtain the values needed to be found. The program composer has been developed, where only the number of the different braids and the taping is entered. It computes with very high precision the stiffness, the density and the final diameter of the bundle. With the measured values and the fitting to the real shape it has been proven that the simulation approach reflects the precise behavior of cables and hoses. Since the stiffness of the single braids is wasteful to measure, a measurement setup was created where the stiffness has a large impact to the shape. With known density, the stiffness of the specimen can be reconstructed precisely. Thus a fast and beneficial way of obtaining the stiffness of a cable has been invented. The poissons ratio of cables and bundles cannot be measured with a tensile test, since the inner structure is very complex. For hoses, the variation of the inner diameter has been measured during the tensile test as well. The resulting values were reasonable, but their accuracy could not be proven. For cables and hoses, it has been tried to obtain the poissons ratio via the computation of the cross section, but the influence of individual changes and the crosstalk of the braids is very high. Therefore a setup was constructed where the torsion stiffness can be measured. For cables and hoses, the individual cross-sections and taping lead to varying results. For hoses, expected and repeatable good values for the poissons ratio were obtained. The low influence of the poisons ratio in the range between 0 and 0.5 has been proven. Therefore we decided to follow the advice of [Old06] and our own experiences to set the poisons ratio for cables and bundles to 0.25. With the knowledge of the measurability and the capabilities of the developed program composer, a procedure to obtain material properties for bundles has been designed. 1. Measurement of the braid density with via pyknometer or mass, length and diameter. 2. Empirical reconstruction of the stiffness with the designed setup. 3. Composing the bundle with the program composer. 4. Adding a factor for the taping and transfer the values to the simulation. The model of the cable simulation has been improved as follows: The main influences in the simulation of cables and hoses are the simulation method, the material properties and the geometric constraints. To obtain higher accuracy, an investigation on the correct material properties is indispensable. The scientific determination of material properties for the simulation of cables, bundles and hoses has been performed for the first time. The influence of geometrical constraints has been analyzed and documented. The next steps are the analysis of pre-deformation and further investigations to the determination of the poisons ratio with a more precise torsion test. All analysis were made with the simulation approach fleXengine. A comparison to other simulation methods would be of high interest.
Agricultural land-use may lead to brief pulse exposures of pesticides in edge-of-field streams, potentially resulting in adverse effects on aquatic macrophytes, invertebrates and ecosystem functions. The higher tier risk assessment is mainly based on pond mesocosms which are not designed to mimic stream-typical conditions. Relatively little is known on exposure and effect assessment using stream mesocosms.
Thus the present thesis evaluates the appliacability of the stream mesocosms to mimic stream-typical pulse exposures, to assess resulting effects on flora and fauna and to evaluate aquatic-terrestrial food web coupling. The first objective was to mimic stream-typical pulse exposure scenarios with different durations (≤ 1 to ≥ 24 hours). These exposure scenarios established using a fluorescence tracer were the methodological basis for the effect assessment of an herbicide and an insecticide. In order to evaluate the applicability of stream mesocosms for regulatory purposes, the second objective was to assess effects on two aquatic macrophytes following a 24-h pulse exposure with the herbicide iofensulfuron-sodium (1, 3, 10 and 30 µg/L; n = 3). Growth inhibition of up to 66 and 45% was observed for the total shoot length of Myriophyllum spicatum and Elodea canadensis, respectively. Recovery of this endpoint could be demonstrated within 42 days for both macrophytes. The third objective was to assess effects on structural and functional endpoints following a 6-h pulse exposure of the pyrethroid ether etofenprox (0.05, 0.5 and 5 µg/L; n = 4). The most sensitive structural (abundance of Cloeon simile) and functional (feeding rates of Asellus aquaticus) endpoint revealed significant effects at 0.05 µg/L etofenprox. This concentration was below field-measured etofenprox concentrations and thus suggests that pulse exposures adversely affect invertebrate populations and ecosystem functions in streams. Such pollutions of streams may also result in decreased emergence of aquatic insects and potentially lead to an insect-mediated transfer of pollutants to adjacent food webs. Test systems capable to assess aquatic-terrestrial effects are not yet integrated in mesocosm approaches but might be of interest for substances with bioaccumulation potential. Here, the fourth part provides an aquatic-terrestrial model ecosystem capable to assess cross-ecosystem effects. Information on the riparian food web such as the contribution of aquatic (up to 71%) and terrestrial (up to 29%) insect prey to the diet of the riparian spider Tetragnatha extensa was assessed via stable isotope ratios (δ13C and δ15N). Thus, the present thesis provides the methodological basis to assess aquatic-terrestrial pollutant transfer and effects on the riparian food web.
Overall the results of this thesis indicate, that stream mesocosms can be used to mimic stream-typical pulse exposures of pesticides, to assess resulting effects on macrophytes and invertebrates within prospective environmental risk assessment (ERA) and to evaluate changes in riparian food webs.
Since software influences nearly every aspect of everyday life, the security of software systems is more important than ever before. The evaluation of the security of a software system still poses a significant challenge in practice, mostly due to the lack of metrics, which can map the security properties of source code onto numeric values. It is a common assumption, that the occurrence of security vulnerabilities and the quality of the software design stand in direct correlation, but there is currently no clear evidence to support this. A proof of an existing correlation could help to optimize the measurements of program security, making it possible to apply quality measurements to evaluate it. For this purpose, this work evaluates fifty open-source android applications, using three security and seven quality metrics. It also considers the correlations between the metrics. The quality metrics range from simple code metrics to high-level metrics such as object-oriented anti-patterns, which together provide a comprehensive picture of the quality. Two visibility metrics, along with a metric that computes the minimal permission request for mobile applications, were selected to illustrate the security. Using the evaluation projects, it was found that there is a clear correlation between most quality metrics. By contrast, no significant correlations were found using the security metrics. This work discusses the correlations and their causes as well as further recommendations based on the findings.
The work presented in this thesis investigated interactions of selected biophysical processes that affect zooplankton ecology at smaller scales. In this endeavour, the extent of changes in swimming behaviour and fluid disturbances produced by swimming Daphnia in response to changing physical environments were quantified. In the first research question addressed within this context, size and energetics of hydrodynamic trails produced by Daphnia swimming in non-stratified still waters were characterized and quantified as a function of organisms’ size and their swimming patterns.
The results revealed that neither size nor the swimming pattern of Daphnia affects the width of induced trails or dissipation rates. Nevertheless, as the size and swimming velocity of the organisms increased, trail volume increased in proportional to the cubic power of Reynolds number, and the biggest trail volume was about 500 times the body volume of the largest daphnids. Larger spatial extent of fluid perturbation and prolonged period to decay caused by bigger trail volumes would play a significant role in zooplankton ecology, e.g. increasing the risk of predation.
The study also found that increased trail volume brought about significantly enhanced total dissipated power at higher Reynolds number, and the magnitudes of total dissipated power observed varied in the range of (1.3-10)X10-9 W.
Furthermore, this study provided strong evidence that swimming speed of Daphnia and total dissipated power in Daphnia trails exceeded those of some other selected zooplankton species.
In recognizing turbulence as an intrinsic environmental perturbation in aquatic habitats, this thesis also examined the response of Daphnia to a range of turbulence flows, which correspond to turbu-lence levels that zooplankton generally encounter in their habitats. Results indicated that within the range of turbulent intensities to which the Daphnia are likely to be exposed in their natural habitats, increasing turbulence compelled the organisms to enhance their swimming activity and swim-ming speed. However, as the turbulence increased to extremely high values (10-4 m2s-3), Daphnia began to withdraw from their active swimming behaviour. Findings of this work also demonstrated that the threshold level of turbulence at which animals start to alleviate from largely active swimming is about 10-6 m2s-3. The study further illustrated that during the intermediate range of turbu-lence; 10-7 - 10-6 m2s-3, kinetic energy dissipation rates in the vicinity of the organisms is consistently one order of magnitude higher than that of the background turbulent flow.
Swarming, a common conspicuous behavioural trait observed in many zooplankton species, is considered to play a significant role in defining freshwater ecology of their habitats from food exploitation, mate encountering to avoiding predators through hydrodynamic flow structures produced by them, therefore, this thesis also investigated implications of Daphnia swarms at varied abundance & swarm densities on their swimming kinematics and induced flow field.
The results showed that Daphnia aggregated in swarms with swarm densities of (1.1-2.3)x103 L-1, which exceeded the abundance densities by two orders of magnitude (i.e. 1.7 - 6.7 L-1). The estimated swarm volume decreased from 52 cm3 to 6.5 cm3, and the mean neighbouring distance dropped from 9.9 to 6.4 body lengths. The findings of this work also showed that mean swimming trajectories were primarily horizontal concentric circles around the light source. Mean flow speeds found to be one order of magnitude lower than the corresponding swimming speeds of Daphnia. Furthermore, this study provided evidences that the flow fields produced by swarming Daphnia differed considerably between unidirectional vortex swarming and bidirectional swimming at low and high abundances respectively.
Increasingly, problematic smartphone use behavior (PSU) and excessive consumption are reported. In this study, an experiment was developed to investigate the influence of screen coloration using the grayscale setting on smartphone usage time in repeated measurements. We also investigated how individuals perceived suffering correlates with smartphone usage time and PSU, and whether differences exist by smartphone usage type (social, process, habitual). 240 subjects completed a questionnaire about smartphone usage time, PSU, perceived suffering, and smartphone usage types. Afterward, their smartphones were switched to grayscale setting for at least 24h, and thereafter 92 of these participants completed the second questionnaire. Analyses showed that grayscale setting decreases usage time and that there is a positive correlation between PSU, smartphone usage duration, and perceived suffering. The types of use (process and habitual) influence one’s perceived suffering. Thus, it shows that individuals are aware of their PSU and suffer from it. Using grayscale setting is effective in reducing smartphone use time.
Performanz von RIP-MTIfi
(2009)
Diese Diplomarbeit beschäftigt sich mit der Performanz von RIP-MTI, insbesondere mit der Performanz der Schleifenerkennung. Ziel der Arbeit ist es, die Zeitdauer der Schleifenerkennung von RIP-MTI zu untersuchen und Probleme, welche bei der Erkennung von Schleifen auftreten könen, aufzudecken und zu lösen.
The maintenance strategy “predictive maintenance”, which is characterized by predicting the failure behavior of technical units based on modern sensor technology, plays a key role in smart factories against the background of an industry 4.0. This paper contains an evaluation of the current state of research on this strategy and gives an overview of the areas of application to date. With the aid of a qualitative video analysis, the implementation in the industries and company divisions involved and the type of goods monitored are examined. The analyzed video clips were uploaded to YouTube for example for marketing purposes by various companies with different perspectives on predictive maintenance. The video analysis was realized by applying a previously defined coding plan to the video material. The results show a predominant application in the manufacturing industry, in which predictive maintenance is used to monitor plants and machines. In addition, the strategy is also mainly applied to means of transport used for freight and passenger transport in various infrastructures. As a result of the video analysis, the currently high need for explanation of predictive maintenance becomes visible. By looking at these explanations, one also learns something about the special features that distinguish it from other maintenance strategies.
More than 10,000 organic chemicals such as pharmaceuticals, ingredients of personal care products and biocides are ubiquitously used in every day life. After their application, many of these chemicals enter the domestic sewer. Research has shown that conventional biological wastewater treatment in municipal wastewater treatment plants (WWTPs) is an insufficient barrier for the release of most of these anthropogenic chemicals into the receiving waters.
This bears unforeseen risks for aquatic wildlife and drinking water resources. Especially for recently introduced and/or detected compounds (so called emerging micropollutants), there is a growing need to investigate the occurrence and fate in WWTPs. In order to get a comprehensive picture on the behavior in municipal wastewater treatment, the following groups of emerging organic micropollutants, spanning a broad range of applications and physico-chemical properties, were selected as target compounds: pharmaceuticals (beta blockers, psycho-active drugs), UV-filters, vulcanization accelerators (benzothiazoles), biocides (anti-dandruffs, preservatives, disinfectants) and pesticides (phenylurea and triazine herbicides).
Forschungsergebnisse zum Männerchorwesen Deutschlands im 19. Jahrhundert belegen dessen gesellschaftliche und politische Relevanz. Das so genannte Sängerwesen leistete demnach einen wesentlichen Beitrag zur Nationsbildung in Deutschland, da die Sänger durch ihren Gesang sowie durch ihre Aktivitäten im Verein und in der Öffentlichkeit zur inneren Einigung der Bevölkerung beitrugen und somit halfen, eine einheitliche Nation zu formen. Im Gegensatz dazu gab es bislang kaum Erkenntnisse über die gesellschaftlichen und politischen Hintergründe des Männerchorwesens der Pfalz im gleichen Zeitraum. Um diese Lücke zu schließen, wurde mit der vorliegenden Arbeit die Geschichte des Männerchorwesens der Pfalz erforscht, insbesondere hinsichtlich seiner Bedeutung für die Nationsbildung Deutschlands. Der Untersuchungszeitraum erstreckt sich von 1816, dem Jahr, in dem die Pfalz zum bayerischen Staatsgebiet wurde, bis zur Gründung des Deutschen Reiches im Jahre 1871.
Zunächst wird die Entwicklung des pfälzischen Sängerwesens bezüglich der Zahl der gegründeten Vereinen in den einzelnen Jahren und Orten sowie bezüglich des Feierns lokaler und regionaler Sängerfeste im Überblick und im Vergleich zum Musikvereinswesen der Pfalz im gleichen Zeitraum dargestellt. Dieser Betrachtung des pfälzischen Männerchorwesens als Ganzem folgt die Untersuchung seiner Einzelteile, der Personen und Ereignisse innerhalb einzelner Sängervereine und innerhalb bestimmter Zeitabschnitte, vor dem Hintergrund der jeweiligen politischen und gesellschaftlichen Situation. Bedeutend sind in diesem Zusammenhang vor allem die Auswirkungen der politisch-gesellschaftlichen Großereignisse Hambacher Fest von 1832 sowie Revolution von 1848/49 auf die laienmusikalischen Vereinskulturen.
Schließlich werden die einzelnen Phänomene und die Gesamtentwicklung aufeineander bezogen. Der Anhang der Forschungsarbeit beinhaltet, neben Auszügen aus den Protokollbüchern des "Cäcilienverein-Liedertafel Dürkheim" sowie Plakaten von pfälzischen Musik- und Sängerfesten der 1840er Jahre, Übersichtstabellen mit Informationen zu den Pfälzischen Musikfesten des 19. Jahrhunderts sowie zu den im Untersuchungszeitraum gegründeten Sängervereinen, außerdem Kartenmaterial zur räumlichen Verbreitung der Vereine sowie Notenmaterial zum freimaurerischen "Weihelied" des Kaiserslauterer Seminarlehrers Philipp Walter.
Das Organische Qualitätsmanagement (OQM) hat seine Wurzeln in der "Natürlichen Gemeindeentwicklung" (NGE). Der evangelische Theologe Christian A. Schwarz und der Diplompsychologe Christoph Schalk gingen im Rahmen eines internationalen Forschungsprojektes der Frage nach, ob es universell gültige Prinzipien für das qualitative und quantitative Wachstum von Kirchengemeinden gibt. Diese Studie wurde zu einem der größten Forschungsprojekte, das jemals über das Wachstum von Gemeinden durchgeführt wurde. Bis Februar 2011 wurden in über 70 Ländern 71.512 Profile für Kirchengemeinden unterschiedlichster Prägung und Konfession erstellt.
Das Ergebnis dieser fortschreitenden Studie bietet eine wissenschaftlich zu verifizierende Antwort auf die Frage: "Was sind die Wachstumsprinzipien, die unabhängig von Kultur, theologischer Prägung und Frömmigkeitsrichtung gelten und können sich diese Prinzipien auch positiv auf Non Profit Organisationen und säkulare Wirtschaftsunternehmen anwenden lassen? Kapitel 1 beschreibt zunächst die Entstehung, Herkunft und Entwicklung des OQM, d.h. den Übertragungsprozess aller Erkenntnisse aus der Entwicklung von Kirchengemeinden auf die Realität von kirchlichen Organisationen und später auf die Anforderungen eines Wirtschaftsunternehmens. Kapitel 2 geht ausführlich auf die Beschreibung der acht Qualitätsmerkmale ein. Hierbei stehen weniger die Substantive (Leitung, Mitarbeiter, Strukturen, Beziehungen etc.) im Mittelpunkt, als vielmehr die Adjektive (bevollmächtigend, gabenorientiert, zweckmäßig, vertrauensvoll etc.). Sie beschreiben, auf was es in Veränderungsprozessen wirklich ankommt.
Zu Beginn des Projektes war nicht abzusehen, ob und in welcher Weise sich die Qualitätsmerkmale und die christlichen Sozialprinzipien als Erfolgsfaktoren auch in anderen Unternehmensformen beweisen können. In Kapitel 3 werden Praxisbeispiele zeigen, dass ein dialogisch arbeitendes OQM nicht nur Kirchengemeinden sondern auch caritativen Unternehmen und sogar Wirtschaftsunternehmen helfen kann zu wachsen und somit erfolgreich zu sein.
Generalized methods for automated theorem proving can be used to compute formula transformations such as projection elimination and knowledge compilation. We present a framework based on clausal tableaux suited for such tasks. These tableaux are characterized independently of particular construction methods, but important features of empirically successful methods are taken into account, especially dependency directed backjumping and branch local operation. As an instance of that framework an adaption of DPLL is described. We show that knowledge compilation methods can be essentially improved by weaving projection elimination partially into the compilation phase.
Clubs, such as Scouts, rely on the work of their volunteer members, who have a variety of tasks to accomplish. Often there are sudden changes in their organization teams and offices, whereby planning steps are lost and inexperience in planning occurs. Since the special requirements are not covered by already existing tools, ScOuT, a planning tool for the organization administration, is designed and developed in this work to support clubs with regard to the mentioned problems. The focus was on identifying and using various suitable guidelines and heuristic methods to create a usable interface. The developed product was evaluated empirically by a user survey in terms of usability.
The result of this study shows that already a high degree of the desired goal could be reached by the inclusion of the guidelines and methods. From this it can be concluded that with the help of user-specific concept ideas and the application of suitable guidelines and methods, a suitable basis for a usable application to support clubs can be created.
Mit der Microsoft Kinect waren die ersten Aufnahmen von synchronisierten Farb- und Tiefendaten (RGB-D) möglich, ohne hohe finanzielle Mittel aufwenden zu müssen und neue Möglichkeiten der Forschung eröffneten sich. Mit fortschreitender Technik sind auch mobile Endgeräte in der Lage, immer mehr zu leisten. Lenovo und Asus bieten die ersten kommerziell erwerblichen Geräte mit RGB D-Wahrnehmung an. Mit integrierten Funktionen der Lokalisierung, Umgebungserkennung und Tiefenwahrnehmung durch die Plattform Tango von Google gibt es bereits die ersten Tests in verschiedenen Bereichen des Rechnersehens z.B. Mapping. In dieser Arbeit wird betrachtet, inwiefern sich ein Tango Gerät für die Objekterkennung eignet. Aus den Ausgangsdaten des Tango Geräts werden RGB D-Daten extrahiert und für die Objekterkennung verarbeitet. Es wird ein Überblick über den aktuellen Stand der Forschung und gewisse Grundlagen bezüglich der Tango Plattform gegeben. Dabei werden existierende Ansätze und Methoden für eine Objekterkennung auf mobilen Endgeräten untersucht. Die Implementation der Erkennung wird anhand einer selbst erstellten Datenbank von RGB-D Bildern gelernt und getestet. Neben der Vorstellung der Ergebnisse werden Verbesserungen und Erweiterungen für die Erkennung vorgeschlagen.
Online Handschrifterkennung chinesischer Schriftzeichen auf androidfähigen mobilen Endgeräten
(2014)
Usage of mobile dictionaries or translators requires an input. This input has to be processed and recognized beforehand. Chinese characters are more suited for a handwritten input than a keyboard based one. Reason for that are the characters consisting mostly of pictograms or ideograms.
This thesis deals with an implementation of a prototypical recognition system on a mobile device. The recognition process should be online and therefore running while writing. It can save time for the user, because suggestions are made during runtime.
Basics and an overview over the current state of the art in online handwriting recognition will be given. An approach will be chosen and implemented, such that the recognition process is fast and needs little memory. The implementation will be tested and it will show, that a fast recognition can be possible on small devices. Suggestions for expansions and improvements will be given, including a future work part.
Forwarding loops
(2013)
Die Motivation für diese Arbeit bestand darin, den Studierenden in den Rechnerpools der Universität Koblenz die Möglichkeit zu geben, mit der Simulationssoftware VNUML (Virtual Network User Mode Linux) zu arbeiten. Eingesetzt wird diese Software in den Vorlesungen und Übungen zu Rechnernetzen I und II, was eine Anwendung der Software für die Studenten unumgänglich macht. In der Vergangenheit gab es jedoch immer wieder Probleme bei der Installation und Einrichtung auf den privaten Rechnern, obwohl in früheren Studienarbeiten mehrfach vereinfachte Installationsroutinen entwickelt worden waren. Ein weiteres Problem für die Verwendung von VNUML stellt auch die Tatsache dar, dass die Software nur in einer Linux-Umgebung lauffähig ist. Da aber nicht alle Studierenden das Betriebssystem Linux benutzen und viele vor einer Installation allein zur Verwendung von VNUML zurückschrecken, war es schon länger angedacht, diese Software an den Rechnern der Universität zur Verfügung zu stellen. In dieser Arbeit wird der Prozess beschrieben, wie eine Installation der VNUML-Software in den Rechnerpools möglich war, welche Probleme dabei aufgetreten sind und welche Alternativen zur gewählten Vorgehensweise möglich gewesen wären. Das Ergebnis bietet auch eine sehr einfache Installation für den privaten Anwender, ohne dass hierfür eine eigenständige Linux-Installation nötig wäre. Auch wurden während der Entwicklung immer weitere Verbesserungen vorgenommen, welche die Anwenderfreundlichkeit der endgültigen Lösung weiter erhöhten. Die Möglichkeiten und Ideen sind dabei auch so vielfältig, dass sich die Arbeitsgruppe noch weiter mit diesem Thema beschäftigen wird und weitere Optimierungen vorgenommen werden können.
Seit Beginn des World Wide Web hat sich die Erzeugung und Verteilung digitaler Güter (digital assets) entschieden verändert. Zur Erzeugung, Bearbeitung, Verteilung und Konsumierung bedarf es heute nicht mehr spezieller physischer Gerätschaften. Dadurch hat sich die Geschwindigkeit, in der Medien generiert und transportiert werden, enorm gesteigert. Auch die Möglichkeiten der Kooperation waren dadurch einem Wandel unterlegen bzw. wurden mancherorts erst möglich gemacht.
Die Nutzung des Internets ermöglichte zwar die Loslösung digitaler Güter von ihren physischen Trägermedien, die Bestimmungen des Urheberrechts gelten jedoch weiterhin. Dies führt gerade bei juristisch weniger erfahrenen Nutzern zu Unsicherheit darüber, wie ein konkretes digitales Gut genutzt werden darf. Andererseits wird von vielen Nutzern das gewohnte Tauschen von Medien auch auf das digitale Umfeld übertragen. Die Urheberrechtsverletzungen, die zuvor im privaten Umfeld im kleinen Rahmen stattfanden, geschehen nun global und für alle sichtbar. Da diese Form des Tausches das primäre Geschäftsmodell der Verwerter gefährdet, wird versucht, die Nutzung digitaler Güter einzuschränken bzw. für nicht berechtigte Nutzer zu unterbinden. Dies geschah und geschieht unter anderem mit Verfahren der digitalen Rechte-Verwaltung (Digital Rights Management - DRM).
Diese Verfahren sind unter Nutzern bisweilen umstritten oder werden sogar offen abgelehnt, da sie die Nutzung digitaler Güter im Vergleich zum physischen Pendant erschweren können. Zudem erwiesen sich viele dieser Verfahren als nicht sicher, so dass die verwendeten Verschlüsselungsverfahren gebrochen wurden. Mit einer "Nutzungsrechte-Verwaltung" (Usage Rights Management - URM) soll DRM im Kernprinzip zwar erhalten bleiben. Die praktische Umsetzung soll aber in eine andere Richtung vorstoßen. Der Nutzer bekommt die volle Kontrolle über die digitalen Güter (ohne die restriktiven Maßnahmen klassischer DRM-Umsetzungen), aber auch wieder die volle Verantwortung. Unterstützt wird er dabei von Software, die ihn über die rechtlichen Möglichkeiten informiert und auf Wunsch des Nutzers auch software-technische Schranken in der Benutzung setzt, ähnlich der Rechtedurchsetzung (Enforcement) bei klassischen DRM-Systemen.
URM nutzt dabei die offene Rechtedefinitionssprache ODRL. Die vorliegende Studienarbeit ist Teil des URM-Projektes der Forschungsgruppe IT-Risk-Management, welches wiederum Teil des SOAVIWA-Projektes ist. Ziel der Studienarbeit ist es, eine Java-Klasse zu entwickeln, mit der in ODRL verfasste Lizenzen als Java-Objekte abgebildet werden. Weitere zu entwickelnde Komponenten sollen diese Objekte verwalten und das Modifizieren und Erzeugen neuer Objekte zulassen. Alle Komponenten sollen Bestandteil des bereits anfänglich implementierten Toolkit für URM (TURM) sein.
This diploma thesis describes the concept and implementation of a software router for policy-based Internet regulation. It is based on the ontology InFO described by Kasten and Scherp. InFO is destined for a system-independent description of regulation mechanisms. Additionally, InFO enables a transparent regulation by linking background information to the regulation mechanisms. The InFO extension RFCO extends the ontology with router-specific entities. A software router is developed to implement RFCO at the IP level. The regulation is designed to be transparent by letting the router inform affected users about the regulation measures. The router implementation is exemplarily tested in a virtual network environment.
The decline of biodiversity can be observed worldwide and its consequences are alarming. It is therefore crucial that nature must be protected and, where possible, restored. A wide variety of different project options are possible. Yet in the context of limited availability of resources, the selection of the most efficient measures is increasingly important. For this purpose, there is still a lack of information. This pertains, as outlined in the next paragraph, in particular, to information at different scales of projects.
Firstly, there is a lack of information on the concrete added value of biodiversity protection projects. Secondly, there is a lack of information on the actual impacts of such projects and on the costs and benefits associated with a project. Finally, there is a lack of information on the links between the design of a project, the associated framework conditions and the perception of specific impacts. This paper addresses this knowledge gap by providing more information on the three scales by means of three empirical studies on three different biodiversity protection projects in order to help optimize future projects.
The first study “Assessing the trade-offs in more nature-friendly mosquito control in the Upper Rhine region” examines the added value of a more nature-friendly mosquito control in the Upper Rhine Valley of Germany using a contingent valuation method. Recent studies show that the widely used biocide Bti, which is used as the main mosquito control agent in many parts of the world, has more negative effects on nature than previously expected. However, it is not yet clear whether the population supports a more nature-friendly mosquito control, as such an adaptation could potentially lead to higher nuisance. This study attempts to answer this question by assessing the willingness to pay for an adapted mosquito control strategy that reduces the use of Bti, while maintaining nuisance protection within settlements. The results show that the majority of the surveyed population attaches a high value to a more nature-friendly mosquito control and is willing to accept a higher nuisance outside of the villages.
The second study “Inner city river restoration projects: the role of project components for acceptance” examines the acceptance of a river restoration project in Rhineland-Palatinate, Germany. Despite much effort, many rivers worldwide are still in poor condition. Therefore, a rapid implementation of river restoration projects is of great importance. In this context, acceptance by society plays a fundamental role, however, the factors determining such acceptance are still poorly understood. In particular, the complex interplay between the acceptance or rejection of specific project components and the acceptance of the overall project require further exploration. This study addresses this knowledge gap by assessing the acceptance of the project, its various ecological and social components, and the perception of real and fictitious costs as well as the benefits of the components. Our findings demonstrate that while acceptance of the overall project is generally rather high, many respondents reject one or more of the project's components. Complementary social project components, like a playground, find less support than purely ecological components. Overall, our research shows that complementary components may increase or decrease acceptance of the overall project. We, furthermore, found that differences in the acceptance of the individual components depend on individual concerns, such as perceived flood risk, construction costs, expected noise and littering as well as the quality of communication, attachment to the site, and the age of the respondents.
The third study “What determines preferences for semi-natural habitats in agrarian landscapes? A choice-modelling approach across two countries using attributes characterizing vegetation” investigates people's aesthetic preferences for semi-natural habitats in agricultural landscapes. The EU-Common Agricultural Policy promotes the introduction of woody and grassy semi-natural habitats (SNH) in agricultural landscapes. While the benefits of these structures in terms of regulating ecosystem services are already well understood, the effects of SNH on visual landscape quality is still not clear. This study investigates the factors determining people’s visual preferences in the context of grassy and woody SNH elements in Swiss and Hungarian landscapes using picture-based choice experiments. The results suggest that respondents’ choices strongly depend on specific vegetation characteristics that appear and disappear over the year. In particular, flowers as a source of colours and green vegetation as well as ordered structure and the proportion of uncovered soil in the picture play an important role regarding respondents’ aesthetic perceptions of the pictures.
The three empirical studies can help to make future projects in the study areas of biodiversity protection more efficient. While this thesis highlights the importance of exploring biodiversity protection projects at different scales, further analyses of the different scales of biodiversity protection projects are needed to provide a sound basis to develop guidance on identifying the most efficient biodiversity protection projects.
Ein Werkzeug zur schnellen Erstellung individueller Schriftarten für die jeweiligen akuten Bedürfnisse wäre ein hilfreiches Instrument für Grafiker und Typographen. Die Anforderung für ein solches Instrument kann kaum sein, gute Schriftsätze zu erzeugen, dies liegt in den Händen des Gestalters, jedoch sollte sie jedem, der sich mit dem Thema befassen möchte, einen leichten Einstieg in die Gestaltung geben. Diese Arbeit versucht somit eine möglichst simple Lösung für das komplexe Thema der Schriftgestaltung zu liefern.
Thousands of chemicals from daily use are being discharged from civilization into the water cycle via different pathways. Ingredients of personal care products, detergents, pharmaceuticals, pesticides, and industrial chemicals thus find their way into the aquatic ecosystems and may cause adverse impacts on the ecology. Pharmaceuticals for instance, represent a central group of anthropogenic chemicals, because of their designed potency to interfere with physiological functions in organisms. Ecotoxicological effects from pharmaceutical burden have been verified in the past. Therapeutic groups with pronounced endocrine disrupting potentials such as steroid hormones gain increasing focus in environmental research as it was reported that they cause endocrine disruption in aquatic organisms even when exposed to environmentally relevant concentrations. This thesis considers the comprehensive investigation of the occurrence of corticosteroids and progestogens in wastewater treatment plant (WWTP) effluents and surface waters as well as the elucidation of the fate and biodegradability of these steroid families during activated sludge treatment. For the first goal of the thesis, a robust and highly sensitive analytical method based on liquid chromatography-tandem mass spectrometry (LC-MS/MS) was developed in order to simultaneously determine the occurrence of around 60 mineralocorticoids, glucocorticoids and progestogens in the aquatic environment. A special focus was set to the compound selection due to the diversity of marketed synthetic steroids. Some analytical challenges have been approved by individual approaches regarding sensitivity enhancement and compound stabilities. These results may be important for further research in environmental analysis of steroid hormones. Reliable and low quantification limits are the perquisite for the determination of corticosteroids and progestogens at relevant concentrations due to low consumption volumes and simultaneously low effect-based trigger values. Achieved quantification limits for all target analytes ranged between 0.02 ng/L and 0.5 ng/L in surface water and 0.05 ng/L to 5 ng/L in WWTP effluents. This sensitivity enabled the detection of three mineralocorticoids, 23 glucocorticoids and 10 progestogens within the sampling campaign around Germany. Many of them were detected for the first time in the environment, particularly in Germany and the EU. To the best of our knowledge, this in-depth steroid screening provided a good overview of single steroid burden and allowed for the identification of predominantly steroids of each steroid
type analyzed for the first time. The frequent detection of highly potent synthetic steroids (e.g. triamcinolone acetonide, clobetasol propionate, betamethasone valerate, dienogest, cyproterone acetate) highlighted insufficient removal during conventional Summary wastewater treatment and indicated the need for regulation to control their emission since the steroid concentrations were found to be above the reported effect-based trigger values for biota. Overall, the study revealed reliable environmental data of poorly or even not analyzed steroids. The results complement the existing knowledge in this field but also providednew information which can beused particularly for compound prioritization in ecotoxicological research and environmental analysis. Based on the data obtained from the monitoring campaign, incubation experiments were conducted to enable the comparison of the biodegradability and transformation processes in activated sludge treatment for structure-related steroids under aerobic and standardized experimental conditions. The compounds were accurately selected to cover manifold structural moieties of commonly used glucocorticoids, including non-halogenated and halogenated steroids, their mono- and diesters, and several acetonide-type steroids. This approach allowed for a structure-based interpretation of the results. The obtained biodegradation rate constants suggested large variations in the biodegradability (half-lifes ranged from < 0.5 h to > 14 d). An increasing stability was identified in the order from non-halogenated steroids (e.g. hydrocortisone), over 9α-halogenated steroids (e.g. betamethasone), to C17-monoesters (e.g. betamethasone 17-valerate, clobetasol propionate), and finally to acetonides (e.g. triamcinolone acetonide), thus suggesting a strong relationship of the biodegradability with the glucocorticoid structure. Some explanations for this behavior have been received by identifying the transformation products (TPs) and elucidating individual transformation pathways. The results revealed the identification of the likelihood of transformation reactions depending on the chemical steroid structure for the first time. Among the identified TPs, the carboxylates (e.g. TPs of fluticasone propionate, triamcinolone acetonide) have been shown persistency in the subsequent incubation experiments. The newly identified TPs furthermore were frequently detected in the effluents of full-scale wastewater treatment plants. These findings emphasized i) the transferability of the lab-scale degradation experiments to real world and that ii) insufficient removals may cause adverse effects in the aquatic environment due to the ability of the precursor steroids and TPs to interact with the endocrine system in biota. For the last goal, the conceptual study for glucocorticoids was applied to progestogens.
Here, two sub-types of the steroid family frequently used for hormonal contraception were selected (17α-hydroxyprogesterone and 19-norstestosterone type). The progestogens showed a fast and complete degradation within six hours, and thus empathizes pronounced biodegradability. However, cyproterone acetate and dienogest Summary have been found to be more recalcitrant in activated sludge treatment. This was consistent with their ubiquitously occurrence during the previous monitoring campaign. The elucidation of TPs again revealed some crucial information regarding the observed behavior and highlighted furthermore the formation of hazardous TPs. It was shown that 19-nortestosterone type steroids are able to undergo aromatization at ring A in contact with activated sludge, leading to the formation of estrogen-like TPs with a phenolic moiety at ring A. In the case of norethisterone the formation of 17α-ethinylestradiol was confirmed, which is a well-known potent synthetic estrogen with elevated ecotoxicological potency. Thus, the results indicated for the very first time an unknown source of estrogenic compounds, particularly for 17α-ethinylestradiol. In conclusion, some steroids were found to be very stable in activated sludge treatment, others degrade well, and others which do degrade but predominantly to active TPs depending on their chemical structure. Fluorinated acetal steroids such as triamcinolone acetonide and fluocinolone acetonide are poorly biodegradable, which is reflected in high concentrations detected ubiquitously in WWTP effluents. Endogenous steroids and their most related synthetic once such as hydrocortisone, prednisolone or 17α-hydroxyprogesterone are readily biodegradable. Regardless their high influent concentrations, they are almost completely removed in conventional WWTPs. Steroids between this range have been found to form elevated quantities of TPs which are partially still active, which particularly the case for betamethasone, fluticasone propionate, cyproterone acetate or dienogest. The thesis illustrates the need for an extensive evaluation of the environmental risks and carried out that corticosteroids and progestogens merit more attention in environmental regulatory and research than it is currently the case
Instructor feedback on written assignments is one of the most important elements in the writing process, especially for students writing in English as a foreign language. However, students are often critical of both the amount and quality of the feedback they receive. In order to better understand what makes feedback effective, this study explored the nature of students’ assessments of the educational alliance, and how their receptivity to, perceptions of, and decisions about using their instructors’ feedback differed depending on how strong they believed the educational alliance to be. This exploratory case study found that students not only assessed the quality of the educational alliance based on goal compatibility, task relevance, and teacher effectiveness, but that there was also a reciprocal relationship between these elements. Furthermore, students’ perceptions of the educational alliance directly influenced how they perceived the feedback, which made the instructor’s choice of feedback method largely irrelevant. Stronger educational alliances resulted in higher instances of critical engagement, intrinsic motivation, and feelings of self-efficacy. The multidirectional influence of goal, task, and bond mean that instructors who want to maximize their feedback efforts need to attend to all three.
Today’s agriculture heavily relies on pesticides to manage diverse pests and maximise crop yields. Despite elaborate regulation of pesticide use based on a complex environmental risk assessment (ERA) scheme, the widespread use of these biologically active compounds has been shown to be a threat to the environment. For surface waters, pesticide exposure has been observed to exceed safe concentration levels and negatively impact stream ecology leading to the question whether current ERA schemes ensure a sustainable use of pesticides. To answer this, the large-scale “Kleingewässer-Monitoring” (KgM) assessed the occurrence of pesticides and related effects in 124 streams throughout Germany, Central Europe, in 2018 and 2019.
Based on five scientific publications originating from the KgM, this thesis evaluated pesticide exposure in streams, ecological effects and the regulatory implications. More than 1,000 water samples were analysed for over 100 pesticide analytes to characterise occurrence patterns (publication 1). Measured concentrations and effects were used to validate the exposure and effect concentrations predicted in the ERA (publication 2). By jointly analysing real-world pesticide application data and measured pesticide mixtures in streams, the disregard of environmental pesticide mixtures in the ERA was evaluated (publication 3). The toxic potential of mixtures in stream water was additionally investigated using suspect screening for 395 chemicals and a battery of in-vitro bioassays (publication 4). Finally, the results from the KgM stream monitoring were used to assess the capability to identify pesticide risks in governmental monitoring programmes (publication 5).
The results of this thesis reveal the widespread occurrence of pesticides in non-target stream ecosystems. The water samples contained a variety of pesticides occurring in complex mixtures predominantly in short-term peaks after rainfall events (publications 1 & 4). Respective pesticide concentration maxima were linked to declines in vulnerable invertebrate species and exceeded regulatory acceptable concentrations in about 80% of agricultural streams, while these thresholds were still estimated partly insufficient to protect the invertebrate community (publication 2). The co-occurrence of pesticides in streams led to a risk underestimated in the single substance-oriented ERA by a factor of about 3.2 in realistic worst-case scenarios, which is further exacerbated by a high frequency at which non-target organism are exposed to pesticides (publication 3). Stream water samples taken after rainfall caused distinct effects in bioassays which were only explainable to a minor extent by the many analytes, indicating the relevance of unknown chemical or biological mixture components (publication 4). Finally, the regulatory monitoring of surface waters under the Water Framework Directive (WFD) was found to significantly underestimate pesticide risks, as about three quarters of critical pesticides and more than half of streams at risk were overlooked (publication 5).
Essentially, this thesis involves a new level of validation of the ERA of pesticides in aquatic ecosystems by assessing pesticide occurrence and environmental impacts at a scale so far unique. The overall results demonstrate that the current agricultural use of pesticides leads to significant impacts on stream ecology that go beyond the level tolerated under the ERA. This thesis identified the underestimation of pesticide exposure, the potential insufficiency of regulatory thresholds and the general inertia of the authorisation process as the main causes why the ERA fails to meet its objectives. To achieve a sustainable use of pesticides, the thesis proposes substantial refinements of the ERA. Adequate monitoring programmes such as the KgM, which go beyond current government monitoring efforts, will continue to be needed to keep pesticide regulators constantly informed of the validity of their prospective ERA, which will always be subject to uncertainty.
To construct a business process model manually is a highly complex and error-prone task which takes a lot of time and deep insights into the organizational structure, its operations and business rules. To improve the output of business analysts dealing with this process, different techniques have been introduced by researchers to support them during construction with helpful recommendations. These supporting recommendation systems vary in their way of what to recommend in the first place as well as their calculations taking place under the hood to recommend the most fitting element to the user. After a broad introduction into the field of business process modeling and its basic recommendation structures, this work will take a closer look at diverse proposals and descriptions published in current literature regarding implementation strategies to effectively and efficiently assist modelers during their business process model creation. A critical analysis of presentations in the selected literature will point out strengths and weaknesses of their approaches, studies and descriptions of those. As a result, the final concept matrix in this work will give a precise and helpful overview about the key features and recommendation methods used and implemented in previous research studies to pinpoint an entry into future works without the downsides already spotted by fellow researchers.
The thesis develops and evaluates a hypothetical model of the factors that influence user acceptance of weblog technology. Previous acceptance studies are reviewed, and the various models employed are discussed. The eventual model is based on the technology acceptance model (TAM) by Davis et al. It conceptualizes and operationalizes a quantitative survey conducted by means of an online questionnaire, strictly from a user perspective. Finally, it is tested and validated by applying methods of data analysis.
This bachelor thesis deals with the conception, implementation and evaluation of a Jump'n'Run game and the consideration of the influence of achievment systems on players. In the game Age of Tunes you play Bardur, the beardless bard and have to try to free the cursed magical creatures in the world Harmonica. The emphasis of the thesis was the clean conception and gradual development of the game, appealing graphic quality, integration of opponents, a mini-game and the consideration of effects of an achievment system on players. In a final evaluation the game and the behavior could be evaluated regarding the achievments.
There are a few systems high and low-cost ones for gaze tracking. Normally low-cost systems go in hand with low-resolution cameras. Here the image quality is poor, so the algorithms for detecting the gaze have to work more precisely. But how to test and analyse them correctly, when there is a bad image quality and no reference point known? The idea of this work is, to generate synthetic eye images, where the reference points are known, because they are mainly manually set and then to test and analyse the algorithms with these synthetic images. By switching on features like gaussian noise or a second glint-like reflection point, it is possible to stepwise approximate the synthetic images close to reality. In fact the experiments will lead to an improvement of the algorithms used in a low-resolution system environment.
The annotation of digital media is no new area of research, instead it is widely investigated. There are many innovative ideas for creating the process of annotation. The most extensive segment of related work is about semi automatic annotation. One characteristic is common in the related work: None of them put the user in focus. If you want to build an interface, which is supporting and satsfying the user, you will have to do a user evaluation first. Whithin this thesis we want to analyze, which features an interface should or should not have to meet these requirements of support, user satisfaction and beeing intuitive. After collecting many ideas and arguing with a team of experts, we determined only a few of them. Different combination of these determined variables form the interfaces, we have to investigate in our usability study. The results of the usability leads to the assumption, that autocompletion and suggestion features supports the user. Furthermore coloring tags for grouping them into categories is not disturbing to the user, but has a tendency of being supportive. Same tendencies emerge for an interface consisting of two user interface elements. There is also an example given for the definition differences of being intuitive. This thesis leads to the concolusion that for reasons of user satisfaction and support it is allowed to differ from classical annotation interface features and to implement further usability studies in the section of annotation interfaces.
Groundwater is essential for the provision of drinking water in many areas around the world. The ecosystem services provided by groundwater-related organisms are crucial for the quality of groundwater-bearing aquifers. Therefore, if remediation of contaminated groundwater is necessary, the remediation method has to be carefully selected to avoid risk-risk trade-offs that might impact these valuable ecosystems. In the present thesis, the ecotoxicity of the in situ remediation agent Carbo-Iron (a composite of zero valent nano-iron and active carbon) was investigated, an estimation of its environmental risk was performed, and the risk and benefit of a groundwater remediation with Carbo-Iron were comprehensively analysed.
At the beginning of the work on the present thesis, a sound assessment of the environmental risks of nanomaterials was impeded by a lack of guidance documents, resulting in many uncertainties on selection of suitable test methods and a low comparability of test results from different studies with similar nanomaterials. The reasons for the low comparability were based on methodological aspects of the testing procedures before and during the toxicity testing. Therefore, decision trees were developed as a tool to systematically decide on ecotoxicity test procedures for nanomaterials. Potential effects of Carbo-Iron on embryonic, juvenile and adult life stages of zebrafish (Danio rerio) and the amphipod Hyalella azteca were investigated in acute and chronic tests. These tests were based on existing OECD and EPA test guidelines (OECD, 1992a, 2013a, 2013b; US EPA, 2000) to facilitate the use of the obtained effect data in the risk assessment. Additionally, the uptake of particles into the test organisms was investigated using microscopic methods. In zebrafish embryos, effects of Carbo-Iron on gene expression were investigated. The obtained ecotoxicity data were complemented by studies with the waterflea Daphnia magna, the algae Scenedesmus vacuolatus, larvae of the insect species Chironomus riparius and nitrifying soil microorganisms.
In the fish embryo test, no passage of Carbo-Iron particles into the perivitelline space or the embryo was observed. In D. rerio and H. azteca, Carbo-Iron was detected in the gut at the end of exposure, but no passage into the surrounding tissue was detected. Carbo-Iron had no significant effect on soil microorganisms and on survival and growth of fish. However, it had significant effects on the growth, feeding rate and reproduction of H. azteca and on survival and reproduction in D. magna. Additionally, the development rate of C. riparius and the cell volume of S. vacuolatus were negatively influenced.
A predicted no effect concentration of 0.1 mg/L was derived from the ecotoxicity studies based on the no-effect level determined in the reproduction test with D. magna and an assessment factor of 10. It was compared to measured and modelled environmental concentrations for Carbo-Iron after application to an aquifer contaminated with chlorohydrocarbons in a field study. Based on these concentrations, risk quotients were derived. Additionally, the overall environmental risk before and after Carbo-Iron application was assessed to verify whether the chances for a risk-risk trade-off by the remediation of the contaminated site could be minimized. With the data used in the present study, a reduced environmental risk was identified after the application of Carbo-Iron. Thus, the benefit of remediation with Carbo-Iron outweighs potential negative effects on the environment.
Ziel dieser Arbeit war es, den in [Rhe06] dargestellten operationalen Ansatz zur Modelltransformation mit Hilfe der am Institut für Softwaretechnik der Universität Koblenz-Landau vorhandenen Bibliotheken "JGraLab" und "GReQL" in Java zu implementieren. Die Implementierung sollte beweisen, dass der aufgezeigte Transformationsansatz in der Praxis umsetzbar ist. Dies wurde durch verschiedene Beispiele bewiesen. Die geplante Verwendung in weiteren Projekten des IST wird für die Zukunft zeigen, ob sich weitere Transformationen umsetzten lassen oder wo die Grenzen des Ansatzes sind. Des weiteren ist denkbar, die Transformationen nicht mehr in zwei Schritten (Schematransformation vor Graphtransformation), sondern beide Schritte auf einmal ablaufen zu lassen. Dieser Schritt setzt jedoch voraus, dass JGraLab dies ebenfalls unterstützt.
Digitalisation and further media development are core processes of the current digital age. In order for companies to benefit from technical progress, their employees must have or are expected to acquire the relevant skills. Companies therefore are faced with the task of not being overwhelmed with the mass of innovations and opportunities and, in the best-case scenario, of being able to use them to improve their own performance.
Small and medium-sized enterprises represent 99% of all enterprises in Germany. Though, it has not been further established, how the majority of small enterprises and their employees participate in this development. The research question therefore consists of two parts. On the one hand: "Is the promotion of employees' skills taken care of in micro-enterprises?" and on the other hand: "Where are the opportunities and challenges for companies of this size?”
In order to answer the research question, a qualitative research method was used, the guideline-based interview. The interviewed companies were all in the media and IT sector. Thus, the recorded and transcribed data provided a real insight into the current situation in micro-enterprises.
The responses to the interviews showed that companies with very small numbers of employees are more dependent on their employees than others. So, the commitment of the employees is decisive for the success of the company itself. It is the management's task to promote this and ensure employee satisfaction.
Companies that pay more attention to employee development are therefore more recommendable for career starters who need and/or want to develop themselves and their entire horizon of experience.
Problems in the analysis of requirements often lead to failures when developing software systems. This problem is nowadays being faced by requirements engineering. The early involvement of all kinds of stakeholders in the development of such a system and a structured process to elicitate and analyse requirements have made it a crucial factor as a first step in software development. The increasing complexity of modern softwaresystems though leads to a rising amount of information which has to be dealt with during analysis. Without the support of appropriate tools this would be almost impossible to do. Especially in bigger projects, which tend to be spatially distributed, an effective requirements engineering could not be implemented without this kind of support. Today there is a wide range of tools dealing with this matter. They have been in use since some time now and, in their most recent versions, realize the most important aspects of requirements engineering. Within the scope of this thesis some of these tools will be analysed, focussing on both the major functionalities concerning the management of requirements and the repository of these tools. The results of this analyis will be integrated into a reference model.
This Master Thesis is an exploratory research to determine whether it is feasible to construct a subjectivity lexicon using Wikipedia. The key hypothesis is that that all quotes in Wikipedia are subjective and all regular text are objective. The degree of subjectivity of a word, also known as ''Quote Score'' is determined based on the ratio of word frequency in quotations to its frequency outside quotations. The proportion of words in the English Wikipedia which are within quotations is found to be much smaller as compared to those which are not in quotes, resulting in a right-skewed distribution and low mean value of Quote Scores.
The methodology used to generate the subjectivity lexicon from text corpus in English Wikipedia is designed in such a way that it can be scaled and reused to produce similar subjectivity lexica of other languages. This is achieved by abstaining from domain and language-specific methods, apart from using only readily-available English dictionary packages to detect and exclude stopwords and non-English words in the Wikipedia text corpus.
The subjectivity lexicon generated from English Wikipedia is compared against other lexica; namely MPQA and SentiWordNet. It is found that words which are strongly subjective tend to have high Quote Scores in the subjectivity lexicon generated from English Wikipedia. There is a large observable difference between distribution of Quote Scores for words classified as strongly subjective versus distribution of Quote Scores for words classified as weakly subjective and objective. However, weakly subjective and objective words cannot be differentiated clearly based on Quote Score. In addition to that, a questionnaire is commissioned as an exploratory approach to investigate whether subjectivity lexicon generated from Wikipedia could be used to extend the coverage of words of existing lexica.
Pelagic oxyclines, the transition zone between oxygen rich surface waters and oxygen depleted deep waters, are a common characteristic of eutrophic lakes during summer stratification. They can have tremendous effects on the biodiversity and the ecosystem functioning of lakes and, to add insult to injury, are expected to become more frequent and more pronounced as climate warming progresses. On these grounds, this thesis endeavors to advance the understanding of formation, persistence, and consequences of pelagic oxyclines: We test, whether the formation of metalimnetic oxygen minima is intrinsically tied to a locally enhanced oxygen consuming process, investigate the relative importance of vertical physical oxygen transport and biochemical oxygen consumption for the persistence of pelagic oxyclines, and finally assess their potential consequences for whole lake cycling. To pursue these objectives, the present thesis nearly exclusively resorts to in situ measurements. Field campaigns were conducted at three lakes in Germany featuring different types of oxyclines and resolved either a short (hours to days) or a long (weeks to months) time scale. Measurements comprised temperature, current velocity, and concentrations of oxygen and reduced substances in high temporal and vertical resolution. Additionally, vertical transport was estimated by applying the eddy correlation technique within the pelagic region for the first time. The thesis revealed, that the formation of metalimnetic oxygen minima does not necessarily depend on locally enhanced oxygen depletion, but can solely result from gradients and curvatures of oxygen concentration and depletion and their relative position to each other. Physical oxygen transport was found to be relevant for oxycline persistence when it considerably postponed anoxia on a long time scale. However, its influence on oxygen dynamics was minor on short time scales, although mixing and transport were highly variable. Biochemical consumption always dominated the fate of oxygen in pelagic oxyclines. It was primarily determined by the oxidative breakdown of organic matter originating from the epilimnion, whereas in meromictic lakes, the oxidation of reduced substances dominated. Beyond that, the results of the thesis emphasize that pelagic oxyclines can be a hotspot of mineralization and, hence, short-circuit carbon and nutrient cycling in the upper part of the water column. Overall, the present thesis highlights the importance of considering physical transport as well as biochemical cycling in future studies.
This thesis addresses the implementation of a particle simulation of an explosion. The simulation will be displayed via ray tracing in near real time. The implementation makes use of the openCL standard. The focus of research in this thesis is to analyse the performance of this combination of components.
Fresh water resources like rivers and reservoirs are exposed to a drastically changing world. In order to safeguard these lentic ecosystems, they need stronger protection in times of global change and population growth. In the last years, the exploitation pressure on drinking water reservoirs has increased steadily worldwide. Besides securing the demands of safe drinking water supply, international laws especially in Europe (EU Water Framework Directive) stipulate to minimize the impact of dams on downstream rivers. In this study we investigate the potential of a smart withdrawal strategy at Grosse Dhuenn Reservoir to improve the temperature and discharge regime downstream without jeopardizing drinking water production. Our aim is to improve the existing withdrawal strategy for operating the reservoir in a sustainable way in terms of water quality and quantity. First, we set-up and calibrated a 1D numerical model for Grosse Dhuenn Reservoir with the open-source community model “General Lake Model” (GLM) together with its water quality module “Aquatic Ecodynamics” library (AED2). The reservoir model reproduced water temperatures and hypolimnetic dissolved oxygen concentrations accurately over a 5 year period. Second, we extended the model source code with a selective withdrawal functionality (adaptive offtake) and added operational rules for a realistic reservoir management. Now the model is able to autonomously determine the best withdrawal height according to the temperature and flow requirements of the downstream river and the raw water quality objectives. Criteria for the determination of the withdrawal regime are selective withdrawal, development of stratification and oxygen content in the deep hypolimnion. This functionality is not available in current reservoir models, where withdrawal heights are generally provided a priori to the model and kept fixed during the simulation. Third, we ran scenario simulations identifying an improved reservoir withdrawal strategy to balance the demands for downstream river and raw water supply. Therefore we aimed at finding an optimal parallel withdrawal ratio between cold hypolimnetic water and warm epilimnetic or metalimnetic water in order to provide a pre-defined temperature in the downstream river. The reservoir model and the proposed withdrawal strategy provide a simple and efficient tool to optimize reservoir management in a multi-objective view for mastering future reservoir management challenges.
This thesis deals with quality assurance of model-based SRS, in particular SRS-Models and SRS-Diagrams. The interesting thing about model-based SRS is that they are generated by a documentation generator based on the following input data: SRS-Model, SRS-Diagrams and texts external to the model. Therefore to assure the quality of the documentation the quality of their four factors must be assured, which are the SRS-Model, SRS-Diagrams, external texts and the documentation generator. The thesis" goal is to define a quality connotation for SRS-Models and -Diagrams and to show an approach for realizing automatically quality testing, measurement and assessment for the modelling tool Innovator.
Previous research concerned with early science education revealed that guided play can support young children’s knowledge acquisition. However, the questions whether guided play maintains other important prerequisites such as children’s science self-concept and how guided play should be implemented remain unanswered. The present dissertation encompasses three research articles that investigated 5- to 6-year-old children’s science knowledge, science theories, and science self-concept in the stability domain and their relation to interindividual prerequisites. Moreover, the articles examined whether children’s science knowledge, science theories, and science self-concept can be supported by different play forms, i.e., guided play with material and verbal scaffolds, guided play with material scaffolds, and free play. The general introduction of the present dissertation first highlights children’s cognitive development, their science self-concept, and interindividual prerequisites, i.e., fluid and crystallised intelligence, mental rotation ability, and interest in block play. These prerequisites are applied to possible ways of supporting children during play. The first article focused on the measurement of 5-to-6-year-old children’s stability knowledge and its relation to interindividual prerequisites. Results suggested that children’s stability knowledge could be measured reliably and validly, and was related to their fluid and crystallised intelligence. The second article was concerned with the development of children’s intuitive stability theories over three points of measurement and the effects of guided and free play, children’s prior theories as well as their intelligence on these intuitive theories. Results implied that guided play with material and verbal scaffolds supported children’s stability theories more than the other two play forms, i.e., guided play with material scaffolds and free play. Moreover, consistency of children’s prior theories, their fluid and crystallised intelligence were related to children’s theory adaptation after the intervention. The third article focused on the effect of the playful interventions on children’s stability knowledge and science self-concept over three points of measurement. Furthermore, the reciprocal effects between knowledge acquisition and science self-concept were investigated. Results implied that guided play supported knowledge acquisition and maintained children’s science self-concept. Free play did not support children’s stability knowledge and decreased children’s science self-concept. No evidence for reciprocal effects between children’s stability knowledge and their science self-concept was found. Last, in a general discussion, the findings of the three articles are combined and reflected amidst children’s cognitive development. Summarising, the present dissertation shows that children’s science knowledge, science theories, and science self-concept can be supported through guided play that considers children’s cognitive development.
Deformable Snow Rendering
(2019)
Accurate snow simulation is key to capture snow's iconic visuals. Intricate
methods exist that attempt to grasp snow behaviour in a holistic manner. Computational complexity prevents them from reaching real-time performance. This thesis presents three techniques making use of the GPU that focus on the deformation of a snow surface in real-time. The approaches are examined by their ability to scale with an increasing number of deformation actors and their visual portrayal of snow deformation. The findings indicate that the approaches maintain real-time performance well into several hundred individual deformation actors. However, these approaches each have their individual restrictions handicapping the visual results. An experimental approach is to combine the techniques at reduced deformation actor count to benefit from the detailed, merged deformation pattern.
German politicians have identified a need for greater citizen involvement in decision-making than in the past, as confirmed by a recent German parliamentarians study ("DEUPAS"). As in other forms of social interactions, the Internet provides significant potential to serve as the digital interface between citizens and decision-makers: in the recent past, dedicated electronic participation ("e-participation") platforms (e.g. dedicated websites) have been provided by politicians and governments in an attempt to gather citizens" feedback and comment on a particular issue or subject. Some of these have been successful, but a large proportion of them are grossly under-used " often only small numbers of citizens use them. Over the same time period, enthusiasm of Society for social networks has increased and is now commonplace. Many citizens use social networks such as Facebook and Twitter for all kinds of purposes, and in some cases to discuss political issues.
Social networks are therefore obviously attractive to politicians " from local government to federal agencies, politicians have integrated social media into their daily work. However, there is a significant challenge regarding the usefulness of social networks. The problem is the continuous increase in digital information: social networks contain vast amounts of information, and it is impossible for a human to manually filter the relevant information from the irrelevant (so-called "information overload"). Even using the search tools provided by social networks, it is still a huge task for a human to determine meanings and themes from the multitude of search results. New technologies and concepts have been proposed to provide summaries of masses of information through lexical analysis of social media messages, and therefore they promise an easy and quick overview of the information.
This thesis examines the relevance of these analyses" results, for the use in everyday political life, with the emphasis on the social networks Facebook and Twitter as data sources. Here we make use of the WeGov Toolbox and its analysis components that were developed during the EU project WeGov. The assessment has been performed in consultation with actual policy-makers from different levels of German government: policy-makers from the German Federal Parliament, the State Parliament North Rhine-Westphalia, the State Chancellery of the Saarland and the cities of Cologne and Kempten all took part in the study. Our method was to execute the analyses on data collected from Facebook and Twitter, and present the results to the policy-makers, who would then evaluate them using a mixture of qualitative methods.
The responses of the participants have provided us with some useful conclusions:
1) None of the participants believe that e-participation is possible in this way. But participants confirm that "citizen-friendliness" can be supported by this approach.
2) The most likely users for the summarisation tools are those who have experience with social networks, but are not "power users". The reason being is that "power users" already knew the relevant information provided by analysis tools. But without any experiences for social networks it is hard to interpret the analysis results the right way.
3) The evaluation has considered geographical aspects, and related this to e.g. a politician- constituency as a local area of social networks. Comparing the rural to the urban areas, it is shown that the amount of relevant political information in the rural areas is low. While the proportion of publicly available information in urban areas is relatively high, the proportion in the rural areas is much lower.
The findings that result from the engagement with policy-makers will be systematically surveyed and validated within this thesis.
In der vorliegenden Arbeit wird die Integration einer Business Intelligence-Lösung in eine bestehende Social Software beschrieben. Dafür wird zunächst der Begriff Business Intelligence und Social Software, der Aufbau sowie deren Bestandteile näher erläutert. Danach erfolgt eine Analyse der IST-Situation der Zielgruppe durch Interviews, deren Auswertungen in der SOLL-Konzeptionierung in eine Anforderungsliste transformiert werden. Abschließend werden die herausgearbeiteten Anforderungen an der finalen Installation geprüft und getestet, um festzustellen ob die Erwartungen der Zielgruppe und ihre Vorstellungen von Business Intelligence realisierbar sind.
Das Ergebnis dieser Arbeit soll eine installierte Business Intelligence-Lösung in einer Social Software sein. Diese soll einen Überblick darüber geben, was mit der aktuellsten Version der Software bereits möglich ist und kritisch aufzeigen, wo es Stärken und Schwächen gibt, die bei zukünftigen Versionenrnbedacht werden sollten.
Retrospektive Analyse der Ausbreitung und dynamische Erkennung von Web-Tracking durch Sandboxing
(2018)
Aktuelle quantitative Analysen von Web-Tracking bieten keinen umfassenden Überblick über dessen Entstehung, Ausbreitung und Entwicklung. Diese Arbeit ermöglicht durch Auswertung archivierter Webseiten eine rückblickende Erfassung der Entstehungsgeschichte des Web-Trackings zwischen den Jahren 2000 und 2015. Zu diesem Zweck wurde ein geeignetes Werkzeug entworfen, implementiert, evaluiert und zur Analyse von 10000 Webseiten eingesetzt. Während im Jahr 2005 durchschnittlich 1,17 Ressourcen von Drittparteien eingebettet wurden, zeigt sich ein Anstieg auf 6,61 in den darauffolgenden 10 Jahren. Netzwerkdiagramme visualisieren den Trend zu einer monopolisierten Netzstruktur, in der bereits ein einzelnes Unternehmen 80 % der Internetnutzung überwachen kann.
Trotz vielfältiger Versuche, dieser Entwicklung durch technische Maßnahmen entgegenzuwirken, erweisen sich nur wenige Selbst- und Systemschutzmaßnahmen als wirkungsvoll. Diese gehen häufig mit einem Verlust der Funktionsfähigkeit einer Webseite oder mit einer Einschränkung der Nutzbarkeit des Browsers einher. Mit der vorgestellten Studie wird belegt, dass rechtliche Vorschriften ebenfalls keinen hinreichenden Schutz bieten. An Webauftritten von Bildungseinrichtungen werden Mängel bei Erfüllung der datenschutzrechtlichen Pflichten festgestellt. Diese zeigen sich durch fehlende, fehlerhafte oder unvollständige Datenschutzerklärungen, deren Bereitstellung zu den Informationspflichten eines Diensteanbieters gehören.
Die alleinige Berücksichtigung klassischer Tracker ist nicht ausreichend, wie mit einer weiteren Studie nachgewiesen wird. Durch die offene Bereitstellung funktionaler Webseitenbestandteile kann ein Tracking-Unternehmen die Abdeckung von 38 % auf 61 % erhöhen. Diese Situation wird durch Messungen von Webseiten aus dem Gesundheitswesen belegt und aus technischer sowie rechtlicher Perspektive bewertet.
Bestehende systemische Werkzeuge zum Erfassen von Web-Tracking verwenden für ihre Messung die Schnittstellen der Browser. In der vorliegenden Arbeit wird mit DisTrack ein Framework zur Web-Tracking-Analyse vorgestellt, welches eine Sandbox-basierte Messmethodik verfolgt. Dies ist eine Vorgehensweise, die in der dynamischen Schadsoftwareanalyse erfolgreich eingesetzt wird und sich auf das Erkennen von Seiteneffekten auf das umliegende System spezialisiert. Durch diese Verhaltensanalyse, die unabhängig von den Schnittstellen des Browsers operiert, wird eine ganzheitliche Untersuchung des Browsers ermöglicht. Auf diese Weise können systemische Schwachstellen im Browser aufgezeigt werden, die für speicherbasierte Web-Tracking-Verfahren nutzbar sind.
Eine zutreffende Diagnose über den aktuellen Kenntnisstand der jeweiligen Schülerinnen und Schüler ist notwendig, um adäquat in Gruppenarbeitsprozesse intervenieren zu können. Von diesem Zusammenhang wird in der Literatur weit-gehend ausgegangen, jedoch gibt es bisher kaum empirische Studien, die diesen belegen. Die vorliegende Arbeit widmet sich schwerpunktmäßig dem Interventi-onsverhalten von Studierenden. Dabei wird die prozessdiagnostische Fähigkeit „Deuten“ zugrundegelegt, um unterschiedliches Interventionsverhalten auf diese Fähigkeit zurückführen zu können. Sowohl beim Aufbau diagnostischer Fähig-keiten als auch bei der (Weiter-)Entwicklung des eigenen Lehrerhandelns gilt Reflexion als hilfreich. Entsprechend wird auch das Zusammenspiel von Pro-zessdiagnose und Reflexionsverhalten sowie von Interventionsverhalten und Reflexionsverhalten untersucht.
Für die Erhebung der prozessdiagnostischen Fähigkeit „Deuten“ wurden drei Videovignetten erstellt und in das Videodiagnosetool ViviAn eingebunden. Die Videovignetten zeigen jeweils vier Schülerinnen, die sich mit dem Thema „Ter-me“ beschäftigen. Im Rahmen eines Lehr-Lern-Labores wurden über vier Se-mester hinweg alle teilnehmenden Studierenden dazu angehalten, die Videovig-netten zu bearbeiten. Ebenso konzipierten sie jeweils zu dritt eine Laborstation im Mathematik-Labor „Mathe ist mehr“ und erprobten diese mit einer Schul-klasse. Dabei wurden die Interventionen der Studierenden in die Gruppenarbeits-prozesse der Schülerinnen und Schüler videographiert. Anschließend reflektierten die Studierenden in Kleingruppen über die Erprobungen und über die getätigten Interventionen. Die Reflexionsgespräche wurden ebenfalls videographiert.
Es zeigt sich, dass die Studierenden, die sich zum Zeitpunkt der Erhebung im Masterstudium befanden, noch Entwicklungsspielraum in Bezug auf ihre pro-zessdiagnostische Fähigkeit „Deuten“ besitzen. Im Hinblick auf die Interventio-nen waren responsive Interventionen häufiger angemessen als invasive Interven-tionen, wobei responsive Internvetionen auch vergleichsweise häufiger dazu führten, dass mehr Schülerinnen und Schüler nach der Intervention aktiv waren. Studierende mit höherer prozessdiagnostischer Fähigkeit „Deuten“ intervenierten jedoch häufiger invasiv und tätigten dabei trotzdem angemessenere und aktivie-rendere Interventionen als ihre Kommilitoninnen und Kommilitonen. Entspre-chend scheint sich die prozessdiagnostische Fähigkeit „Deuten“ positiv auf die Interventionen der Studierenden auszuwirken und sollte daher bereits im Rah-men des (Lehramts-)Studiums verstärkt geschult werden.
Model-Driven Engineering (MDE) aims to raise the level of abstraction in software system specifications and increase automation in software development. Modelware technological spaces contain the languages and tools for MDE that software developers take into consideration to model systems and domains. Ontoware technological spaces contain ontology languages and technologies to design, query, and reason on knowledge. With the advent of the Semantic Web, ontologies are now being used within the field of software development, as well. In this thesis, bridging technologies are developed to combine two technological spaces in general. Transformation bridges translate models between spaces, mapping bridges relate different models between two spaces, and, integration bridges merge spaces to new all-embracing technological spaces. API bridges establish interoperability between the tools used in the space. In particular, this thesis focuses on the combination of modelware and ontoware technological spaces. Subsequent to a sound comparison of languages and tools in both spaces, the integration bridge is used to build a common technological space, which allows for the hybrid use of languages and the interoperable use of tools. The new space allows for language and domain engineering. Ontology-based software languages may be designed in the new space where syntax and formal semantics are defined with the support of ontology languages, and the correctness of language models is ensured by the use of ontology reasoning technologies. These languages represent a core means for exploiting expressive ontology reasoning in the software modeling domain, while remaining flexible enough to accommodate varying needs of software modelers. Application domains are conceptually described by languages that allow for defining domain instances and types within one domain model. Integrated ontology languages may provide formal semantics for domain-specific languages and ontology technologies allow for reasoning over types and instances in domain models. A scenario in which configurations for network device families are modeled illustrates the approaches discussed in this thesis. Furthermore, the implementation of all bridging technologies for the combination of technological spaces and all tools for ontology-based language engineering and use is illustrated.
In der Masterthesis von Benjamin Waldmann mit dem Titel „Flusskrebse in Deutschland – Aktueller Stand der Verbreitung heimischer und invasiver gebietsfremder Flusskrebse in Deutschland; Überblick über die erfolgten Schutzmaßnahmen und den damit verbundenen Erfahrungen; Vernetzung der Akteure im Flusskrebsschutz“ wurden erstmals für alle heimischen wie gebietsfremden Flusskrebsarten (Zehnfußkrebse) Verbreitungskarten für Deutschland vorgelegt. Grundlage der Arbeit waren umfangreiche Recherchen und Abfragen zur Verbreitung der Arten in den Bundesländern bei den zuständigen Behörden, Institutionen, Artexperten und Privatpersonen. Die Rohdaten wurden qualitätsgesichert und in einem Geoinformationssystem aufbereitet und dargestellt, so dass daraus bundesweite Verbreitungskarten für jede Art in einem zehn Kilometerraster (UTM-Gitter im Bezugssystem ETRS89) erstellt werden konnten. Darüber hinaus wurden, ebenfalls auf Basis umfangreicher Recherchen und Abfragen, die unterschiedlichen Möglichkeiten für Schutzmaßnahmen für heimische Flusskrebspopulationen aufgezeigt, bewertet und daraus Empfehlungen abgeleitet. Besonderes Augenmerk wurde dabei auf das Management invasiver gebietsfremder Flusskrebsarten sowie der Umgang mit der Tierseuche Krebspest (Aphanomyces astaci) gelegt. Abschließend wurden Empfehlungen zur Vernetzung der Akteure im Flusskrebsschutz gegeben sowie die Ansprechpartner:innen in den einzelnen Bundesländern aufgeführt.
Abstract zum Print-Buch: Walden, R. (2008). Architekturpsychologie: Schule, Hochschule und Bürogebäude der Zukunft. Lengerich: Pabst Science Publishers. Menschen wünschen sich im Allgemeinen „Kontrolle“ über ihre Umweltbedingungen (vgl. Flammer, 1990; Burger, 1992). Dieses Bedürfnis kommt in Form von Selbstgestaltungen von Architektur und Selbstregulierungen von Stressoren zum Ausdruck. Aus diesem Grund wird das Konzept der Umweltkontrolle als zentrales Kriterium für gebaute Umwelten in allen drei Fall-Studien zu Schule, Hochschule und Büro angewendet. Die Erhebung smethoden Programming (nutzerorientierte Programmentwicklung), User-Needs Analysis (Nutzer-Bedarfs-Analyse) und Post-Occupancy Evaluation werden im Hinblick auf ihren Stellenwert für die Building Performance Evaluation (Gebäudeleistungsevaluation) (Preiser & Schramm, 1997; 2005) erläutert. Der „Koblenzer Architekturfragebogen“ wird als Instrument zur Beurteilung von gebauten Umwelten in drei Varianten vorgestellt. Es werden zu den drei Studien jeweils ausgewählte Ergebnisse zu zwei verschiedenen Umweltbereichen berichtet.
Auswirkungen von Architektur auf Performance von Nutzern (vgl. BOSTI-Studien, 1984, 2001) werden im empirischen Teil in drei Studien zu innovativen Gebäuden (Waldorfschule in Köln, Universität in Koblenz, Post Tower in Bonn) ermittelt. Performance wird gemessen anhand von 1) Lern- und Arbeitsleistungen, 2) Wohlbefinden und 3) Umweltkontrolle sowie 4) Sozialverhalten im Fall der Schulstudie und mit der Hilfe von 16 (Bürogebäude) bzw. 21 (Universität) psychologischen Kriterien für den Erfolg von Organisationen. Ziel aller drei Studien ist unter anderem, das theoretische Konzept „Umweltkontrolle“ zu überprüfen und Empfehlungen für Verbesserungen von Um- bzw. Neubauten benennen zu können. Zwei zentrale Frages tellungen lauten daher: Unterscheidet sich in einer User-Needs Analysis eine Einschätzung des Gebäudes „zur Zeit“ von der Beurteilung der „Wichtigkeit in Zukunft“? Haben Merkmale der Architektur Auswirkungen auf die Performance-Kriterien?
Zu den Studien wurden drei Abbildungssätze nach dem Facettenansatz (Borg, 1996) und zwei Schemata zur Beurteilung der Qualität von Schul- und Bürogebäuden entwickelt. Aus den Schemata wurden zur Schule 139 Fragen für 26 Lehrer und 86 Fragen für 122 Schüler gewonnen, zur Hochschule 203 Fragen für 147 Studierende und 28 Dozenten sowie zu dem Bürogebäude 254 Fragen für 56 studentische Beurteiler. Einzelne Merkmale von Gebäuden wurden anhand einer Skala von +2 ☺☺ („zur Zeit“ sehr gut bzw. sehr wichtig „in Zukunft“) bis –2 // („zur Zeit“ sehr schlecht bzw. sehr unwichtig „in Zukunft“) bewertet. Die drei Performance-Hauptkriterien, insbesondere zu der Einschätzung der „Wichtigkeit in Zukunft“, korrelieren in der Regel in allen drei Studien substanziell und hoch signifikant miteinander, so dass erwartet werden kann, dass eine höhere Ei nschätzung von Umweltkontrolle durch die Nutzer
zu mehr Wohlbefinden führt und über beide Va riablen auch die Beurteilungen von Lern- und Arbeitsleistungen „in Zukunft“ gesteigert werden können. Nach den Ergebnissen der drei Studien wünschen si ch die Nutzer bzw. Beurteiler beispielsweise in allen drei Umwelten Rückzugsmöglichkeiten, in der Schule in Form von Schülerbüros, Nischen und Sitzgruppen im Klassenzimmer, in der Universität in Form von geschützten Bänken im Außenbereich und Arbeitstischen zum Beispiel in der Cafeteria und im Bürogebäude durch weniger Transparenz im Kombibüro und damit weniger Einsicht aller Tätigkeiten durch Vorgesetzte und Kollegen. Die in den drei Studien ermittelten Zusammenhänge der zentralen Perfor mance-Kriterien lassen begründet annehmen, dass durch gezielte Verbesserungen wichtiger Umweltaspekte, insbesondere solcher, die Umweltkontrolle ermöglichen, das Wohlbefinden und somit die Leistung von Nutzern positiv beeinflusst werden kann.
Resumen del libro impreso : Walden, R. (2008). Psicología arquitectónica: Escuela, escuela superior y edificios de oficinas del futuro. Lengerich: Pabst Science Publishers. Las personas desean, por lo general, un “contr ol“ sobre las condiciones de su entorno (comp. Flammer, 1990; Burger, 1992). Esta necesidad queda de manifiesto en forma de autoorganizaciones de la arquitectura y la autorregulación de los factores de estrés. Por esta razón, el concepto de control del medio ambiente se aplica como criterio central para los entornos construidos en los tres estudios de caso: de la escuela, de la universidad y de la s oficinas. Los métodos de encuesta "Programming" (desarrollo del programa orientado al usuario), "U ser-Needs Analysis" (análisis de necesidades del usuario) y "Post-Occupancy Evaluation" (análisis post-ocupacional) se explican con vistas a su importancia para la "Building Performance Evaluation" (evaluación del rendimiento del edificio) (Preiser & Araña, 1997; 2005). La “evaluación arquitectónica de Koblenz“ se presenta como instrumento para evaluar los entornos construidos en tres variantes. Los resultados elegidos de dichos tres estudios se relacionan, a su vez, con dos entornos diferentes. Los efectos de la arquitectura sobre el rendimiento de los usuarios (comp. Estudios BOSTI, 1984 y 2001) se establecen en la parte empírica en tres estudios de edificios innovadores (Escuela Waldorf en Köln, Universidad de Koblenz, Post Tower en Bonn). La eficacia se mide a través de: 1) el rendimiento de aprendizaje y laboral, 2) el bienestar y 3) el control del medio ambiente, así como 4) el comportamiento social en el caso del estudio escola r y con la ayuda de 16 (edificios de oficina) y 21 (universidades) criterios psicológicos para el éxito de las organizaciones. La meta de los tres estudios es, entre otras cosas, revisar el concepto teórico de "control del medio ambiente" y poder dar nuevas recomendaciones para mejorar en las reformas y ta mbién en las nuevas construcciones. Por eso, dos planteamientos centrales son: en el análisis de necesidades del usuario, la opinión del edificio "en la actualidad", ¿se diferencia de la evaluación de la “importancia en el futuro”?; las características de la arquitectura ¿tienen repercusión en los criterios de rendimiento?. Con los estudios se desarrollaron tres teoremas según el enfoque de facetas (Borg, 1996) y dos esquemas para la evaluación de calidad de los edificios escolares y de oficinas. De los esquemas se obtuvieron: para la escuela, 139 preguntas para 26 profesores y 86 preguntas para 122 alumnos; para la escuela superior, 203 preguntas para 147 estudian tes y 28 docentes, así como para el edificio de oficinas 254 preguntas para 56 censores estudiantiles. Las características particulares de los edificios se calificaron mediante una escala de +2 ☺☺ ("en la actualidad" muy buenas y “en el futuro” muy importantes respectivamente) a –2 // ("en la actualidad" muy malas y “en el futuro“ muy insignificantes respectivamente). Los tres criterios principales de rendimiento, especialmente para la evaluación de la “importancia en el futuro“, están correlacionados uno con el otro en los tres estudios, por lo general, de manera sustancial y muy significativa, de manera que se puede esperar que una evaluación superior del control del medio ambiente llevada a cabo por los usuarios conduzca a más bienestar y que, sobre ambas variables, también puedan ser aumentadas las evaluaciones de rendimientos de aprendizaje y laboral “en el futuro“. Según los resultados de los tres estudios, los usuarios y los censores desean por ejemplo, en los tres entornos, la posibilidad de retiro: en la escuela, en forma de oficinas de alumnos, nichos y tresillos en el aula; en la universidad, en forma de bancos protegidos en el área exterior y mesas de trabajo, por ejemplo, en la cafetería; y en el edificio de ofic inas a través de menos transparencia en las oficinas compartidas y con eso menos visión de todas las actividades entre jefes y compañeros. Las relaciones de los criterios de rendimiento centrales establecidas en los tres estudios dejan suponer de modo justificado que, mediante mejoras concretas de aspectos importantes del entorno -sobre todo de aquellos que permiten el control del mismo-, se puede influir de manera positiva sobre el bienestar y, con esto, sobre el rendimiento de los usuarios. Palabras claves: necesidades del usuario, análisis; evaluación del rendimiento del edificio; control del medio ambiente (autoorganización, regulación de los factores de estrés, control social), bienestar, rendimientos de aprendizaje y laboral, comportamiento social; criterios del éxito de las Resumen 5 organizaciones; teoría de las facetas; esquema para la evaluación de la calidad de los edificios; recomendaciones para la construcción de edificios con visión de futuro.
Abstract for the print-book: Walden, R. (2008). Architectural Psychology: School, University Campus, and Office Building of the Future. Lengerich: Pabst Science Publishers (in German). The need for display of self in architecture and for users' self-regulation of stress factors, which demonstrate that users crave individual control of their environment (cf. Flammer, 1990; Burger, 1992) motivated this study to use the concept of environmental control as a central criterion for the evaluation of built environment. It was applied to three case studies: a school, a university campus, and an office building. Advantages and disadvantages of the data- gathering methods of architectural Programming, User-Needs Analysis, and Post-Occupancy Evaluation were analyzed to highlight their significance in terms of Building Performance Evaluation as described by Preiser and Sc hramm (1997, 2005). The “Koblenz Architecture Questionnaire” was used as an instrument for assessing the built environment of the three case studies, and the study reports selected findings from these questionnaires. The investigation seeks to determine the effect of architecture - especially buildings' provisions for user control of environmental conditions - on user performance (cf. BOSTI studies, 1984, 2001) in three innovative buildings: the Waldorf School in Cologne, the new campus for the University in Koblenz, and the Office Tower of the Deutsche Post World Net AG in Bonn. Performance is measured in terms of (1) Lear ning and Work Efficiency, (2) Well-being, (3) Environmental Control, (4) Social Behavior (the latter just for the school project), and by means of 21 and 16 additional psychological criteria for success of the organization in the cases of the university and the office building, respectively. The study aims, among other things, at reassessing the theoretical concept of 'environmental control' and at making recommendations for both improvement of existing buildings and the design of new projects. Two central questions are: In User-Needs Analysis, what is the difference between the assessment of a building for its current use and its estimated performance in future? Do certain architectural features influence user assessments on the given performance criteria? In the studies, three mapping sentences were developed according to the 'facet approach' (Borg, 1996) as well as two systems to judge the quality of school and office buildings. Using these systems, information was obtained in all three studies to construct questionnaires. In the school study, teachers were asked 139 questions, pupils 86 questions. Responses were obtained from 26 teachers and 122 pupils. For the university, 147 students and 28 faculty members responded to 203 questions. For the office building, 56 student-experts were asked 254 questions. Characteristics of the built environment were rated using the following scale: +2 ☺☺ (very good “at present”, and accordingly very important “in the future”) down to –2 // (very bad “at present”, and very unimportant “in the future”). A general finding was a high and significant co rrelation between the responses for the three main performance criteria in all three case studies, especially for the 'importance for the future' aspect. This supports the conclusion that a perception of higher degree of environmental control by users will lead to an increased sense of well-being and consequently, there will also be a higher expectation of improved work or learning efficiency 'in the future'. The three studies further show for example that users in all three environments desire 'retreat opportunities' which may take the form of student offices in schools, niches and small group seating in classrooms, and sheltered seating in outdoor areas and work tables in the cafeteria for the university. For the offices, users wanted more visual privacy (less transparent office partitions in Combi Offices) for less visual control of their activities by supervisors and co-workers. The relationships found by the studies between the responses on the central performance criteria and the spatial characteristics of the three buildings support the contention that focused improvements in the built environment, especially with respect to features that enhance user control of environmental conditions, will influence users’ well-being as well as work performance and work or learning efficiency in a positive way.
Menschen wünschen sich im Allgemeinen "Kontrolle" über ihre Umweltbedingungen. Dieses Bedürfnis kommt in Form von Selbstgestaltungen von Architektur und Selbstregulierungen von Stressoren zum Ausdruck. Aus diesem Grund wird das Konzept der Umweltkontrolle als zentrales Kriterium für gebaute Umwelten in allen drei Fall-Studien zu Schule (Waldorfschule Köln), Hochschule (Universität Koblenz) und Bürogebäude (Post Tower Bonn) angewendet. Im deutschen Sprachraum werden damit erstmalig in einer Monographie psychologische Untersuchungen von innovativen Gebäuden in Anlehnung an die International Building Performance Evaluation (Gebäudeleistungsevaluation) beschrieben. Der "Koblenzer Architekturfragebogen" wird als Instrument zur Beurteilung von gebauten Umwelten in drei Varianten vorgestellt. Bis zu 21 psychologische Kriterien für den Erfolg von Organisationen werden pro Studie angewandt. Ziel aller drei Studien ist unter anderem, Empfehlungen für Verbesserungen von Um- und Neubauten benennen zu können. Zwei zentrale Fragestellungen lauten daher: Unterscheidet sich in einer User-Needs Analysis eine Einschätzung des Gebäudes "zur Zeit" von der Beurteilung der "Wichtigkeit in Zukunft"? Haben Merkmale der Architektur Auswirkungen auf die Performance-Kriterien? Zu den Studien wurden drei Abbildungssätze nach dem Facettenansatz und zwei Schemata zur Beurteilung der Qualität von Schul- und Bürogebäuden entwickelt. Die in den drei Studien ermittelten Zusammenhänge der zentralen Performance-Kriterien lassen begründet annehmen, dass durch gezielte Verbesserungen wichtiger Umweltaspekte, insbesondere solcher, die Umweltkontrolle ermöglichen, das Wohlbefinden und somit die Leistung von Nutzern positiv beeinflusst werden können.
This work represents a quantitative analysis and visualisation of scar tissue of the left ventricular myocard. The scar information is shown in the late enhancement data, that highlights the avitale tissue with the help of a contrast agent. Through automatic methods, the scar is extracted from the image data and quantifies the size, location and transmurality. The transmurality shows a local measurement between the heart wall und the width of the scar. The developed methods help the cardiologist to analyse the measurement, the reason and the degree of the heart failure in a short time period. He can further control the results by several visual presentations. The deformation of the scar tissue over the heart cycle is implemented in another scientific work. A visual improvement of the deformation result which extracts the scar out of the data is aspired. The avital tissue is shown in a more comfortable way by eliminating the unnecessary image information and therefore improves the visual analysis of the pumping heart. Both methods show a detailed analysis of the scar tissue. This supports the clinic practical throughout the manual analysis.