Refine
Year of publication
- 2017 (26) (remove)
Document Type
- Doctoral Thesis (13)
- Master's Thesis (11)
- Bachelor Thesis (2)
Language
- English (26) (remove)
Keywords
- Internet of Things (2)
- Aquatic Guidance Document (1)
- Bestäubung (1)
- Biodiversität (1)
- Case Study Analysis (1)
- Environmental organic chemistry (1)
- Grounded Theory (1)
- IoT (1)
- Knowledge Engineering (1)
- MSR (1)
Institute
- Fachbereich 7 (7)
- Institut für Wirtschafts- und Verwaltungsinformatik (5)
- Institut für Computervisualistik (3)
- Institut für Umweltwissenschaften (3)
- Institut für Informatik (2)
- Institut für Integrierte Naturwissenschaften, Abt. Chemie (2)
- Institute for Web Science and Technologies (2)
- Institut für Integrierte Naturwissenschaften, Abt. Biologie (1)
- Institut für Management (1)
- Institut für Softwaretechnik (1)
This thesis explores the possibilities of probabilistic process modelling for the Computer Supported Cooperative Work (CSCW) systems in order to predict the behaviour of the users present in the CSCW system. Toward this objective applicability, advantages, limitations and challenges of probabilistic modelling are excavated in context of CSCW systems. Finally, as a primary goal seven models are created and examined to show the feasibilities of probabilistic process discovery and predictions of the users behaviour in CSCW systems.
The implementation of physiological indicators reflecting the response of organisms to changes in their environment is assumed to provide potential benefits for ecological studies. By analysing the physiological condition of organisms in freshwater ecological studies rather than their ultimate effects, physiological indicators can contribute to a faster assessment of effects than using traditional ecological indicators, such as the evaluation of the benthic community structure or the determination of the reproductive success of organisms. This can increase the effectiveness of environmental health assessment and experimental ecology. In this respect the thesis focuses on physiological measures characterizing the energetic condition and energy consumption (the concentration of energy storage compounds, the adenylate energy charge, the energy consumption in vivo), as well as individual growth (RNA:DNA ratio) of organisms. Although these sub-individual indicators are commonly applied in marine ecology and more recently in ecotoxicology, they have been rarely applied in freshwater ecology to date. With respect to an increased use of physiological indicators in freshwater ecological studies, the objectives of the present thesis are twofold. First, it highlights the potential of assessing the individual fitness by means of physiological indicators in freshwater ecological studies. For that reason, Chapter 2 provides the basic assumptions as well as the theoretical and methodological fundamentals necessary for the application of physiological indicators within freshwater ecology and, furthermore, points out their applicability by several case studies. As second objective, the thesis addresses selected ecophysiological aspects of native and non-native freshwater amphipods, which are considered suitable candidates for the determination of physiological indicators in ecological studies due to their function as keystone species within aquatic habitats. The studies presented in Chapters 3−5 of the thesis provide information on (i) species- and sex-specific seasonal variations within the energetic condition of natural Gammarus populations (G. fossarum, G. pulex), (ii) differences in metabolic activity and behaviour between different amphipod species (G fossarum, G. roeselii and D. villosus), as well as (iii) the direct effects of ambient ammonia on the physiology and behaviour of D. villosus. The fundamental conclusions drawn from the conducted field and laboratory studies, as well as their relevance and general implications for the application of physiological indicators in freshwater ecological research are discussed in Chapter 6.
Introduction:
In March 2012 a secessionist-Islamist insurgency gained momentum in Mali and quickly took control of two-thirds of the state territory. Within weeks radical Islamists, drug smugglers and rebels suddenly ruled over a territory bigger than Germany. News of the abuse of the population and the introduction of harsh Sharia law spread soon, and word got out that the Malian Army had simply abandoned the land. The general echo of the IC was surprise, a reaction that was, as this research will show, as unfunded as it was unconstructive*. When Malian state structures collapsed, the world watched in shock, even though the developments couldhave been anticipated –and prevented. Ultimately, the situation had to be resolved by international forces (most notably French troops), who are still in Mali at the time of writing (Arieff 2013a: 5; Lohmann 2012: 3; Walther and Christopoulos 2015: 514f.; Shaw 2013: 204; Qantara, Interview, 2012;L’Express, Mali, 2015; Deutscher Bundestag, MINUSMA und EUTM Mali, 2016; UN, MUNISMA, 2016; Boeke and Schuurmann 2015: 801; Chivvis 2016: 93f.).
This research will show that the developments in Mali in 2012 have been developing for a long time and could have been avoided. In doing so, it will also show why state security can never be analyzed or consolidated in an isolated manner. Instead, it is necessary to take into account regional dynamics and developments in order to find a comprehensive approach to security in individual states. Once state failure occurs, not only does the state itself fail, but the surrounding region equally failed to prevent the failure.
Weak states are a growing concern in many world regions, particularly in Africa. As international intervention often proves unsustainable for various reasons*, the author believes that states which cannot stabilize themselves need a regional agent to support them. This regional agent should be a Regional Security Complex (RSC) asdefined by Barry Buzan and Ole Waever (Buzan and Waever 2003). As the following analysis will show, Mali is a case in point. The hope is that this study will help avoid similar failures in the future by making a strong case for the establishment of RSC’s.
…
In scientific data visualization huge amounts of data are generated, which implies the task of analyzing these in an efficient way. This includes the reliable detection of important parts and a low expenditure of time and effort. This is especially important for the big-sized seismic volume datasets, that are required for the exploration of oil and gas deposits. Since the generated data is complex and a manual analysis is very time-intensive, a semi-automatic approach could on one hand reduce the time required for the analysis and on the other hand offer more flexibility, than a fully automatic approach.
This master's thesis introduces an algorithm, which is capable of locating regions of interest in seismic volume data automatically by detecting anomalies in local histograms. Furthermore the results are visualized and a variety of tools for the exploration and interpretation of the detected regions are developed. The approach is evaluated by experiments with synthetic data and in interviews with domain experts on the basis of real-world data. Conclusively further improvements to integrate the algorithm into the seismic interpretation workflow are suggested.
This thesis proposes the use of MSR (Mining Software Repositories) techniques to identify software developers with exclusive expertise about specific APIs and programming domains in software repositories. A pilot Tool for finding such
“Islands of Knowledge” in Node.js projects is presented and applied in a case study to the 180 most popular npm packages. It is found that on average each package has 2.3 Islands of Knowledge, which is possibly explained by the finding that npm packages tend to have only one main contributor. In a survey, the maintainers of 50 packages are contacted and asked for opinions on the results produced by the Tool. Together with their responses, this thesis reports on experiences made with the pilot Tool and how future iterations could produce even more accurate statements about programming expertise distribution in developer teams.
With the emergence of current generation head-mounted displays (HMDs), virtual reality (VR) is regaining much interest in the field of medical imaging and diagnosis. Room-scale exploration of CT or MRI data in virtual reality feels like an intuitive application. However in VR retaining a high frame rate is more critical than for conventional user interaction seated in front of a screen. There is strong scientific evidence suggesting that low frame rates and high latency have a strong influence on the appearance of cybersickness. This thesis explores two practical approaches to overcome the high computational cost of volume rendering for virtual reality. One lies within the exploitation of coherency properties of the especially costly stereoscopic rendering setup. The main contribution is the development and evaluation of a novel acceleration technique for stereoscopic GPU ray casting. Additionally, an asynchronous rendering approach is pursued to minimize the amount of latency in the system. A selection of image warping techniques has been implemented and evaluated methodically, assessing the applicability for VR volume rendering.
Part-of-Speech tagging is the process of assigning words with similar grammatical properties to a part of speech (PoS). In the English language, PoS-tagging algorithms generally reach very high accuracy. This thesis undertakes the task to test against these accuracies in PoS-tagging as a qualitative measure in classification capabilities for a recently developed neural network model, called graph convolutional network (GCN). The novelty proposed in this thesis is to translate a corpus into a graph as a direct input for the GCN. The experiments in this thesis serve as a proof of concept with room for improvements.
The Web contains some extremely valuable information; however, often poor quality, inaccurate, irrelevant or fraudulent information can also be found. With the increasing amount of data available, it is becoming more and more difficult to distinguish truth from speculation on the Web. One of the most, if not the most, important criterion used to evaluate data credibility is the information source, i.e., the data origin. Trust in the information source is a valuable currency users have to evaluate such data. Data popularity, recency (or the time of validity), reliability, or vagueness ascribed to the data may also help users to judge the validity and appropriateness of information sources. We call this knowledge derived from the data the provenance of the data. Provenance is an important aspect of the Web. It is essential in identifying the suitability, veracity, and reliability of information, and in deciding whether information is to be trusted, reused, or even integrated with other information sources. Therefore, models and frameworks for representing, managing, and using provenance in the realm of Semantic Web technologies and applications are critically required. This thesis highlights the benefits of the use of provenance in different Web applications and scenarios. In particular, it presents management frameworks for querying and reasoning in the Semantic Web with provenance, and presents a collection of Semantic Web tools that explore provenance information when ranking and updating caches of Web data. To begin, this thesis discusses a highly exible and generic approach to the treatment of provenance when querying RDF datasets. The approach re-uses existing RDF modeling possibilities in order to represent provenance. It extends SPARQL query processing in such a way that given a SPARQL query for data, one may request provenance without modifying it. The use of provenance within SPARQL queries helps users to understand how RDF facts arederived, i.e., it describes the data and the operations used to produce the derived facts. Turning to more expressive Semantic Web data models, an optimized algorithm for reasoning and debugging OWL ontologies with provenance is presented. Typical reasoning tasks over an expressive Description Logic (e.g., using tableau methods to perform consistency checking, instance checking, satisfiability checking, and so on) are in the worst case doubly exponential, and in practice are often likewise very expensive. With the algorithm described in this thesis, however, one can efficiently reason in OWL ontologies with provenance, i.e., provenance is efficiently combined and propagated within the reasoning process. Users can use the derived provenance information to judge the reliability of inferences and to find errors in the ontology. Next, this thesis tackles the problem of providing to Web users the right content at the right time. The challenge is to efficiently rank a stream of messages based on user preferences. Provenance is used to represent preferences, i.e., the user defines his preferences over the messages' popularity, recency, etc. This information is then aggregated to obtain a joint ranking. The aggregation problem is related to the problem of preference aggregation in Social Choice Theory. The traditional problem formulation of preference aggregation assumes a I fixed set of preference orders and a fixed set of domain elements (e.g. messages). This work, however, investigates how an aggregated preference order has to be updated when the domain is dynamic, i.e., the aggregation approach ranks messages 'on the y' as the message passes through the system. Consequently, this thesis presents computational approaches for online preference aggregation that handle the dynamic setting more efficiently than standard ones. Lastly, this thesis addresses the scenario of caching data from the Linked Open Data (LOD) cloud. Data on the LOD cloud changes frequently and applications relying on that data - by pre-fetching data from the Web and storing local copies of it in a cache - need to continually update their caches. In order to make best use of the resources (e.g., network bandwidth for fetching data, and computation time) available, it is vital to choose a good strategy to know when to fetch data from which data source. A strategy to cope with data changes is to check for provenance. Provenance information delivered by LOD sources can denote when the resource on the Web has been changed last. Linked Data applications can benefit from this piece of information since simply checking on it may help users decide which sources need to be updated. For this purpose, this work describes an investigation of the availability and reliability of provenance information in the Linked Data sources. Another strategy for capturing data changes is to exploit provenance in a time-dependent function. Such a function should measure the frequency of the changes of LOD sources. This work describes, therefore, an approach to the analysis of data dynamics, i.e., the analysis of the change behavior of Linked Data sources over time, followed by the investigation of different scheduling update strategies to keep local LOD caches up-to-date. This thesis aims to prove the importance and benefits of the use of provenance in different Web applications and scenarios. The exibility of the approaches presented, combined with their high scalability, make this thesis a possible building block for the Semantic Web proof layer cake - the layer of provenance knowledge.
With global and distributed project teams being increasingly common Collaborative Project Management is becoming the prevalent paradigm for the work in most organisations. Software has for many years been one of the most used tools for supporting Project Management and with the focus on Collaborative Project Management and accompanied by the emergence of Enterprise Collaboration Systems (ECS), Collaborative Project Management Software (CPMS) is gaining increased attention. This thesis examines the capabilities of CPMS for the long-term management of information which not only includes the management of files within these systems, but the management of all types of digital business documents, particularly social business documents. Previous research shows that social content in collaboration software is often poorly managed which poses challenges to meeting performance and conformance objectives in a business. Based on literature research, requirements for the long-term management of information in CPMS are defined and 7 CPMS tools are analysed regarding the content they contain and the functionalities for the long-term management of this content they offer. The study shows that CPMS by and large are not able to meet the long-term information management needs of an organisation on their own and that only the tools geared towards enterprise customers have sufficient capabilities to support the implementation of an Enterprise Information Management strategy.
Using semantic data from general-purpose programming languages does not provide the unified experience one would want for such an application. Static error checking is lacking, especially with regards to static typing of the data. Based on the previous work of λ-DL, which integrates semantic queries and concepts as types into a typed λ-calculus, this work takes its ideas a step further to meld them into a real-world programming language. This thesis explores how λ-DL's features can be extended and integrated into an existing language, researches an appropriate extension mechanism and produces Semantics4J, a JastAdd-based Java language semantic data extension for type-safe OWL programming, together with examples of its usage.
For a comprehensive understanding of evolutionary processes and for providing reliable prognoses about the future consequences of environmental change, it is essential to reveal the genetic basis underlying adaptive responses. The importance of this goal increases in light of ongoing climate change, which confronts organisms worldwide with new selection pressures and requires rapid evolutionary change to avoid local extinction. Thereby, freshwater ectotherms like daphnids are particularly threatened. Unraveling the genetic basis of local adaptation is complicated by the interplay of forces affecting patterns of genetic divergence among populations. Due to their key position in freshwater communities, cyclic parthenogenetic mode of reproduction and resting propagules (which form biological archives), daphnids are particularly suited for this purpose.
The aim of this thesis was to assess the impact of local thermal selection on the Daphnia longispina complex and to reveal the underlying genetic loci. Therefore, I compared genetic differentiation among populations containing Daphnia galeata, Daphnia longispina and their interspecific hybrids across time, space, and species boundaries. I revealed strongly contrasting patterns of genetic differentiation between selectively neutral and functional candidate gene markers, between the two species, and among samples from different lakes, suggesting (together with a correlation with habitat temperatures) local thermal selection acting on candidate gene TRY5F and indicating adaptive introgression. To reveal the candidate genes’ impact on fitness, I performed association analyses among data on genotypes and phenotypic traits of D. galeata clones from seven populations. The tests revealed a general temperature effect as well as inter-population differences in phenotypic traits and imply a possible contribution of the candidate genes to life-history traits. Finally, utilizing a combined population transcriptomic and reverse ecology approach, I introduced a methodology with a wide range of applications in evolutionary biology and revealed that local thermal selection was probably a minor force in shaping sequence and gene expression divergence among four D. galeata populations, but contributed to sequence divergence among two populations. I identified many transcripts possibly under selection or contributing strongly to population divergence, a large amount thereof putatively under local thermal selection, and showed that genetic and gene expression variation is not depleted specifically in temperature-related candidate genes.
In conclusion, I detected signs of local adaptation in the D. longispina complex across space, time, and species barriers. Populations and species remained genetically divergent, although increased gene flow possibly contributed, together with genotypes recruited from the resting egg bank, to the maintenance of standing genetic variation. Further work is required to accurately determine the influence of introgression and the effects of candidate genes on individual fitness. While I found no evidence suggesting a response to intense local thermal selection, the high resilience and adaptive potential regarding environmental change I observed suggest positive future prospects for the populations of the D. longispina complex. However, overall, due to the continuing environmental degradation, daphnids and other aquatic invertebrates remain vulnerable and threatened.
The presence of anthropogenic chemicals in the natural environment may impact both habitats and human use of natural resources. In particular the contamination of aquatic resources by organic compounds used as pharmaceuticals or household chemicals has become evident. The newly identified environmental pollutants, also known as micropollutants, often have i) unknown ecotoxicological impacts, ii) unknown partitioning mechanisms, e.g. sorption to sediments, and iii) limited regulation to control their emission. Furthermore, like any compound, micropollutants can be transformed while in the environmental matrix to unknown transformation products (TPs), which add to the number of unknown chemicals to consider and thus increase the complexity of risk management. Transformation is at the same time a natural mechanism for the removal of anthropogenic compounds, either by complete degradation (mineralisation) or to innocuous TPs. However, how transformation occurs in real-world conditions is still largely unknown. During the transport of micropollutants from household wastewater to surface water, a large amount of transformation can occur during wastewater treatment—specifically during biological nitrifying–denitrifying treatment processes. The thesis considers the systematic optimisation of laboratory investigative techniques, application of sensitive mass-spectrometry-based analysis techniques and the monitoring of full-scale wastewater treatment plants (WWTPs) to elucidate transformation processes of five known micropollutants.
The first of the five compounds investigated was the antibiotic trimethoprim. Incubation experiments were conducted at different analyte spike concentrations and different sludge to wastewater ratios. Using high-resolution mass spectrometry, a total of six TPs were identified from trimethoprim. The types of TPs formed was clearly influenced by the spike concentration. To the best of our knowledge, such impacts have not been previously described in the literature. Beginning from the lower spike concentration, a relatively stable final TP was formed (2,4-diaminopyrimidine-5-carboxylic acid, DAPC), which could account for almost all of the transformed trimethoprim quantity. The results were compared to the process in a reference reactor. Both by the detection of TPs (e.g., DAPC) and by modelling the removal kinetics, it could be concluded that only experimental results at the low spike concentrations mirrored the real reactor. The limits of using elevated spike concentrations in incubation experiments could thus be shown.
Three phenolic micropollutants, the antiseptic ortho-phenylphenol (OPP), the plastics additive bisphenol A (BPA) and the psychoactive drug dextrorphan were investigated with regard to the formation of potentially toxic, nitrophenolic TPs. Nitrite is an intermediate in the nitrification– denitrification process occurring in activated sludge and was found to cause nitration of these phenols. To elucidate the processes, incubation experiments were conducted in purified water in the presence of nitrite with OPP as the test substance. The reactive species HNO2, N2O3 and the radicals ·NO and ·NO2 were likely involved as indicated by scavenger experiments. In conditions found at WWTPs the wastewater is usually at neutral pH, and nitrite, being an intermediate, usually has a low concentration. By conducting incubation experiments inoculated with sludge from a conventional WWTP, it was found that the three phenolic micropollutants, OPP, BPA and dextrorphan were quickly transformed to biological TPs. Nitrophenolic TPs were only formed after artificial increase of the nitrite concentration or lowering of the pH. However, nitrophenolic-TPs can be formed as sample preparation artefacts through acidification or freezing for preservation, creating optimal conditions for the reaction to take place.
The final micropollutant to be studied was the pain-reliever diclofenac, a micropollutant on the EU-watch list due to ecotoxicological effects on rainbow trout. The transformation was compared in two different treatment systems, one employing a reactor with suspended carriers as a biofilm growth surface, while the other system employed conventional activated sludge. In the biofilm-based system, the pathway was found to produce many TPs each at relatively low concentration, many of which were intermediate TPs that were further degraded to unknown tertiary TPs. In the conventional activated sludge system some of the same reactions took place but all at much slower rates. The main difference between the two systems was due to different reaction rates rather than different transformation pathways. The municipal WWTPs were monitored to verify these results. In the biofilm system, a 10-day monitoring campaign confirmed an 88% removal of diclofenac and the formation of the same TPs as those observed in the laboratory experiments. The proposed environmental quality standard of 0.05 μg/L might thus be met without the need for additional treatment processes such as activated carbon filtration or ozonation.
Coordination and awareness mechanisms are important in systems for Computer-Supported Cooperative Work (CSCW) and traditional groupware systems. It has been a key focus of research into collaborative groupware and its capability to enable people to efficiently collaborate and coordinate work. Until now, no classification of the mechanisms has been undertaken to identify commonalities and differences in coordination and awareness mechanisms and to show their significance in collaborative environments. In addition, there is a little investigation of coordination and awareness mechanisms in new forms of groupware such as socially enabled Enterprise Collaboration Systems (ECS). Indeed, both in science and in practices, ECS incorporating social software have become increasingly important. Based on the combination of traditional groupware and social software, ECS also include coordination and awareness mechanisms that may simplify collaboration, but these have not yet been investigated.
Therefore, the aim of this thesis is to identify coordination and awareness mechanisms in the academic literature to provide a general overview of those mechanisms examples. Additionally, this thesis aims to classify the mechanism examples. Based on a deep literature analysis, concepts described in literature are chosen and applied with the intension to analyse the mechanisms and to reach a classification. Based on the classification of the identified mechanisms their commonalities and differences are examined and described to gain a better understanding of them. For illustration purpose, examples of coordination and awareness mechanisms and their application are portrayed. The mechanisms examples refer to the classification groups derived. The selection of the mechanisms for the visualization is based on significant differences in their functionality. Subsequently, the selected mechanisms, more based on traditional groupware, are checked to a limited extend whether they can be found in socially enabled ECS. The collaborative platform of IBM Connections serves as a practical example of ECS incorporating social software. IBM Connections is used at the University of Koblenz to run the platform "UniConnect". On the platform it is investigated which of the identified mechanisms examples of the literature are applied in IBM Connections and which additional mechanisms are created by users. This work is the first step in the study of coordination and awareness mechanisms in socially-enabled ECS. In addition, it is expected to detect new mechanisms which are used while the social factor to collaborative work is new.
The purpose of this thesis is to examine and collect coordination and awareness mechanisms examples in literature to analyse them. Additionally, the purpose is to provide a first overview of mechanisms and to classify them by investigating their commonalities. Beside this thesis should give incentive for further investigations to investigate coordination and awareness mechanisms in socially integrated ECS.
Grassland management has been increasingly intensified throughout centuries since mankind started to control and modify the landscape. Species communities were always shaped alongside management changes leading to huge alterations in species richness and diversity up to the point where land use intensity exceeded the threshold. Since then biodiversity became increasingly lost. Today, global biodiversity and especially grassland biodiversity is pushed beyond its boundaries. Policymakers and conservationists seek for management options which fulfill the requirements of agronomic interests as well as biodiversity conservation alongside with the maintenance of ecosystem processes. However, there is and will always be a trade-off.
Earlier in history, natural circumstances in a landscape mainly determined regionally adapted land use. These regional adaptions shaped islands for many specialist species, and thus diverse species communities, favoring the establishment of a high β-diversity. With the raising food demand, these regional and traditional management regimes became widely unprofitable, and the invention of mineral fertilizers ultimately led to a wide homogenization of grassland management and, as follows, the loss of biotic heterogeneity. In the course of the green revolution, this immediate coherence and the dependency between grassland biodiversity and traditional land use practices becomes increasingly noticed. Indeed, some traditional forms of management such as meadow irrigation have been preserved in a few regions and thus give us the opportunity to directly investigate their long-term relevance for the species communities and ecosystem processes. Traditional meadow irrigation was a common management practice to improve productivity in lowland, but also alpine hay meadows throughout Europe until the 20th century. Nowadays, meadow irrigation is only practiced as a relic in a few remnant areas. In parts of the Queichwiesen meadows flood irrigation goes back to the Middle Ages, which makes them a predestined as a model region to study the long- and short-term effects of lowland meadow irrigation on the biodiversity and ecosystem processes.
Our study pointed out the conservation value of traditional meadow irrigation for the preservation of local species communities as well as the plant diversity at the landscape scale. The structurally more complex irrigated meadows lead to the assumption of a higher arthropod diversity (Orthodoptera, Carabidae, Araneae), which could not be detected. However, irrigated meadows are a significant habitat for moisture dependent arthropod species. In the light of the agronomic potential, flood irrigation could be a way to at least reduce fertilizer costs to a certain degree and possibly prevent overfertilization pulses which are necessarily hazardous to non-target ecosystems. Still, the reestablishment of flood irrigation in formerly irrigated meadows, or even the establishment of new irrigation systems needs ecological and economic evaluation dependent on regional circumstances and specific species communities, at which this study could serve as a reference point.
The Internet of Things (IoT) is a network of addressable, physical objects that contain embedded sensing, communication and actuating technologies to sense and interact with their environment (Geschickter 2015). Like every novel paradigm, the IoT sparks interest throughout all domains both in theory and practice, resulting in the development of systems pushing technology to its limits. These limits become apparent when having to manage an increasing number of Things across various contexts. A plethora of IoT architecture proposals have been developed and prototype products, such as IoT platforms, been introduced. However, each of these architectures and products apply their very own interpretations of an IoT architecture and its individual components so that IoT is currently more an Intranet of Things than an Internet of Things (Zorzi et al. 2010). Thus, this thesis aims to develop a common understanding of the elements forming an IoT architecture and provide high-level specifications in the form of a Holistic IoT Architecture Framework.
Design Science Research (DSR) is used in this thesis to develop the architecture framework based on the pertinent literature. The development of the Holistic IoT Architecture Framework includes the identification of two new IoT Architecture Perspectives that became apparent during the analysis of the IoT architecture proposals identified in the extant literature. While applying these novel perspectives, the need for a new component for the architecture framework, which was merely implicitly mentioned in the literature, became obvious as well. The components of various IoT architecture proposals as well as the novel component, the Thing Management System, were combined, consolidated and related to each other to develop the Holistic IoT Architecture Framework. Subsequently, it was shown that the specifications of the architecture framework are suitable to guide the implementation of a prototype.
This contribution provides a common understanding of the basic building blocks, actors and relations of an IoT architecture.
Mapping ORM to TGraph
(2017)
Object Role Modeling (ORM) is a semantic modeling language used to describe objects and their relations amongst each other. Both objects and relations may be subject to rules or ORM constraints.
TGraphs are ordered, attributed, typed and directed graphs. The type of a TGraph and its components, the edges and vertices, is defined using the schema language graph UML (grUML), a profiled version of UML class diagrams. The goal of this thesis is to map ORM schemas to grUML schemas in order to be able to represent ORM schema instances as TGraphs.
Up to this point, the preferred representation for ORM schema instances is in form of relational tables. Though mappings from ORM schemas to relational schemas exist, those publicly available do not support most of the constraints ORM has to offer.
Constraints can be added to grUML schemas using the TGraph query language GReQL, which can efficiently check whether a TGraph validates the constraint or not. The graph library JGraLab provides efficient implementations of TGraphs and their query language GReQL and supports the generation of grUML schemas.
The first goal of this work is to perform a complete mapping from ORM schemas to grUML schemas, using GReQL to sepcify constraints. The second goal is to represent ORM instances in form of TGraphs.
This work gives an overview of ORM, TGraphs, grUML and GReQL and the theoretical mapping from ORM schemas to grUML schemas. It also describes the implementation of this mapping, deals with the representation of ORM schema instances as TGraphs and the question how grUML constraints can be validated.
Natural pest control and pollination are important ecosystem services for agriculture. They can be supported by organic farming and by seminatural habitats at the local and landscape scale.
The potential of seminatural habitats to support predatory flies (chapters 2 and 3) and bees(chapter 7) at the local and landscape scale was investigated in seminatural habitats. Predatory flies were more abundant in woody habitats and positively related to landscape complexity. The diversity and the abundance of honey and wild bees were positively related to the supply of flowers offered in the seminatural habitats.
The influence of organic farming, adjacent seminatural habitats and landscape complexity on pest control (chapter 4) and pollination (chapter 6) was investigated in 18 pumpkin fields. Organic farming lacked strong effects both on the pest control and on the pollination of pumpkin.
Pest control is best supported at the local scale by the flower abundance in the adjacent habitat. The flower supply positively affected the density of natural enemies and tended to reduce aphid densities in pumpkin fields.
Pumpkin provides a striking example for a dominant role of wild pollinators for pollination success, because bumble bees are the key pollinators of pumpkin in Germany, despite a higher visitation frequency of honey bees. Pollination is best supported by landscape complexity. Bumble bee visits and as a result pollen delivery in pumpkin were negatively related to the dominance of agricultural land in the surrounding landscape.
The influence of aphid density (chapter 8) and pollination (chapter 5) on pumpkin yield was evaluated. Pumpkin yields were not affected by aphid densities observed in the pumpkin fields and not limited by pollination at the current levels of bee visitation.
In conclusion, especially seminatural habitats, that provide diverse, continuous floral resources, are important for natural enemies and pollinators. A sufficient proportion of different seminatural habitat types in agricultural landscapes should be maintained and restored. Thereby natural enemies such as predatory flies, wild pollinators such as bumble bees, and the pest control and pollination provided by them can be supported.
The extensive literature in the data visualization field indicates that the process of creating efficient data visualizations requires the data designer to have a large set of skills from different fields (such as computer science, user experience, and business expertise). However, there is a lack of guidance about the visualization process itself. This thesis aims to investigate the different processes for creating data visualizations and develop an integrated framework to guide the process of creating data visualizations that enable the user to create more useful and usable data visualizations. Firstly, existing frameworks in the literature will be identified, analyzed and compared. During this analysis, eight views of the visualization process are developed. These views represent the set of activities which should be done in the visualization process. Then, a preliminary integrated framework is developed based on an analysis of these findings. This new integrated framework is tested in the field of Social Collaboration Analytics on an example from the UniConnect platform. Lastly, the integrated framework is refined and improved based on the results of testing with the help of diagrams, visualizations and textual description. The results show that the visualization process is not a waterfall type. It is the iterative methodology with the certain phases of work, demonstrating how to address the eight views with different levels of stakeholder involvement. The findings are the basis for a visualization process which can be used in future work to develop the fully functional methodology.
The Internet of Things (IoT) recently developed from the far-away vision of ubiquitous computing into very tangible endeavors in politics and economy, implemented in expensive preparedness programs. Experts predict considerable changes in business models that need to be addressed by organizations in order to respond to competition. Although there is a need to develop strategies for upcoming transformations, organizational change literature did not turn to the specific change related to the new technology yet. This work aims at investigating IoT-related organizational change by identifying and classifying different change types. It therefore combines the methodological approach of grounded theory with a discussion and classification of identified change informed by a structured literature review of organizational change literature. This includes a meta-analysis of case studies using a qualitative, exploratory coding approach to identify categories of organizational change related to the introduction of IoT. Furthermore a comparison of the identified categories to former technology-related change is provided using the example of Electronic Business (e-business), Enterprise Resource Planning (ERP) systems, and Customer Relationship Management (CRM) systems. As a main result, this work develops a comprehensive model of IoT-related business change. The model presents two main themes of change indicating that personal smart things will transform businesses by means of using more personal devices, suggesting and scheduling actions of their users, and trying to avoid hazards. At the same time, the availability of information in organizations will further increase to a state where information is available ubiquitously. This will ultimately enable accessing real time information about objects and persons anytime and from any place. As a secondary result, this work gives an overview on concepts of technology-related organizational change in academic literature.
Key mechanisms for the release of metal(loid)s from a construction material in hydraulic engineering
(2017)
Hydraulic engineering and thus construction materials are necessary to enable the navigability of water ways. Since, a variety of natural as well as artificial materials are used, this materials are world wide tested on a potential release of dangerous substances to prevent adverse effects on the environment. To determine the potential release, it is important to identify and to understand key mechanisms which are decisive for the release of hazardous substances. A limited correlation between the conditions used in regulatory tests and those found in environmental systems is given and hence, often the significance of results from standardised tests on construction materials is questioned, since they are not designed to mimic environmental conditions.
In Germany industrial by-products are used as armour stones in hydraulic engineering. Especially the by-product copper slag is used during the last 40 years for the construction of embankments, groynes and coastal protection. On the one hand, this material has a high density and natural resources (landscape) are protected. One the other hand, the material contains high quantities of metal(loid)s. Therefore the copper slag (product name: iron silicate stones) is very suitable as test material. Metal(loid)s examined were As, Sb and Mo as representatives for (hydr)oxide forming elements and Cd, Co, Cu, Fe, Ni, Pb and Zn were studied as representatives for elements forming cations during the release.
Questions addressed in this Thesis were: (i) can we transfer the results from batch experiments to construction scenarios under the prevalent environmental conditions, (ii) which long-term trends exist for the release of metal(loid)s from copper slags and (iii) how environmental conditions influence the leaching of metal(loid)s from water construction materials?
To answer the first question the surface depending release of the metal(loid)s from the construction materials was examined. Therefore, batch leaching experiments with different particle sizes and a constant liquid/solid ratio were performed. In a second step a comparison between different methods for the determination of the specific surface area of armour stones with a 3D laser scanning method as a reference were performed. In a last step it was possible to show that via a roughness factor the results of the specific surface area from small stones, measured with gas adsorption, can be connected with the results from armour stones, determined with an aluminium foil method. Based on calculations of the specific surface area, it was possible to significantly improve catchment scale calculation about the release of metal(loid)s and to evaluate a potential impact of construction materials in hydraulic engineering on the water chemistry of rivers and streams.
To answer the second question long-term leaching diffuse gradient in thin films supported experiments were performed for half a year. Diffuse gradients in thin films (DGT) is an in situ method to passive sample metal(loid)s in water, sediments and soils. They were used as a sink for metal(loid)s in the eluate to provide solution equilibriums. Thus the exchange of the eluent, which is performed normally in long-term experiments, was superfluous and long-term effects under undisturbed conditions were studied. The long-term leaching experiments with DGT have proven to be capable (i) to differentiate between the depletion of the material surface and the solution equilibriums and (ii) to study sorption processes with or without a further release of the analytes. This means for the practically relevant test material copper slag that: (i) the cations Cd, Co, Cu, Ni and Pb are confirmed to be released from the slag over the whole time period of six months, (ii) a surface depletion of Zn was detected, and (iii) that the (hydr)oxide forming elements As, Mo and Sb were released from the slag over the hole periods of six months but the release was masked by adsorption to Fe-oxide colloids, which were formed during the leaching experiments. It was confirmed, that sulphide minerals are the main source for long-term release of Cd, Cu, Ni, Pb and Mo.
To answer the third question short-term leaching experiments simulating environmental conditions in hydraulic engineering were performed. One factor is the salinity. The influence of this parameter was tested in batch experiments with sea salt solution (30 g/l), river Rhine water, ultra pure water and in addition with different NaCl concentration (5, 10, 20 and 30 g/l). In general, the ionic strength is an important factor for the metal(loid) release but the composition of the water (e.g. the HCO3- content) may superimpose this effect. Therefore, the concentrations of the metal(loid)s in the experiments with ultra-pure water spiked with sea salt or native river water and the ultra-pure water spiked with NaCl were significantly different. In a second experiment the influence of the environmental parameters and the interactions between the environmental parameters pH (4–10), sediment content (0 g–3.75 g), temperature (4 °C–36 °C) and ionic strength (0 g/l–30 g/l NaCl) on the release of metal(loid)s from the test material was examined. The statistical Design of Experiments (DoE) was used to study the influence of these factors as well as their interactions. All studied factors may impact the release of metal(loid)s from the test material to the eluent, whereas the release and the partitioning between sediment and eluate of metal(loid)s was impacted by interactions between the studied factors. The main processes were sorption, complexation, solubility, buffering and ion exchange. In addition, by separating the sediment from the slag after the experiments by magnetic separation, the enrichment of metal(loid)s in the sediment was visible. Thus, the sediment was the most important factor for the release of the metal(loid)s, via pH, temperature and ionic strength, because the sediment acted as a sink.