Refine
Year of publication
Document Type
- Doctoral Thesis (246)
- Master's Thesis (90)
- Part of Periodical (84)
- Bachelor Thesis (44)
- Diploma Thesis (27)
- Article (19)
- Conference Proceedings (10)
- Study Thesis (10)
- Habilitation (4)
- Other (2)
Language
- English (538) (remove)
Has Fulltext
- yes (538) (remove)
Keywords
- Pestizid (8)
- Pflanzenschutzmittel (6)
- Software Engineering (6)
- Internet of Things (5)
- Biodiversität (4)
- Bluetooth (4)
- Bodenchemie (4)
- Landwirtschaft (4)
- Semantic Web (4)
- ecotoxicology (4)
Institute
- Fachbereich 4 (115)
- Institut für Informatik (81)
- Fachbereich 7 (78)
- Institut für Computervisualistik (52)
- Institut für Wirtschafts- und Verwaltungsinformatik (52)
- Institute for Web Science and Technologies (50)
- Institut für Management (30)
- Institut für Integrierte Naturwissenschaften, Abt. Biologie (24)
- Institut für Umweltwissenschaften (23)
- Fachbereich 8 (20)
With the emergence of current generation head-mounted displays (HMDs), virtual reality (VR) is regaining much interest in the field of medical imaging and diagnosis. Room-scale exploration of CT or MRI data in virtual reality feels like an intuitive application. However in VR retaining a high frame rate is more critical than for conventional user interaction seated in front of a screen. There is strong scientific evidence suggesting that low frame rates and high latency have a strong influence on the appearance of cybersickness. This thesis explores two practical approaches to overcome the high computational cost of volume rendering for virtual reality. One lies within the exploitation of coherency properties of the especially costly stereoscopic rendering setup. The main contribution is the development and evaluation of a novel acceleration technique for stereoscopic GPU ray casting. Additionally, an asynchronous rendering approach is pursued to minimize the amount of latency in the system. A selection of image warping techniques has been implemented and evaluated methodically, assessing the applicability for VR volume rendering.
Successful export sectors in manufacturing and agribusiness are important drivers of structural transformation in Sub-Sahara African countries. Backed by industrial policies and active state involvement, a small number of successful productive export sectors has emerged in Sub-Saharan Africa. This thesis asks the question: How do politics shape the promotion of export-driven industrialisation and firm-level upgrading in Sub-Saharan Africa? It exemplifies this question with an in-depth, qualitative study of the cashew processing industry in Mozambique in the period from 1991 until 2019. Mozambique used to be one of the world’s largest producers and processors of cashew nuts in the 1960s and 1970s. At the end of the 20th century, the cashew processing industry broke down completely but has re-emerged as one of the country’s few successful agro-processing exports.
The thesis draws on theoretical approaches from the fields of political science, notably the political settlements framework, global value chain analysis and the research on technological capabilities to explore why the Mozambican Government supported the cashew processing industry and how Mozambican cashew processors acquired the technological capabilities needed to access the global cashew value chain and to upgrade. It makes an important theoretical contribution by linking the political settlements framework and the literature on upgrading in global value chains to study how politics shaped productive sector promotion and upgrading in the Mozambican cashew processing industry. The findings of the thesis are based on extensive primary data, including 58 expert interviews and 10 firm surveys, that was collected in Mozambique in 2018 as well as a broad base of secondary literature.
The thesis argues that the Mozambican Government supported the cashew processing industry because it became important for the Government’s political survival. Promoting the cashew sector formed part of an electoral strategy for the ruling FRELIMO coalition and a means to keep FRELIMO factions united by offering economic opportunities to key constituencies. In 1999, it adopted a protectionist cashew law that created strong incentives for cashew processing in Mozambique. This not only facilitated the re-emergence of the cashew processing industry after its breakdown. The law and the active involvement of the National Cashew Institute (INCAJU) also affected the governance of the local cashew value chain, the creation of backward linkages, and the upgrading paths of cashew processors. The findings of the thesis suggest that the cashew law reduced the pressure on the cashew processing industry to upgrade. The law further created opportunities for formal and informal rent creation for members of the political elite and lower level FRELIMO officials that prevented a far-reaching reform of the law. The thesis shows that international buyers do not promote upgrading among Sub-Sahara African firms in global value chains with market-based or modular governance. Moreover, firms that operate in countries where industrial policies are not enforced effectively cannot draw on the support of government institutions to enhance their capabilities and to upgrade. Firms therefore mainly depended on costly learning channels at firm level, e.g. learning by doing or hiring skilled labour, and/or on technical assistance from donors to build the technological capabilities needed to access global value chains and to remain competitive.
The findings of the thesis suggest that researchers, governments, development practitioners and consultants need to rethink their understanding of upgrading in GVCs in four ways. First, they need to move away from understanding upgrading in terms of moving towards more complex, higher value-added activities in GVCs (functional upgrading). Instead, it is important to consider the potential of other, more realistic types of upgrading for firms in low-income countries, such reducing risks by diversifying suppliers and buyers or increasing rewards by making production processes more efficient. Second, they need to replace an overly positive view on upgrading that neglects possible side-effects at sector and/or country level. Third, GVC participation on its own does not promote upgrading among local supplier firms in Sub-Saharan Africa. The interests of lead firms and Sub-Sahara African supplier firms may not be aligned or even conflicting. Targeted industrial policies and the creation of institutions that effectively promote capability building among firms therefore become even more important. Finally, upgrading needs to be understood as a process that is not only shaped by interactions between firms, but also by local domestic politics.
The findings of the thesis are highly relevant for scholars from the fields of political science, development studies, and economics. Its practical implications and tools, e.g. a technological capabilities matrix for the cashew industry, are of interest for development practitioners, members of public institutions in Sub-Sahara African countries, local entrepreneurs, and representatives of local business associations that are involved in promoting export sectors and upgrading among local firms.
Research has shown that people recognize personality, gender, inner states and many other items of information by simply observing human motion. Therefore the expressive human motion seems to be a valuable non-verbal communication channel. On the quest for more believable characters in virtual three dimensional simulations a great amount of visual realism has been achieved during the last decades. However, while interacting with synthetic characters in real-time simulations, often human users still sense an unnatural stiffness. This disturbance in believability is generally caused by a lack of human behavior simulation. Expressive motions, which convey personality and emotional states can be of great help to create more plausible and life-like characters. This thesis explores the feasibility of an automatic generation of emotionally expressive animations from given neutral character motions. Such research is required since common animation methods, such as manual modeling or motion capturing techniques, are too costly to create all possible variations of motions needed for interactive character behavior. To investigate how emotions influence human motion relevant literature from various research fields has been viewed and certain motion rules and features have been extracted. These movement domains were validated in a motion analysis and implemented in a system in an exemplary manner capable of automating the expression of angry, sad and happy states in a virtual character through its body language. Finally, the results were evaluated in user test.
Semantic descriptions of non-textual media available on the web can be used to facilitate retrieval and presentation of media assets and documents containing them. While technologies for multimedia semantic descriptions already exist, there is as yet no formal description of a high quality multimedia ontology that is compatible with existing (semantic) web technologies. We explain the complexity of the problem using an annotation scenario. We then derive a number of requirements for specifying a formal multimedia ontology, including: compatibility with MPEG-7, embedding in foundational ontologies, and modularisation including separation of document structure from domain knowledge. We then present the developed ontology and discuss it with respect to our requirements.
Advanced Auditing of Inconsistencies in Declarative Process Models using Clustering Algorithms
(2021)
To have a compliant business process of an organization, it is essential to ensure a onsistent process. The measure of checking if a process is consistent or not depends on the business rules of a process. If the process adheres to these business rules, then the process is compliant and efficient. For huge processes, this is quite a challenge. Having an inconsistency in a process can yield very quickly to a non-functional process, and that’s a severe problem for organizations. This thesis presents a novel auditing approach for handling inconsistencies from a post-execution perspective. The tool identifies the run-time inconsistencies and visualizes them in heatmaps. These plots aim to help modelers observe the most problematic constraints and help them make the right remodeling decisions. The modelers assisted with many variables can be set in the tool to see a different representation of heatmaps that help grasp all the perspectives of the problem. The heatmap sort and shows the run-time inconsistency patterns, so that modeler can decide which constraints are highly problematic and should address a re-model. The tool can be applied to real-life data sets in a reasonable run-time.
The formulation of the decoding problem for linear block codes as an integer program (IP) with a rather tight linear programming (LP) relaxation has made a central part of channel coding accessible for the theory and methods of mathematical optimization, especially integer programming, polyhedral combinatorics and also algorithmic graph theory, since the important class of turbo codes exhibits an inherent graphical structure. We present several novel models, algorithms and theoretical results for error-correction decoding based on mathematical optimization. Our contribution includes a partly combinatorial LP decoder for turbo codes, a fast branch-and-cut algorithm for maximum-likelihood (ML) decoding of arbitrary binary linear codes, a theoretical analysis of the LP decoder's performance for 3-dimensional turbo codes, compact IP models for various heuristic algorithms as well as ML decoding in combination with higher-order modulation, and, finally, first steps towards an implementation of the LP decoder in specialized hardware. The scientific contributions are presented in the form of seven revised reprints of papers that appeared in peer-reviewed international journals or conference proceedings. They are accompanied by an extensive introductory part that reviews the basics of mathematical optimization, coding theory, and the previous results on LP decoding that we rely on afterwards.
Despite the inception of new technologies at a breakneck pace, many analytics projects fail mainly due to the use of incompatible development methodologies. As big data analytics projects are different from software development projects, the methodologies used in software development projects could not be applied in the same fashion to analytics projects. The traditional agile project management approaches to the projects do not consider the complexities involved in the analytics. In this thesis, the challenges involved in generalizing the application of agile methodologies will be evaluated, and some suitable agile frameworks which are more compatible with the analytics project will be explored and recommended. The standard practices and approaches which are currently applied in the industry for analytics projects will be discussed concerning enablers and success factors for agile adaption. In the end, after the comprehensive discussion and analysis of the problem and complexities, a framework will be recommended that copes best with the discussed challenges and complexities and is generally well suited for the most data-intensive analytics projects.
The loss of biodiversity is recognised on a global scale and also in the anthropogenic landscapes used for agriculture, now covering almost 50% of the global terrestrial land surface. In agriculture pesticides, biologically active chemicals are deliberately distributed to control pests, disease and weeds in the cropped areas. The quantification of remaining semi-naturals structures such as field margins and hedges is a prerequisite to understand the impact of pesticides on biodiversity, since these structures represent habitats for many organisms in agricultural landscapes. The presence of organisms in these habitats and crops is required to obtain an estimate of their potential pesticide exposure. In this text I provide studies on animal groups so far not addressed in risk assessment procedures for the regulation of pesticides such as amphibians, moths and bats. For all groups it becomes apparent that they are present in agricultural landscapes and potentially coincide with pesticide applications indicating a risk. Risk quantification also requires data on the sensitivity of organisms and here data for plants, amphibians and bees are presented. Effects translating to community level were studied for herbicide, insecticide and fertiliser effects in a natural system. After three years the treatments resulted in simplified plant communities with lower species numbers and a reduction in flowering plants. This reduction of flowers is used as an example for an indirect effect and was especially obvious for the effect of an herbicide on the common buttercup. Sublethal herbicide effects for a plant translated in an impact on feeding caterpillars, indicating a reduction in food quality. Insecticide inputs realistic for field margins also reduced moth pollination of white champion flowers by 30%. These indirect effects by distortions of food web characteristics are playing a critical role to understand declines in organism groups, however so far are not accounted for in pesticide risk assessment schemes. The current intense use of pesticides in agriculture and their inherent toxicity may lead to a chemical landscape fragmentation, where populations may not be connected anymore. Source-sink dynamics are important ecological processes and as a final result not only population size but also genetic population structure might be affected. Including potential pesticide impacts as costs in a model for amphibians migrating to breeding ponds in vineyards in Rhineland-Palatinate indicated the isolation of investigated populations. A first validation by analyzing the population structure of the European common frog confirmed the model prediction for some sites. For the regulation of pesticides in Europe a risk assessment is required and for the organisms of the terrestrial habitat a multitude of guidance documents is in place or is recently developed or improved. The results of the presented research indicate that wild plants and especially their reproductive flower stage are highly sensitive and risks are underestimated. Population recovery of arthropods needs a reevaluation at landscape scale and the addition of amphibian risk assessment in regulation procedures is suggested. However, developing or adopting risk assessment procedures and test systems is a time consuming task and therefore the establishment of risk management options is a pragmatic alternative with immediate effects. Artificial wetlands in the agricultural landscape proved to be important foraging sites for bats and their creation could mitigate negative pesticide effects. The integration of direct and indirect effects in a risk assessment scheme for all organism groups addressing also landscape scale and pesticide mixtures requires a long developing time. The establishment of model landscapes where management options and integrated pest management are applied on a larger scale would allow us to study pesticide effects in a realistic scenario and to develop an approach for the agriculture of the future.
The use of agricultural plastic covers has become common practice for its agronomic benefits such as improving yields and crop quality, managing harvest times better, and increasing pesticide and water use efficiency. However, plastic covers are suspected of partially breaking down into smaller debris and thereby contributing to soil pollution with microplastics. A better understanding of the sources and fate of plastic debris in terrestrial systems has so far been hindered by the lack of adequate analytical techniques for the mass-based and polymer-selective quantification of plastic debris in soil. The aim of this dissertation was thus to assess, develop, and validate thermoanalytical methods for the mass-based quantification of relevant polymers in and around agricultural fields previously covered with fleeces, perforated foils, and plastic mulches. Thermogravimetry/mass spectrometry (TGA/MS) enabled direct plastic analyses of 50 mg of soil without any sample preparation. With polyethylene terephthalate (PET) as a preliminary model, the method limit of detection (LOD) was 0.7 g kg−1. But the missing chromatographic separation complicated the quantification of polymer mixtures. Therefore, a pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) method was developed that additionally exploited the selective solubility of polymers in specific solvents prior to analysis. By dissolving polyethylene (PE), polypropylene (PP), and polystyrene (PS) in a mixture of 1,2,4-trichlorobenzene and p-xylene after density separation, up to 50 g soil became amenable to routine plastic analysis. Method LODs were 0.7–3.3 mg kg−1, and the recovery of 20 mg kg−1 PE, PP, and PS from a reference loamy sand was 86–105%. In the reference silty clay, however, poor PS recoveries, potentially induced by the additional separation step, suggested a qualitative evaluation of PS. Yet, the new solvent-based Py-GC/MS method enabled a first exploratory screening of plastic-covered soil. It revealed PE, PP, and PS contents above LOD in six of eight fields (6% of all samples). In three fields, PE levels of 3–35 mg kg−1 were associated with the use of 40 μm thin perforated foils. By contrast, 50 μm PE films were not shown to induce plastic levels above LOD. PP and PS contents of 5–19 mg kg−1 were restricted to single observations in four fields and potentially originated from littering. The results suggest that the short-term use of thicker and more durable plastic covers should be preferred to limit plastic emissions and accumulation in soil. By providing mass-based information on the distribution of the three most common plastics in agricultural soil, this work may facilitate comparisons with modeling and effect data and thus contribute to a better risk assessment and regulation of plastics. However, the fate of plastic debris in the terrestrial environment remains incompletely understood and needs to be scrutinized in future, more systematic research. This should include the study of aging processes, the interaction of plastics with other organic and inorganic compounds, and the environmental impact of biodegradable plastics and nanoplastics.
Graphs are known to be a good representation of structured data. TGraphs, which are typed, attributed, ordered, and directed graphs, are a very general kind of graphs that can be used for many domains. The Java Graph Laboratory (JGraLab) provides an efficient implementation of TGraphs with all their properties. JGraLab ships with many features, including a query language (GReQL2) for extracting data from a graph. However, it lacks a generic library for important common graph algorithms. This mid-study thesis extends JGraLab with a generic algorithm library called Algolib, which provides a generic and extensible implementation of several important common graph algorithms. The major aspects of this work are the generic nature of Algolib, its extensibility, and the methods of software engineering that were used for achieving both. Algolib is designed to be extensible in two ways. Existing algorithms can be extended for solving specialized problems and further algorithms can be easily added to the library.
In the last decades, it became evident that the world is facing an unprecedented, human-induced global biodiversity crisis with amphibians being one of the most threatened species groups. About 41% of the amphibian species are classified as endangered by the IUCN, but even in amphibian species that are listed as "least concern", population declines can be observed on a local level. With land-use change and agrochemicals (i.e. pesticides), two of the main drivers for this amphibian decline are directly linked to intensive agriculture, which is the dominant landscape type in large parts of Europe. Thus, understanding the situation of amphibians in the agricultural landscape is crucial for conservation measures. In the present thesis, I investigated the effects of viticulture on amphibian populations around Landau in der Pfalz (Germany) in terms of habitat use, pesticide exposure, biometric traits as well as genetic and age structure. From the perspective of amphibians, land-use change means usually the destruction of habitats in agricultural landscapes, which often leads to landscape fragmentation. Thus, I followed the question if also vineyards lead to the fragmentation of the landscape and if pesticides that are frequently used in viticulture have to be considered as a factor too, so if there is a chemical landscape fragmentation. Using telemetry, I could show that common toads (Bufo bufo) can be found directly in vineyards, but that they tend to avoid them as habitat. Analysing the genetic structure of common frogs (Rana temporaria) revealed that vineyards have to be considered as a barrier for amphibians. To identify if pesticides contribute to the resulting landscape fragmentation, I conducted an arena choice experiment in the laboratory in which I found evidence for an avoidance of pesticide-contaminated soil. Such an avoidance could be one of the underlying reasons for a potential chemical landscape fragmentation. By combining telemetry data with information about pesticide applications from local wine growers, I could show that a large part of the common toads is likely to come in contact with pesticides. Further, I demonstrated that the agricultural landscape, probably due to the application of pesticides, can have negative effects on the reproduction capacity of common toads. By studying palmate newts (Lissotriton helveticus) I found that adult newts from agricultural ponds are smaller than those from forest ponds. As I did not find differences in the age structure and growth, these differences might be carry-over effects from earlier life stages. While agricultural ponds might be suitable habitats for adult palmate newts, the potential carry-over effect indicates suboptimal conditions for larvae and/or juveniles. I conclude that the best management measure for sustaining amphibians in the agricultural landscape would be a heterogeneous cultural landscape with a mosaic of different habitat patches that work without or at least a reduced amount of pesticides. Green corridors between populations and different habitats would allow migrating individuals to avoid agricultural and thus pesticide-contaminated areas. This would reduce the pesticide exposure risk of amphibians, while preventing the fragmentation of the landscape and thus the isolation of populations.
With 47% land coverage in 2016, agricultural land was one of the largest terrestrial biomes in Germany. About 70% of the agricultural land was cropped area with associated pesticide applications. Agricultural land also represents an essential habitat for amphibians. Therefore, exposure of amphibians to agrochemicals, such as fertilizers and pesticides, seems likely. Pesticides can be highly toxic for amphibians, even a fraction of the original application rate may result in high amphibian mortality.
To evaluate the potential risk of pesticide exposure for amphibians, the temporal coincidence of amphibian presence on agricultural land and pesticide applications (N = 331) was analyzed for the fire-bellied toad (Bombina bombina), moor frog (Rana arvalis), spadefoot toad (Pelobates fuscus) and crested newt (Triturus cristatus) during spring migration. In 2007 and 2008, up to 80% of the migrating amphibians temporally coincided with pesticide applications in the study area of Müncheberg, about 50 km east of Berlin. Pesticide interception by plants ranged between 50 to 90% in winter cereals and 80 to 90% in winter rape. The highest coincidence was observed for the spadefoot toad, where 86.6% of the reproducing population was affected by a single pesticide in winter rape during stem elongation with 80% pesticide interception by plants. Late migrating species, such as the fire-bellied toad and the spadefoot toad, overlapped more with pesticide applications than early migrating species, such as the moor frog, did. Under favorable circumstances, the majority of early migrants may not coincide with the pesticide applications of arable fields during spring migration.
To evaluate the potential effect of pesticide applications on populations of the common frog (Rana temporaria), a landscape genetic study was conducted in the vinicultural area of Southern Palatinate. Due to small sample sizes at breeding sites within viniculture, several DNA sampling methods were tested. Furthermore, the novel repeated randomized selection of genotypes approach was developed to utilize genetic data from siblings for more reliable estimates of genetic parameters. Genetic analyses highlighted three of the breeding site populations located in viniculture as isolated from the meta-population. Genetic differentiation among breeding site populations in the viniculture (median pairwise FST=0.0215 at 2.34 km to 0.0987 at 2.39 km distance) was higher compared to genetic differentiation among breeding site populations in the Palatinate Forest (median pairwise FST=0.0041 at 5.39 km to 0.0159 at 9.40 km distance).
The presented studies add valuable information about the risk of pesticide exposure for amphibians in the terrestrial life stage and possible effects of agricultural land on amphibian meta-populations. To conserve endemic amphibian species and their (genetic) diversity in the long run, the risk assessment of pesticides and applied agricultural management measures need to be adjusted to protect amphibians adequately. In addition, other conservation measures such as the creation of new suitable breeding site should be considered to improve connectivity between breeding site populations and ensure the persistence of amphibians in the agricultural land.
Traditional Driver Assistance Systems (DAS) like for example Lane Departure Warning Systems or the well-known Electronic Stability Program have in common that their system and software architecture is static. This means that neither the number and topology of Electronic Control Units (ECUs) nor the presence and functionality of software modules changes after the vehicles leave the factory.
However, some future DAS do face changes at runtime. This is true for example for truck and trailer DAS as their hardware components and software entities are spread over both parts of the combination. These new requirements cannot be faced by state-of-the-art approaches of automotive software systems. Instead, a different technique of designing such Distributed Driver Assistance Systems (DDAS) needs to be developed. The main contribution of this thesis is the development of a novel software and system architecture for dynamically changing DAS using the example of driving assistance for truck and trailer. This architecture has to be able to autonomously detect and handle changes within the topology. In order to do so, the system decides which degree of assistance and which types of HMI can be offered every time a trailer is connected or disconnected. Therefore an analysis of the available software and hardware components as well as a determination of possible assistance functionality and a re-configuration of the system take place. Such adaptation can be granted by the principles of Service-oriented Architecture (SOA). In this architectural style all functionality is encapsulated in self-contained units, so-called Services. These Services offer the functionality through well-defined interfaces whose behavior is described in contracts. Using these Services, large-scale applications can be built and adapted at runtime. This thesis describes the research conducted in achieving the goals described by introducing Service-oriented Architectures into the automotive domain. SOA deals with the high degree of distribution, the demand for re-usability and the heterogeneity of the needed components.
It also applies automatic re-configuration in the event of a system change. Instead of adapting one of the frameworks available to this scenario, the main principles of Service-orientation are picked up and tailored. This leads to the development of the Service-oriented Driver Assistance (SODA) framework, which implements the benefits of Service-orientation while ensuring compatibility and compliance to automotive requirements, best-practices and standards. Within this thesis several state-of-the-art Service-oriented frameworks are analyzed and compared. Furthermore, the SODA framework as well as all its different aspects regarding the automotive software domain are described in detail. These aspects include a well-defined reference model that introduces and relates terms and concepts and defines an architectural blueprint. Furthermore, some of the modules of this blueprint such as the re-configuration module and the Communication Model are presented in full detail. In order to prove the compliance of the framework regarding state-of-the-art automotive software systems, a development process respecting today's best practices in automotive design procedures as well as the integration of SODA into the AUTOSAR standard are discussed. Finally, the SODA framework is used to build a full-scale demonstrator in order to evaluate its performance and efficiency.
Software systems are often developed as a set of variants to meet diverse requirements. Two common approaches to this are "clone-and-owning" and software product lines. Both approaches have advantages and disadvantages. In previous work we and collaborators proposed an idea which combines both approaches to manage variants, similarities, and cloning by using a virtual platform and cloning-related operators.
In this thesis, we present an approach for aggregating essential metadata to enable a propagate operator, which implements a form of change propagation. For this we have developed a system to annotate code similarities which were extracted throughout the history of a software repository. The annotations express similarity maintenance tasks, which can then either be executed automatically by propagate or have to be performed manually by the user. In this work we outline the automated metadata extraction process and the system for annotating similarities; we explain how the implemented system can be integrated into the workflow of an existing version control system (Git); and, finally, we present a case study using the 101haskell corpus of variants.
As Enterprise 2.0 (E2.0) initiatives are gradually moving out of the early experimentation phase it is time to focus greater attention on examining the structures, processes and operations surrounding E2.0 projects. In this paper we present the findings of an empirical study to investigate and understand the reasons for initiating E2.0 projects and the benefits being derived from them. Our study comprises seven in-depth case studies of E2.0 implementations. We develop a classification and means of visualising the scope of E2.0 initiatives and use these methods to analyse and compare projects.
Our findings indicate a wide range of motivations and combinations of technology in use and show a strong emphasis towards the content management functionality of E2.0 technologies.
An empirical study to evaluate the location of advertisement panels by using a mobile marketing tool
(2009)
The efficiency of marketing campaigns is a precondition for business success. This paper discusses a technique to transfer advertisement content vie Bluetooth technology and collects market research information at the same time. Conventional advertisement media were enhanced by devices to automatically measure the number, distance, frequency and exposure time of passersby, making information available to evaluate both the wireless media as well as the location in general. This paper presents a study analyzing these data. A cryptographic one-way function protects privacy during data acquisition.
The paper is a study focusing on exploring which factors and examining the impact of those factors influencing the entrepreneurial intention among students in the Construction industry, specifically among students of Hanoi Construction University and Hanoi Architecture University. The study also mentions some solution of this findings for entrepreneurship in the Construction field in Vietnam that the author might think of based on this research work for future study. The Theory of planned behavior is used as the theoritical framework for this study. Both qualitative and quantitative methods are employed. The questionaire will be conducted among students of the two universities mentioned above. Then, an exploratory factor analysis (EFA) will performed to test the validity of the constructs. The research findings provide factors and their impact factors influencing the entrepreneurial intention and propose some solutions to improve the entrepreneurship in the Construction field in Vietnam.
Within aquatic environments sediment water interfaces (SWIs) are the most important areas concerning exchange processes between the water body and the sediment. These spatially restricted regions are characterized by steep biogeochemical gradients that determine the speciation and fate of natural or artificial substances. Apart from biological mediated processes (e.g., burrowing organisms, photosynthesis) the determining exchange processes are diffusion or a colloid-mediated transport. Hence, methods are required enabling to capture the fine scale structures at the boundary layer and to distinguish between the different transport pathways. Regarding emerging substances that will probably reach the aquatic environment engineered nanomaterials (ENMs) are of great concern due to their increased use in many products and applications. Since they are determined based on their size (<100 nm) they include a variety of different materials behaving differently in the environment. Once released, they will inevitable mix with naturally present colloids (< 1 μm) including natural nanomaterials.
With regard to existing methodological gaps concerning the characterization of ENMs (as emerging substances) and the investigation of SWIs (as receiving environmental compartments), the aim of this thesis was to develop, validate and apply suitable analytical tools. The challenges were to i) develop methods that enable a high resolution and low-invasive sampling of sediment pore water. To ii) develop routine-suitable methods for the characterization of metal-based engineered nanoparticles and iii) to adopt and optimize size-fractionation approaches for pore water samples of sediment depth profiles to obtain size-related information on element distributions at SWIs.
Within the first part, an available microprofiling system was combined with a novel micro sampling system equipped with newly developed sample filtration-probes. The system was thoroughly validated and applied to a freshwater sediment proving the applicability for an automatic sampling of sediment pore waters in parallel to microsensor measurements. Thereby, for the first time multi-element information for sediment depth profiles were obtained at a millimeter scale that could directly be related to simultaneously measured sediment parameters.
Due to the expected release of ENMs to the environment the aim was to develop methods that enable the investigation of fate and transport of ENMs at sediment water interfaces. Since standardized approaches are still lacking, methods were developed for the determination of the total mass concentration and the determination of the dissolved fraction of (nano)particle suspensions. Thereby, validated, routine suitable methods were provided enabling for the first time a routine-suitable determination of these two, among the most important properties regarding the analyses of colloidal systems, also urgently needed as a basis for the development of appropriate (future) risk assessments and regulatory frameworks. Based on this methodological basis, approaches were developed enabling to distinguish between dissolved and colloidal fractions of sediment pore waters. This made it possible for the first time to obtain fraction related element information for sediment depth profiles at a millimeter scale, capturing the fine scale structures and distinguishing between diffusion and colloid-mediated transport. In addition to the research oriented parts of this thesis, questions concerning the regulation of ENPs in the case of a release into aquatic systems were addressed in a separate publication (included in the Appendix) discussing the topic against the background of the currently valid German water legislation and the actual state of the research.
The purpose of this thesis is to explore the sentiment distributions of Wikipedia concepts.
We analyse the sentiment of the entire English Wikipedia corpus, which includes 5,669,867 articles and 1,906,375 talks, by using a lexicon-based method with four different lexicons.
Also, we explore the sentiment distributions from a time perspective using the sentiment scores obtained from our selected corpus. The results obtained have been compared not only between articles and talks but also among four lexicons: OL, MPQA, LIWC, and ANEW.
Our findings show that among the four lexicons, MPQA has the highest sensitivity and ANEW has the lowest sensitivity to emotional expressions. Wikipedia articles show more sentiments than talks according to OL, MPQA, and LIWC, whereas Wikipedia talks show more sentiments than articles according to ANEW. Besides, the sentiment has a trend regarding time series, and each lexicon has its own bias regarding text describing different things.
Moreover, our research provides three interactive widgets for visualising sentiment distributions for Wikipedia concepts regarding the time and geolocation attributes of concepts.
Scientific and public interest in epidemiology and mathematical modelling of disease spread has increased significantly due to the current COVID-19 pandemic. Political action is influenced by forecasts and evaluations of such models and the whole society is affected by the corresponding countermeasures for containment. But how are these models structured?
Which methods can be used to apply them to the respective regions, based on real data sets? These questions are certainly not new. Mathematical modelling in epidemiology using differential equations has been researched for quite some time now and can be carried out mainly by means of numerical computer simulations. These models are constantly being refinded and adapted to corresponding diseases. However, it should be noted that the more complex a model is, the more unknown parameters are included. A meaningful data adaptation thus becomes very diffcult. The goal of this thesis is to design applicable models using the examples of COVID-19 and dengue, to adapt them adequately to real data sets and thus to perform numerical simulations. For this purpose, first the mathematical foundations are presented and a theoretical outline of ordinary differential equations and optimization is provided. The parameter estimations shall be performed by means of adjoint functions. This procedure represents a combination of static and dynamical optimization. The objective function corresponds to a least squares method with L2 norm which depends on the searched parameters. This objective function is coupled to constraints in the form of ordinary differential equations and numerically minimized, using Pontryagin's maximum (minimum) principle and optimal control theory. In the case of dengue, due to the transmission path via mosquitoes, a model reduction of an SIRUV model to an SIR model with time-dependent transmission rate is performed by means of time-scale separation. The SIRUV model includes uninfected (U) and infected (V ) mosquito compartments in addition to the susceptible (S), infected (I) and recovered (R) human compartments, known from the SIR model. The unknwon parameters of the reduced SIR model are estimated using data sets from Colombo (Sri Lanka) and Jakarta (Indonesia). Based on this parameter estimation the predictive power of the model is checked and evaluated. In the case of Jakarta, the model is additionally provided with a mobility component between the individual city districts, based on commuter data. The transmission rates of the SIR models are also dependent on meteorological data as correlations between these and dengue outbreaks have been demonstrated in previous data analyses. For the modelling of COVID-19 we use several SEIRD models which in comparison to the SIR model also take into account the latency period and the number of deaths via exposed (E) and deaths (D) compartments. Based on these models a parameter estimation with adjoint functions is performed for the location Germany. This is possible because since the beginning of the pandemic, the cumulative number of infected persons and deaths
are published daily by Johns Hopkins University and the Robert-Koch-Institute. Here, a SEIRD model with a time delay regarding the deaths proves to be particularly suitable. In the next step, this model is used to compare the parameter estimation via adjoint functions with a Metropolis algorithm. Analytical effort, accuracy and calculation speed are taken into account. In all data fittings, one parameter each is determined to assess the estimated number of unreported cases.