Refine
Year of publication
- 2009 (19) (remove)
Document Type
- Doctoral Thesis (9)
- Part of Periodical (6)
- Diploma Thesis (2)
- Bachelor Thesis (1)
- Master's Thesis (1)
Language
- English (19) (remove)
Has Fulltext
- yes (19) (remove)
Keywords
- Bildverarbeitung (2)
- Abwasserreinigung (1)
- Aktionsart (1)
- Aktiver Wortschatz (1)
- Akzeptanz (1)
- Aspekt <Linguistik> (1)
- Automatische Klassifikation (1)
- BPMS (1)
- Benetzung (1)
- Beta-Blocker (1)
Institute
- Fachbereich 4 (6)
- Institut für Informatik (5)
- Institut für Computervisualistik (3)
- Institut für Softwaretechnik (3)
- Institut für Anglistik und Amerikanistik (2)
- Institut für Integrierte Naturwissenschaften (2)
- Institut für Umweltwissenschaften (2)
- Institut für Wirtschafts- und Verwaltungsinformatik (2)
Software is vital for modern society. The efficient development of correct and reliable software is of ever-growing importance. An important technique to achieve this goal is deductive program verification: the construction of logical proofs that programs are correct. In this thesis, we address three important challenges for deductive verification on its way to a wider deployment in the industry: 1. verification of thread-based concurrent programs 2. correctness management of verification systems 3. change management in the verification process. These are consistently brought up by practitioners when applying otherwise mature verification systems. The three challenges correspond to the three parts of this thesis (not counting the introductory first part, providing technical background on the KeY verification approach). In the first part, we define a novel program logic for specifying correctness properties of object-oriented programs with unbounded thread-based concurrency. We also present a calculus for the above logic, which allows verifying actual Java programs. The calculus is based on symbolic execution resulting in its good understandability for the user. We describe the implementation of the calculus in the KeY verification system and present a case study. In the second part, we provide a first systematic survey and appraisal of factors involved in reliability of formal reasoning. We elucidate the potential and limitations of self-application of formal methods in this area and give recommendations based on our experience in design and operation of verification systems. In the third part, we show how the technique of similarity-based proof reuse can be applied to the problems of industrial verification life cycle. We address issues (e.g., coping with changes in the proof system) that are important in verification practice, but have been neglected by research so far.
Over the last three decades researchers of Cognitive Metaphor Theory have shown conclusively that metaphor is motivated rather than arbitrary and often used to systematically map out conceptual territory. This cognitive semantic proposal holds the potential for alternative L2 teaching strategies. As an abstract domain, business discourse is naturally rich in metaphors and is additionally filled with consciously used metaphorical language to strategically manipulate clients and business partners. Business English courses especially stand to profit from metaphor-oriented language teaching, as (future) managers aim to quickly improve their language performance to be prepared for international business communication. In using metaphors, speakers as well as hearers conceptualize and thus experience one thing in terms of another. Having been made aware of the conceptual linkage, students are immediately equipped with a whole set of vocabulary they may already have learned for a concrete domain and are then able to elaborate in the more abstract area of business discourse. Enhanced metaphor awareness may thus prove to be a valuable vehicle for vocabulary acquisition as well as for vocabulary retention. This thesis is subdivided into ten chapters. With each successive chapter, the focus will increasingly sharpen on the main hypothesis that metaphor awareness raising and explicit teaching in the business English classroom assists the students to dip into their savings' and transfer already acquired vocabulary to abstract business discourse and thus to become more proficient business communicators. After an introduction to the main objectives, chapter two critically looks at the different strands of Cognitive Linguistic contributions to metaphor theory made within the last three decades and discusses the structure, function and processing of figurative language to single out relevant aspects of the language classroom applications. Chapter three narrows the perspective to the socio-economic discourse as the very target domain in focus and surveys the conceptual metaphors that have been identified for this target domain, namely the source domains most productive for the target and therefore most valuable for the language classroom. In chapter four Cognitive Linguistic findings are put in contact with language didactics; i.e., the Cognitive Linguistic basis is discussed in the context of language teaching and learning theories and a first classification of metaphor teaching in the theoretical framework of language didactics is proposed. Ten cornerstones summarize the theoretical output of the previous chapters and the respective didactic consequences are considered. Theories of cognitive psychology pertaining to noticing, processing, and storing metaphors are systematically revisited and expanded to formulate further didactic implications for metaphor teaching. The consequences drawn from both linguistic as well as didactic theory are translated into a list of ten short guidelines identifying essentials for the explicit integration of metaphors into the language classroom. In chapter five those experimental studies that have already been conducted in the field of Cognitive Linguistic-inspired figurative language teaching are systematically summarized and possible contributions to set up a didactic framework for metaphor teaching are investigated. Chapters six to nine then present a piece of original research. Starting out from five research questions tackling receptive and productive vocabulary acquisition and retention as well as the influence of and on the learner- level of language proficiency, a three-fold study was designed and conducted in a regular business English classroom and results are discussed in detail. The last chapter deals again with specific implications for teaching. Earlier statements about and claims for the language classroom are revisited and refined on the basis of the theoretical linguistic, didactic and empirical findings, and an agenda for further empirical investigations is sketched out.
Entwicklung eines Regelungsverfahrens zur Pfadverfolgung für ein Modellfahrzeug mit Sattelanhänger
(2009)
Besides the progressive automation of internal goods traffic, there is an important area that should also be considered. This area is the carriage of goods in selected external areas. The use of driverless trucks in logistic centers can report economic efficiency. In particular, these precise control procedures require that trucks drive on predetermined paths. The general aim of this work is the adaption and evaluation of a path following control method for articulated vehicles. The differences in the kinematic behavior between trucks with one-axle trailer and semi-trailer vehicles will be emphasized. Additionally, the characteristic kinematic properties of semi-trailers for the adaptation of a control procedure will be considered. This control procedure was initially designed for trucks with one-axle trailer. It must work in forwards and backwards movements. This control process will be integrated as a closed component on the control software of the model vehicle. Thus, the geometry of the model vehicle will be specified, and the possible special cases of the control process will be discovered. The work also documents the most relevant software components of the implemented control process.
Colonoscopy is the gold standard for the detection of colorectal polyps that can progress into cancer. In such an examination, physicians search for polyps in endoscopic images. Thereby polyps can be removed. To support experts with a computer-aided diagnosis system, the University of Koblenz-Landau currently makes some efforts in research different methods for automatic detection. Comparable to traditional pattern recognition systems, features are initially extracted and a classifier is trained on such data. Afterwards, unknown endoscopic images can be classified with the previously trained classifier. This thesis concentrates on the extension of the feature extraction module in the existing system. New detection methods are compared to existing techniques. Several features are implemented, incorporating Graylevel Co-occurrence Matrices, Local Binary Patterns and Discrte Wavelet Transform. Different modifications on those features are applied and evaaluated.
This Thesis contributes by reporting on the current state of diffusion of collaboration information technology (CIT). The investigation concludes, with a high degree of certainty, that today we have a "satisfactory" diffusion level of some level-A CITs (mostly e-Mail, distantly followed by Audio Conferencing), and a "dissatisfactory" diffusion level of higher-level CITs (i.e. those requiring significant collaboration and cooperation among users, like Meeting Support Systems, Group Decision Support Systems, etc.). The potential benefits of the latter seem to be far from fully realised due to lack of user acceptance. This conclusion has gradually developed along the research cycle " it was suggested by Empirical Study I, and tested through Empirical Studies II and III. An additional, unplanned and rather interesting, finding from this study has been the recognition of large [mostly business] reporting on numerous Web 2.0 user-community produced collaboration technologies (most of them belonging to the category of "social software") and their metamorphosis from autonomous, "bottom-up" solutions into enterprise-supported infrastructures. Another contribution of this Thesis " again suggested by Empirical Study I, and tested through Empirical Studies II and III " pertains to the "process structure" of CIT diffusion. I have found that collaboration technology has historically diffused following two distinct (interdependent but orthogonal) diffusion paths " top-down (authority-based) and bottom-up. The authority-based diffusion path seems to be characterised by efforts aimed at "imposing" technologies on employees, the primary concern being to make sure that technology seamlessly and easily integrates into the organisational IT infrastructure. On the other hand, the bottom-up diffusion trail seems to be successful. The contribution of this investigation may be summarised as threefold: 1. This investigation consolidates most of the findings to date, pertaining to CIT adoption and diffusion, which have been produced by the CIT research community. Thus, it tells a coherent story of the dynamics of the community focus and the collective wisdom gathered over a period of (at least) one decade. 2. This work offers a meaningful framework within which to analyse existing knowledge " and indeed extends that knowledge base by identifying persistent problems of collaboration technology acceptance, adoption and diffusion. These problems have been repeatedly observed in practice, though the pattern does not seem to have been recognised and internalised by the community. Many of these problems have been observed in cases of CIT use one decade ago, five years ago, three years ago, and continue to be observed today in structurally the same form despite what is unarguably "rapid technological development". This gives me reason to believe that, at least some of the persistent problems of CIT diffusion can be hypothesised as "determining factors". My contribution here is to identify these factors, discuss them in detail, and thus tackle the theme of CIT diffusion through a structured historical narrative. 3. Through my contribution (2) above, I characterise a "knowledge-action gap" in the field of CIT and illuminate a potential path through which the research community might hope to bridge this gap. The gap may be operationalised as cognitive distance between CIT "knowledge" and CIT "action".
The STOR project aims at the development of a scientific component system employing models and knowledge for object recognition in images. This interim report elaborates on the requirements for such a component system, structures the application area by identifying a large set of basic operations, and shows how a set of appropriate data structures and components can be derived. A small case studies exemplifies the approach.
An empirical study to evaluate the location of advertisement panels by using a mobile marketing tool
(2009)
The efficiency of marketing campaigns is a precondition for business success. This paper discusses a technique to transfer advertisement content vie Bluetooth technology and collects market research information at the same time. Conventional advertisement media were enhanced by devices to automatically measure the number, distance, frequency and exposure time of passersby, making information available to evaluate both the wireless media as well as the location in general. This paper presents a study analyzing these data. A cryptographic one-way function protects privacy during data acquisition.
Abstract The present work investigates the wetting characteristics of soils with regard to their dependence on environmental parameters such as water content (WC), pH, drying temperature and wetting temperature of wettable and repellent soils from two contrasting anthropogenic sites, the former sewage disposal field Berlin-Buch and the inner-city park Berlin-Tiergarten. The aim of this thesis is to deepen the understanding of processes and mechanisms leading to changes in soil water repellency. This helps to gain further insight into the behaviour of soil organic matter (SOM) and identifying ways to prevent or reduce the negative effects of soil water repellency (SWR). The first focus of this work is to determine whether chemical reactions are required for wetting repellent samples. This hypothesis was tested by time and temperature dependence of sessile drop spreading on wettable and repellent samples. Additionally, diffuse reflectance infrared Fourier transform (DRIFT) spectroscopy was used to determine whether various drying regimes cause changes in the relative abundance of hydrophobic and hydrophilic functional groups in the outer layer of soil particles and whether these changes can be correlated with water content and the degree of SWR. Finally, by artificially altering the pH in dried samples applying acidic and alkaline reagents in a gaseous state, the influence of only pH on the degree of SWR was investigated separately from the influence of changes in moisture status. The investigation of the two locations Buch and Tiergarten, each exceptionally different in the nature of their respective wetting properties, leads to new insights in the variety of appearance of SWR. The results of temperature, water content and pH dependency of SWR on the two contrasting sites resulted in one respective hypothetical model of nature of repellency for each site which provides an explanation for most of the observations made in this and earlier studies: At the Tiergarten site, wetting characteristics are most likely determined by micelle-like arrangement of amphiphiles which depends on the concentration of water soluble amphiphilic substances, pH and ionic strength in soil solution. At low pH and at high ionic strength, repulsion forces between hydrophilic charged groups are minimized allowing their aggregation with outward orientated hydrophobic molecule moieties. At high pH and low ionic strength, higher repulsion forces between hydrophilic functional groups lead to an aggregation of hydrophobic groups during drying, which results in a layer with outward oriented hydrophilic moieties on soil organic matter surface leading to enhanced wettability. For samples from the Buch site, chemical reactions are necessary for the wetting process. The strong dependence of SWR on water content indicates that hydrolysis-condensation reactions are the controlling mechanisms. Since acid catalyzed hydrolysis is an equilibrium reaction dependent on water content, an excess of water favours hydrolysis leading to an increasing number of hydrophilic functional groups. In contrast, water deficiency favours condensation reactions leading to a reduction of hydrophilic functional groups and thus a reduction of wettability. The results of the present investigation and its comparison with earlier investigations clearly show that SWR is subject to numerous antagonistically and synergistically interacting environmental factors. The degree of influence, which a single factor exerts on SWR, is site-specific, e.g., it is dependent on special characteristics of mineral constituents and SOM which underlies the influence of climate, soil texture, topography, vegetation and the former and current use of the respective site.
This thesis focuses on the utilization of modern graphics hardware (GPU) for visualization and computation purposes, especially of volumetric data from medical imaging. The considerable increase in raw computing power in recent years has turned commodity systems into high-performance workstations. In combination with the direct rendering capabilities of graphics hardware, "visual computing" and "computational steering" approaches on large data sets have become feasible. In this regard several example applications and concepts such as the "ray textures" have been developed and are discussed in detail. As the amount of data to be processed and visualized is steadily increasing, memory and bandwidth limitations require compact representations of the data. While the compression of image data has been investigated extensively in the past, the thesis addresses possibilities of performing computations directly on the compressed data. Therefore, different categories of algorithms are identified and represented in the wavelet domain. By using special variants of the compressed format, efficient implementations of essential image processing algorithms are possible and demonstrate the potential of the approach. From the technical perspective, the GPU-based framework "Cascada" has been developed in the course of this thesis. The introduction of object-oriented concepts to shader programming, as well as a hierarchical representation of computation and/or visualization procedures led to a simplified utilization of graphics hardware while maintaining competitive performance. This is shown with different implementations throughout the contributions, as well as two clinical projects in the field of diagnosis assistance. On the one hand the semi-automatic segmentation of low-resolution MRI data sets of the human liver is evaluated. On the other hand different possibilities in assessing abdominal aortic aneurysms are discussed; both projects make use of graphics hardware. In addition, "Cascada" provides extensions towards recent general-purpose programming architectures and a modular design for future developments.
This thesis evaluates automated techniques to remove objects from an image and proposed several modifications for the specific application of removing a colour checker from structure dominated images. The selection of approaches covers the main research field of image inpainting as well as an approach used in medical image processing. Their results are investigated to disclose their applicability to removing objects from structure-intense images. The advantages and disadvantages discovered in the process are then used to propose several modifications for an adapted inpainting approach suitable for removing the colour checker.