Refine
Year of publication
Document Type
- Doctoral Thesis (245)
- Master's Thesis (91)
- Part of Periodical (84)
- Bachelor Thesis (45)
- Diploma Thesis (27)
- Article (13)
- Study Thesis (11)
- Conference Proceedings (10)
- Habilitation (4)
- Other (2)
Language
- English (534) (remove)
Keywords
- Pestizid (8)
- Pflanzenschutzmittel (6)
- Software Engineering (6)
- Internet of Things (5)
- Biodiversität (4)
- Bluetooth (4)
- Bodenchemie (4)
- Landwirtschaft (4)
- Semantic Web (4)
- ecotoxicology (4)
- ontology (4)
- risk assessment (4)
- soil organic matter (4)
- API (3)
- Crayfish plague (3)
- E-KRHyper (3)
- Enterprise 2.0 (3)
- Gamification (3)
- Insektizid (3)
- Knowledge Compilation (3)
- Maschinelles Lernen (3)
- Nanopartikel (3)
- OWL (3)
- OWL <Informatik> (3)
- Ontologie <Wissensverarbeitung> (3)
- Ontology (3)
- Pesticides (3)
- Risikoanalyse (3)
- Systematik (3)
- UML (3)
- Umweltpsychologie (3)
- University (3)
- agriculture (3)
- classification (3)
- computer clusters (3)
- model-based (3)
- pesticide (3)
- pesticides (3)
- virtual reality (3)
- Abduktion <Logik> (2)
- Abwasserreinigung (2)
- Agriculture (2)
- Akzeptanz (2)
- Annotation (2)
- Anpassung (2)
- Araneae (2)
- Beschaffung (2)
- Bestäubung (2)
- Bildverarbeitung (2)
- Biodiversity (2)
- Campus Information System (2)
- Cloud Computing (2)
- Computer Graphics (2)
- Computergraphik (2)
- Computersimulation (2)
- Data Mining (2)
- Diffusion (2)
- Ecotoxicology (2)
- Emissionen (2)
- Equality (2)
- Feldsaum (2)
- Formale Ontologie (2)
- Freshwater crayfish (2)
- GIS (2)
- Genetische Variabilität (2)
- Graphik (2)
- Grounded Theory (2)
- Habitat Fragmentation (2)
- Hydrodynamik (2)
- Kakamega Forest (2)
- Klimawandel (2)
- Kognitive Linguistik (2)
- Line Space (2)
- Linked Open Data (2)
- Logistik (2)
- Metamodel (2)
- Modellgetriebene Entwicklung (2)
- Nanoparticles (2)
- Netzwerk (2)
- Neuronales Netz (2)
- OpenGL (2)
- Petri Nets (2)
- Petri-Netze (2)
- Propagation (2)
- RDF (2)
- Risikobewertung (2)
- Risikomanagement (2)
- Schlussfolgern (2)
- Sediment (2)
- Serviceorientierte Architektur (2)
- Simulation (2)
- Softwaretest (2)
- Sozialpsychologie (2)
- Sustainability (2)
- Taxonomie (2)
- Taxonomy (2)
- Theorem Proving (2)
- Umwelttoxikologie (2)
- Umweltwissenschaften (2)
- Usability (2)
- Volumen-Rendering (2)
- Wastewater treatment plants (2)
- Wikipedia (2)
- Wirbellose (2)
- XML (2)
- aquatic ecotoxicology (2)
- aquatic macrophytes (2)
- artificial neural networks (2)
- constraint logic programming (2)
- decomposition (2)
- description logic (2)
- diffusion (2)
- ecotoxicity (2)
- emic-etic (2)
- eye tracking (2)
- framework (2)
- freshwater organisms (2)
- governance (2)
- hybrid automata (2)
- invertebrates (2)
- micropollutants (2)
- mobile phone (2)
- modelling (2)
- monitoring (2)
- mosquito control (2)
- multi-agent systems (2)
- multimedia metadata (2)
- optimal control (2)
- parallel algorithms (2)
- probability propagation nets (2)
- risk (2)
- semantics (2)
- simulation (2)
- social media (2)
- social simulation (2)
- soil water repellency (2)
- sorption (2)
- streams (2)
- tracking (2)
- traits (2)
- transformation (2)
- visualization (2)
- wastewater treatment (2)
- Ökosystem (2)
- Ökosystemdienstleistung (2)
- Ökotoxikologie (2)
- 101companies (1)
- 1H-NMR Relaxometry (1)
- 2019 European Parliament Election (1)
- 8C model (1)
- ABox (1)
- API Analysis (1)
- API Migratiom (1)
- API analysis (1)
- API-Analyse (1)
- AUTOSAR (1)
- Abbildung <Mathematik> (1)
- Abdrift <Pflanzenbau> (1)
- Absolutismus (1)
- Abwasser (1)
- Abwasserbehandlung (1)
- Acceleration Structures (1)
- Action Recognition (1)
- Action Segmentation (1)
- Ad-hoc-Netz (1)
- Adaptation (1)
- Adaptive Services Grid (ASG) (1)
- Adobe Flex (1)
- Africa (1)
- Afrika (1)
- Agenten (1)
- Agentenorientiertes Software Engineering (1)
- Agents (1)
- Agrarlandschaft (1)
- Agrochemikalien (1)
- Aktionsart (1)
- Aktiver Wortschatz (1)
- Algolib (1)
- Algorithm Engineering (1)
- Algorithmische Geometrie (1)
- Amazon Mechanical Turks (1)
- Amazonia (1)
- Amazonien (1)
- Amphibia (1)
- Analysis of social platform (1)
- Android <Systemplattform> (1)
- Anforderung (1)
- Antagonistic interactions (1)
- Aphanomyces astaci (1)
- Aphid predator (1)
- Aquatic Ecotoxicology (1)
- Aquatic Guidance Document (1)
- Aquatic ecology (1)
- Aquatische Makrophyten (1)
- Aquatisches Ökosystem (1)
- Architektur <Informatik> (1)
- Archivierung (1)
- Articles for Deletion (1)
- Artificial Intelligence (1)
- Artificial Neural Networks (1)
- Arzneimittel (1)
- Arzneistoffe (1)
- Aspekt <Linguistik> (1)
- Association Rules (1)
- Astacus astacus (1)
- Auchenorrhyncha (1)
- Auditing (1)
- Augenbewegung (1)
- Augmented Reality (1)
- Ausbreitung (1)
- Auslese (1)
- Auswahl (1)
- Automated Theorem Proving (1)
- Automated Theorem Proving Systems (1)
- Automatische Klassifikation (1)
- Automatisches Beweisverfahren (1)
- Automotive Systems (1)
- Autoritarismus (1)
- Avatar (1)
- B2B Integration (1)
- BPM (1)
- BPMN (1)
- BPMS (1)
- Bach (1)
- Barbatula barbatula (1)
- Basic psychological needs (1)
- Bayes Procedures (1)
- Bayes-Netz (1)
- Bayesian Networks (1)
- Beaconless (1)
- Bebauungsdichte (1)
- Bedarfsanalyse (1)
- Bedarfsforschung (1)
- Bedrohte Tiere (1)
- Bees (1)
- Befahrbarkeit (1)
- Belebtschlamm (1)
- Belief change, concept contraction, EL (1)
- Benetzung (1)
- Benutzerverhalten (1)
- Beruflicher Kontakt (1)
- Beschichtung (1)
- Beschreibungslogik (1)
- Bestäuber (1)
- Beta-Blocker (1)
- Beta-Diversität (1)
- Bewertungskriterien (1)
- Bienen <Familie> (1)
- Big Five (1)
- Bildanalyse (1)
- Bildsegmentierung (1)
- Bilingualer Unterricht (1)
- Binnengewässer (1)
- Bioassay (1)
- Biohydrogel (1)
- Biometric Authentication (1)
- Biopolymere (1)
- Biotransformation (1)
- Biozide (1)
- Bipartiter Graph (1)
- Blickbewegung (1)
- Blickpunktabhängig (1)
- Blog marketing (1)
- Boden (1)
- Bodenphysik (1)
- Bodenwasser (1)
- Bodenökologie (1)
- Bombina variegata (1)
- Border Gateway Protocol (1)
- Border Gateway Protocol 4 (1)
- Breeding tree selection (1)
- Budongo Forest (1)
- Building Performance Evaluation (1)
- Business Collaboration (1)
- Business English (1)
- Business Process Management Recommender Systems Survey (1)
- Business Process Modeling (1)
- Business Rule Bases, Inconsistency Measurement (1)
- Butterflies (1)
- Bärlappe (1)
- Bürgerbeiteiligung (1)
- C++ (1)
- CLIL (1)
- COVID-19 (1)
- CSCW (1)
- Calcium (1)
- Calculus (1)
- Carry-over effects (1)
- Case Study Analysis (1)
- Cashew-Sektor (1)
- Casual Games (1)
- Cations (1)
- Challenges (1)
- Chaos (1)
- Cheilolejeunea ; continental tropical Africa ; rainforest (1)
- Chemische Abwasserreinigung (1)
- Chironomus riparius (1)
- Chromatographie (1)
- Climate (1)
- Climate anxiety (1)
- Climate change (1)
- Climate denial (1)
- Cloud Point Extraction (1)
- Clustering coefficient (1)
- CodeBlue (1)
- Cognitive functions (1)
- Cold Chain (1)
- Coleoptera (1)
- Collaboration (1)
- Coloskopie (1)
- Communication Networks (1)
- Computational Toxicology (1)
- Computational biology (1)
- Compute Shader (1)
- Computer Security (1)
- Computer Supported Cooperative Work (1)
- Computer Vision (1)
- Computer assisted communication (1)
- Computeranimation (1)
- Computersicherheit (1)
- Computerspiel (1)
- Computertomografie (1)
- Computervisualistik (1)
- Conceptual Metaphor Theory (1)
- Conference (1)
- Connected Vehicles (1)
- Conservation (1)
- Consumer behaviour (1)
- Consumption renunciation (1)
- Container Entity Modell (1)
- Content Analysis (1)
- Content Management (1)
- Content and Language Integrated Learning (1)
- Content and Language Integrated Learning (CLIL) (1)
- Context-aware processes (1)
- Core Ontology on Multimedia (1)
- Core Ontology on Multimedia (COMM) (1)
- Core Self-Evaluations (1)
- Corvus frugilegus (1)
- Cottus gobio (1)
- Crayfish (1)
- Creativity (1)
- Criteria Matrix (1)
- Crowdsourcing (1)
- Curriculum (1)
- DMN (1)
- DPLL procedure (1)
- DRIFTS (1)
- DTI (1)
- Daphnia (1)
- Daphnia longispina (1)
- Daphnia longispina complex (1)
- Daphnia longispina-Komplex (1)
- Darmpolyp (1)
- Data compression (1)
- Data manipulation (1)
- Data protection (1)
- Datenaustausch (1)
- Datenkompression (1)
- Datenschutz (1)
- Decision-support (1)
- Decodierung (1)
- Deduktion (1)
- Deep Metric Learning (1)
- Defi-Now! (1)
- Defibrillator (1)
- Delta (1)
- Demographie (1)
- Demography (1)
- Densimetric Measurement (1)
- Depth Profile (1)
- Description Logic (1)
- Description Logics (1)
- Design Pattern (1)
- Design Science Research (1)
- Destiny (1)
- Developer profiling (1)
- Diabetes (1)
- Diabetische Retinopathie (1)
- Diagnose (1)
- Diagnosekriterien (1)
- Diagnoseunterstützung (1)
- Diagnosis (1)
- Diagnosis assistance (1)
- Dichtemessung (1)
- Differentia Scanning Calorimetry (1)
- Differential scanning calorimetry (1)
- Diffuse Quellen (1)
- Diffusionsbildgebung (1)
- Digitale Bilder (1)
- Digitalisation (1)
- Digitalisierung (1)
- Dijkstras Algorithmus (1)
- Dimension 3 (1)
- Dimensionality Reduction (1)
- Dimensionsreduzierung (1)
- Discussion Forums (1)
- Diskrete Simulation (1)
- Distance Vector Routing (1)
- Distanz Vektor Routing (1)
- Distributed Algorithm (1)
- Distributed Environments (1)
- Distributed process execution (1)
- Documents (1)
- Dokumentation (1)
- Dracaena (1)
- Drahtloses Sensorsystem (1)
- Drahtloses lokales Netz (1)
- Drahtloses vermachtes Netz (1)
- Drainagegräben (1)
- Dredging (1)
- Dreidimensionale Bildverarbeitung (1)
- Driver Assistance Systems (1)
- Dynamische Analyse (1)
- Düngemittel (1)
- E-Business (1)
- E-Hyper Tableau (1)
- E-KRHyper theorem prover (1)
- E-Participation (1)
- E-Partizipation (1)
- E-government (1)
- E-participation (1)
- E-services (1)
- ECMS 2012 (1)
- ECSA (1)
- EU (1)
- East Africa (1)
- Ebener Graph (1)
- Ebullition (1)
- Eclipse <Programmierumgebung> (1)
- Economic potential (1)
- Ecosystem service (1)
- Ecotoxicity (1)
- Eddy-covariance (1)
- Edelkrebs (1)
- Effectiveness (1)
- Einkauf (1)
- Einstellung (1)
- Einstellungen gegenüber bestimmten Filmeigenschaften (1)
- Ekel (1)
- Elastic net (1)
- Elektronenmikroskopie (1)
- Elevation gradient (1)
- Emergenz (1)
- Emission (1)
- Empfehlungssystem (1)
- Empirical Research (1)
- Empirical Studies (1)
- Employee Behavior (1)
- Emulation (1)
- Endangerment (1)
- Endokrine Regulation (1)
- Energiefluss (1)
- Energy fluxes (1)
- Englisch (1)
- Enhanced Reality (1)
- Enhanced Representation (1)
- Enterprise Architecture Framework (1)
- Enterprise Information Management (1)
- Enterprise Systems (1)
- Entity Component System Architecture (1)
- Entrepreneurship (1)
- Entrepreneurship Education (1)
- Entrepreneurship Experience and Extra-curricular Activity (1)
- Entscheidungsunterstützung (1)
- Entwickler Profil (1)
- Entwurfsmuster (1)
- Environmental Risk Assessment (1)
- Environmental factors (1)
- Environmental organic chemistry (1)
- Environmental psychology (1)
- Environmental samples (1)
- Epiphyten (1)
- Ergonomic Principles (1)
- Erste Hilfe (1)
- Erzieher (1)
- Erzieherin (1)
- European Conference on Modelling and Simulation (1)
- Europäischer Schadensbericht (1)
- Evacuation modeling (1)
- Evaluierung (1)
- Evidence-based Psychotherapy (1)
- Eye Tracking (1)
- Eyetracking (1)
- FTIR (1)
- Fabric Simulation (1)
- Facebook Application (1)
- Facet Theory (1)
- Fahrverhalten (1)
- Fahrzeug (1)
- Farbkalibrierung (1)
- Farnpflanzen (1)
- Fast-slow continuum (1)
- Fault Trees (1)
- Faxonius limosus (1)
- Feature Extraction (1)
- Feature Modeling (1)
- Fehlerbaum (1)
- Felis catus (1)
- Felis silvestris domestica (1)
- Ferns (1)
- Feuchtgebiet (1)
- Fiber Tracking (1)
- Filmbewertung (1)
- Fingerprint Recognition (1)
- First aid (1)
- Fischgewebe (1)
- Five Factor model (1)
- Fledermäuse (1)
- Flesch-Reading-Ease Index (1)
- FlexViz (1)
- Fließgewässer (1)
- Flow decomposition (1)
- Fluid-Struktur Wechselwirkung (1)
- Fluss (1)
- Foliicolous lichens (1)
- Food (1)
- Food Transportation System (1)
- Foodstuff (1)
- Formal Methods (1)
- Formale Methoden (1)
- Fotoauswahl (1)
- Fractionation (1)
- Fragebeantwortung (1)
- Freeze Coring (1)
- Fremdsprachendidaktik (1)
- Fremdsprachenunterricht (1)
- Function Words (1)
- Fungicides (1)
- Fungizid (1)
- Fuzzy-Logik (1)
- Fächerkanon (1)
- GDPR (1)
- GDS (1)
- GPGPU (1)
- GPS (1)
- GPU (1)
- GRAF1 (1)
- GReQL2 (1)
- GSM-Standard (1)
- Galerucinae (1)
- Game-based Learning (1)
- Gammarus fossarum (1)
- Gangart (1)
- Ganzzahlige Optimierung (1)
- Gas storage capacity (1)
- Gasblasen (1)
- GazeTheWeb (1)
- Gefrierkernverfahren (1)
- Gefrierpunktserniedrigung (1)
- Gefährdung (1)
- Gefäßanalyse (1)
- Gefühl (1)
- Gehirn (1)
- Gel effect (1)
- Gelbbauchunke (1)
- Gelände (1)
- Gemischt-ganzzahlige Optimierung (1)
- Generative Model (1)
- Genetic diversity (1)
- Genetics (1)
- Genetik (1)
- Genetischer Fingerabdruck (1)
- Geocaching (1)
- Geographic routing (1)
- Geoinformationssystem (1)
- Geometric spanner (1)
- Geowissenschaften (1)
- Gerichteter Graph (1)
- Germany (1)
- Geschlecht (1)
- Gewässer (1)
- Gewässerqualität (1)
- Gewässerökologie (1)
- Glasumwandlung (1)
- Glasübergang (1)
- Globale Wertschöpfungsketten (1)
- Grafikkarte (1)
- Grafikprogrammierung (1)
- Grails (1)
- Grails 1.2 (1)
- Graph (1)
- Graph Technology (1)
- Graph theory (1)
- Graphentheorie (1)
- Graphicsprogramming (1)
- Graphik-Hardware (1)
- Graphische Benutzeroberfläche (1)
- Grayscale (1)
- Grundbedürfnis (1)
- Gruppenarbeit (1)
- Größenfraktionierung (1)
- Grünlandbewirtschaftung (1)
- Gut content analysis (1)
- Habitat loss (1)
- Habitat networks (1)
- Habitatfragmentierung (1)
- Habitatsverlust (1)
- Hand-based Gestures (1)
- Handsfree editing (1)
- Hard and Soft News (1)
- Haskell (1)
- Hauskatze (1)
- Health (1)
- Healthcare institution (1)
- Hedonic (1)
- Hedonisch (1)
- Heimarbeit (1)
- Herbizid (1)
- Herzrate (1)
- Hindernis (1)
- Horn Clauses (1)
- Human Disturbance (1)
- Human motion (1)
- Human resources management (1)
- Human-Computer Interaction (1)
- Humus (1)
- Hyaluronan (1)
- Hyaluronsäure (1)
- Hydratation (1)
- Hydration (1)
- Hydrodynamics (1)
- Hydrogel (1)
- Hydrophobie (1)
- Hyper Tableau Calculus (1)
- Hypertableaux (1)
- I-messages (1)
- IASON (1)
- IAT (1)
- IBM Bluemix (1)
- ICM (1)
- ICP-MS (1)
- IPT (1)
- IT Guru (1)
- IT Outsourcing (1)
- IT Security (1)
- IT Services (1)
- IT-Security (1)
- IceCube (1)
- Image (1)
- Image Processing (1)
- Image Understanding (1)
- Imitation Learning (1)
- Implicit Association Test (1)
- Incremental Algorithms (1)
- Industrial-CT (1)
- Industriepolitik (1)
- Informatik (1)
- Information Asset Register (1)
- Information Audit (1)
- Information Capturing Methods (1)
- Information Centric Networking (1)
- Information Retrieval (1)
- Information system (1)
- Inkompressible Fluide (1)
- Innerbetriebliche Kooperation (1)
- Inpainting-Verfahren (1)
- Insecticide (1)
- Instructed Second Language Acquisition (1)
- Insurance (1)
- Integrated Model (1)
- Intelligent Information Network (1)
- Interactive Video Retrieval (1)
- Interaktion (1)
- Interaktionseffekt (1)
- Intergruppenprozesse (1)
- International organization (1)
- Internationale Organisationen (1)
- Internet (1)
- Internet Voting (1)
- Interoperability (1)
- Interoperabilität (1)
- Interparticulate hydrogel swelling (1)
- IoT (1)
- JGraLab (1)
- JML (1)
- Java (1)
- Java <Programmiersprache> (1)
- Java Modeling Language (1)
- Java. Programmiersprache (1)
- Journalismusforschung (1)
- Justification (1)
- KRHyper (1)
- Kalkmagerrasen (1)
- Kanalcodierung (1)
- Kantenbewerteter Graph (1)
- Kantenverfolgung (1)
- Katastrophentheorie (1)
- Kation-Brücken (1)
- Kationen (1)
- Katze (1)
- Kenya (1)
- Klassifikation (1)
- Klima (1)
- Knowledge (1)
- Knowledge Engineering (1)
- Knowledge Graphs (1)
- Knowledge Sharing (1)
- Kognitive Entwicklung (1)
- Kohlenstoffkreislauf (1)
- Kohlenstoffschichten (1)
- Kollaboration (1)
- Kollektivismus (1)
- Kolloid (1)
- Kolloide (1)
- Kolloids (1)
- Komplexität / Algorithmus (1)
- Konjugation (1)
- Konkurrenz (1)
- Konsistenz. Psychologie (1)
- Konsumentenverhalten (1)
- Konsumverzicht (1)
- Kontaktwinkel (1)
- Konturfindung (1)
- Konzept (1)
- Krebspest (1)
- Kriterium (1)
- Kryo (1)
- Körperliche Aktivität (1)
- Künstliche Intelligenz (1)
- Künstliche Neuronale Netze (1)
- L2 writers (1)
- Lake Kinneret (1)
- Lake Naivasha (1)
- Lake Wamala (1)
- Lakes (1)
- Landscape ecology (1)
- Landschaftskartierung (1)
- Landschaftsökologie (1)
- Langlebigkeit (1)
- Laser (1)
- Lasso (1)
- Last-year students (1)
- Latent Negative (1)
- Laufen (1)
- Lebenslanges Lernen (1)
- Lebensmittel (1)
- Lebensstandard (1)
- Lehrerbildung (1)
- Lehrerkompetenzen (1)
- Leichte Sprache (1)
- Leugnung (1)
- Life history (1)
- Limnologie (1)
- Limnology (1)
- Limology (1)
- Linespace (1)
- Linguistic Requirements (1)
- Link Prediction (1)
- Linked Data Modeling (1)
- Loans (1)
- Local algorithm (1)
- Logik (1)
- Logischer Schluss (1)
- Lokalisation (1)
- Longevity (1)
- Lurche (1)
- Lycophytes (1)
- MIA (1)
- MPEG-7 (1)
- MSR (1)
- Machine-Learning (1)
- Machinelles lernen (1)
- Magnetis (1)
- Maifisch (1)
- Makrophyten (1)
- MapReduce (1)
- Mapping <Mathematics> (1)
- Maschinelles Sehen (1)
- Mass-Spektrometrie (1)
- Matching (1)
- Material Point Method (1)
- Mathematical optimisation (1)
- Mathematik (1)
- Maßtheorie (1)
- MeVisLab (1)
- Measure-theory (1)
- Mediator framework (1)
- Medical Image Analysis (1)
- Medizinische Bildanalyse (1)
- Medizinische Bildverarbeitung (1)
- Meiofauna (1)
- Mensch-Maschine-Interaktion (1)
- Merkmalsdetektion (1)
- Merkmalsextrahierung (1)
- Mesofauna (1)
- Metalle/Matalloide (1)
- Metalloids (1)
- Metals (1)
- Metals/metalloids (1)
- Metapher (1)
- Metapopulation dynamics (1)
- Metapopulationsdynamiken (1)
- Methan (1)
- Methane emissions (1)
- Methode (1)
- Microfinance (1)
- Microfinance institutions (1)
- Microplastics (1)
- Micropollutants (1)
- Migration (1)
- Mikrofinanzierung (1)
- Mikroorganismus (1)
- Mikroplastik (1)
- Mikrosatelliten-DNA (1)
- Mikroverunreinigung (1)
- Minderung (1)
- Minimalschnitt (1)
- Mining (1)
- Mining Software Repositories (1)
- Mister X (1)
- Mitral Valve (1)
- Mitralklappe (1)
- Mixed integer programming (1)
- Mixed method (1)
- Mixed methods (1)
- Mixture Toxicity (1)
- Mobile Information Systems (1)
- Model-Driven Engineering (1)
- Modellfahrzeug (1)
- Monitoring (1)
- Monolepta (1)
- Morphologische Operatoren (1)
- Mosambik (1)
- Motion Capturing (1)
- Motivation (1)
- Mouse Gestures (1)
- Movie evaluation criteria (1)
- Mucilage (1)
- Multi-Agenten-Systeme (1)
- Multi-robot System (1)
- Multiagent System (1)
- Multiagentensysteme (1)
- Multidimensional (1)
- Multimedia Metadata Ontology (1)
- Multimodal Action Recognition (1)
- Multimodal Medical Image Analysis Cochlea Spine Non-rigid Registration Segmentation ITK VTK 3D Slicer CT MRI CBCT (1)
- Multiple Object Tracking (1)
- Multivariable Statistik (1)
- N-Body Simulation (1)
- N-Körper Simulation (1)
- NMR relaxometry (1)
- NMR-Spektroskopie (1)
- Nachbarschaftsgraph (1)
- Nachtfalter (1)
- Nachtschmetterlinge (1)
- Named Function Networking (1)
- Nanoröhren (1)
- Nassbaggerung (1)
- Nationalismus (1)
- Native language identification (1)
- Natural Feature Tracking (1)
- Natural Language Processing (1)
- Naturschutzgenetik (1)
- Naturschutzmanagement (1)
- Natürliche Schädlingskontrolle (1)
- Natürliches organisches Material (1)
- Navier-Stokes Gleichungen (1)
- Navier-Stokes equations (1)
- Near-surface turbulence (1)
- Network robustness (1)
- Networks (1)
- Netzwerk Routing (1)
- Netzwerkanalyse (1)
- Netzwerkrobustheit (1)
- Netzwerktopologie (1)
- Neuroactive chemicals (1)
- Neutino (1)
- Nicht-Ziel-Pflanzen (1)
- Non-freezing water (1)
- Nuclear Magnetic R (1)
- Nutzererleben (1)
- Nyungwe National Park (1)
- Nährstoffverfügbarkeit (1)
- Nützlinge (1)
- OCB (1)
- OCL <Programmiersprache> (1)
- ODRL (1)
- ONDEX (1)
- OPD-SHRM (1)
- OPNET (1)
- OVTK (1)
- Oberflächen-Runoff (1)
- Oberflächeneigenschaft (1)
- Oberflächenveredelung (1)
- Object Recognition (1)
- Objektentfernung (1)
- Oligomer (1)
- One-Shot Action Recognition (1)
- Online Community (1)
- Online grocery shopping (1)
- Online-Lebensmittelhandel (1)
- Ontologie. Wissensverarbeitung (1)
- Ontology API model (1)
- Ontology alignment (1)
- Open Content (1)
- Open Source (1)
- OpenGL Shading Language (1)
- OpenVDB (1)
- Optimierung (1)
- Optimization (1)
- Oracle Generation (1)
- Oraklegenerierung (1)
- Organische Bodensubstanz (1)
- Organizational Change (1)
- Oriental region (1)
- Ostafrika (1)
- Osteocephalus (1)
- Ozon (1)
- Ozonisierung (1)
- PEPPOL (1)
- POIs (1)
- Pan European Public Procurement OnLine (1)
- Parteienkommunikation (1)
- Passiver Wortschatz (1)
- Path Tracing (1)
- Pattern Recognition (1)
- Perfect (1)
- Perfekt (1)
- Personalised Information Systems (1)
- Personality (1)
- Persönlichkeit (1)
- Pestizide (1)
- Petri net (1)
- Petri-Netz (1)
- Petrinetz (1)
- Pfadnachverfolgung (1)
- Pfadplanung (1)
- Pfadverfolgung (1)
- Pflanzen (1)
- Pharmaceuticals (1)
- Pharmakokinetik (1)
- Phosphorsäureester (1)
- Photographie (1)
- Phylogeographie (1)
- Physik (1)
- Physiksimulation (1)
- Placement Strategies (1)
- Planar graphs (1)
- Plant Communities (1)
- Plant protection products (1)
- Plastic mulching (1)
- Plasticization; Glass transition (1)
- Plastifizieren (1)
- Plastifizierung (1)
- Plug in (1)
- Pointing Devices (1)
- Policy Language (1)
- Political Communication (1)
- Politik (1)
- Politische Ökonomie (1)
- Pollinators (1)
- Pollution (1)
- Polysaccharide (1)
- Polysaccharides (1)
- Populationsgenetik (1)
- Pore Water (1)
- Pragmatic (1)
- Pragmatisch (1)
- Predictive Model (1)
- Present Perfect (1)
- Pro-environmental behaviour change (1)
- Proactive Caching (1)
- Probabilistic finite automata (1)
- Probability (1)
- Probability propagation nets (1)
- Problematic smartphone use (1)
- Procambarus virginalis (1)
- Proceedings (1)
- Process (1)
- Process Quality (1)
- Process tracing (1)
- Product choice (1)
- Produktbewertung (1)
- Produktentscheidung (1)
- Produktwahl (1)
- Programmierung (1)
- Prosoziales Verhalten (1)
- Proteinstrukturanalyse (1)
- Provenance (1)
- Prozedurale Synthese (1)
- Prozessqualität (1)
- Prädikatenlogik (1)
- Präposition (1)
- Präsentisches Perfekt (1)
- Pteris (1)
- Py-GC/MS (1)
- Pyrethroide (1)
- Pädagogik (1)
- Quality assessment system (1)
- Quasi unit disk graph (1)
- Query Expansion (1)
- RDF Graphs (1)
- RDF modeling (1)
- RNA sequencing (1)
- Railway Research (1)
- Railway Research Topics (1)
- Railway Safety (1)
- Railway Safety Research (1)
- Random Finite Sets (1)
- Random Forest (1)
- Raupe (1)
- Raytracing (1)
- ReDSeeDS-Project (1)
- Reactive algorithm (1)
- Real-Life Game (1)
- Real-Life Spiel (1)
- Real-Time (1)
- Rechtfertigung (1)
- Rechtfertigung <Philosophie> (1)
- Recommender System (1)
- Recommender Systems, Business Process Modeling, Literature Review (1)
- Recovery (1)
- Reddit (1)
- Reengineering (1)
- Reference Model (1)
- Referenzrahmen (1)
- Reflections (1)
- Reflektionen (1)
- Regenwald (1)
- Regenwald ; Afrika ; Cheilolejeunea (1)
- Regionenlabeling (1)
- Registratur (1)
- Rehabilitation (1)
- Relevance Feedback (1)
- Religiosität (1)
- Rendering (1)
- Renewable energy (1)
- Reproduktion (1)
- Reservoir Sedimentation (1)
- Reservoirs (1)
- Resource Description Framework (RDF) (1)
- Resource Governance (1)
- Retina Befundbilder (1)
- Retina Fundus Bilder (1)
- Retina Fundus Images (1)
- Reverse Engineering (1)
- Revision (1)
- Rezeptionsforschung (1)
- Rhein (1)
- Rheinland-Pfalz (1)
- Rheometry (1)
- Rhineland-Palatinate (1)
- Rhizosphere (1)
- Right-wing ideology (1)
- Risikoabschätzung (1)
- Risikominimierung (1)
- Risk assessment (1)
- RoboCup (1)
- Robocup 2008 (1)
- Roboter (1)
- Robotik (1)
- Robust Principal Component Analysis (1)
- Rook (1)
- Rothe's method (1)
- Rothe-Methode (1)
- Routing (1)
- Routing Information Protocol (1)
- Routing Information Protocol (RIP) (1)
- Routing Information Protokoll (1)
- Routing Loops (1)
- Routing with Metric based Topology Investigation (RMTI) (1)
- Ruscaceae (1)
- Russia (1)
- Räuber (1)
- Rückverfolgbarkeit (1)
- SOA (1)
- SPARQL (1)
- SPEAR (1)
- STOF Model (1)
- Salinisation (1)
- Sand (1)
- Satelliten-DNS (1)
- Sattelkraftfahrzeug (1)
- Sattelzug (1)
- Saving (1)
- Saving and credit cooperatives (SACCOs) (1)
- Schadstoffbelastung (1)
- Schadstoffkonzentration (1)
- Schema Information (1)
- Schizophrenie (1)
- Schnee (1)
- Schreiben (1)
- Schreibtechnik (1)
- Schulden (1)
- Schwache Lösungen (1)
- Schwebstoffe (1)
- Schwermetalle (1)
- Schädlingskontrolle (1)
- Search engine (1)
- Security (1)
- Security Requirements (1)
- Security Routing (1)
- Sediment Water Interface (1)
- Sediment-Water-Interfaces (1)
- See (1)
- Segmentation (1)
- Segmentierung (1)
- Selbstbeobachtung (1)
- Selbstbeschädigung (1)
- Selbsteinschaetzung (1)
- Selbstorganisation (1)
- Selbstregulation (1)
- Self-determination theory (1)
- Semantic Data (1)
- Semantic Web Data (1)
- Semantics (1)
- Sensing as a Service (1)
- Serious Games (1)
- Service identification (1)
- Service-Identifizierung (1)
- Service-oriented Architectures (SOA) (1)
- Service-orientierte Architektur (1)
- Sexuelle Orientierung (1)
- Shader (1)
- Sicherheit Routing (1)
- Simulationswerkzeug (1)
- Size-fractionation (1)
- Skalenkonstruktion (1)
- Skalenvalidierung (1)
- Skalierungsmodelle (1)
- Smartphone (1)
- Smartphone Applikation (1)
- Smartphone addiction (1)
- Social Cognitive Career Theory (1)
- Social Entrepreneurship in Vietnam (1)
- Social Games (1)
- Social Networking Platforms (1)
- Social identity theory (1)
- Socio-ecological transformation (1)
- Socio-economic development (1)
- Software (1)
- Software Development (1)
- Software Language (1)
- Software Repositories (1)
- Software Technology (1)
- Software migration (1)
- Software techniques for object recognition (STOR) (1)
- Software-Migration (1)
- Softwarearchitektur (1)
- Softwareentwicklung (1)
- Softwareergonomie (1)
- Softwaretesting (1)
- Softwarewartung (1)
- Soil physics (1)
- Soil structural stability (1)
- Solutions (1)
- Sorption (1)
- Southern Amazonia (1)
- Sozial-ökologische Transformation (1)
- Soziale Identität (1)
- Soziale Wahrnehmung (1)
- Soziales Netzwerk (1)
- Soziales System (1)
- Sozialwissenschaftliche Simulation (1)
- Sparen (1)
- Speaker Recognition (1)
- Spear (1)
- Speciation (1)
- Species turnover (1)
- Specification (1)
- Specular (1)
- Spezifikation (1)
- Spiralcurriculum (1)
- Sprechweise (1)
- Standard of living (1)
- Statistical Shape Model (1)
- Staubewässerung (1)
- Staugeregelte Flüsse (1)
- Stausee (1)
- Stauseeverlandung (1)
- Stechmücke (1)
- Stechmücken-Kontrolle (1)
- Stereotyp (1)
- Stereotype Content Model (1)
- Steuerung (1)
- Stimme (1)
- Stimmungsveränderung (1)
- Stochastic Logic (1)
- Stoffsimulation (1)
- Strassenkreuzung (1)
- Straßenzustand (1)
- Streams (1)
- Structural Equation Modeling (1)
- Structural Validity (1)
- Strukturelle Validität (1)
- Suffering (1)
- Sufficiency (1)
- Sufficiency orientation (1)
- Suffizienz (1)
- Suffizienzorientierung (1)
- Support System (1)
- Surface Science (1)
- Survey Research (1)
- Systematics (1)
- Systembiologie (1)
- Säugetiere (1)
- Südafrika (1)
- Süßwasserhaushalt (1)
- TAP (1)
- TBox (1)
- TRECVID (1)
- Tableau Calculus (1)
- Technical potential (1)
- Technologischer Raum (1)
- Telearbeit (1)
- Tempus (1)
- Tenneco Automotive (1)
- Tense (1)
- Test Generation (1)
- Testen (1)
- Testgenerierung (1)
- Text (1)
- Text Analysis (1)
- Text Mining (1)
- Text classification (1)
- Texterkennung (1)
- Theorem prover (1)
- Theorembeweiser (1)
- Time (1)
- Titandioxid-Nanopartikeln (1)
- Tokens (1)
- Tool Evaluation (1)
- Torf (1)
- Toxicological characterization (1)
- Toxicology (1)
- Toxikologische Bewertung (1)
- Toxizität (1)
- Traceability (1)
- Tracing (1)
- Tracking-System (1)
- Transfer coefficients (1)
- Transferfunction (1)
- Transferfunktion (1)
- Transformation products (1)
- Transformationsprodukte (1)
- Transport (1)
- Tropfenform (1)
- Tropical rainforest (1)
- Tropischer Regenwald (1)
- Turbulence (1)
- Turbulenz (1)
- Type System (1)
- Type system (1)
- Types of smartphone use (1)
- Ubuntu (1)
- Ultraschall (1)
- Ultrasound (1)
- Umfrage (1)
- Umfrage in Koblenz (1)
- Umkehrosmose (1)
- Umwelt (1)
- Umweltchemikalie (1)
- Umweltproben (1)
- Umweltverhaltensänderung (1)
- Umweltverschmutzung (1)
- Unified Modeling Language (UML ) (1)
- Unit disk graph (1)
- Unlink Prediction (1)
- Unsicheres Schließen (1)
- Unterrichtsforschung (1)
- Unterrichtsqualität (1)
- Untersuchung (1)
- Unterwasser-Pipeline (1)
- Unterwasserfahrzeug (1)
- Unterwasserkabel (1)
- Unterwasserwelt (1)
- User experience (1)
- User-Needs Analysis (1)
- VCD (1)
- VIACOBI (1)
- Variabilität (1)
- Vascular analysis (1)
- Vegetation (1)
- Vegetation distribution (1)
- Verb (1)
- Verbal Aspect (1)
- Verbraucherverhalten (1)
- Vergangenheitstempus (1)
- Verhandlung (1)
- Verification (1)
- Verifikation (1)
- Vermeidung (1)
- Versalzung (1)
- Verteilter Algorithmus (1)
- Verteilung (1)
- Virtual Company Dossier (1)
- Virtual characters (1)
- Virtuelle Realität (1)
- Visibility Skeleton (1)
- Visual Stimuli Discovery (1)
- Visualisierung von Verbformen (1)
- Vocabulary (1)
- Vocabulary Mapping (1)
- Vocabulary Reuse (1)
- Vocabulary Trainer (1)
- Vokabellernen (1)
- Volume Hatching (1)
- Vorschulkind (1)
- Vulnerability (1)
- WCET (1)
- WEB (1)
- WLAN Fingerprinting (1)
- WSDL (1)
- WSN (1)
- Wachstumsregler (1)
- Wahlen zum europäischen Parlament (EU-Wahlen) (1)
- Wahrscheinlichkeit (1)
- Wahrscheinlichkeitsrechnung (1)
- Wanderfische (1)
- Wasser-Sediment-Grenzschichten (1)
- Wasserverschmutzung (1)
- Wastewater (1)
- Water Management (1)
- Water quality (1)
- Wavelet (1)
- Wearables (1)
- Web (1)
- Web Analytics (1)
- Web Analytics Framework (1)
- Web Mining (1)
- Web Ontology Language (OWL) (1)
- Web Science (1)
- Web Services (1)
- Web log (1)
- Web-application framework (1)
- Web-programming technologies (1)
- Weblog (1)
- Website (1)
- Wechselkursänderung (1)
- Weinbau (1)
- Weltkultur (1)
- Werbung (1)
- WiFi Fingerprinting (1)
- Wiederbesiedlung (1)
- Wild pollinator (1)
- Wildbienen (1)
- Wildtiere (1)
- Wireless sensor network (1)
- Wirtschaft (1)
- Wirtschaftsenglisch (1)
- Wissensbasis (1)
- Wissensmanagement (1)
- Word-of-Mouth (1)
- World Wide Web 2.0 (1)
- Wortschatz (1)
- Wrapping (1)
- X-ray computer tomography (XRT) (1)
- Yellow-bellied toad (1)
- You-messages (1)
- Zeit (1)
- Zikaden (1)
- Zoologie (1)
- Zoology (1)
- Zooplankton (1)
- Zuckmücken (1)
- Zusammenhängender Graph (1)
- absolutism (1)
- acceptance (1)
- acid leaching (1)
- activated sludge (1)
- adaptive GUI Design (1)
- adaptive resonance theory (1)
- adjoint functions (1)
- advanced wastewater treatment (1)
- age cohorts (1)
- agent-based simulation (1)
- agricultural intensification (1)
- agroecosystems (1)
- amorphous hydrogenated carbon layer (1)
- amphibians (1)
- analytics (1)
- anthropogenic disturbance (1)
- application programming interfaces (1)
- aquatic environment (1)
- aquatic invertebrates (1)
- archiving (1)
- artifcial neural networks (1)
- artiffficial neural networks (1)
- artififfcial neural networks (1)
- assessment model (1)
- attitudes towards specific movie features (1)
- authoritarianism (1)
- automated theorem prover (1)
- automatic behavioral cues (1)
- backpropagation (1)
- bait-lamina test (1)
- bats (1)
- bauxite (1)
- behavior change (1)
- behavioural ecology (1)
- belief in just world (1)
- beneficial insects (1)
- benefits (1)
- benthic oxygen fluxes (1)
- bias (1)
- biocide (1)
- biocides (1)
- biodegradation (1)
- biodiversity (1)
- biodiversity conservation (1)
- biofiltration (1)
- bioindicator (1)
- biological degradation (1)
- biologischer Abbau (1)
- biotransformation (1)
- bioturbation (1)
- blockchain (1)
- blood analysis (1)
- bribery (1)
- business intelligence (1)
- business process management (1)
- by-stander effect (1)
- carbon hybridisation (1)
- catastrophy theory (1)
- categorisation (1)
- cation bridges (1)
- cation-bridges (CaB) (1)
- chalk grassland (1)
- change (1)
- chaos (1)
- chemical force microscopy (1)
- chemical risk assessment (1)
- chironomids (1)
- clonal diversity (1)
- cognitive development (1)
- cognitive linguistic approach (1)
- collaboration (1)
- collaborative technologies (1)
- collectivism (1)
- colloid (1)
- colour calibration (1)
- competence- and control beliefs (1)
- competition (1)
- concept (1)
- concurrency (1)
- conflict detection (1)
- conservation genetics (1)
- construction materials (1)
- contact angle (1)
- contemporary detective fiction (1)
- contexts of use (1)
- cooperation (1)
- core ontologies (1)
- core self-evaluations (1)
- corrosion protection (1)
- corrosion resistance (1)
- covid-19 (1)
- criminal victimization (1)
- critical section (1)
- crop pollination (1)
- cross-cultural psychology (1)
- cross-linking (1)
- cryo-electron microscopy (1)
- cultural dimensions (1)
- cultural landscape (1)
- currency exchange rates (1)
- data (1)
- data mining (1)
- data protection (1)
- data sharing (1)
- data warehouse (1)
- decision support tool (1)
- deductive (1)
- delivery drone (1)
- dengue (1)
- density separation (1)
- design thinking (1)
- deutsche Hochschulen (1)
- diabetic retinopathy (1)
- digestion (1)
- digital transformation (1)
- digital workplace (1)
- directed acyclic graphs (1)
- disabled detective (1)
- disabled masculinity (1)
- disgust sensitivity (1)
- distinct object identifiers (1)
- distributed information systems (1)
- distributed ledger (1)
- distribution (1)
- disturbance (1)
- drone (1)
- dry inland waters (1)
- e-Commerce (1)
- e-learning (1)
- e-service (1)
- e-service quality (1)
- eGovernment (1)
- eSourcing (1)
- eXperience methodology (1)
- ebullition (1)
- ecological risk management (1)
- ecology (1)
- ecosystem functioning (1)
- ecosystem functions (1)
- ecosystem services (1)
- eddy correlation (1)
- edge linking (1)
- educational alliance (1)
- effect assessment (1)
- effect-directed analysis (1)
- emergence (1)
- emerging micropollutants (1)
- empirische Untersuchung (1)
- endocrine disrupting chemicals (1)
- endokrine Regulation (1)
- energetics (1)
- engineered nanoparticles (1)
- english prepositions (1)
- enterprise collaboration platforms (1)
- enterprise collaboration systems (1)
- entrepreneurial design thinking (1)
- entrepreneurial thinking (1)
- entrepreneurship education (1)
- environmental compatibility (1)
- environmental control (displays of self, control of stress factors, social control), well-being, work or learning efficiency, social behavior, co mmunication (1)
- environmental fate (1)
- environmental psychology (1)
- environmental risk assessment (1)
- environmental surfaces (1)
- epidemiology (1)
- epoxide (1)
- erweiterte Abwasserbehandlung (1)
- estimation of algorithm efficiency (1)
- evaluation (1)
- event model (1)
- event-based systems (1)
- events (1)
- evolution (1)
- excess deaths (1)
- expansion (1)
- faceted search (1)
- fairness (1)
- feedback (1)
- field experiment (1)
- field margin (1)
- finite state automata (1)
- first-order logic (1)
- fish tissues (1)
- floral resources (1)
- flows over time (1)
- fluid disturbances (1)
- fluid-structure interaction (1)
- focused feedback (1)
- folksonomies (1)
- freshwater ecosystem (1)
- functional web testing tools (1)
- fungicide (1)
- fungus resistant grapevine (1)
- futex (1)
- gait (1)
- galvanic anodes (1)
- gaze information (1)
- gender (1)
- genotyping error (1)
- giftedness (1)
- glass transition (1)
- global carbon cycle (1)
- grade (1)
- gradient method of training weight coefficients (1)
- grassland (1)
- grassland irrigation (1)
- greenhouse gases (1)
- groundwater remediation (1)
- groupwork (1)
- hazard prediction (1)
- healthcare (1)
- heart rate (1)
- high power impulse magnetron sputtering (1)
- humic acid (1)
- hybrid systems (1)
- hybrid work (1)
- hybride Automaten (1)
- hydrodynamic chromatography (1)
- hydrodynamische Chromatographie (1)
- hydrophobicity (1)
- hypertableaux (1)
- iCity project (1)
- image processing (1)
- image semantics (1)
- image warping (1)
- immediate priority ceiling protocol (1)
- implicit-explicit consistency (1)
- impounded rivers (1)
- in situ (1)
- incompressible fluids (1)
- information infrastructure (1)
- information retrieval (1)
- information system (1)
- insecticide (1)
- interaction (1)
- intergroup contact (1)
- internet of things (1)
- invasive Arten (1)
- invasive crayfish (1)
- invasive species (1)
- iot development platforms (1)
- iron removal (1)
- jOWL (1)
- kinematics (1)
- klonale Diversität (1)
- knowledge base (1)
- knowledge management system (1)
- knowledge work (1)
- land use (1)
- land use change (1)
- landmarks (1)
- landscape (1)
- landscape complexity (1)
- landscape mapping (1)
- laser induced fluorescence (1)
- leaching (1)
- lead desorption (1)
- leaf beetles (1)
- leafhoppers (1)
- leap motion (1)
- lexical sophistication (1)
- life cycle test (1)
- lifelong learning (1)
- living book (1)
- logistic (1)
- long-living systems (1)
- longitudinal (1)
- mPayments (1)
- machine learning (1)
- macroinvertebrates (1)
- mammals (1)
- masculine disability (1)
- masculine identity (1)
- mathematical Modelling (1)
- mathematical model (1)
- mathematische Modellbildung (1)
- measure (1)
- media competence model (1)
- medical care (1)
- medical image processing (1)
- metadata formats (1)
- metadata standards (1)
- metal-film phase plate (1)
- methane (1)
- methodology (1)
- micro-agent (1)
- microorganisms (1)
- microsatellite DNA (1)
- microsatellite analysis (1)
- minimal pruning (1)
- minimum self-contained graphs (1)
- mitigation (1)
- mitigation measures (1)
- mixtures (1)
- mobile application (1)
- mobile devices (1)
- mobile facets (1)
- mobile health care (1)
- mobile interaction (1)
- model generation (1)
- model-driven engineering (1)
- modeling (1)
- modulares System (1)
- monitor (1)
- mood change (1)
- morphological operators (1)
- multiagent systems (1)
- multidimensional (1)
- mutual exclusion (1)
- nanoparticle (1)
- nationalism (1)
- natural language generation (1)
- natural organic matter (1)
- nature conservation (1)
- negotiation (1)
- networks (1)
- neuartige Spurenstoffe (1)
- neural (1)
- nicht gefrierbares Wasser (1)
- nichtlinearer Zusammenhang (1)
- nichtsuizidale Selbstverletzung (1)
- non-consumptive effects (1)
- non-crop habitats (1)
- non-point source (1)
- non-target effects (1)
- non-target plants (1)
- norm (1)
- nternational organizations (1)
- numerical simulation (1)
- off-field habitats (1)
- olive mill wastewater (1)
- optimization (1)
- organic coatings (1)
- organic pollution (1)
- organizational behavior (1)
- organophosphate (1)
- ozonation (1)
- ozonation of beta blockers (1)
- ozone (1)
- ozone reactivity (1)
- pH-Wert (1)
- parallel calculations (1)
- parameter estimation (1)
- path planning (1)
- peat (1)
- pelzresistente Rebsorten (1)
- performance optimization (1)
- periphyton (1)
- personal information management (1)
- persönliches Informationsmanagement (1)
- pest control (1)
- pesticide risk assessment (1)
- pharmaceuticals (1)
- phenolic compounds (1)
- photo selection (1)
- physical activity (1)
- plain language (1)
- plant protection products (1)
- planthoppers (1)
- plants (1)
- plastic consumption (1)
- plastic debris (1)
- playful learning (1)
- point source (1)
- points of interest (1)
- policy modelling (1)
- pollen diet (1)
- pollinator development (1)
- pollinator fitness (1)
- pollution (1)
- polyurethane (1)
- population genetics (1)
- predation (1)
- predictability (1)
- preschool children (1)
- priority effects (1)
- priority-Effekte (1)
- privacy and personal data (1)
- privacy by design (1)
- privacy competence model (1)
- privacy impact assessment (1)
- privacy protection (1)
- probabilistic (1)
- procedural content generation (1)
- prognosis model (1)
- prosocial behavior (1)
- prosoziale Gruppennorm (1)
- public key infrastructure (1)
- punishment goals (1)
- pyrethroids (1)
- question answering (1)
- rainforest (1)
- reasoning (1)
- recolonisation (1)
- recurrent (1)
- refractory grade (1)
- region labeling (1)
- regression analysis (1)
- regular dag languages (1)
- regulation (1)
- relative Prototypikalität (1)
- religiousness (1)
- remote work (1)
- repellency (1)
- reproduction (1)
- requirements analysis (1)
- retina fundus images (1)
- reverse osmosis (1)
- revision (1)
- rich multimedia presentations (1)
- risks (1)
- robotics (1)
- rocking-phase plate (1)
- running (1)
- runoff (1)
- sample pretreatment (1)
- scaffolded writing (1)
- scale construction (1)
- scale validation (1)
- scene analysis (1)
- school attack (1)
- school tier (1)
- science learning (1)
- security (1)
- security awareness (1)
- self-concept (1)
- self-efficacy (1)
- self-organisation (1)
- semantic annotation (1)
- semantic desktop (1)
- semantischer Desktop (1)
- sequent calculi (1)
- sexual orientation (1)
- shiq (1)
- silver nanoparticles (1)
- single-particle analysis (1)
- smartphone app (1)
- social media data (1)
- social object (1)
- social perception (1)
- social system (1)
- software engineering (1)
- soil (1)
- soil aquifer treatment (1)
- soil pH (1)
- soil solution (1)
- soils (1)
- sparsity (1)
- spatial Fuzzy Logic (1)
- spatial and temporal varibility (1)
- speech (1)
- spray-drift (1)
- stereoscopic rendering (1)
- stereotypes (1)
- stream (1)
- stream mesocosm (1)
- student misbehavior (1)
- student writing (1)
- summative evaluation (1)
- surface characteristics (1)
- survey in Koblenz (1)
- suspended particle matter (1)
- swarming (1)
- swimming behaviour (1)
- systematics (1)
- systems to judge the quality of buildings (1)
- tag recommendation (1)
- tagging (1)
- task orientation (1)
- teacher beliefs (1)
- teacher education (1)
- teacher motivation (1)
- teacher training (1)
- teaching (1)
- teams (1)
- technology acceptance model (1)
- text-picture integration (1)
- thermal analysis (1)
- time series (1)
- titanium nitride (1)
- tool-integration (1)
- toxicokinetics (1)
- trace organic chemicals (1)
- traceability (1)
- trait-mediated effects (1)
- transformation products (1)
- transport (1)
- trophic cascades (1)
- unique name assumption (1)
- uptake (1)
- usability study (1)
- variational discretization (1)
- vascular plants (1)
- vaskuläre Planzen (1)
- vegetated treatment systems (1)
- vegetation modeling (1)
- verification (1)
- video games (1)
- virtual goods (1)
- viticulture (1)
- voice (1)
- volume rendering (1)
- warp divergence (1)
- wastewater treatment plant (1)
- water pollution (1)
- water re-use (1)
- water reuse (1)
- water scarcity (1)
- water treatment (1)
- water-molecule-bridges (WaMB) (1)
- weak solution (1)
- wear resistance (1)
- web 2.0 (1)
- web-portal medical e-services (1)
- wettability (1)
- wild bees (1)
- wildlife management (1)
- window of opportunity (1)
- wireless sensor networks (1)
- work from anywhere (1)
- work from home (1)
- world polity (1)
- writing (1)
- zooplankton (1)
- Ästuar (1)
- Ökologie (1)
- Ökotoxologie (1)
- Überarbeitung (1)
- Überwachung (1)
Institute
- Fachbereich 4 (116)
- Institut für Informatik (81)
- Fachbereich 7 (78)
- Institut für Wirtschafts- und Verwaltungsinformatik (53)
- Institut für Computervisualistik (52)
- Institute for Web Science and Technologies (50)
- Institut für Management (30)
- Institut für Integrierte Naturwissenschaften, Abt. Biologie (23)
- Institut für Umweltwissenschaften (23)
- Fachbereich 8 (20)
- Institut für Softwaretechnik (14)
- Institut für Integrierte Naturwissenschaften, Abt. Chemie (10)
- Mathematisches Institut (9)
- Institut für Anglistik und Amerikanistik (7)
- Institut für Integrierte Naturwissenschaften (6)
- Institut für Integrierte Naturwissenschaften, Abt. Physik (6)
- Institut für Psychologie (5)
- Fachbereich 6 (4)
- Arbeitsbereich Sozial- und Wirtschaftspsychologie (2)
- Institut für Mathematik (2)
- Arbeitsbereich Allgemeine und Pädagogische Psychologie (1)
- Arbeitsbereich Biopsychologie, Klinische Psychologie und Psychotherapie (1)
- Arbeitsbereich Diagnostik, Differentielle und Persönlichkeitspsychologie, Methodik und Evaluation (1)
- Arbeitsbereich Entwicklungspsychologie und Pädagogische Psychologie (1)
- Fachbereich 5 (1)
- Institut für Bildung im Kindes- und Jugendalter (1)
- Institut für Erziehungswissenschaft (1)
- Institut für Integrierte Naturwissenschaften, Abt. Geographie (1)
- Institut für Kommunikationspsychologie und Medienpädagogik (1)
- Institut für Sozialwissenschaften (1)
- Institut für fremdsprachliche Philologien (1)
- Universitätsbibliothek Koblenz-Landau (1)
- Zentrale Einrichtungen (1)
This volume contains those research papers presented at the Second International Conference on Tests and Proofs (TAP 2008) that were not included in the main conference proceedings. TAP was the second conference devoted to the convergence of proofs and tests. It combines ideas from both areas for the advancement of software quality. To prove the correctness of a program is to demonstrate, through impeccable mathematical techniques, that it has no bugs; to test a program is to run it with the expectation of discovering bugs. On the surface, the two techniques seem contradictory: if you have proved your program, it is fruitless to comb it for bugs; and if you are testing it, that is surely a sign that you have given up on any hope of proving its correctness. Accordingly, proofs and tests have, since the onset of software engineering research, been pursued by distinct communities using rather different techniques and tools. And yet the development of both approaches leads to the discovery of common issues and to the realization that each may need the other. The emergence of model checking has been one of the first signs that contradiction may yield to complementarity, but in the past few years an increasing number of research efforts have encountered the need for combining proofs and tests, dropping earlier dogmatic views of their incompatibility and taking instead the best of what each of these software engineering domains has to offer. The first TAP conference (held at ETH Zurich in February 2007) was an attempt to provide a forum for the cross-fertilization of ideas and approaches from the testing and proving communities. For the 2008 edition we found the Monash University Prato Centre near Florence to be an ideal place providing a stimulating environment. We wish to sincerely thank all the authors who submitted their work for consideration. And we would like to thank the Program Committee members as well as additional referees for their great effort and professional work in the review and selection process. Their names are listed on the following pages. In addition to the contributed papers, the program included three excellent keynote talks. We are grateful to Michael Hennell (LDRA Ltd., Cheshire, UK), Orna Kupferman (Hebrew University, Israel), and Elaine Weyuker (AT&T Labs Inc., USA) for accepting the invitation to address the conference. Two very interesting tutorials were part of TAP 2008: "Parameterized Unit Testing with Pex" (J. de Halleux, N. Tillmann) and "Integrating Verification and Testing of Object-Oriented Software" (C. Engel, C. Gladisch, V. Klebanov, and P. Rümmer). We would like to express our thanks to the tutorial presenters for their contribution. It was a team effort that made the conference so successful. We are grateful to the Conference Chair and the Steering Committee members for their support. And we particularly thank Christoph Gladisch, Beate Körner, and Philipp Rümmer for their hard work and help in making the conference a success. In addition, we gratefully acknowledge the generous support of Microsoft Research Redmond, who financed an invited speaker.
This paper describes the robots TIAGo and Lisa used by
team homer@UniKoblenz of the University of Koblenz-Landau, Germany,
for the participation at the RoboCup@Home 2019 in Sydney,
Australia. We ended up first at RoboCup@Home 2019 in the Open Platform
League and won the competition in our league now three times
in a row (four times in total) which makes our team the most successful
in RoboCup@Home. We demonstrated approaches for learning from
demonstration, touch enforcing manipulation and autonomous semantic
exploration in the finals. A special focus is put on novel system components
and the open source contributions of our team. We have released
packages for object recognition, a robot face including speech synthesis,
mapping and navigation, speech recognition interface, gesture recognition
and imitation learning. The packages are available (and new packages
will be released) on http://homer.uni-koblenz.de.
Proceedings of the 9th Open German-Russian Workshop on Pattern Recognition and Image Understanding
(2015)
The Proceedings of the 9th Open German-Russian Workshop on Pattern Recognition and Image Understanding include publications (extended abstracts), that cover but are not limited to the following topics: - Mathematical Theory of Pattern Recognition, Image and Speech Processing, Analysis, Recognition and Understanding. - Cognitive Technologies, Information Technologies, Automated Systems and Software for Pattern Recognition, Image, Speech and Signal Processing, Analysis and Understanding - Databases, Knowledge Bases, and Linguistic Tools - Special-Purpose Architectures, Software and Hardware Tools - Vision and Sensor Data Interpretation for Robotics - Industrial, Medical, Multimedia and Other Applications - Algorithms, Software, Automated Systems and Information Technologies in Bioinformatics and Medical Informatics. The workshop took place from December 1st-5th, 2014, at the University of Koblenz-Landau in Koblenz, Germany.
Engineered nanoparticles (ENP) are widely used in different industrial fields and products. In the last years, the risk potential for the release of ENP in the environment has increased as never before. ENP are expected to pass the wastewater-river-topsoil-groundwater pathway. In the terrestrial and aquatic environment ENP can undergo aging and transformation processes which can influence fate, transport and toxicological effects to different living organisms.
The scope of this workshop is to gather researchers, scientists, experts and specialists from nanoparticle and colloid science, soil and environmental chemistry, ecotoxicology or neighbouring disciplines to discuss the latest results and findings in the field of aging, fate, transport and toxicological effects of nanoparticles in the environment.
Five personality traits commonly known as the “Big Five” have been widely acknowledged as universal. But most available psychological instruments are not necessarily transferable to other cultures. They are referred to as “W.E.I.R.D.” (western, educated, industrial, rich, democratic) and lack the combined emic-etic approach that is necessary for a transcultural perspective. This intercontinental congress brings experts from Kenya and Germany together – thinking out of the box and collecting ideas for a scientific based partnership of East Africa and Europe. Main topics are psychological constructs that prove relevant for Human Resources Management. The Five-Factor Model, core self-evaluations, coping processes and acculturation as well as globalization effects and gender issues are discussed.
This paper describes the robots TIAGo and Lisa used by team homer@UniKoblenz of the University of Koblenz-Landau, Germany, for the participation at the RoboCup@Home 2018 in Montreal, Canada. Further this paper serves as qualification material for the RoboCup-@Home participation in 2018. A special focus is put on novel system components and the open source contributions of our team. This year the team from Koblenz won the biggest annual scientianc robot competition in Montreal in the RoboCup@Home Open Platform track for the third time and also won the RoboCup@Home German Open for the second time. As a research highlight a novel symbolic imitation learning approach was demonstrated during the annals. The TIAGo robotic research platform was used for the first time by the team. We have released packages for object recognition, a robot face including speech synthesis, mapping and navigation, speech recognition interface via android and a GUI. The packages are available (and new packages will be released) on http://wiki.ros.org/agas-ros-pkg. Further information can be found on our project page http://homer.uni-koblenz.de.
Folksonomies are Web 2.0 platforms where users share resources with each other. Furthermore, they can assign keywords (called tags) to the resources for categorizing and organizing the resources. Numerous types of resources like websites (Delicious), images (Flickr), and videos (YouTube) are supported by different folksonomies. The folksonomies are easy to use and thus attract the attention of millions of users. Together with the ease they offer, there are also some problems. This thesis addresses different problems of folksonomies and proposes solutions for these problems. The first problem occurs when users search for relevant resources in folksonomies. Often, the users are not able to find all relevant resources because they don't know which tags are relevant. The second problem is assigning tags to resources. Although many folksonomies (like Delicious) recommend tags for the resources, other folksonomies (like Flickr) do not recommend any tags. Tag recommendation helps the users to easily tag their resources. The third problem is that tags and resources are lacking semantics. This leads for example to ambiguous tags. The tags are lacking semantics because they are freely chosen keywords. The automatic identification of the semantics of tags and resources helps in reducing problems that arise from this freedom of the users in choosing the tags. This thesis proposes methods which exploit semantics to address the problems of search, tag recommendation, and the identification of tag semantics. The semantics are discovered from a variety of sources. In this thesis, we exploit web search engines, online social communities and the co-occurrences of tags as sources of semantics. Using different sources for discovering semantics reduces the efforts to build systems which solve the problems mentioned earlier. This thesis evaluates the proposed methods on a large scale data set. The evaluation results suggest that it is possible to exploit the semantics for improving search, recommendation of tags, and automatic identification of the semantics of tags and resources.
A fundamental understanding of attachment of engineered nanoparticles to environmentalrnsurfaces is essential for the prediction of nanoparticle fate and transport in the environment.
The present work investigates the attachment of non-coated silver nanoparticles and citraterncoated silver nanoparticles to different model surfaces and environmental surfaces in thernpresence and absence of humic acid. Batch sorption experiments were used for this investigation.
The objective of this thesis was to investigate how silver nanoparticles interactrnwith surfaces having different chemical functional groups. The effect of presence of HA, on the particle-surface interactions was also investigated. In the absence of humic acid, nanoparticle-surface interactions or attachment was influencedrnby the chemical nature of the interacting surfaces. On the other hand, in the presence ofrnhumic acid, nanoparticle-surface attachment was influenced by the specific surface area of the sorbent surfaces. The sorption of non-coated silver nanoparticles and citrate coatedrnnanoparticles to all the surfaces was nonlinear and best described by Langmuir isotherm, indicating monolayer sorption of nanoparticles on to the surfaces. This can be explained as due to the blocking effect generated by the particle-particle repulsion. In the presence of humic acid, sorption of nanoparticles to the surfaces was linear. When the humic acid was present in the interacting medium, both the nanoparticles and surfaces were getting coated with humic acid and this masks the chemical functionalities of the surfaces. This leads to the change in particle-surface interactions, in the presence of humic acid. For the silver nanoparticle sorption from an unstable suspension, the sorption isotherms did not follow any classical sorption models, suggesting interplay between aggregation and sorption. Citrate coated silver nanoparticles and humic acid coated silver nanoparticles showed arndepression in sorption compared to the sorption of non-coated silver nanoparticles. In therncase of citrate coated silver nanoparticles the decrease in sorption can be explained by thernmore negative zeta potential of citrate coated nanoparticles compared to non-coated ones. For humic acid coated nanoparticles the sorption depression can be due to the steric hindrance caused by the free humic acid molecules which may coat the sorbent surface or due to the competition for sorption sites between the nanoparticle and free humic acid molecules present in the suspension. Thus nanoparticle surface chemistry is an important factor that determines the attachment of nanoparticles towards surfaces and it makes the characterization of nanoparticle surface an essential step in the study of their fate in the environment.
Another aim of this study was to introduce the potential of chemical force microscopy for nanoparticle surface characterization. With the use of this technique, it was possible to distinguish between bare silver nanoparticles, citrate coated silver nanoparticles, and humic acid coated silver nanoparticles. This was possible by measuring the adhesion forces between the nanoparticles and five different AFM probes having different chemical functionalization.
Vertebrate biodiversity is rapidly decreasing worldwide with amphibians being the most endangered vertebrate group. In the EU, 21 of 89 amphibian species are recognized as being endangered. The intensively used European agricultural landscape is one of the major causes for these declines. As agriculture represents an essential habitat for amphibians, exposure to pesticides can have adverse effects on amphibian populations. Currently, the European risk assessment of pesticides for vertebrates requires specific approaches for fish regarding aquatic vertebrate toxicity and birds as well as mammals for terrestrial vertebrate toxicity but does not address the unique characteristics of amphibians. Therefore, the overall goal of this thesis was to investigate the ecotoxicological effects of pesticides on Central European anuran amphibians. For this, effects on aquatic and terrestrial amphibian life stages as well as on reproduction were investigated. Then, in anticipation of a risk assessment of pesticides for amphibians, this thesis discussed potential regulatory risk assessment approaches.
For the investigated pesticides and amphibian species, it was observed that the acute aquatic toxicity of pesticides can be addressed using the existing aquatic risk assessment approach based on fish toxicity data. However, lethal as well as sublethal effects were observed in terrestrial juveniles after dermal exposure to environmentally realistic pesticide concentrations, which cannot be covered using an existing risk assessment approach. Therefore, pesticides should also be evaluated for potential terrestrial toxicity using risk assessment tools before approval. Additionally, effects of co-formulants and adjuvants of pesticides need specific consideration in a future risk assessment as they can increase toxicity of pesticides to aquatic and terrestrial amphibian stages. The chronic duration of combined aquatic and terrestrial exposure was shown to affect amphibian reproduction. Currently, such effects cannot be captured by the existing risk assessment as data involving field scenarios analysing effects of multiple pesticides on amphibian reproduction are too rare to allow comparison to data of other terrestrial vertebrates such as birds and mammals. In the light of these findings, future research should not only address acute and lethal effects, but also chronic and sublethal effects on a population level. As pesticide exposure can adversely affect amphibian populations, their application should be considered even more carefully to avoid further amphibian declines. Overall, this thesis emphasizes the urgent need for a protective pesticide risk assessment for amphibians to preserve and promote stable amphibian populations in agricultural landscapes.
Most social media platforms allow users to freely express their opinions, feelings, and beliefs. However, in recent years the growing propagation of hate speech, offensive language, racism and sexism on the social media outlets have drawn attention from individuals, companies, and researchers. Today, sexism both online and offline with different forms, including blatant, covert, and subtle lan- guage, is a common phenomenon in society. A notable amount of work has been done over identifying sexist content and computationally detecting sexism which exists online. Although previous efforts have mostly used peoples’ activities on social media platforms such as Twitter as a public and helpful source for collecting data, they neglect the fact that the method of gathering sexist tweets could be biased towards the initial search terms. Moreover, some forms of sexism could be missed since some tweets which contain offensive language could be misclassified as hate speech. Further, in existing hate speech corpora, sexist tweets mostly express hostile sexism, and to some degree, the other forms of sexism which also appear online was disregarded. Besides, the creation of labeled datasets with manual exertion, relying on users to report offensive comments with a tremendous effort by human annotators is not only a costly and time-consuming process, but it also raises the risk of involving discrimination under biased judgment.
This thesis generates a novel sexist and non-sexist dataset which is constructed via "UnSexistifyIt", an online web-based game that incentivizes the players to make minimal modifications to a sexist statement with the goal of turning it into a non-sexist statement and convincing other players that the modified statement is non-sexist. The game applies the methodology of "Game With A Purpose" to generate data as a side-effect of playing the game and also employs the gamification and crowdsourcing techniques to enhance non-game contexts. When voluntary participants play the game, they help to produce non-sexist statements which can reduce the cost of generating new corpus. This work explores how diverse individual beliefs concerning sexism are. Further, the result of this work highlights the impact of various linguistic features and content attributes regarding sexist language detection. Finally, this thesis could help to expand our understanding regarding the syntactic and semantic structure of sexist and non-sexist content and also provides insights to build a probabilistic classifier for single sentences into sexist or non-sexist classes and lastly find a potential ground truth for such a classifier.
Nowadays, almost any IT system involves personal data processing. In
such systems, many privacy risks arise when privacy concerns are not
properly addressed from the early phases of the system design. The
General Data Protection Regulation (GDPR) prescribes the Privacy by
Design (PbD) principle. As its core, PbD obliges protecting personal
data from the onset of the system development, by effectively
integrating appropriate privacy controls into the design. To
operationalize the concept of PbD, a set of challenges emerges: First, we need a basis to define privacy concerns. Without such a basis, we are not able to verify whether personal data processing is authorized. Second, we need to identify where precisely in a system, the controls have to be applied. This calls for system analysis concerning privacy concerns. Third, with a view to selecting and integrating appropriate controls, based on the results of system analysis, a mechanism to identify the privacy risks is required. Mitigating privacy risks is at the core of the PbD principle. Fourth, choosing and integrating appropriate controls into a system are complex tasks that besides risks, have to consider potential interrelations among privacy controls and the costs of the controls.
This thesis introduces a model-based privacy by design methodology to handle the above challenges. Our methodology relies on a precise definition of privacy concerns and comprises three sub-methodologies: model-based privacy analysis, modelbased privacy impact assessment and privacy-enhanced system design modeling. First, we introduce a definition of privacy preferences, which provides a basis to specify privacy concerns and to verify whether personal data processing is authorized. Second, we present a model-based methodology to analyze a system model. The results of this analysis denote a set of privacy design violations. Third, taking into account the results of privacy analysis, we introduce a model-based privacy impact assessment methodology to identify concrete privacy risks in a system model. Fourth, concerning the risks, and taking into account the interrelations and the costs of the controls, we propose a methodology to select appropriate controls and integrate them into a system design. Using various practical case studies, we evaluate our concepts, showing a promising outlook on the applicability of our methodology in real-world settings.
Knowledge-based authentication methods are vulnerable to Shoulder surfing phenomenon.
The widespread usage of these methods and not addressing the limitations it has could result in the user’s information to be compromised. User authentication method ought to be effortless to use and efficient, nevertheless secure.
The problem that we face concerning the security of PIN (Personal Identification Number) or password entry is shoulder surfing, in which a direct or indirect malicious observer could identify the user sensitive information. To tackle this issue we present TouchGaze which combines gaze signals and touch capabilities, as an input method for entering user’s credentials. Gaze signals will be primarily used to enhance targeting and touch for selecting. In this work, we have designed three different PIN entry method which they all have similar interfaces. For the evaluation, these methods were compared based on efficiency, accuracy, and usability. The results uncovered that despite the fact that gaze-based methods require extra time for the user to get familiar with yet it is considered more secure. In regards to efficiency, it has the similar error margin to the traditional PIN entry methods.
One task of executives and project managers in IT companies or departments is to hire suitable developers and to assign them to suitable problems. In this paper, we propose a new technique that directly leverages previous work experience of developers in a systematic manner. Existing evidence for developer expertise based on the version history of existing projects is analyzed. More specifically, we analyze the commits to a repository in terms of affected API usage. On these grounds, we associate APIs with developers and thus we assess API experience of developers. In transitive closure, we also assess programming domain experience.
Human action recognition from a video has received growing attention in computer vision and has made significant progress in recent years. Action recognition is described as a requirement to decide which human actions appear in videos. The difficulties involved in distinguishing human actions are due to the high complexity of human behaviors as well as appearance variation, motion pattern variation, occlusions, etc. Many applications use human action recognition on captured video from cameras, resulting in video surveillance systems, health monitoring, human-computer interaction, and robotics. Action recognition based on RGB-D data has increasingly drawn more attention to it in recent years. RGB-D data contain color (Red, Green, and Blue (RGB)) and depth data that represent the distance from the sensor to every pixel in the object (object point). The main problem that this thesis deals with is how to automate the classification of specific human activities/actions through RGB-D data. The classification process of these activities utilizes a spatial and temporal structure of actions. Therefore, the goal of this work is to develop algorithms that can distinguish these activities by recognizing low-level and high-level activities of interest from one another. These algorithms are developed by introducing new features and methods using RGB-D data to enhance the detection and recognition of human activities. In this thesis, the most popular state-of-the-art techniques are reviewed, presented, and evaluated. From the literature review, these techniques are categorized into hand-crafted features and deep learning-based approaches. The proposed new action recognition framework is based on these two categories that are approved in this work by embedding novel methods for human action recognition. These methods are based on features extracted from RGB-D data that are
evaluated using machine learning techniques. The presented work of this thesis improves human action recognition in two distinct parts. The first part focuses on improving current successful hand-crafted approaches. It contributes into two significant areas of state-of-the-art: Execute the existing feature detectors, and classify the human action in the 3D spatio-temporal domains by testing a new combination of different feature representations. The contributions of this part are tested based on machine learning techniques that include unsupervised and supervised learning to evaluate this suitability for the task of human action recognition. A k-means clustering represents the unsupervised learning technique, while the supervised learning technique is represented by: Support Vector Machine, Random Forest, K-Nearest Neighbor, Naive Bayes, and Artificial Neural Networks classifiers. The second part focuses on studying the current deep-learning-based approach and how to use it with RGB-D data for the human action recognition task. As the first step of each contribution, an input video is analyzed as a sequence of frames. Then, pre-processing steps are applied to the video frames, like filtering and smoothing methods to remove the noisy data from each frame. Afterward, different motion detection and feature representation methods are used to extract features presented in each frame. The extracted features
are represented by local features, global features, and feature combination besides deep learning methods, e.g., Convolutional Neural Networks. The feature combination achieves an excellent accuracy performance that outperforms other methods on the same RGB-D datasets. All the results from the proposed methods in this thesis are evaluated based on publicly available datasets, which illustrate that using spatiotemporal features can improve the recognition accuracy. The competitive experimental results are achieved overall. In particular, the proposed methods can be better applied to the test set compared to the state-of-the-art methods using the RGB-D datasets.
Efficient Cochlear Implant (CI) surgery requires prior knowledge of the cochlea’s size and its characteristics. This information helps to select suitable implants for different patients. Registered and fused images helps doctors by providing more informative image that takes advantages of different modalities. The cochlea’s small size and complex structure, in addition to the different resolutions and head positions during imaging, reveals a big challenge for the automated registration of the different image modalities. To obtain an automatic measurement of the cochlea length and the volume size, a segmentation method of cochlea medical images is needed. The goal of this dissertation is to introduce new practical and automatic algorithms for the human cochlea multi-modal 3D image registration, fusion, segmentation and analysis. Two novel methods for automatic cochlea image registration (ACIR) and automatic cochlea analysis (ACA) are introduced. The proposed methods crop the input images to the cochlea part and then align the cropped images to obtain the optimal transformation. After that, this transformation is used to align the original images. ACIR and ACA use Mattes mutual information as similarity metric, the adaptive stochastic gradient descent (ASGD) or the stochastic limited memory Broyden–Fletcher–Goldfarb–Shanno (s-LBFGS) optimizer to estimate the parameters of 3D rigid transform. The second stage of nonrigid registration estimates B-spline coefficients that are used in an atlas-model-based segmentation to extract cochlea scalae and the relative measurements of the input image. The image which has segmentation is aligned to the input image to obtain the non-rigid transformation. After that the segmentation of the first image, in addition to point-models are transformed to the input image. The detailed transformed segmentation provides the scala volume size. Using the transformed point-models, the A-value, the central scala lengths, the lateral and the organ of corti scala tympani lengths are computed. The methods have been tested using clinical 3D images of total 67 patients: from Germany (41 patients) and Egypt (26 patients). The atients are of different ages and gender. The number of images used in the experiments is 217, which are multi-modal 3D clinical images from CT, CBCT, and MRI scanners. The proposed methods are compared to the state of the arts ptimizers related medical image registration methods e.g. fast adaptive stochastic gradient descent (FASGD) and efficient preconditioned tochastic gradient descent (EPSGD). The comparison used the root mean squared distance (RMSE) between the ground truth landmarks and the resulted landmarks. The landmarks are located manually by two experts to represent the round window and the top of the cochlea. After obtaining the transformation using ACIR, the landmarks of the moving image are transformed using the resulted transformation and RMSE of the transformed landmarks, and at the same time the fixed image landmarks are computed. I also used the active length of the cochlea implant electrodes to compute the error aroused by the image artifact, and I found out an error ranged from 0.5 mm to 1.12 mm. ACIR method’s RMSE average was 0.36 mm with a standard deviation (SD) of 0.17 mm. The total time average required for registration of an image pair using ACIR was 4.62 seconds with SD of 1.19 seconds. All experiments are repeated 3 times for justifications. Comparing the RMSE of ACIR2017 and ACIR2020 using paired T-test shows no significant difference (p-value = 0.17). The total RMSE average of ACA method was 0.61 mm with a SD of 0.22 mm. The total time average required for analysing an image was 5.21 seconds with SD of 0.93 seconds. The statistical tests show that there is no difference between the results from automatic A-value method and the manual A-value method (p-value = 0.42). There is no difference also between length’s measurements of the left and the right ear sides (p-value > 0.16). Comparing the results from German and Egypt dataset shows there is no difference when using manual or automatic A-value methods (p-value > 0.20). However, there is a significant difference when using ACA2000 method between the German and the Egyptian results (p-value < 0.001). The average time to obtain the segmentation and all measurements was 5.21 second per image. The cochlea scala tympani volume size ranged from 38.98 mm3 to 57.67 mm3 . The combined scala media and scala vestibuli volume size ranged from 34.98 mm 3 to 49.3 mm 3 . The overall volume size of the cochlea should range from 73.96 mm 3 to 106.97 mm 3 . The lateral wall length of scala tympani ranged from 42.93 mm to 47.19 mm. The organ-of-Corti length of scala tympani ranged from 31.11 mm to 34.08 mm. Using the A-value method, the lateral length of scala tympani ranged from 36.69 mm to 45.91 mm. The organ-of-Corti length of scala tympani ranged from 29.12 mm to 39.05 mm. The length from ACA2020 method can be visualised and has a well-defined endpoints. The ACA2020 method works on different modalities and different images despite the noise level or the resolution. In the other hand, the A-value method works neither on MRI nor noisy images. Hence, ACA2020 method may provide more reliable and accurate measurement than the A-value method. The source-code and the datasets are made publicly available to help reproduction and validation of my result.
Social media provides a powerful way for people to share opinions and sentiments about a specific topic, allowing others to benefit from these thoughts and feelings. This procedure generates a huge amount of unstructured data, such as texts, images, and references that are constantly increasing through daily comments to related discussions. However, the vast amount of unstructured data presents risks to the information-extraction process, and so decision making becomes highly challenging. This is because data overload may cause the loss of useful data due to its inappropriate presentation and its accumulation. To this extent, this thesis contributed to the field of analyzing and detecting feelings in images and texts. And that by extracting the feelings and opinions hidden in a huge collection of image data and texts on social networks After that, these feelings are classified into positive, negative, or neutral, according to the features of the classified data. The process of extracting these feelings greatly helps in decision-making processes on various topics as will be explained in the first chapter of the thesis. A system has been built that can classify the feelings inherent in the images and texts on social media sites, such as people’s opinions about products and companies, personal posts, and general messages. This thesis begins by introducing a new method of reducing the dimension of text data based on data-mining approaches and then examines the sentiment based on neural and deep neural network classification algorithms. Subsequently, in contrast to sentiment analysis research in text datasets, we examine sentiment expression and polarity classification within and across image datasets by building deep neural networks based on the attention mechanism.
The bio-insecticide Bacillus thuringiensis israelensis (Bti) has worldwide become the most commonly used agentin mosquito control programs that pursue two main objectives: the control of vector-borne diseases and the reduction of nuisance, mainly coming frommosquitoes that emerge in large quantities from seasonal wetlands. The Upper Rhine Valley, a biodiversity hotspot in Germany, has been treated withBti for decades to reduce mosquito-borne nuisance and increase human well-being.Although Btiis presumed to be an environmentally safe agent,adverse effects on wetland ecosystems are still a matter of debate especially when it comes to long-term and indirect effects on non-target organisms. In light of the above, this thesis aims at investigating direct and indirect effects of Bti-based mosquito control on non-target organisms within wetland food chains.Effects were examinedin studies with increasingeco(toxico)logical complexity, ranging from laboratory over mesocosm to field approaches with a focus on the non-biting Chironomidae and amphibian larvae (Rana temporaria, Lissotriton sp.).In addition, public acceptance of environmentally less invasive alternative mosquito control methods was evaluated within surveys among the local population.
Chironomids were the most severely affected non-target aquatic invertebrates. Bti substantially reduced larval and adult chironomid abundances and modified their species composition. Repeated exposures to commonly used Bti formulations induced sublethal alterations of enzymatic biomarkers activityin frog tadpoles. Bti-induced reductions of chironomid prey availability indirectly decreased body size of newts at metamorphosis and increased predation on newt larvae in mesocosm experiments. Indirect effects of severe reductions in midge biomassmight equally be passed through aquatic but also terrestrial food chains influencing predators of higher trophic levels. The majority ofaffectedpeople in the Upper Rhine Valley expressed a high willingness to contributefinancially to environmentally less harmful mosquito control.Alternative approaches could still include Bti applications excepting treatment of ecologically valuable areas. Potentially rising mosquito levels could be counteracted with local acting mosquito traps in domestic and urban areas because mosquito presence was experienced as most annoying in the home environment.
As Bti-based mosquito control can adversely affect wetland ecosystems, its large-scale applications, including nature conservation areas, should be considered more carefully to avoid harmful consequences for the environmentat the Upper Rhine Valley.This thesis emphasizesthe importance to reconsiderthe current practice of mosquito control and encourage research on alternative mosquito control concepts that are endorsed by the local population. In the context ofthe ongoing amphibian and insect declinesfurther human-induced effects onwetlands should be avoided to preserve biodiversity in functioning ecosystems.
Recent estimates have confirmed that inland waters emit a considerable amount of CH4 and CO2 to the atmosphere at the regional and global scale. But these estimates are based on extrapolated measured data and lack of data from inland waters in arid and semi-arid regions and carbon sources from wastewater treatment plants (WWTPs) as well insufficient resolution of the spatiotemporal variability of these emissions.
Through this study, we analyzed monthly hydrological, meteorological and water quality data from three irrigation and drinking water reservoirs in the lower Jordan River basin and estimated the atmospheric emission rates of CO2. We investigated the effect of WWTPs on surrounding aquatic systems in term of CH4 and CO2 emission by presenting seasonally resolved data for dissolved concentrations of both gases in the effluents and in the receiving streams at nine WWTPs in Germany.
We investigated spatiotemporal variability of CH4 and CO2 emission from aquatic ecosystems by using of simple low-cost tools for measuring CO2 flux and bubble release rate from freshwater systems. Our estimates showed that reservoirs in semi-arid regions are oversaturated with CO2 and acted as net sources to the atmosphere. The magnitude of observed fluxes at the three water reservoirs in Jordan is comparable to those from tropical reservoirs (3.3 g CO2 m-2 d-1). The CO2 emission rate from these reservoirs is linked to changes of water surface area, which is the result of water management practices. WWTPs have been shown to discharge a considerable amount of CH4 (30.9±40.7 kg yr-1) and CO2 (0.06±0.05 Gg yr-1) to their surrounding streams, and emission rates of CH4 and CO2 from these streams are significantly enhanced by effluents of WWTPs up to 1.2 and 8.6 times, respectively.
Our results showed that both diffusive flux and bubble release rate varied in time and space, and both of emission pathways should be included and variability should be resolved adequately in further sampling and measuring strategies. We conclude that future emission measurements and estimates from inland waters may consider water management practices, carbon sources from WWTPs as well spatial and temporal variability of emission.
For decades a worldwide decline of biological diversity has been reported. Landscapes are influenced by several kinds of anthropogenic disturbances. Agricultural land use, application of fertilizers and pesticides and the removal of corridors simplify and homogenize a landscape whereas others like road constructions lead to fragmentation. Both kinds lead to a constraint of habitats, reduce living environment and gene pool, hinder gene flow and change the functional characteristics of species. Furthermore, it facilitates the introduction of alien species. On the other hand, disturbances of different temporal and spatial dimensions lead to a more diverse landscape because they prevent competitive exclusion and create niches where species are able to coexist.
This study focuses on the complexity of disturbance regimes and its influence on phytodiversity. It differs from other studies that mostly select one or few disturbance types in including all identifiable disturbances. Data were derived from three study sites in the north of Bavaria and are subject to different land-use intensities. Two landscapes underlie agriculture and forestry, of which one is intensively used and the second one rather moderate and small-scaled. The third dataset was collected on an actively used military training area. The first part of the study deals with the influence of disturbance regimes on phytodiversity, first with the focus on military disturbances, afterwards in comparison with the agricultural landscapes. The second part examines the influence of disturbance regimes on red-listed species, the distribution of neophytes and generalist plant species and the homogenization of the landscape. All analyses were conducted on landscape and local scale.
A decisive role was played by the variety of disturbance types, especially in different temporal and spatial dimensions and not by single kinds of disturbances, which significantly was proven in the military training area with its multiple and undirected disturbance regime. Homogeneous disturbance regimes that typically are found in agricultural landscapes led to a reduced species number. On local scale, the abiotic heterogeneity which originated of recent and historical disturbances superimposed the positive effects of disturbance regimes, whereas dry and nutrient-poor sites showed a negative effect. Due to a low tree density and moderate treatment species numbers were significantly higher in forest in the training area than in the two agricultural landscapes.
Numbers of red-listed species were positively correlated to the total number of species in all three sites. However, the military training area showed a significantly higher abundance within the area in comparison to the agricultural landscapes where rare species were mostly found on marginal strips. Furthermore, numbers of neophytes and generalist species were lower and consequently homogenization.
In conclusion, the military training area is an ideal landscape from a nature conservation point of view. The moderately used agricultural area showed high species numbers and agricultural productivity. However, yield is too low to withstand either abandonment or land-use intensification.
In the present study the flora and vegetation of Kakamega Forest, an East African rainforest in Western Kenya, was investigated. Kakamega Forest is highly degraded and fragmented and is an ideal model to study the anthropogenic influence on the forest inventory. The main focus was to analyse the influence of human impact on the vascular plant species composition. During five field phases in the years 2001 to 2004 a total of 19 study sites scattered over the whole forest including all fragments were investigated regarding forest structure, species composition and plant communities. The different forest sites were analysed by three different methods, phytosociological relevés, line-transect and with the variable-area transect method. The forest survey revealed about 400 taxa of vascular plant species, among them 112 trees, 62 shrubs, 58 climbers and 114 herbs. Several species are restricted to this forest in Kenya, but only one endemic species, the herb Commelina albiflora, could be discovered. About 15 species were recorded as new for Kenya and probably at least one species is new to science. Kakamega Forest is a unique mixture of Guineo-Congolian and Afromontane floral elements. About one half of the vascular plant species has its origin in the lowland forests of the Congo basin and one third originates from Afromontane habitats. The present study represents the first description of plant communities of Kakamega Forest. An analysis of different forest sites and plantations resulted in 17 different vegetation units. For the mature forest sites eleven plant communities were described. The young succession stage consists of two plant communities. Since the disturbance history and the age of the different plant communities could be estimated, their chronology was also described. An exception are the study sites within the plantations and afforested sites. The four defined vegetation units were not described as plant communities, because they are highly affected by man and do not belong to the natural succession of Kakamega Forest. Nevertheless, the regeneration potential of such forests was investigated. Due to the different succession stages the changing species composition along a disturbance gradient could be analysed. Most of Kakamega Forest consists of middle-aged secondary forest often surrounded by very young secondary forest. A true primary rainforest could not be found due the massive influence by over-exploitation. In all parts of the forest the anthropogenic influence could be observed. The forest develops towards a climax stage, but a 2 Abstract comparison with former surveys shows that the regeneration is much slower than expected. Human impact has to be avoided to allow the forest to develop into a primary-like rainforest. But several climax tree species might be missing anyway, because after the broad logging activities in the past there are not enough seed trees remaining. Species richness was highest in disturbed forest sites. A mixture of pioneer, climax and bushland species could be recorded there. Therefore, a high species richness is not a suitable indicator for forest quality. The proportion of climax species typical for Kakamega Forest would be a better measure. Compared to the main forest block the forest fragments do not lack in diversity as expected due to fragmentation processes. Instead, the only near primary forest could be recorded in Kisere, a northern fragment. The high amount of climax species and the more or less undisturbed forest structure is a result of the strict protection by the Kenya Wildlife Service and due to low logging activities. Differences in species composition between the studied forest sites are either a result of the different logging history or management regime rather than due to different edaphic or climatic conditions.
Predictive Process Monitoring is becoming more prevalent as an aid for organizations to support their operational processes. However, most software applications available today require extensive technical know-how by the operator and are therefore not suitable for most real-world scenarios. Therefore, this work presents a prototype implementation of a Predictive Process Monitoring dashboard in the form of a web application. The system is based on the PPM Camunda Plugin presented by Bartmann et al. (2021) and allows users to easily create metrics, visualizations to display these metrics, and dashboards in which visualizations can be arranged. A usability test is with test users of different computer skills is conducted to confirm the application’s user-friendliness.
The role of alternative resources for pollinators and aphid predators in agricultural landscapes
(2021)
The world wide decline of insects is often associated with loss of natural and semi-natural habitat caused by intensified land-use. Many insects provide important ecosystem services to agriculture, such as pest control or pollination. To efficiently promote insects on remaining semi-natural habitat we need precise knowledge of their requirements to non-crop habitat. This thesis focuses on identifying
the most important semi-natural habitats (forest edges, grasslands, and semi-open habitats) for pollinators and natural enemies of crop pests with respect to their food resource requirements. Special
attention is given to floral resources and their spatio-temporal distribution in agricultural landscapes.
Floral resource maps might get closer at characterizing landscapes the way they are experienced by insects compared to classical habitat maps. Performance of the two map types was compared on the prediction of wild bees and natural enemies that consume nectar and pollen, identifying habitats of special importance in the process. In wild bees, influences of spatio-temporal floral resource availability were analysed as well as habitat preferences of specific groups of bees. Understanding dietary needs of natural enemies of crop pests requires additional knowledge on prey use. To this end, ladybird gut contents have been analysed by means of high-throughput sequencing for insight into aphid prey-use.
Results showed, that wild bees were predicted better by floral resource maps compared to classical habitat maps. Forest edge area, as well as floral resources in forest edges had positive effects on abundance and diversity of rare bees and important crop pollinators. Similar patterns were retained for grassland diversity. Especially early floral resources seemed to have positive effects on wild bees. Crops and fruit trees produced a resource pulse in April that exceeded floral resource availability in May and June by tenfold. Most floral resources in forest edges appeared early in the season, with the highest floral density per area. Grasslands provided the lowest amount of floral resources but highest diversity, which was evenly distributed over the season.
Despite natural enemies need for floral resources, classical habitat maps performed better at predicting natural enemies of crop pests compared to floral resource maps. Classical habitat maps revealed a positive effect of forest edge habitat on the abundance of pest enemies, which translated into improved aphid control. Results from gut content analysis reveal high portions of pest aphid species and nettle aphids as well as a broader insight into prey spectra retained from ladybirds collected from sticky traps compared to individuals collected by hand. The aphid specific primer designed for this purpose will be helpful for identifying aphid consumption by ladybirds in future studies.
Findings of this thesis show the potential of floral resource maps for understanding interactions of wild bees and the landscape but also indicate that natural enemies are limited by other resources. I would like to highlight the positive effects of forest edges for different groups of bees as well as natural enemies and their performance on pest control.
Semantic descriptions of non-textual media available on the web can be used to facilitate retrieval and presentation of media assets and documents containing them. While technologies for multimedia semantic descriptions already exist, there is as yet no formal description of a high quality multimedia ontology that is compatible with existing (semantic) web technologies. We explain the complexity of the problem using an annotation scenario. We then derive a number of requirements for specifying a formal multimedia ontology, including: compatibility with MPEG-7, embedding in foundational ontologies, and modularisation including separation of document structure from domain knowledge. We then present the developed ontology and discuss it with respect to our requirements.
Public electronic procurement (eProcurement), here electronic sourcing (eSourcing) in particular, is almost certainly on the agenda when eGovernment experts meet. Not surprisingly is eProcurement the first high-impact service to be addressed in the European Union- recent Action Plan. This is mainly dedicated to the fact that public procurement makes out almost 20% of Europe- GDP and therefore holds a huge saving potential. To some extent this potential lies in the common European market, since effective cross-boarder eSourcing solutions can open many doors, both for buyers and suppliers. To achieve this, systems and processes and tools, need to be adoptable, transferable as well as be able to communicate with each other. In one word, they need to be interoperable. In many relevant domains, interoperability has reached a very positive level, standards have been established, workflows been put in place. In other domains however, there is still a long road ahead. As a consequence it is crucial to define requirements for such interoperable eSourcing systems and to identify the progress in research and practice.
This thesis explores the possibilities of probabilistic process modelling for the Computer Supported Cooperative Work (CSCW) systems in order to predict the behaviour of the users present in the CSCW system. Toward this objective applicability, advantages, limitations and challenges of probabilistic modelling are excavated in context of CSCW systems. Finally, as a primary goal seven models are created and examined to show the feasibilities of probabilistic process discovery and predictions of the users behaviour in CSCW systems.
Implementation of Agile Software Development Methodology in a Company – Why? Challenges? Benefits?
(2019)
The software development industry is enhancing day by day. The introduction of agile software development methodologies was a tremendous structural change in companies. Agile transformation provides unlimited opportunities and benefits to the existing and new developing companies. Along with benefits, agile conversion also brings many unseen challenges. New entrants have the advantage of being flexible and cope with the environmental, consumer, and cultural changes, but existing companies are bound to rigid structure.
The goal of this research is to have deep insight into agile software development methodology, agile manifesto, and principles behind the agile manifesto. The prerequisites company must know for agile software development implementation. The benefits a company can achieve by implementing agile software development. Significant challenges that a company can face during agile implementation in a company.
The research objectives of this study help to generate strong motivational research questions. These research questions cover the cultural aspects of company agility, values and principles of agile, benefits, and challenges of agile implementation. The project management triangle will show how benefits of cost, benefits of time, and benefits of quality can be achieved by implementing agile methodologies. Six significant areas have been explored, which shows different challenges a company can face during implementation agile software development methodology. In the end, after the in depth systematic literature review, conclusion is made following some open topics for future work and recommendations on the topic of implementation of agile software development methodology in a company.
Over the past few decades, Single-Particle Analysis (SPA), in combination with cryo-transmission electron microscopy, has evolved into one of the leading technologies for structural analysis of biological macromolecules. It allows the investigation of biological structures in a close to native state at the molecular level. Within the last five years the achievable resolution of SPA surpassed 2°A and is now approaching atomic resolution, which so far has only been possible with Xray crystallography in a far from native environment. One remaining problem of Cryo-Electron Microscopy (cryo-EM) is the weak image contrast. Since the introduction of cryo-EM in the 1980s phase plates have been investigated as a potential tool to overcome these contrast limitations. Until now, technical problems and instrumental deficiencies have made the use of phase plates difficult; an automated workflow, crucial for the acquisition of 1000s of micrographs needed for SPA, was not possible. In this thesis, a new Zernike-type Phase Plate (PP) was developed and investigated. Freestanding metal films were used as a PP material to overcome the ageing and contamination problems of standard carbon-based PPs. Several experiments, evaluating and testing various metals, ended with iridium as the best-suited material. A thorough investigation of the properties of iridium PP followed in the second part of this thesis. One key outcome is a new operation mode, the rocking PP. By using this rocking-mode, fringing artifacts, another obstacle of Zernike PPs, could be solved. In the last part of this work, acquisition and reconstruction of SPA data of apoferritin was performed using the iridium PP in rocking-mode. A special semi-automated workflow for the acquisition of PP data was developed and tested. The recorded PP data was compared to an additional reference dataset without a PP, acquired following a conventional workflow.
This thesis describes the implementation of a Path-planning algorithm for multi-axle vehicles using machine learning algorithms. For that purpose, a general overview over Genetic Algorithms is given and alternative machine learning algorithms are briefly explained. The software developed for this purpose is based on the EZSystem Simulation Software developed by the AG Echtzeitysteme at the University Koblenz-Landau and a path correction algorithm developed by Christian Schwarz, which is also detailed in this paper. This also includes a description of the vehicle used in these simulations. Genetic Algorithms as a solution for path-planning in complex scenarios are then evaluated based on the results of the developed simulation software and compared to alternative, non-machine learning solutions, which are also shortly presented.
Culture and violence
(2010)
The basic assumption of this study is that specific cultural conditions may lead to psychopathological reactions through which an increase in interpersonal violence may happen. The objective of this study was to define to what extent homicide rates across national cultures might be associated with the strength of their attitudes toward specific beliefs and values, and their scores in specific cultural dimensions. To answer this question, nine independent variables were defined six of which were related to the people- attitudes pertaining importance of religion (Religiosity), excessive feeling of choice and control (Omnipotence), clear-cut distinction between good and evil (Absolutism), proud of their nationality (Nationalism), approval of competition (Competitiveness), and high respect for authorities and emphasis on obedience (Authoritarianism). The data for these variables were collected from World Values Survey. For two cultural dimensions, Collectivism, and Power Distance, Hofstede- scores were used. The 9th variable was GNI per capita. After estimation of 7% missing values in the whole data through multiple imputation, a sample of 81 nations was used for further statistical analyses.
Results: Stepwise regression analysis indicated Omnipotence and GNI as the strongest predictors of homicide (β = .44 P = .000; β = -.27 P = .006 respectively). The 9 independent variables were loaded on two factors, socio-economic development (SED) and psycho-cultural factor (Psy-Cul), which were negatively correlated (-.47). The Psy-Cul was interpreted as an indicator of narcissism, and a mediator between SED and homicide. Hierarchical cluster analysis made a clear distinction among three main groups of Western, Developing, and post-Communist nations on the basis of the two factors.
We aim to demonstrate that automated deduction techniques, in particular those following the model computation paradigm, are very well suited for database schema/query reasoning. Specifically, we present an approach to compute completed paths for database or XPath queries. The database schema and a query are transformed to disjunctive logic programs with default negation, using a description logic as an intermediate language. Our underlying deduction system, KRHyper, then detects if a query is satisfiable or not. In case of a satisfiable query, all completed paths -- those that fulfill all given constraints -- are returned as part of the computed models. The purpose of our approach is to dramatically reduce the workload on the query processor. Without the path completion, a usual XML query processor would search the database for solutions to the query. In the paper we describe the transformation in detail and explain how to extract the solution to the original task from the computed models. We understand this paper as a first step, that covers a basic schema/query reaÂsoning task by model-based deduction. Due to the underlying expressive logic formalism we expect our approach to easily adapt to more sophisticated problem settings, like type hierarchies as they evolve within the XML world.
Hyper tableaux with equality
(2007)
In most theorem proving applications, a proper treatment of equational theories or equality is mandatory. In this paper we show how to integrate a modern treatment of equality in the hyper tableau calculus. It is based on splitting of positive clauses and an adapted version of the superposition inference rule, where equations used for paramodulation are drawn (only) from a set of positive unit clauses, the candidate model. The calculus also features a generic, semantically justified simplification rule which covers many redundancy elimination techniques known from superposition theorem proving. Our main results are soundness and completeness, but we briefly describe the implementation, too.
The Living Book is a system for the management of personalized and scenario specific teaching material. The main goal of the system is to support the active, explorative and selfdetermined learning in lectures, tutorials and self study. The Living Book includes a course on 'logic for computer scientists' with a uniform access to various tools like theorem provers and an interactive tableau editor. It is routinely used within teaching undergraduate courses at our university. This paper describes the Living Book and the use of theorem proving technology as a core component in the knowledge management system (KMS) of the Living Book. The KMS provides a scenario management component where teachers may describe those parts of given documents that are relevant in order to achieve a certain learning goal. The task of the KMS is to assemble new documents from a database of elementary units called 'slices' (definitions, theorems, and so on) in a scenario-based way (like 'I want to prepare for an exam and need to learn about resolution'). The computation of such assemblies is carried out by a model-generating theorem prover for first-order logic with a default negation principle. Its input consists of meta data that describe the dependencies between different slices, and logic-programming style rules that describe the scenario-specific composition of slices. Additionally, a user model is taken into account that contains information about topics and slices that are known or unknown to a student. A model computed by the system for such input then directly specifies the document to be assembled. This paper introduces the elearning context we are faced with, motivates our choice of logic and presents the newly developed calculus used in the KMS.
The model evolution calculus
(2004)
The DPLL procedure is the basis of some of the most successful propositional satisfiability solvers to date. Although originally devised as a proof procedure for first-order logic, it has been used almost exclusively for propositional logic so far because of its highly inefficient treatment of quantifiers, based on instantiation into ground formulas. The recent FDPLL calculus by Baumgartner was the first successful attempt to lift the procedure to the first-order level without resorting to ground instantiations. FDPLL lifts to the first-order case the core of the DPLL procedure, the splitting rule, but ignores other aspects of the procedure that, although not necessary for completeness, are crucial for its effectiveness in practice. In this paper, we present a new calculus loosely based on FDPLL that lifts these aspects as well. In addition to being a more faithful litfing of the DPLL procedure, the new calculus contains a more systematic treatment of universal literals, one of FDPLL's optimizations, and so has the potential of leading to much faster implementations.
The goal of this master thesis was to develop a CRM system for the Assist team of CompuGroup Medical that is aiding in integrating open innovation into the development of the Minerva 2.0 software. To achieve this, CRM methodology has been combined with Social Networking Systems, following the research of Lin and Chen (2010, pp. 11 – 30). To achieve the predefined goals literature has been analyzed on how to successfully im- plement a CRM system as well as an online community. Subsequently the results have been applied to the development of the Minerva Community according to the guidelines of Design Science suggested by Hevner et al. (2004, pp. 75 – 104). The finished product is designed based on customer and management requirements and evaluated from a customer and company perspective.
The implementation of physiological indicators reflecting the response of organisms to changes in their environment is assumed to provide potential benefits for ecological studies. By analysing the physiological condition of organisms in freshwater ecological studies rather than their ultimate effects, physiological indicators can contribute to a faster assessment of effects than using traditional ecological indicators, such as the evaluation of the benthic community structure or the determination of the reproductive success of organisms. This can increase the effectiveness of environmental health assessment and experimental ecology. In this respect the thesis focuses on physiological measures characterizing the energetic condition and energy consumption (the concentration of energy storage compounds, the adenylate energy charge, the energy consumption in vivo), as well as individual growth (RNA:DNA ratio) of organisms. Although these sub-individual indicators are commonly applied in marine ecology and more recently in ecotoxicology, they have been rarely applied in freshwater ecology to date. With respect to an increased use of physiological indicators in freshwater ecological studies, the objectives of the present thesis are twofold. First, it highlights the potential of assessing the individual fitness by means of physiological indicators in freshwater ecological studies. For that reason, Chapter 2 provides the basic assumptions as well as the theoretical and methodological fundamentals necessary for the application of physiological indicators within freshwater ecology and, furthermore, points out their applicability by several case studies. As second objective, the thesis addresses selected ecophysiological aspects of native and non-native freshwater amphipods, which are considered suitable candidates for the determination of physiological indicators in ecological studies due to their function as keystone species within aquatic habitats. The studies presented in Chapters 3−5 of the thesis provide information on (i) species- and sex-specific seasonal variations within the energetic condition of natural Gammarus populations (G. fossarum, G. pulex), (ii) differences in metabolic activity and behaviour between different amphipod species (G fossarum, G. roeselii and D. villosus), as well as (iii) the direct effects of ambient ammonia on the physiology and behaviour of D. villosus. The fundamental conclusions drawn from the conducted field and laboratory studies, as well as their relevance and general implications for the application of physiological indicators in freshwater ecological research are discussed in Chapter 6.
SUMMARY
Buildings and infrastructures characterize the appearance of our cultural landscapes and provide essential services for the human society. However, they inevitably impact the natural environment e.g. by the structural change of habitats. Additionally, they potentially cause further negative environmental impacts due to the release of chemical substances from construction materials. Galvanic anodes and organic coatings regularly used for corrosion protection of steel structures are building materials of particular importance for the transport infrastructure. In direct contact with a water body or indirectly via the runoff after rainfall, numerous chemicals can be released into the environment and pose a risk to aquatic organisms. Up to now, there is no uniform investigation and evaluation approach for the assessment of the environmental compatibility of building products. Furthermore, galvanic anodes and organic coatings pose particular challenges for their ecotoxicological characterization due to their composition. Therefore, the objective of the presented thesis was the ecotoxicological assessment of emissions from galvanic anodes and protective coatings as well as the development of standardized assessment procedures for these materials.
The possible environmental hazard posed by the use of anodes on offshore installations was investigated on three trophic levels. To ensure a realistic and reliable evaluation, the experiments were carried out in natural seawater and under natural pH conditions. Moreover, the anode material and its main components zinc and aluminum were exposed while simulating a worst-case scenario. The anode material examined caused a weak inhibition of algae growth; no acute toxicity was observed on the luminescent bacteria and amphipods. However, an increase of aluminum and indium levels in the crustacean species was found. On the basis of these results, no direct threat has been identified for marine organisms from the use of galvanic aluminum anodes. However, an accumulation of metals in crustaceans and a resulting entry into the marine food web cannot be excluded.
The environmental compatibility of organic coating systems was exemplarily evaluated using a selection of relevant products based on epoxy resins (EP) and polyurethanes. For this purpose, coated test plates were dynamically leached over 64 days. The eluates obtained were systematically analyzed for their ecotoxicological effects (acute toxicity to algae and luminescent bacteria, mutagenic and estrogenic effects) and their chemical composition. In particular, the EP-based coatings caused significant bacterial toxicity and estrogen-like effects. The continuously released 4-tert-butylphenol was identified as a main contributor to these effects and was quantified in concentrations exceeding the predicted no effect concentration for freshwater in all samples. Interestingly, the overall toxicity was not governed by the content of 4-tert-butylphenol in the products but rather by the release mechanism of this compound from the investigated polymers. This finding indicates that an optimization of the composition can result in the reduction of emissions and thus of environmental impacts - possibly due to a better polymerization of the compounds.
Coatings for corrosion protection are exposed to rain, changes in temperature and sun light leading to a weathering of the polymer. To determine the influence of light-induced aging on the ecotoxicity of top coatings, the emissions and associated adverse effects of UV-irradiated and untreated EP-based products were compared. To that end, the investigation of static leachates was focused on estrogenicity and bacterial toxicity, which were detected in the classic microtiter plate format and in combination with thin-layer plates. Both materials examined showed a significant decrease of the ecotoxicological effects after irradiation with a simultaneous reduction of the 4-tert-butylphenol emission. However, bisphenol A and various structural analogues were detected as photolytic degradation products of the polymers, which also contributed to the observed effects. In this context, the identification of bioactive compounds was supported by the successful combination of in-vitro bioassays with chemical analysis by means of an effect-directed analysis. The presented findings provide important information to assess the general suitability of top coatings based on epoxy resins.
Within the scope of the present study, an investigation concept was developed and successfully applied to a selection of relevant construction materials. The adaptation of single standard methods allowed an individual evaluation of these products. At the same time, the suitability of the ecotoxicological methods used for the investigation of materials of unknown and complex composition was confirmed and the basis for a systematic assessment of the environmental compatibility of corrosion protection products was created. Against the background of the European Construction Products Regulation, the chosen approach can facilitate the selection of environmentally friendly products and contributes to the optimization of individual formulations by the simple comparison of different building materials e.g. within a product group.
This minor thesis shows a way to optimise a generated oracle to achieve shorter runtimes. Shorter runtimes of test cases allows the execution of more test cases in the same time. The execution of more test cases leads to a higher confidence in the software-quality. Oracles can be derived from specifications. However specifications are used for different purposes and therefore are not necessarily executable. Even if the are executable it might be with only a high runtime. Those two facts come mostly from the use of quantifiers in the logic. If the quantifier-range is not bounded, respectively if the bounds are outside the target language-datatype-limits, the specification is too expressive to be exported into a program. Even if the bounds inside the used datatype-limits, the quantification is represented as a loop which leads to a runtime blowup, especially if quantifiers are nested. This work explains four different possibilities to reduce the execution time of the oracle by manipulating the quantified formular whereas this approach is only applicable if the quantified variables are of type Integer.
E-KRHyper is a versatile theorem prover and model generator for firstorder logic that natively supports equality. Inequality of constants, however, has to be given by explicitly adding facts. As the amount of these facts grows quadratically in the number of these distinct constants, the knowledge base is blown up. This makes it harder for a human reader to focus on the actual problem, and impairs the reasoning process. We extend E-Hyper- underlying E-KRhyper tableau calculus to avoid this blow-up by implementing a native handling for inequality of constants. This is done by introducing the unique name assumption for a subset of the constants (the so called distinct object identifiers). The obtained calculus is shown to be sound and complete and is implemented into the E-KRHyper system. Synthetic benchmarks, situated in the theory of arrays, are used to back up the benefits of the new calculus.
In automated theorem proving, there are some problems that need information on the inequality of certain constants. In most cases this information is provided by adding facts which explicitly state that two constants are unequal. Depending on the number of constants, a huge amount of this facts can clutter the knowledge base and distract the author and readers of the problem from its actual proposition. For most cases it is save to assume that a larger knowledge base reduces the performance of a theorem prover, which is another drawback of explicit inequality facts. Using the unique name assumption in those reasoning tasks renders the introduction of inequality facts obsolete as the unique name assumptions states that two constants are identical iff their interpretation is identical. Implicit handling of non-identical constants makes the problems easier to comprehend and reduces the execution time of reasoning. In this thesis we will show how to integrate the unique name assumption into the E-hyper tableau calculus and that the modified calculus is sound and complete. The calculus will be implemented into the E-KRHyper theorem prover and we will show, by empiric evaluation, that the changed implementation, which is able to use the unique name assumption, is superior to the traditional version of E-KRHyper.
Many pharmaceuticals (e.g. antibiotics, contrast media, beta blockers) are excreted unmetabolized and enter wastewater treatment plants (WWTPs) through the domestic sewage system. Research has shown that many of them are not effectively removed by conventional wastewater treatment and therefore are detected in surface waters. Reverse osmosis (RO) is one of the most effective means for removing a wide range of micropollutants in water recycling. However, one significant disadvantage is the need to dispose the resultant RO concentrate. Due to the fact that there are elevated concentrations of micropollutants in the concentrate, a direct disposal to surface water could be hazardous to aquatic organisms. As a consequence, further treatment of the concentrate is necessary. In this study, ozonation was investigated as a possible treatment option for RO concentrates. Concentrate samples were obtained from a RO-membrane system which uses municipal WWTP effluents as feeding water to produce infiltration water for artificial groundwater recharge. In this study it could be shown that ozonation is efficient in the attenuation of selected pharmaceuticals, even in samples with high TOC levels (46 mg C/L). Tests with chlorinated and non-chlorinated WWTP effluent showed an increase of ozone stability, but a decrease of hydroxyl radical exposure in the samples after chlorination. This may shift the oxidation processes towards direct ozone reactions and favors the degradation of compounds with high apparent second order rate constants. Additionally it might inhibit an oxidation of compound predominantly reacting with OH radicals. Ozone reaction kinetics were investigated for beta blockers (acebutolol, atenolol, metoprolol and propranolol) which are permanently present in WWTP effluents. For beta blockers two moieties are common which are reactive towards ozone, a secondary amine group and an activated aromatic ring. The secondary amine is responsible for a pH dependence of the direct ozone reaction rate, since only the deprotonated amine reacts very quickly. At pH 7 acebutolol, atenolol and metoprolol reacted with ozone with an apparent second order rate constant of about 2000 M-1 s-1, whereas propranolol reacted at ~1.0 105 M-1 s-1. The rate constants for the reaction of the selected compounds with OH radicals were determined to be 0.5-1.0 x 1010 M-1 s-1. Oxidation products (OPs) formed during ozonation of metoprolol and propranolol were identified via liquid chromatography (LC) tandem mass spectrometry. Ozonation led to a high number of OPs being formed. Experiments were carried out in MilliQ-water at pH 3 and pH 8 as well as with and without the radical scavenger tertiary butanol (t-BuOH). This revealed the influence of pH and the OH radical exposure on OP formation. The OH radical exposure was determined by adding the probe compound para-chlorobenzoic acid (pCBA). Metoprolol: To define the impacts of the protonated and non protonated metoprolol species on OH radical formation, the measured pCBA attenuation was compared to modeled values obtained by a simplified kinetic model (Acuchem). A better agreement with the measured results was obtained, when the model was based on a stoichiometric formation of OH radical precursors (O2-) during the primary ozone reaction of metoprolol. However, for reaction of a deprotonated molecule (attack of the aromatic ring) a formation of O2- could be confirmed, but an assumed stoichiometric O2- formation over-estimated the formation of OH radicals in the system. Analysis of ozonated raw wastewater and municipal WWTP effluent spiked with 10 μM metoprolol exhibited a similar OP formation pattern as detected in the reaction system at pH 8 without adding radical scavenger. This indicated a significant impact of OH radical exposure on the formation of OPs in real wastewater matrices. Propranolol: The primary ozonation product of propranolol (OP-291) was formed by an ozone attack of the naphthalene ring, which resulted in a ring opening and two aldehyde moieties being formed. OP-291 was further oxidized to OP-307, presumably by an OH radical attack, which was then further oxidized to OP-281. Reaction pathways via ozone as well as OH radicals were proposed and confirmed by the chemical structures identified with MS2 and MS3 data. It can be concluded that ozonation of WWTP effluent results in the formation of a high number of OPs with an elevated toxic potential (i.e. formation of aldehydes).
Fate and effects of insecticides in vegetated agricultural drainage ditches and constructed wetlands
(2006)
Studies have shown that runoff and spray-drift are important sources of nonpoint-source pesticide pollution of surface waters. Owing to this, public concern over the presence of pesticides in surface and ground water has resulted in intensive scientific efforts to find economical, yet environmentally sound solutions to the problem. The primary objective of this research was to assess the effectiveness of vegetated aquatic systems in providing buffering between natural aquatic ecosystems and agricultural landscape following insecticide associated runoff and spray-drift events. The first set of studies were implemented using vegetated agricultural ditches, one in Mississippi, USA, using pyrethroids (bifenthrin, lambda-cyhalothrin) under simulated runoff conditions and the other in the Western Cape, South Africa using the organophosphate insecticide, azinphos-methyl (AZP), under natural runoff and spray-drift conditions. The second set of studies were implemented using constructed wetlands, one in the Western Cape using AZP under natural spray-drift conditions and the other in Mississippi, USA using the organophosphate MeP under simulated runoff conditions. Results from the Mississippi-ditch study indicated that ditch lengths of less than 300 m would be sufficient to mitigate bifenthrin and lambda-cyhalothrin. In addition, data from mass balance calculations determined that the ditch plants were the major sink (generally > 90%) and/or sorption site for the rapid dissipation of the above pyrethroids from the water column. Similarly, results from the ditch study in South Africa showed that a 180 m vegetated system was effective in mitigating AZP after natural spray drift and low flow runoff events. Analytical results from the first wetland study show that the vegetated wetland was more effective than the non-vegetated wetland in reducing loadings of MeP. Mass balance calculations indicated approximately 90% of MeP mass was associated with the plant compartment. Ninety-six hours after the contamination, a significant negative acute effect of contamination on abundances was found in 8 out of the 15 macroinvertebrate species in both wetland systems. Even with these toxic effects, the overall reaction of macroinvertebrates clearly demonstrated that the impact of MeP in the vegetated wetland was considerably lower than in the non-vegetated wetland. Results from the constructed wetland study in South Africa revealed that concentrations of AZP at the inlet of the 134 m wetland system were reduced by 90% at the outlet. Overall, results from all of the studies in this thesis indicate that the presence of the plant compartment was essential for the effective mitigation of insecticide contamination introduced after both simulated and natural runoff or spray-drift events. Finally, both the vegetated agricultural drainage ditch and vegetated constructed wetland systems studied would be effective in mitigating pesticide loadings introduced from either runoff or spray-drift, in turn lowering or eliminating potential pesticide associated toxic effects in receiving aquatic ecosystems. Data produced in this research provide important information to reduce insecticide risk in exposure assessment scenarios. It should be noted that incorporating these types of best management practices (BMPs) will decrease the risk of acute toxicity, but chronic exposure may still be an apparent overall risk.
Introduction:
In March 2012 a secessionist-Islamist insurgency gained momentum in Mali and quickly took control of two-thirds of the state territory. Within weeks radical Islamists, drug smugglers and rebels suddenly ruled over a territory bigger than Germany. News of the abuse of the population and the introduction of harsh Sharia law spread soon, and word got out that the Malian Army had simply abandoned the land. The general echo of the IC was surprise, a reaction that was, as this research will show, as unfunded as it was unconstructive*. When Malian state structures collapsed, the world watched in shock, even though the developments couldhave been anticipated –and prevented. Ultimately, the situation had to be resolved by international forces (most notably French troops), who are still in Mali at the time of writing (Arieff 2013a: 5; Lohmann 2012: 3; Walther and Christopoulos 2015: 514f.; Shaw 2013: 204; Qantara, Interview, 2012;L’Express, Mali, 2015; Deutscher Bundestag, MINUSMA und EUTM Mali, 2016; UN, MUNISMA, 2016; Boeke and Schuurmann 2015: 801; Chivvis 2016: 93f.).
This research will show that the developments in Mali in 2012 have been developing for a long time and could have been avoided. In doing so, it will also show why state security can never be analyzed or consolidated in an isolated manner. Instead, it is necessary to take into account regional dynamics and developments in order to find a comprehensive approach to security in individual states. Once state failure occurs, not only does the state itself fail, but the surrounding region equally failed to prevent the failure.
Weak states are a growing concern in many world regions, particularly in Africa. As international intervention often proves unsustainable for various reasons*, the author believes that states which cannot stabilize themselves need a regional agent to support them. This regional agent should be a Regional Security Complex (RSC) asdefined by Barry Buzan and Ole Waever (Buzan and Waever 2003). As the following analysis will show, Mali is a case in point. The hope is that this study will help avoid similar failures in the future by making a strong case for the establishment of RSC’s.
…
This dissertation introduces a methodology for formal specification and verification of user interfaces under security aspects. The methodology allows to use formal methods pervasively in the specification and verification of human-computer interaction. This work consists of three parts. In the first part, a formal methodology for the description of human-computer interaction is developed. In the second part, existing definitions of computer security are adapted for human-computer interaction and formalized. A generic formal model of human-computer interaction is developed. In the third part, the methodology is applied to the specification and verification of a secure email client.
In scientific data visualization huge amounts of data are generated, which implies the task of analyzing these in an efficient way. This includes the reliable detection of important parts and a low expenditure of time and effort. This is especially important for the big-sized seismic volume datasets, that are required for the exploration of oil and gas deposits. Since the generated data is complex and a manual analysis is very time-intensive, a semi-automatic approach could on one hand reduce the time required for the analysis and on the other hand offer more flexibility, than a fully automatic approach.
This master's thesis introduces an algorithm, which is capable of locating regions of interest in seismic volume data automatically by detecting anomalies in local histograms. Furthermore the results are visualized and a variety of tools for the exploration and interpretation of the detected regions are developed. The approach is evaluated by experiments with synthetic data and in interviews with domain experts on the basis of real-world data. Conclusively further improvements to integrate the algorithm into the seismic interpretation workflow are suggested.
In the new epoch of Anthropocene, global freshwater resources are experiencing extensive degradation from a multitude of stressors. Consequently, freshwater ecosystems are threatened by a considerable loss of biodiversity as well as substantial decrease in adequate and secured freshwater supply for human usage, not only on local scales, but also on regional to global scales. Large scale assessments of human and ecological impacts of freshwater degradation enable an integrated freshwater management as well as complement small scale approaches. Geographic information systems (GIS) and spatial statistics (SS) have shown considerable potential in ecological and ecotoxicological research to quantify stressor impacts on humans and ecological entitles, and disentangle the relationships between drivers and ecological entities on large scales through an integrated spatial-ecological approach. However, integration of GIS and SS with ecological and ecotoxicological models are scarce and hence the large scale spatial picture of the extent and magnitude of freshwater stressors as well as their human and ecological impacts is still opaque. This Ph.D. thesis contributes novel GIS and SS tools as well as adapts and advances available spatial models and integrates them with ecological models to enable large scale human and ecological impacts identification from freshwater degradation. The main aim was to identify and quantify the effects of stressors, i.e climate change and trace metals, on the freshwater assemblage structure and trait composition, and human health, respectively, on large scales, i.e. European and Asian freshwater networks. The thesis starts with an introduction to the conceptual framework and objectives (chapter 1). It proceeds with outlining two novel open-source algorithms for quantification of the magnitude and effects of catchment scale stressors (chapter 2). The algorithms, i.e. jointly called ATRIC, automatically select an accumulation threshold for stream network extraction from digital elevation models (DEM) by assuring the highest concordance between DEM-derived and traditionally mapped stream networks. Moreover, they delineate catchments and upstream riparian corridors for given stream sampling points after snapping them to the DEM-derived stream network. ATRIC showed similar or better performance than the available comparable algorithms, and is capable of processing large scale datasets. It enables an integrated and transboundary management of freshwater resources by quantifying the magnitude of effects of catchment scale stressors. Spatially shifting temporal points (SSTP), outlined in chapter 3, estimates pooled within-time series (PTS) variograms by spatializing temporal data points and shifting them. Data were pooled by ensuring consistency of spatial structure and temporal stationarity within a time series, while pooling sufficient number of data points and increasing data density for a reliable variogram estimation. SSTP estimated PTS variograms showed higher precision than the available method. The method enables regional scale stressors quantification by filling spatial data gaps integrating temporal information in data scarce regions. In chapter 4, responses of the assumed climate-associated traits from six grouping features to 35 bioclimatic indices for five insect orders were compared, their potential for changing distribution pattern under future climate change was evaluated and the most influential climatic aspects were identified (chapter 4). Traits of temperature preference grouping feature and the insect order Ephemeroptera exhibited the strongest response to climate as well as the highest potential for changing distribution pattern, while seasonal radiation and moisture were the most influential climatic aspects that may drive a change in insect distribution pattern. The results contribute to the trait based freshwater monitoring and change prediction. In chapter 5, the concentrations of 10 trace metals in the drinking water sources were predicted and were compared with guideline values. In more than 53% of the total area of Pakistan, inhabited by more than 74 million people, the drinking water was predicted to be at risk from multiple trace metal contamination. The results inform freshwater management by identifying potential hot spots. The last chapter (6) synthesizes the results and provides a comprehensive discussion on the four studies and on their relevance for freshwater resources conservation and management.
Artificial intelligence (AI) is of rising importance in these days. AI is increasingly used in various company fields. Nonetheless, no high-quality scientific sources could be found stating the use of AI in the field of leadership. This research gap is addressed with this elaboration by performing expert interviews with leaders. In total seventeen companies could be questioned. The results indicate that AI is not widely used in leadership yet since only one company uses it currently and just about 10% of the participants plan the implementation in the closer feature. While the following items ex- plain why companies want to use AI in leadership: Chances for automation, time and cost savings, many important disadvantages and issues prevent companies from actively using it now: No areas of application are known, no need justifies the use, human interactions as a key aspect of leadership is reduced and it is hard to collect all necessary data. Beyond that, it was aimed to identify changes in the field of leadership through the use of AI. This objective could not be addressed due to the limited number of participants using AI in leadership.
Keywords: Leadership, artificial intelligence, transformation, state-of-use
The industry standard Decision Model and Notation (DMN) has enabled a new way for the formalization of business rules since 2015. Here, rules are modeled in so-called decision tables, which are defined by input columns and output columns. Furthermore, decisions are arranged in a graph-like structure (DRD level), which creates dependencies between them. With a given input, the decisions now can be requested by appropriate systems. Thereby, activated rules produce output for future use. However, modeling mistakes produces erroneous models, which can occur in the decision tables as well as at the DRD level. According to the Design Science Research Methodology, this thesis introduces an implementation of a verification prototype for the detection and resolution of these errors while the modeling phase. Therefore, presented basics provide the needed theoretical foundation for the development of the tool. This thesis further presents the architecture of the tool and the implemented verification capabilities. Finally, the created prototype is evaluated.
Avoidance of routing loops
(2009)
We introduce a new routing algorithm which can detect routing loops by evaluating routing updates more thoroughly. Our new algorithm is called Routing with Metric based Topology Investigation (RMTI), which is based on the simple Routing Information Protocol (RIP) and is compatible to all RIP versions. In case of a link failure, a network can reorganize itself if there are redundant links available. Redundant links are only available in a network system like the internet if the topology contains loops. Therefore, it is necessary to recognize and to prevent routing loops. A routing loop can be seen as a circular trace of a routing update information which returns to the same router, either directly from the neighbor router or via a loop topology. Routing loops could consume a large amount of network bandwidth and could impact the endtoend performance of the network. Our RMTI approach is capable to improve the efficiency of Distance Vector Routing.
The University of Koblenz-Landau would like to apply for participation in the RoboCup Mixed Reality League in Suzhou, China 2008. Our team is composed of ten team members and two supervisors. All members are graduate students of Computational Visualistics. Our supervisors are Ph.D. candidates currently researching in the working groups of artificial intelligence and computer graphics.
The research described in this thesis was designed to yield information on the impact of particle-bound pesticides on organisms living in the interface between sediment and water column in a temporarily open estuary (TOCEs). It was hypothesized that natural variables such as salinity and temperature and anthropogenic stressors such as particle-bound pesticides contribute to the variability of the system. A multiple line of evidence approach is necessary due to the variability in sediment type, contaminant distribution and spatial and temporal variability within the ecosystem in particular within TOCEs. The first aim of this thesis was to identify which particle-bound pesticides are important to the contamination of the Lourens River estuary (Western Cape, South Africa), taking into account their environmental concentrations, physico-chemical and toxicological properties (Exposure assessment). The second aim was to identify spatial and temporal variations in particle bound pesticide contamination, natural environmental variables and benthic community structure (effect assessment). The third aim was to test the hypothesis: "does adaptation to fluctuating salinities lead to enhanced survival of the harpacticoid copepod Mesochra parva when exposed to a combination of particle associated chlorpyrifos exposure and hypo-osmotic stress during a 96 h sediment toxicity test?" The last aim was to identify the driving environmental variables (including natural and anthropogenic stressors) in a "natural" (Rooiels River) compared to a "disturbed" (Lourens River) estuary and to identify if and how these variables change the benthic community structure in both estuaries. Data produced in this research thus provide important information to understand the impact of pesticides and its interaction with natural variables in a temporarily open estuary. To summarise, this study indicated, by the use of the multi-evidence approach, that the pesticides endosulfan and chlorpyrifos posed a risk towards benthic organisms in a temporarily open estuary in particular during spring season. Furthermore an important link between pesticide exposure/ toxicity and salinity was identified, which has important implications for the management of temporarily open estuaries.
Method development for the quantification of pharmaceuticals in aqueous environmental matrices
(2021)
As a consequence of the world population increase and the resulting water scarcity, water quality is the object of growing attention. In that context, organic anthropogenic molecules — often defined as micropollutants— represent a threat for water resources. Among them, pharmaceuticals are the object of particular concerns due to their permanent discharge, their increasing consumption and their effect-based structures. Pharmaceuticals are mainly introduced in the environment via wastewater treatment plants (WWTPs), along with their metabolites and the on-site formed transformation products (TPs). Once in the aquatic environment, they partition between the different environmental compartments in particular the aqueous phase, suspended particulate matter(SPM) and biota. In the last decades, pharmaceuticals have been widely investigated in the water phase. However, extreme polar pharmaceuticals have rarely been monitored due to the lack of robust analytical methods. Moreover, metabolites and TPs have seldom been included in routine analysis methods although their environmental relevance is proven. Furthermore, pharmaceuticals have been only sporadically investigated in SPM and biota and adequate multi-residue methods are lacking to obtain comprehensive results about their occurrence in these matrices. This thesis endeavors to cover these gaps of knowledge by the development of generic multi-residue methods for pharmaceuticals determination in the water phase, SPM and biota and to evaluate the occurrence and partition of pharmaceuticals into these compartments. For a complete overview, a particular focus was laid on extreme polar pharmaceuticals, pharmaceutical metabolites and TPs. In total, three innovative multi-residue methods were developed, they include analytes covering a broad range of physico-chemical properties. First, a reliable multi-residue method was developed for the analysis of extreme polar pharmaceuticals, metabolites and TPs dissolved in water. The selected analytes covered a significant range of elevated polarity and the method would be easily expendable to further analytes. This versatility could be achieved by the utilization of freeze-drying as sample preparation and zwitterionic hydrophilic interaction liquid chromatography (HILIC) in gradient elution mode. The suitability of HILIC chromatography to simultaneously quantify a large range of micropollutants in aqueous environmental samples was thoroughly studied. Several limitations were pointed out: a very complex and time-consuming method development, a very high sensitivity with regards to modification of the acetonitrile to water ratio in the eluent or the diluent and high positive matrix effects for certain analytes. However, these limitations can be overcome by the utilization of a precise protocol and appropriate labeled internal standards. They are overmatched by the benefits of HILIC which permits the chromatographic separation of extreme polar micropollutants. Investigation of environmental samples showed elevated concentrations of the analytes in the water phase. In particular, gabapentin, metformin, guanylurea and oxypurinol were measured at concentrations in the µg/L range in surface water. Subsequently, a reliable multi-residue method was established for the determination of 57 pharmaceuticals, 47 metabolites and TPs sorbed to SPM down to the low ng/g range. This method was conceived to cover a large range of polarity in particular with the inclusion of extreme polar pharmaceuticals. The extraction procedure was based on pressurized liquid extraction (PLE) followed by a clean-up via solvent exchange and detection via direct injection-reversed-phase LC-MS/MS and freeze-drying HILIC-MS/MS. Pharmaceutical sorption was examined using laboratory experiments. Derived distribution coefficients Kd varied by five orders of magnitude among the analytes and confirmed a high sorption potential for positively charged and nonpolar pharmaceuticals. The occurrence of pharmaceuticals in German rivers SPM was evaluated by the investigation of annual composite SPM samples taken at four sites at the river Rhine and one site at the river Saar between the years 2005 and 2015. It revealed the ubiquitous presence of pharmaceuticals sorbed to SPM in these rivers. In particular, positively charged analytes, even very polar and nonpolar pharmaceuticals showed appreciable concentrations. For many pharmaceuticals, a distinct correlation was observed between the annual quantities consumed in Germany and the concentrations measured in SPM. Studies of composite SPM spatial distribution permitted to get hints about specific industrial discharge by comparing the pollution pattern along the river. For the first time, these results showed the potential of SPM for the monitoring of positively charged and nonpolar pharmaceuticals in surface water. Finally, a reliable and generic multi residue method was developed to investigate 35 pharmaceuticals and 28 metabolites and TPs in fish plasma, fish liver and fish fillet. For this matrix, it was very challenging to develop an adequate clean-up allowing for the sufficient separation of the matrix disturbances from the analytes. In the final method, fish tissue extraction was performed by cell disruption followed by a non-discriminating clean-up based on silica gel solid-phase extraction(SPE) and restrictive access media (RAM) chromatography. Application of the developed method to the measurement of bream and carp tissues from German rivers revealed that even polar micropollutants such as pharmaceuticals are ubiquitously present in fish tissues. In total, 17 analytes were detected for the first time in fish tissues, including 10 metabolites/TPs. The importance of monitoring metabolites and TPs in fish tissues was confirmed with their detection at similar concentrations as their parents. Liver and fillet were shown to be appropriate for the monitoring of pharmaceuticals in fish, whereas plasma is more inconvenient due to very low concentrations and collection difficulties. Elevated concentrations of certain metabolites suggest possible formation of human metabolites in fish. Measured concentrations indicate a low bioaccumulation potential for pharmaceuticals in fish tissues.
This thesis proposes the use of MSR (Mining Software Repositories) techniques to identify software developers with exclusive expertise about specific APIs and programming domains in software repositories. A pilot Tool for finding such
“Islands of Knowledge” in Node.js projects is presented and applied in a case study to the 180 most popular npm packages. It is found that on average each package has 2.3 Islands of Knowledge, which is possibly explained by the finding that npm packages tend to have only one main contributor. In a survey, the maintainers of 50 packages are contacted and asked for opinions on the results produced by the Tool. Together with their responses, this thesis reports on experiences made with the pilot Tool and how future iterations could produce even more accurate statements about programming expertise distribution in developer teams.
The novel mobile application csxPOI (short for: collaborative, semantic, and context-aware points-of-interest) enables its users to collaboratively create, share, and modify semantic points of interest (POI). Semantic POIs describe geographic places with explicit semantic properties of a collaboratively created ontology. As the ontology includes multiple subclassiffcations and instantiations and as it links to DBpedia, the richness of annotation goes far beyond mere textual annotations such as tags. With the intuitive interface of csxPOI, users can easily create, delete, and modify their POIs and those shared by others. Thereby, the users adapt the structure of the ontology underlying the semantic annotations of the POIs. Data mining techniques are employed to cluster and thus improve the quality of the collaboratively created POIs. The semantic POIs and collaborative POI ontology are published as Linked Open Data.
Gel effect induced by mucilage in the pore space and consequences on soil physical properties
(2020)
Water uptake, respiration and exudation are some of the biological functions fulfilled by plant roots. They drive plant growth and alter the biogeochemical parameters of soil in the vicinity of roots, the rhizosphere. As a result, soil processes such as water fluxes, carbon and nitrogen exchanges or microbial activity are enhanced in the rhizosphere in comparison to the bulk soil. In particularly, the exudation of mucilage as a gel-like substance by plant roots seems to be a strategy for plants to overcome drought stress by increasing soil water content and soil unsaturated hydraulic conductivity at negative water potentials. Although the variations of soil properties due to mucilage are increasingly understood, a comprehensive understanding of the mechanisms in the pore space leading to such variations is lacking.
The aim of this work was to elucidate the gel properties of mucilage in the pore space, i.e. interparticulate mucilage, in order to link changes of the physico-chemical properties in the rhizosphere to mucilage. The fulfilment of this goal was confronted to the three following challenges: The lack of methods for in situ detection of mucilage in soil; The lack of knowledge concerning the properties of interparticulate mucilage; The unknown relationship between the composition and the properties of model substances and root mucilage produced by various species. These challenges are addressed in several chapters.
In a first instance, a literature review picked information from various scientific fields about methods enabling the characterization of gels and gel phases in soil. The variation of soil properties resulting from biohydrogel swelling in soil was named the gel effect. The combined study of water entrapment of gels and gel phases in soil and soil structural properties in terms of mechanical stability or visual structures proved promising to disentangle the gel effect in soil.
The acquired methodical knowledge was used in the next experiments to detect and characterize the properties of interparticulate gel. 1H NMR relaxometry allows the non-invasive measure of water mobility in porous media. A conceptual model based on the equations describing the relaxation of water protons in porous media was developed to integrate the several gel effects into the NMR parameters and quantify the influence of mucilage on proton relaxation. Rheometry was additionally used to assess mucilage viscosity and soil microstructural stability and ESEM images to visualize the network of interparticulate gel. Combination of the results enabled to identify three main interparticulate gel properties: The spider-web effect restricts the elongation of the polymer chains due to the grip of the polymer network to the surface of soil particles. The polymer network effect illustrates the organization of the polymer network in the pore space according to the environment. The microviscosity effect describes the increased viscosity of interparticulate gel in contrast to free gel. The impact of these properties on soil water mobility and microstructural stability were investigated. Consequences on soil hydraulic and soil mechanical properties found in the literature are further discussed.
The influence of the chemical properties of polymers on gel formation mechanism and gel properties was also investigated. For this, model substances with various uronic acid content, degree of esterification and amount of calcium were tested and their amount of high molecular weight substances was measured. The substances investigated included pectic polysaccharides and chia seed mucilage as model polymers and wheat and maize root mucilage. Polygalacturonic acid and low-methoxy pectin proved as non-suitable model polymers for seed and root mucilage as ionic interactions with calcium control their properties. Mucilage properties rather seem to be governed by weak electrostatic interactions between the entangled polymer chains. The amount of high molecular weight material varies considerably depending on mucilage´s origin and seems to be a straight factor for mucilage’s gel effect in soil. Additionally to the chemical characterization of the high molecular weight compounds, determination of their molecular weight and of their conformation in several mucilages types is needed to draw composition-property profiles. The variations measured between the various mucilages also highlight the necessity to study how the specific properties of the various mucilages fulfill the needs of the plant from which they are exuded.
Finally, the integration of molecular interactions in gel and interparticulate gel properties to explain the physical properties of the rhizosphere was discussed. This approach offers numerous perspectives to clarify for example how water content or hydraulic conductivity in the rhizosphere vary according to the properties of the exuded mucilage. The hypothesis that the gel effect is general for all soil-born exudates showing gel properties was considered. As a result, a classification of soil-born gel phases including roots, seeds, bacteria, hyphae and earthworm’s exuded gel-like material according to their common gel physico-chemical properties is recommended for future research. An outcome could be that the physico-chemical properties of such gels are linked with the extent of the gel effect, with their impact on soil properties and with the functions of the gels in soil.
The loss of biodiversity is recognised on a global scale and also in the anthropogenic landscapes used for agriculture, now covering almost 50% of the global terrestrial land surface. In agriculture pesticides, biologically active chemicals are deliberately distributed to control pests, disease and weeds in the cropped areas. The quantification of remaining semi-naturals structures such as field margins and hedges is a prerequisite to understand the impact of pesticides on biodiversity, since these structures represent habitats for many organisms in agricultural landscapes. The presence of organisms in these habitats and crops is required to obtain an estimate of their potential pesticide exposure. In this text I provide studies on animal groups so far not addressed in risk assessment procedures for the regulation of pesticides such as amphibians, moths and bats. For all groups it becomes apparent that they are present in agricultural landscapes and potentially coincide with pesticide applications indicating a risk. Risk quantification also requires data on the sensitivity of organisms and here data for plants, amphibians and bees are presented. Effects translating to community level were studied for herbicide, insecticide and fertiliser effects in a natural system. After three years the treatments resulted in simplified plant communities with lower species numbers and a reduction in flowering plants. This reduction of flowers is used as an example for an indirect effect and was especially obvious for the effect of an herbicide on the common buttercup. Sublethal herbicide effects for a plant translated in an impact on feeding caterpillars, indicating a reduction in food quality. Insecticide inputs realistic for field margins also reduced moth pollination of white champion flowers by 30%. These indirect effects by distortions of food web characteristics are playing a critical role to understand declines in organism groups, however so far are not accounted for in pesticide risk assessment schemes. The current intense use of pesticides in agriculture and their inherent toxicity may lead to a chemical landscape fragmentation, where populations may not be connected anymore. Source-sink dynamics are important ecological processes and as a final result not only population size but also genetic population structure might be affected. Including potential pesticide impacts as costs in a model for amphibians migrating to breeding ponds in vineyards in Rhineland-Palatinate indicated the isolation of investigated populations. A first validation by analyzing the population structure of the European common frog confirmed the model prediction for some sites. For the regulation of pesticides in Europe a risk assessment is required and for the organisms of the terrestrial habitat a multitude of guidance documents is in place or is recently developed or improved. The results of the presented research indicate that wild plants and especially their reproductive flower stage are highly sensitive and risks are underestimated. Population recovery of arthropods needs a reevaluation at landscape scale and the addition of amphibian risk assessment in regulation procedures is suggested. However, developing or adopting risk assessment procedures and test systems is a time consuming task and therefore the establishment of risk management options is a pragmatic alternative with immediate effects. Artificial wetlands in the agricultural landscape proved to be important foraging sites for bats and their creation could mitigate negative pesticide effects. The integration of direct and indirect effects in a risk assessment scheme for all organism groups addressing also landscape scale and pesticide mixtures requires a long developing time. The establishment of model landscapes where management options and integrated pest management are applied on a larger scale would allow us to study pesticide effects in a realistic scenario and to develop an approach for the agriculture of the future.
With the emergence of current generation head-mounted displays (HMDs), virtual reality (VR) is regaining much interest in the field of medical imaging and diagnosis. Room-scale exploration of CT or MRI data in virtual reality feels like an intuitive application. However in VR retaining a high frame rate is more critical than for conventional user interaction seated in front of a screen. There is strong scientific evidence suggesting that low frame rates and high latency have a strong influence on the appearance of cybersickness. This thesis explores two practical approaches to overcome the high computational cost of volume rendering for virtual reality. One lies within the exploitation of coherency properties of the especially costly stereoscopic rendering setup. The main contribution is the development and evaluation of a novel acceleration technique for stereoscopic GPU ray casting. Additionally, an asynchronous rendering approach is pursued to minimize the amount of latency in the system. A selection of image warping techniques has been implemented and evaluated methodically, assessing the applicability for VR volume rendering.
Non-Consumptive Effects of Spiders and Ants: Does Fear Matter in Terrestrial Interaction Webs?
(2014)
Most animals suffer from predators. Besides killing prey, predators can affect prey physiology, morphology and behaviour. Spiders are among the most diverse and frequent predators in terrestrial ecosystems. Our behavioural arena experiments revealed that behavioural changes under spider predation risk are relatively scarce among arthropods. Wood crickets (Nemobius sylvestris), in particular, changed their behaviour in response to cues of various spider species. Thereby, more common and relatively larger spider species induced stronger antipredator behaviour in crickets.
Behavioural changes under predation risk are expected to enhance predator avoidance, but they come at a cost. Crickets previously confronted with cues of the nursery web spider (Pisaura mirabilis) were indeed more successful in avoiding predation. Surprisingly, crickets slightly increased food uptake and lost less weight under predation risk, indicating that crickets are able to compensate for short-term cost under predation risk. In a following plant choice experiment, crickets strongly avoided plants bearing spider cues, which in turn reduced the herbivory on the respective plants.
Similar to spiders, ants are ubiquitous predators and can have a strong impact on herbivores, but also on other predators. Juvenile spiders increased their propensity for long-distance dispersal if exposed to ant cues. Thus, spiders use this passive dispersal through the air (ballooning) to avoid ants and colonise new habitats.
In a field experiment, we compared arthropod colonisation between plants bearing cues of the nursery web spider and cue-free plants. We followed herbivory during the experimental period and sampled the arthropod community on the plants. In accordance with the plant choice experiment, herbivory was reduced on plants bearing spider cues. In addition, spider cues led to changes in the arthropod community: smaller spiders and black garden ants (Lasius niger) avoided plants bearing spider cues. In contrast, common red ants (Myrmica rubra) increased the recruitment of workers, possibly to protect their aphids.
Although behavioural changes were relatively rare on filter papers bearing spider cues, more natural experimental setups revealed strong and far-reaching effects of predation risk. We further suggest that risk effects influence the spatial distribution of herbivory, rather than reduce overall herbivory that is expected if predators kill herbivores. Consequently, the relative importance of predation and risk effects is crucial for the way predators affect lower trophic levels.
Organic substances play an essential role for the formation of stable soil structures. In this context, their physico-chemical properties, interactions with mineral soil constituents and soil-water interactions are particu-larly important. However, the underlying mechanisms contributing to soil particle cementation by swollen or-ganic substances (hydrogels) remains unclear. Up to now, no mechanistic model is available which explains the mechanisms of interparticulate hydrogel swelling and its contribution to soil-water interactions and soil structur-al stability. This mainly results from the lack of appropriate testing methods to study hydrogel swelling in soil as well as from the difficulties of adapting available methods to the system soil/hydrogel.
In this thesis, 1H proton nuclear magnetic resonance (NMR) relaxometry was combined with various soil micro- and macrostructural stability testing methods in order to identify the contribution of hydrogel swelling-induced soil-water interactions to the structural stability of water-saturated and unsaturated soils. In the first part, the potentials and limitations of 1H NMR relaxometry to enlighten soil structural stabilization mechanism and vari-ous water populations were investigated. In the second part, 1H-NMR relaxometry was combined with rheologi-cal measurements of soil to assess the contribution of interparticulate hydrogel swelling and various polymer-clay interactions on soil-water interactions and soil structural stability in an isolated manner. Finally, the effects of various organic and mineral soil fractions on soil-water interactions and soil structural stability was assessed in more detail for a natural, agriculturally cultivated soil by soil density fractionation and on the basis of the experiences gained from the previous experiments.
The increased experiment complexity in the course of this thesis enabled to link physico-chemical properties of interparticulate hydrogel structures with soil structural stability on various scales. The established mechanistic model explains the contribution of interparticulate hydrogels to the structural stability of water-saturated and unsaturated soils: While swollen clay particles reduce soil structural stability by acting as lubricant between soil particles, interparticulate hydrogel structures increase soil structural stability by forming a flexible polymeric network which interconnects mineral particles more effectively than soil pore- or capillary water. It was appar-ent that soil structural stability increases with increasing viscosity of the interparticluate hydrogel in dependence on incubation time, soil texture, soil solution composition and external factors in terms of moisture dynamics and agricultural management practices. The stabilizing effect of interparticulate hydrogel structures further in-crease in the presence of clay particles which is attributed to additional polymer-clay interactions and the incor-poration of clay particles into the three-dimensional interparticulate hydrogel network. Furthermore, the simul-taneous swelling of clay particles and hydrogel structures results in the competition for water and thus in a mu-tual restriction of their swelling in the interparticle space. Thus, polymer-clay interactions not only increase the viscosity of the interparticulate hydrogel and thus its ability to stabilize soil structures but further reduce the swelling of clay particles and consequently their negative effects on soil structural stability. The knowledge on these underlying mechanisms enhance the knowledge on the formation of stable soil structures and enable to take appropriate management practices in order to maintain a sustainable soil structure. The additionally out-lined limitations and challenges of the mechanistic model should provide information on areas with optimization and research potential, respectively.
On-screen interactive presentations have got immense popularity in the domain of attentive interfaces recently. These attentive screens adapt their behavior according to the user's visual attention. This thesis aims to introduce an application that would enable these attentive interfaces to change their behavior not just according to the gaze data but also facial features and expressions. The modern era requires new ways of communications and publications for advertisement. These ads need to be more specific according to people's interests, age, and gender. When advertising, it's important to get a reaction from the user but not every user is interested in providing feedback. In such a context more, advance techniques are required that would collect user's feedback effortlessly. The main problem this thesis intends to resolve is, to apply advanced techniques of gaze and face recognition to collect data about user's reactions towards different ads being played on interactive screens. We aim to create an application that enables attentive screens to detect a person's facial features, expressions, and eye gaze. With eye gaze data we can determine the interests and with facial features, age and gender can be specified. All this information will help in optimizing the advertisements.
Studies have shown that wastewater treatment plant (WWTP) effluents are the major pathways of organic and inorganic chemicals of anthropogenic use (=micropollutants) into aquatic environments. There, micropollutants can be transferred to ground water bodies - and may finally end up in drinking water - or cause various effects in aquatic organisms like multiple resistances of bacteria. Hence, the upgrading of WWTPs with the aim to reduce the load of those micropollutants is currently under discussion.
Therefore, the primary objective of this thesis was to assess ecotoxicological effects of wastewater ozonation, a tertiary treatment method, using specifically developed toxicity tests with Gammarus fossarum (Koch) at various levels of ecological complexity. Several studies were designed in the laboratory and under semi-field conditions to cope with this primary objective. Prior to the investigations with ozone treated wastewater, the ecotoxicity of secondary treated (=non-ozone treated) wastewater from WWTP Wüeri, Switzerland, for the test species was assessed by a four-week experiment. This experiment displayed statistically significant impairments in feeding, assimilation and physiological endpoints related to population development and reproduction. The first experiment investigating ecotoxicological implications of ozone application in wastewater from the same WWTP displayed a preference of G. fossarum for leaf discs conditioned in ozone treated wastewater when offered together with leaf discs conditioned in non-ozone treated wastewater. This effect seems to be mainly driven by an alteration in the leaf associated microbial community. Another series of laboratory experiments conducted also with wastewater from WWTP Wüeri treated with ozone at the lab- or full-scale, revealed significantly increased feeding rates of G. fossarum exposed to ozone treated wastewater compared to non-ozone treated wastewater. These laboratory experiments also indicated that any alteration in the organic matrix potentially caused by ozone treatment is not related to the effects in feeding as this endpoint showed only negligible deviation in secondary treated wastewater, which contained hardly any (micro)pollutants (i.e. pharmaceuticals), from the same wastewater additionally treated with ozone. Moreover, it was shown that shifts in the dissolved organic carbon (DOC) profile do not affect the feeding rate of gammarids. In situ bioassays conducted in the receiving stream of the WWTP Wüeri confirmed the results of the laboratory experiments by displaying significantly reduced feeding rates of G. fossarum exposed below the WWTP effluent if non-ozone treated wastewater was released. However, at the time the ozonation was operating, no adverse effects in feeding rates were observed below the effluent compared to the unaffected upstream sites. Also population studies in on-site flow-through stream microcosms displayed an increased feeding and a statistically significantly higher population size after ten weeks when exposed to ozone treated wastewater compared to non-ozone treated wastewater.
In conclusion, the present thesis documents that ozonation might be a suitable tool to reduce both the load of micropollutants as well as the ecotoxicity of wastewaters. Thus, this technology may help to meet the requirements of the Water Framework Directive also under predicted climate change scenarios, which may lead to elevated proportions of wastewater in the receiving stream during summer discharge. However, as ozone application may also produce by-products with a higher toxicity than their parent compounds, the implementation of this technique should be assessed further both via chemical analysis and ecotoxicological bioassays.
The adoption of the EU Water Framework Directive (WFD) in 2000 marked the beginning of a new era of European water policy. However, more than a decade later, the majority of European rivers are still failing to meet one of the main objectives of the WFD: the good ecological status. Pesticides are a major stressor for stream ecosystems. This PhD thesis emphasises the need for WFD managers to consider all main agricultural pesticide sources and influencing landscape parameters when setting up River Basin Management Plans and Programmes of Measures. The findings and recommendations of this thesis can help to successfully tackle the risk of pesticide contamination to achieve the WFD objectives.
A total of 663 sites that were situated in the German Federal States of Saxony, Saxony-Anhalt, Thuringia and Hesse were studied (Chapter 3 and 4). In addition to an analysis of the macroinvertebrate data of the governmental WFD monitoring network, a detailed GIS analysis of the main agricultural pesticide sources (arable land and garden allotments as well as wastewater treatment plants (WWTPs)) and landscape elements (riparian buffer strips and forested upstream reaches) was conducted. Based on the results, a screening approach was developed that allows an initial rapid and cost-effective identification of those sites that are potentially affected by pesticide contamination. By using the trait-based bioindicator SPEARpesticides, the insecticidal long-term effects of the WWTP effluents on the structure of the macroinvertebrate community were identified up to at least 1.5 km downstream (in some cases even 3 km) of the WWTPs. The results of the German Saprobic Index revealed that the WWTPs can still be important sources of oxygen-depleting substances. Furthermore, the results indicate that forested upstream reaches and riparian buffer strips at least 5 m in width can be appropriate measures in mitigating the effects and exposure of pesticides.
There are concerns that the future expansion of energy crop cultivation will lead to an increased pesticide contamination of ecosystems in agricultural landscapes. Therefore, the potential of energy crops for pesticide contamination was examined based on an analysis of the development of energy crop cultivation in Germany and a literature search on perennial energy crops (Chapter 5). The results indicate that the future large-scale expansion of energy crop cultivation will not necessarily cause an increase or decrease in the amounts of pesticides that are released into the environment. The potential effects will depend on the future design of the agricultural systems. Instead of creating energy monocultures, annual energy crops should be integrated into the existing food production systems. Financial incentives and further education are needed to encourage the use of sustainable crop rotations, innovative cropping systems and perennial energy crops, which may contribute to crop diversity and generate lower pesticide demands than do intensive farming systems.
Concept for a Knowledge Base on ICT for Governance and Policy Modelling regarding eGovPoliNet
(2013)
Abstract The EU project eGovPoliNet is engaged in research and development in the field of information and communication technologies (ICT) for governance and policy modelling. Numerous communities pursue similar goals in this field of IT-based, strategic decision making and simulation of social problem areas. Though, the existing research approaches and results so far are quite fragmented. The aim of eGovPoliNet is to overcome the fragmentation across disciplines and to establish an international, open dialogue by fostering the cooperation between research and practice. This dialogue will advance the discussion and development of various problem areas with the help of researchers from different disciplines, who share knowledge, expertise and best practice supporting policy analysis, modelling and governance. To support this dialogue, eGovPoliNet will provide a knowledge base, which's conceptual development is the subject of this thesis. The knowledge base is to be filled with content from the area of ICT for strategic decision making and social simulation, such as publications, ICT solutions and project descriptions. This content needs to be structured, organised and managed in a way, so that it generates added value and the knowledge base is used as source of accumulated knowledge, which consolidates the previously fragmented research and development results in a central location.
The aim of this thesis is the development of a concept for a knowledge base, which provides the structure and the necessary functionalities to gather and process knowledge concerning ICT solutions for governance and policy modelling. This knowledge needs to be made available to users and thereby motivate them to contribute to the development and maintenance of the knowledge base.
Software systems have an increasing impact on our daily lives. Many systems process sensitive data or control critical infrastructure. Providing secure software is therefore inevitable. Such systems are rarely being renewed regularly due to the high costs and effort. Oftentimes, systems that were planned and implemented to be secure, become insecure because their context evolves. These systems are connected to the Internet and therefore also constantly subject to new types of attacks. The security requirements of these systems remain unchanged, while, for example, discovery of a vulnerability of an encryption algorithm previously assumed to be secure requires a change of the system design. Some security requirements cannot be checked by the system’s design but only at run time. Furthermore, the sudden discovery of a security violation requires an immediate reaction to prevent a system shutdown. Knowledge regarding security best practices, attacks, and mitigations is generally available, yet rarely integrated part of software development or covering evolution.
This thesis examines how the security of long-living software systems can be preserved taking into account the influence of context evolutions. The goal of the proposed approach, S²EC²O, is to recover the security of model-based software systems using co-evolution.
An ontology-based knowledge base is introduced, capable of managing common, as well as system-specific knowledge relevant to security. A transformation achieves the connection of the knowledge base to the UML system model. By using semantic differences, knowledge inference, and the detection of inconsistencies in the knowledge base, context knowledge evolutions are detected.
A catalog containing rules to manage and recover security requirements uses detected context evolutions to propose potential co-evolutions to the system model which reestablish the compliance with security requirements.
S²EC²O uses security annotations to link models and executable code and provides support for run-time monitoring. The adaptation of running systems is being considered as is round-trip engineering, which integrates insights from the run time into the system model.
S²EC²O is amended by prototypical tool support. This tool is used to show S²EC²O’s applicability based on a case study targeting the medical information system iTrust.
This thesis at hand contributes to the development and maintenance of long-living software systems, regarding their security. The proposed approach will aid security experts: It detects security-relevant changes to the system context, determines the impact on the system’s security and facilitates co-evolutions to recover the compliance with the security requirements.
Belief revision is the subarea of knowledge representation which studies the dynamics of epistemic states of an agent. In the classical AGM approach, contraction, as part of the belief revision, deals with the removal of beliefs in knowledge bases. This master's thesis presents the study and the implementation of concept contraction in the Description Logic EL. Concept contraction deals with the following situation. Given two concept C and D, assuming that C is subsumed by D, how can concept C be changed so that it is not subsumed by D anymore, but is as similar as possible to C? This approach of belief change is different from other related work because it deals with contraction in the level of concepts and not T-Boxes and A-Boxes in general. The main contribution of the thesis is the implementation of the concept contraction. The implementation provides insight into the complexity of contraction in EL, which is tractable since the main inference task in EL is also tractable. The implementation consists of the design of five algorithms that are necessary for concept contraction. The algorithms are described, illustrated with examples, and analyzed in terms of time complexity. Furthermore, we propose an new approach for a selection function, adapt for the concept contraction. The selection function uses metadata about the concepts in order to select the best from an input set. The metadata is modeled in a framework that we have designed, based on standard metadata frameworks. As an important part of the concept contraction, the selection function is responsible for selecting the best concepts that are as similar as possible to concept C. Lastly, we have successfully implemented the concept contraction in Python, and the results are promising.
Within the field of Business Process Management, business rules are commonly used to model company decision logic and govern allowed company behavior. An exemplary business rule in the financial sector could be for example:
”A customer with a mental condition is not creditworthy”. Business rules are
usually created and maintained collaboratively and over time. In this setting,
modelling errors can occur frequently. A challenging problem in this context is
that of inconsistency, i.e., contradictory rules which cannot hold at the same
time. For instance, regarding the exemplary rule above, an inconsistency would
arise if a (second) modeller entered an additional rule: ”A customer with a mental condition is always creditworthy”, as the two rules cannot hold at the same
time. In this thesis, we investigate how to handle such inconsistencies in business
rule bases. In particular, we develop methods and techniques for the detection,
analysis and resolution of inconsistencies in business rule bases
Entwicklung eines Regelungsverfahrens zur Pfadverfolgung für ein Modellfahrzeug mit Sattelanhänger
(2009)
Besides the progressive automation of internal goods traffic, there is an important area that should also be considered. This area is the carriage of goods in selected external areas. The use of driverless trucks in logistic centers can report economic efficiency. In particular, these precise control procedures require that trucks drive on predetermined paths. The general aim of this work is the adaption and evaluation of a path following control method for articulated vehicles. The differences in the kinematic behavior between trucks with one-axle trailer and semi-trailer vehicles will be emphasized. Additionally, the characteristic kinematic properties of semi-trailers for the adaptation of a control procedure will be considered. This control procedure was initially designed for trucks with one-axle trailer. It must work in forwards and backwards movements. This control process will be integrated as a closed component on the control software of the model vehicle. Thus, the geometry of the model vehicle will be specified, and the possible special cases of the control process will be discovered. The work also documents the most relevant software components of the implemented control process.
Regarding the rapidly growing amount of data produced every year and the increasing acceptance of Enterprise 2.0 enterprises have to care about the management of their data more and more. Content created and stored in an uncoordinated manner can lead to data-silos (Williams & Hardy 2011, p.57), which result in long search times, inaccessible data and in consequence monetary losses. The "expanding digital universe" forces enterprises to develop new archiving solutions and records management policies (Gantz et al. 2007, p.13). Enterprise Content Management (ECM) is the research field that deals with these challenges. It is placed in the scientific context of Enterprise Information Management. This thesis aims to find out to what extent current Enterprise Content Management Systems (ECMS) support these new requirements, especially concerning the archiving of Enterprise 2.0 data. For this purpose, three scenarios were created to evaluate two different kinds of ECMS (one Open Source - and one proprietary system) chosen on the basis of a short marketrnresearch. The application of the scenarios reveals that the system vendors actually face the industry- concerns: both tools provide functionality for the archiving of data arising from online collaboration and also business records management capabilities but the integration of those topics is not, or is only inconsistently solved. At this point new questions - such as, "Which datarngenerated in an Enterprise 2.0 is worth being a record?" - arise and should be examined in future research.
The diversity within amphibian communities in cultivated areas in Rwanda and within two selected, taxonomically challenging groups, the genera Ptychadena and Hyperolius, were investigated in this thesis. The amphibian community of an agricultural wetland near Butare in southern Rwanda comprised 15 anuran species. Rarefaction and jackknife analyses corroborated that the complete current species richness of the assemblage had been recorded, and the results of acoustic niche analysis suggested species saturation of the community. Surveys at many other Rwandan localities showed that the species recorded in Butare are widespread in cultivated and pristine wetlands. The species were readily distinguishable using morphological, bioacoustic, and molecular (DNA barcoding) features, but only eight of the 15 species could be assigned unambiguously to nominal species. The remaining represented undescribed or currently unrecognized taxa, including three species of Hyperolius, two Phrynobatrachus species, one Ptychadena species, and one species of Amietia. The diversity of the Ridged Frogs in Rwanda was investigated in two studies (Chapters III and IV). Three species of Ptychadena were recorded in wetlands in the catchment of the Nile. They can be distinguished by morphological characters (morphometrics and qualitative features) as well as by their advertisement calls and genetics. The Rwandan species of the P. mascareniensis group was shown to differ from the topotypic population as well as from other genetic lineages in sub-Saharan Africa and an old available name, P. nilotica, was resurrected from synonymy for this lineage. Two further Ptychadena species were identified among voucher specimens from Rwanda deposited in the collection of the RMCA, P. chrysogaster and P. uzungwensis. Morphologically they can be unambiguously distinguished from each other and the three other Rwandan species. A key based on qualitative morphological characters was developed, which allows unequivocal identification of specimens of all species that have been recorded from Rwanda. DNA was isolated from a Rwandan voucher specimen of P. chrysogaster, and the genetic analysis corroborated the species" distinct status.
A species of Hyperolius collected in the Nyungwe National Park was compared to all other Rwandan species of the genus and to morphologically or genetically similar species from neighbouring countries. Its distinct taxonomic status was justified by morphological, bioacoustic, and molecular evidence and it was described as a new species, H. jackie. A species of the H. nasutus group collected at agricultural sites in Rwanda was described as a new species in the course of a revision of the species of the Hyperolius nasutus group. The group was shown to consist of 15 distinct species which can be distinguished from each other genetically, bioacoustically, and morphologically.
The aerial performance, i.e. parachuting, of the Disc-fingered Reed Frog, Hyperolius discodactylus, was described. It represents a novel observation of a behaviour that has been known from a number of Southeast Asian and Neotropical frog species. Parachuting frogs, including H. discodactylus, exhibit certain morphological characteristics and, while airborne, assume a distinct posture which is best-suited for maneuvering in the air. Another study on the species addressed the validity of the taxon H. alticola which had been considered either a synonym of H. discodactylus or a distinct species. Type material of both taxa was re-examined and the status of H. alticola reassessed using morphological data from historic and new collections, call recordings, and molecular data from animals collected on recent expeditions. A northern and a southern genetic clade were identified, a divide that is weakly supported by diverging morphology of the vouchers from the respective localities. No distinction in advertisement call features could be recovered to support this split and both genetic and morphological differences between the two geographic clades are marginal and not always congruent and more likely reflect population-level variation. Therefore it was concluded that H. alticola is not a valid taxon and should be treated as a synonym of H. discodactylus.
Business rules have become an important tool to warrant compliance at their business processes. But the collection of these business rules can have various conflicting elements. This can lead to a violation of the compliance to be achieved. This conflicting elements are therefore a kind of inconsistencies, or quasi incon- sistencies in the business rule base. The target for this thesis is to investigate how those quasi inconsistencies in business rules can be detected and analyzed. To this aim, we develop a comprehensive library which allows to apply results from the scientific field of inconsistency measurement to business rule formalisms that are actually used in practice.
Tagging systems are intriguing dynamic systems, in which users collaboratively index resources with the so-called tags. In order to leverage the full potential of tagging systems, it is important to understand the relationship between the micro-level behavior of the individual users and the macro-level properties of the whole tagging system. In this thesis, we present the Epistemic Dynamic Model, which tries to bridge this gap between the micro-level behavior and the macro-level properties by developing a theory of tagging systems. The model is based on the assumption that the combined influence of the shared background knowledge of the users and the imitation of tag recommendations are sufficient for explaining the emergence of the tag frequency distribution and the vocabulary growth in tagging systems. Both macro-level properties of tagging systems are closely related to the emergence of the shared community vocabulary. rnrnWith the help of the Epistemic Dynamic Model, we show that the general shape of the tag frequency distribution and of the vocabulary growth have their origin in the shared background knowledge of the users. Tag recommendations can then be used for selectively influencing this general shape. In this thesis, we especially concentrate on studying the influence of recommending a set of popular tags. Recommending popular tags adds a feedback mechanism between the vocabularies of individual users that increases the inter-indexer consistency of the tag assignments. How does this influence the indexing quality in a tagging system? For this purpose, we investigate a methodology for measuring the inter-resource consistency of tag assignments. The inter-resource consistency is an indicator of the indexing quality, which positively correlates with the precision and recall of query results. It measures the degree to which the tag vectors of indexed resources reflect how the users perceive the similarity between resources. We argue with our model, and show it with a user experiment, that recommending popular tags decreases the inter-resource consistency in a tagging system. Furthermore, we show that recommending the user his/her previously used tags helps to increase the inter-resource consistency. Our measure of the inter-resource consistency complements existing measures for the evaluation and comparison of tag recommendation algorithms, moving the focus to evaluating their influence on the indexing quality.
Part-of-Speech tagging is the process of assigning words with similar grammatical properties to a part of speech (PoS). In the English language, PoS-tagging algorithms generally reach very high accuracy. This thesis undertakes the task to test against these accuracies in PoS-tagging as a qualitative measure in classification capabilities for a recently developed neural network model, called graph convolutional network (GCN). The novelty proposed in this thesis is to translate a corpus into a graph as a direct input for the GCN. The experiments in this thesis serve as a proof of concept with room for improvements.
Technical products have become more than practical tools to us. Mobile phones, for example, are a constant companion in daily life. Besides purely pragmatic tasks, they fulfill psychological needs such as relatedness, stimulation, competence, popularity, or security. Their potential for the mediation of positive experience makes interactive products a rich source of pleasure. Research acknowledged this: in parallel to the hedonic/utilitarian model in consumer research, Human-Computer Interaction (HCI) researchers broadened their focus from mere task-fulfillment (i.e., the pragmatic) to a holistic view, encompassing a product's ability for need-fulfillment and positive experience (i.e., the hedonic). Accordingly, many theoretical models of User Experience (UX) acknowledge both dimensions as equally important determinants of a product's appeal: pragmatic attributes (e.g., usability) as well as hedonic attributes (e.g., beauty). In choice situations, however, people often overemphasize the pragmatic, and fail to acknowledge the hedonic. This phenomenon may be explained by justification. Due to their need for justification, people attend to the justifiability of hedonic and pragmatic attributes rather than to their impact on experience. Given that pragmatic attributes directly contribute to task-fulfillment, they are far easier to justify than hedonic attributes. People may then choose the pragmatic over the hedonic, despite a true preference for the hedonic. This can be considered a dilemma, since people choose what is easy to justify and not what they enjoy the most. The present thesis presents a systematic exploration of the notion of a hedonic dilemma in the context of interactive products.
A first set of four studies explored the assumed phenomenon. Study 1 (N = 422) revealed a reluctance to pay for a hedonic attribute compared to a pragmatic attribute. Study 2 (N = 134) demonstrated that people (secretly) prefer a more hedonic product, but justify their choice by spurious pragmatic advantages. Study 3 (N = 118) confronted participants with a trade-off between hedonic and pragmatic quality. Even though the prospect of receiving a hedonic product was related to more positive affect, participants predominantly chose the pragmatic, especially those with a high need for justification. This correlation between product choice and perceived need for justification lent further support to the notion that justification lies at the heart of the dilemma. Study 4 (N = 125) explored affective consequences and justifications provided for hedonic and pragmatic choice. Data on positive affect suggested a true preference for the hedonic - even among those who chose the pragmatic product.
A second set of three studies tested different ways to reduce the dilemma by manipulating justification. Manipulations referred to the justifiability of attributes as well as the general need for justification. Study 5 (N = 129) enhanced the respective justifiability of hedonic and pragmatic choice by ambiguous product information, which could be interpreted according to latent preferences. As expected, enhanced justifiability led to an increase in hedonic but not in pragmatic choice. Study 6 (N = 178) manipulated the justifiability of hedonic choice through product information provided by a "test report", which suggested hedonic attributes as legitimate. Again, hedonic choice increased with increased justifiability. Study 7 (N = 133) reduced the general need for justification by framing a purchase as gratification. A significant positive effect of the gratification frame on purchase rates occurred for a hedonic but not for a pragmatic product.
Altogether, the present studies revealed a desire for hedonic attributes, even in interactive products, which often are still understood as purely pragmatic "tools". But precisely because of this predominance of pragmatic quality, people may hesitate to give in to their desire for hedonic quality in interactive products - at least, as long as they feel a need for justification. The present findings provide an enhanced understanding of the complex consequences of hedonic and pragmatic attributes, and indicate a general necessity to expand the scope of User Experience research to the moment of product choice. Limitations of the present studies, implications for future research as well as practical implications for design and marketing are discussed.
Abstract The present work investigates the wetting characteristics of soils with regard to their dependence on environmental parameters such as water content (WC), pH, drying temperature and wetting temperature of wettable and repellent soils from two contrasting anthropogenic sites, the former sewage disposal field Berlin-Buch and the inner-city park Berlin-Tiergarten. The aim of this thesis is to deepen the understanding of processes and mechanisms leading to changes in soil water repellency. This helps to gain further insight into the behaviour of soil organic matter (SOM) and identifying ways to prevent or reduce the negative effects of soil water repellency (SWR). The first focus of this work is to determine whether chemical reactions are required for wetting repellent samples. This hypothesis was tested by time and temperature dependence of sessile drop spreading on wettable and repellent samples. Additionally, diffuse reflectance infrared Fourier transform (DRIFT) spectroscopy was used to determine whether various drying regimes cause changes in the relative abundance of hydrophobic and hydrophilic functional groups in the outer layer of soil particles and whether these changes can be correlated with water content and the degree of SWR. Finally, by artificially altering the pH in dried samples applying acidic and alkaline reagents in a gaseous state, the influence of only pH on the degree of SWR was investigated separately from the influence of changes in moisture status. The investigation of the two locations Buch and Tiergarten, each exceptionally different in the nature of their respective wetting properties, leads to new insights in the variety of appearance of SWR. The results of temperature, water content and pH dependency of SWR on the two contrasting sites resulted in one respective hypothetical model of nature of repellency for each site which provides an explanation for most of the observations made in this and earlier studies: At the Tiergarten site, wetting characteristics are most likely determined by micelle-like arrangement of amphiphiles which depends on the concentration of water soluble amphiphilic substances, pH and ionic strength in soil solution. At low pH and at high ionic strength, repulsion forces between hydrophilic charged groups are minimized allowing their aggregation with outward orientated hydrophobic molecule moieties. At high pH and low ionic strength, higher repulsion forces between hydrophilic functional groups lead to an aggregation of hydrophobic groups during drying, which results in a layer with outward oriented hydrophilic moieties on soil organic matter surface leading to enhanced wettability. For samples from the Buch site, chemical reactions are necessary for the wetting process. The strong dependence of SWR on water content indicates that hydrolysis-condensation reactions are the controlling mechanisms. Since acid catalyzed hydrolysis is an equilibrium reaction dependent on water content, an excess of water favours hydrolysis leading to an increasing number of hydrophilic functional groups. In contrast, water deficiency favours condensation reactions leading to a reduction of hydrophilic functional groups and thus a reduction of wettability. The results of the present investigation and its comparison with earlier investigations clearly show that SWR is subject to numerous antagonistically and synergistically interacting environmental factors. The degree of influence, which a single factor exerts on SWR, is site-specific, e.g., it is dependent on special characteristics of mineral constituents and SOM which underlies the influence of climate, soil texture, topography, vegetation and the former and current use of the respective site.
Navigation is a natural way to explore and discover content in a digital environment. Hence, providers of online information systems such as Wikipedia---a free online encyclopedia---are interested in providing navigational support to their users. To this end, an essential task approached in this thesis is the analysis and modeling of navigational user behavior in information networks with the goal of paving the way for the improvement and maintenance of web-based systems. Using large-scale log data from Wikipedia, this thesis first studies information access by contrasting search and navigation as the two main information access paradigms on the Web. Second, this thesis validates and builds upon existing navigational hypotheses to introduce an adaptation of the well-known PageRank algorithm. This adaptation is an improvement of the standard PageRank random surfer navigation model that results in a more "reasonable surfer" by accounting for the visual position of links, the information network regions they lead to, and the textual similarity between the link source and target articles. Finally, using agent-based simulations, this thesis compares user models that have a different knowledge of the network topology in order to investigate the amount and type of network topological information needed for efficient navigation. An evaluation of agents' success on four different networks reveals that in order to navigate efficiently, users require only a small amount of high-quality knowledge of the network topology. Aside from the direct benefits to content ranking provided by the "reasonable surfer" version of PageRank, the empirical insights presented in this thesis may also have an impact on system design decisions and Wikipedia editor guidelines, i.e., for link placement and webpage layout.
The Web contains some extremely valuable information; however, often poor quality, inaccurate, irrelevant or fraudulent information can also be found. With the increasing amount of data available, it is becoming more and more difficult to distinguish truth from speculation on the Web. One of the most, if not the most, important criterion used to evaluate data credibility is the information source, i.e., the data origin. Trust in the information source is a valuable currency users have to evaluate such data. Data popularity, recency (or the time of validity), reliability, or vagueness ascribed to the data may also help users to judge the validity and appropriateness of information sources. We call this knowledge derived from the data the provenance of the data. Provenance is an important aspect of the Web. It is essential in identifying the suitability, veracity, and reliability of information, and in deciding whether information is to be trusted, reused, or even integrated with other information sources. Therefore, models and frameworks for representing, managing, and using provenance in the realm of Semantic Web technologies and applications are critically required. This thesis highlights the benefits of the use of provenance in different Web applications and scenarios. In particular, it presents management frameworks for querying and reasoning in the Semantic Web with provenance, and presents a collection of Semantic Web tools that explore provenance information when ranking and updating caches of Web data. To begin, this thesis discusses a highly exible and generic approach to the treatment of provenance when querying RDF datasets. The approach re-uses existing RDF modeling possibilities in order to represent provenance. It extends SPARQL query processing in such a way that given a SPARQL query for data, one may request provenance without modifying it. The use of provenance within SPARQL queries helps users to understand how RDF facts arederived, i.e., it describes the data and the operations used to produce the derived facts. Turning to more expressive Semantic Web data models, an optimized algorithm for reasoning and debugging OWL ontologies with provenance is presented. Typical reasoning tasks over an expressive Description Logic (e.g., using tableau methods to perform consistency checking, instance checking, satisfiability checking, and so on) are in the worst case doubly exponential, and in practice are often likewise very expensive. With the algorithm described in this thesis, however, one can efficiently reason in OWL ontologies with provenance, i.e., provenance is efficiently combined and propagated within the reasoning process. Users can use the derived provenance information to judge the reliability of inferences and to find errors in the ontology. Next, this thesis tackles the problem of providing to Web users the right content at the right time. The challenge is to efficiently rank a stream of messages based on user preferences. Provenance is used to represent preferences, i.e., the user defines his preferences over the messages' popularity, recency, etc. This information is then aggregated to obtain a joint ranking. The aggregation problem is related to the problem of preference aggregation in Social Choice Theory. The traditional problem formulation of preference aggregation assumes a I fixed set of preference orders and a fixed set of domain elements (e.g. messages). This work, however, investigates how an aggregated preference order has to be updated when the domain is dynamic, i.e., the aggregation approach ranks messages 'on the y' as the message passes through the system. Consequently, this thesis presents computational approaches for online preference aggregation that handle the dynamic setting more efficiently than standard ones. Lastly, this thesis addresses the scenario of caching data from the Linked Open Data (LOD) cloud. Data on the LOD cloud changes frequently and applications relying on that data - by pre-fetching data from the Web and storing local copies of it in a cache - need to continually update their caches. In order to make best use of the resources (e.g., network bandwidth for fetching data, and computation time) available, it is vital to choose a good strategy to know when to fetch data from which data source. A strategy to cope with data changes is to check for provenance. Provenance information delivered by LOD sources can denote when the resource on the Web has been changed last. Linked Data applications can benefit from this piece of information since simply checking on it may help users decide which sources need to be updated. For this purpose, this work describes an investigation of the availability and reliability of provenance information in the Linked Data sources. Another strategy for capturing data changes is to exploit provenance in a time-dependent function. Such a function should measure the frequency of the changes of LOD sources. This work describes, therefore, an approach to the analysis of data dynamics, i.e., the analysis of the change behavior of Linked Data sources over time, followed by the investigation of different scheduling update strategies to keep local LOD caches up-to-date. This thesis aims to prove the importance and benefits of the use of provenance in different Web applications and scenarios. The exibility of the approaches presented, combined with their high scalability, make this thesis a possible building block for the Semantic Web proof layer cake - the layer of provenance knowledge.
Sediment transport contributes to the movement of inorganic and organic material in rivers. The construction of a dam interrupts the continuity of this sediment transport through rivers, causing sediments to accumulate within the reservoir. Reservoirs can also act as carbon sinks and methane can be released when organic matter in the sediment is degraded under anoxic conditions. Reservoir sedimentation poses a great threat to the sustainability of reservoirs worldwide, and can emit the potent greenhouse gas methane into the atmosphere. Sediment management measures to rehabilitate silted reservoirs are required to achieve both better water quantity and quality, as well as to mitigate greenhouse gas emissions.
This thesis aims at the improvement of sediment sampling techniques to characterize sediment deposits as a basis for accurate and efficient water jet dredging and to monitor the dredging efficiency by measuring the sediment concentration. To achieve this, we investigated freeze coring as a method to sample (gas-bearing) sediment in situ. The freeze cores from three reservoirs obtained were scanned using a non-destructive X-Ray CT scan technique. This allows the determination of sediment stratification and character-ization of gas bubbles to quantify methane emissions and serve as a basis for the identi-fication of specific (i.e. contaminated) sediment layers to be dredged. The results demon-strate the capability of freeze coring as a method for the characterization of (gas-bearing) sediment and overcomes certain limitations of commonly used gravity cores. Even though the core’s structure showed coring disturbances related to the freezing process, the general core integrity seems to not have been disturbed. For dredging purposes, we analyzed the impact pressure distribution and spray pattern of submerged cavitating wa-ter jets and determined the effects of impinging distances and angles, pump pressures and spray angles. We used an adapted Pressure Measurement Sensing technique to enhance the spatial distribution, which proved to be a comparatively easy-to-use meas-urement method for an improved understanding of the governing factors on the erosional capacity of cavitating water jets. Based on this data, the multiple linear regression model can be used to predict the impact pressure distribution of those water jets to achieve higher dredging accuracy and efficiency. To determine the dredging operational efficien-cy, we developed a semi-continuous automated measurement device to measure the sediment concentration of the slurry. This simple and robust device has lower costs, compared to traditional and surrogate sediment concentration measurement technolo-gies, and can be monitored and controlled remotely under a wide range of concentrations and grain-sizes, unaffected by entrained gas bubbles
The term "Augmented Reality (AR)" denotes the superposition of additional virtual objects and supplementary information over real images. The joint project Enhanced Reality (ER)1 aims at a generic AR-system. The ER-project is a cooperation of six different research groups of the Department of Computer Science at the University of Koblenz-Landau. According to Ronald Azuma an AR-system combines real and virtual environments, where the real and virtual objects are registered in 3-D, and it provides interactivity in real time [Azu97]. Enhanced Reality extends Augmented Reality by requiring the virtual objects to be seamlessly embedded into the real world as photo-realistic objects according to the exact lighting conditions. Furthermore, additional information supplying value-added services may be displayed and interaction of the user may even be immersive. The short-term goal of the ER-project is the exploration of ER-fundamentals using some specific research scenarios; the long-term goal is the development of a component-based ER-framework for the creation of ER-applications for arbitrary application areas. ER-applications are developed as single-user applications for users who are moving in a real environment and are wearing some kind of visual output device like see-through glasses and some mobile end device. By these devices the user is able to see reality as it is, but he can also see the virtual objects and the additional information about some value-added service. Furthermore he might have additional devices whereby he can interact with the available virtual objects. The development of a generic framework for ER-applications requires the definition of generic components which are customizable and composable to build concrete applications and it requires a homogeneous data model which supports all components equally well. The workgroup "Software Technology"2 is responsible for this subproject. This report gives some preliminary results concerning the derivation of a component-based view of ER. There are several augmented reality frameworks like ARVIKA, AMIRE, DWARF, MORGAN, Studierstube and others which offer some support for the development of AR-applications. All of them ease the use of existing subsystems like AR-Toolkit, OpenGL and others and leverage the generation process for realistic systems by making efficient use of those subsystems. Consequently, they highly rely on them.
Wild bees are essential for the pollination of wild and cultivated plants. However, within the
last decades, the increasing intensification of modern agriculture has led to both a reduction and fragmentation as well as a degradation of the habitats wild bees need. The resulting loss of pollinators and their pollination poses an immense challenge to global food production. To support wild bees, the availability of flowering resources is essential. However, the flowering period of each resource is temporally limited and has different effects on pollinators and their pollination, depending on the time of their flowering.
Therefore, to efficiently promote and manage wild bee pollinators in agricultural landscapes, we identified species-specific key floral resources of three selected wild bee species and their spatial and temporal availability (CHAPTERS 2, 3 & 4). We examined, which habitat types predominantly provide these resources (CHAPTERS 3 & 4). We also investigated whether floral resource maps based on the use of these key resources and their spatial and temporal availability explain the abundance and development of the selected wild bees (CHAPTERS 3 & 4) and pollination (CHAPTER 5) better than habitat maps, that only indirectly account for the availability of floral resources.
For each of the species studied, we were able to identify different key pollen sources, predominantly woody plants in the early season (April/May) and increasingly herbaceous plants in the later season (June/July; CHAPTERS 2, 3 & 4). The open woody semi-natural habitats of our agricultural landscapes provided about 75% of the floral resources for the buff-tailed bumblebees, 60% for the red mason bees, and 55% for the horned mason bees studied, although they accounted for only 3% of the area (CHAPTERS 3 & 4). In addition, fruit orchards provided about 35% of the floral resources for the horned mason bees on 4% of the landscape area (CHAPTER 3). We showed that both mason bee species benefited from the resource availability in the surrounding landscapes (CHAPTER 3). Yet this was not the case for the bumblebees (CHAPTER 4). Instead, the weight gain of their colonies, the number of developed queen cells and their colony survival were higher with increasing proximity to forests. The proximity to forests also had a positive effect on the mason bees studied (CHAPTER 3). In addition, the red mason bees benefited from herbaceous semi-natural habitats. The proportion of built-up areas had a negative effect on the horned mason bees, and the proportion of arable land on the red mason bees. The habitat maps explained horned mason bee abundances equally well as the floral resource maps, but red mason bee abundances were distinctly better explained by key floral resources. The pollination of field bean increased with higher proportions of early floral resources, whereas synchronous floral resources showed no measurable reduction in their pollination (CHAPTER 5). Habitat maps also explained field bean pollination better than floral resource maps. Here, pollination increased with increasing proportions of built-up areas in the landscapes and decreased with increasing proportions of arable land.
Our results highlight the importance of the spatio-temporal availability of certain key species as resource plants of wild bees in agricultural landscapes. They show that habitat maps are ahead of, or at least equal to, spatio-temporally resolved floral resource maps in predicting wild bee development and pollination. Nevertheless, floral resource maps allow us to draw more accurate conclusions between key floral resources and the organisms studied. The proximity to forest edges had a positive effect on each of the three wild bee species studied. However, besides pure food availability, other factors seem to co-determine the occurrence of wild bees in agricultural landscapes.
In dieser Ausarbeitung beschreibe ich die Ergebnisse meiner Untersuchungen zur Erweiterung des LogAnswer-Systemsmit nutzerspezifischen Profilinformationen. LogAnswer ist ein natürlichsprachliches open-domain Frage-Antwort-System. Das heißt: es beantwortet Fragen zu beliebigen Themen und liefert dabei konkrete (möglichst knappe und korrekte) Antworten zurück. Das System wird im Rahmen eines Gemeinschaftsprojekts der Arbeitsgruppe für künstliche Intelligenz von Professor Ulrich Furbach an der Universität Koblenz-Landau und der Arbeitsgruppe Intelligent Information and Communication Systems (IICS) von Professor Hermann Helbig an der Fernuniversität Hagen entwickelt. Die Motivation meiner Arbeit war die Idee, dass der Prozess der Antwortfindung optimiert werden kann, wenn das Themengebiet, auf das die Frage abzielt, im Vorhinein bestimmt werden kann. Dazu versuchte ich im Rahmen meiner Arbeit die Interessensgebiete von Nutzern basierend auf Profilinformationen zu bestimmen. Das Semantic Desktop System NEPOMUK wurde verwendet um diese Profilinformationen zu erhalten. NEPOMUK wird verwendet um alle Daten, Dokumente und Informationen, die ein Nutzer auf seinem Rechner hat zu strukturieren. Dazu nutzt das System ein sogenanntes Personal Information Model (PIMO) in Form einer Ontologie. Diese Ontologie enthält unter anderem eine Klasse "Topic", welche die wichtigste Grundlage für das Erstellen der in meiner Arbeit verwendeten Nutzerprofile bildete. Konkret wurde die RDF-Anfragesprache SPARQL verwendet, um eine Liste aller für den Nutzer relevanten Themen aus der Ontologie zu filtern. Die zentrale Idee meiner Arbeit war es nun diese Profilinformationen zur Optimierung des Ranking von Antwortkandidaten einzusetzen. In LogAnswer werden zu jeder gestellten Frage bis zu 200 potentiell relevante Textstellen aus der deutschen Wikipedia extrahiert. Diese Textstellen werden auf Basis von Eigenschaften (wie z.B. lexikalische Übereinstimmungen zwischen Frage und Textstelle) geordnet, da innerhalb des zur Verfügung stehenden Zeitlimits nicht alle Kandidaten bearbeitet werden können.
Mein Ansatz verfolgte das Ziel, diesen Algorithmus durch Nutzerprofile so zu erweitern, dass Antwortkandidaten, welche für den Benutzer relevante Informationen enthalten, höher in der Rangfolge eingeordnet werden. Zur Umsetzung dieser Idee musste eine Methode gefunden werden, um zu bestimmen ob ein Antwortkandidat mit dem Profil übereinstimmt. Da sich die in einer Textstelle enthaltenen Informationen in den meisten Fällen auf das übergeordnete Thema des Artikels beziehen, ohne den Namen des Artikels explizit zu erwähnen, wurde in meiner Implementierung der Artikelname betrachtet, um zu ermitteln, zu welchem Themengebiet die Textstelle Informationen liefert. Als zusätzliches Hilfsmittel wurde außerdem die DBpedia-Ontologie eingesetzt, welche die Informationen der Wikipedia strukturiert im RDF Format enthält. Mit Hilfe dieser Ontologie war es möglich, jeden Artikel in Kategorien einzuordnen, die dann mit den im Profil enthaltenen Stichworten verglichen wurden. Zur Untersuchung der Auswirkungen des Ansatzes auf das Ranking-Verfahren wurden mehrere Testläufe mit je 200 Testfragen durchgeführt. Die erste Testmenge bestand aus zufällig ausgewählten Fragen, die mit meinem eigenen Nutzerprofil getestet wurden. Dieser Testlauf lieferte kaum nutzbare Ergebnisse, da nur bei 29 der getesteten Fragen überhaupt ein Antwortkandidat mit dem Profil in Verbindung gebracht werden konnte. Außerdem konnte eine potentielle Verbesserung der Ergebnisse nur bei einer dieser 29 Fragen festgestellt werden, was zu der Schlussfolgerung führte, dass der Einsatz von Profildaten nicht für Anwendungsfälle geeignet ist, in denen die Fragen keine Korrelation mit dem genutzten Profil aufweisen.
Da die Grundannahme meiner Arbeit war, dass Nutzer in erster Linie Fragen zu den Interessensgebieten stellen, welche sich aus ihrem Profil ableiten lassen, sollten die weiteren Testläufe genau diesen Fall beleuchten. Dazu wurden 200 Testfragen aus dem Bereich Sport ausgewählt und mit einem Profil getestet, welches Stichworte zu unterschiedlichen Sportarten enthielt. Die Tests mit den Sportfragen waren wesentlich aussagekräftiger. Auch hier deuteten die Ergebnisse darauf hin, dass der Ansatz kein großes Potential zur Verbesserung des Rankings hat. Eine genauere Betrachtung einiger ausgewählter Beispiele zeigte allerdings, dass die Integration von Profildaten für bestimmte Anwendungsfälle, wie z.B. offene Fragen für die es mehr als eine korrekte Antwort gibt, durchaus zu einer Verbesserung der Ergebnisse führen kann. Außerdem wurde festgestellt, dass viele der schlechten Ergebnisse auf Inkosistenzen in der DBpedia-Ontologie und grundsätzliche Probleme im Umgang mit Wissensbasen in natürlicher Sprache beruhen.
Die Schlussfolgerung meiner Arbeit ist, dass der in dieser Arbeit vorgestellte Ansatz zur Integration von Profilinformationen für den aktuellen Anwendungsfall von LogAnswer nicht geeignet ist, da vor allem Faktenwissen aus sehr unterschiedlichen Domänen abgefragt wird und offene Fragen nur einen geringen Anteil ausmachen.
With the Multimedia Metadata Ontology (M3O), we have developed a sophisticated model for representing among others the annotation, decomposition, and provenance of multimedia metadata. The goal of the M3O is to integrate the existing metadata standards and metadata formats rather than replacing them. To this end, the M3O provides a scaffold needed to represent multimedia metadata. Being an abstract model for multimedia metadata, it is not straightforward how to use and specialize the M3O for concrete application requirements and existing metadata formats and metadata standards. In this paper, we present a step-by-step alignment method describing how to integrate and leverage existing multimedia metadata standards and metadata formats in the M3O in order to use them in a concrete application. We demonstrate our approach by integrating three existing metadata models: the Core Ontology on Multimedia (COMM), which is a formalization of the multimedia metadata standard MPEG-7, the Ontology for Media Resource of the W3C, and the widely known industry standard EXIF for image metadata
Expert-driven business process management is an established means for improving efficiency of organizational knowledge work. Implicit procedural knowledge in the organization is made explicit by defining processes. This approach is not applicable to individual knowledge work due to its high complexity and variability. However, without explicitly described processes there is no analysis and efficient communication of best practices of individual knowledge work within the organization. In addition, the activities of the individual knowledge work cannot be synchronized with the activities in the organizational knowledge work.rnrnSolution to this problem is the semantic integration of individual knowledgernwork and organizational knowledge work by means of the patternbased core ontology strukt. The ontology allows for defining and managing the dynamic tasks of individual knowledge work in a formal way and to synchronize them with organizational business processes. Using the strukt ontology, we have implemented a prototype application for knowledge workers and have evaluated it at the use case of an architectural fifirm conducting construction projects.
The status of Business Process Management (BPM) recommender systems is not quite clear as research states. The use of recommenders familiarized itself with the world during the rise of technological evolution in the past decade.Ever since then, several BPM recommender systems came about. However, not a lot of research is conducted in this field. It is not well known to what broad are the technologies used and how are they used. Moreover, this master’s thesis aims at surveying the BPM recommender systems existing. Building on this, the recommendations come in different shapes. They can be positionbased where an element is to be placed at an element’s front, back or to autocomplete a missing link. On the other hand, Recommendations can be textual, to fill the labels of the elements. Furthermore, the literature review for BPM recommender systems took place under the guides of a literature review framework. The framework suggests 5stages of consecutive stages for this sake. The first stage is defining a scope for the research. Secondly, conceptualizing the topic by choosing key terms for literature research. After that in the third stage, comes the research stage.As for the fourth stage, it suggests choosing analysis features over which the literature is to be synthesized and compared. Finally, it recommends defining the research agenda to describe the reason for the literature review. By invoking the mentioned methodology, this master’s thesis surveyed 18 BPM recommender systems. It was found as a result of the survey that there
are not many different technologies for implementing the recommenders. It was also found that the majority of the recommenders suggest nodes that are yet to come in the model, which is called forward recommending. Also, one of the results of the survey indicated the scarce use of textual recommendations to BPM labels. Finally, 18 recommenders are considered less than excepted for a developing field therefore as a result, the survey found a shortage in the number of BPM recommender systems. The results indicate several shortages in several aspects in the field of BPM recommender systems. On this basis, this master’s thesis recommends the future work on it the results.
Recent EU-frameworks enforce the implementation of risk mitigation measures for nonpoint-source pesticide pollution in surface waters. Vegetated surface flow treatments systems (VTS) can be a way to mitigate risk of adverse effects in the aquatic ecosystems following unavoidable pollution after rainfall-related runoff events. Studies in experimental wetland cells and vegetated ditch mesocosms with common fungicides, herbicides and insecticides were performed to assess efficiency of VTS. Comprehensive monitoring of fungicide exposure after rainfall-related runoff events and reduction of pesticide concentrations within partially optimised VTS was performed from 2006-2009 at five vegetated detention ponds and two vegetated ditches in the wine growing region of the Southern Palatinate (SW-Germany).
Influence of plant density, size related parameters and pesticide properties in the performance of the experimental devices, and the monitored systems were the focus of the analysis. A spatial tool for prediction of pesticide pollution of surface waters after rainfall-related runoff events was programmed in a geographic information system (GIS). A sophisticated and high resolution database on European scale was built for simulation. With the results of the experiments, the monitoring campaign and further results of the EU-Life Project ArtWET mitigation measures were implemented in a georeferenced spatial decision support system. The database for the GIS tools was built with open data. The REXTOX (ratio of exposure to toxicity) Risk Indicator, which was proposed by the OECD (Organisation for Economic Co-operation and Development), was extended, and used for modeling the risk of rainfall-related runoff exposure to pesticides, for all agricultural waterbodies on European scale. Results show good performance of VTS. The vegetated ditches and wetland cells of the experimental systems showed a very high reduction of more than 90% of pesticide concentrations and potential adverse effects. Vegetated ditches and wetland cells performed significantly better than devices without vegetation. Plant density and sorptivity of the pesticide were the variables with the highest explanatory power regarding the response variable reduction of concentrations. In the experimental vegetated ditches 65% of the reduction of peak concentrations was explained with plant density and KOC. The monitoring campaign showed that concentrations of the fungicides and potential adverse effects of the mixtures were reduced significantly within vegetated ditches (Median 56%) and detention ponds (Median 38%) systems. Regression analysis with data from the monitoring campaign identified plant density and size related properties as explanatory variables for mitigation efficiency (DP: R²=0.57, p<0.001; VD:
R²=0.19, p<0.001). Results of risk model runs are the input for the second tool, simulating three risk mitigation measures. VTS as risk mitigation measures are implemented using the results for plant density and size related performance of the experimental and monitoring studies, supported by additional data from the ArtWET project. Based on the risk tool, simulations can be performed for single crops, selected regions, different pesticide compounds and rainfall events. Costs for implementation of the mitigation measures are estimated. Experiments and monitoring, with focus on the whole range of pesticides, provide novel information on VTS for pesticide pollution. The monitoring campaign also shows that fungicide pollution may affect surface waters. Tools developed for this study are easy to use and are not only a good base for further spatial analysis but are also useful as decision support of the non-scientific community. On a large scale, the tools on the one hand can help to compute external costs of pesticide use with simulation of mitigation costs on three levels, on the other hand feasible measures mitigating or remediating the effects of nonpoint-source pollution can be identified for implementation. Further study of risk of adverse effects caused by fungicide pollution and long-time performance of optimised VTS is needed.
Colonoscopy is the gold standard for the detection of colorectal polyps that can progress into cancer. In such an examination, physicians search for polyps in endoscopic images. Thereby polyps can be removed. To support experts with a computer-aided diagnosis system, the University of Koblenz-Landau currently makes some efforts in research different methods for automatic detection. Comparable to traditional pattern recognition systems, features are initially extracted and a classifier is trained on such data. Afterwards, unknown endoscopic images can be classified with the previously trained classifier. This thesis concentrates on the extension of the feature extraction module in the existing system. New detection methods are compared to existing techniques. Several features are implemented, incorporating Graylevel Co-occurrence Matrices, Local Binary Patterns and Discrte Wavelet Transform. Different modifications on those features are applied and evaaluated.
Magnetic resonance (MR) tomography is an imaging method, that is used to expose the structure and function of tissues and organs in the human body for medical diagnosis. Diffusion weighted (DW) imaging is a specific MR imaging technique, which enables us to gain insight into the connectivity of white matter pathways noninvasively and in vivo. It allows for making predictions about the structure and integrity of those connections. In clinical routine this modality finds application in the planning phase of neurosurgical operations, such as in tumor resections. This is especially helpful if the lesion is deeply seated in a functionally important area, where the risk of damage is given. This work reviews the concepts of MR imaging and DW imaging. Generally, at the current resolution of diffusion weighted data, single white matter axons cannot be resolved. The captured signal rather describes whole fiber bundles. Beside this, it often appears that different complex fiber configurations occur in a single voxel, such as crossings, splittings and fannings. For this reason, the main goal is to assist tractography algorithms who are often confound in such complex regions. Tractography is a method which uses local information to reconstruct global connectivities, i.e. fiber tracts. In the course of this thesis, existing reconstruction methods such as diffusion tensor imaging (DTI) and q-ball imaging (QBI) are evaluated on synthetic generated data and real human brain data, whereas the amount of valuable information provided by the individual reconstruction mehods and their corresponding limitations are investigated. The output of QBI is the orientation distribution function (ODF), where the local maxima coincides with the underlying fiber architecture. We determine those local maxima. Furthermore, we propose a new voxel-based classification scheme conducted on diffusion tensor metrics. The main contribution of this work is the combination of voxel-based classification, local maxima from the ODF and global information from a voxel- neighborhood, which leads to the development of a global classifier. This classifier validates the detected ODF maxima and enhances them with neighborhood information. Hence, specific asymmetric fibrous architectures can be determined. The outcome of the global classifier are potential tracking directions. Subsequently, a fiber tractography algorithm is designed that integrates along the potential tracking directions and is able to reproduce splitting fiber tracts.
Systemic neonicotinoids are one of the most widely used insecticide classes worldwide. In addition to their use in agriculture, they are increasingly applied on forest trees as a protective measure against insect pests. However, senescent leaves containing neonicotinoids might, inter alia during autumn leaf fall, enter nearby streams. There, the hydrophilic neonicotinoids may be remobilized from leaves to water resulting in waterborne exposure of aquatic non-target organisms. Despite the insensitivity of the standard test species Daphnia magna (Crustacea, Cladocera) toward neonicotinoids, a potential risk for aquatic organisms is evident as many other aquatic invertebrates (in particular insects and amphipods) display adverse effects when exposed to neonicotinoids in the ng/L- to low µg/L-range. In addition to waterborne exposure, in particular leaf-shredding invertebrates (= shredders) might be adversely affected by the introduction of neonicotinoid-contaminated leaves into the aquatic environment since they heavily rely on leaf litter as food source. However, dietary neonicotinoid exposure of aquatic shredders has hardly received any attention from researchers and is not considered during aquatic environmental risk assessment. The primary aim of this thesis is, therefore, (1) to characterize foliar neonicotinoid residues and exposure pathways relevant for aquatic shredders, (2) to investigate ecotoxicological effects of waterborne and dietary exposure on two model shredders, namely Gammarus fossarum (Crustacea, Amphipoda) and Chaetopteryx villosa (Insecta, Trichoptera), and (3) to identify biotic and abiotic factors potentially modulating exposure under field conditions.
During the course of this thesis, ecotoxicologically relevant foliar residues of the neonicotinoids imidacloprid, thiacloprid and acetamiprid were quantified in black alder trees treated at field relevant levels. A worst-case model – developed to simulate imidacloprid water concentrations resulting from an input of contaminated leaves into a stream – predicted only low aqueous imidacloprid concentrations (i.e., ng/L-range). However, the model identified dietary uptake as an additional exposure pathway relevant for shredders up to a few days after the leaves’ introduction into the stream. When test organisms were simultaneously exposed (= combined exposure) to neonicotinoids leaching from leaves into the water and via the consumption of contaminated leaves, adverse effects exceeded those observed under waterborne exposure alone. When exposure pathways were separated using a flow-through system, dietary exposure towards thiacloprid-contaminated leaves caused similar sublethal adverse effects in G. fossarum as observed under waterborne exposure. Moreover, the effect sizes observed under combined exposure were largely predictable using the reference model “independent action”, which assumes different molecular target sites to be affected. Dietary toxicity for shredders might, however, be reduced under field conditions since UV-induced photodegradation and leaching decreased imidacloprid residues in leaves and thereby the toxicity for G. fossarum. In contrast, both shredders were found unable to actively avoid dietary exposure. This thesis thus recommends considering dietary exposure towards systemic insecticides, such as neonicotinoids, already during their registration to safeguard aquatic shredders, associated ecosystem functions (e.g., leaf litter breakdown) and ultimately ecosystem integrity.
Social networks are ubiquitous structures that we generate and enrich every-day while connecting with people through social media platforms, emails, and any other type of interaction. While these structures are intangible to us, they carry important information. For instance, the political leaning of our friends can be a proxy to identify our own political preferences. Similarly, the credit score of our friends can be decisive in the approval or rejection of our own loans. This explanatory power is being leveraged in public policy, business decision-making and scientific research because it helps machine learning techniques to make accurate predictions. However, these generalizations often benefit the majority of people who shape the general structure of the network, and put in disadvantage under-represented groups by limiting their resources and opportunities. Therefore it is crucial to first understand how social networks form to then verify to what extent their mechanisms of edge formation contribute to reinforce social inequalities in machine learning algorithms.
To this end, in the first part of this thesis, I propose HopRank and Janus two methods to characterize the mechanisms of edge formation in real-world undirected social networks. HopRank is a model of information foraging on networks. Its key component is a biased random walker based on transition probabilities between k-hop neighborhoods. Janus is a Bayesian framework that allows to identify and rank plausible hypotheses of edge formation in cases where nodes possess additional information. In the second part of this thesis, I investigate the implications of these mechanisms - that explain edge formation in social networks - on machine learning. Specifically, I study the influence of homophily, preferential attachment, edge density, fraction of inorities, and the directionality of links on both performance and bias of collective classification, and on the visibility of minorities in top-k ranks. My findings demonstrate a strong correlation between network structure and machine learning outcomes. This suggests that systematic discrimination against certain people can be: (i) anticipated by the type of network, and (ii) mitigated by connecting strategically in the network.
This thesis deals with the verbal categories tense and aspect in the context of analysing the English perfect. The underlying notion of time is examined from the viewpoint of etymology and from the viewpoint of fields of knowledge that lie outside the immediate scope of temporal semantics, e.g. mathematics. The category tense is scrutinised by discussing the concept of Reichenbach tense and the concept of correlation (Giering). The starting point of the discussion of the category verbal aspect is the dichotomy perfective vs. imperfective in the Slavic languages. The main part about the perfect is concerned with the possessive perfect as a cross-linguistic phenomenon (including a comparison of the English and the Slavic perfect) and focuses on the usage and the meaning of the English present perfect. There are three appendices which are an integral part of this dissertation. Appendix A deals with the systematization of English verb forms and their graphical representation. Three different visualizations are presented, two of which are genuine to this paper. Appendix B reproduces the target setting according to which an animated visualization of English infinitives was programmed. Appendix C represents a synopsis of approaches to the English perfect in grammars and textbooks.
The mitral valve is one of four human heart valves. It is located in the left heart and acts as a unidirectional passageway for blood between the left atrium and the left ventricle. A correctly functioning mitral valve prevents a backflow of blood into the pulmonary circulation (lungs) and thus constitutes a vital part of the cardiac cycle. Pathologies of the mitral valve can manifest in a variety of symptoms with severity ranging from chest pain and fatigue to pulmonary edema (fluid accumulation in the tissue and air space of lungs), which may ultimately cause respiratory failure.
Malfunctioning mitral valves can be restored through complex surgical interventions, which greatly benefit from intensive planning and pre-operative analysis. Visualization techniques provide a possibility to enhance such preparation processes and can also facilitate post-operative evaluation. The work at hand extends current research in this field, building upon patient-specific mitral valve segmentations developed at the German Cancer Research Center, which result in triangulated 3D models of the valve surface. The core of this work will be the construction of a 2D-view of these models through global parameterization, a method that can be used to establish a bijective mapping between a planar parameter domain and a surface embedded in higher dimensions.
A flat representation of the mitral valve provides physicians with a view of the whole surface at once, similar to a map. This allows assessment of the valve's area and shape without the need for different viewing angles. Parts of the valve that are occluded by geometry in 3D become visible in 2D.
An additional contribution of this work will be the exploration of different visualizations of the 3D and 2D mitral valve representations. Features of the valve can be highlighted by associating them with specified colors, which can for instance directly convey pathology indicators.
Quality and effectiveness of the proposed methods were evaluated through a survey conducted at the Heidelberg University Hospital.
Entrepreneurship plays a vital role in scientific literature and in public debates. Especially in these hightech and digitized times it happens more and more frequently that young entrepreneurs with a good idea make the breakthrough and set up an established company. Basically, there are an increasing number of start-ups and a trend towards independence. The economy of a country depends on young entrepreneurs in order to remain economically competitive in international competition. It follows that young entrepreneurs must be encouraged and supported. This support is expressed in various stages of foundation and through various fields of action. In the meantime, there are many offers for start-up support. These networks satisfy different fields of action along a foundation. However, a structured overview of these networks on which a young founder can orient himself and gain easily access to the offers of the networks, is missing until then.
This work attempts to present these offers clearly on a map and to categorize and present the commitment in the respective fields of action. In addition to this main objective, the following three key questions are investigated and answered in this work:
1. How can the Entrepreneurship Networks be assigned to the respective fields of action of Entrepreneurship Education?
2. What is the benefit of such a classification for potential entrepreneurs in detail?
3. Are these Entrepreneurship networks missing an important step? Might they improve their offer? Does the value chain cover every need a young entrepreneur might have?
For this purpose, the respective fields of action of the networks are first separated from each other along a founding and defined individually. Subsequently, a combination of quantitative and qualitative approaches was used to filter and analyze the contents of the websites of the networks. The results of this investigation were transformed in a classification
The aim of this work is to produce a map that displays the existing networks in the world clearly. The map also contains information that is more detailed and the classifica-tion of the networks in the respective fields of action.
Interest in crowdfunding has been increasing in recent years, both from the economy and the scientific community. Besides artists and entrepreneurs, researchers are now also funding their projects through many small contributions from the crowd. However, the perceived use in Germany does not reflect the benefits of a crowdfunding campaign, especially in international comparison. This study investigates this issue by identifying the motives and barriers for crowdfunding in order to formulate recommendations for research institutions to encourage the use of crowdfunding.
By means of a literature review, first insights are gained which are then used to conduct qualitative interviews with eleven researchers who successfully completed a crowdfunding campaign. The results indicate that researchers in Germany use crowdfunding primarily to raise awareness for the subject and the scientific community in general. The initial assumption of the speed of crowdfunding as a motive was contradicted by the experts. The major barriers are the immense effort involved in a campaign and the lack of reputation for the concept of crowdfunding by German scientists. In addition, only subjects and projects with a high public relevance and funding volume of up to five digits are recommended for crowdfunding. Furthermore, the public exposure of the experts during the campaign was identified as an additional barrier.
These findings lead to three recommendations for research institutions to increase the use of crowdfunding: Firstly, universities should raise awareness for the subject of crowdfunding as an additional form of research funding and highlight the benefits of a crowdfunding campaign. Secondly, universities should cooperate with crowdfunding partners and utilize the networking capacities of a university. Lastly, universities should provide support to distribute the workload among interdisciplinary teams in order to enhance the effortreturn ratio of a crowdfunding campaign.
The chosen methodology and the scope of the thesis enable further research that might examine the perspective of the universities and the conditions in other countries. In addition, a largescale quantitative survey is required to validate the identified concepts statistically.
Within aquatic environments sediment water interfaces (SWIs) are the most important areas concerning exchange processes between the water body and the sediment. These spatially restricted regions are characterized by steep biogeochemical gradients that determine the speciation and fate of natural or artificial substances. Apart from biological mediated processes (e.g., burrowing organisms, photosynthesis) the determining exchange processes are diffusion or a colloid-mediated transport. Hence, methods are required enabling to capture the fine scale structures at the boundary layer and to distinguish between the different transport pathways. Regarding emerging substances that will probably reach the aquatic environment engineered nanomaterials (ENMs) are of great concern due to their increased use in many products and applications. Since they are determined based on their size (<100 nm) they include a variety of different materials behaving differently in the environment. Once released, they will inevitable mix with naturally present colloids (< 1 μm) including natural nanomaterials.
With regard to existing methodological gaps concerning the characterization of ENMs (as emerging substances) and the investigation of SWIs (as receiving environmental compartments), the aim of this thesis was to develop, validate and apply suitable analytical tools. The challenges were to i) develop methods that enable a high resolution and low-invasive sampling of sediment pore water. To ii) develop routine-suitable methods for the characterization of metal-based engineered nanoparticles and iii) to adopt and optimize size-fractionation approaches for pore water samples of sediment depth profiles to obtain size-related information on element distributions at SWIs.
Within the first part, an available microprofiling system was combined with a novel micro sampling system equipped with newly developed sample filtration-probes. The system was thoroughly validated and applied to a freshwater sediment proving the applicability for an automatic sampling of sediment pore waters in parallel to microsensor measurements. Thereby, for the first time multi-element information for sediment depth profiles were obtained at a millimeter scale that could directly be related to simultaneously measured sediment parameters.
Due to the expected release of ENMs to the environment the aim was to develop methods that enable the investigation of fate and transport of ENMs at sediment water interfaces. Since standardized approaches are still lacking, methods were developed for the determination of the total mass concentration and the determination of the dissolved fraction of (nano)particle suspensions. Thereby, validated, routine suitable methods were provided enabling for the first time a routine-suitable determination of these two, among the most important properties regarding the analyses of colloidal systems, also urgently needed as a basis for the development of appropriate (future) risk assessments and regulatory frameworks. Based on this methodological basis, approaches were developed enabling to distinguish between dissolved and colloidal fractions of sediment pore waters. This made it possible for the first time to obtain fraction related element information for sediment depth profiles at a millimeter scale, capturing the fine scale structures and distinguishing between diffusion and colloid-mediated transport. In addition to the research oriented parts of this thesis, questions concerning the regulation of ENPs in the case of a release into aquatic systems were addressed in a separate publication (included in the Appendix) discussing the topic against the background of the currently valid German water legislation and the actual state of the research.
The STOR project aims at the development of a scientific component system employing models and knowledge for object recognition in images. This interim report elaborates on the requirements for such a component system, structures the application area by identifying a large set of basic operations, and shows how a set of appropriate data structures and components can be derived. A small case studies exemplifies the approach.
This thesis introduces fnnlib, a C++ library for recurrent neural network simulations that I developed between October 2009 and March 2010 at Osaka University's Graduate School of Engineering. After covering the theory behind recurrent neural networks, backpropagation through time, recurrent neural networks with parametric bias, continuous-time recurrent neural networks, and echo state networks, the design of the library is explained. All of the classes as well as their interrelationships are presented along with reasons as to why certain design decisions were made. Towards the end of the thesis, a small practical example is shown. Also, fnnlib is compared to other neural network libraries.