Refine
Year of publication
Document Type
- Master's Thesis (19)
- Part of Periodical (13)
- Bachelor Thesis (11)
- Doctoral Thesis (6)
- Diploma Thesis (4)
Language
- English (53) (remove)
Is part of the Bibliography
- no (53) (remove)
Keywords
- Internet of Things (4)
- computer clusters (3)
- Beschaffung (2)
- Logistik (2)
- artificial neural networks (2)
- classification (2)
- framework (2)
- parallel algorithms (2)
- social simulation (2)
- Adaptive Services Grid (ASG) (1)
- Agentenorientiertes Software Engineering (1)
- Akzeptanz (1)
- BPM (1)
- BPMS (1)
- Bedarfsanalyse (1)
- Biometric Authentication (1)
- Business Process Management Recommender Systems Survey (1)
- Business Process Modeling (1)
- Business Rule Bases, Inconsistency Measurement (1)
- Bürgerbeiteiligung (1)
- COVID-19 (1)
- CSCW (1)
- Case Study Analysis (1)
- Cloud Computing (1)
- CodeBlue (1)
- Collaboration (1)
- Computer Supported Cooperative Work (1)
- Container Entity Modell (1)
- DMN (1)
- Datenaustausch (1)
- Diffusion (1)
- Documents (1)
- E-Partizipation (1)
- E-participation (1)
- EU (1)
- Einkauf (1)
- Empfehlungssystem (1)
- Enterprise 2.0 (1)
- Enterprise Systems (1)
- Europäischer Schadensbericht (1)
- Evaluierung (1)
- Fingerprint Recognition (1)
- Gamification (1)
- Grounded Theory (1)
- Gruppenarbeit (1)
- Health (1)
- Heimarbeit (1)
- IBM Bluemix (1)
- IT Outsourcing (1)
- IT Security (1)
- IT Services (1)
- Information Asset Register (1)
- Information Audit (1)
- Information Capturing Methods (1)
- Information system (1)
- Insurance (1)
- Internet Voting (1)
- Interoperability (1)
- Interoperabilität (1)
- IoT (1)
- Kollaboration (1)
- Multi-Agenten-Systeme (1)
- ODRL (1)
- OWL (1)
- Organizational Change (1)
- PEPPOL (1)
- Pan European Public Procurement OnLine (1)
- Probabilistic finite automata (1)
- RDF (1)
- Recommender System (1)
- Recommender Systems, Business Process Modeling, Literature Review (1)
- STOF Model (1)
- Security (1)
- Semantic Web (1)
- Sensing as a Service (1)
- Service-oriented Architectures (SOA) (1)
- Service-orientierte Architektur (1)
- Sozialwissenschaftliche Simulation (1)
- Speaker Recognition (1)
- Sustainability (1)
- Telearbeit (1)
- Tenneco Automotive (1)
- Transport (1)
- Umfrage (1)
- Umfrage in Koblenz (1)
- VCD (1)
- Verification (1)
- Virtual Company Dossier (1)
- WSN (1)
- Wearables (1)
- acceptance (1)
- adaptive resonance theory (1)
- analytics (1)
- artifcial neural networks (1)
- artififfcial neural networks (1)
- assessment model (1)
- blood analysis (1)
- business intelligence (1)
- categorisation (1)
- change (1)
- collaboration (1)
- cultural dimensions (1)
- data (1)
- diffusion (1)
- digital transformation (1)
- digital workplace (1)
- distributed information systems (1)
- e-service (1)
- e-service quality (1)
- eGovernment (1)
- eSourcing (1)
- enterprise collaboration platforms (1)
- enterprise collaboration systems (1)
- estimation of algorithm efficiency (1)
- evaluation (1)
- governance (1)
- gradient method of training weight coefficients (1)
- groupwork (1)
- hybrid work (1)
- information infrastructure (1)
- information system (1)
- internet of things (1)
- iot development platforms (1)
- logistic (1)
- longitudinal (1)
- mPayments (1)
- mathematical model (1)
- medical care (1)
- methodology (1)
- micro-agent (1)
- minimum self-contained graphs (1)
- mobile health care (1)
- modulares System (1)
- multi-agent systems (1)
- parallel calculations (1)
- regression analysis (1)
- remote work (1)
- requirements analysis (1)
- risk (1)
- social media (1)
- social object (1)
- survey in Koblenz (1)
- virtual goods (1)
- visualization (1)
- web-portal medical e-services (1)
- wireless sensor networks (1)
- work from anywhere (1)
- work from home (1)
Institute
- Institut für Wirtschafts- und Verwaltungsinformatik (53) (remove)
Predictive Process Monitoring is becoming more prevalent as an aid for organizations to support their operational processes. However, most software applications available today require extensive technical know-how by the operator and are therefore not suitable for most real-world scenarios. Therefore, this work presents a prototype implementation of a Predictive Process Monitoring dashboard in the form of a web application. The system is based on the PPM Camunda Plugin presented by Bartmann et al. (2021) and allows users to easily create metrics, visualizations to display these metrics, and dashboards in which visualizations can be arranged. A usability test is with test users of different computer skills is conducted to confirm the application’s user-friendliness.
Remote Working Study 2022
(2022)
The Remote Working Study 2022 is focused on the transition to work from home (WFH) triggered by the stay at home directives of 2020. These directives required employees to work in their private premises wherever possible to reduce the transmission of the coronavirus. The study, conducted by the Center for Enterprise Information Research (CEIR) at the University of Koblenz from December 2021 to January 2022, explores the transition to remote working.
The objective of the survey is to collect baseline information about organisations’ remote work experiences during and immediately following the COVID-19 lockdowns. The survey was completed by the key persons responsible for the implementation and/or management of the digital workplace in 19 German and Swiss organisations.
The data presented in this report was collected from member organisations of the IndustryConnect initiative. IndustryConnect is a university-industry research programme that is coordinated by researchers from the University of Koblenz. It focuses on research in the areas of the digital workplace and enterprise collaboration technologies, and facilitates the generation of new research insights and the exchange of experiences among user companies.
Advanced Auditing of Inconsistencies in Declarative Process Models using Clustering Algorithms
(2021)
To have a compliant business process of an organization, it is essential to ensure a onsistent process. The measure of checking if a process is consistent or not depends on the business rules of a process. If the process adheres to these business rules, then the process is compliant and efficient. For huge processes, this is quite a challenge. Having an inconsistency in a process can yield very quickly to a non-functional process, and that’s a severe problem for organizations. This thesis presents a novel auditing approach for handling inconsistencies from a post-execution perspective. The tool identifies the run-time inconsistencies and visualizes them in heatmaps. These plots aim to help modelers observe the most problematic constraints and help them make the right remodeling decisions. The modelers assisted with many variables can be set in the tool to see a different representation of heatmaps that help grasp all the perspectives of the problem. The heatmap sort and shows the run-time inconsistency patterns, so that modeler can decide which constraints are highly problematic and should address a re-model. The tool can be applied to real-life data sets in a reasonable run-time.
Enterprise collaboration platforms are increasingly gaining importance in organisations. Integrating groupware functionality and enterprise social software (ESS), they have substantially been transforming everyday work in organisations. While traditional collaboration systems have been studied in Computer Supported Cooperative Work (CSCW) for many years, the large-scale, infrastructural and heterogeneous nature of enterprise collaboration platforms remains uncharted. Enterprise collaboration platforms are embedded into organisations’ digital workplace and come with a high degree of complexity, ambiguity, and generativity. When introduced, they are empty shells with no pre-determined purposes of use. They afford interpretive flexibility, and thus are shaping and being shaped by and in their social context. Outcomes and benefits emerge and evolve over time in an open-ended process and as the digital platform is designed through use. In order to make the most of the platform and associated continuous digital transformation, organisations have to develop the necessary competencies and capabilities.
Extant literature on enterprise collaboration platforms has proliferated and provide valuable insights on diverse topics, such as implementation strategies, adoption hurdles, or collaboration use cases, however, they tend to disregard their evolvability and related multiple time frames and settings. Thus, this research aims to identify, investigate, and theorise the ways that enterprise collaboration platforms are changing over time and space and the ways that organisations build digital transformation capabilities. To address this research aim two different case study types are conducted: i) in-depth longitudinal qualitative case study, where case narratives and visualisations capturing hard-to-summarise complexities in the enterprise collaboration platform evolution are developed and ii) multiple-case studies to capture, investigate, and compare cross-case elements that contribute to the shaping of enterprise collaboration platforms in different medium-sized and large organisations from a range of industries. Empirical data is captured and investigated through a multi-method research design (incl. focus groups, surveys, in-depth interviews, literature reviews, qualitative content analysis, descriptive statistics) with shifting units of analysis. The findings reveal unique change routes with unanticipated outcomes and transformations, context-specific change strategies to deal with multiple challenges (e.g. GDPR, works council, developments in the technological field, competing systems, integration of blue-collar workers), co-existing platform uses, and various interacting actors from the immediate setting and broader context. The interpretation draws on information infrastructure (II) as a theoretical lens and related sociotechnical concepts and perspectives (incl. inscriptions, social worlds, biography of artefacts). Iteratively, a conceptual model of the building of digital transformation capabilities is developed, integrating the insights gained from the study of enterprise collaboration platform change and developed monitoring change tools (e.g. MoBeC framework). It assists researchers and practitioners in understanding the building of digital transformation capabilities from a theoretical and practical viewpoint and organisations implement the depicted knowledge in their unique digital transformation processes.
Enterprise Collaboration Systems (ECS) have become substantial for computer-mediated communication and collaboration among employees in organisations. As ECS combine features from social media and traditional groupware, a growing number of organisations implement ECS to facilitate collaboration among employees. Consequently, ECS form the core of the digital workplace. Thus, the activity logs of ECS are particularly valuable since they provide a unique opportunity for observing and analysing collaboration in the digital workplace.
Evidence from academia and practice demonstrates that there is no standardised approach for the analysis of ECS logs and that practitioners struggle with various barriers. Because current ECS analytics tools only provide basic features, academics and practitioners cannot leverage the full potential of the activity logs. As ECS activity logs are a valuable source for understanding collaboration in the digital workplace, new methods and metrics for their analysis are required. This dissertation develops Social Collaboration Analytics (SCA) as a method for measuring and analysing collaboration activities in ECS. To address the existing limitations in academia and practice and to contribute a method and structures for applying SCA in practice, this dissertation aims to answer two main research questions:
1. What are the current practices for measuring collaboration activities in Enterprise Collaboration Systems?
2. How can Social Collaboration Analytics be implemented in practice?
By answering the research questions, this dissertation seeks to (1) establish a broad thematic understanding of the research field of SCA and (2) to develop SCA as a structured method for analysing ac-tivity logs of ECS. As part of the first research question, this dissertation documents the status quo of SCA in the academic literature and practice. By answering the second research question, this dissertation contributes the SCA framework (SCAF), which guides the practical application of SCA. SCAF is the main contribution of this dissertation. The framework was developed based on findings from an analysis of 86 SCA studies, results from 6 focus groups and results from a survey among 27 ECS user companies. The phases of SCAF were derived from a comparison of established process models for data mining and business intelligence. The eight phases of the framework contain detailed descriptions, working steps, and guiding questions, which provide a step by step guide for the application of SCA in practice. Thus, academics and practitioners can benefit from using the framework.
The constant evaluation of the research outcomes in focus groups ensures both rigour and relevance. This dissertation employs a qualitative-dominant mixed-methods approach. As part of the university-industry collaboration initiative IndustryConnect, this research has access to more than 30 leading ECS user companies. Being built on a key case study and a series of advanced focus groups with representatives of user companies, this dissertation can draw from unique insights from practice as well as rich data with a longitudinal perspective.
Within the field of Business Process Management, business rules are commonly used to model company decision logic and govern allowed company behavior. An exemplary business rule in the financial sector could be for example:
”A customer with a mental condition is not creditworthy”. Business rules are
usually created and maintained collaboratively and over time. In this setting,
modelling errors can occur frequently. A challenging problem in this context is
that of inconsistency, i.e., contradictory rules which cannot hold at the same
time. For instance, regarding the exemplary rule above, an inconsistency would
arise if a (second) modeller entered an additional rule: ”A customer with a mental condition is always creditworthy”, as the two rules cannot hold at the same
time. In this thesis, we investigate how to handle such inconsistencies in business
rule bases. In particular, we develop methods and techniques for the detection,
analysis and resolution of inconsistencies in business rule bases
The industry standard Decision Model and Notation (DMN) has enabled a new way for the formalization of business rules since 2015. Here, rules are modeled in so-called decision tables, which are defined by input columns and output columns. Furthermore, decisions are arranged in a graph-like structure (DRD level), which creates dependencies between them. With a given input, the decisions now can be requested by appropriate systems. Thereby, activated rules produce output for future use. However, modeling mistakes produces erroneous models, which can occur in the decision tables as well as at the DRD level. According to the Design Science Research Methodology, this thesis introduces an implementation of a verification prototype for the detection and resolution of these errors while the modeling phase. Therefore, presented basics provide the needed theoretical foundation for the development of the tool. This thesis further presents the architecture of the tool and the implemented verification capabilities. Finally, the created prototype is evaluated.
To construct a business process model manually is a highly complex and error-prone task which takes a lot of time and deep insights into the organizational structure, its operations and business rules. To improve the output of business analysts dealing with this process, different techniques have been introduced by researchers to support them during construction with helpful recommendations. These supporting recommendation systems vary in their way of what to recommend in the first place as well as their calculations taking place under the hood to recommend the most fitting element to the user. After a broad introduction into the field of business process modeling and its basic recommendation structures, this work will take a closer look at diverse proposals and descriptions published in current literature regarding implementation strategies to effectively and efficiently assist modelers during their business process model creation. A critical analysis of presentations in the selected literature will point out strengths and weaknesses of their approaches, studies and descriptions of those. As a result, the final concept matrix in this work will give a precise and helpful overview about the key features and recommendation methods used and implemented in previous research studies to pinpoint an entry into future works without the downsides already spotted by fellow researchers.
The Internet of Things is still one of the most relevant topics in the field of economics and research powered by the increasing demand of innovative services. Cost reductions in manufacturing of IoT hardware and the development of completely new communication ways has led to the point of bil-lions of devices connected to the internet. But in order to rule this new IoT landscape a standardized solution to conquer these challenges must be developed, the IoT Architecture.
This thesis examines the structure, purpose and requirements of IoT Architecture Models in the global IoT landscape and proposes an overview across the selected ones. For that purpose, a struc-tured literature analysis on this topic is conducted within this thesis, including an analysis on three existing research approaches trying to frame this topic and a tool supported evaluation of IoT Archi-tecture literature with over 200 accessed documents.
Furthermore, a coding of literature with the help of the specialised coding tool ATLAS.ti 8 is conduct-ed on 30 different IoT Architecture Models. In a final step these Architecture Models are categorized and compared to each other showing that the environment of IoT and its Architectures gets even more complex the further the research goes.
Business Process Querying (BPQ) is a discipline in the field of Business Process Man- agement which helps experts to understand existing process models and accelerates the development of new ones. Its queries can fetch and merge these models, answer questions regarding the underlying process, and conduct compliance checking in return. Many languages have been deployed in this discipline but two language types are dominant: Logic-based languages use temporal logic to verify models as finite state machines whereas graph-based languages use pattern matching to retrieve subgraphs of model graphs directly. This thesis aims to map the features of both language types to features of the other to identify strengths and weaknesses. Exemplarily, the features of Computational Tree Logic (CTL) and The Diagramed Modeling Language (DMQL) are mapped to one another. CTL explores the valid state space and thus is better for behavioral querying. Lacking certain structural features and counting mechanisms it is not appropriate to query structural properties. In contrast, DMQL issues structural queries and its patterns can reconstruct any CTL formula. However, they do not always achieve exactly the same semantic: Patterns treat conditional flow as sequential flow by ignoring its conditions. As a result, retrieved mappings are invalid process execution sequences, i.e. false positives, in certain scenarios. DMQL can be used for behavioral querying if these are absent or acceptable. In conclusion, both language types have strengths and are specialized for different BPQ use cases but in certain scenarios graph-based languages can be applied to both. Integrating the evaluation of conditions would remove the need for logic-based languages in BPQ completely.
The Internet of Things (IoT) is a fast-growing, technological concept, which aims to integrate various physical and virtual objects into a global network to enable interaction and communication between those objects (Atzori, Iera and Morabito, 2010). The application possibilities are manifold and may transform society and economy similarly to the usage of the internet (Chase, 2013). Furthermore, the Internet of Things occupies a central role for the realisation of visionary future concepts, for example, Smart City or Smart Healthcare. In addition, the utilisation of this technology promises opportunities for the enhancement of various sustainability aspects, and thus for the transformation to a smarter, more efficient and more conscious dealing with natural resources (Maksimovic, 2017). The action principle of sustainability increasingly gains attention in the societal and academical discourse. This is reasoned by the partly harmful consumption and production patterns of the last century (Mcwilliams et al., 2016). Relating to sustainability, the advancing application of IoT technology also poses risks. Following the precautionary principle, these risks should be considered early (Harremoës et al., 2001). Risks of IoT for sustainability include the massive amounts of energy and raw materials which are required for the manufacturing and operation of IoT objects and furthermore, the disposal of those objects (Birkel et al., 2019). The exact relations in the context of IoT and sustainability are insufficiently explored to this point and do not constitute a central element within the discussion of this technology (Behrendt, 2019). Therefore, this thesis aims to develop a comprehensive overview of the relations between IoT and sustainability.
To achieve this aim, this thesis utilises the methodology of Grounded Theory in combination with a comprehensive literature review. The analysed literature primarily consists of research contributions in the field of Information Technology (IT). Based on this literature, aspects, solution approaches, effects and challenges in the context of IoT and sustainability were elaborated. The analysis revealed two central perspectives in this context. IoT for Sustainability (IoT4Sus) describes the utilisation and usage of IoT-generated information to enhance sustainability aspects. In contrast, Sustainability for IoT (Sus4IoT) fo-cuses on sustainability aspects of the applied technology and highlights methods to reduce negative impacts, which are associated with the manufacturing and operation of IoT. Elaborated aspects and relations were illustrated in the comprehensive CCIS Framework. This framework represents a tool for the capturing of relevant aspects and relations in this context and thus supports the awareness of the link between IoT and sustainability. Furthermore, the framework suggests an action principle to optimise the performance of IoT systems regarding sustainability.
The central contribution of this thesis is represented by the providence of the CCIS Framework and the contained information regarding the aspects and relations of IoT and sustainability.
The status of Business Process Management (BPM) recommender systems is not quite clear as research states. The use of recommenders familiarized itself with the world during the rise of technological evolution in the past decade.Ever since then, several BPM recommender systems came about. However, not a lot of research is conducted in this field. It is not well known to what broad are the technologies used and how are they used. Moreover, this master’s thesis aims at surveying the BPM recommender systems existing. Building on this, the recommendations come in different shapes. They can be positionbased where an element is to be placed at an element’s front, back or to autocomplete a missing link. On the other hand, Recommendations can be textual, to fill the labels of the elements. Furthermore, the literature review for BPM recommender systems took place under the guides of a literature review framework. The framework suggests 5stages of consecutive stages for this sake. The first stage is defining a scope for the research. Secondly, conceptualizing the topic by choosing key terms for literature research. After that in the third stage, comes the research stage.As for the fourth stage, it suggests choosing analysis features over which the literature is to be synthesized and compared. Finally, it recommends defining the research agenda to describe the reason for the literature review. By invoking the mentioned methodology, this master’s thesis surveyed 18 BPM recommender systems. It was found as a result of the survey that there
are not many different technologies for implementing the recommenders. It was also found that the majority of the recommenders suggest nodes that are yet to come in the model, which is called forward recommending. Also, one of the results of the survey indicated the scarce use of textual recommendations to BPM labels. Finally, 18 recommenders are considered less than excepted for a developing field therefore as a result, the survey found a shortage in the number of BPM recommender systems. The results indicate several shortages in several aspects in the field of BPM recommender systems. On this basis, this master’s thesis recommends the future work on it the results.
Business rules have become an important tool to warrant compliance at their business processes. But the collection of these business rules can have various conflicting elements. This can lead to a violation of the compliance to be achieved. This conflicting elements are therefore a kind of inconsistencies, or quasi incon- sistencies in the business rule base. The target for this thesis is to investigate how those quasi inconsistencies in business rules can be detected and analyzed. To this aim, we develop a comprehensive library which allows to apply results from the scientific field of inconsistency measurement to business rule formalisms that are actually used in practice.
The goal of this thesis is to create a recommender system (RS) for business processes, based on the existing ProM plugin RegPFA. To accomplish this task, firstly an interface must be created that sets up and expands a database receiving probabilistic finite automata (PFA) created by RegPFA in tsml format as input. Secondly, a Java program must be designed that uses said database to recommend the process elements that are most likely to follow a given sequence of process elements.
Engineering criminal agents
(2019)
This PhD thesis with the title "Engineering Criminal Agents" demonstrates the interplay of three different research fields captured in the title: In the centre are Engineering and Simulation, both set in relation with the application field of Criminology - and the social science aspect of the latter. More precisely,
this work intends to show how specific agent-based simulation models can be created using common methods from software engineering.
Agent-based simulation has proven to be a valuable method for social science since decades, and the trend to increasingly complex simulation models is apparent, not at least due to advancing computational and simulation techniques. An important cause of complexity is the inclusion of 'evidence' as basis of simulation models. Evidence can be provided by various stakeholders, reflecting their different viewpoints on the topic to model.
This poses a particular burden by interrelating the two relevant perspectives on the topic of simulation: on the one hand the user of the simulation model who provides the requirements and is interested in the simulation results, on the other hand the developer of the simulation model who has to program a verified and validated formal model. In order to methodically link these two perspectives, substantial efforts in research and development are needed, where this PhD thesis aims to make a contribution.
The practical results - in terms of software - were achieved by using the multi-faceted approach mentioned above: using methods from software engineering, in order to become able to apply methods from computational social sciences, in order to gain insights into social systems, such as in the internal dynamics of criminal networks.
The PhD thesis shows the research involved to create these practical results, and gives technical details and specifications of the developed software.
The frame for research and development to achieve these results was provided mainly by two research projects: OCOPOMO and GLODERS.
The Internet of Things (IoT) is a concept in which connected physical objects are integrated into the virtual world to become active partakers of businesses and everyday processes (Uckelmann, Harrison and Michahelles, 2011; Shrouf, Ordieres and Miragliotta, 2014). It is expected to have a major impact on businesses (Council, Nic and Intelligence, 2008), but small and medium enterprises’ business models are threatened if they do not adopt the new concept (Sommer, 2015). Thus, this thesis aims to showcase a sample implementation of connected devices in a small enterprise, demonstrating its added benefits for the business.
Design Science Research (DSR) is used to develop a prototype based on a use case provided by a carpentry. The prototype comprises a hardware sensor and a web application which can be used by the wood shop to improve their processes. The thesis documents the iterative process of developing a prototype from the grounds up to useable hard- and software.
This contribution provides an example of how IoT can be used and implemented at a small business.
This thesis connects the endeavors of the winemaker’s intention in perfect and profitable wine making with an innovative technological application to use Internet of Things. Thereby the winemaker’s work may be supported and enriched – and enables until recent years still unthinkable optimization of managing and planning of his business, including close state control of different areas of his vineyard, and more than that, not ending up with the single grapevine. It is exemplarily shown in this thesis how to measure, transmit, store and make data available, exemplarily demonstrated with “live” temperature, air and soil humidity values from the vineyard. A modular architecture was designed for the system presented, which allows the use of current sensors, and similar low-voltage sensors, which will be developed in the future.
By using IoT devices in the vineyard, the winemaker advances to a new quality of precision of forecasted data, starting from live data of his vineyard. Of more and more importance, the winemaker can start immediate action, when unforeseen heavy weather conditions occur. Immediate use of current data enabled by a Cloud Infrastructure. For this system, an open service infrastructure is employed. In contrast to other published commercial approaches, the described solution is based on open source.
As an alone-standing part of this work, a physical prototype for measuring relevant parameters in the vineyard was de-novo designed and developed until fulfilling the set of specifications. The outlined features and requirements for a functioning data collection and autonomously transmitting device was developed, described, and the fulfilment by the prototype device were demonstrated. Through literature research and supportive orientationally live interviews of winemakers, the theory and the practical application were synchronized and qualified.
For the development of the prototype the general principles of development of an electronic device were followed, in particular the Design Science Research development rules, and principles of Quality Function Deployment. As a characteristic of the prototype, some principles like re-use of approved construction and material price of the building blocks of the device were taken into consideration as well (e.g. housing; Arduino; PCB). Parts reduction principles, decomplexation and simplified assembly, testing and field service were integrated to the development process by the modular design of the functional vineyard device components, e.g. with partial reference to innovative electrical cabinet construction system Modular-3.
The software architectural concept is based on a three-layer architecture inclusive the TTN infrastructure. The front end is realized as a rich web client, using a WordPress plugin. WordPress was chosen due to the wide adoption through the whole internet, enabling fast and easy user familiarization. Relevant quality issues have been tested and discussed in the view of exemplary functionality, extensibility, requirements fulfilment, as usability and durability of the device and the software.
The prototype was characterized and tested with success in the laboratory and in field exposition under different conditions, in order to allow a measurement and analysis of the fulfilment of all requirements by the selected and realized electronic construction and layout.
The solution presented may serve as a basis for future development and application in this special showcase and within similar technologies. A prognosis of future work and applications concludes this work.
During the last couple of years the extension of the internet into the real world, also referred to as the Internet of Things (IoT), was positively affected by an ongoing digitalization (Mattern and Floerkemeier, 2010; Evans, 2013). Furthermore, one of the most active IoT domains is the personal health ecosystem (Steele and Clarke, 2013). However, this thesis proposes a gamification framework which is supported and enabled by IoT to bring personal health and IoT together in the context of health-insurances. By examining gamification approaches and identifying the role of IoT in such, a conceptual model of a gamification approach was created which indicates where and how IoT is ap-plicable to it. Hence, IoT acts as enabler and furthermore as enhancer of gamified activities. Especial-ly the necessity of wearable devices was highlighted. A stakeholder analysis shed light on respective benefits which concluded in the outcome, that IoT enabled two paradigm shifts for both, the insur-ance and their customer. While taking the results of the examination and the stakeholder analysis as input, the previously made insights were used to develop an IoT supported gamification framework. The framework includes a multi-level structure which is meant to guide through the process of creat-ing an approach but also to analyze already existing approaches. Additionally, the developed frame-work was instantiated based on the application Pokémon Go to identify occurring issues and explain why it failed to retain their customer in the long term. The thesis provides a foundation on which fur-ther context related research can be orientated.
Social Business Documents: An Investigation of their Nature, Structure and Long-term Management
(2018)
Business documents contain valuable information. In order to comply with legal requirements, to serve as organisational knowledge and to prevent risks they need to be managed. However, changes in technology with which documents are being produced introduced new kinds of documents and new ways of interacting with documents. Thereby, the web 2.0 led to the development of Enterprise Collaboration Systems (ECS), which enable employees to use wiki, blog or forum applications for conducting their business. Part of the content produced in ECS can be called Social Business Documents (SBD). Compared to traditional digital documents SBD are different in their nature and structure as they are, for example, less well-structured and do not follow a strict lifecycle. These characteristics bring along new management challenges. However, currently research literature lacks investigations on the characteristics of SBD, their peculiarities and management.
This dissertation uses document theory and documentary practice as theoretical lenses to investigate the new challenges of the long-term management of SBD in ECS. By using an interpretative, exploratory, mixed methods approach the study includes two major research parts. First, the nature and structure of Social Business Documents is addressed by analysing them within four different systems using four different modelling techniques each. The findings are used to develop general SBD information models, outlining the basic underlying components, structure, functions and included metadata, as well as a broad range of SBD characteristics. The second phase comprises a focus group, a case study including in-depth interviews and a questionnaire, all conducted with industry representatives. The focus group identified that the kind of SBD used for specific content and the actual place of storage differ between organisations as well as that there are currently nearly no management practices for SBD at hand. The case study provided deep insights into general document management activities and investigated requirements, challenges and actions for managing SBD. Finally, the questionnaire consolidated and deepened the previous findings. It provides insights about the value of SBD, their current management practices as well as management challenges and needs. Despite all participating organisations storing information worth managing in SBD most are not addressing them with management activities and many challenges remain.
Together, the investigations enable a contribution to practice and theory. The progress in practice is summarised through a framework, addressing the long-term management of Social Business Documents. The framework identifies and outlines the requirements and challenges of and the actions for SBD management. It also indicates the dependencies of the different aspects. Furthermore, the findings enable the progress in theory within documentary practice by discussing the extension of document types to include SBD. Existing problems are outlined along the definitions of records and the newly possible characteristics of documents emerging through Social Business Documents are taken into account.
Companies try to utilise Knowledge Management (KM) to gain more efficiency and effectiveness in business. The major problem is that most of these KM projects are not or rarely based on sustainable analyses or established theories about KM. Often there is a big gap between the expectations and the real outcome of such KM initiatives. So the research question to be answered is: What challenges arise in KM projects, which KM requirements can be derived from them and which recommendations support the goal of meeting the requirements for KM? As theoretical foundation a set of KM frameworks is examined. Subsequently KM challenges from literature are analysed and best practices from case studies are used to provide recommendations for action on this challenges. The main outcome of this thesis is a best practice guideline,which allows Chief Knowledge Officers (CKOs) and KM project managers to examine the challenges mentioned in this thesis closely, and to find a suitable method to master these challenge in an optimal way. This guideline shows that KM can be positively and negatively influenced in a variety of ways. Mastering Knowledge Management (KM) in a company is a big and far-reaching venture and that technology respectively Information Technology (IT) is only a part of the big picture.
The Internet of Things (IoT) recently developed from the far-away vision of ubiquitous computing into very tangible endeavors in politics and economy, implemented in expensive preparedness programs. Experts predict considerable changes in business models that need to be addressed by organizations in order to respond to competition. Although there is a need to develop strategies for upcoming transformations, organizational change literature did not turn to the specific change related to the new technology yet. This work aims at investigating IoT-related organizational change by identifying and classifying different change types. It therefore combines the methodological approach of grounded theory with a discussion and classification of identified change informed by a structured literature review of organizational change literature. This includes a meta-analysis of case studies using a qualitative, exploratory coding approach to identify categories of organizational change related to the introduction of IoT. Furthermore a comparison of the identified categories to former technology-related change is provided using the example of Electronic Business (e-business), Enterprise Resource Planning (ERP) systems, and Customer Relationship Management (CRM) systems. As a main result, this work develops a comprehensive model of IoT-related business change. The model presents two main themes of change indicating that personal smart things will transform businesses by means of using more personal devices, suggesting and scheduling actions of their users, and trying to avoid hazards. At the same time, the availability of information in organizations will further increase to a state where information is available ubiquitously. This will ultimately enable accessing real time information about objects and persons anytime and from any place. As a secondary result, this work gives an overview on concepts of technology-related organizational change in academic literature.
This thesis explores the possibilities of probabilistic process modelling for the Computer Supported Cooperative Work (CSCW) systems in order to predict the behaviour of the users present in the CSCW system. Toward this objective applicability, advantages, limitations and challenges of probabilistic modelling are excavated in context of CSCW systems. Finally, as a primary goal seven models are created and examined to show the feasibilities of probabilistic process discovery and predictions of the users behaviour in CSCW systems.
The extensive literature in the data visualization field indicates that the process of creating efficient data visualizations requires the data designer to have a large set of skills from different fields (such as computer science, user experience, and business expertise). However, there is a lack of guidance about the visualization process itself. This thesis aims to investigate the different processes for creating data visualizations and develop an integrated framework to guide the process of creating data visualizations that enable the user to create more useful and usable data visualizations. Firstly, existing frameworks in the literature will be identified, analyzed and compared. During this analysis, eight views of the visualization process are developed. These views represent the set of activities which should be done in the visualization process. Then, a preliminary integrated framework is developed based on an analysis of these findings. This new integrated framework is tested in the field of Social Collaboration Analytics on an example from the UniConnect platform. Lastly, the integrated framework is refined and improved based on the results of testing with the help of diagrams, visualizations and textual description. The results show that the visualization process is not a waterfall type. It is the iterative methodology with the certain phases of work, demonstrating how to address the eight views with different levels of stakeholder involvement. The findings are the basis for a visualization process which can be used in future work to develop the fully functional methodology.
With global and distributed project teams being increasingly common Collaborative Project Management is becoming the prevalent paradigm for the work in most organisations. Software has for many years been one of the most used tools for supporting Project Management and with the focus on Collaborative Project Management and accompanied by the emergence of Enterprise Collaboration Systems (ECS), Collaborative Project Management Software (CPMS) is gaining increased attention. This thesis examines the capabilities of CPMS for the long-term management of information which not only includes the management of files within these systems, but the management of all types of digital business documents, particularly social business documents. Previous research shows that social content in collaboration software is often poorly managed which poses challenges to meeting performance and conformance objectives in a business. Based on literature research, requirements for the long-term management of information in CPMS are defined and 7 CPMS tools are analysed regarding the content they contain and the functionalities for the long-term management of this content they offer. The study shows that CPMS by and large are not able to meet the long-term information management needs of an organisation on their own and that only the tools geared towards enterprise customers have sufficient capabilities to support the implementation of an Enterprise Information Management strategy.
The Internet of Things (IoT) is a network of addressable, physical objects that contain embedded sensing, communication and actuating technologies to sense and interact with their environment (Geschickter 2015). Like every novel paradigm, the IoT sparks interest throughout all domains both in theory and practice, resulting in the development of systems pushing technology to its limits. These limits become apparent when having to manage an increasing number of Things across various contexts. A plethora of IoT architecture proposals have been developed and prototype products, such as IoT platforms, been introduced. However, each of these architectures and products apply their very own interpretations of an IoT architecture and its individual components so that IoT is currently more an Intranet of Things than an Internet of Things (Zorzi et al. 2010). Thus, this thesis aims to develop a common understanding of the elements forming an IoT architecture and provide high-level specifications in the form of a Holistic IoT Architecture Framework.
Design Science Research (DSR) is used in this thesis to develop the architecture framework based on the pertinent literature. The development of the Holistic IoT Architecture Framework includes the identification of two new IoT Architecture Perspectives that became apparent during the analysis of the IoT architecture proposals identified in the extant literature. While applying these novel perspectives, the need for a new component for the architecture framework, which was merely implicitly mentioned in the literature, became obvious as well. The components of various IoT architecture proposals as well as the novel component, the Thing Management System, were combined, consolidated and related to each other to develop the Holistic IoT Architecture Framework. Subsequently, it was shown that the specifications of the architecture framework are suitable to guide the implementation of a prototype.
This contribution provides a common understanding of the basic building blocks, actors and relations of an IoT architecture.
The purpose of this research is to examine various existing cloud-based Internet of Things (IoT) development platforms and evaluate one platform (IBM Watson IoT) in detail using a use case scenario. Internet of Things IoT is an emerging technology that has a vision of interconnecting the virtual world (e.g. clouds, social networks) and the physical world (e.g. device, cars, fridge, people, animals) through the Internet technology. For example, the IoT concept of smart cities which has the objectives to improve the efficiency and development of business, social and cultural services in the city, can be achieved by using sensors, actuators, clouds and mobile devices (IEEE, 2015). A sensor (e.g. temperature sensor) in the building (global world) can send the real-time data to the IoT cloud platform (virtual world), where it can be monitored, stored, analysed, or used to trigger some action (e.g. turn on the cooling system in the building if temperature exceeds a threshold limit). Although, the IoT creates vast opportunities in different areas (e.g. transportation, healthcare, manufacturing industry), it also brings challenges such as standardisation, interoperability, scalability, security and privacy. In this research report, IoT concepts and related key issues are discussed.
The focus of this research is to compare various cloud-based IoT platforms in order to understand the business and technical features they offer. The cloud-based IoT platforms from IBM, Google, Microsoft, PTC and Amazon have been studied.
To design the research, the Design Science Research (DSR) methodology has been followed, and to model the real-time IoT system the IOT-A modelling approach has been used.
The comparison of different cloud based IoT development platforms shows that all of the studied platforms provide basic IoT functionalities such as connecting the IoT devices to the cloud based IoT platform, collecting data from the IoT devices, data storage and data analytics. However, the IBM’s IoT platform appears to have an edge over the other platforms studied in this research because of the integrated run-time environment which also makes it more developer friendly. Therefore, IBM Watson IoT for Bluemix is selected for further examination of its capabilities. The IBM Watson IoT for Bluemix offerings include analytics, risk management, connect and information management. A use case was implemented to assess the capabilities that IBM Watson IoT platform offers. The digital artifacts (i.e. applications) are produced to evaluate the IBM’s IoT solution. The results show that IBM offers a very scalable, developer and deployment friendly IoT platform. Its cognitive, contextual and predictive analytics provide a promising functionality that can be used to gain insights from the IoT data transmitted by the sensors and other IoT devices.
This thesis is providing an overview over the current topics and influences of mobile components on Enterprise Content Management (ECM). With a literature review the core topics of enterprise mobility and ECM have been identified and projected on the context of using mobile Apps within the environment of ECM. An analysis of three ECM systems and their mobile software lead to an understanding of the functionalities and capabilities mobile systems are providing in the ECM environment. These findings lead to a better un- derstanding for the usage of mobile Enterprise Content Management and is preparation. The thesis focuses the most important topics, which need to be considered for the usage and adoption of mobile Apps in ECM.
We are entering the 26th year from the time the World Wide Web (WWW) became reality. Since the birth of the WWW in 1990, the Internet and therewith websites have changed the way businesses compete, shifting products, services and even entire markets.
Therewith, gathering and analysing visitor traffic on websites can provide crucial information to un- derstand customer behavior and numerous other aspects.
Web Analytics (WA) tools offer a quantity of diverse functionality, which calls for complex decision- making in information management. Website operators implement Web Analytic tools such as Google Analytics to analyse their website for the purpose of identifying web usage to optimise website design and management. The gathered data leads to emergent knowledge, which provides new marketing opportunities and can be used to improve business processes and understand customer behavior to increase profit. Moreover, Web Analytics plays a significant role to measure performance and has therefore become an important component in web-based environments to make business decisions.
However, many small and medium –sized enterprises try to keep up with the web business competi- tion, but do not have the equivalent resources in manpower and knowledge to stand the pace, there- fore some even resign entirely on Web Analytics.
This research project aims to develop a Web Analytics framework to assist small and medium-sized enterprises in making better use of Web Analytics. By identifying business requirements of SMEs and connecting them to the functionality of Google Analytics, a Web Analytics framework with attending guidelines is developed, which guides SMEs on how to proceed in using Google Analytics to achieve actionable outcomes.
This research examines information audit methodologies and information capturing methods for enterprise social software which are an elementary part of the audit process. Information auditing is lacking of a standardized definition and methodology because the scope of the audit process is diversified and dependent on the organization undertaking the audit. The benefits of information auditing and potential challenges of Enterprise 2.0 the audit can overcome are comprehensive and provide a major incentive for managers to conduct an audit. Information asset registers as a starting point for information auditing are not specifically focusing on social software assets. Therefore this research pro-ject combines asset registers from different areas to create a new register suitable for the requirements of Enterprise 2.0. The necssary adaptations caused by the new character of the assets are minor. The case study applying the asset register for the first time however reveals several problematic areas for information auditors completing the register. Rounding up the thesis a template is developed for setting up new work spaces on enterprise social software systems with appropriate metadata taking into account the meaningful metadata discovered in the asset register.
In this work a framework is developed that is used to create an evaluation scheme for the evaluation of text processing tools. The evaluation scheme is developed using a model-dependent software evaluation approach and the focus of the model-dependent part is the text-processing process which is derived from the Conceptual Analysis Process developed in the GLODERS project. As input data a German court document is used containing two incidents of extortion racketeering which happened in 2011 and 2012. The evaluation of six different tools shows that one tool offers great results for the given dataset when it is compared to manual results. It is able to identify and visualize relations between concepts without any additional manual work. Other tools also offer good results with minor drawbacks. The biggest drawback for some tools is the unavailability of models for the German language. They can perform automated tasks only on English documents. Nonetheless some tools can be enhanced by self-written code which allows users with development experience to apply additional methods.
The aim of this paper is to identify and understand the risks and issues companies are experiencing from the business use of social media and to develop a framework for describing and categorising those social media risks. The goal is to contribute to the evolving theorisation of social media risk and to provide a foundation for the further development of social media risk management strategies and processes. The study findings identify thirty risk types organised into five categories (technical, human, content, compliance and reputational). A risk-chain is used to illustrate the complex interrelated, multi-stakeholder nature of these risks and directions for future work are identified.
Iterative Signing of RDF(S) Graphs, Named Graphs, and OWL Graphs: Formalization and Application
(2013)
When publishing graph data on the web such as vocabulariesrnusing RDF(S) or OWL, one has only limited means to verify the authenticity and integrity of the graph data. Today's approaches require a high signature overhead and do not allow for an iterative signing of graph data. This paper presents a formally defined framework for signing arbitrary graph data provided in RDF(S), Named Graphs, or OWL. Our framework supports signing graph data at different levels of granularity: minimum self-contained graphs (MSG), sets of MSGs, and entire graphs. It supports for an iterative signing of graph data, e. g., when different parties provide different parts of a common graph, and allows for signing multiple graphs. Both can be done with a constant, low overhead for the signature graph, even when iteratively signing graph data.
Virtual Goods + ODRL 2012
(2012)
This is the 10th international workshop for technical, economic, and legal aspects of business models for virtual goods incorporating the 8th ODRL community group meeting. This year we did not call for completed research results, but we invited PhD students to present and discuss their ongoing research work. In the traditional international group of virtual goods and ODRL researchers we discussed PhD research from Belgium, Brazil, and Germany. The topics focused on research questions about rights management in the Internet and e-business stimulation. In the center of rights management stands the conception of a formal policy expression that can be used for human readable policy transparency, as well as for machine readable support of policy conformant systems behavior up to automatic policy enforcement. ODRL has proven to be an ideal basis for policy expressions, not only for digital copy rights, but also for the more general "Policy Awareness in the World of Virtual Goods". In this sense, policies support the communication of virtual goods, and they are a virtualization of rules-governed behavior themselves.
Cloud Computing is a topic that has gained momentum in the last years. Current studies show that an increasing number of companies is evaluating the promised advantages and considering making use of cloud services. In this paper we investigate the phenomenon of cloud computing and its importance for the operation of ERP systems. We argue that the phenomenon of cloud computing could lead to a decisive change in the way business software is deployed in companies. Our reference framework contains three levels (IaaS, PaaS, SaaS) and clarifies the meaning of public, private and hybrid clouds. The three levels of cloud computing and their impact on ERP systems operation are discussed. From the literature we identify areas for future research and propose a research agenda.
This paper describes results of the simulation of social objects, the dependence of schoolchildren's professional abilities on their personal characteristics. The simulation tool is the artificial neural network (ANN) technology. Results of a comparison of the time expense for training the ANN and for calculating the weight coefficients with serial and parallel algorithms, respectively, are presented.
An estimation of the number of multiplication and addition operations for training artififfcial neural networks by means of consecutive and parallel algorithms on a computer cluster is carried out. The evaluation of the efficiency of these algorithms is developed. The multilayer perceptron, the Volterra network and the cascade-correlation network are used as structures of artififfcial neural networks. Different methods of non-linear programming such as gradient and non-gradient methods are used for the calculation of the weight coefficients.
The paper is devoted to solving the problem of assessing the quality of the medical electronic service. A variety of dimensions and factors of quality, methods and models applied in different scopes of activity for assessing quality of service is researched. The basic aspects, requirements and peculiarities of implementing medical electronic services are investigated. The results of the analysis and the set of information models describing the processes of assessing quality of the electronic service "Booking an appointment with a physician" and developed for this paper allowed us to describe the methodology and to state the problem of the assessment of quality of this service.
The estimation of various social objects is necessary in different fields of social life, science, education, etc. This estimation is usually used for forecasting, for evaluating of different properties and for other goals in complex man-machine systems. At present this estimation is possible by means of computer and mathematical simulation methods which is connected with significant difficulties, such as: - time-distributed process of receiving information about the object; - determination of a corresponding mathematical device and structure identification of the mathematical model; - approximation of the mathematical model to real data, generalization and parametric identification of the mathematical model; - identification of the structure of the links of the real social object. The solution of these problems is impossible without a special intellectual information system which combines different processes and allows predicting the behaviour of such an object. However, most existing information systems lead to the solution of only one special problem. From this point of view the development of a more general technology of designing such systems is very important. The technology of intellectual information system development for estimation and forecasting the professional ability of respondents in the sphere of education can be a concrete example of such a technology. Job orientation is necessary and topical in present economic conditions. It helps tornsolve the problem of expediency of investments to a certain sphere of education. Scientifically validated combined diagnostic methods of job orientation are necessary to carry out professional selection in higher education establishments. The requirements of a modern society are growing, with the earlier developed techniques being unable to correspond to them sufficiently. All these techniques lack an opportunity to account all necessary professional and personal characteristics. Therefore, it is necessary to use a system of various tests. Thus, the development of new methods of job orientation for entrants is necessary. The information model of the process of job orientation is necessary for this purpose. Therefore, it would be desirable to have an information system capable of giving recommendations concerning the choice of a trade on the basis of complex personal characteristics of entrants.
Multi-agent systems are a mature approach to model complex software systems by means of Agent-Oriented Software Engineering (AOSE). However, their application is not widely accepted in mainstream software engineering. Parallel to this the interdisciplinary field of Agent-based Social Simulation (ABSS) finds increasing recognition beyond the purely academic realm which starts to draw attention from the mainstream of agent researchers. This work analyzes factors to improve the uptake of AOSE as well as characteristics which separate the two fields AOSE and ABSS to understand their gap. Based on the efficiency-oriented micro-agent concept of the Otago Agent Platform (OPAL) we have constructed a new modern and self-contained micro-agent platform called µ². The design takes technological trends into account and integrates representative technologies, such as the functionally-inspired JVM language Clojure (with its Transactional Memory), asynchronous message passing frameworks and the mobile application platform Android. The mobile version of the platform shows an innovative approach to allow direct interaction between Android application components and micro-agents by mapping their related internal communication mechanisms. This empowers micro-agents to exploit virtually any capability of mobile devices for intelligent agent-based applications, robotics or simply act as a distributed middleware. Additionally, relevant platform components for the support of social simulations are identified and partially implemented. To show the usability of the platform for simulation purposes an interaction-centric scenario representing group shaping processes in a multi-cultural context is provided. The scenario is based on Hofstede's concept of 'Cultural Dimensions'. It does not only confirm the applicability of the platform for simulations but also reveals interesting patterns for culturally augmented in- and out-group agents. This explorative research advocates the potential of micro-agents as a powerful general system modelling mechanism while bridging the convergence between mobile and desktop systems. The results stimulate future work on the micro-agent concept itself, the suggested platform and the deeper exploration of mechanisms for seemless interaction of micro-agents with mobile environments. Last but not least the further elaboration of the simulation model as well as its use to augment intelligent agents with cultural aspects offer promising perspectives for future research.
Mobile payment has been a payment option in the market for a long time now and was predicted to become a widely used payment method. However, over the years, the market penetration rate of mPayments has been relatively low, despite it having all characteristics required of a convenient payment method. The primaryrnreason for this has been cited as a lack of customer acceptance mainly caused due to the lack of perceived security by the end-user. Although biometric authentication is not a new technology, it is experiencing a revival in the light of the present day terror threats and increased security requirements in various industries. The application of biometric authentication in mPayments is analysed here and a suitable biometric authentication method for use with mPayments is recommended. The issue of enrolment, human and technical factors to be considered are discussed and the STOF business model is applied to a BiMoP (biometric mPayment) application.
Im Rahmen dieser Bachelorarbeit wurde ein Back-Office für die elektronische Version des Europäischen Schadensberichtes erstellt. Es wurde bereits in anderen Arbeiten ein mobiler Client, welcher auf einem Windows Mobile Handy läuft, sowie ein Polizei Client erstellt. Diese greifen auf das Back-Office zu, um Daten, wie z.B. die Autodaten (Automarke, der Typ, das Baujahr und Bilder eines 3D-Modells des Autos) zu einem bestimmten Kennzeichen oder die Personendaten des jeweiligen Autobesitzers zu erhalten. Der mobile Client sendet zudem die Unfallakte an das Back-Office, damit die Daten über einen Unfall in diesem abgespeichert und weiter bearbeitet werden können. Ziel der Arbeit war es ein erweiterbares, modulares System zu entwickeln, welches später um weitere Module ergänzt werden kann, um neue Funktionen bereitstellen zu können. Diese Module können jeweils beliebige Daten in einer Datenbank abspeichern und diese von der Datenbank auch wieder abfragen, sowie verändern, ohne dass das relationale Schema der Datenbank verändert werden muss.
The development of a pan-European public E-Procurement system is an important target of the European Union to enhance the efficiency, transparency and competitiveness of public procurement procedures conducted within the European single market. A great obstacle for cross-border electronic procurement is the heterogeneity of national procurement systems in terms of technical, organizational and legal differences. To overcome this obstacle the European Commission funds several initiatives that contribute to the aim of achieving interoperability for pan-European public procurement. Pan European Public Procurement OnLine (PEPPOL) is one of these initiatives that aims at piloting an interoperable pan-European E-Procurement solution to support businesses and public purchasing entities from different member states to conduct their procurement processes electronically.rnrnAs interoperability and inter-connection of distributed heterogeneous information systems are the major requirements in the European procurement domain, and the VCD sub-domain in particular, service-oriented architecture (SOA) seems to provide a promising approach to realize such an architecture, as it promotes loose coupling and interoperability. This master thesis therefore discusses the SOA approach and how its concepts, methodologies and technologies can be used for the development of interoperable IT systems for electronic public procurement. This discussion is enhanced through a practical application of the discussed SOA methodologies by conceptualizing and prototyping of a sub-system derived from the overall system domain of the Virtual Company Dossier. For that purpose, important aspects of interoperability and related standards and technologies will be examined and put into the context of public electronic procurement. Furthermore, the paradigm behind SOA will be discussed, including the derivation of a top-down development methodology for service-oriented systems.
This Thesis contributes by reporting on the current state of diffusion of collaboration information technology (CIT). The investigation concludes, with a high degree of certainty, that today we have a "satisfactory" diffusion level of some level-A CITs (mostly e-Mail, distantly followed by Audio Conferencing), and a "dissatisfactory" diffusion level of higher-level CITs (i.e. those requiring significant collaboration and cooperation among users, like Meeting Support Systems, Group Decision Support Systems, etc.). The potential benefits of the latter seem to be far from fully realised due to lack of user acceptance. This conclusion has gradually developed along the research cycle " it was suggested by Empirical Study I, and tested through Empirical Studies II and III. An additional, unplanned and rather interesting, finding from this study has been the recognition of large [mostly business] reporting on numerous Web 2.0 user-community produced collaboration technologies (most of them belonging to the category of "social software") and their metamorphosis from autonomous, "bottom-up" solutions into enterprise-supported infrastructures. Another contribution of this Thesis " again suggested by Empirical Study I, and tested through Empirical Studies II and III " pertains to the "process structure" of CIT diffusion. I have found that collaboration technology has historically diffused following two distinct (interdependent but orthogonal) diffusion paths " top-down (authority-based) and bottom-up. The authority-based diffusion path seems to be characterised by efforts aimed at "imposing" technologies on employees, the primary concern being to make sure that technology seamlessly and easily integrates into the organisational IT infrastructure. On the other hand, the bottom-up diffusion trail seems to be successful. The contribution of this investigation may be summarised as threefold: 1. This investigation consolidates most of the findings to date, pertaining to CIT adoption and diffusion, which have been produced by the CIT research community. Thus, it tells a coherent story of the dynamics of the community focus and the collective wisdom gathered over a period of (at least) one decade. 2. This work offers a meaningful framework within which to analyse existing knowledge " and indeed extends that knowledge base by identifying persistent problems of collaboration technology acceptance, adoption and diffusion. These problems have been repeatedly observed in practice, though the pattern does not seem to have been recognised and internalised by the community. Many of these problems have been observed in cases of CIT use one decade ago, five years ago, three years ago, and continue to be observed today in structurally the same form despite what is unarguably "rapid technological development". This gives me reason to believe that, at least some of the persistent problems of CIT diffusion can be hypothesised as "determining factors". My contribution here is to identify these factors, discuss them in detail, and thus tackle the theme of CIT diffusion through a structured historical narrative. 3. Through my contribution (2) above, I characterise a "knowledge-action gap" in the field of CIT and illuminate a potential path through which the research community might hope to bridge this gap. The gap may be operationalised as cognitive distance between CIT "knowledge" and CIT "action".
The thesis at hand evaluates Open Source Business Process Management (BPM) Systems in the context of the R4eGov1 Project. The provision of concepts and tools to support and enable interoperability in pan-European networks of pubic administrations is one of the major objectives that R4eGov is aiming at. Thereby a strong focus lies on the interoperability of cross-organizational processes from the viewpoint of modeling, execution and monitoring. BPM can increase the effectiveness and efficiency of cross-organizational processes by restructuring them towards the needs of the entities involved. BPM is dependent on BPM systems that combine technologies of process modeling, business process analysis and execution along with their integration into adequate runtime environments and rule engines. The evaluation that is performed within the thesis investigates how far BPM systems can support several requirements of interoperability that have been developed by the R4eGov project. It also targets at analyzing those BPM system according to generic requirements on BPM and software tools. The investigation is build upon common BPM theories and standards for modeling business processes. It describes the origin and interdependencies of BPM and Workflow Management (WfM), highlighting similarities and differences from the technological and historical perspective. Moreover, it introduces web service standards and technologies that are used to build service-oriented architectures allowing greater flexibility in BPM. In addition the thesis introduces methods and best practices to evaluate software tools. It contains an evaluation framework for BPM tools that has been based on the software product evaluation standard ISO/IEC 14598. The evaluation framework comprises the definition of an R4eGov scenario and a catalogue of criteria for evaluating a set of selected Open Source BPM systems. The definition of the catalogue of criteria is build upon generic requirements on BPM systems and those that are specifically to R4eGov. The chosen methods and the core elements of the evaluation framework will be applied to the selected BPM systems Intalio BPMS,NetBeans IDE, and JBoss jBPM. Finally the results of the applied R4eGov scenario and of the applied catalogue of criteria are being discussed by highlighting individual strengths and weaknesses of the systems.
Computers and especially computer networks have become an important part of our everyday life. Almost every device we use is equipped with a computer or microcontroller. Recent technology has even boosted this development by miniaturization of the size of microcontrollers. These are used to either process or collect data. Miniature senors may sense and collect huge amounts of information coming from nature, either from environment or from our own bodies. To process and distribute the data of these sensors, wireless sensor networks (WSN) have been developed in the last couple of years. Several microcontrollers are connected over a wireless connection and are able to collect, transmit and process data for various applications. Today, there are several WSN applications available, such as environment monitoring, rescue operations, habitat monitoring and smart home applications. The research group of Prof. Elaine Lawrence at the University of Technology, Sydney (UTS) is focusing on mobile health care with WSN. Small sensors are used to collect vital data. This data is sent over the network to be processed at a central device such as computer, laptop or handheld device. The research group has developed several prototypes of mobile health care. This thesis will deal with enhancing and improving the latest prototype based on CodeBlue, a hardware and software framework for medical care.
The internet is becoming more and more important in daily life. Fundamental changes can be observed in the private sector as well as in the public sector. In the course of this, active involvement of citizens in planning political procedures is more and more supported electronically. The expectations culminate in the assumption that information and communication technology (ICT) can enhance civic participation and reduce disenchantment with politics. Out of these expectations, a lot of eparticipation projects were initiated in Germany. Initiatives were established, e.g. the "Initiative eParticipation", which gave many incentives of electronic participation for policy and administration in order to strengthen decision-making processes with internet supported participation practices. This thesis consists of two major parts. In the first part, definitions of the essential terms are presented. The position of e-participation within the dimension of ebusiness is pointed out. In order to explain e-participation, basics of the classical offline participation are delivered. It will be shown that a change is in progress, not only because of the deployment of ICT. Subsequently, a framework to characterize eparticipation is presented. The European Union is encouraging the implementation of e-participation. So, the city of Koblenz should be no exception. But what is the current situation in Koblenz? To provide an answer to this question, the status quo was examined with the help of a survey among the citizens of Koblenz, which was developed, conducted and evaluated. This is the second major part of this thesis.
Public electronic procurement (eProcurement), here electronic sourcing (eSourcing) in particular, is almost certainly on the agenda when eGovernment experts meet. Not surprisingly is eProcurement the first high-impact service to be addressed in the European Union- recent Action Plan. This is mainly dedicated to the fact that public procurement makes out almost 20% of Europe- GDP and therefore holds a huge saving potential. To some extent this potential lies in the common European market, since effective cross-boarder eSourcing solutions can open many doors, both for buyers and suppliers. To achieve this, systems and processes and tools, need to be adoptable, transferable as well as be able to communicate with each other. In one word, they need to be interoperable. In many relevant domains, interoperability has reached a very positive level, standards have been established, workflows been put in place. In other domains however, there is still a long road ahead. As a consequence it is crucial to define requirements for such interoperable eSourcing systems and to identify the progress in research and practice.
SOA-Security
(2007)
This paper is a part of the ASG project (Adaptive Services Grid) and addresses some IT security issues of service oriented architectures. It defines a service-oriented security concept, it explores the SOA security challenge, it describes the existing WS-Security standard, and it undertakes a first step into a survey on best practice examples. In particular, the ASG middleware platform technology (JBossWS) is analyzed with respect to its ability to handle security functions.