Program in Detail


The I4CS 2024 program flyer

You find all information and details on the conference in our program flyer.

Download

The program is still subject to change.


Invited Speakers

Have a look at the details on our invited speakers.


Session 1: Quantum Computing

Wednesday, June 12, 2024, 10:30 a.m. - Chair: Leendert W. M. Wienhofen

Frank Phillipson

Leveraging Quantum Technology to Enhance Community Services and Supportive ICT Infrastructure

Abstract: This article explores the transformative potential of quantum technology in community services, emphasising quantum sensing, quantum computing algorithms, and quantum communication. Community services, spanning healthcare, education, and environmental conservation, are crucial for resident well-being. Quantum technology, rooted in principles like superposition and entanglement, is presented as a gamechanger. Quantum sensing offers unparalleled precision, benefiting environmental monitoring, traffic management, and healthcare diagnostics. Quantum computing algorithms, leveraging qubits, promise breakthroughs in resource allocation, data analysis, and telecommunications optimisation. Quantum communication, particularly quantum key distribution, ensures secure data transmission, safeguarding sensitive information in fields like finance and healthcare. The article envisions a future where quantum technology optimises community services, fostering data accuracy, speed, and privacy. The collaborative incorporation of quantum technology into ICT infrastructure is crucial for realising these advancements and enhancing community well-being. The recommendations stress enhancing ICT infrastructure for seamless quantum technology integration. Quantum-safe encryption, high-speed communication networks, and quantum-ready data centres are crucial. Collaboration among stakeholders is deemed essential for identifying applications and ensuring comprehensive integration.
Melanie van Dommelen and Frank PhillipsonQUBO Formulation for Sparse Sensor Placement for Classification
Abstract: The demand for facial recognition technology has grown in various sectors over the past decade, but the need for efficient feature selection methods is crucial due to high-dimensional data complexity. This paper explores the potential of quantum computing for Sparse Sensor Placement optimisation (SSPO) in facial image classification. It studies a well known Filter Approach, based on statistical measures like Pearson correlation and Mutual Information, as it offers computational simplicity and speed. The proposed Quadratic Unconstrained Binary optimisation (QUBO) formulation for SSPO, inspired by the Quadratic Programming Feature Selection approach, aims to select a sparse set of relevant features while minimising redundancy. QUBO formulations can be solved by simulated annealing and by quantum annealing. Two experiments were conducted to compare the QUBO with a machine learning (ML) approach. The results showed that the QUBO approach, utilising simulated annealing, achieved an accuracy between random placed sensors and ML based sensors. The ML algorithm outperformed the QUBO approach, likely due to its ability to capture relevant features more effectively. The QUBO approach’s advantage lies in its much shorter running time. The study suggests potential improvements by using Mutual Information instead of Pearson correlation as a measure of feature relevance. Additionally, it highlights the limitations of quantum annealers’ current connectivity and the need for further advancements in quantum hardware.

Sebastian Zielinski, Magdalena Benkard, Jonas Nüßlein, Claudia Linnhoff-Popien and Sebastian Feld

SATQUBOLIB: A Python Framework for Creating and Benchmarking (Max)-3SAT QUBOs
Abstract: In this paper, we present an open-source Python framework, called satqubolib. This framework aims to provide all necessary tools for solving (MAX)-3SAT problems on quantum hardware systems via Quadratic Unconstrained Binary Optimization (QUBO). Our framework solves two major issues when solving (MAX)-3SAT instances in the context of quantum computing. Firstly, a common way of solving satisfiability instances with quantum methods is, to transform these instances into instances of QUBO, as QUBO is the input format for quantum annealers and the Quantum Approximate Optimization Algorithm (QAOA) on quantum gate systems. Studies have shown, that the choice of this transformation can significantly impact the solution quality of quantum hardware systems. Thus, our framework provides thousands of usable QUBO transformations for satisfiability problems. Doing so also enables practitioners from any domain to immediately explore and use quantum techniques as a potential solver for their domain-specific problems, as long as they can be encoded as satisfiability problems. As a second contribution, we created a dataset of 6000 practically hard and satisfiable SAT instances that are also small enough to be solved with current quantum(-hybrid) methods. This dataset enables meaningful benchmarking of new quantum, quantum-hybrid, and classical methods for solving satisfiability problems.

Back to Program Overview, Top


Session 2: Pervasive Computing

Wednesday, June 12, 2024, 12:45 p.m. - Chair: Gerald Eichler

Wessel Kraaij, Marloes van der Klauw, Geiske Bouma and Pepijn van Empelen

Better Together – Empowering Citizen Collectives with Community Learning

Abstract: Citizen collectives have the potential to contribute significantly to various societal transitions. Rather than focusing on the individual, collective action improves empowerment and agency, and can contribute to effective collaboration between citizens, policy makers and other institutes. This can be facilitated by enabling efficient monitoring, reflection, and multi-level learning, but an infrastructure is missing to scale bottom-up approaches and bridge the gap with the systems world of government policy makers. We discuss four open problems that need to be addressed in order to create this comprehensive data and learning infrastructure for empowering citizen collectives: 1. providing value and accessibility for all; 2. handling privacy, providing trust and autonomy; 3. enable community learning from observational data; 4. enabling scaling in order to accelerate and link to the ‘systems world’. We conclude with sketching the research methodology that we plan to use to address these challenges.
Lucie Schmidt, Christian ErfurthThe Future of Aging at Home: A Trend Analysis of Smart Home Innovations
Abstract: This paper presents a comprehensive analysis of the development and potential of smart home technologies to address the challenges of aging at home. By examining current trends, technologies, and their acceptance among the elderly population, the study aims to gain insights into the preferences and needs of seniors regarding intelligent living solutions. Considering a literature review, factors such as usability, security, and privacy are evaluated. The paper identifies key areas where smart home innovations can enhance the independence and quality of life for older individuals and discusses implications for future research and development. The goal is to contribute to the design of age-appropriate smart homes that are not only technologically advanced but also tailored to the specific requirements and desires of this growing user group.
Sabrina Hölzer, Christian ErfurthExamining Smart Neighborhood Platforms: A Qualitative Exploration of Features and Applications
Abstract: Smart neighborhood platforms have emerged as innovative solutions to address the challenges of urbanization and as smart living approaches to create sustainable, interconnected communities. This study aims to provide a qualitative exploration of multiple smart neighborhood platform initiatives in the German speaking region. The research examines the diverse characteristics, features and capabilities offered by these platforms, as well as their potential application in various aspects of smart community living. Qualitative research methods are employed to identify key parameters. Additionally, the study investigates the practical application of these platforms, ranging from optimizing resource consumption and improving quality of life to enhancing social interactions and fostering social cohesion and sustainable behaviors. The findings contribute to a deeper understanding of the design and applications of smart neighborhood platforms, providing valuable insights for further research as well as urban planners, policymakers, and technology developers seeking to create smarter and more livable communities.

Back to Program Overview, Top


Session 3: Information Analysis

Wednesday, June 12, 2024, 2:30 p.m. - Chair: Christian Erfurth

Mario M. Kubek, Shiraj Pokharel and Georg P. RoßruckerWebMap – Large Language Model-assisted Semantic Link Induction in the Web
Abstract: Carrying out research tasks is only inadequately supported, if not hindered, by current web search engines. This paper therefore proposes functional extensions of WebMap, a semantically induced overlay linking structure on the web to inherently facilitate research activities. These add-ons support the dynamic determination and regrouping of document clusters, the creation of a semantic signpost in the web, and the interactive tracing of topics back to their origins.
Patrick Seidel, Steffen SpätheDevelopment and validation of AI-driven NLP algorithms for chatbots in Requirement Engineering
Abstract: The present research focused on the use of artificial intelligence (AI) and natural language processing (NLP) techniques in the field of requirements engineering within software development. The primary challenge is the prevention of miscommunication between the customer and the development team. In the worst-case scenario, it might lead to the premature termination of the project. The aim of this project is to develop a prototype of a chatbot able to evaluate consumer needs and suggest potential requests. The first step comprised a thorough evaluation of the chatbot’s requirements, followed by the development of a prototype. Two transformer models have been developed to classify customer input, and an additional model has been established to generate suitable requests. The classification was obtained by assessing the level of detail of the provided user input using a classification model, as well as classifying them based on ISO 25010 (quality criteria for software). Both versions utilized the DistilBERT models as their foundation. A GPT-2 model was trained to generate the inquiry. This approach utilized ambiguous user inputs and generated inquiries to get further information. To determine the user’s intention, it was decided to use RASA software to train an intention module. This module will be able to differentiate between a user’s question and their intention to proceed with the acceptance procedure. The initial classification model achieved an accuracy of 0.7033, whereas the second model had an accuracy of 0.2784. Moreover, the output generated by the GPT model varies only to a limited degree. The quality of the model is directly influenced by the quality of the training data. Increasing the number of data points and balancing the classes can help enhance the model quality. Nevertheless, this scientific work presents a fundamental basis for the possible utilization of transformer models in the field of requirement engineering. Further exploration of the application of NLP approaches using transformer models to understand customer requirements has the potential to reduce the failure rate of software development projects.

Sergej Schultenkämper and Frederik Simon Bäumer

Structured Knowledge Extraction for Digital Twins: Leveraging LLMs to Analyze Tweets
Abstract: This paper concentrates on the extraction of pertinent information from unstructured data, specifically analyzing textual content disseminated by users on X/Twitter. The objective is to construct an exhaustive knowledge graph by discerning implicit personal data from tweets. The gleaned information serves to instantiate a digital counterpart and establish a tailored alert mechanism aimed at shielding users from threats such as social engineering or doxing. The study assesses the efficacy of fine-tuning cutting-edge open source large language models for extracting pertinent triples from tweets. Additionally, it delves into the concept of digital counterparts within the realm of cyber threats and presents relevant works in information extraction. The methodology encompasses data acquisition, relational triple extraction, large language model fine-tuning, and subsequent result evaluation. Leveraging a X/Twitter dataset, the study scrutinizes the challenges inherent in usergenerated data. The outcomes underscore the precision of the extracted triples and the discernible personal traits gleaned from tweets.

Back to Program Overview, Top


Session 4: Graphs and Routing

Thursday, June 13, 2024, 8:30 a.m. - Chair: Jörg Roth

Sam Leder and Thijs LaarhovenOblivious graph algorithms for solving TSP and VRP using FHE and MPC
Abstract: As the world is starting to realize the necessity and potential of privacy-friendly data processing, various questions need to be answered regarding potential solutions. Especially for the developing eld of fully homomorphic encryption (FHE), which aims to oer the best privacy protection at a high computational cost, there are big expectations for future applications, but the question of practicability remains a major issue. Can this technology compete with more mature and ecient technologies like multiparty computation (MPC)? Can current state-of-the-art FHE schemes and libraries be deployed in realistic applications? In this paper we attempt to gain further insights into the current status of FHE as a privacy-enhancing technology, by studying a use case related to route planning in transport and logistics. In this application, a central computing server is tasked with (obliviously) computing a short tour along various destinations (known as the traveling salesman problem, or TSP), without seeing the privacy-sensitive input data of the user, namely which destinations the user wishes to visit. We also study a generalization of this problem with multiple tours being planned, each departing from the same central depot (known as the vehicle routing problem, or VRP). Finally, we aim to assess how solutions for TSP and VRP using FHE compare to solutions using MPC.
Florian Blauensteiner and Günter FahrnbergerRoute Optimization of an Unmanned Aerial Vehicle beyond Visual Line of Sight
Abstract: Undoubtedly, Unmanned Aerial Vehicles (UAVs), also known as drones, have experienced significant growth in recent decades and will continue their increase. Presently, drone operators view the endurance of their devices’ flights as one of the most challenging obstacles. This challenge becomes even more pronounced when UAVs must remain airborne for extended periods without ground contact for recharging or refueling. This contribution further addresses autonomous Beyond Visual Line Of Sight (BVLOS) flight, hydrogen propulsion, and optimized joined wing design, considering both demands and constraints. However, as no recent scientific work adequately addresses all these challenges, this disquisition aims to provide mitigation strategies by proposing prudent routing for an already constructed drone prototype. Findings from an experimental test flight substantiate the intended improvements.
Laurent Hussenet, Cherifa Boucetta and Michel HerbinSpanning Thread: A Multidimensional Classification Method for Efficient Data Center Management
Abstract: Data originating from diverse sources, including relational and NoSQL databases, web pages, texts, images, recordings, and videos, are expanding in both size and complexity. Navigating through these vast datasets poses signicant challenges. As the volume of data grows, the complexity of analysis intensies. Multidimensional data analysis requires eective organization of the data, and dierent ranking methods can help achieve this goal. Ranking is a way to create a linear order of the data items that reects their similarity or importance. The essence of ranking lies in providing a systematic way to traverse the entirety of a dataset, from its inception to its conclusion. In this paper, we introduce a novel method named "Spanning Thread" (ST) for classication and ranking multidimensional data. ST aims to establish a meaningful path connecting all data points and starts with a randomly selected data. Additionally, we present OST (Ordered Spanning Thread), which commences with the minimum virtual data and concludes with the maximum virtual data. Both methods are evaluated using an open dataset, wherein we measure the frequency of class changes to assess their eectiveness.

Back to Program Overview, Top


Session 5: Information Security in Supply Chains

Thursday, June 13, 2024, 10:30 a.m. - Chair: Udo Krieger

Maximilian Greiner, Judith Strussenberg, Andreas Seiler, Stefan Hofbauer, Michael Schuster, Damian Stano, Günter Fahrnberger, Stefan Schauer and Ulrike LechnerScared? Prepared? Toward a Ransomware Incident Response Scenario
Abstract: Individuals, organizations, and supply chains must increase the level of preparedness in response to a cyber incident. This article is part of a larger research initiative that designs a process framework, playbooks, serious games, and simulations to prepare for a ransomware incident with the scenario being common ground. Drawing on a Design Science Research approach, we use 11 interviews, short cases, four realworld cases, and cross-case analysis to design a logistics domain scenario highlighting the business, actor, infrastructure, and threat view. A discussion of the design rationale and the next steps concludes the article.
Tiange Zhao, Ulrike Lechner, Maria Pinto-Albuquerque, Tiago Gasiba and Didem OnguCOPYCAT: Applying Serious Games in Industry for Defending Supply Chain Attack
Abstract: Serious games have found their application in many cases for improving cybersecurity; one of those cases is for building a successful strategy for defending against potential attacks. Our research tries to simulate supply chain attacks involving cloud data by adapting serious game frameworks. In this paper, we present the usage of a serious game called COPYCAT to help the participants raise awareness of supply chain attack threats and build a valid defense strategy against them. COPYCAT is an abbreviation of CONTAIN suPplY Chain ATtack and CONTAIN is the name of the project that is the larger context of this study. The game originates from CATS (Cloud of Assets and Threats), designed to raise cloud security awareness. In this work, CATS is adapted to the new attack scenarios specifically for data-related supply chain attacks, and the adaption is verified in two game events conducted with practitioners from the industry. This paper marks a milestone in positioning a new game dedicated to incident response. In this paper, we share the design of our serious game and the results we collected during the game events and provide insight into the topic of serious game application for training purposes to raise awareness on data-related supply chain attacks.
Andreas Seiler, Ulrike Lechner, Judith Strussenberg and Stefan HofbauerOperation Raven - Design of a Cyber Security Incident Response Game
Abstract: Envisioning a major ransomware incident with its potential consequences might be unpleasant, and preparing for such an incident takes quite an eort. Operation Raven is a serious game designed to facilitate discussion about processes and decisions to detect, contain, and eradicate ransomware. This paper presents the game idea and game material of Operation Raven and the results after two game events. The article reects on the next steps.

Back to Program Overview, Top  


Session 6: Secure Applications

Thursday, June 13, 2024, 1:45 p.m.. - Chair: Ulrike Lechner

Guntur Budi Herwanto, Gerald Quirchmayr, A Min Tjoa, Annisa Ningtyas, Diyah Putri and Anis FuadIntegrating Contextual Integrity in Privacy Requirements Engineering: A Study Case in Personal E-Health Applications
Abstract: The importance of privacy in personal health care has increased due to the widespread use of technology. Therefore, it has become increasingly relevant to incorporate privacy considerations into these socio-technical systems. This has led to the emergence of the use of privacy engineering in the healthcare context, which is based on the principle of privacy by design. The significance of context is emphasized by the diverse norms and principles inherent in each socio-technical system, especially in healthcare information systems. This paper presents a novel approach to privacy engineering by integrating the concept of contextual integrity into a framework for analyzing privacy threats. Contextual integrity, which considers privacy as context-dependent, serves as the theoretical foundation of this approach. The steps of decision heuristics in contextual integrity are aligned with the workflow of privacy threat analysis to increase the tangibility of contextual integrity and incorporate the knowledge of contextual integrity into the privacy threat analysis. A case study in personal e-health application is used to demonstrate the methodology’s practical application in the context of privacy protection in healthcare.
Manuel Gerwien, Marcel Großmann and Udo R. KriegerThe System Architecture of a Reliable Telesurgery Service and its Performance Analysis
Abstract: Today, new smart e-health applications are enabled by the very rapid evolution of high-speed networking and cloud computing technologies. This development of new communication and computing techniques as well as the incorporation of machine learning for intelligent image analysis offer new opportunities for classical medical services like telesurgery. In our paper we consider the design of a classical telesurgery system and analyze its architecture. Then we discuss the extension of its functional modules to support remote surgery based on a reliable multi-path communication among its components. We provide an investigation of important key performance indices of such a remote-controlled telesurgery service. To achieve deeper technical insights on the system’s performance of such a distributed architecture and its communication flows, a telesurgery prototype is studied by GNS3-emulations within a virtualized test bed. In this way we investigate the fundamental computing and network performance indices of our prototype.
Marcel Großmann and Noelle WeinmannEmulation of Denial-of-Service Attacks for Software Defined Networks Accessible on Commodity Hardware with Katharà
Abstract: In the present era, networks have become intricate and may not be comprehensible to passionate freshman. Fortunately, virtualization simplies access for individuals interested in learning about network challenges. Nevertheless, replicating numerous variations of Distributed Denial-of-Service (DoS) attacks remains a tough endeavor. Thus, we suggest employing widely recognized attack techniques utilized through virtualization. Our paper uses the network emulator Kathará as the foundation for constructing networks, and we leverage customized Docker images as network nodes. Additionally, we set up a service called Kathará as a Service (KaaS) to facilitate access for all users of the network emulator. As an example, we create a network by applying a Software-Dened Network (SDN) and by conducting DDoS attacks against it. Our goal is to improve the accessibility of network assaults on next-generation networks for a diverse group of users.

Back to Program Overview, Top 


Session 7: Blockchain and Digital Sovereignty

Friday, June 14, 2024, 9:00 a.m. - Chair: Günter Fahrnberger

Karl Seidenfad, Maximilian Greiner, Jan Biermann, David Dannenberg, Sven Keineke and Ulrike LechnerGreenhouse Gas Emissions as Commons: A Community Service Approach with Blockchain on the Edge
Abstract: Designing distributed infrastructures for the common good becomes a driver of decarbonization as one of the significant endeavors of this time. This article explores the design principles for a community service approach in developing resilient infrastructure, as well as secure tools for managing greenhouse gas emissions. We introduce the prototype of an information system that utilizes blockchain technology and edge computing for collaborative automation of measuring, reporting, and verifying (MRV) greenhouse gas (GHG) emissions. In addition, the paper discusses the concept and early results of a field trial conducted with industry partners.
Maximilian Greiner, Karl Seidenfad, Andreas Hofmann, Christoph Langewisch and Ulrike LechnerThe Digital Product Passport: Enabling Interoperable Information Flows through Blockchain Consortia for Sustainability
Abstract: Global supply chains face mounting pressure for collaboration and reliable data exchange in an inter-organizational environment, especially in the pursuit of sustainability. The inherent complexities, driven by intricate multi-layered processes and worldwide connectivity, pose significant challenges. Within this article, we want to present our current research towards a Digital Product Passport enabled by blockchain as a decentralized ecosystem for reliable, traceable, and interoperable data exchange without a central authority. Following a Design Science Research methodology, it explores the convergence of a blockchain consortium and a public blockchain interface, including a discussion of the necessary governance structures. This research promotes a system-independent Digital Product Passport within blockchain-based information systems, ensuring privacy, traceability, transparency, and trust in global supply chains, advancing standardized data structures for a reliable information flow. Additionally, our Digital Product Passport supports companies in providing and verifying sustainability criteria to legislators. Reliable and interoperable information are key aspects to establish governance for successfully managed community resources without the intervention of a central authority enabled by blockchain.
John Bechara and Ulrike LechnerDigital Sovereignty and Open Source Software - A Discussion Paper
Abstract: Digital sovereignty is an important goal in Germany’s and Europe’s political agendas. To achieve this goal, the IT or OT systems’ design, the life cycle, and the digital ecosystems must be reconsidered. Our research interest is the potential role of open-source software in strengthening digital sovereignty. This idea paper discusses its risks and potential contribution to digital sovereignty. It presents the research idea and a research design.
Michael Hofmeier, Karl Seidenfad, Manfred Hofmeier and Wolfgang HommelWeb-based Protocol Enabling Distributed Identity Information Networks for Greater Sovereignty
Abstract: This paper presents a design for a Distributed Identity Information Network (DistIN) that can manage digital identities in a decentralized manner while aiming for high security, scalability and sovereignty. This novel approach enables the creation and verication of electronic signatures and oers the functionality of a public key infrastructure. Due to its decentralized nature and exibility, this system is suitable for various types of organizations, including community services. Common web technologies, state-of-the-art cryptographic algorithms as well as blockchain technology are used. This system design is developed on the basis of universal use cases and validated for its applicability, leading to the web-based DistIN protocol as a result.

Back to Program Overview, Top