Scientific Sessions
Session 1Swarm robotics and multi-agent systems
Swarm robotics and multi-agent systems are fields within robotics and artificial intelligence that focus on the coordination and collective behavior of multiple autonomous agents or robots working together to achieve complex tasks.
Swarm robotics is inspired by the collective behavior of social insects like ants, bees, or birds. It involves large groups of relatively simple robots that cooperate using local rules and decentralized control without a central leader.
Multi-agent systems consist of multiple intelligent agents (which can be robots, software programs, or virtual entities) that interact, cooperate, or compete to solve problems that are difficult for a single agent.
Session 2Database management systems
Database Management Systems (DBMS) are crucial software applications that facilitate the creation, management, and manipulation of databases. They provide an organized way to store, retrieve, and manage data, ensuring its integrity, security, and accessibility. DBMSs allow users to perform various operations on the data, such as querying, updating, and reporting, using a structured query language (SQL) or other interfaces. They support transaction processing, which ensures that data modifications are performed accurately and reliably, even in the event of system failures. By abstracting the complexities of data storage and management, DBMSs enable efficient data handling and facilitate the development of data-driven applications across various domains, from business and finance to research and healthcare.
Session 3Operating Systems
Operating systems (OS) are fundamental software that manage computer hardware and provide essential services for application programs. They act as an intermediary between users and the computer hardware, ensuring efficient resource allocation and enabling multitasking. Key functions of an OS include managing hardware resources like the CPU, memory, and storage, as well as providing user interfaces, managing files, and coordinating input and output operations. Popular operating systems include Microsoft Windows, macOS, Linux, and Android, each designed to cater to specific needs and environments. The evolution of operating systems has been driven by advances in technology, aiming to enhance performance, security, and user experience.
Session 4Robotics & Autonomous Systems
Session 5Quantum error correction
Quantum error correction (QEC) is a set of techniques designed to protect quantum information from errors caused by decoherence, noise, and other quantum disturbances. Since quantum bits (qubits) are extremely fragile, QEC is essential for building reliable and scalable quantum computers.
Quantum error correction encodes a logical qubit into a larger number of physical qubits, spreading information redundantly to detect and correct errors without measuring the quantum information directly (which would collapse the quantum state).
Session 6Quantum Computing
Quantum computing is an advanced field of computing that leverages the principles of quantum mechanics to process information in fundamentally new ways, promising to solve certain problems much faster than classical computers.
Quantum computing represents a paradigm shift with the potential to revolutionize computing, cryptography, and scientific research. While still in early stages, ongoing advances could unlock new capabilities beyond classical limitations.
Session 7Data mining and pattern recognition
Session 8Data Science & Big Data
Data Science and Big Data are interconnected fields that focus on extracting meaningful insights from large and complex datasets using a combination of statistics, machine learning, and computational tools
Big Data refers to datasets that are too large, fast, or complex to be processed using traditional tools
Data Science provides the techniques to derive actionable insights, while Big Data provides the scale and complexity of modern data challenges. Together, they power innovations across sectors—from business intelligence to healthcare and smart cities
Session 9Cybersecurity & Privacy
Cybersecurity involves the strategies, technologies, and practices used to protect digital assets—including software, hardware, and data—from cyber threats such as hacking, viruses, phishing, ransomware, and denial-of-service (DoS) attacks.
Privacy focuses on the responsible collection, use, and sharing of personal data. It ensures individuals have control over how their information is used and that organizations comply with data protection laws.
Cybersecurity and privacy are closely related fields focused on protecting digital systems, networks, and information from unauthorized access, misuse, and damage, while ensuring that individuals’ personal data is handled responsibly and lawfully.
Session 10Artificial Intelligence (AI) and Machine Learning (ML)
AI refers to the broader concept of machines being able to carry out tasks in a way that we would consider “smart” — such as reasoning, problem-solving, understanding language, and perception.
ML is a subset of AI where systems learn from data and improve their performance over time without being explicitly programmed.
Session 11Multimodal learning
Session 12Network protocols and simulation
Session 135G/6G and next-generation networking
Session 14Internet of Things (IoT)
The Internet of Things (IoT) refers to a network of interconnected physical devices—such as sensors, appliances, vehicles, and machines—that collect, exchange, and act on data using the internet, often without human intervention.
IoT is transforming how we interact with the physical world by enabling smarter environments, automated systems, and data-driven decision-making across industries. As it evolves, it will play a central role in future technologies like smart cities, autonomous vehicles, and AI-powered environments.
Session 15Cloud computing and edge computing
Cloud computing and edge computing are modern computing paradigms that address how data is processed, stored, and accessed across networks—but with different approaches to location, latency, and scalability.
Cloud computing delivers computing services—such as servers, storage, databases, networking, software, and analytics—over the internet (“the cloud”). It allows users to access powerful infrastructure without managing physical hardware.
Edge computing pushes computation and data storage closer to the data source (i.e., at the “edge” of the network), rather than relying solely on centralized cloud servers.
Session 16Networking & Distributed Systems
Networking and Distributed Systems are areas of computer science that focus on how multiple computers communicate and collaborate to perform tasks, share resources, and maintain system reliability—often across geographic distances.
Computer networking involves the design and management of communication systems that allow data to be transferred between devices. It forms the backbone of the internet and most modern computing environments.
Distributed systems consist of multiple interconnected computers that work together as a single system. These systems aim to provide scalability, fault tolerance, and resource sharing across various nodes.
Session 17Formal methods and logic in computing
Session 18Computational complexity
Computational complexity is a field in theoretical computer science that studies the inherent difficulty of computational problems and classifies them based on the resources required to solve them, such as time (how fast) and space (how much memory).
Computational complexity provides a deep theoretical framework for understanding the limits of computation. It tells us not only how to solve problems, but whether some problems are even solvable within practical constraints.
Session 19Algorithms and data structures
Algorithms and data structures are fundamental concepts in computer science that deal with how data is organized and how computational problems are solved efficiently.
An algorithm is a step-by-step procedure or set of rules designed to perform a specific task or solve a problem. Algorithms are evaluated based on their correctness, efficiency (in terms of time and space), and scalability.
A data structure is a way of organizing and storing data so it can be accessed and modified efficiently. The choice of data structure affects the performance of algorithms.
Session 20Theoretical Computer Science
Session 21Human-robot interaction
Human-Robot Interaction (HRI) is an interdisciplinary field focused on studying, designing, and evaluating the ways humans and robots communicate, collaborate, and coexist safely and effectively.
Human-Robot Interaction is critical to making robots more accessible, trustworthy, and effective partners in various aspects of daily life and work. By improving HRI, we can foster smoother, safer, and more meaningful collaboration between humans and machines.
Session 22Software Engineering
Software engineering is the systematic application of engineering principles to the design, development, testing, deployment, and maintenance of software systems. It aims to produce high-quality, reliable, and scalable software in a cost-effective and efficient manner
Session 23Automated testing and debugging
Session 24Code generation using AI
Code generation using AI refers to the use of artificial intelligence—particularly machine learning and natural language processing models—to automatically write, complete, or suggest code based on human input or existing code context.
AI code generation systems can take natural language prompts (e.g., “write a function to sort a list”) or partial code snippets and generate functional code in a target programming language. These tools are trained on vast code repositories and can understand patterns, syntax, and even coding best practices.
Session 25Human-Computer Interaction (HCI)
Session 26Federated and distributed learning
Federated learning is a decentralized approach to training machine learning models where data remains on local devices (such as smartphones, IoT devices, or edge servers). Instead of collecting data in a central server, the model is trained locally on each device, and only model updates (e.g., gradients or weights) are sent back to a central server. The server then aggregates these updates to form a global model.
Distributed learning refers to a method of training machine learning models by splitting the computation across multiple machines, often in a data center or cloud environment. This is especially useful for training large-scale models (e.g., deep neural networks) that would be too slow or memory-intensive to train on a single machine.