Most talks are scheduled to be between 60 and 90 minutes
long in order to have coffee breaks for discussions of at
least 30 minutes length between the talks. The times given
here reflect the upper-bound length of the talks and the lower-bound
length of the coffee breaks. The missing abstracts will be
added to this page as soon as they are available.
The autumn school will officially start on
Wednesday, September 30 at 8:00 AM with the registration at
the front desk. The first talk will start at 9:00 AM.
Wednesday | Thursday | Friday | |
---|---|---|---|
9:00 - 9:15 |
Opening |
Goran Frehse Model Checking of Hybrid Systems |
Sven Schewe Synthesis |
9:15 - 10:30 |
Werner Damm On the Nos and Musts in System Design |
||
10:30 - 11:00 |
Coffee break |
Coffee break |
Coffee break |
11:00 - 12:30 |
Kim G. Larsen Real Time Model Checking, Performance Evaluation, Synthesis and Optimization |
André Platzer Logical Foundations of Cyber-Physical Systems |
Ralf Wimmer Probabilistic Counterexamples |
12:30 - 13:30 |
Lunch |
Lunch |
Lunch |
13:30 - 15:00 |
Ernst-Rüdiger Olderog Automatic Verification of Combined Specifications |
Thomas Sturm Quantifier Elimination |
|
15:00 - 15:30 |
Coffee break |
Coffee break |
|
15:30 - 17:00 |
Andrey Rybalchenko Horn Constraints for Verification and Synthesis |
David Parker Probabilistic Model Checking and Controller Synthesis |
|
17:00 - 18:00 |
Mani Swaminathan "Design meets Verification" for Real-Time and Probabilistic Systems |
Paolo Marin Precision of BlackBox Verification Techniques: Hardness and Technology |
|
19:30 - open end |
Social event |
The talk gives a brief survey of the research challenges adressed in AVACS, and then focusses on research within the AVACS Research Area on Systems. It relates previous joint research with David Harel on Live Sequence Charts with recent results on Compositional Synthesis developed jointly with Bernd Finkbeiner in the context of AVACS. Live sequence charts introduced modalities into protocol specifications. The key concept of hot and cold conditions has been incorporated into a design methodology for contract based systems engineeering, there denoted as weak and strong assumptions. The talk highlights the key methodological value in introducing such assumptions into system design, and presents recent theoretical results on automatic generation of weak assumptions.
Download slides: Introduction and On the Nos and Musts in System Design
Timed automata and games, priced timed automata and energy automata have emerged as useful formalisms for modeling real-time and energy-aware systems as found in several embedded and cyber-physical systems. During the last 20 years the real-time model checker UPPAAL has been developed allowing for efficient verification of hard timing constraints of timed automata. Moreover a number of significant branches exists, e.g. UPPAAL CORA providing efficient support for optimization, UPPAAL TIGA allowing for automatic synthesis of strategies for given safety and liveness objectives, and UPPAAL SMC offering a highly scalable statistical model checking engine supporting performance analysis of stochastic hybrid automata. Most recently the branch UPPAAL STRATEGO has been released supporting synthesis (using machine learning) and evaluation of near-optimal strategies for stochastic priced timed games. The lecture will review the various branches of UPPAAL and indicate their concerted applications to a range of real-time and cyber-physical examples.
We discuss how properties of reactive systems with continuous real-time constraints and complex, possibly infinite data can be automatically verified, exploiting recent advances in semantics, constraint-based model checking, and decision procedures for complex data. The systems are specified in CSP-OZ-DC, a language that integrates concepts from Communicating Sequential Processes (CSP), Object-Z (OZ), and Duration Calculus (DC). Our approach to automatic verification rests on a compositional semantics of this language in terms of Phase-Event-Automata. These are further translated into the input formats of the model checkers ARMC (Abstraction Refinement Model Checker) and SLAB (Slicing Abstraction model checker) as well as the tool H-PILoT (Hierarchical Proving by Instantiation in Local Theory extensions). We demonstrate this with a case study concerning emergency messages in the European Train Control System (ETCS). The talk is based on the project ``Beyond Timed Automata'' carried out in the Collaborative Research Center AVACS (Automatic Verification and Analysis of Complex Systems) of the Universities of Oldenburg, Freiburg, and Saarbrücken: see also www.avacs.org.
Download slides: Automatic Verification of Combined Specifications
We will show how Horn constraints can be used to describe verification and synthesis problems and how such constraints can be solved efficiently.
Download slides: Horn constraints for Verification and Synthesis
We illustrate one of the overarching paradigms of AVACS Phase-3, namely,
"Design Meets Verification", in the context of real-time and probabilistic systems.
In particular, we investigate design-level structural transformations that
aim at easier subsequent verification of real-time systems with shared
data variables, modelled as networks of extended timed automata (ETA). Our
contributions to this end are the following:
(1) We first equip ETA with an operator for layered composition, intermediate between parallel and
sequential composition. Under certain non-interference and/or precedence
conditions imposed on the structure of the ETA networks, the communication
closed layer (CCL) laws and associated partial-order (po-) and (layered)
reachability equivalences are shown to hold.
(2) Next, we investigate (under certain cycle conditions on the ETA) the (reachability preserving)
transformations of separation and flattening aimed at reducing the number of cycles of the ETA.
(3) We then show that our separation and flattening in (2) may be applied together with the CCL laws
in (1), in order to restructure ETA networks such that the verification of layered
reachability properties is rendered easier. This interplay of the three structural transformations (separation,
flattening, and layering) is demonstrated on an enhanced version of Fischer's
real-time mutual exclusion protocol for access to multiple critical sections.
We then conclude by outlining a probabilistic version of layering, and its
application to simplifying the analysis of the protocol of Kushilevitz and
Rabin for randomized mutual exclusion.
(These results reflect joint work predominantly with Ernst-Rüdiger Olderog
and also with Joost-Pieter Katoen).
Download slides: "Design meets Verification" for Real-Time and Probabilistic Systems
Hybrid automata combine finite state models with continuous variables that are governed by differential equations. Hybrid automata are used to model systems in a wide range of domains such as automotive control, robotics, electronic circuits, systems biology, and health care. Hybrid systems are difficult to analyze since neighboring states, no matter how close, may exhibit qualitatively different behaviors. Conventional techniques from model-based design, such as numerical simulation, may fail to detect critical behavior. Set-based reachability constructs a cover of of all behaviors by exhaustively computing one-step successor states until a fixed-point is found. If precise enough, it can show safety of the system as well as provide quantitative measurements of key variables. In general, one-step successors can only be computed approximately and are difficult to scale in the number of continuous variables. The approximation error requires particular attention since it can accumulate rapidly, leading to a coarse cover, prohibitive state explosion, or preventing termination. This talk presents major approaches for reachability and discusses their advantages as well as shortcomings.
Download slides: Model Checking of Hybrid Systems
Logical foundations of cyber-physical systems (CPS) study systems that combine cyber aspects such as communication and computer control with physical aspects such as movement in space. CPS applications abound. Ensuring their correct functioning, however, is a serious challenge. Scientists and engineers need analytic tools to understand and predict the behavior of their systems. That's the key to designing smart and reliable control. This talk identifies a mathematical model for CPS called multi-dynamical systems, i.e. systems characterized by combining multiple facets of dynamical systems, including discrete and continuous dynamics, but also uncertainty resolved by nondeterministic, stochastic, and adversarial dynamics. Multi-dynamical systems help us understand CPSs better, as being composed of multiple dynamical aspects, each of which is simpler than the full system. The family of differential dynamic logics surveyed in this talk exploits this compositionality in order to tame the complexity of CPS and enable their analysis. In addition to providing a strong theoretical foundation for CPS, differential dynamic logics have also been instrumental in verifying many applications, including the Airborne Collision Avoidance System ACAS X, the European Train Control System ETCS, several automotive systems, mobile robot navigation with the dynamic window algorithm, and a surgical robotic system for skull-base surgery. The approach is implemented in the theorem prover KeYmaera X.
Download slides: Logical Foundations of Cyber-Physical Systems
Effective quantifier elimination procedures are a mathematical tool for equivalently eliminating quantifiers from first-order formulas for certain algebraic theories. This generalizes classical satisfiability modulo theory to quantifier alternations and furthermore to parametric situations. We give a survey of classical quantifier-eliminable theories, negative results, and fundamental proof techniques. One focus will be on virtual substitution and cylindrical algebraic decomposition for the nonlinear real arithmetic. These methods, among many others, are implemented in our freely available computer logic system Redlog, which is an established scientific tool in the sciences and in engineering.
Download slides: Quantifier Elimination
Probabilistic model checking is an automated technique to verify whether a probabilistic system satisfies a formally specified quantitative correctness property (e.g. "with probability at least 0.999, the airbag successfully deploys within 20 milliseconds of a crash"). The same techniques can also be used to synthesise a controller (e.g. for a robot or a power management system) which then guarantees satisfaction of a correctness property (e.g. "the minimum probability of patrolling all rooms within 15 minutes is above 0.98"). This talk will give an overview of probabilistic model checking and controller synthesis, including some recent directions such as multi-objective model checking (to investigate trade-offs between several conflicting properties) and permissive strategy synthesis (to generate flexible and robust controllers). I will also describe some applications of these techniques to real-world problems using the probabilistic model checker PRISM.
Download slides: Probabilistic Model Checking and Controller Synthesis
In this lecture we will give an overview of BMC and Equivalence Checking problems in case of incomplete discrete designs, which are also known as Black-Box BMC and Partial Equivalence Checking problems, respectively. We will show how to improve the workflow's scalability working on the technologies needed to decide these problems when non-trivial topologies are considered.
Synthesis of reactive systems is a research direction inspired by Church's realisability problem. It focuses on systems that receive a constant stream of inputs from an environment, and must, for each input, produce some output. Specifically, we are given a logical specification that dictates how the system must react to the inputs, and we must construct a system that satisfies the specification for all possible inputs that the environment could provide. While the verification problem (the validation or refutation of the correctness of such a system) has gained many algorithmic solutions and various successful tools, the synthesis problem has had fewer results. We focus on the classical approach by Pnueli and Rosner, which links synthesis to emptiness games of automata and the treatment of incomplete information. We will discuss why these approaches require determinisation and how the resulting automata can be simplified with respect to their acceptance mechanism. For pleasure, we might look into a connection between complexity classes and the existence of succinct control strategies.
Download slides: Synthesis
For non-probabilistic systems, counterexamples play a crucial role not only for debugging erroneous systems but also for techniques like counterexample-guided abstraction refinement (CEGAR). For digital systems, a single trace leading to an unsafe state suffices to refute a safety property; such a trace is often obtained as a by-product of model checking. For probabilistic safety properties ("The probability to reach an unsafe state is below 0.01") the situation is more complicated: Typically a single trace does not carry enough probability to refute the property. Instead, a potentially very large (or even infinite) set of paths is necessary. The actual reachability probability is computed as the solution of a linear equation system; therefore the model checking algorithms do not yield a counterexample if a property is violated. In this tutorial I will show how to compute and represent counterexamples for probabilistic systems at different levels of abstraction:
Download slides: Probabilistic Counterexamples