banner
qest logo
Tutorials
Sunday 16th
Two parallel tutorials per session are planned.

Morning: 9.30 - 1.00 (break after 1.5 hours)
Lunch: 1.00-2.00

Afternoon: 2.00-5.30 (break after 1.5 hours)

 

Abstracts
  • Tutorial 1
    Speaker: Kim Larsen
    Title: Data structures and Algorithms for Quantitative Verification in UPPAAL

    Abstract: This tutorial will present the tool UPPAAL and the newly emerged branches UPPAAL Cora and UPPAAL Tiga (see www.uppaal.com) focusing on the development of datastrutures and algorithms which have been crucial for the efficiency of these tools.

    UPPAAL is a tool for modelling, simulating and verifying real-time systems. The modelling language is based on timed automata allowing system behaviour to be described as a networks of timed automata extended with data variables which may be manipulated by a rich C-like programming language. The verification engine of UPPAAL supports both safety, liveness and bounded liveness properties. In the tutorial we will review the notion of zones and their efficient representation and manipulation for the analysis of timed automata models using difference bounded matrices, minimal constraing representation as well as clock difference diagrams. Here it is crucial to perform abstractions taking the maximum constant of the given timed automaton into account. We show that significant coarser abstractions may be obtained by making it location dependent and by distinguishing maximal lower and upper bounds.

    UPPAAL Cora uses a priced (or weighted) timed automata as the underlying modeling formalism and has proved particular useful for optimal scheduling problems. Priced timed automata extend classical timed automata with cost information on location (and edges) giving the price per time unit for staying in that location. In this way every run of a priced automaton has a global cost. A number of natural optimization problems has dealt with in both in terms of basic computability but also in terms of datastructures and algorithms for efficient implementation in practice. Crucial are priced extensions of the notion of zone as well as heuristics for guiding and pruning the search taking cost-information into account.

    UPPAAL Tiga deals with open systems, i.e. systems interacting with an environment, with the purpose of synthesizing a control so that the system behaves correct no matter what the environment does. Formally, UPPAAL Tiga uses timed game automata to model such open systems as a (real-time) game between the "controller" and the "environment". More concretely edges of a timed game automaton are either "controllable" (thus describing possible moves of the controller) or "uncontrollable" (thus describing possible moves of the environment). Though timed games for long have been known to be decidable there has until now been a lack of efficient algorithm. UPPAAL Tiga offers the first efficient on-the-fly algorithm for synthesizing winning strategies for timed game automata with respect to reachability and safety properties.

    In the tutorial we thus explain in detail the theories, algorithms and data structures underlying UPPAAL, UPPAAL Cora and UPPAAL Tiga and which have proven crucial for their efficiency. The tutorial will also include demo's of the tools.

     

  • Tutorial 2
    Speakers: Jean-Michel Fourneau and Nihal Pekergin
    Title: Stochastic Bounds and Stochastic Monotonicity: methods, algorithms and applications

    Abstract: We survey several algorithms and applications of stochastic comparison of Discrete Time Markov Chains. These techniques lead to an important reduction on time and memory complexity when one is able to compare the initial rewards with rewards on smaller chains. We also show that stochastic comparison and stochastic monotonicity can provide intervals for rewards when models are not completly known (some parameters are missing). Finally we can prove some qualitative properties due to the stochastic monotonicity or the level crossing ordering or Markov Chains.

    In this tutorial we present some applications of stochastic comparison of Discrete Time Markov Chains. The approach differs from sample path techniques and coupling theorem applied to models as we only consider Markov chains and algorithms on stochastic matrices. These algorithms build stochastic matrices which finally provide bounds on the rewards. We consider DTMC but we also show how we can apply the method to Continuous Time Markov Chains. We consider three typical problems we have to deal with when we consider a Markovian model:

    1. The most known problem is the size of the state space. Even if the tensor representation provides an efficient manner to store the non zero entries of the transition rate matrix, it is still difficult to solve the model. We are interested in performance measures defined as reward functions on the steady-state or transient distribution or by first passage time (for instance to a faulty state in dependability modeling). Thus the numerical computation of the analysis is mainly the computation of the steady-state or transient distributions or the fundamental matrix. This is in general difficult because of the memory space and time requirements. Sometimes it is even impossible to store a vector of states.
    2. We also have a problem related to the parametrization of the model. Finding realistic parameters for a Markovian model may be a very difficult task. Sometimes we know that some transitions exist but we do not know their probabilities. Instead we can find an interval of probability for all the transitions. Thus the model is associated with a set of Markov chains. Some models are also described by a set of distributions rather than a single one.
    3. Finally, we need sometimes qualitative properties rather than numerical results. For instance, we may want to prove that the reward we analyze is non decreasing with a parameter. Numerical analysis does not help here.

    We show in that tutorial that stochastic comparison of Markov chains may lead to an efficient solution for these three problems. While modeling high speed networks or dependable systems, it is often sufficient to satisfy the requirements for the Quality of Service (QoS) we expect. Exact values of the performance measures are not necessary in this case and bounding some reward functions is often sufficient. So, we advocate the use of stochastic comparison to prove that QoS requirements are satisfied. We can derive bounds for a family of stochastic matrices. Thus we are able to give an interval for rewards on models which are not completely specified. The stochastic monotonicity and the level crossing ordering of Markov chains allow us to prove some qualitative properties of the models.

    Jean-Michel Fourneau
    Jean-Michel Fourneau is Professor of Computer Science at the University of Versailles St Quentin, France. He was formerly with Ecole Nationale des Telecommnications, Paris and University of Paris XI Orsay as an Assistant Professor. He graduated in Statistics and Economy from Ecole Nationale de la Statistique et de l'Administation Economique, Paris and he obtained his PHD and his habilitation in Computer Science at Paris XI Orsay in 87 and 91 respectively. He is a member of IFIP WG7.3. His recent research interests are algorithmic performance evaluation, Stochastic Automata Networks, G-networks, stochastic bounds, and application to high speed networks and all optical networks.

    Nihal Pekergin
    Nihal Pekergin is Professor of Computer Science at the University of Paris 12, France. She has been formely in University of Paris 1 as Associate Professor. She had her Bs degree in Engineering of Electronics and Communication(1983), and her Ms degree in Computer Sciences (1986) at Istanbul Technical University. She obtained her PhD from University of Paris 5, and her habilitation from University of Versailles-St Quentin respectively in 1991 and 2001. Her recent research interests are performance evaluation of discrete event systems, stochastic bounds and probabilistic model checking.

     

  • Tutorial 3
    Speakers: Dirk Husmeier
    Title: Bayesian learning of Bayesian networks with applications in bioinformatics and systems biology

    Abstract: Bayesian networks are a combination of probability theory and graph theory. Graph theory provides a framework to represent complex structures of interacting sets of variables. Probability theory provides a method to infer these structures from observations or measurements in the presence of noise and uncertainty. Many problems in computational molecular biology and bioinformatics, like sequence alignment, molecular evolution, and genetic networks, can be treated as particular instances of the general problem of learning Bayesian networks from data. This tutorial will provide a brief introduction to Bayesian networks and Bayesian learning in part 1. This will be followed by the application of Bayesian networks to problems in systems biology in part 2.

    Part 2: An important problem in systems biology is the reverse engineering of signalling pathways and regulatory networks from postgenomic data. Various reverse engineering methods have been proposed in the literature, and it is important to understand their relative merits and shortcomings. In the first part of my talk, I shall compare the accuracy of reconstructing gene regulatory networks with three different modelling and inference paradigms: (1) Relevance networks (RNs): pairwise association scores independent of the remaining network; (2) graphical Gaussian models (GGMs): undirected graphical models with constraint-based inference, and (3) Bayesian networks (BNs): directed graphical models with score-based inference. The evaluation is carried out on the Raf pathway, a cellular signalling network describing the interaction of 11 phosphorylated proteins and phospholipids in human immune system cells. Both laboratory data from cytometry experiments as well as data simulated from the gold-standard network are used for the evaluation. The study also compares the different effects of passive observations versus active interventions. In the second part of my talk, I shall discuss how biological prior knowledge, related for instance to transcription factor binding locations in promoter regions or partially known signalling pathways from the literature, can be used for the reconstruction of regulatory networks. I shall describe a Bayesian approach to systematically integrate expression data with multiple sources of prior knowledge. Each source is encoded via a separate energy function, from which a prior distribution over network structures in the form of a Gibbs distribution is constructed. The hyperparameters associated with the different sources of prior knowledge, which measure the influence of the respective prior relative to the data, are sampled from the posterior distribution with MCMC. We have evaluated the proposed scheme on the Raf signalling pathway. Our findings suggest that the inclusion of independent prior knowledge significantly improves the network reconstruction accuracy, and that the values of the hyperparameters inferred with this scheme are close to optimal with respect to minimizing the reconstruction error.

    Dirk Husmeier
    I graduated in Physics (Dipl.-Phys.) at the University of Bochum (Germany) in 1991, and received both my M.Sc. (Information Processing and Neural Networks) and Ph.D. (Applied Mathematics and Neural Computation) degrees from King's College London in 1994 and 1997, respectively.
    After working as a postdoctoral research fellow in the Electrical Engineering Department of Imperial College London from 1997 to 1999, I joined Biomathematics and Statistics Scotland (BioSS) as a research scientist in October 1999. My former research interests include theoretical biophysics, statistical pattern recognition, supervised learning in neural networks, and Bayesian approaches to machine learning.
    My current research activities are in the field of computational biology and bioinformatics, especially focusing on statistical methods for phylogenetics and the reverse engineering of gene regulatory networks.
    Since May 2006, I have been leading the statistical bioinformatics research in BioSS.

     

  • Tutorial 4
    Speakers: Manuela Bujorianu and Holger Hermanns
    Title: Modelling and Verification with Stochastic Hybrid Systems

    Abstract: Hybrid systems have been a topic of intense research in recent years. The deterministic hybrid systems, i.e. systems that involve interactions between continuous and discrete dynamics, without probabilities, can capture a large spectrum of behaviors encountered in practice and have proved invaluable in a number of applications. However, they have their limitations that make them too "weak'' for some specific applications. To overcome these limitations, researchers extended their study of hybrid systems beyond continuous and discrete dynamics, to include probabilistic phenomena. This has conducted to the more general class of Stochastic Hybrid Systems (SHS), which have found applications to, among others, capacity expansion models the power industry, flexible manufacturing and fault tolerant control, biological networks.

    In this tutorial, we present some modelling paradigms for SHS. Some background on ordinary/stochastic differential equations necessary for the understanding of these models will be provided.

    There are now different types of stochastic hybrid models. The main difference between these classes of stochastic hybrid models lies in the manner in which the randomness enters the process. In some models, the continuous dynamics may be governed by stochastic differential equations, while in others, the continuous evolution may be deterministic. Likewise, some models include forced transitions, which take place whenever the continuous state tries to leave a given set, other models only allow transitions to occur spontaneously at random times (at a given, possibly state-dependent, rate), whilst other models allow both types of transitions. Finally, the initial condition and the destinations of discrete transitions may also be given by stochastic kernels on the state space.

    We also define a model checking problem for SHS. In general terms, a this problem consists in evaluating if a given system will reach a certain set during some time horizon, starting from a set of initial conditions. This problem arises, for example, in connection with those safety verification problems where the unsafe conditions for the system can be characterised in terms of its state entering some unsafe set. In a stochastic setting, this problem can be formulated as that of estimating the probability that the trajectory of the system remains outside the unsafe set for a given time horizon.

    Manuela Bujorianu
    Manuela Bujorianu is a postdoc researcher in the Formal Methods and Tools group at the University of Twente. Prior to this, she held research positions at University of Cambridge, England and University of Stirling, Scotland. She is one of the first researchers who started to investigate SHS and she pioneered the application of formal specification and verification in this area. Currently she is involved in the AiSHA project (http://wwwhome.cs.utwente.nl/~manuela/aisha/index.html) investigating applications to concurrency to stochastic hybrid systems.

    Holger Hermanns
    Holger Hermanns is a professor for dependable systems and software at Saarland University, where he works on computer assisted modelling and verification of embedded and distributed systems. He is also affiliated with the VASY group ant INRIA Rhône-Alpes. Before, he heldpositions at the University of Twente, and the University of Erlangen-Nürnberg. In 2001, the Netherlands Organisation for Scientific Research (NWO) awarded him the "Verniewingsimpuls" prize for innovative research.

     

News
Oct 07
QEST'07 photo album available. [read more]

Sep 07
Graduate symposium and list of accepted posters [read more]

Jun 07
Accepted papers

Accepted tool papers

May 07
Special issue of IEEE TSE [read more]

Organised by

Logo


Sponsors

Read about QEST 2007 sponsors