Invited Speakers


2016

Dirk Beyer (University of Passau, Germany)
SMT-Based Verification Algorithms in CPAchecker

After many years of successful development of new algorithms for software model checking, there is a need to consolidate the knowledge about the different algorithms and approaches. This talk gives a coarse overview over four algorithms. We compare the following different "schools of thought" of algorithms: bounded model checking, k-induction, lazy predicate abstraction, and lazy abstraction with interpolants. Those algorithms are well-known and successful in software verification. They have in common that they are based on SMT solving as the back-end technology, using the theories of uninterpreted functions, bit vectors, and floats as underlying theory. All four algorithms are implemented in the verification framework CPAchecker. Thus, we can present an evaluation that really compares only the core algorithms, and keeps the design variables such as parser front end, SMT solver, used theory in SMT formulas, etc. constant. We evaluate the algorithms on a large set of verification tasks, and discuss the conclusions.

Dirk Beyer is Professor of Computer Science and has a Research Chair for Software Systems at the University of Passau, Germany. He was Assistant and Associate Professor at Simon Fraser University, B.C., Canada, and Postdoctoral Researcher at EPFL in Lausanne, Switzerland (2004-2006) and at the University of California, Berkeley, USA (2003-2004) in the group of Tom Henzinger. Dirk Beyer holds a Dipl.-Inf. degree (1998) and a Dr. rer. nat. degree (2002) in Computer Science from the Brandenburg University of Technology in Cottbus, Germany. In 1998 he was Software Engineer with Siemens AG, SBS Dept. Major Projects in Dresden, Germany. His research focuses on models, algorithms, and tools for the construction and analysis of reliable software systems. He is the architect, designer, and implementor of several successful tools. For example, CrocoPat is the first efficient interpreter for relational programming, CCVisu is a successful tool for visual clustering, and CPAchecker and BLAST are two well-known and successful software model checkers.

Alexey Khoroshilov (Institute for System Programming of RAS, Russia)
Verification of Operating Systems

The talk presents experience in application of various verification techniques to operating systems. It includes deductive verification of Linux security modules, software model checking of Linux device drivers, static analysis to detect data races in operating system kernels, model-based testing for compliance to API standards like LSB and POSIX, systematic fault injection for run-time testing.

Alexey has graduated from Computer Sciences Department of the Lomonosov's Moscow State University with honour in 2001. He received a Ph.D. in Computer Sciences in 2006 on the Formal Specification and Testing of Asynchronous Systems. Alexey works for the Institute for System Programming of the Russian Academy of Sciences (ISPRAS) since 1999. His research interest includes formal methods for software and hardware systems, model based testing, verification of operating systems and design of safety critical embedded systems. Alexey was the lead architect of the LSB Infrastructure program run jointly by ISPRAS and Linux Foundation. He is a director of the Linux Verification Center of ISPRAS, where a number of verification techniques are developed and applied to Linux kernel and other real-time operating systems.

Vartan Padaryan (Institute for System Programming of RAS, Russia)
Protection and Mitigation of Software Bug Exploitation

Automatic exploit generation demanded not only by Black-Hat hackers but benign developers too. Generation of workable exploit – is the best way to evaluate severity of found defect and set right priority to fix the bug. Over the past decade a lot of papers were presented to offer methods of automatic (or automated) exploit generation using only raw binary code. However, at the same time software protection mechanisms were significantly developed. They aimed to prevent exploitation or to mitigate its consequences. The talk compares exploitation techniques and protection mechanisms in the modern software, shows how secure software development utilizes advances in compiler technology.

Vartan Padaryan is the leading researcher at the Compiler technology Department of the Institute for System Programming of the Russian Academy of Sciences (ISP RAS). He has graduated from the faculty of Computational Mathematics and Cybernetics of the Lomonosov Moscow State University with honor in 2000. In 2005 he defended his Ph.D. thesis on the «Research and development of estimation methods for scalability and performance of data-parallel programs». He has published over 50 articles. Since 1997 Vartan Padaryan has been working at the ISP RAS. The last 9 years he is working on computer security problems. His research interests include compiler technologies, program analysis and reverse engineering, network security, parallel and distributed programming.

 

2015

Susanne Graf (VERIMAG Laboratory, France)
Knowledge-based Verification and Construction of Distributed and Constrained Systems

Deriving distributed implementations from global specifications has been extensively studied for various application domains, and under various assumptions and constraints. We explore here the problem from the knowledge perspective: a process can decide to execute a local action when it has the knowledge to do so. We discuss typical knowledge atoms, useful for expressing local enabling conditions with respect to different notions of correctness, as well as different means for obtaining knowledge and for representing it locally in an efficient manner. Our goal is to use such a knowledge-based representation of the distribution problem for either deriving distributed implementations automatically from global specifications on which some constraint is enforced -- a difficult problem -- or for improving the efficiency of existing protocols by exploiting local knowledge. We also argue that such a knowledge-based presentation helps achieving the necessary correctness proofs.

Nikolay Pakulin (Institute for System Programming of RAS, Russia)
Standards and Standardization in Software & Software Engineering. What Does it Matter to You?

Standards are what most people in the industry have heard of, but little have seen. Languages suAbstarch as C and C++, programming interfaces like POSIX, networking like TCP or Web-Services, cryptography such as RSA and AES, development processes like change management and quality control... Almost everything around a coder is supported by standards. They are curtained by components, libraries and tools, but they are there. What are they - the standards? What are they for you? What can you do with them? And what can you do for them? The talk discusses the importance of knowing the standards, the pros and conses of following and breaking the standards, and the ways you can contribute to them.

Nikolay Shilov (Nazarbayev University, Kazakhstan)
Agent Knowledge and Belief in Distributed Systems

Distributed system is a group of decentralized interacting executers. Communication in a distributed system is said to be fair, if any executer that wants to communicate with another one will eventually communicate. Distributed algorithm is the communication protocol implementing a distributed system to solve some task. Multiagent system is a distributed system that consists of agents. An agentis an autonomous rational reactive object (in Object Oriented sense) whose internal state can be characterized in terms of agent's believes, desires, intentions, and rationale. Multiagent algorithm is a distributed algorithm that solves some problem by means of cooperative work of agents. But in an individual agent’s perspective a multigent algorithm is a reactive and proactive knowledge/believe-based rational algorithm to achieve agent’s own desires. Agent's beliefs represent its “knowledge” about itself, other agents and an “environment”; this “knowledge” may be incomplete, inconsistent, and (even) incorrect; beliefs evolve in reaction to “signal” that it gets from the environment (i.e. agents are reactive). Agent's desires represent its long-term aims, obligations and purposes (that may be controversial); usually desires don’t change on the run ofmultiagent algorithm. Agent's intentions are used for a short-term planning; this ability to make and change short-term plans is calledagent’s proactivety. A rational agent has clear “preferences” (rationale) and always chooses the action (in feasible actions) that leads to the “best” outcome for itself. Every agent is autonomous, i.e. any change of its believes, desires and intentions depends on the agent itself, and a change can’t be decreed from outside.In the talk we examine (from epistemic point of view) knowledge and beliefs of agents that arrive one by one to a resource center to rent for a while one of available desired resources. Available resources are passive, they form a cloud; each of the available resources is lend on demand if there is no races for this resource and returns to the cloud after use. Agents also form a cloud, but leave the cloud immediately when they rent a desired resource. All agents are rational and can communicate with each other in P2P-manner, negotiate and flip (change intentions) so that all flips must always be rational for participating agents. The problem is to design a multiagent algorithm, which allows each arriving agent sooner or later to rent some of desired resource.

2014

Bertrand Meyer (ETH Zürich, Switzerland)
Automatic Frame Inference

One of the core practical problems in proving the correctness of programs is to take care of specifying and verifying properties that do not change: the so-called frame properties. Because frame properties are tedious to write, mechanical assistance is necessary. I will present a general convention for specifying frame properties and a framework (developed with A. Kogtenkov) to infer them automatically, based on the “alias calculus”.

Kostya Serebryany (Google Moscow, Russia)
AddressSanitizer & Friends: Using Compiler Instrumentation to Find Bugs at Run-Time

AddressSanitizer (ASan) is a tool that finds buffer overflows (in stack, heap and globals) and use-after-free bugs in C/C++ programs. ThreadSanitizer (TSan) finds data races in C/C++ and Go programs. MemorySanitizer (MSan) is a work-in-progress tool that finds uses of uninitialized memory (C++). These tools are based on compiler instrumentation (LLVM and GCC), which makes them very fast (e.g. ASan incurs just 2x slowdown). In this talk we will share our experience in huge scale testing using these tools and discuss new related research areas.

2013

Yuri Gubanov (Belkasoft, Russia)
How to Find an Elephant in a Haystack. Investigating Digital Crimes

Despite of the fact that there are myriads of well-meaning ways of using digital devices and Internet, the world, unfortunately, does not consist exclusively of decent people. More crimes are getting turned to the virtual world, while computers and Internet are becoming instruments of committing crimes. It is quite natural that countermeasures appear, and, moreover, there is an entire branch of science, called digital forensics, devoted to investigation of crimes committed online or with the use of digital devices. In contrast to usual data examination, forensic data analysis is covered by strict legislation, and, therefore, digital forensic software should be special. As the niche is rather narrow, there are not many companies in the world that deliver solutions to the digital investigation market. One of them is the Russian company “Belkasoft”.

Mirko Conrad (The MathWorks GmbH, Germany)
Developing High-Integrity Systems Using Model-Based Design

The increasing utilization of embedded software in application domains such as automotive, aerospace or railway has resulted in a staggering complexity that has proven difficult to negotiate with conventional design approaches and processes. In addition, more and more projects need to comply with functional safety standards such as ISO 26262, DO-178C, or EN 50128 because modern software-based applications increasingly control or interact with mission-critical or safety related system components. Development approaches and processes play a decisive role in addressing the complexity, productivity, and quality challenges. Because of its ability to address these challenges, Model-Based Design has been extensively used in various application domains. More recently, companies have begun to consider and adopt Model-Based Design for the development of embedded software for high-integrity applications.

Marek Miłosz (Institute of Computer Science, Lublin University of Technology, Poland)
Model Driven Engineering and WebML

Model Driven Engineering (MDE) is a way to develop software by creating and transformation model on different level of abstractions and areas. The last transformation is a code generation. General idea of MDE and special language (WebML) and software tool (WebRatio) to it implemented in practice for web applications developing will be introduced.

2012

Peter Gorm Larsen (Aarhus University, Denmark)

We argue that the main challenges to be overcome in developing future generations of IT-enabled products and services lie not so much in the software engineering discipline itself as in the collaborative relationships that software engineers have with other disciplines. We briefly review the need for more emphasis on multi-disciplinary approaches and consider three classes of demanding system: embedded products, systems-of-systems and cyber-physical systems. In each of these areas, we argue that there is a need for engineering with formal semantic bases that enable joint modelling, analysis and simulation of groups of heterogeneous models.

Nadezhda Yarushkina (Ulyanovsk State Technical University, Russia)

Qualitative evaluation and comparison of changes of indications of objects having different nature is used by designers, managers, people making decisions (PMD) and experts to make the decisions more reasonable. For support of such activity on the analysis of changes of data connected with certain dates and time intervals, models of fuzzy time series are applied. In this article a model of fuzzy tendency the carrier of which is a fuzzy time series and its variety — elementary tendency model — is offered. The offered models are applied for solution of the problem of summarization of fuzzy time series in terms of tendencies.

 2011

Shihong Huang (Florida Atlantic University, USA)
Capturing the Essence of Software Engineering – A Reflection on Semat Initiative

purpose of writing this Three Year Vision paper is three-fold. Firstly, it briefly recaps the progress Semat has made thus far; secondly, it lays out the future directions for people working actively within the Semat community; thirdly, it provides the background for seeking funding support from agencies, such as the European Community and the like. Funding support is necessary to sustain the ongoing activities of Semat and its growth into a broader community effort, as most people working within Semat are volunteers. As such, the paper may be both too much and too little for the wider supporter base. However, we intend to make our work fully transparent, hence, we publish it widely. We seek feedback and comments from supporters and signatories in order to improve the vision. In this context, other companion papers are being written to better address the specific needs for the practitioners, the industry and the academia.

Petr Skobelev (Knowledge Genesis, Russia)

Optimal scheduling of loading, warehousing and delivering of goods, picking up and transporting passengers, allocating drivers, pilots, crew members or service engineers to tasks, and ensuring that capacities match demands, is difficult and often impossible to achieve when demand is unpredictable and the operation is frequently interrupted by changes in orders, delays, bad weather, no-shows or resource failures. Real-time schedulers, which can rapidly react to every disruptive event and re-schedule affected resources before the next disruptive event occurs, are the answer. The talk will discuss how Multi-Agent Technologies (MAT) can help in dynamic scheduling and other complex problems, like optimization, pattern discovery, text understanding, etc.

2010

Tiziana Margaria (University of Potsdam, Germany)
eXtreme Model Driven Design/Engineering

Taming complexity by building models is one of the oldest and best established engineering principles. In Software Engineering this is complicated by the inherent typically high degree of abstraction, that easily leads to misunderstandings already at requirement analysis time. Additionally, taking care of heterogeneity leads within the modern model driven development approaches to a variety of aspect- and phase-specific models, whose interplay can be hardly overseen even by experts. Here is where the `eXtreme Model Driven Design? (XMDD) strikes, with a coherent and user-oriented modeling approach. The backbone of the entire Software/System lifecycle is a behavioural model accessible to the users (typically, application experts with little IT background). This model is on one hand selectively refined until the IT experts can proceed with a simple service-oriented implementation, but on the other hand it serves the application expert as a continuously available and up-to-date 'sensor', control- and validation instrument. This way, harmony between the Application expert and the IT side is established and coherently maintained, with high benefits especially in application domains with high evolution needs. The potential of this approach has been proven not only in scenarios from the business- and bioinformatics, but also in its successful use in distributed, cooperative project-based teaching.

Aleksey Savateyev (Microsoft, USA)
Building Cloud Applications for the Real World

This session will give you an overview of what cloud computing is and what cloud platform-as-a-service such as Windows Azure Platform can be used for in creating solutions for the real world scenarios. You will learn about all major components of Windows Azure Platform and how they work with each other in massively scalable architecture. You will find out what others around the world are building in the cloud, what issues they are encountering and how these issues are getting resolved.

2009

Victor Kuliamin (Institute for System Programming of RAS, Russia)

The talk provides a review of recent advances in hybrid verification techniques, such as model based testing, extended static analysis, runtime verification, and compound structured testing, and focuses on improvements of tools supporting these approaches. The review shows that along with implementation of new techniques like counterexample-guided abstraction refinement or directed structured test generation based on constraint resolution, modern verification tools also develop to be more modular and tend to use more things as modules from other tools. Thus implementation of integrated verification methods applying a variety of techniques recently available only in research prototypes becomes reachable in industrial software development tools.