Alessandra Di Pierro

Main works - abstracts


Journals



Tempus Fugit: How to Plug It (With C. Hankin and H. Wiklicky) Journal of Logic and Algebraic Programming PDF available.
ABSTRACT
Secret or private information may be leaked to an external attacker through the timing behaviour of the system running the untrusted code. After introducing a formalisation of this situation in terms of a confinement property, we present an algorithm which is able to transform the system into one that is computationally equivalent to the given system but free of timing leaks.

Reversible Combinatory Logic (With C. Hankin and H. Wiklicky) Mathematica Struct. in Computer Science PDF available.
ABSTRACT
The λ-calculus is destructive: its main computational mechanism -- beta reduction -- destroys the redex and makes it thus impossible to replay the computational steps. Combinatory logic is a variant of the λ-calculus which maintains irreversibility. Recently, reversible computational models have been studied mainly in the context of quantum computation, as (without measurements) quantum physics is inherently reversible. However, reversibility also changes fundamentally the semantical framework in which classical computation has to be investigated. We describe an implementation of classical combinatory logic into a reversible calculus for which we present an algebraic model based on a generalisation of the notion of group

Quantitative Static Analysis of Distributed Systems (With C. Hankin and H. Wiklicky) Journal of Functional Programming PDF available.
ABSTRACT
We introduce a quantitative approach to the analysis of distributed systems which relies on a linear operator based network semantics. A typical problem in a distributed setting is how information propagates through a network, and a typical qualitative analysis is concerned with establishing whether some information will eventually be transmitted from one node to another node in the network. The quantitative approach we present allows to obtain additional information such as an estimation of the probability that some data is transmitted within a given interval of time. We formalise situations like this using a probabilistic version of a process calculus which is the core of KLAIM, a language for distributed and mobile computing based on interactions through distributed tuple spaces. The analysis we present exploits techniques based on Probabilistic Abstract Interpretation and is characterised by compositional aspects which greatly simplify the inspection of the nodes interaction and the detection of the information propagation through a computer network.

Probabilistic Lambda-calculus and Quantitative Program Analysis (With C. Hankin and H. Wiklicky) Journal of Logic and Computation PDF available.
ABSTRACT
We show how the framework of probabilistic abstract interpretation can be applied to statically analyse a probabilistic version of the λ-calculus. The resulting analysis allows for a more speculative use of its outcomes based on the consideration of statistically defined quantities. After introducing a linear operator based semantics for our probabilistic λ-calculus &LambdaP, and reviewing the framework of abstract interpretation and strictness analysis, we demonstrate our technique by constructing a probabilistic (first-order) strictness analysis for &LambdaP.

Measuring the Confinement of Probabilistic Systems (With C. Hankin and H. Wiklicky) Theoretical Computer Science PDF available.
ABSTRACT
In this paper we lay the semantic basis for a quantitative security analysis of probabilistic systems by introducing notions of approximate confinement based on various process equivalences. We re-cast the operational semantics classically expressed via probabilistic transition systems (PTS) in terms of linear operators and we present a technique for defining approximate semantics as probabilistic abstract interpretations of the PTS semantics. An operator norm is then used to quantify this approximation. This provides a quantitative measure &epsilon of the indistinguishability of two processes and therefore of their confinement. In this security setting a statistical interpretation is then given of the quantity &epsilon which relates it to the number of tests needed to breach the security of the system.

Approximate Non-Interference (With C. Hankin and H. Wiklicky) Journal of Computer Security, 12(1):37-81, 2004. Postscript available.
ABSTRACT
We address the problem of characterising the security of a program against unauthorised information flows. Classical approaches are based on non-inter\-ference models which depend ultimately on the notion of process equivalence. In these models confidentiality is an absolute property stating the absence of any illegal information flow. We present a model in which the notion of non-interference is approximated in the sense that it allows for some exactly quantified leakage of information. This is characterised via a notion of process similarity which replaces the indistinguishability of processes by a quantitative measure of their behavioural difference. Such a quantity is related to the number of statistical tests needed to distinguish two behaviours. We also present two semantics-based analyses of approximate non-interference and we show that one is a correct abstraction of the other.

A preliminary version of this paper appeared in the proceedings of CSFW 2002 and WITS 2002.


Nondeterminism and Infinite Computations in Constraint Programming. (With F.S. de Boer and C. Palamidessi) Theoretical Computer Science, 151:37-78, 1995. Postscript available.
ABSTRACT
We investigate the semantics of concurrent constraint programming and of various sublanguages, with particular emphasis on nondeterminism and infinite behavior. The aim is to find out what is the minimal structure which a domain must have in order to capture these two aspects. We show that a notion of observables, obtained by the upward-closure of the results of computations, is relatively easy to model even in presence of synchronization. On the contrary modeling the exact set of results is problematic, even for the simple sublanguage of constraint logic programming. We show that most of the standard topological techniques fail in capturing this more precise notion of observables. The analysis of these failed attempts leads us to consider a categorical approach.

An algebraic perspective of constraint logic programming. (With F.S. de Boer and C. Palamidessi) Journal of Logic and Computation, 7(1), 1997. Postscript available.
ABSTRACT
We develop a denotational, fully abstract semantics for constraint logic programming (clp) with respect to successful and failed observables. The denotational approach turns out very useful for the definition of new operators on the language as the counterpart of some abstract operations on the denotational domain. In particular, by defining our domain as a cylindric Heyting algebra, we can exploit, to this aim, operations of both cylindric algebras (such as cylindrification), and Heyting algebras (such as implication and negation). The former allows us to generalize the clp language by introducing an explicit hiding operator; the latter allows us to define a notion of negation which extends the classical negation used in Logic Programming. In particular, we show that our notion subsumes both Negation as Failure and Negation as Instantiation.

A preliminary version of this paper, with title A Logical Denotational Semantics for Constraint Logic Programming, appeared in the proceedings of ESOP 94.


Negation as Instantiation. (With M. Martelli and C. Palamidessi.) Information and Computation, 120(2):263-278, 1995. Postscript available.
ABSTRACT
We propose a new negation rule for logic programming which derives existentially closed negative literals, and we define a version of completion for which this rule is sound and complete. The rule is called "Negation As Instantiation" (NAI). According to it, a negated atom succeeds whenever all the branches of the SLD-tree for the atom either fail or instantiate the atom. The set of the atoms whose negation is inferred by the NAI rule is proved equivalent to the complement of $T_c\!\downarrow\!\omega$, where $T_c$ is the immediate consequence operator extended to non-ground atoms (Falaschi et al., 1989). The NAI rule subsumes negation as failure on ground atoms, it excludes floundering and can be efficiently implemented. We formalize this way of handling negation in terms of SLDNI-resolution (SLD-resolution with Negation as Instantiation). Finally, we amalgamate SLDNI-resolution and SLDNF-resolution, thus obtaining a new resolution procedure which is able to treat negative literals with both existentially quantified variables and free variables, and we prove its correctness.

A preliminary version of this paper, with title Negation As Instantiation: a New Rule for the Treatment of Negation in Logic Programming, appeared in the proceedings of ICLP 91.

See also An algebraic perspective of constraint logic programming.


International Conferences



Semantic Abstraction and Quantum Computation (With H. Wiklicky) Pdf available.
ABSTRACT
We present a logico-algebraic approach to probabilistic abstract interpretation based on the ortholattice structure of the projective measurement operators in quantum mechanics. On this base, we present a novel interpretation of quantum measurement as a probabilistic abstraction showing that the measurement of a physical observable essentially corresponds to a static analysis of the observed property.

Abstract Interpretation for Worst and Average Case Analysis (With C. Hankin and H. Wiklicky) Pdf available.
ABSTRACT
We review Wilhelm's work on WCET for hard real-time applications and also recent work on analysis of soft-real time systems using probabilistic methods. We then present Probabilistic Abstract Interpretation (PAI) as a quantitative variation of the classical approach; PAI aims to provide close approximations -- this should be contrasted to the safe approximations studied in the standard setting. We discuss the relation between PAI and classical Abstract Interpretation as well as average case analysis.

Probabilistic Chemical Abstract Machine and the Expressiveness of Linda Languages (With C. Hankin and H. Wiklicky) Pdf available.
ABSTRACT
The Chemical Abstract Machine of Berry and Boudol provides a commonly accepted, uniform framework for describing the operational semantics of various process calculi and languages, such as for example CCS, the pi-calculus and coordination languages like Linda. In its original form the CHAM is purely non-deterministic and thus only describes what reactions are {\em possible} but not how long it will take (in the average) before a certain reaction takes place or its probability. Such quantitative information is however often vital for ``real world'' applications such as systems biology or performance analysis. We propose a probabilistic version of the CHAM. We then define a linear operator semantics for the probabilistic CHAM which exploits a tensor product representation for distributions over possible solutions. Based on this we propose a novel approach towards comparing the expressive power of different calculi via their encoding in the probabilistic CHAM. We illustrate our approach by comparing the expressiveness of various Linda Languages.

Noninterference and the Most Powerful Probabilistic Adversary (With A. Aldini) Pdf available.
ABSTRACT
Probabilistic noninterference extends the classical possibilistic notion introduced by Goguen and Meseguer in order to capture the information leakage caused by adversaries that set up probabilistic covert channels. In this setting we investigate how to evaluate the observational power of an adversary to the purpose of establishing the maximal security degree of a given system. We introduce three classes of probabilistic adversaries, which represent the different observational power of an adversary, and then we establish properties for each such classes which state the complexity of effectively computing the most powerful adversary.

On Reversible Combinatory Logic (With C. Hankin and H. Wiklicky) Pdf available.
ABSTRACT
The lambda calculus is destructive: its main computational mechanism -- beta reduction -- destroys the redex and makes it thus impossible to replay the computational steps. Recently, reversible computational models have been studied mainly in the context of quantum computation, as (without measurements) quantum physics is inherently reversible. However, reversibility also changes fundamentally the semantical framework in which classical computation has to be investigated. We describe an implementation of classical combinatory logic into a reversible calculus for which we present an algebraic model based on a generalisation of the notion of group.

Probabilistic Linda-based Coordination Languages. (With C. Hankin and H. Wiklicky) Pdf available.
ABSTRACT
Coordination languages are intended to simplify the development of complex software systems by separating the coordination aspects of an application from its computational aspects. Coordination refers to the ways the independent active pieces of a program (e.g. a process, a task, a thread, etc.) communicate and synchronise with each other. We review various approaches to introducing probabilistic or stochastic features in coordination languages. The main objective of such a study is to develop a semantic basis for a quantitative analysis of systems of interconnected or interacting components, which allows us to address not only the functional (qualitative) aspects of a system behaviour but also its non-functional aspects, typically considered in the realm of performance modelling and evaluation.

Operator Algebras and the Operational Semantics of Probabilistic Languages. (With H. Wiklicky) Pdf available.
ABSTRACT
We investigate the construction of linear operators representing the semantics of probabilistic programming languages expressed via probabilistic transition systems. Finite transition relations, corresponding to finite automata, can easily be represented by finite dimensional matrices; for the infinite case we need to consider an appropriate generalisation of matrix algebras. We argue that C star algebras, or more precisely Approximately Finite (or AF) algebras, provide a sufficiently rich mathematical structure for modelling probabilistic processes. We show how to construct for a given probabilistic language a unique AF~algebra A and how to represent the operational semantics of processes within this framework: finite computations correspond directly to operators in A, while infinite processes are represented by elements in the so-called strong closure of this algebra.

Continuous-Time Probabilistic KLAIM. (With C. Hankin and H. Wiklicky) Pdf available.
ABSTRACT
We present a probabilistic version of KLAIM, which we call pKLAIM. At the local level, we introduce probabilistic parallel and choice operators. In addition we use probabilistic allocation environments which associate distributions on physical sites with logical localities. Furthermore, we associates a rate with each node; this determines how often a node is active. An alternative would be to associate a probability with each node, indicating the chance that a process at that node will be selected for execution. We have studied such a variant in detail in an earlier paper.

On Quantitative Analysis of Probabilistic Protocols. (With A. Aldini) Pdf available.
ABSTRACT
We advocate the use of approximate noninterference for the security analysis of probabilistic protocols. Our approach relies on a formalisation of the protocol in the setting of a probabilistic process algebra and a notion of process similarity based on weak probabilistic bisimulation. We illustrate this approach by presenting the analysis of a probabilistic nonrepudiation protocol which allows us to quantitatively estimate its fairness degree.

Probabilistic KLAIM. (With C. Hankin and H. Wiklicky) Postscript available.
ABSTRACT
We introduce a probabilistic extension of KLAIM, where the behaviour of networks and individual nodes is determined by a probabilistic scheduler for processes and probabilistic allocation environments which describe the logical neighbourhood of each node. The resulting language has two variants which are modelled respectively as discrete and continuous time Markov processes. We suggest that Poisson processes are a natural probabilistic model for the coordination of discrete processes asynchronously communicating in continuous time and we use them to define the operational semantics of the continuous time variant. This framework allows for the implementation of networks with independent clocks on each site.

A Quantitative Approach to Noninterference for Probabilistic Systems. (With A. Aldini) Postscript available.
ABSTRACT
We present a technique for measuring the security of a system which relies on a probabilistic process algebraic formalisation of noninterference. We define a mathematical model for this technique which consists of a linear space of processes and linear transformations on them. In this model the measured quantity corresponds to the norm of a suitably defined linear operator associated to the system. The probabilistic model we adopt is reactive in the sense that processes can react to the environment with a probabilistic choice on a set of inputs; it is also generative in the sense that outputs autonomously chosen by the system are governed by a probability distribution. In this setting, noninterference is formulated in terms of a probabilistic notion of weak bisimulation. We show how the probabilistic information in this notion can be used to estimate the maximal information leakage, i.e. the security degree of a system against a most powerful attacker.

Quantitative Relations and Approximate Process Equivalences. (With C. Hankin and H. Wiklicky) Postscript available.
ABSTRACT
We introduce a characterisation of probabilistic transition systems (PTS) in terms of linear operators on some suitably defined vector space representing the set of states. Various notions of process equivalences can then be re-formulated as abstract linear operators related to the concrete PTS semantics via a probabilistic abstract interpretation. These process equivalences can be turned into corresponding approximate notions by identifying processes whose abstract operators ``differ'' by a given quantity, which can be calculated as the norm of the difference operator. We argue that this number can be given a statistical interpretation in terms of the tests needed to distinguish two behaviours.

Measuring the Confinement of Concurrent Probabilistic Systems. (With C. Hankin and H. Wiklicky) Postscript available.
ABSTRACT
In previous work we have studied the notion of approximate confinement based on I/O observables. In this paper, we generalise this work to consider probabilistic bisimulation (both exact and approximate) in the context of probabilistic transition systems. We also show the relationship between probabilistic bisimulation and probabilistic abstract interpretation; this result parallels Schmidt's result in the classical (non-probabilistic) setting where he relates bisimulation and Galois connections. We conclude with some examples.

Linear Embedding for a Quantitative Comparison of Language Expressiveness. (With A. Brogi and H. Wiklicky) Postscript available.
ABSTRACT
We introduce the notion of linear embedding which refines Shapiro's notion of embedding by recasting it in a linear-space based semantics setting. We use this notion to compare the expressiveness of a class of languages that employ asynchronous communication primitives à la Linda. The adoption of a linear semantics in which the observables of a language are linear operators (matrices) representing the programs transition graphs allows us to give quantitative estimates of the different expressive power of languages, thus improving previous results in the field.

Probabilistic Abstract Interpretation and Statistical Testing. (With H. Wiklicky) Postscript available.
ABSTRACT
Although generally too weak to guarantee correctness, testing is an indispensable technique for the validation of reactive systems. Abstract Interpretation has been used to overcome some of the problems associated with testing such as the termination of a program run within an acceptable interval of time. We propose the use of Probabilistic Abstract Interpretation for measuring the error made in the evaluation of the reliability of systems via Monte-Carlo testing. The problem we consider is to determine the probability that a reactive system's response falls within a certain set of possible outputs, i.e. that it passes a certain test. As Monte-Carlo testing of the concrete system is not always feasible, tests can be performed instead on an abstract system. This abstraction introduces a further approximation which must be taken into account in the estimation of the error made by testing. The problem is to find out how much the expectation value (or average) of the abstract system differs from the one of the concrete system. We show how to measure such difference by lifting the abstraction to a probabilistic abstract interpretation.

Analysing Approximate Confinement under Uniform Attacks. (With C. Hankin and H. Wiklicky) Postscript available.
ABSTRACT
We are concerned to give certain guarantees about the security of a system. We identify two kinds of attack: the internally scheduled attack (exemplified by Trojan Horse attacks) and externally scheduled attacks (exemplified by timing attacks). In this paper we focus on the latter. We present a semantic framework for studying such attacks in the context of PCCP, a simple process algebra with a constraint store. We show that a measure of the efficacy of an attacker can be determined by considering its observable behaviour over the "average" store of the system (for some number of steps). We show how to construct an analysis to determine the average store using the technique of probabilistic abstract interpretation.

Approximate Non-Interference. (With C. Hankin and H. Wiklicky) Postscript available.
ABSTRACT
We address the problem of characterising the security of a program against unauthorised information flows. Classical approaches are based on non-interference models which depend ultimately on the notion of process equivalence. In these models confidentiality is an absolute property stating the absence of any illegal information flow. We present a model in which the notion of non-interference is approximated in the sense that it allows for some exactly quantified leakage of information. This is characterised via a notion of process similarity which replaces the indistinguishability of processes by a quantitative measure of their behavioural difference. Such a quantity is related to the number of statistical tests needed to distinguish two behaviours. We also present two semantics-based analyses of approximate non-interference and we show that one is a correct abstraction of the other.

Quantum Constraint Programming . (With H. Wiklicky) Postscript available.
ABSTRACT
Quantum computers are hypothetical machines which can perform many calculations simultaneously based on quantum-mechanical principles that allows a single bit to coexist in many states at once. This enormous potential of quantum computing has attracted substantial interest, especially during the last decade, and initiated a whole new field of research. As a contribution to this research we address the problem of the design of high level languages for programming quantum computers, and the definition of an appropriate formal semantics for such languages. To this purpose we consider the Constraint Programming paradigm and we show how computations in this paradigm can be seen as physical processes obeying the laws of quantum mechanics.

Linear Stuctures for Concurrency in Probabilistic Programming Languages . (With H. Wiklicky) Postscript available.
ABSTRACT
We introduce a semantical model based on operator algebras and we show the suitability of this model to capture both a quantitative version of non-determinism (in the form of a probabilistic choice) and concurrency. We present the model by referring to a generic language which generalises various probabilistic concurrent languages from different programming paradigms. We discuss the relation between concurrency and the commutativity of the resulting semantical domain. In particular, we use Gelfand's representation theorem to relate the semantical models of synchronisation-free and fully concurrent versions of the language. A central aspect of the model we present is that it allows for a unified view of both operational and denotational semantics for a concurrent language.

Probabilistic Confinement in a Declarative Framework. (With H. Wiklicky and Chris Hankin) Postscript available.
ABSTRACT
We show how to formulate and investigate security notions in the context of declarative programming. We concentrate on a particular class of security properties, namely the one called in \cite{VolpanoSmith98a} {\em confinement} properties. Our reference language is concurrent constraint programming. We use a probabilistic version of this language to highlight via simple programs examples the difference between probabilistic and non-deterministic confinement as pointed out in the work by Volpano and Smith \cite{VolpanoSmith98b,VolpanoSmith98c} in the context of imperative languages. The different role played by variables in imperative and constraint programming hinders a direct translation of the notion of confinement into our declarative setting. Therefore, we introduce the notion of ``identity confinement'' which is more appropriate for constraint languages. Finally, we present an approximating probabilistic semantics which can be used as a base for the analysis of confinement properties.

Measuring the Precision of Abstract Interpretations. (With H. Wiklicky.) Postscript available.
ABSTRACT
We present a method for approximating the semantics of probabilistic programs to the purpose of constructing semantics-based analyses of various properties. The method is also suited for a probabilistic analysis of classical deterministic programs. The framework resembles the one based on Galois connection used in abstract interpretation, the main difference being the choice of linear space structures instead of order-theoretic ones as semantical (concrete and abstract) domains. Linear spaces reflect the quantitative aspects of (probabilistic) computation and are therefore of fundamental importance in the semantics and the semantics-based analysis. The intrinsic quantitative nature of linear spaces makes the method suitable for investigations on the problem of a numerical comparison of abstract interpretations with respect to their precision.

Concurrent Constraint Programming: Towards Probabilistic Abstract Interpretation. (With H. Wiklicky.) Postscript available.
ABSTRACT
We present a method for approximating the semantics of probabilistic programs to the purpose of constructing semantics-based analyses of such programs. The method resembles the one based on Galois connection as developed in the Cousot framework for abstract interpretation. The main difference between our approach and the standard theory of abstract interpretation is the choice of linear space structures instead of order-theoretic ones as semantical (concrete and abstract) domains. We show that our method generates ``best approximations'' according to an appropriate notion of precision defined in terms of a norm. Moreover, if re-casted in a order-theoretic setting these approximations are correct in the sense of classical abstract interpretation theory. We use Concurrent Constraint Programming as a reference programming paradigm. The basic concepts and ideas can nevertheless be applied to any other paradigm. The results we present are intended to be the first step towards a general theory of probabilistic abstract interpretation, which re-formulates the classical theory in a setting suitable for a quantitative reasoning about programs.

Randomised Algorithms and Probabilistic Constraint Programming. (With H. Wiklicky.) Postscript available.
ABSTRACT
We propose a declarative implementation of randomised algorithms, which exploits the Constraint Logic Programming (CLP) paradigm. For the high-level formalisation of probabilistic programs expressing such algorithms we actually refer to a generalisation of CLP, namely the Probabilistic Concurrent Constraint Programming (PCCP) language, previously introduced by the authors. This language provides a construct for probabilistic choice which allows us to express randomness in a program. The design of PCCP does not require any additional structure on the underlying constraint system (e.g. fuzzy or belief systems) and therefore also allows a straight forward implementation. We demonstrate the use of this language for implementing randomised algorithms. In particular, we give an extensive treatment of two popular (generic) randomised algorithms, namely Simulated Annealing and Randomised Rounding, and we discuss some instantiations of these algorithms for solving two well-known optimisation problems, namely the travelling salesman (TSP) and the maximum satisfaction (MAX SAT) problem.

Quantitative Observables and Averages in Probabilistic Constraint Programming. (With H. Wiklicky.) Postscript available.
ABSTRACT
We investigate notions of observable behaviour of programs which include quantitative aspects of computation along with the most commonly assumed qualitative ones. We model these notions by means of a transition system where transitions occur with a given probability and an associated `cost' expressing some complexity measure (e.g. running time or, in general, resource consumption). The addition of these quantities allows for a natural formulation of the {\em average} behaviour of a program, whose specification and analysis is particularly important in the study of system performance and reliability. We base our model on the Concurrent Constraint Programming (CCP) paradigm and we argue that it can be an appropriate base for further developments oriented to the analysis and verification of average properties.

Ergodic Average in Constraint Programming. (With H. Wiklicky.) Postscript available.
ABSTRACT
We will discuss the problem of modelling probabilistic properties of constraint programs, which express the average of some quantities of interest. A random variable on the set of constraints is used for assigning to each constraint a real value representing the `cost' of that constraint. This way, we obtain a notion of quantitative observables Q which, although interesting in itself can be used in order to define different types of average properties.

A Markov Model for Probabilistic Concurrent Constraint Programming. (With H. Wiklicky.) Postscript available.
ABSTRACT
This paper introduces a new approach towards the semantics of Concurrent Constraint Programming (CCP), which is based on operator algebras. In particular, we show how stochastic matrices can be used for modelling both a quantitative version of nondeterminism (in the form of a probabilistic choice) and synchronisation. We will assume Probabilistic Concurrent Constraint Programming (PCCP) as the reference paradigm. The presented model subsumes CCP as a special case. The model is obtained by a fixpoint construction on a space of linear operators. For the purpose of this paper, we consider only finite constraint systems. This allows us to define our model by using finite dimensional linear operators, i.e. matrices. We show that our model is equivalent to a notion of observables which extends the standard one in CCP (input/output behaviour) by including information about the likelihood of the computed results. The model also captures infinite computations. The adoption of functional analytical and operator algebraic methods constitutes the main novelty of this work. It also creates a natural link between (a certain field of) computer science and different disciplines such as physics, by the use of the theory of stochastic processes.

Implementing Randomised Algorithms in Constraint Logic Programming. (With N. Angelopoulos and H. Wiklicky.) Postscript available.
ABSTRACT
We propose a declarative-based implementation of randomised algorithms, which exploits the Constraint Logic Programming (CLP) paradigm. For the high-level formalisation of probabilistic programs expressing such algorithms we actually refer to a generalisation of CLP, namely the Probabilistic Concurrent Constraint Programming (PCCP) language, previously introduced in ICCL98. This language provides a construct for probabilistic choice which allows us to express randomness in a program. PCCP also includes synchronisation and concurrency aspects. However, for the purpose of this work, the (probabilistic) CLP fragment of PCCP is sufficient. We present a meta-interpreter for this language. This is just a standard prolog meta-interpreter, suitably extended so as to deal with probabilistic choice. For the constraint solving, the meta-interpreter exploits existing constraint handling facilities (and in more concrete terms to the SICStus 3.\#6 system). This is possible because the design of PCCP does not require any additional structure on the underlying constraint system (e.g. fuzzy or belief systems). We demonstrate the use of this system for implementing randomised algorithms. In particular, we give an extensive treatment of two popular (generic) randomised algorithms, namely Simulated Annealing and Randomised Rounding, and we discuss some instantiations of these algorithms for solving two well-known optimisation problems, namely the travelling salesman (TSP) and the maximum satisfaction (MAX SAT) problem.

Probabilistic Concurrent Constraint Programming: Towards a Fully Abstract Model. (With H. Wiklicky.) Postscript available.
ABSTRACT
This paper presents a Banach space based approach towards a denotational semantics of a probabilistic constraint programming language. This language is based on the concurrent constraint programming paradigm, where randomness is introduced by means of a probabilistic choice construct. As a result, we obtain a declarative framework, in which randomised algorithms can be expressed and formalised. The denotational model we present is constructed by using functional-analytical techniques. As an example, the existence of fixed-points is guaranteed by the Brouwer-Schauder Fixed-Point Theorem. A concrete fixed-point construction is also presented which corresponds to a notion of observables capturing the exact results of both finite and infinite computations.

An Operational Semantics for Probabilistic Concurrent Constraint Programming . (With H. Wiklicky.) Postscript available.
ABSTRACT
This paper investigates a probabilistic version of the concurrent constraint programming paradigm (CCP). The aim is to introduce the possibility to formulate so called ``randomised algorithms'' within the CCP framework. Differently from common approaches in (imperative) high-level programming languages, which rely on some kind of random() function, we introduce randomness in the very definition of the language by means of a probabilistic choice construct. This allows a program to make stochastic moves during its execution. We call the resulting language Probabilistic Concurrent Constraint Programming (PCCP). We present an operational semantics for PCCP by means of a probabilistic transition system such that the execution of a PCCP program may be seen as a stochastic process, i.e. as a random walk on the transition graph. The transition probabilities are given explicitly. This semantics captures a notion of observables which combines results of computations and the probability of those results being computed. This embedding of randomness within the semantics of a well structured programming paradigm, like CCP, also aims at providing a sound framework for formalising and reasoning about randomised algorithms and programs. Additionally, we give some examples of PCCP programs and we show how well-known randomised algorithms can be implemented very naturally in the new language.

A Banach Space Based Semantics for Probabilistic Concurrent Constraint Programming. (With H. Wiklicky.) Postscript available.
ABSTRACT
The aim of this work is to provide a probabilistic foundation for nondeterministic computations in Concurrent Constraint Programming (CCP). Differently from common approaches in (imperative) high-level programming languages, which rely on some kind of random() function, we introduce randomness in the very definition of the language by means of a probabilistic choice construct. This allows a program to make stochastic moves during its execution. We call the resulting language Probabilistic Concurrent Constraint Programming (PCCP). We present an operational and denotational semantics for PCCP both capturing a notion of observables which combines results of computations and the probability of those results being computed. For the denotational semantics, we consider linear (Banach) structures and the mathematics behind them, as appropriate candidates for the definition of the domain of denotations. This approach differs from the ones typically adopted in the semantics of constraint programming, which are based on order-theoretic or metric structures. We compare this semantics to other approaches and discuss it in some examples.

On Probabilistic CCP . (With H. Wiklicky.) Postscript available.
ABSTRACT
This paper investigates a probabilistic version of the concurrent constraint programming paradigm (CCP). The aim is to introduce the possibility to formulate so called ``randomised algorithms'' within the CCP framework. Our approach incorporates randomness directly within the (operational) semantics instead of referring to an ``external'' function or procedure call. We define the operational semantics of probabilistic concurrent constraint programming (PCCP) by means of a probabilistic transition system such that the execution of a PCCP program may be seen as a stochastic process, i.e. as a random walk on the transition graph. The transition probabilities are given explicitly.

On Negation As Instantiation . (With W. Drabent.) Postscript available.
ABSTRACT
Given a logic program P and a goal G, we introduce a notion which states when an SLD-tree for P and G instantiates a set of variables V with respect to another one, W. We call this notion weak instantiation, as it is a generalization of the instantiation property introduced in a previous work ("Negation As Instantiation"). A negation rule based on instantiation, the so-called Negation As Instantiation rule (NAI), allows for inferring existentially closed negative queries, that is formulas of the form $\exists\neg Q$, from logic programs.
We show that, by using the new notion, we can infer a larger class of negative queries, namely the class of the queries of the form $\forall_W\exists_V\neg Q$ and of the form $\forall_W\exists_V\forall_Z\neg Q$, where $Z$ is the set of the remaining variables of Q.

On Quantified Negative Queries . (With W. Drabent.) Postscript available.
ABSTRACT
Given a logic program P and a goal G, we introduce a notion which states when an SLD-tree for P and G instantiates a set of variables V with respect to another one, W. We call this notion weak instantiation, as it is a generalization of the instantiation property introduced in a previous work ("Negation As Instantiation"). A negation rule based on instantiation, the so-called Negation As Instantiation rule (NAI), allows for inferring existentially closed negative queries, that is formulas of the form $\exists\neg Q$, from logic programs.
We show that, by using the new notion, we can infer a larger class of negative queries, namely the class of the queries of the form $\forall_W\exists_V\neg Q$ and of the form $\forall_W\exists_V\forall_Z\neg Q$, where $Z$ is the set of the remaining variables of Q.



National Conferences

Amalgamating NAF and NAI . Postscript available.
ABSTRACT
We define a new logic language with negation by introducing existential and universal quantifiers in the queries.
Negation is handled by using two rules: the NAF rule for universally quantified negative queries and the NAI rule for existentially quantified negative queries. We formalize this amalgamation in terms of NAIF-resolution (SLD-resolution with Negation As Instantiation and Failure), and we prove its correctness w.r.t. the completion semantics.
We also show how extended programs and goals, i.e. programs and goals whose body is an arbitrary first-order formula, can be transformed into programs and goals in the new language and then evaluated using our system.



PhD Thesis

Negation and Infinite Computations in Logic Programming. Postscript available.
ABSTRACT
This thesis is devoted to probe into some aspects of Logic Programming, namely the problem of inferring some negative information from a logic program, the denotational characterization of the meaning of a program, and the study of its infinite computations. Concerning the first problem, the attention is focused on the negative information expressed by the existential quantification of the negation of an atom (or a conjunction of atoms) which is not a logical consequence of the theory associated to a given logic program. Operationally, these formulas can be inferred by means of the Negation As Instantiation rule; from a logical point of view, they are shown to be exactly those formulas which are logical consequences of the theory obtained by considering the Clark completion of the given program on a universal vocabulary. The paradigm of Logic Programming is then extended by allowing the use of the existential negative expressions in the definition of the predicates, and, correspondingly, also the classical mechanism of SLD-resolution is extended so to deal with the new formulas.
The other two aspects are tackled within the more general framework of constraint logic programming. This language is reformulated in algebraic terms as the free language generated by a BNF grammar. This allows us to define a structured operational semantics, based on a transition system, by means of which a notion of observables is introduced, also including the results of infinite computations. Then, a compositional semantics based on logical operators is defined, and it is shown its full correspondence with the observables. The denotational meaning of the program is obtained by considering the domain of the upward-closed Scott-compact sets of constraints (the Smith powerdomain of the domain of constrains). It turns out that the well-known Negation As Failure of Logic Programming is just the Heyting negation on this domain.