Semantics of computation
2013
Abstract
In the practice of information extraction, the input data are usually arranged into pattern matrices, and analyzed by the methods of linear algebra and statistics, such as principal component analysis. In some applications, the tacit assumptions of these methods lead to wrong results. The usual reason is that the matrix composition of linear algebra presents information as flowing in waves, whereas it sometimes flows in particles, which seek the shortest paths. This wave-particle duality in computation and information processing has been originally observed by Abramsky. In this paper we pursue a particle view of information, formalized in distance spaces, which generalize metric spaces, but are slightly less general than Lawvere's generalized metric spaces. In this framework, the task of extracting the 'principal components' from a given matrix of data boils down to a bicompletion, in the sense of enriched category theory. We describe the bicompletion construction for distance matrices. The practical goal that motivates this research is to develop a method to estimate the hardness of attack constructions in security.
Abstract
We present a new model of computation, described in terms of monoidal
categories. It conforms the Church-Turing Thesis, and captures the
same computable functions as the standard models. It provides a
succinct categorical interface to most of them, free of their diverse
implementation details, using the ideas and structures that in the
meantime emerged from research in semantics of computation and
programming. The salient feature of the language of monoidal
categories is that it is supported by a sound and complete graphical
formalism, string diagrams, which provide a concrete and intuitive
interface for abstract reasoning about computation. The original
motivation and the ultimate goal of this effort is to provide a
convenient high level programming language for a theory of
computational resources, such as one-way functions, and trapdoor
functions, by adopting the methods for hiding the low level
implementation details that emerged from practice. Only a modest first
step towards this ambitious goal is presented.
2009
Abstract
Game theoretic equilibria are mathematical expressions of rationality. Rational agents are used to model not only humans and their software representatives, but also organisms, populations, species and genes, interacting with each other and with the environment. Rational behaviors are achieved not only through conscious reasoning, but also through spontaneous stabilization at equilibrium points.
Formal theories of rationality are usually guided by informal intuitions, which are acquired by observing some concrete economic, biological, or network processes. Treating such processes as instances of computation, we reconstruct and refine some basic notions of equilibrium and rationality from the some basic structures of computation.
It is, of course, well known that equilibria arise as fixed points; the point is that semantics of computation of fixed points seems to be providing novel methods, algebraic and coalgebraic, for reasoning about them.
2007
Abstract
Labelled Markov processes (LMPs) are labelled transition systems in
which each transition has an associated probability. In this paper we
present a universal LMP as the spectrum of a commutative C*-algebra
consisting of formal linear combinations of labelled trees. This
yields a simple trace-tree semantics for LMPs that is fully
abstract with respect to probabilistic bisimilarity. We also consider
LMPs with distinguished entry and exit points as stateful stochastic
relations. This allows us to define a category LMP, with
measurable spaces as objects and LMPs as morphisms. Our main result
in this context is to provide a predicate-transformer duality for
LMP that generalises Kozen's duality for the category SRel
of stochastic relations.
2006
Abstract
We propose a methodology based on testing as a framework to capture the interactions of a machine represented in a denotational
model and the data it manipulates. Using a connection that models ma-
chines on the one hand, and the data they manipulate on the other, test-
ing is used to capture the interactions of each with the objects on the
other side: just as the data that are input into a machine can be viewed
as tests that the machine can be subjected to, the machine can be viewed
as a test that can be used to distinguish data. This approach is based on
generalizing from duality theories that now are common in semantics to
logical connections, which are simply contravariant adjunctions. In the
process, it accomplishes much more than simply moving from one side
of a duality to the other; it faithfully represents the interactions that
embody what is happening as the computation proceeds.
Our basic philosophy is that tests can be used as a basis for modeling
interactions, as well as processes and the data on which they operate. In
more abstract terms, tests can be viewed as formulas of process logics,
and testing semantics connects processes and process logics, and assigns
computational meanings to both.
2004
Abstract
Labelled Markov processes (LMPs) are probabilistic labelled
transition systems. In this paper we present a `universal' LMP
as the Stone-Gelfand-Naimark dual of a C*-algebra consisting of
formal linear combinations of labelled trees. We characterize the
state space of the universal LMP as the set of homomorphims from
an ordered commutative monoid of labelled trees into the
multiplicative unit interval. This yields a simple semantics for
LMPs which is fully abstract with respect to probabilistic
bisimilarity. We also consider LMPs with entry points and exit
points in the framework of Elgot's iterative theories. We define an
iterative theory of LMPs by specifing its categorical dual: a
category of commutative rings consisting of C*-algebras of trees
and `shapely maps' between them. We find that the basic operations
for composing LMPs have simple definitions in the dual category.
2001
Abstract
When people perform computations, they routinely monitor their
results, and try to adapt and improve their algorithms when a need
arises. The idea of self-adaptive software is to implement this
common facility of human mind within the framework of the standard
logical methods of software engineering. The ubiquitous practice of
testing, debugging and improving programs at the design time should be
automated, and established as a continuing run time routine.
Technically, the task thus requires combining
functionalities of automated software development tools and of runtime
environments. Such combinations lead not just to challenging
engineering problems, but also to novel theoretical questions. Formal
methods are needed, and the standard techniques do not suffice.
As a first contribution in this direction, we present a
basic mathematical framework suitable for describing self-adaptive
software at a high level of semantical abstraction. A static view
leads to a structure akin to the Chu construction. An dynamic view is
given by a coalgebraic presentation of adaptive transducers.
Abstract
Simulations between processes can be
understood in terms of coalgebra homomorphisms, with homomorphisms to
the final coalgebra exactly identifying bisimilar processes. The
elements of the final coalgebra are thus natural representatives of
bisimilarity classes, and a denotational semantics of processes can be
developed in a final-coalgebra-enriched category where arrows are
processes, canonically represented.
In the present paper, we describe a general framework for building
final-coalgebra-enriched categories. Every such category is
constructed from a multivariant functor representing a notion of
process, much like Moggi's categories of computations arising from
monads as notions of computation. The "notion of process" functors are
intended to capture different flavors of processes as dynamically
extended computations. These functors may involve a computational
(co)monad, so that a process category in many cases contains an
associated computational category as a retract. We further discuss
categories of resumptions and of hyperfunctions, which are the main
examples of prcess categories. Very informally, resumptions can be
understood as computations extended in time, whereas hypercomputations
are extended in space.
2000
Process fusion
Abstract
The technique of build fusion, also
known as deforestation, removes intermediate results in a
composition involving the "build" of an initial (inductive,
finite) data structure, followed by its consumption. Here we show that
it is analogously possible to do process fusion, removing
intermediate final (coinductive, potentially infinite) data
passing between a producer and a consumer.
The key observation leading to our results is the fact that the
Curry-Howard isomorphism, relating types to propositions, programs to
proofs, and sequential composition to cut, extends to the
correspondence of fusion to cut elimination. this
simple idea gives us logical interpretations of the basic methods of
generic and transformational programming. In the present paper, we
provide a logical analysis of the general form of build
fusion over the inductive data types, regular or nested. The
analysis is based on a novel logical interpretation of parametricity
in terms of the paranatural transformations, introduced in
the paper. We extend it to cover process fusion on
coinductive data types.
The results obtained are truly generic, in the sense of applying to
all coinductive (final) data types, including nested ones, and allow a
far wider range of optimizations than previously possible. By the
standard embedding of initial into final data types, it also applies
to arbitrary intial-final mixtures (e.g., infinitely unfolding trees
of finite lists).
1999
Abstract
Parametricity is one of the most effective ways to achieve
compositionality and reuse in software development. Parametric
specifications have been thoroughly analyzed in the algebraic setting
and are by now a standard part of most software development
toolkits. However, an effort towards classifying, specifying and
refining algorithmic theories, rather than mere datatypes, quickly
leads beyond the realm of algebra, and often to full first order
theories. We extend standard semantics of parametric specifications
to this more general setting.
The familiar semantic characterization of parametricity in the
algebraic case is expressed in terms of the free functor, i.e. using
the initial models. In the general case, the initial models may not
exist, and the free functor is not available. Various syntactic,
semantic and abstract definitions of parametricity have been offered,
but their exact relationships are often unclear. Using the methods of
categorical model theory, we establish the equivalence of two well
known, yet so far unrelated definitions of parametricity, one
syntactic, one semantic. Besides providing support for both underlying
views, and a way for aligning the systems based on each of them, the
offered general analysis and its formalism open up several avenues for
future research and applications.
1997
Abstract
We analyse two complementary methods for obtaining categorical models
of process calculi. They allow adding new features to the captured
notions of process and of type respectively. By alternating these two
methods, all the familiar examples, as well as some new interaction
categories, can be derived from basic monoidal categories.
Using the proposed constructions, interaction categories can be built
and analysed without fixing a set of axioms for them. They are thus
approached via specifications, just like algebras are
approached via equations and relations, independantly of the
intrinsic characterisation of varieties.
Abstract
Milner's action calculus implements abstraction in monoidal
categories, so that the familiar lambda-calculi, together with the
pi-calculus and Petri nets can be subsumed. Variables are generalised
to names: only a restricted form of substitution is allowed.
In the present paper, the well-known categorical semantics of the
lambda-calculus is generalised to the action calculus. A suitable
functional completeness theorem for symmetric monoidal categories is
proved: we determine the conditions under which the abstraction is
definable. Algebraically, the distinction between the variables and
the names boils down to the distinction between the transcendental and
the algebraic elements. The former lead to the polynomial extensions,
like e.g. the ring Z[x], the latter to the algebraic extensions like
Z[i].
Building upon the work of P. Gardner, we introduce action categories,
and show that they are related to the static action calculus exacly as
cartesian closed categories are related to the lambda-calculus.
Natural examples of this structure arise from allegories and cartesian
bicategories. On the other hand, the free algebras for any commutative
Moggi monad form an action category. The general correspondence of
action calculi and Moggi monads will be worked out in a sequel to this
work.
1996
Abstract
In the preceding paper, we have argued for a representation of
processes taking into account computationally relevant morphisms. It
has been shown that the category of synchronous processes, i.e. modulo
strong bisimilarity, is isomorphic with the category of labelled
irredundant trees.
In the present paper, we extend this representation to asynchronous
processes, namely modulo the weak and the branching bisimulations and
congruences. They are shown to correspond to interesting subcategories
of the category of labelled irredundant trees. The form of the
representatives in the case of the branching bisimilarity suggests
possible connections with game theory. An abstract construction of a
category of processes in a general setting is presented in the
appendix.
1995
Abstract
This is a report on a mathematician's effort to understand some
concurrency theory. The starting point is a logical interpretation of
Nielsen and Winskel's account of the basic models of concurrency. Upon
the obtained logical structures, we build a calculus of relations
which yields, when cut down by bisimulations, Abramsky's interaction
category of synchronous processes. It seems that all interaction
categories arise in this way. The obtained presentation uncovers some
of their logical contents and perhaps sheds some more light on the
original idea of processes as relations extended in time.
The sequel of this paper will address the issues of asynchronicity,
preemption, noninterleaving and linear logic in the same setting.
Abstract
Deep categorical analyses of various aspects of concurrency have been
developed, but a uniform categorical treatment of the very first
concepts seems to be hindered by the fact that the existing
representations of processes as bisimilarity classes do not provide an
account of computational morphisms.
In the present paper, we describe a category of processes modulo
strong bisimulations, with the bisimilarity preserving simulations as
morphisms, and show that it is equivalent to the category of labelled
irredundant trees and the label preserving tree morphisms. The
developed representation turns out to be applicable to weaker notions
of process as well. They will be studied in the sequel.