# lp:axiom

Created by VCS imports and last modified
Get this branch:
bzr branch lp:axiom

Owner:
VCS imports
Project:
Axiom
Status:
Development

## Import details

Import Status: Failed

This branch is an import of the HEAD branch of the Git repository at git://github.com/daly/axiom.git.

The import has been suspended because it failed 5 or more times in succession.

Last successful import was .

Import started on alnitak and finished taking 1 hour — see the log
Import started on izar and finished taking 1 hour — see the log
Import started on alnitak and finished taking 1 hour — see the log
Import started on izar and finished taking 1 hour — see the log

## Recent revisions

2878. By daly on 2018-06-30

books/bookvolbib fix typo in Grie11

Goal: Axiom Literate Programming

2877. By daly on 2018-06-30

books/bookvol10.1 add chapter on Primality Testing

Goal: Axiom Literate Programming

2876. By daly on 2018-06-30

Goal: Proving Axiom Sane

2875. By daly on 2018-06-30

books/bookvolbib add latex style files

Goal: Axiom Literate Programming

2874. By daly on 2018-06-30

Goal: Proving Axiom Sane

\index{Davenport, James H.}
\begin{chunk}{axiom.bib}
@article{Dave12b,
author = "Davenport, James H.",
title = {{Small Algorithms for Small Systems}},
journal = "ACM Comm. in Computer Algebra",
volume = "46",
number = "1",
year = "2012",
paper = "Dave12b.pdf"
}

\end{chunk}

\index{Stoutemyer, David R.}
\begin{chunk}{axiom.bib}
@article{Stou12,
author = "Stoutemyer, David R.",
title = {{Series Crimes}},
journal = "ACM Comm. in Computer Algebra",
volume = "46",
number = "2",
year = "2012",
abstract =
"Puiseux series are power series in which the exponents can be
fractional and/or negative rational numbers. Several computer algebra
systems have one or more built-in or loadable functions for computing
truncated Puiseux series. Some are generalized to allow coefficients
containing functions of the series variable that are dominated by any
power of that variable, such as logarithms and nested logarithms of
the series variable. Some computer algebra systems also have built-in
or loadable functions that compute infinite Puiseuxseries.
Unfortunately, there are some little-known pitfalls in
computing Puiseux series. The most serious of these is expansions
within branch cuts or at branch points that are incorrect for some
directions in the complex plane. For example with each series
implementation accessible to you:

Compare the value of $(z^2 + z^3)^{3/2}$ with that of its
truncated series expansion about $z = 0$, approximated at
$z = −0.01$. Does the series converge to a value that is
the negative of the correct value?

Compare the value of $ln(z^2 + z^3)$ with its truncated series
expansion about $z = 0$, approximated at $z= −0.01 + 0.1i$. Does
the series converge to a value that is incorrect by $2\pi i$?

Compare $arctanh(−2 + ln(z)z)$ with its truncated series
expansion about $z = 0$, approximated at $z = −0.01$. Does the series
converge to a value that is incorrect by about $\pi i$?

At the time of this writing, most implementations that accommodate
such series exhibit such errors. This article describes how to avoid
these errors both for manual derivation of series and when
implementing series packages.",
paper = "Stou12.pdf"
}

\end{chunk}

\index{Blair, Fred W.}
\index{Griesmer, James H.}
\index{Harry, Joseph}
\index{Pivovonsky, Mark}
\begin{chunk}{axiom.bib}
@misc{Blai71,
author = "Blair, Fred W. and Griesmer, James H. and Harry, Joseph and
Pivovonsky, Mark",
title = {{Design and Development Document for Lisp on Several S/360
Operating Systems}},
year = "1971",
publisher = "IBM Research",
paper = "Blai71.pdf"
}

\end{chunk}

\index{Alford, W.R.}
\index{Granville, A.}
\index{Pomerance, C.}
\begin{chunk}{axiom.bib}
@misc{Alfo92,
author = "Alford, W.R. and Granville, A. and Pomerance, C.",
title = {{There are Infinitely Many Carmichael Numbers}},
year = "1992",
comment = "Preprint"
}

\end{chunk}

\index{Arnault, F.}
\begin{chunk}{axiom.bib}
@misc{Arna91,
author = "Arnault, F.",
title = {{Le Test de Primalite de Rabin-Miller: Un Nombre compose
qui le "passe"}},
comment = "Report 61",
university = "Universite de Poitiers Departement de Mathematiques",
year = "1991"
}

\end{chunk}

\index{Damgard, I.}
\index{Landrock, P.}
\begin{chunk}{axiom.bib}
@misc{Damg91,
author = "Damgard, I. and Landrock, P.",
title = {{Improved Bounds for the Rabin Primality Test}},
booktitle = "Proc. 3rd IMA Conf. on Coding and Cryptography",
year = "1991"
}

\end{chunk}

\index{Davenport, James H.}
\index{Smith, G.C.}
\begin{chunk}{axiom.bib}
@misc{Dave87,
author = "Davenport, James H. and Smith, G.C.",
title = {{Rabin's Primality Testing Algorithm -- a Group Theory View}},
school = "University of Bath",
type = "technical report",
number = "87-04",
year = "1987"
}

\end{chunk}

\index{Jaeschke, G.}
\begin{chunk}{axiom.bib}
@misc{Jaes91,
author = "Jaeschke, G.",
title = {{Private Communication}},
comment = "to James Davenport",
year = "1991"
}

\end{chunk}

\index{Leech, J.}
\begin{chunk}{axiom.bib}
@misc{Leec92,
author = "Leech, J.",
title = {{Private Communication}},
comment = "to James Davenport",
year = "1992"
}

\end{chunk}

\index{Koblitz, N.}
\begin{chunk}{axiom.bib}
@book{Kobl87,
author = "Koblitz, N.",
title = {{A Cource in Number Theory and Cryptography}},
publisher = "Springer-Verlog",
year = "1987"
}

\end{chunk}

\index{Lenstra Jr., H.W.}
\begin{chunk}{axiom.bib}
@article{Lens81,
author = "Lenstra Jr., H.W.",
title = {{Primality Testing Algorithms}},
journal = "Lecture Notes in Mathematics",
volume = "901",
publisher = "Springer-Verlag",
pages = "243-257",
year = "1981"
}

\end{chunk}

\index{Pomerance, C.}
\index{Slefridge, J.L.}
\index{Wagstaff Jr., S.S.}
\begin{chunk}{axiom.bib}
@article{Pome80,
author = "Pomerance, C. and Slefridge, J.L. and Wagstaff Jr., S.S.",
title = {{The Pseudoprimes up to $25\cdot 10^9$}},
journal = "Math. Comp.",
volume = "35",
pages = "1003-1026",
year = "1980"
}

\end{chunk}

\index{Rabin, M.O.}
\begin{chunk}{axiom.bib}
@article{Rabi80,
author = "Rabin, M.O.",
title = {{Probabilistic Algorithm for Testing Primality}},
journal = "J. Number Theory",
volume = "12",
pages = "128-138",
year = "1980"
}

\end{chunk}

\index{Steele, Guy Lewis}
\index{Sussman, Gerald Jay}
\begin{chunk}{axiom.bib}
@techreport{Stee78a,
author = "Steele, Guy Lewis and Sussman, Gerald Jay",
title = {{The Art of the Interpreter}},
institution = "MIT",
type = "technical report",
number = "AI Memo No. 453",
year = "1978",
abstract =
"We examine the effes of various language design decisions on the
programming styles available to a user of the language, with
particular emphasis on the ability to incrementatlly construct
modular systems. At each step we exhibit an interactive
meta-circular interpreter for the language under
consideration. Each new interpreter is the result of an
incremental change to the previous interpreter.

We explore the consequences of various variable binding
disciplines and the introduction of side effects. We find that
dynamic scoping is unsuitable for constructing procedural
abstractions, but has another role as an agent of modularity,
being a structured form of side effect. More general side effects
are also found to be necessary to promote modular style. We find
that the notion of side effect and the notion of equality (object
identity) are mutually constraining; to define one is to define
the other.

The interpreters we exhibit are all written in a simple dialect of
LISP, and all implement LISP-like languages. A subset of these
interpreters constitute a partial historical reconstruction of the
actual evolution of LISP."
}

\end{chunk}

\index{Constable, Robert L.}
\begin{chunk}{axiom.bib}
@incollection{Cons03,
author = "Constable, Robert L.",
title = {{Naive Computational Type Theory}},
booktitle = "Proof and System Reliability",
publisher = "unknown",
year = "2003",
paper = "Cons03.pdf"
}

\end{chunk}

\index{Waaldijk, Frank}
\begin{chunk}{axiom.bib}
@misc{Waal03,
author = "Waaldijk, Frank",
title = {{On the Foundations of Constructive Mathematics}},
year = "2003",
abstract =
"We discuss the foundations of constructive mathematics, including
recursive mathematics and intuitionism, in relation to classical
mathematics. There are connections with the foundations of physics,
due to the way in which the different branches of mathematics reflect
reality. Many different axioms and their interrelationship are
discussed. We show that there is a fundamental problem in bish
(Bishop’s school of constructive mathematics) with regard to its
current definition of ‘continuous function’. This problem is closely
related to the definition in bish of ‘locally compact’.

Possible approaches to this problem are discussed. Topology seems to
be a key to understanding many issues. We offer several new
simplifying axioms, which can form bridges between the various
branches of constructive mathematics and classical mathematics
(‘reuniting the antipodes’). We give a simplification of basic
intuitionistic theory, especially with regard to so-called ‘bar
induction’. We then plead for a limited number of axiomatic systems,
which differentiate between the various branches of mathematics.
Finally, in the appendix we offer bish an elegant topological
definition of ‘locally compact’, which unlike the current definition
is equivalent to the usual classical and/or intuitionistic definition
in classical and intuitionistic mathematics respectively.",
paper = "Waal03.pdf"
}

\end{chunk}

\index{Farmer, William M.}
\begin{chunk}{axiom.bib}
@article{Farm95b,
author = "Farmer, William M.",
title = {{Reasoning about Partial Functions with the Aid of a
Computer}},
journal = "Erkenntnis",
volume = "43",
number = "3",
pages = "279-294",
year = "1995",
abstract =
"Partial functions are ubiquitous in both mathematics and computer
science. Therefore, it is imperative that the underlying logical
formalism for a general-purpose mechanized mathematics system
provide strong support for reasoning about partial
functions. Unfortunately, the common logical formalisms --
first-order logic, type theory, and set theory -- are usually only
adequate for reasoning about partial functions {\sl in
theory}. However, the approach to partial functions traditionally
employed by mathematicians is quite adequate {\sl in
practice}. This paper shows how the traditional approach to
partial functions can be formalized in a range of formalisms that
includes first-order logic, simple type theory, and Von-Neuman,
Bernays, Godel set theory. It argues that these new formalisms
allow one to directly reason about partial functions; are based on
natural, well-understood, familiar principles; and can be
effectively implemented in mechanized mathematics systems.",
paper = "Farm95b.pdf",
keywords = "printed"
}

\end{chunk}

\index{Fong, Brendan}
\index{Spivak, David I.}
\begin{chunk}{axiom.bib}
@misc{Fong18,
author = "Fong, Brendan and Spivak, David I.",
title = {{Seven Sketches in Compositionality: An Invitation to
Applied Category Theory}},
year = "2018",
paper = "Fong18.pdf"
}

\end{chunk}

\index{Popov, Nikolaj}
\begin{chunk}{axiom.bib}
@misc{Popo03,
author = "Popov, Nikolaj",
title = {{Verification Using Weakest Precondition Strategy}},
comment = "Talk at Comp. Aided Verfication of Information Systems",
year = "2003",
location = "Timisoara, Romania",
abstract =
"We describe the weakest precondition strategy for verifying
programs. This is a method which takes a specification and an
annotated program and generates so-called verification conditions:
mathematical lemmata which have to be proved in order to obtain a
formal correctness proof for the program with respect to its
specification. There are rules for generating the intermediate pre
and post conditions algorithmically. Based on these rules, we have
developed a package for generating verification conditions.",
paper = "Popo03.pdf",
keywords = "printed"
}

\end{chunk}

\index{Kovacs, Laura}
\index{Jebelean, Tudor}
\begin{chunk}{axiom.bib}
@article{Kova04,
author = "Kovacs, Laura and Jebelean, Tudor",
title = {{Practical Aspects of Imperative Program Verification in
Theorema}},
journal = "Analele Universitatu din Timisoara",
volume = "XXXXIX",
number = "2",
year = "2004",
abstract =
"Approaching the problem of imperative program verification from a
practical point of view has certain implications concerning: the style
of specifications, the programming language which is used, the help
provided to the user for finding appropriate loop invariants, the
theoretical frame used for formal verification, the language used
for expressing generated verification theorems as well as the database
of necessary mathematical knowledge, and finally the proving power,
style and language. The Theorema system (www. theorema.org) has
certain capabilities which make it appropriate for such a practical
approach. Our approach to imperative program verification in Theorema
is based on Hoare– Logic and the Weakest Precondition Strategy. In
this paper we present some practical aspect of our work, as well as a
novel method for verification of pro- grams that contain loops, namely
a method based on recurrence equation solvers for generating the
necessary loop invariants and termination terms automatically. We
have tested our system with numerous examples, some of these example
are presented at the end of the paper.",
paper = "Kova04.pdf",
keywords = "printed"
}

\end{chunk}

\index{Kovacs, Laura}
\index{Jebelean, Tudor}
\begin{chunk}{axiom.bib}
@article{Kova04a
author = "Kovacs, Laura and Jebelean, Tudor",
title = {{Automated Generation of Loop Invariants by Recurrence
Solving in Theorema}},
year = "2004",
abstract =
"Most of the properties established during program verification are
either invariants or depend crucially on invariants. The effectiveness
of automated verification of (imperative) programs is therefore
sensitive to the ease with which invariants, even trivial ones, can be
automatically deduced. We present a method for invariant generation
that relies on combinatorial techniques, namely on recurrence solving
and variable elimination. The effectiveness of the method is
demonstrated on examples.",
paper = "Kova04a.pdf",
keywords = "printed"
}

\end{chunk}

\index{Kovacs, Laura}
\begin{chunk}{axiom.bib}
@phdthesis{Kova07,
author = "Kovacs, Laura",
title = {{Automated Invariant Generation by Algebraic Techniques for
Imperative Program Verification in Theorema}},
school = "Johannes-Kepler University Linz",
year = "2007",
abstract =

"This thesis presents algebraic and combinatorial approaches for
reasoning about imperative loops with assignments, sequencing and
conditionals.

A certain family of loops, called P-solvable , is defined for which
the value of each program variable can be expressed as a polynomial of
the initial values of variables, the loop counter, and some new
variables where there are algebraic dependencies among the new
variables. For such loops, a systematic method is developed for
generating polynomial invariants. Further, if the bodies of these
loops consist only of assignments and conditional branches, and test
conditions in the loop and conditionals are ignored, the method is
shown to be complete for some special cases. By completeness we mean
that it generates a set of polynomials from which, under additional
assumptions for loops with conditional branches, any polynomial
invariant can be derived. Many non-trivial algorithms working on
numbers can be naturally implemented using P-solvable loops.

By combining advanced techniques from algorithmic combinatorics,
symbolic summation, computer algebra and computational logic, a
framework is developed for generating polynomial invariants for
imperative programs operating on numbers.

Exploiting the symbolic manipulation capabilities of the computer
algebra system Mathematica , these techniques are implemented in a new
software package called Aligator . By using several combinatorial
packages developed at RISC, Aligator includes algorithms for solving
special classes of recurrence relations (those that are either
Gosper-summable or C-finite) and generating polynomial dependencies
among algebraic exponential sequences. Using Aligator , a complete set
of polynomial invariants is successfully generated for numerous
imperative programs working on numbers.

The automatically obtained invariant assertions are subsequently used
for proving the partial correctness of programs by generating
appropriate verification conditions as first-order logical formulas.
Based on Hoare logic and the weakest precondition strategy, this
verification process is supported in an imperative verification
environment implemented in the Theorema system. Theorema is
convenient for such an integration given that it is built on top of
the computer algebra system Mathematica and includes automated methods
for theorem proving in predicate logic, domain specific reasoning and
proving by induction.",
paper = "Kova07.pdf"
}

\end{chunk}

\index{Kovacs, Laura}
\begin{chunk}{axiom.bib}
@inproceedings{Kova08,
author = "Kovacs, Laura",
title = {{Reasoning Algebraiclly About P-Solvable Loops}},
booktitle = "Int. Conf. on Tools and Algorithms for the Construction
and Analysis of Systems",
pages = "249-264",
year = "2008",
abstract =
"We present a method for generating polynomial invariants for a
subfamily of imperative loops operating on numbers, called the
P-solvable loops. The method uses algorithmic combinatorics and
algebraic techniques. The approach is shown to be complete for some
special cases. By completeness we mean that it generates a set of
polynomial invariants from which, under additional assumptions, any
polynomial invariant can be derived. These techniques are implemented
in a new software package Aligator written in Mathematica and
successfully tried on many programs implementing interesting
algorithms working on numbers.",
paper = "Kova08.pdf",
keywords = "printed"
}

\end{chunk}

\index{Oury, Nicolas}
\index{Swierstra, Wouter}
\begin{chunk}{axiom.bib}
@inproceedings{Oury08,
author = "Oury, Nicolas and Swierstra, Wouter",
title = {{The Power of Pi}},
booktitle = "Int. Conf. on Functional Programming",
year = "2008",
abstract =
"This paper exhibits the power of programming with dependent types by
dint of embedding three domain-specific languages: Cryptol, a
language for cryptographic protocols; a small data description
language; and relational algebra. Each example demonstrates
particular design patterns inherent to dependently-typed programming.
Documenting these techniques paves the way for further research in
domain-specific embedded type systems.",
paper = "Oury08.pdf",
keywords = "printed"
}

\end{chunk}

\index{Graillat, Stef}
\index{Menissier-Morain, Valerie}
\begin{chunk}{axiom.bib}
@inproceedings{Grai07,
author = "Graillat, Stef and Menissier-Morain, Valerie",
title = {{Error-free Transformations in Real and Complex
Floating-point Arithmetic}},
booktitle = "Int. Symp. on Nonlinear Theory and Applications",
year = "2007",
abstract =
"Error-free transformation is a concept that makes it possible to
compute accurate results within a floating point arithmetic. Up to
now, it has only be studied for real floating point arithmetic. In
this short note, we recall the known error-free transformations for
real arithmetic and we propose some new error-free transformations for
complex floating point arithmetic. This will make it possible to
design some new accurate algorithms for summation, dot product and
polynomial evaluation with complex entries.",
paper = "Grai07.pdf"
}

\end{chunk}

\index{Noblel, James}
\index{Black, Andrew P.}
\index{Bruce, Kim B.}
\index{Homer, Michael}
\index{Miller, Mark S.}
\begin{chunk}{axiom.bib}
@inproceedings{Nobl16,
author = "Noble, James and Black, Andrew P. and Bruce, Kim B. and
Homer, Michael and Miller, Mark S.",
title = {{The Left Hand of Equals}},
booktitle = "Int. Symp. of New Ideas, New Paradigms, and Reflections
on Programming and Software",
publisher = "ACM",
pages = "224-237",
year = "2016",
abstract =
"When is one object equal to another object? While object identity is
fundamental to object-oriented systems, object equality, although
tightly intertwined with identity, is harder to pin down. The
distinction between identity and equality is reflected in
object-oriented languages, almost all of which provide two variants of
'equality', while some provide many more. Programmers can usually
override at least one of these forms of equality, and can always
define their own methods to distinguish their own objects.

This essay takes a reflexive journey through fifty years of identity
and equality in object-oriented languages, and ends somewhere we did
not expect: a 'left-handed' equality relying on trust and grace.",
paper = "Nobl16.pdf",
keywords = "printed"
}

\end{chunk}

\begin{chunk}{axiom.bib}
@article{Avig18a,
author = "Avigad, Jeremy"
title = {{The Mechanization of Mathematics}},
journal = "Notices of the AMS",
volume = "65",
number = "6",
year = "2018",
pages = "681-690",
paper = "Avig18a.pdf",
keywords = "printed"
}

\end{chunk}

\index{Leroy, Xavier}
\begin{chunk}{axiom.bib}
@techreport{Lero90,
author = "Leroy, Xavier",
title = {{The ZINC experiment: An Economical Implementation of the
ML Language.}},
type = "technical report",
institution = "Institut National de Recherche en Informatique et en
Automatique",
number = "117",
year = "1990",
abstract =
"This report details the design and implementation of the ZINC
system. This is an implementation of the ML language, inended to
serve as a test field for various extensions of the language, and
for new implementation techniques as well. This system is strongly
oriented toward separate compilation and the production of small,
standalone programs; type safety is ensured by a Modula-2-like
module system. ZINC uses simple, portable techniques, such as
bytecode interpretation; a sophisticated execution model helps
counterbalance the interpretation overhead. Other highlights
include an efficient implementation of records with inclusion
(subtyping).",
paper = "Lero90.pdf",
keywords = "printed"
}

\end{chunk}

\index{Cohn, Avra Jean}
\begin{chunk}{axiom.bib}
@phdthesis{Cohn79,
author = "Cohn, Avra Jean",
title = {{Machine Assisted Proofs of Recursion Implementation}},
school = "University of Edinburgh",
year = "1979",
abstract =
"Three studies in the machine assisted proof of recursion
implementation are described. The verification system used is
Edinburgh LCF (Logic for Computable Functions). Proofs are generated,
in LCF, in a goal-oriented fashion by the application of strategies
reflecting informal proof plans. LCF is introduced in Chapter 1. We
present three case studies in which proof strategies are developed and
(except in the third) tested in LCF. Chapter 2 contains an account of
the machine generated proofs of three program transformations (from
recursive to iterative function schemata). Two of the examples are
taken from Manna and Waldinger. In each case, the recursion is
implemented by the introduction of a new data type, e.g., a stack or
counter. Some progress is made towards the development of a general
strategy for producing the equivalence proofs of recursive and
iterative function schemata by machine. Chapter 3 is concerned with
the machine generated proof of the correctness of a compiling
algorithm. The formulation, borrowed from Russell, includes a simple
imperative language with a while and conditional construct, and a low
level language of labelled statements, including jumps. We have, in
LCF, formalised his denotational descriptions of the two languages and
performed a proof of the preservation of the semantics under
compilation. In Chapter 4, we express and informally prove the
correctness of a compiling algorithm for a language containing
declarations and calls of recursive procedures. We present a low level
language whose semantics model a standard activation stack
implementation. Certain theoretical difficulties (connected with
recursively defined relations) are discussed, and a proposed proof in
LCF is outlined. The emphasis in this work is less on proving original
theorems, or even automatically finding proofs of known theorems, than
on (i) exhibiting and analysing the underlying structure of proofs,
and of machine proof attempts, and (ii) investigating the nature of
the interaction (between a user and a computer system) required to
generate proofs mechanically; that is, the transition from informal
proof plans to behaviours which cause formal proofs to be performed.",
paper = "Cohn79.pdf"
}

\end{chunk}

\index{Russell, Bruce D.}
\begin{chunk}{axiom.bib}
@article{Russ77,
author = "Russell, Bruce D.",
title = "Implementation Correctness Involving a Language with GOTO
statements",
journal = "SIAM Journal of Computing",
volume = "6",
number = "3",
year = "1977",
abstract =
"Two languages, one a simple structured programming language, the
other a simple goto language, are defined. A denotational semantics is
given for each language. An interpreter for the goto language is given
and is proved correct with respect to the denotational semantics. A
compiler from the structured to the goto language is defined and
proved to be a semantically invariant translation of programs. The
proofs are by computational induction.",
paper = "Russ77.pdf",
keywords = "printed"
}

\end{chunk}

\index{Paulson, Lawrence C.}
\begin{chunk}{axiom.bib}
@book{Paul87,
author = "Paulson, Lawrence C.",
title = {{Logic and Computation}},
publisher = "Press Synticate of Cambridge University",
year = "1987",
isbn = 0-521-34632-0"
}

\end{chunk}

\index{Shoenfield, Joseph R.}
\begin{chunk}{axiom.bib}
@book{Shoe67,
author = "Shoenfield, Joseph R.",
title = {{Mathematical Logic}},
publisher = "Association for Symbolic Logic",
year = "1967",
isbn = "1-56881-135-7"
}

\end{chunk}

\index{Abramsky, S.}
\index{Gabbay, Dov M.}
\index{Maibaum, T.S.E}
\begin{chunk}{axiom.bib}
@book{Abra92,
author = "Abramsky, S. and Gabbay, Dov M. and Maibaum, T.S.E",
title = {{Handbook of Logic in Computer Science (Volume 2)}},
publisher = "Oxford Science Publications",
year = "1992",
isbn = "0-19-853761-1"
}

\end{chunk}

\index{Basu, Saugata}
\index{Pollack, Richard}
\index{Roy, Marie-Francoise}
\begin{chunk}{axiom.bib}
@book{Basu10,
author = "Basu, Saugata and Pollack, Richard and Roy, Marie-Francoise",
title = {{Algorithms in Real Algebraic Geometry}},
publisher = "Springer-Verlag",
year = "2010",
isbn = "978-3-642-06964-2"
}

\end{chunk}

\index{Harrison, John}
\begin{chunk}{axiom.bib}
@book{Harr09,
author = "Harrison, John",
title = {{Handbook of Practical Logic and Automated Reasoning}},
publisher = "Cambridge University Press",
year = "2009",
isbn = "978-0-521-89957-4"
}

\end{chunk}

\index{Barendregt, Henk}
\index{Dekkers, Wil}
\index{Statman, Richard}
\begin{chunk}{axiom.bib}
@book{Bare13,
author = "Barendregt, Henk and Dekkers, Wil and Statman, Richard",
title = {{Lambda Calculus with Types}}
publisher = "Cambridge University Press",
year = "2013",
isbn = "978-0-521-76614-2"
}

\end{chunk}

\index{Manna, Zohar}
\index{Waldinger, Richard}
\begin{chunk}{axiom.bib}
@book{Mann85,
author = "Manna, Zohar and Waldinger, Richard",
title = {{The Logical Basis for Computer Programming (Vol 1)}},
year = "1985",
isbn = "0-201-15260-2"
}

\end{chunk}

\index{Pierce, Benjamin C.}
\begin{chunk}{axiom.bib}
@book{Pier02,
author = "Pierce, Benjamin C.",
title = {{Types and Programming Languages}},
publisher = "MIT Press",
year = "2002",
isbn = "978-0-262-16209-8"
}

\end{chunk}

\index{Tarski, Alfred}
\begin{chunk}{axiom.bib}
@book{Tars46,
author = "Tarski, Alfred",
title = {{Introduction to Logic}},
publisher = "Dover",
year = "1946",
isbn = "13-978-0-486-28462-0"
}

\end{chunk}

\index{Shankar, N.}
\begin{chunk}{axiom.bib}
@book{Shan94,
author = "Shankar, N.",
title = {{Metamathematics, Machines, and Godel's Proof}},
publisher = "Cambridge University Press",
year = "1994",
isbn = "0-521-58533-3"
}

\end{chunk}

\index{Beeson, Michael J.}
\begin{chunk}{axiom.bib}
@book{Bees80,
author = "Beeson, Michael J.",
title = {{Foundations of Constructive Mathematics}}
publisher = "Springer-Verlag",
year = "1980",
isbn = "3-540-12173-0"
}

\end{chunk}

\index{Shieber, Stuart M.}
\index{Schabes, Yves}
\index{Pereira, Fernando C.N.}
\begin{chunk}{axiom.bib}
@article{Shie95,
author = "Shieber, Stuart M. and Schabes, Yves and Pereira, Fernando C.N.",
title = {{Principles and Implementation of Deductive Parsing}},
journal = "J. Logic Programming",
volume = "24",
number = "1-2",
pages = "3-36",
year = "1995",
abstract =
"We present a system for generating parsers based directly on the
metaphor of parsing as deduction. Parsing algorithms can be
represented directly as deduction systems, and a single deduction
engine can interpret such deduction systems so as to implement the
corresponding parser. The method generalizes easily to parsers for
augmented phrase structure formalisms, such as definite-clause
grammars and other logic grammar formalisms, and has been used for
rapid prototyping of parsing algorithms for a variety of formalisms
including variants of tree-adjoining grammars, categorial grammars,
and lexicalized context-free grammars.",
paper = "Shie95.pdf",
keywords = "printed"
}

\end{chunk}

\index{Greve, David A.}
\index{Kaufmann, Matt}
\index{Manolios, Panagiotis}
\index{Moore, J Strother}
\index{Ray, Sandip}
\index{Ruiz-Reina, Jose Luis}
\index{Sumners, Rob}
\index{Vroon, Daron}
\index{Wilding, Matthew}
\begin{chunk}{axiom.bib}
@article{Grev08,
author = "Greve, David A. and Kaufmann, Matt and Manolios, Panagiotis
and Moore, J Strother and Ray, Sandip and Ruiz-Reina, Jose Luis
and Sumners, Rob and Vroon, Daron and Wilding, Matthew",
title = {{Efficient Execution in an Automated Reasoning Environment}},
journal = "Journal of Functional Programming",
volume = "18",
number = "1",
pages = "15-46",
year = "2008",
abstract =
"We describe a method that permits the user of a mechanized
mathematical logic to write elegant logical definitions while allowing
sound and efficient execution. In particular, the features supporting
this method allow the user to install, in a logically sound way,
alternative executable counterparts for logically defined
functions. These alternatives are often much more efficient than the
logically equivalent terms they replace. These features have been
implemented in the ACL2 theorem prover, and we discuss several
applications of the features in ACL2.",
paper = "Grev08.pdf",
keywords = "printed"

}

\end{chunk}

\index{Appel, Andrew W.}
\index{MacQueen, David B.}
\begin{chunk}{axiom.bib}
@inproceedings{Appe91,
author = "Appel, Andrew W. and MacQueen, David B.",
title = {{Standard ML of New Jersey}},
booktitle = "Int. Symp. on Prog. Lang. Implementations and Logic
Programming",
publisher = "Springer-Verlag",
pages = "1-13",
year = "1991",
abstract =
"The Standard ML of New Jersey compiler has been under development for
five years now. We have developed a robust and complete environment
for Standard ML that supports the implementation of large software
systems and generates efficient code. The compiler has also served as
a laboratory for developing novel implementation techniques for a
sophisticated type and module system, continuation based code
generation, efficient pattern matching, and concurrent programming
features.",
paper = "Appe91.pdf",
keywords = "printed"
}

\end{chunk}

\begin{chunk}{axiom.bib}
@misc{Toft09,
author = "Tofte, Mads",
title = {{Tips for Computer Scientists on Standard ML (Revised)}},
year = "2009",
paper = "Toft09.pdf",
keywords = "printed"
}

\end{chunk}

\index{Graham, Paul}
\begin{chunk}{axiom.bib}
@book{Grah93,
author = "Graham, Paul",
title = {{On Lisp}},
publisher = "Prentice Hall",
year = "1993",
comment = "code in papers/onlisp.lisp",
isbn = "0130305529",
abstract =
"On Lisp is a comprehensive study of advanced Lisp techniques, with
bottom-up programming as the unifying theme. It gives the first
complete description of macros and macro applications. The book also
covers important subjects related to bottom-up programming, including
functional programming, rapid prototyping, interactive development,
and embedded languages. The final chapter takes a deeper look at
object-oriented programming than previous Lisp books, showing the
step-by-step construction of a working model of the Common Lisp Object
System (CLOS).",
paper = "Grah93.pdf"
}

\end{chunk}

\index{Loh, Andres}
\index{McBride, Conor}
\index{Swierstra, Wouter}
\begin{chunk}{axiom.bib}
@misc{Lohx18,
author = "Loh, Andres and McBride, Conor and Swierstra, Wouter",
title = {{Simply Easy! An Implementation of a Dependently Typed
Lambda Calculus}},
year = "2018",
abstract =
"We present an implementation in Haskell of a dependently-typed
lambda calculus that can be used as the core of a programming
language. We show that a dependently-typed lambda calculus is no
more difficult to implement than other typed lambda calculi. In fact,
our implementation is almost as easy as an implementation of the
simply typed lambda calculus, which we emphasize by discussing
the modifications necessary to go from one to the other. We explain
how to add data types and write simple programs in the core
language, and discuss the steps necessary to build a full-fledged
programming language on top of our simple core.",
paper = "Lohx18.pdf",
keywords = "printed"
}

\end{chunk}

\index{Felty, Amy}
\index{Miller, Dale}
\begin{chunk}{axiom.bib}
@techreport{Felt90,
author = "Felty, Amy and Miller, Dale",
title = {{Encoding a Dependent-Type Lambda Calculus in a Logic
Programming Language}},
type = "technical report",
number = "MS-CIS-90-18",
institution = "University of Pennsylvania",
year = "1990",
abstract =
"Various forms of typed $\lambda$-calculi have been proposed as
specification languages for representing wide varieties of object
logics. The logical framework, LF, is an example of such a
dependent-type $\lambda$-calculus. A small subset of intuitionistic
logic with quantification over simply typed $\lambda$-calculus has
also been proposed as a framework for specifying general logics. The
logic of {\sl hereditary Harrop formulas} with quantification at all
non-predicate types, denoted here as $hh^\omega$ , is such a
meta-logic that has been implemented in both the Isabelle theorem
prover and the $\lambda$Prolog logic programming language. Both
frameworks provide for specifications of logics in which details
involved with free and bound variable occurrences, substitutions,
eigenvariables, and the scope of assumptions within object logics are
handled correctly and elegantly at the 'meta' level. In this paper, we
show how LF ca n be encoded into $hh^\omega$ in a direct and natural
way by mapping the typing judgments in LF in to propositions in the
logic of $hh^\omega$. This translation establishes a very strong
connection between these two languages: the order of quantification in
an LF signature is exactly the order of a set of $hh^\omega$ clauses,
and the proofs in one system correspond directly to proofs in the
other system. Relating these two languages makes it possible to
provide implementations of proof checkers and theorem provers for
logics specified in LF by using standard logic programming techniques
which can be used to implement $hh^\omega$.",
paper = "Felt90.pdf",
keywords = "printed"
}

\end{chunk}

\index{Altenkirch, Thorsten}
\index{McBride, Conor}
\index{McKinna, James}
\begin{chunk}{axiom.bib}
@misc{Alte05,
author = "Altenkirch, Thorsten and McBride, Conor",
title = {{Why Dependent Types Matter}},
year = "2005",
abstract =
"We exhibit the rationale behind the design of Epigram, a dependently
typed programming language and interactive program development system,
using refinements of a well known program -- merge sort -- as a running
example. We discuss its relationship with other proposals to introduce
aspects of dependent types into functional programming languages and
sketch some topics for further work in this area.",
paper = "Alte05.pdf",
keywords = "printed"
}

\end{chunk}

\index{Xi, Hongwei}
\index{Pfenning, Frank}
\begin{chunk}{axiom.bib}
@inproceedings{Xixx99,
author = "Xi, Hongwei and Pfenning, Frank",
title = {{Dependent Types in Practical Programming}},
booktitle = "SIGPLAN-SIGACT Symp. on Principles of Programming
Languages",
year = "1990",
publisher = "ACM",
abstract =
"We present an approach to enriching the type system of ML with a
restricted form of dependent types, where type index objects are drawn
from a constraint domain C , leading to the DML( C ) language schema.
This allows specification and inference of significantly more precise
type information, facilitating program error detection and compiler
optimization. A major complication resulting from introducing
dependent types is that pure type inference for the enriched system is
no longer possible, but we show that type-checking a suciently
annotated program in DML( C ) can be reduced to constraint
satisfaction in the constraint domain C . We exhibit the
un-obtrusiveness of our approach through practical examples and prove
that DML( C ) is conservative over ML. The main contribution of the
paper lies in our language design, including the formulation of
type-checking rules which makes the approach practical. To our
knowledge, no previous type system for a general purpose programming
language such as ML has combined dependent types with features including
datatype declarations, higher-order functions, general recursions,
let-polymorphism, mutable references, and exceptions. In addition,
we have finished a prototype implementation of DML( C ) for an integer
constraint domain C , where constraints are linear inequalities (Xi
and Pfenning 1998).",
paper = "Xixx99.pdf",
keywords = "printed"
}

\end{chunk}

\index{Altenkirch, Thorsten}
\begin{chunk}{axiom.bib}
@misc{Alte04,
author = "Altenkirch, Thorsten",
title = {{$\lambda$-calculus and types}},
comment = "Lecture Notes; APPSEM Spring School 2004",
year = "2004"
paper = "Alte04.pdf",
keywords = "printed",
}

\end{chunk}

\index{Loh, Andres}
\index{McBride, Conor}
\index{Swierstra, Wouter}
\begin{chunk}{axiom.bib}
@article{Lohx01,
author = "Loh, Andres and McBride, Conor and Swierstra, Wouter",
title = {{A Tutorial Implementation of a Dependently Typed Lambda
Calculus}},
journal = "Fundamenta Informaticae",
volume = "XXI",
year = "2001",
pages = "1001-1031",
abstract =
"We present the type rules for a dependently typed core calculus
together with a straight-forward implementation in Haskell. We
explicitly highlight the changes necessary to shift from a
simply-typed lambda calculus to the dependently typed lambda
calculus. We also describe how to extend our core language with data
types and write several small example programs. The article is
accompanied by an executable interpreter and example code that allows
immediate experimentation with the system we describe.",
paper = "Lohx01.pdf",
keywords = "printed"
}

\end{chunk}

\index{Roorda, Jan-Willem}
\begin{chunk}{axiom.bib}
@phdthesis{Roor00,
author = "Roorda, Jan-Willem",
title = {{Pure Type Systems for Functional Programming}},
year = "2000",
institution = "University of Utrecht",
abstract =
"We present a functional programming language based on Pure Type
Systems (PTSs). We show how we can define such a language by
extending the PTS framework with algebraic data types, case
expressions and definitions. To be able to experiment with our
language we present an implementation of a type checker and an
interpreter for our language. PTSs are well suited as a basis for a
functional programming language because they are at the top of a
hierarchy of increasingly stronger type systems. The concepts of
existential types', rank-n polymorphism' and dependent types' arise
naturally in functional programming languages based on the systems in
this hierarchy. There is no need for ad-hoc extensions to incorporate
these features. The type system of our language is more powerful than
the Hindley-Milner system. We illustrate this fact by giving a number
of meaningful programs that cannot be typed in Haskell but are typable
in our language. A real world' example of such a program is the
mapping of a specialisation of a Generic Haskell function to a Haskell
function. Unlike the description of the Henk language by Simon Peyton
Jones and Erik Meijer we give a complete formal definition of the type
system and the operational semantics of our language. Another
difference between Henk and our language is that our language is
defined for a large class of Pure Type Systems instead of only for the
systems of the $\lambda$-cube.",
paper = "Roor00.pdf",
keywords = "printed"
}

\end{chunk}

\index{Jones, Simon Peyton}
\index{Meijer, Erik}
\begin{chunk}{axiom.bib}
@misc{Jone97,
author = "Jones, Simon Peyton and Meijer, Erik",
title = {{Henk: A Typed Intermediate Language}},
year = "1997",
abstract =
"There is a growing interest in the use of richly-typed
intermediate languages in sophisticated compilers for
higher-order, typed source languages. These intermediate languages
are typically stratified, involving terms, types, and kinds. As
the sophistication of the type system increases, there three
levels begin to look more and more similar, so an attraictive
approach is to use a single syntax, and a single data type in the
compiler, to represent all three.

The theory of so-called {\sl pure type systems} amkes precisely
such an identification. This paper describes Henk, a new typed
intermediate langugage based closely on a particuarl pure type
system, {\sl the lambda cube}. On the way we give a tutorial
introduction to the lambda cube.",
paper = "Jone97.pdf",
keywords = "printed"
}

\end{chunk}

\index{Morrisett, Greg}
\begin{chunk}{axiom.bib}
@phdthesis{Morr95,
author = "Morrisett, Greg",
title = {{Compiling with Types}},
school = "Carnegie Mellon University",
year = "1995",
comment = "CMU-CS-95-226",
abstract =
"Compilers for monomorphic languages, suc h as C and Pascal, take
advantage of types to determine data representations, alignment,
calling conventions, and register selection. However, these languages
lack important features including polymorphism, abstract datatypes,
and garbage collection. In contrast, modern programming languages
such as Standard ML (SML), provide all of these features, but existing
implementations fail to take full advantage of types. The result is
that performance of SML code is quite bad when compared to C.

In this thesis, I provide a general framework, called
{\sl type-directed compilation}, that allows compiler writeres to
take advantage of types at all stages in compilation. In the
framework, types are used not only to determine efficient
representations and calling conventions, but also to prove the
correctness of the compiler. A key property of type-directed
compilation is that all but the lowest levels of the compiler use
{\sl typed intermediate languages}. An advantage of this approach
is that it provides a means for automatically checking the
integrity of the resulting code.

An mportant contribution of this work is the development of a new,
statically-typed intermediate language, called $\lambda^{ML}_i$.
This language supports {\sl dynamic type dispatch}, providing a
means to select operations based on types at run time. I show how
to use dynamic type dispatch to support polymorphism, ad-hoc
operators, and garbage collection without having to box or tag
values. This allows compilers for SML to take advantage of
techniques used in C compilers, without sacrificing language
features or separate compilation.

TO demonstrate the applicability of my approach, I, along with
others, have constructed a new compiler for SML called TIL that
eliminates most restrictions on the representations of values. The
code produced by TIL is roughly twice as fast as code produced by
the SML/NJ compiler. This is due to at least partially to the use
of natural representations, but primarily to the conventional
optimizer which manipulates typed, $\lambda^{ML}_i$ code. TIL
demonstrates that combining type-directed compilation with dynamic
type dispatch yields a superior architecture for compilers of
modern languages.",
paper = "Morr95.pdf"
}

\end{chunk}

\index{Leroy, Xavier}
\begin{chunk}{axiom.bib}
@article{Lero98,
author = "Leroy, Xavier",
title = {{An Overview of Types in Compilation}},
journal = "LNCS",
volume = "1473",
year = "1998",
publisher = "Springer-Verlang",
pages = "1-8",
paper = "Lero98.pdf",
keywords = "printed"
}

\end{chunk}

\index{Lu, Eric}
\begin{chunk}{axiom.bib}
@misc{Luxx16,
author = "Lu, Eric",
title = {{Barendregt's Cube and Programming with Dependent Types}},
paper = "Luxx16.pdf",
keywords = "printed"
}

\end{chunk}

\index{Guallart, Nino}
\begin{chunk}{axiom.bib}
@misc{Gual14,
author = "Guallart, Nino",
title = {{An Overview of Type Theories}},
year = "2014",
abstract =
"Pure type systems arise as a generalisation of simply typed lambda
calculus. The contemporary development of mathematics has renewed
the interest in type theories, as they are not just the object of
mere historical research, but have an active role in the development
of computational science and core mathematics. It is worth exploring
some of them in depth, particularly predicative Martin-Löf’s
intuitionistic type theory and impredicative Coquand’s calculus of
constructions. The logical and philosophical differences and
similarities between them will be studied, showing the relationship
between these type theories and other fields of logic.",
paper = "Gual14.pdf",
keywords = "printed"
}

\end{chunk}

\index{Benini, Marco}
\begin{chunk}{axiom.bib}
@misc{Beni95,
author = "Benini, Marco",
title = {{Barendregt's $\lambda$-Cube in Isabelle}},
year = "1995",
abstract =
"We present an implementation of Barendregt’s $\lambda$-Cube in the
Isabelle proof system. Isabelle provides many facilities for
developing a useful specification and proving environment from the
basic formulation of formal systems. We used those facilities to
provide an environment where the user can describe problems and derive
proofs interactively.",
paper = "Beni95.pdf",
keywords = "printed"
}

\end{chunk}

\index{Hellman, Martin E.}
\begin{chunk}{axiom.bib}
@article{Hell17,
author = "Hellman, Martin E.",
title = {{Cybersecurity, Nuclear Security, Alan Turing, and
Illogical Logic}},
journal = "J. ACM",
volume = "60",
number = "12",
pages = "52-59",
year = "2017",
}

\end{chunk}

\index{Winston, Patrick Henry}
\begin{chunk}{axiom.bib}
@inproceedings{Wins12,
author = "Winston, Patrick Henry",
title = {{The Nex 50 Years: A Personal View}},
booktitle = "Biologially Inspired Cognitive Architectures 1",
year = "2012",
pages = "92-99",
publisher = "Elsevier B.V"
abstract =
"I review history, starting with Turing’s seminal paper, reaching
back ultimately to when our species started to outperform other
primates, searching for the questions that will help us develop a
computational account of human intelligence. I answer that the
right questions are: What’s different between us and the other
primates and what’s the same. I answer the {\sl what’s different}
question by saying that we became symbolic in a way that enabled
story understanding, directed perception, and easy communication,
and other species did not. I argue against Turing’s
reasoning-centered suggestions, offering that reasoning is just a
special case of story understanding. I answer the {\sl what’s the
same} question by noting that our brains are largely engineered in
the same exotic way, with information flowing in all directions at
once. By way of example, I illustrate how these answers can
influence a research program, describing the Genesis system, a
system that works with short summaries of stories, provided in
English, together with low-level {\sl common-sense rules} and
higher-level {\sl concept patterns}, likewise expressed in
English. Genesis answers questions, notes abstract concepts such
as {\sl revenge}, tells stories in a listener-aware way, and fills
in story gaps using precedents. I conclude by suggesting,
optimistically, that a genuine computational theory of human
intelligence will emerge in the next 50 years if we stick to the
right, biologically inspired questions, and work toward
biologically informed models.",
paper = "Wins12.pdf"
}

\end{chunk}

\index{Brooks, Rodney A.}
\begin{chunk}{axiom.bib}
@article{Broo87,
author = "Brooks, Rodney A.",
title = {{Intelligence Without Representation}},
journal = "Artificial Intelligence",
volume = "47",
year = "1991",
pages = "139-159",
abstract =
"Artificial intelligence research has foundered on the issue of
representation. When intelligence is approached in an incremental
manner, with strict reliance on interfacing to the real world through
perception and action, reliance on representation disappears. In this
paper we outline our approach to incrementally building complete
intelligent Creatures. The fundamental decomposition of the
intelligent system is not into independent information processing
units which must interface with each other via representations.
Instead, the intelligent system is decomposed into independent and
parallel activity producers which all interface directly to the world
through perception and action, rather than interface to each other
particularly much. The notions of central and peripheral systems
evaporateeverything is both central and peripheral. Based on these
principles we have built a very successful series of mobile robots
which operate without supervision as Creatures in standard office
environments.",
paper = "Broo87.pdf"
}

\end{chunk}

\index{Roy, Peter Van}
\index{Haridi, Seif}
\begin{chunk}{axiom.bib}
@book{Royx03,
author = "Roy, Peter Van and Haridi, Seif",
title = {{Concepts, Techniques, and Models of Computer Programming}},
year = "2003",
publisher = "MIT",
isbn = "978-0262220699",
paper = "Royx03.pdf"
}

\end{chunk}

\index{Boyer, Robert S.}
\index{Moore, J Strother}
\begin{chunk}{axiom.bib}
@article{Boye84,
author = "Boyer, Robert S. and Moore, J Strother",
title = {{Proof-Checking, Theorem-Proving, and Program Verification}},
journal = "Contemporary Mathematics",
volume = "29",
year = "1984",
abstract =
"This article consists or three parts: a tutorial introduction to a
computer program that proves theorems by induction; a brief
description or recent applications or that theorem-prover; and a
discussion or several nontechnical aspects of the problem or building
automatic theorem-provers. The theorem-prover described has proved
theorems such as the uniqueness of prime factorizations, Fermat's
theorem, and the recursive unsolvability or the halting problem.

The article is addressed to those who know nothing about automatic
theorem-proving but would like a glimpse or one such system. This
article definitely does not provide a balanced view or all automatic
theorem-proving, the literature of which is already rather large and
technica1. Nor do we describe the details or our theorem-proving
system, but they can be round in the books, articles, and technical
reports that we reference.

In our opinion, progress in automatic theorem-proving is largely a
function or the mathematical ability or those attempting to build such
systems. We encourage good mathematicians to work in the field.",
paper = "Boye84.pdf",
keywords = "printed"
}

\end{chunk}

\index{Lamport, Leslie}
\begin{chunk}{axiom.bib}
@article{Lamp78
author = "Lamport, Leslie",
title = {{The Specification and Proof of Correctness of Interactive
Programs}},
journal = "LNCS",
volume = "75",
year = "1978",
abstract =
"method production assertional rules specified, is proved is modified
correctly to accept and interactive is described, method that a
program A program of specifying and to permit typed its implementation
correct. by and the Floyd-Hoare implements format programs one to
prove its specification. input is formally with a TECO program.",
paper = "Lamp78.pdf",
keywords = "printed"
}

\end{chunk}

\index{Anderson, E.R.}
\index{Belz, F.C.}
\index{Blum, E.K.}
\begin{chunk}{axiom.bib}
@article{Ande78,
author = "Anderson, E.R. and Belz, F.C. and Blum, E.K.",
title = {{Extending an Implementation Language to a Specification Language}},
journal = "LNCS",
volume = "75",
year = "1978",
paper = "Ande78.pdf",
keywords = "printed"
}

\end{chunk}

\index{Felleisen, Matthias}
\begin{chunk}{axiom.bib}
@misc{Tobi11,
author = "Tobin-Hochstadt, Sam and Felleisen, Matthias",
title = {{The Design and Implementation of Typed Scheme: From
Scripts to Programs}},
year = "2011",
abstract =
"When scripts in untyped languages grow into large programs,
maintaining them becomes difficult. A lack of explicit type
annotations in typical scripting languages forces programmers to must
(re)discover critical pieces of design information every time they
wish to change a program. This analysis step both slows down the
maintenance process and may even introduce mistakes due to the
violation of undiscovered invariants.

This paper presents Typed Scheme, an explicitly typed extension of PLT
Scheme, an untyped scripting language. Its type system is based on
the novel notion of occurrence typing, which we formalize and
mechanically prove sound. The implementation of Typed Scheme
additionally borrows elements from a range of approaches, including
recursive types, true unions and subtyping, plus polymorphism combined
with a modicum of local inference.

The formulation of occurrence typing naturally leads to a simple and
expressive version of predicates to describe refinement types. A
Typed Scheme program can use these refinement types to keep track of
arbitrary classes of values via the type system. Further, we show how
the Typed Scheme type system, in conjunction with simple recursive
types, is able to encode refinements of existing datatypes, thus
expressing both proposed variations of refinement types.",
paper = "Tobi11.pdf",
keywords = "printed"
}

\end{chunk}

\index{McCarthy, John}
\index{Painter, James}
\begin{chunk}{axiom.bib}
@inproceedings{Mcca67,
author = "McCarthy, John and Painter, James",
title = {{Correctness of a Compiler for Arithmetic Expressions}},
booktitle = "Proc. Symp. in Applied Mathematics",
publisher = "American Mathematical Society",
year = "1967",
paper = "Mcca67.pdf",
keywords = "printed"
}

\end{chunk}

\index{Vuillemin, Jean}
\begin{chunk}{axiom.bib}
@phdthesis{Vuil73,
author = "Vuillemin, Jean",
title = {{Proof Techniques for Recursive Programs}},
school = "Stanford University",
year = "1973",
abstract =
"The concept of least fixed-point of a continuous function can be
considered as the unifying thread of this dissertation.

The connections between fixed-points and recursive programs are
detailed in chapter 2, providing some insights on practical
implementations of recursion. There are two usual characterizations
of the least fixed-point of a continuous function. To the first
characterization, due to Knaster and Tarski, corresponds a class
of proof techniques for programs, as described in Chapter 3. The
other characterization of least fixed points, better known as
Kleene's first recursion theorem, is discussed in Chapter 4. It
has the advantage of being effective and it leads to a wider class
of proof techniques.",
paper = "Vuil73.pdf"
}

\end{chunk}

\begin{chunk}{axiom.bib}
@misc{Byte79,
author = "Byte Publications",
title = {{LISP}},
publisher = "McGraw-Hill",
year = "1979",
paper = "Byte79.pdf"
}

\end{chunk}

\index{Manna, Zohar}
\index{Vuillemin, Jean}
\begin{chunk}{axiom.bib}
@article{Mann72,
author = "Manna, Zohar and Vuillemin, Jean",
title = {{Fixpoint Approach to the Theory of Computation}},
journal = "Communications of the ACM",
volume = "15",
number = "7",
year = "1972",
pages = "828-836",
abstract =
"Following the fixpoint theory of Scott, the semantics of computer
programs are defined in terms of the least fixpoints of recursive
programs. This allows not only the justification of all existing
verification techniques, but also their extension to the handling, in
a uniform manner of various properties of computer programs, including
correctness, termination, and equivalence.",
paper = "Mann72.pdf",
keywords = "printed"
}

\end{chunk}

\index{Paulson, Lawrence C.}
\begin{chunk}{axiom.bib}
@article{Paul83,
author = "Paulson, Lawrence C.",
title = {{A Higher-Order Implementation of Rewriting}},
journal = "Science of Computer Programming",
volume = "3",
pages = "119-149",
year = "1983",
abstract =
"Many automatic theorem-provers rely on rewriting. Using theorems as
rewrite rules helps to simplify the subgoals that arise during a
proof. LCF is an interactive theorem-prover intended For reasoning
about computation. Its implementation of rewriting is presented in
detail. LCF provides a Family of rewriting Functions, and operators to
combine them. A succession of Functions is described, From pattern
matching primitives to the rewriting tool that performs most
inferences in LCF proofs. The design is highly modular. Each function
performs a basic, specific task, such as recognizing a certain form of
tautology. Each operator implements one method of building a rewriting
Function From simpler ones. These pieces can be put together in
numerous ways, yielding a variety of rewriting strategies. The
approach involves programming with higher-order Functions. Rewriting
Functions are data values, produced by computation on other rewriting
Functions. The code is in daily use at Cambridge, demonstrating the
practical use of Functional programming.",
paper = "Paul83.pdf",
keywords = "printed"
}

\end{chunk}

\index{Holzl, Johannes}
\index{Serafin, Luke}
@misc{Avig17c,
author = "Avigad, Jeremy and Holzl, Johannes and Serafin, Luke",
title = {{A Formally Verified Proof of the Central Limit Theorem}},
year = "2017",
abstract =
"We describe a proof of the Central Limit Theorem that has been
formally verified in the Isabelle proof assistant. Our formalization
builds upon and extends Isabelle’s libraries for analysis and
measure-theoretic probability. The proof of the theorem uses
characteristic functions , which are a kind of Fourier transform, to
demonstrate that, under suitable hypotheses, sums of random
variables converge weakly to the st andard normal distribution. We
also discuss the libraries and infrastructure that supported the
formalization, and reflect on some of the lessons we have learned
from the effort.",
paper = "Avig17c.pdf",
keywords = "printed"
}

\end{chunk}

\index{Feferman, Solomon}
\begin{chunk}{axiom.bib}
@misc{Fefe95,
author = "Feferman, Solomon",
title = {{Definedness}},
year = "1995",
abstract =
"Questions of definedness are ubiquitous in mathematics. Informally,
these involve reasoning about expressions which may or may not have a
value. This paper surveys work on logics in which such reasoning can
be carried out directly, especially in computational contexts. It
begins with a general logic of 'partial terms', continues with partial
combinatory and lambda calculi, and concludes with an expressively
rich theory of partial functions and polymorphic types, where
termination of functional programs can be established in a natural way.",
paper = "Fefe95.pdf",
keywords = "printed"
}

\end{chumk}

2873. By daly on 2018-06-07

books/bookvol0 fix minor typo to citation

Goal: Axiom Maintenance

2872. By daly on 2018-06-07

books/bookvol0 add Griesmer Obituary

Goal:

2871. By daly on 2018-06-07

Goal: Proving Axiom Correct

\index{Kohlhase, Michael}
\index{De Feo, Luca}
\index{Muller, Dennis}
\index{Pfeiffer, Markus}
\index{Rabe, Florian}
\index{Thiery, Nicolas M.}
\index{Vasilyev, Victor}
\index{Wesing, Tom}
\begin{chunk}{axiom.bib}
@inproceedings{Kohl17,
author = "Kohlhase, Michael and De Feo, Luca and Muller, Dennis and
Pfeiffer, Markus and Rabe, Florian and Thiery, Nicolas M.
and Vasilyev, Victor and Wesing, Tom",
title = {{Knowledge-Based Interoperability for Mathematical Software
Systems}},
booktitle = "7th Int. Conf. on Mathematical Aspects of Computer and
Information Sciences",
publisher = "Springer",
year = "2017",
pages = "195-210",
isbn = "9783319724539",
abstract =
"There is a large ecosystem of mathematical software systems.
Individually, these are optimized for particular domains and
functionalities, and together they cover many needs of practical and
theoretical mathematics. However, each system specializes on one area,
and it remains very difficult to solve problems that need to involve
multiple systems. Some integrations exist, but the are ad-hoc and have
scalability and maintainability issues. In particular, there is not
yet an interoperability layer that combines the various systems into a
virtual research environment (VRE) for mathematics.

The OpenDreamKit project aims at building a toolkit for such VREs. It
suggests using a central system-agnostic formalization of mathematics
(Math-in-the-Middle, MitM) as the needed interoperability layer. In
this paper, we conduct the first major case study that instantiates
the MitM paradigm for a concrete domain as well as a concrete set of
systems. Specifically, we integrate GAP, Sage, and Singular to perform
computation in group and ring theory.

Our work involves massive practical efforts, including a novel
formalization of computational group theory, improvements to the
involved software systems, and a novel mediating system that sits at
the center of a star-shaped integration layout between mathematical
software systems.",
paper = "Kohl17.pdf",
keywords = "printed"
}

\end{chunk}

\index{Dehaye, Paul-Olivier}
\index{Iancu, Mihnea}
\index{Kohlhase, Michael}
\index{Konovalov, Alexander}
\index{Lelievre, Samuel}
\index{Muller, Dennis}
\index{Pfeiffer, Markus}
\index{Rabe, Florian}
\index{Thiery, Nicolas M.}
\index{Wiesling, Tom}
\begin{chunk}{axiom.bib}
@inproceedings{Deha16,
author = "Dehaye, Paul-Olivier and Iancu, Mihnea and Kohlhase, Michael
and Konovalov, Alexander and Lelievre, Samuel and
Muller, Dennis and Pfeiffer, Markus and Rabe, Florian and
Thiery, Nicolas M. and Wiesling, Tom",
title = {{Interoperability in the OpenDreamKit project: The
Math-in-the-Middle Approach}},
booktitle = "Intelligent Computer Mathematics",
pages = "117-131",
year = "2016",
isbn = "9783319425467",
publisher = "Springer",
abstract =
"OpenDreamKit - 'Open Digital Research Environment Toolkit for the
Advancement of Mathematics' - is an H2020 EU Research Infrastructure
project that aims at supporting, over the period 2015-2019, the
ecosystem of open-source mathematical software systems. OpenDreamKit
will deliver a flexible toolkit enabling research groups to set up
Virtual Research Environments, customised to meet the varied needs of
research projects in pure mathematics and applications.

An important step in the OpenDreamKit endeavor is to foster the
interoperability between a variety of systems, ranging from
computer algebra systems over mathematical databases to
front-ends. This is the mission of the integration work
package. We report on experiments and future plans with the
Math-in-the-Middle approach. This architecture consists of a
central mathematical ontology that documents the domain and fixes a
joint vocabulary, or even a language, going beyond existing
systems such as OpenMath, combined with specifications of the
functionalities of the various systems. Interaction between
systems can then be enriched by pivoting around this architecture.",
paper = "Deha16.pdf",
keywords = "printed, axiomref"
}

\end{chunk}

\index{de Bruijn, N.G.}
\begin{chunk}{axiom.bib}
@techreport{Brui68a,
author = "de Bruijn, N.G.",
title = {{AUTOMATH, A Language for Mathematics}},
year = "1968",
type = "technical report",
number = "68-WSK-05",
institution = "Technische Hogeschool Eindhoven",
paper = "Brui68a.pdf"
}

\end{chunk}

\index{Farmer, William M.}
\begin{chunk}{axiom.bib}
@article{Farm07a,
author = "Farmer, William M.",
title = {{Chiron: A Multi-Paradigm Logic}},
journal = "Studies in Logic, Grammar and Rhetoric",
volume = "10",
number = "23",
year = "2007",
abstract =
"Chiron is a derivative of von-Neumann-Bernays-G̈odel ( NBG ) set
theory that is intended to be a practical, general-purpose logic for
mechanizing mathematics. It supports several reasoning paradigms by
integrating NBG set theory with elements of type theory, a scheme for
handling undefinedness, and a facility for reasoning about the syntax
of expressions. This paper gives a quick, informal presentation of
the syntax and semantics of Chiron and then discusses some of the
benefits Chiron provides as a multi-paradigm logic.",
paper = "Farm07a.pdf",
keywords = "axiomref, printed"
}

\end{chunk}

\index{Farmer, William M.}
\begin{chunk}{axiom.bib}
@techreport{Farm13,
author = "Farmer, William M.",
title = {{Chiron: A Set Theory with Types, Undefinedness, Quotation,
and Evaluation}},
type = "technical report",
institution = "McMaster University",
number = "SQRL Report No. 38",
year = "2013",
abstract =
"Chiron is a derivative of von-Neumann-Bernays-Godel (NBG) set
theory that is intended to be a practical, general-purpose logic for
mechanizing mathematics. Unlike traditional set theories such as
Zermelo-Fraenkel (ZF) and NBG, Chiron is equipped with a type system,
lambda notation, and definite and indefinite description. The type
system includes a universal type, dependent types, dependent function
types, subtypes, and possibly empty types. Unlike traditional logics
such as first-order logic and simple type theory, Chiron admits
undefined terms that result, for example, from a function applied to
an argument outside its domain or from an improper definite or
indefinite description. The most noteworthy part of Chiron is its
facility for reasoning about the syntax of expressions. Quotation is
used to refer to a set called a construction that represents the
syntactic structure of an expression, and evaluation is used to refer
to the value of the expression that a construction represents. Using
quotation and evaluation, syntactic side conditions, schemas,
syntactic transformations used in deduction and computation rules, and
other such things can be directly expressed in Chiron. This paper
presents the syntax and semantics of Chiron, some definitions and
simple examples illustrating its use, a proof system for Chiron, and a
notion of an interpretation of one theory of Chiron in another.",
paper = "Farm13.pdf"
}

\end{chunk}

\index{Kripke, Saul}
\begin{chunk}{axiom.bib}
@article{Krip75,
author = "Kripke, Saul",
title = {{Outline of a Theory of Truth}},
journal = "Journal of Philosophy",
volume = "72",
number = "19",
year = "1975",
pages = "690-716",
paper = "Krip75.pdf",
keywords = "printed"
}

\end{chunk}

\index{Kleene, Stephen Cole}
\begin{chunk}{axiom.bib}
@book{Klee52,
author = "Kleene, Stephen Cole",
title = {{Introduction to MetaMathematics}},
year = "1952",
publisher = "Ishi Press International",
isbn = "978-0923891572",
paper = "Klee52.pdf"
}

\end{chunk}

\index{Smith, Peter}
\begin{chunk}{axiom.bib}
@misc{Smit15,
author = "Smith, Peter",
title = {{Some Big Books on Mathematical Logic}},
year = "2015",
paper = "Smit15.pdf",
keywords = "printed"
}

\end{chunk}

\index{Gonthier, Georges}
\begin{chunk}{axiom.bib}
@article{Gont09a,
author = "Gonthier, Georges",
title = {{Software Engineering for Mathematics}},
journal = "LNCS",
volume = "5625",
year = "2009",
pages = "27-27",
abstract =
"Despite its mathematical origins, progress in computer assisted
reasoning has mostly been driven by applications in computer science,
like hardware or protocol security verification. Paradoxically, it has
yet to gain widespread acceptance in its original domain of
application, mathematics; this is commonly attributed to a 'lack of
libraries': attempts to formalize advanced mathematics get bogged down
into the formalization of an unwieldly large set of basic resuts.

This problem is actually a symptom of a deeper issue: the main
function of computer proof systems, checking proofs down to their
finest details, is at odds with mathematical practice, which ignores
or defers details in order to apply and combine abstractions in
creative and elegant ways. Mathematical texts commonly leave logically
important parts of proofs as 'exercises to the reader', and are rife
with 'abuses of notation that make mathematics tractable' (according
to Bourbaki). This (essential) flexibility cannot be readily
accomodated by the narrow concept of 'proof library' used by most
proof assistants and based on 19th century first-order logic: a
collection of constants, definitions, and lemmas.

This mismatch is familiar to software engineers, who have been
struggling for the past 50 years to reconcile the flexibility needed
to produce sensible user requirements with the precision needed to
implement them correctly with computer code. Over the last 20 years
object and components have replaced traditional data and procedure
libraries, partly bridging this gap and making it possible to build
significantly larger computer systems.

These techniques can be implemented in compuer proof systems by
exploiting advances in mathematical logic. Higher-order logics allow
the direct manipulation of functions; this can be used to assign
behaviour, such as simplification rules, to symbols, similarly to
objects. Advanced type systems can assign a secondary, contextual
meaning to expressions, using mechanisms such as type classes,
similarly to the metadata in software components. The two can be combined
to perform reflection, where an entire statement gets quoted as
metadata and then proved algorithmically by some decision procedure.

We propose to use a more modest, small-scale form of reflection, to
implement mathematical components. We use the type-derived metadata to
indicate how symbols, definitions and lemmas should be used in other
theories, and functions to implement this usage — roughly, formalizing
some of the exercize section of a textbook. We have applied
successfully this more engineered approch to computer proofs in our
past work on the Four Color Theorem, the Cayley-Hamilton Theorem, and
our ongoing long-term effort on the Odd Order Theorem, which is the
starting point of the proof of the Classification of Finite Simple
Groups (the famous 'monster theorem' whose proof spans 10,000 pages in
400 articles).",
paper = "Gont09a.pdf",
keywords = "printed"
}

\end{chunk}

\index{Carette, Jacques}
\begin{chunk}{axiom.bib}
@article{Care10,
author = "Carette, Jacques",
title = {{Mechanized Mathematics}},
journal = "LNCS",
volume = "6167",
year = "2010",
pages = "157-157",
abstract =
"In the 50 years since McCarthy’s 'Recursive Functions of Symbolic
Expressions and Their Computation by Machine', what have we
learned about the realization of Leibniz’s dream of just being
able to utter 'Calculemus!' (Let us calculate!) when faced with a
mathematical dilemma?

In this talk, I will first present what I see as the most
important lessons from the past which need to be heeded by modern
designers. From the present, I will look at the context in which
computers are used, and derive further requirements. In
particular, now that computers are no longer the exclusive
playground for highly educated scientists, usability is now more
important than ever, and justifiably so.

I will also examine what I see as some principal failings of
current systems, primarily to understand some major mistakes to
avoid. These failings will be analyzed to extract what seems to be
the root mistake, and I will present my favourite solutions.

Furthermore, various technologies have matured since the creation
of many of our systems, and whenever appropriate, these should be
used. For example, our understanding of the structure of
mathematics has significantly increased, yet this is barely
reflected in our libraries. The extreme focus on efficiency by the
computer algebra community, and correctness by the (interactive)
theorem proving community should no longer be considered viable
long term strategies. But how does one effectively bridge that
gap?

I personally find that a number of (programming) language-based
solutions are particularly effective, and I will emphasize
these. Solutions to some of these problems will be illustrated
with code from a prototype of MathScheme 2.0, the system I am
developing with Bill Farmer and our research group.",
paper = "Care10.pdf",
keywords = "printed"
}

\end{chunk}

\index{Zeilberger, Doron}
\begin{chunk}{axiom.bib}
@article{Zeil10,
author = "Zeilberger, Doron",
title = {{Against Rigor}},
journal = "LNCS",
volume = "6167",
year = "2010",
pages = "262-262",
abstract =
"The ancient Greeks gave (western) civilization quite a few
gifts, but we should beware of Greeks bearing gifts. The gifts of theatre
and democracy were definitely good ones, and perhaps even the gift of
philosophy, but the 'gift' of the so-called 'axiomatic method' and the
notion of 'rigorous' proof did much more harm than good. If we want
to maximize Mathematical Knowledge, and its Management, we have to
return to Euclid this dubious gift, and give-up our fanatical insistence
on perfect rigor. Of course, we should not go to the other extreme, of
demanding that everything should be non-rigorous. We should encourage
diversity of proof-styles and rigor levels, and remember that nothing is
absolutely sure in this world, and there does not exist an absolutely
rigorous proof, nor absolute certainty, and 'truth' has many shades and
levels.",
paper = "Zeil10.pdf",
keywords = "printed"
}

\end{chunk}

\index{Pfenning, Frank}
\begin{chunk}{axiom.bib}
@phdthesis{Pfen87,
author = "Pfenning, Frank",
title = {{Proof Transformations in Higher-Order Logic}},
year = "1987",
school = "Carnegie Mellon University",
abstract =
"We investigate the problem of translating between different styles of
proof systems in higher-order logic: analytic proofs which are well
suited for automated theorem proving, and non-analytic deductions
which are well suited for the mathematician. Analytic proofs are
represented as expansion proofs, $H$ , a form of the sequent calculus we
define, non-analytic proofs are represented by natural deductions. A
non-deterministic translation algorithm between expansion proofs and
$H$-deductions is presented and its correctness is proven. We also
present an algorithm for translation in the other direction and prove
its correctness. A cut-elimination algorithm for expansion proofs is
given and its partial correctness is proven. Strong termination of
this algorithm remains a conjecture for the full higher-order
system, but is proven for the first-order fragment. We extend the
translations to a non-analytic proof system which contains a primitive
notion of equality, while leaving the notion of expansion proof
unaltered. This is possible, since a non-extensional equality is
definable in our system of type theory. Next we extend analytic and
non-analytic proof systems and the translations between them to
include extensionality. Finally, we show how the methods and notions
used so far apply to the problem of translating expansion proofs into
natural deductions. Much care is taken to specify this translation in
a modular way (through tactics) which leaves a large number of choices
and therefore a lot of room for heuristics to produce elegant natural
deductions. A practically very useful extension, called symmetric
simplification, produces natural deductions which make use of lemmas
and are often much more intuitive than the normal deductions which
would be created by earlier algorithms.",
paper = "Pfen87.pdf"
}

\end{chunk}

\index{Comon, Hubert}
\begin{chunk}{axiom.bib}
@incollection{Como01,
author = "Comon, Hubert",
title = {{Inductionless Induction}},
booktitle = "Handbook of Automated Reasoning",
comment = "Chapter 14",
publisher = "Elsevier",
year = "2001",
paper = "Como01.ps"
}

\end{chunk}

\index{Schmitt, P.H.}
\begin{chunk}{axiom.bib}
@article{Schm86,
author = "Schmitt, P.H.}",
title = {{Computational Aspects of Three-Valued Logic}},
journal = "LNCS",
volume = "230",
year = "1986",
abstract =
"This paper investigates a three-valued logic $L_3$, that has been
introduced in the study of natural language semantics. A complete
proof system based on a three-valued analogon of negative resolution
is presented. A subclass of $L_3$ corresponding to Horn clauses in
two-valued logic is defined. Its model theoretic properties are
studied and it is shown to admit a PROLOG-style proof procedure.",
paper = "Schm86.pdf"
}

\end{chunk}

\index{Common, Hubert}
\index{Lescanne, Pierre}
\begin{chunk}{axiom.bib}
@article{Como89,
author = "Common, Hubert and Lescanne, Pierre",
title = {{Equational Problems and Disunification}},
journal = "J. Symbolic Computation",
volume = "7",
number = "3-4",
year = "1989",
pages = "371-425",
abstract =
"Roughly speaking, an equational problem is a first order formula
whose only predicate symbol is $=$. We propose some rules for the
transformation of equational problems and study their correctness in
various models. Then, we give completeness results with respect to
some 'simple' problems called solved forms. Such completeness results
still hold when adding some control which moreover ensures
termination. The termination proofs are given for a 'weak' control and
thus hold for the (large) class of algorithms obtained by restricting
the scope of the rules. Finally, it must be noted that a by-product of
our method is a decision procedure for the validity in the Herbrand
Universe of any first order formula with the only predicate symbol $=$.",
paper = "Como89.pdf"
}

\end{chunk}

\index{Jones, Simon > Peyton}
\begin{chunk}{axiom.bib}
@inproceedings{Jone96,
author = "Jones, Simon > Peyton",
title = {{Compiling Haskell by Program Transformation: A Report from
the Trenches}},
booktitle = "Proc. European Symposium on Programming",
year = "1996",
publisher = "Eurographics",
abstract =
"Many compilers do some of their work by means of
correctness-preseving, and hopefully performance-improving, program
transformations. The Glasgow Haskell Compiler (GHC) takes this idea
of 'compilation by transformation' as its war-cry, trying to express
as much as possible of the compilation process in the form of
program transformations.

This paper reports on our practical experience of the
transformational approach to compilation, in the context of a
substantial compiler.",
paper = "Jone96.pdf"
}

\end{chunk}

\index{Hanus, Michael}
\begin{chunk}{axiom.bib}
@article{Hanu90,
author = "Hanus, Michael",
title = {{Compiling Logic Programs with Equality}},
journal = "LNCS",
volume = "456",
year = "1990",
pages = "387-401",
abstract =
"Horn clause logic with equality is an amalgamation of functional and
logic programming languages. A sound and complete operational
semantics for logic programs with equality is based on resolution to
solve literals, and rewriting and narrowing to evaluate functional
expressions. This paper proposes a technique for compiling programs
with these inference rules into programs of a low-level abstract
machine which can be efficiently executed on conventional
architectures. The presented approach is based on an extension of the
Warren abstract machine (WAM). In our approach pure logic programs
without function definitions are compiled in the same way as in the
WAM-approach, and for logic programs with function definitions
particular instructions are generated for occurrences of functions
inside clause bodies. In order to obtain an efficient implementation
of functional computations, a stack of occurrences of function symbols
in goals is managed by the abstract machine. The compiler generates
the necessary instructions for the efficient manipulation of the
occurrence stack from the given equational logic programs.",
paper = "Hanu90.pdf"
}

\end{chunk}

\index{Ballerin, Clemens}
\begin{chunk}{axiom.bib}
@phdthesis{Ball99a,
author = "Ballerin, Clemens",
title = {{Computer Algebra and Theorem Proving}},
school = "Darwin College, University of Cambridge",
year = "1999",
abstract =
"Is the use of computer algebra technology beneficial for
mechanised reasoining in and about mathematical domains? Usually
it is assumed that it is. Many works in this area, however, either
have little reasoning conent, or use symbolic computation only to
simplify expressions. In work that has achieved more, the used
methods do not scale up. They trust the computer algebra system
either too much or too little.

Computer algebra systems are not as rigorous as many provers. They
are not logically sound reasoning systems, but collections of
algorithms. We classify soundness problems that occur in computer
algebra systems. While many algorithms and their implementations
are perfectly trustworthy, the semantics of symbols is often
unclear and leads to errors. On the other hand, more robust
approaches to interface external reasoners to provers are not
always practical because the mathematical depth of proofs
algorithms in computer algebra are based on can be enormous.

Our own approach takes both trustworthiness of the overall system
and efficiency into account. It relies on using only reliable
parts of a computer algebra system, which can be achieved by
choosing a suitable library, and deriving specifications for these
algorithms from their literature.

We design and implement an interface between the prover Isabelle
and the computer algebra library Sumit and use it to prove
non-trivial theorems from coding theory. This is based on the
mechanisation of the algebraic theories of rings and
polynomials. Coding theory is an area where proofs do have a
substantial amount of computational content. Also, it is realistic
to assume that the verification of an encoding or decoding device
could be undertaken in, and indeed, be simplified by, such a
system.

The reason why semantics of symbols is often unclear in current
computer algebra systems is not mathematical difficulty, but the
design of those systems. For Gaussian elimination we show how the
soundness problem can be fixed by a small extension, and without
losing efficiency. This is a prerequisite for the efficient use of
the algorithm in a prover.",
paper = "Ball99a.pdf",
keywords = "axiomref, printed"
}

\end{chunk}

\index{Nederpelt, Rob}
\index{Geuvers, Nerman}
\begin{chunk}{axiom.bib}
@book{Nede14,
author = "Nederpelt, Rob and Geuvers, Nerman",
title = {{Type Theory and Formal Proof}},
year = "2014",
publisher = "Cambridge University Press",
isbn = "978-1-107-03650-5"
}

\end{chunk}

\index{Robinson, Alan}
\index{Voronkov, Andrei}
\begin{chunk}{axiom.bib}
@book{Robi01,
author = "Robinson, Alan and Voronkov, Andrei",
title = {{Handbook of Automated Reasoning (2 Volumes)}},
year = "2001",
publisher = "MIT Press",
isbn = "0-262-18223-8"
}

\end{chunk}

\index{Barendregt, Henk}
\index{Wiedijk, Freek}
\begin{chunk}{axiom.bib}
@article{Bare05,
author = "Barendregt, Henk and Wiedijk, Freek",
title = {{The Challenge of Computer Mathematics}},
journal = "Phil. Trans. Royal Society",
volume = "363",
pages = "2351-2375",
year = "2005",
abstract =
"Progress in the foundations of mathematics has made it possible to
formulate all thinkable mathematical concepts, algorithms and proofs
in one language and in an impeccable way. This is not in spite of, but
partially based on the famous results of Godel and Turing. In this way
statements are about mathematical objects and algorithms, proofs show
the correctness of statements and computations, and computations are
dealing with objects and proofs. Interactive computer systems for a
full integration of defining, computing and proving are based on
this. The human defines concepts, constructs algorithms and provides
proofs, while the machine checks that the definitions are well formed
and the proofs and computations are correct. Results formalized so far
demonstrate the feasibility of this ‘computer mathematics’. Also there
are very good applications. The challenge is to make the systems more
mathematician-friendly, by building libraries and tools. The eventual
goal is to help humans to learn, develop, communicate, referee and
apply mathematics.",
paper = "Bare05.pdf",
keywords = "printed"
}

\end{chunk}

\index{Klaeren, Herbert A.}
\begin{chunk}{axiom.bib}
@article{Klae84,
author = "Klaeren, Herbert A.",
title = {{A Constructive Method for Abstract Algebraic Software
Specification}},
journal = "Theoretical Computer Science",
vollume = "30",
year = "1984",
pages = "139-204",
abstract =
"A constructive method for abstract algebraic software
specification is presented, where the operations are not
implicitly specified by equations but by an explicit recursion on
the generating operations of an algebra characterizing the
underlying data structure. The underlying algebra itself may be
equationally specified since we cannot assume that all data
structures will correspond to free algebras. This implies that we
distinguish between generating and defined operations and that the
underlying algebra has a mechanism of well-founded decomposition
w.r.t. the generating operations. We show that the explicit
specification of operations using so-called structural recursive
schemata offers advantages over purely equational specifications,
especially concerning the safeness of enrichments, the ease of
semantics description and the separation between the underlying
data structure and the operations defined on it.",
keywords = "printed"
}

\end{chunk}

\index{Pfenning, Frank}
\begin{chunk}{axiom.bib}
@misc{Pfen04,
author = "Pfenning, Frank",
title = {{Automated Theorem Proving}},
year = "2004",
comment = "Draft of Spring 2004",
keywords = "printed"
}

\end{chunk}

\index{Coquuand, Thierry}
\index{Huet, Gerard}
\begin{chunk}{axiom.bib}
@incollection{Coqu88,
author = "Coquuand, Thierry and Huet, Gerard",
title = {{The Calculus of Constructions}},
booktitle = "Information and Computation, Volume 76",
year = "1988",
publisher = "Academic Press",
paper = "Coqu88.pdf",
keywords = "printed"
}

\end{chunk}

\index{Okasaki, Chris}
\begin{chunk}{axiom.bib}
@phdthesis{Okas96,
author = "Okasaki, Chris",
title = {{Purely Functional Data Structures}},
school = "Carnegie Mellon University",
year = "1996",
comment = "CMU-CS-96-177",
isbn = "978-0521663502",
abstract =
"When a C programmer needs an efficient data structure for a
particular problem, he or she can often simply look one up in any of
a number of good textbooks or handbooks. Unfortunately, programmers
in functional languages such as Standard ML or Haskell do not have
this luxury. Although some data structures designed for imperative
languages such as C can be quite easily adapted to a functional
setting, most cannot, usually because they depend in crucial ways on
assignments, which are disallowed, or at least discouraged, in
functional languages. To address this imbalance, we describe several
techniques for designing functional data structures, and numerous
original data structures based on these techniques, including multiple
variations of lists, queues, double-ended queues, and heaps, many
supporting more exotic features such as random access or efficient
catenation.

In addition, we expose the fundamental role of lazy evaluation in
amortized functional data structures. Traditional methods of
amortization break down when old versions of a data structure, not
just the most recent, are available for further processing. This
property is known as persistence , and is taken for granted in
functional languages. On the surface, persistence and amortization
appear to be incompatible, but we show how lazy evaluation can be used
to resolve this conflict, yielding amortized data structures that are
efficient even when used persistently. Turning this relationship
between lazy evaluation and amortization around, the notion of
amortization also provides the first practical techniques for
analyzing the time requirements of non-trivial lazy programs.

Finally, our data structures offer numerous hints to programming
language designers, illustrating the utility of combining strict and
lazy evaluation in a single language, and providing non-trivial
examples using polymorphic recursion and higher-order, recursive
modules.",
paper = "Okas96.pdf"
}

\end{chunk}

\index{Letouzey, Pierre}
\begin{chunk}{axiom.bib}
@inproceedings{Leto04,
author = "Letouzey, Pierre",
title = {{A New Extraction for Coq}},
booktitle = "Types for Proofs and Programs. TYPES 2002",
publisher = "Springer",
pages = "617-635",
year = "2004",
abstract =
"We present here a new extraction mechanism for the Coq proof
assistant. By extraction, we mean automatic generation of
functional code from Coq proofs, in order to produce certified
programs. In former versions of Coq, the extraction mechanism
suffered several limitations and in particular worked only with a
subset of the language. We first discuss difficulties encountered
and solutions proposed to remove these limitations. Then we give a
proof of correctness for a theoretical model of the new
extraction. Finally we describe the actual implementation.",
paper = "Leto04.pdf",
keywords = "printed"
}

\end{chunk}

\index{Abrams, Marshall D.}
\index{Zelkowitz, Marvin V.}
\begin{chunk}{axiom.bib}
@article{Abra95,
author = "Abrams, Marshall D. and Zelkowitz, Marvin V.",
title = {{Striving for Correctness}},
journal = "Computers and Security",
volume = "14",
year = "1995",
pages = "719-738",
paper = "Abra95.pdf"
}

\end{chunk}

\index{Parnas, David L.}
\begin{chunk}{axiom.bib}
@article{Parn72,
author = "Parnas, David L.",
title = {{A Technique for Software Module Specification with
Examples}},
journal = "CACM",
volume = "15",
number = "5",
year = "1972",
pages = "330-336",
abstract =
"This paper presents an approach to writing specifications for
parts of software systems. The main goal is to provide
specifications sufficiently precise and complete that other pieces
of software can be written to interact with the piece specified
without additional information. The secondary goal is to include
in the specification no more information than necessary to meet
the first goal. The technique is illustrated by means of a variety
of examples from a tutorial system.",
paper = "Parn72.pdf",
keywords = "printed"
}

\end{chunk}

\index{Johnson, C.W.}
\begin{chunk}{axiom.bib}
@article{John96,
author = "Johnson, C.W.",
title = {{Literate Specifications}},
journal = "Software Engineering Journal",
volume = "11",
number = "4",
year = "1996",
pages = "225-237",
paper = "John96.pdf",
abstract =
"The increasing scale and complexity of software is imposing serious
burdens on many industries. Formal notations, such as Z, VDM and
temporal logic, have been developed to address these problems. There
are, however, a number of limitations that restrict the use of
mathematical specifications far large-scale software development. Many
projects take months or years to complete. This creates difficulties
because abstract mathematical requirements cannot easily be used by
new members of a development team to understand past design
decisions. Formal specifications describe what software must do, they
do not explain why it must do it. In order to avoid these limitations,
a literate approach to software engineering is proposed. This
technique integrates a formal specification language and a semi-formal
design rationale. The Z schema calculus is usecl to represent what a
system must do. In contrast, the Questions, Options and Criteria
notation is used to represent the justifications that lie behind
development decisions. Empirical evidence is presented that suggests
the integration of these techniques provides significant benefits over
previous approaches to mathematical analysis and design techniques. A
range of tools is described that have been developed to support our
literate approach to the specification of large-scale sohare systems.",
keywords = "printed"
}

\end{chunk}

\index{Finney, Kate}
\begin{chunk}{axiom.bib}
@article{Finn96,
author = "Finney, Kate",
title = {{Mathematical Notation in Formal Specifications: Too
Difficult for the Masses?}},
journal = "IEEE Trans. on Software Engineering",
volume = "22",
number = "2",
year = "1996",
pages = "158-159",
abstract =
"The phrase 'not much mathematics required' can imply a
variety of skill levels. When this phrase is applied to computer
scientists, software engineers, and clients in the area of formal
specification, the word 'much' can be widely misinterpreted with
disastrous consequences. A small experiment in reading
specifications revealed that students already trained in discrete
mathematics and the specification notation performed very poorly;
much worse than could reasonably be expected if formal methods
proponents are to be believed.",
paper = "Finn96.pdf",
keywords = "printed"
}

\end{chunk}

\index{Moore, Andrew P.}
\index{Payne Jr., Charles N.}
\begin{chunk}{axiom.bib}
@inproceedings{Moor96,
author = "Moore, Andrew P. and Payne Jr., Charles N.",
title = {{Increasing Assurance with LIterate Programming Techniques}},
booktitle = "Comf. on Computer Assurance COMPASS'96",
publisher = "National Institute of Standards and Technology",
year = "1996",
pages = "187-198",
abstract =
"The assurance argument that a trusted system satisfies its
information security requirements must be convincing, because the
argument supports the accreditation decision to allow the computer to
process classified information in an operational
environment. Assurance is achieved through understanding, but some
evidence that supports the assurance argument can be difficult to
understand. This paper describes a novel applica- tion of a technique,
called literate programming [ll], that significantly improves the
readability of the assur- ance argument while maintaining its
consistency with formal specifications that are input to specification
and verification systems. We describe an application c,f this
technique to a simple example and discuss the lessons learned from
this effort.",
paper = "Moor96.pdf",
keywords = "printed"
}

\end{chunk}

\index{Nissanke, Nimal}
\begin{chunk}{axiom.bib}
@book{Niss99,
author = "Nissanke, Nimal",
title = {{Formal Specification}},
publisher = "Springer",
year = "1999",
isbn = "978-1-85233-002-6",
paper = "Niss99.pdf"
}

\end{chunk}

\index{Vienneau, Robert L.}
\begin{chunk}{axiom.bib}
@misc{Vien93,
author = "Vienneau, Robert L.",
title = {{A Review of Formal Methods}},
year = "1993",
paper = "Vien93.pdf",
keywords = "printed"
}

\end{chunk}

\index{Murthy, S.G.K}
\index{Sekharam, K. Raja}
\begin{chunk}{axiom.bib}
@article{Murt09,
author = "Murthy, S.G.K and Sekharam, K. Raja",
title = {{Software Reliability through Theorem Proving}},
journal = "Defence Science Journal",
volume = "59",
number = "3",
year = "2009",
pages = "314-317",
abstract =
"Improving software reliability of mission-critical systems is
widely recognised as one of the major challenges. Early detection
of errors in software requirements, designs and implementation,
need rigorous verification and validation techniques. Several
techniques comprising static and dynamic testing approaches are
used to improve reliability of mission critical software; however
it is hard to balance development time and budget with software
reliability. Particularly using dynamic testing techniques, it is
hard to ensure software reliability, as exhaustive testing is not
possible. On the other hand, formal verification techniques
utilise mathematical logic to prove correctness of the software
based on given specifications, which in turn improves the
reliability of the software. Theorem proving is a powerful formal
verification technique that enhances the software reliability for
mission- critical aerospace applications. This paper discusses the
issues related to software reliability and theorem proving used to
enhance software reliability through formal verification
technique, based on the experiences with STeP tool, using the
conventional and internationally accepted methodologies, models,
theorem proving techniques available in the tool without proposing
a new model.",
paper = "Murt09.pdf",
keywords = "printed"
}

\end{chunk}

\index{Paulson, Lawrence C.}
\begin{chunk}{axiom.bib}
@incollection{Paul90a,
author = "Paulson, Lawrence C.",
title = {{Isabelle: The Next 700 Theorem Provers}},
booktitle = "Logic and Computer Science",
publisher = "Academic Press",
pages = "361-386",
year = "1990",
paper = "Paul90a.pdf"
}

\end{chunk}

\index{Beeson, M.}
\begin{chunk}{axiom.bib}
@article{Bees16,
author = "Beeson, M.",
title = {{Mixing Computations and Proofs}},
journal = "J. of Formalized Reasoning",
volume = "9",
number = "1",
pages = "71-99",
year = "2016",
abstract =
"We examine the relationship between proof and computation in
mathematics, especially in formalized mathematics. We compare the
various approaches to proofs with a significant computational
component, including (i) verifying the algorithms, (ii) verifying the
results of the unverified algo- rithms, and (iii) trusting an external
computation.",
paper = "Bees16.pdf",
keywords = "printed"
}

\end{chunk}

\index{Beeson, Michael}
\begin{chunk}{axiom.bib}
@misc{Bees14,
author = "Beeson, M.",
title = {{Mixing Computations and Proofs}},
comment = "slides",
year = "2014",
paper = "Bees14.pdf"
}

\end{chunk}

\index{Gunter, Elsa L.}
\begin{chunk}{axiom.bib}
@inproceedings{Gunt89a,
author = "Gunter, Elsa L.",
booktitle = "Int. Workshop on Extensions of Logic Programming",
publisher = "Springer",
year = "1989",
pages = "223-244",
abstract =
"In this article, we discuss several possible extensions to
traditional logic programming languages. The specific extensions
proposed here fall into two categories: logical extensions and the
addition of constructs to allow for increased control. There is a
unifying theme to the proposed logical extensions, and that is the
scoped introduction of extensions to a programming context. More
specifically, these extensions are the ability to introduce variables
whose scope is limited to the term in which they occur (i.e. A-bound
variables within A-terms), the ability to intro~iuce into a goal a
fresh constant whose scope is limited to the derivation of that goal,
and the ability to introduce into a goal a program clause whose scope
is limited to the derivation of that goal. The purpose of the
additions for increased control is to facilitate the raising and
handling of failures.

To motivate these various extensions, we have repeatedly appealed to
examples related to the construction of a generic theorem prover. It
is our thesis that this problem domain is specific enough to lend
focus when one is considering various language constructs, and yet
complex enough to encompass many of the general difficulties found in
other areas of symbolic computation.",
paper = "Gunt89a.pdf"
}

\end{chunk}

\index{Paulson, Lawrence C.}
\index{Smith, Andrew W.}
\begin{chunk}{axiom.bib}
@inproceedings{Paul89,
author = "Paulson, Lawrence C. and Smith, Andrew W.",
title = {{Logic Programming, Functional Programming, and
Inductive Definitions}},
booktitle = "Int. Workshop on Extensions of Logic Programming",
publisher = "Springer",
year = "1989",
pages = "283-309",
abstract =
"The unification of logic and functional programming, like the Holy
Grail, is sought by countless people [6, 14]. In reporting our
attempt, we first discuss the motivation. We argue that logic
programming is still immature, compared with functional programming,
because few logic programs are both useful and pure. Functions can
help to purify logic programming, for they can eliminate certain uses
of the cut and can express certMn negations positively.

More generally, we suggest that the traditional paradigm -- logic
programming as first-order logic -- is seriously out of step with
practice. We offer an alternative paradigm. We view the logic program
as an inductive definition of sets and relations. This view explains
certain uses of Negation as Failure, and explains why most attempts to
extend PROLO G to larger fragments of first-order logic have not been
successful. It suggests a logic language with functions,
incorporating equational unification.

We have implemented a prototype of this language. It is very slow,
but complete, and appear to be faster than some earlier
implementations of narrowing. Our experiments illustrate the
programming style and shed light on the further development of such
languages.",
paper = "Paul89.pdf"
}

\end{chunk}

\index{Weirich, Jim}
\begin{chunk}{axiom.bib}
@misc{Weir12,
author = "Weirich, Jim",
title = {{Y Not? -- Adventures in Functional Programming}},
year = "2012",
abstract = "Explains the Y-Combinator"
}

\end{chunk}

\index{Beeson, Michael}
\begin{chunk}{axiom.bib}
@article{Bees06,
author = "Beeson, Michael",
title = {{Mathematical Induction in Otter-Lambda}},
journal = "J. Automated Reasoning",
volume = "36",
number = "4",
pages = "311'344",
abstract =
"Otter-lambda is Otter modified by adding code to implement an
algorithm for lambda unification. Otter is a resolution-based,
clause-language first-order prover that accumulates deduced clauses
and uses strategies to control the deduction and retention of
clauses. This is the first time that such a first-order prover has
been combined in one program with a unification algorithm capable of
instantiating variables to lambda terms to assist in the
deductions. The resulting prover has all the advantages of the
proof-search algorithm of Otter (speed, variety of inference rules,
excellent handling of equality) and also the power of lambda
unification. We illustrate how these capabilities work well together
by using Otter-lambda to find proofs by mathematical induction. Lambda
unification instantiates the induction schema to find a useful
instance of induction, and then Otter's first-order reasoning can be
used to carry out the base case and induction step. If necessary,
induction can be used for those, too. We present and discuss a variety
of examples of inductive proofs found by Otter-lambda: some in pure
Peano arithmetic, some in Peano arithmetic with defined predicates,
some in theories combining algebra and the natural numbers, some
involving algebraic simplification (used in the induction step) by
simplification code from MathXpert, and some involving list induction
instead of numerical induction. These examples demonstrate the
feasibility and usefulness of adding lambda unification to a
first-order prover.",
papers = "Bees06.pdf",
keywords = "printed"
}

\end{chunk}

\index{Barendregt, Henk}
\index{Barendsen, Erik}
\begin{chunk}{axiom.bib}
@article{Bare02,
author = "Barendregt, Henk and Barendsen, Erik",
title = {{Autorkic Computations in Formal Proofs}},
journal = "J. Automated Reasoning",
volume = "28",
pages = "321-336",
year = "2002",
abstract =
"Formal proofs in mathematics and computer science are being studied
because these objects can be verified by a very simple computer
program. An important open problem is whether these formal proofs can
be generated with an effort not much greater than writing a
mathematical paper in, say, LATEX. Modern systems for proof
development make the formalization of reasoning relatively
easy. However, formalizing computations in such a manner that the
results can be used in formal proofs is not immediate. In this paper
we show how to obtain formal proofs of statements such as Prime(61) in
the context of Peano arithmetic or $(x + 1)(x + 1) = x 2 + 2x + 1$ in
the context of rings. We hope that the method will help bridge the gap
between the efficient systems of computer algebra and the reliable
systems of proof development.",
paper = "Bare02.pdf",
keywords = "printed"
}

\end{chunk}

\index{Beeson, Michael}
\begin{chunk}{axiom.bib}
@article{Bees95,
author = "Beeson, Michael",
title = {{Using Nonstandard Analysis to Ensure the Correctness of
Symbolic Computations}},
journal = "Int. J. of Foundations of Computer Science",
volume = "6",
number = "3",
pages = "299-338",
year = "1995",
abstract =
"Nonstandard analysis has been put to use in a theorem-prover,
where it assists in the analysis of formulae involving limits. The
theorem-prover in question is used in the computer program
Mathpert to ensure the correctness of calculations in
calculus. Although non-standard analysis is widely viewed as
non-constructive, it can alternately be viewed as a method of
reducing logical manipulation (of formulae with quantifiers) to
coputation (with rewrite rules). We give a logical theory of
nonstandard analysis which is implemented in Mathpert. We describe
a procedure for 'elimination of infinitesimals' (also implemented
in Mathpert) and prove its correctness.",
paper = "Bees95.pdf",
keywords = "printed"
}

\end{chunk}

\index{Beeson, Michael}
\index{Wiedijk, Freek}
\begin{chunk}{axiom.bib}
@article{Bees02,
author = "Beeson, Michael and Wiedijk, Freek",
title = {{The Meaning of Infinity in Calculus and Computer Algebra
Systems}},
journal = "LNCS",
volume = "2385",
year = "2002",
abstract =
"We use filters of open sets to provide a semantics justifying the
use of infinity in informal limit calculations in calculus, and in
the same kind of calculations in computer algebra. We compare the
behavior of these filters to the way Mathematica behaves when
calculating with infinity.

We stress the need to have a proper semantics for computer algebra
expressions, especially if one wants to use results and methods
from computer algebra in theorem provers. The computer algebra
method under discussion in this paper is the use of rewrite rules
to evaluate limits involving infinity.",
paper = "Bees02.pdf",
keywords = "printed"
}

\end{chunk}

\index{Zimmer, Jurgen}
\index{Dennis, Louise A.}
\begin{chunk}{axiom.bib}
@article{Zimm02,
author = "Zimmer, Jurgen and Dennis, Louise A.",
title = {{Inductive Theorem Proving and Computer Algebra in the
MathWeb Software Bus}},
journal = "LNCS",
volume = "2385",
year = "2002",
abstract =
"Reasoning systems have reached a high degree of maturity in the last
decade. However, even the most successful systems are usually not
general purpose problem solvers but are typically specialised on
problems in a certain domain. The MathWeb Software Bus (MathWeb-SB) is a
system for combining reasoning specialists via a common software bus.
We describe the integration of the $\lambda$-Clam system, a reasoning
specialist for proofs by induction, into the MathWeb-SB. Due to this
integration, $\lambda$-Clam now offers its theorem proving expertise
to other systems in the MathWeb-SB. On the other hand,
$\lambda$-Clam can use the
services of any reasoning specialist already integrated. We focus on
the latter and describe first experiments on proving theorems by
induction using the computational power of the Maple system within
$\lambda$-Clam.",
paper = "Zimm02.pdf",
keywords = "printed"
}

\end{chunk}

\index{Campbell, J.A.}
\begin{chunk}{axiom.bib}
@article{Camp02,
author = "Campbell, J.A.",
title = {{Indefinite Integration as a Testbed for Developments in
Multi-agent Systems}},
journal = "LNCS",
volume = "2385",
year = "2002",
abstract =
"Coordination of multiple autonomous agents to solve problems that
require each of them to contribute their limited expertise in the
construction of a solution is often ensured by the use of numerical
methods such as vote-counting, payoff functions, game theory and
economic criteria. In areas where there are no obvious numerical methods
for agents to use in assessing other agents’ contributions, many
questions still remain open for research. The paper reports a study of
one such area: heuristic indefinite integration in terms of agents with
different single heuristic abilities which must cooperate in finding
indefinite integrals. It examines the reasons for successes and lack
of success in performance, and draws some general conclusions about
the usefulness of indefinite integration as a field for realistic
tests of methods for multi-agent systems where the usefulness of
'economic' criteria is limited. In this connection, the role of
numerical taxonomy is emphasised.",
paper = "Camp02.pdf",
keywords = "printed"
}

\end{chunk}

\index{Wiedijk, Freek}
\begin{chunk}{axiom.bib}
@misc{Wied18,
author = "Wiedijk, Freek",
title = {{Formaizing 100 Theorems}},
year = "2018"
}

\end{chunk}

\index{Wiedijk, Freek}
\begin{chunk}{axiom.bib}
@misc{Wied08,
author = "Wiedijk, Freek",
title = {{Formai Proof -- Geting Started}},
year = "2008",
paper = "Wied08.pdf"
}

\end{chunk}

\index{Kaliszyk, Cezary}
\begin{chunk}{axiom.bib}
@article{Kali08,
author = "Kaliszyk, Cezary",
title = {{Automating Side Conditions in Formalized Partial Functions}},
journal = "LNCS",
volume = "5144",
year = "2008",
abstract =
"Assumptions about the domains of partial functions are necessary in
state-of-the-art proof assistants. On the other hand when
mathematicians write about partial functions they tend not to
explicitly write the side conditions. We present an approach to
formalizing partiality in real and complex analysis in total
frameworks that allows keeping the side conditions hidden from the
user as long as they can be computed and simplified
automatically. This framework simplifies defining and operating on
partial functions in formalized real analysis in HOL Light. Our
framework allows simplifying expressions under partiality conditions
in a proof assistant in a manner that resembles computer algebra
systems.",
paper = "Kali08.pdf"
}

\end{chunk}

\index{Kaliszyk, Cezary}
\index{Wiedijk, Freek}
\begin{chunk}{axiom.bib}
@article{Kali08a,
author = "Kaliszyk, Cezary and Wiedijk, Freek",
title = {{Merging Procedural and Declarative Proof}},
journal = "LNCS",
volume = "5497",
year = "2008",
abstract =
"There are two different styles for writing natural deduction proofs:
the 'Gentzen' style in which a proof is a tree with the conclusion at
the root and the assumptions at the leaves, and the 'Fitch' style
(also called ‘flag’ style) in which a proof consists of lines that are
grouped together in nested boxes.

In the world of proof assistants these two kinds of natural deduction
correspond to procedural proofs (tactic scripts that work on one or
more subgoals, like those of the Coq, HOL and PVS systems), and
declarative proofs (like those of the Mizar and Isabelle/Isar
languages).

In this paper we give an algorithm for converting tree style proofs to
flag style proofs. We then present a rewrite system that simplifies
the results.

This algorithm can be used to convert arbitrary procedural proofs to
declarative proofs. It does not work on the level of the proof terms
(the basic inferences of the system), but on the level of the
statements that the user sees in the goals when constructing the
proof.

The algorithm from this paper has been implemented in the ProofWeb
interface to Coq. In ProofWeb a proof that is given as a Coq proof
script (even with arbitrary Coq tactics) can be displayed both as a
tree style and as a flag style proof.",
paper = "Kali08a.pdf",
keywords = "printed"
}

\end{chunk}

\index{Paulson, Lawrence C.}
\begin{chunk}{axiom.bib}
@article{Akba08a,
author = "Akbarpour, Behzad and Paulson, Lawrence C.",
title = {{MetiTarski: An Automatic Prover for the Elementary Functions}},
journal = "LNCS",
volume = "5144",
year = "2008",
abstract =
"Many inequalities involving the functions ln, exp, sin, cos, etc.,
can be proved automatically by MetiTarski: a resolution theorem prover
(Metis) modified to call a decision procedure (QEPCAD) for the theory
of real closed fields. The decision procedure simplifies clauses by
deleting literals that are inconsistent with other algebraic facts,
while deleting as redundant clauses that follow algebraically from
other clauses. MetiTarski includes special code to simplify
arithmetic expressions.",
paper = "Akba08a.pdf"
}

\end{chunk}

\index{Cramer, Marcos}
\begin{chunk}{axiom.bib}
@misc{Cram14,
author = "Cramer, Marcos",
title = {{Modelling the usage of partial functions and undefined
terms using presupposition theory}},
year = "2014",
isbn = "978-1-84890-130-8",
publisher = "College Publications",
pages = "71-88",
abstract =
"We describe how the linguistic theory of presuppositions can be used
to analyse and model the usage of partial functions and undefined
terms in mathematical texts. We compare our account to other accounts
of partial functions and undefined terms, showing how our account
models the actual usage of partial functions and undefined terms more
faithfully than existing accounts. The model described in this paper
has been developed for the Naproche system, a computer system for
proof-checking mathematical texts written in controlled natural
language, and has largely been implemented in this system.",
paper = "Cram14.pdf",
keywords = "printed"
}

\end{chunk}

\index{Freundt, Sebastian}
\index{Horn, Peter}
\index{Konovalov, Alexander}
\index{Linton, Steve}
\index{Roozemond, Dan}
\begin{chunk}{axiom.bib}
@article{Freu08,
author = "Freundt, Sebastian and Horn, Peter and Konovalov, Alexander
and Linton, Steve and Roozemond, Dan",
title = {{Symbolic Computation Software Composability}},
journal = "LNCS",
volume = "5144",
year = "2008",
abstract =
"We present three examples of the composition of Computer Algebra
Systems to illustrate the progress on a composability infrastructure
as part of the SCIEnce (Symbolic Computation Infrastructure for
Europe) project 1 . One of the major results of the project so far is
an OpenMath based protocol called SCSCP (Symbolic Computation Software
Composability Protocol). SCSCP enables the various software packages
for example to exchange mathematical objects, request calcula- tions,
and store and retrieve remote objects, either locally or accross the
internet. The three examples show the current state of the GAP, KANT,
and MuPAD software packages, and give a demonstration of exposing
Macaulay using a newly developed framework.",
paper = "Freu08.pdf",
keywords = "printed"
}

\end{chunk}

\index{Kozen, Dexter}
\index{Palsberg, Jens}
\index{Schwartzbach, Michael I.}
\begin{chunk}{axiom.bib}
@article{Koze93,
author = "Kozen, Dexter and Palsberg, Jens and Schwartzbach, Michael I.",
title = {{Efficient Recursive Subtyping}},
journal = "Mathematical Structures in Computer Science",
volume = "5",
number = "1",
pages = "113-125",
year = "1995",
abstract =
"Subtyping in the presence of recursive types for the
$\lambda$-calculus was studied by Amadio and Cardelli in 1991. In that
paper they showed that the problem of deciding whether one recursive
type is a subtype of another is decidable in exponential time. In
this paper we give an $O(n^2)$ algorithm. Our algorithm is based on a
simplification of the definition of the subtype relation, which allows
us to reduce the problem to the emptiness problem for a certain finite
automaton with quadratically many states. It is known that equality
of recursive types and the covariant B̈ohm order can be decided
efficiently by means of finite automata, since they are just language
equality and language inclusion, respectively. Our results extend the
automata-theoretic approach to handle orderings based on
contravariance.",
paper = "Koze93.pdf"
}

\end{chunk}

\index{Martelli, Alberto}
\index{Montanari, Ugo}
\begin{chunk}{axiom.bib}
@article{Mart82,
author = "Martelli, Alberto and Montanari, Ugo",
title = {{An Efficient Unification Algorithm}},
journal = "ACM TOPLAS",
volume = "4",
number = "2",
pages = "258-282",
year = "1982",
abstract =
"The unification problem in first-order predicate calculus is
described in general terms as the solution of a system of equations,
and a nondeterministic algorithm is given. A new unification
algorithm, characterized by having the acyclicity test efficiently
embedded into it, is derived from the nondeterministic one, and a
PASCAL implementation is given. A comparison with other well-known
unification algorithms shows that the algorithm described here
performs well in all cases.",
paper = "Mart82.pdf"
}

\end{chunk}

\index{Pottier, Francois}
\begin{chunk}{axiom.bib}
@phdthesis{Pott98,
author = "Pottier, Francois",
title = {{Type Inference in the Presence of Subtyping: From Theory
to Practice}},
school = "Universite Paris 7",
year = "1988",
comment = "INRIA Research Report RR-3483",
abstract =
"From a purely theoretical point of view, type inference for a
functional language with parametric polymorphism and subtyping
poses little difficulty. Indeed, it suffices to generalize the
inference algorithm used in the ML language, so as to deal with
type inequalities, rather than equalities. However, the number of
such inequalities is linear in the program size -- whence, from a
practical point of view, a serious efficiency and readability
problem.

To solve this problem, one must simplify the inferred
constraints. So, after studying the logical properties of
subtyping constraints, this work proposes several simplifcation
algorithms. They combine seamlessly, yielding a homogeneous, fully
formal framework, which directly leads to an efficient
implementation. Although this theoretical study is performed in a
simplified setting, numerous extensions are possible. Thus, this
framework is realistic and should allow a practical appearance of
subtyping in languages with type inference.",
paper = "Pott98.pdf"
}

\end{chunk}

\index{McCarthy, John}
\begin{chunk}{axiom.bib}
@article{Mcca60,
author = "McCarthy, John",
title = {{Recursive Functions of Symbolic Expressions and Their
Computation by Machine, Part I}},
journal = "CACM",
volume = "3",
pages = "184-195",
year = "1960",
paper = "Mcca60.pdf",
keywords = "printed"
}

\end{chunk}

\index{Barendregt, Henk}
\begin{chunk}{axiom.bib}
@incollection{Bare92,
author = "Barendregt, Henk",
title = {{Lambda Calculi with Types}},
booktitle = "Handbook of Logic in Computer Science (vol 2)",
publisher = "Oxford University Press",
year = "1992",
paper = "Bare92.pdf"
}

\end{chunk}

\index{Church, Alonzo}
\begin{chunk}{axiom.bib}
@article{Chur28,
author = "Church, Alonzo",
title = {{On the Law of the Excluded Middle}},
journal = "Bull. of the American Mathematical Society",
volume = "34",
number = "1",
pages = "75-78",
year = "1928",
paper = "Chur28.pdf"
}

\end{chunk}

\index{Mitchell, John}
\begin{chunk}{axiom.bib}
@inproceedings{Mitc84
author = "Mitchell, John",
title = {{Type Inference and Type Containment}},
booktitle = "Proc. Int. Symp. on Semantics of Data Types",
publisher = "Springer",
isbn = "3-540-13346-1",
year = "1984"
pages = "257-277",
abstract =
"Type inference, the process of assigning types to untyped
expressions, may be motivated by the design of a typed language or
semantical considerations on the meeaaings of types and expressions.
A typed language GR with polymorphic functions leads to the GR
inference rules. With the addition of an 'oracle' rule for equations
between expressions, the GR rules become complete for a general class
of semantic models of type inference. These inference models are
models of untyped lambda calculus with extra structure similar to
models of the typed language GR. A more specialized set of type
inference rules, the GRS rules, characterize semantic typing when the
functional type $a~$ , is interpreted as all elements of the model that
map a to $~r$ and the polymorphic type $Vt,~0$ is interpreted as the
intersection of all $o'(r)$. Both inference systems may be reformulated
using rules for deducing containments between types. The advantage of
the type inference rules based on containments is thatproofs
correspond more closely to the structure of terms.",
paper = "Mitc84.pdf"
}

\end{chunk}

\index{Griesmer, James}
\begin{chunk}{axiom.bib}
@artcile{Grie11,
author = "Griesmer, James",
title = {{James Griesmer 1929--2011}},
journal = "ACM Communications in Computer Algebra",
volume = "46",
number = "1/2",
year = "2012",
pages = "10-11",
comment = "Obituary",
paper = "Grie11.pdf"
}

\end{chunk}

\index{Daly, Timothy}
\index{Kastner, John}
\index{Mays, Eric}
\begin{chunk}{axiom.bib}
@inproceedings{Daly88,
author = "Daly, Timothy and Kastner, John and Mays, Eric",
title = {{Integrating Rules and Inheritance Networks in a Knowlege-based
Financial Marketing Consutation System}},
booktitle = "Proc. HICSS 21",
volume = "3",
number = "5-8",
pages = "496-500",
year = "1988",
comment = "KROPS",
abstract =
"This paper describes the integrated use of rule-bascd inference and
an object-centered knowledge representation (inheritance network) in a
financial marketing consultation system. The rules provide a highly
flexible pattern match capability and inference cycle for control. The
inheritance network provides a convenient way to represent the
conceptual structure of the domain. By merging the two techniques, our
financial computation can he shared at the most general level, and
rule inference is carried out at any appropriate Ievel of
generalization. Since domain knowledge is representcd independently
from control knowledge, knowledge ahout a particular problcm solving
technique is decoupled from the conditions for its invocation. A Iarge
financial marketing system has been built and examples are given to
show our combined use of rules and inheritance networks.",
paper = "Daly88.pdf"
}

\end{chunk}

2870. By daly on 2018-05-16

Goal: Proving Axiom Correct

\index{Walther, J.S.}
\begin{chunk}{axiom.bib}
@misc{Walt71,
author = "Walther, J.S.",
title = {{A Unified Algorithm for Elementary Functions}},
year = "1971",
abstract =
"This paper describes a single unified algorithm for the calculation
of elementary functions including multipli- cation, division, sin,
cos, tan, arctan, sinh, cosh, tanh, arctanh, In, exp and square-root.
The basis for the algorithm is coordinate rotation in a linear,
circular, or hyperbolic coordinate system depending on which function
is to be calculated. The only operations re- quired are shifting,
adding, subtracting and the recall of prestored constants. The
limited domain of con- vergence of the algorithm is calculated,
leading to a discussion of the modifications required to extend the
domain for floating point calculations.

A hardware floating point processor using the algo- rithm was built at
Hewlett-Packard Laboratories. The block diagram of the processor, the
microprogram control used for the algorithm, and measures of actual
performance are shown.",
paper = "Walt71.pdf"
}

\end{chunk}

\index{Friedman, Daniel P.}
\index{Wise, David S.}
\begin{chunk}{axiom.bib}
@techreport{Frie76,
author = "Friedman, Daniel P. and Wise, David S.",
title = {{CONS should not Evaluate its Arguments}},
institution = "Indiana University",
number = "TR44",
year = "1976",
abstract =
"The constructor function which allocates and fills records in
recursive, side-effect-free procedural languages is redefined to be a
non-strict (Vuillemin 1974) elementary operation. Instead of
evaluating its arguments, it builds suspensions of them which are not
coerced until the suspensions is accessed by strict elementary
function. The resulting evalutation procedures are strictly more
powerful than existing schemes for languages such as LISP. The main
results are that Landin's streams are subsumed into McCarthy's LISP
merely by the redefinition of elementar functions, that invocations of
LISP's evaluator can be minimized by redefining the elemntary
functions without redefining the interpreter, and as a strong
conjecture, that redefining the elementary functions yields the least
fixed-point semantics for McCarthy's evalution scheme. This new
insight into the role of construction functions will do much to ease
the interface between recursive programmers and iterative programmers,
as well as the interface between programmers and data structure
designers.",
paper = "Frie16.pdf",
keywords = "printed"
}

\end{chunk}

\index{Sarma, Gopal}
\index{Hay, Nick J.}
\begin{chunk}{axiom.bib}
@article{Sarm17,
author = "Sarma, Gopal and Hay, Nick J.",
title = {{Robust Computer Algebra, Theorem Proving, and Oracle AI}},
journal = "Informatica",
volume = "41",
number = "3",
year = "2017",
abstract =
"In the context of superintelligent AI systems, the term 'oracle' has
two meanings. One refers to modular systems queried for
domain-specific tasks. Another usage, referring to a class of systems
which may be useful for addressing the value alignment and AI control
problems, is a superintelligent AI system that only answers questions.
The aim of this manuscript is to survey contemporary research problems
related to oracles which align with long-term research goals of AI
safety. We examine existing question answering systems and argue that
their high degree of architectural heterogeneity makes them poor
candidates for rigorous analysis as oracles. On the other hand, we
identify computer algebra systems (CASs) as being primitive examples
of domain-specific oracles for mathematics and argue that efforts to
integrate computer algebra systems with theorem provers, systems which
have largely been developed independent of one another, provide a
concrete set of problems related to the notion of provable safety that
has emerged in the AI safety community. We review approaches to
interfacing CASs with theorem provers, describe well-defined
architectural deficiencies that have been identified with CASs, and
suggest possible lines of research and practical software projects for
scientists interested in AI safety.",
paper = "Sarm17.pdf",
keywords = "printed, axiomref"
}

\end{chunk}

\index{Keller, Chantal}
\begin{chunk}{axiom.bib}
@phdthesis{Kell13,
author = "Keller, C.",
title = {{A Matter of Trust: Skeptical Commuication Between Coq and
External Provers}},
school = "Ecole Polytechnique",
year = "2013",
"\url{https://www.lri.fr/~keller/Documents-recherche/Publications/thesis13.pdf}",
abstract =
"This thesis studies the cooperation between the Coq proof assistant
and external provers through proof witnesses. We concentrate on two
different kinds of provers that can return certicates: first, answers
coming from SAT and SMT solvers can be checked in Coq to increase both
the confidence in these solvers and Coq 's automation; second,
theorems established in interactive provers based on Higher-Order
Logic can be exported to Coq and checked again, in order to offer the
possibility to produce formal developments which mix these two
different logical paradigms. It ended up in two software : SMTCoq, a
bi-directional cooperation between Coq and SAT/SMT solvers, and
HOLLIGHTCOQ, a tool importing HOL Light theorems into Coq.

For both tools, we took great care to define a modular and efficient
architecture, based on three clearly separated ingredients: an
embedding of the formalism of the external tool inside Coq which is
carefully translated into Coq terms, a certified checker to establish
the proofs using the certicates and a Ocaml preprocessor to transform
proof witnesses coming from different provers into a generic
certificate. This division allows that a change in the format of proof
witnesses only affects the preprocessor, but no proved Coq code.
Another fundamental component for efficiency and modularity is
computational reflection, which exploits the computational power of
Coq to establish generic and small proofs based on the certicates.",
paper = "Kell13.pdf"
}

\end{chunk}

\index{Farmer, William M.}
\begin{chunk}{axiom.bib}
@article{Farm07,
author = "Farmer, William M.",
title = {{Biform Theories in Chiron}},
journal = "LNCS",
volume = "4573",
pages = "66-79",
year = "2007",
abstract =
"An axiomatic theory represents mathematical knowledge declaratively
as a set of axioms. An algorithmic theory represents mathematical
knowledge procedurally as a set of algorithms. A biform theory is
simultaneously an axiomatic theory and an algorithmic theory. It
represents mathematical knowledge both declaratively and procedurally.
Since the algorithms of algorithmic theories manipulate th e syntax of
expressions, biform theories —- as well as algorithmic theories -— are
difficult to formalize in a traditional logic without the means to
reason about syntax. Chiron is a derivative of
von-Neumann-Bernays-G̈odel ( NBG ) set theory that is intended to be a
practical, general-purpose logic for mechanizing mathematics. It
includes elements of type theory, a scheme for handling undefinedness,
and a facility for reasoning about the syntax of expressions. It is an
exceptionally well-suited logic for formalizing biform theories. This
paper defines the notion of a biform theory, gives an overview of
Chiron, and illustrates how biform theories can be formalized in Chiron.",
paper = "Farm07.pdf",
keywords = "printed"
}

\end{chunk}

\index{Carette, Jacques}
\index{Farmer, William M.}
\index{Sorge, Volker}
\begin{chunk}{axiom.bib}
@inproceedings{Care07,
author = "Carette, Jacques and Farmer, William M. and Sorge, Volker",
title = {{A Rational Reconstruction of a System for Experimental
Mathematics}},
booktitle = "14th Workshop on Automated Reasoning",
publisher = "unknown",
year = "2007",
abstract =
"In previous papers we described the implementation of a system
which combines mathematical object generation, transformation and
filtering, conjecture generation, proving and disproving for
mathematical discovery in non-associative algebra. While the system
has generated novel, fully verified theorems, their construction
involved a lot of ad hoc communication between disparate systems. In
this paper we carefully reconstruct a specification of a sub-process
of the original system in a framework for trustable communication
between mathematics systems put forth by us. It employs the concept
of biform theories that enables the combined formalisation of the
axiomatic and algorithmic theories behind the generation
process. This allows us to gain a much better understanding of the
original system, and exposes clear generalisation opportunities.",
paper = "Care07.pdf",
keywords = "printed"
}

\end{chunk}

\index{Farmer, William M.}
\begin{chunk}{axiom.bib}
@inproceedings{Farm00,
author = "Farmer, William M.",
title = {{A Proposal for the Development of an Interactive
Mathematics Laboratory for Mathematics Eductions}},
booktitle = "Workshop on Deductions Systems for Mathematics Eduation",
pages = "20-25",
year = "2000"
paper = "Farm00.pdf",
keywords = "axiomref, printed"
}

\end{chunk}

\index{Farmer, William M.}
\begin{chunk}{axiom.bib}
@article{Farm04,
author = "Farmer, William M.",
title = {{MKM A New Interdisciplinary Field of Research}},
journal = "SIGSAM",
volume = "38",
pages = "47-52",
year = "2004",
abstract =
"Mathematical Knowledge Management (MKM) is a new interdisciplinary
field of research in the intersection of mathematics, computer
science, library science, and scientific publishing. Its objective is
to develop new and better ways of managing mathematical knowledge
using sophisticated software tools. Its grand challenge is to create
a universal digital mathematics library (UDML), accessible via the
World-Wide Web, that contains essentially all mathematical knowledge
(intended for the public). The challenges facing MKM are daunting,
but a UDML, even just partially constructed, would transform how
mathematics is learned and practiced.",
paper = "Farm04.pdf",
keywords = "printed, axiomref"
}

\end{chunk}

\index{Farmer, William M.}
\index{Mohrenschildt, Martin v.}
\begin{chunk}{axiom.bib}
@inproceedings{Farm00a,
author = "Farmer, William M. and Mohrenschildt, Martin v.",
title = {{Transformers for Symbolic Computation and Formal Deduction}},
pages = "36-45",
year = "2000",
abstract =
"A transformer is a function that maps expressions to expressions.
Many transformational operators -— such as expression evaluators and
simplifiers, rewrite rules, rules of inference, and decision
procedures -— can be represented by transformers. Computations and
deductions can be formed by applying sound transformers in
sequence. This paper introduces machinery for defining sound
transformers in the context of an axiomatic theory in a formal
logic. The paper is intended to be a first step in a development of an
integrated framework for symbolic computation and formal deduction.",
paper = "Farm00a.pdf",
keywords = "printed"
}

\end{chunk}

\index{Farmer, William M.}
\index{Larjani, Pouya}
\begin{chunk}{axiom.bib}
@misc{Farm14,
author = "Farmer, William M. and Larjani, Pouya",
title = {{Frameworks for Reasoning about Syntax that Utilize
Quotation and Evaluation}},
year = "2014",
abstract =
"It is often useful, if not necessary, to reason about the syntactic
structure of an expression in an interpreted language (i.e., a
language with a semantics). This paper introduces a mathematical
structure called a syntax framework that is intended to be an abstract
model of a system for reasoning about the syntax of an interpreted
language. Like many concrete systems for reasoning about syntax, a
syntax framework contains a mapping of expressions in the
interpreted language to syntactic values that represent the syntactic
structures of the expressions; a language for reasoning about the
syntactic values; a mechanism called quotation to refer to the
syntactic value of an expression; and a mechanism called evaluation to
refer to the value of the expression represented by a syntactic value.
A syntax framework provides a basis for integrating reasoning about
the syntax of the expressions with reasoning about what the
expressions mean. The notion of a syntax framework is used to discuss
how quotation and evaluation can be built into a language and to
define what quasiquotation is. Several examples of syntax frameworks
are presented.",
paper = "Farm14.pdf",
keywords = "printed"
}

\end{chunk}

\index{Carette, Jacques}
\index{Farmer, William M.}
\index{O'Connor, Russell}
\begin{chunk}{axiom.bib}
@misc{Care11c,
author = "Carette, Jacques and Farmer, William M. and O'Connor, Russell",
title = {{MathScheme: Project Description}},
year = "2011",
paper = "Care11c.pdf",
keywords = "printed"
}

\end{chunk}

\index{Carette, Jacques}
\index{Farmer, William M.}
\begin{chunk}{axiom.bib}
@article{Care08,
author = "Carette, Jacques and Farmer, William M.",
title = {{High-Level Theories}},
journal = "LNCS",
volume = "5144",
pages = "232-245"
year = "2008",
abstract =
"We introduce high-level theories in analogy with high-level
programming languages. The basic point is that even though one can
define many theories via simple, low-level axiomatizations , that is
neither an effective nor a comfortable way to work with such theories.
We present an approach which is closer to what users of mathematics
employ, while still being based on formal structures.",
paper = "Care08.pdf",
keywords = "printed"
}

\end{chunk}

\index{Carette, Jacques}
\index{O'Connor, Russell}
\begin{chunk}{axiom.bib}
@article{Care12,
author = "Carette, Jacques and O'Connor, Russell",
title = {{Theory Presentation Combinators}},
journal = "LNCS",
volume = "7362",
year = "2012",
abstract =
"We motivate and give semantics to theory presentation combinators
as the foundational building blocks for a scalable library of
theories. The key observation is that the category of contexts and
fibered categories are the ideal theoretical tools for this
purpose.",
paper = "Care12.pdf",
keywords = "printed"
}

\end{chunk}

\index{Musser, David R.}
\index{Kapur, Deepak}
\begin{chunk}{axiom.bib}
@article{Muss82,
author = "Musser, David R. and Kapur, Deepak",
title = {{Rewrite Rule Theory and Abstract Data Type Analysis}},
journal = "Lecture Notes in Computer Science",
volume = "144",
pages = "77-90",
year = "1982",
paper = "Muss82.pdf",
keywords = "axiomref"
}

\end{chunk}

\index{Lazard, Daniel}
\begin{chunk}{axiom.bib}
@article{Laza82,,
author = "Lazard, Daniel",
title = {{Commutative Algebra and Computer Algebra}},
journal = "Lecture Notes in Computer Science",
volume = "144",
pages = "40-48",
year = "1982",
abstract =
"It is well known that computer algebra deals with commutative rings
such as the integers, the rationals, the modular integers and
polynomials over such rings.

On the other hand, commutative algebra is that part of mathematics
which studies commutative rings.

Our aim is to make this link more precise. It will appear that most of
the constructions which appear in classical commutative algebra can be
done explicitly in finite time. However, there are
exceptions. Moreover, most of the known algorithms are intractable :
The problems are generally exponential by themselves, but many
algorithms are over-over-exponential. It needs a lot of work to find
better methods, and to implement them, with the final hope to open a
computation domain larger than this one which is possible by hand.

To illustrate the limits and the possibilities of computerizing
commutative algebra, we describe an algorithm which tests the
primality of a polynomial ideal and we give an example of a single
algebraic extension of fields $K\subset L$ wuch that there exists an
algorithm of factorization for the polynomials over $k$, but not for
the polynomials over $L$.",
paper = "Laza82.pdf"
}

\end{chunk}

\index{Hearn, Anthony C.}
\begin{chunk}{axiom.bib}
@article{Hear82,,
author = "Hearn, Anthony C.",
title = {{REDUCE - A Case Study in Algebra System Development}},
journal = "Lecture Notes in Computer Science",
volume = "144",
pages = "263-272",
year = "1982",
paper = "Hear82.pdf",
keywords = "axiomref"
}

\end{chunk}

\begin{chunk}{axiom.bib}
author = "Padget, J.A.",
title = {{Escaping from Intermediate Expression Swell: A Continuing Saga}},
journal = "Lecture Notes in Computer Science",
volume = "144",
pages = "256-262",
year = "1982",
abstract =
"The notion of a closed continuation is introduced, is presented,
coroutines using function call and return based on this concept, are
applications and a functional dialect of LISP shown to be merely a
more general form of for coroutines in algebraic simplification and
are suggested, by extension function. expression Potential evaluation
and a specific example of their use is given in a novel attack on the
phenomenon of intermediate expression swell in polynomial
multiplication.",
}

\end{chunk}

\index{Norman, Arthur}
\begin{chunk}{axiom.bib}
@article{Norm82,
author = "Norman, Arthur",
title = {{The Development of a Vector-Based Algebra System}},
journal = "Lecture Notes in Computer Science",
volume = "144",
pages = "237-248",
year = "1982",
paper = "Norm82.pdf"
}

\end{chunk}

\index{Bordoni, L.}
\index{Colagrossi, A.}
\index{Miola, A.}
\begin{chunk}{axiom.bib}
@article{Bord82,
author = "Bordoni, L. and Colagrossi, A. and Miola, A.",
title = {{Linear Algebraic Approach for Computing Polynomial Resultant}},
journal = "Lecture Notes in Computer Science",
volume = "144",
pages = "231-236",
year = "1982",
abstract =
"This paper presents a linear algebraic method for computing the
resultant of two polynomials. This method is based on the
computation of a determinant of order equal to the minimum of the
degrees of the two given polynomials. This method turns out to be
preferable to other known linear algebraic methods both from a
computational point of view and for a total generality respect to
the class of the given polynomials. Some relationships of this
method with the polynomial pseudo-remainder operation are discussed.",
paper = "Bord82.pdf"
}

\end{chunk}

\index{Arnon, Dennis S.}
\index{McCallum, Scott}
\begin{chunk}{axiom.bib}
@article{Arno82a,
author = "Arnon, Dennis S. and McCallum, Scott",
title = {{Cylindrical Algebraic Decomposition by Quantifier Eliminations}},
journal = "Lecture Notes in Computer Science",
volume = "144",
pages = "215-222",
year = "1982",
abstract =
"Cylindrical algebraic decompositions were introduced as a major
component of a new quantifier elimination algorithm for elementary
algebra and geometry (G. Collins, ~973). In the present paper we turn
the tables and show that one can use quantifier elimination for ele-
mentary algebra and geometry to obtain a new version of the
cylindrical algebraic decomposi- tion algorithm. A key part of our
result is a theorem, of interest in its own right, that relates the
multiplicities of the roots of a polynomial to their continuity.",
paper = "Arno82a.pdf"
}

\end{chunk}

\index{Collins, George E.}
\begin{chunk}{axiom.bib}
@article{Coll82a,
author = "Collins, George E.",
title = {{Factorization in Cylindrical Algebraic Decomposition - Abstract}},
journal = "Lecture Notes in Computer Science",
volume = "144",
pages = "212-214",
year = "1982",
paper = "Coll82a.pdf"
}

\end{chunk}

\index{Lazard, Daniel}
\begin{chunk}{axiom.bib}
@article{Laza82a,
author = "Lazard, Daniel",
title = {{On Polynomial Factorization}},
journal = "Lecture Notes in Computer Science",
volume = "144",
pages = "144-157",
year = "1982",
abstract =
"We present new algorithms for factoring univariate polynomials
over finite fields. They are variants of the algorithms of Camion
and Cantor-Zassenhaus and differ from them essentially by
computing the primitive idempotents of the algebra $K[X]/f$ before
factoring $f$.

These algorithms are probabilistic in the following sense. The
time of computation depends on random choices, but the validity of
the result does not depend on them. So, worst case complexity,
being infinite, is meaningless and we compute average
complexity. It appears that our and Cantor-Zassenhaus algorithms
have the same asymptotic complexity and they are the best
algorithms actuall known; with elementary multiplication and GCD
computation, Cantor-Zassenhaus algorithm is always bettern than
ours; with fast multiplication and GCD, it seems that ours is
better, but this fact is not yet proven.

Finally, for factoring polynomials over the integers, it seems
that the best strategy consists in choosing prime moduli as big as
possible in the range where the time of the multiplication does
not depend on the size of the factors (machine word size). An
accurate computation of the involved constants would be needed for
proving this estimation.",
paper = "Laza82a.pdf"
}

\end{chunk}

\index{Strachey, Christopher}
\begin{chunk}{axiom.bib}
@article{Stra00a,
author = "Strachey, Christopher and Wadsworth, Christopher",
title = {{Continuations: A Mathematical Semantics for Handling Full Jumps}},
journal = "Higher-Order and Symbolic Computation",
volume = "13",
pages = "135-152",
year = "2000",
abstract =
"This paper describes a method of giving the mathematical
semantics of programming languages which include the most general
form of jumps.",
paper = "Stra00a.pdf",
keywords = "printed"
}

\end{chunk}

\index{Kaes, Stefan}
\begin{chunk}{axiom.bib}
@article{Kaes88,
author = "Kaes, Stefan",
title = {{Parametric Overloading in Polymorphic Programming Languages}},
journal = "LNCS",
volume = "300",
pages = "131-144",
year = "1988",
abstract =
"The introduction of unrestricted overloading in languagues with type
systems based on implicit parametric potymorphism generally destroys
the principal type property: namely that the type of every expression
can uniformly be represented by a single type expression over some set
of type variables. As a consequence, type inference in the presence
of unrestricted overloading can become a NP-complete problem. In
this paper we define the concept of parametric overloading as a
restricted form of overloading which is easily combined with
principal type property, thereby allowing the design of efficient type
inference algorithms. We present sound type deduction systems, both
for predefined and programmer defined overloading. Finally we state
that parametric overloading can be resolved either statically, at
compile time, or dynamically, during program execution.",
paper = "Kaes88.pdf",
keywords = "printed"
}

\end{chunk}

\index{Kaes, Stefan}
\begin{chunk}{axiom.bib}
@article{Kaes92,
author = "Kaes, Stefan",
title = {{Type Inference inthe Presence of Overloading, Subtyping and
Recursive Types}},
journal = "ACM Lisp Pointers",
volume = "5",
number = "1",
year = "1992",
pages = "193-204",
abstract =
"We present a unified approach to type inference in the presence of
overloading and coercions based on the concept of {\sl constrained
types}. We define a generic inference system, show that subtyping and
overloading can be treated as a special instance of this system and
develop a simple algorithm to compute principal types. We prove the
decidability of type inference for hte class of {\sl decomposable
predicates} and deelop a canonical representation for principal types
based on {\sl most accurate simplifications} of constraint
sets. Finally, we investigate the extension of our techniques to
{\sl recursive types}.",
paper = "Kaes92.pdf",
keywords = "printed"
}

\end{chunk}

\index{Hall, Cordelia V.}
\index{Hammond, Kevin}
\index{Jones, Simon L. Peyton}
\begin{chunk}{axiom.bib}
@article{Hall96,
author = "Hall, Cordelia V. and Hammond, Kevin and Jones, Simon L. Peyton
and Wadler, Philip L.",
title = {{Type Classes in Haskell}},
journal = "Trans. on Programming Langues and Systems",
volume = "18",
number = "2",
pages = "109-138",
year = "1996",
abstract =
"This article de nes a set of type inference rules for resolving
overloading introduced by type classes, as used in the functional
programming language Haskell. Programs including type classes are
transformed into ones which may be typed by standard Hindley-Milner
inference rules. In contrast to other work on type classes, the rules
presented here relate directly to Haskell programs. An innovative
aspect of this work is the use of second-order lambda calculus to
record type information in the transformed program.",
paper = "Hall96.pdf",
keywords = "printed"
}

\end{chunk}

\index{Dreyer, Derek}
\index{Harper, Robert}
\index{Chakravarty, Manuel M.T.}
\index{Keller, Gabriele}
\begin{chunk}{axiom.bib}
@inproceedings{Drey07,
author = "Dreyer, Derek and Harper, Robert and Chakravarty, Manuel M.T.
and Keller, Gabriele",
title = {{Modlular Type Classes}},
booktitle = "Proc. POPL'07",
pages = "63-70",
year = "2007",
abstract =
"ML modules and Haskell type classes have proven to be highly
effective tools for program structuring. Modules emphasize explicit
configuration of program components and the use of data abstraction.
Type classes emphasize implicit program construction and ad hoc
polymorphism. In this paper , we show how the implicitly-typed
style of type class programming may be supported within the framework
of an explicitly-typed module language by viewing type classes as a
particular mode of use of modules. This view offers a harmonious
integration of modules and type classes, where type class features,
such as class hierarchies and associated types, arise naturally as
uses of existing module-language constructs, such as module
hierarchies and type components. In addition, programmers have
explicit control over which type class instances are available for
use by type inference in a given scope. We formalize our approach as
a Harper-Stone-style elaboration relation, and provide a sound type
inference algorithm as a guide to implementation.",
paper = "Drey07.pdf",
keywords = "printed"
}

\end{chunk}

\index{Wehr, Stefan}
\index{Chakravarty, Maneul M.T.}
\begin{chunk}{axiom.bib}
@article{Wehr08,
author = "Wehr, Stefan and Chakravarty, Maneul M.T.",
title = {{ML Modules and Haskell Type Classes: A Constructive
Comparison}},
journal = "LNCS",
volume = "5356",
pages = "188-204",
year = "2008",
abstract =
"Researchers repeatedly observed that the module system of ML and the
type class mechanism of Haskell are related. So far, this relationship
has received little formal investigation. The work at hand fills this
gap: It introduces type-preserving translations from modules to type
classes and vice versa, which enable a thorough comparison of the two
concepts.",
paper = "Wehr08.pdf",
keywords = "printed"
}

\end{chunk}

\index{Stuckey, Peter J.}
\index{Sulzmann, Martin}
\begin{chunk}{axiom.bib}
@article{Stuc05,
author = "Stuckey, Peter J. and Sulzmann, Martin",
journal = "ACM Trans. on Programming Languages and Systems",
volume = "27",
number = "6",
pages = "1-54",
year = "2005",
abstract =
"We present a novel approach to allow for overloading of
identifiers in the spirit of type classes. Our approach relies on
the combination of the HM(X) type system framework with Constraint
Handling Rules (CHRs). CHRs are a declarative language for writing
incremental constraint solvers, that provide our scheme with a
form of programmable type language. CHRs allow us to precisely
describe the relationships among overloaded identifiers. Under
some sufficient conditions on the CHRs we achieve decidable type
inference and the semantic meaning of programs is unambiguous. Our
approach provides a common formal basis for many type class
extensions such as multi-parameter type classes and functional
dependencies.",
paper = "Stuc05.pdf",
keywords = "printed"
}

\end{chunk}

\index{Reynolds, J.C.}
\begin{chunk}{axiom.bib}
@article{Reyn85,
author = "Reynolds, J.C.",
title = {{Three Approaches to Type Structure}},
journal = "LNCS",
volume = "185",
year = "1985",
abstract =
"We examine three disparate views of the type structure of
programming languages: Milner's type deduction system and polymorphic
let construct, the theory of subtypes and generic operators, and
the polymorphic or second-order typed lambda calculus. These
approaches are illustrated with a functional language including
product, sum and list constructors. The syntactic behavior of types
is formalized with type inference rules, bus their semantics is
treated intuitively.",
paper = "Reyn85.pdf",
keywords = "printed"
}

\end{chunk}

\index{Balsters, Herman}
\index{Fokkinga, Maarten M.}
\begin{chunk}{axiom.bib}
@article{Bals91,
author = "Balsters, Herman and Fokkinga, Maarten M.",
title = {{Subtyping can have a simple semantics}},
journal = "Theoretical Computer Science",
volume = "87",
pages = "81-96",
year = "1991",
abstract =
"Consider a first order typed language, with semantics
$\llbracket~\rrbracket$ for expressions and types. Adding
subtyping means that a partial order $\le$ on types is defined
and that the typing rules are extended to the effect that
expression $e$ has type $\tau$ whenever $e$ has type $\sigma$ and
$\sigma \le \tau$. We show how to adapt the semantics
$\llbracket~\rrbracket$ in a {\sl simple set theoretic way},
obtaining a semantics $\llbracket~\rrbracket$ that satisfies, in
addition to some obvious requirements, also the property
$\llbracket\sigma\rrbracket \subseteq$\llbracket\tau\rrbracket$, whenever$\sigma \le \tau\$.",
paper = "Bals91.pdf",
keywords = "printed"
}

\end{chunk}

\index{Cardelli, Luca}
\begin{chunk}{axiom.bib}
@article{Card84,
author = "Cardelli, Luca",
title = {{A Semantics of Multiple Inheritance}},
journal = "LNCS",
volume = "173",
year = "1984",
paper = "Card84.pdf",
keywords = "printed"
}

\end{chunk}

\index{Mosses, Peter}
\begin{chunk}{axiom.bib}
@article{Moss84,
author = "Mosses, Peter",
title = {{A Basic Abstract Semantics Algebra}},
journal = "LNCS",
volume = "173",
year = "1984",
abstract =
"It seems that there are some pragmatic advantages in using Abstract
Semantic Algebras (ASAs) instead of X-notation in denotational
semantics. The values of ASAs correspond to 'actions' (or
'processes'), and the operators correspond to primitive ways of
combining actions. There are simple ASAs for the various independent
'facets' of actions: a functional ASA for data-flow, an imperative ASA
for assignments, a declarative ASA for bindings, etc. The aim is to
obtain general ASAs by systematic combination of these simple ASAs.

Here we specify a basic ASA that captures the common features of the
functional, imperative and declarative ASAs -- and highlights their
differences. We discuss the correctness of ASA specifications, and
sketch the proof of the consistency and (limiting) completeness of the
functional ASA, relative to a simple model.

Some familiarity with denotational semantics and algebraic
specifications is assumed.",
paper = "Moss84.pdf"
}

\end{chunk}

\index{Gross, Jason}
\begin{chunk}{axiom.bib}
@misc{Gros15,
author = "Gross, Jason and Chlipala, Adam",
title = {{Parsing Parses}},
year = "2015",
abstract =
"We present a functional parser for arbitrary context-free grammars,
together with soundness and completeness proofs, all inside Coq. By
exposing the parser in the right way with parametric polymorphism and
dependent types, we are able to use the parser to prove its own
soundness, and, with a little help from relational parametricity,
prove its own completeness, too. Of particular interest is one strange
instantiation of the type and value parameters: by parsing parse trees
instead of strings, we convince the parser to generate its own
completeness proof. We conclude with highlights of our experiences
iterating through several versions of the Coq development, and some
general lessons about dependently typed programming.",
paper = "Gros15.pdf",
keywords = "printed"
}

\end{chunk}

\index{Wehr, Stefan}
\begin{chunk}{axiom.bib}
@phdthesis{Wehr05,
author = "Wehr, Stefan",
title = {{ML Modules and Haskell Type Classes: A Constructive
Comparison}},
school = "Albert Ludwigs Universitat",
year = "2005",
abstract =
"Researchers repeatedly observed that the module system of ML and the
type class mechanism of Haskell are related. So far, this relationship
has received little formal investigation. The work at hand fills this
gap: It introduces type-preserving translations from modules to type
classes and vice versa, which enable a thorough comparison of the two
concepts.

The source language of the first translation is a subset of
Standard ML. The target language is Haskell with common extensions
and one new feature, which was deeloped as part of this work. The
second translation maps a subset of Haskell 98 to ML, with
well-established extensions. I prove that the translations
preserve type correctness and provide implementations for both.

Building on the insights obtained from the translations, I present
a thorough comparison between ML modules and Haskell type
classes. Moreover, I evaluate to what extent the techniques used
in the translations can be exploited for modular programming in
Haskell and for programming with ad-hoc polymorphism in ML.",
paper = "Wehr05.pdf"
}

\end{chunk}

\index{Dreyer, Derek}
\index{Crary, Karl}
\index{Harper, Robert}
\begin{chunk}{axiom.bib}
@article{Drey03,
author = "Dreyer, Derek and Crary, Karl and Harper, Robert",
title = {{A Type System for Higher-Order Modules}},
journal = "ACM SIGPLAN Notices",
volume = "38",
number = "1",
year = "2003",
abstract =
"We present a type theory for higher-order modules that accounts for
many central issues in module system design, including translucency,
applicativity , generativity , and modules as first-class values.
Our type system harmonizes design elements from previous work,
resulting in a simple, economical account of modular programming. The
main unifying principle is the treatment of abstraction mechanisms
as computational effects. Our language is the first to provide a
complete and practical formalization of all of these critical issues
in module system design.",
paper = "Drey03.pdf",
keywords = "printed"
}

\end{chunk}

\index{Crary, Karl}
\index{Harper, Robert}
\index{Puri, Sidd}
\begin{chunk}{axiom.bib}
@inproceedings{Crar99,
author = "Crary, Karl and Harper, Robert and Puri, Sidd",
title = {{What is a Recursive Module}},
booktitle = "Conf. on Programming Language Design and Implementation",
year = "1999",
abstract =
"A hierarchical module system is an effective tool for structuring
large programs. Strictly hierarchical module systems impose an
acyclic ordering on import dependencies among program units. This
can impede modular programming by forcing mutually-dependent
components to be consolidated into a single module. Recently there
have been several proposals for module systems that admit cyclic
dependencies, but it is not clear how these proposals relate to
one another, nor how one mught integrate them into an expressive
module system such as that of ML.

To address this question we provide a type-theoretic analysis of
the notion of a recursive module in the context of a
phase-distinction'' formalism for higher-order module
systems. We extend this calculus with a recursive module mechanism
and a new form of signature, called a {\sl recurslively dependent
signature}, to support the definition of recursive modules. These
extensions are justified by an interpretation in terms of more
primitive language constructs. This interpretation may also serve
as a guide for implementation.",
paper = "Crar99.pdf",
keywords = "printed"
}

\end{chunk}

\index{Dreyer, Derek}
\index{Crary, Karl}
\index{Harper, Robert}
\begin{chunk}{axiom.bib}
@techreport{Drey02,
author = "Dreyer, Derek and Crary, Karl and Harper, Robert",
title = {{A Type System for Higher-Order Modules (Expanded Version)}},
type = "technical report",
institution = "Carnegie Mellon University",
number = "CMU-CS-02-122R",
year = "2002",
abstract =
"We present a type theory for higher-order modules that accounts for
many central issues in module system design, including translucency,
applicativity , generativity , and modules as first-class values.
Our type system harmonizes design elements from previous work,
resulting in a simple, economical account of modular programming. The
main unifying principle is the treatment of abstraction mechanisms
as computational effects. Our language is the first to provide a
complete and practical formalization of all of these critical issues
in module system design.",
paper = "Drey02.pdf"
}

\end{chunk}

\index{Crary, Karl}
\begin{chunk}{axiom.bib}
@techreport{Crar02,
author = "Crary, Karl",
title = {{Toward a Foundational Typed Assembly Language}},
institution = "Carnegie Mellon University",
number = "CMU-CS-02-196",
year = "2002,
abstract =
"We present the design of a typed assembly language called TALT that
supports heterogeneous tuples, disjoint sums, arrays, and a general
account of addressing modes. TALT also implements the von Neumann
model in which programs are stored in memory, and supports relative
addressing. Type safety for execution and for garbage collection are
shown by machine-checkable proofs. TALT is the first formalized typed
assembly language to provide any of these features.",
paper = "Crar02.pdf"
}

\end{chunk}

\index{Mili, Ali}
\index{Aharon, Shir}
\begin{chunk}{axiom.bib}
@article{Mili09,
author = "Mili, Ali and Aharon, Shir and Nadkarni, Chaitanya",
title = {{Mathematics for Reasoning about Loop Functions}},
journal = "Science of Computer Programming",
volume = "79",
year = "2009",
pages = "989-1020",
abstract =
"The criticality of modern software applications, the pervasiveness of
malicious code concerns, the emergence of third-party software
development, and the preponderance of program inspection as a quality
assurance method all place a great premium on the ability to analyze
programs and derive their function in all circumstances of use and all
its functional detail. For C-like programming languages, one of the
most challenging tasks in this endeavor is the derivation of loop
functions. In this paper, we outline the premises of our approach to
this problem, present some mathematical results, and discuss how these
results can be used as a basis for building an automated tool that
derives the function of while loops under some conditions.",
paper = "Mili09.pdf",
keywords = "printed"
}

\end{chunk}

\index{Piroi, Florina}
\index{Buchberger, Bruno}
\index{Rosenkranz, Camelia}
\begin{chunk}{axiom.bib}
@misc{Piro08,
author = "Piroi, Florina and Buchberger, Bruno and Rosenkranz, Camelia",
title = {{Mathematical Journals as Reasoning Agents: Literature Review}},
year = "2008",
abstract =
"This report reviews the literature relevant for the research project
'Math−Agents: Mathematical Journals as Reasoning Agents' proposed by
Bruno Buchberger as a technology transfer project based on the results
of the SFB Project 'Scientific Computing', in particular the project
SFB 1302, 'Theorema'. The project aims at computer−supporting the
refereeing process of mathematical journals by tools that are mainly
based on automated reasoning and also at building up the knowledge
archived in mathematical journals in such a way that they can act as
interactive and active reasoning agents later on. In this report,
we review current mathematical software systems with a focus on the
availability of tools that can contribute to the goals of the Math−
Agents project.",
paper = "Piro08.pdf",
keywords = "printed"
}

\end{chunk}

\index{Sokolowski, Stefan}
\begin{chunk}{axiom.bib}
@article{Soko87,
author = "Sokolowski, Stefan",
title = {{Soundness of Hoare's Logic: An Automated Proof Using LCF}},
journal = "Trans. on Programming Languages and Systems",
volume = "9",
number = "1",
pages = "100-120",
year = "1987",
abstract =
"This paper presents a natural deduction proof of Hoare's logic
carried out by the Edinburgh LCF theorem prover. The emphasis is
on the way Hoare's theory is presented to the LCF, which looks
very much like an exposition of syntax and semantics to human
readers; and on the programmable heuristics (tactics). We also
discuss some problems and possible improvements to the LCF.",
paper = "Soko87.pdf",
keywords = "printed"
}

\end{chunk}

\begin{chunk}{axiom.bib}
author = "Wadler, Philip",
title = {{Monads for Functional Programming}},
journal = "LNCS",
volume = "925",
abstract =
"The use of monads to structure functional programs is
described. Monads provide a convenient framework for simulating
effects found in other languages, such as global state, exception
handling, output, or non-determinism. Three case studies are
looked at in detail: how monads ease the modification of a simple
evaluator; how monads act as the basis of a datatype of arrays
subject to in-place update; and how monads can be used to build
parsers.",
keywords = "printed"
}

\end{chunk}

\index{Freeman, Tim}
\index{Pfenning, Frank}
\begin{chunk}{axiom.bib}
@inproceedings{Free91,
author = "Freeman, Tim and Pfenning, Frank",
title = {{Refinement Types for ML}},
booktitle = "ACM SIGPLAN PLDI'91",
year = "1991",
abstract =
"We describe a refinement of ML's type system allowing the
specification of recursively defined subtypes of user-defined
datatypes. The resulting system of {\sl refinement types}
preserves desirable properties of ML such as decidability of type
inference, while at the same time allowing more errors to be
detected at compile-time. The type system combines abstract
interpretation with ideas from the intersection type discipline,
but remains closely tied to ML in that refinement types are given
only to programs which are already well-typed in ML.",
paper = "Free91.pdf",
keywords = "printed"
}

\end{chunk}

\index{Zeilberger, Noam}
\begin{chunk}{axiom.bib}
@misc{Zeil16,
author = "Zeilberger, Noam",
title = {{Towards a Mathematical Science of Programming}},
year = "2016"
}

\end{chunk}

\index{Zeilberger, Noam}
\begin{chunk}{axiom.bib}
@inproceedings{Zeil16a,
author = "Zeilberger, Noam",
title = {{Principles of Type Refinement}},
booktitle = "OPLSS 2106",
year = "2016",
paper = "Zeil16a.pdf"
}

\end{chunk}

\index{McCarthy, John}
\begin{chunk}{axiom.bib}
@incollection{Mcca63,
author = "McCarthy, John",
title = {{A Basis for a Mathematical Theory of Computation}},
booktitle = "Computer Programming and Formal Systems",
year = "1963",
paper = "Mcca63.pdf"
}

\end{chunk}

\index{Scott, Dana S.}
\index{Strachey, Christopher}
\begin{chunk}{axiom.bib}
@article{Scot71,
author = "Scott, Dana S. and Strachey, C.",
title = {{Towards a Mathematical Semantics for Computer Languages}},
journal = "Proc. Symp. on Computers and Automata",
volume = "21",
year = "1971",
abstract =
"Compilers for high-level languages are generally constructed to
give a complete translation of the programs into machine
lanugage. As machines merely juggle bit patterns, the concepts of
the original language may be lost or at least obscured during this
passage. The purpose of a mathematical semantics is to give a
correct and meaningful correspondence between programs and
mathematical entities in a way that is entirely independent of an
implementation. This plan is illustrated in a very elementary
method with the usual idea of state transformations. The next
section shows why the mathematics of functions has to be modified
to accommodate recursive commands. Section 3 explains the
modification. Section 4 introduces the environments for handling
variables and identifiers and shows how the semantical equations
define equivalence of programs. Section 5 gives an exposition of
the new type of mathematical function spaces that are required for
the semantics of procedures when these are allowed in assignment
statements. The conclusion traces some of the background of the
project and points the way to future work.",
paper = "Scot71.pdf"
}

\end{chunk}

\index{Mellies, Paul-Andre}
\index{Zeilberger, Noam}
\begin{chunk}{axiom.bib}
@inproceedings{Mell15,
author = "Mellies, Paul-Andre and Zeilberger, Noam",
title = {{Functors are Type Refinement Systems}},
booktitle = "POPL'15",
publisher = "ACM",
year = "2015",
abstract =
"The standard reading of type theory through the lens of category
theory is based on the idea of viewing a type system as a category
of well-typed terms. We propose a basic revision of this reading;
rather than interpreting type systems as categories, we describe
them as {\sl functors} from a category of typing derivations to a
category of underlying terms. Then, turning this around, we
explain how in fact {\sl any} functor gives rise to a generalized
type system, with an abstract notion of type judgment, typing
derivations and typing rules. This leads to a purely categorical
reformulation of various natural classes of type systems as
natural classes of functors.

The main purpose of this paper is to describe the general
framework (which can also be seen as providing a categorical
analysis of {\sl refinement types}, and to present a few
applications. As a larger case study, we revisit Reynold's paper
on The Meaning of Types'' (2000), showing how the paper's main
results may be reconstructed along these lines.",
paper = "Mell15.pdf",
keywords = "printed"
}

\end{chunk}

\index{Shields, Mark}
\index{Jones, Simon Peyton}
\begin{chunk}{axiom.bib}
@inproceedings{Shie02,
author = "Shields, Mark and Jones, Simon Peyton",
title = {{First-Class Modules for Haskell}},
booktitle = "9th Int. Conf. on Foundations of Object-Oriented Languages",
pages = "28-40",
year = "2002",
abstract =
"Though Haskell's module language is quite weak, its core language
is highly expressive. Indeed, it is tantalisingly close to being
able to express much of the structure traditionally delegated to a
separate module language. However, the encoding are awkward, and
some situations can't be encoded at all.

In this paper we refine Haskell's core language to support
{\sl first-class modules} with many of the features of ML-style
modules. Our proposal cleanly encodes signatures, structures and
functors with the appropriate type abstraction and type sharing,
and supports recursive modules. All of these features work across
compilation units, and interact harmoniously with Haskell's class
system. Coupled with support for staged computation, we believe
our proposal would be an elegant approach to run-time dynamic
linking of structured code.

Our work builds directly upon Jones' work on parameterised
signatures, Odersky and L\"aufer's system of higher-ranked type
annotations, Russo's semantics of ML modules using ordinary
existential and universal quantifications, and Odersky and
Zenger's work on nested types. We motivate the system by examples,
and include a more formal presentation in the appendix.",
paper = "Shie02.pdf",
keywords = "printed"
}

\end{chunk}

\index{Dijkstra, E.W.}
\begin{chunk}{axiom.bib}
@misc{Dijk71,
author = "Dijkstra, E.W.",
title = {{A Short Introduction to the Art of Programming}},
comment = "EWD316",
year = "1971",
paper = "Dijk71.pdf"
}

\end{chunk}

\index{Steele, Guy Lewis}
\index{Sussman, Gerald Jay}
\begin{chunk}{axiom.bib}
@techreport{Stee78a,
author = "Steele, Guy Lewis and Sussman, Gerald Jay",
title = {{The Art of the Interpreter or, The Modularity Complex (Parts
Zero, One, and Two}},
type = "technical report",
institution = "MIT AI Lab",
number = "AIM-453",
year = "1978",
abstract =
"We examine the effects of various language design decisions on
the programming styles available to a user of the language, with
particular emphasis on the ability to incrementally construct
modular systems. At each step we exhibit an interactive
meta-circular interpreter for the language under consideration.

We explore the consequences of various variable binding
disciplines and the introduction of side effects. We find that
dynamic scoping is unsuitable for constructing procedural
abstractions, but has another role as an agent of modularity,
being a structured form of side effect. More general side effects
are also found to be necessary to promote modular style. We find
that the notion of side effect and the notion of equality (object
identity) are mutually constraining: to define one is to define
the other.

The interpreters we exhibit are all written in a simple dialect of
LISP, and all implement LISP-like languages. A subset of these
interpreters constitute a partial historical reconstruction of the
actual evolution of LISP.",
paper = "Stee78a.pdf",
keywords = "printed"
}

\end{chunk}

\index{Findler, Robert Bruce}
\begin{chunk}{axiom.bib}
author = "Wadler, Philip and Findler, Robert Bruce",
title = {{Well-Typed Programs Can't Be Blamed}},
booktitle = "Workshop on Scheme and Functional Programming",
year = "2007",,
pages = "15-26",
abstract =
"We show how {\sl contracts} with blame fit naturally with recent
work on {\sl hybrid types} and {\sl gradual types}. Unlike hybrid
types or gradual types, we require casts in the source code, in
order to indicate where type errors may occur. Two (perhaps
surprising) aspects of our approach are that refined types can
provide useful static guarantees even in the absence of a theorem
prover, and that type {\sl dynamic} should not be regarded as a
supertype of all other types. We factor the well-known notion of
subtyping into new notions of positive and negative subtyping, and
use these to characterise where positive and negative blame may
arise. Our approach sharpens and clarifies some recent results in
the literature.",
keywords = "printed"
}

\end{chunk}

\index{Michaelson, Greg}
\begin{chunk}{axiom.bib}
@book{Mich11,
author = "Michaelson, Greg",
title = {{Functional Programming Through Lambda Calculus}},
year = "2011",
publisher = "Dover",
isbn = "978-0-486-47883-8"
}

\end{chunk}

2869. By daly on 2018-05-05

books/breakurl.sty latex package to break long URLs

Goal: Literate Programming

Branch format:
Branch format 7
Repository format:
Bazaar repository format 2a (needs bzr 1.16 or later)
This branch contains Public information
Everyone can see this information.

No subscribers.