Subversion Repositories TPPL


Rev 15 | Blame | Compare with Previous | Last modification | View Log | RSS feed

%TODO: disambiguate specification (ML in-out/pre-post) != declaration (thy)
\title{Design and Prototype for a \\
  TP-based Programming Language\\
  {\Large Draft for an FWF proposal}}

\section{Scientific aspects}

} %10 pages
This proposal aims at design and development of a prototype
for a programming language, called ``Theorem-Prover based programming
language'' (TP-PL), which

\item\label{item-comp-func}\textit{adapts the functional programming fragment of
Isabelle}~\footnote{Some details discussed here specifically apply to
the ``High-Order Logic (HOL)'' development of Isabelle.} such that it
becomes appropriate for large software developments in engineering
sciences while exploiting the power of Isabelle's function package and
code generator.
\item\label{item-comp-tools}\textit{accompanies Isabelle's deductive
knowledge with computational tools}, for instance, accompany Isabelle's
Kurzweil-Henstock Gauge Integration in many dimensions with tools to
actually compute integrals; this concerns certified computer algebra.
\item\label{item-comp-interp}\textit{provides an interpreter for the prototyped programming
language} as terms in Isabelle, i.e. in the language layer where
Isabelle tools like simplifiers and tools from computer algebra
(Pt.\ref{item-comp-tools}) reside; the interpreter's design will
extend Isabelle's equational semantics to a big-step operational
TP-PL shall enhance existing tool support for a working engineer as
shown below:
\begin{figure} [htb]
The present tool support is given by CA systems and CA-based
programming languages. CA shall be enhanced by the above
Pt.\ref{item-comp-tools}, the PL by Pt.\ref{item-comp-func} and the
integration of TP-based CA into the language by
}. As a by-product, because conceptionally

%TODO vorl"aufige Formatierung
\noindent required by TP-PL, domain knowledge would serve engineers
directly as well.

A running example will accompany these sections; the example concerns
a standard problem from a textbook on structural engineering:
%cp from ~/material/texts/emat/lucin/vers1/lucas-interp-100614-KEEPgeom.tex
\medskip %\bigskip ... figure --> other page
\textit{\small Determine the bending line of a beam of length $L$,
which consists of homogeneous material, which is clamped on one side
and which is under constant line load $q_0$; see
}%on p.\pageref{fig-bend-7-70}
% Hint: Use the constraints on the shear force $V(0)=q_0\cdot L$, on the
% bending moment $M_b(L)=0$ and on the expected bending line $y(0)=0$
% and $y^\prime(0)=0$.
%$y(0)=0$, $y^\prime(0)=0$, $V(0)=q_0\cdot L$, $M_b(L)=0$.
\begin{figure} [htb]
\caption{\small\it A balcony under load.}
The above mentioned three points will be related to three
respective paragraphs in \S\ref{aims-state}, clarifying the respective
relations to the state-of-the-art. \S\ref{aims-news} will explain how
the project could break new ground scientifically and
\S\ref{aims-results} will estimate the importance of the expected
results for Formal Methods.

\subsubsection{Aims in relation to the state-of-the-art}\label{aims-state}
This section addresses the three aims from p.\pageref{aims} with three
respective paragraphs, eventually cross-referenced by page.

\paragraph{Adapt the functional programming fragment of Isabelle:}\label{aims-state-fun}
Over the years Isabelle has extended it's high-order logic with
a powerful function package and a code generator, such that we can state:
\textit{Isabelle provides powerful and appropriate prerequisites\\
for the planned programming language, TP-PL ~!}
We just need to adopt Isabelle's existing programming language,
exploit the power of the function package and the code generator and
to extend both mechanisms by modules, featuring larger developments in
engineering applications.

Starting from the running example, the problem stated on
} can be solved by computations of a program,
which could look like as follows:
% cp from material/texts/present/110731-wroclaw.tex:
123\=$01$x\=(\=(\=(funs\=(Subproblem (\=\kill
\>$01$  \>{\sc program} bendingLine :: real $\Rightarrow$ real $\Rightarrow$ real (real $\Rightarrow$ real) $\Rightarrow$ bool list\\
\>$02$  \>{\sc program} bendingLine l q v b rb =\\
\>$03$  \>\>({\tt let}\\
\>$04$  \>\>\>(funs::bool list) {\tt =}\\
\>$05$  \>\>\>\>({\sc program} (Bendingline, BeamUnderLoad, Integrate4times) [q, b, v]);\\
\>$06$  \>\>\>(equs::bool list) {\tt =}\\
\>$07$  \>\>\>\>({\sc program} (Bendingline, Constraints, setConstraints) [funs, rb, l]);\\
\>$08$  \>\>\>(sols::bool list) {\tt =} {\sc solve
} equs [$c,c_2,c_3,c_4$];\\
%\>$12$  \>\>\>(sols::bool list) {\tt =}\\
%\>$13$  \>\>\>\>(SubProblem (Real,[linear,equation,system],[])\\
%\>$14$  \>\>\>\>\>[bools equs, reals [c,c\_2,c\_3,c\_4]]);\\
%\>$15$  \>\>\>B {\tt =} {\em Take} ({\tt last} funs);\\
\>$09$  \>\>\>B {\tt =} (({\sc Substitute} sols) {\tt o} ({\sc Simplify} [(bdv, v)] make\_ratpoly\_in)) ({\tt last} funs)\\
%\>$16$  \>\>\>B {\tt =} ((Substitute sols) @@\\
%\>$17$  \>\>\>\>$\;$(Rewrite\_Set\_Inst [(bdv, v)] make\_ratpoly\_in)) B\\
\>$10$  \>\>{\tt in} B)
This program will be referenced subsequently, such hat the essential
features become clear step by step.

\subparagraph{\sc\large Adopt Isabelle's existing programming language:} The
executable fragment of Isabelle consists of data-type and function
definitions.  It's usability even suggests that fragment for
introductory courses \cite{nipkow-prog-prove}. HOL is a typed logic
whose type system resembles that of functional programming
languages. Thus there are
\item[base types,] in particular \textit{bool}, the type of truth
values, \textit{nat}, \textit{int}, \textit{complex}, and the types of
natural, integer and complex numbers respectively in mathematics.
\item[type constructors] allow to define arbitrary types, from
\textit{set}, \textit{list} to advanced data-structures like
\textit{trees}, red-black-trees etc.
\item[function types,] denoted by $\Rightarrow$, will be separately
discussed subsequently.
\item[type variables,] denoted by $^\prime a, ^\prime b$ etc, provide
type polymorphism. Isabelle automatically computes the type of each
variable in a term by use of Hindley-Milner type inference

\textbf{Terms} are formed as in functional programming by applying
functions to arguments. If $f$ is a function of type
$\tau_1\Rightarrow \tau_2$ and $t$ is a term of type $\tau_1$ then
$f\;t$ is a term of type~$\tau_2$. $t\;::\;\tau$ means that term $t$
has type $\tau$. There are many predefined infix symbols like $+$ and
$\leq$ most of which are overloaded for various types.

HOL also supports some basic constructs from functional programming:
\begin{tabbing} 123\=\kill
\>$( \; {\tt if} \; b \; {\tt then} \; t_1 \; {\tt else} \; t_2 \;)$\\
\>$( \; {\tt let} \; x=t \; {\tt in} \; u \; )$\\
\>$( \; {\tt case} \; t \; {\tt of} \; {\it pat}_1
  \Rightarrow t_1 \; |\dots| \; {\it pat}_n\Rightarrow t_n \; )$

\end{tabbing} }
\noindent \textit{The running example's program uses some of these elements
(marked by {\tt tt-font} on p.\pageref{expl-program}): ${\tt
let}\dots{\tt in}$
in lines $02 \dots 11$, as well as {\tt last} for
lists and {\tt o} for functional (forward) composition in line
$10$. In fact, the whole program is an Isabelle term with specific
function constants like {\sc program}, {\sc Substitute} and {\sc
Rewrite\_Set\_Inst} in lines $01$ and $10$ respectively.

% Terms may also contain $\lambda$-abstractions. For example, $\lambda
% x. \; x$ is the identity function.

\textbf{Formulae} are terms of type \textit{bool}. There are the basic
constants \textit{True} and \textit{False} and the usual logical
connectives (in decreasing order of precedence): $\neg, \land, \lor,

\textbf{Equality} is available in the form of the infix function $=$ of type $a \Rightarrow a \Rightarrow {\it bool}$. It also works for formulas, where it means ``if and only if''.

\textbf{Quantifiers} are written $\forall x. \; P$ and $\exists x. \;
.  Quantifiers lead to non-executable functions, so functions do not
always correspond to programs, for instance, if comprising \\$(
\;{\it if} \; \exists x.\;P \; {\it then} \; e_1 \; {\it else} \; e_2

\subparagraph{\sc\large Exploit the power of Isabelle's function
package} which can generate functional programs from recursive
functions specified in the HOL logic. The power of the package is
indicated by calling an intermediate layer ``Mini-Haskell''
\cite{haftm-nipkow-code-gen-HRS-10}. The package provides specific
support for modeling iteration: In functional programs iteration is
modeled by recursive function calls; this introduces another class of
{\em partiality}, which results from not well-founded recursion and
which might cause non-termination. And termination needs to be proved
in the envisaged context of theorem proving.

However, such termination proofs are gracefully handled by Isabelle's
function package
\cite{krauss:partial06,krauss-fun-def-10}: Given
the definition of a well-known function by \dots

\begin{tabbing} 123\=$|\;$\=\kill
\>${\it fun}\;{\it fib} :: "{\it nat} \Rightarrow {\it nat}"$\\
\>${\it where}$\\
\> \>$"{\it fib}\;0 = 1"$\\
\>$|$ \> $"{\it fib}\;({\it Suc}\;0) = 1"$\\
\>$|$ \> $"{\it fib}\;({\it Suc}\;({\it Suc}\;n)) = {\it fib}\ n +
{\it fib}\;(S{\it uc}\;n)"$


\noindent\dots Isabelle automatically proves (triggered by
\textit{fun}) termination of this function as well as of nested
recursive and of mutual recursive functions.  The termination prover
\cite{nipkow-krauss:07-find-lex-orders,Krauss:07-size-terminat} is
able to solve termination proof obligations fully automatically,
obviating the need for specifying termination relations. The package
automatically introduces inductive domain predicates which accomplish
several kinds of partiality issues. The termination prover can also
link to external provers and include full proof reconstruction, where
all necessary properties are derived inside Isabelle

The function package is not a-priori limited to terminating
recursions, but also works on functions which need not terminate on
all input values. Functions requiring a manual termination proof are
defined by \textit{function} (as opposed to \textit{fun}); users have
the choice of either proving it in terms of the domain predicate, or
specifying a well- founded relation or measure and proving that the
arguments of the recursive calls are decreasing -- which they
typically prefer.

The package automatically provides appropriate proof rules for
reasoning about defined functions and by the way establishes two
appealing advantages: (1) an equational semantics for the programming
language in terms of higher-order rewrite systems
\cite{haftm-nipkow-code-gen-HRS-10} and (2) prerequisites for code
generation, both of which will be re-used in the proposal for
extending the computational tools on p.\pageref{concepts-computation}
and the interpreter of TP-PL. on p.\pageref{concept-interpreter}.

\medskip\textit{The example program on p.\pageref{expl-program}
comprises several such functions: {\sc solve} in line $09$, {\sc
Substitute} and {\sc Simplify} in line $10$ and also {\tt last}, the
latter already defined in Isabelle/HOL.}

\subparagraph{\sc\large Extend the language for large engineering
applications:} The most successful programming languages for applied
mathematics in engineering sciences are based on Computer
Algebra (CA) Systems like \cite{progr-mathematica-2012,prog-maple06
in fact, the vast majority of mathematical engineering software is
implemented in those languages.

%Isabelle now starts to exploit availability of multi-core machines by
%parallel execution which is completely hidden from the user concerned
%with proofs and theory development \cite{parallel-ML}. When designing
%the TP-PL, the question arises how to combine implicit parallelism
%(independent branches of a functional program can be executed in
%parallel) with explicit control of parallel execution by the
%programmer.  \cite{parallel-ML} gives some hints, for instance a
%parallel \textit{map}-function.
% The pD Compiler (1993–1994) In my Ph.D. thesis I developed a compiler
% for a small para-functional programming language that used a new
% technique to generate parallelism from annotated functional
% programs. The target code is efficient PACLIB C code with explicit
% task creation and synchronization constructs.
% TODO Wolfgang Schreiner ?: pD Compiler ``technique to generate
% parallelism from annotated functional programs''

So, when aiming at a programming language usable for engineers, smooth
integration of CA functionality is indispensable. \cite{plmms10} gives
some ideas. This requirement, however, involves a specific class of
challenges addressed in the second paragraph ``extend Isabelle's
deductive knowledge with computational tools'' on

\medskip The second extension results from the requirement to  assemble a
multitude of functions. The above mentioned function package can
handle functions one by one, so consistent handling of several
functions in one program requires a substantial extension of present
Isabelle. The respective conceptional challenges are addressed in the
third paragraph ``provide an interpreter for the programming
language'' on p.\pageref{concept-interpreter} and respective
technicalities in \S\ref{meth-integrate}.

A particular requirement for large programs is modularisation. Since
TP-PL shall reside within a logical framework, the conception includes
formal specification for programs.

\textit{A specification for particular input data for the running example's
program could be:
%Florian's version in Haskell format:
\>BeamUnderLoad :: (real $\Rightarrow$ real) $\Rightarrow$ real $\Rightarrow$ (real $\Rightarrow$ real)\\
\>BeamUnderLoad $q_0$  L = y\\
\>\>assumes $\;q_0$ is\_integrable $\;\land\;\; L > 0$\\
\>\>yields $y(0) = 0\land y^\prime(0) = 0\:\land\:V(0) = q_0 . L\:\land\;M_b(L) = 0$
% cp from material/texts/present/110731-wroclaw.tex:
% \begin{spacing}{0.8}\label{expl-spec}
% {\small
% \begin{eqnarray*}
% {\it in} &:& {\it function}\; q_0, {\it length}\; L \\
% {\it pre} &:& q_0\; {\it is\_integrable} \;\land\; L > 0\\
% {\it out} &:& {\it function}\; y(x) \\
% {\it post} &:& y(0) = 0\land y^\prime(0) = 0\:\land\:V(0) = q_0 . L\:\land\;M_b(L) = 0
% \end{eqnarray*}}
% \end{spacing}
\textit{where $V$ and $M_b$ are constant function symbols in the theory of ``bending lines''. \textit{function} and \textit{length} are functions fixing the arguments' types; $\;q_0$ is a constant function with type $\cal{R} \rightarrow \cal{R}$.}}

Sub-programs shall be accompanied by their specifications as well.
There also shall be the possibility to provide several algorithms for
one and the same specification, raising the design question how to
design calls of such subprograms.

\textit{The example program on p.\pageref{expl-program} shows a
preliminary decision for calls of subprograms: these are not called by
their name, but by the keyword {\sc program} followed by a triple: the
theory, the specification and the algorithm.}

\medskip Specifications as well as algorithms shall be assembled in
appropriate data structures in order to facilitate mechanised
search. Definitions, specifications and algorithms are expected to
develop into consistent bodies of domain-specific knowledge order to
improve professionality of development \cite{db:SW-engIII}. For
software engineering such knowledge seems to start~\footnote{In
Isabelle's ``Archive of Formal Proofs'' there are collections like on
``Computer Science $>>$ Security'' with growing lists like ``SIFPL,
VolpanoSmith, HotelKeyCards, RSAPSS, InformationFlowSlicing'', see}, also for a few other domains
such knowledge is already mechanised~
\cite{db:dom-eng,Hansen94b,Dehbonei&94}. \cite{plmms10
} describe the
use of such knowledge at a future engineer's workbench.

%TOO SPECIFICL domain knowledge
% The development of such explicit knowledge is already prepared by the
% distinction between the two established notions, quality of design and
% quality of conformance. The former concerns the relation between
% ``reality'' and an abstract model, the latter the relation between the
% abstract model and an implementation. Both relations are increasingly
% investigated and tackled with mathematical methods. One of the
% strongest method is proof; thus one might predict increasing
% importance of mathematical proof in this part software technology.

\paragraph{Accompany Isabelle's deductive knowledge with
computational tools:}\label{concepts-computation} Engineers want
concrete solutions computed for concrete problems like the example
problem on p.\pageref{expl-program}. So this proposal sets the
objective that TP-PL shall shall support calling CA functions, which
in turn set the standards for computational tools.

\textit{In the program for the running example on p.\pageref{expl-program},
the following CA functions are called: {\sc solve} in line $12$ and
{\sc Simplify} in line $14$.}

Since TP-PL is a TP-based language embedded in a logical framework, CA
called by TP-PL needs to integrate into logics. However, mainstream CA
systems, for instance Mathematica and Maple, are weakly founded
(\cite{harr:book} even calls them ``ill-defined'') in terms of
logics. There are various reasons for the mistakes found in mainstream
CA systems: assumptions can be lost, types of expressions can be
forgotten, solutions of an equation have been dropped, etc
\cite{casproto}. So improvements are urgently required, and the
logical foundations of theorem provers like Isabelle are considered an
appropriate prerequisite for such improvements.

\subparagraph{\sc\large Build upon Isabelle's deductive knowledge:}
Isabelle has grown to an extent where application to engineering
sciences (beyond software engineering) comes within reach.  Isabelle's
knowledge developed within HOL primarily addresses topics from
software technology, but also in topics relevant for applied
mathematics the knowledge exceeds what is covered by introductory
courses in engineering studies. Below there is a brief survey of the
knowledge contained in the standard distribution of Isabelle2012~
\footnote{http://} and
in the Archive of Formal
\item[Algebra] is developed from groups, rings, fields up to the
fundamental theorem of algebra in the univariate case and Groebner
bases in the multivariate case.
\item[Linear algebra] with executable operations on matrices of
arbitrary dimensions over ordered semirings, elementary linear algebra
on Euclidean spaces, convexity in real vector spaces up to
Gauss-Jordan Elimination for Matrices.
\item[Analysis] includes semicontinuous and continuous functions,
Cauchy's Mean Theorem and the Cauchy-Schwarz Inequality, MacLaurin and
Taylor series, nth roots and the transcendental functions.
\item[Multivariate analysis] concerning Frechet Derivative, Inner
Product Spaces and the Gradient Derivative, Finite-Dimensional Inner
Product Spaces, Multivariate calculus in Euclidean space,
Kurzweil-Henstock Gauge Integration in many dimensions, up to initial
value problems of Ordinary Differential
\item[Probability theory] including Binomial Coefficients,
Permutations, Social Choice Theory (Arrow's General Possibility
Theorem, Sen's Liberal Paradox and May's Theorem) and also
Discrete-Time Markov Chains.
\item[Polynomials] in an univariate and in a multivariate version, the
latter addressing monotonicity on strongly normalizing (monotone)
\item[Number theory] with primality on $\cal N$, congruence, unique
factorization, divisibility in monoids and rings, Residue rings:
Euler's theorem and Wilson's theorem, quotient rings.
\item[Numerals] and their respective arithmetic are all available,
$\cal N, I, Q, R$ and $\cal C$.  For $\cal R$ there are approximations
by floating point numbers~
\end{description} Following the LCF paradigm all the above knowledge
is derived from first principles, from the axioms of HOL. The axioms,
definitions, theorems and proofs are human readable.

\subparagraph{\sc\large Add CA algorithms based on the deductive
knowledge:} The kind of knowledge as described above, however, is very
different from what applied sciences are used to expect from Computer
Algebra (CA) systems. Isabelle's knowledge is proved mechanically (and
thus reliable), but it does not compute solutions. For instance, the
Isabelle distribution contains this \dots
\begin{tabbing} 123\=le\=\kill
\>${\it lemma}$\\
\>\>${\it fixes}\;x::"'a::\{{\it idom},{\it number\_ring}\}"$\\
\>\>${\it shows}\;"x^2*y = x^2 \;\&\; x*y^2 = y^2 \longleftrightarrow
x=1 \;\&\; y=1 \;|\; x=0 \;\&\; y=0"$
\>\>${\it by}\;{\it algebra}$
\dots which is not what required: engineers want to have the solution of
equations {\em computed} and not the other way round: find a solution
(somehow -- by hand?) and then let the computer check the correctness.
So, there is a lot to do: computational tools are required to complement
Isabelle's deductive knowledge.

Adding computational tools is essentially more than just copying the
algorithms from CA: CA lacks logical foundations, and the proposed
project is willing not to relax logical rigor established when adding
CA functions. For instance, an equation solver must guarantee to have
found {\em all} solutions to a given equation (system).

Some algorithms already available might not be sufficient engineering
applications.  For instance, Isabelle's development of Gauss-Jordan
states ``Its distinctive feature is succinctness. It is not meant for
large computations.''

\medskip In spite of the challenges mentioned there the proposal is based on the assumption, that implementing a core of CA functions is within reach of
this project:
\item solving equations in $\cal R$ and $\cal C$
\item solving equational systems in $\cal R$ and $\cal C$, linear and
\item univariate and multivariate differentiation and integration
\item some classes of integral and differential equations

\subparagraph{\sc\large Implement CA algorithms using TP-PL:}
As mentioned above,
trustworthiness of CA algorithms requires improvement; such
improvement is addressed by various ways pursued in the domain of TP
as well as in the domain of CA.

This proposal goes it's specific way: it shall improve the
trustworthiness by implementing the CA algorithms by use of the
envisaged TP-PL. This programming language shall have a rigorous
formal semantics embedded into Isabelle's logical framework; this
central issue has already been addressed by paragraph ``adaption of
the programming fragment'' on p.\pageref{aims-state-fun} and will be
elaborated in paragraph ``interpreter for the programming language on

The interpreter of TP-PL to be introduced in the paragraph below will
support interactive program construction --- with respect to CA
algorithms such support will allow for novel ways of exploring the
implemented algorithms. For instance, the shape of the S-polynomial in
the Groebner Bases method for the equational system shown above might
be instructive even for a working engineer (while the hundreds of
rewrites in the same algorithms surely are not).

While the interpreter aims at comprehensibility, transparency and
succinctness, code generation will accomplish efficiency of the code,
including parallel execution.

%DONE 1.1.1.a ---> 1.1.1.b
%The integration entails to
%overcome deficiencies found basically and essentially in CA systems
%and to improve safety and reliability of simplifiers, equation solvers
%and other CA-functionality according to standards of Computer Theorem
%Proving; a specific example is the formalisation of initial value
%problems of
%The general issue requires specific attention and additional efforts
%described in \S\ref{concepts-computation}.

%probably delete the following paragraph: NO EDUCATION in this proposal!
% \subparagraph{\sc\large Implement advanced algorithms in parallel to elementary}
% algorithms --- this aim continues above-mentioned support for
% transparency: many CA algorithms at the state-of-the-art are
% sophisticated to an extent which is immediately comprehensible only
% for a few experts in the world.
% So, in order to make the novel kind of transpacency useful (at least
% with respect to engineering education), in certain cases more
% elementary algorithms might be appropriate and implemented in parallel
% to the advanced algorithms. Of course, being elementary usually trades
% agains generality.

\paragraph{Provide an interpreter for the programming
language:}\label{concept-interpreter} This paragraph explains, how to
bring the ``adapted functional programming fragment of Isabelle''
(first paragraph p.\pageref{aims-state-fun}) with ``extension of
Isabelle's deductive knowledge by computational tools (second
paragraph p.\pageref{concepts-computation}) into effect as a prototype
with some essential features for a future workbench of an engineer.

\subparagraph{\sc\large Support interactive program development} such
that it's complexity can be tackled by a step-wise approach: by
step-wise refinement from abstract to concrete representations,
refinement for all, theories, specifications and algorithms, and
iterating between them, even if all are not complete.

Isabelle/Isar \cite{mw:isar07} supports interactive development well,
also conjoined management of theories and (included) SML
\cite{mak:isar-ml-scala}. So the grounds are laid well for the
proposed extensions. The major extensions are concerned with:
\item assemblage of a multitude of functions (including their
respective termination proofs) into a program
\item\label{check} check if the pre-conditions of a sub-program can be
derived from the pre/post-conditions of previous (sub-)programs;
prove post-conditions.
\item\label{manag} management of theories, specifications and
programs, which becomes an issue since TP-PL separates these
into three specific collections with prospect to future
Ad Pt.\ref{check}: \textit{For instance, when developing line $09$ in
the running example's program on p.\pageref{expl-program}, one would
be interested if the program's precondition and the preceding
sub-programs' post-conditions can guarantee that {\sc Solve} will
provide a single unique solution for the equational system in

How to adapt Isabelle's mechanisms to accomplish such checks
(\ref{check}.) this question is concern of the next paragraph, how
management (\ref{manag}.) is supported is concern of the subsequent

\subparagraph{\sc\large Extend the interpreter of Isabelle's code
generator:} Isabelle's code generator
\cite{nipkow-bergh-code-gen-02,haftm-nipkow-code-gen-HRS-10} not only
generates code in several languages, but also supports immediate
interpretation of the functions, by use of Isabelle's simplifier
\cite{isa-codegen} and further mechanisms.

Like Isabelle's function package, the code generator only works for
single functions --- so major extensions are required, delving deeply
into Isabelle's concepts and sources, most of which are only
documented by the source code itself. A pleasant exception is
\cite{isar-impl}, which gives some descriptions, for instance of

\cite{haftm-nipkow-code-gen-HRS-10} note that ``the code generator
supports stepwise refinement of both algorithms and data by means of
code lemmas that replace less efficient functions and data by more
efficient ones in a uniform and logically sound way'' --- such
stepwise refinement shall remain in focus when tackling design
issues of interactive program development.

\subparagraph{\sc\large Support management of knowledge} as mentioned
in point (\ref{manag}.) above: The algorithms of the selected
exemplary engineering domain shall all be accompanied by rigorous
formal specifications (like the example on p.\pageref{expl-spec}), and
also the CA algorithms to be developed according to paragraph on
p.\pageref{concepts-computation} shall do likewise. Availability of
specifications also enhance usability, because CA algorithms are more
appropriately retrieved by their respective specifications.

So the proposed prototype shall explore possibilities for retrieval of
specifications and algorithms from large bodies of knowledge and for
comparing them: specifications might have relatives (e.g. a type of a
differential equation), one specification might be met by several
algorithms (being comprehensible/sophisticated or fast/slow or just
different), etc.  \cite{plmms10} gives a preview on such choices at a
future engineer's workbench.

Another design challenge is raised by the question how to integrate view on
different language levels: (1) the view on the equations created but
the function package on the level of program code (meta language), and
(2) the view on particular values computed during program
interpretation on the level of program values (object language).

\subparagraph{\sc\large Provide the interpreter with a clear
semantics} which would allow to
\item step-wise refine data and algorithms
\item step-wise extend a program
\item verify a program (e.g. the example on
p.\pageref{expl-program}) with respect to a specification (e.g. the
example on p.\pageref{expl-spec}).

\textit{In particular, the example's final context on
p.\pageref{expl-ctxt} should allow to prove the example's
post-condition on p.\pageref{expl-spec}}~\footnote{Ideally such proofs
are automated and hidden behind the scenes.}.

Isabelle's function package and code generator, both rely on a shallow
embedding of programs and a semantics in terms of higher-order rewrite
systems \cite{haftm-nipkow-code-gen-HRS-10}. This ``equational
semantics'' has the advantage to reside within Isabelle's inference
mechanisms with the trusted kernel of code.

In case TP-PL suggests to leave this realm of reliability, also for
this case valuable spadework has been done within Isabelle: Jinja
\cite{nipkow-06:java-like} is a complete formal model of a Java-like
language, which includes a formal semantics, type system, virtual
machine model, compiler, and bytecode verifier. The latter has been
re-implemented \cite{imperative-functional-08} based on Haskell's
\cite{jones-haskell-03} imperative specification style where local
state references and mutable arrays can be dynamically
allocated. \S\ref{meth-language
} goes into technical details.

% \subparagraph{\sc\large Regard parallel execution in the semantics:}
% Since several years Isabelle more and more supports parallel execution
% \cite{makarius:isa-scala-jedit} with considerable performance gains
% \cite{Makarius-09:parall-proof}. Hoever, the proof is not yet finally
% established saying that the new mode of execution does not infect the
% system's consistency, in spite of much ongoing work in the field
% \cite{parallel-ML,Wenzel-11:doc-orient,makarius-jEdit-12}.
% The envisaged project shall include investigation of parallel
% execution already in the formal semantics, and probably contributes to
% final clarifications for the parallel execution model of the whole
% Isabelle system.

\paragraph{Proof of concept in an exemplary development in an
engineering domain:} The usability of TP-PL shall be tested in a
field-study in cooperation with a selected institute of an engineering
domain (definitely {\em not software} engineering !) at TU Graz.  This
institute is interested in Formal Methods and provides their
students with a comprehensive education in programming.

Students of this institute, which consider a master thesis with a
major mathematical part, shall be invited to accomplish this
mathematics using TP-PL.

The results of their theses are supposed to serve for seminars,
further theses and other educational purposes. The results and
experiences from the field-test are relevant for the respective
engineering domain als well as for the areas of TP and FM.

\subsubsection{How the project breaks new ground
scientifically}\label{aims-news} The project adopts and extends
front-of-the-wave developments extending the computational power of
Isabelle, itself a leading product in the field of deduction and of
mathematics-based, domain-specific knowledge development. The
extension breaks new grounds in several ways:

\paragraph{A novel combination of mechanisms} in Isabelle establishes
novel functionality: the function package
\cite{krauss:partial06,krauss-fun-def-10}, the context mechanisms
\cite{isar-impl} and the machinery of code generation
\cite{nipkow-bergh-code-gen-02,haftm-nipkow-code-gen-HRS-10} are
adapted such that programs can be constructed from several functions,
which can not yet be done. This gives a promising basis for
theoretical work on modularisation mechanisms for larger developments

%extend recently established computational facilities in Isabelle in a
%small but decisive step

\paragraph{Theoretically considered techniques are implemented} first
time: insights gained with MiniMaple \cite{mini-maple-12} serve
logically consistent integration of CA functions and bring
clarification of a rigorous formal semantics
\cite{TODO-semantics-from-others} within reach, which \dots TODO


\paragraph{Advantages over comparable approaches} seem apparent in the
proposed incremental development in TP-PL with integration of
programming and proving and exploiting refinement techniques
\item {\em Advantages in generality} over special purpose languages
like EventML~\cite{eventml-12} \dots TODO \dots
\item {\em Advantages in interactive development} over the work-flow
established for instance in the Focal project~\cite{focal-08}, which
\dots TODO
\item {\em Advantages in integration} over traditional tools in Formal
Methods like the VDM tool box~\cite{vdm-tools-08}, which
\dots TODO
\item {\em Advantages in flexibility and trustability} over
constructive logics like in Agda or in Coq, which integrate deduction
and computation by concept (the Curry-Howard isomorphism) but would
require extension of the trusted kernel by comparable extensions.
These advantages are expected to outperform all respective

\subsubsection{Importance of the expected results for the
discipline}\label{aims-results} The discipline the proposal is
concerned with is Formal Methods (FM) \cite{fm-03}. FM is the approach
to master increasing complexity of technical systems by formal logics:
systems are composed from sub-systems and thus interfaces are crucial,
in particular the properties of dynamic behaviour.

Driven by FM, domain-specific bodies of mechanised knowledge are
emerging, not only for software technology \footnote{For instance, the
Archive of Formal Proofs} but also for
other engineering domains \cite{db:dom-eng,Hansen94b,Dehbonei&94} and
others.  So-called ``domain engineering'' is expected to create and
mechanise domain specific knowledge \cite{db:SW-engIII},
i.e. appropriate notions (for instance \textit{beam, bending line} in
the running example on p.\pageref{expl}), concise abstractions like
domain specific predicates (for instance \textit{homogeneous, clamped}
in the example) and models, which will develop from kinds of abstract
datatypes to algebraic structures with specific axioms, definitions,
lemmas and theorems. So one might predict increasing importance of
mathematical proof also in these engineering domains.

In spite of increasing demands, FM are still not widely used in
engineering practice.

\medskip\noindent The proposed project intends to promote wider usage
of FM by the following advancements provided by TP-PL:

\item\textbf{Combination of computation with Formal Methods:}
Programming in the envisaged TP-PL relies on prerequisites from FM and
TP: a specification (in/output, pre/post-condition) must be present,
the notions in the specification must be defined in Isabelle, theorems
used in the program need to be mechanically proved, calling CA
algorithms in the program involves their respective specifications.
Pt.\ref{code-gen} shall ensure that this combination increases
acceptance of FM.
\item\textbf{Dynamic integration of programming and proving} is
particularly supported by the integration of SML code into the logical
infrastructure in Isabelle/Isar. The expected programming environment
of TP-PL will build upon Isabelle's front-end, the plans for extending
this front-end expect prolongation of the front-end's present

The work-flow envisaged by TP-PL comprises step-wise refinement of
datatypes and algorithms, step-wise completion of programs, proving
pre-conditions automatically or interactively and enhanced usability
in managing large collections of knowledge, in particular the CA
\item\textbf{Integrated management of large knowledge collections:}
The demonstration study planned for one specific domain is expected
to result in a considerable large collection of knowledge. TP-PL
adds specifications (in/output-items, pre/post-condition) to knowledge
traditionally managed by Isabelle, and TP-PL adds algorithms to knowledge
traditionally managed by FM.

So the project stimulates further research on knowledge management for
specifications and algorithms (which, for instance, are presently not
addressed by the OpenMath
\item\textbf{Verified Computer Algebra} results from the planned
implementation of CA algorithms in TP-PL. The selection of CA algorithms
is related to respective knowledge in Isabelle (about Gr\"obner Bases,
multivariate analysis, etc), which shall be complemented by algorithms
which actually compute solutions.

Of specific importance is the fact, that in this way verified Computer
Algebra is made immediately available for Formal Methods.
\item\label{code-gen}\textbf{Motivation of users by code generation}
has been experienced in several academic courses on FM~\footnote{In
the Isabelle mailing list a thread ``[isabelle] A course based on
Isabelle/HOL and some feedback...''  discusses respective experiences
with}. TP-PL will extend the
scope of Isabelle's code generator to programs capable of implementing
algorithms solving realistic engineering problems.

So there is the expectation that this kind of motivation will carry
over to the working engineers, such the benefits from automation in
final coding and other advantages will dominate over the hassle with
logical rigor.

The above variety of particular advancements indicates TP-PL's
substantial contribution to the discipline of Formal Methods.  The
integration of all of these features advances over comparable systems
with related features as discussed in \S\ref{aims-news}.

} %4 pages
A review of leading logical frameworks led to the decision for
Isabelle as the conceptional and technological base for prototyping a
programming language in the context of deduction.

Concerning the conceptional base, a principal question is, why the
proposal does {\em not} decide for constructive logics as represented
by Coq~\cite{coq-team-10,coq-logics-93} or
Agda~\cite{agda-wiki,adga-phd-07}. Constructive logics most closely
integrate computation and deduction (due to the Curry-Howard
isomorphism~\cite{TODO-curry-howard}), and this proposal concerns
integration of computation and deduction. The reason is the following:
TP-PL extends computation presently available, and in Coq or Agda such
an extension would affect the core of these systems, enlarge the
amount of trusted code and thus reduce trust in the resulting
programming language.

With the decision for Isabelle in mind, the prototype shall be
developed by different methods with respect to the three aims from
p.\pageref{aims}, described in the first three sections. The forth
section \S\ref{meth-usability} concerns a test of concept for the

\subsubsection{Design of language, semantics and
This section concerns the work of one PhD. The work starts with
research of the state-of-the-art, the major part is design of novel
software concepts, a modest amount of implementation in included.

\paragraph{Review existing FM tools and programming in TP:}

Coq, Agda, PVS, ACL2

VDM, B-method, \dots

\paragraph{Extend Isabelle's function package with modules:}
Isabelle provides all prerequisites for a shallow embedding of the
programming language, TP-PL. The implementation of TP-PL
will be a {\em definitional extension} and will introduce all definitions
by reducing it to a simpler form that can be processed by existing
means.  The original specification is then {\em derived} from the
primitive definition by an automated proof procedure. Since all
reasoning is performed within the theorem prover, definitional
extensions are conservative by construction, and thus offer a maximum
of safety.

The requirement for large programs assembling a multitude of functions
raises the need to generalise concepts and code of the function
package.  Large programs also require modularisation. As seen from the
state-of-the-art before the start of the project, it is not clear,
whether this kind of {\em definitional extension} can be maintained
for modularisation, i.e. for programs accompanied with a theory, a
specification (in/output-items, pre/post-condition) and an algorithm.

\paragraph{Research and design a semantics for TP-PL:} As a first
approach to formal semantics for TP-PL we shall take the equaltional
semantics in terms of higher-order rewrite systems. Relating the
equational semantics to an operational semantics for the interpreter
seems simpler than for an imperative language like Java. In case TP-PL
suggests to leave the equational semantics, spadework by
\cite{nipkow-06:java-like} shall be re-used.

The respective virtual machine model will be adapted to capture the
contexts mentioned above together with an environment:

\begin{tabbing} 123\=$|\;$\=\kill
\>${\it type}\;{\it istate} = {\it ctxt}\times{\it env}$

The design of a big-step semantics can re-use many details from
\cite{imperative-functional-08}, for instance the respective
description in relational form as $(h,h^\prime,r) \in [[c]]$ hold iff
the computation $c$ started on heap $h$ does not generate an exception
and yields the result $r$ and the updated heap $h^\prime$. ``Heap''
refers to Haskell's state monad used for imperatively mutable arrays
and adapted as follows:
\begin{tabbing} 123\=$\mathbf{datatype}\;{\it istate}\;{\it
\>$\mathbf{datatype}\;{\it istate}\;{\it Heap}$\>$ = {\it Heap}({\it
heap}\Rightarrow({\it istate}+{\it exception})\times{\it heap})$
\>${\it heap}\;f$\>$= {\it Heap}(\lambda h.\;{\it let}\;(x,h^\prime)=
f\;h\;{\it in}\;({\it Inl}\;x,h^\prime))$
\>${\it execute}\;({\it Heap}\;f)$\>$= f$

Based on this monad big-step semantics is defined as

\begin{tabbing} 123\=$|\;$\=\kill
\>$(h,h^\prime,r) \in [[c]] = (({\it nlf}\;r,h^\prime) = {\it execute}\;c\;h)$

Given this semantics one can prove rules which connect this relation
to the different basic commands. The bytecode verifier is modeled in a
very abstract framework using a semilattice, which hides all the
technical details of the virtual machine. Later, the ``real thing''
can be obtained by instantiation.

The above \textit{istate} in particular will comprise environments
\textit{env$_i$} and (logical) contexts \cite{isar-impl} ctxt$_i$,
which for step-wise interpretation of the example program might look
like on p.\pageref{expl-ctxt} for the running example.

\paragraph{Integrate language and interpreter into Isabelle's code:}
This integration is the major task in the proposed project; so this
task is planned to be done in joint efforts with \S\ref{meth-integrate}.

\subsubsection{Research verification of Computer Algebra and implement
This section concerns the work of a second PhD. The work starts with
researching of the state-of-the-art, combines and integrates concepts
from TP and CA and involves a modest amount of implementation.

\paragraph{Research state-of-the-art in verification of software and CA:}

FoCal, Axiom, MiniMaple \cite{mini-maple-12}

\paragraph{Review Isabelle's knowledge and select additional CA
%TODO redo this paragraph from scratch
Isabelle's deductive knowledge is considered a
sufficient base for developing engineering applications, see

However, the only advanced engineering algorithm covered by Isabelle
presently is Fast Fourier Transform over the complex numbers, and its
inverse (using Vandermonde

But even elementary looking CA functions seem not to be trivial: How to
ensure, that no solution is lost in equation solving (in
transcendental equations, in differential equations, etc)~?

\paragraph{Design a structure for related specifications} which at
least occur in a equation solvers: CA systems have, not
explicitly~(!), a hierarchy of types of equations, which are
mechanically searched for matching the input equation until the
appropriate algorithm for solving the equation is found.

TP-PL will have the specifications explicit, and so some kind of
hierarchy will be provided for mechanical search. There are open
questions, for instance how to nicely treat similar equations
(e.q. ``linear equation'') over different domains, integers, reals,
complex numbers and vector spaces.

\paragraph{Distribute implementation in TP-PL to sub-projects:}
The CA functions are implemented using TP-PL. So the respective
sub-projects can start after the interpreter of the TP-PL has
achieved sufficient usability.

The achievement of this usability is the critical point on the
critical path on the time-line to the field-test
\S\ref{meth-usability}.  Thus two teams of students are planned for
years 2/3, one at the institutes for mathematics at TU Graz and one at
RISC Linz.

\subsubsection{Integrate language interpreter and Computer
This section concerns the work for the Postdoctoral Research Fellow.
The PostDoc must be a former member of the Isabelle developer team,
for the following reasons:

Isabelle is an experimental software with 1.5 Mio LoCs under rapid
development by a team of 5 - 10 PhD and core staff. The code is the
documentation (as usual for well-structured functional code), the
repository provides documentation for dependencies between components
in connection with particular changes in the code. The only reliable
written documentation is \cite{isar-impl}.  Thus becoming able for high
quality contribution to Isabelle code requires several years; so a
novice to Isabelle development would be inefficient in the proposed

\paragraph{Integrate the interpreter into Isabelle:} This integration
requires deeply delving into Isabelle's concepts and sources, for
instance, code generator, contexts, proof state and document model.

In order to give an idea of the interpreter's internal state,
comprising a traditional environment together with logical context,
the following states relate to the program statements in the running
example's program on p.\pageref{expl-program
% cp from material/texts/present/100708-plmms.tex
\footnotesize{\it %formulas copied from present/090130-belgrade.tex
  1234\=$01$xx\=env$_0$ = \{\=ctxt$_0$ $\cup$ 12\=\kill
  \>$01$\>env$_0$ = \{$($l , L$)$, $($ q , $q_0)$, $($ v , x$)$, $($ b , y$\:x)$ ,\\
  \>    \>\>$($rb , $\,[y\:0 = 0\land y^\prime\:0 = 0\:\land\:V\:0 = q_0 . L\:\land\;M_b\:L = 0])$\}
  \>    \>ctxt$_0$ = \{ $q_0\; {\it is\_integrable\_in}\; x\;,\; L > 0$ \
\>$04$\>env$_1$ = env$_0$ $\cup$ \{$($funs , $[V x = c - q_0\cdot x \;, M_b\: x = c_2 + c\cdot x - \frac{q_0}{2}\cdot x^2\;,$\\
\>    \>\>\>$y^\prime\:x = c_3 - \frac{1}{EI}\cdot(c_2 + \frac{c}{2}\cdot x^2 - \frac{q_0}{6}\cdot x^3) \;,$\\
\>    \>\>\>$y\:x = c_4 + c_3\cdot x - \frac{1}{EI}\cdot(\frac{c_2}{2}\cdot x^2 + \frac{c}{6}\cdot x^3 - \frac{q_0}{24}\cdot x^4)]$
\>    \>ctxt$_1$ = ctxt$_0$ $\cup$ \{ $q_0\; {\it is\_integrable\_in}\; x \;,\; y^{\prime\prime}\:x = -\frac{1}{\mathit{EI}}\cdot M_b\:x \;,$\\
\>    \>\>\>$y^{\prime\prime\prime}\:x = -\frac{1}{\mathit{EI}}\cdot V\:x  \;,\; y^{\prime v}\:x =  q_1\:x$ \
\>$07$\>env$_2$ = env$_1$ $\cup$ \{$($equs , $[L\cdot q_0 = c\;,\; 0=c_2 + 2\cdot L\cdot c - \frac{L^2\cdot q_0}{2}\;,$\\
\>    \>\>\>$0=c_4\;,\; 0=c_3 ])$
\>    \>ctxt$_2$ = ctxt$_1$ $\cup$ \{pre-condition skipped , post-condition skipped\
%\>$14$\>env$_3$ = env$_2$ $\cup$ \{$($sols , $[c=L\cdot q_0\;,\;c_2=\frac{-L^2\cdot q_0}{2}\;,\; c_3=0\;,\;c_4=0])$\}
%\>    \>ctxt$_3$ = ctxt$_2$ $\cup$ \{pre-condition skipped , $\forall s\in \mathit{sols}\,.\; \mathit{Substitute}\;\mathit{sols}\;s\}$\\
%\>$15$\>env$_4$ = env$_3$ $\cup$ \{$($B , $y\:x = c_4 + c_3\cdot x - \frac{1}{EI}\cdot(\frac{c_2}{2}\cdot x^2 + \frac{c}{6}\cdot x^3 - \frac{q_0}{24}\cdot x^4)$\}\\
%\>    \>ctxt$_4$ = ctxt$_3$ $\cup$ \{\}\\
\>$09$\>env$_5$ = env$_4$ $\cup$ \{$($B , $y\:x = (0) + (0)\cdot x - \frac{1}{EI}\cdot(\frac{   \left(\frac{-L^2\cdot q_0}{2}\right)    }{2}\cdot x^2 + $\\
\>    \>\>\>$\frac{   \left(L\cdot q_0\right)   }{6}\cdot x^3 - \frac{q_0}{24}\cdot x^4)$\}
\>    \>ctxt$_5$ = ctxt$_4$ $\cup$ \{\
\>$10$\>env$_6$ = env$_5$ $\cup$ \{$($B , $y\:x= \frac{q_0\cdot L^2}{ 4\cdot{\it EI}}\cdot x^2
                    -\frac{q_0\cdot L  }{ 6\cdot{\it EI}}\cdot x^3
                    +\frac{q_0         }{24\cdot{\it EI}}\cdot x^4)$
\>    \>ctxt$_6$ = ctxt$_5$ $\cup$ \{ $\neg(y\:x\;\mathit{contains}\;\{c,c_2,c_3,c_4\})$ \
%\>ctxt$_i$ = env$_i$ $\cup$ ctxt$_i$
The numbers on the left refer to the numbers in the example program on

\paragraph{Supervise integration of CA into Isabelle:}

\paragraph{Extend Isabelle/jEdit as mini-IDE for TP-PL:} As mentioned,
Isabelle is under rapid development; this is particularly true for the
new front-end based on jEdit running the Java Virtual Machine and thus
opens TP for the world outside Linux, Emacs and single/local processing.

The conception of TP-PL requires some functionality in addition to
present Isabelle/jEdit: programming in TP-PL involves search and
management of collections of specifications and of algorithms. Also
dynamic views of the interpreter state are planned for debugging.

\paragraph{Assist the project leader} in particular with decisions
related to Isabelle's internals and in cooperation with the institute
engaged in the proof of concept according to \S\ref{meth-usability}.

There are also several supervision tasks: assistance for the students
performing the proof of concept (who only can be expected to
accomplish specifying and programming in TP-PL, but not proving
theorems) as well as supervision of thes students engaged in
sub-projects on Computer Algebra according to \S\ref{meth-ca}.

\subsubsection{Proof of concept in an  selected exemplary engineering
science}\label{meth-usability} As soon as the language definition has
been clarified according to \S\ref{meth-language}, topics and problems
for programming can be selected. The selection requires cooperation
with \S\ref{meth-ca}: the CA functions required for programming
solutions for the selected problems must be developed; there might be
constraints on the selection from \S\ref{meth-ca}.

\paragraph{Implement engineering problems in TP-PL:} Implementation is
done at the selected institute. In order to make work efficient, a
team of students shall be gathered, where each individual is working
on his or her thesis independently, by sharing experience with
development in TP-PL.

The implementation work comprises development of definitions and theorems
(with decisive support in proving), of specifications and of

\paragraph{Compile guide lines for TP-PL programmers} is planned as a
by-product of supervising the students working in the proof of
concept. This task requires coordination with all other tasks
\S\ref{meth-language}, \S\ref{meth-ca} and \S\ref{meth-integrate}.

\subsection{Work plan, time plan and dissemination plan
} %1page
\subsection{Cooperations} %1 page
\section{Human resources} %1 page
\subsection{Scientific qualifications of the scientists involved}
\subsection{Importance of the project for the career development}
\section{Potential additional aspects
} %1 page
\subsection{Implications for other branches of science} All sciences'
formal parts are addressed by Formal Methods (FM) in principle, but as
mentioned in \S\ref{aims-results
}, not even the technology oriented
sciences widely use Formal Methods. An adjacent discipline, ``Systems
Modelling'' (SM)), tries to bridge the gap between graphic-oriented
methods (traditionally from UML etc) and more formal methods --- and
also there the former still seem to surpass the latter.

% Software and Systems Modeling
% Volume 1 / 2002 - Volume 11 / 2012
% ... editorial !

A most recent advancement on the graphical side are ``System
Modelers'' based on
CA~\footnote{,}: these provide rendering and
animation such that the engineers' intuition is addressed and
iterations in development can be driven by such intuition.

\medskip However, experiences from FM
demonstrate~\cite{FME-97,FME-96}, that formal rigor surpasses human
intuition the more complex systems become. So what seems urgently
needed is
{\it FM} + SM = {\it F}SM $\;$ ({\it Formal} Systems Modelling)
i.e. underpinning the toos for SM with formal rigor from FM and TP
such that critical features of systems can be rigorously verified.

By extending computational expressiveness while maintaining logical
rigor, the proposed TP-based programming language might
\textit{supplant a missing link at a crucial point}: the prototype
shall demonstrate potential to gradually substitute CA-based languages
within all the upcoming tools in (F)SM.

\subsection{Effects that will have implications beyond the field} With
respect to the essential role of Formal Methods for reliable systems,
there is hope for effects on the general attitude towards science and

Such effects are desirable for experts and developers as well as for
the public. The proposed TP-PL specifically addresses engineers and
will first be applied in engineering science education according to
\S\ref{aims}.  Cost reduction by FM in development of complex systems
is widely acknowledged.

TP-PL is in company with development of formal domain knowledge as
shown in \S\ref{aims-results}. Such domain knowledge is developed {\em
preceeding} actual development of systems, crucial systems'
features become evident early and widespread adoption of FM in technology
standards is desirable.

All this enables the public for earlier involvement in technology impact
assessment --- and the project consortium hopes that TP-PL can provide
a right kick at a right point at the right time.

\section{Financial aspects
} %1 page
\subsection{Information on the research institution}
%- available personnel not financed by the FWF (usually the project
%  leader and co-applicants);
%- available infrastructure:

\subsection{Information on the support requested}
%- concise justification for the personnel requested (type of
%  position(s), description of nature of work, length and extent of
%  involvement in the project)
%- concise justification for the non-personnel costs (equipment,
%  material, travel and other costs). If equipment is requested,
%  applicants must specify why this does not represent part of the
%  available infrastructure (see also Appendix 2.3).

\section{CVs and publication lists}
\subsection{Wolfgang Schreiner}
\subsection{Franz Winkler}
\subsection{Amine Chaieb}