Trends in Functional Programming Volume 6

This book presents latest research developments in the area of functional programming. The contributions in this volume cover a wide range of topics from theory, formal aspects of functional programming, transformational and generic programming to type checking and designing new classes of data types.

Not all papers in this book belong to the category of research papers. Also, the categories of project description (at the start of a project) and project evaluation (at the end of a project) papers are represented.

Particular trends in this volume are:

  • software engineering techniques such as metrics and refactoring for high-level programming languages;
  • generation techniques for data type elements as well as for lambda expressions; 
  • analysis techniques for resource consumption with the use of high-level programming languages for embedded systems;
  • widening and strengthening of the theoretical foundations.

 

1135744963
Trends in Functional Programming Volume 6

This book presents latest research developments in the area of functional programming. The contributions in this volume cover a wide range of topics from theory, formal aspects of functional programming, transformational and generic programming to type checking and designing new classes of data types.

Not all papers in this book belong to the category of research papers. Also, the categories of project description (at the start of a project) and project evaluation (at the end of a project) papers are represented.

Particular trends in this volume are:

  • software engineering techniques such as metrics and refactoring for high-level programming languages;
  • generation techniques for data type elements as well as for lambda expressions; 
  • analysis techniques for resource consumption with the use of high-level programming languages for embedded systems;
  • widening and strengthening of the theoretical foundations.

 

44.0 In Stock
Trends in Functional Programming Volume 6

Trends in Functional Programming Volume 6

by Marko Van Eekelen (Editor)
Trends in Functional Programming Volume 6

Trends in Functional Programming Volume 6

by Marko Van Eekelen (Editor)

Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
WANT A NOOK?  Explore Now

Related collections and offers


Overview

This book presents latest research developments in the area of functional programming. The contributions in this volume cover a wide range of topics from theory, formal aspects of functional programming, transformational and generic programming to type checking and designing new classes of data types.

Not all papers in this book belong to the category of research papers. Also, the categories of project description (at the start of a project) and project evaluation (at the end of a project) papers are represented.

Particular trends in this volume are:

  • software engineering techniques such as metrics and refactoring for high-level programming languages;
  • generation techniques for data type elements as well as for lambda expressions; 
  • analysis techniques for resource consumption with the use of high-level programming languages for embedded systems;
  • widening and strengthening of the theoretical foundations.

 


Product Details

ISBN-13: 9781841509907
Publisher: Intellect Books
Publication date: 06/01/2005
Series: Trends in Functional Programming
Sold by: Barnes & Noble
Format: eBook
Pages: 240
File size: 2 MB

About the Author

Dr. Marko van Eekelen is an associate professor in the Security of Systems Department of the Institute for Computing and Information Sciences, Radboud University, Nijmegen.

Read an Excerpt

Trends in Functional Programming Volume 6


By Marko van Eekelen

Intellect Ltd

Copyright © 2007 Intellect
All rights reserved.
ISBN: 978-1-84150-990-7



CHAPTER 1

Best Student Paper: A New Approach to One-Pass Transformations

Kevin Millikin


Abstract: We show how to construct a one-pass optimizing transformation by fusing a non-optimizing transformation with an optimization pass. We state the transformation in build form and the optimization pass in cata form, i.e., as a catamorphism; and we use cata/build fusion to combine them. We illustrate the method by fusing Plotkin's call-by-value and call-by-name CPS transformations with a reduction-free normalization function for the λ-calculus, thus obtaining two new one-pass CPS transformations.


1.1 INTRODUCTION

Compiler writers often face a choice between implementing a simple, non-optimizing transformation pass that generates poor code which will require subsequent optimization, and implementing a complex, optimizing transformation pass that avoids generating poor code in the first place. A two-pass strategy is compelling because it is simpler to implement correctly, but its disadvantage is that the intermediate data structures can be large and traversing them unnecessarily can be costly. In a system performing just-in-time compilation or run-time code generation, the costs associated with a two-pass compilation strategy can render it impractical. A one-pass optimizing transformation is compelling because it avoids generating intermediate data structures requiring further optimization, but its disadvantage is that the transformation is more difficult to implement.

The specification of a one-pass transformation is that it is extensionally equal to the composition of a non-optimizing transformation and an optimization pass. A one-pass transformation is not usually constructed this way, however, but is instead constructed as a separate artifact which must then be demonstrated to match its specification. Our approach is to construct one-pass transformations directly, as the fusion of passes via shortcut deforestation [GLJ93, TM95], thus maintaining the explicit connection to both the non-optimizing transformation and the optimization pass.

Shortcut deforestation relies on a simple but powerful program transformation rule known as cata/build fusion. This rule requires both the transformation and optimization passes to be expressed in a stylized form. The first pass, the transformation, must be written as a build, abstracted over the constructors of its input. The second pass, the optimization, must be a catamorphism, defined by compositional recursive descent over its input.

The non-optimizing CPS transformation generates terms that contain administrative redexes which can be optimized away by β-reduction. A one-pass CPS transformation [DF90, DF92] generates terms that do not contain administrative redexes, in a single pass, by contracting these redexes at transformation time. Thus β-reduction is the notion of optimization for the CPS transformation. The normalization function we will use for reduction of CPS terms, however, contracts all β-redexes, not just administrative redexes. In Section 1.6 we describe how to contract only the administrative redexes.

When using a metalanguage to express normalization in the object language, as we do here, the evaluation order of the metalanguage is usually important. However, because CPS terms are insensitive to evaluation order [Plo75], evaluation order is not a concern.


This work. We present a systematic method to construct a one-pass transformation, based on the fusion of a non-optimizing transformation with an optimization pass. We demonstrate the method by constructing new one-pass CPS transformations as the fusion of non-optimizing CPS transformations with a catamorphic normalization function.

The rest of the paper is organized as follows. First, we briefly review cata-morphisms, builds, and cata/build fusion in Section 1.2. Then, in Section 1.3 we restate Plotkin's call-by-value CPS transformation [Plo75] with build, and in Section 1.4 we restate a reduction-free normalization function for the untyped λ-calculus to use a catamorphism. We then present a new one-pass CPS transformation obtained by fusion, in Section 1.5. In Section 1.6 we describe how to modify the transformation to contract only the administrative redexes. We compare our new CPS transformation to the one-pass transformation of Danvy and Filinski [DF92] in Section 1.7. In Section 1.8 we repeat the method for Plotkin's call-by-name CPS transformation. We present related work and conclude in Section 1.9.


Prerequisites. The reader should be familiar with reduction in the λ-calculus, and the CPS transformation [Plo75]. Knowledge of functional programming, particularly catamorphisms (i.e., the higher-order function fold) [MFP91] is expected. We use a functional pseudocode that is similar to Haskell.


1.2 CATA/BUILD FUSION FOR λ-TERMS

The familiar datatype of λ-terms is defined by the following context-free grammar (assuming the metavariable x ranges over a set Ident of identifiers):

Term [contains as member] m :: = var x | lam x m | app m m

A catamorphism [GLJ93, MFP91, TM95] (or fold) over λ-terms captures a common pattern of recursion. It recurs on all subterms and replaces each of the constructors var, lam, and app in a λ-term with functions of the appropriate type. We use the combinator foldλ, with type [for all]A. (Ident->A)->(Ident ->A->A)->(A ->A->A)->Term->A, to construct a catamorphism over λ-terms:

[MATHEMATICAL EXPRESSION OMITTED]

We use the combinator buildλ to systematically construct λ-terms. It takes a polymorphic function f which uses arbitrary functions (of the appropriate types) instead of the λ-term constructors to transform an input into an output, and then applies f to the λ-term constructors, producing a function that transforms an input into a λ-term. It has type [for all]A. ([for all]B.(Ident->B) ->(Ident->B->B)->(B->B->B) ->A->B)->A ->Term:

buildλf = f var lam app

Cata/build fusion [GLJ93, TM95] is a simple program transformation that fuses a catamorphism with a function that produces its output using build. For λ-terms, cata/build fusion consists of the rewrite rule:

(foldλvr lm ap) ο (buildλf) [??] f vr lm ap

The fused function produces its output without constructing intermediate data structures.


1.3 THE CALL-BY-VALUE CPS TRANSFORMATION USING BUILD

The non-optimizing call-by-value CPS transformation [Plo75] is given in Figure 1.1. We assume the ability to choose fresh identifiers when needed; the identifiers k, v0, and v1 are chosen fresh.

Fusion with a catamorphic normalization function requires that the transformation is written using build, i.e., parameterized over the constructors used to produce its output. The transformation using build thus constructs a Church encoding of the original output. The non-optimizing transformation in build form is shown in Figure 1.2. As before, the identifiers k, v0, and v1 are chosen fresh.

The transformation is non-optimizing because it produces terms that contain extraneous administrative redexes. Transforming the simple term λx. λy. y x (written with the usual notation for λ-terms) produces the term

[MATHEMATICAL EXPRESSION OMITTED]

containing administrative redexes (for simplicity, we have used only a single continuation identifier).


1.4 A CATAMORPHIC NORMALIZATION FUNCTION

Normalization by evaluation (NBE) is a reduction-free approach to normalization that is not based on the transitive closure of a single-step reduction function. Instead, NBE uses a non-standard evaluator to map a term to its denotation in a residualizing model. The residualizing model has the property that a denotation in the model can be reified into a syntactic representation of a term, and that reified terms are in normal form. A reduction-free normalization function is then constructed as the composition of evaluation and reification.

NBE has been used in the typed λ-calculus, combinatory logic, the free monoid, and the untyped λ-calculus [DD98]. We adopt the traditional normalization function for the untyped λ-calculus as our optimizer for CPS terms. We show it in Figure 1.3. Just as in the CPS transformation, we assume the ability to choose a fresh identifier when needed. We note, however, that our approach works equally well with other methods of name generation such as using de Bruijn levels or threading a source of fresh names through the evaluator. We opt against the former approach here because we want to compare our resulting one-pass transformation with existing one-pass transformations. The latter approach goes through without a hitch, but we opt against it here because the extra machinery involved with name generation distracts from the overall example without contributing anything essential.

The normalization function maps terms to their β-normal form. Normal forms are given by the grammar for Norm in Figure 1.3. Elements Val of the residualizing model are either atoms (terms that are not abstractions, given by the grammar for Atom), or else functions from Val to Val.

Environments are somewhat unusual in that the initial environment maps each identifier to itself as an element of the residualizing model, which allows us to handle open terms:

[MATHEMATICAL EXPRESSION OMITTED]

Abstractions denote functions from Val to Val. The recursive function reify ([down arrow]) extracts a normal term from an element of the residualizing model. The function apply dispatches on the value of the operator of an application to determine whether to build a residual atom or to apply the function. Normalization is then the composition of evaluation (in the initial environment) followed by reification.

Because the evaluation function is compositional, we can rewrite it as a catamorphism over λ-terms, given in Figure 1.4. The domains of terms, atoms, values, and environments do not change, nor do the auxiliary functions [down arrow] and apply.

Using this normalization function to normalize the example term from Section 1.3 produces [MATHEMATICAL EXPRESSION OMITTED], where all the β-redexes have been contracted (and fresh identifiers have been generated for all bound variables).


1.5 A NEW ONE-PASS CALL-BY-VALUE CPS TRANSFORMATION

We fuse the non-optimizing CPS transformation buildλf : Term->Term of Section 1.3 and the catamorphic evaluation function foldλvr lm ap : Term->Env ->Val of Section 1.4 to produce a one-pass transformation from λ-terms into the residualizing model. This one-pass transformation is simply f vr lm ap : Term->Env ->Val. We then extract β-normal forms from the residualizing model by applying to the initial environment and reifying, as before.

Inlining the definitions of f, vr, lm and ap, performing β-reduction, and simplifying environment operations (namely, replacing environment applications that yield a known value with their value and trimming bindings that are known to be unneeded) yields the simplified specification of the one-pass transformation shown in Figure 1.5. The domains of normal terms, atoms, values, and environments as well as the auxiliary functions [down arrow] and apply are the same as in Figure 1.3.

We have implemented this one-pass transformation in Standard ML and Haskell, letting the type inferencer act as a theorem prover to verify that the transformation returns a β-normal form if it terminates [DRR01].


1.6 SUPPRESSING CONTRACTION OF SOURCE REDEXES

Compared to traditional one-pass CPS transformations, our transformation is overzealous. The normalization function we use contracts all β-redexes; it cannot tell which ones are administrative redexes. Therefore our CPS transformation does not terminate for terms that do not have a β-normal form (e.g., (λx.x x) (λx.x x)). Of course, if we restricted the input to simply-typed λ-terms, then the transformation would always terminate because the corresponding normalization function does.

We can modify the new CPS transformation to contract only the administrative redexes. We modify the datatype of intermediate terms (and the associated catamorphism operator) to contain two types of applications, corresponding to source and administrative redexes. This is an example of a general technique of embedding information known to the first pass in the structure of the intermediate language, for use by the second pass.

Term [contains as member] m :: = var x | lam x m | app m m | srcapp m m

We then modify the non-optimizing CPS transformation to preserve source applications (by replacing the app (var v0) (var v1) with srcapp (var v0) (var v1) in the clause for applications) and we modify the normalization function (to always reify both the operator and operand of source applications). The datatype of normal forms now includes source redexes:

Norm [contains as member] n :: = atomNa | lamNx n

Atom [contains as member] a :: = varNx | appNa n | srcappNn n

The result of fusing the modified call-by-value CPS transformation with the modified normalization function is shown in Figure 1.6. Again, the domains of values and environments, and the auxiliary functions ? and apply are the same as in Figure 1.3.

Given the term from Section 1.3, the modified transformation produces

[MATHEMATICAL EXPRESSION OMITTED]

(i.e., it does not contract the source redex).


1.7 COMPARISON TO DANVY AND FILINSKI'S ONE-PASS CPS TRANSFORMATION

Danvy and Filinski [DF92] obtained a one-pass CPS transformation by anticipating which administrative redexes would be built and contracting them at transformation time. They introduced a binding-time separation between static and dynamic constructs in the CPS transformation (static constructs are represented here by metalanguage variables, abstractions, and applications; and dynamic constructs by the constructors var, lam, and app). Static β-redexes are contracted at transformation time and dynamic redexes are residualized. We present their transformation in Figure 1.7.


(Continues...)

Excerpted from Trends in Functional Programming Volume 6 by Marko van Eekelen. Copyright © 2007 Intellect. Excerpted by permission of Intellect Ltd.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

1 Best student Paper: A New Approach to One-Pass Transformations          [1]

Kevin Millikin

1.1 Introduction

1.2 Cata/build Fusion for [?-Terms]

1.3 The Call-by-value CPS Transformation Using Build

1.4 A Catamorphic Normalization Function

1.5 A New One-Pass Call-by-value CPS Transformation

1.6 Suppressing Contraction of Source Redexes

1.7 Comparison to Danvy and Filinski's One-Pass CPS Transformation

1.8 A New One-Pass Call-by-name CPS Transformation

1.9 Related Work and Conclusion

References

 

2 A Static Checker for Safe Pattern Matching in Haskell          [15]

Neil Mitchell and Colin Runciman

2.1 Introduction

2.2 Reduced Haskell

2.3 A Constraint Language 

2.4 Determining the Constraints

2.5 A Worked Example

2.6 Some Small Examples and a Case Study

2.7 Related Work

2.8 Conclusion and Further Work

References

 

3 Software Metrics: Measuring Haskell          [31]

Chris Ryder, Simon Thompson

3.1 Introduction

3.2 What Can Be Measured

3.3 Validation Methodology

3.4 Results

3.5 Conclusion and Further Work

References

 

4 Type-Specialized Serialization with Sharing          [47]

Martin Elsman

4.1 Introduction

4.2 The Serialization Library 

4.3 Implementation

4.4 Experiments with the MLKit

4.5 Conclusions and Further Work

References

 

5 Logical Relations for Call-by-value Delimited Continuations          [63]

Kenichi Asai

5.1 Introduction

5.2 Preliminaries

5.3 Specializer for Call-by-name []-Calculus

5.4 Logical Relations for Call-by-value []-Calculus

5.5 Specializer in CPS

5.6 Specializer in Direct Style

5.7 Interpreter and A-normalizer for Shift and Reset

5.8 Specializer for Shift and Reset

5.9 Type System for Shift and Reset

5.10 Logical Relations for Shift and Reset

5.11 Related Work

5.12 Conclusion

References

 

 6 Epigram Reloaded: A Standalone Typechecker for ETT          [79]

James Chapman, Thorsten Altenkirch, Conor McBride

6.1 Introduction

6.2 Dependent Types and Typechecking

6.3 Epigram and its Elaboration

6.4 ETT Syntax in Haskell

6.5 Checking Types

6.6 From Syntax to Semantics

6.7 Checking Equality

6.8 Related Work

6.9 Conclusions and Further Work

References

 

7 Formalisation of Haskell Refactorings          [95]

Huiqing Li, Simon Thompson

7.1 Introduction

7.2 Related Work

7.3 The []-Calculus with Letrec ([]LETREC)

7.4 The Fundamentals of ([]LETREC)

7.5 Formalisation of Generalizing a Definition

7.6 Formalisation of a Simple Module System []M

7.7 Fundamentals of []M

7.8 Formalisation of Move a definition from one module to another in []M

7.9 Conclusions and Further Work

References

 

8 Systematic Search for Lambda Expressions          [111]

Susumu Katayama

8.1 Introduction

8.2 Implemented System

8.3 Efficiency Evaluation

8.4 Discussions for Further Improvements

8.5 Conclusions

References

 

9 First-Class Open and Closed Code Fragments          [127]

Morten Rhiger

9.1 Introduction

9.2 Open and Closed Code Fragments

9.3 Syntactic Type Soundness

9.4 Examples

9.5 Related Work

9.6 Conclusions

References

 

10 Comonadic Functional Attribute Evaluation          [145]

Tarmo Uustalu and Varmo Vene

10.1 Introduction

10.2 Comonads and Dataflow Computation

10.3 Comonadic Attribute Evaluation

10.4 Related Work

10.5 Conclusions and Future Work

References

 

11 Generic Generation of the Elements of Data Types          [163]

Pieter Koopman, Rinus Plasmeijer 

11.1 Introduction

11.2 Introduction to Automatic Testing

11.3 Generic Test Data Generation in Previous Work

11.4 Generic Test Data Generation: Basic Approach

11.5 Pseudo-Random Data Generation

11.6 Restricted Data Types

11.7 Related Work

11.8 Conclusion

References

 

12 Extensible Record with Scoped Labels          [179]

Daan Leijen 

12.1 Introduction

12.2 Record operations

12.3 The Types of Records

12.4 Higher-Ranked Impredicative Records

12.5 Type Rules

12.6 Type Inference

12.7 Implementing Records

12.8 Related Work

12.9 Conclusion

References

 

13 Project Start Paper: The Embounded Project          [195]

Kevin Hammond, Roy Dyckhoff, Christian Ferdinand, Reinhold Heckmann, Martin Hofmann, Steffen Jost, Hans-Wolfgang Loidl, Greg Michaelson, Robert Pointon, Norman Scaife, Jocelyn Sérot and Andy Wallace

13.1 Project Overview

13.2 The Hume Language

13.3 Project Work Plan

13.4 The State of the Art in in Program Analysis for Real-Time Embedded Systems

13.5 Existing Work by the Consortium

13.6 Conclusions

References

 

14 Project Evaluation Paper: Mobile Resource Guarantees          [211]

Donald Sannella, Martin Hoffmann, David Aspinall, Stephen Gilmore, Ian Stark, Lennart Beringer, Hans-Wolfgang Loidl, Kenneth MacKenzie, Alberto Momigliano, Olha Shkaravska

14.1 Introduction

14.2 Project Objectives

14.3 An Infrastructure for Resource Certification

14.4 A PCC Infrastructure for Resources

14.5 Results

References

 

 

 

From the B&N Reads Blog

Customer Reviews