The Definition of Standard ML

The Definition of Standard ML

The Definition of Standard ML

The Definition of Standard ML

Paperback(revised edition)

$30.00 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Related collections and offers


Overview

Standard ML is a general-purpose programming language designed for large projects. This book provides a formal definition of Standard ML for the benefit of all concerned with the language, including users and implementers. Because computer programs are increasingly required to withstand rigorous analysis, it is all the more important that the language in which they are written be defined with full rigor. One purpose of a language definition is to establish a theory of meanings upon which the understanding of particular programs may rest. To properly define a programming language, it is necessary to use some form of notation other than a programming language. Given a concern for rigor, mathematical notation is an obvious choice. The authors have defined their semantic objects in mathematical notation that is completely independent of Standard ML. In defining a language one must also define the rules of evaluation precisely--that is, define what meaning results from evaluating any phrase of the language. The definition thus constitutes a formal specification for an implementation. The authors have developed enough of their theory to give sense to their rules of evaluation. The Definition of Standard ML is the essential point of reference for Standard ML. Since its publication in 1990, the implementation technology of the language has advanced enormously and the number of users has grown. The revised edition includes a number of new features, omits little-used features, and corrects mistakes of definition.


Product Details

ISBN-13: 9780262631815
Publisher: MIT Press
Publication date: 05/21/1997
Series: The MIT Press
Edition description: revised edition
Pages: 132
Product dimensions: 6.80(w) x 8.80(h) x 0.40(d)
Age Range: 18 Years

About the Author

David MacQueen is Professor of Computer Science at the University of Chicago.

From the B&N Reads Blog

Customer Reviews