Leaders in Computing: Changing the digital world
This collection of exclusive interviews provides a fascinating insight into the thoughts and ideas of influential figures from the world of IT and computing, including Sir Tim Berners-Lee, Donald Knuth, Linus Torvalds, Jimmy Wales, Grady Booch, Steve Wozniak, Vint Cerf, Karen Spärck Jones and Dame Stephanie Shirley. With representatives from areas as diverse as programming, development, hardware, networks, interface, internet and applications, this collection also provides an excellent overview of important developments in this diverse field over recent years.
1111933289
Leaders in Computing: Changing the digital world
This collection of exclusive interviews provides a fascinating insight into the thoughts and ideas of influential figures from the world of IT and computing, including Sir Tim Berners-Lee, Donald Knuth, Linus Torvalds, Jimmy Wales, Grady Booch, Steve Wozniak, Vint Cerf, Karen Spärck Jones and Dame Stephanie Shirley. With representatives from areas as diverse as programming, development, hardware, networks, interface, internet and applications, this collection also provides an excellent overview of important developments in this diverse field over recent years.
2.99 In Stock
Leaders in Computing: Changing the digital world

Leaders in Computing: Changing the digital world

Leaders in Computing: Changing the digital world

Leaders in Computing: Changing the digital world

eBook

$2.99  $3.89 Save 23% Current price is $2.99, Original price is $3.89. You Save 23%.

Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
WANT A NOOK?  Explore Now

Related collections and offers


Overview

This collection of exclusive interviews provides a fascinating insight into the thoughts and ideas of influential figures from the world of IT and computing, including Sir Tim Berners-Lee, Donald Knuth, Linus Torvalds, Jimmy Wales, Grady Booch, Steve Wozniak, Vint Cerf, Karen Spärck Jones and Dame Stephanie Shirley. With representatives from areas as diverse as programming, development, hardware, networks, interface, internet and applications, this collection also provides an excellent overview of important developments in this diverse field over recent years.

Product Details

ISBN-13: 9781780171005
Publisher: BCS, The Chartered Institute for IT
Publication date: 09/30/2011
Series: EBO Series
Sold by: Barnes & Noble
Format: eBook
Pages: 75
File size: 458 KB

Read an Excerpt

CHAPTER 1

THE ART OF COMPUTER PROGRAMMING

Donald Knuth, June 2011

While he was over in the UK for a book tour and lecture series, Professor Donald Knuth, the author of the hugely respected The Art of Computer Programming book series, made time to talk to BCS editor Justin Richards about his life and works.

You're probably best known for your book series The Art of Computer Programming. In 1999, these books were named among the best 12 physical-science monographs of the century by American Scientist. How did these books originally come about and how do you feel about the American Scientist distinction?

The books came about because, in the 60s, when I began, everyone was starting to rediscover things because there was no one good source of what was known and I had enjoyed writing all the time. I was involved in newspapers at school and magazines and thought of myself as a writer and I realised there was a need for someone to get down all the good ideas that had been published and that we were already forgetting.

This was back in the earliest days, when the number of people actually studying computing was probably less than a thousand. I didn't see it as affecting the world, but I still thought it was pretty cool stuff and ought to be organised.

Then I thought about who else could write such a book and everyone I thought of I thought they'd probably only mention their own stuff and I was the only guy I knew who hadn't invented anything himself, so I figured I could be neutral and I could be a spokesman for the other people. And really that was the original motivation.

I started writing the book and, naturally, because I was trying to combine the ideas of many different people, I would see where one person had analysed his method in one way while another, for a competing method, had analysed it another way. So I had to analyse method A according to author B and method B according to author A.

Therefore I ended up creating an original work just by analysing these things and pretty soon I realised there were a whole bunch of interesting scientific approaches here that hadn't been part of my own education that were really coming together. Over and over again I was really seeing this way of thinking as necessary in order to get the story right.

So, to make a long story short, pretty soon I had my own axe to grind too; I started discovering things too and I couldn't be an unbiased writer anymore. However, I still kept to the original idea of trying to summarise everybody's ideas in the fairest, most reasonable way I could.

Now, to be put into that category of one of the best books of the century, that's a little bit embarrassing as they rank me with Einstein and Feynman. I'm not in that league really, I just didn't have as much competition. They had to have a token person in computer science! But still, I worked hard and I think it was necessary to comment on the research so far, but it's a bit like comparing apples to oranges when they chose me to represent computing.

What is it about computer science that drew you to it?

I was born to be a computer scientist – I have a way of organising stuff in my head that just seems to make me a good programmer. I think everybody can learn to use computers, but only about 1 person in every 50 is a geek in the same way as I am. That means we can push the envelope and can resonate with the computer. The way we think helps to make it easier for us to know how to construct a machine.

Why do you think computer science is so important?

Computer scientists are important because of the way they affect communication and, I'm sorry to say it, also finances. Unfortunately, the world measures what my colleagues and I do by how much our work affects Wall Street. I'm jealous of astronomers, for example, because people respect astronomers for doing astronomy because it's interesting just for its own sake. I studied computer science because it's interesting to study computer science.

The term IT doesn't resonate with me so much – it's the science that interests me. To me the IT is very nice, but it's not something that I'm particularly good at. My wife can figure out what these icons mean and what to click before I can, but there are so many scientific challenges in order to get machines to do complicated, sophisticated things. The ideas are subtle, the questions are fascinating. There are many questions I never thought I'd know the answer to, but gradually we've learned how to solve them. For me I would do it even if there was no money in it.

So you have a passion for it?

Yeah, it's like I wake up in the morning thinking I've got to write a program.

Do you have a muse?

Yeah, well some days she talks to me more than others. There was a period when I almost thought there was a muse dictating to me.

In your opinion, what do you think is your greatest achievement in the field of computer sciences?

I guess the first thing I did well at was when I worked on the theory that goes on behind how compilers work. I worked on the theory that underlies algebraic languages, and, as I was writing The Art of Computing book (Chapter 10), I was describing what everyone else had done, but then I realised that there was a way to bring these things together. I didn't know how to explain that in a book, it was too far out, so I published that theory in a paper and other people figured out what I meant and this became the theory of parsing that's used in all algebraic compilers now.

But I feel the biggest thing that I developed was the mathematical approach to compare algorithms in order to find out how good a method was. I worked out quantitative ways you could say that one program is going to be, say, 2.3 times better than another one and the mathematics that goes with it and it's called the analysis of algorithms. It's what I'm most proud of – in developing an academic subject – but it's key to the successful use of the machine.

When I came up with this approach, I said to my publishers 'let's rename the book and call it The Analysis of Algorithms' and they said 'we can't, it will never sell!' But that's really what my book is about – it summarises the work of all these people, but it also helps us decide, in a quantitative manner, how good each method is.

You've said on your website, in response to the question 'why don't you do email?' – 'Email is a wonderful thing for people whose role in life is to be on top of things. But not for me; my role is to be on the bottom of things.' Can you explain your stance on email and what you meant about being on the bottom of things?

Someone has not to be tweeting all the time, someone has to be thinking about things which need a long attention span and trying to organise material and build up strong foundations instead of rushing off across the frontier. It takes a long time to put out something that has the right style; I have to really think about it and if I'm going to do it right I have to spend a lot of time focused on it. And I was being treated like an oracle, lots of people from around the world were asking my opinion about whatever, so after 15 years of email I decided that was a lifetime's worth.

A previous Turing Lecture speaker, Grady Booch, was very much an advocate of making coding simpler and, according to a blurb regarding your winning the BBVA Foundation Frontiers of Knowledge Award in the Information and Communication Technologies category, you are too. Can you explain why you think code should be kept simple, compact and intuitively understandable?

I guess you have to go back to Einstein who said 'keep it as simple as possible, but no simpler'. It's an illusion to think there's going to be some sort of 'royal road' and everything is going to be easy to understand, but almost never do I find something that can't be simplified if you have the chance to rethink it again. So every once in a while people have to say 'well, let's chuck everything we have and start over again, in view of what we know now'.

There's a project at Stanford that started a few years ago called the Clean Slate Project that said 'let's figure out a better way to do the internet'. Things just keep getting added on and accumulate and you realise that there's plenty of baggage which there's no reason to have anymore.

It's like the appendix in the human body, there was probably some purpose for it at one time, but not now. So I think there's the potential, although I think maybe it's not possible because the world is so dependent on it, for someone to come along and say 'let's start again with the way programs are loaded into machines'. It's like when Linux came out – that was an attempt at the simplification of operating systems.

Another ideology that you share with Grady Booch is that you have both said that you can appreciate the beauty within coding and programming – what do you mean by that?

I'm thinking of it in several senses of the word 'art', but in general the word 'art' means something that is done by human beings and is not a part of nature. But then there is fine art, which brings aesthetics into it as well.

In many ways beauty is in the eye of the beholder, but you do something because it's elegant and hangs together and is a pleasure to read as well as to write or to see someone else's work; you feel that you've got it and you can take pride in it having achieved certain criteria.

Maybe Grady and I don't agree on the criteria. I mean no two people agree on what's the best kind of music in the world, but musicians certainly know what they like and what they don't like and they know when they've done something well and that's the way I look at a program.

I guess it's down to personal opinion at the end of the day?

Yes, indeed. There's no algorithm that you feed in and say 'isn't this beautiful or what?' Although people did try – there was a book that came out in the 1930s by one of America's greatest mathematicians, George [David] Birkhoff, called Aesthetic Measure and it was filled with all kinds of formulas and there was a page filled with all kinds of Greek urns and next to each one would be a number which would say how beautiful the urn was.

He classified a whole bunch of different design systems; it's kind of interesting as number two or three in his list of 100 was the swastika – he was a kind of Nazi sympathiser. I guess it has a greater religious significance in Hindu, if it's reflected left-to-right. I don't believe there's a way to measure it, but he did and some people have tried.

So no one has written a program to work out the beauty of a program yet?

No, not really, although there's software engineering that tries to do this because they have to measure something – I don't really know. You know that, as a writer/ reporter, you just have to find quantitative numbers to accompany the text – X number of people have died in Cairo, you have to know whether it's 300 or 315, that's part of the news story. Qualifying things adds quality. I try to find reason for numbers too, but software engineers are trying to measure how good a programmer is; their bosses know better!

I think numbers are there so people can do a mental comparison and think 20 people have died in that event and 50 in that event so, by contrast the latter event must have been worse. But it's like comparing apples with oranges, because when you do something to a number then you can start to play games and make the number high even though the thing isn't right.

You can take education and an educated student and think, well, how are they going to do on this test and out come the books on how to pass this test rather than how to learn science. It's all about how to get a good score on a science test. And that's the problem with these numerical things; they don't always capture the essence of it. Once you have a way to quantify something then, if your goal is to cheat, you'll figure out a way to cheat, when the goal really is to learn.

You've said in the past your work is basically about finding a way to sort out the things that are going to last a long time (in computer science) instead of changing rapidly. What do you mean by that?

Every day I get about one journal in the mail, not including ITNOW (laughs), but including The Computer Journal. About eight of them arrive in my mailbox every week. So there's an enormous amount of material out there and it's good stuff. So how am I going to decide what to put into The Art of Computer Programming?

I try to avoid the stuff that's quickly going to become obsolete and concentrate on the stuff that's going to have lots of applications. I try to find the facts that aren't too hard to learn, that are going to be useful for everybody's toolkit. What should all programmers of the next generation remember? I don't pretend that I'm right about everything, but I try to sort out the ones that stand out to me, that are unforgettable and that our children should remember.

So I guess the building blocks of computer science and not so much all the more transient add-ons which tend to follow?

Yes, but there are still thousands of add-ons that are describable in a couple of paragraphs, and learnable. If something takes 10 pages to describe, then it's very hard to get it into my book. But if something only takes three pages, is intrinsically useful and I can see how it physically fits in with other stuff, then it's more likely to go in. For example, we all learned how to add numbers together when we were young. If you think of all the uses to which that skill has been put – it's incredible. We all use addition every day, in different ways and continue to do so every single day. But still you learned about adding – you learned the concept of adding. There are loads of little concepts like that which go into my book and that's what I'm looking for. They haven't all been discovered yet.

Even with adding and computing there's now 'adding without carrying or nim-addition', which is something that was invented in England 100 years ago. It began as a game, which computers can do well, and we could combine this addition with ordinary addition, so one of the things in my new book is to explain to people why we might even want to be teaching fifth graders a new kind of addition because it's turning out to be quite useful. But it's not so simple that you can say 'everything I need to know I learned in kindergarten' – we keep learning little things that help us take giant steps as we go.

In 1999, you were invited to give six public lectures at MIT on the general subject of relations between faith and science. Over a decade on, have your views on the relationship between science and spirituality changed and if so how?

I'm just glad to see that people think there's more to life than things we can understand and it just seems that, at the time I gave those lectures, it was just coming out of the closet, saying 'well, computer science is wonderful, but it's not everything and don't expect me to give you any answers – let me explain why I think it's good to still have some things that are mysterious'.

I think there is the tendency as we discover more and more science that we tend to think that now we know everything. But as we think about it more and more we're just getting started, I think. The amount that is changing is happening incredibly fast, but still I can see that in 100 years' time there's still going to be much more to learn. So there's plenty of room for humility, but we have still learned an awful lot of stuff we can be proud of.

I had this invitation to MIT and I thought, well, once in my life, if I ever wanted to reflect on this, this was going to be the time and the place to do it. I don't pretend to be an expert on it; I just didn't think people were spending enough time thinking about it. I was glad to see how many people responded to it.

Were the lectures well attended?

Well, that's the thing – it was standing room only! It was a big lecture hall, too. There were six lectures. After the first one it was carried on on Dr Dobb's tele-webcast and it was downloaded an incredible number of times over the next five or six years. So it was definitely meeting some kind of a need. I wasn't necessarily providing the answers, but I was providing some of the questions that I thought were part of our life – why not discuss these things in public? I was very pleasantly surprised at the numbers who came.

A few years back I gave a talk at Google and again it was standing room only, and again it was about this very subject. And it was a 'question and answer' talk like I'm giving for the Turing Lecture. That's the thing I enjoy, somehow responding to what people ask me more than having a canned thing.

(Continues…)


Excerpted from "Leaders in Computing"
by .
Copyright © 2011 British Informatics Society Limited.
Excerpted by permission of BCS The Chartered Institute for IT.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Foreword

The Art of Computer Programming - Donald Knuth

The Mighty Booch – Grady Booch

Talking to Torvalds – Linus Torvalds

Apple to the core - Steve Wozniak

Cerf's up – Vint Cerf

Computing's too important to be left to men - Karen Spärck Jones

Isn’t it semantic? - Sir Tim Berners-Lee

Sharing knowledge - Jimmy Wales

Shape the future - Dame Stephanie Shirley

From the B&N Reads Blog

Customer Reviews