Computeruser.com
Latest News

Toward a human-centric Web

An interview with Prof. Michael Dertouzos, head of MIT’s prestigious Laboratory for Computer Science.

Prof. Michael Dertouzos is perhaps the most influential person in modern computing. As head of MIT’s Laboratory for Computer Science (LCS) for more than 25 years, Dertouzos’ legacy is woven into every major computing breakthrough since the early ’70s. His lab has been an incubator to dozens of successful tech companies, from 3Com to Akamai. From Bob Metcalf to Tim Berners-Lee, he has mentored some of the greatest visionaries of the Information Revolution.

His own views are no less visionary than those of his protégés. Just out last month, his latest book–which goes by the short but sweet title “The Unfinished Revolution: Human-Centric Computers and What They Can Do for Us”–takes a radical approach to the relationship between machines and humans. “For 40 years computers have been shrines to which we pay dutiful homage,” writes Dertouzos. “When something goes wrong, the ‘user’–you and I–feel that if we had somehow behaved better the trouble would not have arisen. But we are not at fault. The trouble lies with our current approach to computing.”

The current approach builds computers that perform their desired functions, and it forces us to adapt to them. Rather than making the human adapt to the machine, Dertouzos’ approach is to start with human needs and build machines that serve them. With the human at the center of the design problem, there is no need to slap an interface on the results. The human/machine interface is as natural as human speech and vision.

I talked with Dertouzos recently about the human-centric views that form the basis of his book and of every project in the LCS. I focused my questions on the topic of this month’s issue–Web developments. He spoke candidly in cool yet cheerful tones about the two main LCS Web projects: Haystack, an automatic information organizer; and the Semantic Web, a joint project between the LCS and the World Wide Web Consortium (W3C), which gives meaning to Web content.

These two projects exemplify the LCS approach to computing. While there is a lot of work to be done on both projects, early results indicate that the systems will have far-reaching implications in the future of both private and public webs. If Dertouzos is right, these and other human-centric projects will help to finish the Information Revolution by automating not just machine computation, but also human/machine interaction.

Mathewson

Your book develops a new model of computers across several applications. How does the model differ from related research in human factors, interfacial design, and other information-access design work going on today?

Dertouzos

The fundamental difference is this. I view human-centric computing as a total commitment to the human as the starting point. I start with the interface, and then I go down to all the applications. In the approach we have had for the last 40 years, there is a machine that has all this number-crunching power, and then there is an interface that lets us talk to the machine. Nowadays, there are upwards of 3,000 commands in an operating system, and most of them control the interface. This is just cosmetic painting–you paint the trash can hoping that it will not smell as bad. But of course, it smells worse.

You have 3,000 commands and 8,000 APIs, and the result is more complexity, not less. It’s just window painting, a nice little GUI interface, and much of it only adds to our frustration. In the new approach, you’re not talking to the interface, you’re talking to the machine–it doesn’t need an interface. The entire computer is the interface–this is radically different than what we’re used to.

Mathewson

Our March issue, in which this Q&A will appear, focuses on leading-edge Web developments. You devote a whole chapter to this with Haystack, a peer-to-peer information-bundling system; and the Semantic Web, an Internet system for defining meaning between related sites. How do you view these systems complementing each other as they’re implemented in future information systems?

Dertouzos

Right now, links on the Web are in all manner of conceivable relationships. When you go to a site, it could be related to a wide variety of meanings, and often there’s no way to know until you get there if it’s related to what you want. It’s linked not according to meaning, it’s linked the way designers want it–these are the familiar blue links we click every day. And search engines, no matter how sophisticated, are just oriented to our patterns of words, which in themselves carry no meaning.

These systems are similar in one very important respect. Basically, both Haystack and the Semantic Web link information–what I call in the book the red links–according to meaning. They differ in that Haystack primarily deals with finding information in the close world: your hard drive and your family and friends’ hard drives. This is how we find information outside of computing. We look around us and, if we don’t find something, we ask our friends and colleagues.

Once you’ve asked all your friends, then it’s time to look in the Semantic Web with what I call SL, which automatically finds information based on matching synonyms that various groups use for the same thing. In everyday life, community business is terribly important for information access. One company may have its own vocabulary that has to be matched with another company’s in order to get at the meanings of their words. The words themselves might be meaningless to outsiders, but if a system gives meaning to this vocabulary, the information can be gleaned and compared with what the outsiders know.

Mathewson

The problem of assigning global meaning to terms in our complex languages is one that defied the smartest minds of the 20th century. You have problems within languages–terms have several meanings and change nuances constantly. And these problems pale in comparison to the problems between languages–for example, many terms have no counterpart in other languages. How does the Semantic Web get around these problems?

Dertouzos

You don’t have to go all the way up the ladder in an ascent to meaning, as the AI people have tried to do unsuccessfully for decades. Lots of people have tried to shape the world in their rational image–they’re wrong. Lots of people have the aspiration to develop a calculus of human logic–it won’t work. If that’s your goal–basically automating common sense, as Lao Tzu would put it–you’ll never get there. But we can go a long way up the ladder with synonyms.

The important thing to note is that the Semantic Web is just a framework. The actual synonym matching will be developed by third parties–those with a stake in automated communication, like the various automakers I talk about at length in the book. Left to their own devices, people will do what they really like. And given the tremendous incentive for people to develop allied meanings for things, it will happen. It might take some time, but it needs a basis for it to happen, and the Semantic Web is a promising foundation.

Mathewson

You’ve been working in the old way for a long time at LCS. What made you so focused on breaking from the old way and finishing the revolution?

Dertouzos

The reason I’m so passionate is because great people have failed miserably. After all this failure, I started developing a whole different way of thinking about the problems. For example, we’re building very nice little systems that are very intelligent in these limited, narrow fields. Because you have small sets of bins to put data in, you can be very definitive about what goes in the bins. Take speech systems. If you have a vocabulary that is specific to an occupation or activity, you can get very close to machine understanding.

The next step is to stitch all these little intelligent systems together–travel, navigation, traffic, and so on–and you have an intelligent system that can do more than just one thing. A flat collection of a very large number of narrow fields can be much more useful than some attempt to determine global machine understanding. The LCS’s Oxygen program is an example of implicit stitching together of these narrow fields. Oxygen avoids the highfalutin dead-end paths that have plagued AI from the start.

Mathewson

How do you handle all the success of LCS? I don’t have to recite all the successful technologies and companies that it’s spawned. But when you stop to look at it, it really is impressive.

Dertouzos

I’m wedded to this laboratory. I love what it’s doing, so naturally, I think its the best computer science lab in the world. But I don’t think too much about what we’ve been able to do, except to continue a successful formula. I think we’ve gotta stick to our mixture of the forefront of technology and the highly practical. Put it into the salad bowl, and if a concept makes sense, go for it and let it improve the whole.

And I’m really very serious about the human-centered stuff. I absolutely get livid when things don’t work the way they’re supposed to. I just got the latest and greatest Mac G4 and loaded the latest and greatest Code Warrior on it, expecting to get some real work done. Instead I spent the whole allotted time talking to technical-support people and futzing around with the machine.

It’s not just the inconvenience. It’s really our destiny. We’re going to make use of these things in the 21st century, or they will be our undoing. The way things are going, unless we can turn this around, computers will run our lives rather than the other way around. But I’m optimistic. I think the 21st century will see this ascent toward machines that serve us.

James Mathewson is editorial director of ComputerUser and ComputerUser.com.

Leave a comment

seks shop - izolasyon
basic theory test book basic theory test