Computeruser.com
Latest News

Software: the underbelly of the future

The third installment of a five-part invention.

In this third installment of my columns on the future of computing, I’m going to take a hard look at something soft. It’s the underbelly of computing, and its point of greatest vulnerability is software.

Not rocket science?

Since when is “n-tier architecture combined with Web services frameworks in order to provide a seamless distributed application environment” something that mere mortals can master? If this jargon looks like gobbledygook, let me tell you that the underlying truth is much, much worse. The Web is not a friendly place for software developers–not in conception, development, or (especially) actual use. There are so many elements to take into consideration–clients, servers, databases, XML, HTML, scripts, and more. The industry is fraught with perils including broken connections, bad performance, unruly users, and hackers who delight in breaking your software.

Thinking about future technology tells me a great deal about the difficulties that lie ahead for software. Take nanotechnology. Big things are in store for this tiny realm (see the Pursuits columns from December 2001 and January 2002 on ComputerUser.com), but everywhere I look it’s the equipment side that people talk about: the tunneling microscopes, nanotube circuits, and other devices. They simply presume that the software will be there to control and organize this ultra-complex and vastly numerous group of nanites.

On the other hand, I hear great bastions of software development such as IBM bewail the fact that software is getting out of hand: i.e., it’s too complicated and too difficult to be reliable. We all know about the problems Microsoft had and is having with providing secure and reliable operating systems. Yet we have people basically shrugging and assuming that software to manage the potentials (dangers too) of nanotechnology is a “given.” Well, it’s not a given, or a gimme, or even a maybe-get.

Not another crisis

A crisis in software is a buck a bundle–not a big deal, right? I’ve been hearing for decades about how hard it is to make good software (and, of course, it goes back further than that). The principle of “muddle through” seems to having taken hold long ago, and is still hanging on for dear life. Among other things we’ve invented modular programming, object orientation, and Xtreme programming to solve the software bottleneck. And yet, people are still decrying the lack of progress. That’s probably because programmers continue to avoid “good programming practices” in favor of trying to get the job done–quick and dirty, if necessary.

Lord knows I’ve done enough preaching about good software development practices (nine books and a bunch of articles). But from the point of view of successful projects, programming today doesn’t look a whole lot better than it did in the days of assembly languages (it looks different, not better). Recent surveys continue to show that only one out of five or six software projects is considered successful. We can do more with software, but more isn’t necessarily a big improvement–for the user or the developer. Meanwhile, the rush of technological development continues to up the ante on software complexity.

It isn’t that we can’t write good, even great, software. Give a clutch of hotshots a lot of money and a single, relatively well-focused project and watch the code fly! Put another way, we can demo any kind of software–real one-of-a-kind brilliance. The problem is in replicating good software over many different projects, and especially in dealing with very large and long-term projects. In these cases, Herculean effort sometimes succeeds.

So software development, which is intimately linked to the success of most other future technologies, must get itself together before we get too much further down the road. When you evaluate new computer technology (or almost any other kind) be sure to ask, “Where’s the software? How good is the software?” It’s bad enough to have a premier channel of communication (the Internet) mucked up by faulty and poorly designed software security, but when it comes to splicing human genes or unleashing trillions of tiny robots…

Autonomic computing

If you think this diatribe is anomalous (cry wolf, lone columnist), take a gander at the IBM Web site. IBM is too chivalrous to use my kind of language, but the idea is similar: We cannot produce software to manage the kind of complexity we see now (much less the future) without a radical changes in the way hardware and software is designed and the way it functions.

What changes? In a nutshell, “Computer, fix thyself.” IBM calls the approach autonomic computing, which is sufficiently esoteric to appeal to academics and digital theoreticians–the target audience of the first round of research. The reason for the research is that we don’t really know how to make hardware and software self-managing and self-maintaining. Heck, we have a hard enough time getting humans to do those jobs. Maybe that’s the point.

The example IBM likes to use for autonomic computing is the human nervous system. Unless there’s a problem, breathing, blood circulation, digestion, and other functions carry on 24/7/365 under nervous-system control but without conscious effort. So it should be with hardware and software systems. The key word here is systems. By software, we’re not talking about your “Quake” game or genealogy program, but larger software applications such as computer and network operating systems. The underlying principle is that systems, particularly software systems, should relieve both programmers and users of much of the work of developing and maintaining software by doing the routine tasks automatically.

I don’t know if autonomic computing is the technology to watch in the near future. I do believe it’s the kind of software technology we must keep an eye on. I would feel a lot better about the concept if somebody would turn its principles on the next big thing in software–Web services.

Web services

A software developer colleague of mine remarked a year ago, “Whatever the future of software, it won’t be office suites.” Now I’m not so sure. Microsoft plans to convert its Office software into Web services, meaning instead of buying the suite as a single package, you’ll be renting or leasing it piecemeal. This does not mean a big piece like Word, but instead, the functionalities of Word broken into 15 or 20 services, such as editing, graphics, HTML formatting, mail merge, and spell checking. You pay (repeatedly) for the services you need. Since Microsoft Office is the No. 1 software application in the world, it follows that making it a Web services product will automatically put Web services on the map in a big way.

In the future (say, three to five years) vendors like Microsoft, IBM, Oracle, and Sun see Web services as the main delivery system for software. This has many implications: The Internet will become the standard arena for software; if it isn’t on the Net, then it’s old hat. Applications of many kinds will be concocted more or less ad hoc by connecting Web services, in theory from many vendors all over the world. To make this work, there must be a payment system, such as micropayments, to handle the jillions of transactions involved in millions of people using multiple Web services. Compatible services must be delivered to the user in a timely fashion to coordinating software. Users will want to have their services properly validated (bug- and virus-free) and have assurances that any data they put through the services is secured and handled reliably.

If this sounds like a simple system to you, a job at NASA awaits. To me, Web services are a perfect example of how the complexity of software will increase by orders of magnitude in the future. Of course, the user isn’t supposed to see the complexity. But behind the scenes, the software developer will be confronted with all the complexities and vagaries I mentioned at the outset, so they’d better get to work.

Leave a comment

seks shop - izolasyon
basic theory test book basic theory test