Computeruser.com
Latest News

The limits of productivity

No technology company wants to admit this, but productivity comes from within the individual.

A recent survey of 200 executives and 500 staff in Britain suggests that on the whole, British workers are apathetic, unmotivated, and unskilled. The study was sponsored by Hewlett-Packard, which concluded that companies that adopt a more “adaptive” approach to technology can turn this problem into a competitive advantage. I confess: Trying to understand that one sent my brow into spasms.

Once I figured it out, my forehead regained its smoothness. When you pay for a study like this, you just have to give it positive spin. And this positive spin results more from habit than the marketer’s mantra, “Turn that frown upside down.” Technology companies have been pitching the productivity gains from their products for so long, it’s unthinkable to them that productivity could rely on factors outside of technology, such as management.

No technology company wants to admit this, but productivity comes from within the individual. You can give Joe Slacker a dual-processor, liquid-cooled, over-clocked behemoth, and he’ll still spend 10 percent of his time actually working. All those clock cycles will sit idle while he makes origami surfers for his wave keyboard. The challenge of the ’00s is not so much giving workers enough computing power, it’s motivating them to use it.

Technology-related productivity gains were legitimate in the ’90s, when motivated workers sat waiting for their computers to refresh their screens. Now more often than not, computers wait for workers to input some data, any data. Why the change? I don’t mean to state the obvious, but Moore’s Law comes to mind. The number of processors on a single wafer of silicon continues to double every 18 months or so, resulting in two-fold gains in processor speed over the same time frame. When your 66MHz 486 has evolved into a 3GHz P4, your productivity can’t keep up with the speed increase. And the processor is only one component that obeys Moore’s Law. Memory size and speed, video silicon, bus speed, hard drive speed, networking silicon … the list goes on. The result is computers that run faster than humans can think, let alone work.

Software companies have consistently attempted to close the productivity gap by creating applications that use this speed to automate tasks and reduce the amount of thinking and inputting humans need to do. On the whole, these efforts have been marginally successful. So-called “intelligent” applications must make a slew of presumptions that can turn out to be false, annoying workers and reducing productivity. For example, certain versions of Microsoft Word replace characters as you type if your typing appears to be mistaken. If you don’t turn off these automated functions, you often have to go back and fix Word’s “corrections.” This results in productivity lost rather than gained. The benefits of artificial intelligence (AI) are greater in some contexts, but AI is a long way from closing the gap.

The less obvious cause of the productivity reversal is the motivation factor. It’s not that workers are less motivated now than they were in the ’90s. In fact, productivity continues to rise at astounding rates. Recent statistics from the U.S. Bureau of Labor Statistics (BLS) show that the annual percentage increase of output per hour of non-farm labor has steadily increased from 1.2 percent in 1994 to 4.4 percent in 2003. Quarter to quarter, we see wild fluctuations in these statistics. But 2004 numbers look promising as well.

So U.S. workers are arguably more motivated now than they were in the early ’90s. And given global competition for jobs, I suspect the same is true for British workers. The problem is management perception. When executives spend a ton of money on technology that’s supposed to magically increase productivity, they want results. When they don’t get the results they expect, they blame the workers. They want the workers to keep pace with Moore’s Law and most people just can’t keep up. It’s enough for most people to hit a sustainable level of productivity and maintain it. After that point, technology investment is wasted on them.

The reality is there’s no artificial substitute for human thought and creativity. You can try to speed people’s thoughts and creative actions up, but you will eventually hit a limit. Much to the chagrin of many executives, humans are not machines. They have limited mental and creative energy. Push them too hard, and they will need to rest and recoup their energy. Burn them out, and they might never recover their prior productive ways.

By my definition, good management is the process of creating an environment that enables employees to have sustainable and high levels of productivity. Technology helps, but many other factors contribute to the productive environment. Recognizing them, giving them generous paid time off, and most of all, inspiring them in your business’s good cause come to mind as the basics of good management.

Many managers fail to do even the basics. Rather, they find their most productive people and give them more work while not holding their least productive people accountable. Nothing saps productivity like inequity. When the team leader who never checks her e-mail is promoted to product manager, productivity takes a nose dive. And if you make productive people work more hours than their brains can handle, you will see a decrease in productivity. Get up over 50 hours a week over a long period of time and you might do permanent damage. Cancel vacations and make them work weekends and holidays, and you will do permanent damage. They might recover their mental energy, but they will not recover their desire to use it. As a victim of burnout, I can attest to this. And I have several good friends who are either burned out or on the cusp of burnout. Some of the best and brightest minds in the industry can no longer produce at sustainable and high levels because they are burned out.

Executives in the United States don’t know how good they have it. The average American takes just over two weeks of paid time off per year, counting paid national holidays. The average European takes nearly five weeks of paid time off. And average U.S. paid time off is decreasing as the number of contractors increases and regular head counts decrease. Contractors, such as myself, typically do not get paid time off. According to a BLS employer survey of benefits, 25.5 million U.S. private-sector workers do not have paid holidays, and 22.2 million private sector workers do not have paid vacation. One way for executives to increase productivity is to reduce head counts and increase required output from their middle managers. The only way for the middle managers to satisfy these conflicting expectations is by hiring more contractors, which are necessarily more productive in part because they don’t get paid time off.

Given all this, it is astonishing to hear executives complain that their employees are apathetic, unmotivated, and unskilled. They can’t blame the technology. They can’t blame the workers with a straight face. When assessing blame, they should look in the mirror. By consistently burning out the best and the brightest, they’re turning out a class of middle-aged workers who just want to make origami surfers for their wave keyboards.

Leave a comment

seks shop - izolasyon
basic theory test book basic theory test