Forget about information overload. Time to deal with total sensory overload.
LOS ANGELES —
I’m at the Electronic Entertainment Expo, or E3, being bombarded by noise of all kinds on all sides. Walking the show floor and actually grasping what’s going on requires a nanosecond attention span, since each booth contains dozens of gargantuan displays, each with surround-sound systems blasting the latest, if not greatest, video game. Despite my penchant for solitude, it is hard not to get excited about this show. Vendors showed up in force and some spent millions on their displays.
I’ll give you my picks of the best and worst of the show in a minute. But here I would like to just focus on one company as a microcosm of this explosive industry–nVIDIA. Everyone knows nVIDIA came out of nowhere in the late ’90s to eventually dominate the desktop graphics chip business. But few are aware of the key reasons why nVIDIA is poised to become the next Microsoft of graphics silicon intellectual property (IP). I sat down with Tony Tamasi–general manager of nVIDIA’s desktop graphics processor business–here at the show, to get a better sense of just why nVIDIA continues its phenomenal run of success.
I say nVIDIA is an IP company because it is known as a fabless chip vendor, meaning it designs technology and contracts out the actual fabrication. This is one of the keys to its success–the cost of chip fab plants has soared while the margins on chip fab output have crashed. But good technology design continues to have relatively low costs and high margins. This is why I call nVIDIA the next Microsoft of graphics chips vendors and not the next Intel. Microsoft is an IP company while Intel is involved in both IP and manufacturing.
One key ingredient to nVIDIA’s success is contained in its first 30 patents. These patents outline a Universal Driver system that enables every nVIDIA chipset to run on one and only one driver. This is significant for a variety of reasons. It is a boon to users who can have plug-and-play on any card or chip upgrade without loading new drivers. And it’s a boon to developers, who never have to gear their code to a continuously changing driver set. I asked Tamasi if nVIDIA might consider licensing these technologies to other chip and board manufacturers–it would drastically simplify PC lifecycle management. He said they haven’t considered this opportunity, but he said the company would look into it.
Before the early 2000s, nVIDIA was a minor player in graphics chips behind ATI, Sony, and others. But it made a revolutionary design decision that has catapulted it into the lead in just three years and will help it continue to grow market share at its current exponential curve. The decision was made in the late 1990s, when all graphics chipsets were basically application-specific integrated circuits (ASICs). That means the chips were designed to do one thing and to do it well. In this environment, it was extremely difficult for nVIDIA to differentiate itself on every level except ease of use. So designers changed the game by creating the first true graphics processor–gForce3, released in 2001.
Processors are different from ASICs in that they are programmable. Programmable graphics logic is a great advantage because it gives the development community a lot more latitude in what it can do. Rather than needing to conform their code to someone’s chip design, they can now conform the chip to their game design. Along with the programmable logic, nVIDIA created a development environment–CG–that has quickly become a standard in the industry. In this respect nVIDIA also resembles Microsoft. If you create an environment that all developers must use to be successful, the products that work with those applications will soon dominate the market and eventually monopolize the market. Not only do all high-level game development tools feature CG, but a growing majority of game developers design games specifically for nVIDIA chips. Look at just about any packaged game and you will find the familiar “The way it’s meant to be played” logo.
Two big-picture industry trends make nVIDIA’s dominance of this market even more significant. First, as PC technology becomes pervasive throughout businesses and homes, a key ingredient in the success of any company will be how its products network with other devices. nVIDIA has quietly developed leadership in network-enabled graphics logic that will only enhance its position. Second, as display technology becomes ubiquitous, nVIDIAs products will become ubiquitous. In the last year, it has entered several new markets–mobile, workstation and other more vertical markets–and grabbed market share in those markets at the same rate its share has grown in the PC video market.
Perhaps the most significant big-picture trend hit me like a rocket-propelled grenade as I left the solitude of nVIDIA’s interview space and descended the stairs onto the booming South Hall of the Los Angeles Convention Center. In the future, digital entertainment will not be relegated to arcades, living rooms, and gaming centers. It will be everywhere display and sound technology is feasible. As organic light emitting diodes (OLEDs) face the same laws governing silicon devices (think Moore), displays will be so cheap, people will put them everywhere. And sound will be the natural accompaniment to what’s on the screen. You think information overload is hard to manage. Try dealing with total sensory overload. A lot of folks here sure had trouble with it, as they crowded into the concourses outside of the main halls to get away from the booming, buzzing confusion.
Best display: Vivendi Universal’s “Lord of the Rings” movie/game on the same plane. I found myself coming back again and again.
Best new game: “Savage,” a hybrid of first-person action and multiplayer strategy game from iGames.com (not to be confused with iGames.org, the tournament governing body). This game is destined to become an online gaming classic.
Biggest waste of space: Midway Games. What good is live music (“American Idol” first-round rejects) and all that glitz when your games are a poor substitute for EA Sports?
Worst game: Midway’s “MLB Slugfest 2003” game is pathetic. The poor graphics are not the only thing about this game that lack reality. This game features players showing each other up, fighting on the basepaths, showboating, etc. It’s enough to make a baseball purist like myself spit.
James Mathewson is editor of ComputerUser magazine and ComputerUser.com