Lumbering toward multimedia

     Most of the tech news around the PC in the mid 1990’s centered on what we called multimedia.

In those days, PC audio was like a bowl of Rice Crispies, full of snaps, crackles and pops. Graphics were often blurry or would get rendered as checkerboards of pixels, and even postage-stamp-sized videos were virtually unplayable. There was excited talk about connecting the PC to the phone to take and manage calls or even run office telephone exchange systems, but it, too, was pie in the sky.

The PC was something of a lumbering klutz when it came to handling what engineers called real-time services like audio and video. Not only was the Windows operating system too slow, PCs weren’t designed for the kind of millisecond response times such services required. Critics, including Ron Wilson, said the multimedia problem was just another sign of the software bloat that was Windows and the overall lack of sophistication in the PC architecture.

Users were accustomed to phones and TVs that, unlike PCs, handled media smoothly. The top brass at Wintel needed to close the gap ASAP. That was a big part of Andy Grove’s motivation in creating Intel Architecture Labs.

A few dozen chip startups saw the multimedia mess as a huge opportunity for audio and video processors that would someday be in every PC.  EE Times lived to cover such multi-year gambits. We’d profile the players and how they fit into the tech landscape with their competing agendas as they jockeyed to get designed into popular systems.

One of those startups, NVIDIA, was considered a rising star. When its CEO, Jensen Huang, and his team of execs came through Manhasset on a press tour to reveal its graphics processor, Wallace chose someone else, probably Ron or Alex, to take the meeting. I felt jealous but consoled myself that the strangely named company probably wouldn’t make the cut, anyway. I had my eye on a couple other prospects, one of which ultimately went belly up in a financial scandal.

At one point, more than 50 companies were trying to design media processors of some sort for the PC. They would all have to fight Intel which saw its rightful place as the maker of what would remain the largest and most important chip in the PC. Its x86 was, after all, the CPU, the central processing unit. From Intel’s point of view, the PC didn’t need expensive new chips; the CPU just needed a new audio, graphics or display block inside it.

Carl Stork shows an early multimedia PC in 1987. 
(Photo: Seattle Post-Intelligencer) 

Jim Pappas’ small team had another agenda. “We weren’t trying to stop them,” he told me. “We were actively enabling and driving the PC’s capabilities so they could succeed.”

For example, not long after PCI went mainstream, Intel’s engineers realized it wasn’t fast or powerful enough for bleeding-edge graphics chips, so they designed a new and faster path, the Accelerated Graphics Port.

Pappas was given engineering resources to help four chip companies get out products using AGP. One of them was NVIDIA.

“My team met with folks from NVIDIA and said their technology was good. NVIDIA was brand new at that point, but they managed to produce the first AGP card. They were very aggressive,” Jim said.

Without Intel’s help, NVIDIA might not have risen above the crowd to become the leader in PC graphics, a hyper-competitive field in the day. And without a solid grounding in PC graphics, that startup wouldn’t have had the resources to design its follow-on chips and software that lit the fuse on today’s AI revolution.

While Jim’s small group was enabling startups like NVIDIA, larger engineering teams were designing Intel’s own graphics chips. Some were basic, enough for supporting pretty pie charts in Powerpoint, some were much more ambitious.

Over time, Intel integrated versions of its graphics technology into its CPUs. It’s one of the laws of semiconductors: As chips get smaller, faster and cheaper, they suck more and more functionality into themselves. They’re hungry little beasts. So, it was natural that Intel both enabled graphics chip makers and competed fiercely with them.

Fundamentally, Microsoft and Intel shared an agenda. They wanted to grow the PC pie and make it tastier. That meant driving functionality and simplicity up and costs down. Typically, that translated into setting ad hoc standards –like PCI and USB -- so the loosely knit PC community moved forward more or less together. And typically, those Wintel standards kept the operating system and the CPU as the most important players.

Of course, Microsoft and Intel engineers didn’t always agree on what features should be added when or how they should be implemented in software versus hardware. The Wintel relationship had a certain “creative tension,” Stork often said, with a sly smile papering over backroom arguments.

Inevitably, a new hardware concept at Intel would require a revision to Windows, or a new idea at Microsoft would need a change in hardware.  “Most of the friction was when we got into each other’s space,” Jim said.

For example, to end the problems with bad Windows drivers, Microsoft set up a testing lab. Every company would have to submit its code to be tested and certified. Intel execs were livid.

“We said, you can’t do this, you will fix quality, but destroy the pace of innovation,” Jim recalled.

Ultimately, Intel hired as many as 200 software developers to help write drivers so passing lab tests would not become a roadblock. “We may have had more people writing drivers than Microsoft did,” Pappas said.

But reporters like me in the trade press saw a bigger picture emerging. Microsoft and Intel were driving the technology and reaping most of the profits, earning the derogatory nickname Wintel. Intel PR coached its execs when they heard the name to respond, “there is no such company.”

True, no such company, but the animus was palpable. Michael Slater, the founding editor of The Microprocessor Report, used to say that the only company that likes Microsoft is Intel and vice versa – and they both hate each other. 

Next: Sunny Days, Setting Standards

Comments

Popular posts from this blog

Inside the Death Star

I Arrive in Oz

Retirement Day: I Begin a New Journey by Looking Backwards