GPU socket returns from the dead
AT THE dawn of modern 3D, there was a market slaughter between 3Dfx, Matrox, S3 Graphics, ATI Technologies and a shedload of small players such as Rendition, PowerVR or Number Nine.
There was no Nvidia then - it didn't came into the frame before Riva 128 and ZX. That was the time when people spoke of Duke Nukem Forever being a hit game for 1998, and BitBoys making a GPU which would demolish the existing players. Prey was also being discussed, as a game which "currently cannot be rendered faster than one frame per minute".
Rendition will also be remembered as the first IT company ever to hold a poem contest. As far as I can recall, the winning rhyme was something along the lines of "Oh Rendition, bring 3d power to my verition". Whatever a verition is. I can't figure why, but it reminds me of the Radeon song. Anyone have that?
Rendition was in bed with Diamond Multimedia and Creative Labs, all heavyweights that battled against we-build-our-own-products ATI and Matrox. The moment of fame for the company was Verite 1000, a 3D accelerator which was 2D/3D combo at the time, running GLQuake in - OpenGL. 3Dfx knacked everyone down with VGA (Voodoo Graphics Adaptor), and Rendition started to develop 2200 and 3000 series. What mattered with 3000 was well, something new.
Rendition had the idea of an interchangeable GPU socket which would offer "complete graphic card flexibility". With SO-DIMM slot on the PCB, all a consumer would need for the future was add-on memory modules of different size and graphics chips. Rendition thought AGP 1x and 2x were "it" as the standard goes, and hoped to sell a lot of boards. However, the company got sold to Micron Technologies, and soon ceased to exist. What matters about Rendition today is its legacy in the form of the soon to be heavily touted "GPU Socket". Both ATI and Nvidia will probably try to tell you that GPU socket is their own baby, with Nvidia claiming bigger flexibility with a GPU Socket on a motherboard, while ATI plans to shatter Intel, AMD and AGEIA with its GPGPU goodness. But ATI and GPGPU is a whole different story.
The INQuirer
|