Just to add my experience to the pile: when I went to college I was able to convince my parents to get me a custom PC from a company called GamePC. Among the specs in 1998:
400Mhz Pentium 2
128MB
Nvidia Riva TNT
3DFX Voodoo2
CDRW (4x4x24 I think)
Syquest SparQ (Awesome, but had major issues)
Internal Zip Drive
Just a ridiculous system for the time. Quake 2 and Starsiege Tribes were really popular in our dorm and that system was just perfect for it. Also popular was burning lots of pirated games, so we'd order CDRs in bulk from this really random site overseas. High quality "gold" CDRs and they were far more reliable than any of the ones you'd find in stores in the US for about half the cost.
Halfway through my freshman year I decided to swap the motherboard and CPU for a crazy motherboard/CPU combo. There was a brief moment where Intel Celerons didn't really prevent you from using them in a dual CPU setup, so I had two 366mhz Celerons overclocked to ~433mhz (sometimes up to 533mhz, but that was less stable) and started playing around with OSs like Linux and BeOS to actually take advantage of them.
> Halfway through my freshman year I decided to swap the motherboard and CPU for a crazy motherboard/CPU combo. There was a brief moment where Intel Celerons didn't really prevent you from using them in a dual CPU setup, so I had two 366mhz Celerons overclocked to ~433mhz (sometimes up to 533mhz, but that was less stable) and started playing around with OSs like Linux and BeOS to actually take advantage of them.
Half of HN alive at the time probably had that motherboard - ABIT BP6 was the reason I got my hands on a copy of W2K, and also started playing with Linux.
I'm still bummed that CPU manufacturers basically killed off the third party chipset industry. Maybe it was inevitable when memory controllers moved on-die, but I remember when there was actual competition in the chipset business.
Chipset, not CPU. For example, Nvidia was a well known chipset manufacturer around this time, shortly before memory controllers went on package and 3rd party chipsets died off.
Speak for yourself, mate! Many fun times were had with machines built by each. I have particularly fond memories of the SiS 630 / 730, Via's KT133A, and lots of old ALi, OPTi and ULi gear from the 286, 386, and 486 era.
Had pretty much the same thing... but only one overclocked Celeron to 433. Was amazing upgrade from my pentium 133 with a Matrox Millenium; which I somehow used to complete Half Life in low FPS agony.
I still have distinct memories of "playing" CS in 640x480 on a PII with the same card, which didn't do 3D at all iirc. 12-15 fps with the software renderer, depending on how many bots you had.
It was a cool board. I didnt technically have one, but I built my dad a W2K server on a BP6. I always wanted to hack on it and overclock with it. But after I handed it over, I wasnt allowed to touch it, "you'll burn up my processors." Since he didn't care about overclocking he had dual P2-400s or maybe 450s. It was a beast. He could run SQLServer and compile Delphi apps so hard.
I got my kicks though with a BF6 and a 300A. Those were the times; atleast until the AthlonXPs (AXIA -- anybody?) were released.
The first PC I built myself had an Athlon XP Barton 2500+ and 2x 256mb sticks of DDR2-400. It wasn't top of the line by any means but great bang-for-buck in 2003.
agree -- that dual celeron setup (often with a peltier cooler) was suuuper common, I knew so many people who rushed out to get them and run at 500?
it was my second exposure to SMP though: First was dual socket Pentium Pro 200mhz which ran nt4.0 for the longest time (which I still keep that hefty cpu around on my desk for laughs)
I wonder if there something like this today you will have in college? Some low cost graphic card rigs? Or is it more like some cloudflare set-ups today?
I'm slightly confused, how would games of that era benefit from a dual CPU setup?
Old games were decidedly single-threaded and built for a single-core world. It was only in the mid-to-late 2000s that games started to be more optimized for multi-core CPUs. And multi-CPU is even more difficult because there isn't any cache sharing.
It was just single-CPU, but I had the ABIT BH6 with a Celeron 300A, one of the most overclockable setups ever. Mine was stable at 450mhz without any special cooling.
Similar experience, I had a Cyrix PR200 which really underperformed the equivalent Intel CPU.
Convinced my parent's to buy a new PC, they organized with a local computer store for me to go in and sit with the tech and actually build the PC. Almost identical specs in 1998: 400Mhz Pentium 2, Voodoo 2, no zip drive, but had a Soundblaster Live ($500 AUD for this at the time).
I distinctly remember the invoice being $5k AUD in 1998 dollars, which is $10k AUD in 2024 dollars. This was A LOT of money for my parents (~7% of their pretax annual income), and I'm eternally grateful.
I was in grade 8 at the time (middle school equivalent in USA) and it was the PC I learnt to code on (QBasic -> C -> C++), spent many hours installing Linux and re-compiling kernel drives (learning how to use the command line), used SoftICE to reverse engineer shareware keygen (learning x86 assembly), created Counterstrike wall hacks by writing MiniGL proxy dlls (learning OpenGL).
So glad there wasn't infinity pools of time wasting (YouTube, TikTok, etc) back then, and I was forced to occupy myself with productive learning.
I could share almost exactly the same story. So grateful my parents could afford, and were willing to spend, the money on a nice PC that I entirely monopolised.
Oh man the Celeron A, which was basically a Pentium II with on-die L2 cache. Intel attempted to handicap it by limiting it's FSB to 66 MHz, but any half-decent motherboard would allow you to bump that up to 100 MHz so long as you had the rest of the hardware to support it (i.e., PC-100 memory). This resulted in a pretty significant bump in CPU frequency.
That and high speed internet. I played for a couple of years on 28.8K. The day I got a better graphics card was great. No more choppiness. The day I got cable internet was life changing in Tribes (and Tribes 2)!
I think I still have a pic somewhere of the infamous NoFix scolding "LPBs"
I remember when Cable internet started showing up... I'd cart my computer to a friend's house once a month to play LAN party for the weekend and run updates.
Back then, updates over modem took hours to run, it was kind of crazy considering how many easily exploited bugs existed back then.
> it was kind of crazy considering how many easily exploited bugs existed back then.
Anyone on IRC learned this pretty quick.
I thought my computer was up to date on everything, ran win2k, zone alarm firewall well configured, and someone on IRC said they had a script with all known exploits and I invited them to run it against me… they still managed to crash my computer.
There was a moniker for the few people with high speed back then - LPB - low ping bastards. All those fortunate enough to live in a city with adsl or cable high speed in the early days (or gaming at work or university on the T1)
Interestingly enough, these days it's often an advantage to have high ping, because modern games make client-side hit detection authorative. With Apex Legends, Respawn uses the argument that playing against laggers but with client-side hit detection makes the bullshit that happens "symmetrical" and they want to keep the game accessible for people with poor connections, but anyone that plays against laggers knows that is absolutely not the case.
I wish modern games would just include a Ping Lock toggle in the matchmaking. "Do not match me with anyone with poor connection quality" (>100 ping, >1% packet loss). With a big fat pop-up warning that it'll increase matchmaking times.
It was deeper than that. That was just the way we were all classified back then: hpb (high), lpb (low), slpb (super-low?). When we got a cable modem in `99, I felt like hot shit to leave the hpb shame behind.
It's too bad that tribes games' learning curve is too steep for people now. Tribes Ascend was pretty well made but died quickly, and Tribes 3 seems to be dead even faster.
Very few people who didn't already play the earlier games have much stomach to figure out how to even move effectively across the map or hit anything moving at high speed, let alone do proper cap routes, chase, etc. I played Tribes Ascend for awhile and on most random servers you could play the first 2 minutes as a chaser to verify "yup there is nobody who knows how to move quickly", kill yourself over to chaser, and then end the game in like 4 more minutes when everyone else is just slowly messing around in midfield doing nothing productive. And I wasn't even any good lol, when I went into any semi organized pug I would get destroyed.
vgh! Except that texture transparency worked with glide (voodoo) cards and not with opengl or software rendering. So if you made a custom skin with transparency texture there was a brief time in the Tribes 1.1 to 1.2 era where you could be somewhat invisible to people with voodoo cards (if your skin was in a skin pack that everyone had).
> There was a brief moment where Intel Celerons didn't really prevent you from using them in a dual CPU setup, so I had two 366mhz Celerons overclocked to ~433mhz
Was that the BP6 motherboard from Abit?
I had that board, those processors and used to overclock them too.
Also ran Linux and BeOS on it (though IIRC you had to patch BeOS for SMP support).
Quake 3 ran so smooth on that machine, even without Q3s experimental SMP support enabled.
That was actually my all time favourite computer, even to this day.
I also had a TNT2 in an earlier machine, but the BP6 machine had a GeForce 3.
Dual 300As overclocked to just over 500Mhz each on a BP6 with Geforce 256 here too! Fastest, smoothest machine I ever used until the M1 MacBook. Quake 3 multiplayer demo ran so fast it gave me motion sickness ^^ Years later I "upgraded" to a 1Ghz Athlon and it felt like a downgrade a lot of the time.
> though IIRC you had to patch BeOS for SMP support
The board might have require a driver or patch, but SMP was BeOS's entire reason for being! The drawing of each window on the screen ran in a separate thread. It was their main selling point.
Reading the BeOS Bible talking about that is quite a throwback:
> As described elsewhere in this book, BeOS uses multiple processors with incredible efficiency. If you'll be running BeOS most of the time, you'll get more bang for your buck by getting two (or more) older processors than by installing one superfast CPU. Last year's 266 MHz CPUs will always be dirt cheap compared to today's 450 MHz CPU. Thus, when running BeOS, you could have 532 MHz for less than the cost of a single 450 MHz processor. The catch is that if you'll be dual-booting into operating systems that won't recognize a second CPU (such as Windows 95/98), you'll end up with half of your processor speed being wasted until you reboot into BeOS. Chance are that once you start using BeOS regularly, you won't want to use anything else, and you won't regret buying a multiprocessor machine.
Lack of SMP was an artificial limitation for the BeOS 5 Personal Edition (I think it was called). The idea being you’d get BeOS for free but you couldn’t use it as a proper multiprocessor workstation without paying for a license.
This was also the same BeOS release that targeted Intel and ran off a virtual disk stored on a Windows FAT32 partition.
Overclocking Celeron's those were the days. Intel binning down a bunch of processors capable of reaching higher clock rates but selling them as a lower end part was a boon for college students everywhere.
NVidia RIVA TNT which used the AGP bus on the Intel LX440 mobo.
A whopping 128Mb of RAM and 8Gb HDD.
I recall using a program called WinSplit to split the Nvidia driver over several floppy discs on my bosses Win3.1 machine in the office. I didn't have internet at home and really wanted to play Jedi Knight and Battlezone.
I recall the legendary Celeron being the 300A. It was 300MHz, but was easily overclocked to 450MHz. There were higher clocked versions, but regardless of which CPU you got, they ultimately were only able to overclock to about the same frequencies.
Also, the celerons of that generation did not have unlocked multipliers. The only way to overclock them was to overclock the front side bus, which also controlled memory bandwidth. The "standard" FSB speed was 66MHz. By overclocking a 300MHz CPU to 450MHz, you got a 100MHz memory speed. By overclocking a 366MHz CPU to 466MHz, you "only" got 78MHz of memory bandwidth.
My friend in college had one. Windows 98 didn't support SMP, so he had to run Windows 2000, which was based on Windows NT, and would be the basis for XP. Compatibility with games was sometimes...interesting. Windows ME came out about that time, but was absolute garbage. All of us either stuck with 98SE or experimented with 2k. None of us actually bought it of course...
So the story originally started with the cacheless 266 Mhz Celeron. CPUs were delivered as AICs (add-in-cards) at the time, with separate cache chips, so to deliver a budget processor, they shipped the same silicon, but without the cache chips added. Removing the cache drastically tanked the performance, especially on integer work loads (typically productivity software), but didn't really affect floating point workloads. However, it had the side benefit of removing the part of the AIC that was most sensitive to over-clocking (the cache). It used a 66Mhz clock with a fixed 4x multiplier, and upping the clock to 100Mhz got the Celeron running at 400Mhz, which had performance roughly equivalent to a 266 Mhz Pentium II with cache for integer workloads, but for games, it was almost as fast as the fastest Pentium II of the time (which topped out at 450Mhz).
In order to stop the overclocking, Intel decided to add some cache back to the CPU, but to save money, rather than using cache chips, they stuck a relatively tiny amount of cache directly on the CPU die, and released the now infamous Celeron 300A
Because the cache was on-die, it could overclock just as well as the previous celeron, but this time the 300A was faster than the equivalent Pentium because the on-die cache ran at twice the clock speed of the external caches
> By overclocking a 366MHz CPU to 466MHz, you "only" got 78MHz of memory bandwidth.
I think the PCI bus probably also typically ran at some fraction of the front-side bus. The common FSB frequencies around those times were 66 or 100 MHz which gave a standard ~33 MHz PCI bus frequency with a multiplier of 1/2 or 1/3. FSB frequencies that weren't close to a multiple of 33 MHz might have caused trouble with some PCI cards. Might have depended on how the motherboard or chipset handled the bus frequencies, too.
Of course the PCI bus should probably always run at 33 MHz but I think I saw it being modified with the FSB speed at least on some motherboards.
It was crazy how fast things moved back then. A friend of mine had a 233MHz P2 with 32GB and a 2D card, and within two years it was a dinosaur, being shown up by machines like yours, 400-450MHz, 3D cards, way more memory....
Ah, a fellow Tribesplayer. Just so you know, we still play tribes. Join us! http://playt1.com/ - the community mantains the master server and clients these days. There are good pick-up games on fri+weekends.
I love this game, it's also amazing to me how the concept of "skiing" was foreign to me when I first played T1 and T2, and now its a core game mechanic.
I think the other commenter is right...you're thinking of DVD-R vs DVD+R, possibly even DVD-RW and DVD+RW.
Based on the specs listed, OP was in college just before me or may have overlapped. The big gold CD-R stacks (you could bur in jewel cases, on spindles, or just gross stacks which were nice and cheap) were a huge thing with my group (who encoded to FLAC & MP3 -V0 and burned audio CDs relentlessly. We felt we were archiving our liberal arts college music library and radio library for the future! Who knows. Some of that "future" is still backed up and on hard disks, and I should migrate them to SSD or tape just on principle.
At that point CD-R were cheaper than CD-RWs, and because most archiving/distributing didn't require rewriting (not return-on-investment wise anyway), we just shared programs on CD-R as well. In some ways it was a beautiful technology! Particularly fidelity to a spec everyone tried to bend and break for a profit angle, when honestly, there was no point for many of us using CD-R
It was truly jaw dropping firing up quake 1 for the first time on 3dfx voodoo1. Double the resolution of software and super smooth framerate, and much better quality texture mapping too. I recall tweaking some setting (gl_flashblend?) so that I could see pretty glowing orbs around my rockets (and strategically, also everybody else's rockets).
It's hard to convey just how revolutionary the original voodoo cards were. There aren't many times in my life where there was a clear line of before and after, but this was one of those times.
Still blows my mind that it was just a flash in the pan. At the time it felt that 3dfx was certainly going to be a dominant force in computing for years. And although they lingered a bit, the whole event was barely over 2 years.
I think everyone understood that having it be 3D-only (and requiring a separate graphics card to do normal 2D desktop graphics) was a half-solution, and 3DFX's long term success would depend on their ability to provide a full 2D/3D solution before existing competitors like NVIDIA, ATI, and Matrox could catch up with their own 3D accelerators.
I remember reading a historical piece where the voodoos success was partially luck. At the time the first generation cards were being developed edo ram was super expensive so most competitive designs were hamstrung trying to do things with very little ram. By luck edo ram prices crashed right as they released making them far more affordable to manufacture than 3dfx could have reasonably expected. That gave them an early and massive lead with their initial design.
That took a lot longer really as well... I remember seeing SATA SSDs around 2009, paying a massive amount for my 64gb Intel drive (that ate itself just over a year later)... I hated moving/symlinking so much... but, fortunately by the time it died, I could go to 256gb or 512gb (don't quite remember which) for not too much more.
Even then, I was still seeing most Desktops sold with spinning rust for several years later.
They also had the most recognizable unified box art style from all HW makers[1]. When you saw those eyes staring into your soul off the shelves, you knew it was a 3dfx GPU. They also had the best ads. [2] HW vendors today don't have the balls anymore to use designs like that, it's all flat sterile corporate nonsense.
Unless I'm mistaken, those cards were all produced by 3dfx after their acquisition of STB. Regardless, that box art blew my 14 year old mind back in the day.
I think mine went into a computer that we donated to a school, or something. Around 2002 or 2003, my dad and I put together a bunch of working systems out of spare parts and donated them.
Mine was the PCI version of the card. Crazy looking on Ebay how much even the bare card goes for now, let alone when someone has the full boxed set.
I wiped my drive a few times before realizing dropbox didn't back my wallet up. I shrugged it off losing 30 bitcoins worth maybe at best 3 cents each at the time. Hindsight is 20/20 I suppose.
I was the unfortunate owner of an S3 ViRGE card at the time - the (in)famous "3D decelerator". I managed to get Quake running on it, and it looked nice, but was slower than with software rendering...
I had an S3 ViRGE too. It really was a decelerator, and the number of games that actually supported it was minuscule. I managed to run GLQuake, but without any textures - just shades of gray - and even that ran at most a couple of frames per second.
But there was another game - Terminal Velocity - that actually looked a lot better with hardware rendering, although it was still slower than software rendering. So, I would run it with hardware rendering to enjoy flying and then restart the game with software rendering to actually fight the enemies. :)
Same here. I can still vividly remember the experience of loading in with a voodoo2 for the first time. It was night and day -- mind completely blown. The late `90s really established a new version of the gamer; consoles were fun, but computer gaming was a different world. It made me a junky for reading about hardware, overclocking and gaming.
Replaying Heretic 2 back in 1998 with my first Voodoo (banshee) was a borderline otherwordly experience, compared to my first playthrough of the game using software rendering. Nothing has blown my mind the same way since.
I have had three experiences like this in my life:
1. PC Speaker -> Sound Blaster: Most games that I had were instantly upgraded
2. Doom: my first "real" fluid 3D experience, with stairs, etc, compared to maze-like maps in Wolfenstein
3. Software Rendering -> 3dfx (Canopus Pure3D): Transparent water surfaces in Quake (if you re-vis'd the maps), smooth and fluid visuals, it was amazing.
The closest thing to this, in modern gaming, has been the boss fights in Elden Ring: https://i.imgur.com/gzLvsLw.mp4 -- visually, they are quite epic.
Debatable. I always preferred the crisp look of the software renderer to the washed out GLQuake. Same with Quake 2. I think it because textures back then were too low resolution so filtering just makes them look muddy.
It’s also because the VGA signal quality from the 3dfx Voodoo wasn’t very good.
It didn’t have a traditional 2D graphics core at all, so it needed another graphics card for rendering the desktop (any non-accelerated apps really), and this was connected to the Voodoo using VGA passthrough. There was a noticeable image quality loss from this arrangement.
A Matrox card would give you crisp VGA with nice saturation, but the 3D acceleration was nearly worthless. Choices…
I really disagree. There were some nice Matrox cards. They weren't as good at 3d as 3DFX but for the time they really improved gaming. I developed Battlezone on G200. In those days we tried to have everyone have a different graphics card because the companies would just give them to us and we wanted to work with every card.
Matrox had great hardware, but the software drivers took too long to catch up. I was on the OpenGL team and my life's mission was to get Quake running as fast as the G200 and G400 was capable of. We finally caught up and got parity with Nvidia's TNT2, and then bam, they released the GeForce 256 series, and it was curtains for Matrox because their next gen hardware wasn't ready yet.
I agree that the washed-out textures haven’t aged well.
But at the time, not having pixelated textures was the first thing people looked at when judging graphics quality. I remember that being a HUGE selling point of the N64 and something that made the console look almost a generation ahead of the PlayStation and Sega Saturn to kids back then.
Today, I think I prefer the PSX look, thoug. Maybe with Z-buffer correction to avoid the warped textures of the PlayStation.
Might have also been one of those things that looked better on the 14-15" CRTs of the time vs crisp high-res flat panels of today. They were blurry enough that 640x480 was "high resolution" (I remember not being able to easily see the pixels at 800x600 on a 14" CRT unless I came super close to the monitor).
Even today I think a lot of Doom clones look better (or more nostalgic) with software rendering and texture mapping rather than OpenGL. There's an intensity of saturation to the colors that's lost. Fireblu is never quite so eye burning as when it's in software.
I came here to comment similarly, the lower pixelated software rendered Quake seems to work well with the textures. They have a bumpmappy fuzzy feel that gets lost with the sharp corners everything is super flat texture mapped and filtered version that one got from the 3d accelerators of the time. I guess my brain just adds context to the low res images.
Before unreal, I had a s3-virge for 2d and a powerVR 3d accelerator pair, and I was always flipping between software, virge and powerVR depending on game. Which at the time were largely hexen/heretic. The powerVR was higher resolution and clean/sterile but never seemed like a lot better experience.
But then there was unreal, the first game I think was absolutely better on an accelerator (voodoo2 in my case). Its also pretty much the last of the serious software renderers and outside of the voodoo's definitely did a better job with software lighting /texture mapping/etc than any of the previous (affordable) accelerators. Which is why I ended up finally replacing the powerVR with the voodoo2. The results were 'unreal'. Some of that might just be bias, I played insane amounts of doom/etc but never really got into quake. Quake just seemed like doom rehashed to me, so I was off playing warcraft/diablo/hexen/etc.
And frankly, outside of FEAR, I stopped playing 1st person shooter games for 20 years, the graphics improvements were so incremental, I just kept seeing flat low polygon models everywhere. And I don't think that looks good. Even after all the tessellation/bump mapping/endless tricks I kept seeing frames where I could play "count how many polygons are onscreen right now" games. Its gotten better the past few years, particularly some of the lighting, at least the screenshots/cut scenes are no longer obviously not in game rendering. The UE5 demo is slowly becoming reality in actual games, so maybe its time to revisit a few of them.
You can 'fix' the texture filtering to nearest neighbour in hl by adding the following to userconfig.cfg (should be in a directory called 'valve' in the game's root directory):
gl_texturemode GL_NEAREST_MIPMAP_LINEAR
gl_ansio "0"
gl_round_down "0"
Or just entering those lines in the consoe, preceded by 'set'
In terms of pixels, it was 4x the resolution. And for fun, one of the window textures (visible in the difficulty choice lobby IIRC) was implemented as a mirror in glquake - IIRC John Carmack said it was so easy to do in OpenGL he did it as a kind of test.
Always thought the original software renderer looked much better. It didn’t have the bilinear filtering, so the textures didn’t look all smooth and ‘washed out’, which suited the environment more imho
I can't speak for the original GLQuake on 3dfx hardware, but on OpenGL-compatible Quake engines (which include modern Quake source ports such as Quakespasm, Ironwail, and vkQuake), bilinear texture filtering is an option that can be turned off.
I play on vkQuake with nearest-neighbor texture filtering, square particles, and the "classic" water warping effect and lighting effects, alongside 8x MSAA, 16x anisotropic filtering, high-resolution widescreen, etc. This keeps the art style consistent with the look of the original Quake, while still allowing for the benefits of hardware 3D acceleration.
For what it's worth, the modern source port for Descent (DXX-Rebirth) makes bilinear filtering optional, too, while using OpenGL and allowing MSAA and stuff. I played through the first two games some time ago and I also found the bilinear filtered textures worse-looking than the blocky ones.
I'd be interested in a RTX-enhanced software renderer. Ie replace the baked lighting with the GI raytracing, but otherwise keep the rest of the software renderer. Have a feeling that could be an awesome blend.
Would be a bit challenging with the palette but should be doable.
Yeah that's what I was thinking. Like do a RT-only pass doing lighting (no textures), then do the software pass using the RT-lighting rather than baked lightmaps.
Latency would be slightly higher but I guess one could implement the important parts[1] of the software renderer on the GPU.
For me it was Carmageddon. I bought it later on an ipad and it may have just been rose tinted glasses of being completely blown away back in the day but the ipad version never seems quite as crisp...
I dreamt about having the vodoo but i could not afford it. Went with a rendition verite based one. It was underpowered compared to the vodoo but I really consider it the first real GPU as it was a RISC processor.
If I remember correctly to get transparent water the level also had to be re processed through the "vis" program with a transparent water flag set.
vis did a precalculation for where a level segment(the partition in binary space partition) could be seen from any other level segment. the end effect was that while glquake did have a option for transparent water, the geometry would not draw on the far side. making the effect useless without a bit of extra work. But I have to admit I have no idea if entities(other players) would draw or not.
Adding: the server and client had to both be running vis patched maps to be able to see other players in the water due to the way entity visibility was calculated server-side.
The downside to running vis patched maps on a server is it used slightly more CPU than unpatched maps IIRC. Perhaps someone that ran more servers than I did (I ran two nodes on an Intergraph InterServe with dual P6-200s) could weigh in on what the impact was at scale.
There was also a perverse effect on some games. With a graphics card, your gameplay could be altered and you had to unlearn all the reflexes you built on CPU rendering alone. Moto Racer (1997) was like that. The gameplay with a graphics card was completely different, even trajectories (I assume lag made the cpu accept a little bit more rounding errors).
> I can very clearly remember installing the card and then loading up 2Fort4 in the Team Fortress mod and suddenly being able to see THROUGH THE WATER.
Searching for "2Fort4" in YouTube yielded some interesting videos for people curious what the original Quake Mod version of the map looked like:
As someone who still spends at least 3 hours a week playing 2Fort on the current Team Fortress 2, it's fascinating to see how the core of the map is still basically the same after 20 years.
EDIT: Another video which goes into more detail about the history of each 2fort version, going back to its original inspiration from a Doom 2 level:
Even having a solid dial-up connection with a ~180-185ms ping was a ridiculous advantage when most HPBs were ~350ms, particularly in clan invitationals for Q1CTF. We were playing as LPBs in the dorm at ~45-60ms and 180ms wasn't that much of a concern, aside from sliding around corners more than expected, but at 350ms you were basically shooting predictively at where you assumed they'd be next, not where they 'were'.
Subspace/Continuum also used lag in its gameplay, with players warping to recently exploded spaceships so they could continue to invade. It was an established technique and had to be defended against.
On a very different scale, but I recall playing bzflag decades ago and discovering that I simply could not jump my tank to the second level. My graphics card was so slow that something wasn't working correctly, and no matter how many times I tried from different distances I would almost make it, but not quite.
I used to play games like Starsiege (the mech game) on dialup. With our 250ms pings, your brain just learned to compensate for the delay and click that much earlier to shoot.
But yeah, those lucky people with their DSL modems and 60ms pings would wipe the floor with us.
I loved playing starsiege back in 2000. I had a wired college campus connection, but shared with so many students, my pings would go anywhere from 50 to 500 depending on time of day. Near-timeouts showed the client side prediction code in action, with mechs sliding around and then freezing in place.
The torque engine (at least tribes 2) was wild what it was capable of on dial up. 32 or 64 players was okay compared to 8 on quake 2 depending on your latency and connection speed.
I played in Tribes 1 tournaments ... I never had a computer powerful enough for Tribes 2 – it had a very bumpy launch, remember? A lot of C++ exceptions, etc.
But Tribes 1 was incredible in how smoothly it handled "larger" servers. It would be so chaotic.
When qtest was released, I was there in the IRC channel and one of the first to play.
I remember connecting to someone's machine and just destroying everyone. Afterward I got a message from someone congratulating me, but being incredulous about my ping time being 26ms.
I happened to be on the first floor in the first dorm on campus to get wired internet access, and they had an OC3 dedicated to it. Two months earlier there were 16 of us splitting that line, with public IP addresses. (Going back to dial-up after leaving the dorm was.. difficult).
So I told him, yeah I kinda have a ridiculous internet connection here at school. He was like, "no, you don't understand - it is my machine. I am playing on the server and your ping time is lower than mine!"
Crazy illustration of "nothing happens anymore." 3dfx seemed just as dominant in the 1990s as NVIDIA does today. But from founding to selling to asset sell-off, the company lasted just six years. Meanwhile NVIDIA has been king of the hill since the GeForce was released in 1999, which was 25 years ago.
AMD overtook Nvidia at times in the gaming space. I'd say that Nvidia has been king of the hill since the introduction of CUDA, since that's what really cemented their position in the tech sector.
Pre-AMD acquisition ATI also often had better hardware specs than NVIDIA, but their drivers were so often buggy and terrible. By the time they'd been fixed the reviews were long since done on the initial release versions.
AMD seems to run a better software shop, at least.
The 90s was an absolutely crazy period for PC hardware. So many flash in the pan companies making a novel devices and then dying entirely as their niche became obsolete. There used to be tons of display board manufacturers and very few of them survived the 3D acceleration introduction.
Sometime in the late late 00s, i put my voodoo card on Craigslist. I got pinged immediately, told me he’d pay double if i reserved it for him. The cards were precious for keeping some arcade game cabinets running, and with the company no more, my used card was a lifeline. I wanna say it was a golf game like golden tee? I was delighted to make the sale and happy to charge him the original (not double) price
I recall Unreal has an option to switch to use the 3dfx card, and if IIRC, it has some additional features like more colourful lights and such.
Unreal was such a beast back in the day that it completes beats Quake 2 and other contemporary FPS even on software rendering. TBH it still looks beautiful even by today's standards, if you ignore the low polygon counts.
I'm not a person who cares too much about graphics, even for FPS (I don't really enjoy most of the modern FPS except Wolfenstein, which has interesting gameplay), and I argue that too much graphics eye candies simply decrease the overall quality of the game, but 3dfx definitely was a huge bang back in the day.
the performance boost also made a significant difference in how well the game played: i remembered when the voodoo 1 came out i had a 100mhz pentium and running quake (in low resolution) was "fine" but ran at like 20-25fps. With a voodoo that game ran at a smooth 60fps which made it so much more fun for such a fast-paced game (while also running at a higher resolution with smooth textures and improved lighting effects). It made a huge difference on multiple axes.
The percentage change in resolution you ran the games at was also absolutely mind blowing too.
For the most part we went from running the game at 320x200 or 320x240 to 640x480 in that first round of acceleration. I think in terms of % change it is a bigger leap than anything we've really had since, or certainly after 1920x1080.
So you suddenly had super smooth graphics, much better looking translucency and effects, and the # of pixels quadupled or more and you could just see everything so much more clearly.
Yeah that's true. Software rendering at low resolution is not a good sight to look at.
I remember back in 1997, when Quake 2 was just out, I sit in a net bar (where you pay to use a computer) and played an hour of Quake 2 in software rendering. The game was interesting, but I felt a bit sick, half due to the resolution, half due to the almost infinite brownish colour. A girl behind me murmured, "This is not as half fun as Duke Nukem", and yeah I completely agreed with her.
I think I still agree with her somewhat. Quake 2 is a great game, but Duke3d is a fun one.
Where Quake2 really shined was in multiplayer, especially mods like q2ctf.
Quake2 was released at just the right moment to take advantage of both 3D acceleration and broadband Internet access. Playing a game of q2ctf, in 3D-accelerated 800x600 mode, with 60 ms ping was just fucking amazing.
Unreal had a small menu where you could switch between different renderer backends precisely because of things like different cards having different... Driver quality let's say.
I remember how i was amazed when i got my first 3d card, a Voodoo 2. It was like having an arcade at home.
The 3dfx logo spining up when you launched a GLide game was something.
Unreal in particular was amazing, i remember as a kid just watching the lighting and the water.
At that time every light in a game had to be colored, just because it could be done. Small rooms with green, red and blue lights moving all over the place, so 90s.
I never had that "Wow" factor again, from there everything felt like incremental instead of revolutionary. How an absolute market leader disapeared in 2 years is incredible.
I think i only got the same wow factor the first time i tested a VR headset.
I remember my first like 5 paychecks when I was a teenager scooping ice cream went to a voodoo3 from compUSA. I don't even think it had a fan, and I remember being shocked how small the pci was as id been accustomed to mostly ISA "daughter boards"
I'm consistently amazed at how massive video cards are today... it really feels like it's often excessive for the sake of being excessive over any real need. I was actually happy to see the depth shrink of the Meshify C case, now I'm disappointed I'm probably going to have to swap the case for a new GPU... it's too hard to find options that fit beyond mid-range, and even then.
Those old cards were way under 50 Watts. Even a "low-end" card now like an Intel B580 (list price $250, inflation adjusted equivalent to about $125 in the late 90s.) is 225 W. Cooling and power circuitry are much more critical now.
In early 2000, I cobbled together a gaming PC from used parts that I bought or traded for. It had a K6-2, a Voodoo 2, and 192 MB of RAM. It was amazing and such an upgrade over my family’s Celeron. The big games were TFC, Counter-Strike, Unreal Tournament, and StarCraft. We LAN’d every weekend. It was heaven.
It kind of boggles my mind how short the lifespan of these companies were during periods of great technological advancement.
3dfx is founded in August 1994. The Sony Playstation came out in Japan in December of 1994.
It's heyday is roughly 1996-2000, less than 4 years. Then it goes bankrupt in 2002, just 8 years after its founding.
Within that time period we go from games like Sonic the Hedgehog 3 (released 1994), to Grand Theft Auto III (released October 2001). Just a massive amount of technological shift in a span of 6-7 years.
Feels like something is similar going on with AI right now. Compare the image generating DALL-E1 released in January 2021 to Google's Veo 2 released in January 2025. Just insane progress in 4 years.
normally companies that exploded in value at ipo brought lots of competition short term.
i guess that's a thing of the past after quantitative easing scams and the latter capital shift. why fund a competitor if your capital is already riding there? not many people at the roulette anymore.
Apparently 3dfx had a contract with Sega for making the Dreamcast GPU, but IPO'd in 1997. As part of the IPO they disclosed the terms of the contract, and the next gen console at Sega was a closely guarded secret at the time to avoid cannibalizing Sega Saturn sales (which were abysmal).
The contract with 3dfx was canceled leading to a lawsuit. Then Sega of America CEO disclosed they were abandoning the Saturn for the next gen console. The console did not release in Japan until November 1998 and in America until September 1999. The CEO of Sega of America Bernie Stolar was fired just days before the release, partially due to these shenanigans, and the Dreamcast as a whole was such a failure it nearly killed the company.
My first 3D card was a Righteous Orchid 3d. It had a mechanical relay in it to switch between 2d and 3d modes, so it made a distinctive click[0] when you loaded Quake3D.
Or, too many times, it didn't, and I had to try again.
I loved playing No One Lives Forever 1&2 on my Voodoo 5 5500. That was the height of my PC building days. Now as a wizened old man, I'm stuck with these Apple Macbook Pros/Airs and they do well enough. But I do miss building my own machines...
FWIW, you can build a fully functional desktop for ~$400 with integrated graphics (that can play most modern games on lower settings), or maybe $600 with a discreet GPU. Less if you go with used parts.
How wizened? If you are close to retiring, maybe you can build a pc and play some games. Keep the brain running, and stay in touch with friends (if they’ll do multiplayer).
My first gpu ever was a voodoo 2 8mb. I remember starting up the original unreal and getting it working. Shortly after we got a cable modem. 12 year old me was having a total blast. ;)
Oh man 3dfx and Matrox, so much nostalgia. I didn't have the money for any of those so they will always stay a legend for me. I think my dad's 586 had an S3 on it. In those times, the only way to find games walkthroughs and cheat codes was through physical magazines and word of mouth. Internet was $$$.
I still remember these massive performance jumps you could get around the turn of the millennium. First it was Pentium 166 MMX (SIMD FP math), then it was 3dfx Voodoo, then it was GeForce 256 (hardware T&L) and AMD Athlon Thunderbird (just blasting past anything Intel could offer).
Quake2 with a K6-2 and Voodoo2 was a kickass combo. At some point iD software released the 3DNow! patch for Quake2 which yielded yet another 10 FPS....good times.
MMX wasn’t actually that useful. The vectors were only 64 bits wide, you had no float support and the supported operations were kind of uneven... SSE and especially SSE2 were a much bigger leap.
what gave the pentium mmx the big speed boost (I also remember it being quite significant) was probably the bigger 16kb cache (pentium classic had only 8kb) rather than mmx itself.
I have more memories of my 3dfx Voodoo cards than of any other old hardware. The OpenGL implementation was so buttery smooth that there is simply nothing to compare it to. Quake 2 at 120fps on a 90Hz CRT was just something else entirely. It felt like there was no input latency at all and even with a higher ping of 80-100 in RocketArena it felt smoother than modern shooters on a 144hz panel.
> Voodoo Graphics technology is also the graphics architecture for the 3D media processor chipset that the Company is developing for license to Sega Enterprises, Ltd. ("Sega") for use in Sega's next generation consumer home game console.
Love how there are 605 instances of the word “Sega” in this. Related:
Have we crossed the threshold where more "Graphics Processing Units" are sold for ML than for graphics processing?
I remember thinking it was funny that gaming ultimately subsidized a lot of the major advances in ML for the last decade. We might be flipping to a point where ML subsidizes gaming.
The 'death' of PC computing has been rather exaggerated. Each year hundreds of millions of PCs are still sold, and that's exclusively referring to prepackaged stuff. There's then the increasingly large chunk of people that simply upgrade a frankenputer as necessary. As for gaming Steam has users in the hundreds of millions and continues to regularly grow. And while that is certainly going to encompass most people, I'm sure there are some oddballs out there that game but don't use Steam.
So GPUs sold primarily for ML probably still make up a tiny share of the overall market, but I expect they make up a huge chunk of the market for cards like A100. Gaming hasn't been bleeding edge (in terms of requirements) for a very long time and card prices drop quickly, so there's just no point in spending that sort of money on a card for gaming.
Specially funny to me is how on console orientend channels everyone is talking about the raise of PC gaming, it never went anywhere.
Using computers, not game consoles, for gaming was all over the place during the 8 and 16 bit home computing days, and the large majority eventually moved into Windows PC, as the other platforms died, that is why Valve has to translate Windows/DirectX if they want to have any games at all on their SteamDeck.
Consoles has been a niche market, at least here in Europe, mainly used by kids until they are old enough not to destroy the family's computer while playing games, given that many families only have one per household. And nowadays that roles has probably been taken over the tablets.
To the point that Playstation/XBox are now becoming publishing brands, as the exponential growth for selling consoles has plateaued.
These are very different stats. He was referring to unit sales of GPUs, not $ sales. The A100 is a $8000+ video card and so cards like it are going to dominate in revenue, even if their sales numbers are relatively low. For contrast the most popular card, per the Steam hardware survey, is (inexplicably - probably because of prepackaged kits) the RTX 4060, which is a $300 card.
In 2024 256 million PCs were sold but only 40 million of those were desktops. Excluding the fact that some PCs (hard to say a number but I'd be surprised if it weren't over 40%) are office PCs with crappy GPUs, most laptops also have a bad, integrated GPU.
There's a chance that this year or the next one more GPUs will be sold for AI than for graphics.
Laptops are also desktops, for all pratical purposes other than being able to swap components.
There are plenty of games and graphics to play, all the way back to the Atari 2600, not everyone is playing the last version of Call of Duty, Cyberpunk, or whatever tops the AAA charts.
In fact, those integrated GPUs are good enough for WebGL 2.0, which I still haven't seen much that can top mobile games graphics in the last 10 years (done with OpenGL ES), other than demoscene reels on shader competitions.
I'm fairly sure OP was more concerned about modern GPUs being used as TPUs or whatever they're called, than about what graphics circuits the Atari 2600 was using.
Even mid range GPUs are proportionally much more expensive. I built a decent gaming PC with a GTX 760 10 years ago for about $900. These days you'd have to pay double for the same relative performance.
Again, there are plenty of games to chose from besides last generation AAA games.
I guess some folks might suffer from FOMO, but that doesn't change the fact there are more games to play than most folks can achieve to finish during their lifetime, that aren't last generation AAA.
>> This tale brings up many “what ifs.” What if 3dfx had gotten the SEGA Dreamcast contract? Would both companies have been better off? What if 3dfx had continued simply doing chips and not gotten into the board business? Would 3dfx then have brought its products to market more quickly?
What if Atari had continued with its rasterizer chip from 1983s I,Robot? They also had their "mathbox" for geometry transformations since the late 1970's. They were well equipped technically to enter the broader 3D graphics market in some way but that never happened.
I remember upgrading to a creative labs 3dfx voodoo banshee and it was actually stunning - I don't think there has been a generational leap quite as apparent as seeing everything before and after that upgrade. I think I had a matrox card before that and it wasn't even that old. This was on a Celeron 400...
I don't think we'll see a generational leap like that in the future.
Glide was nuts on games that supported it - it was night and day. It took several generations of hardware for directX to surpass openGL.
Ah, back in the day I really wanted a Voodoo and being able to play with Glide, instead I was forced to return it to the shop, and get it replaced with a Riva TNT, because the PCI version on the motherboard could talk to the Voodoo.
Quite sad day for me, even if in hindsight it was for the better, given how everything went afterwards.
It felt that I got the lesser product, only because of a little motherboard issue.
I remember getting a Voodoo5 AGP, not knowing at the time that AGP and PCI were different. I couldn't use it for the first couple of months that I had it, and then upgraded the motherboard to one that could. I remember originally running a Gigabyte GA-6BXD with Dual Pentium IIIs, but I don't remember what I upgraded to that let me run the Voodoo5.
The V5 was the largest card I'd ever seen that wasn't a motherboard, and it ran every game I wanted to play for years!
I have one question: wasn't the 3dfx a graphics postprocessor? I thought it didn't render the image in higher quality, but it did postprocessing only... Never had opportunity to have voodoo, but later, when got a decent NVIDIA, I played Need for Speed 2, which had demo videos "rendered in 3dfx" with snow etc, and my graphics was crispy and no-snowy. I tried to look up why my NVIDIA does not have those effects, and I learned that they were overlayed over the original image only by 3dfx voodoo...
The Voodoo was a 3D-only accelerator. It didn’t have a traditional 2D graphics core at all, so you needed another basic video card which plugged into the Voodoo using VGA passthrough. When an accelerated game was launched, the Voodoo took over and replaced the 2D card’s output completely.
That’s probably why you remember it being a post-processor. It didn’t apply effects to the 2D signal, but it needed it for all non-accelerated programs.
3dfx also supported more blending modes than most competing cards at the time. That could be why the snow effect didn’t work on your card.
I was playing around with building retro game VMs with qemu and pci passthrough awhile back and dusted off my old canopus pure 3d to try out with a pci to pcie adapter board. It was kind of amusing you'd have the windows desktop running in virt-manager and when you fired up unreal tournament the desktop would just freeze and only then would the card actually output anything.
No it’s definitely a 3D renderer. Glide was a competitor to OpenGL and Direct3D that was proprietary to 3dfx. Don’t remember why the quality was higher.
Yeah, the earliest models are literally 3D only as discussed in the article, they have separate pass through cable for your existing 2D graphics because even just making a flat 2D window isn't viable directly, Glide really wants to render only textured triangles which is fine for Quake but no good for Windows.
I had one of those 3D-only cards on my first computer. I didn't know about the passthrough and got pretty annoyed that my games sucked and never worked with the hardware 3D stack. I don't know if they didn't document it correctly, or if I just missed it. But when some support person finally told me, I was so pumped. I spent a while manually moving the cable to the 3D card when playing a game, until I finally got a passthrough cable.
> While at SGI, Tarolli, Sellers, and Smith had all had some exposure to SGI’s Reality Engine, and with video games (especially the PlayStation) moving toward 3D graphics, all three saw the potential for consumer 3D acceleration.
Were they exposed to early versions of the Playstation? It wasn't publicly released until after 3dFX was formed.
While working at Virgin games in 92 we did see some demos of 3D gaming, but I can't recall who the manufacturer was.
Playstation was in development and was released in japan almost a year before 3dfx released their first card. It's reasonable to draw that someone could have had experience working on the playstation graphics system prior to moving over and creating 3dfx. The fact they were building a prototype system for Sega as well means they were likely involved in that space before. SGI also licensed the CPU to sony for the playstation and back then the CPU would have done most of the graphics workload for the playstation - similar to an APU today with even less segregation between gpu and cpu.
I recently decided that I wanted to relive the PII/3dfx glory days went a little bit overboard buying up parts on eBay. I ended up with an Abit BH6, PIII-800, 768MB RAM, dual Voodoo2 cards and an Asus 3800 TNT2.
I remember pining for a 3dfx card, and seeing a second hand Rush based card for sale. 13 year old me bought it. Boy was I disappointed :) I learnt a good lesson that day.
Ended up getting a GeForce a couple of years later. Still wanted a Voodoo 3, but they were a little too expensive.
I had a Diamond Viper V770 Ultra TNT2, I feel like that was a turning point in the 3dfx vs NVIDIA battle (and the subsequent GeForce marks NVIDIA industrial lead).
Sometimes I still don't know how I convinced my parents to buy me certain things. We lived in an extremely rural area and were not at all "well-to-do" but we always had a computer in the house. In my teenage years, this was a Packard Bell Pentium 100. All in the space of a year or so, I somehow managed to convince them to buy me a Canopus Pure3D video card (3dfx Voodoo but with 4 MB texture memory), a big ergonomic keyboard, a Sidewinder joypad, and Tomb Raider.
The Pure3D was the real winner here, it took Quake from "meh" on the average PC to flat-out amazing. Even on dialup, I got quite a lot of mileage out of that setup.
I still am a bit of a 3dfx fanboy. Ended up emailing 3dfx at one stage and got sent a load of posters and case stickers (remember those?).
I had a Voodoo Banshee which was a fairly decent card (not quite as good as a Voodoo 2, but better than a Voodoo Rush as a combined 2D/3D card). Paired to a Pentium P133 - very overkill on the GPU. Ended up using the same card on a AMD K6-2 500 in the end which was a bit more evenly matched.
Then ended up buying a cheap Voodoo 5 5500 after they went under (only paid £50 for it).
Sadly both of them went in a dumpster a long time ago. Wish I'd kept them both. I ended up moving to nVidia cards for a while, then had an ATi Radeon. Nowadays I just run a Macbook Air for my personal machine - life got in the way of much gaming!
My memory is that during late 90s, whenever a game supports glide, 3dfx card will always render smoother and with noticeably better texture than NVIDIA and ATI cards, even benchmark will give you similar numbers. So we constantly envy the roommates with a voodoo card.
They didn't mention dual monitor and S-Video output. Having a second display was novel at the time. Using your television as another monitor, even more so.
Author here. I should have been more clear about this in the article, but it was a combination of two different factors. First, the company alienated OEMs by making their own cards. Second, the cards they made were close in price to ATI and Nvidia offerings while being slightly less powerful. To make it worse, they were struggling to produce, and so while they had some interesting hardware in the works, they couldn't get it to market quickly enough.
Nvidia did sprinkle 3dfx's technologies across their products where it made sense, and many of the 3dfx folks continued at Nvidia for some time.
Probably what I'm about to say is unfair because it happened during the last days of 3dfx, but I remember how disappointed my friends and I were when one of us bought a Banshee and tried to run some games. It ran like crap.
Everything we tried ran between "very bad" and "average", certainly not the "wow" we were led to believe from marketing. Then we tried something we had high hopes for: Trespasser, the Jurassic Park game (it would later come to be called the "arm simulator", but we didn't know this back then).
Trespasser ran appallingly bad with the Banshee. It sucked, plain and simple, almost a slide show rather than a game. We were sorely disappointed... with the Banshee.
It turned out much later that Trespasser was a very badly optimized game, and it had been an unfair test of the Banshee because the engine ran poorly on any 3D accelerator on the market.
But the Banshee's reputation was forever ruined for us. We still joke about this.
When the reveal the information about SEGA the whole leadership team should have been on a plane to Japan the next day, all to apologize bow and scrape.
And trying to make your own board in that moment, was just an incredible self own.
> And trying to make your own board in that moment, was just an incredible self own.
ATI built their own boards too at the time (e.g. https://en.wikipedia.org/wiki/Radeon_R100_series#/media/File...), so the strategy of wanting to control more of the value chain doesn't sound that misguided to me. Not sure when ATI stopped doing that - was it after they were acquired by AMD in 2006 or before?
The same article also has another photo of a Creative-branded card. Other card brands also definitely marketed cards with ATI chips. I remember I had a R200-based card from some third-party manufacturer. ATI and Nvidia also had reference cards but I don't know if those ended up being sold in the mass market or not.
I worked for an elevator manufacture that killed somebody, they fucked up the whole apologize in Japan thing too. Even if it wasn't actually their fault.
The board I can understand, the guys having worked at SGI before and seeing how much more you could extract of the arch by having total control over the hardware (minus the CPU, obv). Essentially building customer-oriented x86 SGI machines, branching out of the gaming market and challenging workstation vendors. A 64 bits Opteron+Voodoo based Windows machine would have been something to behold. But the Sega thing probably torpedoed the funding that would have been required for them to become independent of graphic card vendors.
They knew it was a matter of time before their advantage eroded. I think what really did them in was DirectX, they stuck with Glide and allowed NVidia to develop a proper implementation of the new Microsoft thingie which was heavily marketed to developers. Their moat became a prison.
Yes, I remember toward the end it seemed like they were just releasing souped up versions of last year's product. I don't know if it was a resources issue or they just didn't expect the market to advance so fast.
Some SCSIs dangling from a NCR/Symbios Logic controller
Some VIA Board I can't exactly recall, except that it supported that NEC VCM and the NCR/Symbios Logic from its BIOS, and ran really well.
A 21" Hitachi Superscan Elite/Supreme? (With all rectangular buttons, including power, not right on the front bezel, but down below that,slightly recessed) doing 1600x1200 at up to 85Hz, without annoying Trinitron wires! :)
Mostly running NetBSD (otherwise FBSD & Gentoo), whose XFree86 or early Xorg made full use of the Voodoo for 2D-accelleration, absolutely fluid 'desktop' (mostly KDE3) :)
Just to add my experience to the pile: when I went to college I was able to convince my parents to get me a custom PC from a company called GamePC. Among the specs in 1998:
Just a ridiculous system for the time. Quake 2 and Starsiege Tribes were really popular in our dorm and that system was just perfect for it. Also popular was burning lots of pirated games, so we'd order CDRs in bulk from this really random site overseas. High quality "gold" CDRs and they were far more reliable than any of the ones you'd find in stores in the US for about half the cost.Halfway through my freshman year I decided to swap the motherboard and CPU for a crazy motherboard/CPU combo. There was a brief moment where Intel Celerons didn't really prevent you from using them in a dual CPU setup, so I had two 366mhz Celerons overclocked to ~433mhz (sometimes up to 533mhz, but that was less stable) and started playing around with OSs like Linux and BeOS to actually take advantage of them.
edit: corrected the amount of memory
/end reminiscing about a simpler time
> Halfway through my freshman year I decided to swap the motherboard and CPU for a crazy motherboard/CPU combo. There was a brief moment where Intel Celerons didn't really prevent you from using them in a dual CPU setup, so I had two 366mhz Celerons overclocked to ~433mhz (sometimes up to 533mhz, but that was less stable) and started playing around with OSs like Linux and BeOS to actually take advantage of them.
Half of HN alive at the time probably had that motherboard - ABIT BP6 was the reason I got my hands on a copy of W2K, and also started playing with Linux.
I'm still bummed that CPU manufacturers basically killed off the third party chipset industry. Maybe it was inevitable when memory controllers moved on-die, but I remember when there was actual competition in the chipset business.
Like Cyrix and AMD? I don't recall any other mainstream x86 alternatives.
Chipset, not CPU. For example, Nvidia was a well known chipset manufacturer around this time, shortly before memory controllers went on package and 3rd party chipsets died off.
Don't forget the venerable Via, SiS, Chips and Technologies, OPTi, ALi, ULi, etc.
They weren't venerable. They were anything but. They were designed to be a lower BOM on the motherboard and 90% of them were buggy garbage.
Speak for yourself, mate! Many fun times were had with machines built by each. I have particularly fond memories of the SiS 630 / 730, Via's KT133A, and lots of old ALi, OPTi and ULi gear from the 286, 386, and 486 era.
Yes, yes indeed. The nVidia nForce chipset (for AMD) was such a leap ahead because it was fast, flexible and reliable.
For Intel, you just picked an Intel chipset and that was that.
ABIT was such an amazing motherboard manufacturer, thanks for the KT7-A Thunderbird platform <3 R.I.P.
https://en.wikipedia.org/wiki/Universal_Abit
> There was a brief moment where Intel Celerons didn't really prevent you from using them in a dual CPU setup
I think for a while Intel started labelling the socket 370 celeron boxes as “for single core systems only”, but it was a lie.
Had pretty much the same thing... but only one overclocked Celeron to 433. Was amazing upgrade from my pentium 133 with a Matrox Millenium; which I somehow used to complete Half Life in low FPS agony.
I still have distinct memories of "playing" CS in 640x480 on a PII with the same card, which didn't do 3D at all iirc. 12-15 fps with the software renderer, depending on how many bots you had.
It was a cool board. I didnt technically have one, but I built my dad a W2K server on a BP6. I always wanted to hack on it and overclock with it. But after I handed it over, I wasnt allowed to touch it, "you'll burn up my processors." Since he didn't care about overclocking he had dual P2-400s or maybe 450s. It was a beast. He could run SQLServer and compile Delphi apps so hard.
I got my kicks though with a BF6 and a 300A. Those were the times; atleast until the AthlonXPs (AXIA -- anybody?) were released.
The first PC I built myself had an Athlon XP Barton 2500+ and 2x 256mb sticks of DDR2-400. It wasn't top of the line by any means but great bang-for-buck in 2003.
agree -- that dual celeron setup (often with a peltier cooler) was suuuper common, I knew so many people who rushed out to get them and run at 500? it was my second exposure to SMP though: First was dual socket Pentium Pro 200mhz which ran nt4.0 for the longest time (which I still keep that hefty cpu around on my desk for laughs)
Are you me? I also started with a dual socket PPro that I picked up somewhere cheap (it was decent, though getting outdated).
The only reason (I feel) all this stuff worked was because of how bad the P4 stank at the time, otherwise we'd all have been climbing the clock speed.
I also had a BP6 with two Celeron 400As. Stuck on Win98 for a while until Win2k graphics drivers got good enough for gaming.
To add evidence: also had it.
I wonder if there something like this today you will have in college? Some low cost graphic card rigs? Or is it more like some cloudflare set-ups today?
I am starting to feel old.
I'm slightly confused, how would games of that era benefit from a dual CPU setup?
Old games were decidedly single-threaded and built for a single-core world. It was only in the mid-to-late 2000s that games started to be more optimized for multi-core CPUs. And multi-CPU is even more difficult because there isn't any cache sharing.
That was the fun thing, they didn't really benefit - you'd have to get them running on Windows 2000 to even have a chance at using that second CPU.
However, once you got that working, you could play your game AND listen to music at the same time! Phenomenal.
Which you definitely got from one of the many legal music streaming services at the time ;)
Well you have to have another core free to run your sick winamp skin, otherwise how will you keep up your K/D?
It was just single-CPU, but I had the ABIT BH6 with a Celeron 300A, one of the most overclockable setups ever. Mine was stable at 450mhz without any special cooling.
Cathode Ray Dude's channel has a great video about the history of this amazing motherboard: https://www.youtube.com/watch?v=UE-k4hYHIDE
Oh yes, fond memories of my BP6. You just felt like you're doing something you're not supposed to, which was fun.
Similar experience, I had a Cyrix PR200 which really underperformed the equivalent Intel CPU.
Convinced my parent's to buy a new PC, they organized with a local computer store for me to go in and sit with the tech and actually build the PC. Almost identical specs in 1998: 400Mhz Pentium 2, Voodoo 2, no zip drive, but had a Soundblaster Live ($500 AUD for this at the time).
I distinctly remember the invoice being $5k AUD in 1998 dollars, which is $10k AUD in 2024 dollars. This was A LOT of money for my parents (~7% of their pretax annual income), and I'm eternally grateful.
I was in grade 8 at the time (middle school equivalent in USA) and it was the PC I learnt to code on (QBasic -> C -> C++), spent many hours installing Linux and re-compiling kernel drives (learning how to use the command line), used SoftICE to reverse engineer shareware keygen (learning x86 assembly), created Counterstrike wall hacks by writing MiniGL proxy dlls (learning OpenGL).
So glad there wasn't infinity pools of time wasting (YouTube, TikTok, etc) back then, and I was forced to occupy myself with productive learning.
/end reminiscing
I could share almost exactly the same story. So grateful my parents could afford, and were willing to spend, the money on a nice PC that I entirely monopolised.
Oh man the Celeron A, which was basically a Pentium II with on-die L2 cache. Intel attempted to handicap it by limiting it's FSB to 66 MHz, but any half-decent motherboard would allow you to bump that up to 100 MHz so long as you had the rest of the hardware to support it (i.e., PC-100 memory). This resulted in a pretty significant bump in CPU frequency.
> but any half-decent motherboard would allow you to bump that up to 100 MHz so long as you had the rest of the hardware to support it
I think any BX chipset motherboard would do it… but you may have to resort to messing with covering/hotwiring CPU pins.
Tribes was the game I remember that showcased the card best - you had a legitimate competitive advantage if you were running this card.
(also, shazbot)
That and high speed internet. I played for a couple of years on 28.8K. The day I got a better graphics card was great. No more choppiness. The day I got cable internet was life changing in Tribes (and Tribes 2)!
I think I still have a pic somewhere of the infamous NoFix scolding "LPBs"
"Shazbot!" "The enemy is in our base!" "Woohoo!"
I remember when Cable internet started showing up... I'd cart my computer to a friend's house once a month to play LAN party for the weekend and run updates.
Back then, updates over modem took hours to run, it was kind of crazy considering how many easily exploited bugs existed back then.
> it was kind of crazy considering how many easily exploited bugs existed back then.
Anyone on IRC learned this pretty quick.
I thought my computer was up to date on everything, ran win2k, zone alarm firewall well configured, and someone on IRC said they had a script with all known exploits and I invited them to run it against me… they still managed to crash my computer.
Maybe it was an early metasploit?
I learned to never trust a computer.
Nothing like eating a disc to the face while you are lagging in mid-air on a dialup connection. <3
There was a moniker for the few people with high speed back then - LPB - low ping bastards. All those fortunate enough to live in a city with adsl or cable high speed in the early days (or gaming at work or university on the T1)
Interestingly enough, these days it's often an advantage to have high ping, because modern games make client-side hit detection authorative. With Apex Legends, Respawn uses the argument that playing against laggers but with client-side hit detection makes the bullshit that happens "symmetrical" and they want to keep the game accessible for people with poor connections, but anyone that plays against laggers knows that is absolutely not the case.
I wish modern games would just include a Ping Lock toggle in the matchmaking. "Do not match me with anyone with poor connection quality" (>100 ping, >1% packet loss). With a big fat pop-up warning that it'll increase matchmaking times.
It was deeper than that. That was just the way we were all classified back then: hpb (high), lpb (low), slpb (super-low?). When we got a cable modem in `99, I felt like hot shit to leave the hpb shame behind.
I had something like 2-7ms ping to any server anywhere near Los Angeles for a real long time.
I was also pretty good at a lot of competitive online games, so accusations of botting or other shenanigans got old.
Recently my ping to 1.1.1.1 went down to 12ms and I got excited.
I think I played Tribes 2 for more time than the time I've spent on every other game combined. Years!
It's too bad that tribes games' learning curve is too steep for people now. Tribes Ascend was pretty well made but died quickly, and Tribes 3 seems to be dead even faster.
Very few people who didn't already play the earlier games have much stomach to figure out how to even move effectively across the map or hit anything moving at high speed, let alone do proper cap routes, chase, etc. I played Tribes Ascend for awhile and on most random servers you could play the first 2 minutes as a chaser to verify "yup there is nobody who knows how to move quickly", kill yourself over to chaser, and then end the game in like 4 more minutes when everyone else is just slowly messing around in midfield doing nothing productive. And I wasn't even any good lol, when I went into any semi organized pug I would get destroyed.
vgh! Except that texture transparency worked with glide (voodoo) cards and not with opengl or software rendering. So if you made a custom skin with transparency texture there was a brief time in the Tribes 1.1 to 1.2 era where you could be somewhat invisible to people with voodoo cards (if your skin was in a skin pack that everyone had).
> There was a brief moment where Intel Celerons didn't really prevent you from using them in a dual CPU setup, so I had two 366mhz Celerons overclocked to ~433mhz
Was that the BP6 motherboard from Abit?
I had that board, those processors and used to overclock them too.
Also ran Linux and BeOS on it (though IIRC you had to patch BeOS for SMP support).
Quake 3 ran so smooth on that machine, even without Q3s experimental SMP support enabled.
That was actually my all time favourite computer, even to this day.
I also had a TNT2 in an earlier machine, but the BP6 machine had a GeForce 3.
Dual 300As overclocked to just over 500Mhz each on a BP6 with Geforce 256 here too! Fastest, smoothest machine I ever used until the M1 MacBook. Quake 3 multiplayer demo ran so fast it gave me motion sickness ^^ Years later I "upgraded" to a 1Ghz Athlon and it felt like a downgrade a lot of the time.
> though IIRC you had to patch BeOS for SMP support
The board might have require a driver or patch, but SMP was BeOS's entire reason for being! The drawing of each window on the screen ran in a separate thread. It was their main selling point.
Reading the BeOS Bible talking about that is quite a throwback:
> As described elsewhere in this book, BeOS uses multiple processors with incredible efficiency. If you'll be running BeOS most of the time, you'll get more bang for your buck by getting two (or more) older processors than by installing one superfast CPU. Last year's 266 MHz CPUs will always be dirt cheap compared to today's 450 MHz CPU. Thus, when running BeOS, you could have 532 MHz for less than the cost of a single 450 MHz processor. The catch is that if you'll be dual-booting into operating systems that won't recognize a second CPU (such as Windows 95/98), you'll end up with half of your processor speed being wasted until you reboot into BeOS. Chance are that once you start using BeOS regularly, you won't want to use anything else, and you won't regret buying a multiprocessor machine.
https://birdhouse.org/beos/bible/bos/ch_hardware1.html
Lack of SMP was an artificial limitation for the BeOS 5 Personal Edition (I think it was called). The idea being you’d get BeOS for free but you couldn’t use it as a proper multiprocessor workstation without paying for a license.
This was also the same BeOS release that targeted Intel and ran off a virtual disk stored on a Windows FAT32 partition.
Overclocking Celeron's those were the days. Intel binning down a bunch of processors capable of reaching higher clock rates but selling them as a lower end part was a boon for college students everywhere.
I basically did the same upgrade to the Abit BP6, dual Celerons and BeOS. That combo was probably the snappiest system I will ever use.
I had something similar!
300Mhz PII - it came in a black cartridge thing.
NVidia RIVA TNT which used the AGP bus on the Intel LX440 mobo.
A whopping 128Mb of RAM and 8Gb HDD.
I recall using a program called WinSplit to split the Nvidia driver over several floppy discs on my bosses Win3.1 machine in the office. I didn't have internet at home and really wanted to play Jedi Knight and Battlezone.
I recall the legendary Celeron being the 300A. It was 300MHz, but was easily overclocked to 450MHz. There were higher clocked versions, but regardless of which CPU you got, they ultimately were only able to overclock to about the same frequencies.
Also, the celerons of that generation did not have unlocked multipliers. The only way to overclock them was to overclock the front side bus, which also controlled memory bandwidth. The "standard" FSB speed was 66MHz. By overclocking a 300MHz CPU to 450MHz, you got a 100MHz memory speed. By overclocking a 366MHz CPU to 466MHz, you "only" got 78MHz of memory bandwidth.
My friend in college had one. Windows 98 didn't support SMP, so he had to run Windows 2000, which was based on Windows NT, and would be the basis for XP. Compatibility with games was sometimes...interesting. Windows ME came out about that time, but was absolute garbage. All of us either stuck with 98SE or experimented with 2k. None of us actually bought it of course...
Fun times.
So the story originally started with the cacheless 266 Mhz Celeron. CPUs were delivered as AICs (add-in-cards) at the time, with separate cache chips, so to deliver a budget processor, they shipped the same silicon, but without the cache chips added. Removing the cache drastically tanked the performance, especially on integer work loads (typically productivity software), but didn't really affect floating point workloads. However, it had the side benefit of removing the part of the AIC that was most sensitive to over-clocking (the cache). It used a 66Mhz clock with a fixed 4x multiplier, and upping the clock to 100Mhz got the Celeron running at 400Mhz, which had performance roughly equivalent to a 266 Mhz Pentium II with cache for integer workloads, but for games, it was almost as fast as the fastest Pentium II of the time (which topped out at 450Mhz).
In order to stop the overclocking, Intel decided to add some cache back to the CPU, but to save money, rather than using cache chips, they stuck a relatively tiny amount of cache directly on the CPU die, and released the now infamous Celeron 300A
Because the cache was on-die, it could overclock just as well as the previous celeron, but this time the 300A was faster than the equivalent Pentium because the on-die cache ran at twice the clock speed of the external caches
> By overclocking a 366MHz CPU to 466MHz, you "only" got 78MHz of memory bandwidth.
I think the PCI bus probably also typically ran at some fraction of the front-side bus. The common FSB frequencies around those times were 66 or 100 MHz which gave a standard ~33 MHz PCI bus frequency with a multiplier of 1/2 or 1/3. FSB frequencies that weren't close to a multiple of 33 MHz might have caused trouble with some PCI cards. Might have depended on how the motherboard or chipset handled the bus frequencies, too.
Of course the PCI bus should probably always run at 33 MHz but I think I saw it being modified with the FSB speed at least on some motherboards.
It was crazy how fast things moved back then. A friend of mine had a 233MHz P2 with 32GB and a 2D card, and within two years it was a dinosaur, being shown up by machines like yours, 400-450MHz, 3D cards, way more memory....
I bet it was a 3.2gb hard drive. (Or you mean 32mb of ram, that sounds right too(.
32 megabytes!
Even HDDs were smaller back then.
OMG Tribes, totally forgot that existed.
Ah, a fellow Tribesplayer. Just so you know, we still play tribes. Join us! http://playt1.com/ - the community mantains the master server and clients these days. There are good pick-up games on fri+weekends.
I love this game, it's also amazing to me how the concept of "skiing" was foreign to me when I first played T1 and T2, and now its a core game mechanic.
Man I have to pick this up again. I was a top CTF, Arena, and duel player. I played through college competitively (2008)
Now that takes me back
[dead]
CR+R or CD-R
Big difference.
I think the other commenter is right...you're thinking of DVD-R vs DVD+R, possibly even DVD-RW and DVD+RW.
Based on the specs listed, OP was in college just before me or may have overlapped. The big gold CD-R stacks (you could bur in jewel cases, on spindles, or just gross stacks which were nice and cheap) were a huge thing with my group (who encoded to FLAC & MP3 -V0 and burned audio CDs relentlessly. We felt we were archiving our liberal arts college music library and radio library for the future! Who knows. Some of that "future" is still backed up and on hard disks, and I should migrate them to SSD or tape just on principle.
At that point CD-R were cheaper than CD-RWs, and because most archiving/distributing didn't require rewriting (not return-on-investment wise anyway), we just shared programs on CD-R as well. In some ways it was a beautiful technology! Particularly fidelity to a spec everyone tried to bend and break for a profit angle, when honestly, there was no point for many of us using CD-R
Aren’t you confusing that with DVD?
Yes. Wow.
It was truly jaw dropping firing up quake 1 for the first time on 3dfx voodoo1. Double the resolution of software and super smooth framerate, and much better quality texture mapping too. I recall tweaking some setting (gl_flashblend?) so that I could see pretty glowing orbs around my rockets (and strategically, also everybody else's rockets).
It's hard to convey just how revolutionary the original voodoo cards were. There aren't many times in my life where there was a clear line of before and after, but this was one of those times.
Still blows my mind that it was just a flash in the pan. At the time it felt that 3dfx was certainly going to be a dominant force in computing for years. And although they lingered a bit, the whole event was barely over 2 years.
I think everyone understood that having it be 3D-only (and requiring a separate graphics card to do normal 2D desktop graphics) was a half-solution, and 3DFX's long term success would depend on their ability to provide a full 2D/3D solution before existing competitors like NVIDIA, ATI, and Matrox could catch up with their own 3D accelerators.
At various points and machines I had a Voodoo 2 (with the VGA handbag from a 2d card), a Voodoo banshee, and a Voodoo 3.
The latter two were 2d+3d in one and well before any real competition.
Their 2D engine was incredible for the time as well, implementing 100% of the Windows GDI functions in hardware.
They really needed to buy Matrox and make multi-monitor 3D a thing.
I remember reading a historical piece where the voodoos success was partially luck. At the time the first generation cards were being developed edo ram was super expensive so most competitive designs were hamstrung trying to do things with very little ram. By luck edo ram prices crashed right as they released making them far more affordable to manufacture than 3dfx could have reasonably expected. That gave them an early and massive lead with their initial design.
I would compare it with the move from HDDs to SSDs — a night and day difference.
That took a lot longer really as well... I remember seeing SATA SSDs around 2009, paying a massive amount for my 64gb Intel drive (that ate itself just over a year later)... I hated moving/symlinking so much... but, fortunately by the time it died, I could go to 256gb or 512gb (don't quite remember which) for not too much more.
Even then, I was still seeing most Desktops sold with spinning rust for several years later.
They also had the most recognizable unified box art style from all HW makers[1]. When you saw those eyes staring into your soul off the shelves, you knew it was a 3dfx GPU. They also had the best ads. [2] HW vendors today don't have the balls anymore to use designs like that, it's all flat sterile corporate nonsense.
[1] https://www.ixbt.com/img/r30/00/02/08/90/boxes.jpg
[2] https://www.reddit.com/r/pcmasterrace/comments/41r1wj/3dfx_w...
Unless I'm mistaken, those cards were all produced by 3dfx after their acquisition of STB. Regardless, that box art blew my 14 year old mind back in the day.
HW vendors today don't have the balls anymore to use designs like that, it's all flat sterile corporate nonsense.
Not all of them...
https://www.techpowerup.com/333599/yeston-launches-radeon-rx...
https://minixpc.com/products/maxsun-graphics-cards-geforce-r...
[flagged]
That's something I haven't seen in awhile! I remember as a kid staring at those in the store, not being able to afford them.
I sold my used 3DFX Voodoo 5500 with original box in 2015 for about 290EUR. It is probably in a collector's shelf now.
I think mine went into a computer that we donated to a school, or something. Around 2002 or 2003, my dad and I put together a bunch of working systems out of spare parts and donated them.
Mine was the PCI version of the card. Crazy looking on Ebay how much even the bare card goes for now, let alone when someone has the full boxed set.
>I sold my used 3DFX Voodoo 5500 with original box in 2015 for about 290EUR.
Bruh. That's like selling your bitcoins in 2009 for two pizzas.
I wiped my drive a few times before realizing dropbox didn't back my wallet up. I shrugged it off losing 30 bitcoins worth maybe at best 3 cents each at the time. Hindsight is 20/20 I suppose.
I was the unfortunate owner of an S3 ViRGE card at the time - the (in)famous "3D decelerator". I managed to get Quake running on it, and it looked nice, but was slower than with software rendering...
I had an S3 ViRGE too. It really was a decelerator, and the number of games that actually supported it was minuscule. I managed to run GLQuake, but without any textures - just shades of gray - and even that ran at most a couple of frames per second.
But there was another game - Terminal Velocity - that actually looked a lot better with hardware rendering, although it was still slower than software rendering. So, I would run it with hardware rendering to enjoy flying and then restart the game with software rendering to actually fight the enemies. :)
Ahhh, what a piece of shit that was! I spent summer lawnmowing money on that thing.
When I got a real job next summer, bought an AGP Matrox Millenium G200 and after that the NVIDIA GeForce2, and never strayed from NVDA since!
Same here. I can still vividly remember the experience of loading in with a voodoo2 for the first time. It was night and day -- mind completely blown. The late `90s really established a new version of the gamer; consoles were fun, but computer gaming was a different world. It made me a junky for reading about hardware, overclocking and gaming.
Replaying Heretic 2 back in 1998 with my first Voodoo (banshee) was a borderline otherwordly experience, compared to my first playthrough of the game using software rendering. Nothing has blown my mind the same way since.
I have had three experiences like this in my life:
The closest thing to this, in modern gaming, has been the boss fights in Elden Ring: https://i.imgur.com/gzLvsLw.mp4 -- visually, they are quite epic.> much better quality texture mapping too
Debatable. I always preferred the crisp look of the software renderer to the washed out GLQuake. Same with Quake 2. I think it because textures back then were too low resolution so filtering just makes them look muddy.
It’s also because the VGA signal quality from the 3dfx Voodoo wasn’t very good.
It didn’t have a traditional 2D graphics core at all, so it needed another graphics card for rendering the desktop (any non-accelerated apps really), and this was connected to the Voodoo using VGA passthrough. There was a noticeable image quality loss from this arrangement.
A Matrox card would give you crisp VGA with nice saturation, but the 3D acceleration was nearly worthless. Choices…
I really disagree. There were some nice Matrox cards. They weren't as good at 3d as 3DFX but for the time they really improved gaming. I developed Battlezone on G200. In those days we tried to have everyone have a different graphics card because the companies would just give them to us and we wanted to work with every card.
Matrox had great hardware, but the software drivers took too long to catch up. I was on the OpenGL team and my life's mission was to get Quake running as fast as the G200 and G400 was capable of. We finally caught up and got parity with Nvidia's TNT2, and then bam, they released the GeForce 256 series, and it was curtains for Matrox because their next gen hardware wasn't ready yet.
I agree that the washed-out textures haven’t aged well.
But at the time, not having pixelated textures was the first thing people looked at when judging graphics quality. I remember that being a HUGE selling point of the N64 and something that made the console look almost a generation ahead of the PlayStation and Sega Saturn to kids back then.
Today, I think I prefer the PSX look, thoug. Maybe with Z-buffer correction to avoid the warped textures of the PlayStation.
Might have also been one of those things that looked better on the 14-15" CRTs of the time vs crisp high-res flat panels of today. They were blurry enough that 640x480 was "high resolution" (I remember not being able to easily see the pixels at 800x600 on a 14" CRT unless I came super close to the monitor).
Plus Super Mario 64 was so colorfully saturated!
Even today I think a lot of Doom clones look better (or more nostalgic) with software rendering and texture mapping rather than OpenGL. There's an intensity of saturation to the colors that's lost. Fireblu is never quite so eye burning as when it's in software.
It's low resolution plus the cards only supported bilinear filtering, which turns things into a blurry mess.
Overall it looked better, but a lot of Quake 2 players weren't aware of a lot of the small details that were put into the textures.
I came here to comment similarly, the lower pixelated software rendered Quake seems to work well with the textures. They have a bumpmappy fuzzy feel that gets lost with the sharp corners everything is super flat texture mapped and filtered version that one got from the 3d accelerators of the time. I guess my brain just adds context to the low res images.
Before unreal, I had a s3-virge for 2d and a powerVR 3d accelerator pair, and I was always flipping between software, virge and powerVR depending on game. Which at the time were largely hexen/heretic. The powerVR was higher resolution and clean/sterile but never seemed like a lot better experience.
But then there was unreal, the first game I think was absolutely better on an accelerator (voodoo2 in my case). Its also pretty much the last of the serious software renderers and outside of the voodoo's definitely did a better job with software lighting /texture mapping/etc than any of the previous (affordable) accelerators. Which is why I ended up finally replacing the powerVR with the voodoo2. The results were 'unreal'. Some of that might just be bias, I played insane amounts of doom/etc but never really got into quake. Quake just seemed like doom rehashed to me, so I was off playing warcraft/diablo/hexen/etc.
And frankly, outside of FEAR, I stopped playing 1st person shooter games for 20 years, the graphics improvements were so incremental, I just kept seeing flat low polygon models everywhere. And I don't think that looks good. Even after all the tessellation/bump mapping/endless tricks I kept seeing frames where I could play "count how many polygons are onscreen right now" games. Its gotten better the past few years, particularly some of the lighting, at least the screenshots/cut scenes are no longer obviously not in game rendering. The UE5 demo is slowly becoming reality in actual games, so maybe its time to revisit a few of them.
Personally I think even Half Life looks better with the software renderer today. Maybe that's just because it's the way I first played it.
You can 'fix' the texture filtering to nearest neighbour in hl by adding the following to userconfig.cfg (should be in a directory called 'valve' in the game's root directory):
gl_texturemode GL_NEAREST_MIPMAP_LINEAR
gl_ansio "0"
gl_round_down "0"
Or just entering those lines in the consoe, preceded by 'set'
Definitely one of my great early memories in computing.
I had a similar experience seeing Quake2 running with the Glide renderer (on a Voodoo2) for the first time It was amazing.
In terms of pixels, it was 4x the resolution. And for fun, one of the window textures (visible in the difficulty choice lobby IIRC) was implemented as a mirror in glquake - IIRC John Carmack said it was so easy to do in OpenGL he did it as a kind of test.
Always thought the original software renderer looked much better. It didn’t have the bilinear filtering, so the textures didn’t look all smooth and ‘washed out’, which suited the environment more imho
I can't speak for the original GLQuake on 3dfx hardware, but on OpenGL-compatible Quake engines (which include modern Quake source ports such as Quakespasm, Ironwail, and vkQuake), bilinear texture filtering is an option that can be turned off.
I play on vkQuake with nearest-neighbor texture filtering, square particles, and the "classic" water warping effect and lighting effects, alongside 8x MSAA, 16x anisotropic filtering, high-resolution widescreen, etc. This keeps the art style consistent with the look of the original Quake, while still allowing for the benefits of hardware 3D acceleration.
For what it's worth, the modern source port for Descent (DXX-Rebirth) makes bilinear filtering optional, too, while using OpenGL and allowing MSAA and stuff. I played through the first two games some time ago and I also found the bilinear filtered textures worse-looking than the blocky ones.
Yeah same here. Always preferred the software renderer for Quake and Quake 2. Even over the modern stuff.
The software renderer has this gritty feel that is integral to the art I feel.
That said, the 3dfx was impressive at the time, and I was very jealous of my buddy who got one.
You should check out Devil Daggers on Steam.
Mostly agree, but the RTX version of Quake 2 is very impressive to play.
I'd be interested in a RTX-enhanced software renderer. Ie replace the baked lighting with the GI raytracing, but otherwise keep the rest of the software renderer. Have a feeling that could be an awesome blend.
Would be a bit challenging with the palette but should be doable.
Doable, but not at a playable framerate without hardware.
Yeah that's what I was thinking. Like do a RT-only pass doing lighting (no textures), then do the software pass using the RT-lighting rather than baked lightmaps.
Latency would be slightly higher but I guess one could implement the important parts[1] of the software renderer on the GPU.
[1]: https://fabiensanglard.net/quake2/quake2_software_renderer.p...
I've experienced only three improvements in video games that felt ground-breaking & jaw-dropping:
1. Sprite-based -> 3D sandbox world (in my case: Stunts, F29 Retaliator, Gunship 2000, Wolfenstein 3D)
2. Hardware 3D rendering (I had the NVidia RIVA 128ZX)
3. Fast-paced real-time multiplayer (Delta Force: Black Hawk Down)
The 4th might be the usage of LLMs or similar technology for (mostly-)unattended content generation: NPC dialogue etc.
For me it was Carmageddon. I bought it later on an ipad and it may have just been rose tinted glasses of being completely blown away back in the day but the ipad version never seems quite as crisp...
Carmageddon was a software renderer. Absolutely glorious game though!
Core memory unlocked! Good times
I dreamt about having the vodoo but i could not afford it. Went with a rendition verite based one. It was underpowered compared to the vodoo but I really consider it the first real GPU as it was a RISC processor.
There was a brief period of time where having one of these 3D cards in quake led to a pretty heavy advantage for gamers.
I can very clearly remember installing the card and then loading up 2Fort4 in the Team Fortress mod and suddenly being able to see THROUGH THE WATER.
Sniper's paradise!
If I remember correctly to get transparent water the level also had to be re processed through the "vis" program with a transparent water flag set.
vis did a precalculation for where a level segment(the partition in binary space partition) could be seen from any other level segment. the end effect was that while glquake did have a option for transparent water, the geometry would not draw on the far side. making the effect useless without a bit of extra work. But I have to admit I have no idea if entities(other players) would draw or not.
update: found this https://quakeone.com/forum/quake-help/general-help/4754-visp...
Apparently there is a no_vis option to run without the visible set optimizations.
Adding: the server and client had to both be running vis patched maps to be able to see other players in the water due to the way entity visibility was calculated server-side.
The downside to running vis patched maps on a server is it used slightly more CPU than unpatched maps IIRC. Perhaps someone that ran more servers than I did (I ran two nodes on an Intergraph InterServe with dual P6-200s) could weigh in on what the impact was at scale.
There was also a perverse effect on some games. With a graphics card, your gameplay could be altered and you had to unlearn all the reflexes you built on CPU rendering alone. Moto Racer (1997) was like that. The gameplay with a graphics card was completely different, even trajectories (I assume lag made the cpu accept a little bit more rounding errors).
Moto Racer was one of the few games that supported the PowerVR card. I had one and it looked so good. Such a fun game.
> I can very clearly remember installing the card and then loading up 2Fort4 in the Team Fortress mod and suddenly being able to see THROUGH THE WATER.
Searching for "2Fort4" in YouTube yielded some interesting videos for people curious what the original Quake Mod version of the map looked like:
https://www.youtube.com/watch?v=bJh36LuKwVQ&pp=ygUGMkZvcnQ0
As someone who still spends at least 3 hours a week playing 2Fort on the current Team Fortress 2, it's fascinating to see how the core of the map is still basically the same after 20 years.
EDIT: Another video which goes into more detail about the history of each 2fort version, going back to its original inspiration from a Doom 2 level:
https://www.youtube.com/watch?v=Tid9QwAOlng&t=375s
Interesting, didn't realize this design was that old. Feels a little like teapot :).
The video also misses that there was a pretty popular 2fort for half life 1.
I spent so much of my early 20's in that map/mod on Q1. I don't think I've had that level of just fun in any game/map/mod since then.
This and the overwhelming advantage conveyed from fast internet connections was hard to really appreciate when you were there.
Even having a solid dial-up connection with a ~180-185ms ping was a ridiculous advantage when most HPBs were ~350ms, particularly in clan invitationals for Q1CTF. We were playing as LPBs in the dorm at ~45-60ms and 180ms wasn't that much of a concern, aside from sliding around corners more than expected, but at 350ms you were basically shooting predictively at where you assumed they'd be next, not where they 'were'.
Subspace/Continuum also used lag in its gameplay, with players warping to recently exploded spaceships so they could continue to invade. It was an established technique and had to be defended against.
Edit: typo
Shout out to any EG players!
On a very different scale, but I recall playing bzflag decades ago and discovering that I simply could not jump my tank to the second level. My graphics card was so slow that something wasn't working correctly, and no matter how many times I tried from different distances I would almost make it, but not quite.
More recent example: In GTA SA in a mission I wasn't able to reach an airplane before it took off unless I lowered the resolution of the game.
You also needed a fast connection to minimize latency. 400 ms on dial-up was common.
I used to play games like Starsiege (the mech game) on dialup. With our 250ms pings, your brain just learned to compensate for the delay and click that much earlier to shoot.
But yeah, those lucky people with their DSL modems and 60ms pings would wipe the floor with us.
Nowadays, everyone has a < 10ms ping.
I loved playing starsiege back in 2000. I had a wired college campus connection, but shared with so many students, my pings would go anywhere from 50 to 500 depending on time of day. Near-timeouts showed the client side prediction code in action, with mechs sliding around and then freezing in place.
Yeah funny to watch the mechs slide, all the kinds of things the young whippersnappers nowadays will never witness.
Plus all the weird behavior with packet loss. You could see each player's ping and PL numbers in the server player list. Those were the days.
The torque engine (at least tribes 2) was wild what it was capable of on dial up. 32 or 64 players was okay compared to 8 on quake 2 depending on your latency and connection speed.
I played in Tribes 1 tournaments ... I never had a computer powerful enough for Tribes 2 – it had a very bumpy launch, remember? A lot of C++ exceptions, etc.
But Tribes 1 was incredible in how smoothly it handled "larger" servers. It would be so chaotic.
When qtest was released, I was there in the IRC channel and one of the first to play.
I remember connecting to someone's machine and just destroying everyone. Afterward I got a message from someone congratulating me, but being incredulous about my ping time being 26ms.
I happened to be on the first floor in the first dorm on campus to get wired internet access, and they had an OC3 dedicated to it. Two months earlier there were 16 of us splitting that line, with public IP addresses. (Going back to dial-up after leaving the dorm was.. difficult).
So I told him, yeah I kinda have a ridiculous internet connection here at school. He was like, "no, you don't understand - it is my machine. I am playing on the server and your ping time is lower than mine!"
Crazy illustration of "nothing happens anymore." 3dfx seemed just as dominant in the 1990s as NVIDIA does today. But from founding to selling to asset sell-off, the company lasted just six years. Meanwhile NVIDIA has been king of the hill since the GeForce was released in 1999, which was 25 years ago.
AMD overtook Nvidia at times in the gaming space. I'd say that Nvidia has been king of the hill since the introduction of CUDA, since that's what really cemented their position in the tech sector.
Pre-AMD acquisition ATI also often had better hardware specs than NVIDIA, but their drivers were so often buggy and terrible. By the time they'd been fixed the reviews were long since done on the initial release versions.
AMD seems to run a better software shop, at least.
The 90s was an absolutely crazy period for PC hardware. So many flash in the pan companies making a novel devices and then dying entirely as their niche became obsolete. There used to be tons of display board manufacturers and very few of them survived the 3D acceleration introduction.
You'd buy a computer magazines and it almost felt like the performance of CPUs and GPUs when up month by month.
Sometime in the late late 00s, i put my voodoo card on Craigslist. I got pinged immediately, told me he’d pay double if i reserved it for him. The cards were precious for keeping some arcade game cabinets running, and with the company no more, my used card was a lifeline. I wanna say it was a golf game like golden tee? I was delighted to make the sale and happy to charge him the original (not double) price
Yep, my Golden Tee 2005 cabinet uses a Voodoo3 2000 or 3000 (can't remember which off the top of my head).
I'm honestly impressed people have golf game cabinets at home. never saw anyone playing those on the arcades.
I recall Unreal has an option to switch to use the 3dfx card, and if IIRC, it has some additional features like more colourful lights and such.
Unreal was such a beast back in the day that it completes beats Quake 2 and other contemporary FPS even on software rendering. TBH it still looks beautiful even by today's standards, if you ignore the low polygon counts.
I'm not a person who cares too much about graphics, even for FPS (I don't really enjoy most of the modern FPS except Wolfenstein, which has interesting gameplay), and I argue that too much graphics eye candies simply decrease the overall quality of the game, but 3dfx definitely was a huge bang back in the day.
the performance boost also made a significant difference in how well the game played: i remembered when the voodoo 1 came out i had a 100mhz pentium and running quake (in low resolution) was "fine" but ran at like 20-25fps. With a voodoo that game ran at a smooth 60fps which made it so much more fun for such a fast-paced game (while also running at a higher resolution with smooth textures and improved lighting effects). It made a huge difference on multiple axes.
The percentage change in resolution you ran the games at was also absolutely mind blowing too.
For the most part we went from running the game at 320x200 or 320x240 to 640x480 in that first round of acceleration. I think in terms of % change it is a bigger leap than anything we've really had since, or certainly after 1920x1080.
So you suddenly had super smooth graphics, much better looking translucency and effects, and the # of pixels quadupled or more and you could just see everything so much more clearly.
Yeah that's true. Software rendering at low resolution is not a good sight to look at.
I remember back in 1997, when Quake 2 was just out, I sit in a net bar (where you pay to use a computer) and played an hour of Quake 2 in software rendering. The game was interesting, but I felt a bit sick, half due to the resolution, half due to the almost infinite brownish colour. A girl behind me murmured, "This is not as half fun as Duke Nukem", and yeah I completely agreed with her.
I think I still agree with her somewhat. Quake 2 is a great game, but Duke3d is a fun one.
Where Quake2 really shined was in multiplayer, especially mods like q2ctf.
Quake2 was released at just the right moment to take advantage of both 3D acceleration and broadband Internet access. Playing a game of q2ctf, in 3D-accelerated 800x600 mode, with 60 ms ping was just fucking amazing.
Yup it was really a huge difference. FPS really needs 60 FPS and up. Anything less feels clunky.
Playing diablo II (R) now on a 2k 165hz display with a card that can run it that fast makes it real pretty. A lot of love went into that game.
Unreal had a small menu where you could switch between different renderer backends precisely because of things like different cards having different... Driver quality let's say.
Yeah exactly! OldUnreal guys did a lot of work for this classic game. I believe they even have access to the source code.
> I don't really enjoy most of the modern FPS except Wolfenstein, which has interesting gameplay
Which Wolfenstein?
Both the 2009 and the more recent reboot trilogy (old blood, the new order and II).
I remember how i was amazed when i got my first 3d card, a Voodoo 2. It was like having an arcade at home.
The 3dfx logo spining up when you launched a GLide game was something.
Unreal in particular was amazing, i remember as a kid just watching the lighting and the water.
At that time every light in a game had to be colored, just because it could be done. Small rooms with green, red and blue lights moving all over the place, so 90s.
I never had that "Wow" factor again, from there everything felt like incremental instead of revolutionary. How an absolute market leader disapeared in 2 years is incredible.
I think i only got the same wow factor the first time i tested a VR headset.
I remember my first like 5 paychecks when I was a teenager scooping ice cream went to a voodoo3 from compUSA. I don't even think it had a fan, and I remember being shocked how small the pci was as id been accustomed to mostly ISA "daughter boards"
Wow what a trip down memory lane
I'm consistently amazed at how massive video cards are today... it really feels like it's often excessive for the sake of being excessive over any real need. I was actually happy to see the depth shrink of the Meshify C case, now I'm disappointed I'm probably going to have to swap the case for a new GPU... it's too hard to find options that fit beyond mid-range, and even then.
Those old cards were way under 50 Watts. Even a "low-end" card now like an Intel B580 (list price $250, inflation adjusted equivalent to about $125 in the late 90s.) is 225 W. Cooling and power circuitry are much more critical now.
In early 2000, I cobbled together a gaming PC from used parts that I bought or traded for. It had a K6-2, a Voodoo 2, and 192 MB of RAM. It was amazing and such an upgrade over my family’s Celeron. The big games were TFC, Counter-Strike, Unreal Tournament, and StarCraft. We LAN’d every weekend. It was heaven.
It kind of boggles my mind how short the lifespan of these companies were during periods of great technological advancement.
3dfx is founded in August 1994. The Sony Playstation came out in Japan in December of 1994.
It's heyday is roughly 1996-2000, less than 4 years. Then it goes bankrupt in 2002, just 8 years after its founding.
Within that time period we go from games like Sonic the Hedgehog 3 (released 1994), to Grand Theft Auto III (released October 2001). Just a massive amount of technological shift in a span of 6-7 years.
Feels like something is similar going on with AI right now. Compare the image generating DALL-E1 released in January 2021 to Google's Veo 2 released in January 2025. Just insane progress in 4 years.
normally companies that exploded in value at ipo brought lots of competition short term.
i guess that's a thing of the past after quantitative easing scams and the latter capital shift. why fund a competitor if your capital is already riding there? not many people at the roulette anymore.
Apparently 3dfx had a contract with Sega for making the Dreamcast GPU, but IPO'd in 1997. As part of the IPO they disclosed the terms of the contract, and the next gen console at Sega was a closely guarded secret at the time to avoid cannibalizing Sega Saturn sales (which were abysmal).
The contract with 3dfx was canceled leading to a lawsuit. Then Sega of America CEO disclosed they were abandoning the Saturn for the next gen console. The console did not release in Japan until November 1998 and in America until September 1999. The CEO of Sega of America Bernie Stolar was fired just days before the release, partially due to these shenanigans, and the Dreamcast as a whole was such a failure it nearly killed the company.
My first 3D card was a Righteous Orchid 3d. It had a mechanical relay in it to switch between 2d and 3d modes, so it made a distinctive click[0] when you loaded Quake3D.
Or, too many times, it didn't, and I had to try again.
[0] https://riksrandomretro.com/2021/06/07/the-righteous-click
Relevant: This is my lengthy presentation on porting Rogue Squadron 3d from 3dfx glide over to Vulkan.
https://www.youtube.com/watch?v=wcmKy-72_2U
Interesting, thanks for this.
My first Linux experience was trying to get a 3dfx to work.
Also no Internet because I hadn't gotten that far.
I ran back and forth to Steven's house and searched Altavista for the answers.
Good times.
I was an infoseek.com man myself, but we probably still could have been friends.
gkrellm, konqueror, irssi, xmms
of course i had to dual-boot windows so i could still program in vb6 and be a general shit on AOL
That may have even been in the x11amp days.
I loved playing No One Lives Forever 1&2 on my Voodoo 5 5500. That was the height of my PC building days. Now as a wizened old man, I'm stuck with these Apple Macbook Pros/Airs and they do well enough. But I do miss building my own machines...
This was posted on HN the other day. Enjoy!
http://nolfrevival.tk/
FWIW, you can build a fully functional desktop for ~$400 with integrated graphics (that can play most modern games on lower settings), or maybe $600 with a discreet GPU. Less if you go with used parts.
How wizened? If you are close to retiring, maybe you can build a pc and play some games. Keep the brain running, and stay in touch with friends (if they’ll do multiplayer).
My first gpu ever was a voodoo 2 8mb. I remember starting up the original unreal and getting it working. Shortly after we got a cable modem. 12 year old me was having a total blast. ;)
I remember so desperately wanting a 3DFX card and not being able to afford one (I was 13).
So the next best thing I could do was create a fake boot up image that just flashed up showing that the PC had a 3DFX card and their logo.
It made me happy
I saved like 14 months to buy a cd-rw drive. It was a nice Sony, I think 12x/4x. I paid like $400-some dollars for it around 1998 or 1999.
My computer couldn't keep the buffer filled unless I ceased all other activity, killed backgrounded programs, etc. I made a lot of coasters.
Haha I remember that pain now too. A long forgotten memory. Thank you!
Oh man 3dfx and Matrox, so much nostalgia. I didn't have the money for any of those so they will always stay a legend for me. I think my dad's 586 had an S3 on it. In those times, the only way to find games walkthroughs and cheat codes was through physical magazines and word of mouth. Internet was $$$.
I still remember these massive performance jumps you could get around the turn of the millennium. First it was Pentium 166 MMX (SIMD FP math), then it was 3dfx Voodoo, then it was GeForce 256 (hardware T&L) and AMD Athlon Thunderbird (just blasting past anything Intel could offer).
Quake2 with a K6-2 and Voodoo2 was a kickass combo. At some point iD software released the 3DNow! patch for Quake2 which yielded yet another 10 FPS....good times.
MMX wasn’t actually that useful. The vectors were only 64 bits wide, you had no float support and the supported operations were kind of uneven... SSE and especially SSE2 were a much bigger leap.
I remember upgrading from Cyrix 5x86 to the aforementioned Pentium and suddenly being able to play Carmageddon.
I've been thinking it was due to MMX for almost 30 years!
what gave the pentium mmx the big speed boost (I also remember it being quite significant) was probably the bigger 16kb cache (pentium classic had only 8kb) rather than mmx itself.
Yup. More cache, higher clock speeds, slightly more flexible instruction pairing. Few applications used MMX anyway, I think.
The ad campaign from which the quote "So powerful, it's kind of ridiculous" came:
https://www.youtube.com/watch?v=1NWUqIhB04I
I have more memories of my 3dfx Voodoo cards than of any other old hardware. The OpenGL implementation was so buttery smooth that there is simply nothing to compare it to. Quake 2 at 120fps on a 90Hz CRT was just something else entirely. It felt like there was no input latency at all and even with a higher ping of 80-100 in RocketArena it felt smoother than modern shooters on a 144hz panel.
RocketArena players unite! God I spent an ungodly amount of time playing RA3.
Funny thing is it was probably 40fps but you remember it feeling like 120fps.
The voodoo 4 4500 could push up to 140fps at 800x600.
https://www.philscomputerlab.com/3dfx-voodoo-shootout-projec...
The IPO S-1 from 1997. Very promising back then. :p https://secfilings.nasdaq.com/filingFrameset.asp?FilingID=46...
> Voodoo Graphics technology is also the graphics architecture for the 3D media processor chipset that the Company is developing for license to Sega Enterprises, Ltd. ("Sega") for use in Sega's next generation consumer home game console.
Love how there are 605 instances of the word “Sega” in this. Related:
https://segaretro.org/Press_release:_1997-07-22:_3Dfx_Intera...
https://segaretro.org/History_of_the_Sega_Dreamcast/Developm...
Have we crossed the threshold where more "Graphics Processing Units" are sold for ML than for graphics processing?
I remember thinking it was funny that gaming ultimately subsidized a lot of the major advances in ML for the last decade. We might be flipping to a point where ML subsidizes gaming.
The 'death' of PC computing has been rather exaggerated. Each year hundreds of millions of PCs are still sold, and that's exclusively referring to prepackaged stuff. There's then the increasingly large chunk of people that simply upgrade a frankenputer as necessary. As for gaming Steam has users in the hundreds of millions and continues to regularly grow. And while that is certainly going to encompass most people, I'm sure there are some oddballs out there that game but don't use Steam.
So GPUs sold primarily for ML probably still make up a tiny share of the overall market, but I expect they make up a huge chunk of the market for cards like A100. Gaming hasn't been bleeding edge (in terms of requirements) for a very long time and card prices drop quickly, so there's just no point in spending that sort of money on a card for gaming.
Specially funny to me is how on console orientend channels everyone is talking about the raise of PC gaming, it never went anywhere.
Using computers, not game consoles, for gaming was all over the place during the 8 and 16 bit home computing days, and the large majority eventually moved into Windows PC, as the other platforms died, that is why Valve has to translate Windows/DirectX if they want to have any games at all on their SteamDeck.
Consoles has been a niche market, at least here in Europe, mainly used by kids until they are old enough not to destroy the family's computer while playing games, given that many families only have one per household. And nowadays that roles has probably been taken over the tablets.
To the point that Playstation/XBox are now becoming publishing brands, as the exponential growth for selling consoles has plateaued.
PC gaming is going nowhere.
GPUs for data centers makes up a vastly larger portion of NVIDIA’s sales.
These are very different stats. He was referring to unit sales of GPUs, not $ sales. The A100 is a $8000+ video card and so cards like it are going to dominate in revenue, even if their sales numbers are relatively low. For contrast the most popular card, per the Steam hardware survey, is (inexplicably - probably because of prepackaged kits) the RTX 4060, which is a $300 card.
It is the stat that matters in business. NVIDIA is now an AI company with a small graphics card side hustle.
4060 was probably the only 4000 series GPU available for a while, too.
I have a 3090 for AI and gaming and I haven't seen a reason to "upgrade" yet. In fact, I might try and get a 3090ti instead.
In 2024 256 million PCs were sold but only 40 million of those were desktops. Excluding the fact that some PCs (hard to say a number but I'd be surprised if it weren't over 40%) are office PCs with crappy GPUs, most laptops also have a bad, integrated GPU.
There's a chance that this year or the next one more GPUs will be sold for AI than for graphics.
Laptops are also desktops, for all pratical purposes other than being able to swap components.
There are plenty of games and graphics to play, all the way back to the Atari 2600, not everyone is playing the last version of Call of Duty, Cyberpunk, or whatever tops the AAA charts.
In fact, those integrated GPUs are good enough for WebGL 2.0, which I still haven't seen much that can top mobile games graphics in the last 10 years (done with OpenGL ES), other than demoscene reels on shader competitions.
I'm fairly sure OP was more concerned about modern GPUs being used as TPUs or whatever they're called, than about what graphics circuits the Atari 2600 was using.
There is wide range between Atari 2600 and what is on Steam that doesn't require an RTX 5090.
The entire thread was about GPU <<sales>>.
Exactly the point, you don't need to buy a RTX 5090 to play games, there are plenty of games around to chose from that don't require a RTX 5090.
Not everyone is hunting GPUs for AI, other than hyperscalers.
Additionally, maybe Linux users would be left out, everyone else is already migrating to NPUs, GPUs for AI is fighting the last war.
Even mid range GPUs are proportionally much more expensive. I built a decent gaming PC with a GTX 760 10 years ago for about $900. These days you'd have to pay double for the same relative performance.
Again, there are plenty of games to chose from besides last generation AAA games.
I guess some folks might suffer from FOMO, but that doesn't change the fact there are more games to play than most folks can achieve to finish during their lifetime, that aren't last generation AAA.
We hit that a few years ago. That’s when NVIDIA’s stock skyrocketed.
The first time I lost all my money in the stock market was when I went all in on 3dfx stock against nvidia. Ah, the good ol’days!
ouch, I am certain some people did the same for nokia against android/apple :P
anti-matter-millionaires
>> This tale brings up many “what ifs.” What if 3dfx had gotten the SEGA Dreamcast contract? Would both companies have been better off? What if 3dfx had continued simply doing chips and not gotten into the board business? Would 3dfx then have brought its products to market more quickly?
What if Atari had continued with its rasterizer chip from 1983s I,Robot? They also had their "mathbox" for geometry transformations since the late 1970's. They were well equipped technically to enter the broader 3D graphics market in some way but that never happened.
From what i understand the chips they were offering SEGA was technically inferior to the Power VR ones.
I remember upgrading to a creative labs 3dfx voodoo banshee and it was actually stunning - I don't think there has been a generational leap quite as apparent as seeing everything before and after that upgrade. I think I had a matrox card before that and it wasn't even that old. This was on a Celeron 400...
I don't think we'll see a generational leap like that in the future.
Glide was nuts on games that supported it - it was night and day. It took several generations of hardware for directX to surpass openGL.
Previously discussed: https://news.ycombinator.com/item?id=35026862 (2 years ago, 376 comments, 671 points)
Ah, back in the day I really wanted a Voodoo and being able to play with Glide, instead I was forced to return it to the shop, and get it replaced with a Riva TNT, because the PCI version on the motherboard could talk to the Voodoo.
Quite sad day for me, even if in hindsight it was for the better, given how everything went afterwards.
It felt that I got the lesser product, only because of a little motherboard issue.
I remember getting a Voodoo5 AGP, not knowing at the time that AGP and PCI were different. I couldn't use it for the first couple of months that I had it, and then upgraded the motherboard to one that could. I remember originally running a Gigabyte GA-6BXD with Dual Pentium IIIs, but I don't remember what I upgraded to that let me run the Voodoo5.
The V5 was the largest card I'd ever seen that wasn't a motherboard, and it ran every game I wanted to play for years!
I have one question: wasn't the 3dfx a graphics postprocessor? I thought it didn't render the image in higher quality, but it did postprocessing only... Never had opportunity to have voodoo, but later, when got a decent NVIDIA, I played Need for Speed 2, which had demo videos "rendered in 3dfx" with snow etc, and my graphics was crispy and no-snowy. I tried to look up why my NVIDIA does not have those effects, and I learned that they were overlayed over the original image only by 3dfx voodoo...
The Voodoo was a 3D-only accelerator. It didn’t have a traditional 2D graphics core at all, so you needed another basic video card which plugged into the Voodoo using VGA passthrough. When an accelerated game was launched, the Voodoo took over and replaced the 2D card’s output completely.
That’s probably why you remember it being a post-processor. It didn’t apply effects to the 2D signal, but it needed it for all non-accelerated programs.
3dfx also supported more blending modes than most competing cards at the time. That could be why the snow effect didn’t work on your card.
I was playing around with building retro game VMs with qemu and pci passthrough awhile back and dusted off my old canopus pure 3d to try out with a pci to pcie adapter board. It was kind of amusing you'd have the windows desktop running in virt-manager and when you fired up unreal tournament the desktop would just freeze and only then would the card actually output anything.
No it’s definitely a 3D renderer. Glide was a competitor to OpenGL and Direct3D that was proprietary to 3dfx. Don’t remember why the quality was higher.
Yeah, the earliest models are literally 3D only as discussed in the article, they have separate pass through cable for your existing 2D graphics because even just making a flat 2D window isn't viable directly, Glide really wants to render only textured triangles which is fine for Quake but no good for Windows.
I had one of those 3D-only cards on my first computer. I didn't know about the passthrough and got pretty annoyed that my games sucked and never worked with the hardware 3D stack. I don't know if they didn't document it correctly, or if I just missed it. But when some support person finally told me, I was so pumped. I spent a while manually moving the cable to the 3D card when playing a game, until I finally got a passthrough cable.
> While at SGI, Tarolli, Sellers, and Smith had all had some exposure to SGI’s Reality Engine, and with video games (especially the PlayStation) moving toward 3D graphics, all three saw the potential for consumer 3D acceleration.
Were they exposed to early versions of the Playstation? It wasn't publicly released until after 3dFX was formed.
While working at Virgin games in 92 we did see some demos of 3D gaming, but I can't recall who the manufacturer was.
Playstation was in development and was released in japan almost a year before 3dfx released their first card. It's reasonable to draw that someone could have had experience working on the playstation graphics system prior to moving over and creating 3dfx. The fact they were building a prototype system for Sega as well means they were likely involved in that space before. SGI also licensed the CPU to sony for the playstation and back then the CPU would have done most of the graphics workload for the playstation - similar to an APU today with even less segregation between gpu and cpu.
https://archive.computerhistory.org/resources/access/text/20...
very good background on the founders up to and at the creation of 3dfx... some really interesting stuff in there.
Could a Minecraft-like game run on a 3dfx card? Or is it out of question performance wise?
Too little vram.
I recently decided that I wanted to relive the PII/3dfx glory days went a little bit overboard buying up parts on eBay. I ended up with an Abit BH6, PIII-800, 768MB RAM, dual Voodoo2 cards and an Asus 3800 TNT2.
I remember pining for a 3dfx card, and seeing a second hand Rush based card for sale. 13 year old me bought it. Boy was I disappointed :) I learnt a good lesson that day.
Ended up getting a GeForce a couple of years later. Still wanted a Voodoo 3, but they were a little too expensive.
I had a Diamond Viper V770 Ultra TNT2, I feel like that was a turning point in the 3dfx vs NVIDIA battle (and the subsequent GeForce marks NVIDIA industrial lead).
https://www.anandtech.com/show/307
I had that too - it was the last gaming card I ever had, then I became an adult who didn't like games for better or worse
There is a great instrumental chiptune heavy metal track dedicated to 3DFX: https://masterbootrecord.bandcamp.com/track/irq-10-3dfx
Sometimes I still don't know how I convinced my parents to buy me certain things. We lived in an extremely rural area and were not at all "well-to-do" but we always had a computer in the house. In my teenage years, this was a Packard Bell Pentium 100. All in the space of a year or so, I somehow managed to convince them to buy me a Canopus Pure3D video card (3dfx Voodoo but with 4 MB texture memory), a big ergonomic keyboard, a Sidewinder joypad, and Tomb Raider.
The Pure3D was the real winner here, it took Quake from "meh" on the average PC to flat-out amazing. Even on dialup, I got quite a lot of mileage out of that setup.
I still am a bit of a 3dfx fanboy. Ended up emailing 3dfx at one stage and got sent a load of posters and case stickers (remember those?).
I had a Voodoo Banshee which was a fairly decent card (not quite as good as a Voodoo 2, but better than a Voodoo Rush as a combined 2D/3D card). Paired to a Pentium P133 - very overkill on the GPU. Ended up using the same card on a AMD K6-2 500 in the end which was a bit more evenly matched.
Then ended up buying a cheap Voodoo 5 5500 after they went under (only paid £50 for it).
Sadly both of them went in a dumpster a long time ago. Wish I'd kept them both. I ended up moving to nVidia cards for a while, then had an ATi Radeon. Nowadays I just run a Macbook Air for my personal machine - life got in the way of much gaming!
My memory is that during late 90s, whenever a game supports glide, 3dfx card will always render smoother and with noticeably better texture than NVIDIA and ATI cards, even benchmark will give you similar numbers. So we constantly envy the roommates with a voodoo card.
I remember those days, everything ran locally or through a fat client. You really used the hardware you had. I wish and pray we get back there
Ah... the memory of PC Gamer
They didn't mention dual monitor and S-Video output. Having a second display was novel at the time. Using your television as another monitor, even more so.
Those SVCD rips looked really good on a television.
I wonder how much that 1 million shares of common nvda stock would be worth now.
Puzzling how they failed so fast. Why?
Author here. I should have been more clear about this in the article, but it was a combination of two different factors. First, the company alienated OEMs by making their own cards. Second, the cards they made were close in price to ATI and Nvidia offerings while being slightly less powerful. To make it worse, they were struggling to produce, and so while they had some interesting hardware in the works, they couldn't get it to market quickly enough.
Nvidia did sprinkle 3dfx's technologies across their products where it made sense, and many of the 3dfx folks continued at Nvidia for some time.
Nice article - thanks!
I remember at the time things were moving somehow chaotically.
Also puzzling why 3dfx didn’t start to support DirectX in parallel with Glide.
I had a pentium 2 - 233 with a voodoo banshee and it will killer for gaming !
I still have a Diamond Monster Fusion kicking around somewhere in my stash. It was the first GPU I ever bought using my own money.
Obligatory https://vgamuseum.ru/wp-content/gallery/bitching-fast/bitchi...
Probably what I'm about to say is unfair because it happened during the last days of 3dfx, but I remember how disappointed my friends and I were when one of us bought a Banshee and tried to run some games. It ran like crap.
Everything we tried ran between "very bad" and "average", certainly not the "wow" we were led to believe from marketing. Then we tried something we had high hopes for: Trespasser, the Jurassic Park game (it would later come to be called the "arm simulator", but we didn't know this back then).
Trespasser ran appallingly bad with the Banshee. It sucked, plain and simple, almost a slide show rather than a game. We were sorely disappointed... with the Banshee.
It turned out much later that Trespasser was a very badly optimized game, and it had been an unfair test of the Banshee because the engine ran poorly on any 3D accelerator on the market.
But the Banshee's reputation was forever ruined for us. We still joke about this.
The Banshee kind of sucked. Voodoo2 12 MB was king in those days.
When the reveal the information about SEGA the whole leadership team should have been on a plane to Japan the next day, all to apologize bow and scrape.
And trying to make your own board in that moment, was just an incredible self own.
> And trying to make your own board in that moment, was just an incredible self own.
ATI built their own boards too at the time (e.g. https://en.wikipedia.org/wiki/Radeon_R100_series#/media/File...), so the strategy of wanting to control more of the value chain doesn't sound that misguided to me. Not sure when ATI stopped doing that - was it after they were acquired by AMD in 2006 or before?
The same article also has another photo of a Creative-branded card. Other card brands also definitely marketed cards with ATI chips. I remember I had a R200-based card from some third-party manufacturer. ATI and Nvidia also had reference cards but I don't know if those ended up being sold in the mass market or not.
The problem is, they already had a large group of board providers, it was a big market advantage for them.
If you listen to the Computer History Museum interview, many of these providers jumped ship.
Also the bought a board provider that wasn't very successful, and their resulting boards weren't very successful either.
Having encountered some of the 3dfx people involved with that, bowing, scraping or apologizing was not really their m.o.
I worked for an elevator manufacture that killed somebody, they fucked up the whole apologize in Japan thing too. Even if it wasn't actually their fault.
The board I can understand, the guys having worked at SGI before and seeing how much more you could extract of the arch by having total control over the hardware (minus the CPU, obv). Essentially building customer-oriented x86 SGI machines, branching out of the gaming market and challenging workstation vendors. A 64 bits Opteron+Voodoo based Windows machine would have been something to behold. But the Sega thing probably torpedoed the funding that would have been required for them to become independent of graphic card vendors.
They knew it was a matter of time before their advantage eroded. I think what really did them in was DirectX, they stuck with Glide and allowed NVidia to develop a proper implementation of the new Microsoft thingie which was heavily marketed to developers. Their moat became a prison.
I think the whole 'moat prison' thing is overrated. They could release future version that support DirectX and Glide.
They simply didn't focus enough on the next generation chip.
Yes, I remember toward the end it seemed like they were just releasing souped up versions of last year's product. I don't know if it was a resources issue or they just didn't expect the market to advance so fast.
A blast from the past.
Vodoo 3 3000 16MB AGP with blue IBM Ramdac.
AMD AthlonXP 1600(PR) absolutely stable OC'd to 1800(PR), later on 2100 because undervolting to 1.38V!1!!
3x 512MB (1.5GB!!!) NEC Virtual Channel Memory (VCM) 133MHz SD-RAM DIMMs
Some SCSIs dangling from a NCR/Symbios Logic controller
Some VIA Board I can't exactly recall, except that it supported that NEC VCM and the NCR/Symbios Logic from its BIOS, and ran really well.
A 21" Hitachi Superscan Elite/Supreme? (With all rectangular buttons, including power, not right on the front bezel, but down below that,slightly recessed) doing 1600x1200 at up to 85Hz, without annoying Trinitron wires! :)
Mostly running NetBSD (otherwise FBSD & Gentoo), whose XFree86 or early Xorg made full use of the Voodoo for 2D-accelleration, absolutely fluid 'desktop' (mostly KDE3) :)
Supercrisp picture thanks to the IBM-Ramdac.
Such Flashback. Much Instagasm