||Too Many Chips Spoil the Mix?
May 30, 2002 Paul Sullivan
Summary: Our tireless editorialist, Paul, examines the state of the PC industry - specifically the dizzying array of choices consumers have in the area of motherboards, CPUs, video cards, and more. Sometimes, freedom of choice means too much of a good thing. Is growth in the industry being stifled by the vast amount of similar PC products?
| Introduction||Page:: ( 1 / 7 )|
This article may contain outlandish exaggeration, speech designed to mock, and over-the-top comments designed to evoke emotion. If you don't get that, well, then you don't really get what an editorial is all about. Please feel free to spend a little time acquainting yourself with the concept of OP/ED pieces or, if you just want some peace and quiet, try tuning in to the 24 hour golf channel on your cable TV. If your local station listing does not include the 24 hour golf channel, please contact your cable company and ask for it.
This article may also cause dizziness, nausea and dry mouth, though results are similar to sugar pill. Do not read this article if you are nursing or plan to nurse, have liver or kidney damage or plan to have liver or kidney damage in the future. If you have breathed, are breathing or plan to breathe while reading this article, please consult your doctor before continuing any further into the abyss.
KISS: Keep It Simple Stupid
I'm a techno-geek from way back, before PC's had hard drives, and one of the knocks that PC's have gotten over the years, and perhaps deservedly so, is that they are overly complex. At first, it was the software that most people complained about, because text based interfaces were not all that easy to master for people, especially given the complex nature of the DOS syntax. There was only a few companies that really made PCs, so there was a great deal of uniformity at that time.
Now, some 20+ years later, the same can certainly not be said. It seems that there are more varieties of hardware than we have ever had before! Computers are much more than single-purpose tools, they have transitioned and evolved into diverse, multi-function devices that allows users to do everything from playing hardcore 3D games, creating and listening to large music libraries, creating your own custom audio and video disks, watching DVD movies in full Dolby 5.1 digital sound, responding to a flurry of Email and instant messages and surfing the web for hours on end.
What it all boils down to really is that you want the machine you poured all of that hard earned money into to keep its mouth shut, put its proverbial nose to the grindstone and get the job done. Unless you are a total masochist, you don't want to have to spend endless hours tweaking, adjusting, resetting and researching just to get the thing to work the way it is supposed to. In theory, computer, peripheral and chip manufacturers should have figured that out by now, but if you have paid any attention to the market over the last few years, you know that they are just as clueless as you might expect them to be.
What exactly am I talking about? Put in a little time and read the following pages and I'll try to fill you in...
SIDEBAR: When I was a kid, people told me that KISS, the rock group, was evil and I should not listen to it. Of course, I did it anyway. KISS does not stand for Kiss In Satans Service, by the way. How can anybody think that "Beth" is a song promoting devil worship?
| Motherboards||Page:: ( 2 / 7 )|
Intel: Setting The Standard
Back in the day, we had Intel, and it was good. We only had a couple of motherboard chipsets to go with the CPUs that they produced, and there was very little research or effort that needed to go into the buying decision. However, over the years, as CPUs and systems from other manufacturers started coming to market, consumers could no longer rely on Intel as their sole source of solutions.
Intel, of course, would not want to promote motherboard chipsets that supported CPUs other then their own, and nobody can blame them for that. After all, who would want to expend their own corporate resources to help out their fledgling competitors?
That said, Intel eventually started to feel the heat from competition and started to push the envelope forward with a wider variety of offering. Perhaps the most legendary Intel chipset in the modern era has been the 440BX. This chipset was the foundation for the advanced Pentium juggernaut, and to this day is still one of the most stable, most efficient offerings ever produced. Motherboards based on this chipset are some of the best selling ever produced, and there are many still in operation. I know that the Asus P2B family was great, but my favorite board based on the 440BX has been the Abit BE6-II. It still compares well to many new motherboards in terms of bang for the buck, even offering support for 8 IDE devices.
The follow-up to the 440BX, the 815, never really reached its full potential. While heralded by some as a solid step forward, it never played out that way in sales. The next step up of major importance was the move by Intel to the Rambus memory architecture. While on paper it may have looked good, Rambus was monumentally expensive, and their pathetic corporate attitude was a loser from the start. Rambus had this great deal with Intel, and wanted to corner the market on this new technology. Instead of trying to compete on a level playing field, they bought up patents for SDRAM and DDR and told their lawyers to start sending out letters to other memory makers demanding excessive royalties in an effort to drive them out of business. The backlash was serious and immediate.
Consumers balked, memory makers balked and after Rambus lost a couple of court cases, even Intel started taking a step back from the relationship. Eventually, Rambus got hammered in every court case they prosecuted, and with Intel committing to a DDR alternative, Rambus had no choice but to go low-profile for a while and hope that the storm would pass by. The 820 chipset was plagued with problems, and it wasn't until the 850 that the Rambus picture has improved. To some extent, the storm did pass by, and now that Rambus prices have dropped and the 533mhz FSB has been introduced, it is clear that Rambus is at least as fast, if not faster, than any DDR solution currently available. Still, Intel is pushing a variety of 845 chipset solutions that look very impressive, particularly from a price/performance ratio. I hope that Intel decides to blow off Rambus completely, not only because of the fact that Rambus tried to dominate the marketplace with lawyers instead of competitive products, but because there are just too many chipsets with too many bugs coming out of Intel. The new DDR platforms look very promising, and given the hassles with the 820 and other chipsets, it would be better for everybody if Rambus was gone and DDR ruled the day.
SIDEBAR: Back in the day, we had the 8088 and the 8086 and the V20 chip. Remember the V20? You could soup up your 4.77 Mhz machine and play 3D pacman in brilliant Hercules mono-chrome just fast enough to make it feel like you were pushing the edge! Oh how your buddies envied you...
| More Motherboards||Page:: ( 3 / 7 )|
AMD and VIA
The most successful competitor to Intel on the CPU area has been AMD, particularly with the release of their Athlon and Duron processor lines. Of course, the problem with introducing a new CPU is that you need a chipset to support it. Since AMD was not allowed to produce a 100% compatible alternative to Intel that worked on existing Intel chipsets (complicated legal stuff and cross-licensing agreements), they started developing their own alternatives. For a while, AMD was putting out some good, solid chipsets, like the 760 series. However, AMD made it known that they did not want to be in the chipset business, and was counting on other companies to step up to the plate and support AMD processors.
VIA is arguably the current master of AMD compatible chipsets, with a huge, sometimes confusing array of products out there. They seem driven to push the envelope with new features and different combinations, and on some level that is very exciting. However, the problem with pushing the envelope is that you end up on the "Bleeding Edge", where there can be a great deal of pain along with the gain.
The issues with the 686B set, for instance, are very well documented. It seems that Creative Labs products don't often get along with at least some VIA chipsets, and this is one of the most obvious examples. The 686B was also plagued by intermittent IDE Bus Master issues, or so some have said, and the perception that there was a defect in VIA chipsets has come back to haunt the company time and time again.
Even the KT266 chipset had issues, though most seemed to hinge on performance. The KT266A chipset has turned out to be a huge winner for VIA, but only because it realized the promise that the original KT266 chipset failed to provide. For many potential VIA customers, the question has been: "Which chipset to buy?" The wide array of offerings, due in part to the need to introduce bug-fixed replacement sets, can be confusing to even the most technically oriented users.
For example, the new KT333 set is at times actually slower than their KT266A set. How is that possible? They need to tweak timings, latencies, etc. to provide an optimal pairing between the Northbridge and the Southbridge, not to mention the fact that these new higher speed DDR memories are not even officially sanctioned by governing bodies. VIA is so ahead of the curve they have to pave the road as they turn the corner. Maybe if DDR 400 gets approved officially, or DDR II is finalized, things will get simpler, but for now, it can be a real mess.
NVIDIA, ATI and SiS
It looks like at least three other players are joining the fray. NVIDIA has the nForce, ATI has some new offerings and SIS is coming up slowly on the side of the track. The early buzz on the SIS set is impressive, and the nForce is already entering a second generation of product development that is helping bring them the attention and respect that they have craved since announcing their intention to enter the market. The problem that I can see at this time, however, is that there are too many offerings. Personally, I don't like the idea of having integrated graphic chipsets on the motherboard. I wish NVIDIA would leave that to Intel. I don't see the need to have four different versions of a product, because it seems too confusing when you are trying to make a smart decision. Not all users have the time or inclination to investigate and research every tiny detail necessary to make the best decision. While some may applaud all these choices, I would prefer that they focus more efforts on fewer products to help keep it simple and focus resources where they can do the most good.
SIDEBAR: I waited a long time before I got an AMD motherboard, and since I have three of my old Vortex 2 sound cards in Windows 98 SE, I don't seem to have any of the compatibility problems that others had. But my new MSI K7T266 Pro2-RU has had a few issues around the USB 2.0 implementation and onboard sound being disabled. I am begging for another BIOS update to see if it fixes all the remaining trouble.
| Processors||Page:: ( 4 / 7 )|
The Big Two
After all the hype and market manipulation, it is clear that there are only two top-tier players left: Intel and AMD. Intel has been dominating things since the get-go, and they are still dominating with some 70+ percent of the market. They have had very few mistakes in their CPU division over the years. The 386 and 486 lines were fantastic, particularly the 486/33, 486/66 and the DX4/100 multiplier. However, even back in the early days of the 486, we started to see the problems with too many confusing options. You had the 486/SX without a math coprocessor, and the 486/DX that had it. You could upgrade the SX to a DX with the right chip, but it could be a pain. You had different sockets that limited which of the 486/33's you could overdrive to 486/100's, and it was not uncommon to purchase a DX4/100 and find it would not work in your system.
Then, Intel decided to go from a socket system to a Slot 1 system, only to go back to a socket system a few short years later. Personally, I thought the Slot 1 system was fantastic, because you could buy the setup with the chip, heat sink and fan already installed and ready to go. No more messing with ZIF levers, no more cracking chips when you were trying to push down hard the socket clip for the heat sink with your screw driver, no more need to even install your own heat sink. You could purchase a fully integrated setup with a retail warranty and simply slide it into place like you would a video card. But for whatever reason, they opted to go back to sockets, and the upgrade confusion was on yet again. AMD was no better. They followed Intel like mindless sheep, going from Socket 7 to Slot "A" back to another socket configuration. What a pain.
On top of all that, you had Intel and AMD both coming up with dual processor lines - one for the low end and one for the high end. But with AMD cutting prices of the high end, there really has been no need for the Duron chip at all in recent years. Thankfully, AMD is killing off that line and pushing the standard Athlon CPU line into the low end so their 64 bit offering can take over the high end. Intel is still supporting the Celeron though. You had the Celeron sticking at 100mhz for some time, but now that the Pentium 4 is going up to 533mhz, they have decided to push the Celeron up to the 400mhz level. Add to this the fact that on a Mhz to Mhz basis, the Pentium 4 chip is slower than their own Pentium II and Pentium III architecture (too many unused pipelines?) and the confusion increases.
AMD knows that the Pentium 4 is slower on a per Mhz basis, so they came up with this new way of naming their chips. While the chip may actually run at 1.8Ghz, they will call their new chip a 2200+, indicating that it runs at least as fast as a Pentium 4 2.2Ghz model. Confused yet? Oh, and then there are the mobile versions of these CPU's, which use some kind of "SpeedStep" technology that ratchets the chip speed up and down to save battery life. I don't know about you, but I want my chips going full-bore all the time. I'll buy extra battery packs.
Bringing Up The Rear
The other side of the market is no better. I don't even want to get into the whole Transmeta situation. Is it really a processor or just a chip that runs software applications that control other applications? It is designed to save energy, but at what cost? It has been more expensive to produce than they thought it would be, and to say that it has underperformed is an understatement. I think this chip design is going to die an ugly death.
Then there is Apple. Like AMD, they put out tests showing PowerPC optimized versions of Adobe Photoshop clobbering Intel CPU's in certain operations, but the chip runs at a much slower speed than Intel and AMD chips do. How is this possible? Well, I really like Apple, but I think they are only showing the tests that run in their favor. If they put up a list of industry standard benchmarks I think it would be clear that the PC side gives you much more bang for the buck. There are faster apps than Photoshop too, which is a hog in its own right. Paint Shop Pro wipes the floor with Photoshop, but you can't test it on the Mac. Even though you have OS X based upon a Unix core, the graphical bells and whistles drag the machine performance down so much that even the high-speed PowerPC chips choke. I think Apple will continue to be a niche product, partially because of the confusion surrounding their CPU architecture and the true performance it offers relative to the Intel/AMD side.
SIDEBAR: AMD is undeniably the "Bang for the Buck" champion, and since the KT266A boards are so stable overall, a great option. Still, this new 533mzh FSB looks pretty darn intriguing on the Intel side. I wonder if I will buy a Pentium 4 system over 2ghz with RDRAM or not. I'm so weak when it comes to computers and money.
| Video||Page:: ( 5 / 7 )|
The Big Two
I'm only going to focus on the big 2 - NVIDIA and ATI. There is already too much confusion here as it is, and I don't want to add to it. I'm going to let the massive array of offerings speak for themselves. Below is the current NVIDIA product line. Is there any need for the MX series at all? Come on, the Ti 4200 should be enough. Set it to $149 and call it good and kill off the MX series entirely. Two mobile solutions? One too many. In fact, why not drop the 4600 down to $299 or maybe $329 and get rid of the 4400? I think NVIDIA found nirvana when they had the Geforce 3 Ti 500 and the Geforce 3 Ti 200 and that was all. Simple, easy to differentiate, easy to justify on cost. The Geforce 4 line is like carpet bombing an entire city to hit a single building - overkill.
High End Direct X 8 Parts
- GeForce4 Ti 4600
- Vertices per Second: 136 Million
- Fill Rate: 4.8 Billion AA Samples/s
- Operations per Second: 1.23 Trillion
- Memory Bandwidth: 10.4GB/s
- Maximum Memory: 128MB
- GeForce4 Ti 4400
- Vertices per Second: 125 Million
- Fill Rate: 4.4 Billion AA Samples/s
- Operations per Second: 1.12 Trillion
- Memory Bandwidth: 8.8 GB/s
- Maximum Memory: 128MB
- GeForce4 Ti 4200
- Vertices per Second: 113 Million
- Fill Rate: 4.0 Billion AA Samples/s
- Operations per Second: 1.03 Trillion
- Memory Bandwidth: 8 GB/s
- Maximum Memory: 128MB
Low End Parts (Improved Geforce 2 Technology)
- GeForce4 MX460
- Fill Rate: 1.2 Billion Texels/s
- Triangles/s: 38 Million
- Memory Bandwidth: 8.8GB/s
- Maximum Memory: 64MB
- GeForce4 MX440
- Fill Rate: 1.1 Billion Texels/s
- Triangles/s: 34 Million
- Memory Bandwidth: 6.4GB/s
- Maximum Memory: 64MB
- GeForce4 MX420
- Fill Rate: 1.0 Billion Texels/s
- Triangles/s: 31 Million
- Memory Bandwidth: 2.7GB/s
- Maximum Memory: 64MB
- GeForce4 440 Go
- Memory Bandwidth: 7GB/s
- Fill Rate: 880M Texels/s
- GPU Core Clock: 220 MHz
- GeForce4 420 Go
- Memory Bandwidth: 3.2GB/s
- Fill Rate: 800M Texels/s
- GPU Core Clock: 200 MHz
SIDEBAR: No matter what NVIDIA tells me, there is just no excuse for not including full hardware DVD playback assist in all of their products. Why add it to the cheap MX line and not to the Titanium line? The PCB is already huge, I'm sure you could have squeezed it in there somewhere...
| More Video||Page:: ( 6 / 7 )|
As I did with NVIDIA, I'm taking the ATI product line straight from their own web page. It speaks for itself, don't you think? Do you need two All-In Wonder 8500's? Why not just clock the 8500DV at 275/275 and throw in 128 meg of RAM and drop the analog entry? What is the reason for the LE line at all? OEM's? Just drop the LE and have the original 8500 64 meg for $129, the 8500 128 meg for $199 and the AIW 8500 128 meg for $299 and call it good. Clock them all the same at 275/275. Do you really need the 7500 and an AIW 7500? Why even bother with the 7200 and 7000 at all? The original Radeon line is no better. Different versions for DDR and SDR? Why bother? In fact, why not just kill off the 7200, 7000 and original Radeon lines completely? Finally, there is the mobile side of ATI's operation. Here I was complaining that NVIDIA had 2 offerings, when ATI has 7? Seven? Why do you need that many at all? Just put out an 8500 mobility and a 7500 mobility and be done with it.
ALL IN WONDER RADEON 8500
ALL IN WONDER RADEON 8500DV
RADEON 8500 128MB
RADEON 8500 64MB
ALL IN WONDER RADEON 7500
RADEON 7000, aka RADEON VE
ALL IN WONDER RADEON
RADEON 64MB DDR
RADEON 64MB SDR
RADEON 32MB DDR
RADEON 32MB SDR
MOBILITY FIRE GL 7800
MOBILITY RADEON 7500
RAGE MOBILITY 128
RAGE LT PRO
SIDEBAR: I have to admit, ATI has really stepped to the plate when it comes to improving the speed of the drivers for their 8500 series. Yes, they still cheat on Anisotropic and probably some other things, but their drivers are getting more stable and more efficient all the time. Maybe by 2003 they will finally get the 8500 where it should have been in the first place (/sarcasm off)
| Remainder||Page:: ( 7 / 7 )|
The audio market is awash in junk products, but besides Creative Labs, the only company that seems to have made a dent in card sales has been Hercules. For a while we had Aureal, and I still think that the Vortex 2 was one of the best audio products in many years, but unfortunately, the same bad management that killed off Media Vision killed off Aureal. For my tastes, Hercules has one or two more products than they need. If it were up to me, I would have the excellent Game Theater XP and the Fortissimo II and drop the Muse. Creative Labs. I can understand the Sound Blaster Live and the Sound Blaster Live Platinum, but I don't understand why they did not just build full hardware Dolby decoding into the platinum and call it a day. USB sound is a bandwidth hog, and I would rather not have to sweat it. As for all of the different onboard sound chips, all I can say is, what a waste. I wish Creative had solidified the Live as an onboard solution so we did not have to choose between so many DSP's.
I always got fed up with having 8 different kinds of SCSI controllers. 25 pin, 50 pin, 68 pin, 80 pin? SCSI One, Two, Three, Ultra? $60 for external cables? Please. SCSI is a prime example of how things can get out of hand. At least with IDE you had the 40 pin connector all the time. Even 40 pin 80 conductor cables used for UDMA/66 and above use the same basic format as IDE has for years. But now they want to move to Serial ATA and screw the whole thing up. What a pain.
Firewire and USB? Heck, USB 2.0 is awesome, and I would rather see Firewire fall by the wayside. Memory for digital cameras? You have Mini-CD, Micro-Drives, Sony Memory Sticks, Compact Flash and Smart Media, just to name the big players. Why not just settle on Compact Flash with the PC-Card standard and move on? Compact Flash has the clear lead in capacity and value, and it is much more standardized in terms of interface. Micro-Drives have been a good idea, but they are expensive. I think Compact Flash is the best way to go and I wish that Sony, IBM, Olympus and others would just get together, throw in the towel and adopt Compact Flash as the definitive standard. It sure would make camera shopping much easier.
There you have it. My take on the crazy world of chips. I think we just have too darn many to choose from. I'm all for competition and the exercise of the capitalist system, but can't we just stop arguing over these petty little differences and produce a simplified set of products that combine the best of all of them?
I may rag on Microsoft, but I have to admit, I like having Windows on about every machine I work on when I go to client sites. Trying to get Linux, Mac OS X and Windows talking to each other with the right security protocols in place can be a major pain, as any other consultant can likely attest to. When I go to a site and everyone is using FAT 32, it makes using Norton Ghost so much easier, it is not even funny. I can have Windows loaded up, have the same network protocols there, push software onto the desktop, all in a matter of a few hours.
But then you have to go and start tweaking things. One section has VIA chipsets, one has Intel, one needs new 4 in 1's, the other doesn't. Some have ATI chips, others Matrox, others NVIDIA. You have to start configuring a drive image that takes all of those into account. If you have Windows NT 4, which some clients still do, it can be a nightmare, because as much as we may whine about Plug and Play, it really helps with things like different video cards and the like.
Please, manufacturers, take an earful, ok? I love innovation, I love fast computers and I even love tweaking. But what I love most of all, perhaps, is being able to turn on my system and play my games and load my software without having to worry about what goes on behind the scenes. Can't we all just get along?
SIDEBAR: Are you sick and tired of all the confusing options when you go to purchase some hardware? Does anybody actually think that Rogaine is worth the time and effort? Wasn't the Smallville season finale just awesome? Please let us know your thoughts in our Comments Section.