FiringSquad Editors Challenge Round...
||1 entry(ies) in this category
| More Graphics and CPU power!Is it necessary and does it improve things? (14 comments )|
by: VTwedge (5) | Posted in cluster FiringSquad Editors Challenge Round 1 Prelim 2
Posted 76 months ago ( edited 76 months ago ) in category DEFAULT
Some people will laugh when they first see this article and discard it as a joke, others will take a carefull look into it and maybe think twice about whether it is of any value or not.
I am as an avid gamer as the next person whose reading this article and just as excited by new evolution in hardware as every hardware guru since i frequent more gaming sites and hardware sites than the number of my past and future girlfriends put together as far as i can tell!! That does not take anything away from the fact that i also lead a normal life (insert laugh here) and sometimes my normal self comes into a clash with the hardware geek in me, initiating wars between common sense and the need to compete and to stay at the bleeding (pocket) edge of hardware advances.
ill confess:if i had the financial freedom, i would surely buy the latest and greatest of anything that comes out(damn evil hardware geek self surges up!) but the real question is, does the average user see any benefit from having the latest and greatest?
BRIEF CPU HISTORY
Most of the people here dont remember the CPU wars. I was there...I have owned a pc since i was 10 and i started off with a 386 clocked at 40 Mhz with 4 Mb of ram. Yes i can hear you laughing from here to eternity but still, in long years past, that was a blazing fast machine that played everything!!At that time intel was monopolising everything and AMD was a tic on the big bad wolf's rear that was Intel. Then close to 1999-2001, the tic started to grwo and sting the wolf more and more, and the CPU wars erupted. I admit was always a fan of the underdog and in this case the underdog became the champion. AMD started winning the war, broke the Gigahertz mark with its Athlon series processors and proceeded to retain the lead for the past 4 years, while Intel continued to ramp up the processor speed trying to catch up. One Gigahertz was down then the two Gigaherz followed, Intel kept speeding up and hit the thermall wall at 3.73. AMD continued to improve on their architecture, aiming for high processor efficiency instead of clock speed and ended up reigning supreme for the past 3 years with the Athlon 64. Enter the age of rethinking for intel and what do you have now? they made the Conroe or else known to the masses as Core 2 Duo, and won back the "fastest processor" moniker.
And there u have it from a single core to dual core and now going to quad core and it is not stopping (neither should it, in my opinion).
As for graphics? much of the same history. Roll back 15 years and no one would ask you "hey what graphics card do you have in your pc". Now? it is just as important as the processor speed, at least for those of us who shoot their way from the hell of doom, to the Halo ringworld (in our fantasy of course). i remember my first graphics card (a whole megabyte of goodness) in the form of a Cirrus Logic 5430 (if i remember right) and it was a VESA local bus interface as well (for those of you who dont know what that is take a look at wikipedia).i could see 24 bit colors at 640X480 resolution!! yeah keep laughing, im giggling myself here and remembering a famous person who once said 640KB of memory should be enough for everybody (props to those who know who that is) and for those of u who dont i will say this: that person makes Windows (pun intended).
So back then we had only 2D accelerators (if you can even call them that). Then things started moving fast, the first 3D accelerators started appearing.3Dfx Voodoo(4Mb) paradise, then Voodoo 2(8Mb and 12 Mb versions), dominating Nvidia and (yours truly) ATI since those cards were dedicated processors and launched only when a 3D application was used while the other 2 companies used mixed hardware. Resolution didn't increase much but graphics quality did. Then 3Dfx dropped the ball and was bought out by NVidia (the SLI that users today so brag about was a 3Dfx trademark, and yes even those days you could chain 2 cards together and scale your performance), NVidia continued to improve moving to the Geforce 256 series, then Geforce2 series then Geforce3 series then you know how it goes from the 6 series to 7 and the beast that is now known as the 8800 series, carrying a massive 780 million transistors driving 128 unified shaders. ATI followed their own course with their rage series going up to 9000 series hitting the X series (300,600,700,800,850) and their variants(XT, PRO,XT pe,) ending up at X1000 series (300,600,800,900,950, see a trent here?) and their respective variants. Only the XT pe was replaced by XTX, and the much awaited R600 chip (enter the second beast), is waiting to enter the arena. so we went again from having some Megabytes, to having several hundred Megabytes of memory on our boards, for graphics only and soon Gigabytes will follow make no mistake. Sounds similar to the CPU wars doesn't it?
CURRENT WOES AND INTROSPECTION
Now if you have read this far you are probably wandering why the whole history lesson. Honest question for everyone: how many of you average everyday users have noticed drastic performance increases in the use of your computer everytime you upgraded for the past 4 years? I haven't and i have jumped from an AMD Sempron clocked at 2.4 ghz to an AMD dual core 5000 (2X2.6 ghz processors). To those of you, the hard sceptics who have been thinkin of asking if i play games at all the answer is yes i do, quite often in fact. For the past 5 years i have used a 5700 series geforce, then an X800 GTO that i flashed to an X800 XT platinum edition, then to an X1950 XTX and then a crossfired X1950 XTX rig. the one time i saw significant difference was from jumping from the 5700 (which was crappy if you ask me as all the 5000 series of Nvidia was) to the X800. Is my computer fast? yes it is and i do play the latest games but im always playing at 1024X768 on a 19 inch CRT monitor and yes i enable the optimisations (AA and AF) but i personally cant tell the difference from 4AA and 16 AA when im playing so whats the point of turning it any higher?!
Then we have the 15 year old kids in the forums saying "hey mines better than yours cause i have got 2X8800 my daddy bought and i can see 50 more fps than you!". Good line kiddo but unfortunately i know that your eye cant tell 60 fps from 150 fps because you are built like all the other human beings on the planet and you are running a 15 inch monitor so you probably ripped the results from somewhere just for bragging rights since your monitor won't show you or let you play at 1900X1440 resolution which makes ur 2 cards redundant since you won't need that much power anyways in the resolution you are playing!!!oh and another line "intel rules amd sucks" or "amd rules intel sucks". oh right you must be the resident fan boy who didn't speak for the past 4 years or the 5 years before those years etc etc. who sucks? if i put two systems next to each other who were built 3 years apart and start you playing on one then the other ill bet my new pc you won't be able to tell me which runs at 2.6 ghz and which runs at 4ghz, unless you run a benchmark! No i am not a hardware hater or an anti-progress supporter, on the contrary i support progress all the way:) but the thing is starting to become ridiculous.
we have gorgeous looking games today, very immersive nice graphics and all those effects...well im sory to say but apart from a few notable exceptions it seems like the highest the quality of the graphics, the lowest the quality of the content of the game turns out to be. yes i like graphics, but if a game is going to use my hardware it better take off some cpu cycles to also provide some intelligent content for me so that my brain works too instead of just my eyes. yes i do like the occasional shoot-and-then-ask-questions-later-fest but i also like immersion, and immersion for me is not just the eyecandy! You want to immerse me in your adventure or shooty game? give me some plot i can imagine myself being into not the generic aliens or badboys ahoy came from-uknown-shoot-them-kill-and-save- the world by riddling everything with bullets, formula.Im tempted to say please bring back the days of Full Throttle,Starcrat or Gabriel Knight,instead of whatever unimaginitive piece of software is currently in development that will provide a much poorer experience (internal conflict ensues between hardware self and nostalgic self!!)
I believe in evolution, yes hardware makes even bad software look well (if you turn that 32X super mumbo jumbo multisampling-that-your card cant do at acceptable levels thingy) but rapid evolution brings mutation and not all mutations are good (cancer anyone?). i hate the motto hey lets make a jaw dropping game that will need a Cray supercomputer to run on multiple cards and 150 Gb of ram to run on because we cant be bothered to optimise anything!:) no thank you, i will go watch the matrix revolutions now i can get that for 2.5 pounds from Sky Tv instead of rendering it on my pc that will cost me my life's and afterlifes saving's to buy!!
So there we have it. We are definately running faster systems today. We have better graphics cards. So...wheres the improvement apart from
the graphics department?
In my opinion there is little if any, the temptation to turn flashy when you know users have faster systems has certainly appealed to the software makers. Use less time to think of a story and plot and focus on graphics and visual bling bling has started to become a trend in the gaming industry. I guess its easier to blind us with graphics instead with essence. I would give my soul for a visual feast with essence but for now it appears i get to keep my soul for a bit more. Maybe it will not be so bad in the feature but if i take a look at neverwinter nights 2 that brings my system to its knees (AMD 5000+, crossfired X1950 XTX's)
im thinkin of hiding under my bed and not coming out again!
ill stop my ranting here i already took much of your time thank you for reading the whole article, i hope it will make u think twice about where this trent will lead the industry in the future.
just for giggles think about this: the two major companies in the graphics industry NVidia and ATI: One has the hardware for DirectX 10 out but the drivers dont work for it in vista and many users in forums complain, while the other one has the working drivers but no Hardware to use them yet!!! Funny? yes it is,now go figure how this came to be!!!
|14 User Comment(s) • 7 root comment(s)|
| mNewland (1) Mar 06, 2007 - 11:57 am | Edited on Mar 06, 2007 - 12:08 pm|
I too, have recently run into severe battle fatigue on the hardware front. My current box (Water cooled, SLI 7800GTX, 4800+) was tight when it was built a year and a half ago, but is approaching the point where it could use a face lift (sorta the same way a 24 year old could use a face lift).
What's happened in the last year?
- DDR ram uncool. That high end 2GB is now junk. Is DDR2 going to make a meaningful bit of difference, nope.
- High end motherboards jumped from $140-$160 up to $350-$400. Meaningful performance gain: zip (except for a bunch of annoying heatpipes that end up interfereing with cpu heatsinks, look shiny though).
- Core 2 Duo, nice. I want one. Will I see a difference? Probably not, I'm graphics limited running everything at 1920/1200.
- Graphics. Ouch. Water cooling SLI now runs $1600.
- Don't forget to junk the old high end psu for the 1KW! model.
- Vista, wow Halo 2 at last!
Running down that list just wears me out. Could I do it, yep. Would I be the coolest kid on the block for a month, sure. It just seems like so much for so little. I'm sure when I see Crysis I'll immediately buy all that crap, but for now I bought an xbox 360 and Gears of War. Gears looks and sounds fantastic in the home theater. The console and game cost far less than a single 8800GTX and have already provided many hours of fun. Downside: game pads are the mark of the devil.
Also, VTwedge, It sounds like you've laid down quite a bit of coin on you rig over the years. Make your next upgrade a 24" Dell. That is one tweak that truly will change your gaming expereince.
» Login to reply to this
| twophayse (8) Feb 27, 2007 - 04:07 pm|
|» Good topic, poor analysis|
Cmon, this article completely ignores all hardware outside of what you've personally used. The history piece was complete filler. A couple of links to wikipedia would've sufficed. You made no mention of upgrading to eliminate poor framerates without sacrificing resolution or special effects. Anyone who has played FEAR, Oblivion, Splinter Cell, Riddick, Flight Simulator, Mafia, or Black & White upon release can tell you the difference a CPU or GPU upgrade made. Following your logic, imagine going from a Celeron/Geforce2MX in 2001, to a Barton/Ti4200 in 2003, to a Venice/7600gt in 2006. Imagine playing Mafia on the first system, then on the second. Imagine playing Riddick on the second system, then on the third. Let me tell you, the difference in pure speed is immediately noticeable, not to mention the look when the special effects are enabled. Factor in the many LCD owners who expect to play everything smoothly in native resolutions well above 1024x768, and that three year upgrade cycle will pay even larger dividends in framerate and eye candy. You've either been playing the wrong games, or playing the right games... blind.
» Login to reply to this
| VTwedge (5) Feb 27, 2007 - 09:59 pm | Edited on Feb 27, 2007 - 10:00 pm|
|u completely missed the point that is not what i meant, what i tried to show was that most games today are flashy but without much contet compared to what the games used to be when graphics were less advanced but content was so much better!! as for me playing whatever games i want, to each his opinion and whether its the wrong or right games it is up to each persons preferences|
» Login to reply to this
| Diabolique (1) Feb 27, 2007 - 08:18 am|
|A very good and interesting article, I say this because you point out the pro's and con's of the CPU and GPU industry's without,well :) , taking side's if you may :)|
You did a very good job on this, although there's always room for improvement.
» Login to reply to this
» Note: You need to be logged in to write a comment!Login here, or if you don't have an account with FiringSquad, register here, it's FREE!
My Media-Blog categories
No categories created yet.