Jump to content

It's here it's out AMD new P4 killer


fawwaz

Recommended Posts

First off I would like to say technology never stands still. Why do you think what you have now is because engineers are always striving for more, better.

It is invetible that there will be faster processors out. Better memory, hardware, etc.

So I am tired of hearing...

'My computer is faster like yours!'

Like small children.

So lets say they never got faster, and you were wishing for a faster PC. And there was none. What would you do? Get a life? ;-) (like I have one, especially since I got my 4.2's)

I am running dual p3's at 1ghz w/512 megs of ram and I am sure there is something faster right around the corner. But I went from a p2 300mhz w/128megs or ram.

So it is a quantum leap for me. If your pc satisfys your needs. Then hell, or if you just want to have bragging rights. Chuck out a couple of grand every year. smile.gif

Link to comment
Share on other sites

*sigh* While the T-birdz beat the holy hell outta the P4's in most benches, you also have to realize that those benches the T-birdz are beating the P4's on are not optimized for P4 use.

This is basically the same thing that's going on with the Gforce 3 card. In some benches, the older Gforce 2 Ultra card will beat the Gforce 3, but then you realize that the tests being performed aren't optimized for the technology that the newer hardware uses.

While I hate to say it, cause I prefer AMD over Intel, in about a year, when software is out that WILL take advantage of the P4's architecture, everyone will be singing a different tune, because the P4 should handily beat out its competition, unless AMD's next gen cpu's have more punch to them.

So to wrap it all up in a nutshell, the P4's are about a year ahead of their time, so are the Gforce 3's, and when they DO meet up with a more appropriate market for them, doubters will be silenced.

Link to comment
Share on other sites

quote:

*sigh* While the T-birdz beat the holy hell outta the P4's in most benches, you also have to realize that those benches the T-birdz are beating the P4's on are not optimized for P4 use.

While I hate to say it, cause I prefer AMD over Intel, in about a year, when software is out that WILL take advantage of the P4's architecture, everyone will be singing a different tune, because the P4 should handily beat out its competition, unless AMD's next gen cpu's have more punch to them.

also realize that those benches are not optimized for Athlon's either (3dnow!) rolleyes.gif

that's what people use to say with the AMD k6-2/k6-III line, because of the it's horrible fpu, amd relied on 3dnow to make up for it.

when you say "taking advantage of p4's architecture," what you really mean is thep4's SSE2 instructions. intel totally ignored FPU in the p4's architecture, and is relying on software to be optimized for SSE2 to get some good performance out of the p4. (sound familiar? *cough*k6-2*cough*) AMD's next processor, the palomino, which will be coming out this summer will also have SSE2 instructions, so..... wink.gif

Link to comment
Share on other sites

quote:

Originally posted by VIV:

so are the Gforce 3's, and when they DO meet up with a more appropriate market for them, doubters will be silenced.

people said the same thing about the GeForce when it came out. With it's hardware T&L. people said, "it's just ahead of it's time, just wait until software the uses hardware T&L comes out....then the GF will really shine!" well....that was over a year ago, and how many optimize games have come out? About 3. not only that, but since the GeForce3 came out, it uses a new Programmable T&L......hehe....that means the original GF and GF2 is already obsolete because programmers are only going to be programming for the GF3's Programmable T&L.

Intel has tried this with the MMX....never caught on. Then AMD tried with with 3dnow! which works very well.....never really caught on. Nvidia tried it with T&L...hardly got any support. 3dfx tried it with their Z-Buffer...nada. Intel then tried it with SSE.....never really caught on. Nvidia is trying it again with the new type of T&L. and Intel again with SSE2 in the P4...hm... if it does catch on, it also benefits AMD since they will be using SSE2 also smile.gif

Link to comment
Share on other sites

k27-R: the main difference between the Gforce and the Gforce 3's, in this case are that Nvidia and ATI are THE only major players left in the 3D graphics scene, while back in the day, there was ATI, Nvidia, 3DFX, and SORTA Matrox. Back then, every company had it's own agenda, and it's own technologies to push, so game developers were all oever the place with what they would add to their games. Both Nvidia and ATI are now striving to make their newer products MUCH more compatible with DirectX 8, even to the point in helping MS along with DX 8's production. Since the only new cards out there will be designed for DX8 use primarily, and will offer a lot more options for effects and ways to create worlds for games to exist in, developers are gonna be switching over to these newer technologies, because they want their games to look their best, and not be left in their competitions dust.

So basically, we can say the industry is going in this specific direction (the way the Gforce 3 is designed to go), because Nvidia made damn sure that by choosing alternate routes would lead to more work on the part of the developer (take longer to do the same thing on older technologies), and a sub par result compared to older methods (graphical techiques on the Gforce 3 make games look a LOT better (DOOM 3)).

Link to comment
Share on other sites

k27-R: the main difference between the Gforce and the Gforce 3's, in this case are that Nvidia and ATI are THE only major players left in the 3D graphics scene, while back in the day, there was ATI, Nvidia, 3DFX, and SORTA Matrox. Back then, every company had it's own agenda, and it's own technologies to push, so game developers were all oever the place with what they would add to their games. Both Nvidia and ATI are now striving to make their newer products MUCH more compatible with DirectX 8, even to the point in helping MS along with DX 8's production. Since the only new cards out there will be designed for DX8 use primarily, and will offer a lot more options for effects and ways to create worlds for games to exist in, developers are gonna be switching over to these newer technologies, because they want their games to look their best, and not be left in their competitions dust.

So basically, we can say the industry is going in this specific direction (the way the Gforce 3 is designed to go), because Nvidia made damn sure that by choosing alternate routes would lead to more work on the part of the developer (take longer to do the same thing on older technologies), and a sub par result compared to older methods (graphical techiques on the Gforce 3 make games look a LOT better (DOOM 3)).

Link to comment
Share on other sites

quote:

Originally posted by Speed'nGTP:

So I am tired of hearing...

'My computer is faster like yours!'

Like small children.

my computer is faster than yours. HA HA!!!

JK,

J

------------------

JOIN THE "AMY FOR PRESIDENT" CAMPAIGN. OUR PLATFORM:

ALL YOUR BASE ARE BELONG TO US, MUAHAHAHAHAHAHAH!!

PIII @ 1.08GHZ

SuperMicro PIIISCD I820 Mobo

192MB Crucial PC100

Maxtor 40GB 7200rpm ata66

Maxtor 40GB 7200rpm ata100

Generic DVD 4X

Creative 8x4x32x Burner

GeForce256 32mb DDR (orig)

3Com 905B-TX Lan

Netgear RT314 Gateway/Router

Philips Acoustic Edge

425w Power Supply

Triple Boot Me/2K/Whistler Build 2416

Klipsch Pro-Media 4.1's

umm... a floppy.

thats it, for now

MUAHAHAHAHAHAHAH!!!!!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...