It's for the Memory. You need to have a higher Bit interface to make full of the card's memory. For Example, a Card with 512MB of memory will generally have a 256bit interface. If it has a 128bit interface it would only be able to utilize 256MB of it's total, 512MB of memory. The bit interface always has to be exactly half the the entire memory to make full use of the total memory the GPU has.
It's for the Memory. You need to have a higher Bit interface to make full of the card's memory. For Example, a Card with 512MB of memory will generally have a 256bit interface. If it has a 128bit interface it would only be able to utilize 256MB of it's total, 512MB of memory. The bit interface always has to be exactly half the the entire memory to make full use of the total memory the GPU has.
Not wanting to start an argument, but that was not my understanding of the function of the memory interface. All the memory will be used, the interface is simply put the bandwidth that the GPU and the memory use to communicate. Therefore, a 256 bit interface can transfer information from the GPU at about twice the rate of the 128 bit, meaning that more info can be written to and from the memory faster. That is why the higher bitrate cards tend to have better performance with options such as antialaising (yeah, probably misspelled that one) enabled than one with a lower memory interface. I look at it like a highway and the info is the cars. You can get all the cars from point A to point B, but it would be faster with 4 lanes than with 2.
And on the subject of the topic of this thread, I picked up the XXX version of the 8800 GS Urza picked up (basically the same card, just clocked higher out of the box) and it is very nice, especially considering the price. And running 2 in SLI would probably give all the performance anyone would need at the moment.