Page 1 of 1 [ 4 posts ] 

Enigmatic_Oddity
Veteran
Veteran

User avatar

Joined: 4 Nov 2005
Age: 39
Gender: Male
Posts: 2,555

09 Apr 2008, 4:06 pm

Out of curiosity, with all the newest CPUs being released these days being multi-core, is it a far stretch to think that maybe a technology in the future would be a CPU that can also have some of its cores act as the GPU, with software emulating what the GPU would normally do? This could be a way for people without graphics cards to play games with decent graphics without having to shell out for extra hardware, they could just buy a software pack that allows the extra functionality.

I realise a software implementation of a graphics processor would be slower than a hardware-based solution, but it could suffice for a lot of budget users wanting something better than current integrated graphics.



computerlove
Veteran
Veteran

User avatar

Joined: 10 Jul 2006
Age: 124
Gender: Male
Posts: 5,791

09 Apr 2008, 7:30 pm

both things are happening: many if not all computers already have a "on-board graphics chipset", which takes RAM from the computer to allocate for graphics. Intel makes those.
Also the opposite is happening. I remember last year that ATI released or was going to release a GPU with a CPU included 8O


_________________
One of God's own prototypes. Some kind of high powered mutant never even considered for mass production. Too weird to live, and too rare to die.


Betzalel
Deinonychus
Deinonychus

User avatar

Joined: 22 Feb 2008
Age: 45
Gender: Male
Posts: 317

09 Apr 2008, 8:06 pm

All GPUs are really just specailized CPUs anyway. and there are some graphics boards that do have general purpose CPUs in them the Sun XVR-1000 has an MAJC CPU in it that do the graphics processing.

see http://everything2.com/index.pl?node_id=1889447 for more details.

as far as doing graphics acellertion functions on the CPU die thats there the MMX, SSE and SSE2 instruction sets were intended for originally. to accelerate multimedia functions using the systems CPU. I don't find it unreasonable that some graphics routines might be pushed back onto the system's cpu in the future on lower end gear.


but you would still need to have a graphics chip that can push a lot of bandwidth to the screen quickly and it would still need some rudimentary graphics acceleration primitives in hardware otherwise even if the system CPU was really fast the graphics chip wouldn't be able to push enough pixels to make it really matter.


I think seperating the functionality away from the CPU as much as possible is better anyway. its tempting ot do stuff like that in software but with hardware as nice as it is today I think its a better policy to focus on making good graphics cards with their own GPU. You could still push out certain 3d functions onto the CPU if the card doesn't provide them anyway by having the abstraction layer implement optimized routines for the 3d processing in software if it doesnt detect them. I would be suprised if DirectX doesn't already do this.



pakled
Veteran
Veteran

User avatar

Joined: 12 Nov 2007
Age: 67
Gender: Male
Posts: 7,015

09 Apr 2008, 8:38 pm

If that was true, then you'd see AMD buying up ATI and....woops...they did...;)