http://www.trustedreviews.com/motherboa ... -DG45ID/p1
I will be using this board very soon and migrate over to XITE-1 when the time arrives, but was hoping someone could elaborate a little on the onboard graphics.
This is not targeted for gaming enthusiasts but rather media centers, and smaller set ups.
Back when Supermicro's 1U servers were ruling the server market w/ their onboard graphics solutions, I am sure that this would have interferred w/ PCI 32bit bandwidth, etc. But maybe I could dump my eVGA GeForce 16 x PCI-e and go with the onboard solution.
I could try it out and then use the PCI-e 16x just in case, but I am anxious to hear if anyone here has ever tried out Scope w/ an onboard graphics solution.
Ankyu,
Intel GMA X4500HD Graphics
Re: Intel GMA X4500HD Graphics
I have always used seperate peripherals, but with the new CPU's and lower power requirements I am starting to think along those lines. I have already eliminated the need for a high watt PSU, and the heat associated w/ 10K HDD's has been externally located also.
I am using a really old DAW right now that uses an ancient Matrox G450 w/ 16MB's of RAM. It has been running for 7 years, and still does screen redraws w/o a problem.
I am thinking that having XITE-1's PCI-e connector as the only connection might have some low power benfits, and definately keeping better airflow.
I am using a really old DAW right now that uses an ancient Matrox G450 w/ 16MB's of RAM. It has been running for 7 years, and still does screen redraws w/o a problem.
I am thinking that having XITE-1's PCI-e connector as the only connection might have some low power benfits, and definately keeping better airflow.
Re: Intel GMA X4500HD Graphics
integrated graphics should work fine. just remember that graphics will use system ram instead of onboard(the card) ram for video...
Re: Intel GMA X4500HD Graphics
I will be a Guinnea Pig 4 this chore and see if there are benefits.
I thought that whole NBridge issue was resolved w/ a memory controller using DMAccess. This might be a useful test, and one beyond my skills. I can't really trust the builder I use here in LV for a comparison in Scope.
Cool...now I have a reason for GaryB to come and visit again.
Only this time you will stay in a 5th Wheel near the Walker River marshland.....spawning ground for the German Brown Trout. I'll pan fry ya' some fresh trout. I can leave you w/ Dosier my hunting Dog to ward off Indian attacks and drunken Cowboys. They'll hear the Reggae music and come a runnin'.
The whole area is an Alpine paradise, you'll love it. A one hour round trip plane ride for a buck sixty.......a sore dick deal.
Tyan built some phenominal onboard graphics boards that many Pro's used with their Dual 64bit Opteron DAW's. But AMD's On-Die Memory Controller was a far better design than Intel had.
I thought that whole NBridge issue was resolved w/ a memory controller using DMAccess. This might be a useful test, and one beyond my skills. I can't really trust the builder I use here in LV for a comparison in Scope.
Cool...now I have a reason for GaryB to come and visit again.


Tyan built some phenominal onboard graphics boards that many Pro's used with their Dual 64bit Opteron DAW's. But AMD's On-Die Memory Controller was a far better design than Intel had.
Re: Intel GMA X4500HD Graphics
We hoped to at least have an awesome video processor, but that doesn't appear to be the case, either. With any luck, driver updates can at least improve this situation by improving hi-def deinterlacing and adding some good noise reduction.
Granted, ATI and Nvidia's integrated GPUs are also quite slow, but are still faster than this and deliver better video processing, too. While we're at it, we should mention the terrible video control panel for Intel integrated graphics. It's straight out of 1998 in both design and functionality. If Intel is going to get serious about graphics—especially high-end graphics with the Larrabee project—it's going to need to work on the features and design of its control panel software.
We like the small size, energy efficiency, numerous SATA and USB ports, and HDMI output of this little microATX board. But when it comes to performance, we simply can't recommend it, even for the budget-conscious.
Product: Intel DG45ID Motherboard
Company: Intel
Price: $120 (street)
Pros: Lots of SATA ports, lots of USB ports, HDMI output, 7.1 audio.
Cons: Poor graphics performance, poor video acceleration, mediocre system performance.
Summary: A good feature set can't save this motherboard from its anemic integrated graphics and poor overall performance.
Copyright (c) 2008Ziff Davis Media Inc. All Rights Reserved.
_______________________________________________________________________________
It appears as though audio could be comprimised according to this review. If it introduces noise, and has an HDMI connector, that's not very Kosher IMHO.Maybe this little puppy might work.
http://www.evga.com/products/pdf/128-P1-N309-LX.pdf
It appears I could program XITE-1 on a 42" HDTV at home, taking advantage of much larger project windows. I won't need to edit while doing a gig, but for initial programming, this would be a welcome addition. It's 2.4" height is definately low profile too.
My video guru in LVegas says the seperate Bus that would be used for the PCI would pose no threat to bandwidth needed for the PCI-e needed for XITE-1.
This was made for Vista according to eVGA.
Granted, ATI and Nvidia's integrated GPUs are also quite slow, but are still faster than this and deliver better video processing, too. While we're at it, we should mention the terrible video control panel for Intel integrated graphics. It's straight out of 1998 in both design and functionality. If Intel is going to get serious about graphics—especially high-end graphics with the Larrabee project—it's going to need to work on the features and design of its control panel software.
We like the small size, energy efficiency, numerous SATA and USB ports, and HDMI output of this little microATX board. But when it comes to performance, we simply can't recommend it, even for the budget-conscious.
Product: Intel DG45ID Motherboard
Company: Intel
Price: $120 (street)
Pros: Lots of SATA ports, lots of USB ports, HDMI output, 7.1 audio.
Cons: Poor graphics performance, poor video acceleration, mediocre system performance.
Summary: A good feature set can't save this motherboard from its anemic integrated graphics and poor overall performance.
Copyright (c) 2008Ziff Davis Media Inc. All Rights Reserved.
_______________________________________________________________________________
It appears as though audio could be comprimised according to this review. If it introduces noise, and has an HDMI connector, that's not very Kosher IMHO.Maybe this little puppy might work.
http://www.evga.com/products/pdf/128-P1-N309-LX.pdf
It appears I could program XITE-1 on a 42" HDTV at home, taking advantage of much larger project windows. I won't need to edit while doing a gig, but for initial programming, this would be a welcome addition. It's 2.4" height is definately low profile too.
My video guru in LVegas says the seperate Bus that would be used for the PCI would pose no threat to bandwidth needed for the PCI-e needed for XITE-1.
This was made for Vista according to eVGA.
Re: Intel GMA X4500HD Graphics
Personally I would look the Nvidia 9500GT or 9600GT models from Evga (or Foxconn, BFG or XFX). They don't usually generate much noise or heat in normal 2d operation, but you can always fit them with an aftermarket cooler as well (or the 'silent' options from Gigabyte and Asus). I have an 8800GT (considerably larger and warmer than a 95/600GT) that I put an Accelero S1 (rev2) on and it brought it down from 65C to 40C at idle, and 78C to 45C when actually playing games. There's a clip-on fan option for that which brought temps down even more, and is pretty much still inaudible. I've since replaced it with yet another card but just wanted to share the experience.
As a bonus with these cards you'll get h264 and other modern codec support (for instance 14% cpu usage instead of 80% of 2 cores when playing back blu-ray), dual outputs with at least 1 dual link dvi output per card. Dvi can be converted to HDMI with a simple converter, though I must say that when I do that I prefer to put the converter in a location where I can give the cable some support as you wind up with 8 inches of dongle/cable hanging off the back of the computer or monitor (depending on where you put the converter).
Unfortunately Ati hasn't yet released the low end 4xx0 line yet (for the 4850/4870 class gpu) or I would recommend that as an alternative as well.
Also I wanted to mention a huge drawback to using HDTV's for computer work, you can rarely hit the native resolution of the tv's panel. The best case scenario is that (with a modern card) you're able to hit the native resolution of the video processor inside the HDTV.
For instance 720p is theoretically 1280x720, but most HDTV chips show themselves as 1360x768 (which equates to 1mbit worth of pixels) and then actually USE 1440x900 lcd panels. For 1080p the HDTV chip will work at 1920x1080 (and reveal this to the computer), but the panels themselves are usually at 1920x1200 (same as with pc monitors) or higher. Again this is due to the size of the buffer on the dsp chip, and since it's inserted in line you won't actually ever hit 1920x1200 even if you try to force that (the HDTV chip will downscale the input to 1920x1080 then upscale it again to fit the full panel).
It's confusing, but with a modern 1080p monitor the slight vertical stretching that occurs when the computer is set to 1920x1080 is not nearly as noticeable to most people as it is to me. For movie/tv work this is acceptable because the HDTV dsp chips usually use a variety of post processing techniques including sharpening techniques. Personally I'm using a 27" computer lcd these days which saves the eyes as it's the same resolution (1920x1200) as a 24" lcd with larger pixels. I will upgrade to a higher res, larger lcd when oled displaces the flourescent tubes in the backlighting for lcd's.
As a bonus with these cards you'll get h264 and other modern codec support (for instance 14% cpu usage instead of 80% of 2 cores when playing back blu-ray), dual outputs with at least 1 dual link dvi output per card. Dvi can be converted to HDMI with a simple converter, though I must say that when I do that I prefer to put the converter in a location where I can give the cable some support as you wind up with 8 inches of dongle/cable hanging off the back of the computer or monitor (depending on where you put the converter).
Unfortunately Ati hasn't yet released the low end 4xx0 line yet (for the 4850/4870 class gpu) or I would recommend that as an alternative as well.
Also I wanted to mention a huge drawback to using HDTV's for computer work, you can rarely hit the native resolution of the tv's panel. The best case scenario is that (with a modern card) you're able to hit the native resolution of the video processor inside the HDTV.
For instance 720p is theoretically 1280x720, but most HDTV chips show themselves as 1360x768 (which equates to 1mbit worth of pixels) and then actually USE 1440x900 lcd panels. For 1080p the HDTV chip will work at 1920x1080 (and reveal this to the computer), but the panels themselves are usually at 1920x1200 (same as with pc monitors) or higher. Again this is due to the size of the buffer on the dsp chip, and since it's inserted in line you won't actually ever hit 1920x1200 even if you try to force that (the HDTV chip will downscale the input to 1920x1080 then upscale it again to fit the full panel).
It's confusing, but with a modern 1080p monitor the slight vertical stretching that occurs when the computer is set to 1920x1080 is not nearly as noticeable to most people as it is to me. For movie/tv work this is acceptable because the HDTV dsp chips usually use a variety of post processing techniques including sharpening techniques. Personally I'm using a 27" computer lcd these days which saves the eyes as it's the same resolution (1920x1200) as a 24" lcd with larger pixels. I will upgrade to a higher res, larger lcd when oled displaces the flourescent tubes in the backlighting for lcd's.
Re: Intel GMA X4500HD Graphics
dont get ASUS video cards. although their motherboards are usually great i have had 3 ASUS video cards die. and only 1 other (it was nvidia own brand quadro 4600, at work)
Re: Intel GMA X4500HD Graphics
I agree with that, I think Gigabyte is rather sucky these days as well. The good companies would be evga/xfx/foxconn/bfg as they have better than average support & lifetime warranties, but unfortunately the good companies don't tend to put aftermarket coolers on until you move up the line of cards (BFG has some accelero s1 8800GT's & 9800GTX's). With a 9600GT/9500GT used in windows I don't think its much of a problem but the cards do run rather warm with low fan settings.
Re: Intel GMA X4500HD Graphics
That's where evrything is heading in 2010. Low power, higher density.
Intel is a taking a beating on every review of their graphics accelerator, and in their sales of motherboards which use them. Supermicro, Tyan, and Asus all have some fantastic server boards that use the XGI Z9s with 32 MB of DDR2 memory, which smokes the Intel. So it appears that Intel had banked on an in between solution that nobody wants.
This is good news actually, as Intel continues to fail in this area, which means by 2010 their new onboard graphics solution should finally be an option to more expensive graphics controllers.
They have already started using their high quality silicon to enter the SSD market, which makes everyone, even Samsung nervous. When they put their R & D budget to good use, their designs are tough to beat.
http://www.supermicro.com/products/moth ... 7DCA-L.cfm
Here's a mobo using an XGI Z9 that would be a serious contender to those Dual Quad MacTel's. With 6 x 2GB ECC DDRII, and VIsta, this board could stream massive content, while mastering w/ a large amount of quality plugs. Imagine this w/ 2 x Quad Xeons and 1U XITE-1 rack...............................Holy Schmolly Batman.
I actually was contemplating using this, but since I am an addict of DSP, not VST based apps, it seemed totally unnecessary since XITE-1 is capable of yeiding this amount of power all by itself. But for VST based audio / MIDI apps., it could be a sweet option.
Intel is a taking a beating on every review of their graphics accelerator, and in their sales of motherboards which use them. Supermicro, Tyan, and Asus all have some fantastic server boards that use the XGI Z9s with 32 MB of DDR2 memory, which smokes the Intel. So it appears that Intel had banked on an in between solution that nobody wants.
This is good news actually, as Intel continues to fail in this area, which means by 2010 their new onboard graphics solution should finally be an option to more expensive graphics controllers.
They have already started using their high quality silicon to enter the SSD market, which makes everyone, even Samsung nervous. When they put their R & D budget to good use, their designs are tough to beat.
http://www.supermicro.com/products/moth ... 7DCA-L.cfm
Here's a mobo using an XGI Z9 that would be a serious contender to those Dual Quad MacTel's. With 6 x 2GB ECC DDRII, and VIsta, this board could stream massive content, while mastering w/ a large amount of quality plugs. Imagine this w/ 2 x Quad Xeons and 1U XITE-1 rack...............................Holy Schmolly Batman.

I actually was contemplating using this, but since I am an addict of DSP, not VST based apps, it seemed totally unnecessary since XITE-1 is capable of yeiding this amount of power all by itself. But for VST based audio / MIDI apps., it could be a sweet option.
Re: Intel GMA X4500HD Graphics
Actually that board uses the 5100 chipset which is a hobbled (cut-down) version of the 5400 chipset. It's nice that it actually supports ECC DDR2 rather than the (HOT!) fb-dimms required by the other Xeon chipsets, but this limits it to dual channels for ram and results in memory bandwidth issues when you're using 8 cores on 2 sockets.
I've got the Supermicro X7DWA-N which has the 5400 chipset for 4 channels of ram and up to 1600mhz frontside bus. The 1600mhz Xeons were over $1200 each for the cheapest model when I built this, so I'm running 8Gb of 667mhz ram (4 2Gb fb-dimm sticks) and e5430 Xeons (1333mhz fsb, 12mb cache quadcore). The sad thing about Xeons is that Intel changes that platform more than the consumer line, so you're lucky if the Xeons for next year are compatible (even if they're pin-compatible Intel has a habit of changing VRM specs etc). It's not something I'd recommend to the average user who wants to upgrade every 9-12 months as the cost is nearly double a normal system. Sure does speed up 3d renders though, and I rarely even see my cpu meter hit 40% when doing audio work.
Also note that my Scope cards are in my OLDER Xeon system here (Supermicro P4DC6+) which is still netburst/Prestonia (p4 era cores).
I've got the Supermicro X7DWA-N which has the 5400 chipset for 4 channels of ram and up to 1600mhz frontside bus. The 1600mhz Xeons were over $1200 each for the cheapest model when I built this, so I'm running 8Gb of 667mhz ram (4 2Gb fb-dimm sticks) and e5430 Xeons (1333mhz fsb, 12mb cache quadcore). The sad thing about Xeons is that Intel changes that platform more than the consumer line, so you're lucky if the Xeons for next year are compatible (even if they're pin-compatible Intel has a habit of changing VRM specs etc). It's not something I'd recommend to the average user who wants to upgrade every 9-12 months as the cost is nearly double a normal system. Sure does speed up 3d renders though, and I rarely even see my cpu meter hit 40% when doing audio work.
Also note that my Scope cards are in my OLDER Xeon system here (Supermicro P4DC6+) which is still netburst/Prestonia (p4 era cores).
- Attachments
-
- X7DWA_spec.jpg (10.9 KiB) Viewed 5088 times
Re: Intel GMA X4500HD Graphics
Con't you hate how ya' have to copt the pic from the crappy datasheet.
That's a hefty EATX ya' got ther Valis. One of the main reasons I want a large capacity hardware sampler ( Or even a large memory addressing STS5000 ) is so I can use a cheaper CPU / Mobo combo, and stop this every 2 year uprade chasin' I seem to need. Only because of VSTi's and sampler apps. I still have ancient DP DAW w/ the Supermicro P4SCT+II and 2GB's of RAM that runs the old Gigastudio 160 2.54 that runs great.
The DSP's don't draw any power, and Giga was great back then as it didn't have to please the VST crowd, so it was a lean, mean, playback machine. Just a meger P4 3.2 Northwood. It had the PIII Tualatin w/ 512k L2 @ 1.4GHz, and that was upgraded from plain old boredom.
XITE-1 just might be a home run for me, and last for years on a simple design w/o all of this Quadruple Octa Quad nonsense that we have all been suckered into.
I bet that 3D rendering is tons of excitement. My A/V pro buddy of mine who builds great audio DAW's as a hobby uses a serious 3D app. on a MacTel dual quad and his wife bitches at him so much for working w/ his apps, so I got him a box of disposable sponge ear plugs for his BDay last month.

That's a hefty EATX ya' got ther Valis. One of the main reasons I want a large capacity hardware sampler ( Or even a large memory addressing STS5000 ) is so I can use a cheaper CPU / Mobo combo, and stop this every 2 year uprade chasin' I seem to need. Only because of VSTi's and sampler apps. I still have ancient DP DAW w/ the Supermicro P4SCT+II and 2GB's of RAM that runs the old Gigastudio 160 2.54 that runs great.
The DSP's don't draw any power, and Giga was great back then as it didn't have to please the VST crowd, so it was a lean, mean, playback machine. Just a meger P4 3.2 Northwood. It had the PIII Tualatin w/ 512k L2 @ 1.4GHz, and that was upgraded from plain old boredom.
XITE-1 just might be a home run for me, and last for years on a simple design w/o all of this Quadruple Octa Quad nonsense that we have all been suckered into.
I bet that 3D rendering is tons of excitement. My A/V pro buddy of mine who builds great audio DAW's as a hobby uses a serious 3D app. on a MacTel dual quad and his wife bitches at him so much for working w/ his apps, so I got him a box of disposable sponge ear plugs for his BDay last month.

Re: Intel GMA X4500HD Graphics
I definately can't see a need for a Xeon box for Xite. I build mine for graphics/3d/programming/audio/gaming/etc and I prefer a 3-4 year upgrade cycle (boards in this class won't do straight line benchmarks as well in 3+ years but the overall 'backbone' of the system bus is much more expandable and runs smoother/has more headroom than a consumer board does).
It wasn't until WinXP that I could actually do all of that under the same OS or on workstation hardware, there was a time when the hardware I ran wouldn't even work properly under the consumer OS's that music & gaming required so I needed dedicated machines for those tasks. I still have multiple machines in operation but it's not as critical as it once was.
It wasn't until WinXP that I could actually do all of that under the same OS or on workstation hardware, there was a time when the hardware I ran wouldn't even work properly under the consumer OS's that music & gaming required so I needed dedicated machines for those tasks. I still have multiple machines in operation but it's not as critical as it once was.