We've decided to round up the best and brightest motherboards available. And we're not talking Micro ATX, sub-$US100 budgetrino boards here. We reached for the most feature-filled, over-the-top X58 and 890FX boards from the top three mobo vendors.
Want to know how over the top? One board lets you remotely reboot or overclock it using your mobile phone. Another features power connectors usually found only on dual-processor server motherboards. Hell, one has a heat pipe so freaking big some editors here thought it was some sort of new PCI-E add-in card. And one board is so large you'll have to buy a case specifically for its generous dimensions.
So if you're ready to build a machine that will motor you away from those recession doldrums, keep reading, because the best board here will be the one you want in your AMD or Intel machine.
The X58 Reviews
X58: The Final Analysis
With performance essentially equal, it comes down to overall experience.
We know, enthusiasts like to see benchmarks and measurements and numbers. But as we've observed for a long time, performance across the same chipset rarely sees major variances. That lesson is evident here, where there's no clear performance winner. Each board scored minor victories that were most likely the result of a benchmark's margin of error and/or each board's out-of-the-box overclock. The Gigabyte board, for example, runs its bclock at 134.9, which gives it a slight clock-speed advantage. Still, all the boards are fast.
In the overclocking department, we didn't try to wring each board to its fullest potential manually, as that's dependent on the individual overclocker. We did, however, test how each board handled automatic overclocking. Interestingly, all three were pretty safe automatic overclocks, taking our 2.8GHz Core i7-930 to the 3.33GHz Core i7-975 Extreme Edition range without fail. Of course, everyone knows that's a pretty wimpy feat. All three companies are simply being realistic. Folks who use the automatic tools will be happy with what they get but anyone who buys a board designed to boot with frigid liquid nitrogen is going to overclock manually.
So what this comes down to are features and the setup experience. Surprisingly, with the amount of engineering and qualification that goes into the top-tier boards, not everything is perfect. The Gigabyte X58A-UD7 was probably the trickiest. Out of the box, with the latest public BIOS and a retail Core i7-930, the board kept falling back to a 15x multiplier, which made our 2.8GHz chip a 2GHz chip. And no, it wasn't in SpeedStep mode. That won't trip up an enthusiast, but Joe 12-pack might not know he's underclocking a chip. Only manually setting the multiplier to 22x gave us the right clock speed.
The Gigabyte's ET6 utility also kept tripping Windows 7's UAC control on each boot. Another kvetch about the Gigabyte board: It's qualified for tri-SLI and includes a bridge, but you will need a special case to accept the last card. Both MSI's and Asus's tri-SLI configurations should fit in most standard enclosures.
Not that the MSI and Asus boards were without fault. As we noted above, MSI's default power configuration was plain wacky. Requiring a user who has just spent a ton of cash on a top-tier board to enable S3 and tweak two power settings to enable "wake on USB" seems wrong. Granted, at $US300 on the street, MSI's board is the cheapest of the three here. And we do dig the Big Bang's PCI-E layout and surface-mounted controls.
As for the Rampage III, Asus needs to send its north-bridge fan design back to the drawing board. Besides it not working with large coolers, the fan is shrill and prevents you from reaching the top GPU latch with your fingers. And how 'bout another USB header? The other two boards here pack two USB headers for case front-panel ports, but Asus only gives you one.
In the end, though, those are pretty minor complaints. It was a very close competition between Asus's Rampage III Extreme and MSI's Big Bang-XPower, but the RC Bluetooth mode and out-of-the-box flawless setup give the Rampage III Extreme the edge.
The 890FX Reviews
890FX: The Final Analysis
Does AMD have the moxie to run an enthusiast platform?
We have to give credit to the AMD faithful for sticking it out with the underdog, for what has quite frankly been a bumpy ride. Some of the best times date all the way back to the Barton (Socket A/462) era, in which even high-end boards, like the Asus A7N8X and Abit NFS-2, could be yours for about $US100. After that, Socket 939 reigned supreme in Sunnyvale, but not before AMD pissed people off with its stopgap Socket 754 and 940 platforms, both of which had barely left the assembly line before reaching obsolescence.
Here we are on stable ground again. Chalk it up to remarkable engineering or just plain good planning, but while Intel has been busy juggling sockets, AMD has made the most out of its AM2, AM2+ and AM3 platforms with a staggering amount of backward compatibility. The bigger problem for AMD has been the chipset, and most notably, the wonky AHCI support. That's been the case even as recently as 790FX, but AMD appears to have finally figured things out with its 890FX chipset. We ran all three boards in AHCI mode, and while we did run into a single hiccup, we can't definitively blame it on AHCI.
So where does that leave these three enthusiast boards? If we're basing our opinion on performance alone, Gigabyte's GA-890FXA-UD7 walks away with the crown in the narrowest of victories. But for the most part, there really isn't a whole lot that separates these boards in terms of benchmarks other than bragging rights, and even then, is pulling in 155 frames per second in HAWX really worth gloating over with the competition scoring 154.3 (Asus) and 153.3 (MSI)? If it is, then by all means, grab the Gigabyte board and make sure you pick up a chassis that can accommodate XL-ATX form factors while you're at it - you won't be cramming Gigabyte's board into your mid-tower.
We're more enamoured with the Asus Crosshair IV Formula and MSI 890FXA-GD70. The Crosshair IV wins on sex appeal, and again, if you're splitting hairs over benchmarks, then this time the nod goes to Asus. Throw in the gold-plated audio inputs, Q-Connector, SupremeFX X-Fi module and enough fan headers to generate a tornado, and you're left with one helluva mobo. So how did MSI earn a kick-arse award?
To start with, MSI managed to cram one more PCI-E port onto its board than Asus did and also found room for an IDE port. Sure, the Crosshair IV comes with an additional PCI slot, but now that even soundcards ship in PCI-E form, is anyone even using PCI anymore? Anyone? And while both boards nailed the layout, we like that the SATA ports sit a tad lower on MSI's mobo, keeping them away from overhanging video cards.
Finally, we have to give MSI props for its OC Genie. It took a single button press and a 20-second reboot to supercharge our 955BE by almost 700MHz. And did we mention MSI's board is the least expensive?
Dare to Compare
(Click to enlarge)
Maximum PC Reads the Motherboard Tea Leaves
For the record, neither AMD nor Intel have told us squat about what to expect out of their next-generation chipsets. Instead, we sifted the internet rumour mill and grilled motherboard vendors for what little info we could get.
First up, Intel. There have long been rumours of an X68 chipset and new ICH11 south bridge. Those rumours, however, are likely wrong. The chipset is shaping up to be a minor update of the current X58. Expect native SATA 6 support and possibly more PCI-E lanes. We wouldn't expect USB 3.0, though. Rumoured to be included in ICH11, USB 3.0 is looking more like it won't be integrated until late 2011.
More important for Intel is the P65 chipset. The company badly needs to update the P55 chipset, which is finding itself starved for bandwidth now that SATA 6 and USB 3 components are arriving. Intel is unlikely to continue compatibility with its upcoming Sandy Bridge CPUs and its current crop of chipsets. There's already talk of an LGA2011 socket for Sandy Bridge that, obviously, is incompatible with LGA1366. The LGA1156 will likely get left out in the cold too.
AMD's plans are a bit murkier and more secretive. With an integrated GPU, AMD's upcoming Fusion chips is almost certain to require a new socket and a new chipset. Of course, the big question is will current AM3 boards support AMD's upcoming Bulldozer core? At this point, we're going to give it a 50/50 chance. AMD has been fairly wonderful at giving its users an upgrade path, so it's likely the company could cash in on some of that good will by introducing an updated socket. We can usually count on Intel to make you buy a new board, though. Sigh.
Don't Expect to Mix AMD with Nvidia Multi-Card Configs
Viewed from the comfort of today's X58 platform, the multi-GPU war seems like a hundred yarns ago. Unfortunately, AMD users continue to suffer in the ongoing war between ATI and Nvidia.
Today, if you want to run an Nvidia multi-card configuration, you buy an Intel board. If you have an AMD board, your only choice is to buy CrossFireX. Of course, it's not that Radeon HD cards are bad; in fact, ATI's resurgence with the award-winning Radeon HD 5870 and 5970 cards has many satisfied AMD users.
But still, as capable as the 890FX boards are, why can't you run Nvidia cards if you want to? Is it a technical problem?
Neither company would say, but we're certain it's not. We've seen a clear pattern where you could run either brand of cards in a multi-card setup on any chipset and with any CPU, provided there is enough PCI-E bandwidth.
To try to shed some light on the subject, we attempted to pry info from AMD and Nvidia as to the reason for the hold-up. Unfortunately, we weren't very successful. Nvidia provided us with a terse response: "We have no plans to support SLI on motherboards using AMD chipsets." And AMD was no better. It's apparently quite happy for its customers who want to run two or more cards to have ATI logos on them.
Board vendors aren't so happy, though. Those we spoke with said they've looked at options to get SLI running on AMD boards, and one even said that its engineers have tested hacked SLI running on an 890FX to see if it works and it does just fine.
Unfortunately, there's nothing to force anyone's hand here. When Nvidia faced a situation where Intel chipset users were choosing CrossFire over SLI, the company caved and started allowing board vendors to include SLI support. With AMD's much smaller market share, it appears that the situation won't change until one company blinks.
Where's Native Support for the New, Faster USB?
Every motherboard we reviewed here features SuperSpeed USB 3.0, but none has true native support. The blame lies with Intel and AMD, because neither has yet added support for the latest USB spec in their respective south-bridge chips.
To get around this limitation, motherboard vendors have tapped discrete USB controllers from such vendors as NEC to hit those super-fast transfer speeds over USB. That's good enough to get a USB 3.0 logo on the motherboard box, but it's not good enough for true enthusiasts who don't want to be limited to a mere two ports.
So why the hold-up? It didn't take this long for USB 2.0, did it? Actually, it practically did. USB 2.0 launched in early 2000, with most mobo vendors integrating NEC chips for USB 2.0 support. It wasn't until two years later when Intel launched its ICH 4 south bridge with the DDR-based 845E that USB 2.0 became truly integrated.
OK, so maybe we're just being impatient, but we wanted to hear from the chipset makers why the much-requested feature wasn't on tap for this year. AMD's explanation was that it was one of the features that didn't make the priority list when the 890FX (and its accompanying SB850 south bridge) was in the factory.
Intel said it's following the game plan it used with USB 2.0: The spec is finalised, discrete controllers are released and integrated into boards, and then when there's enough actual hardware out there that needs it, the company will add native support.
The company also refuted tin-foil hat theories that Intel was intentionally sandbagging USB 3.0 in order to push its upcoming Light Peak optical technology. "Light Peak does not compete with USB 3.0. The first USB 3.0 products started to appear in the market in 2009, with a volume ramp expected to begin in 2010, using discrete controllers," the company told Maximum PC. "We see Light Peak and USB 3.0 as being complementary, as Light Peak enables USB and other I/O protocols to run together on a single, longer cable and at higher speeds in the future. We expect both to exist together in the market and on the same platform at the same time. The Light Peak initiative does not signify any change to Intel's direction on USB 3.0 or any other existing I/O efforts."
Maximum PC brings you the latest in PC news, reviews and how-tos.