If you like to capture your PC gameplay, but don't care enough to use some serious software like Xsplit or OBS, Nvidia's GeForce Experience has some pretty decent gameplay recording — with one big flaw. Up until now, at least. It's finally getting a much-needed update with the ability to balance in-game audio with your mic, with separate track recording.
Tagged With nvidia
A couple of weeks ago I was braving the big crowds of E3 to meet with the Nvidia team, and while I was ostensibly there to check out Destiny 2 on a PC, what I really wanted to know was what the hell Max-Q Design was. Nvidia announced its new design philosophy back in May, and I'd spent the intervening weeks unable to shake the sense that this was all just a great big marketing ploy — an acknowledgement that Nvidia's most powerful GPUs often end up in great big computing monstrosities.
Nvidia's April Fool's prank was actually pretty funny — a USB drive, "packed with deep learning algorithms", that'll game for you when you're busy or eating pizza. That USB drive — the storage bit, not the crazy AI bit — is real, and you can get one. Here's how.
Buying a gaming monitor has always been a bit like Australian broadband. You could have really nice image quality, 4K and HDR support, a 120hz or 144hz refresh rate, plenty of real estate, but you couldn't have it all especially if you wanted it to be affordable. And even if you're prepared to spend a pretty penny, chances are you'll still have to compromise somewhere.
You couldn't have it all in a gaming monitor. Well, that used to be the case.
When you think gaming and laptops, images of massive, dictionary-thick machines come to mind. Notebook computers purpose-built for PC gaming are only barely portable, but Nvidia wants to change that with a new approach to hardware and software design for laptops called Max-Q, which lets mobile gamers have their cake and easily carry it too.
Gaming laptops have become much, much better over the last couple of years — faster, lighter, more powerful with better battery life. But they can be better, Nvidia says: slimmer and lighter again, with more responsible energy usage when they're plugged in and on battery, and — most importantly — quieter while gaming.
For the past five years, Nvidia has been building itself a shiny new headquarters in the heart of Silicon Valley. When it is completed at the end of this year, the 500,000 square feet structure will house up to 5000 employees across two floors. The site has been specifically designed to encourage collaboration with large congregational areas, open plan offices and staircases to enable chance encounters. During GTC 2017, we were given a sneak peak inside the building which remains a work in progress. Here are the photos.
Announced at GTC 2017, the Tesla V100 is an enterprise-level processor powered by the Volta GV100 GPU: the first chip in the world built with a 12nm FFN process. A single Volta GV100 packs in 21 billion transistors, 5120 CUDA cores, 320 texture units and a 4096-bit HBM2 memory interface with a boost clock speed of 1455MHz. It's equipped with 640 Tensor Cores capable of providing 120 teraflops of tensor operations. (And yes, it will totally play Crysis - one day.)
Whether you like it or not, autonomous cars are coming - and Nvidia just made it a lot easier for manufacturers to jump on the self-driving bandwagon. The company's Deep Learning Institute (DLI) is offering advanced hands-on courses to aid in the development of autonomous vehicles. Provided it has enough expertise and money, any company can now build one. Ulp.
If you have any interest in PC gaming, you've likely heard about two competing technologies by NVIDIA and AMD called G-Sync and FreeSync respectively. Both are designed to eliminate screen tearing, which happens when your monitor's refresh rate can't keep up with the frames being pumped out by your video card. If you've been looking for a definitive comparison, look no further than this opus from Battle(non)sense.
These days, 1080p is so passe. I don't even get out of bed for anything less than 1440p. But 4K, now, that's where it's at. My new TV is 4K, my next monitor will probably be 4K. 4K is the future, for everything from Netflix to gaming. But gaming at 4K requires a gutsy PC, and that means investing in some top of the line hardware. Want to play the latest games at 4K? Nvidia has got you covered with the GeForce GTX 1080 Ti, a graphics card with a significant jump in power from even last year's already-barnstorming GTX 1080.
The PC gaming world has a new king of graphics. Nvidia's new top-of-the-line GeForce GTX 1080 Ti handily beats the $800 GeForce GTX 1080 that we already love, bringing the lion's share of power from the $1600 developer- and supercomputing-friendly Titan X to a slightly more affordable graphics card.
It's almost Time. That's what Nvidia is telling us in preparation for its keynote at GDC 2017, and it doesn't take a genius to work out that it's going to take the opportunity to introduce a new, top-of-the-line consumer graphics card to replace the powerful GTX 1080 — unsurprisingly called the GTX 1080 Ti.
The original Nvidia Shield looked cool and had some neat ideas behind it, but its cost and use of the neglected Android TV operating system left the set-top box/console fusion feeling more like Frankenstein than legitimate answer to either Roku, PS4 or Xbox One. A major software update and some much needed changes to the system peripherals has changed the Nvidia Shield into a legitimate set-top box choice — especially if you're looking to playback 4K HDR content.
In front of thousands, the pitch sounded good. Bring PC gaming to the hundreds of millions who can't, or haven't experienced it before. It's a sensible, reasonable goal for a publicly listed company like NVIDIA to aim at. And the idea of putting a gaming PC in the cloud has a certain logic to it.
Problem is, we've been here before. It didn't work. And even if the streaming technology was sound, it still wouldn't work for Australians.
When I was a kid, self-driving cars were the sci-fi future. They were the stuff of Isaac Asimov's Sally and the Johnny Cab from Total Recall. I didn't actually think that they'd ever happen — the concept itself was a long way from reality, a lot more fi than sci. But smarter brains than mine, with the help of some surprisingly old-school tech, have built cars that can drive on everyday roads.
I took a short trip in one, and it was normal. Normal to the point of being bland — which is what you want from a self-driving car.