NVIDIA CES 2015 Live Blog: Follow The News As It Happened

NVIDIA had a cracking year in 2014. With a brand new Shield gaming tablet with sweet free games, amazing new laptop graphics chips and the excellent GeForce range, it was an impressive 12 months. So what’s next? We’ll be at NVIDIA’s CES 2015 press conference this afternoon to bring you all the news as it happens!

All times are in AEDT.

12midday, 5 January
The presser kicks off in earnest at 3pm AEDT. Tune back in then!

2:50pm
And we’re seated! Solid turnout for a sneaky press conference late on day zero of CES 2015!

2:53
What do you want to see from NVIDIA in 2015? A new Shield handheld? Better desktop processors? Let us know in the comments!

2:55
Green is tonight’s theme.

3pm
Right on time, co-founder and CEO Jen-Hsun Huang takes the stage.

3:02
We’re going over the company’s recent achievements, including K1 and the Maxwell GPU.

3:04
Meet the Tegra X1! It’s a Maxwell chip, shrunk down for a mobile. It’s incredible. It’s an 8-core 64-bit CPU with an incredible 256 cores, and can handle 4K video at 60hz H.265 and VP9. Oh my.

3:05
It smashes rivals out of the park on benchmark tests, and manages to sip power.

3:06

3:07
The Tegra X1 is able to run the amazing Elemental Unreal Engine 4 benchmark test using just 10W of power. When it was demonstrated two months ago, it needed a desktop processor with 300W of power to run it.

3:08
And now we’re watching that demo. This looks great.

3:10
In the year 2000, the ASCI Red supercomputer with 1 million Watts was able to produce 1 Teraflops. 15 years later, the Tegra X1 can pump out the same amount with just 10 Watts.

3:12
So what’s all that power going to be used for? “Surely not a phone,” Jen-Hsun Huang says. It’s going to power the smart car and all of its displays and computing needs.

3:15
So that Tegra X1 is being put to good use inside the NVIDIA Drive CX: a digital cockpit computer.

Designed for future cars, it’s meant to power a bunch of different high-resolution screens, a number of virtual machines and comes with support for QNX, Linux and Android.

The software suite that designers will get access to with it is called Drive Studio. “We’ll be able to render Tron-like graphics for your cockpit in the future,” Jen-Hsun Huang says with a demonstration.

3:20

These graphics look gorgeous.

3:24

This digital cockpit software is completely amazing, and it’s all being powered by this Mighty Mouse of a chip. Awesome.

3:25

Features include dynamic lighting on navigation maps, customisable lighting cues on rev gauges and fantastic rear view cameras that give you a bird’s eye view of a car in a single video render.

3:28

We’re getting a demo of NVIDIA’s design studio replicating different lighting scenarios on various materials in a speedometer. From carbon fibre gauges with red lighting through to porcelain gauges with dimmed blue lighting, NVIDIA boffins are able to replicate and simulate it in NVIDIA Drive CX for car designers.

3:31

We’re now talking about the “car of the future”, and all the advanced driver aid sensors it will have and already has today. NVIDIA predicts that these sensor technologies will be replaced by smart cameras, with computer vision technology powering warnings for the driver.

3:37

How’s this for some future?

NVIDIA just announced a new auto-pilot computer system called the NVIDIA Drive PX. It’s packing two NVIDIA X1 processor, which gives it incredible processing capabilities.

Combined, they have 2.3 Teraflops of computing power. It can connect to up to 12 HD cameras and process 1.3 billion pixels per second. That way it can work out where you’re going.

It’s designed for driverless cars with 4K cameras, amazing displays and smart camera technologies all over it.

3:38

3:40

There’s also a new Deep Neural Network sensor system that detects and classifies objects as it sees them. It will know what a pedestrian looks like, what a bicycle looks like, what an emergency vehicle looks like and what road signs look like.

It’s about making the car “situationally aware” according to NVIDIA. Which sounds terrifying.

3:48

What NVIDIA’s Deep Neural Network systems are able to do is take apart images at a neural level and learn exactly what each pixel pertains to and react accordingly.

For example, if you detect a car in front of you, you’ll drive differently than if you see a school bus with its lights on. The car can tell the difference.

3:50

We’re now taking a look at how NVIDIA has been testing this on real roads. It’s amazing.

These cameras and this Deep Neural Network Computer Vision system can see everything from occluded pedestrians through to distant speed limit signs and the colour of traffic signals and react accordingly.

4:00

It can also get into nitty-gritty subclassifications: it can see all the cars, but it can sub-classify them by sports cars, trucks (heavy and light) and SUVs.

4:10
Ricky Hudi, Executive VP of Electronics Development from Audi is up talking about the developments in driverless car tech.

4:22
Finally, we’re talking about NVIDIA Surround Vision. It pieces every camera angle around the car together th make a fantastic full-car view.

4:37

And that’s all she wrote, folks! Thanks for joining us for the first live blog of CES 2015. We’ll have some amazing stuff coming up this week, so stay tuned to Gizmodo Australia!


Refresh this post periodically for the latest updates!


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.