What’s better than scoring top-of-the-line tech gear? Scoring gold top-of-the-line tech gear for free!
TNW Deals is giving one lucky winner a brand new iPhone 6 and an iPad Air 2, finished in stunning matte gold. These gorgeous new devices can be yours — all you need to do is enter our super-simple giveaway !
To win, simply visit our offer page and enter your email address — that’s it! You can even increase your chances of winning by sharing the offer with your followers on Twitter. Enter today for a chance to rock Apple’s most beautiful phone and tablet yet.
➤ Get this deal now
Leap Motion releases VR Developer Mount and teases prototype Dragonfly sensor for VR headsets
The hype around virtual reality has reached fever pitch and Leap Motion , not surprisingly, wants a piece of the action. The company today announced an inexpensive VR Developer Mount for its existing Leap Motion Controller, as well as a prototype sensor which it hopes will be adopted by virtual reality headset makers (hint hint, Oculus VR.)
As a quick recap, the Leap Motion Controller allows PC and Mac users to control their devices by waving their hands and fingers in front of them. The idea of swiping, pinching, waving and grabbing thin air is similar to Microsoft’s two Kinect sensors, although this is a far more portable device.
The VR Developer Mount is, therefore, a logical evolution. Rather than placing the Leap Motion Controller on a desk, it can be in-line with your eyes, picking up everything in your field-of-view. For now, this means players will be able to see where their hands are in the real world –according to The Verge , this is either as an overlay or by quickly switching from the VR experience.
It’s not hard to imagine a scenario in the future, however, when players can forgo controllers (such as the PlayStation Move peripheral) and have the VR experience actually track and replicate their hand movements.
Leap Motion says this was possible thanks to a new image API it launched earlier this month, which offers raw infrared imagery. “Because our device’s field of view exceeds that of existing VR displays, you’ll find it can start to track your hands before you even see them.” Leap Motion is also updating its beta SDK with an improved ‘top-down tracking’ mode, as well as Unity and C++ examples.
The VR Developer Mount is available today for $19.99 and will ensure developers can quickly attach and remove the Leap Motion Controller.
Leap Motion also mentioned a prototype sensor it’s working on. Codenamed Dragonfly, the company says it’ll feature “greater-than-HD image resolution, color and infrared imagery,” as well as a larger field of view. “With next-generation ‘mega-sensors’ like this, a Leap Motion device can literally become your eyes into the digital and physical realms – allowing you to seamlessly mix and mash, fade and blend, between virtual environments and the sharpness of the real world,” David Holz, co-founder and CTO of Leap Motion said.
Read Next: Leap Motion’s CEO wants its gesture control in cars, as a software upgrade to track hands nears / Leap Motion takes a big step forward with the public beta of its next-gen software to track hands
NVIDIA’s Tegra X1 will put a teraflop of computing power in your pocket
In 2000, the ASCI Red was the world’s first teraflop supercomputer. 15 years later and NVIDIA has just announced the first teraflop mobile processor – the Tegra X1 .
The Tegra X1 packs more power into its eight-core 64-bit CPU and 256 GPU cores using less than 15 watts of power than the ASCI Red did into a 1,600 square foot installation using 500,000 watts.
The ASCI Red was used to maintain the US’s nuclear arsenal. The Tegra X1 will make it even easier to play really great looking games on your mobile.
The new processor brings NVIDIA’s Maxwell GPU architecture , which it announced for PCs last year, to mobile. The company promises it will deliver around twice the performance of last year’s Tegra K1 chip.
As well as coming to mobiles and tablets, the Tegra X1 will power NVIDIA’s new car platforms:
Drive CV is a cockpit computer that can power new information and entertainment systems. Using NVIDIA’s Drive Studio software, designers will be able to create digital cockpits with navigation, information and entertainment integrated with driver aids like Surround Vision, which shows a top-down 360-degree view of the car in real time.
Drive PX, meanwhile, is NVIDIA’s contribution to the development of auto-piloted cars. Powered by dual Tegra X1 processors, the system can use up to 12 high-resolution cameras to power Surround Vision and Auto-Valet features.
There have been a lot of intriguing moves around in-car tech recently but while we’re waiting to see Android Auto and Apple’s CarPlay make a big splash, NVIDIA could get the jump on them through its partner, Audi.
In the meantime, NVIDIA’s showing off the capabilities of the Tegra X1 at CES by showing how it runs Epic’s Unreal Engine 4 demo “Elemental”. Given that most people were gobsmacked when it was shown running on a PS4 back in 2013, that certainly shows how far mobile processors have come.