Skip Navigation
BlackBerry Blog

CES 2016: Why Is Software the Key to Bringing Augmented Reality to Cars?

While self-driving vehicles are gradually becoming a reality, more and more of today’s cars roll out from factories featuring advanced driver assistance systems (ADAS). We are quickly getting used to adaptive cruise control, blind spot monitoring, parking assistance, lane departure warning and many other features that make driving safer and the driver’s job easier. Data from cameras, sensors and V2X infrastructure feed into ADAS systems, increasing their accuracy and efficiency. These systems are important steps toward fully autonomous driving, but the ultimate responsibility for decision making still lies with a driver.

(Guest post by Alex Leonov, marketing director, Luxoft Automotive, a QNX technology partner. Originally posted on the QNX Auto Blog.)

The more that cars become connected, the more the average driver can be bombarded by information while driving. “In 500 feet make a right turn.” “You have an incoming call from Christine.” “You have a new message on Facebook.” “You are over the speed limit.” This may not be so big of a distraction under normal conditions. But sometimes, when driving in hectic city traffic or in a snow storm, it is critical to keep eyes on the road while still receiving essential information. The good news is, the technology is already there to remedy this.

Heads up for HUDs

Keeping the driver’s eyes on the road is a priority, and head-up displays (HUDs) can accomplish just that. They project alerts and navigation prompts right on the windshield. Analysts predict an explosive growth of HUDs with the market reaching close to US$100 billion by 2020. The bulk of HUDs are relatively simple combiners, but more advances in wide-field-of-view HUDs are coming soon.

luxoftprojectingalerts1

HUDs are perfect for presenting information in a convenient, natural way and giving the driver a feeling of being in control. But HUDs are only as good as the information they display. That is why it is critical to have solid and reliable data processing and decision-making algorithms, running on a reliable OS such as QNX Neutrino, that can prioritize and filter data. The resulting alerts and prompts must be communicated to a driver in a clear, transparent way.

Computer vision, also known as machine vision, is a key to processing the endless flow of data. With its human-like image recognition ability, computer vision processes road scenes, and the system fuses data from multiple sources. Add in a natural representation of processing outcomes in the form of augmented reality, while tracking the driver’s pupils, and you have a completely new level of driver experience – safe and intuitive.

Next-generation driving experience

At Luxoft, we’ve been working on making this experience a reality. The result is CVNAR, a computer vision and augmented reality solution. CVNAR is a powerful software framework containing mathematical algorithms that process a vast amount of road data in real time to generate intuitive prompts and alerts. CVNAR has built-in algorithms for road and pedestrian detection, vehicle recognition and tracking, lane detection, facade recognition and texture extraction, road sign recognition and parking space search. It performs relative and absolute positioning and easily integrates with navigation, the map database, sensors and other data sources. A unique feature of CVNAR is its extrapolation engine for latency avoidance.

luxoftdetectingroadsigns

CVNAR works perfectly with LCD displays and smart glasses, but it is ultimately built for HUDs. Data from cameras, sensors, CAN and navigation maps are fused and processed to create an extendable metadata output that describes all augmented objects. It takes a HUD and an eye-tracking camera to implement CVNAR in a vehicle. CVNAR will track the driver’s gaze and adjust the position of the augmented objects in the driver’s line of sight to make sure they don’t obstruct anything important – all in real time.

luxoftalertingdriver

This is not all that CVNAR can do. New car models come packed with infotainment features that take time to learn and memorize. The CVNAR-based smartphone app can help. It turns your smartphone into an interactive guide. Point your phone camera to your dashboard and use augmented prompts to find out more about a particular car function. It can work under the hood, too.

Era of a software-defined car

A modern car runs on code as much as it runs on gasoline (or a battery-powered electric motor). Today, it takes over 100 million lines of software code to get a premium car going, and the amount of software necessary keeps expanding. At Luxoft, we are excited about the car’s digital future, and we work every day to help bring it about by developing cutting-edge automotive solutions for leading global vehicle manufacturers.

Offering a wide range of embedded software development and integration services for in-vehicle infotainment and telematics systems, digital instrument clusters and head-up displays, Luxoft has developed user experience (UX) and human machine interface (HMI) technology for millions of vehicles on the road today. We push the envelope of technology in such areas as situation-aware HMI, computer vision and augmented reality, while Luxoft’s products, the Populus and Teora UX and HMI design tool chains, power the development of award-winning automotive HMIs and slash time to market.

Software holds the key to the future of cars. It is essential to creating a customized user experience in vehicles. With over-the-air updates, software offers unmatched flexibility and scalability. Finally, it takes safety to the next level with its ability to simulate human-like logic through complex algorithms.

About Inside BlackBerry Blog