The annual Consumer Electronics Show in Las Vegas has been growing in importance for the automotive industry over the years. You can hardly fail to notice: this year, as in previous years, the big automakers vie for floor space and attention with the glut of big-screen TVs and other consumer goods. As always, BlackBerry‘s QNX will be in the North Hall, proudly in the middle of the big automotive original equipment manufacturers.
Originally posted on the QNX Auto Blog. As part of the run up to CES 2017, we are running a series of blogs to address automotive industry topics that we feel will be prominent at CES in January. Stay tuned throughout the month of December for more.
QNX has an enviable history of bringing concept cars to CES that rival anything on the show floor, with one important difference – ours are not pure flights of fancy; we show technologies that will become reality in the near future.
We started this trend back in 2010 with an LTE-connected Toyota Prius – 18 months before the first commercial LTE deployment in mid-2011. Working with Alcatel-Lucent to provide the experimental network, we were the first to demonstrate Google Maps functionality with local search and an embedded Pandora radio app in a car. Connectivity is standard in many cars today, but in 2010 we demonstrated the future.
The year 2012 brought us a CNET “Best of CES” award for demonstrating cloud-based natural language voice recognition, text-to-speech, and NFC-based one-touch Bluetooth pairing. Simply touching your phone to an NFC reader in the center console automatically paired the phone and car.
In 2013 we got ahead of the trend for ever-larger center stack displays – with detailed 3D maps and voice-recognition keyword spotting – common today in smartphones, but the first time in a car. Simply saying “Hello, Bentley” enabled you to start interacting with the natural-language, cloud-based voice recognition powered by AT&T’s Watson.
And 2014 took us literally in a different direction, with multi-modal input. In our concept car that year, a 21-inch, horizontally orientated center-stack display stretched across the dash, naturally extending interaction and functionality towards the passenger. Behind the screens, the center stack ran both driver information and vehicle infotainment functions through integration with the instrument cluster. And all of this was seamlessly controlled across the touchscreen, physical buttons, and jog wheel controls.
Not content with that, we foreshadowed even greater integration of advanced driver assistance system (ADAS) functionality warnings to the driver. We warned the driver if local speed limits were exceeded through both the cluster and verbally through text-to-speech, and followed this up in 2015 with a system that recommends an appropriate speed for upcoming curves based upon driving conditions and the radius of the bend.
So, what innovations will we be showcasing in 2017? I’m not allowed to tell you just yet, but in a first (for us), we’ll be showing both future and current technologies.
Building on our products, we will demonstrate how technology can enhance the user experience and increase safety for both drivers and passengers.
These cars are not just “show floor wonders.” Our automotive knowledge enables us to build demonstrations for the real world that can be taken to the road. Our new technologies, thoguh often conceptual, can nevertheless be experienced firsthand. .
So, while I can’t specifically reveal what new and exciting technologies we are planning for CES 2017, I can say that the three (yes, three!) cars we are showing all demonstrate cutting-edge technologies that will be available in the very near future. Come take a look!