The annual Consumer Electronics Show (CES) recently wrapped up and while there were a few standout pieces of gear and technology, the overall feel was one of evolutionary rather than revolutionary for mobile app developers. Human-sized drones and smartwatches did not take center stage this year, instead the spotlight was on tech that pushed the boundaries of interaction and integration. These apps, products and systems served to further blur the line between simply commanding or operating a device and developing a working relationship with it.
This year there was a heavy emphasis on the smarhome apps. The idea of having your very own “Jarvis” a la Marvel’s Tony Stark is no longer a far-flung idea of fiction. Thanks to devices like Google Home, Apple HomeKit and Amazon’s Alexa (which is leading the pack) homeowners can not only physically interact with their home, but talk to it as well. These quasi-AI devices are continuously learning the behaviors, preferences and nuances of the homes residents. Then by talking to their devices, users can control their various smarthome apps which can range from lighting, DVRs, robot vacuums, home security systems, phones and cars. And this list of smart appliances is only growing. Appliance and device manufacturers like Whirlpool, LG and Samsung are already developing and producing products that will integrate with smarthome devices and apps like Alexa.
With the explosion of the smarthome industry looming mobile app developers should position themselves to be the first to get their foot in the door. Big industry names like Google, Apple and Amazon are laying down the infrastructure of the unfolding industry and developers can literally choose their own path when it comes to finding their place within it. Mobile app developers can throw their hat in with the manufacturers and develop smart devices that will sync with smarthome devices. They may also choose to go the app route and develop new ways that users can interact with their existing devices. The possibilities, it would seem, are limitless at this exciting point in time.
Cars, or rather Electric cars, were the big-ticket items of this year’s CES. Chrysler, Honda, Toyota and Bosch released new concept cars. Faraday, who styles itself as Tesla’s biggest competition, also came out and unveiled their newest offering. However, it was what drivers would find inside the cars that turned out to be some of the biggest head turners. BMW is showing just how far they can push digital envelope with their new holographic Heads-Up Display (HUD). Those looking at developing apps and software for the automotive industry should pay close attention to the evolution of this field. If sensors and displays on appliances such as refrigerators, baby monitors and kiosk sign boards are anything to go by, car HUDs will quickly surpass their original purpose of displaying a cars vital statistics and will likely evolve into some sort of interactive space.
The brains of cars also developed identities of sorts in this CES. Car manufacturers have started to integrate voice control features using AI software from some of the biggest names in the tech industry. BMW has decided to interface their vehicle using Microsoft’s Cortana, Ford went with Amazon’s Alexa and Google Assistant is the default for both Hyundai and Chrysler. These names have jumped out of the confines of mobile devices and homes, and into vehicles. Users can now verbally interact with their cars to increase volume, receive a phone call, dictate notes and engage navigational controls. Mobile app developers already used to programming and coding with these AI/voice control software now have a new avenue open to them in the automotive industry.
There is a standout amongst that group of car manufacturers and that is Ford. Trying to be a leader in the automotive-technology innovation, Ford is pushing the integration of technology forward with its SYNC applink. This feature allows apps and devices to seamlessly integrate with the cars on-board computer and systems. Sure, the ability to pair a mobile device to a car has been around for nearly a decade, and with it drivers can receive phone calls and access their devices music playlist.
However, Ford is looking to do more with SYNC. Take their partnership with Samsung for example. Gear S2 and S3 users can now receive notifications regarding their vehicle which includes location and whether their doors are locked through their smartwatches. Furthermore, Ford worked with IVOX to develop DriverScore, an app and service which rewards drivers who practice smart and safe driving behaviors with lower insurance rates.
Then there’s navigation. When engaging driving maps and navigation instructions drivers always had to choose between the app in their phone or the default software that comes with their in-car navigation. Ford is looking to bridge the gap with SYNC. The system will allow any compatible navigation app to display its information, map, instructions and data into the vehicles touchscreen panel using a USB connection. Sygic has put itself on the map as being the first navigation app to utilize the SYNC feature.
The innovations brought forward by Ford are only the beginning. It shows mobile developers what is currently possible. With its SYNC applink, Ford created the infrastructure that will allow developers to build apps specifically tailored for in-car experience. How much further can the envelope be pushed? What other commands, apps and interactions is possible with the car? Ford has made it possible for these questions to be explored and answered not by the automotive industry, but by mobile app development companies and publishers.
CES wouldn’t be CES without gadgets and gizmos. On the laptop side of things, Chromebooks can now run Android apps, which will give app developers new challenges and new possibilities. The goal for mobile app developers would be to adapt their existing apps and develop new ones so that the experience for Chromebook users would not be relegated to that of an emulator. The experience should be native, organic and provide the same feel as it does on other devices. The dimensions of the app are not the only things to change, but the entire UI/UX needs to be reassessed.
Robots! Smart and affordable ones are finally making their way out of the drawing board and into our homes. Companies like Pepper, Kuri and LG displayed great products, especially LG’s Hub Robot which will retail below the $1000 mark. But, interestingly enough, it is the children’s toy brand Lego which unveiled what is perhaps one of the best ways to introduce robots into the average home with Lego Boost. First you build one of five possible robots with, what else, Lego’s. Then by attaching a processing block to the creation and then using the drag and drop app, a user or child can operate their robot. Fun, ingenious and affordable.
Finally, there’s Nebo, an app built by MyScripts. Nebo is the winner of the CES 2017 Mobile Apps Showdown. No, it will not let you launch rockets or decipher foreign languages with a touch of a button. It has however, revolutionized the way that people take and document notes. Believe it or not, using a pen and paper is still the preferred method of taking notes. Using MyScript’s Interactive Ink Technology, Nebo allows users (currently available for Windows and Apple users only, the Android version is on its way) to write their notes using an active pen, but it will automatically capture those pen strokes and turn them into digital documents which are then sent to the user’s device as a MS Word document or as an email. The same operation can be performed when writing out charts, figures, equations, sketches and annotations.