Top
The future of smartphones: A combination of sensors, machine learning and virtual reality

The future of smartphones: A combination of sensors, machine learning and virtual reality

February 24, 2015

Via: itCurated
Category:

This sponsored post is produced in association with Adtile.

Imagine you walk by an Italian restaurant and your phone knows exactly where you are. It knows you love gnocchi and you even traveled to Milan recently. It offers you not simply a coupon, but an immersive experience where you get to explore the restaurant virtually to see what people are eating and visit the kitchen to see how food is prepared. Tempting?

Over the last decade, smartphones have evolved from simply phones to portable entertainment centers. We use them to text, watch movies, and keep ourselves occupied. Now smartphones are about to evolve further. Sensor data combined with machine learning and virtual reality will usher in new wave of engagement, convenience, and utility. Interestingly enough, much of that technology is sitting inside our phones right now.

Your smartphone is smarter than you think

Most people don’t realize how smart their phones actually are or how much they already know about us. Unlike laptops, modern smartphones are packed with dozens of tiny sensors that enable them to collect all kinds of data on who we are, what we are doing, and the world around us.

Accelerometers and gyroscopes are the sensors we hear about the most. These have the potential to collect data on us even when we are not actively using the phone. But most smartphones also have an image sensor, touch sensor, proximity sensor, and up to 30 other sensors, including GPS for location.

New sensors are being developed all the time. Each one opens the door to new possibilities. Chemists at MIT recently developed a smartphone sensor that detects when food has gone bad. Imagine using your phone to check if the rotisserie chicken you brought home three days ago is still safe?

Sensors make our phones more aware. But sensors themselves only collect the raw data. Putting that data to use requires machine learning. By searching for patterns in the data, intelligent apps can figure out whether you are tall or short, big or little, and even guess at gender. It may sound spooky at first, but not so when you consider how useful apps will become.

Apps of the future think on their own

The most intelligent apps will use sensor-based data to provide contextual information. We have seen examples of this already in first-generation fitness apps that track how fast and how far you are walking or running. And many apps, such as Opentable, Uber, and Yelp, use GPS as their main component to serve information based on our location.

You may already be familiar with Apple’s iBeacon technology, tiny wireless transmitters used widely by retailers, airports and even the NBL and NFL to deliver finely tuned content to your Smartphone based on your location.

Some apps today are even crowdsourcing sensor data for traffic and weather forecasts. Consider how Google gathers smartphone GPS data, and sends it back to users as accurate route-time estimates. Another company, PressureNet is working to pull barometer readings from smartphones to improve weather and climate predictions.

But tomorrow’s mobile apps will employ sensor information to a far greater extent. Theses apps will pick up on patterns and routines and learn a user’s preferences over time. “Anyone can collect data. Finding an automated way to create the meaning of that data is paramount,” says Nils Forsblom, the founder of Adtile, a company working on new ways to use machine learning and virtual reality for marketing.

Future apps will usher in a new level of convenience. Instead of asking for input, they will anticipate your needs. Your phone might send calls to voice mail if you are driving or switch into Airplane Mode when it senses a plane moving on the tarmac. An app might hear people talking in a conference room and ask, ‘Do you want to record the meeting?’

Virtual reality adds a new level of creative engagement

But what happens when you mix sensor data and machine learning with virtual reality? Mobile devices may one day deliver immersive experiences, bringing inanimate objects to life and letting you do things like walk around a sculpture or explore the latest exhibit at a museum.

“The phones of the future might look something like Oculus VR meets iPhone — without the headset,” says Forsblom. Oculus is a headset that delivers virtual reality to the smartphone, but Forsblom predicts smartphones will deliver immersive experiences without the headsets.

Advertising may no longer interrupt whatever you are doing or reading, but take the form of an active engagement. You might use your phone as an extension of yourself to walk through a car showroom. If you see something you like, you can use gestures and motions to explore a car in more detail, get more information, or sign up to a test drive of a vehicle.

“In the future, smartphone hardware and software will work in seamless harmony. Future mobile devices will be a mix of invisible apps for utility, entertainment, virtual reality, and gaming. Mobile virtual reality will be the ultimate input-output ‘device’ and creative medium,” says Forsblom.

The next few years will likely see dozens of new apps that use sensors in all kinds of mind-boggling ways. Our smartphones will become more like a personal assistant that understand our preferences, habits, our likes, and dislikes. And virtual reality has the potential to take that one step further, allowing us to explore places and objects without having to get off the sofa — now that’s convenience.

 

Read More