What If We Looked at the Smartphone Camera as a “Sensor” Instead of a “Digital Camera”

Benedict Evans is one of the smartest people I’ve read on treads happening in the technology industry. It’s been fascinating to read his perspectives on the shift to mobile and how mobile adoption has changed society. Back in August he blew my mind again as we think of the evolution of mobile and redefining how we use the camera on our smartphones and what it means for mobile applications. Here’s an excerpt from that post:

This change in assumptions applies to the sensor itself as much as to the image: rather than thinking of a ‘digital camera, I’d suggest that one should think about the image sensor as an input method, just like the multi-touch screen. That points not just to new types of content but new interaction models. You started with a touch screen and you can use that for an on-screen keyboard and for interaction models that replicate a mouse model, tapping instead of clicking. But next, you can make the keyboard smarter, or have GIFs instead of letters, and you can swipe and pinch. You go beyond virtualising the input models of an older set of hardware on the new sensor, and move to new input models. The same is true of the image sensor. We started with a camera that takes photos, and built, say, filters or a simple social network onto that, and that can be powerful. We can even take video too. But what if you use the screen itself as the camera – not a viewfinder, but the camera itself? The input can be anything that the sensors can capture, and can be processed in any way that you can write the software.

Everyone has long argued that the smartphone is great as a consumption engine, but it’s not great as a content creation engine. That’s largely true today, but will that change in the future? I think it’s an extremely powerful idea to think of the camera on your smartphone as a sensor that captures meaningful actions beyond just capturing a picture. That’s a powerful concept that is going to change the way mobile apps work and how they’re designed.

The same is true when you think about the camera app software on your smartphone. We see that with Snapchat and other apps that have taken what’s essentially a camera app and overlayed filters to add new functionality to an otherwise simple item.

Now think about this from a healthcare perspective. Could the camera on your smartphone be a window into your health? Could what you capture with the camera show a window into your daily activities? That brings health tracking to a whole new level.

I first saw an example of this at a Connected Health Symposium many years ago when I saw someone researching how your cell phone camera could measure your heart rate. I’m not sure all the technical details, but I guess the way you look subtley changes and you can measure that change and thus measure your heart rate. Pretty amazing stuff, but that definitely sounds like using your camera as a sensor as opposed to a digital camera.

Go and read Benedict Evan’s full article to really understand this change. I think it could have incredible implications for digital health applications.

About the author

John Lynn

John Lynn

John Lynn is the Founder of the HealthcareScene.com, a network of leading Healthcare IT resources. The flagship blog, Healthcare IT Today, contains over 13,000 articles with over half of the articles written by John. These EMR and Healthcare IT related articles have been viewed over 20 million times.

John manages Healthcare IT Central, the leading career Health IT job board. He also organizes the first of its kind conference and community focused on healthcare marketing, Healthcare and IT Marketing Conference, and a healthcare IT conference, EXPO.health, focused on practical healthcare IT innovation. John is an advisor to multiple healthcare IT companies. John is highly involved in social media, and in addition to his blogs can be found on Twitter: @techguy.

   

Categories