Smartphones Not Secure Enough For HIPAA Or MU

Like it or not, smartphones have become an important part of clinicians’ professional lives, and that includes accessing secure hospital systems.  Unfortunately, few of these devices meet even half of Meaningful Use or HIPAA requirements, according to ONCHIT.

While the BlackBerry and iPhone do a bit better, most mobile phones sold today meet no more than 40 percent of Meaningful Use Stage 2 or HIPAA standards, at least as they’re configured out of the box.  When manually configured, iPhone and BlackBerry smartphones can reach only about 60 percent compliance, according to a piece in MobiHealthNews.

ONC has released these statistics ahead of planned guidance documents designed to help small- and mid-sized provider groups secure mobile devices on the healthcare grid.  ONC plans to publish its guidance as a series of best practices documents next year.

This is positive news. After all, making best practice models available — such as how to handle “BYOD” situations — is quite necessary. That being said, why must providers wait until late this year? I’d argue that providers need best practices for smartphone use immediately, not in several months.

HIT administrators need guidance not only for how to configure the devices adequately, but also how to tailor data delivery to the device’s small brain, how to make the devices uncrackable even if lost and what kind of health data UI works on a smartphone. (Technically, the latter isn’t a security concern, but I think we can all safely assume that if the UI is ugly, physicians will try to “break” it to their use or simply switch to a less secure device.)

Readers, have you had any security concerns arise specifically due to smartphone use? Do you think smartphones are as big of a security threat as tablets and laptops?

About the author

Anne Zieger

Anne Zieger

Anne Zieger is a healthcare journalist who has written about the industry for 30 years. Her work has appeared in all of the leading healthcare industry publications, and she's served as editor in chief of several healthcare B2B sites.


  • Which is precisely why it’s imperative that each mHealth app be secured separate and apart from the device. For instance, PatientKeeper’s physician workflow app (for accessing inpatient info, placing orders, recording charges, etc.), which runs on iPhone/iPad, Android and BlackBerry, does not depend on the native device or OS security features but instead fully encrypts all PHI sent to the device using 256-bit AES. This payload remains encrypted the entire time it is on the device and only decrypted at display time using a public/private key mechanism. Beyond encryption though, a secure application requires the ability for each organization to customize a myriad of settings (e.g. timeout, ability to wipe data after X number of password failures, etc) to meet that organization’s security and compliance requirements.

  • If, as Peter suggests, the app is properly designed to encrypt data, and to properly react to enough failed attempts to login, I would think that would be sufficient to meet requirements. Am I wrong – is there more?

    BTW, I’m also assuming that the only way the device is used for EHR and related purposes is through a controlled app. If there is also email access, it would also need full encryption and login security.

    Both would need to timeout after x minutes of nonuse and then lockup.

    And if the device has browser access to EHR, same issues of encryption and lockouts.

    So is there anything else that ONC is complaining about here?

Click here to post a comment