As healthcare is shifting from volume based to value based, it is essential for healthcare organizations to have access to data analytics that help them make informed decisions. Running these analytics on the cloud gives you the scalability and agility needed to run data analytic applications while alleviating much of the security burden HIPAA places on your application. In this webinar, learn how Caremerge, a cloud-based care coordination platform, is using Amazon Web Services to provide insightful reporting on their application to improve care while reducing costs and how APN Partner Cloudticity helped Caremerge migrate all-in to AWS. This webinar will walk through lessons learned from the migration process, best practices for architecting a reporting environment on AWS, andprescriptive guidance on securing your analytics and big data technologies for HIPAA compliance.
Migrating Your HIPAA Compliant Healthcare Analytics to AWS
1. Migrating Your HIPAA Compliant
Healthcare Analytics to AWS
Presented by Christopher Crosbie (Amazon Web Services),
Fahad Aziz (Caremerge), and Gerry Miller (Cloudticity)
December 3, 2015
3. Housekeeping
⢠Everyone will be muted throughout
⢠Feel free to submit questions via GoToWebinar chat
⢠The recording and slide deck will be sent to all webinar participants
after the event
4. Agenda
⢠Caremerge
â Migration to AWS
â Analytics on AWS
⢠Cloudticity
â Healthcare on AWS
⢠AWS
â Big Data and Analytics architecture on AWS
11. Big Data
Boundless data in healthcare (patient,
conditions, medications, treatments and
outcomes)
Ability to access, manage, connect and
understand this data to create actionable
insights is critical for improving care and
outcomes
12. Data Points Clinical
Vitals
Medications
Diagnosis
Conditions
Allergies
Assessments
Mood & Behavior
Patterns
Cognitive Behavior
Communication/hea
ring patterns
Vision Patterns
Physical functioning
Continence
Disease Diagnosis
Oral/Nutritional
Status
Activity Patterns
Observations
Falls
Wandering
Depressions
Falls with Injury
Elopement
Depressed
Abusive
Dimension of
Wellness
Physical
Emotional
Environmental
Spiritual
Vocational
Social
Intellectual
Health Services
Nutritional
Devices
Sleep Time
Heart Rate
Blood Pressure
Falls Risk
MoreâŚ
Basic Health
Height/Weight
Race
Gender
Religion
Veteran
Marital Status
Blood Type
Quality Measures
ACE/ARB
Beta Blocker
Cholesterol Test
Diuretic
HbA1c
Immunization Influenza
Immunization
Pneumonia
Micro albuminuria Test
Spirometry Test
Statin Therapy Test
Tetanus
Care Transitions
Admission Dates
Admission Source
Transition from
Transition to
30 days re-
admission
ďą Observations
ďą Quality Measures
ďą Care Transitions
ďą Clinical
ďą Assessments
ďą Basic Health
ďą Device data
14. Example # 1
Identify those that are high risk of
re-admission?
Recently discharged from hospital?
Lost weight significantly?
Has at least one chronic illness?
Not sleeping?
Oxygen levels not stable?
Had a fall?
Comparable to other similar cases?
15. Observations: Mood, Wandering, no talking?
Attending less social events?
Losing appetite?
No family connection recently?
Medication changes?
Medication interactions?
Comparable to other similar cases?
Identify those whose health is
declining?
Example # 2
16. We are just getting started!
And we need to focus on our
application and big data analytics
and not worry about infrastructure.
18. A whole lot of firsts
⢠First patient portal on AWS
⢠First Meaningful Use Stage II attestation on AWS
⢠First Health Information Exchange on AWS
You can â and should â run your healthcare applications safely,
securely, and cost effectively on Amazon Web Services.
20. AWS BAA Configuration Requirements
Customers must encrypt ePHI in transit and at rest
Customers must use EC2 Dedicated Instances for instances
processing, storing, or transmitting ePHI
Customers must record and retain activity related to use of
and access to ePHI
28. Migrating your reporting architecture to AWS
Amazon RDS
Availability Zone #2
Auto Scaling group
Amazon EC2
Running
Reporting
Worker #2
Availability Zone #1
Auto Scaling group
Amazon EC2
Running
Reporting
Worker #2
Amazon Redshift
Amazon EC2
Running
Primary
(licensing)
Amazon SNS
Amazon ELB
HTTPS HTTPS
SSL
SSL
HTTPHTTP
29. RD
You should have access to the database
enabled only from the application tier (using
appropriate security group/NACL rules)
Any data that has the potential to contain PHI
should always be encrypted by enabling the
encryption option for your Amazon RDS DB
(see image on left) OR you could use Oracleâs
TDE
For encryption of data in-transit:
MySQL: SSL channel
Oracle: Oracle Native Network encryption
Amazon RDS: Managed SQL Best Practices for
HIPAA
30. Amazon RD
Enable at rest encryption on your Amazon
Redshift cluster.
Customer controls the Cluster Key which
encrypts the database key for the cluster.
Connect to Amazon Redshift using SSL
and set the requre_SSL parameter to true
Amazon Redshift Best Practices for HIPAA
31. Next steps
⢠Pilot project on AWS
⢠Determine the right financial factors to drive a production migration
⢠Leverage the experience of companies like Caremerge who have
already migrated to AWS
⢠Use a Healthcare Competency provider like Cloudticity to maximize
your security, value, and chances of success on AWS
As many of we know, when Steve Job was getting treated for his illness, several doctors, specialists, pain managers where involved but they were not talking to each other. To theopint stevens wife laurene had to invie them to hom e and ask them to created a coordinate plan to treat him.
This problem is not just one off, we all face it every day. Healthcare professionals and entities do not collaborate and coordinate and it results in higher cost and inefficient care plans.
He can do that but you cannot d.
When we decided to solve this problem for seniors, we soon realized its much bigger than we anticipated. A senior is looked after so many people in seinor living community or home health, and they are not in one place and they are making decisions. Using technologies like fax, phone, emails and even text and sticky notes.
Talk about how insecure it is.
We offered them our care coordination platform. For the first time, it brought all the decision makers together to collaborate, and make right decisions at right time. Everyone started to see the benefits, and caremerge started to grow business. More and more senior living communities started to use it, our user size doubled and tripled every month.
Caremerge forges meaningful connections between providers, families and seniors seeking to improve communication in todayâs complex healthcare environment.
With a revolutionary, easy-to-use cloud-based coordination platform, Caremerge keeps the entire care team informed and cohesive through an intuitive interface that enables real-time staff interaction, provides families with peace of mind and improves overall senior wellness.
This growth also did something very exciting for us. As senior living communities were authorized outside collaboratrs, they were basically introducing them to caremerge. Because of this, we started to other healthcare entities like insurance, pharmacy, hospitals, physican offices also got interested and signing up and now we are seeing a network growth.
Media took attention, fox news called us most innovative, entrepreneur magazine did 4 page story on us, and Forbes named us top 10 healthcare companies to watch in 2015
Now business team came to me, and asked, âDo we have the technology and infrastructure to support this growth?â
Every technoloigy has its limitations. When you are on a jet pack, you need to know how high you can fly, how much gas you have and whatâs capabilities of your jet pack? Same is the case with any technology and wer were no different.
We knew that we donât have the infrastructure to manage and plan for the usage growth. We were using couple of physical servers with hardware load balancer, at one location.
Was it safe? To some extent.
Was it scalable, yes but overnight
Was it there to support our growth⌠no.
This means, like a jetpack, a technology failure could mean disaster. Obama administration can get away with healthcare.gov because they had access to unlimited resources, and a strong PR engine but we and many of us cannot. Therefore, we needed a plan.
This is when we met Gerry, shared current and future challenges and asked his help to do comparative analysis of all options
(Gerry you can talk about our initial conversations and what you thought about it)
I decided to go and visit my data center. I flew to Louisville KY, and they took me inside the data centers. Showed my the servers that were hosting my applications and I felt jitters. There was a lot of things that could co wrong there. It was all manual, people managing physical servers, network, security, etc.
Every request was entertained by group of people who need to have some domain knowledge about our setup.
I was not just worried, very worried.
Instantly I realized four major challenges that we will be facing with currents setup. (Talk about each one briefly)
To be worked on (add screenshot and text)
Final slide. Add graphic, clean up bullet point list
Section Header
Final slide. Add graphic, clean up bullet point list
Section Header
1) Obtain a Business Associate Agreement with AWS
Once you have determined that storing, processing, or transmitting protected health information (PHI) is absolutely necessary, before moving any of this data to AWS infrastructure you must contact AWS and make sure you have all the necessary contracts and a Business Associate Agreement (BAA) in place. These contracts will serve to clarify and limit, as appropriate, the permissible uses and disclosures of protected health information.
Header Only
All of these services can contribute to the analytics story. Redshift in particular has seen a lot of adoption, especially with reporting and Tableau but since these four are recent additions to our BAA and carry a little more complexity than the others, would like to give some best practices.
Weâre seeing some interesting trends caused by regulatory changes that are leading to technology changes
Tons of data from EHRs and other devices.
Near limitless storage
Supercomputers in the sky
Section Header
The easiest way to get started with database encryption is to make use of Amazon RDS (MySQL or Oracle engine). To protect your sensitive PHI data, you should consider the following best practices for Amazon RDS:
You should have access to the database enabled only from the application tier (using appropriate security group/NACL rules).
Any data that has the potential to contain PHI should always be encrypted by enabling the encryption option for your Amazon RDS DB instance, as shown in this screenshot. Data that is encrypted at rest includes the underlying storage for a DB instance, its automated backups, read replicas, and snapshots.
For encryption of data in-transit, MySQL provides a mechanism to communicate with the DB instance over an SSL channel,. likewise, for Oracle RDS you can configure Oracle Native Network Encryption to encrypt the data as it moves to and from a DB instance.
For encryption of data at rest, you could also make use of Oracleâs Transparent Data Encryption (TDE) by setting the appropriate parameter in the Options Group associated with the RDS instance. With this, you can enable both TDE tablespace encryption (encrypts entire application tables) and TDE column encryption (encrypts individual data elements that contain sensitive data) to protect just your PHI data and not have the overhead of encrypting everything.
For additional discussion on Amazon RDS encryption mechanisms, please refer back to the whitepaper.
Amazon Redshift provides database encryption for its clusters to help protect data at rest. When customers enable encryption for a cluster, Amazon Redshift encrypts all data, including backups by using hardware-accelerated AES 256 symmetric keys. Amazon Redshift uses a four-tier, key-based architecture for encryption. These keys consist of data encryption keys, a database key, a cluster key, and a master key. Cluster key encrypts the database key for the Amazon Redshift cluster. Customers can use either the AWS KMS or an AWS CloudHSM to manage the cluster key. Amazon Redshift encryption at rest is consistent with the HHS Guidance at the time of this webinar. Amazon Redshift supports Secure Sockets Layer (SSL) connections to encrypt data and server certificates to validate the server certificate that the client connects to. By default, cluster databases accept a connection whether it uses SSL or not. To configure your cluster to require an SSL connection, set the require_SSL parameter to true in the parameter group that is associated with the cluster