By: Kevin Cox, AWS SSA, CCSK, Lead Consultant, Impact Makers
Why would providers, payers, and consumers share Healthcare data?
Healthcare as an industry has been slow to implement data sharing across payers, providers and patients. Impediments to data sharing have traditionally included regulations (i.e. HIPAA) as well as the use of closed system architectures. However, current industry forces are changing that dynamic. Those forces include:
- The industry adoption of healthcare based on patient value
- Data sharing mandates from the U.S. Centers for Medicare & Medicaid Services (CMS)
- An increased emphasis on business processes efficiency
- The ability to leverage cloud capabilities.
Value-based care – the healthcare delivery model that incents healthcare providers on patient health outcomes – surfaces multiple use cases for sharing data, or interoperability. Patient healthcare outcomes must be measured and become the basis for payment and reimbursement. Physicians, insurance companies and individual patients all require holistic views of a patient’s medical history in order to receive and deliver optimized healthcare services.
Secure and standardized data sharing creates enormous advantages for healthcare patients, providers, and payers. When automated data transfers replace manual, paper-based data sharing processes, dramatic process efficiency gains are realized. Equally as important, this same automation greatly improves the visibility and usability of the data by healthcare consumers. Consumer demand for mobile application capabilities have effectively transformed other industries like ecommerce, real estate, entertainment, and delivery services. Firms that were slow to see and embrace this trend struggled as a result. There is a similar consumer demand transition taking place in healthcare today, and Covid-19 has increased the velocity of change. Consider the rapid expansion of telemedicine beginning in the Spring of 2020 versus the slow rollout before Covid-19. Payers and Providers that offer rich mobile capabilities to patients are gaining clear advantages in the healthcare marketplace.
In 2020, the U.S. Centers for Medicare & Medicaid Services (CMS) issued their Interoperability and Patient Access final rule. CMS will exercise enforcement discretion for the Patient Access API and Provider Directory API policies for MA, Medicaid, and CHIP effective January 1, 2021. CMS will not enforce these new requirements until July 1, 2021. This means that an API must be running for these items by July 2021. The requirement is to implement and maintain a secure, standards-based is using Health Level 7® (HL7) Fast Healthcare Interoperability Resources® (FHIR) Release 4.0.1.
CMS.GOV- Interoperability and Patient Access Final Rule
Healthcare entities are no different from any other business, there is a need to be efficient and competitive. With improved efficiency, patient care can be delivered with more proficiency, less effort is required to get payments, and less time is wasted on outdated processes. Healthcare has many manual processes that are based on paper forms, emailing and faxing, and time spent on telephone calls. If there was a way to streamline these methods, it would add clear value — that method is using APIs for data sharing. APIs are an excellent service to leverage from a cloud provider.
The use of cloud providers has huge advantages for any business, including healthcare. Cloud providers deliver HIPAA compliant services that often exceed the capabilities of on-premises based organizations, provide BAA agreements, and configuration options that often exceed many on-premises systems. Cloud delivers benefits in many areas like cost savings, increased security, flexibility, serverless options, redundancy, disaster recovery, economy of scale, innovative services, and geographic distributed services. Impact Makers has developed a cloud overview series here: MOVING TO THE CLOUD – A 2 PART EDUCATIONAL SERIES
Has your organization setup an API that meets these requirements? Many organizations have setup APIs only to discover that the problem with the new requirement is data quality issues in the back-end data silos.
The cumbersome methods of the past and a way forward
Healthcare has many manual processes that are based on documents, paper forms, emailing and faxing, and time spent on telephone calls. This is cumbersome for the organization, the customer, and partners. In recent years, many healthcare organization setup data exchanges for specific uses to avoid the use of manual document processes. This was a step forward. Many of the data exchanges were setup as point-to-point jobs with data drops and pickups. This process took older ETL (extract, transform, load) processes from the data warehouse to partners. The data was put in a flat file format, uploaded to an SFTP site to a partner, and imported into a data silo. Any change in time of delivery, passwords, data format, data type, or size caused problems. This was a serial process and not a real time data exchange.
Think of the older method of data exchanges and document exchanges as a cold war era spy drop. The spy (or client) had to give signals to the handler (the server) before dropping off the data at a specific time (job schedule) in a specific format (data encrypted with manually shared TLS certs) in a specific place (the one SFTP server) using a specific method (upload only). Even spies tried to avoid fax machines!
What happens if the data format changes, the encryption needs to be renegotiated, file space is low, the client needs real time downloads, the SFTP server is down, or the upload job is late? If any of these things happen, that spy drop will not happen correctly and there will be a lot of wasted effort!
Spy data drop package
Newer ways to efficiently share Healthcare data
Application program interfaces are the preferred means to exchange data in the cloud. A typical API can resolve much of the headaches of old-fashioned file transfers. An API is an interface between systems using HTTP to obtain data and generate operations on those data in all possible formats, such as XML and JSON. This method aligns to the CMS requirement for FHIR and HL7 formatting. An API is real time, can adjust for data format changes, does not depend on scheduled jobs, can communicate bi-directionally, includes modern security capabilities, has capabilities for scaling, has capabilities for caching, and acts as an independent and secure interface to data.
Sample diagram of an API functionality for Healthcare
The use of an API for sharing data is the way to be more efficient, lower IT costs, and meet CMS requirements. An API does not address the core data quality. If your back end EHR or payer system has data inconsistencies, this will propagate to the API sharing. This data may need to be mapped or transformed into a shareable format for the API. Data experts with experience in governance, data mapping, data organization, and data movement will be needed to prepare the data for API usage.
You can ask these questions or provide these answers:
- Do I trust my data in back-end systems for sharing via API?
- Do I know where the data is accurate or inaccurate?
- Are there differences in data quality between silos or back-end systems?
- Does your data easily map to the CMS API requirements?
Data quality is often the biggest portion of setting up the mandatory API functionality.
Continued information on Healthcare Data Interoperability
This series continues with additional posts on API usage and Data Quality for healthcare interoperability. In the API post, we will walk through an example in AWS and show the power of using an API as an intermediary platform between data and requests. In the data quality post, we will review some of the common data quality issues encountered with an API approach. The links will be updated here so you can continue to gain insight into this new method of exchanging healthcare data.
about impact makers
Impact Makers is an AWS Advanced Consulting Partner with a specialty in data. We leverage comprehensive and mature data practices to enable customers to take advantage of their data.
Every project has unique elements that must be incorporated into a comprehensive strategy in addition to identification and execution of technical work. As Advanced AWS consulting partners, we recognize the importance of a secure, reliable, and flexible data sharing. Our comprehensive framework includes the AWS Well Architected Framework and industry best practices in addition to elements like compliance, asset and metadata management, business strategy alignment, service portfolio management, support model definition, service design and deploy, CloudOps and much more. We work with our customers to deliver and enable strategic business advantage with cloud services.
To learn more, contact us.