What components are required to setup batch data ingestion from EPIC EMR to Azure FHIR API (Paas)?

293 views Asked by At

I'm trying to create a demo to connect a new Azure FHIR api which I created with EPIC(EMR) to ingest the data in batch mode, I'm unable to find out what all components/pieces are required to setup that ingestion pipeline. The examples that are shown in the available videos are related to pumping the data manually. I want to achieve something like Web-Jobs approach. I don't need the exact code or very detailed solution. It's just if I can have the lists of components and in what way they will be connected to make this ingestion pipeline work.

1

There are 1 answers

0
Cooper On BEST ANSWER

If you are looking for Epic-specific support, I'd recommend reaching out to [email protected], (Epic's free, public support option), joining App Orchard (Epic's paid support option), or working with the healthcare organization's IT team directly.

That said, if you are looking at extracting data from Epic, and loading it into the Azure data warehouse (I'm reading between the lines and guessing your use case), we've heard of several folks looking to do that. Right now, there is not a good FHIR-based method of doing data warehouse synchronization that is supported by Epic. There are good non-FHIR options though (SQL extracts or HL7v2 event driven messaging). If you are using the Azure FHIR storage option, you'll need to do a data transform, but that is a far better option than trying to use an inferior exchange paradigm.