Vendor Provided Data

Episode 22 October 20, 2023 00:14:07
Vendor Provided Data
HealthData Talks
Vendor Provided Data

Oct 20 2023 | 00:14:07

/

Show Notes

In this episode of HealthData Talks, host, Shannon Larkin, and Manager of Data Procurement and Delivery, Alli Hysell at Harmony Healthcare IT, discuss managing incoming and outgoing data for conversions and archives.

View Full Transcript

Episode Transcript

[00:00:00] Speaker A: You. [00:00:02] Speaker B: Welcome to Health Data Talks, where industry experts offer bitesized, tips, and trends for managing legacy data. [00:00:12] Speaker C: Thank you for joining us. I'm Shannon Larkin with Harmony Healthcare. It. And at Harmony, our specialty is working with legacy data. We've been focusing on that in healthcare for the past 17 years. And the thing is, legacy data isn't always the easiest thing to get. There's aging, out of production software and servers that can sometimes pose real challenges for technicians. So that's why we've asked Ali. Heisell our system analyst manager at Harmony to join us today. So, Ali, thanks for being here. [00:00:45] Speaker A: Thanks, Shannon. Happy to join today. [00:00:48] Speaker C: So, I know you're the right person to talk about legacy data acquisition, but to get us started, why don't you give our audience some know on yourself, your role, and your team? [00:00:59] Speaker A: Yeah, absolutely. So I'm a local here, south Bend. I live here with my husband and our son, who will be two at the end of this month. I've been here at Harmony for a little over five years, and I oversee the Systems Analyst team, who primarily is responsible for all incoming and outgoing data. Our team primarily coordinates with our clients and their vendors on expected deliverables, which includes the format of their data, their timing, and the confirmation of their overall alignment of the scope. [00:01:31] Speaker C: Okay, so this is definitely a critical role. It's like the starting point for bringing data into our environment so it can be but transformed, converted, stored. Right. To fulfill whatever scope of work we've contracted for. Tell us how this works most often. Is your team extracting the data directly from a legacy system? Is the data provided to us from that legacy vendor just kind of talk about how we get started? [00:02:01] Speaker D: Yeah, absolutely. [00:02:02] Speaker A: So sometimes we are contracted to do the legacy extraction primarily. We do see that we are often working with the legacy vendor to provide us the data. And data is typically delivered to us one of three ways, all of which are equally secure. And that transfer method mostly depends on the total size of the data that needs to be moved and any data that is less than 1 TB. We typically recommend utilizing SFTP, which is Secure File Transfer Protocol, and that's typically utilizing an application such as FileZilla winsp or Go anywhere. For any data totaling data size larger than a terabyte, we typically recommend sending an external encrypted USB hard drive. [00:02:50] Speaker C: Okay, let me stop you right there, because I know Harmony works with some of the largest health systems in the nation, so I would imagine that we're seeing data sizes in excess of a terabyte quite a bit. So can you comment on typical data sizes maybe, and then maybe expand on the hard drive process a little bit? [00:03:09] Speaker D: Yeah, absolutely. [00:03:10] Speaker A: So our data sizes vary for astronomical sizes, so typically, practice management systems we can see in megabytes to gigabytes, those systems are typically on the smaller sides, whereas our acute clinical systems for larger hospitals, those are where we start to see larger sizes, where we see closer to the terabytes. So those could be 1050 even to the 200 terabytes worth of data and external images. So in terms of our hard drives, so we typically utilize those for our larger projects. So our larger terabyte transfers for our data sets, our go to brand is Apricorn. So some of those key features for that hard drive include a 256 bit ATX AES hardware encryption, which basically just equates to two layers of added encryption. You are also provided a pin to unlock the device utilizing an external keypad on the drive. And one additional security feature includes an unattended auto lock set, which sets the drive to a lock after a period of inactivity for extra added security. [00:04:26] Speaker C: So, just out of curiosity, how are those hard drives shipped? It sounds like the device itself is very secure, but I'm just wondering how the precious protected health information cargo makes its way from the client site to our physical site securely. [00:04:46] Speaker D: Yeah, so, absolutely. [00:04:47] Speaker A: So we ship our hard drives in a secure case with a custom insert for each drive and its cabling. We recently made the change in our shipping to ensure the security and the durability of our transporting of the physical hardware, but it also helps us redo our carbon footprint. [00:05:04] Speaker C: That's interesting. So you mentioned I know I got us off track here with the hard drive, but I find that kind of interesting. So you originally mentioned three ways that data gets delivered. So what's our last method then? [00:05:19] Speaker A: So, our last method is our cloud storage transfer. So this is where our clients utilize their existing cloud storage, such as AWS, Azure, or GCP, which is Google cloud's platform, where they are able to provide us connection credentials to allow us to securely download their data utilizing that connection information. And that's where we're able to securely transfer their data or external images. [00:05:48] Speaker C: So this method seems really logical to me. So do you see this transfer method happening more often as hospitals transition to the cloud? And if so, is there one platform, the ones you mentioned, the big three, that emerges as more prevalent? [00:06:07] Speaker D: Yeah, absolutely. [00:06:08] Speaker A: So a few years back, we ran into a few cloud transfer requests maybe once every three times a year. Nowadays, I see it at least twice a month. Our larger hospital and enterprise clients are investing in cloud storage and utilizing that storage to their advantage as they look to archive their data. In terms of one cloud storage platform to another, I would say kind of. The big three are, again, AWS or GCP. Those are probably the top two I see most. [00:06:42] Speaker C: Good, good. Okay, so we've covered the methods for data delivery. Let's say we have a client that implements a new EHR, and they're ready to archive the legacy systems that that EHR, just displaced. Maybe talk to us about the process and when and how your team engages in that project. [00:07:05] Speaker A: That's a good question. I think about it as kind of a ticking clock. It starts pretty much as soon as the client engages their vendor for a data extraction and the systems analyst team is often engaged along the way to ensure that the expected extraction deliver often aligns with the appropriate format. So we often help kind of navigate through any technical conversations with a vendor and the client. And this can be anywhere between a few weeks to several months. Really just varies between vendor to vendor. [00:07:38] Speaker C: So what kinds of things might make that extend from weeks to months? [00:07:44] Speaker A: That's a good question. So a few things, I think most commonly it's going to depend on the size of the data. So depending on your total size of the discrete data that you're extracting and the total size of your external images. So if it's a few megabytes to gigabytes may take that four to six weeks. If you're looking at a larger acute clinical system where we're looking at terabytes worth of data, that might range for a couple more months, where in some cases it might take three to six more months. You might also need to look at the database platform size that will also play a factor where it's going to determine the overall format of the deliverable, where that will play a role in the length of time it's going to take your vendor to actually extract that. And I would say the most challenging is when the client's data is held in a multitenant vendor hosted database and that's usually where we see extraction times usually take the longest because the vendor often has to perform their work after hours so it does not impact their other tenants. [00:09:00] Speaker C: So I'm kind of surprised at the time frames. And there is an information blocking provision in the 21st Century Cures Act, right? So even if some of these factors you're mentioning might slow the data acquisition process down, you're ultimately always going to be able to get at the data. Right, because the legacy vendors just have to provide access to the data. [00:09:26] Speaker A: Yeah, so in some cases we're not always able to have direct access to the data itself, but we're always able to kind of navigate our way around it. So in some cases there are what we call proprietary files delivered. So in this case, all vendors have different files or inversions of an encrypted file which is unique to them. Ultimately those are files that we're not able to work with unless it's an unlock or decrypted. For example, we often work with Eclinical works where they have encrypted progress notes. And if a vendor is unable to provide this XML, which is a type of document in an unencrypted format, the alternative is to engage our cross functional team, which is our data automation specialists who utilize their robotic automated processing tools to capture that data on the front end of the application to mimic the actions of the front end user. So ultimately, we won't necessarily, if they're not able to provide us access to the data, either delivering that to us and are not able to provide us access to the back end of the data, we definitely have the tools to kind of work our way around it. [00:10:42] Speaker C: So Harmony archives hundreds and hundreds of legacy systems each year. How would you characterize the percentage on how many data extractions go? Let's say smoothly versus not smoothly. [00:10:58] Speaker A: In general, my team works effectively on their day to day items. I think the big thing is being proactively engaged to plan and emphasize the importance of communication. A typical day for a systems analyst could include chatting with a client about gating credentialing information to a vendor's SFTP site to transfer the data, or working on a troubleshooting call with a vendor to review an index file that maybe was delivered without header information. So I think despite some of these challenges, we're always working with our clients on a path forward. So I think at the start of all of our products, harmony conducts a technical key decisions and planning call right after the kickoff. And at that session, you'll meet one on one with our systems analyst, and we'll confirm the optimal data transfer method, kind of map out when, where, and who on the data delivery. And I always highly encourage all of our clients to invite their vendor to their meeting at that time. [00:12:02] Speaker C: That makes sense. What kind of background do the system analysts on your team have? Technically? Like, what would your ideal candidate look like if you were hiring your next systems analyst? [00:12:16] Speaker A: I like this question. So for systems analysts, I often have a mixed bag of experience. I think the ideal candidate has experience with database management, overall data extraction, but can also kind of put on their general It admin hat and dive into some troubleshooting with an LDAP issue. We're often dealing with a wide variety of different database platforms, such as DB Two, Cybase, Oracle, Postgres, and Fox Pro, and that's just to name a few. So this requires the essay to kind of sometimes act as a technical liaison with our project management team, which is crucial when translating technical challenges to our clients for just general next steps. [00:13:05] Speaker C: Crucial, for sure. I think Harmony is pretty lucky to have you at the helm to manage that team and just the data acquisition process overall so that the data specialists at Harmony can complete the data conversions and the archives that we're contracting for, and that just keeps patient and employee information safe the entire time. So, Ellie, I'm so glad you were able to carve out some time and join us today. Thank you very much. [00:13:34] Speaker A: Yeah, thank you so much for having me today. [00:13:37] Speaker C: Yep. I think we learned a lot about the data extraction process and how we're handling it. So to our audience, thank you for joining us, and be sure to tune in next time for another discussion about legacy data in healthcare. [00:13:52] Speaker B: That's it for this session of health data talks. Check out helpful [email protected] and follow us in your favorite podcast app to catch future episodes. We'll see you next time.

Other Episodes

Episode 10

August 09, 2022 00:09:21
Episode Cover

Use Cases for Historical Records in the ER

In this episode, Shannon Larkin and Dr. Mark Kricheff, Emergency Physician and Clinical Informaticist of Saint Joseph Health System, cover use cases and examples...

Listen

Episode 19

June 13, 2023 00:13:23
Episode Cover

Security Best Practices

In this episode of HealthData Talks, Shannon Larkin and Nick Cardwell, Director of Cyber Security at Harmony Healthcare IT, cover secuirty best practices for...

Listen

Episode 26

August 23, 2024 00:18:15
Episode Cover

ONC Reorganization & HTI-2

Listen