Data Upload Quality Control
Hi,
We have recently implemented a new Epic application and have began to noctice a negative perception on the application itself and Lawson that's stemming from information discrepancies
What we hope to find is some information on data quality management techniques and how other facilities upload/update information in Lawson.
We currently use the Lawson Add-Ins feature within Excel to upload new information. The process of running the uploads has worked well for us, but we have concerns with efficiently identifying duplicate information (i.e. descriptions) and ensuring that what we are loading is what we truly need to load.
We use GHX NuVia as a data cleanser and it has helped with assigning noun types, UNSPSC or HCPCS codes, but we noticed with a recent update that NuVia suggested for descriptions, we ended up duplicating the same description for mutliple items.
Any feedback on pre-load data quality check processes or post load audit checks would be greatly appreciated.
Thanks