infor.com
concierge
infor u
developer portal
Home
Groups
Lawson - Technology Customer Community [READ ONLY]
Talent Management Data/Configuration/Security Refresh
Legacy Contributor
I know this might not be the best place to ask the question but I thought I'd give it a try. Does anybody out there do a data/configuration/security refresh in Talent Management (Landmark) that works? If so, do you do it the "Lawson way" or do you have a better option such as a SQL restore and some Lawson command hybrid? We've found a way to do S3 but struggling with LTM.
ERS Process Improvements - September 2020.pdf
Find more posts tagged with
Comments
Legacy Contributor
Hi Carol,
We do both S3 and TM data refreshes on a regular basis. S3 is much simpler than TM since SQL can be used for S3 but not for TM. I'll share a few details, but Infor Managed Services does the grunt work on these requests, so I don't really know the nuts-and-bolts.
First of all, as far as I know you have to use the "Lawson Way" in TM because of all the UniqueID joins and other things that the application builds as the data is loaded. Basically, they have to use Infor utils to dump all the data from PROD into the related export files, then they truncate all the tables on the destination environment and use the Infor utils to load the data.
Note that this takes a very long time - we have around 30,000 employee records (current and historical) and it takes close to 30 hours to do the "load" into the destination system while S3 takes 2-3 hours. We typically use existing backup files as the source.
WARNING - one of the things we learned the hard way is that the data refresh loads everything, including in-process actions, future changes, etc. This sounds OK until you realize that any in-process actions/flows that have email notifications (like Action reminders or Timeout notifications) or any future-dated email changes will cause emails to be sent from DEV using PROD destination addresses (Oops!)
Because of this, we have worked with Infor Managed Services to build a "post refresh" process that includes
- truncating some tables (workunit, actionrequest, and several other related tables)
- truncating and then restoring some tables from the destination backup (PfiConfiguration settings, PfiService, PfiServiceFlowDefinition, PfiFlowDefinition, PfiFlowVersion, CHP, ConfigurationProperty, etc)
- SQL updating "current" email addresses (EmployeeContact and Actor) to a single generic address that points to a secure public folder for testing.
- run a script that resets any email addresses that are set for future changes
Note that this is NOT supported on the Global Support side - they feel that doing anything outside of the system will cause problems. However, there really isn't an alternative because of the downstream impacts.
Unfortunately, I don't have complete details of these post-refresh tasks since we aren't doing it in-house. You'll really have to build it as you go, based on your own business requirements.
Legacy Contributor
Kelly, my main issue is with LTM and it's data refresh. We have the "non-Lawson way" for S3 and now focusing on LTM. It's taking at a minimum of 12 hours to do a dbexport on our production system which then gets us out of sync with S3 and wanting that "snapshot" of both systems at the same time. With a dbexport of LTM but using a redgate backup from S3 it's just not working for us. Hope this makes sense. I'm looking for a much more practical way to copy LTM.
Legacy Contributor
I understand your frustration. Unfortunately, there aren't any alternatives that we have found. DB dump/load hasn't worked for us, and we've been told it is a complete no-no because of the way Landmark and the Landmark Applications work.
In our case, we do the exports overnight on a weekend using the "Lawson way", so there is less activity on the system and less mismatch, but we always have some. This drives some of us up the wall (including me), but the mismatch is really a tiny percentage of the data.
aszeglin
How big is your daexport of the LTM data area? There's some historical data that we've managed to clean up and slim our export down by quite a bit.
Legacy Contributor
251 GB
0911060553270289.pdf
Legacy Contributor
We use Oracle Rman to restore our LTM data from production to test environments. We capture certain tables (test system/server specific configuration information) before the production data comes over and then revert that data back. We also then truncate things like work units (that have captured production data ike e-mail addresses). the entire process takes us about 4 hours (for LTM data), but using Rman allows us to set the data to an exact date and time on both S3 and LTM.
Legacy Contributor
Do you have a list of tables that you save & replace after the copy?
Do you have a list of tables that you truncate?
So you do not copy the encryption files from production to test?
We have been successful copying data in the past, but it is taking longer and longer.
Legacy Contributor
Here is the document that Infor Managed Services uses for our Data Refresh tasks. It includes the exported files, the post-refresh truncated files and restored files, and the other tasks they perform (i.e. resetting email addresses).
Note that we are still finding tables that we need to add - particularly to the truncate list. For example, we recently realized that we need to add all the EmailMessage* files, which aren't on this list, and we currently aren't resetting Candidate email addresses as we are with Employee and Actor.
I hope this is helpful. Kelly
Legacy Contributor
Thank you Kelly. Very helpful.
Important Links
Community Hubs
Discussion Forums
Groups
Community News
Popular Tags
ION Connect
ION Workflow
ION API Gateway
Syteline Development
CPQ Discussion Ask a Colleague
Infor Data Fabric
Infor Document Management (IDM)
LN Development
API Usage
FAQs, How-To, and Best Practices