Hello... I’m just a person who knows nothing about this.I don’t understand the initial setup for using LN Analytics at all.Isn’t the data entered in LN automatically uploaded to the Data Lake?Do I need to manually connect LN data to the Data Lake, and then again to Analytics or Birst?If so, how do I connect them? I really need help... I just want to go home.
You will need to complete the setup in LN (publish data), ION and Birst (Orchestration jobs) to make sure all data load processes are running.
You can find the steps explained in the LN_Analytics_for_Birst_Compass_Administration_Guide (Document code: lna_ce_lnaag__en-us)
How should I proceed?I read the guide, but I still don’t understand.Should I create a flow in DataLake Flow?Would it work if I set it up as LN Application -> DB?But since I’m using the cloud version of LN, I don’t have access to the DB port...This might sound like a silly question, but I really feel lost.Could you please explain?
The setup for LN Analytics is quite complex. Please search for KB2026400 in the Infor Customer Portal (same place where you can open Cases, accessible from Concierge). That KB is your reference point, it contains all needed material.
First of all, in LN, Tools -> Integration Tools -> Data Publishing Management -> Data Sets to Publish (ttdpm5105m000) you must have the LNA data sets, they tell LN what to send to Data Lake. Somewhere in the LN Analytics docs they explain how to import the pre-built LNA data sets into LN. Not difficult, just very sensible. The pre-built data sets should be available in the KB quoted above.
In the Data Publishing Parameters (same menu, a little below) (ttdpm5130m000), use Data Lake Ingestion for both Initial Load and Changes. This is the new method of publishing to the Data Lake and to my knowledge it doesn't need any setup in ION except the Data Catalog (= structure of the published tables). Also I understand that the ION Messaging Service method is going to be deprecated (LN CE 2025.10 according to a message I got from our TRN tenant which is still with that method). I don't know how the other parameters there work, so I suggest you leave them as they are by default.
In ION, you should only need to setup the Data Catalog. In the past, with the ION Messaging Service method, you had to update (or create, in my case it was already there, I don't know what Infor delivers by default) a specific Data Flow called something like "LN to DL" with just an Application Activity and an Ingest to Data Lake Activity. You had to include the needed docs for the published tables and re-activate the Data Flow. I don't remember whether there was any further action required for the Data Catalog.
With the new Data Lake Ingestion method I think the procedure is different but I'm unsure of the details. You should be able to find those in the KB quoted above.
(Indeed, the Data Flow described above shouldn't be necessary anymore: in OneView, in our TRN I see lots of docs named "LN_<table>", which are used in the Data Flow, while in our PRD I see none.)
You then need to go to "Publish Data" in LN (same group of Sessions in the menu) (ttdpm5205m000) and publish an Initial Load, and well, before that you need also the Technical Data, although that's also something that is done once and for all (I think, I never needed to do it!).
At this point go to Data Fabric, in Atlas, and check that data are there in the form of JSON files. Then, go to Compass and try to query some small table. If all goes fine, you still need to make yourself sure that LN is going to publish Changes regularly. There is a Session in LN, still in the same menu group, called "Publish Changes" (ttdpm5125m000) where Data Publishers are listed with their current status. I think the right situation is when there is exactly one in status "Running", and nothing else. There should be a Job in Company 0000 named "AUTO_STARTPUBLISHERS" defined by the Infor DevOps under User "sysadmin" which is scheduled to restart Data Publishers every hour.
This ends the setup from LN to Data Lake. Next is the setup from Data Lake to Birst. In the most recent versions of LN Analytics, some Data Lake Views are needed; check the usual KB. I never used them yet. Then you should run the specific LNA Workflow of your interest in Birst. There are multiple, based on what data you want. Try with the smallest one, be patient (it may take hours to complete), and see whether you get anything in LN Analytics. If yes, you just need to schedule the workflow every, say, one day; if not, well, then you need to troubleshoot in Birst and I'm afraid but that's going to be way longer than what I have already described.
Please let me know how it goes. I will have to go again through this setup myself as soon as I find the time to upgrade to the most recent edition of LN Analytics.
Enrico
hello, Enrico
I thought I had completed all the setup, but the dashboard is still not visible. Do you know how to solve this?
After connecting DataLake in Connect, do I need to link each LN table individually?
Or would it work to create a view for all table JSON files? Currently, I’ve only created views for a few JSON files needed to run the SQL scripts provided.
Hi Enrico,Can you please share with us a basic info? What environment do you use? Is it CloudSuite?
Regards,
Robert.
Hi,Can you please share with us a basic info? What environment do you use? Is it CloudSuite?
Well, for sure you have to do all the setup. A single missing piece may lead to a complete failure. Troubleshooting your scenario is impossible from a chat, and I don't know whether I may be able to do that even if I could access your environment.
One thing that comes to my mind is about security: that's in place in LN Analytics, so your user needs access to the LN Company and everything must be set up properly for security to work.
The only other suggestion I can try to give you is to proceed by steps. LN Analytics is difficult to reverse engineer, but if you try to follow a little bit the data flow you may be able to figure out what point your process hangs at. It requires anyway a lot of time, and possibly quite a bit of experience.