I am trying to edit the SQL directly in the modeler by clicking on the SQL view but I get a message "cannot edit in Read-only Editor" . I am building a Data Laker query and would like to add some code e.g to sum a column however without editing I am not able to do this. Hope someone can help. Thanks.
Does anyone know if snowflake can connect directly to the Data Lake, or can it only connect to a Birst sources extract/staging table?
For my replication from the DataLake to local server I do the following: * I use an IPA to kick off my replication to the Datalake. * Within that IPA I loop, checking for: RedAlert (the flag on the replication set stating the replication is still running) * If the RedAlert flag is not equal to 'true" then I write a…
Hi. I am looking for a way to automate initiate of tables to Data Lake. Sometimes we need delele all information from Data Lake (for example TST Tenant after copying one company to other company) and reinitiate all tables EVS002MI or EVS008. Sometimes those are 180 tables manually. That's why maybe someone may have been…
Hello, My organization is working toward building a data catalog and getting serious about data architecture. I would like feedback from Infor customers, staff, and partners about how useful Infor OS might be as the backbone of this new catalog and architecture. Thanks, Nick
Is there a way to clean up variations in the datalake at least for specific tables? On initial load of a table to my local DB using ION ETL I'm seeing several hundred variations of the same table causing 10's of millions of rows to be updated. On successive updates this is greatly reduced. I am using he QueryAll function…
Hello, I have access to SQL server with transactional data. I am working on a process to upload this to a TB cube. At month end, these records may be updated with new base values. This is captured on premise with a stored procedure that updates a "master" table and increments a column called "VarID" (variation id).…
Hi Team, How we should load initial data from M3 to datalake via Stream pipelines? Couldn't find any documents anywhere?
When defining an Object Schema in Data Catalog in ION to be used in Data Lake (I'm doing this manually at the moment, not by sample file), it is possible to choose the Data Type for each field. In particular, an option is available to set it to Date Time, and a sub-option the Date Time Format may be defined. I have tested…
Hello All, Pulled a few ICSW records from CSD tenant in CSV format and converted the CSV file to JSON format. Then compressed the JSON file to .zlib format using deflate compression technique through Python script. Uploaded the .Zlib file to Data Fabric using Data Fabric Ingestion API (DATAFABRIC/ingestion/v1). Data got…
It looks like you're new here. Sign in or register to get started.