Skip to main content

Dear LeanIX Community,

I am currently looking into the LeanIX Integration Importer. I am able to run it locally via python to create a locally stored excel file, transform it and upload it to LeanIX.

GitHub - leanix-public/leanix-integration-importer: Imports data from Excel via Integration-API

I want to deploy the script that it is able to run automatically each night and gets data from a sharepoint folder and upload the changes into LeanIX. 

I was looking at different options how to deploy it e.g. PowerAutomate, Docker, locally. Is there a way to execute this use case directly from LeanIX itself e.g. store the script within LeanIX → trigger daily → gets sharepoint excel → upload to LeanIX.

Do you have Experience with the Integration Importer? And in which way do you run the integration importer? I am not looking into the normal LeanIX Import / Export function.

Thank you in advance

All the best

Plenty of options, depending on the tech stacks you have access to. You can deploy the data import script in a provider and trigger its execution periodically. As a provider you can use Azure Functions or a similar FaaS, or PowerAutomate. You can also execute the script periodically from a docker container with python.  

Triggering the script periodically from LeanIX is afaik not possible. You can trigger it using a webhook, but that would react to a LeanIX event, not a timer.

Why aren’t you using LeanIX’s GraphQL API to update the data directly? It would probably be a simpler solution to set up and maintain. 


Hey first of all, thanks for the answer.

I thought the same, So i guess the way to go is to deploy and trigger the execution of the script within Azure or something else.

 

I was looking into GraphQl as well, but as far as I understood it is only possible to retrieve data from LeanIX right? I would again use something external like Microsoft Graph API.

Ty in advance

 


@cabuc The GraphQL language supports queries (to read data) and mutations (to create and update data). Both of these options are implemented in the LeanIX GraphQL API.

So, you can (and should) use GraphQL to create/update content, especially when the process involves multiple factsheets and/or relationships. GraphQL is also ideal if you are bulk importing data or creating/updating multiple factsheets.

If the changes are straightforward (e.g. creating one factsheet or updating fields in one factsheet), you can consider using the simpler LeanIX pathfinder REST API.

The LeanIX Integration API (the one you are using through the python script in your original post) requires configuring so-called data processors, which map/transform data to/from the format used in LeanIX (LDIF). This API is designed to integrate LeanIX with external systems, especially when data transformations are required. Based on the description of your use case, you do not need this API - of course you can use it, but it seems to be a complex solution for a simple problem.


Reply