My last post was about how I got the customized data out of the tenant database into Xml files. That tenant database was from a NAV 2016 application.
I have updated the tenant database to Business Central and I need to bring in some of the data from these Xml files.
My first issue was that I needed to make these Xml files available to Business Central. I have been using Azure Blob to store files for some years now. I had both AL and C/AL code that was able to connect to the Azure Blob REST Api, but that code used DotNet variables that is no longer an option.
I did some preparation last year, when I requested Microsoft to add some functionality to the BaseApp. Using that BaseApp functionality I was able to redo my Azure Blob AL code as a clean extension.
I also wanted to put the AL code somewhere in a public place for everyone to see. And GitHub is the default code storage place. I created a project for Business Central AL.

I am hoping that this place can be the place where code examples for our Business Central community is shared and maintained. If you want to contribute then I can add you to this project, or I can approve your pull request.
I need to write another blob post about that Azure Blob and the other repositories I have created there. Hope to find time soon.
There is another repository in this project for the Import Tenant Data App. This app has an Azure Blob Connect functionality to utilize the Azure Blob app for data import.
I start by opening the Import Data Source page.

Here I find the Azure Blob Connector that self registered in the Import Data Source table.

I need to go to Process -> Setup to configure my Azure Blob container access.

The information required can be found in the Azure Portal.

Specify the container where you have uploaded all the Xml files.

Then I searched for Import Project List and create a new import project for the General Ledger. The Import Source for Azure Blob was automatically select, since that is the only one available.
Now to import the related Xml files into this project

I get a list of files from the Azure Blob and select the one I need.


The file list will open again if I have more files to import. Close the file list when finished. Back on the Import Project we should now see information from the Xml file.

For each file I need to configure the destination mapping.

If the table exists in my Business Central App then it will be automatically selected.

And I can map fields from the Xml file to the Business Central Table.
There are options to handle different data structure. One is that we can add a transformation rule directly to each field. The other one is using our own custom data upgrade app that subscribes to the events published in this app.
Four events are published, two for each field in the mapping, two before updating or inserting the database record.

Based on the information in the publishers we can do any manual data modification required. In my example the creation time was added to each G/L Entry in NAV, but is added to the G/L Register in Business Central.

From the list of tables we are able to start the data transfer. First we need to make sure that we have the correct configuration for the import. Do we want to commit during the import, do we want to create missing records in our database?

I select to commit after each 1000 records. If my data transfer stops, than I can resume from that position when I start the data transfer again.
We have the option to create a task in the job queue to handle the data transfer.

The job queue can handle multiple concurrent transfers so the import should not take to much time. Looking into the Destination Mapping, we can see the status of the data import.

I will add few more pictures to give you a better idea of what can be done with this import tenant data app. The AL code is in GitHub for you to browse, improve and fix.






Hi Gunnar,
do you have the azure blob components for nav2017?
Not in a deliverable state. You should be able to downgrade this one. Just copy the required C/AL functions into your solution.
nice work, keep up the good work.
Hi Gunner, nice post..
Does this works on Business Central 15 Cloud as well?
I have tried to make it work but i guess some of the objects are missing in BC15.
Yes, you should be able to compile this app for BC15 and BC16. Perhaps with minor changes.
You will need to remove the file server part and only use Azure blob if you are targeting SaaS
Awesome, thanks!
Hey, i have been trying to get your system to work for a projekt im doing, but there is a few things i cant get working.
When importing the File list it gives me an error saying “Azure Blob JSON interface not found” and for what i can see, i am missing a codeunit of some kind.
So my question is, why are we using JSON if we are importing XML file and 2nd, am i missing some part of the code or is this a Azure Blob storage setting?
Hope you can help
Hi Viggo
Great to see that someone is using this solution.
You will need the Azure Blob objects from https://github.com/businesscentralal/AzureBlob if you are using the Azure Blob storage.
How is the import of very large XML files working?
I have created an export of custom fields in NAV 2009 in CSV files to make an upgrade easier to do and are looking for a way to import..
ps.
The auto login through gmail of twitter doesn’t work.
Hi,
I have been using a SQL Custom Database for bigger databases. Export to SQL instead of Xml files is built into the AdvaniaGIT scripts and the Import Tenant Data app has the support for SQL as well. Much faster.
I have disabled the auto login for security reasons.