My last post was about how I got the customized data out of the tenant database into Xml files. That tenant database was from a NAV 2016 application.
I have updated the tenant database to Business Central and I need to bring in some of the data from these Xml files.
My first issue was that I needed to make these Xml files available to Business Central. I have been using Azure Blob to store files for some years now. I had both AL and C/AL code that was able to connect to the Azure Blob REST Api, but that code used DotNet variables that is no longer an option.
I did some preparation last year, when I requested Microsoft to add some functionality to the BaseApp. Using that BaseApp functionality I was able to redo my Azure Blob AL code as a clean extension.
I also wanted to put the AL code somewhere in a public place for everyone to see. And GitHub is the default code storage place. I created a project for Business Central AL.
I am hoping that this place can be the place where code examples for our Business Central community is shared and maintained. If you want to contribute then I can add you to this project, or I can approve your pull request.
I need to write another blob post about that Azure Blob and the other repositories I have created there. Hope to find time soon.
There is another repository in this project for the Import Tenant Data App. This app has an Azure Blob Connect functionality to utilize the Azure Blob app for data import.
I start by opening the Import Data Source page.
Here I find the Azure Blob Connector that self registered in the Import Data Source table.
I need to go to Process -> Setup to configure my Azure Blob container access.
Specify the container where you have uploaded all the Xml files.
Then I searched for Import Project List and create a new import project for the General Ledger. The Import Source for Azure Blob was automatically select, since that is the only one available.
Now to import the related Xml files into this project
I get a list of files from the Azure Blob and select the one I need.
The file list will open again if I have more files to import. Close the file list when finished. Back on the Import Project we should now see information from the Xml file.
For each file I need to configure the destination mapping.
If the table exists in my Business Central App then it will be automatically selected.
And I can map fields from the Xml file to the Business Central Table.
There are options to handle different data structure. One is that we can add a transformation rule directly to each field. The other one is using our own custom data upgrade app that subscribes to the events published in this app.
Four events are published, two for each field in the mapping, two before updating or inserting the database record.
Based on the information in the publishers we can do any manual data modification required. In my example the creation time was added to each G/L Entry in NAV, but is added to the G/L Register in Business Central.
From the list of tables we are able to start the data transfer. First we need to make sure that we have the correct configuration for the import. Do we want to commit during the import, do we want to create missing records in our database?
I select to commit after each 1000 records. If my data transfer stops, than I can resume from that position when I start the data transfer again.
We have the option to create a task in the job queue to handle the data transfer.
The job queue can handle multiple concurrent transfers so the import should not take to much time. Looking into the Destination Mapping, we can see the status of the data import.
I will add few more pictures to give you a better idea of what can be done with this import tenant data app. The AL code is in GitHub for you to browse, improve and fix.
The resulting Xml file I can view in any Xml viewer.
This data will help me update my upgrade tables for the Sync-NAVTenant command to complete successfully.
But, I wanted more. Since the destination for the data is in an App table I feel that i have no need for the data in the database during the C/AL upgrade.
I also created the Get-NAVTenanttableNos function that will give me a list of all tables in the tenant database. Using this data I was able to iterate through all the tables and export the structure and data.
This gives me all the data that does not fit in the standard 2018-App in separate Xml files .
Each file contains only the fields required to import the custom data. As an example, the xml file for table 17, G/L Entry, does not contain all the data from G/L Entries, just the primary key and the custom fields.
In the Xml file I include details for the table and for the fields included in the data set.
Custom tables that do not have any match in the 2018-App database are exported with all fields. With all the custom data stored safely I can execute the Sync-NAVTenant with Force to get the database upgraded.
The next task is in AL, writing the code to import the Xml files to the App tables.
This configuration points to the W1 Business Central OnPrem Docker Image. Now, let’s point to the Spanish one.
And let’s build a container.
Switching the Terminal part to AdvaniaGIT, I see that I am now pulling the Spanish Docker image down to my laptop.
This may take a few minutes…
After the container is ready I start FinSql.exe
Just opening the first table and properties for the first field I can verify than I have the Spanish captions installed.
So, let’s export these Spanish captions by selecting all objects except the new trigger codeunits (Business Central only) and selecting to export translation…
Save the export to a TXT file.
Opening this file in Visual Studio Code, we can see that the code page does not match the required UTF-8 format. Here we can also see that we have English in lines with A1033 and Spanish in lines with A1034.
We need to process this file with PowerShell. Executing that script can also take some time…
This script reads the file using the “Oem” code page. This code page is the one FinSql uses for import and export. We read through the file and every line that is identified as Spanish is the added to the output variable. We end by writing that output variable to the same file using the “utf8” code page.
Visual Studio Code should refresh the file automatically.
We need to create a “Translations” folder in the server folder. The default server uses the root Translations folder.
If you have instances then the “Translations” folder needs to be in the Instance.
Since I am running this in a container I may need to create this folder in the container.
Then, copy the updated file to the “Translations” folder.
And make sure it has been put into the correct path.
We need to restart the service instance.
Then in my Web Client I can verify that the Spanish application language is now available.
Using the latest Windows 10 version and the latest version of Docker means that we can now use “Process Isolation” images when running NAV and Business Central.
Not using process isolation images on Windows 10 requires Hyper-V support. Inside Hyper-V a server core is running as the platform for the processes executed by the container created from the image. If using process isolation images then the Windows 10 operating system is used as foundation and Hyper-V server core is not needed. Just this little fact can save up to 4GB of memory usage by the container.
In Iceland we add a field to the Customer table (Cust.”ADV Registration No.”). Every business entity in Iceland has a registration number. A company only has one registration number but can have multiple VAT numbers. We already have that registration number field in the Company Information record, but we also add it to Customer, Vendor and Contact records. The Employee social security number equals to the registration number for an individual.
To be able to remove the compile dependency, and therefore the installation dependency I did the following:
Removed the dependency App from app.json
Added a variable to the report
Changed the data set configuration to use this variable
local procedure MyProcedure(varSalesHeader:Record"Sales Header")
Message('I am pleased that you called.');
What happens now is that Codeunit only has one instance for each session. When the first sales document is posted then the an instance of the Codeunit is created and kept in memory on the server as long as the session is alive.
This will save the resources needed to initialize an instance and tear it down again.
Making sure that our subscriber Codeunits are set to single instance is even more important for subscribers to system events that are frequently executed.
Note that a single instance Codeunit used for subscription should not have any global variables, since the global variables are also kept in memory though out the session lifetime.
Make sure that whatever is executed inside a single instance subscriber Codeunit is executed in a local procedure. The variables inside a local procedure are cleared between every execution, also in a single instance Codeunit.
On March 11th and 12th I will be teaching a VSCode and Modern NAV Development. This course will be held from 8:00am-5:00pm each day
The goal of the workshop is to learn about the new development tool for Business Central (Dynamics NAV), VSCode, GIT source control management and to experience what AL programming is about • What makes AL different from C/AL • How do you build and deploy a new BC feature • How can I convert my current code into AL • How to get ready for publishing your IP to AppSource • How to use GIT for you code
On the Developer track I will host three sessions.
Wednesday, March 13, 201910:15 AM – 11:45 AM, Room: Founders III
DEV75: How to Prepare Your Code for ALBCUG/NAVUG
Ready to completely re-think your all-in-one C/AL application? How about we try this: figure out how to split the code into “bricks” by functionality and/or processes, then turn that pile of bricks back into a usable solution. Can you migrate your customer data from the all-in-one C/AL database to the new continuous delivery cycle, replacing C/AL bricks with AL bricks. Let’s find out!
Wednesday, March 13, 20194:00 PM – 5:30 PM, Room: Founders II
DEV78: How I Got my Big Database Upgraded to Business Central
Your database upgrade takes longer than your available downtime window – bit of a problem, right? How about when executing all the upgrade processes on your database will take close to 10 days? Yeah, that’s a big problem. Of course you cannot stop a business for 10 days, but how do you shrink that to fit the 30-hour window over the weekend? You’ll hear the real life story and learn about the tools and methods you can use to streamline your upgrades.
Thursday, March 14, 20198:00 AM – 9:30 AM, Room: Founders III
DEV79: Breaking the Compilation Dependencies
Going to the extension model requires a simple structure to allow multiple extensions to talk to each other without having to put all of them into a compile dependency or into the same extension. Applying the standard API pattern inside the Business Central Service tier will give us the possibility to do all required functionality in a fast and easy way. This session is about explaining this pattern and giving some examples on how we have been using this pattern.
The “Enabled” Codeunit will test for Setup table read permission and if the “Enabled” flag has been set in the default record.
LOCAL TestEnabled(VARTempBlob:Record TempBlob)
WITH JsonInterfaceMgt DOBEGIN
This is how we can make sure that a module is installed and enabled before we start using it or any of the dependent modules.
Table Access Interface
The main module has a standard response table. We map some of the communication responses to this table via Data Exchange Definition. From other modules we like to be able to read the response from the response table.
The response table uses a GUID value for a primary key and has an integer field for the “Data Exchange Entry No.”. From the sub module we ask if a response exists for the current “Data Exchange Entry No.” by calling the interface.
Some processes can be both automatically and manually executed. For manual execution we like to display a request page on a Report. On that request page we can ask for variables, settings and verify before executing the process.
For automatic processing we have default settings and logic to find the correct variables before starting the process. And since one module should be able to start a process in the other then we use the JSON interface pattern for the processing Codeunit.
We also like to include the “Method” variable to add flexibility to the interface. Even if there is only one method in the current implementation.
Reading through the code above we can see that we are also using the JSON interface to pass settings to the Data Exchange Framework. We put the JSON configuration into the “Table Filters” BLOB field in the Data Exchange where we can use it later in the data processing.
From the Report we start the process using the JSON interface.
This pattern is similar to the discovery pattern, where an Event is raised to register possible modules into a temporary table. Example of that is the “OnRegisterServiceConnection” event in Table 1400, Service Connection.
Since we can’t have Event Subscriber in one module listening to an Event Publisher in another, without having compile dependencies, we have come up with a different solution.
We register functionality from the functionality module and the list of modules in stored in a database table. The table uses a GUID and the Language ID for a primary key, and then the view is filtered by the Language ID to only show one entry for each module.
This pattern gives me a list of possible modules for that given functionality. I can open the Setup Page for that module and I can execute the Interface Codeunit for that module as well. Both the Setup Page ID and the Interface Codeunit ID are object names.
The registration interface uses the Method variable to select the functionality. It can either register a new module or it can execute the method in the modules.
WITH JsonInterfaceMgt DOBEGIN
LOCAL RegisterCollectionApp(JsonInterfaceMgt:Codeunit"IS Json Interface Mgt.")
Where the Subscriber that needs to respond to this Publisher is in another module we need to extend the functionality using JSON interfaces.
First, we create a Codeunit within the Publisher module with Subscribers. The parameters in the Subscribers are converted to JSON and passed to the possible subscriber modules using the “ExecuteMethodInApps” function above.
Having standard ways of talking between modules and solutions has opened up for a lot of flexibility. We like to keep our solutions as small as possible.
We could mix “Methods” and “Versions” if we at later time need to be able to extend some of the interfaces. We need to honor the contract we have made for the interfaces. We must not make breaking changes to the interfaces, but we sure can extend them without any problems.
By attaching the JSON Interface Codeunit to the post I hope that you will use this pattern in your solutions. Use the Code freely. It is supplies as-is and without any responsibility, obligations or requirements.
We use this BLOB for our JSON data when we send a request to an interface and the interface response is also JSON in that same BLOB field.
For people that have been working with web requests we can say that TempBlob.Blob is used both for RequestStream and for ResponseStream.
TempBlob is only used as a form of Stream. We never use TempBlob to store data. We never do TempBlob.Get() or TempBlob.Insert(). And, even if the name indicates that this is a temporary record, we don’t define the TempBlob Record variable as temporary. There is no need for that since we never do any database call for this record.
Interface Helper Codeunit
We use a single Codeunit in all our solutions to prepare both request and response JSON and also to read from the request on the other end.
We have created a Codeunit that includes all the required procedures for the interface communication.
We have three functions to handle the basics;
procedure InitializeFromTempBlob(TempBlob: Record TempBlob)
procedure GetAsTempBlob(var TempBlob: Record TempBlob)
A typical flow of executions is to start by initializing the JSON. Then we add data to that JSON. Before we execute the interface Codeunit we use GetAsTempBlob to write the JSON into TempBlob.Blob. Every Interface Codeunit expects a TempBlob record to be passed to the OnRun() trigger.
codeunit10008650"ADV SDS Interface Mgt"
with JsonInterfaceMgt dobegin
Inside the Interface Codeunit we initialize the JSON from the passed TempBlob record. At this stage we have access to all the data that was added to the JSON on the request side.
And, since the interface Codeunit will return TempBlob as well, we must make sure to put the response JSON in there before the execution ends.
with JsonInterfaceMgt dobegin
The JSON is an array that contains one or more objects. An JSON array is represented with square brackets.
The first object in the JSON array is the variable storage. This is an example of a JSON that passes two variables to the interface Codeunit.
All variables are stored in the XML format, using FORMAT(<variable>,0,9) and evaluated back using EVALUATE(<variable>,<json text value>,9). The JSON can then have multiple record related objects after the variable storage.
Adding data to the JSON
We have the following procedures for adding data to the JSON;
I will write a more detailed blog about each of these methods and give examples of how we use them, but for now I will just do a short explanation of their usage.
If we need to pass a reference to a database table we pass the Record ID. Inside the interface Codeunit we can get the database record based on that record. Each Record ID that we add to the JSON is stored with the Table Name and we use either of these two procedures to retrieve the record.
We use Base 64 methods in the JSON. By passing the BLOB to TempBlob.Blob we can use
on the other end to pass a binary content, like images or PDFs.
Finally, we have the possibility to add and encrypt values that we place in the JSON. On the other end we can then decrypt the data to be used. This we use extensively when we pass sensitive data to and from our Azure Function.
Calling an interface Codeunit
As promised I will write more detailed blogs with examples. This is the current list of procedures we use to call interfaces;
procedure ExecuteInterfaceCodeunitIfExists(CodeunitName: Text; var TempBlob: Record TempBlob; ErrorIfNotFound: Text)
procedure TryExecuteInterfaceCodeunitIfExists(CodeunitName: Text; var TempBlob: Record TempBlob; ErrorIfNotFound: Text): Boolean
The first two expect a JSON to be passed using TempBlob. The third one we use to check for a simple true/false. We have no request data but we read the ‘Success’ variable from the response JSON.
For some of our functionality we use an Azure Function. We have created our function to read the same JSON structure we use internally. We also expect our Azure Function to respond with the sames JSON structure. By doing it that way, we can use the same functions to prepare the request and to read from the response as we do for our internal interfaces.