My last post was about how I got the customized data out of the tenant database into Xml files. That tenant database was from a NAV 2016 application.
I have updated the tenant database to Business Central and I need to bring in some of the data from these Xml files.
My first issue was that I needed to make these Xml files available to Business Central. I have been using Azure Blob to store files for some years now. I had both AL and C/AL code that was able to connect to the Azure Blob REST Api, but that code used DotNet variables that is no longer an option.
I did some preparation last year, when I requested Microsoft to add some functionality to the BaseApp. Using that BaseApp functionality I was able to redo my Azure Blob AL code as a clean extension.
I also wanted to put the AL code somewhere in a public place for everyone to see. And GitHub is the default code storage place. I created a project for Business Central AL.
I am hoping that this place can be the place where code examples for our Business Central community is shared and maintained. If you want to contribute then I can add you to this project, or I can approve your pull request.
I need to write another blob post about that Azure Blob and the other repositories I have created there. Hope to find time soon.
There is another repository in this project for the Import Tenant Data App. This app has an Azure Blob Connect functionality to utilize the Azure Blob app for data import.
I start by opening the Import Data Source page.
Here I find the Azure Blob Connector that self registered in the Import Data Source table.
I need to go to Process -> Setup to configure my Azure Blob container access.
The information required can be found in the Azure Portal.
Specify the container where you have uploaded all the Xml files.
Then I searched for Import Project List and create a new import project for the General Ledger. The Import Source for Azure Blob was automatically select, since that is the only one available.
Now to import the related Xml files into this project
I get a list of files from the Azure Blob and select the one I need.
The file list will open again if I have more files to import. Close the file list when finished. Back on the Import Project we should now see information from the Xml file.
For each file I need to configure the destination mapping.
If the table exists in my Business Central App then it will be automatically selected.
And I can map fields from the Xml file to the Business Central Table.
There are options to handle different data structure. One is that we can add a transformation rule directly to each field. The other one is using our own custom data upgrade app that subscribes to the events published in this app.
Four events are published, two for each field in the mapping, two before updating or inserting the database record.
Based on the information in the publishers we can do any manual data modification required. In my example the creation time was added to each G/L Entry in NAV, but is added to the G/L Register in Business Central.
From the list of tables we are able to start the data transfer. First we need to make sure that we have the correct configuration for the import. Do we want to commit during the import, do we want to create missing records in our database?
I select to commit after each 1000 records. If my data transfer stops, than I can resume from that position when I start the data transfer again.
We have the option to create a task in the job queue to handle the data transfer.
The job queue can handle multiple concurrent transfers so the import should not take to much time. Looking into the Destination Mapping, we can see the status of the data import.
I will add few more pictures to give you a better idea of what can be done with this import tenant data app. The AL code is in GitHub for you to browse, improve and fix.
Yesterday I got a question via LinkedIn. I need to add Spanish translation to my W1 instance. How do I do that?
So, let me walk you through that process.
Here is my Business Central setup. It is the Icelandic Docker Container, so I have Icelandic and English. Switching between Icelandic and English works just fine.
Switching to Spanish gives me a mix of Spanish and English.
The Spanish translation for the platform is shipped with the DVD image and automatically installed. So are a lot of other languages.
Icelandic and English are built in captions in the C/AL code. And even if all these languages are shipped with the platform, these languages are not shipped with the application.
There is a way to get these application translations from the appropriate release and add them to your application.
This configuration points to the W1 Business Central OnPrem Docker Image. Now, let’s point to the Spanish one.
And let’s build a container.
Switching the Terminal part to AdvaniaGIT, I see that I am now pulling the Spanish Docker image down to my laptop.
This may take a few minutes…
After the container is ready I start FinSql.exe
Just opening the first table and properties for the first field I can verify than I have the Spanish captions installed.
So, let’s export these Spanish captions by selecting all objects except the new trigger codeunits (Business Central only) and selecting to export translation…
Save the export to a TXT file.
Opening this file in Visual Studio Code, we can see that the code page does not match the required UTF-8 format. Here we can also see that we have English in lines with A1033 and Spanish in lines with A1034.
We need to process this file with PowerShell. Executing that script can also take some time…
This script reads the file using the “Oem” code page. This code page is the one FinSql uses for import and export. We read through the file and every line that is identified as Spanish is the added to the output variable. We end by writing that output variable to the same file using the “utf8” code page.
Visual Studio Code should refresh the file automatically.
We need to create a “Translations” folder in the server folder. The default server uses the root Translations folder.
If you have instances then the “Translations” folder needs to be in the Instance.
Since I am running this in a container I may need to create this folder in the container.
Then, copy the updated file to the “Translations” folder.
And make sure it has been put into the correct path.
We need to restart the service instance.
Then in my Web Client I can verify that the Spanish application language is now available.
In Advania we are switching more and more to using the Docker images for Dynamics NAV and Business Central development.
Since version 1809 of Windows 10 and the latest blog post from Arend-Jan Kauffmann we are moving to using the Docker EE engine instead of the Docker Desktop setup.
Using the latest Windows 10 version and the latest version of Docker means that we can now use “Process Isolation” images when running NAV and Business Central.
Not using process isolation images on Windows 10 requires Hyper-V support. Inside Hyper-V a server core is running as the platform for the processes executed by the container created from the image. If using process isolation images then the Windows 10 operating system is used as foundation and Hyper-V server core is not needed. Just this little fact can save up to 4GB of memory usage by the container.
Freddy Kristiansen announced in this blog that his PowerShell Module, NAVContainerHelper, had support for selecting the proper Docker Image based on the host capabilities.
We have had some issues with our Windows installations and I wanted to give you the heads up and how these issues where resolved.
First thing first, make sure that you are running Windows 10 version 1809 or newer. Execute
winver.exe
in Windows-R to get this displayed.
Optional, make sure to remove the Hyper-V support if you are not using any virtual machines on your host. If you have version 1903 or later I suggest enabling the Hyper-V feature.
Restart your computer as needed.
Start PowerShell ISE as Administrator.
Copy from Arend-Jan‘s blog the Option 1: Manual installation script into the script editor in Powershell ISE and execute by pressing F5.
If you have older Docker Images download you should remove them. Executing
docker rmi -f (docker images -q)
in your PowerShell ISE prompt.
Now to the problems we have encountered.
The NAVContainerHelper added a support for the process isolation images just a few releases ago. Some of our machines had older versions installed and that gave us problems. Execute
Get-Module NAVContainerHelper -ListAvailable
in PowerShell ISE prompt to make sure you have version 0.5.0.5 or newer.
If you have any other versions installed use the File Explorer to delete the “navcontainerhelper” folder from
C:\Program Files (x86)\WindowsPowerShell\Modules
and
C:\Program Files\WindowsPowerShell\Modules
Then execute
Install-Module NAVContainerHelper
in PowerShell ISE prompt to install the latest versions. Verify the installation.
We also had problems downloading the images. Getting the error “read tcp 172.16.4.17:56878->204.79.197.219:443: wsarecv: An existing connection was forcibly closed by the remote host.“.
My college in Advania, Sigurður Gunnlaugsson, figured out that multiple download threads caused network errors.
In PowerShell ISE prompth execute
Stop-Service docker
dockerd --unregister-service
to remove the docker service. Then re-register docker service using
When we design and write our code we need to think about performance.
We have been used to thinking about database performance, using FindFirst(), FindSet(), IsEmpty() where appropriate.
We also need to think about performance when we create our subscriber Codeunits.
Let’s consider this Codeunit.
codeunit 50100 MySubscriberCodeunit
{
trigger OnRun()
begin
end;
[EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)]
local procedure MyProcedure(var SalesHeader: Record "Sales Header")
begin
Message('I am pleased that you called.');
end;
}
Every time any user posts a sales document this subscriber will be executed.
Executing this subscriber will need to load an instance of this Codeunit into the server memory. After execution the Codeunit instance is trashed.
The resources needed to initiate an instance of this Codeunit and trash it again, and doing that for every sales document being posted are a waste of resources.
If we change the Codeunit and make it a “Single Instance”.
codeunit 50100 MySubscriberCodeunit
{
SingleInstance = true;
trigger OnRun()
begin
end;
[EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)]
local procedure MyProcedure(var SalesHeader: Record "Sales Header")
begin
Message('I am pleased that you called.');
end;
}
What happens now is that Codeunit only has one instance for each session. When the first sales document is posted then the an instance of the Codeunit is created and kept in memory on the server as long as the session is alive.
This will save the resources needed to initialize an instance and tear it down again.
Making sure that our subscriber Codeunits are set to single instance is even more important for subscribers to system events that are frequently executed.
Note that a single instance Codeunit used for subscription should not have any global variables, since the global variables are also kept in memory though out the session lifetime.
Make sure that whatever is executed inside a single instance subscriber Codeunit is executed in a local procedure. The variables inside a local procedure are cleared between every execution, also in a single instance Codeunit.
codeunit 50100 MySubscriberCodeunit
{
SingleInstance = true;
trigger OnRun()
begin
end;
[EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)]
local procedure MyProcedure(var SalesHeader: Record "Sales Header")
begin
ExecuteBusinessLogic(SalesHeader);
end;
local procedure ExecuteBusinessLogic(SalesHeader: Record "Sales Header")
var
Customer: Record Customer;
begin
Message('I am pleased that you called.');
end;
}
If your custom code executes every time that the subscriber is executed then I am fine with having that code in a local procedure inside the single instance Codeunit.
Still, I would suggest putting the code in another Codeunit, and keeping the subscriber Codeunit as small as possible.
This is even more important if the custom code only executes on a given condition.
An example of a Codeunit that you call from the subscriber Codeunit could be like this.
codeunit 50001 MyCodeCalledFromSubscriber
{
TableNo = "Sales Header";
trigger OnRun()
begin
ExecuteBusinessLogic(Rec);
end;
local procedure ExecuteBusinessLogic(SalesHeader: Record "Sales Header")
var
Customer: Record Customer;
begin
Message('I am pleased that you called.');
end;
}
And I change my subscriber Codeunit to only execute this code on a given condition.
codeunit 50100 MySubscriberCodeunit
{
SingleInstance = true;
trigger OnRun()
begin
end;
[EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)]
local procedure MyProcedure(var SalesHeader: Record "Sales Header")
begin
ExecuteBusinessLogic(SalesHeader);
end;
local procedure ExecuteBusinessLogic(SalesHeader: Record "Sales Header")
var
MyCodeCalledFromSubscriber: Codeunit MyCodeCalledFromSubscriber;
begin
if SalesHeader."Document Type" = SalesHeader."Document Type"::Order then
MyCodeCalledFromSubscriber.Run(SalesHeader);
end;
}
This pattern makes sure that the execution is a fast as possible and no unneeded variables are populating the server memory.
We have several ways of using the JSON interfaces. I will give few examples with the required C/AL code. I will be using Advania’s Online Banking solution interfaces for examples.
The Advania’s Online Banking solution is split into several different modules. The main module has the general framework. Then we have communication modules and functionality modules.
On/Off Question
A communication module should not work if the general framework does not exist or is not enabled for the current company. Hence, I need to ask the On/Off question
This is triggered by calling the solution enabled Codeunit.
IF NOT JsonInterfaceMgt.TryExecuteCodeunitIfExists('ADV Bank Services Enabled Mgt.','') THEN BEGIN
SetupNotification.MESSAGE := NotificationMsg;
SetupNotification.SEND;
END;
The interface function will search for the Codeunit, check for execution permissions and call the Codeunit with an empty request BLOB.
The “Enabled” Codeunit must respond with a “Success” variable of true or false.
[External] TryExecuteCodeunitIfExists(CodeunitName : Text;ErrorIfNotFound : Text) Success : Boolean
Object.SETRANGE(Type,Object.Type::Codeunit);
Object.SETRANGE(Name,CodeunitName);
IF NOT Object.FINDFIRST THEN
IF ErrorIfNotFound <> '' THEN
ERROR(ErrorIfNotFound)
ELSE
EXIT;
IF NOT HasCodeunitExecuteLicense(Object.ID,ErrorIfNotFound) THEN EXIT;
CODEUNIT.RUN(Object.ID,TempBlob);
InitializeFromTempBlob(TempBlob);
GetVariableBooleanValue(Success,'Success');
The “Enabled” Codeunit will test for Setup table read permission and if the “Enabled” flag has been set in the default record.
OnRun(VAR Rec : Record TempBlob)
TestEnabled(Rec);
LOCAL TestEnabled(VAR TempBlob : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
Initialize;
AddVariable('Success',IsServiceEnabled);
GetAsTempBlob(TempBlob);
END;
IsServiceEnabled() : Boolean
IF NOT Setup.READPERMISSION THEN EXIT;
EXIT(Setup.GET AND Setup.Enabled);
This is how we can make sure that a module is installed and enabled before we start using it or any of the dependent modules.
Table Access Interface
The main module has a standard response table. We map some of the communication responses to this table via Data Exchange Definition. From other modules we like to be able to read the response from the response table.
The response table uses a GUID value for a primary key and has an integer field for the “Data Exchange Entry No.”. From the sub module we ask if a response exists for the current “Data Exchange Entry No.” by calling the interface.
FindResponse(DataExchEntryNo : Integer) Success : Boolean
WITH JsonInterfaceMgt DO BEGIN
Initialize;
AddVariable('DataExchEntryNo',DataExchEntryNo);
GetAsTempBlob(TempBlob);
ExecuteInterfaceCodeunitIfExists('ADV Bank Serv. Resp. Interface',TempBlob,ResponseInterfaceErr);
InitializeFromTempBlob(TempBlob);
GetVariableBooleanValue(Success,'Success');
END;
The Interface Codeunit for the response table will filter on the “Data Exchange Entry No.” and return the RecordID for that record if found.
OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
InitializeFromTempBlob(Rec);
GetVariableIntegerValue(DataExchEntryNo,'DataExchEntryNo');
Response.SETRANGE("Data Exch. Entry No.",DataExchEntryNo);
AddVariable('Success',Response.FINDFIRST);
IF Response.FINDFIRST THEN
AddRecordID(Response);
GetAsTempBlob(Rec);
END;
If the response is found we can ask for the value of any field from that record by calling
GetFieldValue(FieldName : Text) FieldValue : Text
WITH JsonInterfaceMgt DO
IF GetRecordByTableName('ADV Bank Service Response',RecRef) THEN
IF DataTypeMgt.FindFieldByName(RecRef,FldRef,FieldName) THEN
IF FORMAT(FldRef.TYPE) = 'BLOB' THEN BEGIN
TempBlob.Blob := FldRef.VALUE;
FieldValue := TempBlob.ReadAsTextWithCRLFLineSeparator();
END ELSE
FieldValue := FORMAT(FldRef.VALUE,0,9);
Processing Interface
Some processes can be both automatically and manually executed. For manual execution we like to display a request page on a Report. On that request page we can ask for variables, settings and verify before executing the process.
For automatic processing we have default settings and logic to find the correct variables before starting the process. And since one module should be able to start a process in the other then we use the JSON interface pattern for the processing Codeunit.
We also like to include the “Method” variable to add flexibility to the interface. Even if there is only one method in the current implementation.
OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
InitializeFromTempBlob(Rec);
IF NOT GetVariableTextValue(Method,'Method') OR (Method = '') THEN
ERROR(MethodNotFoundErr);
CASE Method OF
'BankAccountProcessing':
BankAccountProcessing(JsonInterfaceMgt);
END;
END;
LOCAL BankAccountProcessing(JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")
CheckSetup;
CompanyInformation.GET;
WITH JsonInterfaceMgt DO BEGIN
GetVariableTextValue(ClaimExportImportFormatCode, 'ClaimExportImportFormatCode');
GetVariableTextValue(BankAccountNo, 'BankAccountNo');
GetVariableDateValue(StartDate,'StartDate');
GetVariableDateValue(EndDate,'EndDate');
ValidateStartDate;
ValidateEndDate;
ValidateImportFormat;
BankAccount.SETRANGE("No.", BankAccountNo);
ClaimExportImportFormat.GET(ClaimExportImportFormatCode);
Initialize;
AddVariable('BankAccNo',BankAccountNo);
AddVariable('ClaimantID',CompanyInformation."Registration No.");
AddVariable('StartDate',StartDate);
AddVariable('EndDate',EndDate);
GetAsTempBlob(TempBlob);
Window.OPEN(ImportingFromBank);
IF BankAccount.FINDSET THEN REPEAT
DataExchDef.GET(ClaimExportImportFormat."Resp. Data Exch. Def. Code");
DataExch.INIT;
DataExch."Related Record" := BankAccount.RECORDID;
DataExch."Table Filters" := TempBlob.Blob;
DataExch."Data Exch. Def Code" := DataExchDef.Code;
DataExchLineDef.SETRANGE("Data Exch. Def Code",DataExchDef.Code);
DataExchLineDef.FINDFIRST;
DataExch."Data Exch. Line Def Code" := DataExchLineDef.Code;
DataExchDef.TESTFIELD("Ext. Data Handling Codeunit");
CODEUNIT.RUN(DataExchDef."Ext. Data Handling Codeunit",DataExch);
DataExch.INSERT;
IF DataExch.ImportToDataExch(DataExchDef) THEN BEGIN
DataExchMapping.GET(DataExchDef.Code,DataExchLineDef.Code,DATABASE::"ADV Claim Payment Batch Entry");
IF DataExchMapping."Pre-Mapping Codeunit" <> 0 THEN
CODEUNIT.RUN(DataExchMapping."Pre-Mapping Codeunit",DataExch);
DataExchMapping.TESTFIELD("Mapping Codeunit");
CODEUNIT.RUN(DataExchMapping."Mapping Codeunit",DataExch);
IF DataExchMapping."Post-Mapping Codeunit" <> 0 THEN
CODEUNIT.RUN(DataExchMapping."Post-Mapping Codeunit",DataExch);
END;
DataExch.DELETE(TRUE);
UNTIL BankAccount.NEXT = 0;
Window.CLOSE;
END;
Reading through the code above we can see that we are also using the JSON interface to pass settings to the Data Exchange Framework. We put the JSON configuration into the “Table Filters” BLOB field in the Data Exchange where we can use it later in the data processing.
From the Report we start the process using the JSON interface.
Bank Account - OnPreDataItem()
WITH JsonInterfaceMgt DO BEGIN
Initialize;
AddVariable('Method','BankAccountProcessing');
AddVariable('ClaimExportImportFormatCode', ClaimExportImportFormat.Code);
AddVariable('BankAccountNo', BankAccount."No.");
AddVariable('StartDate',StartDate);
AddVariable('EndDate',EndDate);
GetAsTempBlob(TempBlob);
ExecuteInterfaceCodeunitIfExists('ADV Import BCP Interface', TempBlob, '');
END;
The ExecuteInterfaceCodeunitIfExists will also verify that the Interface Codeunit exists and also verify the permissions before executing.
[External] ExecuteInterfaceCodeunitIfExists(CodeunitName : Text;VAR TempBlob : Record TempBlob;ErrorIfNotFound : Text)
Object.SETRANGE(Type,Object.Type::Codeunit);
Object.SETRANGE(Name,CodeunitName);
IF NOT Object.FINDFIRST THEN
IF ErrorIfNotFound <> '' THEN
ERROR(ErrorIfNotFound)
ELSE
EXIT;
IF NOT HasCodeunitExecuteLicense(Object.ID,ErrorIfNotFound) THEN EXIT;
CODEUNIT.RUN(Object.ID,TempBlob)
Extensible Interface
For some tasks it might be simple to have a single endpoint (Interface Codeunit) for multiple functionality. This can be achieved by combining Events and Interfaces.
We start by reading the required parameters from the JSON and then we raise an event for anyone to respond to the request.
OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
InitializeFromTempBlob(Rec);
IF NOT GetVariableTextValue(InterfaceType,'InterfaceType') THEN
ERROR(TypeErr);
IF NOT GetVariableTextValue(Method,'Method') THEN
ERROR(MethodErr);
OnInterfaceAccess(InterfaceType,Method,Rec);
END;
LOCAL [IntegrationEvent] OnInterfaceAccess(InterfaceType : Text;Method : Text;VAR TempBlob : Record TempBlob)
We can also pass the JSON Interface Codeunit, as that will contain the full JSON and will contain the full JSON for the response.
OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
InitializeFromTempBlob(Rec);
IF NOT GetVariableTextValue(InterfaceType,'InterfaceType') THEN
ERROR(TypeErr);
IF NOT GetVariableTextValue(Method,'Method') THEN
ERROR(MethodErr);
OnInterfaceAccess(InterfaceType,Method,JsonInterfaceMgt);
GetAsTempBlob(Rec);
END;
LOCAL [IntegrationEvent] OnInterfaceAccess(InterfaceType : Text;Method : Text;VAR JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")
One of the subscribers could look like this
LOCAL [EventSubscriber] OnInterfaceAccess(InterfaceType : Text;Method : Text;VAR JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")
IF InterfaceType = 'Claim' THEN
CASE Method OF
'Register':
Register(JsonInterfaceMgt);
'Edit':
Edit(JsonInterfaceMgt);
'AddExportImportFormat':
AddExportImportFormat(JsonInterfaceMgt);
'GetSetupCodeunitID':
GetSetupCodeunitID(JsonInterfaceMgt);
'GetDirection':
GetDirection(JsonInterfaceMgt);
'GetServiceUrl':
GetServiceUrl(JsonInterfaceMgt);
'GetExportImportFormat':
GetExportImportFormat(JsonInterfaceMgt);
'GetServiceMethod':
GetServiceMethod(JsonInterfaceMgt);
'ShowAndGetClaimFormat':
ShowAndGetClaimFormat(JsonInterfaceMgt);
'GetDataExchangeDefintionWithAction':
GetDataExchangeDefintionWithAction(JsonInterfaceMgt);
'GetOperationResultForClaimant':
GetOperationResultForClaimant(JsonInterfaceMgt);
'ShowClaimPayment':
ShowClaimPayment(JsonInterfaceMgt)
ELSE
ERROR(MethodErr,Method);
END;
Registration Interface
This pattern is similar to the discovery pattern, where an Event is raised to register possible modules into a temporary table. Example of that is the “OnRegisterServiceConnection” event in Table 1400, Service Connection.
Since we can’t have Event Subscriber in one module listening to an Event Publisher in another, without having compile dependencies, we have come up with a different solution.
We register functionality from the functionality module and the list of modules in stored in a database table. The table uses a GUID and the Language ID for a primary key, and then the view is filtered by the Language ID to only show one entry for each module.
This pattern gives me a list of possible modules for that given functionality. I can open the Setup Page for that module and I can execute the Interface Codeunit for that module as well. Both the Setup Page ID and the Interface Codeunit ID are object names.
The registration interface uses the Method variable to select the functionality. It can either register a new module or it can execute the method in the modules.
OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
InitializeFromTempBlob(Rec);
IF NOT GetVariableTextValue(Method,'Method') THEN
ERROR(MethodErr);
CASE Method OF
'Register':
RegisterCollectionApp(JsonInterfaceMgt);
ELSE
ExecuteMethodInApps(Rec);
END;
END;
LOCAL RegisterCollectionApp(JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")
WITH BankCollectionModule DO BEGIN
JsonInterfaceMgt.GetVariableGUIDValue(ID,'ID');
"Language ID" := GLOBALLANGUAGE();
IF FIND THEN EXIT;
INIT;
JsonInterfaceMgt.GetVariableTextValue(Name,'Name');
JsonInterfaceMgt.GetVariableTextValue("Setup Page ID",'SetupPageID');
JsonInterfaceMgt.GetVariableTextValue("Interface Codeunit ID",'InterfaceCodeunitID');
INSERT;
END;
[External] ExecuteMethodInApps(VAR TempBlob : Record TempBlob)
WITH BankCollectionModule DO BEGIN
SETCURRENTKEY("Interface Codeunit ID");
IF FINDSET THEN REPEAT
JsonInterfaceMgt.ExecuteInterfaceCodeunitIfExists("Interface Codeunit ID",TempBlob,'');
SETFILTER("Interface Codeunit ID",'>%1',"Interface Codeunit ID");
UNTIL NEXT = 0;
END;
In the “ExecuteMethodInApps” function I use the filters to make sure to only execute each Interface Codeunit once.
The registration is executed from the Setup & Configuration in the other module.
[External] RegisterCollectionApp()
WITH JsonInterfaceMgt DO BEGIN
Initialize();
AddVariable('Method','Register');
AddVariable('ID',GetCollectionAppID);
AddVariable('Name',ClaimAppName);
AddVariable('SetupPageID','ADV Claim Setup');
AddVariable('InterfaceCodeunitID','ADV Claim Interface Access');
GetAsTempBlob(TempBlob);
ExecuteInterfaceCodeunitIfExists('ADV Bank Collection App Access',TempBlob,'');
END;
Extend functionality using the Registered Modules.
As we have been taught we should open our functionality for other modules. This is done by adding Integration Events to our code.
LOCAL [IntegrationEvent] OnBeforePaymentPost(ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry";VAR CustLedgEntry : Record "Cust. Ledger Entry";VAR UseClaimPaymentApplication : Boolean;VAR ToAccountType : 'G/L Account,Customer,Vendor,Bank Acco
LOCAL [IntegrationEvent] OnBeforePostGenJnlLine(VAR ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry";VAR GenJournalLine : Record "Gen. Journal Line";VAR AppliedDocType : Option;VAR AppliedDocNo : Code[20];VAR AppliesToID : Code[50])
Where the Subscriber that needs to respond to this Publisher is in another module we need to extend the functionality using JSON interfaces.
First, we create a Codeunit within the Publisher module with Subscribers. The parameters in the Subscribers are converted to JSON and passed to the possible subscriber modules using the “ExecuteMethodInApps” function above.
LOCAL [EventSubscriber] OnBeforeClaimPaymentInsert(VAR ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry")
GetClaimSettings(ClaimPaymentEntry);
LOCAL GetClaimSettings(VAR ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry") Success : Boolean
JsonInterfaceMgt.Initialize;
JsonInterfaceMgt.AddVariable('Method','GetClaimSettings');
JsonInterfaceMgt.AddVariable('ClaimantID',ClaimPaymentEntry."Claimant Registration No.");
JsonInterfaceMgt.AddVariable('ClaimKey',ClaimPaymentEntry."Claim Account No.");
JsonInterfaceMgt.AddVariable('InterestDate',ClaimPaymentEntry."Interest Date");
JsonInterfaceMgt.GetAsTempBlob(TempBlob);
BankCollectionAppAccess.ExecuteMethodInApps(TempBlob);
JsonInterfaceMgt.InitializeFromTempBlob(TempBlob);
IF NOT JsonInterfaceMgt.GetVariableBooleanValue(Success,'Success') THEN EXIT;
ClaimPaymentEntry."Batch Code" := GetJsonProperty('BatchCode');
ClaimPaymentEntry."Template Code" := GetJsonProperty('TemplateCode');
ClaimPaymentEntry."Source Code" := GetJsonProperty('SourceCode');
ClaimPaymentEntry."Customer No." := GetJsonProperty('CustomerNo');
ClaimPaymentEntry."Customer Name" := GetJsonProperty('CustomerName');
The module that is extending this functionality will be able to answer to these request and supply the required response.
OnRun(VAR Rec : Record TempBlob)
IF NOT Setup.READPERMISSION THEN EXIT;
Setup.GET;
WITH JsonInterfaceMgt DO BEGIN
InitializeFromTempBlob(Rec);
IF NOT GetVariableTextValue(Method,'Method') THEN
ERROR(MethodErr);
CASE Method OF
'Register':
RegisterCollectionApp();
'GetByCustLedgEntryNo':
ReturnClaimForCustLedgEntryNo(Rec);
'GetCustLedgEntryLinkInfo':
ReturnClaimInfoForCustLedgEntryNo(Rec);
'DisplayCustLedgEntryLinkInfo':
DisplayClaimInfoForCustLedgEntryNo();
'GetClaimSettings':
ReturnClaimSettings(Rec);
'GetClaimTempateSettings':
ReturnClaimTemplateSettings(Rec);
'GetClaimPaymentApplicationID':
ReturnClaimPaymentApplicationID(Rec);
'AddToGenDataRequest':
ReturnGenDataRequest(Rec);
END;
END;
Azure Function
The last example we will show is the Azure Function. Some functionality requires execution in an Azure Function.
By making sure that our Azure Function understands the same JSON format used in our JSON Interface Codeunit we can easily prepare the request and read the response using the same methods.
We have the Azure Function Execution in that same JSON Codeunit. Hence, easily prepare the request and call the function in a similar way as for other interfaces.
JsonInterfaceMgt.Initialize;
JsonInterfaceMgt.AddVariable('Method',ServiceMethod);
JsonInterfaceMgt.AddVariable('Url',ServiceUrl);
JsonInterfaceMgt.AddVariable('Username',Username);
JsonInterfaceMgt.AddEncryptedVariable('Password',Password);
JsonInterfaceMgt.AddVariable('Certificate',CertificateValueAsBase64);
JsonInterfaceMgt.AddVariable('Xml',TempBlob.ReadAsTextWithCRLFLineSeparator);
Success := JsonInterfaceMgt.ExecuteAzureFunction;
IF JsonInterfaceMgt.GetVariableBLOBValue(TempBlob,'Xml') THEN
LogMgt.SetIncoming(TempBlob.ReadAsTextWithCRLFLineSeparator,'xml')
ELSE
LogMgt.SetIncoming(JsonInterfaceMgt.GetJSON,'json');
IF Success THEN
DataExch."File Content" := TempBlob.Blob;
The request JSON is posted to the Azure Function and the result read with a single function.
[External] ExecuteAzureFunction() Success : Boolean
GetAsTempBlob(TempBlob);
IF (NOT GetVariableTextValue(AzureServiceURL,'AzureServiceURL')) OR (AzureServiceURL = '') THEN
AzureServiceURL := 'https://<azurefunction>.azurewebsites.net/api/AzureProxy?code=<some access code>';
OnBeforeExecuteAzureFunction(TempBlob,AzureServiceURL,OmmitWebRequest);
IF NOT OmmitWebRequest THEN BEGIN
HttpWebRequestMgt.Initialize(AzureServiceURL);
HttpWebRequestMgt.DisableUI;
HttpWebRequestMgt.SetMethod('POST');
HttpWebRequestMgt.SetContentType('application/json');
HttpWebRequestMgt.SetReturnType('application/json');
HttpWebRequestMgt.AddBodyBlob(TempBlob);
TempBlob.INIT;
TempBlob.Blob.CREATEINSTREAM(ResponseInStream,TEXTENCODING::UTF8);
IF NOT HttpWebRequestMgt.GetResponse(ResponseInStream,HttpStatusCode,ResponseHeaders) THEN
IF NOT HttpWebRequestMgt.ProcessFaultResponse('http://www.advania.is') THEN BEGIN
Initialize;
AddVariable('Exception',GETLASTERRORTEXT);
EXIT(FALSE);
END;
END;
InitializeFromTempBlob(TempBlob);
GetVariableBooleanValue(Success,'Success');
We use the “OnBeforeExecuteAzureFunction” event with a manual binding for our Unit Tests.
In the Azure Function we read the request with standard JSON functions
Having standard ways of talking between modules and solutions has opened up for a lot of flexibility. We like to keep our solutions as small as possible.
We could mix “Methods” and “Versions” if we at later time need to be able to extend some of the interfaces. We need to honor the contract we have made for the interfaces. We must not make breaking changes to the interfaces, but we sure can extend them without any problems.
By attaching the JSON Interface Codeunit to the post I hope that you will use this pattern in your solutions. Use the Code freely. It is supplies as-is and without any responsibility, obligations or requirements.
The goal of this post is to demo from start to finish the automated build and test of an AL solution for Microsoft Dynamics 365 Business Central.
Setup and configure the build machine
We will create our build machine from a standard Windows 2016 template in Azure.
Docker containers and container images will take a lot of disk space. The data are stored in %ProgramData%\docker
It if obvious that we will not be able to store the lot on the system SSD system drive. To solve this I create an 1TB HDD disk in Azure.
After starting the Azure VM and opening the Server Manager to look at the File and Storage Service we can see the new empty disk that need configuration.
Right click the new drive to create a new volume.
And assign the drive letter
Next go to Add roles and features to add the Containers feature. More information can be found here. We also need to add ‘.NET Framework 3.5 Features’.
I also like to make sure that all Microsoft updates have been installed.
Now I start PowerShell ISE as Administrator.
As Windows Servers are usually configured in a way that prohibits downloads I like to continue the installation task in PowerShell.
To enable all the scripts to be executes we need to change the execution policy for PowerShell scripts. Executing
Set-ExecutionPolicy -ExecutionPolicy Unrestricted
will take care of that.
Confirm with Yes to all.
To make sure that all the following download functions will execute successfully we need to change the TLS configuration with another PowerShell command.
will download the latest version at the time of this blog post. The only thing I change from default during GIT setup is the default editor. I like to use Visual Studio Code.
Go ahead and start Visual Studio Code as Administrator.
Add the AdvaniaGIT extension to Visual Studio Code
Install AdvaniaGIT PowerShell Scripts! We access the commands in Visual Studio Code by pressing Ctrl+Shift+P. From there we type to search for the command ‘Advania: Go!’ and the when selected we press enter.
You will get a small notification dialog asking you to switch to the AdvaniaGIT terminal window.
Accept the default path for the installation but select No to the two optional installation options.
We need a development license to work with NAV and Business Central. This license you copy into the ‘C:\AdvaniaGIT\License’ folder. In the ‘GITSettings.json’ file that Visual Studio Code opened during AdvaniaGIT installation we need to point to this license file.
The DockerSettings.json file is also opened during installation and if you have access to the insider builds we need to update that file.
{
"RepositoryPath": "bcinsider.azurecr.io",
"RepositoryUserName": "User Name from Collaborate",
"RepositoryPassword": "Password from Collaborate",
"ClientFolders": []
}
Save both these configuration files and restart Visual Studio Code. This restart is required to make sure Visual Studio Code recognizes the AdvaniaGIT PowerShell modules.
Let’s open our first GIT repository. We start by opening the NAV 2018 repository. Repositories must have the setup.json file in the root folder to support the AdvaniaGIT functionality.
I need some installation files from the NAV 2018 DVD and I will start by cloning my GitHub NAV 2018 respository. From GitHub I copy the Url to the repository. In Visual Studio Code I open the commands with Ctrl+Shift+P and execute the command ‘Git: Clone’.
I selected the default folder for the local copy and accepted to open the repository folder. Again with Ctrl+Shift+P I start the NAV Installation.
The download will start. The country version we are downloading does not matter at this point. Every country has the same installation files that we require.
This will download NAV and start the installation. I will just cancel the installation and manually install just what I need.
Microsoft SQL Server\sqlncli64
Microsoft SQL Server Management Objects\SQLSysClrTypes
Microsoft Visual C++ 2013\vcredist_x64
Microsoft Visual C++ 2013\vcredist_x86
Microsoft Visual C++ 2017\vcredist_x64
Microsoft Visual Studio 2010 Tools For Office Redist\vstor_redist
To enable the windows authentication for the build containers we need to save the windows credentials. I am running as user “navlightadmin”. I securely save the password for this user by starting a command (Ctrl+Shift+P) and select to save container credentials.
For all the docker container support I like to use the NAV Container Helper from Microsoft. With another command (Ctrl+Shift+P) I install the container helper to the server.
I use Visual Studio Code to update the docker configuration. As pointed out here the default docker configuration file can be found at ‘C:\ProgramData\Docker\config\daemon.json’. If this file does not already exist, it can be created. I update the ‘data-root’ configuration.
Now let’s restart the server by typing
Restart-Computer -Force
or manually.
After restart, open Visual Studio Code as Administrator.
Now to verify the installation let’s clone my Business Central repository. Start command (Ctrl+Shift+P) ‘Git: Clone’ and paste in the Url to the repository.
Make sure to have the Integrated Terminal visible and let’s verify the installation by executing a command (Ctrl+Shift+P) ‘Advania: Build NAV Environment’ to build the development environment.
The image download should start…
You should now be able to use the command (Ctrl+Shift+P) ‘Advania: Start Client’, ‘Advania: Start Web Client’, ‘Advania: Start FinSql’ and ‘Advania: Start Debugger’ to verify all the required NAV/BC functionality.
If you are happy with the results you should be able to install the build agent as shown by Soren Klemmensen here.
Until now I have had my G/L Source Names extension in English only.
Now the upcoming release of Microsoft Dynamics 365 Business Central I need to supply more languages. What does a man do when he does not speak the language?
I gave a shout out yesterday on Twitter asking for help with translation. Tobias Fenster reminded me that we have a service to help us with that. I had already tried to work with this service and now it was time to test the service on my G/L Source Names extension.
In my previous posts I had created the Xliff translation files from my old ML properties. I manually translated to my native language; is-IS.
I already got a Danish translation file sent from a colleague.
Before we start; I needed to do a minor update to the AdvaniaGIT tools. Make sure you run “Advania: Go!” to update the PowerShell Script Package. Then restart Visual Studio Code.
Now, let’s prepare the Xliff files in Visual Studio Code. From the last build I have the default GL Source Names.g.xlf file. I executed the action to create Xliff files.
This action will prompt for a selection of language. The selection is from the languages included in the NAV DVD.
After selection the system will prompt for a translation file that is exported from FinSql. This I already showed in a YouTube Video. If you don’t have a file from FinSql you can just cancel this part. If you already have an Xliff file for that language then it will be imported into memory as translation data and then removed.
This method is therefore useful if you want to reuse the Xliff file data after an extension update. All new files will be based on the g.xlf file.
I basically did this action for all 25 languages. I already had the is-IS and da-DK files, so they where updated. Since the source language is en-US all my en-XX files where automatically translated. All the other languages have translation state set to “needs-translation”.
All these files I need to upload to the Translation Service. From the Lifecycle Services menu select the Translation Service. This will open the Translation Service Dashboard.
Press + to add a translation request.
I now need to zip and upload the nl-NL file from my Translations folder.
After upload I Submit the translation request
The request will appear on the dashboard with the status; Processing. Now I need to wait for the status to change to Completed. Or, create requests for all the other languages and upload files to summit.
When translation has completed I can download the result.
And I have a translation in state “needs-review-translation”.