Add translations to your NAV/BC Server

Yesterday I got a question via LinkedIn. I need to add Spanish translation to my W1 instance. How do I do that?

So, let me walk you through that process.

Here is my Business Central setup. It is the Icelandic Docker Container, so I have Icelandic and English. Switching between Icelandic and English works just fine.

Switching to Spanish gives me a mix of Spanish and English.

The Spanish translation for the platform is shipped with the DVD image and automatically installed. So are a lot of other languages.

Icelandic and English are built in captions in the C/AL code. And even if all these languages are shipped with the platform, these languages are not shipped with the application.

There is a way to get these application translations from the appropriate release and add them to your application.

Let’s start in VS Code where I have cloned my Business Central repository from GitHub. I opened the workspace file and also opened “setup.json” from the root folder of my repository.

This configuration points to the W1 Business Central OnPrem Docker Image. Now, let’s point to the Spanish one.

And let’s build a container.


Switching the Terminal part to AdvaniaGIT, I see that I am now pulling the Spanish Docker image down to my laptop.

This may take a few minutes…

After the container is ready I start FinSql.exe

Just opening the first table and properties for the first field I can verify than I have the Spanish captions installed.

So, let’s export these Spanish captions by selecting all objects except the new trigger codeunits (Business Central only) and selecting to export translation…

Save the export to a TXT file.

Opening this file in Visual Studio Code, we can see that the code page does not match the required UTF-8 format. Here we can also see that we have English in lines with A1033 and Spanish in lines with A1034.

We need to process this file with PowerShell. Executing that script can also take some time…

This script reads the file using the “Oem” code page. This code page is the one FinSql uses for import and export. We read through the file and every line that is identified as Spanish is the added to the output variable. We end by writing that output variable to the same file using the “utf8” code page.

Visual Studio Code should refresh the file automatically.

We need to create a “Translations” folder in the server folder. The default server uses the root Translations folder.

If you have instances then the “Translations” folder needs to be in the Instance.

Since I am running this in a container I may need to create this folder in the container.

Then, copy the updated file to the “Translations” folder.

And make sure it has been put into the correct path.

We need to restart the service instance.

Then in my Web Client I can verify that the Spanish application language is now available.

That is it!

Here is the PowerShell script

$LanguageFile = Get-Item -Path C:\AdvaniaGIT\Workspace\es.txt

Write-Host "Loading $($LanguageFile.Name)..."
$TranslateFile = Get-Content -Path $LanguageFile.FullName -Encoding Oem
$i = 0
$count = $TranslateFile.Length
$StartTime = Get-Date
foreach ($Line in $TranslateFile) {
    $i++
    $NowTime = Get-Date
    $TimeSpan = New-TimeSpan $StartTime $NowTime
    $percent = $i / $count
    if ($percent -gt 1) 
    {
        $percent = 1
    }
    $remtime = $TimeSpan.TotalSeconds / $percent * (1-$percent)
    if (($i % 100) -eq 0) 
    {
        Write-Progress -Status "Processing $i of $count" -Activity 'Updating Translation...' -PercentComplete ($percent*100) -SecondsRemaining $remtime
    }

    if ($Line -match "A1034") {
        if ($TranslatedFile) {
            $TranslatedFile += $Line + "`r`n"
        } else {
            $TranslatedFile = $Line + "`r`n"
        }
    }
}

Write-Host "Saving $($LanguageFile.Name)..."
Remove-Item -Path $LanguageFile.FullName -Force -ErrorAction SilentlyContinue
Out-File -FilePath $LanguageFile.FullName -Encoding utf8 -InputObject $TranslatedFile -Force

In this post I used both AdvaniaGIT and NAVContainerHelper tools. Good luck.

Event subscription and performance

When we design and write our code we need to think about performance.

We have been used to thinking about database performance, using FindFirst(), FindSet(), IsEmpty() where appropriate.

We also need to think about performance when we create our subscriber Codeunits.

Let’s consider this Codeunit.

codeunit 50100 MySubscriberCodeunit
{
    trigger OnRun()
    begin
        
    end;
    

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)] 
    local procedure MyProcedure(var SalesHeader: Record "Sales Header")
    begin
        Message('I am pleased that you called.');
    end;


}

Every time any user posts a sales document this subscriber will be executed.

Executing this subscriber will need to load an instance of this Codeunit into the server memory. After execution the Codeunit instance is trashed.

The resources needed to initiate an instance of this Codeunit and trash it again, and doing that for every sales document being posted are a waste of resources.

If we change the Codeunit and make it a “Single Instance”.

codeunit 50100 MySubscriberCodeunit
{
    SingleInstance = true;
    
    trigger OnRun()
    begin
        
    end;
    

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)] 
    local procedure MyProcedure(var SalesHeader: Record "Sales Header")
    begin
        Message('I am pleased that you called.');
    end;


}

What happens now is that Codeunit only has one instance for each session. When the first sales document is posted then the an instance of the Codeunit is created and kept in memory on the server as long as the session is alive.

This will save the resources needed to initialize an instance and tear it down again.

Making sure that our subscriber Codeunits are set to single instance is even more important for subscribers to system events that are frequently executed.

Note that a single instance Codeunit used for subscription should not have any global variables, since the global variables are also kept in memory though out the session lifetime.

Make sure that whatever is executed inside a single instance subscriber Codeunit is executed in a local procedure. The variables inside a local procedure are cleared between every execution, also in a single instance Codeunit.

codeunit 50100 MySubscriberCodeunit
{
    SingleInstance = true;

    trigger OnRun()
    begin
        
    end;
    

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)] 
    local procedure MyProcedure(var SalesHeader: Record "Sales Header")
    begin
        ExecuteBusinessLogic(SalesHeader);

    end;

    local procedure ExecuteBusinessLogic(SalesHeader: Record "Sales Header")
    var
        Customer: Record Customer;
    begin
        Message('I am pleased that you called.');    
    end;

}

If your custom code executes every time that the subscriber is executed then I am fine with having that code in a local procedure inside the single instance Codeunit.

Still, I would suggest putting the code in another Codeunit, and keeping the subscriber Codeunit as small as possible.

This is even more important if the custom code only executes on a given condition.

An example of a Codeunit that you call from the subscriber Codeunit could be like this.

codeunit 50001 MyCodeCalledFromSubscriber
{
    TableNo = "Sales Header";
    
    trigger OnRun()
    begin
        ExecuteBusinessLogic(Rec);
    end;
    local procedure ExecuteBusinessLogic(SalesHeader: Record "Sales Header")
    var
        Customer: Record Customer;
    begin
        Message('I am pleased that you called.');    
    end;
}

And I change my subscriber Codeunit to only execute this code on a given condition.

codeunit 50100 MySubscriberCodeunit
{
    SingleInstance = true;

    trigger OnRun()
    begin
        
    end;
    

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)] 
    local procedure MyProcedure(var SalesHeader: Record "Sales Header")
    begin
        ExecuteBusinessLogic(SalesHeader);

    end;

    local procedure ExecuteBusinessLogic(SalesHeader: Record "Sales Header")
    var
        MyCodeCalledFromSubscriber: Codeunit MyCodeCalledFromSubscriber;
    begin
        if SalesHeader."Document Type" = SalesHeader."Document Type"::Order then
            MyCodeCalledFromSubscriber.Run(SalesHeader);
    end;

}

This pattern makes sure that the execution is a fast as possible and no unneeded variables are populating the server memory.

JSON Interface – examples

We have several ways of using the JSON interfaces. I will give few examples with the required C/AL code. I will be using Advania’s Online Banking solution interfaces for examples.

The Advania’s Online Banking solution is split into several different modules. The main module has the general framework. Then we have communication modules and functionality modules.

On/Off Question

A communication module should not work if the general framework does not exist or is not enabled for the current company. Hence, I need to ask the On/Off question

This is triggered by calling the solution enabled Codeunit.

IF NOT JsonInterfaceMgt.TryExecuteCodeunitIfExists('ADV Bank Services Enabled Mgt.','') THEN BEGIN
  SetupNotification.MESSAGE := NotificationMsg;
  SetupNotification.SEND;
END;

The interface function will search for the Codeunit, check for execution permissions and call the Codeunit with an empty request BLOB.

The “Enabled” Codeunit must respond with a “Success” variable of true or false.

[External] TryExecuteCodeunitIfExists(CodeunitName : Text;ErrorIfNotFound : Text) Success : Boolean
Object.SETRANGE(Type,Object.Type::Codeunit);
Object.SETRANGE(Name,CodeunitName);
IF NOT Object.FINDFIRST THEN
  IF ErrorIfNotFound <> '' THEN
    ERROR(ErrorIfNotFound)
  ELSE
    EXIT;

IF NOT HasCodeunitExecuteLicense(Object.ID,ErrorIfNotFound) THEN EXIT;
CODEUNIT.RUN(Object.ID,TempBlob);
InitializeFromTempBlob(TempBlob);
GetVariableBooleanValue(Success,'Success');

The “Enabled” Codeunit will test for Setup table read permission and if the “Enabled” flag has been set in the default record.

OnRun(VAR Rec : Record TempBlob)
TestEnabled(Rec);

LOCAL TestEnabled(VAR TempBlob : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  Initialize;
  AddVariable('Success',IsServiceEnabled);
  GetAsTempBlob(TempBlob);
END;

IsServiceEnabled() : Boolean
IF NOT Setup.READPERMISSION THEN EXIT;
EXIT(Setup.GET AND Setup.Enabled);

This is how we can make sure that a module is installed and enabled before we start using it or any of the dependent modules.

Table Access Interface

The main module has a standard response table. We map some of the communication responses to this table via Data Exchange Definition. From other modules we like to be able to read the response from the response table.

The response table uses a GUID value for a primary key and has an integer field for the “Data Exchange Entry No.”. From the sub module we ask if a response exists for the current “Data Exchange Entry No.” by calling the interface.

FindResponse(DataExchEntryNo : Integer) Success : Boolean
WITH JsonInterfaceMgt DO BEGIN
  Initialize;
  AddVariable('DataExchEntryNo',DataExchEntryNo);
  GetAsTempBlob(TempBlob);
  ExecuteInterfaceCodeunitIfExists('ADV Bank Serv. Resp. Interface',TempBlob,ResponseInterfaceErr);
  InitializeFromTempBlob(TempBlob);
  GetVariableBooleanValue(Success,'Success');
END;

The Interface Codeunit for the response table will filter on the “Data Exchange Entry No.” and return the RecordID for that record if found.

OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  GetVariableIntegerValue(DataExchEntryNo,'DataExchEntryNo');
  Response.SETRANGE("Data Exch. Entry No.",DataExchEntryNo);
  AddVariable('Success',Response.FINDFIRST);
  IF Response.FINDFIRST THEN
    AddRecordID(Response);
  GetAsTempBlob(Rec);
END;

If the response is found we can ask for the value of any field from that record by calling

GetFieldValue(FieldName : Text) FieldValue : Text
WITH JsonInterfaceMgt DO
  IF GetRecordByTableName('ADV Bank Service Response',RecRef) THEN
    IF DataTypeMgt.FindFieldByName(RecRef,FldRef,FieldName) THEN
      IF FORMAT(FldRef.TYPE) = 'BLOB' THEN BEGIN
        TempBlob.Blob := FldRef.VALUE;
        FieldValue := TempBlob.ReadAsTextWithCRLFLineSeparator();
      END ELSE
        FieldValue := FORMAT(FldRef.VALUE,0,9);

Processing Interface

Some processes can be both automatically and manually executed. For manual execution we like to display a request page on a Report. On that request page we can ask for variables, settings and verify before executing the process.

For automatic processing we have default settings and logic to find the correct variables before starting the process. And since one module should be able to start a process in the other then we use the JSON interface pattern for the processing Codeunit.

We also like to include the “Method” variable to add flexibility to the interface. Even if there is only one method in the current implementation.

OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  IF NOT GetVariableTextValue(Method,'Method') OR (Method = '') THEN
    ERROR(MethodNotFoundErr);
  CASE Method OF
    'BankAccountProcessing':
      BankAccountProcessing(JsonInterfaceMgt);
  END;
END;

LOCAL BankAccountProcessing(JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")
CheckSetup;
CompanyInformation.GET;
WITH JsonInterfaceMgt DO BEGIN
  GetVariableTextValue(ClaimExportImportFormatCode, 'ClaimExportImportFormatCode');
  GetVariableTextValue(BankAccountNo, 'BankAccountNo');
  GetVariableDateValue(StartDate,'StartDate');
  GetVariableDateValue(EndDate,'EndDate');

  ValidateStartDate;
  ValidateEndDate;
  ValidateImportFormat;
  BankAccount.SETRANGE("No.", BankAccountNo);
  ClaimExportImportFormat.GET(ClaimExportImportFormatCode);
  Initialize;
  AddVariable('BankAccNo',BankAccountNo);
  AddVariable('ClaimantID',CompanyInformation."Registration No.");
  AddVariable('StartDate',StartDate);
  AddVariable('EndDate',EndDate);
  GetAsTempBlob(TempBlob);
  Window.OPEN(ImportingFromBank);
  IF BankAccount.FINDSET THEN REPEAT
    DataExchDef.GET(ClaimExportImportFormat."Resp. Data Exch. Def. Code");

    DataExch.INIT;
    DataExch."Related Record" := BankAccount.RECORDID;
    DataExch."Table Filters" := TempBlob.Blob;
    DataExch."Data Exch. Def Code" := DataExchDef.Code;
    DataExchLineDef.SETRANGE("Data Exch. Def Code",DataExchDef.Code);
    DataExchLineDef.FINDFIRST;
    DataExch."Data Exch. Line Def Code" := DataExchLineDef.Code;

    DataExchDef.TESTFIELD("Ext. Data Handling Codeunit");
    CODEUNIT.RUN(DataExchDef."Ext. Data Handling Codeunit",DataExch);

    DataExch.INSERT;
    IF DataExch.ImportToDataExch(DataExchDef) THEN BEGIN

      DataExchMapping.GET(DataExchDef.Code,DataExchLineDef.Code,DATABASE::"ADV Claim Payment Batch Entry");

      IF DataExchMapping."Pre-Mapping Codeunit" <> 0 THEN
        CODEUNIT.RUN(DataExchMapping."Pre-Mapping Codeunit",DataExch);

      DataExchMapping.TESTFIELD("Mapping Codeunit");
      CODEUNIT.RUN(DataExchMapping."Mapping Codeunit",DataExch);

      IF DataExchMapping."Post-Mapping Codeunit" <> 0 THEN
        CODEUNIT.RUN(DataExchMapping."Post-Mapping Codeunit",DataExch);
    END;
    DataExch.DELETE(TRUE);
  UNTIL BankAccount.NEXT = 0;
  Window.CLOSE;
END;

Reading through the code above we can see that we are also using the JSON interface to pass settings to the Data Exchange Framework. We put the JSON configuration into the “Table Filters” BLOB field in the Data Exchange where we can use it later in the data processing.

From the Report we start the process using the JSON interface.

Bank Account - OnPreDataItem()
WITH JsonInterfaceMgt DO BEGIN
  Initialize;
  AddVariable('Method','BankAccountProcessing');
  AddVariable('ClaimExportImportFormatCode', ClaimExportImportFormat.Code);
  AddVariable('BankAccountNo', BankAccount."No.");
  AddVariable('StartDate',StartDate);
  AddVariable('EndDate',EndDate);
  GetAsTempBlob(TempBlob);
  ExecuteInterfaceCodeunitIfExists('ADV Import BCP Interface', TempBlob, '');
END;

The ExecuteInterfaceCodeunitIfExists will also verify that the Interface Codeunit exists and also verify the permissions before executing.

[External] ExecuteInterfaceCodeunitIfExists(CodeunitName : Text;VAR TempBlob : Record TempBlob;ErrorIfNotFound : Text)
Object.SETRANGE(Type,Object.Type::Codeunit);
Object.SETRANGE(Name,CodeunitName);
IF NOT Object.FINDFIRST THEN
  IF ErrorIfNotFound <> '' THEN
    ERROR(ErrorIfNotFound)
  ELSE
    EXIT;

IF NOT HasCodeunitExecuteLicense(Object.ID,ErrorIfNotFound) THEN EXIT;
CODEUNIT.RUN(Object.ID,TempBlob)

Extensible Interface

For some tasks it might be simple to have a single endpoint (Interface Codeunit) for multiple functionality. This can be achieved by combining Events and Interfaces.

We start by reading the required parameters from the JSON and then we raise an event for anyone to respond to the request.

OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  IF NOT GetVariableTextValue(InterfaceType,'InterfaceType') THEN
    ERROR(TypeErr);
  IF NOT GetVariableTextValue(Method,'Method') THEN
    ERROR(MethodErr);
  OnInterfaceAccess(InterfaceType,Method,Rec);
END;

LOCAL [IntegrationEvent] OnInterfaceAccess(InterfaceType : Text;Method : Text;VAR TempBlob : Record TempBlob)

We can also pass the JSON Interface Codeunit, as that will contain the full JSON and will contain the full JSON for the response.

OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  IF NOT GetVariableTextValue(InterfaceType,'InterfaceType') THEN
    ERROR(TypeErr);
  IF NOT GetVariableTextValue(Method,'Method') THEN
    ERROR(MethodErr);
  OnInterfaceAccess(InterfaceType,Method,JsonInterfaceMgt);
  GetAsTempBlob(Rec);
END;

LOCAL [IntegrationEvent] OnInterfaceAccess(InterfaceType : Text;Method : Text;VAR JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")

One of the subscribers could look like this

LOCAL [EventSubscriber] OnInterfaceAccess(InterfaceType : Text;Method : Text;VAR JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")
IF InterfaceType = 'Claim' THEN
  CASE Method OF
    'Register':
      Register(JsonInterfaceMgt);
    'Edit':
      Edit(JsonInterfaceMgt);
    'AddExportImportFormat':
      AddExportImportFormat(JsonInterfaceMgt);
    'GetSetupCodeunitID':
      GetSetupCodeunitID(JsonInterfaceMgt);
    'GetDirection':
      GetDirection(JsonInterfaceMgt);
    'GetServiceUrl':
      GetServiceUrl(JsonInterfaceMgt);
    'GetExportImportFormat':
      GetExportImportFormat(JsonInterfaceMgt);
    'GetServiceMethod':
      GetServiceMethod(JsonInterfaceMgt);
    'ShowAndGetClaimFormat':
      ShowAndGetClaimFormat(JsonInterfaceMgt);
    'GetDataExchangeDefintionWithAction':
      GetDataExchangeDefintionWithAction(JsonInterfaceMgt);
    'GetOperationResultForClaimant':
      GetOperationResultForClaimant(JsonInterfaceMgt);
    'ShowClaimPayment':
      ShowClaimPayment(JsonInterfaceMgt)
    ELSE
      ERROR(MethodErr,Method);
  END;

Registration Interface

This pattern is similar to the discovery pattern, where an Event is raised to register possible modules into a temporary table. Example of that is the “OnRegisterServiceConnection” event in Table 1400, Service Connection.

Since we can’t have Event Subscriber in one module listening to an Event Publisher in another, without having compile dependencies, we have come up with a different solution.

We register functionality from the functionality module and the list of modules in stored in a database table. The table uses a GUID and the Language ID for a primary key, and then the view is filtered by the Language ID to only show one entry for each module.

This pattern gives me a list of possible modules for that given functionality. I can open the Setup Page for that module and I can execute the Interface Codeunit for that module as well. Both the Setup Page ID and the Interface Codeunit ID are object names.

The registration interface uses the Method variable to select the functionality. It can either register a new module or it can execute the method in the modules.

OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  IF NOT GetVariableTextValue(Method,'Method') THEN
    ERROR(MethodErr);
  CASE Method OF
    'Register':
      RegisterCollectionApp(JsonInterfaceMgt);
    ELSE
      ExecuteMethodInApps(Rec);
  END;
END;

LOCAL RegisterCollectionApp(JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")
WITH BankCollectionModule DO BEGIN
  JsonInterfaceMgt.GetVariableGUIDValue(ID,'ID');
  "Language ID" := GLOBALLANGUAGE();
  IF FIND THEN EXIT;
  INIT;
  JsonInterfaceMgt.GetVariableTextValue(Name,'Name');
  JsonInterfaceMgt.GetVariableTextValue("Setup Page ID",'SetupPageID');
  JsonInterfaceMgt.GetVariableTextValue("Interface Codeunit ID",'InterfaceCodeunitID');
  INSERT;
END;

[External] ExecuteMethodInApps(VAR TempBlob : Record TempBlob)
WITH BankCollectionModule DO BEGIN
  SETCURRENTKEY("Interface Codeunit ID");
  IF FINDSET THEN REPEAT
    JsonInterfaceMgt.ExecuteInterfaceCodeunitIfExists("Interface Codeunit ID",TempBlob,'');
    SETFILTER("Interface Codeunit ID",'>%1',"Interface Codeunit ID");
  UNTIL NEXT = 0;
END;

In the “ExecuteMethodInApps” function I use the filters to make sure to only execute each Interface Codeunit once.

The registration is executed from the Setup & Configuration in the other module.

[External] RegisterCollectionApp()
WITH JsonInterfaceMgt DO BEGIN
  Initialize();
  AddVariable('Method','Register');
  AddVariable('ID',GetCollectionAppID);
  AddVariable('Name',ClaimAppName);
  AddVariable('SetupPageID','ADV Claim Setup');
  AddVariable('InterfaceCodeunitID','ADV Claim Interface Access');
  GetAsTempBlob(TempBlob);
  ExecuteInterfaceCodeunitIfExists('ADV Bank Collection App Access',TempBlob,'');
END;

Extend functionality using the Registered Modules.

As we have been taught we should open our functionality for other modules. This is done by adding Integration Events to our code.

LOCAL [IntegrationEvent] OnBeforePaymentPost(ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry";VAR CustLedgEntry : Record "Cust. Ledger Entry";VAR UseClaimPaymentApplication : Boolean;VAR ToAccountType : 'G/L Account,Customer,Vendor,Bank Acco

LOCAL [IntegrationEvent] OnBeforePostGenJnlLine(VAR ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry";VAR GenJournalLine : Record "Gen. Journal Line";VAR AppliedDocType : Option;VAR AppliedDocNo : Code[20];VAR AppliesToID : Code[50])

Where the Subscriber that needs to respond to this Publisher is in another module we need to extend the functionality using JSON interfaces.

First, we create a Codeunit within the Publisher module with Subscribers. The parameters in the Subscribers are converted to JSON and passed to the possible subscriber modules using the “ExecuteMethodInApps” function above.

LOCAL [EventSubscriber] OnBeforeClaimPaymentInsert(VAR ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry")
GetClaimSettings(ClaimPaymentEntry);

LOCAL GetClaimSettings(VAR ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry") Success : Boolean
JsonInterfaceMgt.Initialize;
JsonInterfaceMgt.AddVariable('Method','GetClaimSettings');
JsonInterfaceMgt.AddVariable('ClaimantID',ClaimPaymentEntry."Claimant Registration No.");
JsonInterfaceMgt.AddVariable('ClaimKey',ClaimPaymentEntry."Claim Account No.");
JsonInterfaceMgt.AddVariable('InterestDate',ClaimPaymentEntry."Interest Date");
JsonInterfaceMgt.GetAsTempBlob(TempBlob);
BankCollectionAppAccess.ExecuteMethodInApps(TempBlob);
JsonInterfaceMgt.InitializeFromTempBlob(TempBlob);
IF NOT JsonInterfaceMgt.GetVariableBooleanValue(Success,'Success') THEN EXIT;

ClaimPaymentEntry."Batch Code" := GetJsonProperty('BatchCode');
ClaimPaymentEntry."Template Code" := GetJsonProperty('TemplateCode');
ClaimPaymentEntry."Source Code" := GetJsonProperty('SourceCode');
ClaimPaymentEntry."Customer No." := GetJsonProperty('CustomerNo');
ClaimPaymentEntry."Customer Name" := GetJsonProperty('CustomerName');

The module that is extending this functionality will be able to answer to these request and supply the required response.

OnRun(VAR Rec : Record TempBlob)
IF NOT Setup.READPERMISSION THEN EXIT;
Setup.GET;

WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  IF NOT GetVariableTextValue(Method,'Method') THEN
    ERROR(MethodErr);
  CASE Method OF
    'Register':
      RegisterCollectionApp();
    'GetByCustLedgEntryNo':
      ReturnClaimForCustLedgEntryNo(Rec);
    'GetCustLedgEntryLinkInfo':
      ReturnClaimInfoForCustLedgEntryNo(Rec);
    'DisplayCustLedgEntryLinkInfo':
      DisplayClaimInfoForCustLedgEntryNo();
    'GetClaimSettings':
      ReturnClaimSettings(Rec);
    'GetClaimTempateSettings':
      ReturnClaimTemplateSettings(Rec);
    'GetClaimPaymentApplicationID':
      ReturnClaimPaymentApplicationID(Rec);
    'AddToGenDataRequest':
      ReturnGenDataRequest(Rec);
  END;
END;

Azure Function

The last example we will show is the Azure Function. Some functionality requires execution in an Azure Function.

By making sure that our Azure Function understands the same JSON format used in our JSON Interface Codeunit we can easily prepare the request and read the response using the same methods.

We have the Azure Function Execution in that same JSON Codeunit. Hence, easily prepare the request and call the function in a similar way as for other interfaces.

JsonInterfaceMgt.Initialize;
JsonInterfaceMgt.AddVariable('Method',ServiceMethod);
JsonInterfaceMgt.AddVariable('Url',ServiceUrl);
JsonInterfaceMgt.AddVariable('Username',Username);
JsonInterfaceMgt.AddEncryptedVariable('Password',Password);
JsonInterfaceMgt.AddVariable('Certificate',CertificateValueAsBase64);
JsonInterfaceMgt.AddVariable('Xml',TempBlob.ReadAsTextWithCRLFLineSeparator);
Success := JsonInterfaceMgt.ExecuteAzureFunction;
IF JsonInterfaceMgt.GetVariableBLOBValue(TempBlob,'Xml') THEN
  LogMgt.SetIncoming(TempBlob.ReadAsTextWithCRLFLineSeparator,'xml')
ELSE
  LogMgt.SetIncoming(JsonInterfaceMgt.GetJSON,'json');
IF Success THEN
  DataExch."File Content" := TempBlob.Blob;

The request JSON is posted to the Azure Function and the result read with a single function.

[External] ExecuteAzureFunction() Success : Boolean
GetAsTempBlob(TempBlob);
IF (NOT GetVariableTextValue(AzureServiceURL,'AzureServiceURL')) OR (AzureServiceURL = '') THEN
  AzureServiceURL := 'https://<azurefunction>.azurewebsites.net/api/AzureProxy?code=<some access code>';

OnBeforeExecuteAzureFunction(TempBlob,AzureServiceURL,OmmitWebRequest);

IF NOT OmmitWebRequest THEN BEGIN
  HttpWebRequestMgt.Initialize(AzureServiceURL);
  HttpWebRequestMgt.DisableUI;
  HttpWebRequestMgt.SetMethod('POST');
  HttpWebRequestMgt.SetContentType('application/json');
  HttpWebRequestMgt.SetReturnType('application/json');
  HttpWebRequestMgt.AddBodyBlob(TempBlob);

  TempBlob.INIT;
  TempBlob.Blob.CREATEINSTREAM(ResponseInStream,TEXTENCODING::UTF8);
  IF NOT HttpWebRequestMgt.GetResponse(ResponseInStream,HttpStatusCode,ResponseHeaders) THEN
    IF NOT HttpWebRequestMgt.ProcessFaultResponse('http://www.advania.is') THEN BEGIN
      Initialize;
      AddVariable('Exception',GETLASTERRORTEXT);
      EXIT(FALSE);
    END;
END;

InitializeFromTempBlob(TempBlob);
GetVariableBooleanValue(Success,'Success');

We use the “OnBeforeExecuteAzureFunction” event with a manual binding for our Unit Tests.

In the Azure Function we read the request with standard JSON functions

dynamic data = await req.Content.ReadAsAsync<object>();
Newtonsoft.Json.Linq.JArray jRequestArray = Newtonsoft.Json.Linq.JArray.Parse(data.ToString());
string Method = jRequestArray.First().Value<string>("Method") ?? "Undefined";

Then based on the Method we call each functionality with the request and write the response to the response JSON.

Newtonsoft.Json.Linq.JArray jResponseArray = new Newtonsoft.Json.Linq.JArray();
Newtonsoft.Json.Linq.JObject jResponseObject = new Newtonsoft.Json.Linq.JObject();
try
{            
    switch (Method)
    {
        case "Ping":
            success = true;
            response = "Hello " + (jRequestArray.First().Value<string>("Name") ?? "Undefined") + "!";
            break; 
        case "IOBS2005WSE2.GetAccountStatement":
            xml = jRequestArray.First().Value<string>("Xml") ?? "";
            success = IOBS2005WSE2.Helper.GetAccountStatement(
                jRequestArray.First().Value<string>("Url") ?? "",
                jRequestArray.First().Value<string>("Username") ?? "",
                Decrypt(jRequestArray.First().Value<string>("Password") ?? ""),
                jRequestArray.First().Value<string>("Certificate") ?? "",
                ref xml);
            jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("Xml", xml));
            response = "";
            break;
            ...
        default:
            response = "Method not found";
            break;
    }
    jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("Response", response));
}
catch (System.Exception exception)
{
    httpStatusCode = HttpStatusCode.BadRequest;
    jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("Request", request));
    jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("Message", exception.Message));
    jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("StackTrace", exception.StackTrace.ToString()));
    jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("InnerException", exception.InnerException.Message));
}

jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("Success", success));
jResponseArray.Add(jResponseObject);           
return req.CreateResponse(httpStatusCode, jResponseArray);                        

Conclusion

Having standard ways of talking between modules and solutions has opened up for a lot of flexibility. We like to keep our solutions as small as possible.

We could mix “Methods” and “Versions” if we at later time need to be able to extend some of the interfaces. We need to honor the contract we have made for the interfaces. We must not make breaking changes to the interfaces, but we sure can extend them without any problems.

By attaching the JSON Interface Codeunit to the post I hope that you will use this pattern in your solutions. Use the Code freely. It is supplies as-is and without any responsibility, obligations or requirements.

C/AL and AL Side-by-Side Development with AdvaniaGIT

Microsoft supports Side-by-Side development for C/AL and AL.  To start using the Side-by-Side development make sure you have the latest version of AdvaniaGIT add-in for Visual Studio Code and update the PowerShell scripts by using the “Advania: Go!” command.

When the Business Central environment is built use the “Advania: Build C/AL Symbol References for AL” to enable the Side-by-Side development for this environment.  This function will reconfigure the service and execute the Generate Symbol References command for the environment.  From here on everything you change in C/AL on this environment will update the AL Symbol References.

So let’s try this out.

I converted my C/AL project to AL project with the steps described in my previous post.  Then selected to open Visual Studio Code in AL folder.

In my new Visual Studio Code window I selected to build an environment – the Docker Container.

When AdvaniaGIT builds a container it will install the AL Extension for Visual Studio Code from that Container.  We need to read the output of the environment build.  In this example I am asked to restart Visual Studio Code before reinstalling AL Language.  Note that if you are not asked to restart Visual Studio Code you don’t need to do that.

After restart I can see that the AL Language extension for Visual Studio Code is missing.

To fix this I execute the “Advania: Build NAV Environment” command again.  This time, since the Container is already running only the NAV license and the AL Extension will be updated.

Restart Visual Studio Code again and we are ready to go.

If we build new environment for our AL project we must update the environment settings in .vscode\launch.json.  This we can do with a built in AdvaniaGIT command.

We can verify the environment by executing “Advania: Check NAV Environment”.  Everything should be up and running at this time.

Since we will be using Side-by-Side development for C/AL and AL in this environment we need to enable that by executing “Advania: Build C/AL Symbol References for AL”.

This will take a few minutes to execute.

Don’t worry about the warning.  AdvaniaGIT takes care of restarting the service.  Let’s download AL Symbols and see what happens.

We can see that AL now recognizes the standard symbols but my custom one; “IS Soap Proxy Client Mgt.” is not recognized.  I will tell you more about this Codeunit in my next blog post.

I start FinSql to import the Codeunit “IS Soap Proxy Client Mgt.”

Import the FOB file

Close FinSql and execute the “AL: Download Symbols” again.  We can now see that AL recognizes my C/AL Codeunit.

Now I am good to go.

Using the Translation Service for G/L Source Names

Until now I have had my G/L Source Names extension in English only.

Now the upcoming release of Microsoft Dynamics 365 Business Central I need to supply more languages.  What does a man do when he does not speak the language?

I gave a shout out yesterday on Twitter asking for help with translation.  Tobias Fenster reminded me that we have a service to help us with that.  I had already tried to work with this service and now it was time to test the service on my G/L Source Names extension.

In my previous posts I had created the Xliff translation files from my old ML properties.  I manually translated to my native language; is-IS.

I already got a Danish translation file sent from a colleague.

Before we start; I needed to do a minor update to the AdvaniaGIT tools.  Make sure you run “Advania: Go!” to update the PowerShell Script Package.  Then restart Visual Studio Code.

Off to the Microsoft Lifecycle Services to utilize the translation service.

Now, let’s prepare the Xliff files in Visual Studio Code.  From the last build I have the default GL Source Names.g.xlf file.  I executed the action to create Xliff files.

This action will prompt for a selection of language.  The selection is from the languages included in the NAV DVD.

After selection the system will prompt for a translation file that is exported from FinSql.  This I already showed in a YouTube Video.  If you don’t have a file from FinSql you can just cancel this part.  If you already have an Xliff file for that language then it will be imported into memory as translation data and then removed.

This method is therefore useful if you want to reuse the Xliff file data after an extension update.  All new files will be based on the g.xlf file.

I basically did this action for all 25 languages.  I already had the is-IS and da-DK files, so they where updated.  Since the source language is en-US all my en-XX files where automatically translated.  All the other languages have translation state set to “needs-translation”.

</trans-unit><trans-unit id="Table 102911037 - Field 1343707150 - Property 2879900210" size-unit="char" translate="yes" xml:space="preserve">
  <source>Source Name</source>
  <target state="needs-translation"></target><note from="Developer" annotates="general" priority="2" />
  <note from="Xliff Generator" annotates="general" priority="3">Table:O4N GL SN - Field:Source Name</note>
</trans-unit>

All these files I need to upload to the Translation Service.  From the Lifecycle Services menu select the Translation Service.  This will open the Translation Service Dashboard.

Press + to add a translation request.

I now need to zip and upload the nl-NL file from my Translations folder.

After upload I Submit the translation request

The request will appear on the dashboard with the status; Processing.  Now I need to wait for the status to change to Completed.  Or, create requests for all the other languages and upload files to summit.

When translation has completed I can download the result.

And I have a translation in state “needs-review-translation”.

<trans-unit id="Table 102911037 - Field 1343707150 - Property 2879900210" translate="yes" xml:space="preserve">
  <source>Source Name</source>
  <target state="needs-review-translation" state-qualifier="mt-suggestion">Bronnaam</target>
  <note from="Xliff Generator" annotates="general" priority="3">Table:O4N GL SN - Field:Source Name</note>
</trans-unit>

Now I just need to complete all languages and push changes to GitHub.

Please, if you can, download your language file and look at the results.

Why do we need Interface Codeunits

And what is an interface Codeunit?

A Codeunit that you can execute with CODEUNIT.RUN to perform a given task is, from my point of view, an interface Codeunit.

An interface Codeunit has a parameter that we put in the

This parameter is always a table object.

We have multiple examples of this already in the application.  Codeunits 12 and 80 are some.  There the parameter is a mixed set of data and settings.  Some of the table fields are business data being pushed into the business logic.  Other fields are settings used to control the business logic.

Table 36, Sales Header, is used as the parameter for Codeunit 80.  Fields like No., Bill-to Customer No., Posting Date and so on are business data.  Fields like Ship, Invoice, Print Posted Documents are settings used to control the business logic but have no meaning as business data.

Every table is then a potential parameter for an interface Codeunit.  Our extension can easily create a table that we use as a parameter table.  Record does not need to be inserted into the table to be passed to the Codeunit.

Let’s look at another scenario.  We know that there is an Interface Codeunit  with the name “My Interface Codeunit” but it is belongs to an Extensions that may and may not be installed in the database.

Here we use the virtual table “CodeUnit Metadata” to look for the Interface Codeunit before execution.

This is all simple and strait forward.  Things that we have been doing for a number of years.

Using TempBlob table as a parameter also gives us flexibility to define more complex interface for the Codeunit.  Tempblob table can store complex data in Json or Xml format and pass that to the Codeunit.

Let’s take an example.  We have an extension that extends the discount calculation for Customers and Items.  We would like to ask this extensions for the discount a given customer will have for a given Item.  Questions like that we can represent in a Json file.

{
    "CustomerNo": "C10000",
    "ItemNo": "1000"
}

And the question can be coded like this.

The Interface Codeunit could be something like

With a Page that contains a single Text variable (Json) we can turn this into a web service.

That we can use from C# with a code like

var navOdataUrl = new System.Uri("https://nav2018dev.westeurope.cloudapp.azure.com:7048/NAV/OData/Company('CRONUS%20International%20Ltd.')/AlexaRequest?$format=json");
var credentials = new NetworkCredential("navUser", "+lppLBb7OQJxlOfZ7CpboRCDcbmAEoCCJpg7cmAEReQ=");
var handler = new HttpClientHandler { Credentials = credentials };

using (var client = new HttpClient(handler))
{
    var Json = new { CustomerNo = "C10000", ItemNo = "1000" };
    JObject JsonRequest = JObject.Parse(Json.ToString());
    JObject requestJson = new JObject();                
    JProperty jProperty = new JProperty("Json", JsonRequest.ToString());
    requestJson.Add(jProperty);
    var requestData = new StringContent(requestJson.ToString(), Encoding.UTF8, "application/json");
    var response = await client.PostAsync(navOdataUrl,requestData);
    dynamic result = await response.Content.ReadAsStringAsync();

    JObject responseJson = JObject.Parse(Convert.ToString(result));
    if (responseJson.TryGetValue("Json", out JToken responseJToken))
    {
        jProperty = responseJson.Property("Json");
        JObject JsonResponse = JObject.Parse(Convert.ToString(jProperty.Value));
        Console.WriteLine(JsonResponse.ToString());
    }
}

This is just scratching the surface of what we can do.  To copy a record to and from Json is easy to do with these functions.

And even if I am showing all this in C/AL there should be no problem in using the new AL in Visual Studio Code to get the same results.

Upgrading my G/L Source Names Extension to AL – step 4 addendum

In the last blog post we completed the translation to our native language.  Since then I have learned that I also need to include translation files for EN-US, EN-GP and EN-CA.

With the latest update of AdvaniaGIT tools that was an easy task.  Just asked to create Xlf file for these languages and skipped the part where we import C/AL translation.

I have also been pointed to a new tool that can work with Xlf files.  Multilingual Editor: https://marketplace.visualstudio.com/items?itemName=MultilingualAppToolkit.MultilingualAppToolkitv40

Now I call out to all who are ready to help me with the translation.  Please fork my NAV2018 repository and send me Xlf translation files for your native language.  Or just download one of the translation files and send me your language.

Our next step is to code sign the App file and send it to Microsoft.

Upgrading my G/L Source Names Extension to AL – step 4

We are on a path to upgrade G/L Source Names from version 1 to version 2.  This requires conversion from C/AL to AL, data upgrade and number of changes to the AL code.

A complete check list of what you need to have in your AL extension is published by Microsoft here.

Our task for today is to translate the AL project into our native language.

To make this all as easy as I could I added new functionality to the AdvaniaGIT module and VS Code extension.  Make sure to update to the latest release.

To translate an AL project we need to follow the steps described by Microsoft here.

To demonstrate this process I created a video.

 

Upgrading my G/L Source Names Extension to AL – step 3

When upgrading an extension from C/AL to AL (version 1 to version 2) we need to think about the data upgrade process.

In C/AL we needed to add two function to an extension Codeunit to handle the installation and upgrade.  This I did with Codeunit 70009200.  One function to be execute once for each install.

PROCEDURE OnNavAppUpgradePerDatabase@1();
VAR
  AccessControl@70009200 : Record 2000000053;
BEGIN
  WITH AccessControl DO BEGIN
    SETFILTER("Role ID",'%1|%2','SUPER','SECURITY');
    IF FINDSET THEN REPEAT
      AddUserAccess("User Security ID",PermissionSetToUserGLSourceNames);
      AddUserAccess("User Security ID",PermissionSetToUpdateGLSourceNames);
      AddUserAccess("User Security ID",PermissionSetToSetupGLSourceNames);
    UNTIL NEXT = 0;
  END;
END;

And another function to be executed once for each company in the install database.

PROCEDURE OnNavAppUpgradePerCompany@2();
VAR
  GLSourceNameMgt@70009200 : Codeunit 70009201;
BEGIN
  NAVAPP.RESTOREARCHIVEDATA(DATABASE::"G/L Source Name Setup");
  NAVAPP.RESTOREARCHIVEDATA(DATABASE::"G/L Source Name User Setup");
  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name");

  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name Help Resource");
  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name User Access");
  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name Group Access");

  GLSourceNameMgt.PopulateSourceTable;
  RemoveAssistedSetup;
END;

For each database I add my permission sets to the installation users and for each company I restore the setup data for my extension and populate the lookup table for G/L Source Name.

The methods for install and upgrade have changed in AL for extensions version 2.  Look at the AL documentation from Microsoft for details.

In version 2 I remove these two obsolete function from my application management Codeunit and need to add two new Codeunits, one for install and another for upgrade.

codeunit 70009207 "O4N GL Source Name Install"
{
    Subtype = Install;
    trigger OnRun();
    begin
    end;

    var
    PermissionSetToSetupGLSourceNames : TextConst ENU='G/L-SOURCE NAMES, S';
    PermissionSetToUpdateGLSourceNames : TextConst ENU='G/L-SOURCE NAMES, E';
    PermissionSetToUserGLSourceNames : TextConst ENU='G/L-SOURCE NAMES';

    
    trigger OnInstallAppPerCompany();
    var
        GLSourceNameMgt : Codeunit "O4N GL SN Mgt";
    begin
        GLSourceNameMgt.PopulateSourceTable;
        RemoveAssistedSetup;
    end;

    trigger OnInstallAppPerDatabase();
    var
        AccessControl : Record "Access Control";
    begin
        with AccessControl do begin
            SETFILTER("Role ID",'%1|%2','SUPER','SECURITY');
            if FINDSET then repeat
                AddUserAccess("User Security ID",PermissionSetToUserGLSourceNames);
                AddUserAccess("User Security ID",PermissionSetToUpdateGLSourceNames);
                AddUserAccess("User Security ID",PermissionSetToSetupGLSourceNames);
            until NEXT = 0;
        end;
    end;

  local procedure RemoveAssistedSetup();
  var
    AssistedSetup : Record "Assisted Setup";
  begin
    with AssistedSetup do begin
      SETRANGE("Page ID",PAGE::"O4N GL SN Setup Wizard");
      if not ISEMPTY then
        DELETEALL;
    end;
  end;

  local procedure AddUserAccess(AssignToUser : Guid;PermissionSet : Code[20]);
  var
    AccessControl : Record "Access Control";
    AppMgt : Codeunit "O4N GL SN App Mgt.";
    AppGuid : Guid;
  begin
    EVALUATE(AppGuid,AppMgt.GetAppId);
    with AccessControl do begin
      INIT;
      "User Security ID" := AssignToUser;
      "App ID" := AppGuid;
      Scope := Scope::Tenant;
      "Role ID" := PermissionSet;
      if not FIND then
        INSERT(true);
    end;
  end;

}

In the code you can see that this Codeunit is of Subtype=Install.  This code will  be executed when installing this extension in a database.

To confirm this I can see that I have the G/L Source Names Permission Sets in the Access Control table .

And my G/L Source Name table also has all required entries.

Uninstalling the extension will not remove this data.  Therefore you need to make sure that the install code is structured in a way that it will work even when reinstalling.  Look at the examples from Microsoft to get a better understanding.

Back to my C/AL extension.  When uninstalling that one the data is moved to archive tables.

Archive tables are handled with the NAVAPP.* commands.  The OnNavAppUpgradePerCompany command here on top handled these archive tables when reinstalling or upgrading.

Basically, since I am keeping the same table structure I can use the same set of commands for my upgrade Codeunit.

codeunit 70009208 "O4N GL SN Upgrade"
{
    Subtype=Upgrade;
    trigger OnRun()
    begin
        
    end;
    
    trigger OnCheckPreconditionsPerCompany()
    begin

    end;

    trigger OnCheckPreconditionsPerDatabase()
    begin

    end;
    
    trigger OnUpgradePerCompany()
    var
        GLSourceNameMgt : Codeunit "O4N GL SN Mgt";
        archivedVersion : Text;
    begin
        archivedVersion := NAVAPP.GetArchiveVersion();
        if archivedVersion = '1.0.0.1' then begin
            NAVAPP.RESTOREARCHIVEDATA(DATABASE::"O4N GL SN Setup");
            NAVAPP.RESTOREARCHIVEDATA(DATABASE::"O4N GL SN User Setup");
            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN");

            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN Help Resource");
            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN User Access");
            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN Group Access");

            GLSourceNameMgt.PopulateSourceTable;
        end;
    end;

    trigger OnUpgradePerDatabase()
    begin

    end;

    trigger OnValidateUpgradePerCompany()
    begin

    end;

    trigger OnValidateUpgradePerDatabase()
    begin

    end;
    
}

So, time to test how and if this works.

I have my AL folder open in Visual Studio Code and I use the AdvaniaGIT command Build NAV Environment to get the new Docker container up and running.

Then I use Update launch.json with current branch information to update my launch.json server settings.

I like to use the NAV Container Helper from Microsoft  to manually work with the container.  I use a command from the AdvaniaGIT module to import the NAV Container Module.

The module uses the container name for most of the functions.  The container name can be found by listing the running Docker containers or by asking for the name that match the server used in launch.json.

I need my C/AL extension inside the container so I executed

Copy-FileToNavContainer -containerName jolly_bhabha -localPath C:\NAVManagementWorkFolder\Workspace\GIT\Kappi\NAV2017\Extension1\AppPackage.navx -containerPath c:\run

Then I open PowerShell inside the container

Enter-NavContainer -containerName jolly_bhabha

Import the NAV Administration Module

Welcome to the NAV Container PowerShell prompt

[50AA0018A87F]: PS C:\run> Import-Module 'C:\Program Files\Microsoft Dynamics NAV\110\Service\NavAdminTool.ps1'

Welcome to the Server Admin Tool Shell!

[50AA0018A87F]: PS C:\run>

and I am ready to play.  Install the C/AL extension

Publish-NAVApp -ServerInstance NAV -IdePath 'C:\Program Files (x86)\Microsoft Dynamics NAV\110\RoleTailored Client\finsql.exe' -Path C:\run\AppPackage.navx -SkipVerification

Now I am faced with the fact that I have opened PowerShell inside the container in my AdvaniaGIT terminal.  That means that my AdvaniaGIT commands will execute inside the container, but not on the host.

The simplest way to solve this is to open another instance of Visual Studio Code.  From there I can start the Web Client and complete the install and configuration of my C/AL extension.

I complete the Assisted Setup and do a round trip to G/L Entries to make sure that I have enough data in my tables to verify that the data upgrade is working.

I can verify this by looking into the SQL tables for my extension.  I use PowerShell to uninstall and unpublish my C/AL extension.

Uninstall-NAVApp -ServerInstance NAV -Name "G/L Source Names"
Unpublish-NAVApp -ServerInstance NAV -Name "G/L Source Names"

I can verify that in my SQL database I now have four AppData archive tables.

Pressing F5 in Visual Studio Code will now publish and install the AL extension, even if I have the terminal open inside the container.

The extension is published but can’t be installed because I had previously installed an older version of my extension.  Back in my container PowerShell I will follow the steps as described by Microsoft.

[50AA0018A87F]: PS C:\run> Sync-NAVApp -ServerInstance NAV -Name "G/L Source Names" -Version 2.0.0.0
WARNING: Cannot synchronize the extension G/L Source Names because it is already synchronized.
[50AA0018A87F]: PS C:\run> Start-NAVAppDataUpgrade -ServerInstance NAV -Name "G/L Source Names" -Version 2.0.0.0
[50AA0018A87F]: PS C:\run> Install-NAVApp -ServerInstance NAV -Tenant Default -Name "G/L Source Names"
WARNING: Cannot install extension G/L Source Names by Objects4NAV 2.0.0.0 for the tenant default because it is already installed.
[50AA0018A87F]: PS C:\run>

My AL extension is published and I have verified in my SQL server that all the data from the C/AL extension has been moved to the AL extension tables and all the archive tables have been removed.

Back in Visual Studio Code I can now use F5 to publish and install the extension again if I need to update, debug and test my extension.

Couple of more steps left that I will do shortly.  Happy coding…

 

Upgrading my G/L Source Names Extension to AL – step 2

So, where is step 1?  Step 1 was converting C/AL code to AL code.  This we did with AdvaniaGIT and was demonstrated here.

First thing first!  I received the following email from Microsoft.

Hello,

The decision has been made by our SLT, that the use of a Prefix or Suffix is now a mandatory requirement. If you are already using this in your app(s), great. If not, you will want to do so.

We are coming across too many collisions between apps in our internal tests during builds and have seen some in live tenants as well. It makes the most sense to make this a requirement now. If you think about it in a live situation, if a customer installs an app before yours and then tries yours but gets collision issues, they may just decide to not even continue on with yours.

Also, I have been made aware that adding a prefix or suffix after you already have a v2 app published can make the process complicated for you. Therefore, since you all have to convert to v2 anyway, now is a good time to add in the prefix/suffix.

The following link provides the guidelines around using it here

If you haven’t reserved your prefix yet, please email me back to reserve one (or more if needed).

Thank you,

Ryan

Since my brand is Objects4NAV.com I asked for 04N as my prefix and got it registered.  Since we got this information from Microsoft, every object that we develop in NAV 2018 now has our companies prefix in the name.

Starting my AL development by opening Visual Studio Code in my repository folder.  I updated my setup.json to match the latest preview build as Docker container and then selected to Build NAV Environment using AdvaniaGIT.

After download and deployment of the container I noticed that the container had a brand new version of the AL Extension for Visual Studio Code.  I looked at the version installed and that was an older version.

I uninstalled the AL Language extension and restarted Visual Studio Code.

As you can see on the screenshot above we now don’t have any AL Language extension installed.  I executed the Build NAV Environment command from AdvanaiGIT to install the extension on the Docker container.  In this case I already had a container assigned to my branch so only three things happened.

  • uidOffset in the container database was updated.  This is recommended for C/AL development.
  • License file is updated in the container database and container service.  The license used is the one configured in branch setup.json or the machine settings GITSettings.json
  • AL Language Extension is copied from the container to the host and installed in Visual Studio Code.

Again, restarting Visual Studio Code to find that the latest version of AL Language Extension has been installed.

I then executed two AdvaniaGIT actions.

  • Update Launch.json with current branch environment.  This will update the host name and the service name in my AL Launch.json file to make sure that my AL project will be interacting with the branch container.
  • Open Visual Studio Code in AL folder.  This will open another instance of Visual Studio Code in the AL folder.

Immediately after Visual Studio Code was opened it asked for symbols and I agreed that we should download them from the container.

Everything is now ready for AL development using the latest build that Microsoft has to offer.

I started  Edit – Replace in Files in Visual Studio Code.  All my objects have a name that start with G/L Source Name.  I used this knowledge to apply the prefix.

By starting with the double quote I make sure to only update the object names and not captions.  All captions start with a single quote.  I got a list of all changes and needed to confirm all changes.

The field name I add to G/L Entry table does not match this rule so I needed to rename that my self.  Selecting the field name and pressing F2 allows me to rename a field and have Visual Studio Code update all references automatically.

Pressing F5 started my build, publish and debug.

My extension is installed and ready for testing.

There are a few more steps that I need to look into before publishing the new version of G/L Source Names to Dynamics 365.  These steps will appear here in the coming days.  Hope this will be useful to you all.