Add translations to your NAV/BC Server

Yesterday I got a question via LinkedIn. I need to add Spanish translation to my W1 instance. How do I do that?

So, let me walk you through that process.

Here is my Business Central setup. It is the Icelandic Docker Container, so I have Icelandic and English. Switching between Icelandic and English works just fine.

Switching to Spanish gives me a mix of Spanish and English.

The Spanish translation for the platform is shipped with the DVD image and automatically installed. So are a lot of other languages.

Icelandic and English are built in captions in the C/AL code. And even if all these languages are shipped with the platform, these languages are not shipped with the application.

There is a way to get these application translations from the appropriate release and add them to your application.

Let’s start in VS Code where I have cloned my Business Central repository from GitHub. I opened the workspace file and also opened “setup.json” from the root folder of my repository.

This configuration points to the W1 Business Central OnPrem Docker Image. Now, let’s point to the Spanish one.

And let’s build a container.


Switching the Terminal part to AdvaniaGIT, I see that I am now pulling the Spanish Docker image down to my laptop.

This may take a few minutes…

After the container is ready I start FinSql.exe

Just opening the first table and properties for the first field I can verify than I have the Spanish captions installed.

So, let’s export these Spanish captions by selecting all objects except the new trigger codeunits (Business Central only) and selecting to export translation…

Save the export to a TXT file.

Opening this file in Visual Studio Code, we can see that the code page does not match the required UTF-8 format. Here we can also see that we have English in lines with A1033 and Spanish in lines with A1034.

We need to process this file with PowerShell. Executing that script can also take some time…

This script reads the file using the “Oem” code page. This code page is the one FinSql uses for import and export. We read through the file and every line that is identified as Spanish is the added to the output variable. We end by writing that output variable to the same file using the “utf8” code page.

Visual Studio Code should refresh the file automatically.

We need to create a “Translations” folder in the server folder. The default server uses the root Translations folder.

If you have instances then the “Translations” folder needs to be in the Instance.

Since I am running this in a container I may need to create this folder in the container.

Then, copy the updated file to the “Translations” folder.

And make sure it has been put into the correct path.

We need to restart the service instance.

Then in my Web Client I can verify that the Spanish application language is now available.

That is it!

Here is the PowerShell script

$LanguageFile = Get-Item -Path C:\AdvaniaGIT\Workspace\es.txt

Write-Host "Loading $($LanguageFile.Name)..."
$TranslateFile = Get-Content -Path $LanguageFile.FullName -Encoding Oem
$i = 0
$count = $TranslateFile.Length
$StartTime = Get-Date
foreach ($Line in $TranslateFile) {
    $i++
    $NowTime = Get-Date
    $TimeSpan = New-TimeSpan $StartTime $NowTime
    $percent = $i / $count
    if ($percent -gt 1) 
    {
        $percent = 1
    }
    $remtime = $TimeSpan.TotalSeconds / $percent * (1-$percent)
    if (($i % 100) -eq 0) 
    {
        Write-Progress -Status "Processing $i of $count" -Activity 'Updating Translation...' -PercentComplete ($percent*100) -SecondsRemaining $remtime
    }

    if ($Line -match "A1034") {
        if ($TranslatedFile) {
            $TranslatedFile += $Line + "`r`n"
        } else {
            $TranslatedFile = $Line + "`r`n"
        }
    }
}

Write-Host "Saving $($LanguageFile.Name)..."
Remove-Item -Path $LanguageFile.FullName -Force -ErrorAction SilentlyContinue
Out-File -FilePath $LanguageFile.FullName -Encoding utf8 -InputObject $TranslatedFile -Force

In this post I used both AdvaniaGIT and NAVContainerHelper tools. Good luck.

Business Central Docker on Windows 10

In Advania we are switching more and more to using the Docker images for Dynamics NAV and Business Central development.

Since version 1809 of Windows 10 and the latest blog post from Arend-Jan Kauffmann we are moving to using the Docker EE engine instead of the Docker Desktop setup.

Using the latest Windows 10 version and the latest version of Docker means that we can now use “Process Isolation” images when running NAV and Business Central.

Not using process isolation images on Windows 10 requires Hyper-V support. Inside Hyper-V a server core is running as the platform for the processes executed by the container created from the image. If using process isolation images then the Windows 10 operating system is used as foundation and Hyper-V server core is not needed. Just this little fact can save up to 4GB of memory usage by the container.

Freddy Kristiansen announced in this blog that his PowerShell Module, NAVContainerHelper, had support for selecting the proper Docker Image based on the host capabilities.

We have had some issues with our Windows installations and I wanted to give you the heads up and how these issues where resolved.

First thing first, make sure that you are running Windows 10 version 1809 or newer. Execute

winver.exe

in Windows-R to get this displayed.

Optional, make sure to remove the Hyper-V support if you are not using any virtual machines on your host. If you have version 1903 or later I suggest enabling the Hyper-V feature.

Restart your computer as needed.

Start PowerShell ISE as Administrator.

Copy from Arend-Jan‘s blog the Option 1: Manual installation script into the script editor in Powershell ISE and execute by pressing F5.

If you have older Docker Images download you should remove them. Executing

docker rmi -f (docker images -q)

in your PowerShell ISE prompt.

Now to the problems we have encountered.

The NAVContainerHelper added a support for the process isolation images just a few releases ago. Some of our machines had older versions installed and that gave us problems. Execute

Get-Module NAVContainerHelper -ListAvailable

in PowerShell ISE prompt to make sure you have version 0.5.0.5 or newer.

If you have any other versions installed use the File Explorer to delete the “navcontainerhelper” folder from

 C:\Program Files (x86)\WindowsPowerShell\Modules

and

C:\Program Files\WindowsPowerShell\Modules

Then execute

Install-Module NAVContainerHelper

in PowerShell ISE prompt to install the latest versions. Verify the installation.

We also had problems downloading the images. Getting the error “read tcp 172.16.4.17:56878->204.79.197.219:443: wsarecv: An existing connection was forcibly closed by the remote host.“.

My college in Advania, Sigurður Gunnlaugsson, figured out that multiple download threads caused network errors.

In PowerShell ISE prompth execute

Stop-Service docker
dockerd --unregister-service

to remove the docker service. Then re-register docker service using

dockerd --exec-opt isolation=process --max-concurrent-downloads 1 --register-service
Start-Service docker

in the PowerShell ISE prompt.

This should result in only one download thread and this way our download was able to complete.

More details on Docker images for Dynamics NAV and Business Central can be found in here.

Waldo’s Blog on Docker Image Tags

AdvaniaGIT and Docker

Tobias Fenster on Docker

Freddy´s Blog

Use references to break compile dependencies

I was looking into a customer App yesterday. That app had a dependency defined in app.json.

I wanted to look at the real requirements for this dependency. I found 1 (one) place in my code where this dependent App was used.

dataitem(PageLoop; "Integer")
{
    DataItemTableView = SORTING (Number) WHERE (Number = CONST (1));
    column(Phone_No_Cust; Cust."Phone No.")
    {
    }
    column(Registration_No_Cust; Cust."ADV Registration No.")
    {
    }
    column(CompanyInfo_Picture; CompanyInfo.Picture)
    {
    }

In Iceland we add a field to the Customer table (Cust.”ADV Registration No.”). Every business entity in Iceland has a registration number. A company only has one registration number but can have multiple VAT numbers. We already have that registration number field in the Company Information record, but we also add it to Customer, Vendor and Contact records. The Employee social security number equals to the registration number for an individual.

To be able to remove the compile dependency, and therefore the installation dependency I did the following:

Removed the dependency App from app.json

Added a variable to the report

    var
        ADVRegistrationNo: Text;

Changed the data set configuration to use this variable

dataitem(PageLoop; "Integer")
	{
	    DataItemTableView = SORTING (Number) WHERE (Number = CONST (1));
	    column(Phone_No_Cust; Cust."Phone No.")
	    {
	    }
	    column(Registration_No_Cust; ADVRegistrationNo)
	    {
	    }
	    column(CompanyInfo_Picture; CompanyInfo.Picture)
	    {
	    }

Located the code that fetches the Customer record and added the reference way to get the required data

trigger OnAfterGetRecord()
var
    DataTypeMgt: Codeunit "Data Type Management";
    RecRef: RecordRef;
    FldRef: FieldRef;
begin
    Cust.Get("Ship-to Customer No.");
    RecRef.GetTable(Cust);
    if DataTypeMgt.FindFieldByName(RecRef, FldRef, 'ADV Registration No.') then
        ADVRegistrationNo := FldRef.Value();
end;

There are both positive and negative repercussion of these changes.

The positive is that we can now install, uninstall both apps without worrying about the compile dependency.

The negative is that breaking changes to the dependent App does not break the installation of this customer App.

So, what happens if the dependent App is not installed? The FindFieldByName will return false and the variable will be blank text.

Since we have adapted the policy that Microsoft uses; no breaking table changes, this field should just be there.

If the data is required and will break the functionality if not present we can change the code to something like this.


Cust.Get("Ship-to Customer No.");
RecRef.GetTable(Cust);
if DataTypeMgt.FindFieldByName(RecRef, FldRef, 'ADV Registration No.') then
    ADVRegistrationNo := FldRef.Value()
else
    Error('Please install the Advania IS Localization App into this Tenant!')

Event subscription and performance

When we design and write our code we need to think about performance.

We have been used to thinking about database performance, using FindFirst(), FindSet(), IsEmpty() where appropriate.

We also need to think about performance when we create our subscriber Codeunits.

Let’s consider this Codeunit.

codeunit 50100 MySubscriberCodeunit
{
    trigger OnRun()
    begin
        
    end;
    

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)] 
    local procedure MyProcedure(var SalesHeader: Record "Sales Header")
    begin
        Message('I am pleased that you called.');
    end;


}

Every time any user posts a sales document this subscriber will be executed.

Executing this subscriber will need to load an instance of this Codeunit into the server memory. After execution the Codeunit instance is trashed.

The resources needed to initiate an instance of this Codeunit and trash it again, and doing that for every sales document being posted are a waste of resources.

If we change the Codeunit and make it a “Single Instance”.

codeunit 50100 MySubscriberCodeunit
{
    SingleInstance = true;
    
    trigger OnRun()
    begin
        
    end;
    

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)] 
    local procedure MyProcedure(var SalesHeader: Record "Sales Header")
    begin
        Message('I am pleased that you called.');
    end;


}

What happens now is that Codeunit only has one instance for each session. When the first sales document is posted then the an instance of the Codeunit is created and kept in memory on the server as long as the session is alive.

This will save the resources needed to initialize an instance and tear it down again.

Making sure that our subscriber Codeunits are set to single instance is even more important for subscribers to system events that are frequently executed.

Note that a single instance Codeunit used for subscription should not have any global variables, since the global variables are also kept in memory though out the session lifetime.

Make sure that whatever is executed inside a single instance subscriber Codeunit is executed in a local procedure. The variables inside a local procedure are cleared between every execution, also in a single instance Codeunit.

codeunit 50100 MySubscriberCodeunit
{
    SingleInstance = true;

    trigger OnRun()
    begin
        
    end;
    

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)] 
    local procedure MyProcedure(var SalesHeader: Record "Sales Header")
    begin
        ExecuteBusinessLogic(SalesHeader);

    end;

    local procedure ExecuteBusinessLogic(SalesHeader: Record "Sales Header")
    var
        Customer: Record Customer;
    begin
        Message('I am pleased that you called.');    
    end;

}

If your custom code executes every time that the subscriber is executed then I am fine with having that code in a local procedure inside the single instance Codeunit.

Still, I would suggest putting the code in another Codeunit, and keeping the subscriber Codeunit as small as possible.

This is even more important if the custom code only executes on a given condition.

An example of a Codeunit that you call from the subscriber Codeunit could be like this.

codeunit 50001 MyCodeCalledFromSubscriber
{
    TableNo = "Sales Header";
    
    trigger OnRun()
    begin
        ExecuteBusinessLogic(Rec);
    end;
    local procedure ExecuteBusinessLogic(SalesHeader: Record "Sales Header")
    var
        Customer: Record Customer;
    begin
        Message('I am pleased that you called.');    
    end;
}

And I change my subscriber Codeunit to only execute this code on a given condition.

codeunit 50100 MySubscriberCodeunit
{
    SingleInstance = true;

    trigger OnRun()
    begin
        
    end;
    

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)] 
    local procedure MyProcedure(var SalesHeader: Record "Sales Header")
    begin
        ExecuteBusinessLogic(SalesHeader);

    end;

    local procedure ExecuteBusinessLogic(SalesHeader: Record "Sales Header")
    var
        MyCodeCalledFromSubscriber: Codeunit MyCodeCalledFromSubscriber;
    begin
        if SalesHeader."Document Type" = SalesHeader."Document Type"::Order then
            MyCodeCalledFromSubscriber.Run(SalesHeader);
    end;

}

This pattern makes sure that the execution is a fast as possible and no unneeded variables are populating the server memory.

AdvaniaGIT: Configure Build and Test using Visual Studio Online

The goal of this post is to demo from start to finish the automated build and test of an AL solution for Microsoft Dynamics 365 Business Central.

Configure build steps

The build steps using AdvaniaGIT are similar to the build step Soren describes here.

The first two steps and the last one is automatically created.  We create PowerShell steps to be executed in between.

To begin with I like to make sure that the previous build environment is removed.  If everything is configured and working correctly the environment is removed in the end of every build.

Since I am working on daily builds I like to leave the environment active after the build so that I can access the latest environment any time to look at code and configuration changes.

When putting together this build I used some of the information found on Tobias Fenster blog and also from Kamil Sacek blog.

Remove existing container

Every PowerShell  task executes the basic command with different parameters.  The script path is

C:\AdvaniaGIT\Scripts\Start-CustomAction.ps1

The Arguments are defined for each task.  I always select which script to execute and always add the ‘BuildMode=$true’ settings.

-ScriptName Remove-NavEnvironment.ps1 -BuildSettings @{BuildMode=$true}

Remember that the ‘setup.json’ configuration and the machine configuration in the read as every action is started.

the ‘setup.json’ parameter ‘projectName’ is used as the docker container name.  This first action will look for a container matching the ‘projectName’ and remove it.

Build and Update Docker Container
-ScriptName Build-NavEnvironment.ps1 -BuildSettings @{dockerMemoryLimit='4G';BuildMode=$true;dockerTestToolkit=$true}

This task will start download the docker image that is defined in the ‘dockerImage’ property in ‘setup.json’. 

To make sure that docker allocates enough memory for the build use the ‘dockerMemoryLimit’ parameter and allow at least 4G of memory.

The ‘dockerTestToolkit’ parameter is required if the AL Test application uses the standard test toolkit libraries.  The G/L Source Names test application uses some of the standard libraries and therefore, using this parameter, I trigger the toolkit import into my docker container.

Initialize Test Company
-ScriptName Initialize-NAVCompany.ps1 -BuildSettings @{BuildMode=$true}

The daily builds are not shipped with the CRONUS company.  To be able to execute the test using the NAV Client I need to have a company marked for evaluation.

This action will remove the existing company, create a new evaluation company and run the Company Initialize Codeunit.

If you are not running tests then this step is not needed.

Download AL Add-in
-ScriptName Download-ALAddin.ps1 -BuildSettings @{BuildMode=$true}

The build machine will create a folder in ‘C:\AdvaniaGIT\Workspace’ using the ‘branchId’ from ‘setup.json’.  That said, we can only have one build process running for each GIT branch.  I have not found the need to allow multiple concurrent builds for a GIT branch.  

The AL extension for VS Code is downloaded and extracted to the ‘vsix’ folder.

Download AL Symbols
-ScriptName Download-ALSymbols.ps1 -BuildSettings @{BuildMode=$true}

The symbol files (app files) are required to build the AL application.  These two files are downloaded from the docker container into our work space ‘Symbols’ folder.

Build AL Solution
-ScriptName Build-ALSolution.ps1 -BuildSettings @{BuildMode=$true;buildID=$(Build.BuildID)}

The AL app will be compiled and created in the work space ‘out’ folder.  I add the ‘Build.BuildID’ from Visual Studio to my app version in every build.  Remember that the AL solution is in the AL folder on my GIT branch as stated in the machine configuration.  Overwrite that configuration in ‘setup.json’ or with ‘BuildSettings’ parameter if needed.

Copy AL Solution to Symbols
-ScriptName Copy-ALSolutionToSymbols.ps1 -BuildSettings @{BuildMode=$true}

The ‘APP’ file is copied from the work space ‘out’ folder to the work space ‘Symbols’ folder.  This is required since I am building the test app on top of the solution app.  If you are not running tests the you can skip this and the next five tasks.

Build AL Test Solution
-ScriptName Build-ALTestSolution.ps1 -BuildSettings @{BuildMode=$true;buildID=$(Build.BuildID)}

This task is almost identical to the AL solution build task.  Here the machine parameters for the AL solution and the Test solution are combined into a single folder ‘ALTests’ to locate the folder containing the AL test application source code.

The test application ‘APP’ file is copied to the work space ‘out’ folder.

Install AL Extension
-ScriptName Install-ALExtensionsToDocker.ps1 -BuildSettings @{BuildMode=$true}

All the ‘APP’ files from the work space ‘out’ folder are published and installed in the docker container.

Execute AL Test Codeunits
-ScriptName Start-ALTestsExecution.ps1 -BuildSettings @{BuildMode=$true}

This task requires the ‘idRange’ in the ‘app.json’ to be specified.

Every Codeunit in this number range with ‘subtype=upgrade’ will be added to the test pipeline.

The server instance is restarted before the tests are executed.  I have asked Microsoft to do one code change to be able to skip this step.

The tests are then executed using the NAV Client that is copied to the build machine and started from there.

Save Test Results
-ScriptName Save-TestResults.ps1 -BuildSettings @{BuildMode=$true;TestResultsPath='TEST-Build_$(Build.BuildID).trx'}

Test results are downloaded from the docker container database and formatted into the VSTest Xml format.  The ‘TestResultPath’ is a sub folder of the repository path.

Publish Test Results

This task is a built-in Visual Studio task.  The test result files must match the ‘TestResultPath’ in the previous step.

The ‘$(System.DefaultWorkingDirectory)’ is the repository path.

Copy AL Solution to Artifact folder
-ScriptName Copy-ALSolutionToArtifactStagingDirectory.ps1 -BuildSettings @{BuildMode=$true}

The ‘Artifact’ folder is created in the repository folder and every ‘APP’ in the work space ‘out’ folder is copied to this folder.

Sign AL Solution in Artifact folder
-ScriptName Sign-ArtifactAppPackage.ps1 -BuildSettings @{BuildMode=$true}

The code signing certificate is needed and must be available on the build machine.  The ‘signtool.exe’ to be used, the certificate path and the certificate password are specified in the machine setup ‘GITSettings.json’.

Publish Artifact: App

This is also a Visual Studio task that will upload all files in the ‘Artifact’ sub folder in the GIT repository to the build artifacts.

Remove Container
-ScriptName Remove-NavEnvironment.ps1

In my setup I disable this and the next task.  This is done so that I can work with the results and the development environment after the build.  This task is also configured to be executed even if something fails in the build pipeline.

Remove Branch Work Folder
-ScriptName Remove-BranchWorkFolder.ps1 -BuildSettings @{BuildMode=$true}

The folder created in the work space by adding the branch ID will take up space on the build machine.  This task will remove that folder and everything it contains.

Delete Logs
-ScriptName Delete-OldLogs.ps1 -BuildSettings @{BuildMode=$true}

When AdvaniaGIT executes custom actions a log folder is created.  The log folders are located in ‘C:\AdvaniaGIT\Log’ folder.

When the NAV Client is copied to then host it is stored in a log folder.

Every sub folder that is older than seven days will be removed from the log folder in this task.

Conclusion

I configured my build machine as described in my previous post.

I use my free subscription to Visual Studio Online to store my source code in GIT.

I installed the build agent on my build server and connected to my Visual Studio Online subscription.

I added the docker repository information and login credentials to the ‘DockerSettings.json’ to be able to build the daily builds.

AdvaniaGIT is accessible on GitHub.

Good luck!

AdvaniaGIT: About the build steps

The goal of this post is to demo from start to finish the automated build and test of an AL solution for Microsoft Dynamics 365 Business Central.

About the build steps

All build steps are execute in the same way.  In the folder ‘C:\AdvaniaGIT\Scripts’ the script ‘Start-CustomAction.ps1’ is executed with parameters.

param
(
[Parameter(Mandatory=$False, ValueFromPipelineByPropertyname=$true)]
[String]$Repository = (Get-Location).Path,
[Parameter(Mandatory=$True, ValueFromPipelineByPropertyName=$true)]
[String]$ScriptName,
[Parameter(Mandatory=$False, ValueFromPipelineByPropertyName=$true)]
[String]$InAdminMode='$false',
[Parameter(Mandatory=$False, ValueFromPipelineByPropertyName=$true)]
[String]$Wait='$false',
[Parameter(Mandatory=$False, ValueFromPipelineByPropertyName=$true)]
[HashTable]$BuildSettings
)

The AdvaniaGIT custom action is executed in the same way from a build machine and from a development machine.

When we created the container in our last post from Visual Studio Code with the command (Ctrl+Shift+P) ‘Build NAV Environment’, Visual Studio Code executed

Start-AdvaniaGITAction -Repository c:\Users\navlightadmin\businesscentral -ScriptName "Build-NavEnvironment.ps1" -Wait $false

From the build task we execute ‘C:\AdvaniaGIT\Scripts\Start-CustomAction.ps1’ with these parameters

-ScriptName Build-NavEnvironment.ps1 -Repository $(System.DefaultWorkingDirectory) -BuildSettings @{BuildMode=$true}

We can see that these commands are almost the same.  We have the one additional parameter in the build script to notify the scripts that we are in Build Mode.

Each AdvaniaGIT build or development machine has a ‘C:\AdvaniaGIT\Data\GITSettings.Json’ configuration file.

When the scripts are started this file is read and all the settings imported.  Then the repository setup file is imported.  The default repository setup file is ‘setup.json’ as stated in GIT settings.  If the same parameters are in both the machine settings and in the repository settings then the repository settings are used.

The same structure is used for the ‘BuildSettings’ parameter that can be passed to the custom action.  The build settings will overwrite the same parameter in both the machine settings and the repository settings.

The default settings are built around the folder structure that I like to use.  For example, we have our C/AL objects in the ‘Objects’ folder.  Microsoft has their objects in then ‘BaseApp’ folder.  Just by adding the ‘objectsPath’ parameter to the repository settings for the Microsoft repository I can use their structure without problems.

If I wan’t to execute the same exact functionality in Visual Studio Code as I expect to get from my build script I can add the ‘BuildSettings’ parameter to the command.

Start-AdvaniaGITAction -Repository c:\Users\navlightadmin\businesscentral -ScriptName "Build-NavEnvironment.ps1" -Wait $false -BuildSettings @{BuildMode=$true}

The folder structure

The structure is defined in settings files.  By default I have the ‘AL’ folder for the main project and the ‘ALTests’ folder for the test project.  Example can be seen in the G/L Source Names repository.

In C/AL we are using deltas and using the build server to merge our solutions to a single solution.  Therefore we have a single repository for a single NAV version and put our solutions and customization into branches.

In AL this is no longer needed.  We can have a dedicated repository for each solution if we like to, since the scripts will not be doing any merge between branches.

AdvaniaGIT: Setup and configure the build machine

The goal of this post is to demo from start to finish the automated build and test of an AL solution for Microsoft Dynamics 365 Business Central.

Setup and configure the build machine

We will create our build machine from a standard Windows 2016 template in Azure.

Docker containers and container images will take a lot of disk space.  The data are stored in %ProgramData%\docker

It if obvious that we will not be able to store the lot on the system SSD system drive.  To solve this I create an 1TB HDD disk in Azure.

After starting the Azure VM and opening the Server Manager to look at the File and Storage Service we can see the new empty disk that need configuration.

Right click the new drive to create a new volume.

And assign the drive letter

Next go to Add roles and features to add the Containers feature.  More information can be found here.  We also need to add ‘.NET Framework 3.5 Features’.

I also like to make sure that all Microsoft updates have been installed.

Now I start PowerShell ISE as Administrator.

As Windows Servers are usually configured in a way that prohibits downloads I like to continue the installation task in PowerShell.

To enable all the scripts to be executes we need to change the execution policy for PowerShell scripts.  Executing

Set-ExecutionPolicy -ExecutionPolicy Unrestricted

will take care of that. 

Confirm with Yes to all.

To make sure that all the following download functions will execute successfully we need to change the TLS configuration with another PowerShell command.

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

Let’s download Visual Studio Code!  Use the following PowerShell command

Invoke-WebRequest -Uri https://go.microsoft.com/fwlink/?Linkid=852157 -OutFile "${env:USERPROFILE}\Desktop\VSCodeInstallation.exe"

to download the installation file to your desktop.  Start the installation.  During installation I like to select all available additional tasks.

We also need to download GIT.  Using the following PowerShell command

Invoke-WebRequest -Uri https://github.com/git-for-windows/git/releases/download/v2.18.0.windows.1/Git-2.18.0-64-bit.exe -OutFile "${env:USERPROFILE}\Desktop\GITInstallation.exe"

will download the latest version at the time of this blog post.  The only thing I change from default during GIT setup is the default editor.  I like to use Visual Studio Code.

Go ahead and start Visual Studio Code as Administrator.

Add the AdvaniaGIT extension to Visual Studio Code

Install AdvaniaGIT PowerShell Scripts!  We access the commands in Visual Studio Code by pressing Ctrl+Shift+P.  From there we type to search for the command ‘Advania: Go!’ and the when selected we press enter.

You will get a small notification dialog asking you to switch to the AdvaniaGIT terminal window.

Accept the default path for the installation but select No to the two optional installation options.

We need a development license to work with NAV and Business Central.  This license you copy into the ‘C:\AdvaniaGIT\License’ folder.  In the ‘GITSettings.json’ file that Visual Studio Code opened during AdvaniaGIT installation we need to point to this license file.

The DockerSettings.json file is also opened during installation and if you have access to the insider builds we need to update that file.

{
    "RepositoryPath":  "bcinsider.azurecr.io",
    "RepositoryUserName":  "User Name from Collaborate",
    "RepositoryPassword":  "Password from Collaborate",
    "ClientFolders":  []
}

If not make sure to have all setting blank

{
  "RepositoryPath":  "",
  "RepositoryUserName":  "",
  "RepositoryPassword":  "",
  "ClientFolders":  []
}

Save both these configuration files and restart Visual Studio Code.  This restart is required to make sure Visual Studio Code recognizes the AdvaniaGIT PowerShell modules.

Let’s open our first GIT repository.  We start by opening the NAV 2018 repository.  Repositories must have the setup.json file in the root folder to support the AdvaniaGIT functionality.

I need some installation files from the NAV 2018 DVD and I will start by cloning my GitHub NAV 2018 respository.  From GitHub I copy the Url to the repository.  In Visual Studio Code I open the commands with Ctrl+Shift+P and execute the command ‘Git: Clone’.

I selected the default folder for the local copy and accepted to open the repository folder.  Again with Ctrl+Shift+P I start the NAV Installation.

The download will start.  The country version we are downloading does not matter at this point.  Every country has the same installation files that we require.

This will download NAV and start the installation.  I will just cancel the installation and manually install just what I need.

  • Microsoft SQL Server\sqlncli64
  • Microsoft SQL Server Management Objects\SQLSysClrTypes
  • Microsoft Visual C++ 2013\vcredist_x64
  • Microsoft Visual C++ 2013\vcredist_x86
  • Microsoft Visual C++ 2017\vcredist_x64
  • Microsoft Visual Studio 2010 Tools For Office Redist\vstor_redist

To enable the windows authentication for the build containers we need to save the windows credentials.  I am running as user “navlightadmin”.  I securely save the password for this user by starting a command (Ctrl+Shift+P) and select to save container credentials.

For all the docker container support I like to use the NAV Container Helper from Microsoft.  With another command (Ctrl+Shift+P) I install the container helper to the server.

To complete the docker installation I execute.

Install-PackageProvider -Name NuGet -MinimumVersion 2.8.5.201 -Force
Install-Module -Name DockerMsftProvider -Force
Install-Package -Name docker -ProviderName DockerMsftProvider -Force

in Visual Studio Code Terminal.

We need to point docker to our data storage drive.  Kamil Sacek already pointed this out to us.

I use Visual Studio Code to update the docker configuration.  As pointed out here the default docker configuration file can be found at ‘C:\ProgramData\Docker\config\daemon.json’. If this file does not already exist, it can be created.  I update the ‘data-root’ configuration.

Now let’s restart the server by typing

Restart-Computer -Force

or manually.

After restart, open Visual Studio Code as Administrator.

Now to verify the installation let’s clone my Business Central repository.  Start command (Ctrl+Shift+P) ‘Git: Clone’ and paste in the Url to the repository.

This repository has a setup.json that points to the Business Central Sandbox.

Make sure to have the Integrated Terminal visible and let’s verify the installation by executing a command (Ctrl+Shift+P) ‘Advania: Build NAV Environment’ to build the development environment.

The image download should start…

You should now be able to use the command (Ctrl+Shift+P) ‘Advania: Start Client’,  ‘Advania: Start Web Client’, ‘Advania: Start FinSql’ and ‘Advania: Start Debugger’ to verify all the required NAV/BC functionality.

If you are happy with the results you should be able to install the build agent as shown by Soren Klemmensen here.

 

C/AL and AL Side-by-Side Development with AdvaniaGIT

Microsoft supports Side-by-Side development for C/AL and AL.  To start using the Side-by-Side development make sure you have the latest version of AdvaniaGIT add-in for Visual Studio Code and update the PowerShell scripts by using the “Advania: Go!” command.

When the Business Central environment is built use the “Advania: Build C/AL Symbol References for AL” to enable the Side-by-Side development for this environment.  This function will reconfigure the service and execute the Generate Symbol References command for the environment.  From here on everything you change in C/AL on this environment will update the AL Symbol References.

So let’s try this out.

I converted my C/AL project to AL project with the steps described in my previous post.  Then selected to open Visual Studio Code in AL folder.

In my new Visual Studio Code window I selected to build an environment – the Docker Container.

When AdvaniaGIT builds a container it will install the AL Extension for Visual Studio Code from that Container.  We need to read the output of the environment build.  In this example I am asked to restart Visual Studio Code before reinstalling AL Language.  Note that if you are not asked to restart Visual Studio Code you don’t need to do that.

After restart I can see that the AL Language extension for Visual Studio Code is missing.

To fix this I execute the “Advania: Build NAV Environment” command again.  This time, since the Container is already running only the NAV license and the AL Extension will be updated.

Restart Visual Studio Code again and we are ready to go.

If we build new environment for our AL project we must update the environment settings in .vscode\launch.json.  This we can do with a built in AdvaniaGIT command.

We can verify the environment by executing “Advania: Check NAV Environment”.  Everything should be up and running at this time.

Since we will be using Side-by-Side development for C/AL and AL in this environment we need to enable that by executing “Advania: Build C/AL Symbol References for AL”.

This will take a few minutes to execute.

Don’t worry about the warning.  AdvaniaGIT takes care of restarting the service.  Let’s download AL Symbols and see what happens.

We can see that AL now recognizes the standard symbols but my custom one; “IS Soap Proxy Client Mgt.” is not recognized.  I will tell you more about this Codeunit in my next blog post.

I start FinSql to import the Codeunit “IS Soap Proxy Client Mgt.”

Import the FOB file

Close FinSql and execute the “AL: Download Symbols” again.  We can now see that AL recognizes my C/AL Codeunit.

Now I am good to go.

Using the Translation Service for G/L Source Names

Until now I have had my G/L Source Names extension in English only.

Now the upcoming release of Microsoft Dynamics 365 Business Central I need to supply more languages.  What does a man do when he does not speak the language?

I gave a shout out yesterday on Twitter asking for help with translation.  Tobias Fenster reminded me that we have a service to help us with that.  I had already tried to work with this service and now it was time to test the service on my G/L Source Names extension.

In my previous posts I had created the Xliff translation files from my old ML properties.  I manually translated to my native language; is-IS.

I already got a Danish translation file sent from a colleague.

Before we start; I needed to do a minor update to the AdvaniaGIT tools.  Make sure you run “Advania: Go!” to update the PowerShell Script Package.  Then restart Visual Studio Code.

Off to the Microsoft Lifecycle Services to utilize the translation service.

Now, let’s prepare the Xliff files in Visual Studio Code.  From the last build I have the default GL Source Names.g.xlf file.  I executed the action to create Xliff files.

This action will prompt for a selection of language.  The selection is from the languages included in the NAV DVD.

After selection the system will prompt for a translation file that is exported from FinSql.  This I already showed in a YouTube Video.  If you don’t have a file from FinSql you can just cancel this part.  If you already have an Xliff file for that language then it will be imported into memory as translation data and then removed.

This method is therefore useful if you want to reuse the Xliff file data after an extension update.  All new files will be based on the g.xlf file.

I basically did this action for all 25 languages.  I already had the is-IS and da-DK files, so they where updated.  Since the source language is en-US all my en-XX files where automatically translated.  All the other languages have translation state set to “needs-translation”.

</trans-unit><trans-unit id="Table 102911037 - Field 1343707150 - Property 2879900210" size-unit="char" translate="yes" xml:space="preserve">
  <source>Source Name</source>
  <target state="needs-translation"></target><note from="Developer" annotates="general" priority="2" />
  <note from="Xliff Generator" annotates="general" priority="3">Table:O4N GL SN - Field:Source Name</note>
</trans-unit>

All these files I need to upload to the Translation Service.  From the Lifecycle Services menu select the Translation Service.  This will open the Translation Service Dashboard.

Press + to add a translation request.

I now need to zip and upload the nl-NL file from my Translations folder.

After upload I Submit the translation request

The request will appear on the dashboard with the status; Processing.  Now I need to wait for the status to change to Completed.  Or, create requests for all the other languages and upload files to summit.

When translation has completed I can download the result.

And I have a translation in state “needs-review-translation”.

<trans-unit id="Table 102911037 - Field 1343707150 - Property 2879900210" translate="yes" xml:space="preserve">
  <source>Source Name</source>
  <target state="needs-review-translation" state-qualifier="mt-suggestion">Bronnaam</target>
  <note from="Xliff Generator" annotates="general" priority="3">Table:O4N GL SN - Field:Source Name</note>
</trans-unit>

Now I just need to complete all languages and push changes to GitHub.

Please, if you can, download your language file and look at the results.

Why do we need Interface Codeunits

And what is an interface Codeunit?

A Codeunit that you can execute with CODEUNIT.RUN to perform a given task is, from my point of view, an interface Codeunit.

An interface Codeunit has a parameter that we put in the

This parameter is always a table object.

We have multiple examples of this already in the application.  Codeunits 12 and 80 are some.  There the parameter is a mixed set of data and settings.  Some of the table fields are business data being pushed into the business logic.  Other fields are settings used to control the business logic.

Table 36, Sales Header, is used as the parameter for Codeunit 80.  Fields like No., Bill-to Customer No., Posting Date and so on are business data.  Fields like Ship, Invoice, Print Posted Documents are settings used to control the business logic but have no meaning as business data.

Every table is then a potential parameter for an interface Codeunit.  Our extension can easily create a table that we use as a parameter table.  Record does not need to be inserted into the table to be passed to the Codeunit.

Let’s look at another scenario.  We know that there is an Interface Codeunit  with the name “My Interface Codeunit” but it is belongs to an Extensions that may and may not be installed in the database.

Here we use the virtual table “CodeUnit Metadata” to look for the Interface Codeunit before execution.

This is all simple and strait forward.  Things that we have been doing for a number of years.

Using TempBlob table as a parameter also gives us flexibility to define more complex interface for the Codeunit.  Tempblob table can store complex data in Json or Xml format and pass that to the Codeunit.

Let’s take an example.  We have an extension that extends the discount calculation for Customers and Items.  We would like to ask this extensions for the discount a given customer will have for a given Item.  Questions like that we can represent in a Json file.

{
    "CustomerNo": "C10000",
    "ItemNo": "1000"
}

And the question can be coded like this.

The Interface Codeunit could be something like

With a Page that contains a single Text variable (Json) we can turn this into a web service.

That we can use from C# with a code like

var navOdataUrl = new System.Uri("https://nav2018dev.westeurope.cloudapp.azure.com:7048/NAV/OData/Company('CRONUS%20International%20Ltd.')/AlexaRequest?$format=json");
var credentials = new NetworkCredential("navUser", "+lppLBb7OQJxlOfZ7CpboRCDcbmAEoCCJpg7cmAEReQ=");
var handler = new HttpClientHandler { Credentials = credentials };

using (var client = new HttpClient(handler))
{
    var Json = new { CustomerNo = "C10000", ItemNo = "1000" };
    JObject JsonRequest = JObject.Parse(Json.ToString());
    JObject requestJson = new JObject();                
    JProperty jProperty = new JProperty("Json", JsonRequest.ToString());
    requestJson.Add(jProperty);
    var requestData = new StringContent(requestJson.ToString(), Encoding.UTF8, "application/json");
    var response = await client.PostAsync(navOdataUrl,requestData);
    dynamic result = await response.Content.ReadAsStringAsync();

    JObject responseJson = JObject.Parse(Convert.ToString(result));
    if (responseJson.TryGetValue("Json", out JToken responseJToken))
    {
        jProperty = responseJson.Property("Json");
        JObject JsonResponse = JObject.Parse(Convert.ToString(jProperty.Value));
        Console.WriteLine(JsonResponse.ToString());
    }
}

This is just scratching the surface of what we can do.  To copy a record to and from Json is easy to do with these functions.

And even if I am showing all this in C/AL there should be no problem in using the new AL in Visual Studio Code to get the same results.