This configuration points to the W1 Business Central OnPrem Docker Image. Now, let’s point to the Spanish one.
And let’s build a container.
Switching the Terminal part to AdvaniaGIT, I see that I am now pulling the Spanish Docker image down to my laptop.
This may take a few minutes…
After the container is ready I start FinSql.exe
Just opening the first table and properties for the first field I can verify than I have the Spanish captions installed.
So, let’s export these Spanish captions by selecting all objects except the new trigger codeunits (Business Central only) and selecting to export translation…
Save the export to a TXT file.
Opening this file in Visual Studio Code, we can see that the code page does not match the required UTF-8 format. Here we can also see that we have English in lines with A1033 and Spanish in lines with A1034.
We need to process this file with PowerShell. Executing that script can also take some time…
This script reads the file using the “Oem” code page. This code page is the one FinSql uses for import and export. We read through the file and every line that is identified as Spanish is the added to the output variable. We end by writing that output variable to the same file using the “utf8” code page.
Visual Studio Code should refresh the file automatically.
We need to create a “Translations” folder in the server folder. The default server uses the root Translations folder.
If you have instances then the “Translations” folder needs to be in the Instance.
Since I am running this in a container I may need to create this folder in the container.
Then, copy the updated file to the “Translations” folder.
And make sure it has been put into the correct path.
We need to restart the service instance.
Then in my Web Client I can verify that the Spanish application language is now available.
Using the latest Windows 10 version and the latest version of Docker means that we can now use “Process Isolation” images when running NAV and Business Central.
Not using process isolation images on Windows 10 requires Hyper-V support. Inside Hyper-V a server core is running as the platform for the processes executed by the container created from the image. If using process isolation images then the Windows 10 operating system is used as foundation and Hyper-V server core is not needed. Just this little fact can save up to 4GB of memory usage by the container.
local procedure MyProcedure(varSalesHeader:Record"Sales Header")
Message('I am pleased that you called.');
What happens now is that Codeunit only has one instance for each session. When the first sales document is posted then the an instance of the Codeunit is created and kept in memory on the server as long as the session is alive.
This will save the resources needed to initialize an instance and tear it down again.
Making sure that our subscriber Codeunits are set to single instance is even more important for subscribers to system events that are frequently executed.
Note that a single instance Codeunit used for subscription should not have any global variables, since the global variables are also kept in memory though out the session lifetime.
Make sure that whatever is executed inside a single instance subscriber Codeunit is executed in a local procedure. The variables inside a local procedure are cleared between every execution, also in a single instance Codeunit.
The “Enabled” Codeunit will test for Setup table read permission and if the “Enabled” flag has been set in the default record.
LOCAL TestEnabled(VARTempBlob:Record TempBlob)
WITH JsonInterfaceMgt DOBEGIN
This is how we can make sure that a module is installed and enabled before we start using it or any of the dependent modules.
Table Access Interface
The main module has a standard response table. We map some of the communication responses to this table via Data Exchange Definition. From other modules we like to be able to read the response from the response table.
The response table uses a GUID value for a primary key and has an integer field for the “Data Exchange Entry No.”. From the sub module we ask if a response exists for the current “Data Exchange Entry No.” by calling the interface.
Some processes can be both automatically and manually executed. For manual execution we like to display a request page on a Report. On that request page we can ask for variables, settings and verify before executing the process.
For automatic processing we have default settings and logic to find the correct variables before starting the process. And since one module should be able to start a process in the other then we use the JSON interface pattern for the processing Codeunit.
We also like to include the “Method” variable to add flexibility to the interface. Even if there is only one method in the current implementation.
Reading through the code above we can see that we are also using the JSON interface to pass settings to the Data Exchange Framework. We put the JSON configuration into the “Table Filters” BLOB field in the Data Exchange where we can use it later in the data processing.
From the Report we start the process using the JSON interface.
This pattern is similar to the discovery pattern, where an Event is raised to register possible modules into a temporary table. Example of that is the “OnRegisterServiceConnection” event in Table 1400, Service Connection.
Since we can’t have Event Subscriber in one module listening to an Event Publisher in another, without having compile dependencies, we have come up with a different solution.
We register functionality from the functionality module and the list of modules in stored in a database table. The table uses a GUID and the Language ID for a primary key, and then the view is filtered by the Language ID to only show one entry for each module.
This pattern gives me a list of possible modules for that given functionality. I can open the Setup Page for that module and I can execute the Interface Codeunit for that module as well. Both the Setup Page ID and the Interface Codeunit ID are object names.
The registration interface uses the Method variable to select the functionality. It can either register a new module or it can execute the method in the modules.
WITH JsonInterfaceMgt DOBEGIN
LOCAL RegisterCollectionApp(JsonInterfaceMgt:Codeunit"IS Json Interface Mgt.")
Where the Subscriber that needs to respond to this Publisher is in another module we need to extend the functionality using JSON interfaces.
First, we create a Codeunit within the Publisher module with Subscribers. The parameters in the Subscribers are converted to JSON and passed to the possible subscriber modules using the “ExecuteMethodInApps” function above.
Having standard ways of talking between modules and solutions has opened up for a lot of flexibility. We like to keep our solutions as small as possible.
We could mix “Methods” and “Versions” if we at later time need to be able to extend some of the interfaces. We need to honor the contract we have made for the interfaces. We must not make breaking changes to the interfaces, but we sure can extend them without any problems.
By attaching the JSON Interface Codeunit to the post I hope that you will use this pattern in your solutions. Use the Code freely. It is supplies as-is and without any responsibility, obligations or requirements.
will download the latest version at the time of this blog post. The only thing I change from default during GIT setup is the default editor. I like to use Visual Studio Code.
Go ahead and start Visual Studio Code as Administrator.
Add the AdvaniaGIT extension to Visual Studio Code
Install AdvaniaGIT PowerShell Scripts! We access the commands in Visual Studio Code by pressing Ctrl+Shift+P. From there we type to search for the command ‘Advania: Go!’ and the when selected we press enter.
You will get a small notification dialog asking you to switch to the AdvaniaGIT terminal window.
Accept the default path for the installation but select No to the two optional installation options.
We need a development license to work with NAV and Business Central. This license you copy into the ‘C:\AdvaniaGIT\License’ folder. In the ‘GITSettings.json’ file that Visual Studio Code opened during AdvaniaGIT installation we need to point to this license file.
The DockerSettings.json file is also opened during installation and if you have access to the insider builds we need to update that file.
"RepositoryUserName":"User Name from Collaborate",
"RepositoryPassword":"Password from Collaborate",
If not make sure to have all setting blank
Blank Docker Config
Save both these configuration files and restart Visual Studio Code. This restart is required to make sure Visual Studio Code recognizes the AdvaniaGIT PowerShell modules.
Let’s open our first GIT repository. We start by opening the NAV 2018 repository. Repositories must have the setup.json file in the root folder to support the AdvaniaGIT functionality.
I need some installation files from the NAV 2018 DVD and I will start by cloning my GitHub NAV 2018 respository. From GitHub I copy the Url to the repository. In Visual Studio Code I open the commands with Ctrl+Shift+P and execute the command ‘Git: Clone’.
I selected the default folder for the local copy and accepted to open the repository folder. Again with Ctrl+Shift+P I start the NAV Installation.
The download will start. The country version we are downloading does not matter at this point. Every country has the same installation files that we require.
This will download NAV and start the installation. I will just cancel the installation and manually install just what I need.
Microsoft SQL Server\sqlncli64
Microsoft SQL Server Management Objects\SQLSysClrTypes
Microsoft Visual C++ 2013\vcredist_x64
Microsoft Visual C++ 2013\vcredist_x86
Microsoft Visual C++ 2017\vcredist_x64
Microsoft Visual Studio 2010 Tools For Office Redist\vstor_redist
To enable the windows authentication for the build containers we need to save the windows credentials. I am running as user “navlightadmin”. I securely save the password for this user by starting a command (Ctrl+Shift+P) and select to save container credentials.
I use Visual Studio Code to update the docker configuration. As pointed out here the default docker configuration file can be found at ‘C:\ProgramData\Docker\config\daemon.json’. If this file does not already exist, it can be created. I update the ‘data-root’ configuration.
Now let’s restart the server by typing
After restart, open Visual Studio Code as Administrator.
Make sure to have the Integrated Terminal visible and let’s verify the installation by executing a command (Ctrl+Shift+P) ‘Advania: Build NAV Environment’ to build the development environment.
The image download should start…
You should now be able to use the command (Ctrl+Shift+P) ‘Advania: Start Client’, ‘Advania: Start Web Client’, ‘Advania: Start FinSql’ and ‘Advania: Start Debugger’ to verify all the required NAV/BC functionality.
If you are happy with the results you should be able to install the build agent as shown by Soren Klemmensen here.
Now the upcoming release of Microsoft Dynamics 365 Business Central I need to supply more languages. What does a man do when he does not speak the language?
I gave a shout out yesterday on Twitter asking for help with translation. Tobias Fenster reminded me that we have a service to help us with that. I had already tried to work with this service and now it was time to test the service on my G/L Source Names extension.
In my previous posts I had created the Xliff translation files from my old ML properties. I manually translated to my native language; is-IS.
I already got a Danish translation file sent from a colleague.
Before we start; I needed to do a minor update to the AdvaniaGIT tools. Make sure you run “Advania: Go!” to update the PowerShell Script Package. Then restart Visual Studio Code.
Now, let’s prepare the Xliff files in Visual Studio Code. From the last build I have the default GL Source Names.g.xlf file. I executed the action to create Xliff files.
This action will prompt for a selection of language. The selection is from the languages included in the NAV DVD.
After selection the system will prompt for a translation file that is exported from FinSql. This I already showed in a YouTube Video. If you don’t have a file from FinSql you can just cancel this part. If you already have an Xliff file for that language then it will be imported into memory as translation data and then removed.
This method is therefore useful if you want to reuse the Xliff file data after an extension update. All new files will be based on the g.xlf file.
I basically did this action for all 25 languages. I already had the is-IS and da-DK files, so they where updated. Since the source language is en-US all my en-XX files where automatically translated. All the other languages have translation state set to “needs-translation”.
</trans-unit><trans-unit id="Table 102911037 - Field 1343707150 - Property 2879900210"size-unit="char"translate="yes"xml:space="preserve">
<note from="Xliff Generator"annotates="general"priority="3">Table:O4N GL SN - Field:Source Name</note>
All these files I need to upload to the Translation Service. From the Lifecycle Services menu select the Translation Service. This will open the Translation Service Dashboard.
Press + to add a translation request.
I now need to zip and upload the nl-NL file from my Translations folder.
After upload I Submit the translation request
The request will appear on the dashboard with the status; Processing. Now I need to wait for the status to change to Completed. Or, create requests for all the other languages and upload files to summit.
When translation has completed I can download the result.
And I have a translation in state “needs-review-translation”.
<trans-unit id="Table 102911037 - Field 1343707150 - Property 2879900210"translate="yes"xml:space="preserve">