Compile Dependencies

As we move from C/AL and all-in-one database we need to redesign how we make one application or a functionality available to the next.

Microsoft recommends that we build multiple AL extensions, not putting all our code into a single extension.

My goal is to be able to develop, test and execute an extension regardless of if other extensions are installed or not. Some functionality may depend on another extensions. Still I don’t want to add compile dependencies between these extensions.

For just over a year we have been designing and implementing methods to solve this task. In the coming blog entries I will be sharing these methods one by one. These JSON interface methods I hope will help you solving your tasks as well.

In last Directions EMEA I had a talk about the path Advania is on from C/AL all-in-one database to multiple AL extensions. Here below is the content of that talk. Feel free to skip that reading and wait for the upcoming blogs about the technical solutions.

From the start we have been calling this project Lego®.  I will spend some time on detailing the path we have been on for almost three years.  We have not yet started converting to AL but is seems that the tooling that Microsoft delivers with the fall release of Business Central will be the one we can use to start the conversion from C/AL to AL.  We have done some tests and they are promising.

This new way of doing things have enabled new methods of doing our business and opened for new dialog with our customers. 

A brick should be able to stand alone.  The only dependencies allowed is on the standard Microsoft code and on Advania’s IS localization that we call IS365.  There are ways to make the solution or functionality available in the base application without having to modify any of the standard objects.  A discovery pattern is where the base application makes a call for Assisted Setup, Business Setup or Service Connections.  By responding to these calls, these events, the setup for the customized functionality is made accessible to the user.

The interface pattern that we created is the key here.  Using this interface pattern we are able to break compile dependencies without breaking the connection between individual solutions and functionalities.  I will spend some time on detailing how we have implemented this interface pattern in our solutions.

When Microsoft talks about a slice in their development they are talking about a similar thing as we do when we talk about a brick.  A peace of code that they check in and needs to work independently.  The main difference is in dependencies and connectivity.  A slice can be dependent on another slice where a brick as we define it can’t have compile dependencies on any other brick.

Object wise, a brick can vary greatly in size and complexity.  Some bricks only have a handful of objects.  Another bricks can have hundreds of objects.  The largest bricks are our ISV solutions.  Still we have broken most of them down to smaller bricks.  Even if Iceland is small in population, we have most of the same complexity as other countries when it comes to delivering quality software.  We need to supply our customers with solutions to handle all the common business processes.  Icelandic partners have been very independent and our extensive experience with Business Central and previous versions means that we already have all the solutions for all these business processes. These solutions we need to handle as a product and we keep all coding in line with our new design principles.

We also have a number of bricks that have only one functionality. As an example, one brick can download the Post Code list for Iceland for update in the application.  Another brick can interface with the online banking web services.  The third one can use the Hardware Hub to scan documents or read a serial connected bar code reader.

These small bricks also have a way to register them selves.  Most of them in Service Connections.  Some of them register them selves as added functionality in other bricks.

One example is our Notification App.  That app is used to send notifications to users.  These notifications can be both via email and to the built in user notifications on the role center.  We have another app that has only one functionality.  That one enables sending an SMS via one of the telecommunication service providers in Iceland.  By installing and configuring  SMS app the notification app extends the functionality and enables notifications via SMS.

The SMS app has an interface that enables all other apps to verify the availability and use it so send SMS.

About three year ago when we came home from NAV TechDays, we brought Kamil Sazek, an MVP from Check Republic with us.  Kamil had already done some source control management solutions for his company using TFS.  At that time everyone wanted to move to GIT and Microsoft was moving to GIT as well.  We spent a week up in Iceland applying Kamil’s knowledge and his scripts to an indoor GIT source control management system.  We had our preferences and our design principles and we boiled everything together and started our road to source control management.  At that time AdvaniaGIT was born.  AdvaniaGIT is the toolbox that we use in our day to day work.

GIT has repositories and branches within repositories.  GIT has a lot more but that is where we started.  We started our NAV 2016 work on GIT and decided not to apply this way of work to older releases.  Most of our NAV 2016 work was done this way.  Some of course not – but that was not a technical problem, it takes time to update the employee mentality and knowledge. 

The first branch in a repository is the master branch.  In there we store Microsoft’s base app, the W1 code base.  Looking at the GIT history for the master branch, we can see every CU of every release since NAV 2016.

Since we are one of the favorite countries for NAV and now Business Central we also have an Microsoft Icelandic release.  That release in every CU since NAV 2016 is also available in our IS branch.  The branching structure for C/AL is very different from the branching structure for AL.  For C/AL we have one repository for each NAV and Business Central version.  For AL we have one repository for each app and one for each customer.

As most of you probably release then there is a gap between the localized version delivered by Microsoft and the one a partner likes to offer to the local market.  The same thing applies in Iceland.  This gap we are closing with multiple bricks.  There is, however, a set of functionality and data that we must add to the base application to be used in multiple bricks.  Every Icelander has a social security number, and every company has a registration number following the same structure.  This field we add to customers, vendors, contacts and all types of bank accounts. 

We call this a base branch.  A branch that we base every brick on and compare every brick to.

Every month we receive a new CU.  We always develop in the latest CU.  When a new update is available we update our NAV and Business Central installation and update the object export both with W1 release in the master branch and with the Icelandic release in the IS branch.

We then use the standard GIT Merge functionality to merge the updates from the IS branch to the IS365 branch.  Merging all the updates from Microsoft into our own base localization branch.  By updating the IS365 base branch with a new update every build that is based on that branch will use the updated code automatically.

Every time a developer opens a branch to work on the first thing that must be done is to use the GIT Merge from the IS365 branch.  By doing this we make sure that we are developing on the latest update and comparing our brick to the current update that is contained in the IS365 branch.  When development is done all objects are exported back to GIT and the brick can be compared to the base IS365 branch and the deltas can be extracted and stored.

This is us in releases prior to NAV 2016.  Building a solution for a customer was a lot of work.  Manually bringing all the code together into a solution and then if we wanted to update with a new CU that also was a lot of work.

Every solution and every functionality installed was very closely linked and we could easily call a function in one solution from another solution.  Life was good and the developer was happy.  Had everything available and had everything under control.

The problem here is that this was not a good business.  Applying an update to customers was a lot of work.  It was very hard to use the same code and the same solution for multiple customers.  Our shelf products needed to be manually added to the solution.  And we had no automated testing to ensure the quality of the code and solution.

Not to mention upgrades.  No one really knew what was being used and what not.  No one was able to keep track of all the changes done.  We did have some comments in the code.  Some helpful, some not.  Big upgrades are very expensive both for the customer and also for the partner.

Microsoft also saw this as a problem and started to work on Events and Extensions.  After three years of that work Extensions are brilliant.  It is very important to see where Microsoft is going.  Not where they are.  When we see where Microsoft is going we need to act our selves in line with that future.  We could not wait for three years for Extensions to be ready and then start working.  The endpoint was clear.  Microsoft base application with Advania Extensions was the way to go.  The road to that solution was not as clear.  That is when our LEGO® method was born.

It was clear that in order to move to Extensions we first needed to break everything apart.  Take all the bricks apart and put them in a pile on the desk.  We started this in NAV 2016.  But in NAV 2016 we played with LEGO® Duplo.  The big bricks.  We put a lot of the functionality in our core app.  Still we where able to use the method and we where able to build customer solutions and apply new CUs without to much work.  We already got a lot of repeatability there.  This took a few months but about six months after the NAV 2016 release we where able to start updating our customers to NAV 2016 solutions built with the brick methods.  These customers we where able to update every month with new CUs.  Still we did not update all of them.  Remember that I said that was not a technical problem.  Our staff has the same problem as our customers.  We tend to fall back to the way we are used to doing things.

We where still working on NAV 2016 when NAV 2017 was released and we did not move our bricks to NAV 2017.  Just spent a little more time educating our staff and preparing for the small bricks.  When NAV 2018 was released we started full force.  Each brick was a potential Extension and what used to be in one brick is now in multiple bricks.  We spent some time on the code we had for NAV 2016 to figure out where to break things apart and what to keep together.  Everything we coded in C/AL in NAV 2018 has to follow the same rules that are set for Extension development.  We removed DotNet variables one by one and made sure not to change anything in the standard application.

As mentioned earlier we have some data requirements that are commonly used in our bricks.  We also have some verification methods both for registration numbers and bank account numbers.  There are some shared events and some share functionality.

We did need to use some DotNet variables.  We also needed to add some events to the base applications.  When we needed to do this we did it in the base IS365 brick.  And everything we did to break the Extension model we sent to Microsoft.  Through Microsoft’s AL Issues on GitHub we where able to request new events.  Through Microsoft’s cal-open-library on GitHub we where able to submit request for DotNet wrappers.  So we broke the Extension rules in NAV 2018 only in the base app, knowing that in Business Central Microsoft would deliver everything we needed to get our changes out of the base application.

More complex DotNet usage we moved to Azure Functions.  A simple REST web request from our brick can trigger complex tasks using Azure Function.  This has worked beautifully for us.

We created a few Codeunits with helper functionality.  Codeunits to handle some common tasks.  As an example we have one Codeunit that every brick can use both to call interfaces and create the responce from an interface.  We also like to use the built in Error Messages but needed some added functionality.  It made sense to have that functionality in the base app and make it available to all the bricks.

We call Azure Functions via the base app and we have ways to handle files from Azure Blob, Azure File System and Dropbox via the base app as well. 

We use interfaces to communicate between different bricks.  We can think about an interface like we think about a REST web service, or an Azure Function.  We supply a request JSON string to the interface and the interface response is also a JSON string.

For this we use the JSON Handler Codeunit in the base app, the IS365.

A REST web service has an http endpoint.  Our endpoints are identified by the Codeunit name.  As required by AppSource, we are using the prefix ADV for all objects, fields and functions in our bricks.  Therefore we can be sure that our Codeunit name is unique and we know who we are talking to.  We can think about this name similar as a domain name.  We use domain name lookup to find the correct IP address for the domain and then the request is sent to that IP address.  We follow the same rule with our Codeunit interfaces.  We lookup the Codeunit by name and if it exists and we have required permissions to use it we will get the object Id.  That object Id we can then use in a simple CODEUNIT.RUN method.

Our JSON Handler Codeunit has methods to execute a Codeunit with or without the JSON string parameter.  The On/Off methods are called without any parameters but all the others are using the TempBlob record where both the JSON request string and response string are stored in the Blob field.

One of our rules is that every brick must have a setup table, a setup page and an interface that can respond to a enabled query. 

If we look at the first code line where we ask if the brick is installed, we can see that this is similar to doing a lookup for a domain name. 

The second line where we ask if the brick is enabled we are simply doing a IP ping.

No matter how complex or simple the brick solution or functionality is.  We always require an setup table with the enabled flag.  We can make sure that all logic and all required setup data is available when the user tries to enable the solution.  We can even ping another brick if that one is required before allowing a user to enable the solution.   

The C/AL code here is pretty simple.  The Codeunit will throw an error if the brick is not configured and enabled. 

The JSON interface is the most used scenario.  Using the JSON Helper Codeunit we can add variables and records, both temporary and not temporary to the JSON string.  For records that are temporary every record in the tables is stored in the JSON string and the table filters and table key is also included.  For database tables we can both send a record Id and field values.

We prepare our request by adding to the json string.  The same thing is done inside the interface Codeunit when the response json string is created.  It starts with the Initialize line.

We read from the request json string with the InitializeFromTempBlob procedure.  After that we have an easy access to all the information we store in the json string.  We applied the same json structure to our Azure Functions.  The functions read the request from the json string created by this Helper Codeunit and the response from the Azure function is read by this same Json Helper Codeunit.

A JSON interface Codeunit will do some work and respond with a result.

There are some cases where we need to trigger an Event in another brick.  We can also enable this with an JSON event interface.  See the code.  In this case the Integration Event is a local procedure inside the same Codeunit.  This Integration Event could just as well be an public procedure in another C/AL or AL object. 

We need to supply the required parameters for the Integration Event to execute properly using the same json Helper Codeunit as before.  In this case we are also passing the TempBlob to the Event but that is not a requirement in our pattern.  An Event Interface is not required to have a response JSON.

We also use JSON registration interfaces to register a functionality in another brick.  When we have a configuration where the user can select a functionality based from a number of lines, we should be able to add to that selection from another brick with a Registration Interface.  In this example we have a new type of bank connection that is available when a brick in installed.  It needs to be available in another brick that has the bank connection framework.  We add this new method for bank connection using the registration interface and include information about the setup page and the assisted setup page.  This information is also in a text format, the page names.  We use the same name lookup instead of focusing on the object ids.

Same thing happens here, we have a list of bank interfaces that can come from multiple bricks, each handling that specific type of bank connection.

Then finally let’s talk about the Data Exchange Framework.  When ever we need to transfer data in or out of the application we use the Data Exchange Framework.  When we receive an Xml response inside the JSON response from an Azure Function.  We pass that Xml to the Data Exchange Framework to work with.  When we need to create an Xml file to send inside a JSON request to the Azure Function we also create that Xml file with the Data Exchange Framework.

Another example is in our Payroll solution.  The Payroll solution can import data to be used in the salary calculation.  That import starts a Data Exchange import process.  By doing it that way we can map any type of file to the import structure in our Payroll solution.  We already had this done in NAV 2016 for external time registration solutions.  On the other hand we also have a time registration solution in NAV and we used to have an action inside the Payroll that would start a report in the time registration module.  In that time registration report we had both the time registration tables and the payroll tables and we just moved the required data between these two tables.  This we can no longer do as these two solution are in two separate bricks.

The solution here was to agree on an Xml format that we can use to pass data from the time registration into the standard payroll import methods.  We can easily specify a Codeunit from the time registration brick in the data exchange definition that is used by the payroll import function.  That time registration Codeunit takes creates the Xml data that we can import.

Now to the configuration.  In every branch we have one configuration file.  This file is named setup.json and it is a simple JSON file that is easy to edit in VS Code.  Our AdvaniaGIT tools require this configuration in order to pick the correct versions and build the correct environment, both for the developer and on the build server.  Remember that we are still talking about C/AL.  We need to have information about the base branch for each brick.  Every branch must have a unique id.  Similar to that every AL app must have a unique Id.

When the developer starts working the local development environment is built and configured in line with this config.  One config line can be added, and AdvaniaGIT will start a Docker Container instead of using the locally installed NAV or Business Central.  The environment that is built is linked to the branch with the branch id and every action in AdvaniaGIT uses that link to work with the correct development environment.  We store all objects as text files in our branches.  We need this to be able to compare our changed objects to a set of standard objects.

When we start the development work on a new solution, a new functionality or a new customer we need to create a new branch.  After creating the new branch we code in C/AL and export the code changes to GIT.  The code changes are then converted to delta files and used in builds.

Every brick branch also has a setup.json.  This setup.json points to the IS365 as the base branch.  The brick branch has a different object id offset and a different branch id.

A brick branch will contain delta files that will describe the changes that brick makes to the IS365 branch.  These changes must comply to the Extension rules to be able to merge without errors along with all the other bricks.

We do a daily build that we call Test Customer where we build the IS365 branch with all the bricks merged into a single solution to make sure that we can always merge all the bricks without errors.

Each customer has it own branch as well.  The customer branch is also based on the IS365 branch.  In this example we are building a solution that we call PUB2018.  This is the solution we use in Advania’s public cloud.  Notice that here we specify a delta branch list.  This is a list of bricks that are to be merged into this customer solution.  The build server will then take care of that.  If there are any customization for this customer they are done in the same way as we did in every brick and stored as delta files in the customer branch.  Same goes with the customer unit tests, they are also stored on the customer branch.

For the customers that we are implementing NAV 2018 for we do the base build in C/AL.  However, all the customization done for the customer is done in AL.  The AL code is automatically built on the build server and deployed to our cloud services during the update period each night.  A change that we commit will be built instantly by the build server and if successful the new App file will be copied to the cloud servers and applied during the next upgrade window.

Over to the builds.  This is an essential part of the development process.  Nothing is finished until it has gone through build and test.  The build server is configured to start Docker Containers and uses the same AdvaniaGIT functions that the developer uses to build, merge and test the environments.  We store database backups on an ftp server.  These backups are brought in when the build process starts and updated after a successful build.

We have configured build processes that result in FOB files, in AL symbol package files, in AL apps, in database backups and database bacpacs.

Remember the delta branch list that I talked about.  That delta branch list is for the build server to select what to merge during the build.

After a successful build we have the option to deploy the result.  This we have also automated.  The build server can update environments by importing a new fob or by updating the AL app installed.  We can deploy the artifacts both internally in Advania’s cloud and also directly to the customers on premise solution.

We also like to automate the data upgrade when we are planning to update a customer or a solution from one version to another.  In that case we have a multi tenant setup on the upgrade machine.  We bring in the latest backup of the production database.  We restore that database and remove the old application from the database.  Running on that machine is an multi tenant NAV instance with the required upgrade code and the required configuration to run long SQL transactions.  We next mount the data database to this application and execute the sync and upgrade tasks.  If successful we dismount the data and mount to another multi tenant NAV instance.  This NAV instance has the acceptance code and we update that one by automatically importing a fob file from the latest customer build.

By automating this we can execute multiple data upgrades for the customer to verify the process before everything is ready to put the new release into production.

Finally we turn to the hot topic.  We have not started this task yet, but we will soon after we arrive back home.  Business Central now is fully capable of supporting all our needs in AL.  We can, if we like, use DotNet variables for local deployments.  We can select to mix C/AL bricks and AL bricks.  The beauty is that we can now start converting our IP brick by brick.  As we applied the extensions rules both with code change and naming in C/AL we will be able to convert the C/AL code to AL and with minimal changes we should be able to create the AL app.  An AL app does not extend the standard tables like C/AL did.  It creates companion tables based on the table that is to be extended and the App GUID.  We therefore must move the C/AL data out of the original place and into temporary upgrade tables.  By using record references in our AL install Codeunit we can bring the data into our installed app if there are upgrade tables from the C/AL brick.  Since we are using the same object and field numbers we can not use the obsolete field option that keeps the C/AL data in place, inaccessible for the user but still accessible by the AL install Codeunit.

We expect that removing the C/AL brick and replacing with AL will be a simple process when these guidelines are followed.

Here is a short overview of our tools.  We use the Atlassian products; Jira for issues, Bitbucket as the GIT server and Bamboo as the build server.  This not the only way.  We can use both GitHub and Microsoft DevOps with the same tools that we have.

We use AdvaniaGIT.  AdvaniaGIT is a set of powershell modules and a VS Code extension.  It is available on GitHub and please, if you can, use it.

AdvaniaGIT: About the build steps

The goal of this post is to demo from start to finish the automated build and test of an AL solution for Microsoft Dynamics 365 Business Central.

About the build steps

All build steps are execute in the same way.  In the folder ‘C:\AdvaniaGIT\Scripts’ the script ‘Start-CustomAction.ps1’ is executed with parameters.

param
(
[Parameter(Mandatory=$False, ValueFromPipelineByPropertyname=$true)]
[String]$Repository = (Get-Location).Path,
[Parameter(Mandatory=$True, ValueFromPipelineByPropertyName=$true)]
[String]$ScriptName,
[Parameter(Mandatory=$False, ValueFromPipelineByPropertyName=$true)]
[String]$InAdminMode='$false',
[Parameter(Mandatory=$False, ValueFromPipelineByPropertyName=$true)]
[String]$Wait='$false',
[Parameter(Mandatory=$False, ValueFromPipelineByPropertyName=$true)]
[HashTable]$BuildSettings
)

The AdvaniaGIT custom action is executed in the same way from a build machine and from a development machine.

When we created the container in our last post from Visual Studio Code with the command (Ctrl+Shift+P) ‘Build NAV Environment’, Visual Studio Code executed

Start-AdvaniaGITAction -Repository c:\Users\navlightadmin\businesscentral -ScriptName "Build-NavEnvironment.ps1" -Wait $false

From the build task we execute ‘C:\AdvaniaGIT\Scripts\Start-CustomAction.ps1’ with these parameters

-ScriptName Build-NavEnvironment.ps1 -Repository $(System.DefaultWorkingDirectory) -BuildSettings @{BuildMode=$true}

We can see that these commands are almost the same.  We have the one additional parameter in the build script to notify the scripts that we are in Build Mode.

Each AdvaniaGIT build or development machine has a ‘C:\AdvaniaGIT\Data\GITSettings.Json’ configuration file.

When the scripts are started this file is read and all the settings imported.  Then the repository setup file is imported.  The default repository setup file is ‘setup.json’ as stated in GIT settings.  If the same parameters are in both the machine settings and in the repository settings then the repository settings are used.

The same structure is used for the ‘BuildSettings’ parameter that can be passed to the custom action.  The build settings will overwrite the same parameter in both the machine settings and the repository settings.

The default settings are built around the folder structure that I like to use.  For example, we have our C/AL objects in the ‘Objects’ folder.  Microsoft has their objects in then ‘BaseApp’ folder.  Just by adding the ‘objectsPath’ parameter to the repository settings for the Microsoft repository I can use their structure without problems.

If I wan’t to execute the same exact functionality in Visual Studio Code as I expect to get from my build script I can add the ‘BuildSettings’ parameter to the command.

Start-AdvaniaGITAction -Repository c:\Users\navlightadmin\businesscentral -ScriptName "Build-NavEnvironment.ps1" -Wait $false -BuildSettings @{BuildMode=$true}

The folder structure

The structure is defined in settings files.  By default I have the ‘AL’ folder for the main project and the ‘ALTests’ folder for the test project.  Example can be seen in the G/L Source Names repository.

In C/AL we are using deltas and using the build server to merge our solutions to a single solution.  Therefore we have a single repository for a single NAV version and put our solutions and customization into branches.

In AL this is no longer needed.  We can have a dedicated repository for each solution if we like to, since the scripts will not be doing any merge between branches.

AdvaniaGIT: Setup and configure the build machine

The goal of this post is to demo from start to finish the automated build and test of an AL solution for Microsoft Dynamics 365 Business Central.

Setup and configure the build machine

We will create our build machine from a standard Windows 2016 template in Azure.

Docker containers and container images will take a lot of disk space.  The data are stored in %ProgramData%\docker

It if obvious that we will not be able to store the lot on the system SSD system drive.  To solve this I create an 1TB HDD disk in Azure.

After starting the Azure VM and opening the Server Manager to look at the File and Storage Service we can see the new empty disk that need configuration.

Right click the new drive to create a new volume.

And assign the drive letter

Next go to Add roles and features to add the Containers feature.  More information can be found here.  We also need to add ‘.NET Framework 3.5 Features’.

I also like to make sure that all Microsoft updates have been installed.

Now I start PowerShell ISE as Administrator.

As Windows Servers are usually configured in a way that prohibits downloads I like to continue the installation task in PowerShell.

To enable all the scripts to be executes we need to change the execution policy for PowerShell scripts.  Executing

Set-ExecutionPolicy -ExecutionPolicy Unrestricted

will take care of that. 

Confirm with Yes to all.

To make sure that all the following download functions will execute successfully we need to change the TLS configuration with another PowerShell command.

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

Let’s download Visual Studio Code!  Use the following PowerShell command

Invoke-WebRequest -Uri https://go.microsoft.com/fwlink/?Linkid=852157 -OutFile "${env:USERPROFILE}\Desktop\VSCodeInstallation.exe"

to download the installation file to your desktop.  Start the installation.  During installation I like to select all available additional tasks.

We also need to download GIT.  Using the following PowerShell command

Invoke-WebRequest -Uri https://github.com/git-for-windows/git/releases/download/v2.18.0.windows.1/Git-2.18.0-64-bit.exe -OutFile "${env:USERPROFILE}\Desktop\GITInstallation.exe"

will download the latest version at the time of this blog post.  The only thing I change from default during GIT setup is the default editor.  I like to use Visual Studio Code.

Go ahead and start Visual Studio Code as Administrator.

Add the AdvaniaGIT extension to Visual Studio Code

Install AdvaniaGIT PowerShell Scripts!  We access the commands in Visual Studio Code by pressing Ctrl+Shift+P.  From there we type to search for the command ‘Advania: Go!’ and the when selected we press enter.

You will get a small notification dialog asking you to switch to the AdvaniaGIT terminal window.

Accept the default path for the installation but select No to the two optional installation options.

We need a development license to work with NAV and Business Central.  This license you copy into the ‘C:\AdvaniaGIT\License’ folder.  In the ‘GITSettings.json’ file that Visual Studio Code opened during AdvaniaGIT installation we need to point to this license file.

The DockerSettings.json file is also opened during installation and if you have access to the insider builds we need to update that file.

{
    "RepositoryPath":  "bcinsider.azurecr.io",
    "RepositoryUserName":  "User Name from Collaborate",
    "RepositoryPassword":  "Password from Collaborate",
    "ClientFolders":  []
}

If not make sure to have all setting blank

{
  "RepositoryPath":  "",
  "RepositoryUserName":  "",
  "RepositoryPassword":  "",
  "ClientFolders":  []
}

Save both these configuration files and restart Visual Studio Code.  This restart is required to make sure Visual Studio Code recognizes the AdvaniaGIT PowerShell modules.

Let’s open our first GIT repository.  We start by opening the NAV 2018 repository.  Repositories must have the setup.json file in the root folder to support the AdvaniaGIT functionality.

I need some installation files from the NAV 2018 DVD and I will start by cloning my GitHub NAV 2018 respository.  From GitHub I copy the Url to the repository.  In Visual Studio Code I open the commands with Ctrl+Shift+P and execute the command ‘Git: Clone’.

I selected the default folder for the local copy and accepted to open the repository folder.  Again with Ctrl+Shift+P I start the NAV Installation.

The download will start.  The country version we are downloading does not matter at this point.  Every country has the same installation files that we require.

This will download NAV and start the installation.  I will just cancel the installation and manually install just what I need.

  • Microsoft SQL Server\sqlncli64
  • Microsoft SQL Server Management Objects\SQLSysClrTypes
  • Microsoft Visual C++ 2013\vcredist_x64
  • Microsoft Visual C++ 2013\vcredist_x86
  • Microsoft Visual C++ 2017\vcredist_x64
  • Microsoft Visual Studio 2010 Tools For Office Redist\vstor_redist

To enable the windows authentication for the build containers we need to save the windows credentials.  I am running as user “navlightadmin”.  I securely save the password for this user by starting a command (Ctrl+Shift+P) and select to save container credentials.

For all the docker container support I like to use the NAV Container Helper from Microsoft.  With another command (Ctrl+Shift+P) I install the container helper to the server.

To complete the docker installation I execute.

Install-PackageProvider -Name NuGet -MinimumVersion 2.8.5.201 -Force
Install-Module -Name DockerMsftProvider -Force
Install-Package -Name docker -ProviderName DockerMsftProvider -Force

in Visual Studio Code Terminal.

We need to point docker to our data storage drive.  Kamil Sacek already pointed this out to us.

I use Visual Studio Code to update the docker configuration.  As pointed out here the default docker configuration file can be found at ‘C:\ProgramData\Docker\config\daemon.json’. If this file does not already exist, it can be created.  I update the ‘data-root’ configuration.

Now let’s restart the server by typing

Restart-Computer -Force

or manually.

After restart, open Visual Studio Code as Administrator.

Now to verify the installation let’s clone my Business Central repository.  Start command (Ctrl+Shift+P) ‘Git: Clone’ and paste in the Url to the repository.

This repository has a setup.json that points to the Business Central Sandbox.

Make sure to have the Integrated Terminal visible and let’s verify the installation by executing a command (Ctrl+Shift+P) ‘Advania: Build NAV Environment’ to build the development environment.

The image download should start…

You should now be able to use the command (Ctrl+Shift+P) ‘Advania: Start Client’,  ‘Advania: Start Web Client’, ‘Advania: Start FinSql’ and ‘Advania: Start Debugger’ to verify all the required NAV/BC functionality.

If you are happy with the results you should be able to install the build agent as shown by Soren Klemmensen here.

 

My Soap Service Proxy Codeunit

Up to now we in Advania have been using the method described here on my blog to connect to most of the Soap web services that we needed to integrate with.

The problem with this method is that we have to manage a lot of DLLs.  This has caused some issues and problems.

Another thing is that we are moving to AL.  And in AL we can’t just throw in a custom DLL to do all the work.

In C/AL We can do this with standard dotnet objects

        DOMDoc := DOMDoc.XmlDocument;
        DOMProcessingInstruction := DOMDoc.CreateProcessingInstruction('xml','version="1.0" encoding="utf-8"');
        DOMDoc.AppendChild(DOMProcessingInstruction);
        DOMElement := DOMDoc.CreateElement('soap:Envelope','http://schemas.xmlsoap.org/soap/envelope/');
        DOMElement.SetAttribute('xmlns:soap','http://schemas.xmlsoap.org/soap/envelope/');
        DOMElement.SetAttribute('xmlns:xsi','http://www.w3.org/2001/XMLSchema-instance');
        DOMElement.SetAttribute('xmlns:xsd','http://www.w3.org/2001/XMLSchema');
        DOMElement.SetAttribute('xmlns:ws','http://ws.msggw.siminn');

        DOMElement2 := DOMDoc.CreateElement('soap:Header','http://schemas.xmlsoap.org/soap/envelope/');
        DOMElement.AppendChild(DOMElement2);

        DOMElement2 := DOMDoc.CreateElement('soap:Body','http://schemas.xmlsoap.org/soap/envelope/');
        DOMElement3 := DOMDoc.CreateElement('ws:sendSMS','http://ws.msggw.siminn');
        DOMElement4 := DOMDoc.CreateElement('ws:username','http://ws.msggw.siminn');
        DOMElement4.InnerText := SMSSetup."Service User Name";
        DOMElement3.AppendChild(DOMElement4);
        DOMElement4 := DOMDoc.CreateElement('ws:password','http://ws.msggw.siminn');
        DOMElement4.InnerText := SMSSetup."Service Password";
        DOMElement3.AppendChild(DOMElement4);
        DOMElement4 := DOMDoc.CreateElement('ws:source','http://ws.msggw.siminn');
        DOMElement4.InnerText := SMSSetup.Sender;
        DOMElement3.AppendChild(DOMElement4);
        DOMElement4 := DOMDoc.CreateElement('ws:destination','http://ws.msggw.siminn');
        DOMElement4.InnerText := SendTo;
        DOMElement3.AppendChild(DOMElement4);
        DOMElement4 := DOMDoc.CreateElement('ws:text','http://ws.msggw.siminn');
        DOMElement4.InnerText := SendText;
        DOMElement3.AppendChild(DOMElement4);
        DOMElement4 := DOMDoc.CreateElement('ws:encoding','http://ws.msggw.siminn');
        DOMElement4.InnerText := '0';
        DOMElement3.AppendChild(DOMElement4);
        DOMElement4 := DOMDoc.CreateElement('ws:flash','http://ws.msggw.siminn');
        DOMElement4.InnerText := '0';
        DOMElement3.AppendChild(DOMElement4);
        DOMElement2.AppendChild(DOMElement3);
        DOMElement.AppendChild(DOMElement2);
        DOMDoc.AppendChild(DOMElement);

        HttpWebRequest := HttpWebRequest.Create(SMSSetup."SOAP URL");
        HttpWebRequest.Timeout := 30000;
        HttpWebRequest.UseDefaultCredentials(TRUE);
        HttpWebRequest.Method := 'POST';
        HttpWebRequest.ContentType := 'text/xml; charset=utf-8';
        HttpWebRequest.Accept := 'text/xml';
        HttpWebRequest.Headers.Add('soapAction','urn:sendSMS');
        MemoryStream := HttpWebRequest.GetRequestStream;
        DOMDoc.Save(MemoryStream);
        MemoryStream.Flush;
        MemoryStream.Close;

        NAVWebRequest := NAVWebRequest.NAVWebRequest;
        IF NOT NAVWebRequest.doRequest(HttpWebRequest,HttpWebException,HttpWebResponse) THEN
          ERROR(Text003,HttpWebException.Status.ToString,HttpWebException.Message);

        MemoryStream := HttpWebResponse.GetResponseStream;
        DOMResponseDoc := DOMResponseDoc.XmlDocument;
        DOMResponseDoc.Load(MemoryStream);
        MemoryStream.Flush;
        MemoryStream.Close;

        ReceivedNameSpaceMgt := ReceivedNameSpaceMgt.XmlNamespaceManager(DOMResponseDoc.NameTable);
        ReceivedNameSpaceMgt.AddNamespace('ns','http://ws.msggw.siminn');
        DOMNode := DOMResponseDoc.SelectSingleNode('//ns:return',ReceivedNameSpaceMgt);

        Response := DOMNode.InnerText;
        Success :=  Response = 'SUCCESS';
        IF ShowResult AND Success THEN
          MESSAGE(Text001)
        ELSE IF ShowResult AND NOT Success THEN
          ERROR(Text005,Response);

AL code to do the same with the built in AL objects but that code is not much shorter.

With a custom proxy DLL the code would be

Proxy := Proxy.SMSWS;
Proxy.Url := SMSSetup."SOAP URL";
Response := Proxy.sendSMS(Username,Password,SenderText,SendTo,SendText,'0',FALSE,FALSE,'0');
Success :=  Response = 'SUCCESS';
IF ShowResult AND Success THEN
  MESSAGE(Text001)
ELSE IF ShowResult AND NOT Success THEN
  ERROR(Text005,Response);

With this example we can easily see why we have chosen to create a proxy DLL for most of the Soap services.

I wanted to find a way to make things easier in AL and I remembered having dealt with C/AL objects by Vjeko from some time ago.  I took another look and that code helped me to get started.

The result is a Soap Proxy Client Mgt. Codeunit in C/AL that I have sent to Microsoft’s cal-open-library project asking to have this code put into the standard C/AL library.

Using this Codeunit the code will be like this.

  WITH SoapProxyClientMgt DO BEGIN
    CreateSoapProxy(SMSSetup."SOAP URL");
    InitParameters(9);
    SetParameterValue(Username,1);
    SetParameterValue(Password,2);
    SetParameterValue(SenderText,3);
    SetParameterValue(SendTo,4);
    SetParameterValue(SendText,5);
    SetParameterValue('0',6);
    SetParameterValue(FALSE,7);
    SetParameterValue(FALSE,8);
    SetParameterValue('0',9);
    InvokeMethod('SMSWS','sendSMS',TempBlob);
    XmlBuffer.LoadFromText(TempBlob.ReadAsTextWithCRLFLineSeparator);
    IF XmlBuffer.FindNodesByXPath(XmlBuffer,'/string') THEN
      Response := XmlBuffer.Value;
    Success :=  Response = 'SUCCESS';
    IF ShowResult AND Success THEN
      MESSAGE(Text001)
    ELSE IF ShowResult AND NOT Success THEN
      ERROR(Text005,Response);
  END;

What about AL?

For now this C/AL Codeunit is not in the standard CRONUS database.  I need to import the C/AL code and make sure that AL will be able to use that Codeunit.  You can see how to do this in my last blog post.

This C/AL Code will directly convert to AL and is ready to use.

          with SoapProxyClientMgt do begin
            CreateSoapProxy(SMSSetup."SOAP URL");
            InitParameters(9);
            SetParameterValue(Username,1);
            SetParameterValue(Password,2);
            SetParameterValue(SenderText,3);
            SetParameterValue(SendTo,4);
            SetParameterValue(SendText,5);
            SetParameterValue('0',6);
            SetParameterValue(false,7);
            SetParameterValue(false,8);
            SetParameterValue('0',9);
            InvokeMethod('SMSWS','sendSMS',TempBlob);
            XmlBuffer.LoadFromText(TempBlob.ReadAsTextWithCRLFLineSeparator);
            if XmlBuffer.FindNodesByXPath(XmlBuffer,'/string') then                        
              Response := XmlBuffer.Value;        
            Success :=  Response = 'SUCCESS';
            if ShowResult and Success then
              MESSAGE(Text001)
            else if ShowResult and not Success then
              ERROR(Text005,Response);
          end;

More examples on how to use this Proxy Codeunit will follow.  Stay tuned…

C/AL and AL Side-by-Side Development with AdvaniaGIT

Microsoft supports Side-by-Side development for C/AL and AL.  To start using the Side-by-Side development make sure you have the latest version of AdvaniaGIT add-in for Visual Studio Code and update the PowerShell scripts by using the “Advania: Go!” command.

When the Business Central environment is built use the “Advania: Build C/AL Symbol References for AL” to enable the Side-by-Side development for this environment.  This function will reconfigure the service and execute the Generate Symbol References command for the environment.  From here on everything you change in C/AL on this environment will update the AL Symbol References.

So let’s try this out.

I converted my C/AL project to AL project with the steps described in my previous post.  Then selected to open Visual Studio Code in AL folder.

In my new Visual Studio Code window I selected to build an environment – the Docker Container.

When AdvaniaGIT builds a container it will install the AL Extension for Visual Studio Code from that Container.  We need to read the output of the environment build.  In this example I am asked to restart Visual Studio Code before reinstalling AL Language.  Note that if you are not asked to restart Visual Studio Code you don’t need to do that.

After restart I can see that the AL Language extension for Visual Studio Code is missing.

To fix this I execute the “Advania: Build NAV Environment” command again.  This time, since the Container is already running only the NAV license and the AL Extension will be updated.

Restart Visual Studio Code again and we are ready to go.

If we build new environment for our AL project we must update the environment settings in .vscode\launch.json.  This we can do with a built in AdvaniaGIT command.

We can verify the environment by executing “Advania: Check NAV Environment”.  Everything should be up and running at this time.

Since we will be using Side-by-Side development for C/AL and AL in this environment we need to enable that by executing “Advania: Build C/AL Symbol References for AL”.

This will take a few minutes to execute.

Don’t worry about the warning.  AdvaniaGIT takes care of restarting the service.  Let’s download AL Symbols and see what happens.

We can see that AL now recognizes the standard symbols but my custom one; “IS Soap Proxy Client Mgt.” is not recognized.  I will tell you more about this Codeunit in my next blog post.

I start FinSql to import the Codeunit “IS Soap Proxy Client Mgt.”

Import the FOB file

Close FinSql and execute the “AL: Download Symbols” again.  We can now see that AL recognizes my C/AL Codeunit.

Now I am good to go.

Why do we need Interface Codeunits

And what is an interface Codeunit?

A Codeunit that you can execute with CODEUNIT.RUN to perform a given task is, from my point of view, an interface Codeunit.

An interface Codeunit has a parameter that we put in the

This parameter is always a table object.

We have multiple examples of this already in the application.  Codeunits 12 and 80 are some.  There the parameter is a mixed set of data and settings.  Some of the table fields are business data being pushed into the business logic.  Other fields are settings used to control the business logic.

Table 36, Sales Header, is used as the parameter for Codeunit 80.  Fields like No., Bill-to Customer No., Posting Date and so on are business data.  Fields like Ship, Invoice, Print Posted Documents are settings used to control the business logic but have no meaning as business data.

Every table is then a potential parameter for an interface Codeunit.  Our extension can easily create a table that we use as a parameter table.  Record does not need to be inserted into the table to be passed to the Codeunit.

Let’s look at another scenario.  We know that there is an Interface Codeunit  with the name “My Interface Codeunit” but it is belongs to an Extensions that may and may not be installed in the database.

Here we use the virtual table “CodeUnit Metadata” to look for the Interface Codeunit before execution.

This is all simple and strait forward.  Things that we have been doing for a number of years.

Using TempBlob table as a parameter also gives us flexibility to define more complex interface for the Codeunit.  Tempblob table can store complex data in Json or Xml format and pass that to the Codeunit.

Let’s take an example.  We have an extension that extends the discount calculation for Customers and Items.  We would like to ask this extensions for the discount a given customer will have for a given Item.  Questions like that we can represent in a Json file.

{
    "CustomerNo": "C10000",
    "ItemNo": "1000"
}

And the question can be coded like this.

The Interface Codeunit could be something like

With a Page that contains a single Text variable (Json) we can turn this into a web service.

That we can use from C# with a code like

var navOdataUrl = new System.Uri("https://nav2018dev.westeurope.cloudapp.azure.com:7048/NAV/OData/Company('CRONUS%20International%20Ltd.')/AlexaRequest?$format=json");
var credentials = new NetworkCredential("navUser", "+lppLBb7OQJxlOfZ7CpboRCDcbmAEoCCJpg7cmAEReQ=");
var handler = new HttpClientHandler { Credentials = credentials };

using (var client = new HttpClient(handler))
{
    var Json = new { CustomerNo = "C10000", ItemNo = "1000" };
    JObject JsonRequest = JObject.Parse(Json.ToString());
    JObject requestJson = new JObject();                
    JProperty jProperty = new JProperty("Json", JsonRequest.ToString());
    requestJson.Add(jProperty);
    var requestData = new StringContent(requestJson.ToString(), Encoding.UTF8, "application/json");
    var response = await client.PostAsync(navOdataUrl,requestData);
    dynamic result = await response.Content.ReadAsStringAsync();

    JObject responseJson = JObject.Parse(Convert.ToString(result));
    if (responseJson.TryGetValue("Json", out JToken responseJToken))
    {
        jProperty = responseJson.Property("Json");
        JObject JsonResponse = JObject.Parse(Convert.ToString(jProperty.Value));
        Console.WriteLine(JsonResponse.ToString());
    }
}

This is just scratching the surface of what we can do.  To copy a record to and from Json is easy to do with these functions.

And even if I am showing all this in C/AL there should be no problem in using the new AL in Visual Studio Code to get the same results.

Upgrading my G/L Source Names Extension to AL – step 3

When upgrading an extension from C/AL to AL (version 1 to version 2) we need to think about the data upgrade process.

In C/AL we needed to add two function to an extension Codeunit to handle the installation and upgrade.  This I did with Codeunit 70009200.  One function to be execute once for each install.

PROCEDURE OnNavAppUpgradePerDatabase@1();
VAR
  AccessControl@70009200 : Record 2000000053;
BEGIN
  WITH AccessControl DO BEGIN
    SETFILTER("Role ID",'%1|%2','SUPER','SECURITY');
    IF FINDSET THEN REPEAT
      AddUserAccess("User Security ID",PermissionSetToUserGLSourceNames);
      AddUserAccess("User Security ID",PermissionSetToUpdateGLSourceNames);
      AddUserAccess("User Security ID",PermissionSetToSetupGLSourceNames);
    UNTIL NEXT = 0;
  END;
END;

And another function to be executed once for each company in the install database.

PROCEDURE OnNavAppUpgradePerCompany@2();
VAR
  GLSourceNameMgt@70009200 : Codeunit 70009201;
BEGIN
  NAVAPP.RESTOREARCHIVEDATA(DATABASE::"G/L Source Name Setup");
  NAVAPP.RESTOREARCHIVEDATA(DATABASE::"G/L Source Name User Setup");
  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name");

  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name Help Resource");
  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name User Access");
  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name Group Access");

  GLSourceNameMgt.PopulateSourceTable;
  RemoveAssistedSetup;
END;

For each database I add my permission sets to the installation users and for each company I restore the setup data for my extension and populate the lookup table for G/L Source Name.

The methods for install and upgrade have changed in AL for extensions version 2.  Look at the AL documentation from Microsoft for details.

In version 2 I remove these two obsolete function from my application management Codeunit and need to add two new Codeunits, one for install and another for upgrade.

codeunit 70009207 "O4N GL Source Name Install"
{
    Subtype = Install;
    trigger OnRun();
    begin
    end;

    var
    PermissionSetToSetupGLSourceNames : TextConst ENU='G/L-SOURCE NAMES, S';
    PermissionSetToUpdateGLSourceNames : TextConst ENU='G/L-SOURCE NAMES, E';
    PermissionSetToUserGLSourceNames : TextConst ENU='G/L-SOURCE NAMES';

    
    trigger OnInstallAppPerCompany();
    var
        GLSourceNameMgt : Codeunit "O4N GL SN Mgt";
    begin
        GLSourceNameMgt.PopulateSourceTable;
        RemoveAssistedSetup;
    end;

    trigger OnInstallAppPerDatabase();
    var
        AccessControl : Record "Access Control";
    begin
        with AccessControl do begin
            SETFILTER("Role ID",'%1|%2','SUPER','SECURITY');
            if FINDSET then repeat
                AddUserAccess("User Security ID",PermissionSetToUserGLSourceNames);
                AddUserAccess("User Security ID",PermissionSetToUpdateGLSourceNames);
                AddUserAccess("User Security ID",PermissionSetToSetupGLSourceNames);
            until NEXT = 0;
        end;
    end;

  local procedure RemoveAssistedSetup();
  var
    AssistedSetup : Record "Assisted Setup";
  begin
    with AssistedSetup do begin
      SETRANGE("Page ID",PAGE::"O4N GL SN Setup Wizard");
      if not ISEMPTY then
        DELETEALL;
    end;
  end;

  local procedure AddUserAccess(AssignToUser : Guid;PermissionSet : Code[20]);
  var
    AccessControl : Record "Access Control";
    AppMgt : Codeunit "O4N GL SN App Mgt.";
    AppGuid : Guid;
  begin
    EVALUATE(AppGuid,AppMgt.GetAppId);
    with AccessControl do begin
      INIT;
      "User Security ID" := AssignToUser;
      "App ID" := AppGuid;
      Scope := Scope::Tenant;
      "Role ID" := PermissionSet;
      if not FIND then
        INSERT(true);
    end;
  end;

}

In the code you can see that this Codeunit is of Subtype=Install.  This code will  be executed when installing this extension in a database.

To confirm this I can see that I have the G/L Source Names Permission Sets in the Access Control table .

And my G/L Source Name table also has all required entries.

Uninstalling the extension will not remove this data.  Therefore you need to make sure that the install code is structured in a way that it will work even when reinstalling.  Look at the examples from Microsoft to get a better understanding.

Back to my C/AL extension.  When uninstalling that one the data is moved to archive tables.

Archive tables are handled with the NAVAPP.* commands.  The OnNavAppUpgradePerCompany command here on top handled these archive tables when reinstalling or upgrading.

Basically, since I am keeping the same table structure I can use the same set of commands for my upgrade Codeunit.

codeunit 70009208 "O4N GL SN Upgrade"
{
    Subtype=Upgrade;
    trigger OnRun()
    begin
        
    end;
    
    trigger OnCheckPreconditionsPerCompany()
    begin

    end;

    trigger OnCheckPreconditionsPerDatabase()
    begin

    end;
    
    trigger OnUpgradePerCompany()
    var
        GLSourceNameMgt : Codeunit "O4N GL SN Mgt";
        archivedVersion : Text;
    begin
        archivedVersion := NAVAPP.GetArchiveVersion();
        if archivedVersion = '1.0.0.1' then begin
            NAVAPP.RESTOREARCHIVEDATA(DATABASE::"O4N GL SN Setup");
            NAVAPP.RESTOREARCHIVEDATA(DATABASE::"O4N GL SN User Setup");
            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN");

            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN Help Resource");
            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN User Access");
            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN Group Access");

            GLSourceNameMgt.PopulateSourceTable;
        end;
    end;

    trigger OnUpgradePerDatabase()
    begin

    end;

    trigger OnValidateUpgradePerCompany()
    begin

    end;

    trigger OnValidateUpgradePerDatabase()
    begin

    end;
    
}

So, time to test how and if this works.

I have my AL folder open in Visual Studio Code and I use the AdvaniaGIT command Build NAV Environment to get the new Docker container up and running.

Then I use Update launch.json with current branch information to update my launch.json server settings.

I like to use the NAV Container Helper from Microsoft  to manually work with the container.  I use a command from the AdvaniaGIT module to import the NAV Container Module.

The module uses the container name for most of the functions.  The container name can be found by listing the running Docker containers or by asking for the name that match the server used in launch.json.

I need my C/AL extension inside the container so I executed

Copy-FileToNavContainer -containerName jolly_bhabha -localPath C:\NAVManagementWorkFolder\Workspace\GIT\Kappi\NAV2017\Extension1\AppPackage.navx -containerPath c:\run

Then I open PowerShell inside the container

Enter-NavContainer -containerName jolly_bhabha

Import the NAV Administration Module

Welcome to the NAV Container PowerShell prompt

[50AA0018A87F]: PS C:\run> Import-Module 'C:\Program Files\Microsoft Dynamics NAV\110\Service\NavAdminTool.ps1'

Welcome to the Server Admin Tool Shell!

[50AA0018A87F]: PS C:\run>

and I am ready to play.  Install the C/AL extension

Publish-NAVApp -ServerInstance NAV -IdePath 'C:\Program Files (x86)\Microsoft Dynamics NAV\110\RoleTailored Client\finsql.exe' -Path C:\run\AppPackage.navx -SkipVerification

Now I am faced with the fact that I have opened PowerShell inside the container in my AdvaniaGIT terminal.  That means that my AdvaniaGIT commands will execute inside the container, but not on the host.

The simplest way to solve this is to open another instance of Visual Studio Code.  From there I can start the Web Client and complete the install and configuration of my C/AL extension.

I complete the Assisted Setup and do a round trip to G/L Entries to make sure that I have enough data in my tables to verify that the data upgrade is working.

I can verify this by looking into the SQL tables for my extension.  I use PowerShell to uninstall and unpublish my C/AL extension.

Uninstall-NAVApp -ServerInstance NAV -Name "G/L Source Names"
Unpublish-NAVApp -ServerInstance NAV -Name "G/L Source Names"

I can verify that in my SQL database I now have four AppData archive tables.

Pressing F5 in Visual Studio Code will now publish and install the AL extension, even if I have the terminal open inside the container.

The extension is published but can’t be installed because I had previously installed an older version of my extension.  Back in my container PowerShell I will follow the steps as described by Microsoft.

[50AA0018A87F]: PS C:\run> Sync-NAVApp -ServerInstance NAV -Name "G/L Source Names" -Version 2.0.0.0
WARNING: Cannot synchronize the extension G/L Source Names because it is already synchronized.
[50AA0018A87F]: PS C:\run> Start-NAVAppDataUpgrade -ServerInstance NAV -Name "G/L Source Names" -Version 2.0.0.0
[50AA0018A87F]: PS C:\run> Install-NAVApp -ServerInstance NAV -Tenant Default -Name "G/L Source Names"
WARNING: Cannot install extension G/L Source Names by Objects4NAV 2.0.0.0 for the tenant default because it is already installed.
[50AA0018A87F]: PS C:\run>

My AL extension is published and I have verified in my SQL server that all the data from the C/AL extension has been moved to the AL extension tables and all the archive tables have been removed.

Back in Visual Studio Code I can now use F5 to publish and install the extension again if I need to update, debug and test my extension.

Couple of more steps left that I will do shortly.  Happy coding…

 

Don’t worry about DotNet version in C/AL

When using DotNet data type in NAV C/AL we normally lookup a sub type to use.  When we do the result can be something like

Newtonsoft.Json.Linq.JObject.'Newtonsoft.Json, Version=6.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'

Then, what will happen when moving this code from NAV 2016 to NAV 2017 and NAV 2018.  The Newtonsoft.Json version is not the same and we will get a compile error!

Just remove the version information from the sub type information.

Newtonsoft.Json.Linq.JObject.'Newtonsoft.Json'

And NAV will find the matching Newtonsoft.Json library you have installed and use it.

This should work for all our DotNet variables.

Using AdvaniaGIT – Convert G/L Source Names to AL

Here we go.

The NAV on Docker environment we just created can be used for the task at hand.  I have an Extension in Dynamics 365 called G/L Source Names.

I need to update this Extension to V2.0 using AL.  In this video I go through the upgrade and conversion process using AdvainaGIT and Visual Studio Code.

In the first part I copy the deltas from my Dynamics 365 Extension into my work space and I download and prepare the latest release of NAV 2018 Docker Container.

Using our source and modified environments we can build new syntax objects and new syntax deltas. These new syntax deltas are then converted to AL code.

 

Using AdvaniaGIT in Visual Studio Code

It has become obvious that the future of AL programming is in Visual Studio Code.

Microsoft has made a decision to ship all their releases as Docker Containers.

The result of this is a development machine that does not have any NAV version installed.  I wanted to go through the installation and configuration of a new NAV on Docker development machine.

Here is what I did.

I installed Windows Server 2016 with Containers.  The other option was to use Windows 10 and install Docker as explained here.

After installing and fully updating the operating system I downloaded and installed Visual Studo Code.

After installation Visual Studio Code detects that I need to install Git.

I selected Download Git and was taken to the Git download page.

I downloaded and installed Git with default settings.

To be able to run NAV Development and NAV Client I need to install prerequisite components.  I copied the Prerequisite Components folder from my NAV 2018 DVD and installed some of them…

Let’s hook Visual Studio Code to our NAV 2018 repository and install AdvaniaGIT.  I first make sure to always run Visual Studio Code with administrative privileges.

Now that we have our AdvaniaGIT installed and configured we can start our development.  Let’s start our C/AL classic development.  Where this video ends you can continue development as described in my previous posts on AdvaniaGIT.  AdvaniaGIT also supports NAV 2016 and NAV 2017.

Since we are running NAV 2018 we can and should be using AL language and the Extension 2.0 model.  Let’s see how to use our repository structure, our already build Docker container and Visual Studio Code to start our first AL project.

So as you can see by watching these short videos it is easy to start developing both in C/AL and AL using AdvaniaGIT and Visual Studio Code.

My next task is to update my G/L Source Names extension to V2.  I will be using these tools for the job.  More to come soon…