The “Enabled” Codeunit will test for Setup table read permission and if the “Enabled” flag has been set in the default record.
LOCAL TestEnabled(VARTempBlob:Record TempBlob)
WITH JsonInterfaceMgt DOBEGIN
This is how we can make sure that a module is installed and enabled before we start using it or any of the dependent modules.
Table Access Interface
The main module has a standard response table. We map some of the communication responses to this table via Data Exchange Definition. From other modules we like to be able to read the response from the response table.
The response table uses a GUID value for a primary key and has an integer field for the “Data Exchange Entry No.”. From the sub module we ask if a response exists for the current “Data Exchange Entry No.” by calling the interface.
Some processes can be both automatically and manually executed. For manual execution we like to display a request page on a Report. On that request page we can ask for variables, settings and verify before executing the process.
For automatic processing we have default settings and logic to find the correct variables before starting the process. And since one module should be able to start a process in the other then we use the JSON interface pattern for the processing Codeunit.
We also like to include the “Method” variable to add flexibility to the interface. Even if there is only one method in the current implementation.
Reading through the code above we can see that we are also using the JSON interface to pass settings to the Data Exchange Framework. We put the JSON configuration into the “Table Filters” BLOB field in the Data Exchange where we can use it later in the data processing.
From the Report we start the process using the JSON interface.
This pattern is similar to the discovery pattern, where an Event is raised to register possible modules into a temporary table. Example of that is the “OnRegisterServiceConnection” event in Table 1400, Service Connection.
Since we can’t have Event Subscriber in one module listening to an Event Publisher in another, without having compile dependencies, we have come up with a different solution.
We register functionality from the functionality module and the list of modules in stored in a database table. The table uses a GUID and the Language ID for a primary key, and then the view is filtered by the Language ID to only show one entry for each module.
This pattern gives me a list of possible modules for that given functionality. I can open the Setup Page for that module and I can execute the Interface Codeunit for that module as well. Both the Setup Page ID and the Interface Codeunit ID are object names.
The registration interface uses the Method variable to select the functionality. It can either register a new module or it can execute the method in the modules.
WITH JsonInterfaceMgt DOBEGIN
LOCAL RegisterCollectionApp(JsonInterfaceMgt:Codeunit"IS Json Interface Mgt.")
Where the Subscriber that needs to respond to this Publisher is in another module we need to extend the functionality using JSON interfaces.
First, we create a Codeunit within the Publisher module with Subscribers. The parameters in the Subscribers are converted to JSON and passed to the possible subscriber modules using the “ExecuteMethodInApps” function above.
Having standard ways of talking between modules and solutions has opened up for a lot of flexibility. We like to keep our solutions as small as possible.
We could mix “Methods” and “Versions” if we at later time need to be able to extend some of the interfaces. We need to honor the contract we have made for the interfaces. We must not make breaking changes to the interfaces, but we sure can extend them without any problems.
By attaching the JSON Interface Codeunit to the post I hope that you will use this pattern in your solutions. Use the Code freely. It is supplies as-is and without any responsibility, obligations or requirements.
We use this BLOB for our JSON data when we send a request to an interface and the interface response is also JSON in that same BLOB field.
For people that have been working with web requests we can say that TempBlob.Blob is used both for RequestStream and for ResponseStream.
TempBlob is only used as a form of Stream. We never use TempBlob to store data. We never do TempBlob.Get() or TempBlob.Insert(). And, even if the name indicates that this is a temporary record, we don’t define the TempBlob Record variable as temporary. There is no need for that since we never do any database call for this record.
Interface Helper Codeunit
We use a single Codeunit in all our solutions to prepare both request and response JSON and also to read from the request on the other end.
We have created a Codeunit that includes all the required procedures for the interface communication.
We have three functions to handle the basics;
procedure InitializeFromTempBlob(TempBlob: Record TempBlob)
procedure GetAsTempBlob(var TempBlob: Record TempBlob)
A typical flow of executions is to start by initializing the JSON. Then we add data to that JSON. Before we execute the interface Codeunit we use GetAsTempBlob to write the JSON into TempBlob.Blob. Every Interface Codeunit expects a TempBlob record to be passed to the OnRun() trigger.
codeunit10008650"ADV SDS Interface Mgt"
with JsonInterfaceMgt dobegin
Inside the Interface Codeunit we initialize the JSON from the passed TempBlob record. At this stage we have access to all the data that was added to the JSON on the request side.
And, since the interface Codeunit will return TempBlob as well, we must make sure to put the response JSON in there before the execution ends.
with JsonInterfaceMgt dobegin
The JSON is an array that contains one or more objects. An JSON array is represented with square brackets.
The first object in the JSON array is the variable storage. This is an example of a JSON that passes two variables to the interface Codeunit.
All variables are stored in the XML format, using FORMAT(<variable>,0,9) and evaluated back using EVALUATE(<variable>,<json text value>,9). The JSON can then have multiple record related objects after the variable storage.
Adding data to the JSON
We have the following procedures for adding data to the JSON;
I will write a more detailed blog about each of these methods and give examples of how we use them, but for now I will just do a short explanation of their usage.
If we need to pass a reference to a database table we pass the Record ID. Inside the interface Codeunit we can get the database record based on that record. Each Record ID that we add to the JSON is stored with the Table Name and we use either of these two procedures to retrieve the record.
We use Base 64 methods in the JSON. By passing the BLOB to TempBlob.Blob we can use
on the other end to pass a binary content, like images or PDFs.
Finally, we have the possibility to add and encrypt values that we place in the JSON. On the other end we can then decrypt the data to be used. This we use extensively when we pass sensitive data to and from our Azure Function.
Calling an interface Codeunit
As promised I will write more detailed blogs with examples. This is the current list of procedures we use to call interfaces;
procedure ExecuteInterfaceCodeunitIfExists(CodeunitName: Text; var TempBlob: Record TempBlob; ErrorIfNotFound: Text)
procedure TryExecuteInterfaceCodeunitIfExists(CodeunitName: Text; var TempBlob: Record TempBlob; ErrorIfNotFound: Text): Boolean
The first two expect a JSON to be passed using TempBlob. The third one we use to check for a simple true/false. We have no request data but we read the ‘Success’ variable from the response JSON.
For some of our functionality we use an Azure Function. We have created our function to read the same JSON structure we use internally. We also expect our Azure Function to respond with the sames JSON structure. By doing it that way, we can use the same functions to prepare the request and to read from the response as we do for our internal interfaces.
For now this C/AL Codeunit is not in the standard CRONUS database. I need to import the C/AL code and make sure that AL will be able to use that Codeunit. You can see how to do this in my last blog post.
This C/AL Code will directly convert to AL and is ready to use.
The result of this is a development machine that does not have any NAV version installed. I wanted to go through the installation and configuration of a new NAV on Docker development machine.
Here is what I did.
I installed Windows Server 2016 with Containers. The other option was to use Windows 10 and install Docker as explained here.
After installing and fully updating the operating system I downloaded and installed Visual Studo Code.
After installation Visual Studio Code detects that I need to install Git.
I selected Download Git and was taken to the Git download page.
I downloaded and installed Git with default settings.
To be able to run NAV Development and NAV Client I need to install prerequisite components. I copied the Prerequisite Components folder from my NAV 2018 DVD and installed some of them…
Let’s hook Visual Studio Code to our NAV 2018 repository and install AdvaniaGIT. I first make sure to always run Visual Studio Code with administrative privileges.
Now that we have our AdvaniaGIT installed and configured we can start our development. Let’s start our C/AL classic development. Where this video ends you can continue development as described in my previous posts on AdvaniaGIT. AdvaniaGIT also supports NAV 2016 and NAV 2017.
Since we are running NAV 2018 we can and should be using AL language and the Extension 2.0 model. Let’s see how to use our repository structure, our already build Docker container and Visual Studio Code to start our first AL project.
So as you can see by watching these short videos it is easy to start developing both in C/AL and AL using AdvaniaGIT and Visual Studio Code.
My next task is to update my G/L Source Names extension to V2. I will be using these tools for the job. More to come soon…
So, you are not the only one in your company doing development, right?
Essential part of being able to develop C/AL is to have a starting point. That starting point is usually where you left of last time you did some development. If you are starting a task your starting point may just be the localized release from Microsoft.
A starting point in AdvaniaGIT is a database backup. The database backup can contain data and it should. Data to make sure that you as a developer can do some basic testing of the solution you are creating.
AdvaniaGIT has a dedicated folder (C:\AdvaniaGIT\Backup) for the database backups. That is where you should put your backups.
If you are working in teams, and even if not you might not want to flood your local drive with database backups. That is why we configure an FTP server in C:\AdvaniaGIT\Data\GITSetting.json.
We are currently investigating how we can use Docker for deploying NAV. For test purposes we have created a Docker Container Image with the NAV Developer Preview, which you can try out.
Docker Containers is a technology where you, instead of virtualizing the entire machine, only virtualize the services and share resources from the host computer. Read more about it here: https://www.docker.com/what-docker
We can install NAV environments as container both in Azure and on premise. We can have multiple NAV versions to work with without having to install, so there is no conflict. We can also get access to container images that are still in preview.
Note what Microsoft says, they are investigating. The NAV Container Image service is not public yet. You need authentication to get access. This project has a place on GitHub. To get access please contact Microsoft or send me a line and I will point you in the right direction.
It should be straight forward to install AdvaniaGIT on top of the NAV Developer Preview and start from there. We can also start from Azure template called “Windows Server 2016 Datacenter – with Containers”.
The local option is to install Docker on our Windows laptop. If you would like to use Docker on your laptop you need to change one setting after installation. You need to switch to Windows containers. Your laptop will most likely restart during installation and configuration of Docker so make sure to have your work saved.
If you are planning to run a Docker-Only environment you don’t need to install NAV. Still there are prerequisite components that you must install. These components can be found on the NAV DVD folder “Prerequisite Components”. From the “Microsoft SQL Server” folder install sqlncli64.msi and ReportBuilder3.msi. From the “Microsoft Visual C++ 2013” folder install vcredist_x64.exe. From the “Microsoft Visual Studio 2010 Tools for Office Redist” install vstor_redist.exe. From the “Microsoft Report Viewer” folder install both SQLSysClrTypes.msi and ReportViewer.msi. You should now be ready for the next step.
So, let’s get started
In your C:\AdvaniaGIT\Data folder open DockerSettings.json
Licenses are not to be stored in SQL backups used by AdvaniaGIT. When using AdvaniaGIT to create SQL backups the license is removed before creating the backup and reinstalled afterwards.
The first function that is executed after SQL database restore is a database upgrade with the development environment. This must be done to make sure that the database fits the service version being used. For this database upgrade function to be successful, first either make sure that the database does not contain expired license, and make sure that you have a valid license in the master database.
There are a few ways of doing this. First, there is an option when installing NAV to upload the license.
Secondly, in the development environment you can upload a license, going through Tools and License Information.
But make sure that the database your development environment is connected to does not have the “Save license in database” set like here, going through File, Database and Alter.
The third option is to use the server administrative shell.
Most of us are not just starting working with NAV. But again not all of us have been able to apply Source Control Management (SCM) to our daily work.
In the previous posts we have installed and prepared our development environment and we are now ready to start working on our solution in the SCM way.
Our first step is to create a branch for our solution.
Now we should look at the options we have to import our solution into our branch. We can have our solution in different formats.
All these file formats can be imported into our solution branch with the tools that AdvaniaGIT delivers. Let’s start with the SQL backup, 2016-DAA_WineApp.bak. AdvaniaGIT will search for backups using these patterns.