Import Tenant Data

My last post was about how I got the customized data out of the tenant database into Xml files. That tenant database was from a NAV 2016 application.

I have updated the tenant database to Business Central and I need to bring in some of the data from these Xml files.

My first issue was that I needed to make these Xml files available to Business Central. I have been using Azure Blob to store files for some years now. I had both AL and C/AL code that was able to connect to the Azure Blob REST Api, but that code used DotNet variables that is no longer an option.

I did some preparation last year, when I requested Microsoft to add some functionality to the BaseApp. Using that BaseApp functionality I was able to redo my Azure Blob AL code as a clean extension.

I also wanted to put the AL code somewhere in a public place for everyone to see. And GitHub is the default code storage place. I created a project for Business Central AL.

I am hoping that this place can be the place where code examples for our Business Central community is shared and maintained. If you want to contribute then I can add you to this project, or I can approve your pull request.

I need to write another blob post about that Azure Blob and the other repositories I have created there. Hope to find time soon.

There is another repository in this project for the Import Tenant Data App. This app has an Azure Blob Connect functionality to utilize the Azure Blob app for data import.

I start by opening the Import Data Source page.

Here I find the Azure Blob Connector that self registered in the Import Data Source table.

I need to go to Process -> Setup to configure my Azure Blob container access.

The information required can be found in the Azure Portal.

Specify the container where you have uploaded all the Xml files.

Then I searched for Import Project List and create a new import project for the General Ledger. The Import Source for Azure Blob was automatically select, since that is the only one available.

Now to import the related Xml files into this project

I get a list of files from the Azure Blob and select the one I need.

The file list will open again if I have more files to import. Close the file list when finished. Back on the Import Project we should now see information from the Xml file.

For each file I need to configure the destination mapping.

If the table exists in my Business Central App then it will be automatically selected.

And I can map fields from the Xml file to the Business Central Table.

There are options to handle different data structure. One is that we can add a transformation rule directly to each field. The other one is using our own custom data upgrade app that subscribes to the events published in this app.

Four events are published, two for each field in the mapping, two before updating or inserting the database record.

Based on the information in the publishers we can do any manual data modification required. In my example the creation time was added to each G/L Entry in NAV, but is added to the G/L Register in Business Central.

From the list of tables we are able to start the data transfer. First we need to make sure that we have the correct configuration for the import. Do we want to commit during the import, do we want to create missing records in our database?

I select to commit after each 1000 records. If my data transfer stops, than I can resume from that position when I start the data transfer again.

We have the option to create a task in the job queue to handle the data transfer.

The job queue can handle multiple concurrent transfers so the import should not take to much time. Looking into the Destination Mapping, we can see the status of the data import.

I will add few more pictures to give you a better idea of what can be done with this import tenant data app. The AL code is in GitHub for you to browse, improve and fix.

Logging your App Activity

It is good practice to have some audit log of what uses do in the application. Some versions ago Microsoft introduced the Change Log to log data changes. How about logging an action execution?

One of the built in solutions in Business Central can be used to solve this. We now have the Activity Log (Table 710).

To use the Activity Log we need to have a record to attach the activity log to. All our Apps have a Setup table that usually only have one record. I like to attach my Activity Log to that record.

To show the Activity Log from that record you can add this action to that record’s page.

action("ActivityLog")
{
    ApplicationArea = All;
    Caption = 'Activity Log';
    Image = Log;
    Promoted = true;
    PromotedCategory = Process;
    PromotedOnly = true;
    Scope = "Page";
    ToolTip = 'See the data activities for this App.';

    trigger OnAction()
    var
        ActivityLog: Record "Activity Log";
    begin
        ActivityLog.ShowEntries(Rec);
    end;
}

The logging part can be something like this.

local procedure LogActivity(ADVUpgradeProjTable: Record "ADV Upgrade Project Table"; Context: Text[30])
var
    ActivityLog: Record "Activity Log";
    Status: Option Success,Failed;
begin
    if ADVUpgradeProject."App Package Id" <> ADVUpgradeProjTable."App Package Id" then begin
        ADVUpgradeProject.SetRange("App Package Id", ADVUpgradeProjTable."App Package Id");
        ADVUpgradeProject.FindFirst();
    end;
    ActivityLog.LogActivity(
        ADVUpgradeProject,
        Status::Success,
        Context,
        StrSubstNo('%1', ADVUpgradeProjTable."Data Upgrade Method"),
        StrSubstNo('%1 (%2)', ADVUpgradeProjTable."App Table Name", ADVUpgradeProjTable."App Table Id"));
end;

We also have the possibility to log details. Both a text value and also from an in-stream.

ActivityLog.SetDetailedInfoFromText("Text variable");

ActivityLog.SetDetailedInfoFromStream("in Stream");

In Business Central we have information about the execution context. I pass that execution context into the LogActivity. This gives me information on the session that is executing the code.

local procedure GetExecutionContext(): Text[30]
var
    SessionContext: ExecutionContext;
begin
    SessionContext := Session.GetCurrentModuleExecutionContext();
    case SessionContext of
        SessionContext::Install:
            exit(CopyStr(InstallationMsg, 1, 30));
        SessionContext::Upgrade:
            exit(CopyStr(UpgradeMsg, 1, 30));
        SessionContext::Normal:
            exit(CopyStr(UserContextMsg, 1, 30));
    end;
end;

var
    InstallationMsg: Label 'App Installation';
    UpgradeMsg: Label 'App Upgrade';
    UserContextMsg: Label 'Started by user';

Using this logic we can log all execution during install, upgrade and normal user cases. If we need information on the variables we can log them into the detailed information using either JSON or XML.

JSON Interface – examples

We have several ways of using the JSON interfaces. I will give few examples with the required C/AL code. I will be using Advania’s Online Banking solution interfaces for examples.

The Advania’s Online Banking solution is split into several different modules. The main module has the general framework. Then we have communication modules and functionality modules.

On/Off Question

A communication module should not work if the general framework does not exist or is not enabled for the current company. Hence, I need to ask the On/Off question

This is triggered by calling the solution enabled Codeunit.

IF NOT JsonInterfaceMgt.TryExecuteCodeunitIfExists('ADV Bank Services Enabled Mgt.','') THEN BEGIN
  SetupNotification.MESSAGE := NotificationMsg;
  SetupNotification.SEND;
END;

The interface function will search for the Codeunit, check for execution permissions and call the Codeunit with an empty request BLOB.

The “Enabled” Codeunit must respond with a “Success” variable of true or false.

[External] TryExecuteCodeunitIfExists(CodeunitName : Text;ErrorIfNotFound : Text) Success : Boolean
Object.SETRANGE(Type,Object.Type::Codeunit);
Object.SETRANGE(Name,CodeunitName);
IF NOT Object.FINDFIRST THEN
  IF ErrorIfNotFound <> '' THEN
    ERROR(ErrorIfNotFound)
  ELSE
    EXIT;

IF NOT HasCodeunitExecuteLicense(Object.ID,ErrorIfNotFound) THEN EXIT;
CODEUNIT.RUN(Object.ID,TempBlob);
InitializeFromTempBlob(TempBlob);
GetVariableBooleanValue(Success,'Success');

The “Enabled” Codeunit will test for Setup table read permission and if the “Enabled” flag has been set in the default record.

OnRun(VAR Rec : Record TempBlob)
TestEnabled(Rec);

LOCAL TestEnabled(VAR TempBlob : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  Initialize;
  AddVariable('Success',IsServiceEnabled);
  GetAsTempBlob(TempBlob);
END;

IsServiceEnabled() : Boolean
IF NOT Setup.READPERMISSION THEN EXIT;
EXIT(Setup.GET AND Setup.Enabled);

This is how we can make sure that a module is installed and enabled before we start using it or any of the dependent modules.

Table Access Interface

The main module has a standard response table. We map some of the communication responses to this table via Data Exchange Definition. From other modules we like to be able to read the response from the response table.

The response table uses a GUID value for a primary key and has an integer field for the “Data Exchange Entry No.”. From the sub module we ask if a response exists for the current “Data Exchange Entry No.” by calling the interface.

FindResponse(DataExchEntryNo : Integer) Success : Boolean
WITH JsonInterfaceMgt DO BEGIN
  Initialize;
  AddVariable('DataExchEntryNo',DataExchEntryNo);
  GetAsTempBlob(TempBlob);
  ExecuteInterfaceCodeunitIfExists('ADV Bank Serv. Resp. Interface',TempBlob,ResponseInterfaceErr);
  InitializeFromTempBlob(TempBlob);
  GetVariableBooleanValue(Success,'Success');
END;

The Interface Codeunit for the response table will filter on the “Data Exchange Entry No.” and return the RecordID for that record if found.

OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  GetVariableIntegerValue(DataExchEntryNo,'DataExchEntryNo');
  Response.SETRANGE("Data Exch. Entry No.",DataExchEntryNo);
  AddVariable('Success',Response.FINDFIRST);
  IF Response.FINDFIRST THEN
    AddRecordID(Response);
  GetAsTempBlob(Rec);
END;

If the response is found we can ask for the value of any field from that record by calling

GetFieldValue(FieldName : Text) FieldValue : Text
WITH JsonInterfaceMgt DO
  IF GetRecordByTableName('ADV Bank Service Response',RecRef) THEN
    IF DataTypeMgt.FindFieldByName(RecRef,FldRef,FieldName) THEN
      IF FORMAT(FldRef.TYPE) = 'BLOB' THEN BEGIN
        TempBlob.Blob := FldRef.VALUE;
        FieldValue := TempBlob.ReadAsTextWithCRLFLineSeparator();
      END ELSE
        FieldValue := FORMAT(FldRef.VALUE,0,9);

Processing Interface

Some processes can be both automatically and manually executed. For manual execution we like to display a request page on a Report. On that request page we can ask for variables, settings and verify before executing the process.

For automatic processing we have default settings and logic to find the correct variables before starting the process. And since one module should be able to start a process in the other then we use the JSON interface pattern for the processing Codeunit.

We also like to include the “Method” variable to add flexibility to the interface. Even if there is only one method in the current implementation.

OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  IF NOT GetVariableTextValue(Method,'Method') OR (Method = '') THEN
    ERROR(MethodNotFoundErr);
  CASE Method OF
    'BankAccountProcessing':
      BankAccountProcessing(JsonInterfaceMgt);
  END;
END;

LOCAL BankAccountProcessing(JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")
CheckSetup;
CompanyInformation.GET;
WITH JsonInterfaceMgt DO BEGIN
  GetVariableTextValue(ClaimExportImportFormatCode, 'ClaimExportImportFormatCode');
  GetVariableTextValue(BankAccountNo, 'BankAccountNo');
  GetVariableDateValue(StartDate,'StartDate');
  GetVariableDateValue(EndDate,'EndDate');

  ValidateStartDate;
  ValidateEndDate;
  ValidateImportFormat;
  BankAccount.SETRANGE("No.", BankAccountNo);
  ClaimExportImportFormat.GET(ClaimExportImportFormatCode);
  Initialize;
  AddVariable('BankAccNo',BankAccountNo);
  AddVariable('ClaimantID',CompanyInformation."Registration No.");
  AddVariable('StartDate',StartDate);
  AddVariable('EndDate',EndDate);
  GetAsTempBlob(TempBlob);
  Window.OPEN(ImportingFromBank);
  IF BankAccount.FINDSET THEN REPEAT
    DataExchDef.GET(ClaimExportImportFormat."Resp. Data Exch. Def. Code");

    DataExch.INIT;
    DataExch."Related Record" := BankAccount.RECORDID;
    DataExch."Table Filters" := TempBlob.Blob;
    DataExch."Data Exch. Def Code" := DataExchDef.Code;
    DataExchLineDef.SETRANGE("Data Exch. Def Code",DataExchDef.Code);
    DataExchLineDef.FINDFIRST;
    DataExch."Data Exch. Line Def Code" := DataExchLineDef.Code;

    DataExchDef.TESTFIELD("Ext. Data Handling Codeunit");
    CODEUNIT.RUN(DataExchDef."Ext. Data Handling Codeunit",DataExch);

    DataExch.INSERT;
    IF DataExch.ImportToDataExch(DataExchDef) THEN BEGIN

      DataExchMapping.GET(DataExchDef.Code,DataExchLineDef.Code,DATABASE::"ADV Claim Payment Batch Entry");

      IF DataExchMapping."Pre-Mapping Codeunit" <> 0 THEN
        CODEUNIT.RUN(DataExchMapping."Pre-Mapping Codeunit",DataExch);

      DataExchMapping.TESTFIELD("Mapping Codeunit");
      CODEUNIT.RUN(DataExchMapping."Mapping Codeunit",DataExch);

      IF DataExchMapping."Post-Mapping Codeunit" <> 0 THEN
        CODEUNIT.RUN(DataExchMapping."Post-Mapping Codeunit",DataExch);
    END;
    DataExch.DELETE(TRUE);
  UNTIL BankAccount.NEXT = 0;
  Window.CLOSE;
END;

Reading through the code above we can see that we are also using the JSON interface to pass settings to the Data Exchange Framework. We put the JSON configuration into the “Table Filters” BLOB field in the Data Exchange where we can use it later in the data processing.

From the Report we start the process using the JSON interface.

Bank Account - OnPreDataItem()
WITH JsonInterfaceMgt DO BEGIN
  Initialize;
  AddVariable('Method','BankAccountProcessing');
  AddVariable('ClaimExportImportFormatCode', ClaimExportImportFormat.Code);
  AddVariable('BankAccountNo', BankAccount."No.");
  AddVariable('StartDate',StartDate);
  AddVariable('EndDate',EndDate);
  GetAsTempBlob(TempBlob);
  ExecuteInterfaceCodeunitIfExists('ADV Import BCP Interface', TempBlob, '');
END;

The ExecuteInterfaceCodeunitIfExists will also verify that the Interface Codeunit exists and also verify the permissions before executing.

[External] ExecuteInterfaceCodeunitIfExists(CodeunitName : Text;VAR TempBlob : Record TempBlob;ErrorIfNotFound : Text)
Object.SETRANGE(Type,Object.Type::Codeunit);
Object.SETRANGE(Name,CodeunitName);
IF NOT Object.FINDFIRST THEN
  IF ErrorIfNotFound <> '' THEN
    ERROR(ErrorIfNotFound)
  ELSE
    EXIT;

IF NOT HasCodeunitExecuteLicense(Object.ID,ErrorIfNotFound) THEN EXIT;
CODEUNIT.RUN(Object.ID,TempBlob)

Extensible Interface

For some tasks it might be simple to have a single endpoint (Interface Codeunit) for multiple functionality. This can be achieved by combining Events and Interfaces.

We start by reading the required parameters from the JSON and then we raise an event for anyone to respond to the request.

OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  IF NOT GetVariableTextValue(InterfaceType,'InterfaceType') THEN
    ERROR(TypeErr);
  IF NOT GetVariableTextValue(Method,'Method') THEN
    ERROR(MethodErr);
  OnInterfaceAccess(InterfaceType,Method,Rec);
END;

LOCAL [IntegrationEvent] OnInterfaceAccess(InterfaceType : Text;Method : Text;VAR TempBlob : Record TempBlob)

We can also pass the JSON Interface Codeunit, as that will contain the full JSON and will contain the full JSON for the response.

OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  IF NOT GetVariableTextValue(InterfaceType,'InterfaceType') THEN
    ERROR(TypeErr);
  IF NOT GetVariableTextValue(Method,'Method') THEN
    ERROR(MethodErr);
  OnInterfaceAccess(InterfaceType,Method,JsonInterfaceMgt);
  GetAsTempBlob(Rec);
END;

LOCAL [IntegrationEvent] OnInterfaceAccess(InterfaceType : Text;Method : Text;VAR JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")

One of the subscribers could look like this

LOCAL [EventSubscriber] OnInterfaceAccess(InterfaceType : Text;Method : Text;VAR JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")
IF InterfaceType = 'Claim' THEN
  CASE Method OF
    'Register':
      Register(JsonInterfaceMgt);
    'Edit':
      Edit(JsonInterfaceMgt);
    'AddExportImportFormat':
      AddExportImportFormat(JsonInterfaceMgt);
    'GetSetupCodeunitID':
      GetSetupCodeunitID(JsonInterfaceMgt);
    'GetDirection':
      GetDirection(JsonInterfaceMgt);
    'GetServiceUrl':
      GetServiceUrl(JsonInterfaceMgt);
    'GetExportImportFormat':
      GetExportImportFormat(JsonInterfaceMgt);
    'GetServiceMethod':
      GetServiceMethod(JsonInterfaceMgt);
    'ShowAndGetClaimFormat':
      ShowAndGetClaimFormat(JsonInterfaceMgt);
    'GetDataExchangeDefintionWithAction':
      GetDataExchangeDefintionWithAction(JsonInterfaceMgt);
    'GetOperationResultForClaimant':
      GetOperationResultForClaimant(JsonInterfaceMgt);
    'ShowClaimPayment':
      ShowClaimPayment(JsonInterfaceMgt)
    ELSE
      ERROR(MethodErr,Method);
  END;

Registration Interface

This pattern is similar to the discovery pattern, where an Event is raised to register possible modules into a temporary table. Example of that is the “OnRegisterServiceConnection” event in Table 1400, Service Connection.

Since we can’t have Event Subscriber in one module listening to an Event Publisher in another, without having compile dependencies, we have come up with a different solution.

We register functionality from the functionality module and the list of modules in stored in a database table. The table uses a GUID and the Language ID for a primary key, and then the view is filtered by the Language ID to only show one entry for each module.

This pattern gives me a list of possible modules for that given functionality. I can open the Setup Page for that module and I can execute the Interface Codeunit for that module as well. Both the Setup Page ID and the Interface Codeunit ID are object names.

The registration interface uses the Method variable to select the functionality. It can either register a new module or it can execute the method in the modules.

OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  IF NOT GetVariableTextValue(Method,'Method') THEN
    ERROR(MethodErr);
  CASE Method OF
    'Register':
      RegisterCollectionApp(JsonInterfaceMgt);
    ELSE
      ExecuteMethodInApps(Rec);
  END;
END;

LOCAL RegisterCollectionApp(JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")
WITH BankCollectionModule DO BEGIN
  JsonInterfaceMgt.GetVariableGUIDValue(ID,'ID');
  "Language ID" := GLOBALLANGUAGE();
  IF FIND THEN EXIT;
  INIT;
  JsonInterfaceMgt.GetVariableTextValue(Name,'Name');
  JsonInterfaceMgt.GetVariableTextValue("Setup Page ID",'SetupPageID');
  JsonInterfaceMgt.GetVariableTextValue("Interface Codeunit ID",'InterfaceCodeunitID');
  INSERT;
END;

[External] ExecuteMethodInApps(VAR TempBlob : Record TempBlob)
WITH BankCollectionModule DO BEGIN
  SETCURRENTKEY("Interface Codeunit ID");
  IF FINDSET THEN REPEAT
    JsonInterfaceMgt.ExecuteInterfaceCodeunitIfExists("Interface Codeunit ID",TempBlob,'');
    SETFILTER("Interface Codeunit ID",'>%1',"Interface Codeunit ID");
  UNTIL NEXT = 0;
END;

In the “ExecuteMethodInApps” function I use the filters to make sure to only execute each Interface Codeunit once.

The registration is executed from the Setup & Configuration in the other module.

[External] RegisterCollectionApp()
WITH JsonInterfaceMgt DO BEGIN
  Initialize();
  AddVariable('Method','Register');
  AddVariable('ID',GetCollectionAppID);
  AddVariable('Name',ClaimAppName);
  AddVariable('SetupPageID','ADV Claim Setup');
  AddVariable('InterfaceCodeunitID','ADV Claim Interface Access');
  GetAsTempBlob(TempBlob);
  ExecuteInterfaceCodeunitIfExists('ADV Bank Collection App Access',TempBlob,'');
END;

Extend functionality using the Registered Modules.

As we have been taught we should open our functionality for other modules. This is done by adding Integration Events to our code.

LOCAL [IntegrationEvent] OnBeforePaymentPost(ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry";VAR CustLedgEntry : Record "Cust. Ledger Entry";VAR UseClaimPaymentApplication : Boolean;VAR ToAccountType : 'G/L Account,Customer,Vendor,Bank Acco

LOCAL [IntegrationEvent] OnBeforePostGenJnlLine(VAR ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry";VAR GenJournalLine : Record "Gen. Journal Line";VAR AppliedDocType : Option;VAR AppliedDocNo : Code[20];VAR AppliesToID : Code[50])

Where the Subscriber that needs to respond to this Publisher is in another module we need to extend the functionality using JSON interfaces.

First, we create a Codeunit within the Publisher module with Subscribers. The parameters in the Subscribers are converted to JSON and passed to the possible subscriber modules using the “ExecuteMethodInApps” function above.

LOCAL [EventSubscriber] OnBeforeClaimPaymentInsert(VAR ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry")
GetClaimSettings(ClaimPaymentEntry);

LOCAL GetClaimSettings(VAR ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry") Success : Boolean
JsonInterfaceMgt.Initialize;
JsonInterfaceMgt.AddVariable('Method','GetClaimSettings');
JsonInterfaceMgt.AddVariable('ClaimantID',ClaimPaymentEntry."Claimant Registration No.");
JsonInterfaceMgt.AddVariable('ClaimKey',ClaimPaymentEntry."Claim Account No.");
JsonInterfaceMgt.AddVariable('InterestDate',ClaimPaymentEntry."Interest Date");
JsonInterfaceMgt.GetAsTempBlob(TempBlob);
BankCollectionAppAccess.ExecuteMethodInApps(TempBlob);
JsonInterfaceMgt.InitializeFromTempBlob(TempBlob);
IF NOT JsonInterfaceMgt.GetVariableBooleanValue(Success,'Success') THEN EXIT;

ClaimPaymentEntry."Batch Code" := GetJsonProperty('BatchCode');
ClaimPaymentEntry."Template Code" := GetJsonProperty('TemplateCode');
ClaimPaymentEntry."Source Code" := GetJsonProperty('SourceCode');
ClaimPaymentEntry."Customer No." := GetJsonProperty('CustomerNo');
ClaimPaymentEntry."Customer Name" := GetJsonProperty('CustomerName');

The module that is extending this functionality will be able to answer to these request and supply the required response.

OnRun(VAR Rec : Record TempBlob)
IF NOT Setup.READPERMISSION THEN EXIT;
Setup.GET;

WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  IF NOT GetVariableTextValue(Method,'Method') THEN
    ERROR(MethodErr);
  CASE Method OF
    'Register':
      RegisterCollectionApp();
    'GetByCustLedgEntryNo':
      ReturnClaimForCustLedgEntryNo(Rec);
    'GetCustLedgEntryLinkInfo':
      ReturnClaimInfoForCustLedgEntryNo(Rec);
    'DisplayCustLedgEntryLinkInfo':
      DisplayClaimInfoForCustLedgEntryNo();
    'GetClaimSettings':
      ReturnClaimSettings(Rec);
    'GetClaimTempateSettings':
      ReturnClaimTemplateSettings(Rec);
    'GetClaimPaymentApplicationID':
      ReturnClaimPaymentApplicationID(Rec);
    'AddToGenDataRequest':
      ReturnGenDataRequest(Rec);
  END;
END;

Azure Function

The last example we will show is the Azure Function. Some functionality requires execution in an Azure Function.

By making sure that our Azure Function understands the same JSON format used in our JSON Interface Codeunit we can easily prepare the request and read the response using the same methods.

We have the Azure Function Execution in that same JSON Codeunit. Hence, easily prepare the request and call the function in a similar way as for other interfaces.

JsonInterfaceMgt.Initialize;
JsonInterfaceMgt.AddVariable('Method',ServiceMethod);
JsonInterfaceMgt.AddVariable('Url',ServiceUrl);
JsonInterfaceMgt.AddVariable('Username',Username);
JsonInterfaceMgt.AddEncryptedVariable('Password',Password);
JsonInterfaceMgt.AddVariable('Certificate',CertificateValueAsBase64);
JsonInterfaceMgt.AddVariable('Xml',TempBlob.ReadAsTextWithCRLFLineSeparator);
Success := JsonInterfaceMgt.ExecuteAzureFunction;
IF JsonInterfaceMgt.GetVariableBLOBValue(TempBlob,'Xml') THEN
  LogMgt.SetIncoming(TempBlob.ReadAsTextWithCRLFLineSeparator,'xml')
ELSE
  LogMgt.SetIncoming(JsonInterfaceMgt.GetJSON,'json');
IF Success THEN
  DataExch."File Content" := TempBlob.Blob;

The request JSON is posted to the Azure Function and the result read with a single function.

[External] ExecuteAzureFunction() Success : Boolean
GetAsTempBlob(TempBlob);
IF (NOT GetVariableTextValue(AzureServiceURL,'AzureServiceURL')) OR (AzureServiceURL = '') THEN
  AzureServiceURL := 'https://<azurefunction>.azurewebsites.net/api/AzureProxy?code=<some access code>';

OnBeforeExecuteAzureFunction(TempBlob,AzureServiceURL,OmmitWebRequest);

IF NOT OmmitWebRequest THEN BEGIN
  HttpWebRequestMgt.Initialize(AzureServiceURL);
  HttpWebRequestMgt.DisableUI;
  HttpWebRequestMgt.SetMethod('POST');
  HttpWebRequestMgt.SetContentType('application/json');
  HttpWebRequestMgt.SetReturnType('application/json');
  HttpWebRequestMgt.AddBodyBlob(TempBlob);

  TempBlob.INIT;
  TempBlob.Blob.CREATEINSTREAM(ResponseInStream,TEXTENCODING::UTF8);
  IF NOT HttpWebRequestMgt.GetResponse(ResponseInStream,HttpStatusCode,ResponseHeaders) THEN
    IF NOT HttpWebRequestMgt.ProcessFaultResponse('http://www.advania.is') THEN BEGIN
      Initialize;
      AddVariable('Exception',GETLASTERRORTEXT);
      EXIT(FALSE);
    END;
END;

InitializeFromTempBlob(TempBlob);
GetVariableBooleanValue(Success,'Success');

We use the “OnBeforeExecuteAzureFunction” event with a manual binding for our Unit Tests.

In the Azure Function we read the request with standard JSON functions

dynamic data = await req.Content.ReadAsAsync<object>();
Newtonsoft.Json.Linq.JArray jRequestArray = Newtonsoft.Json.Linq.JArray.Parse(data.ToString());
string Method = jRequestArray.First().Value<string>("Method") ?? "Undefined";

Then based on the Method we call each functionality with the request and write the response to the response JSON.

Newtonsoft.Json.Linq.JArray jResponseArray = new Newtonsoft.Json.Linq.JArray();
Newtonsoft.Json.Linq.JObject jResponseObject = new Newtonsoft.Json.Linq.JObject();
try
{            
    switch (Method)
    {
        case "Ping":
            success = true;
            response = "Hello " + (jRequestArray.First().Value<string>("Name") ?? "Undefined") + "!";
            break; 
        case "IOBS2005WSE2.GetAccountStatement":
            xml = jRequestArray.First().Value<string>("Xml") ?? "";
            success = IOBS2005WSE2.Helper.GetAccountStatement(
                jRequestArray.First().Value<string>("Url") ?? "",
                jRequestArray.First().Value<string>("Username") ?? "",
                Decrypt(jRequestArray.First().Value<string>("Password") ?? ""),
                jRequestArray.First().Value<string>("Certificate") ?? "",
                ref xml);
            jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("Xml", xml));
            response = "";
            break;
            ...
        default:
            response = "Method not found";
            break;
    }
    jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("Response", response));
}
catch (System.Exception exception)
{
    httpStatusCode = HttpStatusCode.BadRequest;
    jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("Request", request));
    jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("Message", exception.Message));
    jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("StackTrace", exception.StackTrace.ToString()));
    jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("InnerException", exception.InnerException.Message));
}

jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("Success", success));
jResponseArray.Add(jResponseObject);           
return req.CreateResponse(httpStatusCode, jResponseArray);                        

Conclusion

Having standard ways of talking between modules and solutions has opened up for a lot of flexibility. We like to keep our solutions as small as possible.

We could mix “Methods” and “Versions” if we at later time need to be able to extend some of the interfaces. We need to honor the contract we have made for the interfaces. We must not make breaking changes to the interfaces, but we sure can extend them without any problems.

By attaching the JSON Interface Codeunit to the post I hope that you will use this pattern in your solutions. Use the Code freely. It is supplies as-is and without any responsibility, obligations or requirements.

JSON Interface – prerequisites

There are two objects we use in all JSON interfaces. We use the TempBlob table and our custom JSON Interface Codeunit.

Abstract

JSON interface uses the same concept as a web service. The endpoint is defined by the Codeunit Name and the caller always supplies a form of request data (JSON) and expects a response data (JSON).

These interface calls therefore are only internal to the Business Central (NAV) server and are very fast. All the data is handled in memory only.

We define these interfaces by Endpoints. Some Endpoints have Methods. We call these Endpoints with a JSON. The JSON structure is predefined and every interface respects the same structure.

We have a single Codeunit that knows how to handle this JSON structure. Passing JSON to an interface requires a data container.

Interface Data

TempBlob is table 99008535. The table is simple but is has a lot of useful procedures.

table 99008535 TempBlob
{
    Caption = 'TempBlob';

    fields
    {
        field(1;"Primary Key";Integer)
        {
            Caption = 'Primary Key';
            DataClassification = SystemMetadata;
        }
        field(2;Blob;BLOB)
        {
            Caption = 'Blob';
            DataClassification = SystemMetadata;
        }
    }

    keys
    {
        key(Key1;"Primary Key")
        {
        }
    }
}

Wikipedia says: A Binary Large OBject (BLOB) is a collection of binary data stored as a single entity in a database management system. Blobs are typically imagesaudio or other multimedia objects, though sometimes binary executable code is stored as a blob. Database support for blobs is not universal.

We use this BLOB for our JSON data when we send a request to an interface and the interface response is also JSON in that same BLOB field.

For people that have been working with web requests we can say that TempBlob.Blob is used both for RequestStream and for ResponseStream.

TempBlob is only used as a form of Stream. We never use TempBlob to store data. We never do TempBlob.Get() or TempBlob.Insert(). And, even if the name indicates that this is a temporary record, we don’t define the TempBlob Record variable as temporary. There is no need for that since we never do any database call for this record.

Interface Helper Codeunit

We use a single Codeunit in all our solutions to prepare both request and response JSON and also to read from the request on the other end.

We have created a Codeunit that includes all the required procedures for the interface communication.

We have three functions to handle the basics;

  • procedure Initialize()
  • procedure InitializeFromTempBlob(TempBlob: Record TempBlob)
  • procedure GetAsTempBlob(var TempBlob: Record TempBlob)

A typical flow of executions is to start by initializing the JSON. Then we add data to that JSON. Before we execute the interface Codeunit we use GetAsTempBlob to write the JSON into TempBlob.Blob. Every Interface Codeunit expects a TempBlob record to be passed to the OnRun() trigger.

codeunit 10008650 "ADV SDS Interface Mgt"
{
    TableNo = TempBlob;

    trigger OnRun()
    var
        Method: Text;
     begin
        with JsonInterfaceMgt do begin
            InitializeFromTempBlob(Rec);
...

Inside the Interface Codeunit we initialize the JSON from the passed TempBlob record. At this stage we have access to all the data that was added to the JSON on the request side.

And, since the interface Codeunit will return TempBlob as well, we must make sure to put the response JSON in there before the execution ends.

with JsonInterfaceMgt do begin
    Initialize();
    AddVariable('Success', true);
    GetAsTempBlob(Rec);
end;

JSON structure

The JSON is an array that contains one or more objects. An JSON array is represented with square brackets.

[]

The first object in the JSON array is the variable storage. This is an example of a JSON that passes two variables to the interface Codeunit.

[
  {
    "TestVariable": "TestVariableValue",
    "TestVariable2": "TestVariableValue2"
  }
]

All variables are stored in the XML format, using FORMAT(<variable>,0,9) and evaluated back using EVALUATE(<variable>,<json text value>,9). The JSON can then have multiple record related objects after the variable storage.

Adding data to the JSON

We have the following procedures for adding data to the JSON;

  • procedure AddRecordID(Variant: Variant)
  • procedure AddTempTable(TableName: Text; Variant: Variant)
  • procedure AddFilteredTable(TableName: Text; FieldNameFilter: Text; Variant: Variant)
  • procedure AddRecordFields(Variant: Variant)
  • procedure AddVariable(VariableName: Text; Value: Variant)
  • procedure AddEncryptedVariable(VariableName: Text; Value: Text)

I will write a more detailed blog about each of these methods and give examples of how we use them, but for now I will just do a short explanation of their usage.

If we need to pass a reference to a database table we pass the Record ID. Inside the interface Codeunit we can get the database record based on that record. Each Record ID that we add to the JSON is stored with the Table Name and we use either of these two procedures to retrieve the record.

  • procedure GetRecord(var RecRef: RecordRef): Boolean
  • procedure GetRecordByTableName(TableName: Text; var RecRef: RecordRef): Boolean

If we need to pass more than one record we can use pass all records inside the current filter and retrieve the result with

  • procedure UpdateFilteredTable(TableName: Text; KeyFieldName: Text; var RecRef: RecordRef): Boolean

A fully populated temporary table with table view and table filters can be passed to the interface Codeunit by adding it to the JSON by name. When we use

  • procedure GetTempTable(TableName: Text; var RecRef: RecordRef): Boolean

in the interface Codeunit to retrieve the temporary table we will get the whole table, not just the filtered content.

We sometimes need to give interface Codeunits access to the record that we are creating. Similar to the OnBeforeInsert() system event. If we add the record fields to the JSON we can use

  • procedure GetRecordFields(var RecRef: RecordRef): Boolean

on the other end to retrieve the record and add or alter any field content before returning it back to the caller.

We have several procedures available to retrieve the variable values that we pass to the interface Codeunit.

  • procedure GetVariableValue(var Value: Variant; VariableName: Text): Boolean
  • procedure GetVariableTextValue(var TextValue: Text; VariableName: Text): Boolean
  • procedure GetVariableBooleanValue(var BooleanValue: Boolean; VariableName: Text): Boolean
  • procedure GetVariableDateValue(var DateValue: Date; VariableName: Text): Boolean
  • procedure GetVariableDateTimeValue(var DateTimeValue: DateTime; VariableName: Text): Boolean
  • procedure GetVariableDecimalValue(var DecimalValue: Decimal; VariableName: Text): Boolean
  • procedure GetVariableIntegerValue(var IntegerValue: Integer; VariableName: Text): Boolean
  • procedure GetVariableGUIDValue(var GuidValue: Guid; VariableName: Text): Boolean
  • procedure GetVariableBLOBValue(var TempBlob: Record TempBlob; VariableName: Text): Boolean
  • procedure GetVariableBLOBValueBase64String(var TempBlob: Record TempBlob; VariableName: Text): Boolean
  • procedure GetEncryptedVariableTextValue(var TextValue: Text; VariableName: Text): Boolean

We use Base 64 methods in the JSON. By passing the BLOB to TempBlob.Blob we can use

TextValue := TempBlob.ToBase64String();

and then

TempBlob.FromBase64String(TextValue);

on the other end to pass a binary content, like images or PDFs.

Finally, we have the possibility to add and encrypt values that we place in the JSON. On the other end we can then decrypt the data to be used. This we use extensively when we pass sensitive data to and from our Azure Function.

Calling an interface Codeunit

As promised I will write more detailed blogs with examples. This is the current list of procedures we use to call interfaces;

  • procedure ExecuteInterfaceCodeunitIfExists(CodeunitName: Text; var TempBlob: Record TempBlob; ErrorIfNotFound: Text)
  • procedure TryExecuteInterfaceCodeunitIfExists(CodeunitName: Text; var TempBlob: Record TempBlob; ErrorIfNotFound: Text): Boolean
  • procedure TryExecuteCodeunitIfExists(CodeunitName: Text; ErrorIfNotFound: Text) Success: Boolean
  • procedure ExecuteAzureFunction() Success: Boolean

The first two expect a JSON to be passed using TempBlob. The third one we use to check for a simple true/false. We have no request data but we read the ‘Success’ variable from the response JSON.

For some of our functionality we use an Azure Function. We have created our function to read the same JSON structure we use internally. We also expect our Azure Function to respond with the sames JSON structure. By doing it that way, we can use the same functions to prepare the request and to read from the response as we do for our internal interfaces.

My Soap Service Proxy Codeunit

Up to now we in Advania have been using the method described here on my blog to connect to most of the Soap web services that we needed to integrate with.

The problem with this method is that we have to manage a lot of DLLs.  This has caused some issues and problems.

Another thing is that we are moving to AL.  And in AL we can’t just throw in a custom DLL to do all the work.

In C/AL We can do this with standard dotnet objects

        DOMDoc := DOMDoc.XmlDocument;
        DOMProcessingInstruction := DOMDoc.CreateProcessingInstruction('xml','version="1.0" encoding="utf-8"');
        DOMDoc.AppendChild(DOMProcessingInstruction);
        DOMElement := DOMDoc.CreateElement('soap:Envelope','http://schemas.xmlsoap.org/soap/envelope/');
        DOMElement.SetAttribute('xmlns:soap','http://schemas.xmlsoap.org/soap/envelope/');
        DOMElement.SetAttribute('xmlns:xsi','http://www.w3.org/2001/XMLSchema-instance');
        DOMElement.SetAttribute('xmlns:xsd','http://www.w3.org/2001/XMLSchema');
        DOMElement.SetAttribute('xmlns:ws','http://ws.msggw.siminn');

        DOMElement2 := DOMDoc.CreateElement('soap:Header','http://schemas.xmlsoap.org/soap/envelope/');
        DOMElement.AppendChild(DOMElement2);

        DOMElement2 := DOMDoc.CreateElement('soap:Body','http://schemas.xmlsoap.org/soap/envelope/');
        DOMElement3 := DOMDoc.CreateElement('ws:sendSMS','http://ws.msggw.siminn');
        DOMElement4 := DOMDoc.CreateElement('ws:username','http://ws.msggw.siminn');
        DOMElement4.InnerText := SMSSetup."Service User Name";
        DOMElement3.AppendChild(DOMElement4);
        DOMElement4 := DOMDoc.CreateElement('ws:password','http://ws.msggw.siminn');
        DOMElement4.InnerText := SMSSetup."Service Password";
        DOMElement3.AppendChild(DOMElement4);
        DOMElement4 := DOMDoc.CreateElement('ws:source','http://ws.msggw.siminn');
        DOMElement4.InnerText := SMSSetup.Sender;
        DOMElement3.AppendChild(DOMElement4);
        DOMElement4 := DOMDoc.CreateElement('ws:destination','http://ws.msggw.siminn');
        DOMElement4.InnerText := SendTo;
        DOMElement3.AppendChild(DOMElement4);
        DOMElement4 := DOMDoc.CreateElement('ws:text','http://ws.msggw.siminn');
        DOMElement4.InnerText := SendText;
        DOMElement3.AppendChild(DOMElement4);
        DOMElement4 := DOMDoc.CreateElement('ws:encoding','http://ws.msggw.siminn');
        DOMElement4.InnerText := '0';
        DOMElement3.AppendChild(DOMElement4);
        DOMElement4 := DOMDoc.CreateElement('ws:flash','http://ws.msggw.siminn');
        DOMElement4.InnerText := '0';
        DOMElement3.AppendChild(DOMElement4);
        DOMElement2.AppendChild(DOMElement3);
        DOMElement.AppendChild(DOMElement2);
        DOMDoc.AppendChild(DOMElement);

        HttpWebRequest := HttpWebRequest.Create(SMSSetup."SOAP URL");
        HttpWebRequest.Timeout := 30000;
        HttpWebRequest.UseDefaultCredentials(TRUE);
        HttpWebRequest.Method := 'POST';
        HttpWebRequest.ContentType := 'text/xml; charset=utf-8';
        HttpWebRequest.Accept := 'text/xml';
        HttpWebRequest.Headers.Add('soapAction','urn:sendSMS');
        MemoryStream := HttpWebRequest.GetRequestStream;
        DOMDoc.Save(MemoryStream);
        MemoryStream.Flush;
        MemoryStream.Close;

        NAVWebRequest := NAVWebRequest.NAVWebRequest;
        IF NOT NAVWebRequest.doRequest(HttpWebRequest,HttpWebException,HttpWebResponse) THEN
          ERROR(Text003,HttpWebException.Status.ToString,HttpWebException.Message);

        MemoryStream := HttpWebResponse.GetResponseStream;
        DOMResponseDoc := DOMResponseDoc.XmlDocument;
        DOMResponseDoc.Load(MemoryStream);
        MemoryStream.Flush;
        MemoryStream.Close;

        ReceivedNameSpaceMgt := ReceivedNameSpaceMgt.XmlNamespaceManager(DOMResponseDoc.NameTable);
        ReceivedNameSpaceMgt.AddNamespace('ns','http://ws.msggw.siminn');
        DOMNode := DOMResponseDoc.SelectSingleNode('//ns:return',ReceivedNameSpaceMgt);

        Response := DOMNode.InnerText;
        Success :=  Response = 'SUCCESS';
        IF ShowResult AND Success THEN
          MESSAGE(Text001)
        ELSE IF ShowResult AND NOT Success THEN
          ERROR(Text005,Response);

AL code to do the same with the built in AL objects but that code is not much shorter.

With a custom proxy DLL the code would be

Proxy := Proxy.SMSWS;
Proxy.Url := SMSSetup."SOAP URL";
Response := Proxy.sendSMS(Username,Password,SenderText,SendTo,SendText,'0',FALSE,FALSE,'0');
Success :=  Response = 'SUCCESS';
IF ShowResult AND Success THEN
  MESSAGE(Text001)
ELSE IF ShowResult AND NOT Success THEN
  ERROR(Text005,Response);

With this example we can easily see why we have chosen to create a proxy DLL for most of the Soap services.

I wanted to find a way to make things easier in AL and I remembered having dealt with C/AL objects by Vjeko from some time ago.  I took another look and that code helped me to get started.

The result is a Soap Proxy Client Mgt. Codeunit in C/AL that I have sent to Microsoft’s cal-open-library project asking to have this code put into the standard C/AL library.

Using this Codeunit the code will be like this.

  WITH SoapProxyClientMgt DO BEGIN
    CreateSoapProxy(SMSSetup."SOAP URL");
    InitParameters(9);
    SetParameterValue(Username,1);
    SetParameterValue(Password,2);
    SetParameterValue(SenderText,3);
    SetParameterValue(SendTo,4);
    SetParameterValue(SendText,5);
    SetParameterValue('0',6);
    SetParameterValue(FALSE,7);
    SetParameterValue(FALSE,8);
    SetParameterValue('0',9);
    InvokeMethod('SMSWS','sendSMS',TempBlob);
    XmlBuffer.LoadFromText(TempBlob.ReadAsTextWithCRLFLineSeparator);
    IF XmlBuffer.FindNodesByXPath(XmlBuffer,'/string') THEN
      Response := XmlBuffer.Value;
    Success :=  Response = 'SUCCESS';
    IF ShowResult AND Success THEN
      MESSAGE(Text001)
    ELSE IF ShowResult AND NOT Success THEN
      ERROR(Text005,Response);
  END;

What about AL?

For now this C/AL Codeunit is not in the standard CRONUS database.  I need to import the C/AL code and make sure that AL will be able to use that Codeunit.  You can see how to do this in my last blog post.

This C/AL Code will directly convert to AL and is ready to use.

          with SoapProxyClientMgt do begin
            CreateSoapProxy(SMSSetup."SOAP URL");
            InitParameters(9);
            SetParameterValue(Username,1);
            SetParameterValue(Password,2);
            SetParameterValue(SenderText,3);
            SetParameterValue(SendTo,4);
            SetParameterValue(SendText,5);
            SetParameterValue('0',6);
            SetParameterValue(false,7);
            SetParameterValue(false,8);
            SetParameterValue('0',9);
            InvokeMethod('SMSWS','sendSMS',TempBlob);
            XmlBuffer.LoadFromText(TempBlob.ReadAsTextWithCRLFLineSeparator);
            if XmlBuffer.FindNodesByXPath(XmlBuffer,'/string') then                        
              Response := XmlBuffer.Value;        
            Success :=  Response = 'SUCCESS';
            if ShowResult and Success then
              MESSAGE(Text001)
            else if ShowResult and not Success then
              ERROR(Text005,Response);
          end;

More examples on how to use this Proxy Codeunit will follow.  Stay tuned…

Using AdvaniaGIT in Visual Studio Code

It has become obvious that the future of AL programming is in Visual Studio Code.

Microsoft has made a decision to ship all their releases as Docker Containers.

The result of this is a development machine that does not have any NAV version installed.  I wanted to go through the installation and configuration of a new NAV on Docker development machine.

Here is what I did.

I installed Windows Server 2016 with Containers.  The other option was to use Windows 10 and install Docker as explained here.

After installing and fully updating the operating system I downloaded and installed Visual Studo Code.

After installation Visual Studio Code detects that I need to install Git.

I selected Download Git and was taken to the Git download page.

I downloaded and installed Git with default settings.

To be able to run NAV Development and NAV Client I need to install prerequisite components.  I copied the Prerequisite Components folder from my NAV 2018 DVD and installed some of them…

Let’s hook Visual Studio Code to our NAV 2018 repository and install AdvaniaGIT.  I first make sure to always run Visual Studio Code with administrative privileges.

Now that we have our AdvaniaGIT installed and configured we can start our development.  Let’s start our C/AL classic development.  Where this video ends you can continue development as described in my previous posts on AdvaniaGIT.  AdvaniaGIT also supports NAV 2016 and NAV 2017.

Since we are running NAV 2018 we can and should be using AL language and the Extension 2.0 model.  Let’s see how to use our repository structure, our already build Docker container and Visual Studio Code to start our first AL project.

So as you can see by watching these short videos it is easy to start developing both in C/AL and AL using AdvaniaGIT and Visual Studio Code.

My next task is to update my G/L Source Names extension to V2.  I will be using these tools for the job.  More to come soon…

Using AdvaniaGIT – FTP server for teams

So, you are not the only one in your company doing development, right?

Essential part of being able to develop C/AL is to have a starting point.  That starting point is usually where you left of last time you did some development.  If you are starting a task your starting point may just be the localized release from Microsoft.

A starting point in AdvaniaGIT is a database backup.  The database backup can contain data and it should.  Data to make sure that you as a developer can do some basic testing of the solution you are creating.

AdvaniaGIT has a dedicated folder (C:\AdvaniaGIT\Backup) for the database backups.  That is where you should put your backups.

If you are working in teams, and even if not you might not want to flood your local drive with database backups.  That is why we configure an FTP server in C:\AdvaniaGIT\Data\GITSetting.json.

{
    "ftpServer":  "ftp://ftp02.hysing.is/",
    "ftpUser":  "ftp_sourcetree",
    "ftpPass":  "*****",
    "licenseFile":  "Advania.flf",
    "workFolder":  "C:\\NAVManagementWorkFolder\\Workspace",
    "patchNoFunction":  "Display-PatchDayNo",
    "defaultDatabaseServer":  "localhost",
    "defaultDatabaseInstance":  "",
    "objectsNotToDelete":  "(14125500..14125600)",
    "sigToolExecutable":  "C:\\Program Files (x86)\\Windows Kits\\10\\bin\\x64\\signtool.exe",
    "codeSigningCertificate":  "CodeSign_Certificate.pfx",
    "codeSigningCertificatePassword":  "****",
    "setupPath":  "setup.json",
    "objectsPath":  "Objects",
    "deltasPath":  "Deltas",
    "reverseDeltasPath":  "ReverseDeltas",
    "extensionPath":  "Extension1",
    "imagesPath":  "Images",
    "screenshotsPath":  "ScreenShots",
    "permissionSetsPath":  "PermissionSets",
    "addinsPath":  "Addins",
    "languagePath":  "Languages",
    "tableDataPath":  "TableDatas",
    "customReportLayoutsPath":  "CustomReportLayouts",
    "webServicesPath":  "WebServices",
    "binaryPath":  "Binaries",
    "testObjectsPath":  "TestObjects",
    "datetimeCulture":  "is-IS",
    "NewSyntaxPrefix":  "NewSyntax",
    "targetPlatform":  "DynamicsNAV",
    "VSCodePath":  "AL"
}

When we start an action to build NAV development environment the AdvaniaGIT tools searches for a database backup.

The search is both on C:\AdvaniaGIT\Backup and also on the root of the FTP server.

Using the function Get-NAVBackupFilePath to locate the desired backup file it will search based on these patterns.

    $FilePatterns = @(
        "$($SetupParameters.navRelease)-$($SetupParameters.projectName).bak",
        "$($SetupParameters.navRelease)/$($SetupParameters.navVersion)/$($SetupParameters.projectName).bak",
        "$($SetupParameters.navRelease)/$($SetupParameters.projectName).bak",
        "$($SetupParameters.navRelease)-$($SetupParameters.navSolution).bak"
        "$($SetupParameters.navRelease)/$($SetupParameters.navVersion)/$($SetupParameters.navSolution).bak",
        "$($SetupParameters.navRelease)/$($SetupParameters.navSolution).bak")

The navRelease is the year (2016,2017,…).  The navVersion is the build (9.0.46045.0,9.0.46290.0,10.0.17972.0,…)

The projectName and navSolution parameters are defined in Setup.json (settings file) in every GIT branch.

{
  "branchId": "3ba8870a-0274-4162-8ea2-66e314bb3e34",
  "navSolution":  "IS",
  "storeAllObjects":  "true",
  "navVersion":  "9.0.48992.0",
  "projectName": "ADIS",
  "baseBranch": "IS",
  "uidOffset": "10000200",  
  "objectProperties": "true",
  "datetimeCulture":  "is-IS"
}

Combining these values we can see that the search will be done with these patterns.

    $FilePatterns = @(
        "2016-ADIS.bak",
        "2016/9.0.48992.0/ADIS.bak",
        "2016/ADIS.bak",
        "2016-IS.bak"
        "2016/9.0.48992.0/IS.bak",
        "2016/IS.bak")

And these file patterns are applied both to C:\AdvaniaGIT\Backup and to the FTP server root folder.  Here are screenshots from our FTP server.

Looking into the 2017 folder

And into one of the build folders

My local backup folder is simpler

This should give you some idea on where to store your SQL backup files.

 

 

Using AdvaniaGIT – Docker Containers

There is a new kid in town.  His name is Docker.

Microsoft is saying:

We are currently investigating how we can use Docker for deploying NAV. For test purposes we have created a Docker Container Image with the NAV Developer Preview, which you can try out.

Docker Containers is a technology where you, instead of virtualizing the entire machine, only virtualize the services and share resources from the host computer. Read more about it here: https://www.docker.com/what-docker

Read more about how to get started with Docker here: https://docs.docker.com/docker-for-windows/install/

So what does this mean?

We can install NAV environments as container both in Azure and on premise.  We can have multiple NAV versions to work with without having to install, so there is no conflict.  We can also get access to container images that are still in preview.

Note what Microsoft says, they are investigating.  The NAV Container Image service is not public yet.  You need authentication to get access.  This project has a place on GitHub.  To get access please contact Microsoft or send me a line and I will point you in the right direction.

The easiest way to get started is to try the NAV Developer Preview template on Azure,  http://aka.ms/navdeveloperpreview.  This will give you a full development environment for NAV Extension2.0 using VS Code.

It should be straight forward to install AdvaniaGIT on top of the NAV Developer Preview and start from there.  We can also start from Azure template called “Windows Server 2016 Datacenter – with Containers”.

The local option is to install Docker on our Windows laptop.  If you would like to use Docker on your laptop you need to change one setting after installation.  You need to switch to Windows containers.  Your laptop will most likely restart during installation and configuration of Docker so make sure to have your work saved.

If you are planning to run a Docker-Only environment you don’t need to install NAV.  Still there are prerequisite components that you must install.  These components can be found on the NAV DVD folder “Prerequisite Components”.  From the “Microsoft SQL Server” folder install sqlncli64.msi and ReportBuilder3.msi.  From the “Microsoft Visual C++ 2013” folder install vcredist_x64.exe.  From the “Microsoft Visual Studio 2010 Tools for Office Redist” install vstor_redist.exe.  From the “Microsoft Report Viewer” folder install both SQLSysClrTypes.msi and ReportViewer.msi.  You should now be ready for the next step.

So, let’s get started

In your C:\AdvaniaGIT\Data folder open DockerSettings.json

{
  "RepositoryPath": "navdocker.azurecr.io",
  "RepositoryUserName": "7cc3c660-fc3d-41c6-b7dd-dd260148fff7",
  "RepositoryPassword": "access_key_for_the_repository"
}

That’s all.  Your are now ready to use Docker with AdvaniaGIT.  Make sure to always have the latest version of AdvaniaGIT installed.

You can even use the “devpreview” build of NAV TENERIFE to do vNext development both in C/AL and AL.

Stay tuned, the AdvaniaGIT journey will continue…

Using AdvaniaGIT – How to configure licenses

When NAV environments are built a development license is uploaded to the development database.

Make sure to save your development licenses in your AdvaniaGIT license folder.

There are two ways to configure which license is used.  First with the “licenseFile” property of GITSettings.json in your AdvaniaGIT data folder.

{
  "ftpServer": "ftp://ftp02.hysing.is/",
  "ftpUser": "ftp_sourcetree",
  "ftpPass": "******",
  "licenseFile": "Advania.flf",
  "workFolder": "C:\\AdvaniaGIT\\Workspace",
  "patchNoFunction": "Display-PatchDayNo",
  "defaultDatabaseServer": "localhost",
  "defaultDatabaseInstance": "",
  "objectsNotToDelete": "(14125500..14125600)",
  "sigToolExecutable": "C:\\Program Files (x86)\\Windows Kits\\10\\bin\\x64\\signtool.exe",
  "codeSigningCertificate": "",
  "codeSigningCertificatePassword": "",
  "setupPath": "setup.json",
  "objectsPath": "Objects",
  "deltasPath": "Deltas",
  "reverseDeltasPath": "ReverseDeltas",
  "extensionPath": "Extension1",
  "imagesPath": "Images",
  "screenshotsPath": "ScreenShots",
  "permissionSetsPath": "PermissionSets",
  "addinsPath": "Addins",
  "languagePath": "Languages",
  "tableDataPath": "TableDatas",
  "customReportLayoutsPath": "CustomReportLayouts",
  "webServicesPath": "WebServices",
  "binaryPath": "Binaries",
  "targetPlatform": "DynamicsNAV",
  "VSCodePath": "AL",
  "NewSyntaxPrefix": "NewSyntax"
}

Just specify the file name in the license folder.

The license configured in GITSettings.json will be used as default for all builds.  However, as mentioned earlier, every settings in GITSettings.json can be overwritten by the settings in each branch.

The settings file in each branch is named according to the “setupPath” parameter in GITSettings.json.  If we take a look at the setup.json file for my G/L Source Names solution.

{
  "branchId": "479e77f3-031a-49fe-bb6a-314464c6a9a8",
  "navVersion": "10.0.15052.0",
  "navSolution": "W1",
  "projectName": "GLSOURCENAMES",
  "baseBranch": "master",
  "storeAllObjects": "false",
  "uidOffset": "70009200",  
  "licenseFile": "Kappi.flf",
  "versionList": "GLSN10.0",
  "objectProperties": "true",
  "datetimeCulture": "is-IS",
  "targetPlatform": "Dynamics365",
  "appId": "479e77f3-031a-49fe-bb6a-314464c6a9a8",
  "appName": "G/L Source Names",
  "appPublisher": "Objects4NAV",
  "appVersion": "1.0.0.1",
  "appCompatibilityId": "",
  "appManifestName": "G/L Source Names",
  "appManifestDescription": "G/L Source Names adds the source name to the G/L Entries page.  Source Name is the customer in sales transaction and the vendor in purchase transactions", 
  "appBriefDescription": "Source Names in G/L Entries",
  "appPrivacyStatement": "http://objects4nav.com/privacy",
  "appEula": "http://objects4nav.com/terms",
  "appHelp": "http://objects4nav.com/glsourcenames",
  "appUrl": "http://objects4nav.com",
  "appIcon": "Logo250x250",  
  "appDependencies":
    [
  
    ],
  "appPrerequisites":
    [
  
    ],
  "permissionSets":
    [
  		{"id": "G/L-SOURCE NAMES",    "description": "Read G/L Source Names"},
  		{"id": "G/L-SOURCE NAMES, E", "description": "Update G/L Source Names"},
  		{"id": "G/L-SOURCE NAMES, S", "description": "Setup G/L Source Names"}
  	],
  "webServices":
    [
    
    ],
  "dotnetAddins":
    [
  
    ],
  "tableDatas":
    [
   
    ]
}

Here I need to use another license file.  The one used when I applied for the object range for my extension.

Licenses are not to be stored in SQL backups used by AdvaniaGIT.  When using AdvaniaGIT to create SQL backups the license is removed before creating the backup and reinstalled afterwards.

The first function that is executed after SQL database restore is a database upgrade with the development environment.  This must be done to make sure that the database fits the service version being used.  For this database upgrade function to be successful, first either make sure that the database does not contain expired license, and make sure that you have a valid license in the master database.

There are a few ways of doing this.  First, there is an option when installing NAV to upload the license.

Secondly, in the development environment you can upload a license, going through Tools and License Information.

But make sure that the database your development environment is connected to does not have the “Save license in database” set like here, going through File, Database and Alter.

The third option is to use the server administrative shell.

Hope this helps.  More to come soon.  Stay tuned…