JSON Interface – examples

We have several ways of using the JSON interfaces. I will give few examples with the required C/AL code. I will be using Advania’s Online Banking solution interfaces for examples.

The Advania’s Online Banking solution is split into several different modules. The main module has the general framework. Then we have communication modules and functionality modules.

On/Off Question

A communication module should not work if the general framework does not exist or is not enabled for the current company. Hence, I need to ask the On/Off question

This is triggered by calling the solution enabled Codeunit.

IF NOT JsonInterfaceMgt.TryExecuteCodeunitIfExists('ADV Bank Services Enabled Mgt.','') THEN BEGIN
  SetupNotification.MESSAGE := NotificationMsg;
  SetupNotification.SEND;
END;

The interface function will search for the Codeunit, check for execution permissions and call the Codeunit with an empty request BLOB.

The “Enabled” Codeunit must respond with a “Success” variable of true or false.

[External] TryExecuteCodeunitIfExists(CodeunitName : Text;ErrorIfNotFound : Text) Success : Boolean
Object.SETRANGE(Type,Object.Type::Codeunit);
Object.SETRANGE(Name,CodeunitName);
IF NOT Object.FINDFIRST THEN
  IF ErrorIfNotFound <> '' THEN
    ERROR(ErrorIfNotFound)
  ELSE
    EXIT;

IF NOT HasCodeunitExecuteLicense(Object.ID,ErrorIfNotFound) THEN EXIT;
CODEUNIT.RUN(Object.ID,TempBlob);
InitializeFromTempBlob(TempBlob);
GetVariableBooleanValue(Success,'Success');

The “Enabled” Codeunit will test for Setup table read permission and if the “Enabled” flag has been set in the default record.

OnRun(VAR Rec : Record TempBlob)
TestEnabled(Rec);

LOCAL TestEnabled(VAR TempBlob : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  Initialize;
  AddVariable('Success',IsServiceEnabled);
  GetAsTempBlob(TempBlob);
END;

IsServiceEnabled() : Boolean
IF NOT Setup.READPERMISSION THEN EXIT;
EXIT(Setup.GET AND Setup.Enabled);

This is how we can make sure that a module is installed and enabled before we start using it or any of the dependent modules.

Table Access Interface

The main module has a standard response table. We map some of the communication responses to this table via Data Exchange Definition. From other modules we like to be able to read the response from the response table.

The response table uses a GUID value for a primary key and has an integer field for the “Data Exchange Entry No.”. From the sub module we ask if a response exists for the current “Data Exchange Entry No.” by calling the interface.

FindResponse(DataExchEntryNo : Integer) Success : Boolean
WITH JsonInterfaceMgt DO BEGIN
  Initialize;
  AddVariable('DataExchEntryNo',DataExchEntryNo);
  GetAsTempBlob(TempBlob);
  ExecuteInterfaceCodeunitIfExists('ADV Bank Serv. Resp. Interface',TempBlob,ResponseInterfaceErr);
  InitializeFromTempBlob(TempBlob);
  GetVariableBooleanValue(Success,'Success');
END;

The Interface Codeunit for the response table will filter on the “Data Exchange Entry No.” and return the RecordID for that record if found.

OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  GetVariableIntegerValue(DataExchEntryNo,'DataExchEntryNo');
  Response.SETRANGE("Data Exch. Entry No.",DataExchEntryNo);
  AddVariable('Success',Response.FINDFIRST);
  IF Response.FINDFIRST THEN
    AddRecordID(Response);
  GetAsTempBlob(Rec);
END;

If the response is found we can ask for the value of any field from that record by calling

GetFieldValue(FieldName : Text) FieldValue : Text
WITH JsonInterfaceMgt DO
  IF GetRecordByTableName('ADV Bank Service Response',RecRef) THEN
    IF DataTypeMgt.FindFieldByName(RecRef,FldRef,FieldName) THEN
      IF FORMAT(FldRef.TYPE) = 'BLOB' THEN BEGIN
        TempBlob.Blob := FldRef.VALUE;
        FieldValue := TempBlob.ReadAsTextWithCRLFLineSeparator();
      END ELSE
        FieldValue := FORMAT(FldRef.VALUE,0,9);

Processing Interface

Some processes can be both automatically and manually executed. For manual execution we like to display a request page on a Report. On that request page we can ask for variables, settings and verify before executing the process.

For automatic processing we have default settings and logic to find the correct variables before starting the process. And since one module should be able to start a process in the other then we use the JSON interface pattern for the processing Codeunit.

We also like to include the “Method” variable to add flexibility to the interface. Even if there is only one method in the current implementation.

OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  IF NOT GetVariableTextValue(Method,'Method') OR (Method = '') THEN
    ERROR(MethodNotFoundErr);
  CASE Method OF
    'BankAccountProcessing':
      BankAccountProcessing(JsonInterfaceMgt);
  END;
END;

LOCAL BankAccountProcessing(JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")
CheckSetup;
CompanyInformation.GET;
WITH JsonInterfaceMgt DO BEGIN
  GetVariableTextValue(ClaimExportImportFormatCode, 'ClaimExportImportFormatCode');
  GetVariableTextValue(BankAccountNo, 'BankAccountNo');
  GetVariableDateValue(StartDate,'StartDate');
  GetVariableDateValue(EndDate,'EndDate');

  ValidateStartDate;
  ValidateEndDate;
  ValidateImportFormat;
  BankAccount.SETRANGE("No.", BankAccountNo);
  ClaimExportImportFormat.GET(ClaimExportImportFormatCode);
  Initialize;
  AddVariable('BankAccNo',BankAccountNo);
  AddVariable('ClaimantID',CompanyInformation."Registration No.");
  AddVariable('StartDate',StartDate);
  AddVariable('EndDate',EndDate);
  GetAsTempBlob(TempBlob);
  Window.OPEN(ImportingFromBank);
  IF BankAccount.FINDSET THEN REPEAT
    DataExchDef.GET(ClaimExportImportFormat."Resp. Data Exch. Def. Code");

    DataExch.INIT;
    DataExch."Related Record" := BankAccount.RECORDID;
    DataExch."Table Filters" := TempBlob.Blob;
    DataExch."Data Exch. Def Code" := DataExchDef.Code;
    DataExchLineDef.SETRANGE("Data Exch. Def Code",DataExchDef.Code);
    DataExchLineDef.FINDFIRST;
    DataExch."Data Exch. Line Def Code" := DataExchLineDef.Code;

    DataExchDef.TESTFIELD("Ext. Data Handling Codeunit");
    CODEUNIT.RUN(DataExchDef."Ext. Data Handling Codeunit",DataExch);

    DataExch.INSERT;
    IF DataExch.ImportToDataExch(DataExchDef) THEN BEGIN

      DataExchMapping.GET(DataExchDef.Code,DataExchLineDef.Code,DATABASE::"ADV Claim Payment Batch Entry");

      IF DataExchMapping."Pre-Mapping Codeunit" <> 0 THEN
        CODEUNIT.RUN(DataExchMapping."Pre-Mapping Codeunit",DataExch);

      DataExchMapping.TESTFIELD("Mapping Codeunit");
      CODEUNIT.RUN(DataExchMapping."Mapping Codeunit",DataExch);

      IF DataExchMapping."Post-Mapping Codeunit" <> 0 THEN
        CODEUNIT.RUN(DataExchMapping."Post-Mapping Codeunit",DataExch);
    END;
    DataExch.DELETE(TRUE);
  UNTIL BankAccount.NEXT = 0;
  Window.CLOSE;
END;

Reading through the code above we can see that we are also using the JSON interface to pass settings to the Data Exchange Framework. We put the JSON configuration into the “Table Filters” BLOB field in the Data Exchange where we can use it later in the data processing.

From the Report we start the process using the JSON interface.

Bank Account - OnPreDataItem()
WITH JsonInterfaceMgt DO BEGIN
  Initialize;
  AddVariable('Method','BankAccountProcessing');
  AddVariable('ClaimExportImportFormatCode', ClaimExportImportFormat.Code);
  AddVariable('BankAccountNo', BankAccount."No.");
  AddVariable('StartDate',StartDate);
  AddVariable('EndDate',EndDate);
  GetAsTempBlob(TempBlob);
  ExecuteInterfaceCodeunitIfExists('ADV Import BCP Interface', TempBlob, '');
END;

The ExecuteInterfaceCodeunitIfExists will also verify that the Interface Codeunit exists and also verify the permissions before executing.

[External] ExecuteInterfaceCodeunitIfExists(CodeunitName : Text;VAR TempBlob : Record TempBlob;ErrorIfNotFound : Text)
Object.SETRANGE(Type,Object.Type::Codeunit);
Object.SETRANGE(Name,CodeunitName);
IF NOT Object.FINDFIRST THEN
  IF ErrorIfNotFound <> '' THEN
    ERROR(ErrorIfNotFound)
  ELSE
    EXIT;

IF NOT HasCodeunitExecuteLicense(Object.ID,ErrorIfNotFound) THEN EXIT;
CODEUNIT.RUN(Object.ID,TempBlob)

Extensible Interface

For some tasks it might be simple to have a single endpoint (Interface Codeunit) for multiple functionality. This can be achieved by combining Events and Interfaces.

We start by reading the required parameters from the JSON and then we raise an event for anyone to respond to the request.

OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  IF NOT GetVariableTextValue(InterfaceType,'InterfaceType') THEN
    ERROR(TypeErr);
  IF NOT GetVariableTextValue(Method,'Method') THEN
    ERROR(MethodErr);
  OnInterfaceAccess(InterfaceType,Method,Rec);
END;

LOCAL [IntegrationEvent] OnInterfaceAccess(InterfaceType : Text;Method : Text;VAR TempBlob : Record TempBlob)

We can also pass the JSON Interface Codeunit, as that will contain the full JSON and will contain the full JSON for the response.

OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  IF NOT GetVariableTextValue(InterfaceType,'InterfaceType') THEN
    ERROR(TypeErr);
  IF NOT GetVariableTextValue(Method,'Method') THEN
    ERROR(MethodErr);
  OnInterfaceAccess(InterfaceType,Method,JsonInterfaceMgt);
  GetAsTempBlob(Rec);
END;

LOCAL [IntegrationEvent] OnInterfaceAccess(InterfaceType : Text;Method : Text;VAR JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")

One of the subscribers could look like this

LOCAL [EventSubscriber] OnInterfaceAccess(InterfaceType : Text;Method : Text;VAR JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")
IF InterfaceType = 'Claim' THEN
  CASE Method OF
    'Register':
      Register(JsonInterfaceMgt);
    'Edit':
      Edit(JsonInterfaceMgt);
    'AddExportImportFormat':
      AddExportImportFormat(JsonInterfaceMgt);
    'GetSetupCodeunitID':
      GetSetupCodeunitID(JsonInterfaceMgt);
    'GetDirection':
      GetDirection(JsonInterfaceMgt);
    'GetServiceUrl':
      GetServiceUrl(JsonInterfaceMgt);
    'GetExportImportFormat':
      GetExportImportFormat(JsonInterfaceMgt);
    'GetServiceMethod':
      GetServiceMethod(JsonInterfaceMgt);
    'ShowAndGetClaimFormat':
      ShowAndGetClaimFormat(JsonInterfaceMgt);
    'GetDataExchangeDefintionWithAction':
      GetDataExchangeDefintionWithAction(JsonInterfaceMgt);
    'GetOperationResultForClaimant':
      GetOperationResultForClaimant(JsonInterfaceMgt);
    'ShowClaimPayment':
      ShowClaimPayment(JsonInterfaceMgt)
    ELSE
      ERROR(MethodErr,Method);
  END;

Registration Interface

This pattern is similar to the discovery pattern, where an Event is raised to register possible modules into a temporary table. Example of that is the “OnRegisterServiceConnection” event in Table 1400, Service Connection.

Since we can’t have Event Subscriber in one module listening to an Event Publisher in another, without having compile dependencies, we have come up with a different solution.

We register functionality from the functionality module and the list of modules in stored in a database table. The table uses a GUID and the Language ID for a primary key, and then the view is filtered by the Language ID to only show one entry for each module.

This pattern gives me a list of possible modules for that given functionality. I can open the Setup Page for that module and I can execute the Interface Codeunit for that module as well. Both the Setup Page ID and the Interface Codeunit ID are object names.

The registration interface uses the Method variable to select the functionality. It can either register a new module or it can execute the method in the modules.

OnRun(VAR Rec : Record TempBlob)
WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  IF NOT GetVariableTextValue(Method,'Method') THEN
    ERROR(MethodErr);
  CASE Method OF
    'Register':
      RegisterCollectionApp(JsonInterfaceMgt);
    ELSE
      ExecuteMethodInApps(Rec);
  END;
END;

LOCAL RegisterCollectionApp(JsonInterfaceMgt : Codeunit "IS Json Interface Mgt.")
WITH BankCollectionModule DO BEGIN
  JsonInterfaceMgt.GetVariableGUIDValue(ID,'ID');
  "Language ID" := GLOBALLANGUAGE();
  IF FIND THEN EXIT;
  INIT;
  JsonInterfaceMgt.GetVariableTextValue(Name,'Name');
  JsonInterfaceMgt.GetVariableTextValue("Setup Page ID",'SetupPageID');
  JsonInterfaceMgt.GetVariableTextValue("Interface Codeunit ID",'InterfaceCodeunitID');
  INSERT;
END;

[External] ExecuteMethodInApps(VAR TempBlob : Record TempBlob)
WITH BankCollectionModule DO BEGIN
  SETCURRENTKEY("Interface Codeunit ID");
  IF FINDSET THEN REPEAT
    JsonInterfaceMgt.ExecuteInterfaceCodeunitIfExists("Interface Codeunit ID",TempBlob,'');
    SETFILTER("Interface Codeunit ID",'>%1',"Interface Codeunit ID");
  UNTIL NEXT = 0;
END;

In the “ExecuteMethodInApps” function I use the filters to make sure to only execute each Interface Codeunit once.

The registration is executed from the Setup & Configuration in the other module.

[External] RegisterCollectionApp()
WITH JsonInterfaceMgt DO BEGIN
  Initialize();
  AddVariable('Method','Register');
  AddVariable('ID',GetCollectionAppID);
  AddVariable('Name',ClaimAppName);
  AddVariable('SetupPageID','ADV Claim Setup');
  AddVariable('InterfaceCodeunitID','ADV Claim Interface Access');
  GetAsTempBlob(TempBlob);
  ExecuteInterfaceCodeunitIfExists('ADV Bank Collection App Access',TempBlob,'');
END;

Extend functionality using the Registered Modules.

As we have been taught we should open our functionality for other modules. This is done by adding Integration Events to our code.

LOCAL [IntegrationEvent] OnBeforePaymentPost(ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry";VAR CustLedgEntry : Record "Cust. Ledger Entry";VAR UseClaimPaymentApplication : Boolean;VAR ToAccountType : 'G/L Account,Customer,Vendor,Bank Acco

LOCAL [IntegrationEvent] OnBeforePostGenJnlLine(VAR ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry";VAR GenJournalLine : Record "Gen. Journal Line";VAR AppliedDocType : Option;VAR AppliedDocNo : Code[20];VAR AppliesToID : Code[50])

Where the Subscriber that needs to respond to this Publisher is in another module we need to extend the functionality using JSON interfaces.

First, we create a Codeunit within the Publisher module with Subscribers. The parameters in the Subscribers are converted to JSON and passed to the possible subscriber modules using the “ExecuteMethodInApps” function above.

LOCAL [EventSubscriber] OnBeforeClaimPaymentInsert(VAR ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry")
GetClaimSettings(ClaimPaymentEntry);

LOCAL GetClaimSettings(VAR ClaimPaymentEntry : Record "ADV Claim Payment Batch Entry") Success : Boolean
JsonInterfaceMgt.Initialize;
JsonInterfaceMgt.AddVariable('Method','GetClaimSettings');
JsonInterfaceMgt.AddVariable('ClaimantID',ClaimPaymentEntry."Claimant Registration No.");
JsonInterfaceMgt.AddVariable('ClaimKey',ClaimPaymentEntry."Claim Account No.");
JsonInterfaceMgt.AddVariable('InterestDate',ClaimPaymentEntry."Interest Date");
JsonInterfaceMgt.GetAsTempBlob(TempBlob);
BankCollectionAppAccess.ExecuteMethodInApps(TempBlob);
JsonInterfaceMgt.InitializeFromTempBlob(TempBlob);
IF NOT JsonInterfaceMgt.GetVariableBooleanValue(Success,'Success') THEN EXIT;

ClaimPaymentEntry."Batch Code" := GetJsonProperty('BatchCode');
ClaimPaymentEntry."Template Code" := GetJsonProperty('TemplateCode');
ClaimPaymentEntry."Source Code" := GetJsonProperty('SourceCode');
ClaimPaymentEntry."Customer No." := GetJsonProperty('CustomerNo');
ClaimPaymentEntry."Customer Name" := GetJsonProperty('CustomerName');

The module that is extending this functionality will be able to answer to these request and supply the required response.

OnRun(VAR Rec : Record TempBlob)
IF NOT Setup.READPERMISSION THEN EXIT;
Setup.GET;

WITH JsonInterfaceMgt DO BEGIN
  InitializeFromTempBlob(Rec);
  IF NOT GetVariableTextValue(Method,'Method') THEN
    ERROR(MethodErr);
  CASE Method OF
    'Register':
      RegisterCollectionApp();
    'GetByCustLedgEntryNo':
      ReturnClaimForCustLedgEntryNo(Rec);
    'GetCustLedgEntryLinkInfo':
      ReturnClaimInfoForCustLedgEntryNo(Rec);
    'DisplayCustLedgEntryLinkInfo':
      DisplayClaimInfoForCustLedgEntryNo();
    'GetClaimSettings':
      ReturnClaimSettings(Rec);
    'GetClaimTempateSettings':
      ReturnClaimTemplateSettings(Rec);
    'GetClaimPaymentApplicationID':
      ReturnClaimPaymentApplicationID(Rec);
    'AddToGenDataRequest':
      ReturnGenDataRequest(Rec);
  END;
END;

Azure Function

The last example we will show is the Azure Function. Some functionality requires execution in an Azure Function.

By making sure that our Azure Function understands the same JSON format used in our JSON Interface Codeunit we can easily prepare the request and read the response using the same methods.

We have the Azure Function Execution in that same JSON Codeunit. Hence, easily prepare the request and call the function in a similar way as for other interfaces.

JsonInterfaceMgt.Initialize;
JsonInterfaceMgt.AddVariable('Method',ServiceMethod);
JsonInterfaceMgt.AddVariable('Url',ServiceUrl);
JsonInterfaceMgt.AddVariable('Username',Username);
JsonInterfaceMgt.AddEncryptedVariable('Password',Password);
JsonInterfaceMgt.AddVariable('Certificate',CertificateValueAsBase64);
JsonInterfaceMgt.AddVariable('Xml',TempBlob.ReadAsTextWithCRLFLineSeparator);
Success := JsonInterfaceMgt.ExecuteAzureFunction;
IF JsonInterfaceMgt.GetVariableBLOBValue(TempBlob,'Xml') THEN
  LogMgt.SetIncoming(TempBlob.ReadAsTextWithCRLFLineSeparator,'xml')
ELSE
  LogMgt.SetIncoming(JsonInterfaceMgt.GetJSON,'json');
IF Success THEN
  DataExch."File Content" := TempBlob.Blob;

The request JSON is posted to the Azure Function and the result read with a single function.

[External] ExecuteAzureFunction() Success : Boolean
GetAsTempBlob(TempBlob);
IF (NOT GetVariableTextValue(AzureServiceURL,'AzureServiceURL')) OR (AzureServiceURL = '') THEN
  AzureServiceURL := 'https://<azurefunction>.azurewebsites.net/api/AzureProxy?code=<some access code>';

OnBeforeExecuteAzureFunction(TempBlob,AzureServiceURL,OmmitWebRequest);

IF NOT OmmitWebRequest THEN BEGIN
  HttpWebRequestMgt.Initialize(AzureServiceURL);
  HttpWebRequestMgt.DisableUI;
  HttpWebRequestMgt.SetMethod('POST');
  HttpWebRequestMgt.SetContentType('application/json');
  HttpWebRequestMgt.SetReturnType('application/json');
  HttpWebRequestMgt.AddBodyBlob(TempBlob);

  TempBlob.INIT;
  TempBlob.Blob.CREATEINSTREAM(ResponseInStream,TEXTENCODING::UTF8);
  IF NOT HttpWebRequestMgt.GetResponse(ResponseInStream,HttpStatusCode,ResponseHeaders) THEN
    IF NOT HttpWebRequestMgt.ProcessFaultResponse('http://www.advania.is') THEN BEGIN
      Initialize;
      AddVariable('Exception',GETLASTERRORTEXT);
      EXIT(FALSE);
    END;
END;

InitializeFromTempBlob(TempBlob);
GetVariableBooleanValue(Success,'Success');

We use the “OnBeforeExecuteAzureFunction” event with a manual binding for our Unit Tests.

In the Azure Function we read the request with standard JSON functions

dynamic data = await req.Content.ReadAsAsync<object>();
Newtonsoft.Json.Linq.JArray jRequestArray = Newtonsoft.Json.Linq.JArray.Parse(data.ToString());
string Method = jRequestArray.First().Value<string>("Method") ?? "Undefined";

Then based on the Method we call each functionality with the request and write the response to the response JSON.

Newtonsoft.Json.Linq.JArray jResponseArray = new Newtonsoft.Json.Linq.JArray();
Newtonsoft.Json.Linq.JObject jResponseObject = new Newtonsoft.Json.Linq.JObject();
try
{            
    switch (Method)
    {
        case "Ping":
            success = true;
            response = "Hello " + (jRequestArray.First().Value<string>("Name") ?? "Undefined") + "!";
            break; 
        case "IOBS2005WSE2.GetAccountStatement":
            xml = jRequestArray.First().Value<string>("Xml") ?? "";
            success = IOBS2005WSE2.Helper.GetAccountStatement(
                jRequestArray.First().Value<string>("Url") ?? "",
                jRequestArray.First().Value<string>("Username") ?? "",
                Decrypt(jRequestArray.First().Value<string>("Password") ?? ""),
                jRequestArray.First().Value<string>("Certificate") ?? "",
                ref xml);
            jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("Xml", xml));
            response = "";
            break;
            ...
        default:
            response = "Method not found";
            break;
    }
    jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("Response", response));
}
catch (System.Exception exception)
{
    httpStatusCode = HttpStatusCode.BadRequest;
    jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("Request", request));
    jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("Message", exception.Message));
    jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("StackTrace", exception.StackTrace.ToString()));
    jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("InnerException", exception.InnerException.Message));
}

jResponseObject.Add(new Newtonsoft.Json.Linq.JProperty("Success", success));
jResponseArray.Add(jResponseObject);           
return req.CreateResponse(httpStatusCode, jResponseArray);                        

Conclusion

Having standard ways of talking between modules and solutions has opened up for a lot of flexibility. We like to keep our solutions as small as possible.

We could mix “Methods” and “Versions” if we at later time need to be able to extend some of the interfaces. We need to honor the contract we have made for the interfaces. We must not make breaking changes to the interfaces, but we sure can extend them without any problems.

By attaching the JSON Interface Codeunit to the post I hope that you will use this pattern in your solutions. Use the Code freely. It is supplies as-is and without any responsibility, obligations or requirements.

Then new Job Invoice feature in NAV 2013

As a part of the Punch and Time Sheet system that I have developed for NAV 2009 I wrote some added features to the Jobs module.  One of the features was to be able to use the copy document functionality to reverse a job invoice.

The new table 1022, Job Planning Line Invoice and the changed functionality means that I needed to rewrite this functionality in NAV 2013.  Today I saw two things that I think Microsoft should do to improve the use of this table.

The first improvement is to add a key to the table for “Document Type,Document No.,Line No.”.  All these fields are a part of the primary key but my experience suggests that the SQL server will perform better with this key added.  After adding this key I suggest a change in the function DeleteSalesLine in function 1002, Job Create-Invoice to utilize this new key.

[code]

DeleteSalesLine(SalesLine : Record "Sales Line")
WITH JobPlanningLineInvoice DO BEGIN
//#Dynamics.is-
SETCURRENTKEY("Document Type","Document No.","Line No.");
//#Dynamics.is+
CASE SalesLine."Document Type" OF
SalesLine."Document Type"::Invoice:[/code]

The later improvement is in function PostInvoiceContractLine in codeunit 1001, Job Post-Line. Here Microsoft is using the RENAME function for the Job Planning Line Invoice table. I would suggest a DELETE and an INSERT to replace the RENAME function.

[code]
CASE SalesHeader."Document Type" OF
SalesHeader."Document Type"::Invoice:
IF JobPlanningLineInvoice.GET(JobPlanningLine."Job No.",JobPlanningLine."Job Task No.",JobPlanningLine."Line No.",
JobPlanningLineInvoice."Document Type"::Invoice,SalesHeader."No.",SalesLine."Line No.")
THEN BEGIN
//#Dynamics.is-
// JobPlanningLineInvoice.RENAME(JobPlanningLine."Job No.",JobPlanningLine."Job Task No.",JobPlanningLine."Line No.",
// JobPlanningLineInvoice."Document Type"::"Posted Invoice",SalesLine."Document No.",SalesLine."Line No.");
JobPlanningLineInvoice.DELETE;
JobPlanningLineInvoice."Document Type" := JobPlanningLineInvoice."Document Type"::"Posted Invoice";
JobPlanningLineInvoice."Document No." := SalesLine."Document No.";
JobPlanningLineInvoice."Line No." := SalesLine."Line No.";
//#Dynamics.is+
JobPlanningLineInvoice."Invoiced Date" := SalesHeader."Posting Date";
JobPlanningLineInvoice."Invoiced Amount (LCY)" :=
CalcLineAmountLCY(JobPlanningLine,JobPlanningLineInvoice."Quantity Transferred");
JobPlanningLineInvoice."Invoiced Cost Amount (LCY)" :=
JobPlanningLineInvoice."Quantity Transferred" * JobPlanningLine."Unit Cost (LCY)";
IF JobLedgEntry.FINDLAST THEN
JobPlanningLineInvoice."Job Ledger Entry No." := JobLedgEntry."Entry No." + 1
ELSE
JobPlanningLineInvoice."Job Ledger Entry No." := 1;
//#Dynamics.is-
// JobPlanningLineInvoice.MODIFY;
JobPlanningLineInvoice.INSERT;
//#Dynamics.is+
END;
SalesHeader."Document Type"::"Credit Memo":
IF JobPlanningLineInvoice.GET(JobPlanningLine."Job No.",JobPlanningLine."Job Task No.",JobPlanningLine."Line No.",
JobPlanningLineInvoice."Document Type"::"Credit Memo",SalesHeader."No.",SalesLine."Line No.")
THEN BEGIN
//#Dynamics.is-
// JobPlanningLineInvoice.RENAME(JobPlanningLine."Job No.",JobPlanningLine."Job Task No.",JobPlanningLine."Line No.",
// JobPlanningLineInvoice."Document Type"::"Posted Credit Memo",SalesLine."Document No.",SalesLine."Line No.");
JobPlanningLineInvoice.DELETE;
JobPlanningLineInvoice."Document Type" := JobPlanningLineInvoice."Document Type"::"Posted Credit Memo";
JobPlanningLineInvoice."Document No." := SalesLine."Document No.";
JobPlanningLineInvoice."Line No." := SalesLine."Line No.";
//#Dynamics.is+
JobPlanningLineInvoice."Invoiced Date" := SalesHeader."Posting Date";
JobPlanningLineInvoice."Invoiced Amount (LCY)" :=
CalcLineAmountLCY(JobPlanningLine,JobPlanningLineInvoice."Quantity Transferred");
JobPlanningLineInvoice."Invoiced Cost Amount (LCY)" :=
JobPlanningLineInvoice."Quantity Transferred" * JobPlanningLine."Unit Cost (LCY)";
IF JobLedgEntry.FINDLAST THEN
JobPlanningLineInvoice."Job Ledger Entry No." := JobLedgEntry."Entry No." + 1
ELSE
JobPlanningLineInvoice."Job Ledger Entry No." := 1;
//#Dynamics.is-
// JobPlanningLineInvoice.MODIFY;
JobPlanningLineInvoice.INSERT;
//#Dynamics.is+
END;
END;[/code]The functionality that I added to the Copy Document function is to create a new line in the Job Planning Line Invoice table if the user is copying a posted sales invoice to a credit memo with header included and identical lines. This also means that I have the Job fields in the credit memo lines populated and the Job Planning Line updated. The function that I use is[code]
ReverseJobInvoice(FromSalesInvLine : Record "Sales Invoice Line";VAR ToSalesLine : Record "Sales Line")
WITH JobInvoice DO BEGIN
SETCURRENTKEY("Document Type","Document No.","Line No.");
SETRANGE("Document Type","Document Type"::"Posted Invoice");
SETRANGE("Document No.",FromSalesInvLine."Document No.");
SETRANGE("Line No.",FromSalesInvLine."Line No.");
IF FINDFIRST THEN BEGIN
JobPlanningLine.GET("Job No.","Job Task No.","Job Planning Line No.");
CreditJobInvoice := JobInvoice;
CreditJobInvoice."Document Type" := CreditJobInvoice."Document Type"::"Credit Memo";
CreditJobInvoice."Document No." := ToSalesLine."Document No.";
CreditJobInvoice."Line No." := ToSalesLine."Line No.";
CreditJobInvoice."Quantity Transferred" := -ToSalesLine.Quantity;
CreditJobInvoice."Transferred Date" := TODAY;
CreditJobInvoice."Invoiced Date" := 0D;
CreditJobInvoice."Invoiced Amount (LCY)" := 0;
CreditJobInvoice."Invoiced Cost Amount (LCY)" := 0;
CreditJobInvoice."Job Ledger Entry No." := 0;
CreditJobInvoice.INSERT;
ToSalesLine."Job No." := JobPlanningLine."Job No.";
ToSalesLine."Job Task No." := JobPlanningLine."Job Task No.";
ToSalesLine."Job Contract Entry No." := JobPlanningLine."Job Contract Entry No.";
ToSalesLine."Job Credit Invoice Line" := TRUE;
ToSalesLine.MODIFY;
JobPlanningLine.UpdateQtyToTransfer;
JobPlanningLine.MODIFY;
END;
END;[/code]

Creating Web Services in NAV 2009

I have been working on NAV Time Registration and the solution is almost ready.  The last step was to create a web service that supports stand-alone punching clocks, both in .net c# and also as a NAV client.

The web service is a standard codeunit with functions.  I make sure the functions that are not to be published have the Local property set to Yes.

Using XMLport in web services requires a few parameter changes.  First I include them in the function parameters.  I use a boolean parameter in Return Value as a success flag.

Then in XMLport properties I change direction to export, change format to XML and select to use default namespace.

The C/Side code to answer this web service is.
[code htmlscript=”false”]IF NOT ValidateClockID(ClockID,ResponseMessage) THEN BEGIN
InsertLog(ClockID,Log.GetEmployeeList,FALSE,ResponseMessage,”);
EXIT(FALSE);
END;

IF NOT CreateEmployeeBuffer(EmployeeBuffer,ResponseMessage) THEN BEGIN
InsertLog(ClockID,Log.GetEmployeeList,FALSE,ResponseMessage,”);
EXIT(FALSE);
END;

EmployeeList.SetEmployeeList(PunchClock.Code,EmployeeBuffer);
ResponseMessage := Text033;
InsertLog(ClockID,Log.GetEmployeeList,TRUE,”,”);
EXIT(TRUE);[/code]
Then finally I run form 860, Web Services and add a line for this web service to be published.

Next post is a demonstration on how to use this web service.  I used Freddys multiple service tier post in my developement enviroment to setup NAV services.

 

Data Upgrade Method and Code

I am in the process of upgrading a 5.0 SP1 database to 2009 R2.  That includes updated addon solutions from SkĂ½rr.  To help me with this process I created a table, form and a codeunit to handle the upgrade and provide the user a visual process information.

This code will store information on processed tables in the log table and make sure that if you execute the upgrade process only the remaining processes will be executed.  You can simply add another processing codeunit and execute the process again and again.

UPGRADE TOOL

Renumbering tool for Dynamics NAV Objects

I needed to renumber objects for Skyrr Payroll addon and found on Mibuso an old renumbering tool.  I downloaded it and installed.  I corrected the bug and created an import function for Excel worksheet.

The Excel worksheet has the object type in column A, object ID in column B, object name in column C, object caption in column D and the object new ID in column E.  The first row is the header row.  I simply copied my object list from Object Designer and worked out my new ids in Excel.

After importing this Excel worksheet I have a list of new and old IDs in NAV.

If any of the new IDs is an old ID, you will need to renumber first from the old ID to temporary ID and then do another renumbering to the new ID.

I create two sheets in my Excel workbook to solve this.

This renumbering tool does not support Pages and I did not need that support in this case.

Renumbering Tool Source Code

Transferring small amount of data between databases

Most of us have needed to copy data from one Dynamics NAV database to another.  For example the posting setup tables, payment terms, currency, etc.

If the databases are not identical you will not be able to copy and paste the data and you have to solve this problem with dataports or XMLPorts.

I offer a solution to this problem.  I have created a form that is able to export and import data based on table and field numbers.  It will import data to a table even if some fields are missing in the destination database.

Here is the source code

SQL Maintenance from Dynamics NAV

When running Dynamics NAV on a MS-SQL database you need to execute some maintenance tasks.  They include defrag and rebuild for indexes and statistics update.  By using SQL Native Client, the Table Information from NAV and Job Queue you can automate this maintenance with Dynamics NAV Application Server.

First you must make sure that automatic update statistics and auto shrink are set to false.

Then you create codeunit for each task.  Here is an example.

Variables
[code]TableInformation@1100409000 : Record 2000000028;
ADOConnection@1100409004 :
Automation "’Microsoft ActiveX Data Objects 2.8 Library’.Connection";
SQLSetup@1100409006 : Record 10017176;
Window@1100409003 : Dialog;
WindowLastUpdated@1100409002 : DateTime;
Counter@1100409001 : Integer;
Total@1100409000 : Integer;
Text001@1100409005 :
TextConst ‘ENU=Microsoft ADO not found.;ISL=Microsoft ADO finnst ekki.’;[/code]
Code for Rebuild Indexes using “DBCC DBREINDEX”
[code]SQLSetup.GET;
SQLSetup.TESTFIELD("SQL Server Name");
SQLSetup.TESTFIELD("SQL Database name");
IF GUIALLOWED THEN
Window.OPEN(‘#1####################################\\’ +
‘@2@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@’);
WindowLastUpdated := CURRENTDATETIME;

IF ISCLEAR(ADOConnection) THEN
IF NOT CREATE(ADOConnection) THEN
ERROR(Text001);

IF ADOConnection.State = 1 THEN
ADOConnection.Close;

ADOConnection.Open(
STRSUBSTNO(
‘Driver={SQL Native Client};Server=%1;’ +
‘Database=%2;Trusted_Connection=yes;’,
SQLSetup."SQL Server Name",
SQLSetup."SQL Database name"));
ADOConnection.CommandTimeout(0);

WITH TableInformation DO BEGIN
SETFILTER("No. of Records",’&gt;0′);
SETFILTER("Table No.",'<>150014&<2000000002′);
Total := COUNTAPPROX;
Counter := 0;
IF FINDSET THEN REPEAT
Counter := Counter + 1;
IF GUIALLOWED THEN
IF (CURRENTDATETIME – WindowLastUpdated) > 1000 THEN BEGIN
Window.UPDATE(1,"Table Name");
Window.UPDATE(2,ROUND(Counter / Total * 10000,1));
WindowLastUpdated := CURRENTDATETIME;
END;

IF "Company Name" <> ” THEN
ADOConnection.Execute(
STRSUBSTNO(
‘DBCC DBREINDEX ([%1$%2],””,90);’,
CONVERTSTR("Company Name",’."\/”’,’_____’),
CONVERTSTR("Table Name",’."\/”’,’_____’)))
ELSE
ADOConnection.Execute(
STRSUBSTNO(
‘DBCC DBREINDEX ([%1],””,90);’,
CONVERTSTR("Table Name",’."\/”’,’_____’)))

UNTIL NEXT = 0;
END;

ADOConnection.Close;[/code]
This example is using trusted connection to the SQL Server, that is a domain user account.  Grant that user the processadmin server role.

To defragment indexes, use “DBCC INDEXDEFRAG”.

SkĂ½rr has an addon solution that includes all these functions and more with granule 10,017,160

Payroll for NAV 2009

I am currently busy designing and writing an upgraded payroll system for Dynamics NAV 2009.  The solution is based on an older version that has been on the market for over ten years.  The new design should be intuitive to use and be capable to handle all types of payroll calculation.

Import G/L Journal from Excel

Image being able to send your staff an Excel document and get it returned with data for your General Journal.  Data from outside sources are usually available in Excel.  By using this batch, you can import that data into NAV General Journal.

Clicking on the “Create Template” button will start Excel with a document template.

On each cell in the header line there is an comment that helps with the data format.

Here is a zip file with the source code needed.