Upgrading my G/L Source Names Extension to AL – step 4 addendum

In the last blog post we completed the translation to our native language.  Since then I have learned that I also need to include translation files for EN-US, EN-GP and EN-CA.

With the latest update of AdvaniaGIT tools that was an easy task.  Just asked to create Xlf file for these languages and skipped the part where we import C/AL translation.

I have also been pointed to a new tool that can work with Xlf files.  Multilingual Editor: https://marketplace.visualstudio.com/items?itemName=MultilingualAppToolkit.MultilingualAppToolkitv40

Now I call out to all who are ready to help me with the translation.  Please fork my NAV2018 repository and send me Xlf translation files for your native language.  Or just download one of the translation files and send me your language.

Our next step is to code sign the App file and send it to Microsoft.

Upgrading my G/L Source Names Extension to AL – step 4

We are on a path to upgrade G/L Source Names from version 1 to version 2.  This requires conversion from C/AL to AL, data upgrade and number of changes to the AL code.

A complete check list of what you need to have in your AL extension is published by Microsoft here.

Our task for today is to translate the AL project into our native language.

To make this all as easy as I could I added new functionality to the AdvaniaGIT module and VS Code extension.  Make sure to update to the latest release.

To translate an AL project we need to follow the steps described by Microsoft here.

To demonstrate this process I created a video.

 

Upgrading my G/L Source Names Extension to AL – step 3

When upgrading an extension from C/AL to AL (version 1 to version 2) we need to think about the data upgrade process.

In C/AL we needed to add two function to an extension Codeunit to handle the installation and upgrade.  This I did with Codeunit 70009200.  One function to be execute once for each install.

PROCEDURE OnNavAppUpgradePerDatabase@1();
VAR
  AccessControl@70009200 : Record 2000000053;
BEGIN
  WITH AccessControl DO BEGIN
    SETFILTER("Role ID",'%1|%2','SUPER','SECURITY');
    IF FINDSET THEN REPEAT
      AddUserAccess("User Security ID",PermissionSetToUserGLSourceNames);
      AddUserAccess("User Security ID",PermissionSetToUpdateGLSourceNames);
      AddUserAccess("User Security ID",PermissionSetToSetupGLSourceNames);
    UNTIL NEXT = 0;
  END;
END;

And another function to be executed once for each company in the install database.

PROCEDURE OnNavAppUpgradePerCompany@2();
VAR
  GLSourceNameMgt@70009200 : Codeunit 70009201;
BEGIN
  NAVAPP.RESTOREARCHIVEDATA(DATABASE::"G/L Source Name Setup");
  NAVAPP.RESTOREARCHIVEDATA(DATABASE::"G/L Source Name User Setup");
  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name");

  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name Help Resource");
  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name User Access");
  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name Group Access");

  GLSourceNameMgt.PopulateSourceTable;
  RemoveAssistedSetup;
END;

For each database I add my permission sets to the installation users and for each company I restore the setup data for my extension and populate the lookup table for G/L Source Name.

The methods for install and upgrade have changed in AL for extensions version 2.  Look at the AL documentation from Microsoft for details.

In version 2 I remove these two obsolete function from my application management Codeunit and need to add two new Codeunits, one for install and another for upgrade.

codeunit 70009207 "O4N GL Source Name Install"
{
    Subtype = Install;
    trigger OnRun();
    begin
    end;

    var
    PermissionSetToSetupGLSourceNames : TextConst ENU='G/L-SOURCE NAMES, S';
    PermissionSetToUpdateGLSourceNames : TextConst ENU='G/L-SOURCE NAMES, E';
    PermissionSetToUserGLSourceNames : TextConst ENU='G/L-SOURCE NAMES';

    
    trigger OnInstallAppPerCompany();
    var
        GLSourceNameMgt : Codeunit "O4N GL SN Mgt";
    begin
        GLSourceNameMgt.PopulateSourceTable;
        RemoveAssistedSetup;
    end;

    trigger OnInstallAppPerDatabase();
    var
        AccessControl : Record "Access Control";
    begin
        with AccessControl do begin
            SETFILTER("Role ID",'%1|%2','SUPER','SECURITY');
            if FINDSET then repeat
                AddUserAccess("User Security ID",PermissionSetToUserGLSourceNames);
                AddUserAccess("User Security ID",PermissionSetToUpdateGLSourceNames);
                AddUserAccess("User Security ID",PermissionSetToSetupGLSourceNames);
            until NEXT = 0;
        end;
    end;

  local procedure RemoveAssistedSetup();
  var
    AssistedSetup : Record "Assisted Setup";
  begin
    with AssistedSetup do begin
      SETRANGE("Page ID",PAGE::"O4N GL SN Setup Wizard");
      if not ISEMPTY then
        DELETEALL;
    end;
  end;

  local procedure AddUserAccess(AssignToUser : Guid;PermissionSet : Code[20]);
  var
    AccessControl : Record "Access Control";
    AppMgt : Codeunit "O4N GL SN App Mgt.";
    AppGuid : Guid;
  begin
    EVALUATE(AppGuid,AppMgt.GetAppId);
    with AccessControl do begin
      INIT;
      "User Security ID" := AssignToUser;
      "App ID" := AppGuid;
      Scope := Scope::Tenant;
      "Role ID" := PermissionSet;
      if not FIND then
        INSERT(true);
    end;
  end;

}

In the code you can see that this Codeunit is of Subtype=Install.  This code will  be executed when installing this extension in a database.

To confirm this I can see that I have the G/L Source Names Permission Sets in the Access Control table .

And my G/L Source Name table also has all required entries.

Uninstalling the extension will not remove this data.  Therefore you need to make sure that the install code is structured in a way that it will work even when reinstalling.  Look at the examples from Microsoft to get a better understanding.

Back to my C/AL extension.  When uninstalling that one the data is moved to archive tables.

Archive tables are handled with the NAVAPP.* commands.  The OnNavAppUpgradePerCompany command here on top handled these archive tables when reinstalling or upgrading.

Basically, since I am keeping the same table structure I can use the same set of commands for my upgrade Codeunit.

codeunit 70009208 "O4N GL SN Upgrade"
{
    Subtype=Upgrade;
    trigger OnRun()
    begin
        
    end;
    
    trigger OnCheckPreconditionsPerCompany()
    begin

    end;

    trigger OnCheckPreconditionsPerDatabase()
    begin

    end;
    
    trigger OnUpgradePerCompany()
    var
        GLSourceNameMgt : Codeunit "O4N GL SN Mgt";
        archivedVersion : Text;
    begin
        archivedVersion := NAVAPP.GetArchiveVersion();
        if archivedVersion = '1.0.0.1' then begin
            NAVAPP.RESTOREARCHIVEDATA(DATABASE::"O4N GL SN Setup");
            NAVAPP.RESTOREARCHIVEDATA(DATABASE::"O4N GL SN User Setup");
            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN");

            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN Help Resource");
            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN User Access");
            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN Group Access");

            GLSourceNameMgt.PopulateSourceTable;
        end;
    end;

    trigger OnUpgradePerDatabase()
    begin

    end;

    trigger OnValidateUpgradePerCompany()
    begin

    end;

    trigger OnValidateUpgradePerDatabase()
    begin

    end;
    
}

So, time to test how and if this works.

I have my AL folder open in Visual Studio Code and I use the AdvaniaGIT command Build NAV Environment to get the new Docker container up and running.

Then I use Update launch.json with current branch information to update my launch.json server settings.

I like to use the NAV Container Helper from Microsoft  to manually work with the container.  I use a command from the AdvaniaGIT module to import the NAV Container Module.

The module uses the container name for most of the functions.  The container name can be found by listing the running Docker containers or by asking for the name that match the server used in launch.json.

I need my C/AL extension inside the container so I executed

Copy-FileToNavContainer -containerName jolly_bhabha -localPath C:\NAVManagementWorkFolder\Workspace\GIT\Kappi\NAV2017\Extension1\AppPackage.navx -containerPath c:\run

Then I open PowerShell inside the container

Enter-NavContainer -containerName jolly_bhabha

Import the NAV Administration Module

Welcome to the NAV Container PowerShell prompt

[50AA0018A87F]: PS C:\run> Import-Module 'C:\Program Files\Microsoft Dynamics NAV\110\Service\NavAdminTool.ps1'

Welcome to the Server Admin Tool Shell!

[50AA0018A87F]: PS C:\run>

and I am ready to play.  Install the C/AL extension

Publish-NAVApp -ServerInstance NAV -IdePath 'C:\Program Files (x86)\Microsoft Dynamics NAV\110\RoleTailored Client\finsql.exe' -Path C:\run\AppPackage.navx -SkipVerification

Now I am faced with the fact that I have opened PowerShell inside the container in my AdvaniaGIT terminal.  That means that my AdvaniaGIT commands will execute inside the container, but not on the host.

The simplest way to solve this is to open another instance of Visual Studio Code.  From there I can start the Web Client and complete the install and configuration of my C/AL extension.

I complete the Assisted Setup and do a round trip to G/L Entries to make sure that I have enough data in my tables to verify that the data upgrade is working.

I can verify this by looking into the SQL tables for my extension.  I use PowerShell to uninstall and unpublish my C/AL extension.

Uninstall-NAVApp -ServerInstance NAV -Name "G/L Source Names"
Unpublish-NAVApp -ServerInstance NAV -Name "G/L Source Names"

I can verify that in my SQL database I now have four AppData archive tables.

Pressing F5 in Visual Studio Code will now publish and install the AL extension, even if I have the terminal open inside the container.

The extension is published but can’t be installed because I had previously installed an older version of my extension.  Back in my container PowerShell I will follow the steps as described by Microsoft.

[50AA0018A87F]: PS C:\run> Sync-NAVApp -ServerInstance NAV -Name "G/L Source Names" -Version 2.0.0.0
WARNING: Cannot synchronize the extension G/L Source Names because it is already synchronized.
[50AA0018A87F]: PS C:\run> Start-NAVAppDataUpgrade -ServerInstance NAV -Name "G/L Source Names" -Version 2.0.0.0
[50AA0018A87F]: PS C:\run> Install-NAVApp -ServerInstance NAV -Tenant Default -Name "G/L Source Names"
WARNING: Cannot install extension G/L Source Names by Objects4NAV 2.0.0.0 for the tenant default because it is already installed.
[50AA0018A87F]: PS C:\run>

My AL extension is published and I have verified in my SQL server that all the data from the C/AL extension has been moved to the AL extension tables and all the archive tables have been removed.

Back in Visual Studio Code I can now use F5 to publish and install the extension again if I need to update, debug and test my extension.

Couple of more steps left that I will do shortly.  Happy coding…

 

Upgrading my G/L Source Names Extension to AL – step 2

So, where is step 1?  Step 1 was converting C/AL code to AL code.  This we did with AdvaniaGIT and was demonstrated here.

First thing first!  I received the following email from Microsoft.

Hello,

The decision has been made by our SLT, that the use of a Prefix or Suffix is now a mandatory requirement. If you are already using this in your app(s), great. If not, you will want to do so.

We are coming across too many collisions between apps in our internal tests during builds and have seen some in live tenants as well. It makes the most sense to make this a requirement now. If you think about it in a live situation, if a customer installs an app before yours and then tries yours but gets collision issues, they may just decide to not even continue on with yours.

Also, I have been made aware that adding a prefix or suffix after you already have a v2 app published can make the process complicated for you. Therefore, since you all have to convert to v2 anyway, now is a good time to add in the prefix/suffix.

The following link provides the guidelines around using it here

If you haven’t reserved your prefix yet, please email me back to reserve one (or more if needed).

Thank you,

Ryan

Since my brand is Objects4NAV.com I asked for 04N as my prefix and got it registered.  Since we got this information from Microsoft, every object that we develop in NAV 2018 now has our companies prefix in the name.

Starting my AL development by opening Visual Studio Code in my repository folder.  I updated my setup.json to match the latest preview build as Docker container and then selected to Build NAV Environment using AdvaniaGIT.

After download and deployment of the container I noticed that the container had a brand new version of the AL Extension for Visual Studio Code.  I looked at the version installed and that was an older version.

I uninstalled the AL Language extension and restarted Visual Studio Code.

As you can see on the screenshot above we now don’t have any AL Language extension installed.  I executed the Build NAV Environment command from AdvanaiGIT to install the extension on the Docker container.  In this case I already had a container assigned to my branch so only three things happened.

  • uidOffset in the container database was updated.  This is recommended for C/AL development.
  • License file is updated in the container database and container service.  The license used is the one configured in branch setup.json or the machine settings GITSettings.json
  • AL Language Extension is copied from the container to the host and installed in Visual Studio Code.

Again, restarting Visual Studio Code to find that the latest version of AL Language Extension has been installed.

I then executed two AdvaniaGIT actions.

  • Update Launch.json with current branch environment.  This will update the host name and the service name in my AL Launch.json file to make sure that my AL project will be interacting with the branch container.
  • Open Visual Studio Code in AL folder.  This will open another instance of Visual Studio Code in the AL folder.

Immediately after Visual Studio Code was opened it asked for symbols and I agreed that we should download them from the container.

Everything is now ready for AL development using the latest build that Microsoft has to offer.

I started  Edit – Replace in Files in Visual Studio Code.  All my objects have a name that start with G/L Source Name.  I used this knowledge to apply the prefix.

By starting with the double quote I make sure to only update the object names and not captions.  All captions start with a single quote.  I got a list of all changes and needed to confirm all changes.

The field name I add to G/L Entry table does not match this rule so I needed to rename that my self.  Selecting the field name and pressing F2 allows me to rename a field and have Visual Studio Code update all references automatically.

Pressing F5 started my build, publish and debug.

My extension is installed and ready for testing.

There are a few more steps that I need to look into before publishing the new version of G/L Source Names to Dynamics 365.  These steps will appear here in the coming days.  Hope this will be useful to you all.

Don’t worry about DotNet version in C/AL

When using DotNet data type in NAV C/AL we normally lookup a sub type to use.  When we do the result can be something like

Newtonsoft.Json.Linq.JObject.'Newtonsoft.Json, Version=6.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'

Then, what will happen when moving this code from NAV 2016 to NAV 2017 and NAV 2018.  The Newtonsoft.Json version is not the same and we will get a compile error!

Just remove the version information from the sub type information.

Newtonsoft.Json.Linq.JObject.'Newtonsoft.Json'

And NAV will find the matching Newtonsoft.Json library you have installed and use it.

This should work for all our DotNet variables.

AzureSQL database gives me change tracking error

I just uploaded a SQL bacpac to AzureSQL.  This I have done a number of times.  Connected my service to the SQL database and tried to start the service.

This time I got an error.  Looking in Event Viewer I can see.

The following SQL error was unexpected:
  Change tracking is already enabled for database '2018-IS365'.
  ALTER DATABASE statement failed.
  SQL statement:
  IF NOT EXISTS (SELECT * FROM sys.change_tracking_databases WHERE database_id=DB_ID('2018-IS365')) ALTER DATABASE [2018-IS365] SET CHANGE_TRACKING = ON (CHANGE_RETENTION = 10 MINUTES, AUTO_CLEANUP = ON)

I looked into the SQL database, and sure enough there was a line in sys.change_tracking_databases table.  The problem was that in that table the [database_id] was equal to 48 while

SELECT db_id('2018-IS365')

resulted in 49.  Hence the error and my service tier failing to start.

To remove the change tracking from the database I executed (found here)

DECLARE @SQL NVARCHAR(MAX)='';
SELECT @SQL = @SQL + 'ALTER TABLE ' + s.name + '.[' + t.name + ']' +
 ' Disable Change_tracking;'
FROM sys.change_tracking_tables ct
JOIN sys.tables t
 ON ct.object_id= t.object_id
JOIN sys.schemas s
 ON t.schema_id= s.schema_id;
PRINT @SQL;
EXEC sp_executesql @SQL;

ALTER DATABASE [2018-IS365] SET CHANGE_TRACKING = OFF;

The service tier will take care of turning the change tracking on again when it starts.  You might need to repeat these steps if restarting the service tier.

According to Microsoft a fix is in the pipeline and likely ships in CU2.

Using AdvaniaGIT – Convert G/L Source Names to AL

Here we go.

The NAV on Docker environment we just created can be used for the task at hand.  I have an Extension in Dynamics 365 called G/L Source Names.

I need to update this Extension to V2.0 using AL.  In this video I go through the upgrade and conversion process using AdvainaGIT and Visual Studio Code.

In the first part I copy the deltas from my Dynamics 365 Extension into my work space and I download and prepare the latest release of NAV 2018 Docker Container.

Using our source and modified environments we can build new syntax objects and new syntax deltas. These new syntax deltas are then converted to AL code.

 

Working with optional NAV table fields

Now that we have entered the Extension era we must take into account that some extensions may or may not be installed at the time of code execution.

You might even have two Extensions that you would like to share data.

Let’s give an example.

In Iceland we add a new field to the Customer table (18).  That field is named “Registration No.” and is being used for a 10 digit number that is unique for the individual or the company we add as a customer to your system.

My Example Extension can support Icelandic Registration No. if it exists.

Using Codeunit 701, “Data Type Management”, Record Reference and Field Reference we can form the following code.

LOCAL PROCEDURE GetCustomerRegistrationNo@10(Customer@1000 : Record 18) RegistrationNo : Text;
VAR
  DataTypeMgt@1001 : Codeunit 701;
  RecRef@1002 : RecordRef;
  FldRef@1003 : FieldRef;
BEGIN
  IF NOT DataTypeMgt.GetRecordRef(Customer,RecRef) THEN EXIT('');
  IF NOT DataTypeMgt.FindFieldByName(RecRef,FldRef,'Registration No.') THEN EXIT('');
  RegistrationNo := FldRef.VALUE;
END;

Let’s walk through this code…

GetRecordRef will populate the record reference (RecRef) for the given table and return TRUE if successful.
FindFieldByName will populate the field reference (FltRef) for the given record reference and field name and return TRUE if successful.

Call this function with a code like this.

Customer.GET('MYCUSTOMER');
RegistrationNo := GetCustomerRegistrationNo(Customer);

We could create a more generic function.

LOCAL PROCEDURE GetFieldValueAsText@103(RecVariant@1000 : Variant;FieldName@1004 : Text) FieldValue : Text;
VAR
  DataTypeMgt@1001 : Codeunit 701;
  RecRef@1002 : RecordRef;
  FldRef@1003 : FieldRef;
BEGIN
  IF NOT DataTypeMgt.GetRecordRef(RecVariant,RecRef) THEN EXIT('');
  IF NOT DataTypeMgt.FindFieldByName(RecRef,FldRef,'Registration No.') THEN EXIT('');
  FieldValue := FldRef.VALUE;
END;

This function can be used in more generic ways, like

Customer.GET('MYCUSTOMER');
RegistrationNo := GetFieldValueAsText(Customer,'Registration No');

Vendor.GET('MYVENDOR');
RegistrationNo := GetFieldValueAsText(Vendor,'Registration No');

See where I am going with this?

So the other way around…

LOCAL PROCEDURE SetCustomerRegistrationNo@21(VAR Customer@1000 : Record 18;RegistrationNo@1004 : Text) : Boolean;
VAR
  DataTypeMgt@1001 : Codeunit 701;
  RecRef@1002 : RecordRef;
  FldRef@1003 : FieldRef;
BEGIN
  IF NOT DataTypeMgt.GetRecordRef(Customer,RecRef) THEN EXIT(FALSE);
  IF NOT DataTypeMgt.FindFieldByName(RecRef,FldRef,'Registration No.') THEN EXIT(FALSE);
  FldRef.VALUE := RegistrationNo;
  RecRef.SETTABLE(Customer);
  EXIT(TRUE);
END;

And using this with

Customer.GET('MYCUSTOMER');
SetCustomerRegistrationNo(Customer,'1234567890');

More generic versions can be something like this.

PROCEDURE PopulateOptionalField@25(VAR RecordVariant@1000 : Variant;FieldName@1001 : Text;FieldValue@1002 : Variant) : Boolean;
VAR
  RecRef@1004 : RecordRef;
  FldRef@1003 : FieldRef;
BEGIN
  IF NOT GetRecordRef(RecordVariant,RecRef) THEN EXIT;
  IF NOT FindFieldByName(RecRef,FldRef,FieldName) THEN EXIT;
  FldRef.VALUE := FieldValue;
  RecRef.SETTABLE(RecordVariant);
  EXIT(TRUE);
END;

PROCEDURE ValidateOptionalField@26(VAR RecordVariant@1000 : Variant;FieldName@1001 : Text;FieldValue@1002 : Variant) : Boolean;
VAR
  RecRef@1004 : RecordRef;
  FldRef@1003 : FieldRef;
BEGIN
  IF NOT GetRecordRef(RecordVariant,RecRef) THEN EXIT;
  IF NOT FindFieldByName(RecRef,FldRef,FieldName) THEN EXIT;
  FldRef.VALIDATE(FieldValue);
  RecRef.SETTABLE(RecordVariant);
  EXIT(TRUE);
END;

To use these functions we first need to copy our record to a variant variable and then back to the record after the function completes.

RecordVariant := Customer;
PopulateOptionalField(RecordVariant,'Registration No.','1102713369');
Customer := RecordVariant;

Or

RecordVariant := Customer;
ValidateOptionalField(RecordVariant,'Registration No.','1102713369');
Customer := RecordVariant;

I have requested Microsoft to add more generic functions to Codeunit 701, “Data Type Management”.  I trust that they will deliver as usual.

That NAV Codeunit is one of my favorite ones delivered by Microsoft.

Sharing data with multiple tenants

I am upgrading multiple companies to NAV 2016.  I really like to use the multi tenant setup and use it in most cases.

In NAV we have the option to make a table common with all companies.

NoDataPerCompany

This option has been available for all the versions of NAV that I can remember.

Using a multi tenant setup means that you have a dedicated database for each tenant and normally only one company for each tenant.  That makes this option completely useless.

I was running this same multi tenant setup in NAV 2013 R2 and there I solved this issue by modifying the table to be a linked table.

LinkedObject

To successfully setup a linked table I need to manually make sure that the table or view with the correct name and the correct layout is present in every tenant database.  That is a job for the SQL Server Management Studio (SSMS) and can’t be done within NAV.  Doing this also makes upgrades more difficult and can get in the way of a normal synchronization of metadata and tables.

Moving up to NAV 2016 I wanted to get out of this model and use the External SQL methods now available.

ExternalTable

With these properties we can select the table or view name as the ExternalName and the table or view schema as the ExternalSchema.  For all fields in the table we can define an ExternalName.  If that is not defined the normal NAV field name will be used.

FieldNames

This option basically opens the door from the NAV Server to any SQL table.  So, how do we get this to work?

I will show you how I moved from the Linked Table method to the External SQL method.  If you take another look at the properties available for an External SQL table you will see that the DataPerCompany property is not available.  So, an External SQL table is just a table definition for NAV to use and with C/AL code you can define where to find the external table.  This gives you the flexibility to have the same table with all companies and all tenants or select different by tenants and/or companies.

In Iceland we have a national registry.  That registry holds the registration details for every person and every company.  Some companies buy access to the data from the national registry and keep a local copy and are allowed to do a lookup from this data.  Since the data in this table is updated centrally but every company in every tenant wants to have access this is a good candidate for the External SQL table method.

I already had the table defined in NAV with needed data.  I was able to find that table with SSMS.

OriginalTable

By using this table I did not have to worry about the ExternalName for each column in my table definition since it already matched the NAV field names.

I found my application database and used the script engine in SSMS to script the database creation.  I updated the database name to create a central database for my centralized data.  I choose to use this method to make sure that the new database has the same collation as the NAV application database.

I scripted the National Register table creation and created the table in my centralized database.  Then combined the scripts from INSERT INTO and SELECT FROM to insert data into my centralized table.

Finally I made sure that the service user running the NAV service had access to my centralized database.  By doing this I can use a trusted connection between the NAV server and the SQL server.

Moving to NAV developement environment and into the table properties.

NationalRegisterExternalSQL

The ExternalName and ExternalSchema must match the table I created.  Look at the picture from the SSMS to see “FROM [dbo].[National Register]”.  There you can pick up what you need to specify in these properties.

When these changes are synchronized to my database NAV will remove the previous National Register table from the NAV database.  That requires a synchronization with force so be careful.

The actual connection to the centralized database must be done in C/AL.  More information is on this MSDN article.

To complete this solution I use several patterns.

I need a setup data to point me to the centralized database.  I like to look at this as an external service so I link the setup to the Service Connections, Waldo calls this the Discovery Event Pattern.  I create the following function in a Codeunit to register the service.

RegisterConnection

So, if the user has write access to the National Register Setup he will see this service listed in the service connections.

The link to an external database might require me to save a user name and a password.  To successfully do this I apply another pattern for password encryption.  I normally point people to the OCR service setup table and page to find out how to implement these password patterns.

I like to have the Enabled field on my setup tables.  When the service is enabled the user can’t modify the setup data and when trying to enable the service the setup data is verified.  Fields on the setup page are protected by using the EditableByNotEnabled variable.

EditableByNotEnabled

I don’t think you will find a pattern for this method but the setup table in other details follows the Singelton pattern.

NRSetup2

When the user tries to enable the service I do a series or error testing.  The error testing are done with the combination of the Error Message pattern and the TryFunction pattern.

TestSetup

Line 21 does the actual connection test with a TryFunction.

Now, how to connect to the centralized data?

In my setup table I store the database server name and the database name within that server.  With this data I create the connection string.

RegisterUserConnection

The table connection must have a unique id.  I like to create a function to return variables that are linked to the functionality – not using the text constants.

GetConnectionName

This combines what I need to do.  With the correct connection string C/AL registers the connection to my centralized database and set that connection as a default connection for the user.  When NAV needs to use the data from the National Register C/AL must register the connection.

CheckOrRegister

Adding a call to this Codeunit from every page and every function that uses the National Register.

PageInit

Now back to my TryFunction, I can simply check if I can do a FINDFIRST without an error.

TryLookup

 

 

 

Using the new FilterPage in NAV 2016

I was a little surprised to not find any information online on the new FilterPage type in Dynamics NAV 2016.

As a part of the new Workflow feature Microsoft built a new generic feature to ask the user for a filter on any record.

Workflow

Pressing the Assist-Edit button will open the Dynamic Filter Page.

DynamicFilterPage

This view is the same view a NAV users is familiar with when starting reports and batches.

Now to show how to use this new feature.  The best way to show is usually with an example.

Go to the Chart of Accounts.  Then from the ribbon select G/L Balance by Dimension.  Select a setup similar to the screenshot below and press Show Matrix on the ribbon.

GLByDimension

Now you are in a page where you can’t filter anything.  You will see all G/L Accounts within the G/L Account Filter selected earlier and all Accounting Periods in columns according to the Matix Options.  Yes, you have all the normal filter options on the page but none of them work.

OriginalMatrix

So lets see how to use the Dynamic FilterPage to give the user a better experience of this feature.

The first challenge; I want a single month comparison in the columns.  Lets compare amounts for January by year.

To do this we need to make a few modifications to Page 408.

Add the global text variable PeriodTableView.

Page408AddNewGlobal

When the user changes what to show as columns we need to make sure that the PeriodTableView is empty.

Page408ClearPeriodTableView

When the column captions are generated the new PeriodTableView should be used.

Page408AddSetView

Same changes needs to be applied to the NextRec function.

Two new functions needs to be added to ask the user for the filter.

Page408NewFunctions

And finally, get these functions available for the user.

CallingPageView

The result is that the user can now press the Assist-Edit button and enter a filter for every column option.

Page408AccountingPeriod

To attain our goal, lets filter on the month we want to see.

FilterOnJanuary

And the result Matrix looks like this.

MatrixForJanuary

We could add a filter page to the Matrix Page to be able to filter on the G/L Accounts using the same methods and we could add a functionality to add filter on the lines similar to what we did for the columns, but I am not going though that now.

The modified Page 408 is attached.  Good luck.

Page408