Using AdvaniaGIT – FTP server for teams

So, you are not the only one in your company doing development, right?

Essential part of being able to develop C/AL is to have a starting point.  That starting point is usually where you left of last time you did some development.  If you are starting a task your starting point may just be the localized release from Microsoft.

A starting point in AdvaniaGIT is a database backup.  The database backup can contain data and it should.  Data to make sure that you as a developer can do some basic testing of the solution you are creating.

AdvaniaGIT has a dedicated folder (C:\AdvaniaGIT\Backup) for the database backups.  That is where you should put your backups.

If you are working in teams, and even if not you might not want to flood your local drive with database backups.  That is why we configure an FTP server in C:\AdvaniaGIT\Data\GITSetting.json.

{
    "ftpServer":  "ftp://ftp02.hysing.is/",
    "ftpUser":  "ftp_sourcetree",
    "ftpPass":  "*****",
    "licenseFile":  "Advania.flf",
    "workFolder":  "C:\\NAVManagementWorkFolder\\Workspace",
    "patchNoFunction":  "Display-PatchDayNo",
    "defaultDatabaseServer":  "localhost",
    "defaultDatabaseInstance":  "",
    "objectsNotToDelete":  "(14125500..14125600)",
    "sigToolExecutable":  "C:\\Program Files (x86)\\Windows Kits\\10\\bin\\x64\\signtool.exe",
    "codeSigningCertificate":  "CodeSign_Certificate.pfx",
    "codeSigningCertificatePassword":  "****",
    "setupPath":  "setup.json",
    "objectsPath":  "Objects",
    "deltasPath":  "Deltas",
    "reverseDeltasPath":  "ReverseDeltas",
    "extensionPath":  "Extension1",
    "imagesPath":  "Images",
    "screenshotsPath":  "ScreenShots",
    "permissionSetsPath":  "PermissionSets",
    "addinsPath":  "Addins",
    "languagePath":  "Languages",
    "tableDataPath":  "TableDatas",
    "customReportLayoutsPath":  "CustomReportLayouts",
    "webServicesPath":  "WebServices",
    "binaryPath":  "Binaries",
    "testObjectsPath":  "TestObjects",
    "datetimeCulture":  "is-IS",
    "NewSyntaxPrefix":  "NewSyntax",
    "targetPlatform":  "DynamicsNAV",
    "VSCodePath":  "AL"
}

When we start an action to build NAV development environment the AdvaniaGIT tools searches for a database backup.

The search is both on C:\AdvaniaGIT\Backup and also on the root of the FTP server.

Using the function Get-NAVBackupFilePath to locate the desired backup file it will search based on these patterns.

    $FilePatterns = @(
        "$($SetupParameters.navRelease)-$($SetupParameters.projectName).bak",
        "$($SetupParameters.navRelease)/$($SetupParameters.navVersion)/$($SetupParameters.projectName).bak",
        "$($SetupParameters.navRelease)/$($SetupParameters.projectName).bak",
        "$($SetupParameters.navRelease)-$($SetupParameters.navSolution).bak"
        "$($SetupParameters.navRelease)/$($SetupParameters.navVersion)/$($SetupParameters.navSolution).bak",
        "$($SetupParameters.navRelease)/$($SetupParameters.navSolution).bak")

The navRelease is the year (2016,2017,…).  The navVersion is the build (9.0.46045.0,9.0.46290.0,10.0.17972.0,…)

The projectName and navSolution parameters are defined in Setup.json (settings file) in every GIT branch.

{
  "branchId": "3ba8870a-0274-4162-8ea2-66e314bb3e34",
  "navSolution":  "IS",
  "storeAllObjects":  "true",
  "navVersion":  "9.0.48992.0",
  "projectName": "ADIS",
  "baseBranch": "IS",
  "uidOffset": "10000200",  
  "objectProperties": "true",
  "datetimeCulture":  "is-IS"
}

Combining these values we can see that the search will be done with these patterns.

    $FilePatterns = @(
        "2016-ADIS.bak",
        "2016/9.0.48992.0/ADIS.bak",
        "2016/ADIS.bak",
        "2016-IS.bak"
        "2016/9.0.48992.0/IS.bak",
        "2016/IS.bak")

And these file patterns are applied both to C:\AdvaniaGIT\Backup and to the FTP server root folder.  Here are screenshots from our FTP server.

Looking into the 2017 folder

And into one of the build folders

My local backup folder is simpler

This should give you some idea on where to store your SQL backup files.

 

 

Using AdvaniaGIT – Bring your solution into SCM

Most of us are not just starting working with NAV.  But again not all of us have been able to apply Source Control Management (SCM) to our daily work.

In the previous posts we have installed and prepared our development environment and we are now ready to start working on our solution in the SCM way.

Our first step is to create a branch for our solution.

Now we should look at the options we have to import our solution into our branch.  We can have our solution in different formats.

All these file formats can be imported into our solution branch with the tools that AdvaniaGIT delivers.  Let’s start with the SQL backup, 2016-DAA_WineApp.bak.  AdvaniaGIT will search for backups using these patterns.

    $FilePatterns = @(
        "$($SetupParameters.navRelease)-$($SetupParameters.projectName).bak",
        "$($SetupParameters.navRelease)/$($SetupParameters.navVersion)/$($SetupParameters.projectName).bak",
        "$($SetupParameters.navRelease)/$($SetupParameters.projectName).bak",
        "$($SetupParameters.navRelease)-$($SetupParameters.navSolution).bak"
        "$($SetupParameters.navRelease)/$($SetupParameters.navVersion)/$($SetupParameters.navSolution).bak",
        "$($SetupParameters.navRelease)/$($SetupParameters.navSolution).bak")

In this example the values are:

    $FilePatterns = @(
        "2016-WineSolution.bak",
        "2016/9.0.46045.0/WineSolution.bak",
        "2016/WineSolution.bak",
        "2016-NA.bak"
        "2016/9.0.46045.0/NA.bak",
        "2016/NA.bak")

The module first searches the local drive (default = C:\AdvaniaGIT\Backup) and then the ftp server if one is defined in GITSettings.json.

When AdvaniaGIT creates a database backup it is named according to the first pattern, in this case 2016-WineSolution.bak and saved in the default backup location.

The rest of the file types require that the solution branch has already been built like shown in the first video.

Here we restore from bacpac.  Bacpac format is for example used by AzureSQL.

The Navdata format was created by the NAV development team.  To be able to import from a Navdata file we require an Navdata export with the application included.

Perhaps the most common way is to have a FOB export.

Text exported objects can be imported directly into out GIT branch.

And finally we can update the solution branch from delta files.

Out next task will be to do some development in our solution and commit the changes we make to our GIT server.  Stay tuned…

Using AdvaniaGIT – Create a localization branch

In our previous post we completed the installation of GIT, SourceTree, NAV and the AdvaniaGIT modules.  We also created a GIT repository in Bitbucket.  We selected to use BitBucket just because I already had a user configured there.  I also have a user configured in Visual Studio Online where I store many of my NAV solutions in a private GIT repository.  Public projects I put on GitHub.  As I stated before we must make sure not to push any of the NAV standard code to public repositories.

Advania has internal GIT servers where we host our solutions.

The choice is yours.  I don’t have any favorite GIT server solution.

In Advania we have created a branch structure.  We create repositories for each NAV version.  The master branch is always the W1 release.  Each commit to the W1 branch contains a CU update.  We always store all objects in every branch.  In solution branches we also store deltas and reverse deltas.

We can access any of the CU’s code from the GIT repository and we can see every code change made from one CU to the next.

We branch out master to each localization.  Each localization has the same rule as the master branch.  Every CU is available and every code change is traceable.

All the GIT servers have a nice web interface.  There we can easily see all commits, all code and all changes.  This is a list of commits for the master branch.

This is a list of code changes in NAV 2017 CU8.

Let’s go ahead and create our first localization branch.  In the following video I use the AdvaniaGIT functions to download and extract information from the latest CU.  I don’t have a function that will download all updates for a given version.

Our next step will be creating a solution branch based of our localization.  Stay tuned…

Using AdvaniaGIT – Create your first private GIT repository

We have a predefined folder structure in our GIT repository.  Each repository can have multiple branches.  The first branch and the parent to all the rest is “master”.  In our structure we always store released W1 objects in the master branch.

The GIT sub folder structure is defined in GITSettings.json.

“setupPath”: “setup.json”,
“objectsPath”: “Objects”,
“deltasPath”: “Deltas”,
“reverseDeltasPath”: “ReverseDeltas”,
“extensionPath”: “Extension1”,
“imagesPath”: “Images”,
“screenshotsPath”: “ScreenShots”,
“permissionSetsPath”: “PermissionSets”,
“addinsPath”: “Addins”,
“languagePath”: “Languages”,
“tableDataPath”: “TableDatas”,
“customReportLayoutsPath”: “CustomReportLayouts”,
“webServicesPath”: “WebServices”,
“binaryPath”: “Binaries”,

We store all the NAV objects in the “Objects” folder.  All NAV objects is everything needed to build a solution from our GIT branch.

The basic rules are

  • Each branch needs a unique id, a GUID.  You can find a new guid with any of the online guid generators.  The branchId parameter is stored in each branch setup file.
  • Public repositories must not contain exported Microsoft objects.  These repositories must have the storeAllObjects parameter set to false.  These branches are based on standard objects in the Source folder and the Deltas in the repository.
  • Keep your common parameters in Data\GITSettings.json and your branch specific parameters in Setup.json

List of available setup parameters can be found on the AdvaniaGIT wiki.  The Wiki is a work in progress.

As we go through each of the Custom Actions we will talk about these parameters and how they are used.

Here is a demo video on me creating a private repository for NAV 2017 solutions.

 

The AdvaniaGIT module links each branch to an installed NAV environment.  This link is stored in Data\BranchSettings.json and is automatically maintained when environments are built and removed.

NAV Environment and NAV Databases have name rules.  Prefix is NAVDEV then the main release then an automatically increment number.  Example could be “NAV2017DEV0000008”.  The last increment number is stored in Data\BuildSettings.json and is updated on every environment build.

All installed NAV versions must be defined in Data\NAVVersions.json.  Make sure to have the correct web client port to have your web client working correctly.  This is an example of my NAV versions.

{
  "Releases": 
	[
		{"mainVersion": "90", "navRelease": "2016", "webClientPort": "8090", "helpServer": "nav90help.advania.is", "helpServerPort": "80"},
		{"mainVersion": "100","navRelease": "2017", "webClientPort": "8080", "helpServer": "nav100help.advania.is", "helpServerPort": "80"}
	]
}

This summer I got the change to work with Microsoft on vNext.  Microsoft gave me access to a GIT repository with their base application.  The GIT repository contained a ReadMe file and a folder called BaseApp with all the objects.  I just added a setup.json file to repository and was able to use SourceTree and the custom actions for all the development.  Here is the setup.json file I added.

{
  "branchId": "1901c80b-711a-4564-aec6-ac7684e578cb",
  "navVersion": "10.0.17001.0",
  "navSolution": "W1",
  "projectName": "W1",
  "storeAllObjects": "true",
  "objectProperties": "false",
  "objectsPath": "BaseApp"
}

Our next task will be on creating a branch for our localization and our solution.  Stay tuned…

 

Working with optional NAV table fields

Now that we have entered the Extension era we must take into account that some extensions may or may not be installed at the time of code execution.

You might even have two Extensions that you would like to share data.

Let’s give an example.

In Iceland we add a new field to the Customer table (18).  That field is named “Registration No.” and is being used for a 10 digit number that is unique for the individual or the company we add as a customer to your system.

My Example Extension can support Icelandic Registration No. if it exists.

Using Codeunit 701, “Data Type Management”, Record Reference and Field Reference we can form the following code.

LOCAL PROCEDURE GetCustomerRegistrationNo@10(Customer@1000 : Record 18) RegistrationNo : Text;
VAR
  DataTypeMgt@1001 : Codeunit 701;
  RecRef@1002 : RecordRef;
  FldRef@1003 : FieldRef;
BEGIN
  IF NOT DataTypeMgt.GetRecordRef(Customer,RecRef) THEN EXIT('');
  IF NOT DataTypeMgt.FindFieldByName(RecRef,FldRef,'Registration No.') THEN EXIT('');
  RegistrationNo := FldRef.VALUE;
END;

Let’s walk through this code…

GetRecordRef will populate the record reference (RecRef) for the given table and return TRUE if successful.
FindFieldByName will populate the field reference (FltRef) for the given record reference and field name and return TRUE if successful.

Call this function with a code like this.

Customer.GET('MYCUSTOMER');
RegistrationNo := GetCustomerRegistrationNo(Customer);

We could create a more generic function.

LOCAL PROCEDURE GetFieldValueAsText@103(RecVariant@1000 : Variant;FieldName@1004 : Text) FieldValue : Text;
VAR
  DataTypeMgt@1001 : Codeunit 701;
  RecRef@1002 : RecordRef;
  FldRef@1003 : FieldRef;
BEGIN
  IF NOT DataTypeMgt.GetRecordRef(RecVariant,RecRef) THEN EXIT('');
  IF NOT DataTypeMgt.FindFieldByName(RecRef,FldRef,'Registration No.') THEN EXIT('');
  FieldValue := FldRef.VALUE;
END;

This function can be used in more generic ways, like

Customer.GET('MYCUSTOMER');
RegistrationNo := GetFieldValueAsText(Customer,'Registration No');

Vendor.GET('MYVENDOR');
RegistrationNo := GetFieldValueAsText(Vendor,'Registration No');

See where I am going with this?

So the other way around…

LOCAL PROCEDURE SetCustomerRegistrationNo@21(VAR Customer@1000 : Record 18;RegistrationNo@1004 : Text) : Boolean;
VAR
  DataTypeMgt@1001 : Codeunit 701;
  RecRef@1002 : RecordRef;
  FldRef@1003 : FieldRef;
BEGIN
  IF NOT DataTypeMgt.GetRecordRef(Customer,RecRef) THEN EXIT(FALSE);
  IF NOT DataTypeMgt.FindFieldByName(RecRef,FldRef,'Registration No.') THEN EXIT(FALSE);
  FldRef.VALUE := RegistrationNo;
  RecRef.SETTABLE(Customer);
  EXIT(TRUE);
END;

And using this with

Customer.GET('MYCUSTOMER');
SetCustomerRegistrationNo(Customer,'1234567890');

More generic versions can be something like this.

PROCEDURE PopulateOptionalField@25(VAR RecordVariant@1000 : Variant;FieldName@1001 : Text;FieldValue@1002 : Variant) : Boolean;
VAR
  RecRef@1004 : RecordRef;
  FldRef@1003 : FieldRef;
BEGIN
  IF NOT GetRecordRef(RecordVariant,RecRef) THEN EXIT;
  IF NOT FindFieldByName(RecRef,FldRef,FieldName) THEN EXIT;
  FldRef.VALUE := FieldValue;
  RecRef.SETTABLE(RecordVariant);
  EXIT(TRUE);
END;

PROCEDURE ValidateOptionalField@26(VAR RecordVariant@1000 : Variant;FieldName@1001 : Text;FieldValue@1002 : Variant) : Boolean;
VAR
  RecRef@1004 : RecordRef;
  FldRef@1003 : FieldRef;
BEGIN
  IF NOT GetRecordRef(RecordVariant,RecRef) THEN EXIT;
  IF NOT FindFieldByName(RecRef,FldRef,FieldName) THEN EXIT;
  FldRef.VALIDATE(FieldValue);
  RecRef.SETTABLE(RecordVariant);
  EXIT(TRUE);
END;

To use these functions we first need to copy our record to a variant variable and then back to the record after the function completes.

RecordVariant := Customer;
PopulateOptionalField(RecordVariant,'Registration No.','1102713369');
Customer := RecordVariant;

Or

RecordVariant := Customer;
ValidateOptionalField(RecordVariant,'Registration No.','1102713369');
Customer := RecordVariant;

I have requested Microsoft to add more generic functions to Codeunit 701, “Data Type Management”.  I trust that they will deliver as usual.

That NAV Codeunit is one of my favorite ones delivered by Microsoft.

REST Web Services using Json and requiring authentication

But first…

Registration for NAV TechDays 2017 have been opened.  I will do a workshop on web services and json.  I will be using both C/AL and AL with VS Code in this workshop.

Make sure to register for the conference and if possible go to one or two of the workshops.

Now to the topic.  Yesterday I started to develop an integration solution for bokun.io.  Their API is RESTful and uses Json file formats.  It also requires authentication.

In a project like this I usually start by using the OCR Service Setup from standard NAV.  Create a Setup table and a page.

Looking at the API documentation we can see that we need to use HmacSHA1 with both Access Key and Secret Key to authenticate.  In other project I used HmacSHA256 with the Access Key for the Azure API.

First part of the authentication is the time stamp created in UTC.  I find it easy to use the DateTime DotNet variable to solve this.  There are two different formatting I needed to use.

REST service normally just use GET or POST http methods.  The authentication is usually in the request headers.  This is an example from bokun.is

The GetSignature function is

The Secret Key string and the Signature is converted to a byte array.  The Crypto class is constructed with the Secret Key Byte Array and used to compute hash for the Signature Byte Array. That hash is also a byte array that must be converted to a base64 string.  This will give you the HmacSHA1 signature to use in the request header.

My Azure project is using HmacSHA256 but the code is similar.

Azure displays the Access Keys in base64 format while bokun.is has a normal string.

A little further down the line I choose not to use XML Ports, like I did here, but still convert Json to Xml or Xml to Json.

I use the functions from Codeunit “XML DOM Management” to handle the Xml.  This code should give you the general idea.

OBJECT Codeunit 60201 Bokun.is Data Management
{
  OBJECT-PROPERTIES
  {
    Date=;
    Time=;
    Version List=;
  }
  PROPERTIES
  {
    OnRun=BEGIN
          END;

  }
  CODE
  {
    VAR
      XMLDOMMgt@60200 : Codeunit 6224;

    PROCEDURE ReadCurrencies@1(ResponseString@10035985 : Text;VAR CurrencyBuffer@10035988 : TEMPORARY Record 60201);
    VAR
      XmlDocument@60202 : DotNet "'System.Xml, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'.System.Xml.XmlDocument";
      ResultXMLNodeList@60201 : DotNet "'System.Xml, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'.System.Xml.XmlNodeList";
      ResultXMLNode@60200 : DotNet "'System.Xml, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'.System.Xml.XmlNode";
    BEGIN
      XmlDocument := XmlDocument.XmlDocument;
      XmlDocument.LoadXml(JsonToXml('{"Currency":' + ResponseString + '}'));
      XMLDOMMgt.FindNodes(XmlDocument.DocumentElement,'Currency',ResultXMLNodeList);
      FOREACH ResultXMLNode IN ResultXMLNodeList DO
        ReadCurrency(ResultXMLNode,CurrencyBuffer);
    END;

    LOCAL PROCEDURE ReadCurrency@60205(ResultXMLNode@60201 : DotNet "'System.Xml, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'.System.Xml.XmlNode";VAR CurrencyBuffer@60200 : TEMPORARY Record 60201);
    BEGIN
      WITH CurrencyBuffer DO BEGIN
        INIT;
        Code := XMLDOMMgt.FindNodeText(ResultXMLNode,'code');
        "Currency Factor" := ToDecimal(XMLDOMMgt.FindNodeText(ResultXMLNode,'rate'));
        Payment := ToBoolean(XMLDOMMgt.FindNodeText(ResultXMLNode,'payment'));
        INSERT;
      END;
    END;

    PROCEDURE ReadActivities@60201(ResponseString@10035985 : Text);
    VAR
      XmlDocument@60202 : DotNet "'System.Xml, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'.System.Xml.XmlDocument";
      ResultXMLNodeList@60201 : DotNet "'System.Xml, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'.System.Xml.XmlNodeList";
      ResultXMLNode@60200 : DotNet "'System.Xml, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'.System.Xml.XmlNode";
    BEGIN
      XmlDocument := XmlDocument.XmlDocument;
      XmlDocument.LoadXml(JsonToXml(ResponseString));
    END;

    PROCEDURE GetActivityRequestJson@10035986(NoOfParticipants@60200 : Integer;StartDate@60201 : Date;EndDate@60202 : Date) Json : Text;
    VAR
      XmlDocument@10035987 : DotNet "'System.Xml, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'.System.Xml.XmlDocument";
      CreatedXMLNode@10035988 : DotNet "'System.Xml, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'.System.Xml.XmlNode";
    BEGIN
      XmlDocument := XmlDocument.XmlDocument;
      XMLDOMMgt.AddRootElement(XmlDocument,GetDocumentElementName,CreatedXMLNode);
      IF NoOfParticipants <> 0 THEN
        XMLDOMMgt.AddNode(CreatedXMLNode,'participants',FORMAT(NoOfParticipants,0,9));
      IF StartDate <> 0D THEN
        XMLDOMMgt.AddNode(CreatedXMLNode,'startDate',FORMAT(StartDate,0,9));
      IF EndDate <> 0D THEN
        XMLDOMMgt.AddNode(CreatedXMLNode,'endDate',FORMAT(EndDate,0,9));
      Json := XmlToJson(XmlDocument.OuterXml);
    END;

    PROCEDURE XmlToJson@94(Xml@10035985 : Text) Json : Text;
    VAR
      JsonConvert@10017292 : DotNet "'Newtonsoft.Json, Version=6.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'.Newtonsoft.Json.JsonConvert";
      JsonFormatting@10017296 : DotNet "'Newtonsoft.Json, Version=6.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'.Newtonsoft.Json.Formatting";
      XmlDocument@10017291 : DotNet "'System.Xml, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'.System.Xml.XmlDocument";
    BEGIN
      XmlDocument := XmlDocument.XmlDocument;
      XmlDocument.LoadXml(Xml);
      Json := JsonConvert.SerializeXmlNode(XmlDocument.DocumentElement,JsonFormatting.Indented,TRUE);
    END;

    PROCEDURE JsonToXml@95(Json@10035985 : Text) Xml : Text;
    VAR
      JsonConvert@10017293 : DotNet "'Newtonsoft.Json, Version=6.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed'.Newtonsoft.Json.JsonConvert";
      XmlDocument@10017291 : DotNet "'System.Xml, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'.System.Xml.XmlDocument";
    BEGIN
      XmlDocument := JsonConvert.DeserializeXmlNode(Json,GetDocumentElementName);
      Xml := XmlDocument.OuterXml;
    END;

    LOCAL PROCEDURE GetDocumentElementName@97() : Text;
    BEGIN
      EXIT('Bokun.is');
    END;

    LOCAL PROCEDURE ToDecimal@98(InnerText@10035985 : Text) Result : Decimal;
    BEGIN
      IF NOT EVALUATE(Result,InnerText,9) THEN EXIT(0);
    END;

    LOCAL PROCEDURE ToInteger@92(InnerText@10035985 : Text) Result : Decimal;
    BEGIN
      IF NOT EVALUATE(Result,InnerText,9) THEN EXIT(0);
    END;

    LOCAL PROCEDURE ToBoolean@91(InnerText@10035985 : Text) Result : Boolean;
    BEGIN
      IF NOT EVALUATE(Result,InnerText,9) THEN EXIT(FALSE);
    END;

    LOCAL PROCEDURE ToDate@93(InnerText@10035985 : Text) Result : Date;
    BEGIN
      IF NOT EVALUATE(Result,COPYSTR(InnerText,1,10),9) THEN EXIT(0D);
    END;

    LOCAL PROCEDURE ToDateTime@99(InnerText@10035985 : Text) Result : DateTime;
    BEGIN
      IF NOT EVALUATE(Result,InnerText,9) THEN EXIT(0DT);
    END;

    BEGIN
    END.
  }
}

 

 

Building Assisted Setup for Dynamics 365 for Financials

I had my Assisted Setup wizard up and running on NAV 2017.  Everything looked fine but when the extension was being validated nothing worked.

So, there is a difference between NAV 2017 and Dynamics 365 for Financials.

Remember the session on Design Patterns in NAV 2017 on NAV TechDays 2016?  Microsoft showed what they where planning in regards to assisted setup and manual setup.  This has been implemented in Dynamics 365 for Financials but has not been released for NAV 2017.

One of the feedback Microsoft got from us MVPs was about the Assisted Setup not using the discovery pattern (you will know what I am talking about after watching the session above).  The Assisted Setup table (1803) in NAV 2017 is the one used to register all assisted setup pages.  The problem was that a record for an extension in this table was not removed during uninstall.

Now we have a new table, Aggregated Assisted Setup (1808) that is a temporary table using the discovery pattern.  We also have a new discovery pattern for the Manual Setup with another new table, Business Setup (1875).  You can download these new tables from here (NewD365SetupTables) or wait for them to be released in one of the upcoming NAV 2017 Cu releases.

Here you can also download the guidelines for the new setup pattern (AssistedSetupGuidelines).  My code looks like this.

The Icon file that I created is 240x240px with foreground (RGB 55 55 55) and background (RGB 250 250 235).

More to come, stay tuned…

My first Dynamics 365 Extension – Approved for publishing

Yes!  I have passed all validation steps and Microsoft will publish my app soon.

These are my marketing validation results.

Marketing Validation_Objects4NAV – GL Source Names, 3.2.2017

Remember to look for this image in AppSource and try out my Extension.

As I promised, all the source code is now available on GitHub.

https://github.com/gunnargestsson/nav2017/tree/GLSourceNames

This concludes my blog series on “My first Dynamics 365 Extension”.  Stay tuned for more information on how to design and publish your extension.  I will have more to share in the coming weeks and months.