Updates to my Object Renumbering Tool

Back in the end of 2014 I published a renumbering tool for NAV objects.  Using DotNet I was able to increase the renumbering speed for text object files dramatically.

Since then I have been asked if I could upgrade it to work with IDs and Field Numbers.

Now I have.

What’s more, it is also on GitHub.

The Process functions are the following;

  • Read Object Lines – Creates renumbering lines base on the objects in the selected object file.
  • Suggest IDs – Suggest new object numbers in the range from 50.000 based on the available objects in the current license.
  • Read from Excel – Reads object renumbering lines from Excel Sheet created with the Write to Excel process.
  • Write to Excel – Writes current renumbering lines to a new Excel Sheet to me managed within Excel and reread into the renumbering lines.
  • Renumber Using Lines – Prompts for a file to read and for a new file to save with renumbered objects based on the rules in the renumbering lines.
  • Renumber Using Controls – Prompts for a file to read and for a new file to save with renumbered objects based on the rules in the control IDs setup.

I have done some fixes to the renumbering function and have added support for the EventSubscriber.

Go to GitHub to download Page and Table 50000, try this out and submit improvements.

When I am processing an object file I have it open in my text editor.  When I see something to renumber I update the control ranges and execute the renumbering process, reading and writing to the same object file.  My editor will reload the file and I can see the results immediately.

 

NAV Http Web Request

In my post about Json and Rest web services I showed how to use the standard Codeunit no. 1297 for web service communication.

Today I was asked to do this in NAV 2015.  I must admit, I forgot that this Codeunit was not available in NAV 2015.

So I made one.

This one has identical functionality to the one delivered with NAV 2016.  To catch and handle the errors I use the NAV Web Request Add-in that I created and published here on my blog.

Now I can easily move that Json code down to NAV 2015.

Download here –> COD1297-NAV2015

My first Dynamics 365 Extension – step by step – seventh step

Help and notification.

To make one thing clear.  The help we are used to build for the help server is not yet available for Extensions. Therefore we must make sure that all the help we anticipate the user needs will be available from the product.

Microsoft have added tool tips to most of the fields in the application.  To make sure you follow the requirements create tool tips and make sure to have the property for Application Area  populated.

In the previous post about the installation process you can see that a link to the Extension help must be provided.  I did short videos and posted them to YouTube.

With NAV 2017 and Dynamics 365 for Financials Microsoft released a new notification framework.  It is important to use this framework in your Extension.  For example, when a user open the General Ledger Entries after the G/L Source Names Extension installation a notification will appear.

And when the user reacts to the notification a setup video will start inside the web client.

The notification disappears and will not be displayed again for this user.

If the administrator has done the Assisted Setup and the user has the required permissions we will show a different notification.

and that notification will play the usage video.

Now, let’s look at how to do this.

There is a new data type for Notification.  You define variable of type Notification, set the properties and send it on it’s way.

I start by catching the event when the General Ledger Entries page is opened.  Then, depending on the permissions the user has to G/L Source Names table I select between two notifications.

You can have up to three actions added to each notification.  An action must point to a public function in a Codeunit.  That function must have a single parameter of type Notification.

The notification ID is a Guid.  Refer to my last post on how to get a new Guid and keep that Guid for the notification.  The history of the notification – which  user has acted on it, is saved by this ID.  Change it and the history will be lost.  Today, all the notifications must have Local Scope.  The Global Scope is not yet supported by the clients.

In the action for both these notifications I start a YouTube video for the user.  If the user is running web client, including phone client and tablet client, the video will be started inside the client.  For other client types I will start the video in the default browser.

That concludes my development.  Next part is to submit my Extension to AppSource.  Stay tuned…

 

 

 

My first Dynamics 365 Extension – step by step – sixth step

Assisted Setup and Permissions.

But first lets look at the Extension Management Codeunit.

I want to store the appId in my Extension.  This appId is used in several tables in my tenant.  This appId must always be the same for this solution even if the solution is updated.  In my last post you can see that I am also setting my appId parameter in the Git repository.  There are few easy ways to get your appId.  NAV has the function CREATEGUID.  Powershell has the function New-Guid.  But perhaps the easiest way is to use this online Guid generator.

In OnNavAppUpgradePerDatabase – executed once when the Extension is installed, I want to assign my Extension Setup permission set to the user installing.  This is required to be able to run the Assisted Setup after installation.

In OnNavAppUpgradePerCompany – executed once in every company when the Extension is installed, I want to restore data for some of the Extension tables and delete rest of the archive data.  Even if I am just using tables as a temporary tables I still need to define what to do with the non-existing-archived data.  In here I also want to remove the persistent Assisted Setup record in every company.

So, why would I like to remove the Assisted Setup record?  Not because I like to make the user execute the setup every time the Extension is updated.  The Extension has its own Setup table.  In that Setup table I store the setup information belonging to this Extension and the Setup data is restored from archive during the installation as you can see above.

In the current release the Assisted Setup record is a database record.  I know that Microsoft intends to change this and use the same discovery pattern that is used in Service Connections.  When that happens the Assisted Setup record will be temporary only.  So, by designing my Extension in this way I make sure that I will still support Assisted Setup after this change.

In NAV 2016 we had the Mini Role Center, page 9022.  Today this Role Center has been updated to fit Dynamics 365 for Financials.

From here the user can access the Extension installation we covered in the last post, and we can also access the Assisted Setup & Tasks.

I add my Assisted Setup by subscribing to the OnOpenPage Event in the above page.

Looking at the Assisted Setup record we can see that is has a media field for an Icon.  NAV setup data includes icon for the standard Assisted Setup items – so must we.

I decided to ship my icons with the code.  Each icon has a dedicated Codeunit.

To fit an icon into a Codeunit we need to convert the binary data to base64 data.  This we can do with the following code.

This allows me to create a Codeunit like this.

And then code like this to import the icon into my help resources.

Now, back to the Assisted Setup.  I start the setup for my Extension.

This sums up what needs to be done.  The basic setup, the one done with Set Defaults, is to assign permissions to all users based on their current permissions to the tables we build our extension around.

The G/L Source Name lookup table is read from a FlowField in G/L Entries.  We can therefore expect that everyone that has read access to G/L Entries must also have read access to the lookup table. The data in the lookup table is updated when ever any of the four master tables is updated.  Hence, everyone that has access to modify any of the master tables must have access to modify the lookup table.

The Set Defaults function assigns the Extension permission sets to users and user groups based on the current permissions and that should be enough in most cases.  If the user likes the advanced way that possibility is available.

When the user presses Finish the wizard data and the assigned permission sets are applied to the tenant database.

When I created the Setup Wizard page I started with a copy of the Email Setup Wizard, page 1805.

The model I have created to assign permission sets to users and user groups with a Wizard can easily be modified to fit any Extension.  Remember that all this code will soon be available on my GitHub account.

 

Using REST/Json web services from NAV

One of my most popular blog entry is the one about Json.  I have also had some questions outside this website about this topic.

This week I got a task.  We need to communicate with a payment service that uses REST web services and Json file format.

posapi

I got a document describing the service.  Some methods use GET and some use POST.  Here is how I did this.

In the heart of it all I use Codeunit 1297, “Http Web Request Mgt.”.

getaccesstoken

Every time we talk to this POS API we send an Access Token.  If we don’t have the token in memory (single instance Codeunit), we need to get a new one.  That is what the above code does.

The ParameterMgt Codeunit is what I want to focus on.  You can see that I start by inserting my “Authorization Key” into the RequestBodyBlob.  As usual, I use the TempBlob.Blob to get and set my unstructured data.

setapirequest

The interesting part here is that I use an XMLPort to create the data I need to post to the Api.

apiauthenticatexml

A simple one in this example, but nothing says it can’t be complex.  Then I convert the Xml to Json with a single function.

converttojson

The last TRUE variable means the the Document Element will be skipped and the Json will look like it is supposed to.

apikey

The REST service response is Json.

token

And to read the Json response we take a look at the GetAccessToken function.

getaccesstokenfunction

Here I start by converting from Json to Xml.

convertfromjson

And make sure my Document Element name is “posApi”.

apiaccesstokenxml

And I have the result.

As you can see from the documentation some of the Json data is more complex.  This method will work nevertheless.

For more complex date I always create tables that matches the Json structure.  These table I use temporary through the whole process so the don’t need to be licensed tables.  Here is an example where this XMLPORT

getauthorization

will read this Json

getauthorizationjson

I suggest that with our current NAV this is the easiest way to handle REST web services and Json.

 

NAVTechDays 2016

Now, ten days after a succesful NAV TechDays 2016, it is time to write a short recap.

NAV TechDays 2016

As usual I met with a lot of old friends and managed to gain new ones.  Had a great time.  I hosted a workshops on extending the Data Exchange Framework on Thuesday and Wednesday.  It looks like I am the only one in my MVP group to have dug deep into that functionality.  Huge thanks to the people who attended.  I always learn a lot from you guys.

On Thursday morning I had my first NAV TechDays session.  I talked about migrating to events.  I had a blast!

Luc has the video on Mibuso, so if you missed it or just want to relive the moment, go ahead and watch or download.

You can also watch the slides right here.  I have notes on most of my slides to better explain what I was thinking.

On Friday morning Soren presented Source Control.  There he talked about an initiative that Kamil and me started – to make Source Control available for every NAV developer.  Soren jumped on board and we can expect more to join.  On that note I have created my GitHub account where everyone can download the stuff that I publish.

I have published the presentations and objects to GitHub for everyone to play with.  What you download and do with it is on your own responsibility.

p.s. the NAV TechDays photo album is right here to browse through.

Sharing data with multiple tenants

I am upgrading multiple companies to NAV 2016.  I really like to use the multi tenant setup and use it in most cases.

In NAV we have the option to make a table common with all companies.

NoDataPerCompany

This option has been available for all the versions of NAV that I can remember.

Using a multi tenant setup means that you have a dedicated database for each tenant and normally only one company for each tenant.  That makes this option completely useless.

I was running this same multi tenant setup in NAV 2013 R2 and there I solved this issue by modifying the table to be a linked table.

LinkedObject

To successfully setup a linked table I need to manually make sure that the table or view with the correct name and the correct layout is present in every tenant database.  That is a job for the SQL Server Management Studio (SSMS) and can’t be done within NAV.  Doing this also makes upgrades more difficult and can get in the way of a normal synchronization of metadata and tables.

Moving up to NAV 2016 I wanted to get out of this model and use the External SQL methods now available.

ExternalTable

With these properties we can select the table or view name as the ExternalName and the table or view schema as the ExternalSchema.  For all fields in the table we can define an ExternalName.  If that is not defined the normal NAV field name will be used.

FieldNames

This option basically opens the door from the NAV Server to any SQL table.  So, how do we get this to work?

I will show you how I moved from the Linked Table method to the External SQL method.  If you take another look at the properties available for an External SQL table you will see that the DataPerCompany property is not available.  So, an External SQL table is just a table definition for NAV to use and with C/AL code you can define where to find the external table.  This gives you the flexibility to have the same table with all companies and all tenants or select different by tenants and/or companies.

In Iceland we have a national registry.  That registry holds the registration details for every person and every company.  Some companies buy access to the data from the national registry and keep a local copy and are allowed to do a lookup from this data.  Since the data in this table is updated centrally but every company in every tenant wants to have access this is a good candidate for the External SQL table method.

I already had the table defined in NAV with needed data.  I was able to find that table with SSMS.

OriginalTable

By using this table I did not have to worry about the ExternalName for each column in my table definition since it already matched the NAV field names.

I found my application database and used the script engine in SSMS to script the database creation.  I updated the database name to create a central database for my centralized data.  I choose to use this method to make sure that the new database has the same collation as the NAV application database.

I scripted the National Register table creation and created the table in my centralized database.  Then combined the scripts from INSERT INTO and SELECT FROM to insert data into my centralized table.

Finally I made sure that the service user running the NAV service had access to my centralized database.  By doing this I can use a trusted connection between the NAV server and the SQL server.

Moving to NAV developement environment and into the table properties.

NationalRegisterExternalSQL

The ExternalName and ExternalSchema must match the table I created.  Look at the picture from the SSMS to see “FROM [dbo].[National Register]”.  There you can pick up what you need to specify in these properties.

When these changes are synchronized to my database NAV will remove the previous National Register table from the NAV database.  That requires a synchronization with force so be careful.

The actual connection to the centralized database must be done in C/AL.  More information is on this MSDN article.

To complete this solution I use several patterns.

I need a setup data to point me to the centralized database.  I like to look at this as an external service so I link the setup to the Service Connections, Waldo calls this the Discovery Event Pattern.  I create the following function in a Codeunit to register the service.

RegisterConnection

So, if the user has write access to the National Register Setup he will see this service listed in the service connections.

The link to an external database might require me to save a user name and a password.  To successfully do this I apply another pattern for password encryption.  I normally point people to the OCR service setup table and page to find out how to implement these password patterns.

I like to have the Enabled field on my setup tables.  When the service is enabled the user can’t modify the setup data and when trying to enable the service the setup data is verified.  Fields on the setup page are protected by using the EditableByNotEnabled variable.

EditableByNotEnabled

I don’t think you will find a pattern for this method but the setup table in other details follows the Singelton pattern.

NRSetup2

When the user tries to enable the service I do a series or error testing.  The error testing are done with the combination of the Error Message pattern and the TryFunction pattern.

TestSetup

Line 21 does the actual connection test with a TryFunction.

Now, how to connect to the centralized data?

In my setup table I store the database server name and the database name within that server.  With this data I create the connection string.

RegisterUserConnection

The table connection must have a unique id.  I like to create a function to return variables that are linked to the functionality – not using the text constants.

GetConnectionName

This combines what I need to do.  With the correct connection string C/AL registers the connection to my centralized database and set that connection as a default connection for the user.  When NAV needs to use the data from the National Register C/AL must register the connection.

CheckOrRegister

Adding a call to this Codeunit from every page and every function that uses the National Register.

PageInit

Now back to my TryFunction, I can simply check if I can do a FINDFIRST without an error.

TryLookup

 

 

 

Using the new FilterPage in NAV 2016

I was a little surprised to not find any information online on the new FilterPage type in Dynamics NAV 2016.

As a part of the new Workflow feature Microsoft built a new generic feature to ask the user for a filter on any record.

Workflow

Pressing the Assist-Edit button will open the Dynamic Filter Page.

DynamicFilterPage

This view is the same view a NAV users is familiar with when starting reports and batches.

Now to show how to use this new feature.  The best way to show is usually with an example.

Go to the Chart of Accounts.  Then from the ribbon select G/L Balance by Dimension.  Select a setup similar to the screenshot below and press Show Matrix on the ribbon.

GLByDimension

Now you are in a page where you can’t filter anything.  You will see all G/L Accounts within the G/L Account Filter selected earlier and all Accounting Periods in columns according to the Matix Options.  Yes, you have all the normal filter options on the page but none of them work.

OriginalMatrix

So lets see how to use the Dynamic FilterPage to give the user a better experience of this feature.

The first challenge; I want a single month comparison in the columns.  Lets compare amounts for January by year.

To do this we need to make a few modifications to Page 408.

Add the global text variable PeriodTableView.

Page408AddNewGlobal

When the user changes what to show as columns we need to make sure that the PeriodTableView is empty.

Page408ClearPeriodTableView

When the column captions are generated the new PeriodTableView should be used.

Page408AddSetView

Same changes needs to be applied to the NextRec function.

Two new functions needs to be added to ask the user for the filter.

Page408NewFunctions

And finally, get these functions available for the user.

CallingPageView

The result is that the user can now press the Assist-Edit button and enter a filter for every column option.

Page408AccountingPeriod

To attain our goal, lets filter on the month we want to see.

FilterOnJanuary

And the result Matrix looks like this.

MatrixForJanuary

We could add a filter page to the Matrix Page to be able to filter on the G/L Accounts using the same methods and we could add a functionality to add filter on the lines similar to what we did for the columns, but I am not going though that now.

The modified Page 408 is attached.  Good luck.

Page408

 

Presenting the Data Exchange Framework

Earlier this month Arend-Jan contacted me about being a presenter on Dutch Dynamics Community NAV Event for March 2016.  Of course I was honored and after a moments thought I accepted.

The following presentation was repeated two times for close to 140 people in total.  I had a great time and am thankful for the opportunity.

I promised to share one Codeunit – that general mapping Codeunit – and here it is: Codeunit 60000.

If you download the presentation you can also read my notes.  They should help you better understand the whole story.

 

Data Exchange Framework enhancements

I have made several enhancements to the Data Exchange Framework in NAV 2016.  Previously I wrote about text or a comma separated file import and about access to a local data file when creating a xml/json structure.  Both those enhancements are included in the objects attached to this post.

I have needed to handle JSON files where the actual data is not in the node value but in the node name.  To support this I added a boolean field to the Data Exchange Column Definition table (1223).

ConvertNodeNameToValue

To support this I added the same field to the Data Exchange Field Mapping table (1225) as a flow field, calculating the value from the definition table.

Codeunit 1203 is used to import XML and JSON files into the Data Exchange Field table.  I made two changes to the InsertColumn.  First, I added a parameter that accepts the node name and if the above switch is set to true I insert the node name into the column value instead of the node value.  The other change is that instead of finding only the first Data Exchange Column Definition I loop through all of them.  This allows me to create more than one definition based of the same XML node and therefore import into multiple columns from the same node.

To enable this scenario in the Currency Exchange Update Service I made changes to the Data Exchange Field Mapping Buffer table (1265) and the Data Exchange Setup Subform page.  More on that later.

A JSON file without a single root element will cause an error in the standard code.  Inspired my Nikola Kukrika on NAVTechDays 2015 I made changes to Codeunit 1237 utilizing the TryFunction to solve this problem.  To top this of I made a simple change to the Currency Exchange Rate Service Card page (1651) and table (1650) to allow JSON file type.

Having completed these changes I can now use a web service that delivers JSON in my Currency Exchange Rate Service.

CurrencyLayerJSON

Also inspired by Microsoft I added a transformation type to the Transformation Rule table (1237) to convert the Unix Timestamp (seconds since January 1st 1970 UTC) to a date.

UnixTimestamp

All objects and deltas are attached

DEF-Enhancements