Clean up your trail

Boy, have these last months been busy.

Now I am working on a management solution for our cloud offering.  This solution is going to give the control of the services and the tenants to a NAV user interface.  I am running Powershell scripts from NAV (thanks Waldo) and things are looking good so far.

I extended the Powershell functionality to be able to read an XML response like Waldo describes here.

One of the things I need to do is to maintain files for this management solution.  For example a NAVData file, SQL backup file and a NAV license file.  When ever I execute a Powershell script from NAV I write these files to a temporary file path and point Powershell over there.

I can’t be sure that every Powershell execution is a success and I can’t leave the temporary files around.  I must delete them.  The solution in my case is to create a single instance Codeunit and apply a DotNet List object.

I only needed a server version, but with DotNet this can be extended to handle the client side as well.



So, every time I create a temporary file I add the file to a DotNet List.  Even if the execution fails the temporary file is still in the list.  When the execution finishes all the listed temporary files are removed.  If the execution fails then the next successful execution will also remove the previous temporary file.

As you can imagine this type of a Codeunit can be used in many scenarios.  By using DotNet Dictionary it is easy to store parameters with names in one place and retrieve them in another.

For example if you need to add a parameter to a standard function, then adding the parameter to the Dictionary before you execute the function and retrieving it within the function will leave the function footprint unchanged and your customization upgradeable.

User friendly error messages in NAV

One of the most challenging thing when programming is to have proper error handling. This is one of the weakness of the standard NAV code. We are now seeing this change when Microsoft introduced the simplified NAV.

An example of this can be found in report 1306.


Here Microsoft have created a dedicated function for error handling.


This type of error handling will tell the user what the problem is and how to fix that problem.


The flow is; when printing the invoice NAV checks to see if payment information has been entered into the Company Information table. If not a question will be asked if the user wants to update the payment information. If the user agrees then the Company Information page will be displayed and the invoice printing will continue after that page is closed.

I wanted to adapt to this method – perhaps a new error handling pattern – when I write an add-on or a customization.  Here is an video from Mark Brummel in line with this new pattern.

I have added this kind of error handling but in some cases a little bit more is required.  As an example, I am doing a series of tests when the Payment Method Code is validated in a Sales Header.  The problem here is that NAV is in the middle of a transaction when I have that question to ask.

To solve this I use a function that is intended for testing only, so using it is not really supported by Microsoft. Perhaps, if they read this blog they will see a useful application for this function in the standard solution.


Here I have two options for you to consider. Doing an ASSERTERROR ERROR(”) will rollback the transaction without stopping the code execution. By doing this before the CONFIRM question I make sure that the users will not be blocking the database while deciding what to do.


Pressing Yes will open the page.


After fixing this error the user can retry the previous action.  I will most likely use a code like this.


Use this carefully.

Selection Filter to Clipboard

I have now seen three posts about using Excel to help create a Pipe Filter.  The latest one from Mohana is similar to the way I normally do this.  His post is a followup from the original video from Kerry Rosvold and an alternative from Mark Brummel.

If this is something you frequently need to do, why don’t use NAV and skip the Excel part ?

Lets imagine how that could look like.


We add a button the ribbon, select what ever we like from the customer list and click the button.


Now it just a matter of using the filter. Paste it where ever you need it.


And this is with just a few lines of code.


Page21 Delta file

New renumbering tool with DotNet power

No, there is not a dotnet add-in, just using dotnet in C/AL.

I have made updates to this tool.  Read about it here.

There is an old version of a renumbering tool on this blog.  It works in the classic client but who uses the classic client any more ?

I wanted to upgrade this tool to NAV 2015 (and NAV 2013 R2).  There where a few issues that I needed to handle.  First; how to upload a large object file to the server, how to read through millions of lines to find what I was looking for, how to replace string in all these lines without having to wait for days.

That is what I did.  One Table and One Page.  The code is all in the Table, not the ideal setup but I preferred fewer objects over prettier design.  On the other hand this it not that big of an object so every developer should find the way through the code in there.

I have been sharing objects on this blog and have also started to share and sell objects on  These objects are all in the range from 50.000 to 99.999.  This tool will make it so much easier to import these objects into the database.  Just download the renumbering tool and manually install it into the database and use the tool from there.

A little spoiler, I just used this tool on an object file with just under four million lines and needed to renumber 116 objects.  The tool completed this in two minutes and forty two seconds.

So, how does it work?


The page has only three columns.  Source object type, source object id and the new destination id.  If you leave the destination id blank the line will be ignored.  If you need to renumber an object to a number that already exists in the database start by renumber the old object before reusing the object id.

In here we have five functions:

  • Read Object Lines will populate the renumbering lines from an object file.
  • Suggest IDs will look for available id by searching the license permission and will update the renumbering lines.
  • Write to Excel will export the renumbering lines to an Excel worksheet.
  • Read from Excel will import the same format from an Excel worksheet.
  • Update  Object File will apply the object changes in the renumbering lines to an object file and will save a new object file.

If you have this tool in the customer database then you can import the object file you want to add to the database, get a suggestion for new ids, export a new object file and import that one into the database.  This will only take a few seconds.

Lets take a closer look at the code.  Perhaps that will help you in some of your ongoing task or in the future.

The first thing that the tool does is to upload the object file to the server temporary folder.  In NAV we have a function for this purpose but that one can only handle a limited amount of data.  A four million lines of code will not be uploaded with the standard method so I had to create another one.


The magic word here is streaming.  This code will take 4KB of data in each portion and upload to the server.  It will repeat until done and I don’t think you will find a useful NAV file that will not be uploaded with this code.

Once the file is on the server the tool loads the whole file into memory.


An array of string is used to store all the object lines.

This is the starting point.  From this point I can loop through all the lines and do what ever I want.  However, doing that line by line will take forever so I have applied a few tricks on the way.

When the tool is loading an object file into the renumbering lines it will use the dotnet string function split to break the line into smaller bits.  When the tool finds an object it will search the string array for the end of that object before continuing and can therefore skip all lines except the ones needed to build the renumbering lines.

When renumbering, instead of applying each renumbering line to every code line, the tool combines five thousand lines into one dotnet string variable and uses the dotnet function replace to update the code.  After each chunk of code is updated it is downloaded from the server to the client side and written to the local file system.


Well, what are you waiting for.  Download the renumbering tool and start using it today.




My first dynamic Windows Client Add-in

I have a task where a user on a Windows Client wants to use his touch screen to select from a limited set of Items.  In NAV 2009 R2 this was a form with buttons and I have been waiting to update this to NAV 2013 R2 until I had finished most of the other tasks for that client.  That time is now.

I attended the session on Directions EMEA where Vjeko presented client extensions.  I also saw a post by Rashed on his mibuso blog about buttons.

So I started the task.  I have done several add-ins for NAV so I know enough about Visual Studio to feel comfortable using it.  Of course, when ever one starts something like this there is always something new to learn and use.  To search for help on C# on the internet is really helpful and we have a lot of good examples out there.

Like Rashed, I created a panel that hosted all the buttons and text boxes and that panel is the object added to the NAV page.


I wanted to be able to customize all the buttons and text boxes from NAV so I created default values and null able values for all my properties.  Then I created a property for every value that I wanted NAV to have access to in the class.


After I created the panel control I wanted a trigger to fire in NAV to tell me that the add-in was ready.  I followed the example Vjeko gave in last Directions but that simply did not work.  I contacted him and that resulted in a new blog on Vjeko’s site.  Instead of triggering an event when creating the control the trigger is now fired after the control is bound to the NAV page.


When I get the trigger fired in NAV I create the buttons with all the properties I need.


The test page I created will allow me to select any number of buttons and/or text boxes.  I can customize background and foreground color, font family and font size.  Some of these settings can be done for each button or text box individually, that is the ones included in the arrays used when calling the CreateButtons function with arrays create from the Item table using these settings.




In the add-in I have created two event handlers to handle the push of a button or a text change.


This is handled in the NAV page with this code.


So when I press a button or change a text I will get a message from NAV.

I also made sure that if I select zero as a size for either the buttons or the text boxes they will not show.



I have adapted a new method when sharing objects and information.  I will publish on my blog how things are done if you want to create your own version and will also supply a version you can download and start using at

Ping your web service for security

Increasingly we are using Web Services to communicate between NAV companies.  In a production environment these communication is usually secured by the SSL standard. When the communication is sent over the Internet I would recommend that a secure ping be added before every communication to make sure that the correct machines are speaking together.

The payment service module in NAV 2013 R2 has an encryption functionality.  I am using this functionality every time that I store user names and passwords in the database. This encryption functionality is based on the key located in the Online Service add-in folder.


You can copy the Providerkey.key file from one server to another or use the Payment Service Connection Setup page to upload, download, delete and to create a new key.


Create a function and add to you web service.



The encrypted string is then decrypted and compared with the original string sent.


The EncryptionMgt refers to Codeunit 824.  A possibility to add the secret as a parameter to every web service call and skip the preceding ping function.  The code examples can be downloaded from Objects4NAV.


OData in NAV 2013 R2

Open Data Protocol (OData) is a data access protocol initially defined by Microsoft.  In NAV 2013 Microsoft first added a support for OData.  On the Developer Network Microsoft has a few walk-throughs aimed to get us started in using both SOAP and OData web services.

To get started with OData first make sure that the OData service is enabled on your developement server.


As you can see on the image above I also enable the use of NTLM Authentication.  In a production environment OData should use SSL as described in this walk-through from Microsoft.

I want to share with you a few points that I have found out and are not obvious in the walk-throughs.

When an external application makes a OData request NAV will behave the same way as a Windows Client would do.  The server will do the login routine by executing trigger 1 in Codeunit 1, CompanyOpen and when the request is finishing the server executes the CompanyClose trigger.  If the request if for a Page object then all the page triggers are executed, OnInit, OnOpenPage and so on.

The OData will only show the fields added to the Page or to the Query with the table primary key fields added.  Lets take a closer look at how the Page object works with OData.

In the OnOpenPage trigger and in the SourceTableView property it is possible to add filters.  These filters will apply to the OData stream and you will not be able to insert data into NAV outside of these filters – same functionality as if you where using the Page in the Windows Client.  The properties; Editable, InsertAllowed, ModifyAllowed and DeleteAllowed all work with OData.  Lets for example look at Page 39 – General Journal.  In the OnOpenPage trigger the code is filtering the table data with a Template Name and a Batch Name.


From OData, the variable OpenedFromBatch will always be False so the first template and batch for the PAGE::”General Journal” will always be selected and new entries will always be inserted into that journal.  This also means that it is not possible to use Page 39 to read the journal lines from any other journal then the first one.  Fields that are not visible in a Page are visible and usable in OData.

When creating a new record the code in the OnNewRecord trigger is executed.


This will all work fine for the first template and batch.  The AutoSplitKey property is also active so as long as you are fine with inserting in to the default journal then you can use this Page for your OData web service.

The easiest way is still to create a new page dedicated to the web service functionality, show the primary key fields in the page and skip the OnOpenPage and the OnNewRecord code.  I use the OnNewRecord code to put in default values for the table I am inserting into.

On the Microsoft web site walk-through it is shown how to create a new customer, look for a customer and modify a customer.

I have found that I want to add one line to that example

NAV nav = new NAV(new Uri("http://localhost:7048/DynamicsNAV/OData/Company('CRONUS%20International%20Ltd.')"));
nav.Credentials = CredentialCache.DefaultNetworkCredentials; 
nav.IgnoreResourceNotFoundException = true;

Without IgnoreResourceNotFoundException the following code example will return an exception if the customer is not found within the given filter.

private static Customer GetCustomer(NAV nav, string customerNo)
var customers = (from c in nav.Customer
where c.No == customerNo
select c);
foreach (Customer customer in customers)
return customer;
return null;

By combining a Get function like this with a New or Modify function it is easy to update the existing value for any given table in the database.

private static Boolean AddNewDefaultDimensionCodeValue(NAV nav, int tableNo, string no, string dimensionCode, string dimensionCodeValue)
DefaultDimensions existingDefaultDimension = GetDefaultDimension(nav, tableNo, no, dimensionCode);
if (existingDefaultDimension == null)
DefaultDimensions newDefaultDimension = new DefaultDimensions();
newDefaultDimension.Table_ID = tableNo;
newDefaultDimension.No = no;
newDefaultDimension.Dimension_Code = dimensionCode;
newDefaultDimension.Dimension_Value_Code = dimensionCodeValue;
return true;
existingDefaultDimension.Dimension_Value_Code = dimensionCodeValue;
return false;
private static DefaultDimensions GetDefaultDimension(NAV nav, int tableNo, string no, string dimensionCode)
var dimensionValues = (from d in nav.DefaultDimensions
where d.Table_ID == tableNo && d.No == no && d.Dimension_Code == dimensionCode
select d);
foreach (DefaultDimensions dimensionValue in dimensionValues)
return dimensionValue;
return null;

Remember, that without the SaveChanges nothing will be updated in NAV.

Now go ahead and use OData to integrate NAV with all your external systems and devices.  Good luck.

Localized views removed from NAV 2013 R2

In Dynamics NAV 2009 and older we had an option to maintain views in the database.  Activating this caused NAV to create a view for each language and every table in the database.

This is a useful thing if you are building BI on top of your SQL and would like to skip all the translation work.

This however had some drawbacks.

  • Every table in the database is maintained
  • Option values are shown as number
  • Boolean is shown as number
  • Global dimension do not have the correct caption
  • Time is shown as DateTime
  • Not easy to see the difference between normal date and closing date

I already discussed this in a post back in 2012.  Because of this I never used this built-in option.  The solution I created back in 2012 has been upgraded and is now ready and working in NAV 2013 R2.


I start by exporting all tables from the Object Designer as a text file.  This text file is imported with the task “Read Option Description”.  All options captions from the import are stored in a table in the database.


Dates are created according to the Accounting Periods.  The BI Dates table keeps multiple fields with information about the date.  By joining this table to your SQL query or adding it as a dimension to your OLAP cube you have a lot of options to handle dates and periods.


The tables required for the views are added to a list.  Remember to add he BI Dates table to the list of tables to be used in your business intelligence tool.


NAV is now ready to create the SQL commands to create the views.  The prefix can be used if the views are to be kept in a separate database, which I suggest you do.  A prefix can for example be “[Demo Database NAV (7-1)].[dbo].”


The file for SQL Management Studio is now ready for execution.

[code lang=”sql”]IF EXISTS(SELECT * FROM sys.views WHERE object_id = OBJECT_ID(N’dbo.ENU$Kappi ehf_$Currency’))
DROP VIEW [dbo].[ENU$Kappi ehf_$Currency]
CREATE VIEW [dbo].[ENU$Kappi ehf_$Currency]
[Code] As [Code]
,[Last Date Modified] As [Last Date Modified]
,[Last Date Adjusted] As [Last Date Adjusted]
,[Unrealized Gains Acc_] As [Unrealized Gains Acc_]
,[Realized Gains Acc_] As [Realized Gains Acc_]
,[Unrealized Losses Acc_] As [Unrealized Losses Acc_]
,[Realized Losses Acc_] As [Realized Losses Acc_]
,[Invoice Rounding Precision] As [Invoice Rounding Precision]
,[Invoice Rounding Type] As [Invoice Rounding Type]
,[Amount Rounding Precision] As [Amount Rounding Precision]
,[Unit-Amount Rounding Precision] As [Unit-Amount Rounding Precision]
,[Description] As [Description]
,[Amount Decimal Places] As [Amount Decimal Places]
,[Unit-Amount Decimal Places] As [Unit-Amount Decimal Places]
,[Realized G_L Gains Account] As [Realized G_L Gains Account]
,[Realized G_L Losses Account] As [Realized G_L Losses Account]
,[Appln_ Rounding Precision] As [Appln_ Rounding Precision]
,CASE [EMU Currency]
END As [EMU Currency]
,[Currency Factor] As [Currency Factor]
,[Residual Gains Account] As [Residual Gains Account]
,[Residual Losses Account] As [Residual Losses Account]
,[Conv_ LCY Rndg_ Debit Acc_] As [Conv_ LCY Rndg_ Debit Acc_]
,[Conv_ LCY Rndg_ Credit Acc_] As [Conv_ LCY Rndg_ Credit Acc_]
,[Max_ VAT Difference Allowed] As [Max_ VAT Difference Allowed]
,[VAT Rounding Type] As [VAT Rounding Type]
,[Payment Tolerance _] As [Payment Tolerance _]
,[Max_ Payment Tolerance Amount] As [Max_ Payment Tolerance Amount]
,[Wage Amount Rounding Precision] As [Wage Amount Rounding Precision]
,[Wage Rate Rounding Precision] As [Wage Rate Rounding Precision]
,[Wage Amount Decimal Places] As [Wage Amount Decimal Places]
,[Wage Rate Decimal Places] As [Wage Rate Decimal Places]
FROM [Kappi ehf_$Currency]

IF EXISTS(SELECT * FROM sys.views WHERE object_id = OBJECT_ID(N’dbo.ENU$Kappi ehf_$BI Dates’))
DROP VIEW [dbo].[ENU$Kappi ehf_$BI Dates]
CREATE VIEW [dbo].[ENU$Kappi ehf_$BI Dates]
[Date] As [Date]
,[Date Name] As [Date Name]
,[Year] As [Year]
,[Week] As [Week]
,[Month] As [Month]
,[Month Name] As [Month Name]
,[Day of Week] As [Day of Week]
,[Day Name] As [Day Name]
,[Day of Month] As [Day of Month]
,CASE [Closing Date]
END As [Closing Date]
,[SQL Month Name] As [SQL Month Name]
,[SQL Day Name] As [SQL Day Name]
,[Quarter] As [Quarter]
,[Year Month] As [Year Month]
,[Year Month Name] As [Year Month Name]
,[Month Year Name] As [Month Year Name]
,[Quarter Name] As [Quarter Name]
,[VAT Period] As [VAT Period]
,[VAT Period Name] As [VAT Period Name]
,[Sorting Date] As [Sorting Date]
,[HRMS Integer Start] As [HRMS Integer Start]
,[HRMS Integer End] As [HRMS Integer End]
,[Day of Year] As [Day of Year]
,[Day of Half Year] As [Day of Half Year]
,[Day of Quarter] As [Day of Quarter]
,[Day of Accounting] As [Day of Accounting]
,[Half Years] As [Half Years]
,[Half Year of Year] As [Half Year of Year]
,CASE [Is Holiday]
END As [Is Holiday]
,CASE [Is Working Day]
END As [Is Working Day]
,[Month of Half Year] As [Month of Half Year]
,[Month of Quarter] As [Month of Quarter]
,[Month of Year] As [Month of Year]
,[Quarters of Half Year] As [Quarters of Half Year]
,[Quarters of Year] As [Quarters of Year]
,[Week of Year] As [Week of Year]
,CASE [Is Week Day]
END As [Is Week Day]
,[Half Year Name] As [Half Year Name]
,[Week Name] As [Week Name]
,[Fiscal Day] As [Fiscal Date]
,[Fiscal Year] As [Fiscal Year]
,[Fiscal Week] As [Fiscal Week]
,[Fiscal Month] As [Fiscal Month]
,[Fiscal Day of Week] As [Fiscal Day of Week]
,[Fiscal Day of Month] As [Fiscal Day of Month]
,[Fiscal Quarter] As [Fiscal Quarter]
,[Fiscal Day of Year] As [Fiscal Day of Year]
,[Fiscal Day of Half Year] As [Fiscal Day of Half Year]
,[Fiscal Day of Quarter] As [Fiscal Day of Quarter]
,[Fiscal Half Years] As [Fiscal Half Years]
,[Fiscal Half Year of Year] As [Fiscal Half Year of Year]
,[Fiscal Month of Half Year] As [Fiscal Month of Half Year]
,[Fiscal Month of Quarter] As [Fiscal Month of Quarter]
,[Fiscal Month of Year] As [Fiscal Month of Year]
,[Fiscal Quarters of Half Year] As [Fiscal Quarters of Half Year]
,[Fiscal Quarters of Year] As [Fiscal Quarters of Year]
,[Fiscal Week of Half Year] As [Fiscal Week of Half Year]
,[Fiscal Week of Month] As [Fiscal Week of Month]
,[Fiscal Week of Quarter] As [Fiscal Week of Quarter]
,[Fiscal Week of Year] As [Fiscal Week of Year]
,[Fiscal Quarter Name] As [Fiscal Quarter Name]
,[Fiscal Half Year Name] As [Fiscal Half Year Name]
,[Fiscal Week Name] As [Fiscal Week Name]
,[Fiscal Month Name] As [Fiscal Month Name]
FROM [Kappi ehf_$BI Dates]

We now have a localized version of our table data and all the above drawbacks have been fixed.  Now go ahead and build your reports or your OLAP cube on top of these views.  You can use the BI Dates table for your filter area in the reports to make sure that only available periods are selected.

Download objects.


The Next Upgrade Story – Data transfer code generation tool

Guess what, more companies are moving to NAV 2013 R2.

I am about to start a data upgrade for a medium-sized Icelandic company.  Doing a data transfer means that you need to compare the tables of the old version and the new version and where you have mismatch you have to take some action.  The data transfer is build on the upgrade from 2009 to 2013 R2 released by Microsoft.

We have the MergeTool that is used to compare and merge code and a lot of other cool things.  I wanted a tool that did something else so I started designing a writing.  I am hoping that the result will save me a lot of work in future upgrades.


This tool imports exported object files into a table version.  Then I can compare two table versions and get all the difference listed.  I also import the CRONUS tables for each version to make sure that I am not comparing built-in fields.


I give the comparison an identification and select the source and destination versions.  Then I hit Compare.  Within seconds I get the result.


I have got to make decisions on what to do with modified and deleted tables.  I can list all the tables and check the fields comparison.


I can select both step 1 and step 2 actions for each table.  For new fields I can select a source field from the source table.  The example I show here is the customer table.  I choose in step 1 to copy all the needed fields to a temporary table.  The tool will find the next available upgrade table automatically.  In step 2 I choose to move the data from the temporary table into the destination version.  I can ask that a new field will get a value from another field in the temporary table.  In this example the Registration No. field has a new ID.

The best part is yet to come.  To really save some work I make the tool create all the needed NAV upgrade objects.  For step 1 I get all the upgrade tables with only required fields.

[code]OBJECT Table 66001 Temp Customer
{ 1 ; ;No. ;Code20 }
{ 10010400;;Registration No. ;Code20 }
{ 10017241;;Require Purchase Request;Boolean }
{ 10017290;;Freight Paid By ;Option ;OptionString=[Recipient,Prepaid,Senders Account,See Ship-to Address] }
{ ;No. ;Clustered=Yes }

¸ Upgrade Table

OBJECT Table 66002 Temp Cust. Ledger Entry


The codeunit required to make the data manipulation.

[code]OBJECT Codeunit 66000 Upgrade FurblandanR
PROCEDURE Upgrade@1(VAR StateIndicator@1000 : Record 104037);
// Call this function from the top of the Upgrade trigger in Codeunit ID 104045




LOCAL PROCEDURE CopyCustomer@2(VAR StateIndicator@1002 : Record 104037);
Customer@1000 : Record 18;
TempCustomer@1001 : Record 66001;
WITH Customer DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
MODIFYALL("Registration No.",”);
MODIFYALL("Require Purchase Request",FALSE);
MODIFYALL("Freight Paid By",0);

LOCAL PROCEDURE CopyCustLedgerEntry@3(VAR StateIndicator@1002 : Record 104037);
CustLedgerEntry@1000 : Record 21;
TempCustLedgerEntry@1001 : Record 66002;
WITH CustLedgerEntry DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
MODIFYALL("Final Due Date",0D);

LOCAL PROCEDURE MoveCashierBankBranch@26(VAR StateIndicator@1002 : Record 104037);
CashierBankBranch@1000 : Record 10010402;
TempCashierBankBranch@1001 : Record 10000202;
WITH CashierBankBranch DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN

LOCAL PROCEDURE ForceFixedAsset@48(VAR StateIndicator@1002 : Record 104037);
FixedAsset@1000 : Record 5600;
WITH FixedAsset DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
MODIFYALL("Straight-Line %",0);
MODIFYALL("Depreciation Starting Date",0D);
MODIFYALL("Biling Contract No.",”);


And to complete step 1 the addition needed to the DeleteDiscontinuedTables trigger.

[code]// Add these lines to the DeleteDiscontinuedTables trigger of Codeunit ID 104002
…// Add these lines to the DeleteDiscontinuedTables trigger of Codeunit ID 104002

Then there is the upgrade codeunit for step 2

[code]OBJECT Codeunit 66001 Upgrade FurblandanR
PROCEDURE Upgrade@1(VAR StateIndicator@1000 : Record 104037);
// Call this function from the top of the Upgrade trigger in Codeunit ID 104048
.. END

LOCAL PROCEDURE CopyBankingWebServicesUser@2(VAR StateIndicator@1002 : Record 104037);
TempCashierWebServicesUser@1000 : Record 66022;
BankingWebServicesUser@1001 : Record 10010428;
WITH TempCashierWebServicesUser DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
"BankingWebServicesUser"."Authentication Code" := "Authentication Code";
"BankingWebServicesUser"."User ID" := "User ID";
"BankingWebServicesUser"."Certificate Key Identifier" := "Certificate Key Identifier";
//"BankingWebServicesUser"."User ID Store" := ;
//"BankingWebServicesUser"."Password Store" := ;
//"BankingWebServicesUser"."Certificate Location" := ;
//"BankingWebServicesUser"."Certificate Value" := ;

LOCAL PROCEDURE MoveCustomer@3(VAR StateIndicator@1002 : Record 104037);
TempCustomer@1000 : Record 66001;
Customer@1001 : Record 18;
WITH TempCustomer DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
"Customer"."Registration No." := "Registration No.";
//"Customer"."Momentum Claims Active" := ;


And to complete the process the upgrade tables are marked to be deleted.

[code]// Add these lines to the OnRun trigger of Codeunit ID 104003

This tool is in testing and I am planning to add support for NAV 2015 upgrade tool.

New Developer Tools for Dynamics NAV 2013 R2

Cumulative update 9 for Dynamics NAV 2013 R2 has been released.  This new version includes a powershell tool package for developers.

You can read about these tools in the NAV Team Blog, on Soren’s Blog and on Waldo’s Blog.

With every CU package we get an upgrade for the application.  I have a customer that is already running CU8 and I want to install CU9.  This time I will do that with these new powershell tools.

First step is to install the new CU9 binaries.  This I will do with the powershell scripts from here.  This will update windows client files, web client files, server files, outlook integration and click once deployment.  On my developement machine I needed to copy the RTC client files from the CU9 DVD.  The new model tools where not included in the KB RTC folder.

Then to export the Dynamic NAV objects.

I use my CU8 version demo database and export all objects to cu8objects.txt.  I export all objects from the customized database to cu8customizedobjects.txt.  I finally use my CU9 version demo database and export all objects to cu9objects.txt.

Open PoweShell ISE and navigate to the folder with the object files.

Directory: N:\nav\cu9
Mode                LastWriteTime     Length Name
 ----                -------------     ------ ----
 -a---         14.7.2014     09:32  154023304 cu8customizedobjects.txt
 -a---         14.7.2014     09:27  120451925 cu8objects.txt
 -a---         14.7.2014     09:35  120463442 cu9objects.txt

Next I import the new model tools.

Import-Module "${env:ProgramFiles(x86)}\Microsoft Dynamics NAV\71\RoleTailored Client\Microsoft.Dynamics.Nav.Model.Tools.psd1" -force

I create a folder called cu8tocu9diff and execute

Compare-NAVApplicationObject -Original .\cu8objects.txt -Modified .\cu9objects.txt -Delta .\cu8tocu9diff
Processed 4036 objects:
 Inserted   0 objects
 Deleted    0 objects
 Changed    64 objects
 Identical  3972 objects
 Failed     0 objects

I see in my cu8tocu9diff folder that I have 4.036 objects and 3.972 of them are not needed.  I deleted all the files from cu8tocu9diff folder and executed

Compare-NAVApplicationObject -Original .\cu8objects.txt -Modified .\cu9objects.txt -Delta .\cu8tocu9diff | Where-Object CompareResult -eq 'Identical' | foreach {  Remove-Item (Join-Path .\cu8tocu9diff ($_.ObjectType.substring(0,3) + $_.Id + '.delta')) }

The result is that I only have the 64 delta files needed in the cu8tocu9diff folder.

Now I need the same 64 objects from the customized database.  I begin by splitting the exported object file into a new folder.

Split-NAVApplicationObjectFile .\cu8customizedobjects.txt cu8customized

I now have all 4.036 objects in the cu8customized folder.  I delete all unneeded files by executing

Get-ChildItem -Path .\cu8customized | foreach { if (!(Test-Path ((Join-Path .\cu8tocu9diff $_.BaseName) + '.delta'))) { Remove-Item $_.FullName } }

I am now in a place to update the customized objects with the cu9 changes.

Update-NAVApplicationObject -Target .\cu8customized -Delta .\cu8tocu9diff -Result cu9customized.txt -DateTimeProperty FromModified -ModifiedProperty FromModified -VersionListProperty FromModified -DocumentationConflict ModifiedFirst
 Completed the update. Processed 64 application object(s) with a total of 190 individual change(s).
 100,0% of the individual changes were automatically applied during the update.
 Processed 64 application object(s):
Updated      64 objects – with changes in DELTA that were successfully merged with any changes from TARGET
 into RESULT.
 Conflict      0 objects – with changes in both DELTA and TARGET that could only be partially updated.
 The partially updated objects and the corresponding .CONFLICT files are added to RESULT.
 This also includes objects that are deleted in DELTA and changed in TARGET.
 Inserted      0 objects – that exist in DELTA that do not exist in TARGET and are inserted into RESULT.
 Deleted       0 objects – that are deleted in DELTA and exist in TARGET.
 Unchanged     0 objects – that exist in TARGET, are not changed in DELTA, and are copied from TARGET to RESULT.
 Failed        0 objects – that could not be imported, such as an object that is not valid or that contains
 unsupported features.
Processed 190 changes:
Conflict      0 changes
 Updated   100,0% of all changes

When I updated the customized objects with these 64 changes I got objects with a new version list.  The version list needs to be merged and for that I use a script that was written by NAV MVP Kamil Sacek.  The script is attached here as Merge-NAVVersionListString script and his scripts are available on CodePlex.  After copying the script to my base folder I need to import the new functions.

Import-Module '.\Merge-NAVVersionListString script.ps1'

And to merge the version list for the new objects I use

Get-ChildItem -Path .\cu9customized | foreach { Set-NAVApplicationObjectProperty -Target $_.FullName -VersionListProperty (Merge-NAVVersionListString -source (Get-NAVApplicationObjectProperty -Source $_.FullName).VersionList -target (Get-NAVApplicationObjectProperty -Source (Join-Path .\cu8customized $_.Name)).VersionList) }

I have now 64 updated object that I can import into my customized database and compile.  First I want to join all these 64 files into one object file.

Join-NAVApplicationObjectFile -Source .\cu9customized -Destination cu9customizeobjects.txt

When I imported the new customized object I got an error.  Error because I had new fields in tables that I did not have permission to create.  To solve this I begin by importing the URObjects.IS.37221.fob file that is included in the cumulative update package.  I accept the default action so all new fields should be merged into the existing tables.

Scripts are available to export and import NAV objects so this whole process can be scripted and automated.