Presenting the Data Exchange Framework

Earlier this month Arend-Jan contacted me about being a presenter on Dutch Dynamics Community NAV Event for March 2016.  Of course I was honored and after a moments thought I accepted.

The following presentation was repeated two times for close to 140 people in total.  I had a great time and am thankful for the opportunity.

I promised to share one Codeunit – that general mapping Codeunit – and here it is: Codeunit 60000.

If you download the presentation you can also read my notes.  They should help you better understand the whole story.

 

Data Exchange Framework enhancements

I have made several enhancements to the Data Exchange Framework in NAV 2016.  Previously I wrote about text or a comma separated file import and about access to a local data file when creating a xml/json structure.  Both those enhancements are included in the objects attached to this post.

I have needed to handle JSON files where the actual data is not in the node value but in the node name.  To support this I added a boolean field to the Data Exchange Column Definition table (1223).

ConvertNodeNameToValue

To support this I added the same field to the Data Exchange Field Mapping table (1225) as a flow field, calculating the value from the definition table.

Codeunit 1203 is used to import XML and JSON files into the Data Exchange Field table.  I made two changes to the InsertColumn.  First, I added a parameter that accepts the node name and if the above switch is set to true I insert the node name into the column value instead of the node value.  The other change is that instead of finding only the first Data Exchange Column Definition I loop through all of them.  This allows me to create more than one definition based of the same XML node and therefore import into multiple columns from the same node.

To enable this scenario in the Currency Exchange Update Service I made changes to the Data Exchange Field Mapping Buffer table (1265) and the Data Exchange Setup Subform page.  More on that later.

A JSON file without a single root element will cause an error in the standard code.  Inspired my Nikola Kukrika on NAVTechDays 2015 I made changes to Codeunit 1237 utilizing the TryFunction to solve this problem.  To top this of I made a simple change to the Currency Exchange Rate Service Card page (1651) and table (1650) to allow JSON file type.

Having completed these changes I can now use a web service that delivers JSON in my Currency Exchange Rate Service.

CurrencyLayerJSON

Also inspired by Microsoft I added a transformation type to the Transformation Rule table (1237) to convert the Unix Timestamp (seconds since January 1st 1970 UTC) to a date.

UnixTimestamp

All objects and deltas are attached

DEF-Enhancements

NAV 2016 Data Exchange – file import

As promised it is time to pick more things that I have improved for NAV 2016.  Still on the Data Exchange, and looking at file import.  We can import a text file; fixed and delimited, with different encoding.

By using Codeunit 1240 for the External Data Handling action we are able to get any text file into the Data Exchange framework.  It will simply prompt the user for a local file name.

Microsoft shipped Codeunit 1241 to read a flat file.  I decided to change that a bit with two enhancements.  First, I wanted to be able to read a delimited file and second, I wanted to be able to use different encoding.

The Codeunit name is “Fixed File Import” and just the name change show the difference.  My version is named “Fixed/Delimited File Import”.

There are settings in the Data Exchange that I wanted to be able to support.

FileDetails

The standard Codeunit only reads the MSDOS File Encoding with this code

OnRun(VAR Rec : Record "Data Exch.")
"File Content".CREATEINSTREAM(ReadStream);
DataExchDef.GET("Data Exch. Def Code");
LineNo := 1;
REPEAT
ReadLen := ReadStream.READTEXT(ReadText);
IF ReadLen > 0 THEN
ParseLine(ReadText,Rec,LineNo,SkippedLineNo);
UNTIL ReadLen = 0;

Changed

WITH DataExchDef DO BEGIN
GET("Data Exch. Def Code");
CASE "File Encoding" OF
"File Encoding"::"MS-DOS" :
"File Content".CREATEINSTREAM(ReadStream,TEXTENCODING::MSDos);
"File Encoding"::WINDOWS :
"File Content".CREATEINSTREAM(ReadStream,TEXTENCODING::Windows);
"File Encoding"::"UTF-8" :
"File Content".CREATEINSTREAM(ReadStream,TEXTENCODING::UTF8);
"File Encoding"::"UTF-16" :
"File Content".CREATEINSTREAM(ReadStream,TEXTENCODING::UTF16);
END;
LineNo := 1;
REPEAT
ReadLen := ReadStream.READTEXT(ReadText);
IF ReadLen > 0 THEN
ParseLine(ReadText,Rec,LineNo,SkippedLineNo);
UNTIL ReadLen = 0;
END;

to read the four different text encoding that are supported by the Data Exchange Setup.

The PharseLine section in the standard Codeunit has this simple loop to read the fixed file.

StartPosition := 1;
REPEAT
DataExchField.InsertRecXMLField(DataExch."Entry No.",LineNo,DataExchColumnDef."Column No.",'',
COPYSTR(Line,StartPosition,DataExchColumnDef.Length),DataExchLineDef.Code);
StartPosition += DataExchColumnDef.Length;
UNTIL DataExchColumnDef.NEXT = 0;
LineNo += 1;

Extending this and adding two functions gives me the ability to import both fixed and variable text file.

StartPosition := 1;
REPEAT
CASE DataExchDef."File Type" OF
DataExchDef."File Type"::"Fixed Text" :
BEGIN
DataExchField.InsertRecXMLField(DataExch."Entry No.",LineNo,DataExchColumnDef."Column No.",'',
COPYSTR(Line,StartPosition,DataExchColumnDef.Length),DataExchLineDef.Code);
StartPosition += DataExchColumnDef.Length;
END;
DataExchDef."File Type"::"Variable Text" :
BEGIN
DataExchField.InsertRecXMLField(DataExch."Entry No.",LineNo,DataExchColumnDef."Column No.",'',
ExtractFirstDelimitedPart(Line,GetSeparator),DataExchLineDef.Code);
END;
ELSE
ERROR(FileTypeNotSupported,DataExchDef."File Type");
END;
UNTIL DataExchColumnDef.NEXT = 0;
LineNo += 1;

LOCAL GetSeparator() Separator : Text[1]
WITH DataExchDef DO
CASE "Column Separator" OF
"Column Separator"::Tab :
Separator[1] := 9;
"Column Separator"::Semicolon :
Separator := ';';
"Column Separator"::Comma :
Separator := ',';
"Column Separator"::Space :
Separator := ' ';
END;

LOCAL ExtractFirstDelimitedPart(VAR Line : Text;Separator : Text[1]) FirstPart : Text
SeparatorPosition := STRPOS(Line,Separator);
IF SeparatorPosition > 0 THEN BEGIN
FirstPart := COPYSTR(Line,1,SeparatorPosition - 1);
IF SeparatorPosition + 1 <= STRLEN(Line) THEN
Line := COPYSTR(Line,SeparatorPosition + 1)
ELSE
Line := '';
END ELSE BEGIN
FirstPart := Line;
Line := '';
END;

More to come.  Stay tuned…

 

Series of improvements/enhanchements for Dynamics NAV 2016

During my preparation for Directions sessions I studied the new enhanced Data Exchange Framework in NAV 2016.  I have done a number of things to enhance the functionality and to enable functionality that should be available according to the possible setup.

Over the next coming weeks I plan to publish these things and hope that they will be useful for some of you.  How knows, Microsoft might even decide that some of these fixes and features should be included in the standard product.

The first thing I want to suggest is regarding the functionality of getting the XML structure of a file into the Data Exchange Framework.  Going into the definition for Yahoo Currency Exchange Rate Services and try the “Get File Structure” function.

DataExchangeGetStructure

Then next page will allow you to enter a URL to the XML file.

xmlpath

The problem we face here is that the Path must be accessible by the NAV Service Tier.  So, what to do if you have the XML file on your desktop?

Utilizing the new TryFunction method.  I decided to add the new function to the File Management Codeunit (419).

getanypath

Throw any path; local path, server path, url to the MakeServerTempFile function and you will receive a Server File Name that is accessible by the Service Tier.  This function will

  1. If path if found on the server, return that path
  2. If path is found on the windows client computer, upload the file to the server
  3. Try to download the file with web client to the server
  4. Ask the user to upload the file

Add this function to Codeunits 1235 and 1237 to make sure that the Get XML Structure works as expected.

getxml

getjson

Building a clean database – remove not licensed objects

I just got a question from a client;

Gunnar,
Do you have a “King Kong” license that will allow you to delete any object?  It appears our development license does not have the rights to some of the newer LS Retail objects and I need to create a CRONUS database with just our stuff.

Well, I don’t have a “King Kong” license.  That is only for Microsoft.

There is a way to solve this dilemma.  It will take a few steps.

Start with we have two databases, one with the data we need (LSRetail), another with the application we need (CRONUS).

After the process is completed the LSRetail database will not be usable as a standalone database, so make a copy if you need one.  A new database will be created, CRONUS_APP.  To clean up it is safe to delete both these databases.

The following powershell script has two options.  Option 1 is to have the company data imported into the CRONUS database in the end.  This option requires a server instance running on the CRONUS database.  Option 2 is to create a new database with SQL Management Studio and merge the CRONUS application and the LSRetail data into that one.

[code lang=”powershell”]
$CronusDatabaseName = "CRONUS" # Database with destination Application
$CRONUSServerInstance = "DynamicsNAV80" # Instance for destination Application if using option 1
$LSRetailDatabaseName = "LSRETAIL" # LS Retail Demo Database, database with company data
$EmptyDatabaseName = "CRONUS WITH COMPANYDATA" # Create a new empty database using SQL Management Studio if using option 2
$SQLServerName = "SQL2014"
$SQLServerInstance = "NAVDEMO" # Set blank for default instance

$AppDatabaseName = $CronusDatabaseName + "_APP"
$ServiceAccount = $env:USERDOMAIN + "\" + $env:USERNAME
$ServerInstance = "UPGRADE"
$NavDataFile = (Join-Path $env:TEMP "NAVmerge.navdata")

$SelectOption = "2"

#Export Application from CRONUS Database to Application Database
Export-NAVApplication -DatabaseServer $SQLServerName -DatabaseInstance $SQLServerInstance -DatabaseName $CronusDatabaseName -DestinationDatabaseName $AppDatabaseName -ServiceAccount $ServiceAccount -Force

#Setup a temporary Server Instance for the new database
Get-Credential | New-NAVServerInstance -ServerInstance $ServerInstance -ManagementServicesPort 33555 -ClientServicesPort 33556 -SOAPServicesPort 33557 -ODataServicesPort 33558 -DatabaseInstance $SQLServerInstance -DatabaseServer $SQLServerName -DatabaseName $AppDatabaseName -ServiceAccount User -Force
Set-NAVServerConfiguration -ServerInstance $ServerInstance -KeyName "Multitenant" -KeyValue "true" -Force
Set-NAVServerInstance -ServerInstance $ServerInstance -Start -Force

#Prepare LSRetailDatabase for new configuration
Remove-NAVApplication -DatabaseInstance $SQLServerInstance -DatabaseServer $SQLServerName -DatabaseName $LSRetailDatabaseName -Force

#Mount and Sync LSRetailDatabase as a tenant
Mount-NAVTenant -ServerInstance $ServerInstance -DatabaseInstance $SQLServerInstance -DatabaseServer $SQLServerName -DatabaseName $LSRetailDatabaseName -Id DEFAULT -OverwriteTenantIdInDatabase -AllowAppDatabaseWrite -Force
Sync-NAVTenant -ServerInstance $ServerInstance -Tenant DEFAULT -Mode ForceSync -Force

if (Test-Path $NavDataFile)
{
Remove-Item -Path $NavDataFile -Force
}

#Option 1, Copy Company data to the original CRONUS database. Requies a service running on the CRONUS database
if ($SelectOption -eq "1")
{
Export-NAVData -ServerInstance $ServerInstance -Tenant DEFAULT -AllCompanies -FilePath $NavDataFile -Force
Import-NAVData -ServerInstance $CRONUSServerInstance -FilePath $NavDataFile -AllCompanies -Force
}
#Option 2, Import into the new empty database created by SQL Management Studio
if ($SelectOption -eq "2")
{

Export-NAVData -ServerInstance $ServerInstance -Tenant DEFAULT -AllCompanies -FilePath $NavDataFile -IncludeApplication -IncludeApplicationData -IncludeGlobalData -Force
if ($SQLServerInstance -eq "")
{
Import-NAVData -DatabaseServer $SQLServerName -DatabaseName $EmptyDatabaseName -FilePath $NavDataFile -AllCompanies -IncludeApplicationData -IncludeGlobalData -IncludeApplication

}
else
{
Import-NAVData -DatabaseServer ($SQLServerName + "\" + $SQLServerInstance) -DatabaseName $EmptyDatabaseName -FilePath $NavDataFile -AllCompanies -IncludeApplicationData -IncludeGlobalData -IncludeApplication
}

}

Set-NAVServerInstance -ServerInstance $ServerInstance -Stop -Force
Remove-NAVServerInstance -ServerInstance $ServerInstance -Force

if (Test-Path $NavDataFile)
{
Remove-Item -Path $NavDataFile -Force
}
[/code]

To walk you through what happens;

  • Application from CRONUS is exported into CRONUS_APP database
  • New Service Instance is created for CRONUS_APP database
  • Service Instance is changed to Multi Tenant and started
  • Application is removed from LSRetail database
  • LSRetail database is mounted as a tenant for CRONUS_APP database
  • LSRetail database structure is force-synched to CRONUS_APP application
  • Data from CRONUS_APP and LSRetail tenant is exported to NAVData file
  • NAVData file is imported into an empty database or the existing CRONUS database

 

OData in NAV 2013 R2

Open Data Protocol (OData) is a data access protocol initially defined by Microsoft.  In NAV 2013 Microsoft first added a support for OData.  On the Developer Network Microsoft has a few walk-throughs aimed to get us started in using both SOAP and OData web services.

To get started with OData first make sure that the OData service is enabled on your developement server.

ODataEnabled

As you can see on the image above I also enable the use of NTLM Authentication.  In a production environment OData should use SSL as described in this walk-through from Microsoft.

I want to share with you a few points that I have found out and are not obvious in the walk-throughs.

When an external application makes a OData request NAV will behave the same way as a Windows Client would do.  The server will do the login routine by executing trigger 1 in Codeunit 1, CompanyOpen and when the request is finishing the server executes the CompanyClose trigger.  If the request if for a Page object then all the page triggers are executed, OnInit, OnOpenPage and so on.

The OData will only show the fields added to the Page or to the Query with the table primary key fields added.  Lets take a closer look at how the Page object works with OData.

In the OnOpenPage trigger and in the SourceTableView property it is possible to add filters.  These filters will apply to the OData stream and you will not be able to insert data into NAV outside of these filters – same functionality as if you where using the Page in the Windows Client.  The properties; Editable, InsertAllowed, ModifyAllowed and DeleteAllowed all work with OData.  Lets for example look at Page 39 – General Journal.  In the OnOpenPage trigger the code is filtering the table data with a Template Name and a Batch Name.

Page39

From OData, the variable OpenedFromBatch will always be False so the first template and batch for the PAGE::”General Journal” will always be selected and new entries will always be inserted into that journal.  This also means that it is not possible to use Page 39 to read the journal lines from any other journal then the first one.  Fields that are not visible in a Page are visible and usable in OData.

When creating a new record the code in the OnNewRecord trigger is executed.

OnNewRecord

This will all work fine for the first template and batch.  The AutoSplitKey property is also active so as long as you are fine with inserting in to the default journal then you can use this Page for your OData web service.

The easiest way is still to create a new page dedicated to the web service functionality, show the primary key fields in the page and skip the OnOpenPage and the OnNewRecord code.  I use the OnNewRecord code to put in default values for the table I am inserting into.

On the Microsoft web site walk-through it is shown how to create a new customer, look for a customer and modify a customer.

I have found that I want to add one line to that example

NAV nav = new NAV(new Uri("http://localhost:7048/DynamicsNAV/OData/Company('CRONUS%20International%20Ltd.')"));
nav.Credentials = CredentialCache.DefaultNetworkCredentials; 
nav.IgnoreResourceNotFoundException = true;

Without IgnoreResourceNotFoundException the following code example will return an exception if the customer is not found within the given filter.

private static Customer GetCustomer(NAV nav, string customerNo)
{
var customers = (from c in nav.Customer
where c.No == customerNo
select c);
foreach (Customer customer in customers)
return customer;
return null;
}

By combining a Get function like this with a New or Modify function it is easy to update the existing value for any given table in the database.

private static Boolean AddNewDefaultDimensionCodeValue(NAV nav, int tableNo, string no, string dimensionCode, string dimensionCodeValue)
{
DefaultDimensions existingDefaultDimension = GetDefaultDimension(nav, tableNo, no, dimensionCode);
if (existingDefaultDimension == null)
{
DefaultDimensions newDefaultDimension = new DefaultDimensions();
newDefaultDimension.Table_ID = tableNo;
newDefaultDimension.No = no;
newDefaultDimension.Dimension_Code = dimensionCode;
newDefaultDimension.Dimension_Value_Code = dimensionCodeValue;
nav.AddToDefaultDimensions(newDefaultDimension);
nav.SaveChanges();
return true;
}
else
{
existingDefaultDimension.Dimension_Value_Code = dimensionCodeValue;
nav.UpdateObject(existingDefaultDimension);
nav.SaveChanges();
return false;
}
}
private static DefaultDimensions GetDefaultDimension(NAV nav, int tableNo, string no, string dimensionCode)
{
var dimensionValues = (from d in nav.DefaultDimensions
where d.Table_ID == tableNo && d.No == no && d.Dimension_Code == dimensionCode
select d);
foreach (DefaultDimensions dimensionValue in dimensionValues)
return dimensionValue;
return null;
}

Remember, that without the SaveChanges nothing will be updated in NAV.

Now go ahead and use OData to integrate NAV with all your external systems and devices.  Good luck.

Using Views and Linked Tables

There are always some cases where you would like to have external data available to Dynamics NAV.  In cases where the data is available to the SQL Server hosting the Dynamics NAV database it is easy to create a view in the NAV database and a linked table in the Dynamics NAV.  The process is;

Create a normal table in Dynamics NAV and save it.  Make sure that you select to use the same table for all companies.

TableProperties1

Next go to SQL Management Studio and build a CREATE TABLE script for this table.  When you do the CREATE VIEW script you must make sure that all fields are identical and with matching collation. I begin by dropping the table and then building the view.

[code lang=”sql”]USE [NAV 2013 R2 Company]
GO

/****** Object: Table [dbo].[National Register] Script Date: 13.9.2013 09:35:01 ******/
SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

DROP TABLE [dbo].[National Register]
GO

CREATE VIEW [dbo].[National Register] AS SELECT
[No_] COLLATE Icelandic_100_CS_AS AS [No_]
,[Type]
,[Family No_] COLLATE Icelandic_100_CS_AS AS [Family No_]
,[Name] COLLATE Icelandic_100_CS_AS AS [Name]
,[Sorting] COLLATE Icelandic_100_CS_AS AS [Sorting]
,[Community] COLLATE Icelandic_100_CS_AS AS [Community]
,[Gender]
,[Martial Status]
,[Mate No_] COLLATE Icelandic_100_CS_AS AS [Mate No_]
,[Post Code] COLLATE Icelandic_100_CS_AS AS [Post Code]
,[Address] COLLATE Icelandic_100_CS_AS AS [Address]
,[Guardian No_] COLLATE Icelandic_100_CS_AS AS [Guardian No_]
,[Industry Code] COLLATE Icelandic_100_CS_AS AS [Industry Code]
,[Industry] COLLATE Icelandic_100_CS_AS AS [Industry]
,[Address Code] COLLATE Icelandic_100_CS_AS AS [Address Code]
,[identification No_] COLLATE Icelandic_100_CS_AS AS [identification No_]
,[Community Sorting] COLLATE Icelandic_100_CS_AS AS [Community Sorting]
,[Legal Home Code] COLLATE Icelandic_100_CS_AS AS [Legal Home Code]
,[Blocking Code] COLLATE Icelandic_100_CS_AS AS [Blocking Code]
,[Country] COLLATE Icelandic_100_CS_AS AS [Country]
,[Place of Birth] COLLATE Icelandic_100_CS_AS AS [Place of Birth]
,[Date of Birth]
,[Address Code 2] COLLATE Icelandic_100_CS_AS AS [Address Code 2]
,[Last Local Address] COLLATE Icelandic_100_CS_AS AS [Last Local Address]
,[Proxy No_] COLLATE Icelandic_100_CS_AS AS [Proxy No_]
,[Church] COLLATE Icelandic_100_CS_AS AS [Church]
,[Address to] COLLATE Icelandic_100_CS_AS AS [Address to]
,[Status] COLLATE Icelandic_100_CS_AS AS [Status]
,[Deport Date]
,[Post Address] COLLATE Icelandic_100_CS_AS AS [Post Address]
,[c_o Name] COLLATE Icelandic_100_CS_AS AS [c_o Name]
,[c_o Address] COLLATE Icelandic_100_CS_AS AS [c_o Address]
,[c_o Post Code] COLLATE Icelandic_100_CS_AS AS [c_o Post Code]
,[c_o Post Address] COLLATE Icelandic_100_CS_AS AS [c_o Post Address]
,” COLLATE Icelandic_100_CS_AS AS [VAT No_]
,[Registered By] COLLATE Icelandic_100_CS_AS AS [Registered By]
FROM [NAV 2009 R2 Company].[dbo].[NR Data]
[/code]

Next step is to modify the table properties and set this table to be a linked table.
TableProperties2
Now NAV 2013 R2 has access to all the data in the table. The user running the NAV Service must have read access to the original table.  Make sure that all pages that use this table are read-only as the data can’t be modified through a view.

In NAV 2013 R2 there is an issue that we need to be aware of.  The NAV Server caches the data in these tables so we can have different results in NAV 2013 R2 than expected.  My contacts tell me that this issue has been fixed in the upcoming NAV vNext and I created a Microsoft Connect suggestion to have this fixed in NAV 2013 R2.  Until that solution is available the workaround is to use SELECTLATESTVERSION in you code before using the linked data.  Here is an example function that is reading a plain text password from a linked database and comparing that to an incoming MD5 password.  This issue does not seem to affect pages.  In other words the linked data displayed in pages looks up to date.

 

[code] PROCEDURE ValidateLogin@50011(ChurnNo@50000 : Code[10];MD5Password@50001 : Text[80]) : Boolean;
VAR
ChurnNoRegister@50002 : Record 10017450;
MD5@50003 : DotNet "’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.Security.Cryptography.MD5";
Encoding@50004 : DotNet "’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.Text.Encoding";
HashBytes@50005 : DotNet "’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.Array";
Data@50009 : DotNet "’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.Byte";
StringBuilder@50008 : DotNet "’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.Text.StringBuilder";
StringComparer@50006 : DotNet "’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.StringComparer";
Loop@50007 : Integer;
BEGIN
SELECTLATESTVERSION;
IF NOT ChurnNoRegister.GET(ChurnNo) THEN EXIT(FALSE);
IF ChurnNoRegister."Farmer Web Password" = ” THEN EXIT(FALSE);
MD5 := MD5.Create;
HashBytes := MD5.ComputeHash(Encoding.UTF8.GetBytes(ChurnNoRegister."Farmer Web Password"));
StringBuilder := StringBuilder.StringBuilder;
FOR Loop := 0 TO HashBytes.Length – 1 DO BEGIN
Data := HashBytes.GetValue(Loop);
StringBuilder.Append(Data.ToString(‘x2’));
END;
StringComparer := StringComparer.OrdinalIgnoreCase;
EXIT(StringComparer.Compare(MD5Password,StringBuilder.ToString) = 0);
END;
[/code]

 

Importing and Exporting Data in Microsoft Dynamics NAV 2013 R2 CU 8

Two weeks ago Microsoft released CU8 for Dynamics NAV 2013 R2.  This upgrade included the long-awaited Import and Export that is replacing the old FBK backup in the Classic Client.

There are two ways to do import and export.  One is through the Windows Client

ImportExport

and the other is with PowerShell commands Export-NAVData and Import-NAVData.  You can read all about this on the Microsoft Dynamics NAV Team Blog by clicking here.

Compared to the old Classic Client FBK backup this is very fast.  On the other hand the new import functionality is very strict on the database schema.  It needs to be excactly the same in the database used for the export and import.

I was in the process of upgrading a company from NAV 5.0 SP1 and had planned to add the company to an existing NAV 2013 R2 database wich already included a few companies.  This was not as easy as you might think.  Here are the issues that I encountered and how they were solved.

After I finished the upgrade I made sure that all objects in the upgraded database where identical to the destination database.  This I did by exporting all objects from the destination database to the upgraded database.  I used a merge tool on the exported text object file to make sure.

At this point I was not able to import the data file and asked for help from Microsoft.  Got all the help I needed and the first thing that I needed to make sure was that all the tables must be identical on a SQL level.  This was done with this SQL command.

[code lang=”sql”]select db1.[Object ID] "DB1 Table ID", db1.Name "DB1 Table Name", db1.Hash "DB1 Metadata Hash", db2.[Object ID] "DB2 Table ID", db2.Name "DB2 Table Name", db2.Hash "DB2 Metadata Hash"
from [Destination Database].[dbo].[Object Metadata Snapshot] db1
full outer join [Source Database].[dbo].[Object Metadata Snapshot] db2 on db1.[Object ID] = db2.[Object ID][/code]

I needed to make sure that the “DB1 Metadata Hash” what identical to “DB2 Metadata Hash” for all tables.  This was not true even if all objects where identical.  After recompiling all tables in both databases I saw that I now had identical hashes.

This was not enough.  Since I upgraded the source database from an earlier NAV version the database still held all the old virtual table snapshots.  We needed to remove them with this SQL command.

[code lang=”sql”]delete from [Source Database].[dbo].[Object Metadata Snapshot]
where [Object ID] in
(2000000007, 2000000026, 2000000049, 2000000101, 2000000102, 2000000103, 2000000020, 2000000055, 2000000009, 2000000029, 2000000038, 2000000058, 2000000041, 2000000028, 2000000063, 2000000022, 2000000048, 2000000040, 2000000043, 2000000044, 2000000039, 2000000045)
[/code]

After this I successfully exported and imported the upgraded company into the destination database.  One more thing that I saw is that I needed to restart other service instances in order to see the newly imported company.

This new functionality opens a lot of possibilities for developement and testing.  We can now easily with PowerShell scripts create a backup of our live company data and import them into our test and/or developement environment.

In both the Windows Client and in PowerShell it is possible to copy companies, rename and delete.  I know that the import process takes a longer time than the export process.  I tried the export process on a 500GB database and it took 2 hours and 27 minutes.  In the Classic Client this would have taken days.

I expect to see this functionality improve in future releases and also expect Microsoft or some of the community leaders to publish scripts to automate a data refresh in a testing environment.

Change table layout and move data

In older versions of NAV I got used to using a report to move data from one table to another.  Since the last update from Microsoft for NAV 2013 R2 I am using the SQL Management Studio more and more.

For example, yesterday, I needed to change a table layout that included a change in the primary key.

I had modified the table in the developement database but was unable to import the new table because of the existing data.

I solved this by creating a temporary table with the original layout.  I can choose any number for the table that I can use it with my developement license since I will not be using the table in NAV.  I designed the original table and used File-Save As to create the new table.

Then I moved over to the SQL Management Studio.  There I found the original table and scripted a SELECT statement to a new query window.  Next I found the newly created shadow table and scripted an INSERT statement to the clipboard.  Then I combined these two statements to look like this.

[code lang=”sql”]USE [NAV_DATABASE]
GO

INSERT INTO [dbo].[NAV Company$Temp Partner Inbound Mapping]
([IC Partner Code]
,[Responsibility Center]
,[Ship-to Name]
,[Ship-to Address]
,[Ship-to Address 2]
,[Ship-to City]
,[Inbound Mapping Group Code])
SELECT
[IC Partner Code]
,[Responsibility Center]
,[Ship-to Name]
,[Ship-to Address]
,[Ship-to Address 2]
,[Ship-to City]
,[Inbound Mapping Group Code]
FROM [dbo].[NAV Company$IC Partner Inbound Mapping]
GO
DELETE FROM [dbo].[NAV Company$IC Partner Inbound Mapping]
[/code]

At this stage I have deleted all data from the table and was able to import the new modified table into the production NAV database. In this case I have two new fields in the new table. The primary changed to include a line number. That meant that I needed to create a line number for each row. This can be done in with SQL.

[code lang=”sql”]USE [NAV_DATABASE]
GO

INSERT INTO [dbo].[NAV Company$IC Partner Inbound Mapping]
([IC Partner Code]
,[Line No_]
,[Responsibility Center]
,[Ship-to Code]
,[Ship-to Name]
,[Ship-to Address]
,[Ship-to Address 2]
,[Ship-to City]
,[Inbound Mapping Group Code])
SELECT
[IC Partner Code]
,10000 * ROW_NUMBER() OVER(ORDER BY [IC Partner Code]) AS Row
,[Responsibility Center]
,”
,[Ship-to Name]
,[Ship-to Address]
,[Ship-to Address 2]
,[Ship-to City]
,[Inbound Mapping Group Code]
FROM [dbo].[NAV Company$Temp Partner Inbound Mapping]
GO

DELETE FROM [dbo].[NAV Company$Temp Partner Inbound Mapping][/code]

If I would have needed to start the row number with 10000 for every instance of IC Partner Code I would have needed to use a cursor and iterate through every instance of IC Partner Code from another table and execute the statement filtered by the IC Partner Code.  This I used for example in the Azure Backup Script.

I finished this process by deleting the temporary table for the production database.