Localized views removed from NAV 2013 R2

In Dynamics NAV 2009 and older we had an option to maintain views in the database.  Activating this caused NAV to create a view for each language and every table in the database.

This is a useful thing if you are building BI on top of your SQL and would like to skip all the translation work.

This however had some drawbacks.

  • Every table in the database is maintained
  • Option values are shown as number
  • Boolean is shown as number
  • Global dimension do not have the correct caption
  • Time is shown as DateTime
  • Not easy to see the difference between normal date and closing date

I already discussed this in a post back in 2012.  Because of this I never used this built-in option.  The solution I created back in 2012 has been upgraded and is now ready and working in NAV 2013 R2.

SQLViewMenu

I start by exporting all tables from the Object Designer as a text file.  This text file is imported with the task “Read Option Description”.  All options captions from the import are stored in a table in the database.

SQLBIDates

Dates are created according to the Accounting Periods.  The BI Dates table keeps multiple fields with information about the date.  By joining this table to your SQL query or adding it as a dimension to your OLAP cube you have a lot of options to handle dates and periods.

SQLSelectedTables

The tables required for the views are added to a list.  Remember to add he BI Dates table to the list of tables to be used in your business intelligence tool.

SQLCreate

NAV is now ready to create the SQL commands to create the views.  The prefix can be used if the views are to be kept in a separate database, which I suggest you do.  A prefix can for example be “[Demo Database NAV (7-1)].[dbo].”

SQLExportFile

The file for SQL Management Studio is now ready for execution.

[code lang=”sql”]IF EXISTS(SELECT * FROM sys.views WHERE object_id = OBJECT_ID(N’dbo.ENU$Kappi ehf_$Currency’))
DROP VIEW [dbo].[ENU$Kappi ehf_$Currency]
GO
CREATE VIEW [dbo].[ENU$Kappi ehf_$Currency]
AS
SELECT
[Code] As [Code]
,[Last Date Modified] As [Last Date Modified]
,[Last Date Adjusted] As [Last Date Adjusted]
,[Unrealized Gains Acc_] As [Unrealized Gains Acc_]
,[Realized Gains Acc_] As [Realized Gains Acc_]
,[Unrealized Losses Acc_] As [Unrealized Losses Acc_]
,[Realized Losses Acc_] As [Realized Losses Acc_]
,[Invoice Rounding Precision] As [Invoice Rounding Precision]
,[Invoice Rounding Type] As [Invoice Rounding Type]
,[Amount Rounding Precision] As [Amount Rounding Precision]
,[Unit-Amount Rounding Precision] As [Unit-Amount Rounding Precision]
,[Description] As [Description]
,[Amount Decimal Places] As [Amount Decimal Places]
,[Unit-Amount Decimal Places] As [Unit-Amount Decimal Places]
,[Realized G_L Gains Account] As [Realized G_L Gains Account]
,[Realized G_L Losses Account] As [Realized G_L Losses Account]
,[Appln_ Rounding Precision] As [Appln_ Rounding Precision]
,CASE [EMU Currency]
WHEN 1 THEN ‘Yes’
WHEN 0 THEN ‘No’
END As [EMU Currency]
,[Currency Factor] As [Currency Factor]
,[Residual Gains Account] As [Residual Gains Account]
,[Residual Losses Account] As [Residual Losses Account]
,[Conv_ LCY Rndg_ Debit Acc_] As [Conv_ LCY Rndg_ Debit Acc_]
,[Conv_ LCY Rndg_ Credit Acc_] As [Conv_ LCY Rndg_ Credit Acc_]
,[Max_ VAT Difference Allowed] As [Max_ VAT Difference Allowed]
,[VAT Rounding Type] As [VAT Rounding Type]
,[Payment Tolerance _] As [Payment Tolerance _]
,[Max_ Payment Tolerance Amount] As [Max_ Payment Tolerance Amount]
,[Wage Amount Rounding Precision] As [Wage Amount Rounding Precision]
,[Wage Rate Rounding Precision] As [Wage Rate Rounding Precision]
,[Wage Amount Decimal Places] As [Wage Amount Decimal Places]
,[Wage Rate Decimal Places] As [Wage Rate Decimal Places]
FROM [Kappi ehf_$Currency]
GO

IF EXISTS(SELECT * FROM sys.views WHERE object_id = OBJECT_ID(N’dbo.ENU$Kappi ehf_$BI Dates’))
DROP VIEW [dbo].[ENU$Kappi ehf_$BI Dates]
GO
CREATE VIEW [dbo].[ENU$Kappi ehf_$BI Dates]
AS
SELECT
[Date] As [Date]
,[Date Name] As [Date Name]
,[Year] As [Year]
,[Week] As [Week]
,[Month] As [Month]
,[Month Name] As [Month Name]
,[Day of Week] As [Day of Week]
,[Day Name] As [Day Name]
,[Day of Month] As [Day of Month]
,CASE [Closing Date]
WHEN 1 THEN ‘Yes’
WHEN 0 THEN ‘No’
END As [Closing Date]
,[SQL Month Name] As [SQL Month Name]
,[SQL Day Name] As [SQL Day Name]
,[Quarter] As [Quarter]
,[Year Month] As [Year Month]
,[Year Month Name] As [Year Month Name]
,[Month Year Name] As [Month Year Name]
,[Quarter Name] As [Quarter Name]
,[VAT Period] As [VAT Period]
,[VAT Period Name] As [VAT Period Name]
,[Sorting Date] As [Sorting Date]
,[HRMS Integer Start] As [HRMS Integer Start]
,[HRMS Integer End] As [HRMS Integer End]
,[Day of Year] As [Day of Year]
,[Day of Half Year] As [Day of Half Year]
,[Day of Quarter] As [Day of Quarter]
,[Day of Accounting] As [Day of Accounting]
,[Half Years] As [Half Years]
,[Half Year of Year] As [Half Year of Year]
,CASE [Is Holiday]
WHEN 1 THEN ‘Yes’
WHEN 0 THEN ‘No’
END As [Is Holiday]
,CASE [Is Working Day]
WHEN 1 THEN ‘Yes’
WHEN 0 THEN ‘No’
END As [Is Working Day]
,[Month of Half Year] As [Month of Half Year]
,[Month of Quarter] As [Month of Quarter]
,[Month of Year] As [Month of Year]
,[Quarters of Half Year] As [Quarters of Half Year]
,[Quarters of Year] As [Quarters of Year]
,[Week of Year] As [Week of Year]
,CASE [Is Week Day]
WHEN 1 THEN ‘Yes’
WHEN 0 THEN ‘No’
END As [Is Week Day]
,[Half Year Name] As [Half Year Name]
,[Week Name] As [Week Name]
,[Fiscal Day] As [Fiscal Date]
,[Fiscal Year] As [Fiscal Year]
,[Fiscal Week] As [Fiscal Week]
,[Fiscal Month] As [Fiscal Month]
,[Fiscal Day of Week] As [Fiscal Day of Week]
,[Fiscal Day of Month] As [Fiscal Day of Month]
,[Fiscal Quarter] As [Fiscal Quarter]
,[Fiscal Day of Year] As [Fiscal Day of Year]
,[Fiscal Day of Half Year] As [Fiscal Day of Half Year]
,[Fiscal Day of Quarter] As [Fiscal Day of Quarter]
,[Fiscal Half Years] As [Fiscal Half Years]
,[Fiscal Half Year of Year] As [Fiscal Half Year of Year]
,[Fiscal Month of Half Year] As [Fiscal Month of Half Year]
,[Fiscal Month of Quarter] As [Fiscal Month of Quarter]
,[Fiscal Month of Year] As [Fiscal Month of Year]
,[Fiscal Quarters of Half Year] As [Fiscal Quarters of Half Year]
,[Fiscal Quarters of Year] As [Fiscal Quarters of Year]
,[Fiscal Week of Half Year] As [Fiscal Week of Half Year]
,[Fiscal Week of Month] As [Fiscal Week of Month]
,[Fiscal Week of Quarter] As [Fiscal Week of Quarter]
,[Fiscal Week of Year] As [Fiscal Week of Year]
,[Fiscal Quarter Name] As [Fiscal Quarter Name]
,[Fiscal Half Year Name] As [Fiscal Half Year Name]
,[Fiscal Week Name] As [Fiscal Week Name]
,[Fiscal Month Name] As [Fiscal Month Name]
FROM [Kappi ehf_$BI Dates]
GO
[/code]

We now have a localized version of our table data and all the above drawbacks have been fixed.  Now go ahead and build your reports or your OLAP cube on top of these views.  You can use the BI Dates table for your filter area in the reports to make sure that only available periods are selected.

Download objects.

 

The Next Upgrade Story – Data transfer code generation tool

Guess what, more companies are moving to NAV 2013 R2.

I am about to start a data upgrade for a medium-sized Icelandic company.  Doing a data transfer means that you need to compare the tables of the old version and the new version and where you have mismatch you have to take some action.  The data transfer is build on the upgrade from 2009 to 2013 R2 released by Microsoft.

We have the MergeTool that is used to compare and merge code and a lot of other cool things.  I wanted a tool that did something else so I started designing a writing.  I am hoping that the result will save me a lot of work in future upgrades.

UpgradeMenu

This tool imports exported object files into a table version.  Then I can compare two table versions and get all the difference listed.  I also import the CRONUS tables for each version to make sure that I am not comparing built-in fields.

TableVersions

I give the comparison an identification and select the source and destination versions.  Then I hit Compare.  Within seconds I get the result.

CompareResult3

I have got to make decisions on what to do with modified and deleted tables.  I can list all the tables and check the fields comparison.

Fields

I can select both step 1 and step 2 actions for each table.  For new fields I can select a source field from the source table.  The example I show here is the customer table.  I choose in step 1 to copy all the needed fields to a temporary table.  The tool will find the next available upgrade table automatically.  In step 2 I choose to move the data from the temporary table into the destination version.  I can ask that a new field will get a value from another field in the temporary table.  In this example the Registration No. field has a new ID.

The best part is yet to come.  To really save some work I make the tool create all the needed NAV upgrade objects.  For step 1 I get all the upgrade tables with only required fields.

[code]OBJECT Table 66001 Temp Customer
{
OBJECT-PROPERTIES
{
Date=11.02.14;
Time=12:11:11;
Version List=Dynamics.is;
}
PROPERTIES
{
}
FIELDS
{
{ 1 ; ;No. ;Code20 }
{ 10010400;;Registration No. ;Code20 }
{ 10017241;;Require Purchase Request;Boolean }
{ 10017290;;Freight Paid By ;Option ;OptionString=[Recipient,Prepaid,Senders Account,See Ship-to Address] }
}
KEYS
{
{ ;No. ;Clustered=Yes }
}
FIELDGROUPS
{
}
CODE
{

BEGIN
{
¸Dynamics.is Upgrade Table
}
END.
}
}

OBJECT Table 66002 Temp Cust. Ledger Entry
{

[/code]

The codeunit required to make the data manipulation.

[code]OBJECT Codeunit 66000 Upgrade FurblandanR
{
OBJECT-PROPERTIES
{
Date=11.02.14;
Time=12:11:11;
Version List=Dynamics.is;
}
PROPERTIES
{
}
CODE
{
PROCEDURE Upgrade@1(VAR StateIndicator@1000 : Record 104037);
BEGIN
// Call this function from the top of the Upgrade trigger in Codeunit ID 104045
CopyCustomer(StateIndicator);
CopyCustLedgerEntry(StateIndicator);

MoveCashierBankBranch(StateIndicator);

ForcePaymentTerms(StateIndicator);
ForceSalesHeader(StateIndicator);

END

LOCAL PROCEDURE CopyCustomer@2(VAR StateIndicator@1002 : Record 104037);
VAR
Customer@1000 : Record 18;
TempCustomer@1001 : Record 66001;
BEGIN
WITH Customer DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
CALCFIELDS("Picture");
TempCustomer.INIT;
TempCustomer.TRANSFERFIELDS(Customer);
TempCustomer.INSERT;
UNTIL NEXT = 0;
MODIFYALL("Registration No.",”);
MODIFYALL("Require Purchase Request",FALSE);
MODIFYALL("Freight Paid By",0);
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE CopyCustLedgerEntry@3(VAR StateIndicator@1002 : Record 104037);
VAR
CustLedgerEntry@1000 : Record 21;
TempCustLedgerEntry@1001 : Record 66002;
BEGIN
WITH CustLedgerEntry DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
TempCustLedgerEntry.INIT;
TempCustLedgerEntry.TRANSFERFIELDS(CustLedgerEntry);
TempCustLedgerEntry.INSERT;
UNTIL NEXT = 0;
MODIFYALL("Final Due Date",0D);
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE MoveCashierBankBranch@26(VAR StateIndicator@1002 : Record 104037);
VAR
CashierBankBranch@1000 : Record 10010402;
TempCashierBankBranch@1001 : Record 10000202;
BEGIN
WITH CashierBankBranch DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
TempCashierBankBranch.INIT;
TempCashierBankBranch.TRANSFERFIELDS(CashierBankBranch);
TempCashierBankBranch.INSERT;
UNTIL NEXT = 0;
DELETEALL;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE ForceFixedAsset@48(VAR StateIndicator@1002 : Record 104037);
VAR
FixedAsset@1000 : Record 5600;
BEGIN
WITH FixedAsset DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDFIRST THEN BEGIN
StateIndicator.Update;
MODIFYALL("Straight-Line %",0);
MODIFYALL("Depreciation Starting Date",0D);
MODIFYALL("Biling Contract No.",”);
END;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;


[/code]

And to complete step 1 the addition needed to the DeleteDiscontinuedTables trigger.

[code]// Add these lines to the DeleteDiscontinuedTables trigger of Codeunit ID 104002
DeleteTable(72000);
DeleteTable(72001);
DeleteTable(72003);
DeleteTable(87400);
…// Add these lines to the DeleteDiscontinuedTables trigger of Codeunit ID 104002
[/code]

Then there is the upgrade codeunit for step 2

[code]OBJECT Codeunit 66001 Upgrade FurblandanR
{
OBJECT-PROPERTIES
{
Date=11.02.14;
Time=12:11:11;
Version List=Dynamics.is;
}
PROPERTIES
{
}
CODE
{
PROCEDURE Upgrade@1(VAR StateIndicator@1000 : Record 104037);
BEGIN
// Call this function from the top of the Upgrade trigger in Codeunit ID 104048
CopyBankingWebServicesUser(StateIndicator);
MoveCustomer(StateIndicator);
.. END

LOCAL PROCEDURE CopyBankingWebServicesUser@2(VAR StateIndicator@1002 : Record 104037);
VAR
TempCashierWebServicesUser@1000 : Record 66022;
BankingWebServicesUser@1001 : Record 10010428;
BEGIN
WITH TempCashierWebServicesUser DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
"BankingWebServicesUser".INIT;
"BankingWebServicesUser"."Authentication Code" := "Authentication Code";
"BankingWebServicesUser"."User ID" := "User ID";
"BankingWebServicesUser"."Certificate Key Identifier" := "Certificate Key Identifier";
//"BankingWebServicesUser"."User ID Store" := ;
//"BankingWebServicesUser"."Password Store" := ;
//"BankingWebServicesUser"."Certificate Location" := ;
//"BankingWebServicesUser"."Certificate Value" := ;
"BankingWebServicesUser".INSERT;
UNTIL NEXT = 0;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE MoveCustomer@3(VAR StateIndicator@1002 : Record 104037);
VAR
TempCustomer@1000 : Record 66001;
Customer@1001 : Record 18;
BEGIN
WITH TempCustomer DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
"Customer".GET("No.");
"Customer"."Registration No." := "Registration No.";
//"Customer"."Momentum Claims Active" := ;
"Customer".MODIFY;
UNTIL NEXT = 0;
DELETEALL;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;


[/code]

And to complete the process the upgrade tables are marked to be deleted.

[code]// Add these lines to the OnRun trigger of Codeunit ID 104003
MarkTable(66001);
MarkTable(66002);
…[/code]

This tool is in testing and I am planning to add support for NAV 2015 upgrade tool.

New Developer Tools for Dynamics NAV 2013 R2

Cumulative update 9 for Dynamics NAV 2013 R2 has been released.  This new version includes a powershell tool package for developers.

You can read about these tools in the NAV Team Blog, on Soren’s Blog and on Waldo’s Blog.

With every CU package we get an upgrade for the application.  I have a customer that is already running CU8 and I want to install CU9.  This time I will do that with these new powershell tools.

First step is to install the new CU9 binaries.  This I will do with the powershell scripts from here.  This will update windows client files, web client files, server files, outlook integration and click once deployment.  On my developement machine I needed to copy the RTC client files from the CU9 DVD.  The new model tools where not included in the KB RTC folder.

Then to export the Dynamic NAV objects.

I use my CU8 version demo database and export all objects to cu8objects.txt.  I export all objects from the customized database to cu8customizedobjects.txt.  I finally use my CU9 version demo database and export all objects to cu9objects.txt.

Open PoweShell ISE and navigate to the folder with the object files.

dir
Directory: N:\nav\cu9
Mode                LastWriteTime     Length Name
 ----                -------------     ------ ----
 -a---         14.7.2014     09:32  154023304 cu8customizedobjects.txt
 -a---         14.7.2014     09:27  120451925 cu8objects.txt
 -a---         14.7.2014     09:35  120463442 cu9objects.txt

Next I import the new model tools.

Import-Module "${env:ProgramFiles(x86)}\Microsoft Dynamics NAV\71\RoleTailored Client\Microsoft.Dynamics.Nav.Model.Tools.psd1" -force

I create a folder called cu8tocu9diff and execute

Compare-NAVApplicationObject -Original .\cu8objects.txt -Modified .\cu9objects.txt -Delta .\cu8tocu9diff
Processed 4036 objects:
 Inserted   0 objects
 Deleted    0 objects
 Changed    64 objects
 Identical  3972 objects
 Failed     0 objects

I see in my cu8tocu9diff folder that I have 4.036 objects and 3.972 of them are not needed.  I deleted all the files from cu8tocu9diff folder and executed

Compare-NAVApplicationObject -Original .\cu8objects.txt -Modified .\cu9objects.txt -Delta .\cu8tocu9diff | Where-Object CompareResult -eq 'Identical' | foreach {  Remove-Item (Join-Path .\cu8tocu9diff ($_.ObjectType.substring(0,3) + $_.Id + '.delta')) }

The result is that I only have the 64 delta files needed in the cu8tocu9diff folder.

Now I need the same 64 objects from the customized database.  I begin by splitting the exported object file into a new folder.

Split-NAVApplicationObjectFile .\cu8customizedobjects.txt cu8customized

I now have all 4.036 objects in the cu8customized folder.  I delete all unneeded files by executing

Get-ChildItem -Path .\cu8customized | foreach { if (!(Test-Path ((Join-Path .\cu8tocu9diff $_.BaseName) + '.delta'))) { Remove-Item $_.FullName } }

I am now in a place to update the customized objects with the cu9 changes.

Update-NAVApplicationObject -Target .\cu8customized -Delta .\cu8tocu9diff -Result cu9customized.txt -DateTimeProperty FromModified -ModifiedProperty FromModified -VersionListProperty FromModified -DocumentationConflict ModifiedFirst
Summary:
 Completed the update. Processed 64 application object(s) with a total of 190 individual change(s).
 100,0% of the individual changes were automatically applied during the update.
Details:
 Processed 64 application object(s):
Updated      64 objects – with changes in DELTA that were successfully merged with any changes from TARGET
 into RESULT.
 Conflict      0 objects – with changes in both DELTA and TARGET that could only be partially updated.
 The partially updated objects and the corresponding .CONFLICT files are added to RESULT.
 This also includes objects that are deleted in DELTA and changed in TARGET.
 Inserted      0 objects – that exist in DELTA that do not exist in TARGET and are inserted into RESULT.
 Deleted       0 objects – that are deleted in DELTA and exist in TARGET.
 Unchanged     0 objects – that exist in TARGET, are not changed in DELTA, and are copied from TARGET to RESULT.
 Failed        0 objects – that could not be imported, such as an object that is not valid or that contains
 unsupported features.
Processed 190 changes:
Conflict      0 changes
 Updated   100,0% of all changes

When I updated the customized objects with these 64 changes I got objects with a new version list.  The version list needs to be merged and for that I use a script that was written by NAV MVP Kamil Sacek.  The script is attached here as Merge-NAVVersionListString script and his scripts are available on CodePlex.  After copying the script to my base folder I need to import the new functions.

Import-Module '.\Merge-NAVVersionListString script.ps1'

And to merge the version list for the new objects I use

Get-ChildItem -Path .\cu9customized | foreach { Set-NAVApplicationObjectProperty -Target $_.FullName -VersionListProperty (Merge-NAVVersionListString -source (Get-NAVApplicationObjectProperty -Source $_.FullName).VersionList -target (Get-NAVApplicationObjectProperty -Source (Join-Path .\cu8customized $_.Name)).VersionList) }

I have now 64 updated object that I can import into my customized database and compile.  First I want to join all these 64 files into one object file.

Join-NAVApplicationObjectFile -Source .\cu9customized -Destination cu9customizeobjects.txt

When I imported the new customized object I got an error.  Error because I had new fields in tables that I did not have permission to create.  To solve this I begin by importing the URObjects.IS.37221.fob file that is included in the cumulative update package.  I accept the default action so all new fields should be merged into the existing tables.

Scripts are available to export and import NAV objects so this whole process can be scripted and automated.

Using Views and Linked Tables

There are always some cases where you would like to have external data available to Dynamics NAV.  In cases where the data is available to the SQL Server hosting the Dynamics NAV database it is easy to create a view in the NAV database and a linked table in the Dynamics NAV.  The process is;

Create a normal table in Dynamics NAV and save it.  Make sure that you select to use the same table for all companies.

TableProperties1

Next go to SQL Management Studio and build a CREATE TABLE script for this table.  When you do the CREATE VIEW script you must make sure that all fields are identical and with matching collation. I begin by dropping the table and then building the view.

[code lang=”sql”]USE [NAV 2013 R2 Company]
GO

/****** Object: Table [dbo].[National Register] Script Date: 13.9.2013 09:35:01 ******/
SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

DROP TABLE [dbo].[National Register]
GO

CREATE VIEW [dbo].[National Register] AS SELECT
[No_] COLLATE Icelandic_100_CS_AS AS [No_]
,[Type]
,[Family No_] COLLATE Icelandic_100_CS_AS AS [Family No_]
,[Name] COLLATE Icelandic_100_CS_AS AS [Name]
,[Sorting] COLLATE Icelandic_100_CS_AS AS [Sorting]
,[Community] COLLATE Icelandic_100_CS_AS AS [Community]
,[Gender]
,[Martial Status]
,[Mate No_] COLLATE Icelandic_100_CS_AS AS [Mate No_]
,[Post Code] COLLATE Icelandic_100_CS_AS AS [Post Code]
,[Address] COLLATE Icelandic_100_CS_AS AS [Address]
,[Guardian No_] COLLATE Icelandic_100_CS_AS AS [Guardian No_]
,[Industry Code] COLLATE Icelandic_100_CS_AS AS [Industry Code]
,[Industry] COLLATE Icelandic_100_CS_AS AS [Industry]
,[Address Code] COLLATE Icelandic_100_CS_AS AS [Address Code]
,[identification No_] COLLATE Icelandic_100_CS_AS AS [identification No_]
,[Community Sorting] COLLATE Icelandic_100_CS_AS AS [Community Sorting]
,[Legal Home Code] COLLATE Icelandic_100_CS_AS AS [Legal Home Code]
,[Blocking Code] COLLATE Icelandic_100_CS_AS AS [Blocking Code]
,[Country] COLLATE Icelandic_100_CS_AS AS [Country]
,[Place of Birth] COLLATE Icelandic_100_CS_AS AS [Place of Birth]
,[Date of Birth]
,[Address Code 2] COLLATE Icelandic_100_CS_AS AS [Address Code 2]
,[Last Local Address] COLLATE Icelandic_100_CS_AS AS [Last Local Address]
,[Proxy No_] COLLATE Icelandic_100_CS_AS AS [Proxy No_]
,[Church] COLLATE Icelandic_100_CS_AS AS [Church]
,[Address to] COLLATE Icelandic_100_CS_AS AS [Address to]
,[Status] COLLATE Icelandic_100_CS_AS AS [Status]
,[Deport Date]
,[Post Address] COLLATE Icelandic_100_CS_AS AS [Post Address]
,[c_o Name] COLLATE Icelandic_100_CS_AS AS [c_o Name]
,[c_o Address] COLLATE Icelandic_100_CS_AS AS [c_o Address]
,[c_o Post Code] COLLATE Icelandic_100_CS_AS AS [c_o Post Code]
,[c_o Post Address] COLLATE Icelandic_100_CS_AS AS [c_o Post Address]
,” COLLATE Icelandic_100_CS_AS AS [VAT No_]
,[Registered By] COLLATE Icelandic_100_CS_AS AS [Registered By]
FROM [NAV 2009 R2 Company].[dbo].[NR Data]
[/code]

Next step is to modify the table properties and set this table to be a linked table.
TableProperties2
Now NAV 2013 R2 has access to all the data in the table. The user running the NAV Service must have read access to the original table.  Make sure that all pages that use this table are read-only as the data can’t be modified through a view.

In NAV 2013 R2 there is an issue that we need to be aware of.  The NAV Server caches the data in these tables so we can have different results in NAV 2013 R2 than expected.  My contacts tell me that this issue has been fixed in the upcoming NAV vNext and I created a Microsoft Connect suggestion to have this fixed in NAV 2013 R2.  Until that solution is available the workaround is to use SELECTLATESTVERSION in you code before using the linked data.  Here is an example function that is reading a plain text password from a linked database and comparing that to an incoming MD5 password.  This issue does not seem to affect pages.  In other words the linked data displayed in pages looks up to date.

 

[code] PROCEDURE ValidateLogin@50011(ChurnNo@50000 : Code[10];MD5Password@50001 : Text[80]) : Boolean;
VAR
ChurnNoRegister@50002 : Record 10017450;
MD5@50003 : DotNet "’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.Security.Cryptography.MD5";
Encoding@50004 : DotNet "’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.Text.Encoding";
HashBytes@50005 : DotNet "’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.Array";
Data@50009 : DotNet "’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.Byte";
StringBuilder@50008 : DotNet "’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.Text.StringBuilder";
StringComparer@50006 : DotNet "’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.StringComparer";
Loop@50007 : Integer;
BEGIN
SELECTLATESTVERSION;
IF NOT ChurnNoRegister.GET(ChurnNo) THEN EXIT(FALSE);
IF ChurnNoRegister."Farmer Web Password" = ” THEN EXIT(FALSE);
MD5 := MD5.Create;
HashBytes := MD5.ComputeHash(Encoding.UTF8.GetBytes(ChurnNoRegister."Farmer Web Password"));
StringBuilder := StringBuilder.StringBuilder;
FOR Loop := 0 TO HashBytes.Length – 1 DO BEGIN
Data := HashBytes.GetValue(Loop);
StringBuilder.Append(Data.ToString(‘x2’));
END;
StringComparer := StringComparer.OrdinalIgnoreCase;
EXIT(StringComparer.Compare(MD5Password,StringBuilder.ToString) = 0);
END;
[/code]

 

Importing and Exporting Data in Microsoft Dynamics NAV 2013 R2 CU 8

Two weeks ago Microsoft released CU8 for Dynamics NAV 2013 R2.  This upgrade included the long-awaited Import and Export that is replacing the old FBK backup in the Classic Client.

There are two ways to do import and export.  One is through the Windows Client

ImportExport

and the other is with PowerShell commands Export-NAVData and Import-NAVData.  You can read all about this on the Microsoft Dynamics NAV Team Blog by clicking here.

Compared to the old Classic Client FBK backup this is very fast.  On the other hand the new import functionality is very strict on the database schema.  It needs to be excactly the same in the database used for the export and import.

I was in the process of upgrading a company from NAV 5.0 SP1 and had planned to add the company to an existing NAV 2013 R2 database wich already included a few companies.  This was not as easy as you might think.  Here are the issues that I encountered and how they were solved.

After I finished the upgrade I made sure that all objects in the upgraded database where identical to the destination database.  This I did by exporting all objects from the destination database to the upgraded database.  I used a merge tool on the exported text object file to make sure.

At this point I was not able to import the data file and asked for help from Microsoft.  Got all the help I needed and the first thing that I needed to make sure was that all the tables must be identical on a SQL level.  This was done with this SQL command.

[code lang=”sql”]select db1.[Object ID] "DB1 Table ID", db1.Name "DB1 Table Name", db1.Hash "DB1 Metadata Hash", db2.[Object ID] "DB2 Table ID", db2.Name "DB2 Table Name", db2.Hash "DB2 Metadata Hash"
from [Destination Database].[dbo].[Object Metadata Snapshot] db1
full outer join [Source Database].[dbo].[Object Metadata Snapshot] db2 on db1.[Object ID] = db2.[Object ID][/code]

I needed to make sure that the “DB1 Metadata Hash” what identical to “DB2 Metadata Hash” for all tables.  This was not true even if all objects where identical.  After recompiling all tables in both databases I saw that I now had identical hashes.

This was not enough.  Since I upgraded the source database from an earlier NAV version the database still held all the old virtual table snapshots.  We needed to remove them with this SQL command.

[code lang=”sql”]delete from [Source Database].[dbo].[Object Metadata Snapshot]
where [Object ID] in
(2000000007, 2000000026, 2000000049, 2000000101, 2000000102, 2000000103, 2000000020, 2000000055, 2000000009, 2000000029, 2000000038, 2000000058, 2000000041, 2000000028, 2000000063, 2000000022, 2000000048, 2000000040, 2000000043, 2000000044, 2000000039, 2000000045)
[/code]

After this I successfully exported and imported the upgraded company into the destination database.  One more thing that I saw is that I needed to restart other service instances in order to see the newly imported company.

This new functionality opens a lot of possibilities for developement and testing.  We can now easily with PowerShell scripts create a backup of our live company data and import them into our test and/or developement environment.

In both the Windows Client and in PowerShell it is possible to copy companies, rename and delete.  I know that the import process takes a longer time than the export process.  I tried the export process on a 500GB database and it took 2 hours and 27 minutes.  In the Classic Client this would have taken days.

I expect to see this functionality improve in future releases and also expect Microsoft or some of the community leaders to publish scripts to automate a data refresh in a testing environment.