Updates to my Object Renumbering Tool

Back in the end of 2014 I published a renumbering tool for NAV objects.  Using DotNet I was able to increase the renumbering speed for text object files dramatically.

Since then I have been asked if I could upgrade it to work with IDs and Field Numbers.

Now I have.

What’s more, it is also on GitHub.

The Process functions are the following;

  • Read Object Lines – Creates renumbering lines base on the objects in the selected object file.
  • Suggest IDs – Suggest new object numbers in the range from 50.000 based on the available objects in the current license.
  • Read from Excel – Reads object renumbering lines from Excel Sheet created with the Write to Excel process.
  • Write to Excel – Writes current renumbering lines to a new Excel Sheet to me managed within Excel and reread into the renumbering lines.
  • Renumber Using Lines – Prompts for a file to read and for a new file to save with renumbered objects based on the rules in the renumbering lines.
  • Renumber Using Controls – Prompts for a file to read and for a new file to save with renumbered objects based on the rules in the control IDs setup.

I have done some fixes to the renumbering function and have added support for the EventSubscriber.

Go to GitHub to download Page and Table 50000, try this out and submit improvements.

When I am processing an object file I have it open in my text editor.  When I see something to renumber I update the control ranges and execute the renumbering process, reading and writing to the same object file.  My editor will reload the file and I can see the results immediately.

 

Inspired by a Microsoft Developer

Yes it happens.

When reading the C/AL code in NAV written by other developers you normally pick up smart way to do things.  Yesterday I got one.

In my solutions I have been using a function to check if a string is numeric and another function to extract the numeric value from a string.

IsNumeric

All good and well.  Working fine so far.  But if you can write each function with a single line, would that not be better ?

NewIsNumeric

Thanks for the inspiration Microsoft.

The Next Upgrade Story – Data transfer code generation tool

Guess what, more companies are moving to NAV 2013 R2.

I am about to start a data upgrade for a medium-sized Icelandic company.  Doing a data transfer means that you need to compare the tables of the old version and the new version and where you have mismatch you have to take some action.  The data transfer is build on the upgrade from 2009 to 2013 R2 released by Microsoft.

We have the MergeTool that is used to compare and merge code and a lot of other cool things.  I wanted a tool that did something else so I started designing a writing.  I am hoping that the result will save me a lot of work in future upgrades.

UpgradeMenu

This tool imports exported object files into a table version.  Then I can compare two table versions and get all the difference listed.  I also import the CRONUS tables for each version to make sure that I am not comparing built-in fields.

TableVersions

I give the comparison an identification and select the source and destination versions.  Then I hit Compare.  Within seconds I get the result.

CompareResult3

I have got to make decisions on what to do with modified and deleted tables.  I can list all the tables and check the fields comparison.

Fields

I can select both step 1 and step 2 actions for each table.  For new fields I can select a source field from the source table.  The example I show here is the customer table.  I choose in step 1 to copy all the needed fields to a temporary table.  The tool will find the next available upgrade table automatically.  In step 2 I choose to move the data from the temporary table into the destination version.  I can ask that a new field will get a value from another field in the temporary table.  In this example the Registration No. field has a new ID.

The best part is yet to come.  To really save some work I make the tool create all the needed NAV upgrade objects.  For step 1 I get all the upgrade tables with only required fields.

[code]OBJECT Table 66001 Temp Customer
{
OBJECT-PROPERTIES
{
Date=11.02.14;
Time=12:11:11;
Version List=Dynamics.is;
}
PROPERTIES
{
}
FIELDS
{
{ 1 ; ;No. ;Code20 }
{ 10010400;;Registration No. ;Code20 }
{ 10017241;;Require Purchase Request;Boolean }
{ 10017290;;Freight Paid By ;Option ;OptionString=[Recipient,Prepaid,Senders Account,See Ship-to Address] }
}
KEYS
{
{ ;No. ;Clustered=Yes }
}
FIELDGROUPS
{
}
CODE
{

BEGIN
{
¸Dynamics.is Upgrade Table
}
END.
}
}

OBJECT Table 66002 Temp Cust. Ledger Entry
{

[/code]

The codeunit required to make the data manipulation.

[code]OBJECT Codeunit 66000 Upgrade FurblandanR
{
OBJECT-PROPERTIES
{
Date=11.02.14;
Time=12:11:11;
Version List=Dynamics.is;
}
PROPERTIES
{
}
CODE
{
PROCEDURE Upgrade@1(VAR StateIndicator@1000 : Record 104037);
BEGIN
// Call this function from the top of the Upgrade trigger in Codeunit ID 104045
CopyCustomer(StateIndicator);
CopyCustLedgerEntry(StateIndicator);

MoveCashierBankBranch(StateIndicator);

ForcePaymentTerms(StateIndicator);
ForceSalesHeader(StateIndicator);

END

LOCAL PROCEDURE CopyCustomer@2(VAR StateIndicator@1002 : Record 104037);
VAR
Customer@1000 : Record 18;
TempCustomer@1001 : Record 66001;
BEGIN
WITH Customer DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
CALCFIELDS("Picture");
TempCustomer.INIT;
TempCustomer.TRANSFERFIELDS(Customer);
TempCustomer.INSERT;
UNTIL NEXT = 0;
MODIFYALL("Registration No.",”);
MODIFYALL("Require Purchase Request",FALSE);
MODIFYALL("Freight Paid By",0);
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE CopyCustLedgerEntry@3(VAR StateIndicator@1002 : Record 104037);
VAR
CustLedgerEntry@1000 : Record 21;
TempCustLedgerEntry@1001 : Record 66002;
BEGIN
WITH CustLedgerEntry DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
TempCustLedgerEntry.INIT;
TempCustLedgerEntry.TRANSFERFIELDS(CustLedgerEntry);
TempCustLedgerEntry.INSERT;
UNTIL NEXT = 0;
MODIFYALL("Final Due Date",0D);
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE MoveCashierBankBranch@26(VAR StateIndicator@1002 : Record 104037);
VAR
CashierBankBranch@1000 : Record 10010402;
TempCashierBankBranch@1001 : Record 10000202;
BEGIN
WITH CashierBankBranch DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
TempCashierBankBranch.INIT;
TempCashierBankBranch.TRANSFERFIELDS(CashierBankBranch);
TempCashierBankBranch.INSERT;
UNTIL NEXT = 0;
DELETEALL;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE ForceFixedAsset@48(VAR StateIndicator@1002 : Record 104037);
VAR
FixedAsset@1000 : Record 5600;
BEGIN
WITH FixedAsset DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDFIRST THEN BEGIN
StateIndicator.Update;
MODIFYALL("Straight-Line %",0);
MODIFYALL("Depreciation Starting Date",0D);
MODIFYALL("Biling Contract No.",”);
END;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;


[/code]

And to complete step 1 the addition needed to the DeleteDiscontinuedTables trigger.

[code]// Add these lines to the DeleteDiscontinuedTables trigger of Codeunit ID 104002
DeleteTable(72000);
DeleteTable(72001);
DeleteTable(72003);
DeleteTable(87400);
…// Add these lines to the DeleteDiscontinuedTables trigger of Codeunit ID 104002
[/code]

Then there is the upgrade codeunit for step 2

[code]OBJECT Codeunit 66001 Upgrade FurblandanR
{
OBJECT-PROPERTIES
{
Date=11.02.14;
Time=12:11:11;
Version List=Dynamics.is;
}
PROPERTIES
{
}
CODE
{
PROCEDURE Upgrade@1(VAR StateIndicator@1000 : Record 104037);
BEGIN
// Call this function from the top of the Upgrade trigger in Codeunit ID 104048
CopyBankingWebServicesUser(StateIndicator);
MoveCustomer(StateIndicator);
.. END

LOCAL PROCEDURE CopyBankingWebServicesUser@2(VAR StateIndicator@1002 : Record 104037);
VAR
TempCashierWebServicesUser@1000 : Record 66022;
BankingWebServicesUser@1001 : Record 10010428;
BEGIN
WITH TempCashierWebServicesUser DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
"BankingWebServicesUser".INIT;
"BankingWebServicesUser"."Authentication Code" := "Authentication Code";
"BankingWebServicesUser"."User ID" := "User ID";
"BankingWebServicesUser"."Certificate Key Identifier" := "Certificate Key Identifier";
//"BankingWebServicesUser"."User ID Store" := ;
//"BankingWebServicesUser"."Password Store" := ;
//"BankingWebServicesUser"."Certificate Location" := ;
//"BankingWebServicesUser"."Certificate Value" := ;
"BankingWebServicesUser".INSERT;
UNTIL NEXT = 0;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE MoveCustomer@3(VAR StateIndicator@1002 : Record 104037);
VAR
TempCustomer@1000 : Record 66001;
Customer@1001 : Record 18;
BEGIN
WITH TempCustomer DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
"Customer".GET("No.");
"Customer"."Registration No." := "Registration No.";
//"Customer"."Momentum Claims Active" := ;
"Customer".MODIFY;
UNTIL NEXT = 0;
DELETEALL;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;


[/code]

And to complete the process the upgrade tables are marked to be deleted.

[code]// Add these lines to the OnRun trigger of Codeunit ID 104003
MarkTable(66001);
MarkTable(66002);
…[/code]

This tool is in testing and I am planning to add support for NAV 2015 upgrade tool.

Arion Banki currency importer

It is not often that I post a solution that is intended to be used in Iceland only.  Here is one.

Arion Banki is one of the major national banks in Iceland and they want to support Dynamics NAV users.  On their website you will find the currency exchange rates for every working date.  I created an importer that will download the exchange rate from their website and import into NAV.

There are two ways to do this.  First is to go into Currencies in the NAV Client and click on Import

ArionBankiImportCurrency

The other way is to add a Job Queue Entry to make this an automatic task.

NewJobQueueEntry

The process will find the last date imported into your system and import all days from and including that date to the current working date.

CurrencyImportNAV2009 CurrencyImporterObjects2013R2

 

 

Test and fix dimensions before upgrading to NAV 2013

A coworker in Advania is working on a Business Intelligence solution called Advania Insight.  He is working on the installation for a company that is running NAV.  This company has data from Navigator since 1994.

One of the problem we saw was that some of the dimension codes used where not a standard dimension but for example a heading or a footer.  This did not go to well through the analysis and we needed to fix this.

Manually this is a lot of work so I did some SQL queries to find these errors. For example, this query will find all entries in the G/L Entry table that has incorrect value Global Dimension 1 Code.

[code lang=”sql”]USE [Demo Database NAV (7-1)]
GO

SELECT
[Entry No_]
FROM [dbo].[CRONUS Ísland hf_$G_L Entry],
[dbo].[CRONUS Ísland hf_$Dimension Value]
WHERE [Global Dimension 1 Code] = [Code] AND
[Global Dimension No_] = 1 AND
[Dimension Value Type] > 0
GO
[/code]

This query can be used to find entries with missing entries in the Ledger Entry Dimension table.

[code lang=”sql”]SELECT [Entry No_]
FROM [dbo].[CRONUS Ísland hf_$G_L Entry] V
WHERE (SELECT COUNT(D.[Dimension Code]) FROM [dbo].[CRONUS Ísland hf_$Ledger Entry Dimension] D
WHERE [Table ID] = 17 AND V.[Entry No_] = D.[Entry No_] AND D.[Dimension Code] = ‘DEILD’) = 0
AND [Global Dimension 1 Code] <> ”
GO
[/code]

This query can be used to find entries with Global Dimension 1 Code that are missing from the Dimension Value table.

[code lang=”sql”]USE [Demo Database NAV (7-1)]
GO

SELECT [Entry No_]
FROM [dbo].[CRONUS Ísland hf_$G_L Entry] V
WHERE (SELECT COUNT(D.[Dimension Code]) FROM [dbo].[CRONUS Ísland hf_$Dimension Value] D
WHERE [Global Dimension 1 Code] = D.[Code] AND D.[Global Dimension No_] = 1) = 0
AND [Global Dimension 1 Code] <> ”
GO[/code]

Looking forward I saw that it would be a lot of work to manually check all these possibilities so I decided to write a SQL script. First a script that will do the dimension type test on the Ledger Entry Dimension table.

[code lang=”sql”]USE [KS Dynamics NAV]
GO

DECLARE @invalididentifierscars varchar(10)
DECLARE @replacestring varchar(10)
SET @invalididentifierscars = (SELECT [invalididentifierchars] FROM [dbo].[$ndo$dbproperty])

DECLARE @varSQL varchar(max)
DECLARE @DimTableName varchar(256)
DECLARE @DimValueTableName varchar(256)
DECLARE @getCompanyNames CURSOR
DECLARE @CompanyName varchar(50)
DECLARE @loop int
CREATE TABLE #TmpLedgerEntryDimCombinations (CompanyName varchar(50), TableID int, TableName varchar(50), GlobalDimCode varchar(20), GlobalDimNo int)
CREATE TABLE #TmpIncorrectDimEntries (CompanyName varchar(50), TableID int, TableName varchar(50), GlobalDimCode varchar(20), GlobalDimNo int, EntryNo int)
SET @getCompanyNames = CURSOR FOR SELECT [Name] FROM [dbo].[Company]
OPEN @getCompanyNames
FETCH NEXT FROM @getCompanyNames INTO @CompanyName
WHILE @@FETCH_STATUS = 0
BEGIN
SET @DimTableName = @CompanyName + ‘$Ledger Entry Dimension’
SET @DimValueTableName = @CompanyName + ‘$Dimension Value’
SET @loop = 0
WHILE @loop < LEN(@invalididentifierscars)
BEGIN
SET @loop = @loop + 1
SET @DimTableName = REPLACE(@DimTableName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
SET @DimValueTableName = REPLACE(@DimValueTableName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
END
SET @varSQL = ‘USE [‘ + DB_NAME() + ‘]; INSERT INTO #TmpLedgerEntryDimCombinations SELECT DISTINCT ”’ + @CompanyName + ”’,[Table ID],O.[Name],L.[Dimension Code],[Global Dimension No_]
FROM [dbo].[‘ + @DimTableName + ‘] L,[dbo].[‘ + @DimValueTableName + ‘] D, [dbo].[Object] O
WHERE D.[Dimension Code] = L.[Dimension Code] AND [Global Dimension No_] > 0 AND O.[Type] = 0 AND O.[ID] = [Table ID]’
EXEC (@varSQL)
FETCH NEXT FROM @getCompanyNames INTO @CompanyName
END
CLOSE @getCompanyNames
DEALLOCATE @getCompanyNames

DECLARE @TableID int
DECLARE @TableName varchar(50)
DECLARE @GlobalDimCode varchar(20)
DECLARE @GlobalDimNo int
DECLARE @DimFieldName varchar(50)
DECLARE @DimFixes CURSOR
SET @DimFixes = CURSOR FOR SELECT CompanyName, TableID, TableName, GlobalDimCode, GlobalDimNo FROM #TmpLedgerEntryDimCombinations
OPEN @DimFixes
FETCH NEXT FROM @DimFixes INTO @CompanyName, @TableID, @TableName, @GlobalDimCode, @GlobalDimNo
WHILE @@FETCH_STATUS = 0
BEGIN
DECLARE @OrigCompanyName varchar(50)
DECLARE @OrigTableName varchar(50)
SET @OrigCompanyName = @CompanyName
SET @OrigTableName = @TableName
SET @loop = 0
WHILE @loop < LEN(@invalididentifierscars)
BEGIN
SET @loop = @loop + 1
SET @CompanyName = REPLACE(@CompanyName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
SET @TableName = REPLACE(@TableName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
END
SET @DimFieldName = ‘[Global Dimension ‘ + CAST(@GlobalDimNo as varchar(1)) + ‘ Code]’
SET @varSQL = ‘INSERT INTO #TmpIncorrectDimEntries (CompanyName, TableID, TableName, GlobalDimCode, GlobalDimNo, EntryNo)
SELECT ”’ + @OrigCompanyName + ”’, ‘ + CAST(@TableID as varchar(20)) + ‘,”’ + @OrigTableName + ”’,”’ + @GlobalDimCode + ”’,’ + CAST(@GlobalDimNo as varchar(1)) + ‘, [Entry No_]
FROM [dbo].[‘ + @CompanyName + ‘$’ + @TableName + ‘], [dbo].[‘ + @CompanyName + ‘$Dimension Value]
WHERE ‘ + @DimFieldName + ‘ = [Code] AND [Global Dimension No_] = ‘ + CAST(@GlobalDimNo as varchar(1)) + ‘ AND [Dimension Value Type] > 0’
EXEC(@varSQL)
FETCH NEXT FROM @DimFixes INTO @CompanyName, @TableID, @TableName, @GlobalDimCode, @GlobalDimNo
END
CLOSE @DimFixes;
DEALLOCATE @DimFixes;

SELECT * FROM #TmpLedgerEntryDimCombinations
DROP TABLE #TmpLedgerEntryDimCombinations
SELECT * FROM #TmpIncorrectDimEntries
DROP TABLE #TmpIncorrectDimEntries
GO

[/code]

Then a similar script that will update the Ledger Entry Dimension table according to the values in Global Dimension 1 Code and Global Dimension 2 Code.

[code lang=”sql”]USE [KS Dynamics NAV]
GO

DECLARE @invalididentifierscars varchar(10)
DECLARE @replacestring varchar(10)
SET @invalididentifierscars = (SELECT [invalididentifierchars] FROM [dbo].[$ndo$dbproperty])

DECLARE @varSQL varchar(max)
DECLARE @DimTableName varchar(256)
DECLARE @DimValueTableName varchar(256)
DECLARE @getCompanyNames CURSOR
DECLARE @CompanyName varchar(50)
DECLARE @loop int
CREATE TABLE #TmpLedgerEntryDimCombinations (CompanyName varchar(50), TableID int, TableName varchar(50), GlobalDimCode varchar(20), GlobalDimNo int)
SET @getCompanyNames = CURSOR FOR SELECT [Name] FROM [dbo].[Company]
OPEN @getCompanyNames
FETCH NEXT FROM @getCompanyNames INTO @CompanyName
WHILE @@FETCH_STATUS = 0
BEGIN
SET @DimTableName = @CompanyName + ‘$Ledger Entry Dimension’
SET @DimValueTableName = @CompanyName + ‘$Dimension Value’
SET @loop = 0
WHILE @loop < LEN(@invalididentifierscars)
BEGIN
SET @loop = @loop + 1
SET @DimTableName = REPLACE(@DimTableName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
SET @DimValueTableName = REPLACE(@DimValueTableName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
END
SET @varSQL = ‘USE [‘ + DB_NAME() + ‘]; INSERT INTO #TmpLedgerEntryDimCombinations SELECT DISTINCT ”’ + @CompanyName + ”’,[Table ID],O.[Name],L.[Dimension Code],[Global Dimension No_]
FROM [dbo].[‘ + @DimTableName + ‘] L,[dbo].[‘ + @DimValueTableName + ‘] D, [dbo].[Object] O
WHERE D.[Dimension Code] = L.[Dimension Code] AND [Global Dimension No_] > 0 AND O.[Type] = 0 AND O.[ID] = [Table ID]’
EXEC (@varSQL)
FETCH NEXT FROM @getCompanyNames INTO @CompanyName
END
CLOSE @getCompanyNames
DEALLOCATE @getCompanyNames

DECLARE @TableID int
DECLARE @TableName varchar(50)
DECLARE @GlobalDimCode varchar(20)
DECLARE @GlobalDimNo int
DECLARE @DimFieldName varchar(50)
DECLARE @DimFixes CURSOR
SET @DimFixes = CURSOR FOR SELECT CompanyName, TableID, TableName, GlobalDimCode, GlobalDimNo FROM #TmpLedgerEntryDimCombinations
OPEN @DimFixes
FETCH NEXT FROM @DimFixes INTO @CompanyName, @TableID, @TableName, @GlobalDimCode, @GlobalDimNo
WHILE @@FETCH_STATUS = 0
BEGIN
SET @loop = 0
WHILE @loop < LEN(@invalididentifierscars)
BEGIN
SET @loop = @loop + 1
SET @CompanyName = REPLACE(@CompanyName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
SET @TableName = REPLACE(@TableName,SUBSTRING (@invalididentifierscars,@loop,1),’_’)
END
SET @DimFieldName = ‘[Global Dimension ‘ + CAST(@GlobalDimNo as varchar(1)) + ‘ Code]’
SET @varSQL = ‘DELETE FROM [‘ + @CompanyName + ‘$Ledger Entry Dimension]
WHERE [Table ID] = ‘ + CAST(@TableID as varchar(20)) + ‘ AND [Dimension Code] = ”’ + @GlobalDimCode + ””
EXEC(@varSQL)
SET @varSQL = ‘INSERT INTO [‘ + @CompanyName + ‘$Ledger Entry Dimension] ([Table ID],[Entry No_],[Dimension Code],[Dimension Value Code])
SELECT ‘ + CAST(@TableID as varchar(20)) + ‘,[Entry No_],”’ + @GlobalDimCode + ”’,’ + @DimFieldName + ‘
FROM [dbo].[‘ + @CompanyName + ‘$’ + @TableName + ‘] E
WHERE ‘ + @DimFieldName + ‘ <> ””’
EXEC(@varSQL)
FETCH NEXT FROM @DimFixes INTO @CompanyName, @TableID, @TableName, @GlobalDimCode, @GlobalDimNo
END
CLOSE @DimFixes;
DEALLOCATE @DimFixes;

DROP TABLE #TmpLedgerEntryDimCombinations
GO

[/code]

Reading through this you should see that the basic part of the two scripts are similar and could be used to build further testing and for other dimension tables.  It is important that all the testing should be done before upgrading to NAV 2013 or NAv 2013 R2.

There are a lot of tables you will need to consider and do a data check for.

DimensionTables

Some of you have most likely written some kind of dimension test.  Perhaps you can also share them here ?

Could not load type “. error in DotNet object

As often a solution to a strange error is so simple when you know it.  When I copied my add-ins to a  new computer I was not able compile the objects that included DotNet add-ins.

CouldNotLoadType

As this happened when I was installing NAV 2013 R2 for the first time my first thought was that this was something that had changed between versions and I sent a support request to Microsoft.

Mohamad Vajid and Duilio Tacconi took the case and tried to reproduce the problem without success.  Yesterday we took a look at the problem together and the solution came to our attention.

The Add-in was copied from a web site and because of that the file was blocked by the server operating system.

UnblockDLL

Then today as I was creating a new website with files that I downloaded from the web I had the same problem.  That will be the last time.

After you unblock the file, recheck to see if that worked.  If the file is in a protected folder you might need to drag it to your desktop to unblock it and back again.

Thank you Mohamad and Duilio for the assistance.

Performance in data upgrade – lets speed things up

I am working on an interesting issue.  How to execute data upgrade from NAV 4.0SP3 to NAV 2013 on a 300GB database in just one weekend.  My first go at this task was during the summer and my measures tell me that I would not make it in a week.

The database is heavily modified and I need to move a lot of data to a temporary tables and then back in the end of the data upgrade having reached NAV 2013.  Some fields need to be cleared and in some cases a whole table needs to be emptied and even removed.

The standard code in NAV 4.0SP3 Step 1 is to loop through all entries in the table and initialize the fields, like here

[code]
WITH PurchLineArchive DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
"Phase Code" := ”;
"Task Code" := ”;
"Step Code" := ”;
MODIFY;
UNTIL NEXT = 0;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;[/code]

A faster way is to use MODIFYALL, like here

[code]WITH ProdOrderComponent DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
MODIFYALL("Due Date-Time",0);
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;[/code]

But I guess it depends on how many fields needs to be initialized.  In the first example the whole table is read record by record, handled in NAV and modified.  In the second example the SQL server does all the job and nothing is handled in NAV.  However, in the second example the SQL server needs to read and modify the whole table for each field that has to be initialized.  So there is a threshold where it is better to handle every record and modify instead of using MODIFYALL for all fields.

In other cases data has to me copied to a temporary table to use later in the upgrade.  This happens if the table structure is changed, data is used in another tables or even if the field has new number or new data type.  Here is an example for the Customer.

[code]WITH Customer DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
TempCustomer.INIT;
TempCustomer.TRANSFERFIELDS(Customer);
TempCustomer.INSERT;

"Our Account No." := ”;
"Mobile Phone" := ”;
"EAN-Id.No." := ”;
"X-400 address" := ”;
"Reports Delivery" := 0;
"Handling fee" := FALSE;
"Shipment request" := ”;
"Distribution Phone No." := ”;
"Distribution GSM No." := ”;
"Distribution E-mail" := ”;
"Distribution Contact" := ”;
"Billing Blocking Code" := ”;
"Responsable To Customer" := ”;
"Order Change Notification" := FALSE;
"Statement Address" := ”;
"Statement Post Code" := ”;
"Statement City" := ”;
"Address of Consignee 1" := ”;
"Address of Consignee 2" := ”;
"Address of Consignee 3" := ”;
"Address of Consignee 4" := ”;
"Pooled Accounts Collection" := ”;
"Block Order Join" := FALSE;
"Missing Order Notification SMS" := ”;
"Missing Order Notif. SMS 2" := ”;
"Social Security No." := ”;
"Electronic Partner – Old" := FALSE;
"Electronic Partner" := FALSE;
"Outstanding POS Sales" := 0;
"Homestead No." := ”;
"Churn No." := 0;
"Notification Process Code" := ”;
"Queue Priority" := "Queue Priority"::"Very Low";
MODIFY;
UNTIL NEXT = 0;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
[/code]

The first thing I did to speed in my work was to build an Excel document. With the help of Excel I was able to open the table design in NAV, copy all fields that needed to be clear into Excel and have Excel give me the code to initialize the fields.  This saves me some work when customizing the upgrade batch.  A copy of the Excel document will be in the attachment in the bottom of this post.

With the change in dimensions Microsoft faced the fact that the traditional way of data upgrade would not work for the clients.  It would simply be to time consuming.  Their solution was to change parts of the code to build SQL statements and execute them directly on the SQL server.  With this method they where able to reduce the time in their example from 24 hours down to 1.5 hours.

With this in mind I decided to do the same for the data upgrade.  I created an upgrade SQL helper CODEUNIT based on ADO automation to use in the earlier versions.  I will attach it to the post but here is the code

[code]OBJECT Codeunit 99980 Upgrade SQL Helper
{
OBJECT-PROPERTIES
{
Date=07.09.13;
Time=18:02:42;
Modified=Yes;
Version List=UPGW16.00.10,MS;
}
PROPERTIES
{
OnRun=BEGIN
END;

}
CODE
{
VAR
ADOConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000514-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Connection";
ADOStream@1100408004 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000566-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Stream";
WShell@1100408007 : Automation "{F935DC20-1CF0-11D0-ADB9-00C04FD58A0B} 1.0:{72C24DD5-D70A-438B-8A42-98424B88AFB8}:’Windows Script Host Object Model’.WshShell";
SystemFilesSystem@1100408014 : Automation "{F935DC20-1CF0-11D0-ADB9-00C04FD58A0B} 1.0:{0D43FE01-F093-11CF-8940-00A0C9054228}:’Windows Script Host Object Model’.FileSystemObject";
SystemFile@1100408013 : Automation "{F935DC20-1CF0-11D0-ADB9-00C04FD58A0B} 1.0:{C7C3F5B5-88A3-11D0-ABCB-00A0C90FFFC0}:’Windows Script Host Object Model’.File";
SystemTextStream@1100408012 : Automation "{F935DC20-1CF0-11D0-ADB9-00C04FD58A0B} 1.0:{0BB02EC0-EF49-11CF-8940-00A0C9054228}:’Windows Script Host Object Model’.TextStream";
MyDatabase@1100408011 : Record 2000000048;
MyServer@1100408010 : Record 2000000047;
dbpropFromStr@1100408003 : Text[30];
dbpropToStr@1100408002 : Text[30];
CommandFile@1100408008 : Text[1024];
ResultFile@1100408009 : Text[1024];
CrLf@1100408015 : Text[2];
dbpropertyInitialized@1100408001 : Boolean;
Connected@1100408005 : Boolean;
Text007@1100408006 : TextConst ‘ENU=Microsoft ADO not found.;ISL=Microsoft ADO finnst ekki.’;

PROCEDURE CreateSQL@1100408007(Qry@1100408000 : Text[1024]);
BEGIN
CLEAR(ADOStream);
CREATE(ADOStream,TRUE,TRUE);
ADOStream.Type := 2;
ADOStream.Open;
ADOStream.WriteText(‘SET ANSI_NULLS ON;’ + CrLf);
ADOStream.WriteText(‘SET QUOTED_IDENTIFIER ON;’ + CrLf);
ADOStream.WriteText(Qry + CrLf);
END;

PROCEDURE AddToSQL@1100408012(Qry@1100408000 : Text[1024]);
BEGIN
ADOStream.WriteText(Qry + CrLf);
END;

PROCEDURE ExecuteSQL@1100408015();
VAR
FileContent@1100408003 : Text[1024];
WinStyle@1100408002 : Integer;
FileSize@1100408004 : Integer;
Wait@1100408001 : Boolean;
Success@1100408000 : Boolean;
BEGIN
Connect;
ADOStream.WriteText(‘;’);
ADOStream.SaveToFile(CommandFile,2);
ADOStream.Close;
Wait := TRUE;
WShell.Run(
STRSUBSTNO(
‘sqlcmd -E -S %1 -d %2 -i %3 -o %4′,
MyServer."Server Name",
MyDatabase."Database Name",
CommandFile,
ResultFile),WinStyle,Wait);
SystemFile := SystemFilesSystem.GetFile(ResultFile);
FileSize := SystemFile.Size;
IF FileSize > 0 THEN BEGIN
SystemTextStream := SystemFile.OpenAsTextStream;
FileContent := SystemTextStream.ReadAll;
IF STRPOS(FileContent,’affected’) = 0 THEN
ERROR(FileContent);
END;
END;

PROCEDURE StringConvert@2(NAVName@1000 : Text[30]) : Text[30];
VAR
i@1001 : Integer;
BEGIN
IF NOT dbpropertyInitialized THEN BEGIN
dbpropertyInitialized := TRUE;
IF GetDBPropertyField(‘convertidentifiers’) = ‘1’ THEN BEGIN
dbpropFromStr := GetDBPropertyField(‘invalididentifierchars’);
FOR i := 1 TO STRLEN(dbpropFromStr) DO
dbpropToStr += ‘_’;
END;
END;

EXIT(CONVERTSTR(NAVName,dbpropFromStr,dbpropToStr));
END;

PROCEDURE GetFullTableNameSQL@8(TableName@1000 : Text[30]) : Text[1024];
BEGIN
EXIT(STRSUBSTNO(‘[%1$%2]’,GetCompanyNameSQL,GetTableNameSQL(TableName)));
END;

PROCEDURE GetCompanyNameSQL@10() : Text[1024];
BEGIN
EXIT(StringConvert(COMPANYNAME));
END;

PROCEDURE GetTableNameSQL@7(TableName@1000 : Text[30]) : Text[1024];
BEGIN
EXIT(StringConvert(TableName));
END;

PROCEDURE GetFieldNameSQL@1100408001(FieldNameNAV@1100408000 : Text[30]) FieldNameSQL : Text[1024];
BEGIN
FieldNameSQL := ‘[‘ + StringConvert(FieldNameNAV) + ‘]’;
END;

PROCEDURE GetFieldClearSQL@1100408008(FieldNameNAV@1100408000 : Text[30];NewValue@1100408001 : Text[250];LastValue@1100408002 : Boolean) : Text[1024];
BEGIN
IF LastValue THEN
EXIT(GetFieldNameSQL(FieldNameNAV) + ‘ = ‘ + NewValue + ‘ ‘)
ELSE
EXIT(GetFieldNameSQL(FieldNameNAV) + ‘ = ‘ + NewValue + ‘,’)
END;

LOCAL PROCEDURE GetDBPropertyField@11(FieldName@1000 : Text[30]) : Text[30];
VAR
ADORecordset@1100408002 : Automation "{00000300-0000-0010-8000-00AA006D2EA4} 2.8:{00000535-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects Recordset 2.8 Library’.Recordset";
ADOFields@1100408001 : Automation "{00000300-0000-0010-8000-00AA006D2EA4} 2.8:{00000564-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects Recordset 2.8 Library’.Fields";
ADOField@1100408000 : Automation "{00000300-0000-0010-8000-00AA006D2EA4} 2.8:{00000569-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects Recordset 2.8 Library’.Field";
Qry@1004 : Text[1024];
i@1001 : Integer;
FieldValue@1005 : Text[1024];
BEGIN
IF FieldName = ” THEN
EXIT(”);

Qry := STRSUBSTNO(‘select top 1 [%1] from [$ndo$dbproperty]’,FieldName);

Connect;
CREATE(ADORecordset,TRUE,TRUE);
ADORecordset.Open(Qry,ADOConnection);
ADORecordset.MoveFirst;
ADOFields := ADORecordset.Fields;
FieldValue := FORMAT(ADOFields.Item(FieldName).Value);
EXIT(FieldValue);
END;

PROCEDURE GetNullDate@1100408002() : Text[1024];
BEGIN
EXIT(”’1753-01-01 00:00:00.000”’);
END;

PROCEDURE GetNullTime@1100408003() : Text[1024];
BEGIN
EXIT(”’1753-01-01 00:00:00.000”’);
END;

PROCEDURE GetNullDateTime@1100408006() : Text[1024];
BEGIN
EXIT(”’1753-01-01 00:00:00.000”’);
END;

PROCEDURE GetNullString@1100408004() : Text[1024];
BEGIN
EXIT(”””);
END;

PROCEDURE GetNullGuid@1100408014() : Text[1024];
BEGIN
EXIT(”’00000000-0000-0000-0000-000000000000”’);
END;

PROCEDURE GetNullInt@1100408005() : Text[1024];
BEGIN
EXIT(‘0’);
END;

PROCEDURE GetNull@1100408013() : Text[1024];
BEGIN
EXIT(‘NULL’);
END;

LOCAL PROCEDURE Connect@1100408000();
BEGIN
IF Connected THEN EXIT;

MyServer.SETRANGE("My Server",TRUE);
MyServer.FINDFIRST;
MyDatabase.SETRANGE("My Database",TRUE);
MyDatabase.FINDFIRST;

CLEAR(ADOConnection);
IF NOT CREATE(ADOConnection,TRUE,TRUE) THEN
ERROR(Text007);

IF ADOConnection.State = 1 THEN
ADOConnection.Close;

ADOConnection.Open(
STRSUBSTNO(
‘Provider=SQLOLEDB.1;Initial Catalog=%2;Data Source=%1;Trusted_Connection=yes;’,
MyServer."Server Name",
MyDatabase."Database Name"));

Connected := ADOConnection.State = 1;
CommandFile := ENVIRON(‘TEMP’) + ‘\Cmd.sql’;
ResultFile := ENVIRON(‘TEMP’) + ‘\Result.txt’;
CrLf[1] := 13;
CrLf[2] := 10;

CLEAR(WShell);
CREATE(WShell,TRUE,TRUE);

CLEAR(SystemFilesSystem);
CREATE(SystemFilesSystem,TRUE,TRUE);
END;

EVENT ADOConnection@1100408000::InfoMessage@0(pError@1100408002 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000500-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Error";adStatus@1100408001 : Integer;pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::BeginTransComplete@1(TransactionLevel@1100408003 : Integer;pError@1100408002 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000500-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Error";adStatus@1100408001 : Integer;pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::CommitTransComplete@3(pError@1100408002 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000500-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Error";adStatus@1100408001 : Integer;pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::RollbackTransComplete@2(pError@1100408002 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000500-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Error";adStatus@1100408001 : Integer;pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::WillExecute@4(VAR Source@1100408007 : Text[1024];CursorType@1100408006 : Integer;LockType@1100408005 : Integer;VAR Options@1100408004 : Integer;adStatus@1100408003 : Integer;pCommand@1100408002 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{B08400BD-F9D1-4D02-B856-71D5DBA123E9}:’Microsoft ActiveX Data Objects 2.8 Library’._Command";pRecordset@1100408001 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000556-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Recordset";pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::ExecuteComplete@5(RecordsAffected@1100408005 : Integer;pError@1100408004 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000500-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Error";adStatus@1100408003 : Integer;pCommand@1100408002 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{B08400BD-F9D1-4D02-B856-71D5DBA123E9}:’Microsoft ActiveX Data Objects 2.8 Library’._Command";pRecordset@1100408001 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000556-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Recordset";pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::WillConnect@6(VAR ConnectionString@1100408005 : Text[1024];VAR UserID@1100408004 : Text[1024];VAR Password@1100408003 : Text[1024];VAR Options@1100408002 : Integer;adStatus@1100408001 : Integer;pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::ConnectComplete@7(pError@1100408002 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000500-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’.Error";adStatus@1100408001 : Integer;pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

EVENT ADOConnection@1100408000::Disconnect@8(adStatus@1100408001 : Integer;pConnection@1100408000 : Automation "{2A75196C-D9EB-4129-B803-931327F72D5C} 2.8:{00000550-0000-0010-8000-00AA006D2EA4}:’Microsoft ActiveX Data Objects 2.8 Library’._Connection");
BEGIN
END;

BEGIN
END.
}
}

[/code]

I of course used some of the code Microsoft made from NAV 2013, that is the normal way to do things, but I had a problem to solve that did not exist in NAV 2013.  Some of my statements exceeded 1024 characters in length and I was unable to use ADO to execute the statement.  I had to move to sqlcmd to execute the statements.  I use ADO Stream when adding to the SQL command and then saved the content of the stream to a temporary file that is then executed by sqlcmd.  I then look at the result file to catch the errors.

Again, I used Excel to help me to build the code needed.  So, what is gained by this ?

When I need to clear fields in a table a single SQL statement clears all files in one read-modify statement.  Previously this had to be done field by field or by looping through every entry in the table.

[code]
//Delete data from obsolete fields
DBMgt.CreateSQL(STRSUBSTNO(‘UPDATE %1 SET ‘,DBMgt.GetFullTableNameSQL(TABLENAME)));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Pantað magn"),DBMgt.GetNullInt,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Pantað í mælieiningu"),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME(EDI),DBMgt.GetNullInt,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Afgreitt magn"),DBMgt.GetNullInt,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Afgreitt í mælieiningu"),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Pick No."),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Item Pick Group Code"),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Original Order No."),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Best fyrir"),DBMgt.GetNullDate,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Síðasti söludagur"),DBMgt.GetNullDate,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Appl.-to Job Entry"),DBMgt.GetNullInt,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Phase Code"),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Task Code"),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Step Code"),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Job Applies-to ID"),DBMgt.GetNullString,FALSE));
DBMgt.AddToSQL(DBMgt.GetFieldClearSQL(FIELDNAME("Apply and Close (Job)"),DBMgt.GetNullInt,TRUE));
DBMgt.ExecuteSQL;[/code]

Similarly I use a single SQL statement to copy all needed fields from the source table to the temporary table.

[code]
// Copy Data to Temporary Tables
DBMgt.CreateSQL(STRSUBSTNO(‘INSERT INTO %1 (‘,DBMgt.GetFullTableNameSQL(TempSalesShipmentLine.TABLENAME)));
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Document No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Line No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Pantað magn")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Pantað í mælieiningu")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME(EDI)) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Afgreitt magn")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Afgreitt í mælieiningu")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Pick No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Item Pick Group Code")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Original Order No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Best fyrir")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Síðasti söludagur")) + ‘ ‘ );
DBMgt.AddToSQL(‘) SELECT ‘);
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Document No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Line No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Pantað magn")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Pantað í mælieiningu")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME(EDI)) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Afgreitt magn")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Afgreitt í mælieiningu")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Pick No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Item Pick Group Code")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Original Order No.")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Best fyrir")) + ‘ , ‘ );
DBMgt.AddToSQL(DBMgt.GetFieldNameSQL(FIELDNAME("Síðasti söludagur")) + ‘ ‘ );
DBMgt.AddToSQL(STRSUBSTNO(‘FROM %1 ‘,DBMgt.GetFullTableNameSQL(TABLENAME)));
DBMgt.ExecuteSQL;[/code]

In this step I also need to delete large peaces of data. I for example am clearing big log tables in the first step of the upgrade to save time later in the process. There is a TRUNCATE TABLE SQL command that clears all data in a table and is wery quick. The TRUNCATE TABLE command works differently than the DELETE command.  If the table has SumIndexFields in any of the keys the TRUNCATE TABLE command will fail.  One option is to first disable all SumIndexFields and then use TRUNCATE TABLE and the other one is to allow NAV to delete the data from tables with SumIndexFields.  If wanting to disable all SumIndexFields an isolated command with COMMIT in the end is needed.  I found out that none of my large tables had SumIndexFields so I used this code.

[code]

TruncateTable(TableID : Integer)
IF OldObject.GET(OldObject.Type::Table,”,TableID) THEN BEGIN
Company.FINDSET;
DBMgt.CreateSQL(”);
REPEAT
RecRef.OPEN(TableID,FALSE,Company.Name);
TableKey.SETRANGE(TableNo,TableID);
TableKey.SETFILTER(SumIndexFields,'<>%1′,”);
TableKey.SETRANGE(MaintainSIFTIndex,TRUE);
TableInformation.SETRANGE("Company Name",Company.Name);
TableInformation.SETRANGE("Table No.",TableID);
IF TableInformation.ISEMPTY OR NOT TableKey.ISEMPTY THEN
RecRef.DELETEALL
ELSE
DBMgt.AddToSQL(
‘TRUNCATE TABLE [‘ +
DBMgt.StringConvert(Company.Name) +
‘$’ +
DBMgt.GetTableNameSQL(RecRef.NAME) +
‘];’);
RecRef.CLOSE;
UNTIL Company.NEXT = 0;
DBMgt.ExecuteSQL;
OldObject.DELETE;
END;[/code]

On the other end of the upgrade batch, where I want to use the data from the temporary tables I also move the code to SQL. Here I use the NAV 2013 Upgrade – SQL Mgt. CODEUNIT unchanged to execute a MERGE SQL statement.

[code]
WITH VendPaymInfo DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF NOT ISEMPTY THEN
BEGIN
SQLMgt.ExecuteSQLCommand(
STRSUBSTNO(
‘MERGE INTO [%1] T ‘ +
‘ USING [%2] S ‘ +
‘ ON T.[%3] = S.[%3] ‘ +
‘WHEN MATCHED THEN ‘ +
‘ UPDATE ‘,
SQLMgt.GetFullTableNameSQL(DATABASE::"Banking Vend. Payment Info"),
SQLMgt.GetFullTableNameSQL(DATABASE::"Temp Vend. Payment Info"),
SQLMgt.StringConvert(FIELDNAME("Entry No."))
) +
STRSUBSTNO(
‘ SET [%1] = S.[%1], ‘ +
‘ [%2] = S.[%2]; ‘,
SQLMgt.StringConvert(FIELDNAME("Bank Branch No.")),
SQLMgt.StringConvert(FIELDNAME("Bank Account No."))
)
);
SQLMgt.ExecuteSQLCommand(
STRSUBSTNO(
‘TRUNCATE TABLE [%1];’,
SQLMgt.GetFullTableNameSQL(DATABASE::"Temp Vend. Payment Info")
)
);
END;
TempVendPaymInfo.DELETEALL;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;[/code]

Believe me, this is so much faster the the classic method of looping through the records, and even much faster then using the UPDATE SQL statement.  As an example, I am watching the SQL server use 3.260.000 B/sec in total disk activity with the loop-and-update method where as I saw this number go up to 230.000.000 B/sec with direct SQL statement.

Prepare to see you SQL server finally working his socks off.

Upgrade Tools

My Excel Helper to bring opened Excel Documents to front

In one of my projects I use Excel a lot for reporting.  I create the Excel documents with OpenXML and with the help of the Excel Buffer table.  With the click of a button the user creates a beautiful Excel Workbook but when I open the workbook it is displayed in the back of Dynamics NAV client window.

One of the users has been complaining about this and in the process of upgrading from NAV 2009 to NAV 2013 he asked me if this will be fixed.  I of course had to in order to increase the appreciation of the upgrade.

So I looked into this but did not find any .net object to use.  Looked like I had to import user32.dll to do this and that is not possible directly in NAV.  Hence a new class dll was born.

[code language=”csharp”] public static class WindowHelper
{
[System.Runtime.InteropServices.DllImport("user32.dll")]
static extern bool SetForegroundWindow(IntPtr hWnd);

public static void ActivateAppByProcessName(string processName)
{
Process[] p = Process.GetProcessesByName(processName);
if (p.Length > 0)
ActivateAppByHandle(p[0].MainWindowHandle);
}

public static void ActivateAppByPartialProcessName(string processName)
{
Process[] processes = Process.GetProcesses();
foreach (Process process in processes)
{
if (process.MainWindowTitle.Contains(processName))
{
ActivateAppByHandle(process.MainWindowHandle);
}
}
}

public static void ActivateAppByFileName(string fileName)
{
string[] fileNameParts = fileName.Split(new Char[] { ‘.’ });
string windowTitle = "";
for (int i = 0; i < fileNameParts.Length – 1; i++)
{
if (i == 0)
{
windowTitle += fileNameParts[i];
}
else
{
windowTitle += "." + fileNameParts[i];
}
}
ActivateAppByPartialProcessName(windowTitle);
}

static void ActivateAppByHandle(System.IntPtr WindowHandle)
{
SetForegroundWindow(WindowHandle);
}
}[/code]

In the same solution I had a problem converting the Excel Workbook to a PDF file. So I also added that support to this class.

[code language=”csharp”] public static class ExcelHelper
{
public static void ExportPDF(Microsoft.Office.Interop.Excel.Workbook xlWrkBk, string pdfFileName)
{
xlWrkBk.ExportAsFixedFormat(Microsoft.Office.Interop.Excel.XlFixedFormatType.xlTypePDF, pdfFileName);
}
}[/code]

Now I am able to open Excel in front of the NAV client and also able to convert the Excel Workbook to a PDF file.  In Table 370, Excel Buffer, I add a local variable for this class and add a single line in the code.

[code]The changes I make are below[code] PROCEDURE GiveUserControl@3();
VAR
WindowHelper@50000 : DotNet "’UserHelperClass, Version=1.0.0.0, Culture=neutral, PublicKeyToken=22affbe033b077df’.UserHelperClass.WindowHelper" RUNONCLIENT;
BEGIN
IF NOT ISNULL(XlApp) THEN BEGIN
XlApp.Visible := TRUE;
// Dynamics.is – Bring Excel to front –
WindowHelper.ActivateAppByFileName(XlWrkBk.Name);
// Dynamics.is – Bring Excel to front +
XlApp.UserControl := TRUE;
CLEAR(XlApp);
END;
END;
[/code]

And in Codeunit 424 in the bottom of the ExportData trigger

[code]
xlApp.Visible := TRUE;
//Dynamics.is – Bring Excel to Front –
WindowHelper.ActivateAppByFileName(FileMgt.GetFileName(FileName));
//Dynamics.is – Bring Excel to Front +[/code]

The class and then changed objects are attached below.

UserHelperClass  ExcelBufferUpdate

Sales & Purchases number series reaching maximum

In one of the larger companies that I have worked with we ran into a problem with the number series for sales- and purchase documents.  When the company started to use Navigator back in 1994 all the number series were initialized.  Then after some years of usage and upgrade to Financials, Navison and Dynamics NAV those original number series reached maximum and new invoices, credit memos and shipments where not the last in the history list.

To solve this I created a renumbering tool.

RenumberingSalesAndPurchases

I used this tool in NAV 4.00 SP3 and also in NAV 2009 R2.

With this tool I added number prefix to the old numbers and then changed the no. series to create new numbers into the same pattern.  Now, of course, the document numbers are nine numbers and should not be a problem in the future.

The tool is attached below.

SalesPurchDocRenumering

Copy data between companies and/or databases

I am upgrading a solution from NAV 2009 to NAV 2013.  The solution was running in the Role Tailored Client only so the upgrade process was fast and easy.

I wanted to move this customized solution to another database and remove all other customizations from the database.  On the other hand I am upgrading everything in the old database except this customized solution with the standard methods.

After I had the solution upgraded and the new database ready I started to think about how I would get the data from the old database.  It is a significant amount of data so I wanted to use the native sql commands to do this.  Since I had around one hundred tables I felt it would not be practical to copy manually nor to create the copy script manually.

So, I created a batch report.  The batch creates an INSERT INTO and SELECT FROM query that is very fast.  It requires the tables to be identical in both databases.

CreateCopyScript

This batch will create a copy script that runs fast…

Few things I had to check.  When copying Blob fields from NAV 2009 to NAV 2013 I needed to change the Compressed property to No.  I also needed to check the fields to make sure that fields with AutoIncrement property set to Yes would not be included in the script.

I started with a NAV 2009 batch that is basically raw.  I made the NAV 2013 batch smarter with direct connection to the sql server.

Here are both NAV Copy for 2009 and NAV Copy 2013 (2).