Import Tenant Data

My last post was about how I got the customized data out of the tenant database into Xml files. That tenant database was from a NAV 2016 application.

I have updated the tenant database to Business Central and I need to bring in some of the data from these Xml files.

My first issue was that I needed to make these Xml files available to Business Central. I have been using Azure Blob to store files for some years now. I had both AL and C/AL code that was able to connect to the Azure Blob REST Api, but that code used DotNet variables that is no longer an option.

I did some preparation last year, when I requested Microsoft to add some functionality to the BaseApp. Using that BaseApp functionality I was able to redo my Azure Blob AL code as a clean extension.

I also wanted to put the AL code somewhere in a public place for everyone to see. And GitHub is the default code storage place. I created a project for Business Central AL.

I am hoping that this place can be the place where code examples for our Business Central community is shared and maintained. If you want to contribute then I can add you to this project, or I can approve your pull request.

I need to write another blob post about that Azure Blob and the other repositories I have created there. Hope to find time soon.

There is another repository in this project for the Import Tenant Data App. This app has an Azure Blob Connect functionality to utilize the Azure Blob app for data import.

I start by opening the Import Data Source page.

Here I find the Azure Blob Connector that self registered in the Import Data Source table.

I need to go to Process -> Setup to configure my Azure Blob container access.

The information required can be found in the Azure Portal.

Specify the container where you have uploaded all the Xml files.

Then I searched for Import Project List and create a new import project for the General Ledger. The Import Source for Azure Blob was automatically select, since that is the only one available.

Now to import the related Xml files into this project

I get a list of files from the Azure Blob and select the one I need.

The file list will open again if I have more files to import. Close the file list when finished. Back on the Import Project we should now see information from the Xml file.

For each file I need to configure the destination mapping.

If the table exists in my Business Central App then it will be automatically selected.

And I can map fields from the Xml file to the Business Central Table.

There are options to handle different data structure. One is that we can add a transformation rule directly to each field. The other one is using our own custom data upgrade app that subscribes to the events published in this app.

Four events are published, two for each field in the mapping, two before updating or inserting the database record.

Based on the information in the publishers we can do any manual data modification required. In my example the creation time was added to each G/L Entry in NAV, but is added to the G/L Register in Business Central.

From the list of tables we are able to start the data transfer. First we need to make sure that we have the correct configuration for the import. Do we want to commit during the import, do we want to create missing records in our database?

I select to commit after each 1000 records. If my data transfer stops, than I can resume from that position when I start the data transfer again.

We have the option to create a task in the job queue to handle the data transfer.

The job queue can handle multiple concurrent transfers so the import should not take to much time. Looking into the Destination Mapping, we can see the status of the data import.

I will add few more pictures to give you a better idea of what can be done with this import tenant data app. The AL code is in GitHub for you to browse, improve and fix.

Get Table Data from a tenant database

I am working on a data upgrade for a customer. The existing database is from another partner and has tables and fields that my development license does not give me access to.

The database in in version 2016 and I am upgrading to Business Central.

So, I added some functions to the AdvaniaGIT PowerShell module.

To get the table definition from the tenant database I use

(Get-NAVTableMetaDataXml -DatabaseName <tenantDatabasename> -TableId 289 -Snapshot).OuterXml | Out-File C:\AdvaniaGIT\Workspace\xml.xml

The resulting Xml file I can view in any Xml viewer.

This data will help me update my upgrade tables for the Sync-NAVTenant command to complete successfully.

But, I wanted more. Since the destination for the data is in an App table I feel that i have no need for the data in the database during the C/AL upgrade.

I also created the Get-NAVTenanttableNos function that will give me a list of all tables in the tenant database. Using this data I was able to iterate through all the tables and export the structure and data.

I created an empty folder and executed

Get-NAVUpgradeTablesData -TenantDatabase <tenantDatabasename> -ApplicationDatabase 2018-App -CompanyName "MyCompany" -Verbose -ExportPath 'C:\AdvaniaGIT\Workspace\MyCompany'

This gives me all the data that does not fit in the standard 2018-App in separate Xml files .

Each file contains only the fields required to import the custom data. As an example, the xml file for table 17, G/L Entry, does not contain all the data from G/L Entries, just the primary key and the custom fields.

In the Xml file I include details for the table and for the fields included in the data set.

Custom tables that do not have any match in the 2018-App database are exported with all fields. With all the custom data stored safely I can execute the Sync-NAVTenant with Force to get the database upgraded.

The next task is in AL, writing the code to import the Xml files to the App tables.

I will be back with the results from that task…

Upgrading my G/L Source Names Extension to AL – step 3

When upgrading an extension from C/AL to AL (version 1 to version 2) we need to think about the data upgrade process.

In C/AL we needed to add two function to an extension Codeunit to handle the installation and upgrade.  This I did with Codeunit 70009200.  One function to be execute once for each install.

PROCEDURE OnNavAppUpgradePerDatabase@1();
VAR
  AccessControl@70009200 : Record 2000000053;
BEGIN
  WITH AccessControl DO BEGIN
    SETFILTER("Role ID",'%1|%2','SUPER','SECURITY');
    IF FINDSET THEN REPEAT
      AddUserAccess("User Security ID",PermissionSetToUserGLSourceNames);
      AddUserAccess("User Security ID",PermissionSetToUpdateGLSourceNames);
      AddUserAccess("User Security ID",PermissionSetToSetupGLSourceNames);
    UNTIL NEXT = 0;
  END;
END;

And another function to be executed once for each company in the install database.

PROCEDURE OnNavAppUpgradePerCompany@2();
VAR
  GLSourceNameMgt@70009200 : Codeunit 70009201;
BEGIN
  NAVAPP.RESTOREARCHIVEDATA(DATABASE::"G/L Source Name Setup");
  NAVAPP.RESTOREARCHIVEDATA(DATABASE::"G/L Source Name User Setup");
  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name");

  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name Help Resource");
  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name User Access");
  NAVAPP.DELETEARCHIVEDATA(DATABASE::"G/L Source Name Group Access");

  GLSourceNameMgt.PopulateSourceTable;
  RemoveAssistedSetup;
END;

For each database I add my permission sets to the installation users and for each company I restore the setup data for my extension and populate the lookup table for G/L Source Name.

The methods for install and upgrade have changed in AL for extensions version 2.  Look at the AL documentation from Microsoft for details.

In version 2 I remove these two obsolete function from my application management Codeunit and need to add two new Codeunits, one for install and another for upgrade.

codeunit 70009207 "O4N GL Source Name Install"
{
    Subtype = Install;
    trigger OnRun();
    begin
    end;

    var
    PermissionSetToSetupGLSourceNames : TextConst ENU='G/L-SOURCE NAMES, S';
    PermissionSetToUpdateGLSourceNames : TextConst ENU='G/L-SOURCE NAMES, E';
    PermissionSetToUserGLSourceNames : TextConst ENU='G/L-SOURCE NAMES';

    
    trigger OnInstallAppPerCompany();
    var
        GLSourceNameMgt : Codeunit "O4N GL SN Mgt";
    begin
        GLSourceNameMgt.PopulateSourceTable;
        RemoveAssistedSetup;
    end;

    trigger OnInstallAppPerDatabase();
    var
        AccessControl : Record "Access Control";
    begin
        with AccessControl do begin
            SETFILTER("Role ID",'%1|%2','SUPER','SECURITY');
            if FINDSET then repeat
                AddUserAccess("User Security ID",PermissionSetToUserGLSourceNames);
                AddUserAccess("User Security ID",PermissionSetToUpdateGLSourceNames);
                AddUserAccess("User Security ID",PermissionSetToSetupGLSourceNames);
            until NEXT = 0;
        end;
    end;

  local procedure RemoveAssistedSetup();
  var
    AssistedSetup : Record "Assisted Setup";
  begin
    with AssistedSetup do begin
      SETRANGE("Page ID",PAGE::"O4N GL SN Setup Wizard");
      if not ISEMPTY then
        DELETEALL;
    end;
  end;

  local procedure AddUserAccess(AssignToUser : Guid;PermissionSet : Code[20]);
  var
    AccessControl : Record "Access Control";
    AppMgt : Codeunit "O4N GL SN App Mgt.";
    AppGuid : Guid;
  begin
    EVALUATE(AppGuid,AppMgt.GetAppId);
    with AccessControl do begin
      INIT;
      "User Security ID" := AssignToUser;
      "App ID" := AppGuid;
      Scope := Scope::Tenant;
      "Role ID" := PermissionSet;
      if not FIND then
        INSERT(true);
    end;
  end;

}

In the code you can see that this Codeunit is of Subtype=Install.  This code will  be executed when installing this extension in a database.

To confirm this I can see that I have the G/L Source Names Permission Sets in the Access Control table .

And my G/L Source Name table also has all required entries.

Uninstalling the extension will not remove this data.  Therefore you need to make sure that the install code is structured in a way that it will work even when reinstalling.  Look at the examples from Microsoft to get a better understanding.

Back to my C/AL extension.  When uninstalling that one the data is moved to archive tables.

Archive tables are handled with the NAVAPP.* commands.  The OnNavAppUpgradePerCompany command here on top handled these archive tables when reinstalling or upgrading.

Basically, since I am keeping the same table structure I can use the same set of commands for my upgrade Codeunit.

codeunit 70009208 "O4N GL SN Upgrade"
{
    Subtype=Upgrade;
    trigger OnRun()
    begin
        
    end;
    
    trigger OnCheckPreconditionsPerCompany()
    begin

    end;

    trigger OnCheckPreconditionsPerDatabase()
    begin

    end;
    
    trigger OnUpgradePerCompany()
    var
        GLSourceNameMgt : Codeunit "O4N GL SN Mgt";
        archivedVersion : Text;
    begin
        archivedVersion := NAVAPP.GetArchiveVersion();
        if archivedVersion = '1.0.0.1' then begin
            NAVAPP.RESTOREARCHIVEDATA(DATABASE::"O4N GL SN Setup");
            NAVAPP.RESTOREARCHIVEDATA(DATABASE::"O4N GL SN User Setup");
            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN");

            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN Help Resource");
            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN User Access");
            NAVAPP.DELETEARCHIVEDATA(DATABASE::"O4N GL SN Group Access");

            GLSourceNameMgt.PopulateSourceTable;
        end;
    end;

    trigger OnUpgradePerDatabase()
    begin

    end;

    trigger OnValidateUpgradePerCompany()
    begin

    end;

    trigger OnValidateUpgradePerDatabase()
    begin

    end;
    
}

So, time to test how and if this works.

I have my AL folder open in Visual Studio Code and I use the AdvaniaGIT command Build NAV Environment to get the new Docker container up and running.

Then I use Update launch.json with current branch information to update my launch.json server settings.

I like to use the NAV Container Helper from Microsoft  to manually work with the container.  I use a command from the AdvaniaGIT module to import the NAV Container Module.

The module uses the container name for most of the functions.  The container name can be found by listing the running Docker containers or by asking for the name that match the server used in launch.json.

I need my C/AL extension inside the container so I executed

Copy-FileToNavContainer -containerName jolly_bhabha -localPath C:\NAVManagementWorkFolder\Workspace\GIT\Kappi\NAV2017\Extension1\AppPackage.navx -containerPath c:\run

Then I open PowerShell inside the container

Enter-NavContainer -containerName jolly_bhabha

Import the NAV Administration Module

Welcome to the NAV Container PowerShell prompt

[50AA0018A87F]: PS C:\run> Import-Module 'C:\Program Files\Microsoft Dynamics NAV\110\Service\NavAdminTool.ps1'

Welcome to the Server Admin Tool Shell!

[50AA0018A87F]: PS C:\run>

and I am ready to play.  Install the C/AL extension

Publish-NAVApp -ServerInstance NAV -IdePath 'C:\Program Files (x86)\Microsoft Dynamics NAV\110\RoleTailored Client\finsql.exe' -Path C:\run\AppPackage.navx -SkipVerification

Now I am faced with the fact that I have opened PowerShell inside the container in my AdvaniaGIT terminal.  That means that my AdvaniaGIT commands will execute inside the container, but not on the host.

The simplest way to solve this is to open another instance of Visual Studio Code.  From there I can start the Web Client and complete the install and configuration of my C/AL extension.

I complete the Assisted Setup and do a round trip to G/L Entries to make sure that I have enough data in my tables to verify that the data upgrade is working.

I can verify this by looking into the SQL tables for my extension.  I use PowerShell to uninstall and unpublish my C/AL extension.

Uninstall-NAVApp -ServerInstance NAV -Name "G/L Source Names"
Unpublish-NAVApp -ServerInstance NAV -Name "G/L Source Names"

I can verify that in my SQL database I now have four AppData archive tables.

Pressing F5 in Visual Studio Code will now publish and install the AL extension, even if I have the terminal open inside the container.

The extension is published but can’t be installed because I had previously installed an older version of my extension.  Back in my container PowerShell I will follow the steps as described by Microsoft.

[50AA0018A87F]: PS C:\run> Sync-NAVApp -ServerInstance NAV -Name "G/L Source Names" -Version 2.0.0.0
WARNING: Cannot synchronize the extension G/L Source Names because it is already synchronized.
[50AA0018A87F]: PS C:\run> Start-NAVAppDataUpgrade -ServerInstance NAV -Name "G/L Source Names" -Version 2.0.0.0
[50AA0018A87F]: PS C:\run> Install-NAVApp -ServerInstance NAV -Tenant Default -Name "G/L Source Names"
WARNING: Cannot install extension G/L Source Names by Objects4NAV 2.0.0.0 for the tenant default because it is already installed.
[50AA0018A87F]: PS C:\run>

My AL extension is published and I have verified in my SQL server that all the data from the C/AL extension has been moved to the AL extension tables and all the archive tables have been removed.

Back in Visual Studio Code I can now use F5 to publish and install the extension again if I need to update, debug and test my extension.

Couple of more steps left that I will do shortly.  Happy coding…

 

Building a clean database – remove not licensed objects

I just got a question from a client;

Gunnar,
Do you have a “King Kong” license that will allow you to delete any object?  It appears our development license does not have the rights to some of the newer LS Retail objects and I need to create a CRONUS database with just our stuff.

Well, I don’t have a “King Kong” license.  That is only for Microsoft.

There is a way to solve this dilemma.  It will take a few steps.

Start with we have two databases, one with the data we need (LSRetail), another with the application we need (CRONUS).

After the process is completed the LSRetail database will not be usable as a standalone database, so make a copy if you need one.  A new database will be created, CRONUS_APP.  To clean up it is safe to delete both these databases.

The following powershell script has two options.  Option 1 is to have the company data imported into the CRONUS database in the end.  This option requires a server instance running on the CRONUS database.  Option 2 is to create a new database with SQL Management Studio and merge the CRONUS application and the LSRetail data into that one.

[code lang=”powershell”]
$CronusDatabaseName = "CRONUS" # Database with destination Application
$CRONUSServerInstance = "DynamicsNAV80" # Instance for destination Application if using option 1
$LSRetailDatabaseName = "LSRETAIL" # LS Retail Demo Database, database with company data
$EmptyDatabaseName = "CRONUS WITH COMPANYDATA" # Create a new empty database using SQL Management Studio if using option 2
$SQLServerName = "SQL2014"
$SQLServerInstance = "NAVDEMO" # Set blank for default instance

$AppDatabaseName = $CronusDatabaseName + "_APP"
$ServiceAccount = $env:USERDOMAIN + "\" + $env:USERNAME
$ServerInstance = "UPGRADE"
$NavDataFile = (Join-Path $env:TEMP "NAVmerge.navdata")

$SelectOption = "2"

#Export Application from CRONUS Database to Application Database
Export-NAVApplication -DatabaseServer $SQLServerName -DatabaseInstance $SQLServerInstance -DatabaseName $CronusDatabaseName -DestinationDatabaseName $AppDatabaseName -ServiceAccount $ServiceAccount -Force

#Setup a temporary Server Instance for the new database
Get-Credential | New-NAVServerInstance -ServerInstance $ServerInstance -ManagementServicesPort 33555 -ClientServicesPort 33556 -SOAPServicesPort 33557 -ODataServicesPort 33558 -DatabaseInstance $SQLServerInstance -DatabaseServer $SQLServerName -DatabaseName $AppDatabaseName -ServiceAccount User -Force
Set-NAVServerConfiguration -ServerInstance $ServerInstance -KeyName "Multitenant" -KeyValue "true" -Force
Set-NAVServerInstance -ServerInstance $ServerInstance -Start -Force

#Prepare LSRetailDatabase for new configuration
Remove-NAVApplication -DatabaseInstance $SQLServerInstance -DatabaseServer $SQLServerName -DatabaseName $LSRetailDatabaseName -Force

#Mount and Sync LSRetailDatabase as a tenant
Mount-NAVTenant -ServerInstance $ServerInstance -DatabaseInstance $SQLServerInstance -DatabaseServer $SQLServerName -DatabaseName $LSRetailDatabaseName -Id DEFAULT -OverwriteTenantIdInDatabase -AllowAppDatabaseWrite -Force
Sync-NAVTenant -ServerInstance $ServerInstance -Tenant DEFAULT -Mode ForceSync -Force

if (Test-Path $NavDataFile)
{
Remove-Item -Path $NavDataFile -Force
}

#Option 1, Copy Company data to the original CRONUS database. Requies a service running on the CRONUS database
if ($SelectOption -eq "1")
{
Export-NAVData -ServerInstance $ServerInstance -Tenant DEFAULT -AllCompanies -FilePath $NavDataFile -Force
Import-NAVData -ServerInstance $CRONUSServerInstance -FilePath $NavDataFile -AllCompanies -Force
}
#Option 2, Import into the new empty database created by SQL Management Studio
if ($SelectOption -eq "2")
{

Export-NAVData -ServerInstance $ServerInstance -Tenant DEFAULT -AllCompanies -FilePath $NavDataFile -IncludeApplication -IncludeApplicationData -IncludeGlobalData -Force
if ($SQLServerInstance -eq "")
{
Import-NAVData -DatabaseServer $SQLServerName -DatabaseName $EmptyDatabaseName -FilePath $NavDataFile -AllCompanies -IncludeApplicationData -IncludeGlobalData -IncludeApplication

}
else
{
Import-NAVData -DatabaseServer ($SQLServerName + "\" + $SQLServerInstance) -DatabaseName $EmptyDatabaseName -FilePath $NavDataFile -AllCompanies -IncludeApplicationData -IncludeGlobalData -IncludeApplication
}

}

Set-NAVServerInstance -ServerInstance $ServerInstance -Stop -Force
Remove-NAVServerInstance -ServerInstance $ServerInstance -Force

if (Test-Path $NavDataFile)
{
Remove-Item -Path $NavDataFile -Force
}
[/code]

To walk you through what happens;

  • Application from CRONUS is exported into CRONUS_APP database
  • New Service Instance is created for CRONUS_APP database
  • Service Instance is changed to Multi Tenant and started
  • Application is removed from LSRetail database
  • LSRetail database is mounted as a tenant for CRONUS_APP database
  • LSRetail database structure is force-synched to CRONUS_APP application
  • Data from CRONUS_APP and LSRetail tenant is exported to NAVData file
  • NAVData file is imported into an empty database or the existing CRONUS database

 

Technical upgrade from NAV 2013 R2 to NAV 2015

I just made a backup of a live NAV 2013 R2 database to do an upgrade to NAV 2015.  Since the system is live we have several servers running when the backup is created.

The backup was restored in a new environment and the database opened with a NAV 2015 Developement Client.  The database is put in single user mode and converted.  After the conversion a process to upgrade objects starts.  Here is where things start to behave in an unwanted way.

chooseInstance

The database is in a Single User Mode since it is still in the upgrade transaction and no NAV Server instance is connected.  However, in the table “Server Instance” all the old NAV Server instances are listed.  To get through this I needed to press N like a thousand times…

In a test upgrade process like this we need to add another step to the usual upgrade procedure.  After the restore is completed make sure that the tables “Server Instance” and “Active Session” are empty.

After a restore to a new environment it is also good to clean unneeded users from the database.  Only leave what is actually needed.

users

In this case the NAVLIGHT\srvNAV is the service user.  Also make sure that the Compatibility Level is as new as possible and the Recovery Model in line with the backup process.

dbproperty

Also, when moving from one environment to another the users connected to the old domain are obsolete.  Going to Users and trying to remove them results in an error and the user is only disabled.

userdelete

In some cases this could be enough.  It is possible to completely remove the user, just do some housekeeping first.  Find the user in “User Personalization” and Edit the record.  On the Action ribbon we can clean some of the user trails.

clearpersonalization

Finally make sure that the record is deleted from “User Personalization”.

After this cleanup work the user can be safely deleted.

 

Give our customers NAV 2015 Demo with own data in few steps

Microsoft just released the Cumulative Update 1 for Dynamics NAV 2015.  With this update we no longer need to push our data upgrade through NAV 2013 R2.  As the blog says:

“The cumulative update includes an upgrade toolkit for upgrading a Microsoft Dynamics NAV 2009 R2 or Microsoft Dynamics NAV 2009 SP1 database to Microsoft Dynamics NAV 2015. The upgrade toolkit includes several application objects in FOB files that simplify the upgrade process for those of you coming from Microsoft Dynamics NAV 2009 R2 or Microsoft Dynamics NAV 2009 SP1. For more information, see the attached whitepaper.”

Attached is the new PDF document on how to upgrade NAV 2009 R2 or NAV 2009 SP1 Database to NAV 2015

The biggest task in the data upgrade from a customized database to a standard one is to handle all the data modifications.  For this I use the Table Compare tool.  I need both the customized version and the old standard version.  Export all tables from the Object Designer to a text file and import into the tool.

ImportTableData2015

Also import the 2015 objects in a similar way.  After importing all the table definitions into the tool and selecting the appropriate Base Version for each customized version go into the Version Comparisons.

VersionCompare1

After pressing the Compare button the result is showed.

VersionCompareResult

Click on the counters to drill down to the tables.  For all the tables listed as modified and deleted select the action to force the data transformation.

ForceTables

Back on the Version Comparison click the button to Create 2015 Upgrade Code and Save Codeunit Object File.

SaveUpgradeCodeunit

 

Now we step over to the data upgrade task.  We follow the guide by Microsoft; prepare the 2009 R2 version, convert with 2013, open with 2015, compile system tables, attach a service, run the schema synchronization and import the 2015 application objects.

Here we stop after step 21 and add our exported upgrade Codeunit by importing and compiling.  Now, run the Sync. Schema for All Tables With Validation.  This should clean all fields that are not a part of the standard NAV.

Sync

Remove the custom upgrade Codeunit from the database and continue with step 22, Import Upgrade601800.[Country].fob and so forth…

Continue the Microsoft steps and you should have a 2015 version database ready to use as a demo database for the customer.

p.s. This method will also apply to any upgrade process from 2009 R2 to 2015.  If there is customization in the 2015 database, just use the tool to create a temporary tables and data transfer code.

The Next Upgrade Story – Data transfer code generation tool

Guess what, more companies are moving to NAV 2013 R2.

I am about to start a data upgrade for a medium-sized Icelandic company.  Doing a data transfer means that you need to compare the tables of the old version and the new version and where you have mismatch you have to take some action.  The data transfer is build on the upgrade from 2009 to 2013 R2 released by Microsoft.

We have the MergeTool that is used to compare and merge code and a lot of other cool things.  I wanted a tool that did something else so I started designing a writing.  I am hoping that the result will save me a lot of work in future upgrades.

UpgradeMenu

This tool imports exported object files into a table version.  Then I can compare two table versions and get all the difference listed.  I also import the CRONUS tables for each version to make sure that I am not comparing built-in fields.

TableVersions

I give the comparison an identification and select the source and destination versions.  Then I hit Compare.  Within seconds I get the result.

CompareResult3

I have got to make decisions on what to do with modified and deleted tables.  I can list all the tables and check the fields comparison.

Fields

I can select both step 1 and step 2 actions for each table.  For new fields I can select a source field from the source table.  The example I show here is the customer table.  I choose in step 1 to copy all the needed fields to a temporary table.  The tool will find the next available upgrade table automatically.  In step 2 I choose to move the data from the temporary table into the destination version.  I can ask that a new field will get a value from another field in the temporary table.  In this example the Registration No. field has a new ID.

The best part is yet to come.  To really save some work I make the tool create all the needed NAV upgrade objects.  For step 1 I get all the upgrade tables with only required fields.

[code]OBJECT Table 66001 Temp Customer
{
OBJECT-PROPERTIES
{
Date=11.02.14;
Time=12:11:11;
Version List=Dynamics.is;
}
PROPERTIES
{
}
FIELDS
{
{ 1 ; ;No. ;Code20 }
{ 10010400;;Registration No. ;Code20 }
{ 10017241;;Require Purchase Request;Boolean }
{ 10017290;;Freight Paid By ;Option ;OptionString=[Recipient,Prepaid,Senders Account,See Ship-to Address] }
}
KEYS
{
{ ;No. ;Clustered=Yes }
}
FIELDGROUPS
{
}
CODE
{

BEGIN
{
¸Dynamics.is Upgrade Table
}
END.
}
}

OBJECT Table 66002 Temp Cust. Ledger Entry
{

[/code]

The codeunit required to make the data manipulation.

[code]OBJECT Codeunit 66000 Upgrade FurblandanR
{
OBJECT-PROPERTIES
{
Date=11.02.14;
Time=12:11:11;
Version List=Dynamics.is;
}
PROPERTIES
{
}
CODE
{
PROCEDURE Upgrade@1(VAR StateIndicator@1000 : Record 104037);
BEGIN
// Call this function from the top of the Upgrade trigger in Codeunit ID 104045
CopyCustomer(StateIndicator);
CopyCustLedgerEntry(StateIndicator);

MoveCashierBankBranch(StateIndicator);

ForcePaymentTerms(StateIndicator);
ForceSalesHeader(StateIndicator);

END

LOCAL PROCEDURE CopyCustomer@2(VAR StateIndicator@1002 : Record 104037);
VAR
Customer@1000 : Record 18;
TempCustomer@1001 : Record 66001;
BEGIN
WITH Customer DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
CALCFIELDS("Picture");
TempCustomer.INIT;
TempCustomer.TRANSFERFIELDS(Customer);
TempCustomer.INSERT;
UNTIL NEXT = 0;
MODIFYALL("Registration No.",”);
MODIFYALL("Require Purchase Request",FALSE);
MODIFYALL("Freight Paid By",0);
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE CopyCustLedgerEntry@3(VAR StateIndicator@1002 : Record 104037);
VAR
CustLedgerEntry@1000 : Record 21;
TempCustLedgerEntry@1001 : Record 66002;
BEGIN
WITH CustLedgerEntry DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
TempCustLedgerEntry.INIT;
TempCustLedgerEntry.TRANSFERFIELDS(CustLedgerEntry);
TempCustLedgerEntry.INSERT;
UNTIL NEXT = 0;
MODIFYALL("Final Due Date",0D);
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE MoveCashierBankBranch@26(VAR StateIndicator@1002 : Record 104037);
VAR
CashierBankBranch@1000 : Record 10010402;
TempCashierBankBranch@1001 : Record 10000202;
BEGIN
WITH CashierBankBranch DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
TempCashierBankBranch.INIT;
TempCashierBankBranch.TRANSFERFIELDS(CashierBankBranch);
TempCashierBankBranch.INSERT;
UNTIL NEXT = 0;
DELETEALL;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE ForceFixedAsset@48(VAR StateIndicator@1002 : Record 104037);
VAR
FixedAsset@1000 : Record 5600;
BEGIN
WITH FixedAsset DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDFIRST THEN BEGIN
StateIndicator.Update;
MODIFYALL("Straight-Line %",0);
MODIFYALL("Depreciation Starting Date",0D);
MODIFYALL("Biling Contract No.",”);
END;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;


[/code]

And to complete step 1 the addition needed to the DeleteDiscontinuedTables trigger.

[code]// Add these lines to the DeleteDiscontinuedTables trigger of Codeunit ID 104002
DeleteTable(72000);
DeleteTable(72001);
DeleteTable(72003);
DeleteTable(87400);
…// Add these lines to the DeleteDiscontinuedTables trigger of Codeunit ID 104002
[/code]

Then there is the upgrade codeunit for step 2

[code]OBJECT Codeunit 66001 Upgrade FurblandanR
{
OBJECT-PROPERTIES
{
Date=11.02.14;
Time=12:11:11;
Version List=Dynamics.is;
}
PROPERTIES
{
}
CODE
{
PROCEDURE Upgrade@1(VAR StateIndicator@1000 : Record 104037);
BEGIN
// Call this function from the top of the Upgrade trigger in Codeunit ID 104048
CopyBankingWebServicesUser(StateIndicator);
MoveCustomer(StateIndicator);
.. END

LOCAL PROCEDURE CopyBankingWebServicesUser@2(VAR StateIndicator@1002 : Record 104037);
VAR
TempCashierWebServicesUser@1000 : Record 66022;
BankingWebServicesUser@1001 : Record 10010428;
BEGIN
WITH TempCashierWebServicesUser DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
"BankingWebServicesUser".INIT;
"BankingWebServicesUser"."Authentication Code" := "Authentication Code";
"BankingWebServicesUser"."User ID" := "User ID";
"BankingWebServicesUser"."Certificate Key Identifier" := "Certificate Key Identifier";
//"BankingWebServicesUser"."User ID Store" := ;
//"BankingWebServicesUser"."Password Store" := ;
//"BankingWebServicesUser"."Certificate Location" := ;
//"BankingWebServicesUser"."Certificate Value" := ;
"BankingWebServicesUser".INSERT;
UNTIL NEXT = 0;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;

LOCAL PROCEDURE MoveCustomer@3(VAR StateIndicator@1002 : Record 104037);
VAR
TempCustomer@1000 : Record 66001;
Customer@1001 : Record 18;
BEGIN
WITH TempCustomer DO
IF StateIndicator.UpdateTable(TABLENAME) THEN BEGIN
TimeLog.TimeLogInsert(TimeLog,TABLENAME,TRUE);
IF FINDSET(TRUE) THEN
REPEAT
StateIndicator.Update;
"Customer".GET("No.");
"Customer"."Registration No." := "Registration No.";
//"Customer"."Momentum Claims Active" := ;
"Customer".MODIFY;
UNTIL NEXT = 0;
DELETEALL;
TimeLog.TimeLogInsert(TimeLog,TABLENAME,FALSE);
StateIndicator.EndUpdateTable(TABLENAME);
END;
END;


[/code]

And to complete the process the upgrade tables are marked to be deleted.

[code]// Add these lines to the OnRun trigger of Codeunit ID 104003
MarkTable(66001);
MarkTable(66002);
…[/code]

This tool is in testing and I am planning to add support for NAV 2015 upgrade tool.

Help Microsoft with the upgrade story

Customer Survey: Microsoft Dynamics NAV upgrade

​With this survey, we would like to solicit your valuable input in regard to the Microsoft Dynamics NAV upgrade story to better understand your current upgrade situation.

​The questionnaire below focuses on upgrading from Microsoft Dynamics NAV 2009 Classic client to the RoleTailored client as well as the upgrade experience in general. The Microsoft Dynamics NAV team is determined to provide you with the best possible service to help you move your business forward and get the most of our new releases.

http://sgiz.mobi/s3/8dd03168613b

The upgrade story – continued

The Client went live on the 6th.  This is the task I told you about in last September.

The database was 567GB in size with just a little less than 400GB used.  It had 1.506.747 sales invoices and 30.438.858 value entries.  Version 4.0Sp3 with a lot of added fields and customization.

When I first ran the upgrade process with the standard upgrade tool it took about ten days.  As you can see there is no way to stop a company for ten days.  The process was rewritten and these are the steps I used.

  • Backup the live database with compression enabled
  • Copy the backup file to the new SQL server
  • Restore the backup to the new server
  • Drop all statistics that was stopping me from modifying tables
  • Run Inventory Adjustment
  • Import and execute 4.0 SP3 -> 2009 R2 Step 1
  • Import 2009 R2 objects
  • Prepare data for Step 2 by importing conversion tables
  • Execute 4.0 SP3 -> 2009 R2 Step 2
  • Import 2009 R2 -> 2013 R2 upgrade objects and execute Step 1
  • Change database compatibility level to 2012
  • Execute SQL commands to drop all statistics that was blocking database conversion
  • Open the database with NAV 2013 Developement Environment and convert database
  • Open the database with NAV 2013 R2 Developement Environment and convert the database
  • Compiled the system tables
  • Changed the SQL Command Timeout for the service to 12:00:000 and started the service
  • Selected “No” in the Developement Environment option “Prevent data loss from table changes”
  • Imported all NAV 2013 R2 objects
  • From PowerShell, executed Sync-NAVTenant
  • Selected “Yes” in the Developement Environment option “Prevent data loss from table changes”
  • Import 2009 R2 -> 2013 R2 upgrade objects and execute Step 2

There where a few minor steps in between and after that was needed for this customer but basically this was it.

The good news is; this was completed in just under 23 hours with an unexpected delay of just over an hour.  This can be done and the client is happy.

A Table Hook to minimize the upgrade work

In a line with the methods introduced by Partner Ready Software I am now upgrading old code to NAV 2013 R2 with the goal to modify the standard objects a little as possible.

I don’t recall Mark, Waldo, Gary or Vjeko speaking about a Table Hook but I decided to go that way anyway.  As an example I have about twenty new field for the Customer Table.  Instead of creating these fields in table 18 I create a new table that I call Customer Hook.  The Hook Table must include the fields from the primary key and the primary key it self from the Master Table.

CustomerHook

The only modification I do in table 18 is a new function

GetHook

In a code where I need to access the new custom fields I use this function to get the Table Hook to work with.

In the Hook table I have a few functions.

HookFunctions

Next I created a Codeunit to hook to the global triggers in Codeunit 1.  This Codeunit forwards the triggers from Codeunit 1 to the Hook Table.

DbHookCodeunit

I need to add four lines to Codeunit 1.

Codeunit1Changes

When all this is ready I am able to create or extend the Customer Page.  Here it is easy to mix together fields from the Customer Table and the Customer Hook Table.

PageFields

For the fields from the Hook Table I can add a Table Relation and a Validate code.

CodeValidate

On the Page I create a local function

SaveFunction

and finally make sure that I get the Hook table matching the Customer on the Page.

GetHookOnPage