I am working on a data upgrade for a customer. The existing database is from another partner and has tables and fields that my development license does not give me access to.
The database in in version 2016 and I am upgrading to Business Central.
So, I added some functions to the AdvaniaGIT PowerShell module.
To get the table definition from the tenant database I use
(Get-NAVTableMetaDataXml -DatabaseName <tenantDatabasename> -TableId 289 -Snapshot).OuterXml | Out-File C:\AdvaniaGIT\Workspace\xml.xml
The resulting Xml file I can view in any Xml viewer.

This data will help me update my upgrade tables for the Sync-NAVTenant command to complete successfully.
But, I wanted more. Since the destination for the data is in an App table I feel that i have no need for the data in the database during the C/AL upgrade.
I also created the Get-NAVTenanttableNos function that will give me a list of all tables in the tenant database. Using this data I was able to iterate through all the tables and export the structure and data.
I created an empty folder and executed
Get-NAVUpgradeTablesData -TenantDatabase <tenantDatabasename> -ApplicationDatabase 2018-App -CompanyName "MyCompany" -Verbose -ExportPath 'C:\AdvaniaGIT\Workspace\MyCompany'
This gives me all the data that does not fit in the standard 2018-App in separate Xml files .

Each file contains only the fields required to import the custom data. As an example, the xml file for table 17, G/L Entry, does not contain all the data from G/L Entries, just the primary key and the custom fields.
In the Xml file I include details for the table and for the fields included in the data set.
Custom tables that do not have any match in the 2018-App database are exported with all fields. With all the custom data stored safely I can execute the Sync-NAVTenant with Force to get the database upgraded.

The next task is in AL, writing the code to import the Xml files to the App tables.
I will be back with the results from that task…