Import Tenant Data

My last post was about how I got the customized data out of the tenant database into Xml files. That tenant database was from a NAV 2016 application.

I have updated the tenant database to Business Central and I need to bring in some of the data from these Xml files.

My first issue was that I needed to make these Xml files available to Business Central. I have been using Azure Blob to store files for some years now. I had both AL and C/AL code that was able to connect to the Azure Blob REST Api, but that code used DotNet variables that is no longer an option.

I did some preparation last year, when I requested Microsoft to add some functionality to the BaseApp. Using that BaseApp functionality I was able to redo my Azure Blob AL code as a clean extension.

I also wanted to put the AL code somewhere in a public place for everyone to see. And GitHub is the default code storage place. I created a project for Business Central AL.

I am hoping that this place can be the place where code examples for our Business Central community is shared and maintained. If you want to contribute then I can add you to this project, or I can approve your pull request.

I need to write another blob post about that Azure Blob and the other repositories I have created there. Hope to find time soon.

There is another repository in this project for the Import Tenant Data App. This app has an Azure Blob Connect functionality to utilize the Azure Blob app for data import.

I start by opening the Import Data Source page.

Here I find the Azure Blob Connector that self registered in the Import Data Source table.

I need to go to Process -> Setup to configure my Azure Blob container access.

The information required can be found in the Azure Portal.

Specify the container where you have uploaded all the Xml files.

Then I searched for Import Project List and create a new import project for the General Ledger. The Import Source for Azure Blob was automatically select, since that is the only one available.

Now to import the related Xml files into this project

I get a list of files from the Azure Blob and select the one I need.

The file list will open again if I have more files to import. Close the file list when finished. Back on the Import Project we should now see information from the Xml file.

For each file I need to configure the destination mapping.

If the table exists in my Business Central App then it will be automatically selected.

And I can map fields from the Xml file to the Business Central Table.

There are options to handle different data structure. One is that we can add a transformation rule directly to each field. The other one is using our own custom data upgrade app that subscribes to the events published in this app.

Four events are published, two for each field in the mapping, two before updating or inserting the database record.

Based on the information in the publishers we can do any manual data modification required. In my example the creation time was added to each G/L Entry in NAV, but is added to the G/L Register in Business Central.

From the list of tables we are able to start the data transfer. First we need to make sure that we have the correct configuration for the import. Do we want to commit during the import, do we want to create missing records in our database?

I select to commit after each 1000 records. If my data transfer stops, than I can resume from that position when I start the data transfer again.

We have the option to create a task in the job queue to handle the data transfer.

The job queue can handle multiple concurrent transfers so the import should not take to much time. Looking into the Destination Mapping, we can see the status of the data import.

I will add few more pictures to give you a better idea of what can be done with this import tenant data app. The AL code is in GitHub for you to browse, improve and fix.

Use references to break compile dependencies

I was looking into a customer App yesterday. That app had a dependency defined in app.json.

I wanted to look at the real requirements for this dependency. I found 1 (one) place in my code where this dependent App was used.

dataitem(PageLoop; "Integer")
{
    DataItemTableView = SORTING (Number) WHERE (Number = CONST (1));
    column(Phone_No_Cust; Cust."Phone No.")
    {
    }
    column(Registration_No_Cust; Cust."ADV Registration No.")
    {
    }
    column(CompanyInfo_Picture; CompanyInfo.Picture)
    {
    }

In Iceland we add a field to the Customer table (Cust.”ADV Registration No.”). Every business entity in Iceland has a registration number. A company only has one registration number but can have multiple VAT numbers. We already have that registration number field in the Company Information record, but we also add it to Customer, Vendor and Contact records. The Employee social security number equals to the registration number for an individual.

To be able to remove the compile dependency, and therefore the installation dependency I did the following:

Removed the dependency App from app.json

Added a variable to the report

    var
        ADVRegistrationNo: Text;

Changed the data set configuration to use this variable

dataitem(PageLoop; "Integer")
	{
	    DataItemTableView = SORTING (Number) WHERE (Number = CONST (1));
	    column(Phone_No_Cust; Cust."Phone No.")
	    {
	    }
	    column(Registration_No_Cust; ADVRegistrationNo)
	    {
	    }
	    column(CompanyInfo_Picture; CompanyInfo.Picture)
	    {
	    }

Located the code that fetches the Customer record and added the reference way to get the required data

trigger OnAfterGetRecord()
var
    DataTypeMgt: Codeunit "Data Type Management";
    RecRef: RecordRef;
    FldRef: FieldRef;
begin
    Cust.Get("Ship-to Customer No.");
    RecRef.GetTable(Cust);
    if DataTypeMgt.FindFieldByName(RecRef, FldRef, 'ADV Registration No.') then
        ADVRegistrationNo := FldRef.Value();
end;

There are both positive and negative repercussion of these changes.

The positive is that we can now install, uninstall both apps without worrying about the compile dependency.

The negative is that breaking changes to the dependent App does not break the installation of this customer App.

So, what happens if the dependent App is not installed? The FindFieldByName will return false and the variable will be blank text.

Since we have adapted the policy that Microsoft uses; no breaking table changes, this field should just be there.

If the data is required and will break the functionality if not present we can change the code to something like this.


Cust.Get("Ship-to Customer No.");
RecRef.GetTable(Cust);
if DataTypeMgt.FindFieldByName(RecRef, FldRef, 'ADV Registration No.') then
    ADVRegistrationNo := FldRef.Value()
else
    Error('Please install the Advania IS Localization App into this Tenant!')

Logging your App Activity

It is good practice to have some audit log of what uses do in the application. Some versions ago Microsoft introduced the Change Log to log data changes. How about logging an action execution?

One of the built in solutions in Business Central can be used to solve this. We now have the Activity Log (Table 710).

To use the Activity Log we need to have a record to attach the activity log to. All our Apps have a Setup table that usually only have one record. I like to attach my Activity Log to that record.

To show the Activity Log from that record you can add this action to that record’s page.

action("ActivityLog")
{
    ApplicationArea = All;
    Caption = 'Activity Log';
    Image = Log;
    Promoted = true;
    PromotedCategory = Process;
    PromotedOnly = true;
    Scope = "Page";
    ToolTip = 'See the data activities for this App.';

    trigger OnAction()
    var
        ActivityLog: Record "Activity Log";
    begin
        ActivityLog.ShowEntries(Rec);
    end;
}

The logging part can be something like this.

local procedure LogActivity(ADVUpgradeProjTable: Record "ADV Upgrade Project Table"; Context: Text[30])
var
    ActivityLog: Record "Activity Log";
    Status: Option Success,Failed;
begin
    if ADVUpgradeProject."App Package Id" <> ADVUpgradeProjTable."App Package Id" then begin
        ADVUpgradeProject.SetRange("App Package Id", ADVUpgradeProjTable."App Package Id");
        ADVUpgradeProject.FindFirst();
    end;
    ActivityLog.LogActivity(
        ADVUpgradeProject,
        Status::Success,
        Context,
        StrSubstNo('%1', ADVUpgradeProjTable."Data Upgrade Method"),
        StrSubstNo('%1 (%2)', ADVUpgradeProjTable."App Table Name", ADVUpgradeProjTable."App Table Id"));
end;

We also have the possibility to log details. Both a text value and also from an in-stream.

ActivityLog.SetDetailedInfoFromText("Text variable");

ActivityLog.SetDetailedInfoFromStream("in Stream");

In Business Central we have information about the execution context. I pass that execution context into the LogActivity. This gives me information on the session that is executing the code.

local procedure GetExecutionContext(): Text[30]
var
    SessionContext: ExecutionContext;
begin
    SessionContext := Session.GetCurrentModuleExecutionContext();
    case SessionContext of
        SessionContext::Install:
            exit(CopyStr(InstallationMsg, 1, 30));
        SessionContext::Upgrade:
            exit(CopyStr(UpgradeMsg, 1, 30));
        SessionContext::Normal:
            exit(CopyStr(UserContextMsg, 1, 30));
    end;
end;

var
    InstallationMsg: Label 'App Installation';
    UpgradeMsg: Label 'App Upgrade';
    UserContextMsg: Label 'Started by user';

Using this logic we can log all execution during install, upgrade and normal user cases. If we need information on the variables we can log them into the detailed information using either JSON or XML.

Event subscription and performance

When we design and write our code we need to think about performance.

We have been used to thinking about database performance, using FindFirst(), FindSet(), IsEmpty() where appropriate.

We also need to think about performance when we create our subscriber Codeunits.

Let’s consider this Codeunit.

codeunit 50100 MySubscriberCodeunit
{
    trigger OnRun()
    begin
        
    end;
    

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)] 
    local procedure MyProcedure(var SalesHeader: Record "Sales Header")
    begin
        Message('I am pleased that you called.');
    end;


}

Every time any user posts a sales document this subscriber will be executed.

Executing this subscriber will need to load an instance of this Codeunit into the server memory. After execution the Codeunit instance is trashed.

The resources needed to initiate an instance of this Codeunit and trash it again, and doing that for every sales document being posted are a waste of resources.

If we change the Codeunit and make it a “Single Instance”.

codeunit 50100 MySubscriberCodeunit
{
    SingleInstance = true;
    
    trigger OnRun()
    begin
        
    end;
    

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)] 
    local procedure MyProcedure(var SalesHeader: Record "Sales Header")
    begin
        Message('I am pleased that you called.');
    end;


}

What happens now is that Codeunit only has one instance for each session. When the first sales document is posted then the an instance of the Codeunit is created and kept in memory on the server as long as the session is alive.

This will save the resources needed to initialize an instance and tear it down again.

Making sure that our subscriber Codeunits are set to single instance is even more important for subscribers to system events that are frequently executed.

Note that a single instance Codeunit used for subscription should not have any global variables, since the global variables are also kept in memory though out the session lifetime.

Make sure that whatever is executed inside a single instance subscriber Codeunit is executed in a local procedure. The variables inside a local procedure are cleared between every execution, also in a single instance Codeunit.

codeunit 50100 MySubscriberCodeunit
{
    SingleInstance = true;

    trigger OnRun()
    begin
        
    end;
    

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)] 
    local procedure MyProcedure(var SalesHeader: Record "Sales Header")
    begin
        ExecuteBusinessLogic(SalesHeader);

    end;

    local procedure ExecuteBusinessLogic(SalesHeader: Record "Sales Header")
    var
        Customer: Record Customer;
    begin
        Message('I am pleased that you called.');    
    end;

}

If your custom code executes every time that the subscriber is executed then I am fine with having that code in a local procedure inside the single instance Codeunit.

Still, I would suggest putting the code in another Codeunit, and keeping the subscriber Codeunit as small as possible.

This is even more important if the custom code only executes on a given condition.

An example of a Codeunit that you call from the subscriber Codeunit could be like this.

codeunit 50001 MyCodeCalledFromSubscriber
{
    TableNo = "Sales Header";
    
    trigger OnRun()
    begin
        ExecuteBusinessLogic(Rec);
    end;
    local procedure ExecuteBusinessLogic(SalesHeader: Record "Sales Header")
    var
        Customer: Record Customer;
    begin
        Message('I am pleased that you called.');    
    end;
}

And I change my subscriber Codeunit to only execute this code on a given condition.

codeunit 50100 MySubscriberCodeunit
{
    SingleInstance = true;

    trigger OnRun()
    begin
        
    end;
    

    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Sales-Post", 'OnBeforePostSalesDoc', '', true, true)] 
    local procedure MyProcedure(var SalesHeader: Record "Sales Header")
    begin
        ExecuteBusinessLogic(SalesHeader);

    end;

    local procedure ExecuteBusinessLogic(SalesHeader: Record "Sales Header")
    var
        MyCodeCalledFromSubscriber: Codeunit MyCodeCalledFromSubscriber;
    begin
        if SalesHeader."Document Type" = SalesHeader."Document Type"::Order then
            MyCodeCalledFromSubscriber.Run(SalesHeader);
    end;

}

This pattern makes sure that the execution is a fast as possible and no unneeded variables are populating the server memory.

User Group Focus 2019


Dynamics 365 Business Central & NAV  
March 13-14, 2019

On March 11th and 12th I will be teaching a VSCode and Modern NAV Development. This course will be held from 8:00am-5:00pm each day

The goal of the workshop is to learn about the new development tool for Business Central (Dynamics NAV), VSCode, GIT source control management and to experience what AL programming is about 
• What makes AL different from C/AL 
• How do you build and deploy a new BC feature 
• How can I convert my current code into AL 
• How to get ready for publishing your IP to AppSource 
• How to use GIT for you code 

On the Developer track I will host three sessions.

Wednesday, March 13, 201910:15 AM – 11:45 AM, Room: Founders III

DEV75: How to Prepare Your Code for ALBCUG/NAVUG

Ready to completely re-think your all-in-one C/AL application? How about we try this: figure out how to split the code into “bricks” by functionality and/or processes, then turn that pile of bricks back into a usable solution. Can you migrate your customer data from the all-in-one C/AL database to the new continuous delivery cycle, replacing C/AL bricks with AL bricks. Let’s find out!

Wednesday, March 13, 20194:00 PM – 5:30 PM, Room: Founders II

DEV78: How I Got my Big Database Upgraded to Business Central

Your database upgrade takes longer than your available downtime window – bit of a problem, right? How about when executing all the upgrade processes on your database will take close to 10 days? Yeah, that’s a big problem. Of course you cannot stop a business for 10 days, but how do you shrink that to fit the 30-hour window over the weekend? You’ll hear the real life story and learn about the tools and methods you can use to streamline your upgrades.

Thursday, March 14, 20198:00 AM – 9:30 AM, Room: Founders III

DEV79: Breaking the Compilation Dependencies

Going to the extension model requires a simple structure to allow multiple extensions to talk to each other without having to put all of them into a compile dependency or into the same extension. Applying the standard API pattern inside the Business Central Service tier will give us the possibility to do all required functionality in a fast and easy way. This session is about explaining this pattern and giving some examples on how we have been using this pattern.

JSON Interface – prerequisites

There are two objects we use in all JSON interfaces. We use the TempBlob table and our custom JSON Interface Codeunit.

Abstract

JSON interface uses the same concept as a web service. The endpoint is defined by the Codeunit Name and the caller always supplies a form of request data (JSON) and expects a response data (JSON).

These interface calls therefore are only internal to the Business Central (NAV) server and are very fast. All the data is handled in memory only.

We define these interfaces by Endpoints. Some Endpoints have Methods. We call these Endpoints with a JSON. The JSON structure is predefined and every interface respects the same structure.

We have a single Codeunit that knows how to handle this JSON structure. Passing JSON to an interface requires a data container.

Interface Data

TempBlob is table 99008535. The table is simple but is has a lot of useful procedures.

table 99008535 TempBlob
{
    Caption = 'TempBlob';

    fields
    {
        field(1;"Primary Key";Integer)
        {
            Caption = 'Primary Key';
            DataClassification = SystemMetadata;
        }
        field(2;Blob;BLOB)
        {
            Caption = 'Blob';
            DataClassification = SystemMetadata;
        }
    }

    keys
    {
        key(Key1;"Primary Key")
        {
        }
    }
}

Wikipedia says: A Binary Large OBject (BLOB) is a collection of binary data stored as a single entity in a database management system. Blobs are typically imagesaudio or other multimedia objects, though sometimes binary executable code is stored as a blob. Database support for blobs is not universal.

We use this BLOB for our JSON data when we send a request to an interface and the interface response is also JSON in that same BLOB field.

For people that have been working with web requests we can say that TempBlob.Blob is used both for RequestStream and for ResponseStream.

TempBlob is only used as a form of Stream. We never use TempBlob to store data. We never do TempBlob.Get() or TempBlob.Insert(). And, even if the name indicates that this is a temporary record, we don’t define the TempBlob Record variable as temporary. There is no need for that since we never do any database call for this record.

Interface Helper Codeunit

We use a single Codeunit in all our solutions to prepare both request and response JSON and also to read from the request on the other end.

We have created a Codeunit that includes all the required procedures for the interface communication.

We have three functions to handle the basics;

  • procedure Initialize()
  • procedure InitializeFromTempBlob(TempBlob: Record TempBlob)
  • procedure GetAsTempBlob(var TempBlob: Record TempBlob)

A typical flow of executions is to start by initializing the JSON. Then we add data to that JSON. Before we execute the interface Codeunit we use GetAsTempBlob to write the JSON into TempBlob.Blob. Every Interface Codeunit expects a TempBlob record to be passed to the OnRun() trigger.

codeunit 10008650 "ADV SDS Interface Mgt"
{
    TableNo = TempBlob;

    trigger OnRun()
    var
        Method: Text;
     begin
        with JsonInterfaceMgt do begin
            InitializeFromTempBlob(Rec);
...

Inside the Interface Codeunit we initialize the JSON from the passed TempBlob record. At this stage we have access to all the data that was added to the JSON on the request side.

And, since the interface Codeunit will return TempBlob as well, we must make sure to put the response JSON in there before the execution ends.

with JsonInterfaceMgt do begin
    Initialize();
    AddVariable('Success', true);
    GetAsTempBlob(Rec);
end;

JSON structure

The JSON is an array that contains one or more objects. An JSON array is represented with square brackets.

[]

The first object in the JSON array is the variable storage. This is an example of a JSON that passes two variables to the interface Codeunit.

[
  {
    "TestVariable": "TestVariableValue",
    "TestVariable2": "TestVariableValue2"
  }
]

All variables are stored in the XML format, using FORMAT(<variable>,0,9) and evaluated back using EVALUATE(<variable>,<json text value>,9). The JSON can then have multiple record related objects after the variable storage.

Adding data to the JSON

We have the following procedures for adding data to the JSON;

  • procedure AddRecordID(Variant: Variant)
  • procedure AddTempTable(TableName: Text; Variant: Variant)
  • procedure AddFilteredTable(TableName: Text; FieldNameFilter: Text; Variant: Variant)
  • procedure AddRecordFields(Variant: Variant)
  • procedure AddVariable(VariableName: Text; Value: Variant)
  • procedure AddEncryptedVariable(VariableName: Text; Value: Text)

I will write a more detailed blog about each of these methods and give examples of how we use them, but for now I will just do a short explanation of their usage.

If we need to pass a reference to a database table we pass the Record ID. Inside the interface Codeunit we can get the database record based on that record. Each Record ID that we add to the JSON is stored with the Table Name and we use either of these two procedures to retrieve the record.

  • procedure GetRecord(var RecRef: RecordRef): Boolean
  • procedure GetRecordByTableName(TableName: Text; var RecRef: RecordRef): Boolean

If we need to pass more than one record we can use pass all records inside the current filter and retrieve the result with

  • procedure UpdateFilteredTable(TableName: Text; KeyFieldName: Text; var RecRef: RecordRef): Boolean

A fully populated temporary table with table view and table filters can be passed to the interface Codeunit by adding it to the JSON by name. When we use

  • procedure GetTempTable(TableName: Text; var RecRef: RecordRef): Boolean

in the interface Codeunit to retrieve the temporary table we will get the whole table, not just the filtered content.

We sometimes need to give interface Codeunits access to the record that we are creating. Similar to the OnBeforeInsert() system event. If we add the record fields to the JSON we can use

  • procedure GetRecordFields(var RecRef: RecordRef): Boolean

on the other end to retrieve the record and add or alter any field content before returning it back to the caller.

We have several procedures available to retrieve the variable values that we pass to the interface Codeunit.

  • procedure GetVariableValue(var Value: Variant; VariableName: Text): Boolean
  • procedure GetVariableTextValue(var TextValue: Text; VariableName: Text): Boolean
  • procedure GetVariableBooleanValue(var BooleanValue: Boolean; VariableName: Text): Boolean
  • procedure GetVariableDateValue(var DateValue: Date; VariableName: Text): Boolean
  • procedure GetVariableDateTimeValue(var DateTimeValue: DateTime; VariableName: Text): Boolean
  • procedure GetVariableDecimalValue(var DecimalValue: Decimal; VariableName: Text): Boolean
  • procedure GetVariableIntegerValue(var IntegerValue: Integer; VariableName: Text): Boolean
  • procedure GetVariableGUIDValue(var GuidValue: Guid; VariableName: Text): Boolean
  • procedure GetVariableBLOBValue(var TempBlob: Record TempBlob; VariableName: Text): Boolean
  • procedure GetVariableBLOBValueBase64String(var TempBlob: Record TempBlob; VariableName: Text): Boolean
  • procedure GetEncryptedVariableTextValue(var TextValue: Text; VariableName: Text): Boolean

We use Base 64 methods in the JSON. By passing the BLOB to TempBlob.Blob we can use

TextValue := TempBlob.ToBase64String();

and then

TempBlob.FromBase64String(TextValue);

on the other end to pass a binary content, like images or PDFs.

Finally, we have the possibility to add and encrypt values that we place in the JSON. On the other end we can then decrypt the data to be used. This we use extensively when we pass sensitive data to and from our Azure Function.

Calling an interface Codeunit

As promised I will write more detailed blogs with examples. This is the current list of procedures we use to call interfaces;

  • procedure ExecuteInterfaceCodeunitIfExists(CodeunitName: Text; var TempBlob: Record TempBlob; ErrorIfNotFound: Text)
  • procedure TryExecuteInterfaceCodeunitIfExists(CodeunitName: Text; var TempBlob: Record TempBlob; ErrorIfNotFound: Text): Boolean
  • procedure TryExecuteCodeunitIfExists(CodeunitName: Text; ErrorIfNotFound: Text) Success: Boolean
  • procedure ExecuteAzureFunction() Success: Boolean

The first two expect a JSON to be passed using TempBlob. The third one we use to check for a simple true/false. We have no request data but we read the ‘Success’ variable from the response JSON.

For some of our functionality we use an Azure Function. We have created our function to read the same JSON structure we use internally. We also expect our Azure Function to respond with the sames JSON structure. By doing it that way, we can use the same functions to prepare the request and to read from the response as we do for our internal interfaces.

Upgrading my G/L Source Names Extension to AL – step 2

So, where is step 1?  Step 1 was converting C/AL code to AL code.  This we did with AdvaniaGIT and was demonstrated here.

First thing first!  I received the following email from Microsoft.

Hello,

The decision has been made by our SLT, that the use of a Prefix or Suffix is now a mandatory requirement. If you are already using this in your app(s), great. If not, you will want to do so.

We are coming across too many collisions between apps in our internal tests during builds and have seen some in live tenants as well. It makes the most sense to make this a requirement now. If you think about it in a live situation, if a customer installs an app before yours and then tries yours but gets collision issues, they may just decide to not even continue on with yours.

Also, I have been made aware that adding a prefix or suffix after you already have a v2 app published can make the process complicated for you. Therefore, since you all have to convert to v2 anyway, now is a good time to add in the prefix/suffix.

The following link provides the guidelines around using it here

If you haven’t reserved your prefix yet, please email me back to reserve one (or more if needed).

Thank you,

Ryan

Since my brand is Objects4NAV.com I asked for 04N as my prefix and got it registered.  Since we got this information from Microsoft, every object that we develop in NAV 2018 now has our companies prefix in the name.

Starting my AL development by opening Visual Studio Code in my repository folder.  I updated my setup.json to match the latest preview build as Docker container and then selected to Build NAV Environment using AdvaniaGIT.

After download and deployment of the container I noticed that the container had a brand new version of the AL Extension for Visual Studio Code.  I looked at the version installed and that was an older version.

I uninstalled the AL Language extension and restarted Visual Studio Code.

As you can see on the screenshot above we now don’t have any AL Language extension installed.  I executed the Build NAV Environment command from AdvanaiGIT to install the extension on the Docker container.  In this case I already had a container assigned to my branch so only three things happened.

  • uidOffset in the container database was updated.  This is recommended for C/AL development.
  • License file is updated in the container database and container service.  The license used is the one configured in branch setup.json or the machine settings GITSettings.json
  • AL Language Extension is copied from the container to the host and installed in Visual Studio Code.

Again, restarting Visual Studio Code to find that the latest version of AL Language Extension has been installed.

I then executed two AdvaniaGIT actions.

  • Update Launch.json with current branch environment.  This will update the host name and the service name in my AL Launch.json file to make sure that my AL project will be interacting with the branch container.
  • Open Visual Studio Code in AL folder.  This will open another instance of Visual Studio Code in the AL folder.

Immediately after Visual Studio Code was opened it asked for symbols and I agreed that we should download them from the container.

Everything is now ready for AL development using the latest build that Microsoft has to offer.

I started  Edit – Replace in Files in Visual Studio Code.  All my objects have a name that start with G/L Source Name.  I used this knowledge to apply the prefix.

By starting with the double quote I make sure to only update the object names and not captions.  All captions start with a single quote.  I got a list of all changes and needed to confirm all changes.

The field name I add to G/L Entry table does not match this rule so I needed to rename that my self.  Selecting the field name and pressing F2 allows me to rename a field and have Visual Studio Code update all references automatically.

Pressing F5 started my build, publish and debug.

My extension is installed and ready for testing.

There are a few more steps that I need to look into before publishing the new version of G/L Source Names to Dynamics 365.  These steps will appear here in the coming days.  Hope this will be useful to you all.