It is good practice to have some audit log of what uses do in the application. Some versions ago Microsoft introduced the Change Log to log data changes. How about logging an action execution?
One of the built in solutions in Business Central can be used to solve this. We now have the Activity Log (Table 710).
To use the Activity Log we need to have a record to attach the activity log to. All our Apps have a Setup table that usually only have one record. I like to attach my Activity Log to that record.
To show the Activity Log from that record you can add this action to that record’s page.
action("ActivityLog")
{
ApplicationArea = All;
Caption = 'Activity Log';
Image = Log;
Promoted = true;
PromotedCategory = Process;
PromotedOnly = true;
Scope = "Page";
ToolTip = 'See the data activities for this App.';
trigger OnAction()
var
ActivityLog: Record "Activity Log";
begin
ActivityLog.ShowEntries(Rec);
end;
}
The logging part can be something like this.
local procedure LogActivity(ADVUpgradeProjTable: Record "ADV Upgrade Project Table"; Context: Text[30])
var
ActivityLog: Record "Activity Log";
Status: Option Success,Failed;
begin
if ADVUpgradeProject."App Package Id" <> ADVUpgradeProjTable."App Package Id" then begin
ADVUpgradeProject.SetRange("App Package Id", ADVUpgradeProjTable."App Package Id");
ADVUpgradeProject.FindFirst();
end;
ActivityLog.LogActivity(
ADVUpgradeProject,
Status::Success,
Context,
StrSubstNo('%1', ADVUpgradeProjTable."Data Upgrade Method"),
StrSubstNo('%1 (%2)', ADVUpgradeProjTable."App Table Name", ADVUpgradeProjTable."App Table Id"));
end;
We also have the possibility to log details. Both a text value and also from an in-stream.
In Business Central we have information about the execution context. I pass that execution context into the LogActivity. This gives me information on the session that is executing the code.
local procedure GetExecutionContext(): Text[30]
var
SessionContext: ExecutionContext;
begin
SessionContext := Session.GetCurrentModuleExecutionContext();
case SessionContext of
SessionContext::Install:
exit(CopyStr(InstallationMsg, 1, 30));
SessionContext::Upgrade:
exit(CopyStr(UpgradeMsg, 1, 30));
SessionContext::Normal:
exit(CopyStr(UserContextMsg, 1, 30));
end;
end;
var
InstallationMsg: Label 'App Installation';
UpgradeMsg: Label 'App Upgrade';
UserContextMsg: Label 'Started by user';
Using this logic we can log all execution during install, upgrade and normal user cases. If we need information on the variables we can log them into the detailed information using either JSON or XML.
On March 11th and 12th I will be teaching a VSCode and Modern NAV Development. This course will be held from 8:00am-5:00pm each day
The goal of the workshop is to learn about the new development tool for Business Central (Dynamics NAV), VSCode, GIT source control management and to experience what AL programming is about • What makes AL different from C/AL • How do you build and deploy a new BC feature • How can I convert my current code into AL • How to get ready for publishing your IP to AppSource • How to use GIT for you code
On the Developer track I will host three sessions.
Wednesday, March 13, 201910:15 AM – 11:45 AM, Room: Founders III
DEV75: How to Prepare Your Code for ALBCUG/NAVUG
Ready to completely re-think your all-in-one C/AL application? How about we try this: figure out how to split the code into “bricks” by functionality and/or processes, then turn that pile of bricks back into a usable solution. Can you migrate your customer data from the all-in-one C/AL database to the new continuous delivery cycle, replacing C/AL bricks with AL bricks. Let’s find out!
Wednesday, March 13, 20194:00 PM – 5:30 PM, Room: Founders II
DEV78: How I Got my Big Database Upgraded to Business Central
Your database upgrade takes longer than your available downtime window – bit of a problem, right? How about when executing all the upgrade processes on your database will take close to 10 days? Yeah, that’s a big problem. Of course you cannot stop a business for 10 days, but how do you shrink that to fit the 30-hour window over the weekend? You’ll hear the real life story and learn about the tools and methods you can use to streamline your upgrades.
Thursday, March 14, 20198:00 AM – 9:30 AM, Room: Founders III
DEV79: Breaking the Compilation Dependencies
Going to the extension model requires a simple structure to allow multiple extensions to talk to each other without having to put all of them into a compile dependency or into the same extension. Applying the standard API pattern inside the Business Central Service tier will give us the possibility to do all required functionality in a fast and easy way. This session is about explaining this pattern and giving some examples on how we have been using this pattern.
As we move from C/AL and all-in-one database we need to redesign how we make one application or a functionality available to the next.
Microsoft recommends that we build multiple AL extensions, not putting all our code into a single extension.
My goal is to be able to develop, test and execute an extension regardless of if other extensions are installed or not. Some functionality may depend on another extensions. Still I don’t want to add compile dependencies between these extensions.
For just over a year we have been designing and implementing methods to solve this task. In the coming blog entries I will be sharing these methods one by one. These JSON interface methods I hope will help you solving your tasks as well.
In last Directions EMEA I had a talk about the path Advania is on from C/AL all-in-one database to multiple AL extensions. Here below is the content of that talk. Feel free to skip that reading and wait for the upcoming blogs about the technical solutions.
From the start we have been calling this project Lego®. I will spend some time on detailing the path we have been on for almost three years. We have not yet started converting to AL but is seems that the tooling that Microsoft delivers with the fall release of Business Central will be the one we can use to start the conversion from C/AL to AL. We have done some tests and they are promising.
This new way of doing things have enabled new methods of doing our business and opened for new dialog with our customers.
A brick should be able to stand alone. The only dependencies allowed is on the standard Microsoft code and on Advania’s IS localization that we call IS365. There are ways to make the solution or functionality available in the base application without having to modify any of the standard objects. A discovery pattern is where the base application makes a call for Assisted Setup, Business Setup or Service Connections. By responding to these calls, these events, the setup for the customized functionality is made accessible to the user.
The interface pattern that we created is
the key here. Using this interface
pattern we are able to break compile dependencies without breaking the
connection between individual solutions and functionalities. I will spend some time on detailing how we
have implemented this interface pattern in our solutions.
When Microsoft talks about a slice in their development they are talking about a similar thing as we do when we talk about a brick. A peace of code that they check in and needs to work independently. The main difference is in dependencies and connectivity. A slice can be dependent on another slice where a brick as we define it can’t have compile dependencies on any other brick.
Object wise, a brick can vary greatly in size and complexity. Some bricks only have a handful of objects. Another bricks can have hundreds of objects. The largest bricks are our ISV solutions. Still we have broken most of them down to smaller bricks. Even if Iceland is small in population, we have most of the same complexity as other countries when it comes to delivering quality software. We need to supply our customers with solutions to handle all the common business processes. Icelandic partners have been very independent and our extensive experience with Business Central and previous versions means that we already have all the solutions for all these business processes. These solutions we need to handle as a product and we keep all coding in line with our new design principles.
We also have a number of bricks that have
only one functionality. As an example, one brick can download the Post Code
list for Iceland for update in the application.
Another brick can interface with the online banking web services. The third one can use the Hardware Hub to
scan documents or read a serial connected bar code reader.
These small bricks also have a way to
register them selves. Most of them in
Service Connections. Some of them
register them selves as added functionality in other bricks.
One example is our Notification App. That app is used to send notifications to
users. These notifications can be both
via email and to the built in user notifications on the role center. We have another app that has only one
functionality. That one enables sending
an SMS via one of the telecommunication service providers in Iceland. By installing and configuring SMS app the notification app extends the
functionality and enables notifications via SMS.
The SMS app has an interface that enables all other apps to verify the availability and use it so send SMS.
About three year ago when we came home from NAV TechDays, we brought Kamil Sazek, an MVP from Check Republic with us. Kamil had already done some source control management solutions for his company using TFS. At that time everyone wanted to move to GIT and Microsoft was moving to GIT as well. We spent a week up in Iceland applying Kamil’s knowledge and his scripts to an indoor GIT source control management system. We had our preferences and our design principles and we boiled everything together and started our road to source control management. At that time AdvaniaGIT was born. AdvaniaGIT is the toolbox that we use in our day to day work.
GIT has repositories and branches within repositories. GIT has a lot more but that is where we started. We started our NAV 2016 work on GIT and decided not to apply this way of work to older releases. Most of our NAV 2016 work was done this way. Some of course not – but that was not a technical problem, it takes time to update the employee mentality and knowledge.
The first branch in a repository is the
master branch. In there we store
Microsoft’s base app, the W1 code base.
Looking at the GIT history for the master branch, we can see every CU of
every release since NAV 2016.
Since we are one of the favorite
countries for NAV and now Business Central we also have an Microsoft Icelandic
release. That release in every CU since
NAV 2016 is also available in our IS branch.
The branching structure for C/AL is very different from the branching
structure for AL. For C/AL we have one
repository for each NAV and Business Central version. For AL we have one repository for each app
and one for each customer.
As most of you probably release then
there is a gap between the localized version delivered by Microsoft and the one
a partner likes to offer to the local market.
The same thing applies in Iceland.
This gap we are closing with multiple bricks. There is, however, a set of functionality and
data that we must add to the base application to be used in multiple bricks. Every Icelander has a social security number,
and every company has a registration number following the same structure. This field we add to customers, vendors,
contacts and all types of bank accounts.
We call this a base branch. A branch that we base every brick on and compare every brick to.
Every month we receive a new CU. We always develop in the latest CU. When a new update is available we update our
NAV and Business Central installation and update the object export both with W1
release in the master branch and with the Icelandic release in the IS branch.
We then use the standard GIT Merge
functionality to merge the updates from the IS branch to the IS365 branch. Merging all the updates from Microsoft into
our own base localization branch. By
updating the IS365 base branch with a new update every build that is based on
that branch will use the updated code automatically.
Every time a developer opens a branch to work on the first thing that must be done is to use the GIT Merge from the IS365 branch. By doing this we make sure that we are developing on the latest update and comparing our brick to the current update that is contained in the IS365 branch. When development is done all objects are exported back to GIT and the brick can be compared to the base IS365 branch and the deltas can be extracted and stored.
This is us in releases prior to NAV
2016. Building a solution for a customer
was a lot of work. Manually bringing all
the code together into a solution and then if we wanted to update with a new CU
that also was a lot of work.
Every solution and every functionality
installed was very closely linked and we could easily call a function in one
solution from another solution. Life was
good and the developer was happy. Had
everything available and had everything under control.
The problem here is that this was not a
good business. Applying an update to
customers was a lot of work. It was very
hard to use the same code and the same solution for multiple customers. Our shelf products needed to be manually
added to the solution. And we had no
automated testing to ensure the quality of the code and solution.
Not to mention upgrades. No one really knew what was being used and
what not. No one was able to keep track
of all the changes done. We did have
some comments in the code. Some helpful,
some not. Big upgrades are very
expensive both for the customer and also for the partner.
Microsoft also saw this as a problem and started to work on Events and Extensions. After three years of that work Extensions are brilliant. It is very important to see where Microsoft is going. Not where they are. When we see where Microsoft is going we need to act our selves in line with that future. We could not wait for three years for Extensions to be ready and then start working. The endpoint was clear. Microsoft base application with Advania Extensions was the way to go. The road to that solution was not as clear. That is when our LEGO® method was born.
It was clear that in order to move to
Extensions we first needed to break everything apart. Take all the bricks apart and put them in a
pile on the desk. We started this in NAV
2016. But in NAV 2016 we played with
LEGO®
Duplo. The big bricks. We put a lot of the functionality in our core
app. Still we where able to use the
method and we where able to build customer solutions and apply new CUs without
to much work. We already got a lot of
repeatability there. This took a few
months but about six months after the NAV 2016 release we where able to start
updating our customers to NAV 2016 solutions built with the brick methods. These customers we where able to update every
month with new CUs. Still we did not
update all of them. Remember that I said
that was not a technical problem. Our
staff has the same problem as our customers.
We tend to fall back to the way we are used to doing things.
We where still working on NAV 2016 when
NAV 2017 was released and we did not move our bricks to NAV 2017. Just spent a little more time educating our
staff and preparing for the small bricks.
When NAV 2018 was released we started full force. Each brick was a potential Extension and what
used to be in one brick is now in multiple bricks. We spent some time on the code we had for NAV
2016 to figure out where to break things apart and what to keep together. Everything we coded in C/AL in NAV 2018 has
to follow the same rules that are set for Extension development. We removed DotNet variables one by one and made sure not
to change anything in the standard application.
As mentioned earlier we have some data
requirements that are commonly used in our bricks. We also have some verification methods both
for registration numbers and bank account numbers. There are some shared events and some share
functionality.
We did need to use some DotNet variables. We also needed to add some events to the base applications. When we needed to do this we did it in the base IS365 brick. And everything we did to break the Extension model we sent to Microsoft. Through Microsoft’s AL Issues on GitHub we where able to request new events. Through Microsoft’s cal-open-library on GitHub we where able to submit request for DotNet wrappers. So we broke the Extension rules in NAV 2018 only in the base app, knowing that in Business Central Microsoft would deliver everything we needed to get our changes out of the base application.
More complex DotNet
usage we moved to Azure Functions. A
simple REST web request from our brick can trigger complex tasks using Azure
Function. This has worked beautifully
for us.
We created a few Codeunits
with helper functionality. Codeunits to
handle some common tasks. As an example
we have one Codeunit
that every brick can use both to call interfaces and create the responce
from an interface. We also like to use
the built in Error Messages but needed some added functionality. It made sense to have that functionality in
the base app and make it available to all the bricks.
We call Azure Functions via the base app
and we have ways to handle files from Azure Blob, Azure File System and Dropbox
via the base app as well.
We use interfaces to communicate between different bricks. We can think about an interface like we think about a REST web service, or an Azure Function. We supply a request JSON string to the interface and the interface response is also a JSON string.
For this we use the JSON Handler Codeunit in the base app, the IS365.
A REST web service has an http
endpoint. Our endpoints are identified
by the Codeunit
name. As required by AppSource, we
are using the prefix ADV for all objects, fields and functions in our
bricks. Therefore we can be sure that
our Codeunit name
is unique and we know who we are talking to.
We can think about this name similar as a domain name. We use domain name lookup to find the correct
IP address for the domain and then the request is sent to that IP address. We follow the same rule with our Codeunit
interfaces. We lookup the Codeunit by
name and if it exists and we have required permissions to use it we will get
the object Id. That object Id we can
then use in a simple CODEUNIT.RUN method.
Our JSON Handler Codeunit has methods to execute a Codeunit with or without the JSON string parameter. The On/Off methods are called without any parameters but all the others are using the TempBlob record where both the JSON request string and response string are stored in the Blob field.
One of our rules is that every brick must
have a setup table, a setup page and an interface that can respond to a enabled
query.
If we look at the first code line where
we ask if the brick is installed, we can see that this is similar to doing a
lookup for a domain name.
The second line where we ask if the brick
is enabled we are simply doing a IP ping.
No matter how complex or simple the brick
solution or functionality is. We always
require an setup table with the enabled flag.
We can make sure that all logic and all required setup data is available
when the user tries to enable the solution.
We can even ping another brick if that one is required before allowing a
user to enable the solution.
The C/AL code here is pretty simple. The Codeunit will throw an error if the brick is not
configured and enabled.
The JSON interface is the most used scenario. Using the JSON Helper Codeunit we can add variables and records, both temporary and not temporary to the JSON string. For records that are temporary every record in the tables is stored in the JSON string and the table filters and table key is also included. For database tables we can both send a record Id and field values.
We prepare our request by adding to the json
string. The same thing is done inside
the interface Codeunit when
the response json
string is created. It starts with the
Initialize line.
We read from the request json
string with the InitializeFromTempBlob
procedure. After that we have an easy
access to all the information we store in the json string.
We applied the same json structure to our Azure Functions. The functions read the request from the json
string created by this Helper Codeunit and the response from the Azure function
is read by this same Json
Helper Codeunit.
A JSON interface Codeunit will do some work and respond with a result.
There are some cases where we need to trigger an Event in another brick. We can also enable this with an JSON event interface. See the code. In this case the Integration Event is a local procedure inside the same Codeunit. This Integration Event could just as well be an public procedure in another C/AL or AL object.
We need to supply the required parameters for the Integration Event to execute properly using the same json Helper Codeunit as before. In this case we are also passing the TempBlob to the Event but that is not a requirement in our pattern. An Event Interface is not required to have a response JSON.
We also use JSON registration interfaces to register a functionality in another brick. When we have a configuration where the user can select a functionality based from a number of lines, we should be able to add to that selection from another brick with a Registration Interface. In this example we have a new type of bank connection that is available when a brick in installed. It needs to be available in another brick that has the bank connection framework. We add this new method for bank connection using the registration interface and include information about the setup page and the assisted setup page. This information is also in a text format, the page names. We use the same name lookup instead of focusing on the object ids.
Same thing happens here, we have a list
of bank interfaces that can come from multiple bricks, each handling that
specific type of bank connection.
Then finally let’s talk about the Data Exchange Framework. When ever we need to transfer data in or out of the application we use the Data Exchange Framework. When we receive an Xml response inside the JSON response from an Azure Function. We pass that Xml to the Data Exchange Framework to work with. When we need to create an Xml file to send inside a JSON request to the Azure Function we also create that Xml file with the Data Exchange Framework.
Another example is in our Payroll
solution. The Payroll solution can
import data to be used in the salary calculation. That import starts a Data Exchange import
process. By doing it that way we can map
any type of file to the import structure in our Payroll solution. We already had this done in NAV 2016 for
external time registration solutions. On
the other hand we also have a time registration solution in NAV and we used to
have an action inside the Payroll that would start a report in the time
registration module. In that time
registration report we had both the time registration tables and the payroll
tables and we just moved the required data between these two tables. This we can no longer do as these two
solution are in two separate bricks.
The solution here was to agree on an Xml
format that we can use to pass data from the time registration into the
standard payroll import methods. We can
easily specify a Codeunit from
the time registration brick in the data exchange definition that is used by the
payroll import function. That time
registration Codeunit
takes creates the Xml data that we can import.
Now to the configuration. In every branch we have one configuration file. This file is named setup.json and it is a simple JSON file that is easy to edit in VS Code. Our AdvaniaGIT tools require this configuration in order to pick the correct versions and build the correct environment, both for the developer and on the build server. Remember that we are still talking about C/AL. We need to have information about the base branch for each brick. Every branch must have a unique id. Similar to that every AL app must have a unique Id.
When the developer starts working the
local development environment is built and configured in line with this
config. One config line can be added,
and AdvaniaGIT will
start a Docker Container instead of using the locally installed NAV or Business
Central. The environment that is built
is linked to the branch with the branch id and every action in AdvaniaGIT uses
that link to work with the correct development environment. We store all objects as text files in our
branches. We need this to be able to
compare our changed objects to a set of standard objects.
When we start the development work on a
new solution, a new functionality or a new customer we need to create a new
branch. After creating the new branch we
code in C/AL and export the code changes to GIT. The code changes are then converted to delta
files and used in builds.
Every brick branch also has a setup.json. This setup.json points to the IS365 as the base
branch. The brick branch has a different
object id offset and a different branch id.
A brick branch will contain delta files
that will describe the changes that brick makes to the IS365 branch. These changes must comply to the Extension
rules to be able to merge without errors along with all the other bricks.
We do a daily build that we call Test
Customer where we build the IS365 branch with all the bricks merged into a
single solution to make sure that we can always merge all the bricks without
errors.
Each customer has it own branch as well. The customer branch is also based on the IS365 branch. In this example we are building a solution that we call PUB2018. This is the solution we use in Advania’s public cloud. Notice that here we specify a delta branch list. This is a list of bricks that are to be merged into this customer solution. The build server will then take care of that. If there are any customization for this customer they are done in the same way as we did in every brick and stored as delta files in the customer branch. Same goes with the customer unit tests, they are also stored on the customer branch.
For the customers that we are
implementing NAV 2018 for we do the base build in C/AL. However, all the customization done for the
customer is done in AL. The AL code is
automatically built on the build server and deployed to our cloud services
during the update period each night. A
change that we commit will be built instantly by the build server and if
successful the new App file will be copied to the cloud servers and applied
during the next upgrade window.
Over to the builds. This is an essential part of the development
process. Nothing is finished until it
has gone through build and test. The
build server is configured to start Docker Containers and uses the same AdvaniaGIT
functions that the developer uses to build, merge and test the
environments. We store database backups
on an ftp server. These backups are
brought in when the build process starts and updated after a successful build.
We have configured build processes that
result in FOB files, in AL symbol package files, in AL apps, in database
backups and database bacpacs.
Remember the delta branch list that I
talked about. That delta branch list is
for the build server to select what to merge during the build.
After a successful build we have the
option to deploy the result. This we
have also automated. The build server
can update environments by importing a new fob or by updating the AL app
installed. We can deploy the artifacts
both internally in Advania’s
cloud and also directly to the customers on premise solution.
We also like to automate the data upgrade when we are planning to update a customer or a solution from one version to another. In that case we have a multi tenant setup on the upgrade machine. We bring in the latest backup of the production database. We restore that database and remove the old application from the database. Running on that machine is an multi tenant NAV instance with the required upgrade code and the required configuration to run long SQL transactions. We next mount the data database to this application and execute the sync and upgrade tasks. If successful we dismount the data and mount to another multi tenant NAV instance. This NAV instance has the acceptance code and we update that one by automatically importing a fob file from the latest customer build.
By automating this we can execute
multiple data upgrades for the customer to verify the process before everything
is ready to put the new release into production.
Finally we turn to the hot topic. We have not started this task yet, but we will soon after we arrive back home. Business Central now is fully capable of supporting all our needs in AL. We can, if we like, use DotNet variables for local deployments. We can select to mix C/AL bricks and AL bricks. The beauty is that we can now start converting our IP brick by brick. As we applied the extensions rules both with code change and naming in C/AL we will be able to convert the C/AL code to AL and with minimal changes we should be able to create the AL app. An AL app does not extend the standard tables like C/AL did. It creates companion tables based on the table that is to be extended and the App GUID. We therefore must move the C/AL data out of the original place and into temporary upgrade tables. By using record references in our AL install Codeunit we can bring the data into our installed app if there are upgrade tables from the C/AL brick. Since we are using the same object and field numbers we can not use the obsolete field option that keeps the C/AL data in place, inaccessible for the user but still accessible by the AL install Codeunit.
We expect that removing the C/AL brick
and replacing with AL will be a simple process when these guidelines are
followed.
Here is a short overview of our tools. We use the Atlassian products; Jira for issues, Bitbucket as the GIT server and Bamboo as the build server. This not the only way. We can use both GitHub and Microsoft DevOps with the same tools that we have.
We use AdvaniaGIT. AdvaniaGIT is a
set of powershell
modules and a VS Code extension. It is
available on GitHub and please, if you can, use it.
Until now I have had my G/L Source Names extension in English only.
Now the upcoming release of Microsoft Dynamics 365 Business Central I need to supply more languages. What does a man do when he does not speak the language?
I gave a shout out yesterday on Twitter asking for help with translation. Tobias Fenster reminded me that we have a service to help us with that. I had already tried to work with this service and now it was time to test the service on my G/L Source Names extension.
In my previous posts I had created the Xliff translation files from my old ML properties. I manually translated to my native language; is-IS.
I already got a Danish translation file sent from a colleague.
Before we start; I needed to do a minor update to the AdvaniaGIT tools. Make sure you run “Advania: Go!” to update the PowerShell Script Package. Then restart Visual Studio Code.
Now, let’s prepare the Xliff files in Visual Studio Code. From the last build I have the default GL Source Names.g.xlf file. I executed the action to create Xliff files.
This action will prompt for a selection of language. The selection is from the languages included in the NAV DVD.
After selection the system will prompt for a translation file that is exported from FinSql. This I already showed in a YouTube Video. If you don’t have a file from FinSql you can just cancel this part. If you already have an Xliff file for that language then it will be imported into memory as translation data and then removed.
This method is therefore useful if you want to reuse the Xliff file data after an extension update. All new files will be based on the g.xlf file.
I basically did this action for all 25 languages. I already had the is-IS and da-DK files, so they where updated. Since the source language is en-US all my en-XX files where automatically translated. All the other languages have translation state set to “needs-translation”.
All these files I need to upload to the Translation Service. From the Lifecycle Services menu select the Translation Service. This will open the Translation Service Dashboard.
Press + to add a translation request.
I now need to zip and upload the nl-NL file from my Translations folder.
After upload I Submit the translation request
The request will appear on the dashboard with the status; Processing. Now I need to wait for the status to change to Completed. Or, create requests for all the other languages and upload files to summit.
When translation has completed I can download the result.
And I have a translation in state “needs-review-translation”.
A Codeunit that you can execute with CODEUNIT.RUN to perform a given task is, from my point of view, an interface Codeunit.
An interface Codeunit has a parameter that we put in the
This parameter is always a table object.
We have multiple examples of this already in the application. Codeunits 12 and 80 are some. There the parameter is a mixed set of data and settings. Some of the table fields are business data being pushed into the business logic. Other fields are settings used to control the business logic.
Table 36, Sales Header, is used as the parameter for Codeunit 80. Fields like No., Bill-to Customer No., Posting Date and so on are business data. Fields like Ship, Invoice, Print Posted Documents are settings used to control the business logic but have no meaning as business data.
Every table is then a potential parameter for an interface Codeunit. Our extension can easily create a table that we use as a parameter table. Record does not need to be inserted into the table to be passed to the Codeunit.
Let’s look at another scenario. We know that there is an Interface Codeunit with the name “My Interface Codeunit” but it is belongs to an Extensions that may and may not be installed in the database.
Here we use the virtual table “CodeUnit Metadata” to look for the Interface Codeunit before execution.
This is all simple and strait forward. Things that we have been doing for a number of years.
Using TempBlob table as a parameter also gives us flexibility to define more complex interface for the Codeunit. Tempblob table can store complex data in Json or Xml format and pass that to the Codeunit.
Let’s take an example. We have an extension that extends the discount calculation for Customers and Items. We would like to ask this extensions for the discount a given customer will have for a given Item. Questions like that we can represent in a Json file.
{
"CustomerNo": "C10000",
"ItemNo": "1000"
}
And the question can be coded like this.
The Interface Codeunit could be something like
With a Page that contains a single Text variable (Json) we can turn this into a web service.
That we can use from C# with a code like
var navOdataUrl = new System.Uri("https://nav2018dev.westeurope.cloudapp.azure.com:7048/NAV/OData/Company('CRONUS%20International%20Ltd.')/AlexaRequest?$format=json");
var credentials = new NetworkCredential("navUser", "+lppLBb7OQJxlOfZ7CpboRCDcbmAEoCCJpg7cmAEReQ=");
var handler = new HttpClientHandler { Credentials = credentials };
using (var client = new HttpClient(handler))
{
var Json = new { CustomerNo = "C10000", ItemNo = "1000" };
JObject JsonRequest = JObject.Parse(Json.ToString());
JObject requestJson = new JObject();
JProperty jProperty = new JProperty("Json", JsonRequest.ToString());
requestJson.Add(jProperty);
var requestData = new StringContent(requestJson.ToString(), Encoding.UTF8, "application/json");
var response = await client.PostAsync(navOdataUrl,requestData);
dynamic result = await response.Content.ReadAsStringAsync();
JObject responseJson = JObject.Parse(Convert.ToString(result));
if (responseJson.TryGetValue("Json", out JToken responseJToken))
{
jProperty = responseJson.Property("Json");
JObject JsonResponse = JObject.Parse(Convert.ToString(jProperty.Value));
Console.WriteLine(JsonResponse.ToString());
}
}
This is just scratching the surface of what we can do. To copy a record to and from Json is easy to do with these functions.
And even if I am showing all this in C/AL there should be no problem in using the new AL in Visual Studio Code to get the same results.
Now that we have entered the Extension era we must take into account that some extensions may or may not be installed at the time of code execution.
You might even have two Extensions that you would like to share data.
Let’s give an example.
In Iceland we add a new field to the Customer table (18). That field is named “Registration No.” and is being used for a 10 digit number that is unique for the individual or the company we add as a customer to your system.
My Example Extension can support Icelandic Registration No. if it exists.
Using Codeunit 701, “Data Type Management”, Record Reference and Field Reference we can form the following code.
LOCAL PROCEDURE GetCustomerRegistrationNo@10(Customer@1000 : Record 18) RegistrationNo : Text;
VAR
DataTypeMgt@1001 : Codeunit 701;
RecRef@1002 : RecordRef;
FldRef@1003 : FieldRef;
BEGIN
IF NOT DataTypeMgt.GetRecordRef(Customer,RecRef) THEN EXIT('');
IF NOT DataTypeMgt.FindFieldByName(RecRef,FldRef,'Registration No.') THEN EXIT('');
RegistrationNo := FldRef.VALUE;
END;
Let’s walk through this code…
GetRecordRef will populate the record reference (RecRef) for the given table and return TRUE if successful.
FindFieldByName will populate the field reference (FltRef) for the given record reference and field name and return TRUE if successful.
LOCAL PROCEDURE GetFieldValueAsText@103(RecVariant@1000 : Variant;FieldName@1004 : Text) FieldValue : Text;
VAR
DataTypeMgt@1001 : Codeunit 701;
RecRef@1002 : RecordRef;
FldRef@1003 : FieldRef;
BEGIN
IF NOT DataTypeMgt.GetRecordRef(RecVariant,RecRef) THEN EXIT('');
IF NOT DataTypeMgt.FindFieldByName(RecRef,FldRef,'Registration No.') THEN EXIT('');
FieldValue := FldRef.VALUE;
END;
This function can be used in more generic ways, like
LOCAL PROCEDURE SetCustomerRegistrationNo@21(VAR Customer@1000 : Record 18;RegistrationNo@1004 : Text) : Boolean;
VAR
DataTypeMgt@1001 : Codeunit 701;
RecRef@1002 : RecordRef;
FldRef@1003 : FieldRef;
BEGIN
IF NOT DataTypeMgt.GetRecordRef(Customer,RecRef) THEN EXIT(FALSE);
IF NOT DataTypeMgt.FindFieldByName(RecRef,FldRef,'Registration No.') THEN EXIT(FALSE);
FldRef.VALUE := RegistrationNo;
RecRef.SETTABLE(Customer);
EXIT(TRUE);
END;
PROCEDURE PopulateOptionalField@25(VAR RecordVariant@1000 : Variant;FieldName@1001 : Text;FieldValue@1002 : Variant) : Boolean;
VAR
RecRef@1004 : RecordRef;
FldRef@1003 : FieldRef;
BEGIN
IF NOT GetRecordRef(RecordVariant,RecRef) THEN EXIT;
IF NOT FindFieldByName(RecRef,FldRef,FieldName) THEN EXIT;
FldRef.VALUE := FieldValue;
RecRef.SETTABLE(RecordVariant);
EXIT(TRUE);
END;
PROCEDURE ValidateOptionalField@26(VAR RecordVariant@1000 : Variant;FieldName@1001 : Text;FieldValue@1002 : Variant) : Boolean;
VAR
RecRef@1004 : RecordRef;
FldRef@1003 : FieldRef;
BEGIN
IF NOT GetRecordRef(RecordVariant,RecRef) THEN EXIT;
IF NOT FindFieldByName(RecRef,FldRef,FieldName) THEN EXIT;
FldRef.VALIDATE(FieldValue);
RecRef.SETTABLE(RecordVariant);
EXIT(TRUE);
END;
To use these functions we first need to copy our record to a variant variable and then back to the record after the function completes.
Reply all to this email in case you need any help.
Thank you,
Microsoft AppSource team
If you follow these links you should see that the publishing portal for Dynamics 365 AppSource has been changed.
My App was automatically migrated from Azure MarketPlace to Cloud Partner Portal. Got this email from Microsoft:
Greetings,
Azure Marketplace will be migrating your VM offers to the Cloud Partner Portal – the new and improved portal for publishing your single VM offers and getting valuable insights about your Azure Marketplace business, on Monday, 17th July 2017.
What should you expect during migration?
Migration will be done by Azure marketplace team and there is no action needed from you. Post-migration, you will start managing your single VM offers in the new Cloud Partner Portal.
During the migration, you will be unable to make updates to your offers via the publishing portal. Migration will be completed on the same day.
Will my offer be available on Azure Marketplace during migration?
Yes, migration will not affect your Azure Marketplace listing.
You can log in to the Cloud Partner Portal at https://cloudpartner.azure.com. Documentation about the Cloud Partner Portal and publishing single VM offers can be found here.
And that is not all. I also got this email from Microsoft:
Hello,
The Dynamics 365 for Financials Extension Team wants to inform you of some important information. As of August 1st, 2017, we will begin accepting version 2.0 extensions for validation. Version 2.0 extensions is a much improved experience and represent our future direction for extensions. We will continue to accept v1 extensions until October 1. Please plan accordingly.
If you have a v1 app that is currently in process of validation, or if you have one that is currently published in App Source, you can begin (at your convenience) to convert your v1 app to v2. Refer to the site here for information on how to convert your app. We will work with you as well for the conversion so please direct any questions to d365val@microsoft.com.
I just wanted to inform you that your extension passed validation on US, CA, and GB. We will now get your extension checked in and it will go into the June (Update 7) release.
I am excited to see if users will start to install my app. As more markets will open for Dynamics 365 for Financials (D365) I will need to support more languages to my App.
If you can help me with your local language please ping me. The App is available on my GitHub site; https://github.com/gunnargestsson/nav2017/tree/GLSourceNames. If you would like to install this extension or merge the deltas into your solution, again just ping me.
If you are in the process of creating an Extension for Dynamics 365 for Financials you can now request a Financials sandbox environment. I installed this on my local virtual machine and this was an essential part of validating the extension. You will need to sign up for the CTP program, which provides you with a prerelease version of Dynamics 365 for Financials. After you have signed the CTP Agreement, you are directed to a page that contains information about how to download the latest builds and configure a local computer or a Microsoft Azure VM for Dynamics 365 for Financials. If you have questions or feedback regarding this, please send an e-mail to: d365-smb@microsoft.com.
Finally, if you take a look at the source for G/L Source Names you will find a setup.json file. This file has all the information needed for my GIT Source Control. As promised in NAV TechDays 2016 I am releasing the AdvaniaGIT to be community project. Stay tuned to Dynamics.is as I will be writing blogs about this project in the coming days and weeks.
Read Object Lines – Creates renumbering lines base on the objects in the selected object file.
Suggest IDs – Suggest new object numbers in the range from 50.000 based on the available objects in the current license.
Read from Excel – Reads object renumbering lines from Excel Sheet created with the Write to Excel process.
Write to Excel – Writes current renumbering lines to a new Excel Sheet to me managed within Excel and reread into the renumbering lines.
Renumber Using Lines – Prompts for a file to read and for a new file to save with renumbered objects based on the rules in the renumbering lines.
Renumber Using Controls – Prompts for a file to read and for a new file to save with renumbered objects based on the rules in the control IDs setup.
I have done some fixes to the renumbering function and have added support for the EventSubscriber.
Go to GitHub to download Page and Table 50000, try this out and submit improvements.
When I am processing an object file I have it open in my text editor. When I see something to renumber I update the control ranges and execute the renumbering process, reading and writing to the same object file. My editor will reload the file and I can see the results immediately.
I am now in the validation of my app. Unfortunately I don’t have enough time to make this go any faster so please be patient.
This process will take time. Here is an example of the expected time frame.
As you may have read in my previous post I created a few videos. These are the first comment from Microsoft about these videos.
The obvious information from here is that I can’t use Dynamics NAV anywhere in my Dynamics 365 for Financials extension.
I asked and this is the solution they gave me.
If you add &aid=fin to the end of your Web Client URL it will add the Dynamics 365 shell.
When I start my web client the normal Url is “http://tfw107131:8080/NAV2017DEV0000077/WebClient/”. By adding the suggested switch to the Url and starting the web client with “http://tfw107131:8080/NAV2017DEV0000077/WebClient/?aid=fin” I will get the Dynamics 365 shell.
The parameters that are added to the Url start with a question mark (?) and after that each parameter starts with the ampersand (&). As an example, I have opened the Customer List and the Url says: “http://tfw107131:8080/NAV2017DEV0000077/WebClient/?aid=fin&bookmark=27%3bEgAAAAJ7CDAAMQAxADIAMQAyADEAMg%3d%3d&page=22&company=CRONUS International Ltd.&dc=0”. The parameter order does not matter, the “aid=fin” can be placed anywhere in the parameter part of the Url.
Now I am back at creating screen shots, videos and documentation. All with the new Dynamics 365 for Financials look and feel.