Spread the love

There is no question that in our world data is king. Still, for many of us, the problem is that we have so much data in so many different systems and different formats, that we quickly can lose track. We might therefore not get the full potential out of all our data, in this blog series, I will cover some of the tools that Microsoft has made available to us in Azure to handle data. I will also cover some of the basics of how you can structure your data so you can use it most effectively.

Sending Data from Business Central

In this post, I will start by going through a way that we can set up a dynamic way in our Business Central to send data to Azure Blob Storage, which we will work with later on. What we are going to need for this setup is an Azure Function, an Azure storage account, and some AL code. What we are going to be creating is a way that we can set up if a field on a record is modified it will trigger a generic JSON file to be created in an Azure blob storage. This result could also be obtained using WebHooks in Business Central which you can read about here but that would require us to create an API page for each record that we wish to export.

We are going to need 4 tables, the first table is a setup table where we can set up our Azure function URL and also a Boolean to turn our module on, my table and page look as follows:

table 50103 "DEF Data Export Setup"
{
    Caption = 'Data Export Setup';
    DataClassification = CustomerContent;

    fields
    {
        field(1; "Code"; Code[10])
        {
            Caption = 'Code';
        }
        field(2; "Azure Function URL"; Text[250])
        {
            Caption = 'Azure Function URL';
        }
        field(3; "Azure Function Token"; Text[250])
        {
            Caption = 'Azure Function Token';
        }
        field(4; "Enable Send on Triggers"; Boolean)
        {
            Caption = 'Enable Send on Triggers';
        }
    }
    keys
    {
        key(PK; "Code")
        {
            Clustered = true;
        }
    }
}

page 50101 "DEF Data Export Setup"
{
    ApplicationArea = All;
    Caption = 'Data Export Setup';
    PageType = Card;
    SourceTable = "DEF Data Export Setup";
    UsageCategory = Administration;

    layout
    {
        area(content)
        {
            group(General)
            {
                Caption = 'General';

                field("Azure Function URL"; Rec."Azure Function URL")
                {
                    ToolTip = 'Specifies the value of the Azure Function URL field.';
                }
                field("Azure Function Token"; Rec."Azure Function Token")
                {
                    ToolTip = 'Specifies the value of the Azure Function Token field.';
                }
                field("Enable Send on Triggers"; Rec."Enable Send on Triggers")
                {
                    ToolTip = 'Specifies the value of the Enable Send on Triggers field.';
                }
            }
        }
    }

    trigger OnOpenPage()
    begin
        if not Rec.Get() then
            Rec.Insert();
    end;
}

Next, we need a way to set up which records and which fields should trigger a data export, for this purpose I have created two tables “Data Export Mapping Header” and “Data Export Mapping Lines”

table 50101 "DEFData Export Mapping Header"
{
    Caption = 'Data Export Mapping Header';
    DataClassification = CustomerContent;

    fields
    {
        field(1; "Table No."; Integer)
        {
            Caption = 'Table No.';
            DataClassification = CustomerContent;
            TableRelation = AllObj."Object ID" where("Object Type" = const(Table));
        }
        field(2; "Table Name"; Text[250])
        {
            Caption = 'Table Name';
            FieldClass = FlowField;
            CalcFormula = lookup(AllObj."Object Name" where("Object ID" = field("Table No.")));
            Editable = false;
        }
        field(3; "Json Array Name"; Text[250])
        {
            Caption = 'Json Array Name';
            DataClassification = CustomerContent;
        }
        field(4; "Json Object Name"; Text[250])
        {
            Caption = 'Json Object Name';
            DataClassification = CustomerContent;
        }
    }
    keys
    {
        key(PK; "Table No.")
        {
            Clustered = true;
        }
    }
}

table 50102 "DEF Data Export Mapping Lines"
{
    Caption = 'Data Export Mapping Lines';
    DataClassification = CustomerContent;

    fields
    {
        field(1; "Field No."; Integer)
        {
            Caption = 'Field No.';
            DataClassification = CustomerContent;
            TableRelation = Field."No." where(TableNo = field("Table No."));
        }
        field(2; "Field Name"; Text[250])
        {
            Caption = 'Field Name';
            Editable = false;
            FieldClass = FlowField;
            CalcFormula = lookup(Field.FieldName where(TableNo = field("Table No."), "No." = field("Field No.")));

        }
        field(3; "Table No."; Integer)
        {
            Caption = 'Table No.';
            DataClassification = CustomerContent;
            Editable = false;
        }
    }
    keys
    {
        key(PK; "Field No.", "Table No.")
        {
            Clustered = true;
        }
    }
}

And I have created 3 pages a sub-page for the lines:

page 50105 "DEF Data Export Mapping Lines"
{
    ApplicationArea = All;
    Caption = 'Data Export Mapping Lines';
    PageType = ListPart;
    SourceTable = "DEF Data Export Mapping Lines";

    layout
    {
        area(content)
        {
            repeater(General)
            {
                field("Field No."; Rec."Field No.")
                {
                    ToolTip = 'Specifies the value of the Field No. field.';
                }
                field("Field Name"; Rec."Field Name")
                {
                    ToolTip = 'Specifies the value of the Field Name field.';
                }
            }
        }
    }
}

A card page:

page 50106 "DEF Data Export Mapping Card"
{
    ApplicationArea = All;
    Caption = 'Data Export Mapping Card';
    PageType = Card;
    SourceTable = "DEFData Export Mapping Header";
    UsageCategory = Administration;

    layout
    {
        area(content)
        {
            group(General)
            {
                Caption = 'General';

                field("Table No."; Rec."Table No.")
                {
                    ToolTip = 'Specifies the value of the Table No. field.';
                }
                field("Table Name"; Rec."Table Name")
                {
                    ToolTip = 'Specifies the value of the Table Name field.';
                }
                field("Json Array Name"; Rec."Json Array Name")
                {
                    ToolTip = 'Specifies the value of the Json Array Name field.';
                }
                field("Json Object Name"; Rec."Json Object Name")
                {
                    ToolTip = 'Specifies the value of the Json Object Name field.';
                }
            }
            part(lines; "DEF Data Export Mapping Lines")
            {
                SubPageLink = "Table No." = field("Table No.");
            }
        }
    }
}

and a list page:

page 50103 "DEFData Export Mapping List"
{
    ApplicationArea = All;
    Caption = 'Data Export Mapping List';
    PageType = List;
    SourceTable = "DEFData Export Mapping Header";
    UsageCategory = Administration;
    CardPageId = "DEF Data Export Mapping Card";
    Editable = false;

    layout
    {
        area(content)
        {
            repeater(General)
            {
                field("Table No."; Rec."Table No.")
                {
                    ToolTip = 'Specifies the value of the Table No. field.';
                }
                field("Table Name"; Rec."Table Name")
                {
                    ToolTip = 'Specifies the value of the Table Name field.';
                }
                field("Json Array Name"; Rec."Json Array Name")
                {
                    ToolTip = 'Specifies the value of the Json Array Name field.';
                }
                field("Json Object Name"; Rec."Json Object Name")
                {
                    ToolTip = 'Specifies the value of the Json Object Name field.';
                }
            }
        }
    }
}

The idea here is that you choose which table and fields should trigger an export, which fields you wish to export, and what you wish the name of your JSON objects and JSON arrays should be once they have been exported.

Next, we will need a log table and a page, which is used as a buffer table for our job queue

table 50104 "DEFData Export Log"
{
    Caption = 'Data Export Log';
    DataClassification = CustomerContent;

    fields
    {
        field(1; ID; Guid)
        {
            Caption = 'ID';
        }
        field(2; "Table ID"; Integer)
        {
            Caption = 'Table ID';
        }
        field(3; "RecordID"; RecordId)
        {
            Caption = 'RecordID';
        }
        field(4; SysId; Guid)
        {
            Caption = 'Sys ID';
        }
    }
    keys
    {
        key(PK; ID)
        {
            Clustered = true;
        }
    }
}

page 50102 "DEFData Export Log"
{
    ApplicationArea = All;
    Caption = 'Data Export Log';
    PageType = List;
    SourceTable = "DEFData Export Log";
    UsageCategory = Administration;
    Editable = false;

    layout
    {
        area(content)
        {
            repeater(General)
            {
                field(ID; Rec.ID)
                {
                    ToolTip = 'Specifies the value of the ID field.';
                }
                field("RecordID"; Rec."RecordID")
                {
                    ToolTip = 'Specifies the value of the RecordID field.';
                }
                field(SysId; Rec.SysId)
                {
                    ToolTip = 'Specifies the value of the Sys ID field.';
                }
                field("Table ID"; Rec."Table ID")
                {
                    ToolTip = 'Specifies the value of the Table ID field.';
                }
            }
        }
    }
}

Next, we need our codeunit which is where all the magic happens.

codeunit 50104 "DEF JsonExport"
{
    trigger OnRun()
    var
        ExportLog: Record "DEFData Export Log";
        RecordRef: RecordRef;
    begin
        if ExportLog.FindSet() then
            repeat
                RecordRef.Open(ExportLog."Table ID");
                RecordRef.GetBySystemId(ExportLog.SysId);
                Rec2Json(RecordRef);
                SendJSonToAzureFunction(GetJsonString(RecordRef, ExportLog.SysId));
                ExportLog.Delete();
                RecordRef.Close();
            until ExportLog.Next() = 0;
    end;

    procedure Rec2Json(Rec: Variant)
    var
        ExportMappingHeaderSetup: Record "DEFData Export Mapping Header";
        ExportMappingSetupLines: Record "DEF Data Export Mapping Lines";
        FieldRef: FieldRef;
        JObject: JsonObject;
        RecJObject: JsonObject;
        RecordRef: RecordRef;
        i: Integer;
        OutString: Text;
    begin
        RecordRef.GetTable(Rec);
        if ExportMappingHeaderSetup.get(RecordRef.Number) then begin
            ExportMappingSetupLines.setrange("Table No.", ExportMappingHeaderSetup."Table No.");
            if ExportMappingSetupLines.FindSet() then
                repeat
                    FieldRef := RecordRef.FieldIndex(ExportMappingSetupLines."Field No.");
                    JObject.Add(DelChr(FieldRef.Name(), '=', ' /.-*+'), FieldToJsonValue(FieldRef));
                until ExportMappingSetupLines.Next() = 0;
            RecJObject.Add(ExportMappingHeaderSetup."Json Object Name", JObject);
        end else begin
            for i := 1 to RecordRef.FieldCount do begin
                FieldRef := RecordRef.FieldIndex(i);
                JObject.Add(DelChr(FieldRef.Name(), '=', ' /.-*+'), FieldToJsonValue(FieldRef));
            end;
            RecJObject.Add(DelChr(RecordRef.Name(), '=', ' /.-*+'), JObject);
        end;

        JArray.Add(RecJObject);
    end;

    procedure GetJsonString(Rec: Variant; SysId: guid): Text;
    var
        ExportMappingHeaderSetup: Record "DEFData Export Mapping Header";
        Result: Text;
        RecordRef: RecordRef;
    begin
        RecordRef.GetTable(Rec);
        if ExportMappingHeaderSetup.get(RecordRef.Number) then
            ResultJObject.Add(ExportMappingHeaderSetup."Json Array Name", JArray)
        else
            ResultJObject.Add(DelChr(RecordRef.Name(), '=', ' /.-*+') + 's', JArray);
        ResultJObject.Add('Type', RecordRef.Name);
        ResultJObject.Add('Id', format(SysId));
        ResultJObject.WriteTo(Result);
        Clear(ResultJObject);
        Clear(JArray);
        exit(Result);
    end;

    procedure SendJSonToAzureFunction(payload: Text)
    var
        DataExportSetup: Record "DEF Data Export Setup";
        AzureFunctions: Codeunit "Azure Functions";
        AzureFunctionsAuthentication: Codeunit "Azure Functions Authentication";
        AzureFunctionsResponse: Codeunit "Azure Functions Response";
        IAzureFunctionsAuthentication: Interface "Azure Functions Authentication";
        Response: HttpResponseMessage;
        QurreyDic: Dictionary of [text, text];
        Result: Text;
        error: JsonToken;
        Jobject: JsonObject;
    begin
        DataExportSetup.Get();
        IAzureFunctionsAuthentication := AzureFunctionsAuthentication.CreateCodeAuth(DataExportSetup."Azure Function URL", DataExportSetup."Azure Function Token");
        AzureFunctionsResponse := AzureFunctions.SendPostRequest(IAzureFunctionsAuthentication, payload, 'application/json');
        AzureFunctionsResponse.GetHttpResponse(Response);
        if not response.IsSuccessStatusCode() then
            error(Response.ReasonPhrase);
    end;

    local procedure FieldToJsonValue(FieldRef: FieldRef): JsonValue
    var
        FieldValue: JsonValue;
        BoolValue: Boolean;
        IntValue: Integer;
        DecimalValue: Decimal;
        DateValue: Date;
        TimeValue: Time;
        DateTimeValue: DateTime;
        DurationValue: Duration;
        BigIntegerValue: BigInteger;
        GuidValue: Guid;
        RecordRefField: RecordRef;
    begin
        if (FieldRef.Class() = FieldClass::FlowField) then
            FieldRef.CalcField();


        case FieldRef.Type() of
            FieldType::Boolean:
                begin
                    BoolValue := FieldRef.Value();
                    FieldValue.SetValue(BoolValue);
                end;
            FieldType::Integer:
                begin
                    IntValue := FieldRef.Value();
                    FieldValue.SetValue(IntValue);
                end;
            FieldType::Decimal:
                begin
                    DecimalValue := FieldRef.Value();
                    FieldValue.SetValue(DecimalValue);
                end;
            FieldType::Date:
                begin
                    DateValue := FieldRef.Value();
                    FieldValue.SetValue(DateValue);
                end;
            FieldType::Time:
                begin
                    TimeValue := FieldRef.Value();
                    FieldValue.SetValue(TimeValue);
                end;
            FieldType::DateTime:
                begin
                    DateTimeValue := FieldRef.Value();
                    FieldValue.SetValue(DateTimeValue);
                end;
            FieldType::Duration:
                begin
                    DurationValue := FieldRef.Value();
                    FieldValue.SetValue(DurationValue);
                end;
            FieldType::BigInteger:
                begin
                    BigIntegerValue := FieldRef.Value();
                    FieldValue.SetValue(BigIntegerValue);
                end;
            FieldType::Guid:
                begin
                    GuidValue := FieldRef.Value();
                    FieldValue.SetValue(GuidValue);
                end;
            else
                FieldValue.SetValue(Format(FieldRef.Value()));
        end;

        exit(FieldValue);
    end;

    var
        ResultJObject: JsonObject;
        JArray: JsonArray;
}

First off I would like to give a shout-out to IVAN SINGLETON for his blog post about turning any record in Business Central into a JSON object, I have been heavily inspired by his post. The last thing that we will need is a codeuint that subscribes to our global modify trigger.

codeunit 50101 "DEF GlobalSubscriber"
{
    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Global Triggers", 'OnDatabaseModify', '', false, false)]
    local procedure OnGlobalModify(RecRef: RecordRef)
    var
        DataExportSetup: Record "DEF Data Export Setup";
        ExportLog: Record "DEFData Export Log";
        ExportMappingHeaderSetup: Record "DEFData Export Mapping Header";
        ExportMappingSetupLines: Record "DEF Data Export Mapping Lines";
        Fieldef: FieldRef;
        xFieldef: FieldRef;
        xRecRef: RecordRef;
    begin
        if not DataExportSetup.get() then
            exit;
        if not DataExportSetup."Enable Send on Triggers" then
            exit;
        if not ExportMappingHeaderSetup.Get(RecRef.Number) then
            exit;

        ExportMappingSetupLines.SetRange("Table No.", RecRef.Number);
        xRecRef.Open(RecRef.Number, false, RecRef.CurrentCompany);
        xRecRef.Get(RecRef.RecordId);
        if ExportMappingSetupLines.FindSet() then
            repeat
                Fieldef := RecRef.Field(ExportMappingSetupLines."Field No.");
                xFieldef := xRecRef.Field(ExportMappingSetupLines."Field No.");
                if xFieldef.Value <> Fieldef.Value then begin
                    Fieldef := RecRef.Field(RecRef.SystemIdNo);
                    ExportLog.SetRange(SysId, Fieldef.Value);
                    if ExportLog.FindFirst() then
                        exit;
                    ExportLog.ID := CreateGuid();
                    ExportLog.RecordID := RecRef.RecordId;
                    Fieldef := RecRef.Field(RecRef.SystemIdNo);
                    ExportLog.SysId := Fieldef.Value;
                    ExportLog."Table ID" := RecRef.Number;
                    ExportLog.insert();
                end;
            until ExportMappingSetupLines.Next() = 0;
        xRecRef.Close();

    end;


    [EventSubscriber(ObjectType::Codeunit, Codeunit::"Global Triggers", 'GetDatabaseTableTriggerSetup', '', false, false)]
    local procedure GetDatabaseTableTriggerSetup(TableId: Integer; var OnDatabaseInsert: Boolean; var OnDatabaseModify: Boolean; var OnDatabaseDelete: Boolean)
    var
        ExportMappingHeaderSetup: Record "DEFData Export Mapping Header";
    begin
        if ExportMappingHeaderSetup.Get(TableId) then begin
            OnDatabaseInsert := true;
            OnDatabaseModify := true;
            OnDatabaseDelete := true;
        end;

    end;
}


And that is it you will now need to set up codeuint 50104 to run in a job queue as often as you wish to send data.

The next thing that we need to do is create our Azure Blob storage, to do so head over to portal.azure.com and search for Storage accounts

And create a new storage account, once your storage account has been created go to Containers, and create a new container, mine is called frombc.

Now start your visual studio, and create an HTTP triggered Azure Function

Once the Azure Function has been created, go to Tools->NuGet Package Manager->Mange NuGet Packages for Solution, and make sure the following NuGet packages are installed Azure.Storage.Blobs, Microsoft.Azure.WebJobs.Extensions.Storage, Microsoft.NET.Sdk.Functions

Next, replace the code in your Azure function with the following.

using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Azure.Storage.Blobs;
using Newtonsoft.Json;
using System.Collections.Generic;

namespace UploadToBlob
{
    public static class BlobMangement
    {
        [FunctionName(nameof(UploadToBlob))]
        public static async Task<IActionResult> UploadToBlob(
            [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req,
            ILogger log)
        {
            string requestBody = String.Empty;
            using (StreamReader streamReader = new StreamReader(req.Body))
            {
                requestBody = await streamReader.ReadToEndAsync();
            }

            dynamic data = JsonConvert.DeserializeObject(requestBody);
            string folder = data. Type;
            string id = data.Id;

            string Connection = Environment.GetEnvironmentVariable("AzureWebJobsStorage");
            string containerName = Environment.GetEnvironmentVariable("ContainerName");
            var blobClient = new BlobContainerClient(Connection, containerName);
            var blob = blobClient.GetBlobClient(folder + "/"+id+".json");
            var stream = new MemoryStream();
            var writer = new StreamWriter(stream);
            writer.Write(requestBody);
            writer.Flush();
            stream.Position = 0;
            await blob.UploadAsync(stream,true);
            return new OkObjectResult("file uploaded successfylly");
        }
    }
}

What this code does, is it takes a POST request reads the body, takes whatever is written in the Type element and creates a folder with that name in your blob storage, and then creates a JSON file with a GUID as the name and the payload from the post request, if you wish to try this out locally you will have to open up the file local.settings.json and change the file to the following:

{
    "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "{{Connection string to blob}}",
    "ContainerName": "frombc",
    "FUNCTIONS_WORKER_RUNTIME": "dotnet"
  }
}

To get your connection string to your blob storage, go to your Azure portal, Access keys, and copy the connection string.

Once you publish your Azure Function to Azure you must change some configurations, you must change AzureWebJobsStorage to your blob connection string and you must add your ContainerName.

And that is it, your Business Central will now start sending data to your Azure Blob Storage, and every time a new record type is sent it will create a new folder.

And inside these folders, you will see all the files with the data that you have set up in Business Central.

My Customer contains:

{
    "Customers": [
        {
            "Customer": {
                "No": "10000",
                "Name": "Kontorcentralen A/S d",
                "Address2": ""
            }
        }
    ],
    "Type": "Customer"
}

Because that is what I set up in my mapping in Business Central.

You should, of course, modify the code so that it fits your needs, maybe you do not want it to send the record every time, a field has been changed, but only once you perform a certain action, or maybe once a day; that is entirely up to you, also if you look at the codeuint JsonExport, you will see that in the function Rec2Json, if you run it on a record that is not set up, it will export all the fields in your table, which could be used if you wish to have a complete backup of your tables stored in another system. That was it for this post, in the next post we are going to take a look at what we can do with our data once we have it in Azure, until next time stay safe.

Leave a Reply