Spread the love

Webhooks in Microsoft Dynamics 365 Business Central are a powerful tool for real-time integration, but they come with a significant challenge that can overwhelm your systems with unnecessary notifications. In this post, we’ll explore the core problem with Business Central’s webhook implementation and provide practical solutions to make them more efficient and targeted.

The Problem: Over-Notification in Business Central Webhooks

Business Central’s webhook system has a fundamental limitation that affects most integration scenarios: webhooks are triggered on any modification made to a record, regardless of which specific fields were changed. This creates a cascade of unnecessary notifications that can quickly become problematic.

Here’s the scenario: You’ve created an API page that exposes fields X and Y from a table because these are the only fields your subscribing system cares about. However, when field Z (which isn’t even included in your API page) gets updated, Business Central still triggers a webhook notification to all subscribers. Your integration system receives the notification, processes it, calls the API to get the “updated” data, only to find that nothing relevant has actually changed.

This behavior leads to several issues:

  • Excessive API calls: Your systems make unnecessary requests to Business Central
  • Increased processing overhead: Resources are wasted processing irrelevant changes
  • Higher costs: More API calls mean higher usage costs and potential throttling
  • Reduced system performance: Unnecessary processing can slow down your integration workflows
  • Scaling challenges: As your data volume grows, the noise-to-signal ratio becomes increasingly problematic

The root cause is that Business Central’s webhook system operates at the table level rather than the field level, making it impossible to subscribe only to changes in specific fields that matter to your integration.

The Solution: Implementing a Smart Log Table Approach

The most effective way to solve this over-notification problem is to implement a custom logging mechanism that gives you fine-grained control over when notifications are sent. This approach involves creating a dedicated log table and using Business Central’s event system to capture only the changes you care about.

Setting Up the Log Table

First, create a log table with the following essential fields:

table 50100 "Integration Change Log"
{
    fields
    {
        field(1; "Entry No."; Integer)
        {
            AutoIncrement = true;
        }
        field(2; "Change Type"; Enum "Integration Change Type")
        {
            // Values: Insert, Modify, Delete
        }
        field(3; "Table ID"; Integer)
        {
        }
        field(4; "Record System ID"; Guid)
        {
        }
        field(5; "Changed DateTime"; DateTime)
        {
        }
        field(6; "Field Caption"; Text[100])
        {
            // Optional: to track which specific field changed
        }
    }
}

Implementing Field-Level Change Tracking

The key to this approach is writing targeted code that only logs changes to the fields you care about. You have several options for implementing this:

Option 1: Global Table Triggers Use the OnAfterModifyEvent and similar global triggers to check if specific fields have changed:

[EventSubscriber(ObjectType::Table, Database::Customer, 'OnAfterModifyEvent', '', false, false)]
local procedure OnAfterCustomerModify(var Rec: Record Customer; var xRec: Record Customer)
begin
    if (Rec.Name <> xRec.Name) or (Rec."E-Mail" <> xRec."E-Mail") then
        LogIntegrationChange(Database::Customer, Rec.SystemId, IntegrationChangeType::Modify);
end

Option 2: Field-Specific Events Subscribe to field validation triggers for more granular control:

[EventSubscriber(ObjectType::Table, Database::Customer, 'OnAfterValidateEvent', 'Name', false, false)]
local procedure OnAfterValidateCustomerName(var Rec: Record Customer)
begin
    LogIntegrationChange(Database::Customer, Rec.SystemId, IntegrationChangeType::Modify);
end

Creating the API and Webhook Subscription

Once your log table is populated with relevant changes, create an API page for it:

page 50101 "Integration Change Log API"
{
    PageType = API;
    APIPublisher = 'yourcompany';
    APIGroup = 'integration';
    APIVersion = 'v1.0';
    SourceTable = "Integration Change Log";
    DelayedInsert = true;
    EntityName = 'integrationChangeLog';
    EntitySetName = 'integrationChangeLogs';
    ODataKeyFields = systemId;
    
    layout
    {
        area(Content)
        {
            field(systemId; Rec.SystemId) { }
            field(entryNo; Rec."Entry No.") { }
            field(changeType; Rec."Change Type") { }
            field(tableId; Rec."Table ID") { }
            field(recordSystemId; Rec."Record System ID") { }
            field(changedDateTime; Rec."Changed DateTime") { }
        }
    }
}

Now subscribers can set up webhooks on this log table instead of the original data tables. When field X or Y changes in your customer table, the log table gets updated, triggering a focused notification.

The Integration Flow

The improved flow works like this:

  1. A relevant field (X or Y) is modified in Business Central
  2. Your field-specific event handler writes a record to the log table
  3. The webhook subscription on the log table triggers a notification
  4. The subscribing system receives the notification and calls the log API
  5. Based on the log record, the system knows exactly which table and record to query
  6. The system makes a targeted API call using the SystemID from the log record

This approach dramatically reduces unnecessary notifications while providing clear context about what actually changed.

Handling Webhooks Efficiently on the Subscriber Side

Even with improved webhook targeting, you may still face challenges on the subscriber side, especially during periods of high activity. Here are strategies to handle webhook notifications more efficiently:

Deduplication Strategy

When multiple notifications arrive in quick succession, many might reference the same record. Implement deduplication logic in your webhook handler:

public async Task ProcessWebhookNotifications(List<WebhookNotification> notifications)
{
    // Extract unique SystemIDs from all notifications
    var uniqueSystemIds = notifications
        .Select(n => n.SystemId)
        .Distinct()
        .ToList();
    
    // Process each unique record only once
    foreach (var systemId in uniqueSystemIds)
    {
        await ProcessSingleRecord(systemId);
    }
}

Batch API Calls

Instead of making individual API calls for each notification, batch them together using OData filters:

// Instead of multiple individual calls, make one filtered call
var filter = string.Join(" or ", uniqueSystemIds.Select(id => $"SystemId eq {id}"));
var batchUrl = $"{baseApiUrl}?$filter={filter}";
var batchResult = await httpClient.GetAsync(batchUrl);

Azure Service Bus Buffering

For high-volume scenarios, consider implementing a buffering strategy using Azure Service Bus:

  1. Immediate Storage: When webhooks arrive, immediately store them in an Azure Service Bus queue or topic
  2. Batch Processing: Set up a scheduled process (every 10-20 minutes) to process accumulated messages
  3. Deduplication: During batch processing, deduplicate based on SystemID and timestamp
  4. Efficient Retrieval: Make batched API calls to Business Central with all unique records
public async Task ProcessServiceBusMessages()
{
    var messages = await serviceBusReceiver.ReceiveMessagesAsync(maxMessages: 100);
    
    var uniqueRecords = messages
        .GroupBy(m => m.SystemId)
        .Select(g => g.OrderByDescending(m => m.Timestamp).First()) // Take latest per SystemID
        .ToList();
    
    await ProcessBatchedRecords(uniqueRecords);
}

Throttling and Rate Limiting

Implement intelligent throttling to respect Business Central’s API limits:

  • Use exponential backoff when encountering rate limits
  • Implement circuit breaker patterns for failed requests
  • Monitor API usage and adjust batch sizes accordingly

Error Handling and Retry Logic

Build robust error handling into your webhook processing:

public async Task<bool> ProcessWithRetry(string systemId, int maxRetries = 3)
{
    for (int attempt = 1; attempt <= maxRetries; attempt++)
    {
        try
        {
            await ProcessRecord(systemId);
            return true;
        }
        catch (HttpRequestException ex) when (ex.Message.Contains("429")) // Rate limited
        {
            await Task.Delay(TimeSpan.FromSeconds(Math.Pow(2, attempt))); // Exponential backoff
        }
        catch (Exception ex)
        {
            if (attempt == maxRetries) throw;
            await Task.Delay(TimeSpan.FromSeconds(attempt * 2));
        }
    }
    return false;
}

Conclusion

Business Central’s webhook system, while powerful, requires thoughtful implementation to avoid the pitfall of over-notification. By implementing a custom log table approach, you gain precise control over when notifications are sent, ensuring that your integration systems only process truly relevant changes.

The combination of field-level change tracking in Business Central and intelligent webhook handling on the subscriber side creates a robust, efficient integration architecture. This approach not only reduces unnecessary API calls and processing overhead but also improves the overall reliability and performance of your integrations.

Key takeaways for implementing efficient Business Central webhooks:

  • Be selective: Only log and notify on changes that matter to your integration
  • Batch processing: Group notifications and API calls to reduce overhead
  • Implement deduplication: Avoid processing the same record multiple times
  • Use buffering: Consider Azure Service Bus for high-volume scenarios
  • Build in resilience: Implement proper error handling and retry logic

By following these patterns, you’ll transform noisy, inefficient webhook integrations into streamlined, purposeful data synchronization systems that scale with your business needs. The initial investment in setting up this architecture pays dividends in reduced costs, improved performance, and more maintainable integration code.

Remember, the goal isn’t just to receive notifications—it’s to receive the right notifications at the right time, processed in the most efficient way possible.

Leave a Reply