“What if I could replicate a table to a system in an entirely different database with a different language in an entirely different OS evenly?”
Wondered what I could do with webhooks which wasn’t just a standard use case. This post isn’t a how-to for creating custom API pages in AL or how webhooks work, other people have done this and they are way better at explaining it than me.
The flow I wanted:
Add generic Business Central AL extension, which exposes the installed APIs and the Field table. This is the main, it will not expose any data tables itself, but will be used to handle the communication.
Add extension with custom API pages for the tables needing replication. (In this example it will contain the table also, but normally not.)
Server (called CopyCat) will call main extension (ALleyCat) and get a list of all API pages within a specific APIGroup (BadSector) and their table structure.
CopyCat will connect to a secondary database and create tables.
CopyCat will copy all records.
CopyCat will subscribe via webhooks for further changes.
Each webhook response is just a record ID and the change type, ex. Updated or Deleted. So CopyCat will either request new/updated record or delete if needed. Will keep a list of new or modified records in an array and only request a predefined records pr. sec, so the main service-tier won’t be overloaded.
Periodic request for table changes in BC and if new fields are detected, they are added to the table.
This is the tale of how a small idea, became a week long project, but informative/instructive one.
Just a heads up, this post isn’t how I normally make them, so may contain really bad jokes and stupid references. Plus this will actually contain a “why”, which goes against my blogs motto. 🙂
A few weeks ago, with NavTechDays approaching rapidly, I started wondering: “How many were attending and from which countries, this year?” Multiple reasons for this.
First, how fast do I have to get to the halls to get the seats with a table for my laptop?
Secondly, how many do I have to outwit, to get ahead in the queues for the beer? 🙂
Easy, there is a fairly comprehensible list of attendees on the official NavTechDays website. I couldn’t just copy/paste, as the country info were located in a tag, and everyone knows the germans drink like mad-men. So I Googled “.net webscraper”, grabbed the first hit I got, fired up a Business Central Dev Env and started coding.
Let the coding begin
The .Net library I found were HTML Agility Pack. The documentation were a bit sparse, but that didn’t frighten me. The example on the frontpage even made it look super easy.
I thought: “The attendees are listed in a table, I just need to find each line and read them one by one for each page.” I opened the page in Chrome and entered Developer mode (F12) and browsed into the HTML until I found the data I needed.
There they were, table row after table row <TR>. Great, just right click the first and get the XPath selector for it.
This gave me the following XPath selector. For more info on XPath, check out this page.
I removed the last  and valided it in Chrome. Just press Ctrl+F and paste the Xpath, then Chrome will show you the resulting amount of nodes. Here it showed me that I would get 100 lines on this page. Perfect, there is 100 attendees per page.
Enter this into the code:
Took me a while before I remembered the old IEnumerable way of doing things.
But no matter what I did, I never got more than the first attendee. I thought my Enumerable code were bugged, my Xpath were wrong, I wasted the most of a night, trying to figure out where I had messed up. Finally I said to myself: “The code works, the problem must be elsewhere.”
The plot thickens
I opened https://validator.w3.org and entered the attendes URL. It said: 113 errors. A bit high, but not unusual. But when I browsed though the list I suddenly realized why my code wouldn’t work.
So my code was fine, the webscraper just simply didn’t know how to handle the extra tag and just stopped, which is why I kept getting just one line.
Fixing the HTML
I search for a HTML sanitizer on Nuget and found this. Added that to the code but alas, I couldn’t get the VAR initialized. Tried every trick I had to make these parameters. The code just kept getting more and more bloated.
Azure functions to the rescue
I wondered if I had the syntax wrong, so tried it 1-to-1 in C# (Visual Studio 2017) and it worked. Well I had to learn Azure functions some day anyway, so this was as good a day as any.
Long story, short. (Way way to long a story)
I wont turn this into an Azure function tutorial, there are a lot of them on the web and the ones making them, have way more experience than me.
But some of the things I had to wrestle with were:
IActionResult not wanting to return text/xml.
My lack of knowledge regarding ASP.NET MVC.
My Azure Function not wanting to start up. Had to delete it and upload again (Same code).
I ended up making this. A bit rough around the edges, but it works.
public static class FixNavTechDays
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
string page = req.Query["Page"];
//Need this to fetch webpage
var client = new WebClient();
//Setup Sanitizer and allow class to remain in HTML.
var sanitizer = new Ganss.XSS.HtmlSanitizer();
string URL = "https://www.navtechdays.com/2018/attendees/last_name";
if (page != "0")
URL = URL + "/P" + page;
//Create easy to use response with valid HTML.
string sanitized = sanitizer.Sanitize(client.DownloadString(URL));
return new HttpResponseMessage(System.Net.HttpStatusCode.OK)
Content = new StringContent(sanitized, Encoding.Default, @"text/html"),
It take a parameter and translates that into an URL, with which it then request the page and sanitize it.
The final C/AL
With this over I finally had some data I could iterate. I changed the XPath a bit, so I got each column for each row one by one. Could have used Xpath’s specific for each, but had already used way more time on this post than I had hoped. 🙂 So ended doing it quick and dirty.
There were a few things I had to fix also, the strings had a lot of padding and country were only mentioned in an attribute. So had to use a bit of String.Trim and GetAttributeValue.
(Link to the entire C/AL and Azure Function code at the end.)
And did you think I would just post the table data? Nope, I installed Power BI Desktop, added the OData connector for Business Central (Local) and did a bit of data beautifying.
If you need one frase to help minimize the queue at the bar, just yell (Warning, havn’t used my german seriously, in about 20 years):
“Die Leute am Continia Stand sagen Deutsche sind scheiße Fußballer.”
Just making the last touch-ups for the next posts, when I noticed that there is not that many examples on how to use custom assemblies in AL code.
With the release of the full version of Business Central the 1. of October, some info on this topic is more and more relevant. So this is just a quick post based on my MQTT client example.
This is basically just the same code, though some things have been cut out, just to slimline it a bit. Everything related to sending have been removed.
I used txt2al and renamed a few of my vars, had a few overlapping new keywords in AL.
Why I didn’t make it from scratch in AL? Two reasons. First, Txt2al is just a great tool and it works almost flawlessly. Secondly, VS Code is a terrible tool to begin writing AL code which contains DotNet. There is almost no help what-so-ever when using DotNet. So until this is fixed I would recommend you to use C/Side and then convert the code afterwards.
The following have been shown elsewhere, but just want to reiterate it. To use custom assemblies you need to specify what directory they are in and then define them afterwards.
Like here I have mine in a subdir, called .netpackages, of my project folder.
This is then referred to in the .vscode\settings.json file like this.
With this over it is now possible to use the assemblies in the project, but first we need to refer to them like this.
This is where C/SIDE comes handy, as these lines are not possible to find in VS Code. You just like have to know what to write, not impossible, just really annoying. The same can be said about DotNet triggers, you just have to know them, as there is no autocomplete. So you can see DotNet development isn’t really implemented in AL yet.
E.g. it is expected that you just know the trigger name and all its parameters. Like the first line in the code below. No help.
Summary: So if you want to do DotNet in AL, just add the variables and triggers in C/SIDE and then do a txt2al. You will thank yourself afterwards.
UPDATE 07-07-2018: Now with SMS functionality added.
Sometimes you just want to push a notification or send an SMS the cheap way. Here is how you could do it in Dynamics NAV 2018 via C/AL with the use of Pushbullet, “JSON Buffer”, “JSON Management” and “HTTP Web Request Mgt”. This can pretty easily be converted to AL for use in Business Central.
In this example I will use Pushbullet – a cross-platform notification system which you can read more about here.
A quick summary of what you will see:
GET and POST calls to a HTTP Web service from C/AL.
Import JSON using Table 1236 JSON Buffer.
Export/generating JSON using Codeunit 5459 JSON Management.
Sending Pushbullet notifications from C/AL.
The Pushbullet API is well documented, compared to JSON Buffer and JSON Management (*Wink wink* Microsoft). Most of this is based on a lot of trial and error, sprinkled with Google.
So you have a problem and you have found a package on Nuget that might just fix it for you in no time. Where to start?
Lets presume I want a QR generator, I want to create custom QR codes for documents or items. It could be a link to the data sheet, and you want it on your company web page or Item card for easy access for technicians.
They might find the item on a PC but need the drawings later at the machine. So they find it and scan the code, which triggers a download of the PDF data sheet, so they have it offline. There are multiple uses for this. I am not here for the why, just the how.
What is MQTT (Message Queues)? Well I could write up and down about just that subject, but other people have done that before and are properly better at explaining it than me. Google it (Heck use Bing if you are feeling lucky today).
Lazy? Okay then. It is a small footprint protocol on top of the TCP/IP protocol and is for sending information. Mostly used on embedded devices (IoT) where CPU power is limited. You can not have an Intel octa-core powerhouse, when it has to run for a year on a coincell battery.
REST is not an option, it is way to resource heavy for simple sensor data.