Exploring AI-Assisted Development: My Journey with Aider, ChatGPT, and Claude

I have been experimenting with using AI assistants to develop a notification framework for Business Central. My objective was not to create a functional system but to investigate the potential of AI-assisted development in our field. I have extensively used Aider in JavaScript/TypeScript, .NET, and C++. The second iteration of my version of Wine for BC4Ubuntu is 98% written by Aider.

At the end of this post, I’ve linked the repository where the prompt used for Aider, along with the resulting code, is stored. I didn’t want it to create pages, just the initial boilerplate code.

The AI Assistants: My Digital Collaborators
For this experiment, I chose to work with three AI tools:

  • Aider: An AI-powered coding assistant (https://github.com/paul-gauthier/aider) specifically designed to help with coding tasks.
  • ChatGPT-4o: OpenAI’s large language model, is known for its versatility in generating text and assisting with various tasks.
  • Claude 3.5 Sonnet: Anthropic’s AI assistant, capable of engaging in complex discussions and providing detailed explanations.

The Process: A Dance of Human and Machine Intelligence
Setting the Stage with ChatGPT

I began by asking ChatGPT to create a detailed description of a notification framework for Business Central. The AI generated a comprehensive outline, including entity relationships, data flow, and implementation considerations. This provided a solid foundation for the project, offering a level of detail that would have taken significant time to produce manually.

Coding with Aider and Claude

With our blueprint, I turned to Aider and Claude to assist with the AL programming. Here’s where things got interesting. Aider excelled at generating boilerplate code and suggesting basic structures for our entities.

    Reflections: The Promises and Pitfalls of AI-Assisted Development

    The Good

    • Efficiency: The AI assistants dramatically sped up certain aspects of development, particularly in generating initial code structures and documentation.
    • Creativity Boost: Discussing ideas with the AIs often led to novel solutions I might not have considered.
    • Continuous Learning: The explanations and suggestions provided by the AIs, especially Claude, were like having a knowledgeable mentor always available.

    The Challenges

    • Accuracy Concerns: While impressive, the AI-generated code wasn’t always perfect. Careful review was necessary to catch and correct errors.
    • Context Limitations: The AIs sometimes struggled with Business Central-specific nuances that weren’t part of their training data.
    • Over-reliance Risk: It was tempting to rely too heavily on AI suggestions, potentially stifling my problem-solving skills.

    The Unexpected

    • Pair Programming Feel: Working with the AIs often felt like pair programming, providing a collaborative atmosphere even when working alone.
    • Expanded Perspective: The AIs sometimes approached problems from unexpected angles, challenging my preconceptions and broadening my thinking.

    Lessons Learned: Best Practices for AI-Assisted Development

    Use AI as a Tool, Not a Replacement: AI assistants are powerful aids, but they shouldn’t replace human judgment and expertise.

    • Leverage for Learning: Use AI explanations to deepen your understanding of concepts and best practices.
    • Iterate and Refine: Use AI to quickly prototype ideas, but be prepared to refine and optimize the results.
    • Stay Informed: Keep up with the rapidly evolving capabilities of AI tools in the development space.

    Looking Ahead: The Future of AI in Business Central Development

    This experiment has convinced me that AI-assisted development has a bright future in the Business Central ecosystem. As these tools evolve, I anticipate they will become indispensable for tasks like:

    • Rapid prototyping of new features
    • Automated code reviews and optimization
    • Enhanced debugging and error detection
    • More intuitive and context-aware code completion

    However, the key to success will be learning to work symbiotically with AI, combining machine efficiency with human creativity and domain expertise.

    I’ve shared the results of this experiment, including the notification framework code, on GitHub: https://github.com/SShadowS/Aider.ALNotify

    I invite you to explore it and share your thoughts.

    Flame graphs of BC call stacks

    Quick note: Yeah, I’m back. A lot happened, got married, had a kid, new job. You know, the usual.

    So now we got that out of the way. This is just a quick write-up, as someone most of you know will show it in action at a session at BCTechDays.

    If you want graphs like the following of your BC call stacks, call my web service and I will convert it to an SVG formatted flame graph.

    The web service is open for everyone in these regions and will stay free forever (write to me if you are blocked, and I will whitelist):
    Europe, North and South America plus Oceania.

    The web service: https://github.com/SShadowS/AL-Flamegraph

    Example BC App for calling web service: https://github.com/SShadowS/AL-FlamegraphApp

    A more detailed post will follow next week after TechDays, and it will include a guide for using Pyroscope.

    Lazy replication of tables with NodeJS, Custom APIs and Webhooks in Business Central (Part 1)

    “What if I could replicate a table to a system in an entirely different database with a different language in an entirely different OS evenly?”

    Wondered what I could do with webhooks which wasn’t just a standard use case. This post isn’t a how-to for creating custom API pages in AL or how webhooks work, other people have done this and they are way better at explaining it than me.

    The flow I wanted:

    1. Add generic Business Central AL extension, which exposes the installed APIs and the Field table. This is the main, it will not expose any data tables itself, but will be used to handle the communication.
    2. Add extension with custom API pages for the tables needing replication. (In this example it will contain the table also, but normally not.)
    3. Server (called CopyCat) will call the main extension (ALleyCat) and get a list of all API pages within a specific APIGroup (BadSector) and their table structure.
    4. CopyCat will connect to a secondary database and create tables.
    5. CopyCat will copy all records.
    6. CopyCat will subscribe via webhooks for further changes.
    7. Each webhook response is just a record ID and the change type, ex. Updated or Deleted. So CopyCat will either request new/updated record or delete if needed.
      Will keep a list of new or modified records in an array and only request a predefined records pr. sec, so the main service-tier won’t be overloaded.
    8. Periodic requests for table changes in BC and if new fields are detected, they are added to the table.
    Continue reading “Lazy replication of tables with NodeJS, Custom APIs and Webhooks in Business Central (Part 1)”

    Why I had to fix NavTechDays

    Statistics

    This is the tale of how a small idea, became a week long project, but informative/instructive one.

    Just a heads up, this post isn’t how I normally make them, so may contain really bad jokes and stupid references. Plus this will actually contain a “why”, which goes against my blogs motto. 🙂

    A few weeks ago, with NavTechDays approaching rapidly, I started wondering:
    “How many were attending and from which countries, this year?”
    Multiple reasons for this.


    First, how fast do I have to get to the halls to get the seats with a table for my laptop?

    Secondly, how many do I have to outwit, to get ahead in the queues for the beer? 🙂

    Easy, there is a fairly comprehensible list of attendees on the official NavTechDays website. I couldn’t just copy/paste, as the country info were located in a tag, and everyone knows the germans drink like mad-men. So I Googled “.net webscraper”, grabbed the first hit I got,  fired up a Business Central Dev Env and started coding.

    Let the coding begin

    The .Net library I found were HTML Agility Pack. The documentation were a bit sparse, but that didn’t frighten me.
    The example on the frontpage even made it look super easy.

    https://html-agility-pack.net/

    When using the web example, it ended up being very similar i C/AL (Check this blog post on how to use NuGet packages):
    HAPHtmlWeb := HAPHtmlWeb.HtmlWeb;
    HAPHtmlDoc := HAPHtmlWeb.Load(URL);

    By using https://www.navtechdays.com/2018/attendees/last_name/ as the URL, I now had the entire page searchable.

    I thought: “The attendees are listed in a table, I just need to find each line and read them one by one for each page.”
    I opened the page in Chrome and entered Developer mode (F12) and browsed into the HTML until I found the data I needed.

    Developer mode

    There they were, table row after table row <TR>. Great, just right click the first and get the XPath selector for it.

    Xpath selector

    This gave me the following XPath selector. For more info on XPath, check out this page.

    //*[@id="contentBox"]/div[2]/table/tbody/tr[1] 

    I removed the last [1] and valided it in Chrome. Just press Ctrl+F and paste the Xpath, then Chrome will show you the resulting amount of nodes. Here it showed me that I would get 100 lines on this page. Perfect, there is 100 attendees per page.

    Easy…

    Enter this into the code:

    Next bump on the road.

    Took me a while before I remembered the old IEnumerable way of doing things.

    So far, so good.

    But no matter what I did, I never got more than the first attendee. I thought my Enumerable code were bugged, my Xpath were wrong, I wasted the most of a night, trying to figure out where I had messed up.
    Finally I said to myself: “The code works, the problem must be elsewhere.”

    The plot thickens

    I opened https://validator.w3.org and entered the attendes URL. It said: 113 errors. A bit high, but not unusual. But when I browsed though the list I suddenly realized why my code wouldn’t work.

    Line after line with this error.
    Orphan </div>

    So my code was fine, the webscraper just simply didn’t know how to handle the extra tag and just stopped, which is why I kept getting just one line.

    Fixing the HTML

    I search for a HTML sanitizer on Nuget and found this. Added that to the code but alas, I couldn’t get the VAR initialized. Tried every trick I had to make these parameters. The code just kept getting more and more bloated.

    Azure functions to the rescue

    I wondered if I had the syntax wrong, so tried it 1-to-1 in C# (Visual Studio 2017) and it worked.
    Well I had to learn Azure functions some day anyway, so this was as good a day as any.

    Long story, short. (Way way to long a story)

    I wont turn this into an Azure function tutorial, there are a lot of them on the web and the ones making them, have way more experience than me.

    But some of the things I had to wrestle with were: 

    • IActionResult not wanting to return text/xml.
    • My lack of knowledge regarding ASP.NET MVC.
    • My Azure Function not wanting to start up. Had to delete it and upload again (Same code).

    I ended up making this. A bit rough around the edges, but it works.

    using System.IO;
    using System.Threading.Tasks;
    using Microsoft.AspNetCore.Mvc;
    using Microsoft.Azure.WebJobs;
    using Microsoft.Azure.WebJobs.Extensions.Http;
    using Microsoft.AspNetCore.Http;
    using Microsoft.Extensions.Logging;
    using System.Net.Http;
    using Newtonsoft.Json;
    using System.Net;
    using System.Text;
    
    namespace FixNavTechDays
    {
    
        public static class FixNavTechDays
        {
    
            [FunctionName("FixNavTechDays")]
            public static async Task<HttpResponseMessage> Run(
                [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
                ILogger log)
            {
    
                string page = req.Query["Page"];
                
                //Need this to fetch webpage
                var client = new WebClient();
    
                //Setup Sanitizer and allow class to remain in HTML.
                var sanitizer = new Ganss.XSS.HtmlSanitizer();
                sanitizer.AllowedAttributes.Add("class");
    
                string URL = "https://www.navtechdays.com/2018/attendees/last_name";
                if (page != "0")
                {
                    URL = URL + "/P" + page;
                }
    
    
                //Create easy to use response with valid HTML.
                string sanitized = sanitizer.Sanitize(client.DownloadString(URL));
    
                return new HttpResponseMessage(System.Net.HttpStatusCode.OK)
                {
                    Content = new StringContent(sanitized, Encoding.Default, @"text/html"),
                };
    
    
            }
        }
    }

    It take a parameter and translates that into an URL, with which it then request the page and sanitize it.

    The final C/AL

    With this over I finally had some data I could iterate. I changed the XPath a bit, so I got each column for each row one by one. Could have used Xpath’s specific for each, but had already used way more time on this post than I had hoped. 🙂
    So ended doing it quick and dirty.

    Yep, that is the real API key. The API will be up until NavTechDays 2018 is over. Check it out.

    There were a few things I had to fix also, the strings had a lot of padding and country were only mentioned in an attribute. So had to use a bit of String.Trim and GetAttributeValue.

    (Link to the entire C/AL and Azure Function code at the end.)

    The Result

    And did you think I would just post the table data?
    Nope, I installed Power BI Desktop, added the OData connector for Business Central (Local) and did a bit of data beautifying.

    The germans rule.

    Apparently RelateIT brought an army this year.

    TL;DR

    If you need one frase to help minimize the queue at the bar, just yell (Warning, havn’t used my german seriously, in about 20 years):

    “Die Leute am Continia Stand sagen Deutsche sind scheiße Fußballer.”

    C/AL Code: https://www.dropbox.com/s/3c3uka3lncceeeh/WebScraper.txt?dl=1

    Custom DotNet assemblies in AL

    Just making the last touch-ups for the next posts, when I noticed that there is not that many examples on how to use custom assemblies in AL code.

    With the release of the full version of Business Central the 1. of October, some info on this topic is more and more relevant. So this is just a quick post based on my MQTT client example.

    This is basically just the same code, though some things have been cut out, just to slimline it a bit. Everything related to sending have been removed.

    I used txt2al and renamed a few of my vars, had a few overlapping new keywords in AL.

    Why I didn’t make it from scratch in AL? Two reasons. First, Txt2al is just a great tool and it works almost flawlessly. Secondly, VS Code is a terrible tool to begin writing AL code which contains DotNet. There is almost no help what-so-ever when using DotNet. So until this is fixed I would recommend you to use C/Side and then convert the code afterwards.

    The following have been shown elsewhere, but just want to reiterate it. To use custom assemblies you need to specify what directory they are in and then define them afterwards.

    Like here I have mine in a subdir, called .netpackages, of my project folder.

    This is then referred to in the .vscode\settings.json file like this.

    With this over it is now possible to use the assemblies in the project, but first we need to refer to them like this.

    This is where C/SIDE comes handy, as these lines are not possible to find in VS Code. You just like have to know what to write, not impossible, just really annoying.
    The same can be said about DotNet triggers, you just have to know them, as there is no autocomplete.
    So you can see DotNet development isn’t really implemented in AL yet.

    E.g. it is expected that you just know the trigger name and all its parameters. Like the first line in the code below. No help.

    Summary:
    So if you want to do DotNet in AL, just add the variables and triggers in C/SIDE and then do a txt2al. You will thank yourself afterwards.

    The entire project can be found here https://github.com/SShadowS/AL-MQTT-example

    Pushbullet + C/AL + JSON Buffer = Easy notifications on multiple platforms

    Pushbullet logos

    UPDATE 07-07-2018: Now with SMS functionality added.

    Sometimes you just want to push a notification or send an SMS the cheap way. Here is how you could do it in Dynamics NAV 2018 via C/AL with the use of Pushbullet, “JSON Buffer”, “JSON Management” and “HTTP Web Request Mgt”. This can pretty easily be converted to AL for use in Business Central.

    In this example I will use Pushbullet – a cross-platform notification system which you can read more about here.

    A quick summary of what you will see:

    • GET and POST calls to a HTTP Web service from C/AL.
    • Import JSON using Table 1236 JSON Buffer.
    • Export/generating JSON using Codeunit 5459 JSON Management.
    • Sending Pushbullet notifications from C/AL.

    The Pushbullet API is well documented, compared to JSON Buffer and JSON Management (*Wink wink* Microsoft). Most of this is based on a lot of trial and error, sprinkled with Google.

    Continue reading “Pushbullet + C/AL + JSON Buffer = Easy notifications on multiple platforms”

    How to begin using .Net in C/AL?

    So you have a problem and you have found a package on Nuget that might just fix it for you in no time. Where to start?

    Lets presume I want a QR generator, I want to create custom QR codes for documents or items. It could be a link to the data sheet, and you want it on your company web page or Item card for easy access for technicians.

    They might find the item on a PC but need the drawings later at the machine. So they find it and scan the code, which triggers a download of the PDF data sheet, so they have it offline. There are multiple uses for this. I am not here for the why, just the how.

    Continue reading “How to begin using .Net in C/AL?”

    NAV as a MQTT client

    Borrowed fro https://pagefault.blog/2017/03/02/using-local-mqtt-broker-for-cloud-and-interprocess-communication/

    What is MQTT (Message Queues)? Well I could write up and down about just that subject, but other people have done that before and are properly better at explaining it than me. Google it (Heck use Bing if you are feeling lucky today).

    Lazy? Okay then. It is a small footprint protocol on top of the TCP/IP protocol and is for sending information. Mostly used on embedded devices (IoT) where CPU power is limited. You can not have an Intel octa-core powerhouse, when it has to run for a year on a coincell battery.

    REST is not an option, it is way to resource heavy for simple sensor data.

    Continue reading “NAV as a MQTT client”