Rethinking LinterCop: Preparing for the Next Chapter

Maybe you’ve noticed a slowdown in the development of LinterCop lately, but don’t worry! I’m still just as enthusiastic about Code Analysis as ever and will definitely keep pushing this forward in the future.

If you’re not familiar with LinterCop, you can find it here: BusinessCentral.LinterCop.
Together with Stefan Maroń, we also did a webinar that walks through what it does and why it exists.

The main reason for this slowdown is simple: Our current setup doesn’t scale well in the long run I believe. To keep things sustainable, we need to take a step back and rethink a few things that haven’t been working as smoothly as I’d like. These are four main topics I’ll dive into in more detail below.

  • Tight coupling with the AL language
  • GitHub releases versus NuGet packages
  • The “Swiss Army Knife” problem
  • Integration with DevOps

Tight coupling with the AL language

Right now, we’re generating nearly 50 assets for every LinterCop release, because each one is compiled against a specific AL Language build. The AL version you use in VS Code, your CI pipeline, or anywhere else must match exactly with the LinterCop .dll it was built for.

We originally based everything on the VS Code Marketplace versions, since LinterCop started as a companion for VS Code. But over time, we’ve expanded into GitHub and Azure DevOps pipelines which introduced new challenges, since those environments often come with different AL Language versions from the artifacts and/or docker containers. For now, we’ve worked around this by downloading the AL Language extension from the Marketplace during pipeline builds and using that to compile.

So the result is that we’re tightly coupled to Microsoft’s AL Language, and unfortunately, there are breaking changes between versions. I’ve spoken with the Microsoft team and I completely understand why. Maintaining backward compatibility with third-party analyzers would slow down their own development. It’s a trade-off that makes sense for me. Maybe in the future, when the AL Language is moving at a lower pace, compatibility will become easier. It does mean, however, that we, as developers of third-party code analyzers, need to handle these differences ourselves and make sure to stay aligned with each new AL Language release.

To make things even more interesting, a new source for the AL Language recently appeared: Microsoft.Dynamics.BusinessCentral.Development.Tools. It follows a different versioning and release cycle than the Marketplace version, which means we now have to support two separate sources, which will further increasing the number of assets we need to produce for every release.

GitHub releases versus NuGet packages

Currently we’re distributing all artifacts through GitHub release assets. It mostly works fine, but it’s not without problems. One recurring issue is the rate limits for unauthenticated requests — especially in CI pipelines, where these limits could quickly hit. Updated rate limits for unauthenticated requests – GitHub Changelog.

One upside of our current setup is that we can add or update assets after an initial release. There’s an automated GitHub task that checks for new releases of the AL Language and automatically builds and attaches new assets to the existing LinterCop release. Most of the time this works great and the gap between Microsoft releasing a new AL Language version and LinterCop publishing the matching assets is usually small enough that it doesn’t cause much inconvenience.

In the long run, I’d prefer to move towards publishing LinterCop as a NuGet package. That would make distribution cleaner and more reliable. The challenge, though, is that it’s not as simple as just zipping all the .dll files into a single package and calling it a day 🤔.

If we move to NuGet, we’ll need to rethink our strategy. Once a NuGet package is published, its contents can’t be modified. So every time a new version of the AL Language is released, we’d have to create a new LinterCop package version just to update the artifacts. That quickly becomes confusing, especially when considering pre-releases. For example, a new AL Language pre-release might trigger a new NuGet version of LinterCop, even though nothing about the analyzer itself has changed. Meanwhile, users working with the current AL Language version wouldn’t actually need that update at all.

The “Swiss Army Knife” problem

The goal for this year was to pass the magic number of 100 rules in LinterCop, but I have to admit, we’re probably not going to make it. As the number of rules keeps growing, so does the feedback on what kind of rules should (and shouldn’t) be part of LinterCop.

That’s where the discussion started: should LinterCop only cover AL Language syntax and best practices, or should it also include implementation-related rules that touch on Business Central’s base application and business logic? This question came up almost a year ago in this discussion:

I feel like rules that are more like implementation recommendations like this one and not strictly related to the Language itself would be a good fit for a second analyzer, what do you think?

That’s where the seed was planted for a broader idea: multiple analyzers, each focused on its own domain.

Imagine a FormattingCop for layout and style rules, a TestCop for test-project-specific rules, or even more specialized analyzers later on. This approach would give users more flexibility. If you’re not interested in formatting rules, you simply don’t enable that analyzer. It would also make it easier to introduce new rules, possibly even with default severity levels like “Warning” depending on the analyzer’s focus.

The hard part, of course, will be defining where one analyzer ends and another begins. But I think this direction makes sense long-term. And if you have ideas or opinions on how to structure these different analyzers, I’d love to hear them!

Integration with DevOps

Starting from AL Language version 16.0, we’ve run into a new issue: In pre-release versions of the AL Language, LinterCop isn’t being loaded anymore.

Microsoft did a great job by migrating from .NET Standard 2.0 to .NET 8.0 and with a few small adjustments, we quickly had LinterCop running again. All tests were green, everything looked fine. But then we noticed that in CI pipelines on Azure DevOps and GitHub, the LinterCop wasn’t being registered at all.

At that point, we were mostly in the dark about what was going on. To our surprise (in the best possible way) Emil Damsbo from the Microsoft Business Central product team reached out with a detailed explanation and a proposed solution. That kind of proactive engagement, reaching out, explaining the underlying cause and suggesting a fix, really shows the strength of this community and is one of the reasons I enjoy working in the Business Central ecosystem so much.

Unfortunately, we’re a bit stuck between a rock and a hard place here. Adopting the proposed solution would break the VS Code integration — and probably a few other integrations as well. It’s an issue we created ourselves within LinterCop, and it wouldn’t be fair to expect other tools to adapt to it. The best way forward is to remove the tight coupling altogether.
That will not only reduce the number of artifacts we need to maintain but also make it possible to drop the file-renaming workaround entirely. You can read more about it in detail in issue #1149.

The Next Chapter

With all this in mind, it’s clear we need to make some serious fundamental changes to LinterCop that will introduce breaking changes. That’s why I’m considering a LinterCop 2.0 approach: A complete rebuild from the ground up. This will have a big impact, since it affects not only the code analyzer itself but also the integrations around it, such as VS Code and DevOps.

But in the long run, it will remove many of the current limitations and open the door for real improvements, like proper NuGet distribution, multiple specialized analyzers, and a more maintainable overall structure.

This marks the start of that journey and I’ll keep sharing updates as LinterCop 2.0 takes shape.

Leave a Reply

Your email address will not be published. Required fields are marked *