Category Archives

The licensing issue: Christmas present from Microsoft

It’s that time of year again! December, one of the busiest months of the year. Everybody is trying to achieve targets, finish some projects and grab a bonus, all while decorating their homes, shopping for Christmas trees and writing postcards for friends & family.

Microsoft chose exactly this time to secretly slip us an early gift. Almost nobody noticed. 

One of my customers is currently running Dynamics NAV 2018 with a number of C/Side customizations. We’ve already cleaned a lot of stock objects by switching to events; Next step forward into the future is the switch from C/AL to AL, and I managed to convince them that next year, Extensions are the way to go!

In order to prepare for this, I started refactoring and componentizing older customizations, so they can become independent extensions which can be switched on and off as required. While doing so I ran into a “little” problem:

What it says, is that the license doesn’t allow object LocationCardExt to be published, because the Page Extension needs a free object in the licensed range. I had the same error on my Table Extensions. So here I was, at a customer who have used their license right up to the latest object in the custom range, planning to move custom fields from – for example – the Sales and Purchase tables, to multiple extensions.

36-37-38-39-110-111-112-113-114-115-5107-5108: Only the Sales Objects would be 12 tables, so if I would have to build 5 custom extensions, this would result in 60 additional objects. Probably, at least half of those tables have modifications on two pages (a card and a list), so I’d need 90 pages and this is ONLY sales – €10K in extra objects would be easily spent.

Well, fuck. Excuse my French, but this is quite inconvenient to say the least. I silently panicked and contacted Arend-Jan (one of the three wise men).

And I got lucky! He pointed me to a document on object ranges in Business Central which also has a passage about Dynamics NAV 2018:

When implemented with Dynamics NAV 2018 or Dynamics 365 Business Central On-Premise, partner hosted or Azure IAAS:

The classic C/AL objects in this range needs to be purchased from the Dynamics Pricelist when implemented on premise, partner hosted or Azure IAAS. They are developed in the traditional way using C/Side.

New from Business Central Fall 2018 Cumulative Update 1 (planned for November) and NAV 2018 CU 12 (planned for December)

The AL extension (PageExtension, TableExtension) objects developed in Visual Studio Code and stored in the 50.000 – 99.999 range which extends objects to which you have modify permissions in your development license are free of charge. (For ex. When you want to extend the Customer table from the base app, you are required to develop an extension object and assign a unique object ID to it).Regular AL objects (Table, page, codeunit, report,…) needs to be purchased through Dynamics pricelist.

Yes, you read that right: Microsoft said “free of charge”!

From Business Central Fall 2018 CU1 and Dynamics NAV 2018 CU12, it’s possible to use the full 50.000-99.999 range for these Page Extensions and Table Extensions, so it looks as if it will solve this problem. 

In other news: Cumulative Update 12 for Microsoft Dynamics NAV 2018 has been released.  Today.

This blog does come with a little warning: At the time of writing this, I couldn’t find a docker image for NAV 2018 CU12 yet, but as soon as I have the chance I’ll test with a couple of different licenses, and report back here exactly what is possible and what isn’t.

For now: An early Merry Christmas to all of you!

Escaping the transaction scope (and other good reasons for beer)

Yes, beer! It’s friday evening and we’re celebrating. The Dynamics Tailor, last years three day decision of becoming an entrepeneur, is still alive and kicking!
Yes, the first anniversary of The Dynamics Tailor has passed, last Wednesday to be exact.

So what happened this year? A lot! We’ve managed to pull a number of businesses into a recent version of Dynamics NAV, worked on extensions, and I’ve also had some security puzzles in one of my projects. Currently, I’m working on a re-implementation, a customization project and working as interim application manager.

I haven’t had large failures, and I’ve learned a lot: I’m looking forward to next year!

Escaping the transaction scope

This week, I had another customization request that was not so easy to fix:

“During sales order release, sales shipment and warehouse shipment, we want to perform a number of extra validations. If one of these validations fails, all changes should be rolled back. However, can we catch all validation errors (also the stock ones), and log them into a table, so process owners can either fix the issue or decide whether the custom validation can be ignored (approve the order)?

It sounds a lot easier than it is: I didn’t want to modify any stock code (the customer is using NAV 2018, we can’t fully switch to extension yet, but we want to build all customizations ready to be converted in the future).
In this situation, I didn’t have a choice except somehow saving records after the transaction is started, but before an error is thrown, since then I would only be able to get the last error (text). And that’s easier said than done!
As you probably already know, Business Central starts a transaction after the first write command (INSERT, MODIFY, RENAME, DELETE), and performs a COMMIT either when execution ends, or when you force the program to perform a COMMIT.

Putting a COMMIT in between would solve the problem, but will probably give me 15 new problems: It might cause inconsistent data. I’ve always been very careful with COMMIT, but since we’ve started using events and developing extensions, I’ve basically stopped using them altogether, unless in completely isolated code. It’s simply too dangerous when you cannot control exactly what’s happening before your COMMIT is being executed.

Actually, the solution would be to simply keep my error log transaction out of the transaction scope; the whole post transaction will then be rollbacked when an error occurs, but my log would still be saved. Dynamics NAV and Business Central provide an expensive, but functional solution, and this is how to code it:

STARTSESSION(VAR SessionID, CODEUNIT::"MyValidationHandler" ,COMPANYNAME ,ValidationLogRecord);

Start session runs a codeunit in a separate non-GUI session, and therefore is out of the scope of your current transaction. I prepared my record before I passed it to the new session, and it’s only written in a new session if an error is expected; this in order to keep the session open as short as possible, and run as little session as necessary.

Also, I timed the duration of this process, calling a codeunit that simply performs a Record.INSERT(TRUE); and then closes the session. It clocked in at 13 ms on a slow development server – not fast, but acceptable for something that only occurs a few times a day.
Good to know is that performing a STARTSESSION will not cause you to use an extra licensed user – the session is started from the same system and with the same named user, so it doesn’t count.

Again: It’s expensive, but it solves the problem!

Presenting at a Dutch Dynamics Community event!


Today, I had the honor to present for the first time in a long while, at a Dutch Dynamics Community event in Woerden.

I’m still too hyped to sleep, so here’s a little summary on what happened, and another promise kept.

Presenting to an audience that’s probably far more experienced than I am is quite far out of my comfort zone, so I was seriously nervous! And making matters worse, I had to do the same session twice. To cut a long story short, I don’t think I made an impression as relaxed as this guy…

…but eventually, I did have a good time, and hope I managed to inspire some people to start using the discussed technology.

A short summary of the session (in Dutch, sorry!):

8. Hoe up-to-date zijn jouw developmentkennis en -vaardigheden?
Spreker: Kevin O’Garro (Freelance Dynamics NAV Specialist)
Tags: Technisch, NAV

Bij diverse projecten, waar ik de afgelopen maanden betrokken was, viel me op dat er nog weinig met moderne technieken gewerkt wordt, die eigenlijk zo nieuw al niet meer zijn. Ik doel dan op zaken als events, delta-merges, en test automation.

In deze sessie wil ik een update geven van deze “nieuwe” technieken en laten zien hoe eenvoudig ze toegepast kunnen worden. En natuurlijk wil ik met jullie in gesprek komen hoe je dit zelf vrij makkelijk kunt oppakken. Niet alleen in nieuwe projecten, maar ook in oude.

To motivate my choice of subject: I’ve had to force myself a number of times to prevent lapsing into old habits and stay innovative. As a software consultant or developer, it’s sometimes difficult to force yourself to investigate new technology thorough enough to use it in projects. I’m relatively young (in my thirties), and a quick learner – most probably there are a lot of developers who struggle even more, or have stopped trying?
Since there’s quite a lot to benefit from newer technology, my presentation is a small recap of what has been pioneered by people like Waldo, Luc van Vugt, Arend-Jan Kaufmann, Vjekoslav Babic and a lot of other NAV enthusiasts/specialists/MVPs.
This in order to try to tickle the curiosity of my audience.

In the end of my 2nd session, we went a little bit out of scope, and had a look at what’s possible when you strictly follow coding guidelines and use tools already available: we converted NAV2016 customizations into AL code, which can be compiled as a V2 Extension in Visual Studio Code!

Maybe it’s the adrenaline, but I really had an awesome evening.

To my guests: Thanks for not only listening, but also engaging!

Last but not least, you can find a (slightly modified version of the) presentation here, including generated objects & files, extra links and a remark that came up during the second session:

Download the presentation (PDF)

Download the demo files (ZIP)

Downloading the ZIP might cause your antivirus to protest a little. This is caused by the fact that the file contains a PowerShell script and a .cmd-file: Nothing to worry about, it’s part of the session.

PS: Thanks to Dutch Dynamics Community for the opportunity!

Me, demonstrating something about an ancient version of Dynamics NAV
Above: Me, demonstrating something about an ancient version of Dynamics NAV. Image by Vincent van Rens for Dutch Dynamics Community.

Dynamics NAV 2017 & Application Areas

Recently, I was working on a Dynamics NAV 2017 implementation at a customer, when I noticed a lot of the customizations were suddenly not visible. It took me a while to figure out what had happened; when I checked the Company Information page I found Application Area Experience set to “Basic”.

Microsoft rightfully removed the button from newer on-premise versions (it’s only visible when in a SaaS environment on 2018), because I’m sure a lot of people have made this mistake (there’s even a few MVP’s that have blogged about this!).

The culprit was the project responsible; he had been fiddling around through the administration menus and accidentally changed this option, thinking he could always change it back (WRONG!).

Employees at this company do have a sense of humour, and I got a little carried away on blocking the functionality…

If this happens to you, it’s quite easy to fix: Open table 9178 Application Area Setup, remove the record, and all is back to normal.

Efficiency in E-Commerce & the story of my feet

Due to the time-consuming nature of translating, this item is only available in English.

Before anybody asks: EU 49, UK 14.5. This blog is as much about my personal frustration (of crappy webshops, not being able to find a decent variety of shoes, and more), as it is about recent developments in our E-Commerce world.

The past few years, parcel services have been all over the news in the Netherlands: We had news about the volume of delivery traffic causing traffic jams in our cities, PostNL (our oldest parcel service) has been getting rid of as much payroll as possible and started to hire independent parcel runners, and recently there’s been a lot of criticism on how little these “entrepeneurs” get paid.

Alongside at least monthly news about a rapidly changing parcel world, there has been a steady flow of smaller news outlets pointing at another huge disadvantage of our growing E-Commerce market: Less large deliveries to stores means more small deliveries. Result: Our urban traffic jams are growing year-over-year, and our environment is suffering.

You can imagine my surprise when I heard this news in the morning. In short: Kitty Koelemeijer (of Nyenrode Business University) states that crossborder parcel services are too expensive, and E-Commerce could be stimulated by causing more transparency in the sector in order to apply market pressure to the rates – on a total of 370 billion euro per year, the consumer could save up to 12 billion per year in parcel costs.

I bought new shoes this week, and men who share my shoe size will probably know this is not an easy thing to do. Yes, we have a few shops in Rotterdam that go up to 50/51, yet I never feel like going there: Prices are fairly steep and there’s a maximum of two brands and four pairs to choose from. A total waste of my time. No, I buy my shoes online, where it’s quick and simple. Two or three pairs at once, and I know exáctly where to go. The choice in brands is still disappointing – heck, I even tried personally stalking Guillaume Philibert in order to obtain a pair of his nice Filling Pieces in a size made for real men – but hey, I manage to put something on my feet.

So I started thinking: What’s important to me when I buy these shoes? Would this measly 3 percent discount make me buy quicker or add an extra pair? Don’t think so. Would it have for other fairly recent orders? Not at all. So why do I buy where I buy?

Because of this:

  • Availability. Shoes in my size are hard to find, some other stuff I order is only available through a few specific webshops.
  • Speed. I want my stuff and I prefer to have it yesterday. Now is also fine, try not to make it tomorrow.
  • Accuracy. I want to order what I need. My hate for having to use returns processes exceeds my hate for the sum of my hate for highway gas station toilets, biking to work in the rain, empty phone batteries and unspecified syntax error messages.
  • Ease. I don’t have much time. Don’t make me login: Take my zipcode + house no. and my e-mail, make me pay, and let’s get it over with.
  • Trust. If I order, I want to be sure I get my stuff on time. Oh, and I prefer being sure I actually get my stuff, too.
  • Price. I want to buy for cheap, but only after the requirements above are met.

You’ll probably be saying “these are YOUR preferences, YOU don’t represent the whole European market!” by now.

However, I feel I’m quite close to an average customer. The two companies I’ve had the best experience with in the past year were CoolBlue and Bax Music – both are showing turnover growth that’s near insane, and both are winners in very difficult markets.


Let’s look up some E-Commerce numbers. I recently found an article about the average turnover per order in the EU somewhere, and although I cannot reproduce it now, I’ve found some stats from the US: The average E-Commerce order value there is around 80 US dollars, and it has been around this mark since 2012.

Finnish PostNord has had their marketing department make up a wonderful 44-page analysis of the European E-Commerce market in 2014, which gives us even more valuable information: Clothing and footwear were the number 1 product, keeping books (2) and home electronics (3) on a safe distance. And although clothes and footwear have a large advantage in the fact that they barely ever break in the mail, they nearly always have to fit.

One of the bigger players in E-Commerce in the Netherlands is Zalando. The managing director of Zalando, Rubin Ritter, has once told Die Welt that the amount of returns of Zalando is around the 50% mark. FIFTY percent. Zalando is quite transparent – their annual report is easily found through Google, and it shows their fulfilment cost ratio is 25.9%. How much of their revenue is being spent on returns? 10%?

Let’s not jump to conclusions right now – I’m an ERP Specialist, not an E-Commerce consultant, so you won’t hear me shouting that reducing returns is the holy grail of E-Commerce. However, the facts I found show that a lot of resources are going to waste to facilitate returns. From my personal experience, I’ve learned that I keep coming back to webshops that manage to get me exactly what I want, where and when I want it, not to the webshops who have the best returns strategy.

My two cents: The technology to get to know your customer already exists. You can have your customer return, instead of your product.

A Dynamics NAV database… on a Linux server?

Let’s go ahead and admit it: I just have a thing for open source software. FreeBSD and Ubuntu Linux really tickle my fancy. Free, lightweight, fully customizable, versatile, yet world-renowned for stability. This “thing” used to be totally useless when your daily work is all about Microsoft. Recently, things have changed… Because Microsoft released SQL Server 2017 for Linux!

Probably, you’re already wondering why I wrote this article, or even why I tried running SQL Server on Linux. Yes, I know, who cares about on premise installations since we have Azure and Docker?

Don’t ask me “why”, but I just had to give it a go: A Dynamics NAV 2018 database, on Microsoft SQL Server, on Linux. Two (or three?) worlds working together in peace. It’s happening! And if this isn’t enough motivation, well, then there’s always this little beauty:

  • Because we can! *evil laugh*

Step 1: Setting up the Linux Server

If your world is entirely made up out of Microsoft, setting up a Linux Server doesn’t seem the most straightforward thing to do. But actually, it is, so here’s a (detailed but quick) crash course for installing a Ubuntu Linux Server:

  • Go to this link to download Ubuntu Server. Unless you need cutting edge functionality, download the LTS version. Like with Windows Server editions, this release has Long Term Support (five years), so if you keep this Linux Server running you’ll be happy until 21.04 (for the 16.04 version).
  • If you’re planning to run Ubuntu on a physical machine, google “unetbootin” (it’s somewhere on SourceForge) to quickly copy the image you just downloaded to a USB stick + make it bootable.
  • If you’re running a hypervisor (Hyper-V or VMWare for example), just mount the Ubuntu image on a new system and run it. Concerning resources, I’d start with 3GB RAM and 2 cores if you’re limited – Ubuntu Server doesn’t have a GUI and is very lightweight out of the box.
  • You’ll find the setup is very straightforward. By the way, I usually skip keyboard detection and just select “US International”.
  • Go easy on yourself: Install “SSH” right from the setup tool. This is your “Remote Desktop Connection” for Linux. Slight difference: Your Linux has no GUI, so this is a tool to access the command prompt. If you already skipped through the screen that gives you this option: No problem, you can do it afterwards from the command prompt.

So, probably you’re looking at an installed Linux Server with a login-prompt now. After you login it’ll probably tell you it needs some updates. You can easily install these updates by running the tool to do so as administrator with the following command:

sudo apt-get upgrade

After entering this, you will have to enter your password again. “sudo” means “superuser do”, which is equal to running stuff as administrator in Windows. The rest is not important now. Fun fact: Unless you’re very unlucky, you won’t have to reboot the machine after the updates have installed…

If you forgot to install SSH earlier, just run this command to get that settled:

sudo apt-get install -y openssh-server

Last time I did this manually is a while ago but it should work right after installation.

Step 2: Tools to access the machine from your Windows box

To make our Linux-box with SSH useable from outside, we need two tools:

  • PuTTY: Use this to access the command prompt from your Windows machine. You can run ifconfig (with an F instead of a P) from the command prompt to see the IP address of your Linux machine.
  • WinSCP: Convenient tool which looks like a mix between Windows Explorer and good old Norton Commander, but allows you to access files on your Linux machine over the SSH connection. (yes there’s also Bitvise SSH, this is personal preference)

Step 3: Installing Microsoft SQL Server on Linux

This is where the mixing starts. Let’s first tell Ubuntu where to get SQL Server, by running the following commands (either through PuTTY or directly on the prompt):

wget -qO- https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add –

sudo add-apt-repository “$(wget -qO- https://packages.microsoft.com/config/ubuntu/16.04/mssql-server-2017.list)”

sudo apt-get update

Background info: Ubuntu Linux has a “Store”, just like Windows. What we did here is: add the Microsoft encryption key to the system (so it trusts Microsoft’s repository), add the repository to the list of repositories in Ubuntu, and then get information on available packages for the system. Next commands:

sudo apt-get install -y mssql-server

sudo /opt/mssql/bin/mssql-conf setup

These last steps will look different than usual, but probably be very familiar once you see what is being asked. Let’s keep it simple, and also open up port 1433 for all incoming connections in the firewall:

sudo ufw allow 1433/tcp

Congratulations: You’ve just installed and setup SQL Server on your Linux machine!

Step 4: Preparing your database

If all went well, you should now have a running SQL Server, and you should be able to login to the server from your workstation using Microsoft SQL Server Management Studio.

The folder structure is a little different in Linux, but if you copy an SQL backup to your server through WinSCP, you can locate it in SSMS:


Once the database is there, you should be able to connect through the classic “Development Environment”.

Step 5: Connecting the Microsoft Dynamics NAV Service Tier

But suddenly, setting up the Dynamics NAV Service Tier is not so straightforward.

Yes, you probably already noticed this in step 3 and 4: SQL Server Authentication is back. Not a good thing in my opinion, but the Linux machine is simply not a domain member and we’ll have to live with this for now.

The difficulty on using SQL Server Authentication in the Dynamics NAV service tier is in the fact that you’ll need an encryption key to secure traffic between the service tier and the database server: Normally this is handled by the domain. For production environments I’d always use a certificate provided by an official authority, but if you’re simply testing stuff I’d say use makecert.exe to generate one yourself.

After generating the certificate, you’ll need to install this to the service tier. Ashwini Tripathi has written a very helpful blog on how to get this running. Here’s another Dynamics NAV Administration (Power)shell shortcut:

$Credential = (New-Object PSCredential -ArgumentList ‘sa’,(ConvertTo-SecureString -AsPlainText -Force ‘MYPASSWORD’))

Import-NAVEncryptionKey -ApplicationDatabaseServer LINUXSERVERHOSTNAME -ApplicationDatabaseCredentials $Credential -ApplicationDatabaseName DATABASENAME -KeyPath ‘KEYPATH.key’ -Force

Where all the caps need to replaced with your values, of course.

Step 6: Enjoy!

After completing these steps, your Dynamics NAV clients and even the classic development environment should now be able to start on the Linux database server:

For now, I’ll be using this database server to do some development on 2018 and possibly also on 2013; if I have any news on this, you’ll read it here.

If you have any issues, questions or remarks, let me know!

Add some colour to your Dynamics NAV links!

When developing a custom add-on for Dynamics NAV, working with a DTAP or DTP roadmap can ensure both stability in your production environment on one end, and no limitations to develop freely on the other end. However, for key-users, consultants and developers, DTAP environments come with a risk: You might find yourself testing in live, or entering live data into a development, test or acceptation database by accident. Although I would never miss the opportunity to make fun of you if you did, I must admit it happens to me too 🙂

Even without accidents happening, with multiple instances of the same program open in the taskbar, all the buttons can get very confusing.

In order to minimize these risks within our company, we defined colours for every step: Red is development, green is test and blue is production (we don’t have an acceptance database at the moment). We used the system indicator (in Company Information) to show these colours in the program, too, but the taskbar problem was still there.

After some fiddling, we now have a fix for this:

Aside from being nice and colourful, this is quite easy to create and deploy too. The steps I took (skip to #3 if you want to choose different icons – there’s a lot to find for free):

#1 Get the NAV image as an .ico file

I used a small freeware program called NirSoft IconsExtract to get the .ico file from the Dynamics NAV executable. The executable to search for is in the RoleTailoredClient folder in Program Files (x86), in Microsoft.Dynamics.Nav.Client.exe.

#2 Change the colour of the icon

Another free tool was used to change the colour of the icon: GNU Image Manipulation Program (better known as GIMP), which is a very powerful image editor (open source and free!).

When opening the icon in this tool, on the right side of your screen you’ll see the layers. What I did was select the top layer, then click Colors > Map > Color Exchange.Choose Color From and Color To (you can do this quickly with the droplet tool), play a bit with the threshold until the example looks about right and click OK. Then hide the current layer, select the next layer and press Ctrl + F (command to redo the previous Color Exchange).

When you’re finished, make all layers visible again and click File > Export As… > Microsoft Windows Icon (.ico). You can ignore the warning if you’re running Windows 8.1 or 10 (it will read your icon with compressed layer without any problems).

#3 Creating desktop links that use your new icons

It’s NAV, and we want to deploy easily… so we use PowerShell for this! First, we need to set some variables and copy the icons to the Dynamics NAV folder:

Useful to know here: I used the public desktop folder to place the icons. When you want these in personal folders, use $Home\Desktop

Then, we ask the Windows Scripting Host to create an icon for us (copy this part for every environment you wish to link to):

Of course, we can also create a link on the desktop to the standard NAV environment (this uses the config file, and is essentially equal to the link on the Start Menu):

If you wish to change the behavior of your NAV client with command line arguments, you can add them in the $Shortcut.Arguments parameter. The way the link is configured now, everything except running the standard NAV environment will give you this security notice:


Dreaming about C/AL

Small note: When moving this article to dynamicstailor.eu, I decided to shorten this article by about 50%, by removing a few wishes that Microsoft has granted us in the meanwhile, or is about to give us. Two of these were a better editor and/or Visual Studio integration, and version control (any!).

I’ve been programming in Dynamics NAV since 2013, but I wrote my first code over 20 years earlier. It was 1993, I was still living with my parents and I hadn’t even met my first girlfriend yet. My father came home with a white box saying “Microsoft Visual Basic 3”. Before I knew it I was determined to build my own games in stead of save money to buy the newest titles.

A few disks, a big stack of books: Coding in Visual Basic was so incredibly simple, a 10 (and-a-half!) year old could get going within weeks!

A few years later, I switched to VB6, and after this VB.NET. Both steps were a lot more complex, but after you grab the concept, it’s suddenly possible to write more efficient code.


When I started working at a Microsoft Dynamics NAV partner in 2013, I had my first encounter with C/AL. 20 years later, yet I felt like in a playground again. C/AL is incredible versatile, incredibly simple, and incredibly fast (in terms of coding).

With the release of Dynamics NAV 2016, things even got better: Suddenly there’s TRY…CATCH functionality (although Vjeko already blogged about it’s darkside – I’ll get back to this later), and then there’s that editor that finally moved past Notepad-level! Then we suddenly got VS Code, Extensions, AL… but I’m still missing a few things.

Transaction Management

Somewhere in 2009, I did a project where a lot of Transact SQL programming was envolved, and learned to work with database triggers and transactions. You can imagine my surprise when an NAV developer told me that COMMIT exists in NAV, but using it is asking for a death sentence. The surprise got even bigger when I heard there is no BEGIN TRAN!

By now, I’ve learned how to use COMMIT, but to this date, I still try to stay away frOMMIT.

Concerning transaction management, I don’t have any ideas on how to implement it exactly, but it would be very cool to control database writes the way we can on SQL level.

Data Object datatype

I’ve seen so many developers bump their proverbial nose on this one, and there’s such a simple solution. Take a look at tables 36, 37, 110, 111, 112, 113, 114, 115, 5107, 5108, 5126 and I’m probably forgetting a few: The famous Sales-tables. These have a lot in common, for example:

  • They all use fields Document Type (an Option) and (Document) No. (Code 20) in their primary key.
  • All the lines tables have a Description field (Text 30).

Now, let’s implement a few function that use this standard (Document) No. field:

All working fine! We can build similar functions all through the application, and use them without any problems.

Now… let’s imagine Microsoft is getting complaints about the size of the Document No. field. They decide to make it a Code 25, in stead of a code 20.

Still nothing wrong. Your code will compile, and if you also work at the customer which is using this code, you won’t hear anyone complain either. Right until the moment someone actually uses these extra characters: You will see the fames “Overflow under type conversion of Text to Text”.

Fixing this is a matter of switching on the debugger and tracing exactly where this field is thrown in functions, then editing the length of the variable in the function parameters. Maybe you’ll need to do some data recovery (if someone else threw that dreadful COMMIT in the code above yours…), but usually nothing terrible happens.

My point is that it can be solved, by Microsoft, fairly easily!

There. Fixed. It’s right below the MenuSuite, and it’s a Data Object.

What it does is simple, and could work well in two ways. I’ll explain both in a few steps.

Table-based Data Reference Object

After creating table 36, we create a Data Reference Object for the primary key: Document Type, and Document No. This Data Reference Object “remembers” that it’s based on T36, and will continue to mirror vital properties of the fields. As soon as this exists, it’s possible to reference the Data Reference Object for function parameters, as the sweetspot between a Record parameter and a load of hand-built parameters. Doing so gives us at least three advantages;

  • We should be able to call the function without first preparing a record, for example: DoSomething(‘Sales Order’, ‘SO00001’);
  • It should be possible to reference option values that come from the referenced object, hence are always the same as in the table;
  • If someone decides to change the design of any of the sales tables fields, it should not compile until all Data Reference Objects are updated as well;

This would already be a big advantage. However, there are other ways:

Data (Reference) Object based Tables

One of the first things I learned about databases (in university) were the objectives of normalization; third normal form quickly became our holy grail. Let’s turn this thought process around for a second, and go back as far to the flat table as possible. Let’s also ignore the record size limit. Dynamics NAV could probably do with no more than 50 tables. Don’t worry, my next suggestion won’t be to switch to NoSQL and forget about relational databases altogether ;=)

The next step is where it gets interesting: Let’s separate our flat tables from our table structure: We now have ourselves a data model.

Now let’s take the Sales data model (which contains everything from Customers through Sales Headers, Sales Lines, Sales Prices etcetera), and build a new Sales Header table by simply selecting which fields from our data model should be there. We can now build the Sales Line table by selecting a few different fields. It gets even more interesting if we look at the Sales Price table: That should be in Item data model as well. Maybe it’s a good idea to reference from one data model to another as well: Table relations will be easy (if not automatically generated), and we’ll have infinity integrity!

Okay, cool, but do we need this?

Well, no. We don’t need it. Then again, we didn’t need a Web Client, but people use it every day… And if you’d think nobody ever changes field properties, think again. Or take a good look at that Description field I mentioned earlier. It’s not a Text 30 anymore, it’s a Text 50…

NAV Three-Tier Performance and Load Balancing

Disclaimer: This article is not meant to be a best-practice guide! It’s my first publication on LinkedIn, in which I’ve tried to summarize and share some of the information I found scattered through the interwebs. Also, I hope to gain tips & tricks from you. I still don’t have a solid answer on questions found below (although, admittedly, I haven’t directly consulted Microsoft). More advice is always welcome!

In 2008, Microsoft shocked the Navision world by introducing Dynamics NAV 2009. It gave us a new three-tier architecture that has brought a lot of advantages. I started using NAV 2009R2 Classic in 2012 and was a bit skeptical about all that “newness”, but after working with it for a year (mainly 2013, 2013R2 and 2015), I can wholeheartedly say it’s a huge improvement.

Even though it’s a really matured environment, I’m having difficulties finding knowledge on the topic, especially concerning the RTC-client and it’s service-tiers. For example, after consulting a few Microsoft partners, consultants, developers and asking them if the middle tier load balances by itself or if I should set up a certain number of middle tiers, I didn’t find the clear, well-documented answers I am used to in the NAV community.

What I’d like to tune, and why

The few large Dynamics NAV three-tier environments I’ve seen had a few things in common:

  •  Defining “large”, 100+ concurrent users in one database, and all or most users working in one company;
  • Database server and service-tier(s) on dedicated servers;
  • A fairly well performing database server (fast direct SQL-queries);
  • Low CPU usage and low memory usage on the service-tiers; Two to three servicetiers for over 100 users.
  •  Mediocre to fair performance in the client.

Of course, people can live and work with fair performance, but I’d prefer the user to be the limiting factor! Therefore I started looking into how to performance-tweak two of these environments.

What screws do I turn to make it go faster?

Even though I asked quite a few people, I found only a few things to look at:

  • SQL Server: Like in older versions of Dynamics NAV, the database can make or break performance in an environment. When switching from Classic clients to RTC, if you were happy with performance before the migration, it wouldn’t be the first place I’d look for gaining performance: In a three-tier environment the middle-tier acts as a buffer between clients and the database. Because of this, load on the database is less than when directly connecting clients to the DB.
  •  Page Design: If your NAV environment is customized, it is worth the hassle to invest time in your page design. I won’t go into detail here, but lots of fields in list pages, FactBoxes and pages with subpages accessing multiple tables will slow clients down. Of course, users can switch FactBoxes off by themselves to speed the system up… but they can also switch them ón to slow it down, if they need the information!
  • Network Protocol: This shouldn’t affect performance on fast LAN networks, but when connecting through a slow network or VPN fixing the Network Protocol on Sockets should improve performance. However, this is the protocol used by the service-tier to communicate with the database. I didn’t notice any improvement in my test environment.
  • Metadata Provider Cache Size: In Classic environments, one would set the Object Cache (KB) as high as possible to improve client performance. This is a similar setting, but now NAV objects are being cached in the service tier. The default setting is 150 (objects), and I really don’t know why it is this low – a dedicated middle-tier server should have at least 16GB of RAM, why not use it? Because I didn’t configure different service tiers for different departments (so basically everybody uses all objects), I decided to match the amount of objects in the database with the cache size: 5000. Then I started the service tier and opened all pages, reports, XMLPorts etc. I could think of. This resulted in about a 50MB difference in used memory on the server and positively impacted performance by a truckload. I’d advise anyone to try upping this parameter, of course, carefully watching memory load on the server.
  • Data Cache Size: Sets the amount of data cached on the service tier (in-memory). Can be convenient to lower the load on the database server. The standard settings is 9, which equals 512 MB. 8 = 256 MB, 10 = 1024 MB, 11 = 2048 MB etcetera.
  • Compression Threshold: This might be an interesting setting. It determines how much data the service tier will send to the client in one go without compression. By standard, this is set to 64 kilobytes. I didn’t manage to find any information on how this data is structured and how the service tier behaves around this setting: will it prefer sending a compressed dataset that is larger than the screen? Or does it limit the dataset sent to 64 KB even though there’s more records within the filter? It might be worthwhile experimenting with this setting if either your network bandwidth is limited (decrease the threshold) or your processor power on the middle-tier server is limited (increase to make sure the server doesn’t spend valuable time compressing data). Although the environment I’m testing in is not really limiting either way, I am testing if settings 24 and 512 are noticeable on different servicetiers within the same environment.
  • Multiple service tiers: Although I still don’t know where the sweet spot is in the amount of users per service tier, it’s understandable that there ís a sweet spot. I’ve had advice ranging from 25 users all the way up to 100 user per service, and have guesstimated the sweetspot to be about 30 users. This way, all these users will manage to cache all objects in a short time so NAV will “wake-up” quickly in the morning. Multiple services will then make-up for our “manual multi-threading”.

So, how do I balance this load?

Let’s keep it simple: In most environments, setting up different links during client install will divide users over multiple service tiers. Problem solved?
Yes, it does work. However, in case of maintenance or stability problems, it will be difficult to redirect these users to one of the working service tiers. It isn’t flexible!

After searching for days (okay, hours) and not finding anything useful, I decided to give up and see if I could think of a solution myself. What I built is a simple command-line executable launcher that randomly chooses one of the pre-defined servicetiers and then runs NAV with these parameters. Although it’s quick and dirty, it does provide me with basic load balancing, and gives me the flexibility to quickly switch off one of the servicetiers, or even one of the servers. If I know I’ll have to restart a server for maintenance, I’ll modify the settings XML, and then ask users to quit and restart NAV when they prefer, for example anywhere during the next hour. If you’d like to try or use this tool, get it for free here.

Future ideas for this little tool would be to build in true load balancing (checking the database for active sessions per service tier), (de-)activating service tiers from a GUI and – if I figure out a way to do it – switching active users from one service tier to another. Although it’s fun to build, I’d like to know first: Am I the only one running in to this problem? Does anyone have a better solution?

Well, that’s all folks! I hope there’s more info in the NAV community about tuning the three-tier environment – if you find mistakes in my essay, or have any good tips please let me know.

Page 2 of 212