Technisch

Category Archives

Docker Series: Docker on Windows Server 2019

Microsofts most modern server OS has some technical advantages over it’s predecessor, and together with Docker makes for a very versatile platform to develop large apps, and of course add-ons for Business Central On-Premise. If you’re looking to quickly get a BC on Docker environment up and running, read on!

Step 1: What to get?

  • A computer with sufficient resources. Read this blog if you need more information on hardware.
  • You might want a hypervisor (VMWare ESXi or Microsoft Hyper-V Server), but it is not necessary. The installation of a hypervisor is skipped in this tutorial, but the hypervisor is referenced here and there.
  • Windows Server 2019 Standard (or Datacenter, but not Essentials).

Step 2: Let’s get some software!

Of course, some of you can open MSDN and download Windows Server right there, but not everyone has a subscription, and Microsoft’s server licenses aren’t cheap.

However, if you go to Microsoft’s EvalCenter, you can download an ISO for Windows Server 2019 which can be used for 180 days: More than enough to take a good look at Docker Enterprise.

Once you’re confident you want to keep going you don’t even have to reinstall; you can convert your evaluation to a licensed and activated Windows using the dism tool and your license key.

Make sure you get the Standard or Datacenter Edition.

There’s also a version called Windows Server 2019 Essentials, which I think is similar to the Small Business Server of the past. This version is missing a feature that’s essential for Docker to work: The Containers feature.

Adding the Containers feature to Windows Server 2019: This feature is not available in the Standard and Datacenter editions, not in the Essentials Edition.


Step 3: Install Windows

Duh 🙂 Not much to do here, just make sure all updates are installed.

Once you run the Windows installer, it will ask you if you want a GUI or not; keep in mind the GUI will use some memory. Personally, I still think it can be convenient, although PowerShell is just as powerful.

Also good to know: You can find ready to go Windows Server 2019 environments on Azure. Might save you some time!

Step 4: Setting up Docker

In order to run Docker, we need aforementioned Containers-feature on our Windows installation. There’s a separate command to install this, but Docker-guru Tobias Fenster tipped me that installing Docker already does this for you (thanks!).

So let’s install Docker! You can do this through the GUI, or you can run the following PowerShell commands (remember to run PowerShell as Administrator!):

Install-Module -Name DockerMsftProvider -Repository PSGallery -Force

Install-Package -Name docker -ProviderName DockerMsftProvider

These commands also let Windows know where to find the Docker repository, and install the product. After this, reboot the server.

Step 5: Preparing Networking

After your server is up again, it might be a good idea to think about what networking you’d like to use. If you use a laptop to access your server, that laptop is probably connected through WiFi and receiving an IP-address through DHCP.

Is your server on the same physical network (not on Azure, but physically in your office)? Then you might want to consider letting your Docker containers also grab an IP-address through DHCP. This will make them available immediately once they’re up and running.

The way Docker sets up containers as standard, they will only be available internally (as seen from the machine on which Docker runs). Inconvenient – we don’t want to do our development work on the Windows Server, but on our daily workstations.

Creating the network

Docker contains various networking drivers for different purposes. For our dev-machine, the “transparent” driver is most important. We can create a transparent network with name “MyNetwork” from PowerShell:

docker network create -d transparent MyNetwork 

All is prepared now… unless you couldn’t wait, skipped a few steps, and already created a container, of course 😉 You can still connect these containers to any new network. Since I have a lack of patience, let’s restart the container too:

docker network connect MyNetwork MyContainer

docker stop MyContainer

docker start MyContainer

Yes, for some reason I do this manually, in two commands. There is also a “restart”-command, but somehow I feel less in control when I use this, because I didn’t see the stopped container.

Anyway, let’s check if everything is up again:

docker ps -a

You should now have a DHCP address on this container. But if you don’t…

Promiscuous Mode

Setting a virtual network to Promiscuous Mode on VMWare allows this network to see all traffic passing through the switch. On a machine that’s merely being used for testing/development purposes, it will make your life a lot easier getting DHCP to your VMs.

This should be considered a security hazard: Don’t do this unless only you and your trustees have administrator permissions on your VMs.

If you use Microsoft Hyper-V Server, this contains a similar feature: It’s called MAC Address Spoofing. There’s some more information on enabling it here.

Step 6: Install NAVContainerHelper

Not necessary, but it’s so practical – trust me, you’ll want NAVContainerHelper. Installing can be done from PowerShell, and is very easy:

Install-Module NavContainerHelper

Yes, that’s all. Just wait, and it will be installed. You can find more information on this tool here.

Step 7: Creating a container with NAVContainerHelper

Since we’re far past TL;DR point, let’s just start with the one and only (okay, almost) command you need. It sets up your container:

New-NavContainer -accept_eula -containerName "freddy" -auth NavUserPassword -imageName "mcr.microsoft.com/businesscentral/onprem" -updateHosts -additionalParameters @("--network=MyNetwork")

The system will now ask you for a username and password: These are meant to be able to login to your NAV environment later. The sa-password of the SQL-server is also the password you set here. Later, after it finishes, remember to keep the data (about the location of the VSIX etc.) it shows you in the output.

Right, while your PowerShell window is currently working hard, showing you all kinds of GUIDs, pulling FS layers, downloading files and verifying checksums, it might be a good time to analyze what we asked the tool to do here:

New-NavContainer is the command to start a new NAV container.

-accept-eula saves you from typing “y” once. Not very useful, until you start scripting your container builds (I might blog on this later, in which case the link will pop up here)

-containerName “freddy” defines the Docker-name of the container. If your network uses DNS, your container should be reachable through freddy.domain/NAV/ once it’s up and running

-auth NavUserPassword tells the system to setup the container with NavUserPassword authentication. Since I also work with engineers that don’t have a domain account, I prefer this setting, but I think Windows Authentication is also possible.

-imageName “mcr.microsoft.com/businesscentral/onprem” tells Docker exactly what image to get from which repository. There used to be Docker images on the microsoft/ repository also, but as far as I know everything should be moved to mcr.microsoft.com by now. For more information on what’s available, check out this blog by Waldo.

-updateHosts is an option that updates the hosts-file on your system, to allow for “manual DNS”. Necessary if you want to use:

-additionalParameters @(“–network=MyNetwork”) A very useful option of NavContainerHelper: The possibility to pass through standard docker parameters. In this case, it links this container to the network we created in Step 5.

Step 8: Useful Commands

There’s so much more to write, more than enough to fill an extra blog on how to manage a Docker machine, how to automate certain tasks, and how to fix issues… However, to help get you started, I want to limit myself to two more small subjects:

docker logs

Since I started playing with docker, I have been constantly saving my PowerShell sessions (commands and output). One that comes back a lot is “docker logs”:

docker logs MyContainer

This command will show you all logs that have been produced from the start of your container. Can be very helpful if you have an issue to fix.

Import-NavContainerLicense

If you’re a developer, you’ll want your own license in the container. All you need to add is your containerName and the location of the license file (on the Docker machine).

Step 9: Good luck!

If you have any questions, or if you ran into something I forgot to mention here, feel free to ask/let me know!


An application object of type x is already declared

Yes, it’s a Visual Studio Code error, and it has been bugging me for days. No, that’s too mild. It has been driving me nuts! Of course, the message is clear, and you’d think the solution is simple. The only problem is: The application object has not already been declared. There’s only one.

The Problem

A quick explanation of what’s happening:

  1. I start a new project (usually through AL: Go!), fix the launch.json and download symbols.
  2. Develop away… my smallest project has two codeunits, two page extensions and one table extension.
  3. Everything is fine (No “Problems”) right up until I press F5 (Start Debugging) for the first time.
  4. After testing whatever it is I want to test, I continue developing and suddenly the messages start popping up. In this case, I have 5 of them:
Tab-Ext60500.TabExtObject.al
-- An application object of type 'TableExtension' with ID '60500' is already declared AL(AL0264) [1,16]
-- An application object of type 'TableExtension' with name 'TabExtObject' is already declared AL(AL0197) [1,22]
-- The extension object 'TabExtObject' cannot be declared. Another extension for target 'Item' or the target itself is already declared in this module. AL[0334] [1,22]
-- A field with name 'TabExtObjectField1' is already defined AL(AL0205) [6,21]
-- A field with name 'TabExtObjectField2' is already defined AL(AL0205) [17,21]
  1. After seeing these errors, just accept, modify whatever you need to modify, press F5 (Start Debugging).
  2. You’d expect the compiler to also crash on problems like the above, but it doesn’t. It compiles and runs flawlessly.

On this project, I can more or less handle this issue. It’s just 5 errors and if something shows up that I do want to see, I’ll see it. On one of our other projects I was developing on, I have 1146 problems. If this flips to 1147, or 1148, I’ll have no clue where to look. Annoying, to put it mildly.

The Solution

So, after visiting a Dynamics event last evening (well.. a few hours ago), I had some new motivation to figure out what’s the cause of this, and started trying to reproduce the issue from a brand new laptop, and with the “Hello, World!” app. This clean machine caused me to find it, because I suddenly started developing from the local Documents folder.

It’s dead simple: UNC paths!

To prevent data loss (or worse) we have a policy to not store anything on our laptops but keep everything on “the network”. Also the source code. As a result, our projects are all in \\server.domain\Dynamics Tailor BV\projects\projectname\

Visual Studio Code seems to be okay with this, but it clearly isn’t. I just solved the issue on my own machine by simply mapping the network drive to a drive letter and it’s working like a charm.

Well… just one thing left to say:


Docker Series: What hardware?

If you’re planning to run a “heavy” application like Docker to develop for Business Central (or NAV), it’s useful to know what hardware to have. Here’s an overview of what you need.


Lost In Extensions: SQL migration & Modern Development

After having a fairly smooth Friday, I was in my car driving to pick up my son when a migration consultant (let’s just call him Mike) called about a mutual customer. Mike told me something I don’t want to hear on Friday around 17:30:

We’ve just ran another conversion to prepare for testing next week, but somehow we’re missing records in the Item table. About 300, to be exact. It looks as if the service tier isn’t refreshing.

After discussing the problem for a while, and not figuring out what was going on, I promised to take a look after the usual family business. I just did.

At this customer we periodically migrate data from the old live environment to the acceptation environment (using SQL): some tables are wiped every run, but for performance reasons, others are migrated through incremental updates.

So I logged in, restarted all services and then had a quick look in the client. With a filter on our item, it returned an empty list – no other filters on the table, so the item clearly didn’t exist. So I tried to create it… and NAV told me that it already existed!

Right. Next to check were the usual things that go wrong when doing SQL conversions: low caps in code fields, dates or times with the wrong value etcetera (even though the behaviour of the client didn’t match up). As expected, there was nothing “off” visually.

Not wanting to waste time, I decided to run a trace, and quickly found out what was going on:

A few weeks ago, I built an extension containing a TableExtension object that extends the Item table, so now we have two Item tables:

  • The main table: CRONUZ_EU$Item
  • The companion table(s): CRONUZ_EU$Item$[random GUID]

The NAV/BC engine always generates it’s SQL statements with the included companion table, so there will never be a record in one of the companion tables that’s not in the main table, nor will the opposite situation exist.

Mikes original procedures update the CRONUZ_EU$Item table, but not the companion tables. However, the NAV/BC engines generates a SQL statement with a JOIN, not an OUTER JOIN. The result: SQL will return records only if a record with given primary key is found in both tables.

After adding all the missing records, of course, the problem was solved immediately. However, uninstalling the app with option -DoNotSaveData might also help.


The licensing issue: Christmas present from Microsoft

It’s that time of year again! December, one of the busiest months of the year. Everybody is trying to achieve targets, finish some projects and grab a bonus, all while decorating their homes, shopping for Christmas trees and writing postcards for friends & family.

Microsoft chose exactly this time to secretly slip us an early gift. Almost nobody noticed. 

One of my customers is currently running Dynamics NAV 2018 with a number of C/Side customizations. We’ve already cleaned a lot of stock objects by switching to events; Next step forward into the future is the switch from C/AL to AL, and I managed to convince them that next year, Extensions are the way to go!

In order to prepare for this, I started refactoring and componentizing older customizations, so they can become independent extensions which can be switched on and off as required. While doing so I ran into a “little” problem:

What it says, is that the license doesn’t allow object LocationCardExt to be published, because the Page Extension needs a free object in the licensed range. I had the same error on my Table Extensions. So here I was, at a customer who have used their license right up to the latest object in the custom range, planning to move custom fields from – for example – the Sales and Purchase tables, to multiple extensions.

36-37-38-39-110-111-112-113-114-115-5107-5108: Only the Sales Objects would be 12 tables, so if I would have to build 5 custom extensions, this would result in 60 additional objects. Probably, at least half of those tables have modifications on two pages (a card and a list), so I’d need 90 pages and this is ONLY sales – €10K in extra objects would be easily spent.

Well, fuck. Excuse my French, but this is quite inconvenient to say the least. I silently panicked and contacted Arend-Jan (one of the three wise men).

And I got lucky! He pointed me to a document on object ranges in Business Central which also has a passage about Dynamics NAV 2018:

When implemented with Dynamics NAV 2018 or Dynamics 365 Business Central On-Premise, partner hosted or Azure IAAS:

The classic C/AL objects in this range needs to be purchased from the Dynamics Pricelist when implemented on premise, partner hosted or Azure IAAS. They are developed in the traditional way using C/Side.

New from Business Central Fall 2018 Cumulative Update 1 (planned for November) and NAV 2018 CU 12 (planned for December)

The AL extension (PageExtension, TableExtension) objects developed in Visual Studio Code and stored in the 50.000 – 99.999 range which extends objects to which you have modify permissions in your development license are free of charge. (For ex. When you want to extend the Customer table from the base app, you are required to develop an extension object and assign a unique object ID to it).Regular AL objects (Table, page, codeunit, report,…) needs to be purchased through Dynamics pricelist.

Yes, you read that right: Microsoft said “free of charge”!

From Business Central Fall 2018 CU1 and Dynamics NAV 2018 CU12, it’s possible to use the full 50.000-99.999 range for these Page Extensions and Table Extensions, so it looks as if it will solve this problem. 

In other news: Cumulative Update 12 for Microsoft Dynamics NAV 2018 has been released.  Today.

This blog does come with a little warning: At the time of writing this, I couldn’t find a docker image for NAV 2018 CU12 yet, but as soon as I have the chance I’ll test with a couple of different licenses, and report back here exactly what is possible and what isn’t.

For now: An early Merry Christmas to all of you!


Escaping the transaction scope (and other good reasons for beer)

Yes, beer! It’s friday evening and we’re celebrating. The Dynamics Tailor, last years three day decision of becoming an entrepeneur, is still alive and kicking!
Yes, the first anniversary of The Dynamics Tailor has passed, last Wednesday to be exact.

So what happened this year? A lot! We’ve managed to pull a number of businesses into a recent version of Dynamics NAV, worked on extensions, and I’ve also had some security puzzles in one of my projects. Currently, I’m working on a re-implementation, a customization project and working as interim application manager.

I haven’t had large failures, and I’ve learned a lot: I’m looking forward to next year!

Escaping the transaction scope

This week, I had another customization request that was not so easy to fix:

“During sales order release, sales shipment and warehouse shipment, we want to perform a number of extra validations. If one of these validations fails, all changes should be rolled back. However, can we catch all validation errors (also the stock ones), and log them into a table, so process owners can either fix the issue or decide whether the custom validation can be ignored (approve the order)?

It sounds a lot easier than it is: I didn’t want to modify any stock code (the customer is using NAV 2018, we can’t fully switch to extension yet, but we want to build all customizations ready to be converted in the future).
In this situation, I didn’t have a choice except somehow saving records after the transaction is started, but before an error is thrown, since then I would only be able to get the last error (text). And that’s easier said than done!
As you probably already know, Business Central starts a transaction after the first write command (INSERT, MODIFY, RENAME, DELETE), and performs a COMMIT either when execution ends, or when you force the program to perform a COMMIT.

Putting a COMMIT in between would solve the problem, but will probably give me 15 new problems: It might cause inconsistent data. I’ve always been very careful with COMMIT, but since we’ve started using events and developing extensions, I’ve basically stopped using them altogether, unless in completely isolated code. It’s simply too dangerous when you cannot control exactly what’s happening before your COMMIT is being executed.

Actually, the solution would be to simply keep my error log transaction out of the transaction scope; the whole post transaction will then be rollbacked when an error occurs, but my log would still be saved. Dynamics NAV and Business Central provide an expensive, but functional solution, and this is how to code it:

STARTSESSION(VAR SessionID, CODEUNIT::"MyValidationHandler" ,COMPANYNAME ,ValidationLogRecord);

Start session runs a codeunit in a separate non-GUI session, and therefore is out of the scope of your current transaction. I prepared my record before I passed it to the new session, and it’s only written in a new session if an error is expected; this in order to keep the session open as short as possible, and run as little session as necessary.

Also, I timed the duration of this process, calling a codeunit that simply performs a Record.INSERT(TRUE); and then closes the session. It clocked in at 13 ms on a slow development server – not fast, but acceptable for something that only occurs a few times a day.
Good to know is that performing a STARTSESSION will not cause you to use an extra licensed user – the session is started from the same system and with the same named user, so it doesn’t count.

Again: It’s expensive, but it solves the problem!