Some of you might already know, but I’m quite crazy about bikes. The past months, I started giving customers a discount if they are within a defined range, and if they can grant me access to a shower after I arrive in the morning. I prefer biking over my company car!
At the NAVTechDays
Yesterday, I biked from Rotterdam to NAVTechDays! It took me approximately four hours to get to Antwerp – a bit longer than expected, since I had to change a front tyre (of course, a flat, especially on this trip!), and because I was cycling on unknown territory.
This morning, Luc van Dyck (organizer of NAVTechDays) surprised me with a wonderful donation of €3000 to Trees for All, and a chance to show my vehicle on stage during the KeyNote!
I’m sure that, alongside his donation, this exposure will help donations, so if you’re here looking where to donate, please visit Facebook or Geef.nl!
If I have movie footage of the velomobile moment, I’ll ad it here 🙂
In the meanwhile, don’t forget to check out BE O Bottle for the most practical bottle while cycling in a velomobile, and of course Velomobiel.nl for more information on my Velomobiel Quest!
Before the trip
Recently, I was challenged to bike to NAVTechDays 2019, and long story short: I will. We all want a better environment, don’t we?
From my starting point in Rotterdam, the total trip will be about 210 km (to and from NAVTechDays). My bike is a human-powered vehicle (no speed-pedelec), so you’ll probably understand I need lots of motivation and – even more important – fresh air!
Trees generate air, so I’m raising money for Trees for All! Your contribution will make an impact, whether you donate a lot or a little. Anything helps. Do you want to join me in supporting a good cause?
More information about why, how, what and when will be added soon – we’re in the process of recording some vlogs. In the meanwhile: Thanks in advance for your support!
Every year ten billion trees are lost. With disastrous consequences for people, animals and climate. That’s why we invest in trees. How? We plant trees, protect and restore existing forest and raise awareness about the need for trees. Close to home, but also further away through sustainable forestry projects in developing countries. You can help us by planting trees or compensating CO2.
Click here to donate through Facebook. Please drop me a line if you don’t have Facebook!
Microsofts most modern server OS has some technical advantages over it’s predecessor, and together with Docker makes for a very versatile platform to develop large apps, and of course add-ons for Business Central On-Premise. If you’re looking to quickly get a BC on Docker environment up and running, read on!
Step 1: What to get?
A computer with sufficient resources. Read this blog if you need more information on hardware.
You might want a hypervisor (VMWare ESXi or Microsoft Hyper-V Server), but it is not necessary. The installation of a hypervisor is skipped in this tutorial, but the hypervisor is referenced here and there.
Windows Server 2019 Standard (or Datacenter, but not Essentials).
Step 2: Let’s get some software!
Of course, some of you can open MSDN and download Windows Server right there, but not everyone has a subscription, and Microsoft’s server licenses aren’t cheap.
However, if you go to Microsoft’s EvalCenter, you can download an ISO for Windows Server 2019 which can be used for 180 days: More than enough to take a good look at Docker Enterprise.
Once you’re confident you want to keep going you don’t even have to reinstall; you can convert your evaluation to a licensed and activated Windows using the dism tool and your license key.
Make sure you get the Standard or Datacenter Edition.
There’s also a version called Windows Server 2019 Essentials, which I think is similar to the Small Business Server of the past. This version is missing a feature that’s essential for Docker to work: The Containers feature.
Step 3: Install Windows
Duh 🙂 Not much to do here, just make sure all updates are installed.
Once you run the Windows installer, it will ask you if you want a GUI or not; keep in mind the GUI will use some memory. Personally, I still think it can be convenient, although PowerShell is just as powerful.
Also good to know: You can find ready to go Windows Server 2019 environments on Azure. Might save you some time!
Step 4: Setting up Docker
In order to run Docker, we need aforementioned Containers-feature on our Windows installation. There’s a separate command to install this, but Docker-guru Tobias Fenster tipped me that installing Docker already does this for you (thanks!).
So let’s install Docker! You can do this through the GUI, or you can run the following PowerShell commands (remember to run PowerShell as Administrator!):
These commands also let Windows know where to find the Docker repository, and install the product. After this, reboot the server.
Step 5: Preparing Networking
After your server is up again, it might be a good idea to think about what networking you’d like to use. If you use a laptop to access your server, that laptop is probably connected through WiFi and receiving an IP-address through DHCP.
Is your server on the same physical network (not on Azure, but physically in your office)? Then you might want to consider letting your Docker containers also grab an IP-address through DHCP. This will make them available immediately once they’re up and running.
The way Docker sets up containers as standard, they will only be available internally (as seen from the machine on which Docker runs). Inconvenient – we don’t want to do our development work on the Windows Server, but on our daily workstations.
Creating the network
Docker contains various networking drivers for different purposes. For our dev-machine, the “transparent” driver is most important. We can create a transparent network with name “MyNetwork” from PowerShell:
docker network create -d transparent MyNetwork
All is prepared now… unless you couldn’t wait, skipped a few steps, and already created a container, of course 😉 You can still connect these containers to any new network. Since I have a lack of patience, let’s restart the container too:
Yes, for some reason I do this manually, in two commands. There is also a “restart”-command, but somehow I feel less in control when I use this, because I didn’t see the stopped container.
Anyway, let’s check if everything is up again:
docker ps -a
You should now have a DHCP address on this container. But if you don’t…
Setting a virtual network to Promiscuous Mode on VMWare allows this network to see all traffic passing through the switch. On a machine that’s merely being used for testing/development purposes, it will make your life a lot easier getting DHCP to your VMs.
This should be considered a security hazard: Don’t do this unless only you and your trustees have administrator permissions on your VMs.
If you use Microsoft Hyper-V Server, this contains a similar feature: It’s called MAC Address Spoofing. There’s some more information on enabling it here.
Step 6: Install NAVContainerHelper
Not necessary, but it’s so practical – trust me, you’ll want NAVContainerHelper. Installing can be done from PowerShell, and is very easy:
Yes, that’s all. Just wait, and it will be installed. You can find more information on this tool here.
Step 7: Creating a container with NAVContainerHelper
Since we’re far past TL;DR point, let’s just start with the one and only (okay, almost) command you need. It sets up your container:
The system will now ask you for a username and password: These are meant to be able to login to your NAV environment later. The sa-password of the SQL-server is also the password you set here. Later, after it finishes, remember to keep the data (about the location of the VSIX etc.) it shows you in the output.
Right, while your PowerShell window is currently working hard, showing you all kinds of GUIDs, pulling FS layers, downloading files and verifying checksums, it might be a good time to analyze what we asked the tool to do here:
New-NavContainer is the command to start a new NAV container.
-accept-eula saves you from typing “y” once. Not very useful, until you start scripting your container builds (I might blog on this later, in which case the link will pop up here)
-containerName “freddy” defines the Docker-name of the container. If your network uses DNS, your container should be reachable through freddy.domain/NAV/ once it’s up and running
-auth NavUserPassword tells the system to setup the container with NavUserPassword authentication. Since I also work with engineers that don’t have a domain account, I prefer this setting, but I think Windows Authentication is also possible.
-imageName “mcr.microsoft.com/businesscentral/onprem” tells Docker exactly what image to get from which repository. There used to be Docker images on the microsoft/ repository also, but as far as I know everything should be moved to mcr.microsoft.com by now. For more information on what’s available, check out this blog by Waldo.
-updateHosts is an option that updates the hosts-file on your system, to allow for “manual DNS”. Necessary if you want to use:
-additionalParameters @(“–network=MyNetwork”) A very useful option of NavContainerHelper: The possibility to pass through standard docker parameters. In this case, it links this container to the network we created in Step 5.
Step 8: Useful Commands
There’s so much more to write, more than enough to fill an extra blog on how to manage a Docker machine, how to automate certain tasks, and how to fix issues… However, to help get you started, I want to limit myself to two more small subjects:
Since I started playing with docker, I have been constantly saving my PowerShell sessions (commands and output). One that comes back a lot is “docker logs”:
docker logs MyContainer
This command will show you all logs that have been produced from the start of your container. Can be very helpful if you have an issue to fix.
If you’re a developer, you’ll want your own license in the container. All you need to add is your containerName and the location of the license file (on the Docker machine).
Yes, it’s a Visual Studio Code error, and it has been bugging me for days. No, that’s too mild. It has been driving me nuts! Of course, the message is clear, and you’d think the solution is simple. The only problem is: The application object has not already been declared. There’s only one.
A quick explanation of what’s happening:
I start a new project (usually through AL: Go!), fix the launch.json and download symbols.
Develop away… my smallest project has two codeunits, two page extensions and one table extension.
Everything is fine (No “Problems”) right up until I press F5 (Start Debugging) for the first time.
After testing whatever it is I want to test, I continue developing and suddenly the messages start popping up. In this case, I have 5 of them:
-- An application object of type 'TableExtension' with ID '60500' is already declared AL(AL0264) [1,16]
-- An application object of type 'TableExtension' with name 'TabExtObject' is already declared AL(AL0197) [1,22]
-- The extension object 'TabExtObject' cannot be declared. Another extension for target 'Item' or the target itself is already declared in this module. AL [1,22]
-- A field with name 'TabExtObjectField1' is already defined AL(AL0205) [6,21]
-- A field with name 'TabExtObjectField2' is already defined AL(AL0205) [17,21]
After seeing these errors, just accept, modify whatever you need to modify, press F5 (Start Debugging).
You’d expect the compiler to also crash on problems like the above, but it doesn’t. It compiles and runs flawlessly.
On this project, I can more or less handle this issue. It’s just 5 errors and if something shows up that I do want to see, I’ll see it. On one of our other projects I was developing on, I have 1146 problems. If this flips to 1147, or 1148, I’ll have no clue where to look. Annoying, to put it mildly.
So, after visiting a Dynamics event last evening (well.. a few hours ago), I had some new motivation to figure out what’s the cause of this, and started trying to reproduce the issue from a brand new laptop, and with the “Hello, World!” app. This clean machine caused me to find it, because I suddenly started developing from the local Documents folder.
It’s dead simple: UNC paths!
To prevent data loss (or worse) we have a policy to not store anything on our laptops but keep everything on “the network”. Also the source code. As a result, our projects are all in \\server.domain\Dynamics Tailor BV\projects\projectname\
Visual Studio Code seems to be okay with this, but it clearly isn’t. I just solved the issue on my own machine by simply mapping the network drive to a drive letter and it’s working like a charm.
Yes, a blog about NAV/BC on Docker! “Docker” seems to be battling “VS Code” for the blockchain-status of the NAV/Business Central world. Everybody seems to be talking about it. I’ve been using it for about a year now, and rest assured, I’ve struggled. Multiple times. Often enough to share what I’ve learned.
The mere fact that you’re reading this probably means you either want to know if Docker is for you, or you want to know where to begin. Please don’t expect a full length step-by-step manual of everything there is to know about Docker; A lot has already been written by people who know much more about Docker than I do. I do intend to get you to the right information and add what’s missing in my opinion.
So let’s start with some important blog links:
Freddy Kristiansen – this guy is the heart of NAV on Docker at Microsoft. Has written so much about the subject that it’s sometimes difficult to find the right blog 🙂
Tobias Fenster – CTO at Axians Infoma, but seems to have fallen in love with Docker. In my opinion he knows just about everything there is to know about Docker, if you have a chance to visit one of his sessions, it’s worth it.
David Markus’ revelations about Docker – Cloud Architect at X-Talent, but also a Docker nerd. This link is not really a blog, but a presentation about Docker that’ll give you a clear overview of what Docker is and how it works, and will do so fast thanks to some neat navigation.
Do you need Docker?
Docker makes it possible to set up an isolated container with Business Central or NAV ready to go within minutes. The limiting factor is literally your internet connection, and you can run as much different versions alongside each other as you want.
If you regularly work on multiple NAV/Business Central on Premise databases, maybe even with different versions, and outside of a live environment, then yes, you do.
I run Docker for:
Development and automated testing of Business Central extensions, both for AppSource and on Premise environments;
Development of customizations for customers (I have a Cronus on the right version, extensions and localization for every customer);
Testing of functionality in different versions of NAV/BC
Docker can be used for much more than this, for example running various applications on live servers; I’m only focusing on NAV/BC consulting here.
Which version do I need?
Most “getting started with Docker” blogs will refer you to www.docker.com and tell you to download and install Docker. If you follow this advice, you end up here:
So which version do you need?!
If you already know what you’re installing on, it’s actually quite simple:
If you run Windows 10, choose Docker Desktop. You can click the link to download and install Docker on your system.
If you intend to run Docker off a Windows Server, you want Docker Enterprise. This cannot be installed from here, an install manual will appear here soon.
Do you have a choice?
Then the version you should get depends on what you want to do. Oversimplified, it’s like this:
Are you planning to run Docker for you personally? For example on a laptop? Then choose Windows 10 and get Docker Desktop. It’s free, and can do most of the stuff the enterprise version can too.
Do you want to share your development environment with other developers? Are you looking for a cloud-like experience when connecting to your NAV/BC? Do you have powerful hardware (6+ cores, 24GB+ RAM)? Then Docker Enterprise might be for you. In my experience, it’s more stable and scales better. The biggest disadvantage would be the higher cost.
What advantages does Docker have over running VMs?
Let’s imagine you’re running a laptop with a modern six core processor and 16GB of RAM. For Dynamics NAV 2018 or Business Central on premise development, you will need a stable and fast Windows 10 environment with Visual Studio Code, a browser, some Office programs etcetera. Okay, I sometimes also use finsql (with UI) and SQL Server Management Studio – some people call me old-fashioned 😉 For all this software you’ll need at least 8GB of RAM to run comfortable.
Then we’ll create one VM running the most recent Business Central on premise; you’ll need to install Windows 10 (or download a 12GB Windows image), probably you’ll want to run some updates, maybe install SQL Server, install Business Central (maybe including the demo database and SQL Server Express), setup the servicetier, configure the firewall to allow traffic into your machine and I’m probably forgetting a lot here.
How long will all this take? an hour at least. You’ll probably also want at least one dedicated core assigned to this machine, and at least 4GB of RAM to have something vaguely resembling “performance” out of this NAV setup.
Now let’s add customers on 4 different versions to the mix; you’ll either:
Need to create multiple VMs (in which case I’m hoping you made a copy of that freshly installed and updated Windows 10 VM earlier, before setting up NAV…), or;
Have to fiddle with your running installs to install a second version alongside the first one, at the risk of ending up with nothing working at all.
Let’s compare this to a Docker install: Admittedly, both Docker Desktop and Docker Enterprise took me a fair share of time to get going, but once you dó have it running, setting up a new development environment will take you one command and less than five minutes of waiting. Which is awesome!
Another gain of using Docker is the fact that it’s a lot less resource hungry than a VM. It is not only possible to run a fully functional developer’s Windows environment with 4 simultaneously running containers, it will return more than reasonable performance. If you’re a raw numbers person, David Markus did a test for his revelations on Docker presentation. The results:
Doesn’t it have any disadvantages?
Well, yes it does. Sometimes, Docker is just a bit of a motherDocker. Google is not always your friend here either; Googling Docker issues will also give you a lot of solutions that apply only to Linux containers and/or Docker running on Linux, and solutions for problems that work only on *insert older, newer or different version of Windows*.
You will probably learn a good bit about networking, but it’s well worth it. And concerning Google, I hope this blog series will be part of the solution.
Yet another blog that was inspired by a phone call: Yesterday, a “new” Dynamics NAV (AL) developer who had to venture into good old C/Side called me about getting an error message when starting the program.
“An unknown language was selected (8192).” came up right after clicking the icon, after which the program would close.
Since it took me a while to remember what it meant, I decided to put it in a quick blog, for once and for all.
Let’s keep it short: This message doesn’t have much to do with your Dynamics NAV setup. It’s very easy to cause, and also very easy to fix, and has to do with Windows regional settings (being a mix of English and Dutch, but this might also happen with other languages).
The solutions is in Control Panel > Region:
Change this setting to something more usual, and you’re done:
After this, your development environment will work again.
After having a fairly smooth Friday, I was in my car driving to pick up my son when a migration consultant (let’s just call him Mike) called about a mutual customer. Mike told me something I don’t want to hear on Friday around 17:30:
We’ve just ran another conversion to prepare for testing next week, but somehow we’re missing records in the Item table. About 300, to be exact. It looks as if the service tier isn’t refreshing.
After discussing the problem for a while, and not figuring out what was going on, I promised to take a look after the usual family business. I just did.
At this customer we periodically migrate data from the old live environment to the acceptation environment (using SQL): some tables are wiped every run, but for performance reasons, others are migrated through incremental updates.
So I logged in, restarted all services and then had a quick look in the client. With a filter on our item, it returned an empty list – no other filters on the table, so the item clearly didn’t exist. So I tried to create it… and NAV told me that it already existed!
Right. Next to check were the usual things that go wrong when doing SQL conversions: low caps in code fields, dates or times with the wrong value etcetera (even though the behaviour of the client didn’t match up). As expected, there was nothing “off” visually.
Not wanting to waste time, I decided to run a trace, and quickly found out what was going on:
A few weeks ago, I built an extension containing a TableExtension object that extends the Item table, so now we have two Item tables:
The main table: CRONUZ_EU$Item
The companion table(s): CRONUZ_EU$Item$[random GUID]
The NAV/BC engine always generates it’s SQL statements with the included companion table, so there will never be a record in one of the companion tables that’s not in the main table, nor will the opposite situation exist.
Mikes original procedures update the CRONUZ_EU$Item table, but not the companion tables. However, the NAV/BC engines generates a SQL statement with a JOIN, not an OUTER JOIN. The result: SQL will return records only if a record with given primary key is found in both tables.
After adding all the missing records, of course, the problem was solved immediately. However, uninstalling the app with option -DoNotSaveData might also help.
It’s that time of year again! December, one of the busiest months of the year. Everybody is trying to achieve targets, finish some projects and grab a bonus, all while decorating their homes, shopping for Christmas trees and writing postcards for friends & family.
Microsoft chose exactly this time to secretly slip us an early gift. Almost nobody noticed.
One of my customers is currently running Dynamics NAV 2018 with a number of C/Side customizations. We’ve already cleaned a lot of stock objects by switching to events; Next step forward into the future is the switch from C/AL to AL, and I managed to convince them that next year, Extensions are the way to go!
In order to prepare for this, I started refactoring and componentizing older customizations, so they can become independent extensions which can be switched on and off as required. While doing so I ran into a “little” problem:
What it says, is that the license doesn’t allow object LocationCardExt to be published, because the Page Extension needs a free object in the licensed range. I had the same error on my Table Extensions. So here I was, at a customer who have used their license right up to the latest object in the custom range, planning to move custom fields from – for example – the Sales and Purchase tables, to multiple extensions.
36-37-38-39-110-111-112-113-114-115-5107-5108: Only the Sales Objects would be 12 tables, so if I would have to build 5 custom extensions, this would result in 60 additional objects. Probably, at least half of those tables have modifications on two pages (a card and a list), so I’d need 90 pages and this is ONLY sales – €10K in extra objects would be easily spent.
Well, fuck. Excuse my French, but this is quite inconvenient to say the least. I silently panicked and contacted Arend-Jan (one of the three wise men).
When implemented with Dynamics NAV 2018 or Dynamics 365 Business Central On-Premise, partner hosted or Azure IAAS:
The classic C/AL objects in this range needs to be purchased from the Dynamics Pricelist when implemented on premise, partner hosted or Azure IAAS. They are developed in the traditional way using C/Side.
New from Business Central Fall 2018 Cumulative Update 1 (planned for November) and NAV 2018 CU 12 (planned for December)
The AL extension (PageExtension, TableExtension) objects developed in Visual Studio Code and stored in the 50.000 – 99.999 range which extends objects to which you have modify permissions in your development license are free of charge. (For ex. When you want to extend the Customer table from the base app, you are required to develop an extension object and assign a unique object ID to it).Regular AL objects (Table, page, codeunit, report,…) needs to be purchased through Dynamics pricelist.
Yes, you read that right: Microsoft said “free of charge”!
From Business Central Fall 2018 CU1 and Dynamics NAV 2018 CU12, it’s possible to use the full 50.000-99.999 range for these Page Extensions and Table Extensions, so it looks as if it will solve this problem.
This blog does come with a little warning: At the time of writing this, I couldn’t find a docker image for NAV 2018 CU12 yet, but as soon as I have the chance I’ll test with a couple of different licenses, and report back here exactly what is possible and what isn’t.
Yes, beer! It’s friday evening and we’re celebrating. The Dynamics Tailor, last years three day decision of becoming an entrepeneur, is still alive and kicking!
Yes, the first anniversary of The Dynamics Tailor has passed, last Wednesday to be exact.
So what happened this year? A lot! We’ve managed to pull a number of businesses into a recent version of Dynamics NAV, worked on extensions, and I’ve also had some security puzzles in one of my projects. Currently, I’m working on a re-implementation, a customization project and working as interim application manager.
I haven’t had large failures, and I’ve learned a lot: I’m looking forward to next year!
Escaping the transaction scope
This week, I had another customization request that was not so easy to fix:
“During sales order release, sales shipment and warehouse shipment, we want to perform a number of extra validations. If one of these validations fails, all changes should be rolled back. However, can we catch all validation errors (also the stock ones), and log them into a table, so process owners can either fix the issue or decide whether the custom validation can be ignored (approve the order)?
It sounds a lot easier than it is: I didn’t want to modify any stock code (the customer is using NAV 2018, we can’t fully switch to extension yet, but we want to build all customizations ready to be converted in the future).
In this situation, I didn’t have a choice except somehow saving records after the transaction is started, but before an error is thrown, since then I would only be able to get the last error (text). And that’s easier said than done!
As you probably already know, Business Central starts a transaction after the first write command (INSERT, MODIFY, RENAME, DELETE), and performs a COMMIT either when execution ends, or when you force the program to perform a COMMIT.
Putting a COMMIT in between would solve the problem, but will probably give me 15 new problems: It might cause inconsistent data. I’ve always been very careful with COMMIT, but since we’ve started using events and developing extensions, I’ve basically stopped using them altogether, unless in completely isolated code. It’s simply too dangerous when you cannot control exactly what’s happening before your COMMIT is being executed.
Actually, the solution would be to simply keep my error log transaction out of the transaction scope; the whole post transaction will then be rollbacked when an error occurs, but my log would still be saved. Dynamics NAV and Business Central provide an expensive, but functional solution, and this is how to code it:
Start session runs a codeunit in a separate non-GUI session, and therefore is out of the scope of your current transaction. I prepared my record before I passed it to the new session, and it’s only written in a new session if an error is expected; this in order to keep the session open as short as possible, and run as little session as necessary.
Also, I timed the duration of this process, calling a codeunit that simply performs a Record.INSERT(TRUE); and then closes the session. It clocked in at 13 ms on a slow development server – not fast, but acceptable for something that only occurs a few times a day.
Good to know is that performing a STARTSESSION will not cause you to use an extra licensed user – the session is started from the same system and with the same named user, so it doesn’t count.
We choose to not follow you around on the internet: All we save in cookies is your language preference, and whether you clicked "OK" in this bar. Of course, we do appreciate it if you choose to follow us! In order to do so, please click one of the social buttons at the top. OK
Privacy & Cookies Policy
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.