Technical

Category Archives

Docker Series: Docker on Windows Server 2019

Microsofts most modern server OS has some technical advantages over it’s predecessor, and together with Docker makes for a very versatile platform to develop large apps, and of course add-ons for Business Central On-Premise. If you’re looking to quickly get a BC on Docker environment up and running, read on!

Step 1: What to get?

  • A computer with sufficient resources. Read this blog if you need more information on hardware.
  • You might want a hypervisor (VMWare ESXi or Microsoft Hyper-V Server), but it is not necessary. The installation of a hypervisor is skipped in this tutorial, but the hypervisor is referenced here and there.
  • Windows Server 2019 Standard (or Datacenter, but not Essentials).

Step 2: Let’s get some software!

Of course, some of you can open MSDN and download Windows Server right there, but not everyone has a subscription, and Microsoft’s server licenses aren’t cheap.

However, if you go to Microsoft’s EvalCenter, you can download an ISO for Windows Server 2019 which can be used for 180 days: More than enough to take a good look at Docker Enterprise.

Once you’re confident you want to keep going you don’t even have to reinstall; you can convert your evaluation to a licensed and activated Windows using the dism tool and your license key.

Make sure you get the Standard or Datacenter Edition.

There’s also a version called Windows Server 2019 Essentials, which I think is similar to the Small Business Server of the past. This version is missing a feature that’s essential for Docker to work: The Containers feature.

Adding the Containers feature to Windows Server 2019: This feature is not available in the Standard and Datacenter editions, not in the Essentials Edition.


Step 3: Install Windows

Duh 🙂 Not much to do here, just make sure all updates are installed.

Once you run the Windows installer, it will ask you if you want a GUI or not; keep in mind the GUI will use some memory. Personally, I still think it can be convenient, although PowerShell is just as powerful.

Also good to know: You can find ready to go Windows Server 2019 environments on Azure. Might save you some time!

Step 4: Setting up Docker

In order to run Docker, we need aforementioned Containers-feature on our Windows installation. There’s a separate command to install this, but Docker-guru Tobias Fenster tipped me that installing Docker already does this for you (thanks!).

So let’s install Docker! You can do this through the GUI, or you can run the following PowerShell commands (remember to run PowerShell as Administrator!):

Install-Module -Name DockerMsftProvider -Repository PSGallery -Force

Install-Package -Name docker -ProviderName DockerMsftProvider -Force

These commands also let Windows know where to find the Docker repository, and install the product. After this, reboot the server.

Step 5: Preparing Networking

After your server is up again, it might be a good idea to think about what networking you’d like to use. If you use a laptop to access your server, that laptop is probably connected through WiFi and receiving an IP-address through DHCP.

Is your server on the same physical network (not on Azure, but physically in your office)? Then you might want to consider letting your Docker containers also grab an IP-address through DHCP. This will make them available immediately once they’re up and running.

The way Docker sets up containers as standard, they will only be available internally (as seen from the machine on which Docker runs). Inconvenient – we don’t want to do our development work on the Windows Server, but on our daily workstations.

Creating the network

Docker contains various networking drivers for different purposes. For our dev-machine, the “transparent” driver is most important. We can create a transparent network with name “MyNetwork” from PowerShell:

docker network create -d transparent MyNetwork 

All is prepared now… unless you couldn’t wait, skipped a few steps, and already created a container, of course 😉 You can still connect these containers to any new network. Since I have a lack of patience, let’s restart the container too:

docker network connect MyNetwork MyContainer

docker stop MyContainer

docker start MyContainer

Yes, for some reason I do this manually, in two commands. There is also a “restart”-command, but somehow I feel less in control when I use this, because I didn’t see the stopped container.

Anyway, let’s check if everything is up again:

docker ps -a

You should now have a DHCP address on this container. But if you don’t…

Promiscuous Mode

Setting a virtual network to Promiscuous Mode on VMWare allows this network to see all traffic passing through the switch. On a machine that’s merely being used for testing/development purposes, it will make your life a lot easier getting DHCP to your VMs.

This should be considered a security hazard: Don’t do this unless only you and your trustees have administrator permissions on your VMs.

If you use Microsoft Hyper-V Server, this contains a similar feature: It’s called MAC Address Spoofing. There’s some more information on enabling it here.

Step 6: Install NAVContainerHelper

Not necessary, but it’s so practical – trust me, you’ll want NAVContainerHelper. Installing can be done from PowerShell, and is very easy:

Install-Module NavContainerHelper

Yes, that’s all. Just wait, and it will be installed. You can find more information on this tool here.

Step 7: Creating a container with NAVContainerHelper

Since we’re far past TL;DR point, let’s just start with the one and only (okay, almost) command you need. It sets up your container:

New-NavContainer -accept_eula -containerName "freddy" -auth NavUserPassword -imageName "mcr.microsoft.com/businesscentral/onprem" -updateHosts -additionalParameters @("--network=MyNetwork")

The system will now ask you for a username and password: These are meant to be able to login to your NAV environment later. The sa-password of the SQL-server is also the password you set here. Later, after it finishes, remember to keep the data (about the location of the VSIX etc.) it shows you in the output.

Right, while your PowerShell window is currently working hard, showing you all kinds of GUIDs, pulling FS layers, downloading files and verifying checksums, it might be a good time to analyze what we asked the tool to do here:

New-NavContainer is the command to start a new NAV container.

-accept-eula saves you from typing “y” once. Not very useful, until you start scripting your container builds (I might blog on this later, in which case the link will pop up here)

-containerName “freddy” defines the Docker-name of the container. If your network uses DNS, your container should be reachable through freddy.domain/NAV/ once it’s up and running

-auth NavUserPassword tells the system to setup the container with NavUserPassword authentication. Since I also work with engineers that don’t have a domain account, I prefer this setting, but I think Windows Authentication is also possible.

-imageName “mcr.microsoft.com/businesscentral/onprem” tells Docker exactly what image to get from which repository. There used to be Docker images on the microsoft/ repository also, but as far as I know everything should be moved to mcr.microsoft.com by now. For more information on what’s available, check out this blog by Waldo.

-updateHosts is an option that updates the hosts-file on your system, to allow for “manual DNS”. Necessary if you want to use:

-additionalParameters @(“–network=MyNetwork”) A very useful option of NavContainerHelper: The possibility to pass through standard docker parameters. In this case, it links this container to the network we created in Step 5.

Step 8: Useful Commands

There’s so much more to write, more than enough to fill an extra blog on how to manage a Docker machine, how to automate certain tasks, and how to fix issues… However, to help get you started, I want to limit myself to two more small subjects:

docker logs

Since I started playing with docker, I have been constantly saving my PowerShell sessions (commands and output). One that comes back a lot is “docker logs”:

docker logs MyContainer

This command will show you all logs that have been produced from the start of your container. Can be very helpful if you have an issue to fix.

Import-NavContainerLicense

If you’re a developer, you’ll want your own license in the container. All you need to add is your containerName and the location of the license file (on the Docker machine).

Step 9: Good luck!

If you have any questions, or if you ran into something I forgot to mention here, feel free to ask/let me know!


An application object of type x is already declared

Yes, it’s a Visual Studio Code error, and it has been bugging me for days. No, that’s too mild. It has been driving me nuts! Of course, the message is clear, and you’d think the solution is simple. The only problem is: The application object has not already been declared. There’s only one.

The Problem

A quick explanation of what’s happening:

  1. I start a new project (usually through AL: Go!), fix the launch.json and download symbols.
  2. Develop away… my smallest project has two codeunits, two page extensions and one table extension.
  3. Everything is fine (No “Problems”) right up until I press F5 (Start Debugging) for the first time.
  4. After testing whatever it is I want to test, I continue developing and suddenly the messages start popping up. In this case, I have 5 of them:
Tab-Ext60500.TabExtObject.al
-- An application object of type 'TableExtension' with ID '60500' is already declared AL(AL0264) [1,16]
-- An application object of type 'TableExtension' with name 'TabExtObject' is already declared AL(AL0197) [1,22]
-- The extension object 'TabExtObject' cannot be declared. Another extension for target 'Item' or the target itself is already declared in this module. AL[0334] [1,22]
-- A field with name 'TabExtObjectField1' is already defined AL(AL0205) [6,21]
-- A field with name 'TabExtObjectField2' is already defined AL(AL0205) [17,21]
  1. After seeing these errors, just accept, modify whatever you need to modify, press F5 (Start Debugging).
  2. You’d expect the compiler to also crash on problems like the above, but it doesn’t. It compiles and runs flawlessly.

On this project, I can more or less handle this issue. It’s just 5 errors and if something shows up that I do want to see, I’ll see it. On one of our other projects I was developing on, I have 1146 problems. If this flips to 1147, or 1148, I’ll have no clue where to look. Annoying, to put it mildly.

The Solution

So, after visiting a Dynamics event last evening (well.. a few hours ago), I had some new motivation to figure out what’s the cause of this, and started trying to reproduce the issue from a brand new laptop, and with the “Hello, World!” app. This clean machine caused me to find it, because I suddenly started developing from the local Documents folder.

It’s dead simple: UNC paths!

To prevent data loss (or worse) we have a policy to not store anything on our laptops but keep everything on “the network”. Also the source code. As a result, our projects are all in \\server.domain\Dynamics Tailor BV\projects\projectname\

Visual Studio Code seems to be okay with this, but it clearly isn’t. I just solved the issue on my own machine by simply mapping the network drive to a drive letter and it’s working like a charm.

Well… just one thing left to say:


Docker Series: What hardware?

If you’re planning to run a “heavy” application like Docker to develop for Business Central (or NAV), it’s useful to know what hardware to have. Here’s an overview of what you need.


Docker Series: Getting started

Yes, a blog about NAV/BC on Docker! “Docker” seems to be battling “VS Code” for the blockchain-status of the NAV/Business Central world. Everybody seems to be talking about it. I’ve been using it for about a year now, and rest assured, I’ve struggled. Multiple times. Often enough to share what I’ve learned.

The mere fact that you’re reading this probably means you either want to know if Docker is for you, or you want to know where to begin. Please don’t expect a full length step-by-step manual of everything there is to know about Docker; A lot has already been written by people who know much more about Docker than I do. I do intend to get you to the right information and add what’s missing in my opinion.

So let’s start with some important blog links:

Freddy Kristiansen – this guy is the heart of NAV on Docker at Microsoft. Has written so much about the subject that it’s sometimes difficult to find the right blog 🙂

Tobias Fenster – CTO at Axians Infoma, but seems to have fallen in love with Docker. In my opinion he knows just about everything there is to know about Docker, if you have a chance to visit one of his sessions, it’s worth it.

David Markus’ revelations about Docker – Cloud Architect at X-Talent, but also a Docker nerd. This link is not really a blog, but a presentation about Docker that’ll give you a clear overview of what Docker is and how it works, and will do so fast thanks to some neat navigation.

Do you need Docker?

Docker makes it possible to set up an isolated container with Business Central or NAV ready to go within minutes. The limiting factor is literally your internet connection, and you can run as much different versions alongside each other as you want.

If you regularly work on multiple NAV/Business Central on Premise databases, maybe even with different versions, and outside of a live environment, then yes, you do.

I run Docker for:

  • Development and automated testing of Business Central extensions, both for AppSource and on Premise environments;
  • Development of customizations for customers (I have a Cronus on the right version, extensions and localization for every customer);
  • Testing of functionality in different versions of NAV/BC

Docker can be used for much more than this, for example running various applications on live servers; I’m only focusing on NAV/BC consulting here.

Which version do I need?

Most “getting started with Docker” blogs will refer you to www.docker.com and tell you to download and install Docker. If you follow this advice, you end up here:

So which version do you need?!

If you already know what you’re installing on, it’s actually quite simple:

  • If you run Windows 10, choose Docker Desktop. You can click the link to download and install Docker on your system.
  • If you intend to run Docker off a Windows Server, you want Docker Enterprise. This cannot be installed from here, an install manual will appear here soon.

Do you have a choice?

Then the version you should get depends on what you want to do. Oversimplified, it’s like this:

  • Are you planning to run Docker for you personally? For example on a laptop? Then choose Windows 10 and get Docker Desktop. It’s free, and can do most of the stuff the enterprise version can too.
  • Do you want to share your development environment with other developers? Are you looking for a cloud-like experience when connecting to your NAV/BC? Do you have powerful hardware (6+ cores, 24GB+ RAM)? Then Docker Enterprise might be for you. In my experience, it’s more stable and scales better. The biggest disadvantage would be the higher cost.

What advantages does Docker have over running VMs?

Let’s imagine you’re running a laptop with a modern six core processor and 16GB of RAM. For Dynamics NAV 2018 or Business Central on premise development, you will need a stable and fast Windows 10 environment with Visual Studio Code, a browser, some Office programs etcetera. Okay, I sometimes also use finsql (with UI) and SQL Server Management Studio – some people call me old-fashioned 😉 For all this software you’ll need at least 8GB of RAM to run comfortable.

Then we’ll create one VM running the most recent Business Central on premise; you’ll need to install Windows 10 (or download a 12GB Windows image), probably you’ll want to run some updates, maybe install SQL Server, install Business Central (maybe including the demo database and SQL Server Express), setup the servicetier, configure the firewall to allow traffic into your machine and I’m probably forgetting a lot here.

How long will all this take? an hour at least. You’ll probably also want at least one dedicated core assigned to this machine, and at least 4GB of RAM to have something vaguely resembling “performance” out of this NAV setup.

Now let’s add customers on 4 different versions to the mix; you’ll either:

  • Need to create multiple VMs (in which case I’m hoping you made a copy of that freshly installed and updated Windows 10 VM earlier, before setting up NAV…), or;
  • Have to fiddle with your running installs to install a second version alongside the first one, at the risk of ending up with nothing working at all.

Let’s compare this to a Docker install: Admittedly, both Docker Desktop and Docker Enterprise took me a fair share of time to get going, but once you dó have it running, setting up a new development environment will take you one command and less than five minutes of waiting. Which is awesome!

Another gain of using Docker is the fact that it’s a lot less resource hungry than a VM. It is not only possible to run a fully functional developer’s Windows environment with 4 simultaneously running containers, it will return more than reasonable performance. If you’re a raw numbers person, David Markus did a test for his revelations on Docker presentation. The results:

Doesn’t it have any disadvantages?

Well, yes it does. Sometimes, Docker is just a bit of a motherDocker. Google is not always your friend here either; Googling Docker issues will also give you a lot of solutions that apply only to Linux containers and/or Docker running on Linux, and solutions for problems that work only on *insert older, newer or different version of Windows*.

You will probably learn a good bit about networking, but it’s well worth it. And concerning Google, I hope this blog series will be part of the solution.

Let’s get this party started!


An unknown language was selected (8192)

Yet another blog that was inspired by a phone call: Yesterday, a “new” Dynamics NAV (AL) developer who had to venture into good old C/Side called me about getting an error message when starting the program.

“An unknown language was selected (8192).” came up right after clicking the icon, after which the program would close.

Since it took me a while to remember what it meant, I decided to put it in a quick blog, for once and for all.

Let’s keep it short: This message doesn’t have much to do with your Dynamics NAV setup. It’s very easy to cause, and also very easy to fix, and has to do with Windows regional settings (being a mix of English and Dutch, but this might also happen with other languages).

The solutions is in Control Panel > Region:

Change this setting to something more usual, and you’re done:

After this, your development environment will work again.


Lost In Extensions: SQL migration & Modern Development

After having a fairly smooth Friday, I was in my car driving to pick up my son when a migration consultant (let’s just call him Mike) called about a mutual customer. Mike told me something I don’t want to hear on Friday around 17:30:

We’ve just ran another conversion to prepare for testing next week, but somehow we’re missing records in the Item table. About 300, to be exact. It looks as if the service tier isn’t refreshing.

After discussing the problem for a while, and not figuring out what was going on, I promised to take a look after the usual family business. I just did.

At this customer we periodically migrate data from the old live environment to the acceptation environment (using SQL): some tables are wiped every run, but for performance reasons, others are migrated through incremental updates.

So I logged in, restarted all services and then had a quick look in the client. With a filter on our item, it returned an empty list – no other filters on the table, so the item clearly didn’t exist. So I tried to create it… and NAV told me that it already existed!

Right. Next to check were the usual things that go wrong when doing SQL conversions: low caps in code fields, dates or times with the wrong value etcetera (even though the behaviour of the client didn’t match up). As expected, there was nothing “off” visually.

Not wanting to waste time, I decided to run a trace, and quickly found out what was going on:

A few weeks ago, I built an extension containing a TableExtension object that extends the Item table, so now we have two Item tables:

  • The main table: CRONUZ_EU$Item
  • The companion table(s): CRONUZ_EU$Item$[random GUID]

The NAV/BC engine always generates it’s SQL statements with the included companion table, so there will never be a record in one of the companion tables that’s not in the main table, nor will the opposite situation exist.

Mikes original procedures update the CRONUZ_EU$Item table, but not the companion tables. However, the NAV/BC engines generates a SQL statement with a JOIN, not an OUTER JOIN. The result: SQL will return records only if a record with given primary key is found in both tables.

After adding all the missing records, of course, the problem was solved immediately. However, uninstalling the app with option -DoNotSaveData might also help.


The licensing issue: Christmas present from Microsoft

It’s that time of year again! December, one of the busiest months of the year. Everybody is trying to achieve targets, finish some projects and grab a bonus, all while decorating their homes, shopping for Christmas trees and writing postcards for friends & family.

Microsoft chose exactly this time to secretly slip us an early gift. Almost nobody noticed. 

One of my customers is currently running Dynamics NAV 2018 with a number of C/Side customizations. We’ve already cleaned a lot of stock objects by switching to events; Next step forward into the future is the switch from C/AL to AL, and I managed to convince them that next year, Extensions are the way to go!

In order to prepare for this, I started refactoring and componentizing older customizations, so they can become independent extensions which can be switched on and off as required. While doing so I ran into a “little” problem:

What it says, is that the license doesn’t allow object LocationCardExt to be published, because the Page Extension needs a free object in the licensed range. I had the same error on my Table Extensions. So here I was, at a customer who have used their license right up to the latest object in the custom range, planning to move custom fields from – for example – the Sales and Purchase tables, to multiple extensions.

36-37-38-39-110-111-112-113-114-115-5107-5108: Only the Sales Objects would be 12 tables, so if I would have to build 5 custom extensions, this would result in 60 additional objects. Probably, at least half of those tables have modifications on two pages (a card and a list), so I’d need 90 pages and this is ONLY sales – €10K in extra objects would be easily spent.

Well, fuck. Excuse my French, but this is quite inconvenient to say the least. I silently panicked and contacted Arend-Jan (one of the three wise men).

And I got lucky! He pointed me to a document on object ranges in Business Central which also has a passage about Dynamics NAV 2018:

When implemented with Dynamics NAV 2018 or Dynamics 365 Business Central On-Premise, partner hosted or Azure IAAS:

The classic C/AL objects in this range needs to be purchased from the Dynamics Pricelist when implemented on premise, partner hosted or Azure IAAS. They are developed in the traditional way using C/Side.

New from Business Central Fall 2018 Cumulative Update 1 (planned for November) and NAV 2018 CU 12 (planned for December)

The AL extension (PageExtension, TableExtension) objects developed in Visual Studio Code and stored in the 50.000 – 99.999 range which extends objects to which you have modify permissions in your development license are free of charge. (For ex. When you want to extend the Customer table from the base app, you are required to develop an extension object and assign a unique object ID to it).Regular AL objects (Table, page, codeunit, report,…) needs to be purchased through Dynamics pricelist.

Yes, you read that right: Microsoft said “free of charge”!

From Business Central Fall 2018 CU1 and Dynamics NAV 2018 CU12, it’s possible to use the full 50.000-99.999 range for these Page Extensions and Table Extensions, so it looks as if it will solve this problem. 

In other news: Cumulative Update 12 for Microsoft Dynamics NAV 2018 has been released.  Today.

This blog does come with a little warning: At the time of writing this, I couldn’t find a docker image for NAV 2018 CU12 yet, but as soon as I have the chance I’ll test with a couple of different licenses, and report back here exactly what is possible and what isn’t.

For now: An early Merry Christmas to all of you!


Escaping the transaction scope (and other good reasons for beer)

Yes, beer! It’s friday evening and we’re celebrating. The Dynamics Tailor, last years three day decision of becoming an entrepeneur, is still alive and kicking!
Yes, the first anniversary of The Dynamics Tailor has passed, last Wednesday to be exact.

So what happened this year? A lot! We’ve managed to pull a number of businesses into a recent version of Dynamics NAV, worked on extensions, and I’ve also had some security puzzles in one of my projects. Currently, I’m working on a re-implementation, a customization project and working as interim application manager.

I haven’t had large failures, and I’ve learned a lot: I’m looking forward to next year!

Escaping the transaction scope

This week, I had another customization request that was not so easy to fix:

“During sales order release, sales shipment and warehouse shipment, we want to perform a number of extra validations. If one of these validations fails, all changes should be rolled back. However, can we catch all validation errors (also the stock ones), and log them into a table, so process owners can either fix the issue or decide whether the custom validation can be ignored (approve the order)?

It sounds a lot easier than it is: I didn’t want to modify any stock code (the customer is using NAV 2018, we can’t fully switch to extension yet, but we want to build all customizations ready to be converted in the future).
In this situation, I didn’t have a choice except somehow saving records after the transaction is started, but before an error is thrown, since then I would only be able to get the last error (text). And that’s easier said than done!
As you probably already know, Business Central starts a transaction after the first write command (INSERT, MODIFY, RENAME, DELETE), and performs a COMMIT either when execution ends, or when you force the program to perform a COMMIT.

Putting a COMMIT in between would solve the problem, but will probably give me 15 new problems: It might cause inconsistent data. I’ve always been very careful with COMMIT, but since we’ve started using events and developing extensions, I’ve basically stopped using them altogether, unless in completely isolated code. It’s simply too dangerous when you cannot control exactly what’s happening before your COMMIT is being executed.

Actually, the solution would be to simply keep my error log transaction out of the transaction scope; the whole post transaction will then be rollbacked when an error occurs, but my log would still be saved. Dynamics NAV and Business Central provide an expensive, but functional solution, and this is how to code it:

STARTSESSION(VAR SessionID, CODEUNIT::"MyValidationHandler" ,COMPANYNAME ,ValidationLogRecord);

Start session runs a codeunit in a separate non-GUI session, and therefore is out of the scope of your current transaction. I prepared my record before I passed it to the new session, and it’s only written in a new session if an error is expected; this in order to keep the session open as short as possible, and run as little session as necessary.

Also, I timed the duration of this process, calling a codeunit that simply performs a Record.INSERT(TRUE); and then closes the session. It clocked in at 13 ms on a slow development server – not fast, but acceptable for something that only occurs a few times a day.
Good to know is that performing a STARTSESSION will not cause you to use an extra licensed user – the session is started from the same system and with the same named user, so it doesn’t count.

Again: It’s expensive, but it solves the problem!


A Dynamics NAV database… on a Linux server?

Let’s go ahead and admit it: I just have a thing for open source software. FreeBSD and Ubuntu Linux really tickle my fancy. Free, lightweight, fully customizable, versatile, yet world-renowned for stability. This “thing” used to be totally useless when your daily work is all about Microsoft. Recently, things have changed… Because Microsoft released SQL Server 2017 for Linux!

Probably, you’re already wondering why I wrote this article, or even why I tried running SQL Server on Linux. Yes, I know, who cares about on premise installations since we have Azure and Docker?

Don’t ask me “why”, but I just had to give it a go: A Dynamics NAV 2018 database, on Microsoft SQL Server, on Linux. Two (or three?) worlds working together in peace. It’s happening! And if this isn’t enough motivation, well, then there’s always this little beauty:

  • Because we can! *evil laugh*

Step 1: Setting up the Linux Server

If your world is entirely made up out of Microsoft, setting up a Linux Server doesn’t seem the most straightforward thing to do. But actually, it is, so here’s a (detailed but quick) crash course for installing a Ubuntu Linux Server:

  • Go to this link to download Ubuntu Server. Unless you need cutting edge functionality, download the LTS version. Like with Windows Server editions, this release has Long Term Support (five years), so if you keep this Linux Server running you’ll be happy until 21.04 (for the 16.04 version).
  • If you’re planning to run Ubuntu on a physical machine, google “unetbootin” (it’s somewhere on SourceForge) to quickly copy the image you just downloaded to a USB stick + make it bootable.
  • If you’re running a hypervisor (Hyper-V or VMWare for example), just mount the Ubuntu image on a new system and run it. Concerning resources, I’d start with 3GB RAM and 2 cores if you’re limited – Ubuntu Server doesn’t have a GUI and is very lightweight out of the box.
  • You’ll find the setup is very straightforward. By the way, I usually skip keyboard detection and just select “US International”.
  • Go easy on yourself: Install “SSH” right from the setup tool. This is your “Remote Desktop Connection” for Linux. Slight difference: Your Linux has no GUI, so this is a tool to access the command prompt. If you already skipped through the screen that gives you this option: No problem, you can do it afterwards from the command prompt.

So, probably you’re looking at an installed Linux Server with a login-prompt now. After you login it’ll probably tell you it needs some updates. You can easily install these updates by running the tool to do so as administrator with the following command:

sudo apt-get upgrade

After entering this, you will have to enter your password again. “sudo” means “superuser do”, which is equal to running stuff as administrator in Windows. The rest is not important now. Fun fact: Unless you’re very unlucky, you won’t have to reboot the machine after the updates have installed…

If you forgot to install SSH earlier, just run this command to get that settled:

sudo apt-get install -y openssh-server

Last time I did this manually is a while ago but it should work right after installation.

Step 2: Tools to access the machine from your Windows box

To make our Linux-box with SSH useable from outside, we need two tools:

  • PuTTY: Use this to access the command prompt from your Windows machine. You can run ifconfig (with an F instead of a P) from the command prompt to see the IP address of your Linux machine.
  • WinSCP: Convenient tool which looks like a mix between Windows Explorer and good old Norton Commander, but allows you to access files on your Linux machine over the SSH connection. (yes there’s also Bitvise SSH, this is personal preference)

Step 3: Installing Microsoft SQL Server on Linux

This is where the mixing starts. Let’s first tell Ubuntu where to get SQL Server, by running the following commands (either through PuTTY or directly on the prompt):

wget -qO- https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add –

sudo add-apt-repository “$(wget -qO- https://packages.microsoft.com/config/ubuntu/16.04/mssql-server-2017.list)”

sudo apt-get update

Background info: Ubuntu Linux has a “Store”, just like Windows. What we did here is: add the Microsoft encryption key to the system (so it trusts Microsoft’s repository), add the repository to the list of repositories in Ubuntu, and then get information on available packages for the system. Next commands:

sudo apt-get install -y mssql-server

sudo /opt/mssql/bin/mssql-conf setup

These last steps will look different than usual, but probably be very familiar once you see what is being asked. Let’s keep it simple, and also open up port 1433 for all incoming connections in the firewall:

sudo ufw allow 1433/tcp

Congratulations: You’ve just installed and setup SQL Server on your Linux machine!

Step 4: Preparing your database

If all went well, you should now have a running SQL Server, and you should be able to login to the server from your workstation using Microsoft SQL Server Management Studio.

The folder structure is a little different in Linux, but if you copy an SQL backup to your server through WinSCP, you can locate it in SSMS:

 

Once the database is there, you should be able to connect through the classic “Development Environment”.

Step 5: Connecting the Microsoft Dynamics NAV Service Tier

But suddenly, setting up the Dynamics NAV Service Tier is not so straightforward.

Yes, you probably already noticed this in step 3 and 4: SQL Server Authentication is back. Not a good thing in my opinion, but the Linux machine is simply not a domain member and we’ll have to live with this for now.

The difficulty on using SQL Server Authentication in the Dynamics NAV service tier is in the fact that you’ll need an encryption key to secure traffic between the service tier and the database server: Normally this is handled by the domain. For production environments I’d always use a certificate provided by an official authority, but if you’re simply testing stuff I’d say use makecert.exe to generate one yourself.

After generating the certificate, you’ll need to install this to the service tier. Ashwini Tripathi has written a very helpful blog on how to get this running. Here’s another Dynamics NAV Administration (Power)shell shortcut:

$Credential = (New-Object PSCredential -ArgumentList ‘sa’,(ConvertTo-SecureString -AsPlainText -Force ‘MYPASSWORD’))

Import-NAVEncryptionKey -ApplicationDatabaseServer LINUXSERVERHOSTNAME -ApplicationDatabaseCredentials $Credential -ApplicationDatabaseName DATABASENAME -KeyPath ‘KEYPATH.key’ -Force

Where all the caps need to replaced with your values, of course.

Step 6: Enjoy!

After completing these steps, your Dynamics NAV clients and even the classic development environment should now be able to start on the Linux database server:

For now, I’ll be using this database server to do some development on 2018 and possibly also on 2013; if I have any news on this, you’ll read it here.

If you have any issues, questions or remarks, let me know!


Add some colour to your Dynamics NAV links!

When developing a custom add-on for Dynamics NAV, working with a DTAP or DTP roadmap can ensure both stability in your production environment on one end, and no limitations to develop freely on the other end. However, for key-users, consultants and developers, DTAP environments come with a risk: You might find yourself testing in live, or entering live data into a development, test or acceptation database by accident. Although I would never miss the opportunity to make fun of you if you did, I must admit it happens to me too 🙂

Even without accidents happening, with multiple instances of the same program open in the taskbar, all the buttons can get very confusing.

In order to minimize these risks within our company, we defined colours for every step: Red is development, green is test and blue is production (we don’t have an acceptance database at the moment). We used the system indicator (in Company Information) to show these colours in the program, too, but the taskbar problem was still there.

After some fiddling, we now have a fix for this:

Aside from being nice and colourful, this is quite easy to create and deploy too. The steps I took (skip to #3 if you want to choose different icons – there’s a lot to find for free):

#1 Get the NAV image as an .ico file

I used a small freeware program called NirSoft IconsExtract to get the .ico file from the Dynamics NAV executable. The executable to search for is in the RoleTailoredClient folder in Program Files (x86), in Microsoft.Dynamics.Nav.Client.exe.

#2 Change the colour of the icon

Another free tool was used to change the colour of the icon: GNU Image Manipulation Program (better known as GIMP), which is a very powerful image editor (open source and free!).

When opening the icon in this tool, on the right side of your screen you’ll see the layers. What I did was select the top layer, then click Colors > Map > Color Exchange.Choose Color From and Color To (you can do this quickly with the droplet tool), play a bit with the threshold until the example looks about right and click OK. Then hide the current layer, select the next layer and press Ctrl + F (command to redo the previous Color Exchange).

When you’re finished, make all layers visible again and click File > Export As… > Microsoft Windows Icon (.ico). You can ignore the warning if you’re running Windows 8.1 or 10 (it will read your icon with compressed layer without any problems).

#3 Creating desktop links that use your new icons

It’s NAV, and we want to deploy easily… so we use PowerShell for this! First, we need to set some variables and copy the icons to the Dynamics NAV folder:

Useful to know here: I used the public desktop folder to place the icons. When you want these in personal folders, use $Home\Desktop

Then, we ask the Windows Scripting Host to create an icon for us (copy this part for every environment you wish to link to):

Of course, we can also create a link on the desktop to the standard NAV environment (this uses the config file, and is essentially equal to the link on the Start Menu):

If you wish to change the behavior of your NAV client with command line arguments, you can add them in the $Shortcut.Arguments parameter. The way the link is configured now, everything except running the standard NAV environment will give you this security notice:

 


Page 1 of 212