The Dovetail Blog

Dovetail helps IKEA take a bite of the Big Apple

Planning Studios are a new type of retail outlet that IKEA is opening across the globe.

The Manhattan studio is the first Planning Studio in the U.S., and up to 30 more are planned.

Customers can bring measurements of their space, or a list of the household items they’re struggling to organise into the store.  There the customer can work with an IKEA expert to come up with the best solution.

The Manhattan store features inspirational room settings focused on helping customers discover products and solutions suited for city living. It is where New Yorkers can get one-on-one help with kitchen design, bedroom projects, small space living solutions and so much more. Purchases made at the Planning Studio are conveniently delivered to customers’ homes.

Dovetail is providing a critical component of this new service: the system that lets customers make bookings, and manages the sales process for coworkers.

We’re helping ensure that busy New Yorkers get the service they need in the heart of the city.

Putting food safety first with FSAI

The Food Safety Authority of Ireland (FSAI) today launched a new system to allow Food Business Operators to notify the FSAI of the placement of certain types of products onto the Irish marketplace.


The new system allows FSAI to assess compliance of new foodstuffs against European regulations for food labelling, maximum levels of vitamins and minerals, banned substances, and other laws.

The solution was designed and developed by Dovetail to simplify the complex workflows that are required to assess a notification. It provides for regulation checks for Foods for Special Medical Purposes, Infant Food, Follow-on Food, and Supplements, and it will be extended in the future for other regulations.

The system is used by FSAI staff, Environmental Health Officers and Food Business Operators throughout the country.

Clear skies for World Rugby travel coordinators

Organising group travel can be a tortuous affair. But when you have to do it as much as World Rugby does, it becomes a major undertaking.

As from today, life is a bit easier for travel coordinators in World Rugby. They can now use a custom solution we developed to meet their unique requirements.


Dovetail's noble quest

I just came across this quote attributed to Alfred North Whitehead (Mathematician and philosopher. 1861 - 1947)

Alfred North Whitehead

"It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle — they are strictly limited in number, they require fresh horses, and must only be made at decisive moments."

I really like this. The emphasis is mine by the way.

Firstly, it graphically establishes the basis on which Larry Wall (a famous programmer) thinks laziness is a virtue in developers.

Secondly, Dovetail automates important business processes for its clients. Based on this, can we claim to be advancing human civilization no less, every single day? I like the sound of that! :)

Integrating ARCGIS Online maps for CIS

Construction Information Services (CIS) is the leading supplier of All-Ireland Construction Leads, and it recently launched a new mapping feature for its flagship CIS Online product.

This new feature allows CIS customers to see the exact location of project leads.

The system uses the ARCGIS Online platform, provided by ESRI Ireland, to securely store and display spatial data for each project.

Dovetail had a central role in developing this functionality and was responsible for integrating the ArcGIS platform with CIS Online and backoffice features. The project included:

  • Realtime synchronisation of CIS Project leads with the ArcGIS platform using the ArcGIS Online API
  • Adding secure maps to CIS Online and to backend researcher systems
  • Secure integration of an ArcGIS Online application into CIS Online
  • Using the ArcGIS Online Javascript API to project Irish grid coordinates to longitude and latitude.
  • Backend functionality to assist the research team to verify the location of project leads.

CIS Ireland is one of Dovetail’s most established clients and it celebrates its 45th birthday this year.

"We have trusted Dovetail with our critical systems since 2007. Dovetail’s philosophy is based on “partnership”. They consistently deliver high quality solutions for our business and I have no hesitation in recommending them.”

- Tom Moloney, Managing Director.

Selenium not shutting down when running automated protractor tests with TeamCity. Fixed.

In the last couple of weeks Kit and I have been working on how to integrate automated tests into the QuickDBD (great tool, you should try it if you didn’t) application. The challenge was to make our tests work with gulp and Teamcity, in order to automatically run them every time we deploy a new release. QuickDBD is an angular application so we decided to use protractor as our testing framework.

This nice tutorial showed us all the required steps but we had an issue where our tests started on Teamcity but never stopped, leaving the process running indefinitely on the server. I didn’t find the solution for this problem anywhere on the internet so I’m using this post to share our solution with the world.

As mentioned in the tutorial, in order to integrate protractor with gulp you need to install the gulp-angular-protractor. This package automates everything you need to do in order to run protractor, which includes starting and stopping the selenium server and that was actually where the problem laid.

When you configure your gulp task with gulp-angular-protractor it will look similar to this:

The autoStartStopServer: true is used to tell gulp-angular-protractor that it should handle the stopping and starting of the selenium server. But the stopping part wasn’t working.

In previous versions of selenium the following url was used to shutdown the selenium server (considering you are running selenium on the default port 4444):


Gulp-angular-protractor was trying to shut down the selenium server but since selenium server 3.0, this url for some reason was removed.

The solution we found was to tell gulp-protractor-angular to run a previous version of selenium server, which contained the shutdown url, by configuring the gulp task in the following way:

I hope this helps whoever is going through the same issue.

Angular, timezones and Datetimepickers

Recently we developed an angular application for one of our Multinational clients with the purpose of helping them to manage accommodation, transfers and flight times. This app had to deal not only with dates but also time and we looked into many different date and time pickers created specifically for angular in order to deal with that. We ended up selecting angular-bootstrap-datetimepicker as our choice.

At first hand everything went fine but during development the daylight saving period started here in Ireland and we noticed the dates being stored in the server started to differ of what we were picking in the app by one hour.

Javascript is somewhat confusing in how it parses dates, times and timezones, even some browsers handle some things in different ways and what was happening was when the datepicker was parsing the date into a javascript object it was also adding local time information e.g. 19/05/2017 11:18:00 +1 hour instead of 19/05/2017 10:18:00, which was actually the “numbers” that were picked in the datepicker. When this information was being sent to the backend, the server was parsing this date object without timezone information so the date being stored in the server was one hour wrong.

We didn’t want the application to be timezone aware, we wanted the users to pick a date on the datepicker and store that date made of the “numbers” that were picked. No smart conversions or anything like that.

The datepicker already provided support to moment.js (which basically the whole world recommends it if you are dealing with dates in javascript) so all what we had to do was to ensure when we are setting dates using the picker the selected date was in UTC timezone.

In order to achieve this behavior we had to add moment-timezone which gave us support to deal with timezones and then changing the code of angular-bootstrap-datetimepicker in the following way:

On the datetimepicker.js file, on the function setTime, one line after case ‘moment’ we added“Etc/UTC”) in order to enforce the timezone to UTC every time we are setting a date.

By telling the datepicker to use moment we started to have issues with the angular filter for dates, which simply stop working, so we added another library called angular-moment to deal with that. So the code for the input that shows the date ended being this:


Further action: Doing some more testing we discovered that by using moment.utc() we don’t really need add moment-timezone or do the setDefault(“Etc/UTC”) thing. We have not given this a huge amount of testing, but the code would look like this:


Windows Server IIS TLS 1.2 Security for Realex Payments

Realex Payments recently announced that it will be ending support for TLS Version 1.0 and 1.1 and began sending emails out letting their customers know of this change.

I have written this guide which should help people use security best practices on an IIS Windows server, which should address the new Realex security requirements.

Please note that in order for the changes to take effect you will need to restart your server.


This guide is only for servers running Windows and IIS. 

Your Web Applications (Web Sites) will also need to have an SSL Cert.

Step 1: Download IIS Crypto 2.0

Go to Nartac and download IISCrypto.exe to your server.

Step 2: Run IIS Crypto 2.0

Run the executable you just downloaded. It is a portable program so it doesn't install anything. The program should display a screen similar to the one shown here.

Step 3: Click the Best Practices Button

On the screen click the "Best Practices" button on the bottom left or select the options you want. The window should then look like the screen below. Once you are happy with the selected tick boxes. Click the "Apply" button.

Step 3: Restart your Server

After you clicked "Apply" you will need to reboot your server. IIS Crypto will tell you to do this (it will not reboot the server for you).

Step 4: Check your server at Qualys SSL Labs

Once your server and IIS has come back online you will need to check the rating of your server. Enter the URL of the site or the IP address of the server and have Qualys SSL test your server. You will want to get at least an A rating for server. If you do not get an A rating you will need to review your server's security settings and re-run the SSL Report.


 I hope this helps anyone who may want to update their servers security.

Using Let's Encrypt to add an SSL certificate to your Umbraco site

Lately I learned of a new tool which gets and sets an SSL certificate automatically for you and renews itself every 3 months - Let's Encrypt.

I was eager to try out this new service on one of our Umbraco sites however, there was an issue when I tried to run the program.

Let's Encrypt adds a folder called ".well-known" to the root of the site. It then uses this folder to verify the site and issue an SSL certificate. When you attempt to do this using an Umbraco site  you will be given an error which says something along the lines of "Let's Encrypt cannot access this folder". 

In order to get the SSL issued and installed you will need to modify the WebConfig of your Umbraco site like below.


<add key="umbracoReservedPaths" value="~/umbraco,~/install/" />


<add key="umbracoReservedPaths" value="~/umbraco,~/install/,~/.well-known/" />

Re-run the Let's Encrypt program and the SSL certificate should then be issued and installed for your Umbraco site.

Note: That this will also work for Azure hosted Umbraco sites using the KUDU Let's Encrypt site extension.

Invalidating AWS Cloudfront cache using Octopus Deploy

AWS + Octopus Deploy

I recently wrote how we submitted our first Octopus Deploy template to their online library for deploying .Net web apps to AWS Elastic Beanstalk using Octopus Deploy.

This time we needed to automate AWS Cloudfront cache invalidation. Turns out there are a few different ways to achieve this. You can either do it from the AWS console, by making a REST request or by using the AWS CLI tool.

Since authenticating against the AWS REST API is a bit more complex than we feel is necessary for the purpose of using it within an Octopus Deploy step, we decided to go with the AWS CLI approach (it's much easier to authenticate).

One more GitHub pull request and one more Octopus Deploy step template in their library in hope it might find someone in need. :)

The PowerShell script that does the hard work in the background of the template is the following (just fill in the AWS configuration variables): 

# AWS credentials profile name (should be unique)
# Used to store your AWS credentials to: ~/.aws/
$CredentialsProfileName = ""

# AWS CLoudfront Region
$Region = ""

# AWS Cloudfront Distribution Id
$DistributionId = ""

# AWS Access Key
$AccessKey = ""

# AWS Secret Key
$SecretKey = ""

# Space-delimited list of paths to invalidate.
# For example: /index.html /images/*
$InvalidationPaths = ""

Write-Host "Setting up AWS profile environment"
aws configure set aws_access_key_id $AccessKey --profile $CredentialsProfileName
aws configure set aws_secret_access_key $SecretKey --profile $CredentialsProfileName
aws configure set default.region $Region --profile $CredentialsProfileName
aws configure set preview.cloudfront true --profile $CredentialsProfileName

Write-Host "Initiating AWS cloudfront invalidation of the following paths:"
Write-Host $InvalidationPaths
aws cloudfront create-invalidation --profile $CredentialsProfileName --distribution-id $DistributionId --paths $InvalidationPaths

Write-Host "Please note that it may take up to 15-20 minutes for AWS to complete the cloudfront cache invalidation"

The script uses profile setup for AWS credentials. If you don't want to use the profiles, you can just remove that bits from the script but then you might have to re-setup credentials for a different project every time.