The Dovetail Blog

Integrating ARCGIS Online maps for CIS

Construction Information Services (CIS) is the leading supplier of All-Ireland Construction Leads, and it recently launched a new mapping feature for its flagship CIS Online product.

This new feature allows CIS customers to see the exact location of project leads.

The system uses the ARCGIS Online platform, provided by ESRI Ireland, to securely store and display spatial data for each project.

Dovetail had a central role in developing this functionality and was responsible for integrating the ArcGIS platform with CIS Online and backoffice features. The project included:

  • Realtime synchronisation of CIS Project leads with the ArcGIS platform using the ArcGIS Online API
  • Adding secure maps to CIS Online and to backend researcher systems
  • Secure integration of an ArcGIS Online application into CIS Online
  • Using the ArcGIS Online Javascript API to project Irish grid coordinates to longitude and latitude.
  • Backend functionality to assist the research team to verify the location of project leads.

CIS Ireland is one of Dovetail’s most established clients and it celebrates its 45th birthday this year.

"We have trusted Dovetail with our critical systems since 2007. Dovetail’s philosophy is based on “partnership”. They consistently deliver high quality solutions for our business and I have no hesitation in recommending them.”

- Tom Moloney, Managing Director.


Selenium not shutting down when running automated protractor tests with TeamCity. Fixed.

In the last couple of weeks Kit and I have been working on how to integrate automated tests into the QuickDBD (great tool, you should try it if you didn’t) application. The challenge was to make our tests work with gulp and Teamcity, in order to automatically run them every time we deploy a new release. QuickDBD is an angular application so we decided to use protractor as our testing framework.

This nice tutorial showed us all the required steps but we had an issue where our tests started on Teamcity but never stopped, leaving the process running indefinitely on the server. I didn’t find the solution for this problem anywhere on the internet so I’m using this post to share our solution with the world.

As mentioned in the tutorial, in order to integrate protractor with gulp you need to install the gulp-angular-protractor. This package automates everything you need to do in order to run protractor, which includes starting and stopping the selenium server and that was actually where the problem laid.

When you configure your gulp task with gulp-angular-protractor it will look similar to this:

The autoStartStopServer: true is used to tell gulp-angular-protractor that it should handle the stopping and starting of the selenium server. But the stopping part wasn’t working.

In previous versions of selenium the following url was used to shutdown the selenium server (considering you are running selenium on the default port 4444):

http://localhost:4444/selenium-server/driver/?cmd=shutDownSeleniumServer

Gulp-angular-protractor was trying to shut down the selenium server but since selenium server 3.0, this url for some reason was removed.

The solution we found was to tell gulp-protractor-angular to run a previous version of selenium server, which contained the shutdown url, by configuring the gulp task in the following way:

I hope this helps whoever is going through the same issue.


Angular, timezones and Datetimepickers

Recently we developed an angular application for one of our Multinational clients with the purpose of helping them to manage accommodation, transfers and flight times. This app had to deal not only with dates but also time and we looked into many different date and time pickers created specifically for angular in order to deal with that. We ended up selecting angular-bootstrap-datetimepicker as our choice.

At first hand everything went fine but during development the daylight saving period started here in Ireland and we noticed the dates being stored in the server started to differ of what we were picking in the app by one hour.

Javascript is somewhat confusing in how it parses dates, times and timezones, even some browsers handle some things in different ways and what was happening was when the datepicker was parsing the date into a javascript object it was also adding local time information e.g. 19/05/2017 11:18:00 +1 hour instead of 19/05/2017 10:18:00, which was actually the “numbers” that were picked in the datepicker. When this information was being sent to the backend, the server was parsing this date object without timezone information so the date being stored in the server was one hour wrong.

We didn’t want the application to be timezone aware, we wanted the users to pick a date on the datepicker and store that date made of the “numbers” that were picked. No smart conversions or anything like that.

The datepicker already provided support to moment.js (which basically the whole world recommends it if you are dealing with dates in javascript) so all what we had to do was to ensure when we are setting dates using the picker the selected date was in UTC timezone.

In order to achieve this behavior we had to add moment-timezone which gave us support to deal with timezones and then changing the code of angular-bootstrap-datetimepicker in the following way:

On the datetimepicker.js file, on the function setTime, one line after case ‘moment’ we added moment.tz.setDefault(“Etc/UTC”) in order to enforce the timezone to UTC every time we are setting a date.

By telling the datepicker to use moment we started to have issues with the angular filter for dates, which simply stop working, so we added another library called angular-moment to deal with that. So the code for the input that shows the date ended being this:

  

Further action: Doing some more testing we discovered that by using moment.utc() we don’t really need add moment-timezone or do the setDefault(“Etc/UTC”) thing. We have not given this a huge amount of testing, but the code would look like this:

 


Windows Server IIS TLS 1.2 Security for Realex Payments

Realex Payments recently announced that it will be ending support for TLS Version 1.0 and 1.1 and began sending emails out letting their customers know of this change.

I have written this guide which should help people use security best practices on an IIS Windows server, which should address the new Realex security requirements.

Please note that in order for the changes to take effect you will need to restart your server.

Preconditions

This guide is only for servers running Windows and IIS. 

Your Web Applications (Web Sites) will also need to have an SSL Cert.

Step 1: Download IIS Crypto 2.0

Go to Nartac and download IISCrypto.exe to your server.

Step 2: Run IIS Crypto 2.0

Run the executable you just downloaded. It is a portable program so it doesn't install anything. The program should display a screen similar to the one shown here.

Step 3: Click the Best Practices Button

On the screen click the "Best Practices" button on the bottom left or select the options you want. The window should then look like the screen below. Once you are happy with the selected tick boxes. Click the "Apply" button.

Step 3: Restart your Server

After you clicked "Apply" you will need to reboot your server. IIS Crypto will tell you to do this (it will not reboot the server for you).

Step 4: Check your server at Qualys SSL Labs

Once your server and IIS has come back online you will need to check the rating of your server. Enter the URL of the site or the IP address of the server and have Qualys SSL test your server. You will want to get at least an A rating for server. If you do not get an A rating you will need to review your server's security settings and re-run the SSL Report.

 

 I hope this helps anyone who may want to update their servers security.


Using Let's Encrypt to add an SSL certificate to your Umbraco site

Lately I learned of a new tool which gets and sets an SSL certificate automatically for you and renews itself every 3 months - Let's Encrypt.

I was eager to try out this new service on one of our Umbraco sites however, there was an issue when I tried to run the program.

Let's Encrypt adds a folder called ".well-known" to the root of the site. It then uses this folder to verify the site and issue an SSL certificate. When you attempt to do this using an Umbraco site  you will be given an error which says something along the lines of "Let's Encrypt cannot access this folder". 

In order to get the SSL issued and installed you will need to modify the WebConfig of your Umbraco site like below.

Replace

<add key="umbracoReservedPaths" value="~/umbraco,~/install/" />

With

<add key="umbracoReservedPaths" value="~/umbraco,~/install/,~/.well-known/" />

Re-run the Let's Encrypt program and the SSL certificate should then be issued and installed for your Umbraco site.

Note: That this will also work for Azure hosted Umbraco sites using the KUDU Let's Encrypt site extension.


Invalidating AWS Cloudfront cache using Octopus Deploy

AWS + Octopus Deploy

I recently wrote how we submitted our first Octopus Deploy template to their online library for deploying .Net web apps to AWS Elastic Beanstalk using Octopus Deploy.

This time we needed to automate AWS Cloudfront cache invalidation. Turns out there are a few different ways to achieve this. You can either do it from the AWS console, by making a REST request or by using the AWS CLI tool.

Since authenticating against the AWS REST API is a bit more complex than we feel is necessary for the purpose of using it within an Octopus Deploy step, we decided to go with the AWS CLI approach (it's much easier to authenticate).

One more GitHub pull request and one more Octopus Deploy step template in their library in hope it might find someone in need. :)

The PowerShell script that does the hard work in the background of the template is the following (just fill in the AWS configuration variables): 

# AWS credentials profile name (should be unique)
# Used to store your AWS credentials to: ~/.aws/
$CredentialsProfileName = ""

# AWS CLoudfront Region
$Region = ""

# AWS Cloudfront Distribution Id
$DistributionId = ""

# AWS Access Key
$AccessKey = ""

# AWS Secret Key
$SecretKey = ""

# Space-delimited list of paths to invalidate.
# For example: /index.html /images/*
$InvalidationPaths = ""


Write-Host "Setting up AWS profile environment"
aws configure set aws_access_key_id $AccessKey --profile $CredentialsProfileName
aws configure set aws_secret_access_key $SecretKey --profile $CredentialsProfileName
aws configure set default.region $Region --profile $CredentialsProfileName
aws configure set preview.cloudfront true --profile $CredentialsProfileName

Write-Host "Initiating AWS cloudfront invalidation of the following paths:"
Write-Host $InvalidationPaths
aws cloudfront create-invalidation --profile $CredentialsProfileName --distribution-id $DistributionId --paths $InvalidationPaths

Write-Host "Please note that it may take up to 15-20 minutes for AWS to complete the cloudfront cache invalidation"

The script uses profile setup for AWS credentials. If you don't want to use the profiles, you can just remove that bits from the script but then you might have to re-setup credentials for a different project every time.

Cheers.

 


Deploying .Net web apps to AWS Elastic Beanstalk using Octopus Deploy

AWS + Octopus Deploy

Happy New Year! Here is a small 2017 present from Dovetail to everyone.

We normally use Azure to host the apps we make. The whole build and deploy with a single click process using TeamCity and Octopus Deploy is in place and it's trivial for us to add new projects to this pipeline.

Recently however, one of our clients wanted to host the .Net web app we're building for them on Amazon Web Services (AWS) because that's where the rest of their infrastructure is. Naturally not too many people host their .Net apps on AWS because MS Azure feels like a more natural fit. This meant it was a bit harder to find a fast and easy way to automate the deployment process to AWS through Octopus Deploy.

Anyway, we found this kind of half-baked solution (thanks!) on GitHub, made a few modifications, wrapped it up in a nice Octopus step template and made a pull request to Octopus library. :)

The template got accepted and can now be obtained from their library.

We hope it might help someone else and save them some time in setting the whole thing up.

Also, here are some more resources about deploying .Net apps to AWS which we found interesting.

codeproject.com - AWS deployment with octopus deploy
AWS docs - awsdeploy.exe tool
Octopus discussions - AWS elastic beanstalk
Octopus discussions - Modifying machines in environments to support AWS autoscaling
Octopus discussions - AWS beanstalk deployment using octopus deploy


Using SQL geometry type for finding near and intersecting shapes to a lat-lng point

This short blog post will provide you with two SQL stored procedures which work with the SQL geometry data type to figure out how a lat/lng point correlates to spatial shapes on the DB level.

What this means is - you provide lat/lng and the DB returns all the shapes which this point intersects with or in the other case returns the nearest shapes to that point.

You could also use the geography type for this and the code should end up being only slightly more complex than this but we didn't have the need for that so we ended up using the geometry type.

We used these to identify company branches for a certain location on the map. The geometry data is stored in the SpatialData field (of geometry type), and we're returning the BranchId and BranchName but you should obviously modify that to your needs.

-- Finds intersecting branches
-- Accepts lat and lng
CREATE PROC SP_GET_INTERSECTING_BRANCHES
  @lat FLOAT,
  @lng FLOAT
AS
BEGIN
  DECLARE @point GEOMETRY
  SET @point = GEOMETRY::Point(@lng, @lat, 4326)
  SELECT BranchId, BranchName FROM [dbo].[Branch]
  WHERE @point.STIntersects(SpatialData) = 1
END
GO


-- Finds the nearest branches
-- Accepts lat, lng and the amount of matching rows to return
CREATE PROC SP_GET_CLOSEST_BRANCHES
  @lat FLOAT,
  @lng FLOAT,
  @amount INTEGER
AS
BEGIN
  DECLARE @point GEOMETRY
  SET @point = GEOMETRY::Point(@lng, @lat, 4326)
  SELECT TOP (@amount) BranchId, BranchName, @point.STDistance(SpatialData) AS Distance FROM [dbo].[Branch]
  ORDER BY @point.STDistance(SpatialData)
END
GO

Hope it helps. Cheers!


Dovetailers earn Microsoft Certifications

Congratulations are in order as the following Dovetailers passed their Microsoft Certification exams.

Tomás and Murilo passed 70-461: Querying Microsoft SQL Server

Fabrizio and Kit passed 70-483: Programming in C#

John and Mossy passed 70-532: Developing Microsoft Azure Solutions

Dovetail Values - Clarity, Partnership, Craftsmanship, Commerciality, Progression.

Progression is one of Dovetail's core values and we promote constant learning and improvement. In the fast-moving technical sector, no one can afford to sit still and we are already planning next year's Progression Goals. 

 


Studying for Microsoft Exam 70-461

This month myself and the other Dovetailers took some Microsoft Certification exams.

I took the exam 70-461: Querying Microsoft SQL Server 2012/2014.

With that in mind I thought I might share some thoughts on my exam preparation and on some of the resources I used in preparing for the exam. I must say that I was not very proficient with SQL before studying for this exam but with the right amount of preparation and study I passed and earned my certificate.

Below is a list of what I did while preparing for the exam

  • Studied from different resources books, video courses and practice exams.
  • Find which kind of study material suits you more.
  • Study over a period of 4 months.
  • At least 30 minutes study every day.
  • Try and study in the morning. I found it hard to study in the evenings.
  • Take practice exams.
  • Study the exam objectives.
  • Once you feel ready, take the exam.

Below is a list of Study material I used and my thoughts on them

Books

The book I studied with was "Training Kit (Exam 70-461): Querying Microsoft SQL Server 2012 by Dejan Sarka, Itzik Ben-Gan, and Ron Talmage".

This is the official book for the exam. It is a detailed book and covers all the exam objectives. It also covers more than the exam objectives all of which was interesting and will prepare me for the next SQL exam. The book contains around 700 pages. 

The book comes with a free practice test but this was not as good as the practices exams provided by Measure Up and Transcender.

Video Courses (70-461)

Pluralsight : The course was a bit too short and the instructor does not go through the topics thoroughly . I felt it best as an introduction to the exam.

Joes 2 Pros: Very good material, the instructor goes into every topic in detail and provides labs for practicing as well. The website is kinda clunky but the videos are good.

CBT Nuggets: Alternative to Joes 2 Pros, it does not go into the same depth as them but a it does cover a lot of the topics. Provides a lab for you to practice with. The free trial only lasts 7 days.

YouTube: There is a SQL Server tutorial playlist, which covers more topics than whats on the 70-461 but a good free resource.

Practice Exams

Measure Up: The interface did not properly format the SQL, so it can be quite difficult to read. A part from that the exams were useful.

Transcender: I really like Transcender, the interface is good and they also provide flashcards which were helpful when trying to understand an exam topic when needed. I found this to be a really useful tool when studying. The exams were as good as Measure Up and the type of questions asked were of a similar structure to the real exam.

I hope you find this blog post helpful in preparing for your 70-461 exam.

Good Luck!