Realex Payments recently announced that it will be ending support for TLS Version 1.0 and 1.1 and began sending emails out letting their customers know of this change.
I have written this guide which should help people use security best practices on an IIS Windows server, which should address the new Realex security requirements.
Please note that in order for the changes to take effect you will need to restart your server.
This guide is only for servers running Windows and IIS.
Your Web Applications (Web Sites) will also need to have an SSL Cert.
Step 1: Download IIS Crypto 2.0
Go to Nartac and download IISCrypto.exe to your server.
Step 2: Run IIS Crypto 2.0
Run the executable you just downloaded. It is a portable program so it doesn't install anything. The program should display a screen similar to the one shown here.
Step 3: Click the Best Practices Button
On the screen click the "Best Practices" button on the bottom left or select the options you want. The window should then look like the screen below. Once you are happy with the selected tick boxes. Click the "Apply" button.
Step 3: Restart your Server
After you clicked "Apply" you will need to reboot your server. IIS Crypto will tell you to do this (it will not reboot the server for you).
Step 4: Check your server at Qualys SSL Labs
Once your server and IIS has come back online you will need to check the rating of your server. Enter the URL of the site or the IP address of the server and have Qualys SSL test your server. You will want to get at least an A rating for server. If you do not get an A rating you will need to review your server's security settings and re-run the SSL Report.
I hope this helps anyone who may want to update their servers security.
Congratulations are in order as the following Dovetailers passed their Microsoft Certification exams.
Tomás and Murilo passed 70-461: Querying Microsoft SQL Server
Fabrizio and Kit passed 70-483: Programming in C#
John and Mossy passed 70-532: Developing Microsoft Azure Solutions
Progression is one of Dovetail's core values and we promote constant learning and improvement. In the fast-moving technical sector, no one can afford to sit still and we are already planning next year's Progression Goals.
This month myself and the other Dovetailers took some Microsoft Certification exams.
I took the exam 70-461: Querying Microsoft SQL Server 2012/2014.
With that in mind I thought I might share some thoughts on my exam preparation and on some of the resources I used in preparing for the exam. I must say that I was not very proficient with SQL before studying for this exam but with the right amount of preparation and study I passed and earned my certificate.
Below is a list of what I did while preparing for the exam
- Studied from different resources books, video courses and practice exams.
- Find which kind of study material suits you more.
- Study over a period of 4 months.
- At least 30 minutes study every day.
- Try and study in the morning. I found it hard to study in the evenings.
- Take practice exams.
- Study the exam objectives.
- Once you feel ready, take the exam.
Below is a list of Study material I used and my thoughts on them
The book I studied with was "Training Kit (Exam 70-461): Querying Microsoft SQL Server 2012 by Dejan Sarka, Itzik Ben-Gan, and Ron Talmage".
This is the official book for the exam. It is a detailed book and covers all the exam objectives. It also covers more than the exam objectives all of which was interesting and will prepare me for the next SQL exam. The book contains around 700 pages.
The book comes with a free practice test but this was not as good as the practices exams provided by Measure Up and Transcender.
Video Courses (70-461)
Pluralsight : The course was a bit too short and the instructor does not go through the topics thoroughly . I felt it best as an introduction to the exam.
Joes 2 Pros: Very good material, the instructor goes into every topic in detail and provides labs for practicing as well. The website is kinda clunky but the videos are good.
CBT Nuggets: Alternative to Joes 2 Pros, it does not go into the same depth as them but a it does cover a lot of the topics. Provides a lab for you to practice with. The free trial only lasts 7 days.
YouTube: There is a SQL Server tutorial playlist, which covers more topics than whats on the 70-461 but a good free resource.
Measure Up: The interface did not properly format the SQL, so it can be quite difficult to read. A part from that the exams were useful.
Transcender: I really like Transcender, the interface is good and they also provide flashcards which were helpful when trying to understand an exam topic when needed. I found this to be a really useful tool when studying. The exams were as good as Measure Up and the type of questions asked were of a similar structure to the real exam.
I hope you find this blog post helpful in preparing for your 70-461 exam.
Last week Dovetail exhibited at the IoT World Conference held in the Dublin Convention Centre.
It was a really interesting event with over 200 speakers and 150 exhibitors. The startup area was particularly interesting with a wide variety of new businesses showing their wares. With my background in mechanical engineering I was particularly taken with this strain gauge built with nanoparticles.
At the Dovetail stand we demonstrated the system we developed of Novaerus, which drew a lot of attention.
Despite how this picture looks, we didn't actually have a Martin Wallace mannequin. This was the real article, I think he just froze up for a second :)
At the start of every project I place a brief but concerted focus on what to call the system under development.
Why is a good name important?
- It promotes clear communication between stakeholders, and clarity is a Dovetail core value. I worry when a generic term like “the system” is used in a meeting - inevitably somebody is left wondering “Which system exactly?”
- It gives the nascent software system its own identity. This helps stakeholders to engage with the project even though it may still be abstract to them. They can visualise the solution better when it has a name, leading to more creativity and thorough analysis.
So what makes a good name? Here are my suggestions:
- It should be unique rather than generic. If it stands out a little it helps give the new system its own personality.
- It should be a single word, so short that it never occurs to anyone to abbreviate it in speech or writing. This promotes consistent use by being the easiest way to refer to the new system.
- Its pronunciation should be unambiguous. This removes the fear of saying it "wrong", another barrier to universal adoption.
- Don't try to describe the project in its name. You will probably end up with something cumbersome. The name will also be prone to irrelevance as the project grows and evolves.
- The meaning of the word really doesn't matter, so don’t sweat about it too much. Of course it can be a nifty acronym or something related to the project, but it can also just be a word that sounds good. Like a child, the project will grow into its name, everyone will get used to it, and eventually you won't be able to imagine any other name sounding right.
- Don't worry about the permanence of the name. You’re just choosing something for internal use by stakeholders. If the system is launched to a wider audience you can give it a public-facing name at that time, and it will probably be better than anything you think up at this stage.
- Do get buy-in from key stakeholders. Your goal is universal adoption: people find this surprisingly easy when their boss loves the name!
Here are some good examples of actual Dovetail projects:
HARPS was a neat acronym we laboured over when the project started years ago, but nobody remembers what it means now. Hermes is a project for a sports body, so we named it after the Greek god associated with sport. Athena was a seemingly random suggestion by a client after I shared my guidelines above.
As for the last two: when we’re stuck we just pick a bird’s name. It works every time, showing how unimportant the actual word is!
For the last couple of months we've been working on a side-project here in Dovetail. Martin and Trevor wanted a tool to quickly draw/prototype database diagrams by typing. So, we're happy to announce QuickDBD! We decided to wrap it in a shiny design and make it a little product which we hope others will find useful as well. In time, if there is enough demand we'll expand the feature set. If you have any ideas or suggestions, please let us know on our roadmap Trello board.
In the process of making QuickDBD, a lot of cool, interesting technologies were used and no programming languages were harmed! We used things such as AngularJS, Typescript, JointJS (for diagram rendering - awesome library!), Karma and Jasmine (for testing), Angular Material and SASS on the front-end, .Net WebAPI, xUnit and MS SQL on the back-end and we automated our build-test-deploy pipeline with bower, gulp, TeamCity, Octopus Deploy and Azure. A very interesting journey!
We hope you like QuickDBD same as we do. If you have any feedback, please let us know!
I'm jotting down some notable tech news we've been discussing internally (in our slack #techtalk channel) this week.
We use New Relic on a number of applications, it's a great tool for highlighting performance issues in applications. Microsoft has always been somewhat in that game, but their new offering built into Azure is called "Application Insights". It looks to be a direct competitor to New Relic. It also has logging and a query engine to go with it, so it may also be aiming for cloud logging providers too (like Log Entries). https://azure.microsoft.com/en-us/documentation/articles/app-insights-overview/.
Trevor uses a mac (boo!), and we're a Microsoft development house. At times he struggles to find the right tools to work in a primarily windows environment, and he usually resorts to a virtual machine or RDP. We recently found this tool called Wagon (https://www.wagonhq.com/), and Trevor has been using it and enjoying it. Wagon is built on Electron, another tool we have been keeping an eye on lately. Fabrizio is especially enamored by it.
Apparently we care about API versioning. I'm not sure, but other people care about it more than me: Your API versioning is wrong, which is why I decided to do it 3 different wrong ways.
VHS won! But only barely. https://www.theguardian.com/technology/2015/nov/10/betamax-dead-long-live-vhs-sony-end-prodution, http://news.sky.com/story/remember-vcrs-production-to-end-as-sales-slump-10509632.
Lastly, John found this. Have we gone too far?
This week, Irish Rail launched the Online Payments facility for Fixed Payment Notices (which are penalties for fare evasion and other infringements).
The Dovetail-developed system allows passengers to pay a Fixed Payment Notice online. It is mobile-friendly and allows customers to pay a Fixed Payment Notice on their mobile, tablet, laptop or desktop computer.
The system is built using ASP.Net, C#, CSS and HTML5 and it is integrated with the Irish Rail Fixed Payment Notice Management system (a version of the Standard Fare Backoffice Management System which Dovetail previously developed for Dublin Bus).
Our work with Irish Rail, LUAS and Dublin Bus is all part of Dovetail's continued involvement with the transport sector.
The following article appeared in the February 2016 edition of Rail Brief, the Irish Rail staff magazine. You can view the PDF here.
John and Martin with the Irish Rail Team in Connolly Station.
A FINE NEW SYSTEM
In 2015 there were 9,606 Fixed Payment Notices issued. There was a 22% increase in the number of Fixed Payment Notices issued in 2014 compared with 2013 and this trend remains in an upward direction putting more pressure on the system in use. As a source of revenue for us, it is critical that there is an intelligent information system to ensure detailed reporting and timely payment of fines.
Main Triggers for the New System
1. Two separate systems existed, one for DART and one for Innercity
The back office was using two disparate systems; Access and Infopath as both the Intercity & Commuter (ICCN) and DART had individual systems. This meant inputters were moving between systems with differing designs. These disparate systems continued when the RPU was centralised, meaning that the Head of Revenue Protection and the Revenue Protection & Prosecutions Manager had to interrogate each system separately and add the results together. Often they had to physically count original fines for statistics purpose as the original system didn’t allow for any meaningful interrogation. Another issue with the existing system was there was no single view across one person. A person could have a fine on the DART database but the ICCN database had no visibility of it.
2. Everything was manually typed
Prior to the new system, everything was manually typed, for example, there were no drop down boxes with list of stations, Revenue Protection officers’ names, train times or routes. This
lead to the likelihood of poor quality data as typing errors/spelling errors could occur due to the high volumes to be input.
3. Inconsistent design between forms and databases
The fields on the screen and the form didn’t match. As a result, it slowed down the speed of inputting as everything on the screen had to be matched to the form field for input. This contributed to a growing backlog and as a consequence, reminder notices, at times, were late going out to customers. This type of backlog can be very demotivating for an employee – no matter how hard the team worked, there seemed no end to it!
4. Databases were not built for high volume
There were over 38,000 records on the databases which were not built for high volume and as a result crashes often happened. Up to eight people could have been inputting at any one time
and the input may not have updated correctly. The resulting consequence was that a letter could go out to someone who has already paid a fine.
Leading the Change
Roger Tobin, Head of Revenue Protection, has been leading the change project with support from Dave Cannon Manager Revenue and Prosecutions and Shauna Fitzsimmons on the systems side. The back office team have also supported the change process. The team worked with David Bettles Information Systems, Keith Faherty Online Manager, Group IT and Customer First in specifying and clarifying what the system requirements were before Dovetail could commence their work.
Communications and Training
The team had been briefed on the full extent of the system change. These briefings were supported by the Customer First, People and Communications Lead, Linda Allen and were made
by Dave Cannon and Shauna Fitzsimmons. A training test system was set up by Dovetail to ensure all the team were comfortable with the system before it launched. They all found the system to be very straightforward and could really appreciate its benefits. The Dovetail systems supplier facilitated the training for all involved. They also provided systems support for the team to ensure the team were fully supported in the ‘go live’ and beyond. Brian Quinn, Business Process Lead, documented the new processes arising from the implementation of the new system. This was to ensure there was no ambiguity in the implementation and ensured the process in place was the optimal one.
Phase 2 Online Payment Facility
Work is currently ongoing in setting up an online payment facility with a Go Live expected in February 2016. Currently there are limitations on payment options as a customer can only pay
during office hours, Monday to Friday 9am to 5pm. There will be huge benefits to the customer to pay online anytime as the back office team had received complaints from people who wanted to pay but couldn’t get through. This will also mean a reduction of phone calls to the office to allow the employees allocate their time on the key tasks of managing repeat offenders, analysing areas to target and managing files for maximum court prosecutions.
Phase 3 Customer First
Customer First is currently looking at electronic solutions to make the RPU more efficient. Currently Revenue Protection Officers write out Fixed Payment Notices (FPN’s) which would mean real time inputting. There will be real benefits in the adoption of these portable devices.
Benefits of the New Dovetail System
One of the biggest benefits for the team is the removal of the backlog. All their hard work has significantly contributed to this. Other benefits include:
1. One single view of a ‘customer’
The new system can highlight fraudulent persons or highlight repeat offenders. It is able to supply fraud lists or repeat offenders across both systems.This allows for a more intelligent type of reporting and more successful prosecutions.
2. Better targeting of fare evasion
It allows the RPU team to more intelligently target times and services where there are fare evasions above average. The system allows them to interrogate information by multiple fields
e.g. by station, by time, by ticket types, by day of the week and by any other fields stored. The new system has all the information in the one place, it reduces the dependence on physical files.
3. One single system in place and customisation of screens
There is now one single system in place for all the back office team capturing all Railway Undertaking fine data. Customisation took place for ease of use for the inputter on all screens. The new screens mirror the FPN form and will follow the fields of the form as it appears on the page.
4. Template letters created for all scenarios
Template files for all types of letter have been supplied to the new system and can be generated automatically.
5. Preloaded lists and drop down boxes
The new system will have these all lists preloaded along with the actual timetable. It will also have an address link with google maps eliminating the need for freeform typing.
6. Appeals process standardised
The time spend on appeals has reduced as the appeals process has been standardised and the appeal is done via email with the addition of the attachment on the system.
7. Flexible to change
The new system is more flexible to change. The systems allows the addition of new routes, times, officers and can allow the addition or amendment of any fields.
Sorry for the click bait title - I just wanted to share this interesting screenshot.
Below is a screen shot overview of a Dovetail tender document. I was interested to see how many images we use: 70 in this case. This is pretty typical of a Dovetail proposal. The pictures include such things as examples of previous work, suggested approaches for the project under discussion, UML diagrams and some of our corporate bonafides.
The images aid clear communication (one of our corporate values) and they also break the text up to make the document more approachable. And the proof is in the pudding - we won this particular tender :)
So if a picture really is worth a thousand words then this tender document contains 70,000 due to the images alone, and 17,523 that we wrote, giving a whopping total of 87,523!
Here at Dovetail we love Team City and Visual Studio.
We recently updated our Team City configuration to allow projects to be built using Visual Studio 2015, C# 6, and to use the latest Nuget package manager.
In doing so, we discovered a very peculiar setting deep within Team City that caused one of our projects to break on build and break once deployed.
The Build Failures
After updating, we ran our build and the compiler threw an error saying that it could not find a specific version of a Nuget package. For example, our packages.config within Visual Studio specified we use Nuget to install Newtonsoft.Json version 7. However Team City reported that the project needed Newtonsoft.Json version 8.
We made the decision to update all affected nuget packages to the latest versions, pushed our project and Team City built it successfully.
The Deploy Failure
Looking back at our Octopus Deploy package we found that the JQuery file we were referencing and pushing to our repository was not there any more. However, we did see the latest version of the JQuery min file. Our file was being removed and replaced with the latest JQuery min version.
The Update Package Setting
We soon found the setting buried deep inside the Team City "build steps" screens:
Within the NuGet Installer build step is a setting which, when turned on, updates all your packages. This sounds great in theory but when you run into build and deploy issues this will cause headaches.
The text underneath states "Uses the NuGet update command to update all packages under solution. Package versions and constraints are taken from packages.config files". Whether this is a bug in Team City or not, this text seems very vague for an "Update Packages" function.
Be careful, because when checking this check-box, Team City will not read the packages.config version numbers and instead it will download the latest version of every package.
Update: Team City have been back to us and they're going to update the explanatory text on this checkbox to make it more clear.