2016

Using SQL geometry type for finding near and intersecting shapes to a lat-lng point

This short blog post will provide you with two SQL stored procedures which work with the SQL geometry data type to figure out how a lat/lng point correlates to spatial shapes on the DB level.

What this means is - you provide lat/lng and the DB returns all the shapes which this point intersects with or in the other case returns the nearest shapes to that point.

You could also use the geography type for this and the code should end up being only slightly more complex than this but we didn't have the need for that so we ended up using the geometry type.

We used these to identify company branches for a certain location on the map. The geometry data is stored in the SpatialData field (of geometry type), and we're returning the BranchId and BranchName but you should obviously modify that to your needs.

-- Finds intersecting branches
-- Accepts lat and lng
CREATE PROC SP_GET_INTERSECTING_BRANCHES
  @lat FLOAT,
  @lng FLOAT
AS
BEGIN
  DECLARE @point GEOMETRY
  SET @point = GEOMETRY::Point(@lng, @lat, 4326)
  SELECT BranchId, BranchName FROM [dbo].[Branch]
  WHERE @point.STIntersects(SpatialData) = 1
END
GO


-- Finds the nearest branches
-- Accepts lat, lng and the amount of matching rows to return
CREATE PROC SP_GET_CLOSEST_BRANCHES
  @lat FLOAT,
  @lng FLOAT,
  @amount INTEGER
AS
BEGIN
  DECLARE @point GEOMETRY
  SET @point = GEOMETRY::Point(@lng, @lat, 4326)
  SELECT TOP (@amount) BranchId, BranchName, @point.STDistance(SpatialData) AS Distance FROM [dbo].[Branch]
  ORDER BY @point.STDistance(SpatialData)
END
GO

Hope it helps. Cheers!


Dovetailers earn Microsoft Certifications

Congratulations are in order as the following Dovetailers passed their Microsoft Certification exams.

Tomás and Murilo passed 70-461: Querying Microsoft SQL Server

Fabrizio and Kit passed 70-483: Programming in C#

John and Mossy passed 70-532: Developing Microsoft Azure Solutions

Dovetail Values - Clarity, Partnership, Craftsmanship, Commerciality, Progression.

Progression is one of Dovetail's core values and we promote constant learning and improvement. In the fast-moving technical sector, no one can afford to sit still and we are already planning next year's Progression Goals. 

 


Studying for Microsoft Exam 70-461

This month myself and the other Dovetailers took some Microsoft Certification exams.

I took the exam 70-461: Querying Microsoft SQL Server 2012/2014.

With that in mind I thought I might share some thoughts on my exam preparation and on some of the resources I used in preparing for the exam. I must say that I was not very proficient with SQL before studying for this exam but with the right amount of preparation and study I passed and earned my certificate.

Below is a list of what I did while preparing for the exam

  • Studied from different resources books, video courses and practice exams.
  • Find which kind of study material suits you more.
  • Study over a period of 4 months.
  • At least 30 minutes study every day.
  • Try and study in the morning. I found it hard to study in the evenings.
  • Take practice exams.
  • Study the exam objectives.
  • Once you feel ready, take the exam.

Below is a list of Study material I used and my thoughts on them

Books

The book I studied with was "Training Kit (Exam 70-461): Querying Microsoft SQL Server 2012 by Dejan Sarka, Itzik Ben-Gan, and Ron Talmage".

This is the official book for the exam. It is a detailed book and covers all the exam objectives. It also covers more than the exam objectives all of which was interesting and will prepare me for the next SQL exam. The book contains around 700 pages. 

The book comes with a free practice test but this was not as good as the practices exams provided by Measure Up and Transcender.

Video Courses (70-461)

Pluralsight : The course was a bit too short and the instructor does not go through the topics thoroughly . I felt it best as an introduction to the exam.

Joes 2 Pros: Very good material, the instructor goes into every topic in detail and provides labs for practicing as well. The website is kinda clunky but the videos are good.

CBT Nuggets: Alternative to Joes 2 Pros, it does not go into the same depth as them but a it does cover a lot of the topics. Provides a lab for you to practice with. The free trial only lasts 7 days.

YouTube: There is a SQL Server tutorial playlist, which covers more topics than whats on the 70-461 but a good free resource.

Practice Exams

Measure Up: The interface did not properly format the SQL, so it can be quite difficult to read. A part from that the exams were useful.

Transcender: I really like Transcender, the interface is good and they also provide flashcards which were helpful when trying to understand an exam topic when needed. I found this to be a really useful tool when studying. The exams were as good as Measure Up and the type of questions asked were of a similar structure to the real exam.

I hope you find this blog post helpful in preparing for your 70-461 exam.

Good Luck!

 


Worldpay: remote host closed connection during handshake

Around 4pm yesterday one of our clients began receiving error notifications from Worldpay.  

The message was:

Our systems have detected that your callback has failed.

This callback failure means we were unable to pass information
to your server about the following transaction:

Transaction ID: 1111111111
Cart ID: 1111111111111
Installation ID: 1111111

Error reported: Callback to: https://example.com: failed CAUSED BY Remote host closed connection during handshake
Server Reference: 11111-11-1111:callbackFailureEmail-11111:11111111-11-11

Also, if you usually return a response page for us to display to the Shopper within the time allowed (1 minute), this will not have been displayed.

Googling the error "Remote host closed connection during handshake" shows that the message relates to the requesting service's handling of SSL certificates.

We hadn't changed the client's SSL cert for over a year.  We had not deployed any recent software updates for the client, and we could see that multiple other payment processors, used by this system, were connecting to our server without issue.  There were no errors in our server's Event Log or in the app's Logentries records.

We contacted Worldpay support, who were very helpful.  They told us that SSL certs are cached on their systems, and can be cached for a long time (i.e. over a year).   They also said their systems can't handle SNI.  

So what seems to have happened was that Worldpay's certificate cache was refreshed yesterday around 4pm.  Our client's year-old certificate, which uses SNI, was loaded by Worldpay, and all subsequent connections from Worldpay failed.

Options to fix this include (a) get a new non-SNI certificate and (b) change the callback URL to use HTTP.  

Hopefully this post will assist if someone else experiences this issue.


Dovetail at the IoT World Conference

Last week Dovetail exhibited at the IoT World Conference held in the Dublin Convention Centre.

It was a really interesting event with over 200 speakers and 150 exhibitors. The startup area was particularly interesting with a wide variety of new businesses showing their wares. With my background in mechanical engineering I was particularly taken with this strain gauge built with nanoparticles

 

nano strain gauge

 

At the Dovetail stand we demonstrated the system we developed of Novaerus, which drew a lot of attention.

Despite how this picture looks, we didn't actually have a Martin Wallace mannequin. This was the real article, I think he just froze up for a second :)


Here's what's in a name

Project names

 

At the start of every project I place a brief but concerted focus on what to call the system under development.

Why is a good name important?

  1. It promotes clear communication between stakeholders, and clarity is a Dovetail core value. I worry when a generic term like “the system” is used in a meeting - inevitably somebody is left wondering “Which system exactly?”
  2. It gives the nascent software system its own identity. This helps stakeholders to engage with the project even though it may still be abstract to them. They can visualise the solution better when it has a name, leading to more creativity and thorough analysis.

So what makes a good name? Here are my suggestions:

  • It should be unique rather than generic. If it stands out a little it helps give the new system its own personality.
  • It should be a single word, so short that it never occurs to anyone to abbreviate it in speech or writing. This promotes consistent use by being the easiest way to refer to the new system.
  • Its pronunciation should be unambiguous. This removes the fear of saying it "wrong", another barrier to universal adoption.
  • Don't try to describe the project in its name. You will probably end up with something cumbersome. The name will also be prone to irrelevance as the project grows and evolves.
  • The meaning of the word really doesn't matter, so don’t sweat about it too much. Of course it can be a nifty acronym or something related to the project, but it can also just be a word that sounds good. Like a child, the project will grow into its name, everyone will get used to it, and eventually you won't be able to imagine any other name sounding right.
  • Don't worry about the permanence of the name. You’re just choosing something for internal use by stakeholders. If the system is launched to a wider audience you can give it a public-facing name at that time, and it will probably be better than anything you think up at this stage.
  • Do get buy-in from key stakeholders. Your goal is universal adoption: people find this surprisingly easy when their boss loves the name!

Here are some good examples of actual Dovetail projects:

  • HARPS
  • Hermes
  • Athena
  • Seagull
  • Osprey

HARPS was a neat acronym we laboured over when the project started years ago, but nobody remembers what it means now. Hermes is a project for a sports body, so we named it after the Greek god associated with sport. Athena was a seemingly random suggestion by a client after I shared my guidelines above.

As for the last two: when we’re stuck we just pick a bird’s name. It works every time, showing how unimportant the actual word is!

 


Custom JavaScript parser vs Jison - Our experience

 

We recently announced QuickDBD, a simple product we made for drawing database diagrams by typing. If you take a look at the QuickDBD app you'll see it converts source code into a diagram. What we needed to make this work was obviously a parser.

After a bit of research on how to approach this problem, we knew that we would have to use either an existing parser generator or build a custom parser ourselves. After narrowing the choices down a bit, PEG.js and Jison emerged as the two most popular JavaScript parser generators at the moment. Out of these two, Jison seemed to have slightly bigger community - a bit more GitHub followers, more StackOverflow questions and a slightly better documentation. It seemed like a better bet so we decided to spend a bit of time playing with it and to try to make it parse the QuickDBD syntax.

We managed to make it parse the first version of our syntax we had a few months back pretty fast. But since the language we came up with for QuickDBD is closer to a data description language than what most people would consider a programming language, we started hitting bumps in the road pretty quickly as well. We soon ended up having to handle multiple edge cases we weren't able to with just Jison and what that meant was overriding Jison behaviour and injecting custom bits of JavaScript into it.

That kind of felt pretty messy so we talked a bit about it and made a decision to go with our own custom JavaScript parser for several reasons:

  • we would have complete control over how the parser works
  • everyone here is very well versed in JS
  • Jison was new to everyone and there is a bit of a learning curve in being able to do stuff with it efficiently
  • it felt as if we were fighting Jison to make it work something it wasn't supposed to more than it felt it was this great tool that was would empower us to do things better and faster
  • a couple of times it was pretty hard to get information on how to do something with Jison so we had to fall back to reading it's source code to figure things out
  • it didn't feel like the right tool for the job

We however did pick up some ideas from trying it out and I believe it made the custom parser we came up with that much better. We wrote a parser that's fairly small, fast and easy to read, expand and fix - which is ultimately what we needed.

I still think Jison is a great tool but it just wasn't a very good fit four for our needs. If you're considering using it, perhaps try it out on a smaller subset of features of your language first and see how you like it before committing to it. You can always go back to writing something custom after you tried it out.

I also recommend you read this very good parser generators vs custom parsers SO thread with pros and cons for both sides.

Hope this helped!


Hello QuickDBD!

Quick Database Diagrams

For the last couple of months we've been working on a side-project here in Dovetail. Martin and Trevor wanted a tool to quickly draw/prototype database diagrams by typing. So, we're happy to announce QuickDBD! We decided to wrap it in a shiny design and make it a little product which we hope others will find useful as well. In time, if there is enough demand we'll expand the feature set. If you have any ideas or suggestions, please let us know on our roadmap Trello board.

In the process of making QuickDBD, a lot of cool, interesting technologies were used and no programming languages were harmed! We used things such as AngularJS, Typescript, JointJS (for diagram rendering - awesome library!), Karma and Jasmine (for testing), Angular Material and SASS on the front-end, .Net WebAPI, xUnit and MS SQL on the back-end and we automated our build-test-deploy pipeline with bower, gulp, TeamCity, Octopus Deploy and Azure. A very interesting journey!

We hope you like QuickDBD same as we do. If you have any feedback, please let us know!


Integrating Karma code coverage with TeamCity

To unit test our Angular apps we use Karma test runner and Jasmine testing framework. Locally we run these tests using a gulp script that takes care of the whole app building process. To ensure nothing is broken before publishing the app to production we run our tests during the continuous integration process using TeamCity.

This post expects you to have a gulp testing process already in place and it won't cover that part. It also expects you to have a working TeamCity setup in place. The post will only help you integrate Karma with TeamCity as an additional build step so you would get something that looks like this in your TeamCity.

Number of passed/failed tests:

The code coverage tab:

There are a few requirements before we can make this work. To help you better understand our setup, here is a sample project structure that we have:

The first thing to do is ensure you have the following npm packages installed and that they are saved in your package.json file:

"karma": "^0.13.22",
"karma-chrome-launcher": "^1.0.1",
"karma-coverage": "^1.1.1",
"karma-jasmine": "^1.0.2",
"karma-phantomjs-launcher": "^1.0.0",
"karma-teamcity-reporter": "^1.0.0",

Next ensure that you have the following set up in your karma.conf.js:

  • "coverage" and "teamcity" in the reporters list
  • "PhantomJS" in your browsers list
  • singleRun set to true
  • our coverageReporter configuration looks like this (this part is pretty important):
coverageReporter: {
  dir: 'coverage',
  reporters: [
    { type: 'html', subdir: 'html' }
  ]
}
  • set the preprocessors configuration to something like this:
'path/to/code/you/want/to/tests/*': ["coverage"]
  • NOTE: we do not have the plugins property set up
  • the rest of options are pretty much standard - add/remove what you need

Now that this is all set up, go to your TeamCity. This is essentially how our client-side build process looks like:

The step that is the main interest of this post is the "Run Karma Tests" step. Here is how we have it set up (create a Command Line step):

This is a slightly modified version of what Karma documentation recommends. The difference is that we are forcing the use of local Karma module and we specify the configuration as a command line param like this:

node node_modules/karma/bin/karma start karma.conf.js

The last piece of the puzzle is setting up the coverage artifact. Go to the General Configuration Settings of your project in TeamCity and add an additional coverage artifact path (the second line):

The important bit (it's simply where our coverage html files are located):

Project.WebApp/coverage/html/** => coverage.zip

Go back and see how we have the coverage/html folder in our project structure. It is set up by coverageReporter karma.conf.js property. This artifact path will take all the files from the coverage/html folder and will compress it into a coverage.zip archive. After the build process finishes, TeamCity will (if it's is able to find the coverage.zip archive inside the artifacts folder) automatically import it as code coverage for the project and you will be able to navigate to the "Code Coverage" tab for that specific build. If you have any tests that don't pass, this will also fail the whole step, stop the build and prevent it from ending up in production.

Hope this helps. Cheers! :)


Visual studio 2015 real-time CSS editing

I was working on some updates to MenuCal this week. This morning while completing some CSS styling on a new form, I discovered that the CSS was being updated in real-time in Chrome as I made changes in visual studio. This is a huge improvement to my workflow, as I like to style and preview as I go. I was able to drag my CSS editor over to another screen and work away while the styles in Chrome updated instantly. No more hitting save, and refreshing the browser. Thank you Visual studio 2015!! I've seen other tools do this for quite some time, but it's nice to see it in the IDE I use every day.

But let's not get too excited. I mean who uses plain old CSS anymore? We've been using SASS on new projects and unfortunately this lovely little feature is not present out of the box for SASS. I will take a look around at VS plugins that might do that, and report back if I find some elegant solutions.

Update 2nd August, 2016: I tested out one of our projects with SCSS and Sassy Studio. While it's not as elegant as the live CSS preview, it does detect the CSS changes after they are compiled, and the browser updates the CSS.


  • 1
  • 2