Saturday, 28 May 2016

Getting dotCover to report in TeamCity via command line parameters

This is a series of articles about one of my favourite subjects: Continuous Integration and Continuous Delivery. In this article I will focus on the Development step that includes the build process, the packaging of artifacts, the testing and finally the reporting of code coverage. All this from TeamCity.

I will base this solution in one of my .net projects as the integration is really high and it is really easy to do and use. I faced some challenges when setting up NUnit 3.x and dotCover in TeamCity and this post tries to alleviate the pain with a deep explanation about the setting up and configuration.

In further articles I will delve into detail regarding the next items that are part of the continuous delivery pipeline.

One of the most important aspects of any solution is that if you want to get away from spaghetti code you need to have proper controls in place such as unit tests, integration tests, etc. This for me is a must-have in any project. If you can't answer the following question "What's your % of code coverage?" with something rather than 0, then you are in for trouble as you are probably maintaining something similar to spaghetti code. This doesn't relate to the ability you have for coding, this only relates to its maintainability and scalability.

I will use one of my projects from GitHub for this article:
This project contains a basic library with my MapReduce approach, a console application that runs the library and a unit test project that tries to cover as much code as possible.

Here is my TeamCity project for this github solution with the artifacts.
Artifacts:
As you can see, the project gets automatically build via TeamCity and the configuration of the project is as follows:

1. VCS Root:

2. Build Step for Visual Studio:

3. Generation of Artifacts:

As you can see, the setting up of the build process is quite simple. Just create your project in teamcity, point your VCS Root to GitHub, create a build step for Visual Studio (pointing it to the .sln file) and add the location of the binaries to be picked up by TeamCity and generated as artifacts.

So far so good, right!?. Now we need to go one step forward. My project has a console app and a unit test project and I need to build both prior to run the unit test project. To achieve this I will have to piggyback on MsBuild.

Here is the structure of my project:
Now I need to build the projects CountingWordsConsole and MapReduce.Tests together prior to the running of the tests. Here is my msbuild file to build my solution:

And here is the configuration in TeamCity:
I got rid of the previous build step for Visual Studio and this time I'm creating an MsBuild step that will target all my project files.

You will also notice that in the MsBuild file there is a shared argument between TeamCity and MsBuild called ReleaseFolder. This property is set up in TeamCity with the folder location of where my projects will be built. This will help us later as to identify where are our binaries and how we pick those up from TeamCity as artifacts.
Notice that in the MsBuild file the notation of this property is via $(ReleaseFolder) whereas in TeamCity is %system.ReleaseFolder%.

Now that we have the project up and running via MsBuild, it's time to set up NUnit and dotCover.

Setting up NUnit and dotCover

1. Get the latest NUnit.
You can get the latest NUnit 3.2.1 from here. Download the .msi file and install the typical installation. This will leave the files in the following folder:

  • C:\Program Files (x86)\NUnit.org\nunit-console\
Add a new build step in TeamCity and configure it to run it for NUnit 3.x. As soon as you run the project you will get the following error:

This version of NUnit 3 is not a release version and is not compatible with TeamCity. Please update NUnit to a newer release version.


I even tried with NUnit 3.0 RC and NUnit 3.0 but the error never went away. How to fix this? via command line.

To make things a bit more exciting, I will configure directly dotCover as this one will run NUnit by default. dotCover comes automatically by default with TeamCity:
("C:\TeamCity\buildAgent\tools\dotCover") and you only need to configure your project with some configuration files to be able to run dotCover from the command line.

The other aspect to cover later on is how to publish the dotCover report in teamCity but I will get to it.

2. Using dotCover
Using dotCover is really simple, just call dotCover with the argument cover and the .xml config and that's it. Inside the configuration file, dotCover will call NUnit and report the tests back to TeamCity. Here is my configuration file:

Now configure the build step to use dotCover:
Once you run this step, you will see that TeamCity reports back this test automatically:
Now we are almost done. The last part is the tricky one and the answer to the title of this article "Getting dotCover to report in TeamCity via command line parameters". If you check the coverage.xml file, you will see that there is one report created called output.dcvr and now we need to tell TeamCity that the file is there to be picked up.

3. Using service messages.
The common way to report things back to TeamCity is via service messages. To do this, you just need to write this message in the command line:

##teamcity[importData type='dotNetCoverage' tool='dotcover' path='C:\MapReduce\MapReduce.Tests\output.dcvr']

So what I did was building a lightweight console application that just writes that command output:

Now place the executable in the root of your project so it can be called from TeamCity. Add a third build step in your project and use the following arguments:
Run the project in TeamCity and et voilĂ !:

Now you can see the coverage of my unit tests against my source code (>75%) and inspect the report further through TeamCity:

I can see that most of my classes are covered 100% so it gives me enough confidence to keep modifying the project knowing that if I break something the tests will tell me straight away.

I hope you find this useful as it's a quite a long and tedious post and I consider that you already know about TeamCity.
Jordi

Monday, 23 May 2016

Deploying a PARSE server to Heroku

Heroku is a PaaS (Platform as a Service) that enabled developers to build and run applications entirely in the cloud. If you followed my previous post about PARSE server you will see that we are just one step ahead to move things from a self-hosted environment to the cloud. Heroku offers a free tier with 1 web and 1 worker (2 dynos - smart, lightweight containers) but they can't run 24/7 (must sleep 6 hours in a 24h period). For any additional dyno you'll have to pay $7/month. (there are also some considerations in terms of memory that can increase your final bill).

Deploy PARSE server to Heroku.
To deploy PARSE server to Heroku there are two alternatives. The first one which is the easy one where we just press the deploy button from the PARSE server example page and this would lead us to the Heroku page with the details we need to set up our system. To create a deploy button for your repo you need to follow the following tutorial.

The second one involves a bit more of configuration and I would definitely recommend you not to pursue this one. I spent one afternoon to correctly configure git and heroku to achive the same results as pushing a button.

Once you press the deploy button from PARSE server example, you'll be prompted with the following details:

Add you AppName and set the runtime selection. You will see that there is one additional addon added automatically in the app for a MongoDB database.


To finalise this step, the system will ask you to add your credit card details. There is no charge here as the deployment is for free and the MongoDB sandbox is also free, but this is one of the requirements.

Now the app is deployed and it will appear as active under Heroku:


It's time now to test the application. You can use curl, javascript or any other language to test the connectivity with Parse. In my case I will use Delphi as my preferred language and because I already have the libraries for it.

Here is my code:

As you can see this will call the remote instance of my Parse server and add a new item under Instances classes.

To see that everything has worked correctly. I can go to the remote mongoDB instance and check out the results:

And the records that have been inserted:

Now I can happily connect my devices to my remote Parse server and remove the old dependencies to Parse.com.

The second option to produce the same results involve using Heroku Toolbelt and mongoDB addon. For this, get the latest Heroku Toolbelt from here. I'm testing everything from a Win7 machine so I have downloaded the installer for windows. Then follow the steps from Heroku in terms of deploying the application through the command line via heroku and git. And you'll achieve the same results. Remember to configure your SSH key first before pushing anything through git.

In conclusion, using PaaS will change the future in terms of DevOps. Companies will rely more in cloud base platforms for their operations as the will require less worries in terms of load balancing, VPNs, clusterint, key rotation and some complex security policies. DevOps tools have matured a lot nowadays and with all this landscape constantly changing you can learn some valuable skills along the way.

There are also some other deployment solutions that I would like to try like Docker, Chef, Puppet, Ansible and Capistrano. Hopefull I will have the time to test them soon!.

Sunday, 22 May 2016

Creating your own self-hosted PARSE server

As you might be aware, PARSE service is shutting down beginning of 2017. This was a really great solution to quickly spin up the infrastructure needed to build an app. I still have some of my apps relying on that infrastructure and I think it's time to make the move to my own self-hosted PARSE server (for local Development). Here you will find a quick guide through the initial configuration required to have your PARSE server up and running in no time.

PARSE is actually really easy to set up. You just need a NoSQL DB (MongoDB in this case) and then get the latest source code from GitHub to create your own PARSE self-hosted instance. My sandbox is a Windows 2012 R2 and I will guide you through the required steps in there.

The prerequisites are as follows:

  • Node 4.3
  • MongoDB version 2.6.x or 3.0.x
  • Pythin 2.x (For Windows users, 2.7.1 is the required version)

Installing MongoDB.
First you'll need MongoDB. Download the latest version of MongoDB from here. In my case I installed version 3.2.6 for Windows Server 2008 R2 and later with SSL support.

Install the complete version and once finished, go to MongoDB folder and run mongod.exe command:
Notice that I had to run the command twice as the folder that MondoDB needs was not there: C:\data\db. I created the folder manually and then ran again the executable. Now MongoDB is up and running awaiting for connections on port 27017.

Installing Node.js.
Next step is to download and install Node.js. You can get the latest version here. I will install version 4.4.4 LTS which is the recommended for most users. Download the installer and continue clicking next until it is completely installed on your machine.

Installing Python.
The required version for Windows is 2.7.1. In my case I will install 2.7.11 as it's the recommended version from Python. You can get the latest installer from here. Once the setup is completed you will have to reboot your machine to make sure the configuration is updated.

Downloading PARSE server example.
Head to PARSE server source code on GitHub and download the PARSE server example from here. Unzip the zip file under C:\Parse for example. Once everything is in place, open node.js command prompt with admin rights and run the command npm install under C:\Parse folder:


During the installation I hit a small bump with the following error message:
MSBUILD : error MSB4132: The tools version "2.0" is unrecognized. Available tools versions are "14.0", "4.0".

To overcome this issue, just install the MSBuild Tools from VS2013. This can be found here. Now we are ready to start our PARSE server.

Starting PARSE server.
Remember to start mongoDB first here as the server was restarted from installing Pyhton. MongoDB should show that its waiting for connection. Here are some logs when the system is up and running:

Configure PARSE:
Open the file index.js under C:\Parse and edit the sections appId and masterKey

In my case, I mentioned the appId as 'thunderParse' and the masterKey as a nice key (base64 string).

Run PARSE server via npm:
Once configured, run in the same command line the command npm start.

Now parse is up and running and is ready for requests.

Configuring and Starting PARSE Dashboard.
Now that our PARSE server is running, we can configure and start the PARSE dashboard which is a web base application to manage our PARSE apps.

To get PARSE dashboard, you only need to type the following from npm:

Now we can go to localhost:4040/apps and we will see our application described there:

Testing our self-hosted PARSE server.
Now is the time to test our self-hosted PARSE server. To test it out, we just need to use curl or any other mechanism that we like. I've used curl and Delphi as I already had some libraries prepared for it. Here is the command with curl:
And here with Delphi:

And here are the results of the App writing to Parse server:

Next step is to bring this online and host it somewhere else like Heroku, AWS, AZure or Scalingo.

I hope you find this useful.
Jordi

Saturday, 19 March 2016

Testing new Windows 10 components of Delphi 10 Seattle

It's been a while since I got my Delphi 10 Seattle and I really love its new Windows 10 components and styles. I have already played with it and I can say that these new components are really slick and will make your apps really stand out. I've tried the TSplitView and the ActivityIndicator which are great for a full Windows 10 integration. 

Here you can see some screenshots of one of the applications I have created with it and the source code is available on Github:

Initial Screen:

Notice the home button that will control the split view.

Showing the controls:

Controls are hidden/displayed when the home button is pressed. Each of those actions will call each page on the TPageControl component. I'm still looking for a solution where I don't have to show the tabs on the TPagecontrol but I'll get to it later on. I think that as of now, I have a pretty interesting solution that runs fine in my Surface under Windows 10 and it totally looks awesome.

Animation Controls:

I've just added one of the animation controls with a simple processing. There is nothing fancy here, just updating the process regularly would do. If I wanted to make anything better I should put it in a thread for better refresh.

I also tried the TToggleSwitch but it was flickering in runtime so I will report the problem to Embarcadero later on. The new style works really well but I found that the ShBrowseForFolder is not rendering it correctly:


The source code of this little utility can be found here:

Executables can be downloaded here:

This small application will allow you to:
  • Perform a search on a particular folder and file extension.
  • Replace the content of each file for some other content. (really handy when you need to replace a variable in source code files for example).
  • Delete the files.
  • Also to search empty folders and delete them (really handy when your svn complains about checking in empty folders).

Let me know if you have any questions.

Jordi


Sunday, 20 December 2015

Sending REST API messages with Delphi to Parse.com

Parse.com is one of the most interesting cloud base solutions I have found so far, and most importantly; basic accounts are for free (up to 30 requests/second, after that is when you have to pay). Using Parse.com I can keep track of my applications, using the Analytics feature. This is an easy way of knowing how many installs you have running and also it provides cloud storage to add some more details etc. I'm using this module for my project "Flickr Photo Analytics" as I currently have over thousand downloads but I don't have a clue whether the app is being used or not. Here is where the Analytics feature of parse.com gets really handy.

Embarcadero has its own AppAnalytics but I just wanted something that I could handle on my own with an easy setup. I have been using Parse for a while for few mobile applications and now I've decided to extend it to my windows desktop applications.

The idea is very simple. Every time the application is used, it will send a notification to Parse.com via REST API. Parse.com provides several communication protocols for different languages e.g. Java, Javascript, .NET, etc. but unfortunately not for Delphi. Here is why I decided to write about it as I haven't seen anything on the Internet and it could be interesting for many users out there.

To use it in Delphi, we can use the REST API via IdHTTP component. On their website they provide the syntax using curl and here is the correct translation to use it with Delphi and that works:

Curl:


Delphi:

With just this little piece of code, the Analytics feature of Parse will start tracking down your installs. It's not a real time tracking so it takes time to appear on the dashboard. Once you have your applications there, you can review the stats:


As I'm not expecting any traffic at all in my app it is really a good solution and it serves my needs. I also have a small android app that pulls from there so I can read some of the stats published by my app anywhere I go. In the end it all lies to connectivity and what's the best and easy way to achieve it. Using this approach didn't take me long at all. 

I hope it can be useful to the community.

PS: If you know any better solution out there, please do let me know!.

Jordi