Although time cockpit is a standard product, it was designed from the beginning to be easily customizable. In fact, we at software architects adapt time cockpit to the needs of our customers every day. Most of our customers have well established processes or tools in place which drive the customers' business. Those processes or tools will not be replaced just because a new time tracking tool like time cockpit is introduced (although there is a good deal of process optimization and data consolidation sometimes). On the other hand, time cockpit cannot live on its own in larger companies. It lives from base data already stored in some other system.
When we start a time cockpit customization project with a customer there are certain questions we always ask. Some of the questions are:
- What is the scope you are tracking your time on (project, task, customer...)?
- How is your company structured and what roles are there?
- What reports do you need to run your business?
- And, perhaps most importantly: What upstream/downstream systems do you have that exchange data with time cockpit?
To integrate data from upstream systems, we usually implement little pieces of source code that read data from e.g. a SQL database or a web service and import it into time cockpit. However, recently we implemented two imports that were a bit more challenging:
This month we focus on the first scenario. Next month we will describe the hybrid connection with Dynamics NAV.
Microsoft Azure Web Jobs
To perform imports or exports periodically (from e.g. Visual Studio Online or a SQL database), we usually implement so called Azure WebJobs. Web jobs are similar to scheduled tasks in Windows only that they run in the cloud in Microsoft Azure. The nice thing is that they are (at the moment of this writing) completely free. The only thing you need to do as a developer is to create your own Microsoft Azure subscription and add a web app.
In the free web app you can add your web jobs. For our integration scenarios we implement simple console applications (*.exe) that are uploaded to Microsoft Azure. However, you can upload various other executable resources (list of supported formats).
When uploading your console applications you can choose from three different schedules. You can
- run your web job on demand
- run your web job on a specific schedule
- run your web job contiuously
A Push Scenario with Web Hooks
For most of our customers we implement web jobs that run on schedule. They fetch data from a given source and import it into time cockpit with our OData API. The web jobs are scheduled to run e.g. once per day or every three hours, but not continuously.
One of our new customers uses Visual Studio Online (VSO) heavily. They use VSO to e.g. manage the work items of multiple scrum teams. Of course, VSO would offer a REST API that can be used to read work items from. So it would not be a problem to write a web job that fetches new work items from VSO e.g. every few hours. However, there are two drawbacks of such a polling approach:
- A VSO user is needed to connect and read work items. Time cockpit would need to manage the user credentials in a secure way.
- In a worst case scenario it would take the configured amout of hours of the web job for a new work item to be imported. That is not very convenient if users want to quickly create their time records for the newly created work items.
One possible solution is using so called web hooks. According to Wikipeda a web hook is ...
"... a method of augmenting or altering the behavior of a web page, or web application, with custom callbacks. These callbacks may be maintained, modified, and managed by third-party users and developers who may not necessarily be affiliated with the originating website or application."
But how can a web hook be used to implement a push import scenario? In VSO you can define various web hooks (actually web hooks are called service hooks in VSO) which post some JSON messages to a given message queue (in our scenario an Azure Storage Queue) whenever a certain event happens in VSO. The events we are interested in for our integration scenario are Work item created and Work item updated.
Configuring a Web Hook
The following screenshot shows the important configuration step of the Work item created web hook. The important settings for the web hook are:
- Storage account name: Specifies the name of the Azure storage account that contains the message queue you want the web hook to post the messages to.
- Storage account key: Specifies the key of the Azure storage account that contains the message queue you want the web hook to post the messages to.
- Queue name: Name of the message queue the web hook posts the messages to.
And that's basically it. Whenever a work item is created in VSO the web hook posts a JSON message to the message queue.
{
"id":"416a5d89-9a7d-4116-a476-e1b53b0a7989",
"eventType":"workitem.created",
"publisherId":"tfs",
"message":null,
"detailedMessage":null,
"resource":{
"id":99,
"rev":2,
"fields":{
"System.AreaPath":"customer.TimeCockpit",
"System.TeamProject":"customer.TimeCockpit",
"System.IterationPath":"customer.TimeCockpit",
"System.WorkItemType":"Product Backlog Item",
"System.State":"New",
"System.Reason":"New backlog item",
"System.CreatedDate":"2015-03-25T14:13:33.74Z",
"System.CreatedBy":"Alexander Huber <alexander.huber@software-architects.at>",
"System.ChangedDate":"2015-03-25T14:13:34.247Z",
"System.ChangedBy":"Alexander Huber <alexander.huber@software-architects.at>",
"System.Title":"US2",
"Microsoft.VSTS.Common.BacklogPriority":999968378.0,
"WEF_6FEC0B85A7B44F0A9E5C593CFB2B9923_Kanban.Column":"New",
"WEF_6FEC0B85A7B44F0A9E5C593CFB2B9923_Kanban.Column.Done":false
},
"_links":{
"self":{
"href":"https://timecockpit.visualstudio.com/DefaultCollection/_apis/wit/workItems/99"
},
"workItemUpdates":{
"href":"https://timecockpit.visualstudio.com/DefaultCollection/_apis/wit/workItems/99/updates"
},
"workItemRevisions":{
"href":"https://timecockpit.visualstudio.com/DefaultCollection/_apis/wit/workItems/99/revisions"
},
"workItemHistory":{
"href":"https://timecockpit.visualstudio.com/DefaultCollection/_apis/wit/workItems/99/history"
},
"html":{
"href":"https://timecockpit.visualstudio.com/web/wi.aspx?pcguid=788898a5-293e-4f98-95ec-76745e3546b0&id=99"
},
"workItemType":{
"href":"https://timecockpit.visualstudio.com/DefaultCollection/5433c2ed-09b4-4c16-93e1-08b440c6f9b1/_apis/wit/workItemTypes/Product%20Backlog%20Item"
},
"fields":{
"href":"https://timecockpit.visualstudio.com/DefaultCollection/_apis/wit/fields"
}
},
"url":"https://timecockpit.visualstudio.com/DefaultCollection/_apis/wit/workItems/99"
},
"resourceVersion":"1.0",
"resourceContainers":{
"collection":{
"id":"788898a5-293e-4f98-95ec-76745e3546b0"
},
"account":{
"id":"95d4d3b2-7815-4e93-a7be-b8f353767e88"
},
"project":{
"id":"5433c2ed-09b4-4c16-93e1-08b440c6f9b1"
}
},
"createdDate":"2015-03-25T14:13:33.5133467Z"
}
Processing a Web Hook Message
As mentioned earlier, there are different types of web jobs. To process the message that was posted by the web hook we use a web job that runs continuously. Whenever the web job discovers a new entry in the message queue, it picks up the message and processes it.
As for the console application that is continuously run by the web job, we want to share three code snippets that help you get started. First, you need to configure the credentials that are used to connect to the Microsoft Azure Storage account that contains the queue receiving the web hook messages. The credentials are configured in your App.config file and may look like the following:
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<connectionStrings>
<!-- The format of the connection string is "DefaultEndpointsProtocol=https;AccountName=NAME;AccountKey=KEY" -->
<!-- For local execution, the value can be set either in this config file or through environment variables -->
<add name="AzureWebJobsDashboard" connectionString="DefaultEndpointsProtocol=https;AccountName=XXX;AccountKey=XXX" />
<add name="AzureWebJobsStorage" connectionString="DefaultEndpointsProtocol=https;AccountName=XXX;AccountKey=XXX" />
</connectionStrings>
...
</configuration>
Next, we need to tell the web job that it should listen for changes in the message queue and run the actual functions that process the message. The heavily lifting is done by the
JobHost. The
JobHost is configured in the
main method of our console application that we use to process the web hook messages and uses the connection strings from our
App.config file.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
class Program
{
static void Main()
{
var config = new JobHostConfiguration();
config.Queues.BatchSize = 1; // no concurrent invocation
var host = new JobHost(config);
host.RunAndBlock();
}
}
Last, we need to tell the JobHost what method to run when a new JSON message is discovered in the message queue. You can call the function that does the processing whatever you like, but it needs to fulfill the following two requirements:
- The method that processes the message needs to be static (at the time of writing).
- The method that processes the message needs to have a parameter that is decorated with the QueueTrigger attribute.
public static void ProcessQueueMessage([QueueTrigger("vso-work-item")] string message, TextWriter log)
{
var command = JObject.Parse(message);
if (command.Property("eventType") == null)
{
Console.Error.WriteLine("Could not determine event type for message: {0}", message);
return;
}
var eventType = (string)command["eventType"];
var url = string.Empty;
switch (eventType)
{
case "workitem.created":
CreateWorkItem(command, log);
break;
case "workitem.updated":
UpdateWorkItem(command, log);
break;
default:
Console.Error.WriteLine("Unexpected event type {1} for message: {0}", message, eventType);
break;
}
}
The above listing shows some sample code that can be used to process a VSO web hook message. The most imporant part of the code is the QueueTrigger attribute. As mentioned earlier, it tells the JobHost to execute it whenever it discovers a new message in the message queue. Also note that in the attribute you tell the JobHost which queue it should actually listen to.
Of course the processing of the JSON message depends on the given use case. As for our integration scenario, we parse the JSON message and decide whether the message represents an update or a create. If the message represents a create operation, we create a corrensponding task in time cockpit. If the message represents an update operation, we update the given task in time cockpit with the values from VSO.
comments powered by