Steal Focus

Monday 21 January 2013

StealFocus Forecast - ability to delete cloud services not on a white list.

Following on from previous posts about the StealFocus Forecast application, the application can now delete deployments of Azure hosted services that are not on a white list.

Continuing the theme of the previous posts, the Forecast application is about controlling your costs for cloud services.

So far the application can create or delete hosted services on a schedule (deleting services when they are not used saves you money), delete storage entities on a schedule (deleting unused storage saves you money) and horizontally scale hosted services in or out on a schedule (scaling in hosted services during periods of low demand also saves you money).

Now the application will find any hosted service deployments that are not on a white list and will delete them. This also presents you with a cost saving as you will not be on the receiving end of unplanned growth of the number of hosted services you are running.

You can configure the white list as follows:


        <whiteList 
          pollingIntervalInMinutes="60"
          includeDeploymentDeleteServices="true"
          includeDeploymentCreateServices="true" 
          includeHorizontalScaleServices="true">
          <service name="myAzureServiceName1" />
          <service name="myAzureServiceName2" />
        </whiteList>


The attributes ("includeHorizontalScaleServices" etc) indicate whether those other configured services should be automatically included in the white list (the configuration is truncated for brevity and is missing those additional configuration elements).

The application will look for hosted services attached to the configured subscription(s). The subscription configuration is also excluded for brevity.

An example of a full configuration file is included with the application download.

For more information look at the home page on GitHub.

Monday 14 January 2013

StealFocus Forecast - ability to horizontally scale cloud services on a schedule

Following on from my previous post about the StealFocus Forecast application, the application can now horizontally scale a configured number of Azure applications on a schedule.

Just to recap, "Forecast" is an application that can be installed as a Windows service and will perform various operations on your Windows Azure applications according to the configuration. For example it can create or delete hosted services on a schedule. The principal driver for this is to save money, being able to remove hosted services when they are not being used and spin them up later (e.g. test environments) can be a big money saver.

A similar principal applies to horizontally scaling your application on a schedule. You can configure the Forecast application to scale out your application(s) during a certain time period (when there is anticipated high demand) and then scale back in later (during quiet periods). Minimising the number of running instances also presents the opportunity for cost savings.

To horizontally scale a Windows Azure application on a schedule, it is configured as follows:


  <scheduledHorizontalScale
    id="myArbitraryId"
    subscriptionConfigurationId="myArbitraryAzureSubscriptionName"
    serviceName="myAzureServiceName"
    deploymentSlot="Production"
    treatWarningsAsError="true"
    mode="Auto"
    pollingIntervalInMinutes="30">
    <horizontalScales>
      <horizontalScale roleName="myAzureServiceRoleName1" instanceCount="2" />
      <horizontalScale roleName="myAzureServiceRoleName2" instanceCount="4" />
    </horizontalScales>
    <schedules>
      <schedule scheduleDefinitionName="BusinessHours" />
    </schedules>
  </scheduledHorizontalScale>


You can have multiple entries (one for each service) and multiple schedules for each individual service.

A future feature will scale the application on demand, according the performance metrics.

For more information look at the home page on GitHub.

Monday 17 December 2012

Diagnosing problems with Process Explorer


I recently upgraded an instance of Team Foundation Server 2010 to the 2012 version. The core upgrade process went very smoothly. The problem came with my customised build templates which used a custom activity written in C#. I was expecting one error, which I tried to fix before even attempting a build. This first error was that my custom activity was written against the TFS 2010 client assemblies:

Microsoft.TeamFoundation.Build.Client, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a
Microsoft.TeamFoundation.Build.Common, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a
Microsoft.TeamFoundation.Client, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a
Microsoft.TeamFoundation.Common, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a
Microsoft.TeamFoundation.WorkItemTracking.Client, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a

These had to be changed to the TFS 2012 (Version 11.0.0.0) assemblies. So I upgraded and rebuilt the assembly holding my custom activity.

However, running a build after the upgrading my custom activity still gave me an error:

TF215097: An error occurred while initializing a build for build definition \[Team Project Name]\[My Build Name]: Cannot create unknown type '{clr-namespace:StealFocus.TfsExtensions.Workflow.Activities;assembly=StealFocus.TfsExtensions}UpdateBuildNumber'.

For some reason my "UpdateBuildNumber" class could not be created. That type could not even be found.

Time to poke around and see what is happening.

After checking some obvious things (were all dependencies available on the Build Agent, checking Fusion Log etc), I used Process Explorer (http://technet.microsoft.com/en-gb/sysinternals/bb896653.aspx) to look at the "TFSBuildServiceHost.exe" process and see if my assembly was even loading.

Starting up Process Explorer, I got a list of all processes on the local machine. I know I'm looking for "TFSBuildServiceHost.exe" as that is the executable for the Team Build service.


I can look at the detail of "TFSBuildServiceHost.exe", including all loaded .NET assemblies.


Looking down the list of loaded assemblies, my "StealFocus.TfsExtensions.dll" (which holds my custom activity) is not loaded at all, so it looks like there's quite a fundamental problem.

The thought occurred to me that I compiled my upgraded custom assembly against .NET 4.5, perhaps the TFS Build Agent is a .NET 4.0 application. This would mean the TFS Build Agent would be unable to load my assembly. As a rule of thumb, a .NET application cannot load assemblies from a later .NET Framework version. As an aside, the CLR version indicated as "v4.0.30319.17929" in the information above is misleading, the .NET Framework 4.5 still uses the CLR v4.0. The CLR saying "v4.0.30319.17929" did not mean it was a .NET 4.0 application.

So I rebuilt my upgraded assembly against .NET 4.0 and the build completed successfully. Looking into the detail with Process Explorer once more, I saw the following:


My assembly ("StealFocus.TfsExtensions.dll") had indeed loaded.

Problem solved.

So Process Explorer was a very useful tool to see what was happening under the hood. It showed me that my assembly was not loaded, which triggered my thought process about .NET Framework versions.

Monday 10 December 2012

Cloud "Forecast"?


It's worth re-capping a couple of points from my previous post talking about operational cost reduction using Windows Azure. A key benefit of using the cloud is the ability to quickly and easily tear down environments, this includes test environments. If you are not using a test environment, then simply delete it and recreate it later when you need it. This can give you very large cost reductions.

With this in mind, I have created an application that will automatically tear down (or create) cloud resources on a schedule. The application can be run as a console or installed as a windows service.

At the moment you can:
  • Delete "Hosted Services" on a schedule.
  • Create "Hosted Services" on a schedule.
  • Delete Storage Tables on a schedule.
  • Delete Storage Blob Containers on a schedule.
Additional features will be added in time.

You can see the GitHub project here: https://github.com/StealFocus/Forecast

An example configuration is as follows:


  <stealFocusForecastConfiguration 
    xmlns="urn:StealFocus.Forecast.Configuration">
    <scheduleDefinitions>
      <scheduleDefinition name="MorningOutOfHours">
        <days>
          <day name="Monday" startTime="00:00:00" endTime="07:59:59" />
          <day name="Tuesday" startTime="00:00:00" endTime="07:59:59" />
          <day name="Wednesday" startTime="00:00:00" endTime="07:59:59" />
          <day name="Thursday" startTime="00:00:00" endTime="07:59:59" />
          <day name="Friday" startTime="00:00:00" endTime="07:59:59" />
        </days>
      </scheduleDefinition>
      <scheduleDefinition name="BusinessHours">
        <days>
          <day name="Monday" startTime="08:00:00" endTime="18:00:00" />
          <day name="Tuesday" startTime="08:00:00" endTime="18:00:00" />
          <day name="Wednesday" startTime="08:00:00" endTime="18:00:00" />
          <day name="Thursday" startTime="08:00:00" endTime="18:00:00" />
          <day name="Friday" startTime="08:00:00" endTime="18:00:00" />
        </days>
      </scheduleDefinition>
      <scheduleDefinition name="EveningOutOfHours">
        <days>
          <day name="Monday" startTime="18:00:01" endTime="23:59:59" />
          <day name="Tuesday" startTime="18:00:01" endTime="23:59:59" />
          <day name="Wednesday" startTime="18:00:01" endTime="23:59:59" />
          <day name="Thursday" startTime="18:00:01" endTime="23:59:59" />
          <day name="Friday" startTime="18:00:01" endTime="23:59:59" />
        </days>
      </scheduleDefinition>
      <scheduleDefinition name="Weekend">
        <days>
          <day name="Saturday" startTime="00:00:00" endTime="23:59:59" />
          <day name="Sunday" startTime="00:00:00" endTime="23:59:59" />
        </days>
      </scheduleDefinition>
    </scheduleDefinitions>
    <windowsAzure>
      <subscriptions>
        <subscription
          id="myArbitraryAzureSubscriptionName"
          subscriptionId="GUID"
          certificateThumbprint="0000000000000000000000000000000000000000" />
      </subscriptions>
      <hostedService>
        <packages>
          <package
            id="myArbitraryPackageName"
            storageAccountName="myAzureStorageAccountName"
            containerName="MyContainer"
            blobName="MyPackage.cspkg" />
        </packages>
        <deploymentDeletes>
          <deploymentDelete
            serviceName="myAzureServiceName"
            pollingIntervalInMinutes="10"
            subscriptionConfigurationId="myArbitraryAzureSubscriptionName">
            <deploymentSlots>
              <deploymentSlot name="Staging" />
              <deploymentSlot name="Production" />
            </deploymentSlots>
            <schedules>
              <schedule scheduleDefinitionName="MorningOutOfHours" />
              <schedule scheduleDefinitionName="EveningOutOfHours" />
              <schedule scheduleDefinitionName="Weekend" />
            </schedules>
          </deploymentDelete>
        </deploymentDeletes>
        <deploymentCreates>
          <deploymentCreate
            subscriptionConfigurationId="myArbitraryAzureSubscriptionName"
            windowsAzurePackageId="myArbitraryPackageName"
            serviceName="myAzureServiceName"
            deploymentName="MyName"
            deploymentLabel="MyLabel"
            deploymentSlot="Production"
            packageConfigurationFilePath="C:\PathTo\MyPackageConfiguration.cscfg"
            treatWarningsAsError="true"
            startDeployment="true"
            pollingIntervalInMinutes="10">
            <schedules>
              <schedule scheduleDefinitionName="BusinessHours" />
            </schedules>
          </deploymentCreate>
        </deploymentCreates>
      </hostedService>
      <storageService>
        <storageAccounts>
          <storageAccount
            storageAccountName="myStorageAccountName"
            storageAccountKey="myStorageAccountKey" />
        </storageAccounts>
        <tableDeletes>
          <tableDelete
            id="myArbitraryTableDeleteId"
            storageAccountName="myStorageAccountName"
            pollingIntervalInMinutes="60">
            <storageTables>
              <storageTable tableName="myAzureTableName1" />
              <storageTable tableName="myAzureTableName2" />
            </storageTables>
            <schedules>
              <schedule scheduleDefinitionName="MorningOutOfHours" />
              <schedule scheduleDefinitionName="EveningOutOfHours" />
              <schedule scheduleDefinitionName="Weekend" />
            </schedules>
          </tableDelete>
        </tableDeletes>
        <blobContainerDeletes>
          <blobContainerDelete
            id="myArbitraryBlobContainerDeleteId"
            storageAccountName="myStorageAccountName"
            pollingIntervalInMinutes="60">
            <blobContainers>
              <blobContainer blobContainerName="myAzureBlobContainerName1" />
              <blobContainer blobContainerName="myAzureBlobContainerName2" />
            </blobContainers>
            <schedules>
              <schedule scheduleDefinitionName="MorningOutOfHours" />
              <schedule scheduleDefinitionName="EveningOutOfHours" />
              <schedule scheduleDefinitionName="Weekend" />
            </schedules>
          </blobContainerDelete>
        </blobContainerDeletes>
      </storageService>
    </windowsAzure>
  </stealFocusForecastConfiguration>


Monday 3 December 2012

Amazon Web Services


Every day this year, on average, AWS added the same server capacity to its public cloud as it took to run the Amazon.com retail business back in 2003, when it had $5.2bn in revenues. Amazon had "a whole lot of servers" back then in 2003…

This time last year, AWS execs were bragging that the public cloud, on average through 2011, was adding enough server capacity each day to run Amazon.com when it was a $2.76bn business in 2000.” - http://www.theregister.co.uk/2012/11/29/amazon_aws_update_jassy/

Wow.

Monday 26 November 2012

IDisposable and object initialisers

Background...

The "using" statement is just syntax trick. The resulting compiled code from a "using" statement is actually a try-catch statement.

This...


    public class MyClass
    {
        public void MyMethod()
        {
            using (MyDisposableClass myDisposableClass = new MyDisposableClass())
            {
                myDisposableClass.DoSomething();
            }
        }
    }


...is converted by the compiler to this...


    public class MyClass
    {
        public void MyMethod2()
        {
            MyDisposableClass myDisposableClass = null;
            try
            {
                myDisposableClass = new MyDisposableClass();
            }
            catch
            {
                myDisposableClass.Dispose();
            }
        }
    }


Problem...

Consider the following code:

    public class MyClass
    {
        public void MyMethod()
        {
            using (MyDisposableClass myDisposableClass = new MyDisposableClass
                {
                    MyProperty = string.Empty
                })
            {
                myDisposableClass.DoSomething();
            }
        }
    }


This can cause a problem.

The problem is caused by the object initialiser. Remember, the object initialiser is just syntactic sugar. "Under the covers" (the actual compiled code), the object will be created as normal with its constructor and then the properties would be assigned afterwards.

So the use of the object initialiser means the object will be created outside the scope of the using statement. This in turn means the scope of the try-catch statement generated by the compiler would be incorrect. If the object is not created inside the scope of the using statement (and therefore the try-catch statement), should there be an error during the creation and initialisation, then "Dispose" will never be called. If Dispose is never called, the object will not be disposed in a timely manner. This means you may leak resources.

FxCop will advise you of the problem with an error like the following:

CA2000 : Microsoft.Reliability : In method 'MyClass.MyMethod()', object '<>g__initLocal0' is not disposed along all exception paths. Call System.IDisposable.Dispose on object '<>g__initLocal0' before all references to it are out of scope.

The correct code is as follows:


    public class MyClass
    {
        public void MyMethod()
        {
            using (MyDisposableClass myDisposableClass = new MyDisposableClass())
            {
                myDisposableClass.MyProperty = string.Empty;
                myDisposableClass.DoSomething();
            }
        }
    }


The lesson is not to use object initialisers in combination with "using" statements.

Monday 19 November 2012

Enterprise debt

Following on from the earlier post on Technical Debt, there exists a similar concept for the enterprise - "Enterprise Debt".

Enterprise debt follows the exact same parallels with financial debt. You may have inefficient business processes, duplicated business processes, manual processes or a myriad of problems with your Enterprise Architecture. These problems constitute a tax on your business but this costs may be justified. Duplicated business processes might allow different business units to start or move in parallel. Inefficient or manual business process might be justified because you have not had time to improve them or automated them, yet they perform a critical function. You might be happy, in the short or medium term, to pay the interest on these debts to keep your enterprise functioning.

Of course, as before, you must take care not to go bankrupt. Should your enterprise debt exceed your capacity to pay down the interest, your business is finished.

One of the goals of an Enterprise Architecture initiative should be to identify your enterprise debt and seek to reduce it. An ongoing Enterprise Architecture program should also identify new enterprise debt and ensure that it is justified i.e. the "tax" that it incurs brings a larger long term benefit.

Monday 12 November 2012

Technical debt

Technical debt is a way of describing some of the costs you can incur when building software. For example, when implementing a feature in a system you generally have two options.

  1. Implement the feature quickly, knowing that it will make future changes to the system harder.
  2. Implement the feature with more care (using time/effort) resulting in a better design and more robust implementation. This makes subsequent changes to the system easier.
There is a clear trade off here. You can liken technical debt to financial debt.

When incurring technical debt, as with financial debt, you will have to pay "interest". In the example above, the interest you pay on quickly implementing the feature is the additional cost required to implement subsequent features. You can continue to pay the interest each time you add a feature or you can repay the principal by going back and implementing the original feature properly.

Note that technical debt, like financial debt, is sometimes useful. In the same way that you likely couldn't buy a house without going into debt, sometimes you might struggle to deliver an application or system without taking on some technical debt first. Examples of useful technical debt are as follows...
  • Quickly building features so that users or business sponsors can see functionality.
  • Quickly building features to attract more customers or satisfy existing ones.
  • Quickly building features (or a whole product!) so that you get first mover advantage over competitors.
In most circumstances the driver for taking on technical debt is time, for example, you have to deliver the software on a certain date.

However, like with financial debt, you must take care not to go bankrupt. Should the interest payments exceed your capacity to repay them, you are finished.

Should you be spending all your effort on simply paying down the interest on your debts, you will be unable to move your project or business forward. At this point your your project or business will fail.

There are many examples of technical debt. Not having an automated deployment mechanism is common. Not having such a thing saves you the time it takes to implement one. The trade off being that this costs you time/effort for each occasion you deploy your application, because you have to do the deployment manually. You can either invest the time to implement an automated deployment and then experience very low cost for subsequent deployments or continue with a fixed cost for each release. Note that if the application has a short lifespan, and you will only deploy the application a few times, then it's simply not worth investing in the automated deployment mechanism. Continuing the financial debt parallel, this is similar to renting a car versus buying a car. The automated deployment is like buying the car - large up front cost giving a low cost for subsequent uses. Where as manual deployments are like renting a car - zero up front costs but a fixed cost for each use.

Like most things in software, its a matter of weighing up the costs and benefits. Taking on technical debt can be justified but should be clearly acknowledged if it is. In the vast majority of times there should be a plan to pay off the debt.

Tuesday 6 November 2012

SignalR included in ASP.NET Fall 2012 Update

Just what it says on the tin, the SignalR framework is included in ASP.NET Fall 2012 Update.


It's really great to see Microsoft really embracing open source components, giving them first class support (SignalR Hub Classes have their own Visual Studio templates) and delivering them via NuGet.

And if you've not yet heard of SignalR, you really need to check it out.

Monday 5 November 2012

Base 64 encoding strings

Encoding strings to base 64 is super trivial...

The code.


    using System;
    using System.Text;
 
    public static class StringExtensions
    {
        public static string Base64Encode(this string text)
        {
            byte[] bytes = new UTF8Encoding().GetBytes(text);
            string base64EncodedText = Convert.ToBase64String(bytes);
            return base64EncodedText;
        }
 
        public static string Base64Decode(this string base64EncodedText)
        {
            byte[] bytes = Convert.FromBase64String(base64EncodedText);
            string text = new UTF8Encoding().GetString(bytes);
            return text;
        }
    }


The test.


    using Microsoft.VisualStudio.TestTools.UnitTesting;
 
    [TestClass]
    public class StringExtensionsTests
    {
        [TestMethod]
        public void TestEncodeAndDecode()
        {
            const string OrginalText = "originalText";
            string encodedText = OrginalText.Base64Encode();
            Assert.AreNotEqual(OrginalText, encodedText, "The encoded text was the same as the original plain text.");
            string decodedText = encodedText.Base64Decode();
            Assert.AreEqual(OrginalText, decodedText, "The decoded text was not the same as the original plain text.");
        }
    }


And that is it.

Not sure why there aren't such methods in the Base Class Libraries (BCL).

About Me