Windows Azure error “There was an error attaching the debugger to the IIS worker process for URL ‘http://127.255.0.0:82/’…”

Since I last rebuilt my development machine I haven’t had a need to even look at web development let alone Windows Azure.  The last time I had “opportunity” to develop anything using Windows Azure was with version 1.3.  At the time version 1.4 was still in beta and I couldn’t seem to successfully install it.

Lucky me I was added to a project using Windows Azure so I installed version 1.6 along with the Windows Azure Platform Training Kit – November Update and decided to make a quick run through some of the training kit to see how things worked.

Much to my chagrin attempting to run the training projects only ever resulted in the dialog:

image

The first thing that popped out at me was the IP address ‘127.255.0.0’.  I immediately proceeded to look at project properties to figure out where this came from to no avail.  I then unloaded the projects and looked through the raw project and solution files to no avail.  Attempting to ping the address did succeed so I looked through my hosts file in ‘%windir%\System32\drivers\etc’.  Nope it wasn’t there either. 

Searching the Internet (hmmm, I wonder if the term ‘Binging’ will be added to the dictionary?) on the error message gave me a whole lot of nothing.  Refining my search to just the IP address sent me off on a tangent for blogs on Windows Azure v1.5 about the need to add entries to the hosts file, although they were helpful in educating me about what in the world the IP address was.  For more information on that see http://blogs.msdn.com/b/avkashchauhan/archive/2011/09/16/whats-new-in-windows-azure-sdk-1-5-each-instance-in-any-role-gets-its-own-ip-address-to-match-compute-emulator-close-the-cloud-environment.aspx.

Finally *bonk* I got the idea to look in the Windows Event log.  It should have been the first place I looked but I guess I hadn’t drank enough coffee yet to think straight.

I found two errors:

ISAPI Filter ‘C:\Windows\Microsoft.NET\Framework\v4.0.30319\\aspnet_filter.dll’ could not be loaded due to a configuration problem. The current configuration only supports loading images built for a AMD64 processor architecture. The data field contains the error number. To learn more about this issue, including how to troubleshooting this kind of processor architecture mismatch error, see http://go.microsoft.com/fwlink/?LinkId=29349.

and

Could not load all ISAPI filters for site ‘DEPLOYMENT16(11).WINDOWSAZUREPROJECT1.GUESTBOOK_WEBROLE_IN_0_WEB’.  Therefore site startup aborted.

That certainly gave me a good clue.  Why I need ‘Enable 32-bit applications’ on the application pool I have no idea since I’m compiling as ‘Any CPU’.  Compiling as x64 results in the same errors and compiling as x86 fails because I’m running on an x64 box and results in the dialog:

image

Every time the computer server starts it creates a new application pool with ‘Enable 32-bit applications’ set to false.  When the compute server shuts down it removes the application pool so manually resetting this value doesn’t help.  Searching around found http://blogs.msdn.com/b/zxue/archive/2011/10/31/enabling-support-for-32-bit-iis-applications-in-windows-azure.aspx.  Adding the startup task to set the IIS default to allow 32-bit applications solved my problems.  It really only needs to be run once but I just leave it in the project just in case.

Azure 101 – Billing (when 1 minute equals 5 hours and 2 might equal 10)

I thought I’d start a blog series all about Azure – things I’ve learned while getting a few web sites up and running in the (Microsoft) cloud.  First up – billing.  Microsoft offers a “free” 90-day trial to get your feet wet, but BE CAREFUL, may not be free, especially when you quickly go over your quotas and start getting charged.

A lot of this information is available deep in the dark depths of the Azure SDK help, and you might just tell me to RTFM, but even as a Microsoft support engineer admitted to me:  “there is WAY too much documentation out there, it’s impossible to find anything”.  My thoughts exactly, and this is the reason for this quick post to summarize the information.

There are multiple “billing meters” used to charge you for your Azure usage:  Compute Time, Database Size, Storage Amount, Storage Transactions, Data Transfer (in and out), and a couple others.  I will focus on what we have found to be both the most misunderstood, as well as the meters for which you will most probably go over the free quota if you don’t know what to watch for:  Compute Time and Storage Transactions.

Rule #1 – Compute Hours are NOT just the time your site is WORKING on a request

My first assumption was that if my site is hosted up there, but it’s not being hit very often, I won’t incur many (any?) compute hours.  Turns out, compute hours are a measurement of how many hours your site (more correctly your instances) exist on the Azure servers.  Back when I was starting with Azure, I wrote a super simple ASP.NET MVC “Hello World” app and published it to Azure – super easy.  I first published to staging, incremented the instance count to 2 just to see how easy that was, and then published a new version to production.  Out there, I set the instance count to 3, and learned how to “flip the VIP”, switching staging with production in (not exactly) the blink of an eye.

Fast forward 5 days later, and we get our first billing email, saying that we’ve reached the trial period’s limit of 700 COMPUTE HOURS!  Seriously?  3 instances in production, 2 in staging, but nobody is hitting these sites, nobody would know they’re out there (for that matter the staging one has a GUID in the URL so it’s not discoverable by accident).  Long story short, after further reading, I find that a compute hours is any portion of an hour that any of your instances (staging or production) are deployed to the cloud!

A couple more little gotchas:

  • You get charged the first hour(s) the first minute your app gets deployed.  When the clock strikes 12, the next hour starts.  So if you deploy a single instance at 10:50AM, you will get charged 1 hour for the first 10 minutes, and then at 11:00, the second hour is charged.  If you’re unlucky enough to deploy 5 instances in the 59th minute of an hour, you will get billed for 10 hours in 2 minutes of human time.
  • Each time you deploy your project, a new clock is started.  If you are iteratively trying something out and deploy 3 versions in an hour, you get hit for 3 hours (assuming single instance on a small VM).
  • A “VM size factor” is used to multiply the hours for the larger VMs.  A small instance (counts as 1 core) is a value of 1, medium 2, large 4, extra large 8.

To summarize the compute hours – here is the formula:

# of instances (each web and worker role instance counts individually, for both staging and production) * VM size factor * number of partial clock hours

Rule #2 – Turn off diagnostics before you publish to Azure

This one is really buried.  There is a setting in the properties for each Azure role that allows you to turn diagnostics on or off – it’s ON by default.  Diagnostics are captured by Azure on the local VM, and then persisted to YOUR storage account VERY FREQUENTLY if you have this turned on.  The problem is, each time the logs are saved to your storage account, you get hit with storage transactions.  In those first 5 days of my Hello World app, we were averaging over 7000 transactions per day.  When you only get 50,000 transactions in the trial period, it’s easy to see how you can go over.  The MS support engineer told me the default is that the logs are persisted EVERY MILLISECOND.  I have trouble believing that, but that’s what he said.  It’s “really fast” whatever it is.

Turn OFF diagnostics unless you need them, and if you need them, throttle down the period after which they’re persisted to storage.

Right click, properties on each role in your Azure project.  In Configuration, clear the “Enable Diagnostics” checkbox.

CropperCapture80CropperCapture50

If you DO need logging enabled, there are ways to control exactly what data gets logged and the period at which it’s written to your storage account.  You can either use code in your OnStart method, or place a configuration file in your storage account to control the settings.  I plan on writing a post about the configuration based method soon.

Bottom Line

I think the bottom line with all this Azure billing – watch your bill (you can view it online through the portal) like a hawk, even if you have a simple app and are just in the “free” trial period.  Always know what unit of measure you’re dealing with:  human minutes, or Azure hours.  :)

Stream HTML5 Video from Azure Blob Storage

I’ve been looking into what needs to be done in order to stream videos hosted in Microsoft Azure’s Blob storage with the HTML5 <video> control. Once I got over a few little hiccups, the process was very straight forward.

The Process

I followed the Video On The Web section of the online book Dive Into HTML5 by Mark Pilgrim for the overall process. It provides a couple nice pictures stating which browser supports which codec. I was disappointed with the fact that there isn’t one codec supported by all the mainstream browsers, so to support all you have to have at least 2.

I decided that I would take my video and encode it in the H.264 and OGG formats. NOTE: The document above is a bit old but the steps provide a good idea of what to set to accomplish the encoding. Once the two versions were ready I created a simple .html page and hosted it in my local IIS just to test out that I had done the encoding correctly. Below is what I had.

<!DOCTYPE html>
<html>
<body>
<h1>Test</h1>
<video width="576" height="320" controls>
	<source src="tbbt_s1e1.mp4"  type='video/mp4; codecs="avc1.42e01e, mp4a.40.2"'>
	<source src="tbbt_s1e1.ogv"  type='video/ogg; codecs="theora, vorbis"'>
</video>
</body>
</html>

The HTML5 <video> control is really straight forward to use. The controls attribute tells the browser to display its prebuilt controls and there are 2 ways to specify the source video file(s). The first way only uses one file so if the browser doesn’t support it, then you can’t watch. The second way is what I followed, and it allowed me to provide multiple video files allowing the browser to choose the codec it supports.

When I viewed the page the video player controls showed up, but the videos didn’t work. The important thing I missed was to make sure IIS had the correct MIME type defined for the videos (MIME Types Rear Their Ugly Head section of the document above). After adding the following MIME types to the root IIS node the videos played.

Extension MIME type
.ogv video/ogv
.mp4 video/mp4

 
With the videos functioning, I then moved on to hosting the video files in my Blob Azure Development Storage and pointing to those files. Previously I had created a little application to upload files to Blob storage, so I used it to upload the files instead of any of the Azure file explorer applications.

I then pointed the .html file above to the Azure urls and got the player again with the video not working.

<!DOCTYPE html>
<html>
<body>
<h1>Test</h1>
<video width="576" height="320" controls>
	<source src="http://127.0.0.1:10000/devstoreaccount1/videocontainer/c625cff5-6f03-4e89-a8b3-43f29b97f14b.mp4"  type='video/mp4; codecs="avc1.42e01e, mp4a.40.2"'>
	<source src="http://127.0.0.1:10000/devstoreaccount1/videocontainer/4e3f4531-e3d2-407f-8a78-bea8d540d537.ogv"  type='video/ogg; codecs="theora, vorbis"'>
</video>
</body>
</html>

Turns out I ran into the exact same MIME type issue. My application didn’t set the Content Type property of each Blob to the correct type so Azure wouldn’t allow it to be streamed out, but it could be downloaded. To quickly change this I used Azure Storage Explorer to point to my Development Storage and change the Content Type of each of the files to their respective MIME type specified in the table above.

I then browsed to the .html file again and the videos played!

Codec Issues
When encoding the video to the H.264 and OGG formats I ran into a few issues. First I used HandBrake and Firefogg (requires Firefox) respectively to encode the files, but while HandBrake worked, the file Firefogg provided didn’t have audio when played in Firefox. I then switched to using ffmpeg2theora hoping that would fix the issue, but it didn’t. I then tried encoding them for the WebM codec, but that still didn’t work. In a final attempt I took the H.264(.mp4) file that HandBrake output(since it was working great) and used Miro Video Converter to convert it to a OGG(.ogv) file. This new version worked in Firefox, so what I thought was an issue with Firefox or my hosting just turned out to be an encoding problem.

I also ran into an issue viewing the H.264 version in IE9, but it turned out to be a Compatibility View issue that I documented here

I still have a ton to learn about Azure Storage, HTML5 Video and Codecs, but I think this was a good start.