About the Author

Bryan Coon has been a Lead Software Engineer with InterKnowlogy since April 2010. His role at InterKnowlogy is to create solid software with his team using the latest technologies and tools, including .NET, C#, WPF, Silverlight, WCF, and whatever else that can help create great software. Before arriving at his new home, Bryan spent over 12 years writing scientific software for the San Diego life science industry. His roles have included management, technical leadership and, of course, lots of coding. He has written large scale projects in a variety of languages, such as Perl, Java, C, C++ and C#. Some of his previous projects include software for Forensics Analysis using mtDNA and STRs for Abbott Molecular Diagnostics, Genotyping and Methylation MALDI-TOF Analysis for Sequenom Inc., and an artificial lymph node simulation for the La Jolla Institute of Immunology and DARPA. Bryan has a Bachelor of Science in Biopsychology from the University of California, Santa Barbara and worked for over 5 years in Immunology and Virology at The Scripps Research Institute.

The Kinect SDK and Xbox: Part Deux

So in my previous post on this topic I was trying to made an effort to discover some details about who, exactly, can become an Xbox developer, and have access to the Kinect API.   The information is not readily available, and the most info I could find was from this page.

There it states that to apply to the Xbox 360 Registered Developer Program,  you need to send an email to xboxrdb@microsoft.com with your contact info and a game concept.  If the proposal is of interest, you will be contacted.

Cool!  So I applied, describing who we were and what we wanted to do- which would be of great benefit to all mankind by the way.  I figured that they would probably ignore it if it wasn’t interesting , but if they responded I could glean some information about what there entry requirements were.  And maybe… just MAYBE we would get in!

I figured it would take some time for them to respond so I prepared for a wait…

But to my surprise, I had a response the next day!  I think I may have hit a nerve though, here’s what I got back:

“Thank you for your interest in developing for the Xbox 360 and Kinect for Xbox 360.

The Xbox 360 Registered Developers Program works exclusively with professional game developers. It is only open to qualified professional game development companies with a wealth of previous professional industry experience, financial backing and solid game concepts. As such, your inquiry lies far outside the scope of this program.”

Well!  So not only did they just pooh-pooh the request, they managed to sound a bit snooty about it as well.   Excellent.

But anyways, now what can we learn from this?

  1. To be in the Xbox 360 RDP you need to be a professional game developer
  2. You must have a wealth of previous professional experience
  3. You must have lots of money (how do they know we do or not?)
  4. You must have a solid GAME concept.

That’s the interesting part to me- GAME concept.  I take this to mean that unless it’s a game, you cannot be on the Xbox 360 platform.  Healthcare app to help people?  Nope.  App to assist in the home for the elderly?  Not likely.

But I won’t cry about this.  I think the SDK is improving rapidly.  With a commercial version set for Q1 of 2012 and active development it would appear to have a bright future indeed.

Ah well, back to my Kinect SDK development!

The Kinect SDK and Xbox: Part 1

No code today, just some general musings and observations on Kinect. I have been playing around with the Kinect SDK for quite a while now, creating lots of fun applications for prototypes, research and just general noodling during our much valued ‘RECESS’ time. One of the things that always pops up during conversations with clients or colleagues is ‘How cool is that! We need to make an Avatar and drive it with the Kinect!’. And I agree, that would be cool.

It turns out that XNA Game Studio has its own avatar API, and can even generate a random avatar for you. Getting the avatar to move with your Kinect data is not trivial, but not impossible either- basically mapping the Kinect SkeletonData Joints to the corresponding avatar joint or position. Several people have given this a go, with good success… to a point (a simple ball joint example here). The problem lies in the data that comes from the SDK. When it loses track of a joint, or when one joint passes in front of another, the data coming out is just all over. If you stand behind a chair, your avatar or skeleton will show your legs flopping around completely akimbo. Not a pretty sight.

That leads into what really interests me: The features I see in Xbox Kinect that are not available in the SDK. For example, the Xbox version has higher fidelity and can track more joints. And even more interesting to me is that the avatars themselves have joint restrictions- so even if the data is garbage, the avatar never moves to a completely bizarre position.

What I would really like is to create an avatar with XNA Game Studio, and drive it with the data from the Kinect SDK with this joint restriction applied. And then maybe with some changes publish my application to an Xbox using the Kinect on that platform as well. Well, it seems you can’t get there from here- while XNA Game Studio will allow you (with the proper licensing) to publish games to both PC and Xbox, the Kinect packages are obviously different and incompatible.

So just to explore it, what does it take to become an Xbox Kinect developer? Well, I can’t say I have the conclusive answer on this. Not being part of that community it seems the information is difficult to obtain. From the Microsoft site, you need to be part of the Xbox 360 Registered Developers Program. And to become a partner in that program you need to email them with a concept and wait a bit. And then sign an NDA. And then wait a bit more. And that’s if you do not have a Development Account Manager. If you do, then I guess you already know what to do.

I want information. How much, exactly, does it cost? Are there any smaller Kinect for Xbox developers out there? Or is it restricted to big game developers? What equipment is needed, and kits? How difficult is it to port a game from the PC with the SDK to Xbox? Judging from the blast of media from Microsoft regarding the Kinect, I would have thought this information would have been easier to access. Ah well these are the questions I hope to have answered for Part 2!

Prism 4, MEF and the CompositionInitializer

So I have been using MEF (Managed Extensibility Framework) for a while now on most of my WPF projects. Lately, I have been really getting a lot of use out of Prism 4.0 as well and utilizing the built in MEF with the MefBootstrapper.

One of the things that I always found a bit troublesome however was what to do when you need a non-exported class to satisfy its imports from MEF. This came up on a recent project, as I was using the IEventAggregator from Prism, thought it would be useful to have it in classes that may not come from MEF- i.e. classes that are new’ed up depending on the business logic dictated. Now, the true MEF heads may say, ‘Nooooo! That’s not what MEF is for!’. But I thought I would see how it worked anyway.

To solve this, one approach was to pass in my IEventAggregator to my class through the constructor, but this didn’t seem too elegant. What if I need other imports besides the IEventAggregator? Then potentially I need to pass in more and more stuff to the constructor.

What I really needed, was some way to tell the class that it has imports it needs to satisfy, and notify that class that it needs to check the already existing container for this. This is where the CompositionInitializer comes in. One problem with that is that the CompositionInitializer was designed for Silverlight, and my application is MEF. But, no worries, Glenn Block had anticipated my need and kindly written a WPF CompositionInitializer, to be found here. Thanks Glenn. And in addition, I found lots of other great info from Reed Copsey’s site. Thanks Reed.

So I placed my CompositionInitializer in my non-exported class in the constructor:

// No export here
public class MyClass
{
	public MyClass()
	{
		CompositionInitializer.SatisfyImports( this );
	}

	[Import]
	private IEventAggregator EventAggregator { get; set; }
}

But when I ran the code, even though I had a valid import for my IEventAggregator, none of my messages were being received. A bit of googling brought me here. Aha! Simply placing my CompositionInitializer.SatisfyImports( this ) in the constructor is insufficient. How, after all, does the CompositionInitializer know what container you are using? I was so used to MEF magic that it hadn’t occurred to me that CompositionInitializer needs to have its container set.

By adding the following line to my BootStrapper file in a location that gets called after CreateShell():

CompositionHost.Initialize( Container );

My CompositionInitializer was now armed with the appropriate container, and everything worked great in the constructor. Now I was able to new up an instance of any class and satisfy its imports against my Container.

Simple Kinect Joint Smoothing

I was recently working on a project that used the Microsoft Kinect SDK. The goal here was to have a users hand drive a cursor on a large screen, and allow them to navigate around by using a hover-to-click model. One thing that became immediately apparent was that the data coming from the device was very, very jumpy.

While this might not be an issue when you are trying to see if a user is waving their arms around or is standing or sitting, it is a real problem when you are trying to track fine motor movements. For example tracking a single joint such as the right hand and using that to position a screen element is so troublesome that it is almost unusable.

From empirical observation, and examination of the data the problems intensify as one joint (say the hand) moves in front of the other. This makes sense, as the sensor is trying to determine which joint is which and tends to flip flop between the two. This makes for a classic GIGO situation. The kinect runtime does have some smoothing built in:

_kinectRuntime.SkeletonEngine.TransformSmooth = true;
 var parameters = new TransformSmoothParameters
{
    Smoothing = 0.75f,
    Correction = 0.0f,
    Prediction = 0.0f,
    JitterRadius = 0.02f,
    MaxDeviationRadius = 0.04f
};
_kinectRuntime.SkeletonEngine.SmoothParameters = parameters;

But this seemed to have minimal effect. I decided that I needed something more substantial to control the x,y data points. The thing that I found interesting is that this is a complex problem- seemingly too complex for my non-math background. But even so, there is a relatively simple approach, just a nice weighted average. I played with both a straight algebraic average, and an exponential average. The idea was that if I can smooth the data and reduce the lag just enough it would significantly improve the user experience. Here’s what I did:

A nice simple Exponential average:

public double ExponentialMovingAverage( double[] data, double baseValue )
{
    double numerator = 0;
    double denominator = 0;

    double average = data.Sum();
    average /= data.Length;

    for ( int i = 0; i < data.Length; ++i )
    {
        numerator += data[i] * Math.Pow( baseValue, data.Length - i - 1 );
        denominator += Math.Pow( baseValue, data.Length - i - 1 );
    }

    numerator += average * Math.Pow( baseValue, data.Length );
    denominator += Math.Pow( baseValue, data.Length );

    return numerator / denominator;
}

And a weighted average:

public double WeightedAverage( double[] data, double[] weights )
{
    if ( data.Length != weights.Length )
    {
        return Double.MinValue;
    }

    double weightedAverage = data.Select( ( t, i ) => t * weights[i] ).Sum();

    return weightedAverage / weights.Sum();
}

The exponential average, in my opinion was better. More smoothing and jitter control with less lag. Good! Here’s how I used it:

private readonly Queue<double> _weightedX = new Queue<double>();
private readonly Queue<double> _weightedY = new Queue<double>();

Point point = ExponentialWeightedAvg( scaledJoint );

private Point ExponentialWeightedAvg( Joint joint )
{
    _weightedX.Enqueue( joint.Position.X );
    _weightedY.Enqueue( joint.Position.Y );

    if ( _weightedX.Count > Settings.Default.Smoothing )
    {
        _weightedX.Dequeue();
        _weightedY.Dequeue();
    }

    double x = ExponentialMovingAverage( _weightedX.ToArray(), 0.9 );
    double y = ExponentialMovingAverage( _weightedY.ToArray(), 0.9 );

    return new Point( x, y );
}

Note: The scaledJoint comes from the Kinect SkeletonFrameReady event handler, after using the ScaleTo() extension method from the Coding4Fun Kinect Toolkit.

Once you have your Point, you can use it to place your nicely smoothed cursor (In my case an image of a hand on a canvas) in the right location on every frame from the Kinect, and have it be nice and stable. I found that using just a few points (5-7) was enough to smooth and reduce jitter.

How to crash a WCF Service

As part of a recent project, I needed to create a WCF service, host it in IIS and then access it through a remote client.  The client in this case implemented an interface that would allow the client to connect to other sites as well- the concrete implementation of the interface dealing with all the data.

There were common DTOs being used on the client side, and since I had complete control over both the client and server side code, I thought I would try to use RIA’s DomainService in a non-RIA app, to pull entities from the database, convert them to the common DTO type and then send them across the wire to the client.

Everything was off to a great start and then I noticed that after one particular (or so I thought) request, the WCF service was completely locked up.  I had to either recompile my service or restart IIS with iisreset to get things working again.

The errors were quite painful to track down.  Data was returned to the client just fine on the first call- but subsequent calls failed with the following exception:

System.ServiceModel.CommunicationException.

After enabling WCF Tracing (http://msdn.microsoft.com/en-us/library/ms733025.aspx) I saw the following error:

The InnerException  message was ‘Type ‘System.Globalization.GregorianCalendar’ with data  contract name
‘GregorianCalendar:http://schemas.datacontract.org/2004/07/System.Globalization’
is not expected. Add any types not known statically to the list of known
types – for example, by using the KnownTypeAttribute attribute or by adding
them to the list of known types passed to DataContractSerializer.’. Please
see InnerException for more details.

After some head scratching, concluding the error wasn’t informative, scouring the net etc., etc., I found the following post:

http://howevangotburned.wordpress.com/2007/12/06/type-fidelity-across-the-wire-in-wcf/

Turns out you can’t sent abstract or virtual types over WCF!  Whoops.  In one of my objects, I was sending a CultureInfo object, which has a Calendar member.  And while calendar is virtual, the concrete implementation being sent was a GregorianCalendar.

In my case, the fix would have been pretty easy

[DataContract] 
[KnownType( typeof( CircleType ) )] 
[KnownType( typeof( TriangleType ) )] 
public class CompanyLogo2 
{ 
    [DataMember] 
    private Shape ShapeOfLogo; 
    [DataMember] 
    private int ColorOfLogo; 
}

Use the KnownType attribute.  But in other cases, like with System.Type, these are internal and you are basically out of luck.

So I ended up NOT sending my object, but rather just sending generated Entity objects- as these seem to be safer than what I could put together to send and crash WCF.

My Windows Service Won’t Start! (And Proxy Woes)

On a recent project, my colleagues and I were asked to write an application that had quite a few moving parts to it- one of which was a pretty standard windows service.  In our WiX installer configuration, we specified that the NetworkService account should be used to start up the service.

We tested this configuration on multiple systems, both real and virtual at our local site and everything worked just as we expected.  Awesome, ship it!

The trouble started once we delivered the build to our client’s QA department.  On their systems, on their test network, we saw the following error message when the installer reached the point it was trying to start the service:

Error 1920. Service OurService (OurService) failed to start. Verify that you have sufficient privileges to start system services.

What the heck?  And then the real head scratching began. 

Normally, when a windows service fails to start with a message like this, it is due to a genuine permissions problem from the service startup user not having permissions or the install user having insufficient privileges.   So we checked first that the service install folders had the correct permissions, that the user running the installer had Admin privileges, etc.  Everything seemed to be in order.

Then, we tried to change the service startup user to the account of the QA person doing the testing- and it worked!  Aha!  We thought, we are on the trail now.  So we created a new user, and gave that user the exact permissions the QA person’s account had.  And the service failed to start. 

After much subsequent testing, we established the following list of facts:

1. The service would start only with the QA account

2. The service would start if the LAN was unplugged (!?)

3. The service would start if the test system was on a different network.

4. The QA system was using a proxy- disabling or enabling this had no impact on the install (hmm!)

So now things were truly getting perplexing- and to make things even more interesting, the QA user in a last ditch effort to troubleshoot the issue was running software to listen to the active ports, and reported that our service was trying to communicate with an external web address!  Since there is no code in our service that talks to anything outside the network, and in fact our service wasn’t even running this made no sense at all.

This was the best clue we had though, and kudos to that QA person for giving it a go.   In looking at the addresses that were being communicated to, we noticed two- one was for Akamai the other for Microsoft.  One is a caching services and the other is… well, Microsoft.

And what was the request to the Microsoft site?  GET /pki/crl/products/CSPCA.crl.

A little googling and we realized what was going on.

http://softwareblog.morlok.net/tag/crl/ (a nice explanation of the way the CRL works)

Our service uses a lot of .NET 3.5 assemblies, and Microsoft has been digitally signing their assemblies for quite some time.  As part of the service starting, .NET is then of course loading these signed assemblies and needs to check the Certificate Revocation List (CRL) to make sure that all is okay with that assembly.  With the LAN unplugged, .NET is smart enough to know that no network interface is available, and the service starts.  But with a PROXY, the network interface is up but unable to connect to the CRL site, and the CRL request times out after 15 seconds.

But a Windows Service needs to be responsive, and 15 seconds is too long for it to just sit there waiting, so it puked with the 1920 error before that.

The fix?  Thankfully very simple!  Adding these 3 lines of code to the service .config file disables the check for the CRL (in .NET 2.0 and later only):

<runtime>
    <generatePublisherEvidence enabled="false"/>
</runtime>

Such a simple fix!  What took so long to find it?  There were several contributing factors:

1. Unfamiliarity with this issue beforehand (and if you have run into it once, you wont forget!)

2. Our unfamiliarity with the customer testing environment

3. The Customers unfamiliarity with their own environment

4. Overly complex testing systems

5. Distance between the customer and ourselves

These all played a role in the time it took to find the –real- source of the problem rather than chasing ghosts.

Hopefully this will help someone else out there avoid this pitfall.