# Friday, February 05, 2010

A while back I wrote a blog post about DataSets and why you shouldn’t use them on service boundaries. The fundamental issues are:

  1. No non-.NET client has any idea what the data looks like you are sending them
  2. Hidden, non essential data is being passes up and down the wire
  3. Your client gets coupled to the shape of your data (usually a product of the Data Access Layer)

So when I saw that Entity Framework 4.0 supports Self Tracking Entities (STEs) I was interested to see how they would work – after all, automated change tracking is one of the reasons people wanted to use DataSets in service contracts. The idea is that as you manipulate the state the object itself tracks the changes to properties and whether it has been created new or marked for deletion. It does this by implementing an interface called IObjectWithchangeTracker which is then used by the ObjectContext to work out what needs to be done in terms of persistence. There is an extension method on the ObjectContext called ApplyChanges which does the heavy lifting.

The Entity Framework team has released a T4 Template to generate these STEs from an EDMX file and the nice thing is that the generated entities themselves have no dependency on the Entity Framework. Only the generated context class has this dependency and, so the story goes, the client needs to know nothing about Entity Framework, only the service does. The client, and the entities, remain ignorant of the persistence model. For this reason STEs have been touted as a powerful tool in n-tier based architectures

All of this seems almost too good to be true … and unfortunately it is.

To understand what the issue is with STEs we have to remember what two of the main goals of Service based systems are:

  1. Support heterogeneous systems – the service ecosystem is not bound to one technology
  2. Decoupling – to ensure changes in a service do not cascade to consumers of the service for technical reasons (there may obviously be business reasons why we might want a change in a service to effect the consumers such as changes in law or legislation)

So to understand the problem with STEs we need to look at the generated code.Here’s the model I’m using:

edmx

Now lets look at the T4 Template generated code for, say, the OrderLine

[DataContract(IsReference = true)]
[KnownType(typeof(Order))]
public partial class OrderLine: IObjectWithChangeTracker, INotifyPropertyChanged
{
    #region Primitive Properties
 
    [DataMember]
    public int id
    {
        get { return _id; }
        set
        {
            if (_id != value)
            {
                ChangeTracker.RecordOriginalValue("id", _id);
                _id = value;
                OnPropertyChanged("id");
            }
        }
    }
    private int _id;
 
    // more details elided for clarity
}

A few things to notice: firstly this is a DataContract and therefore is designed to be used on WCF contracts – that is its intent; secondly a bit of work takes place inside the generated property setters. The property setters check to see if the data is actually changed, then it records the old value and raises calls OnPropertyChanged to raise the PropertyChanged event (defined on INotifyPropertyChanged). Lets have a look inside OnPropertyChanged:

protected virtual void OnPropertyChanged(String propertyName)
{
     if (ChangeTracker.State != ObjectState.Added && ChangeTracker.State != ObjectState.Deleted)
     {
         ChangeTracker.State = ObjectState.Modified;
     }
     if (_propertyChanged != null)
     {
         _propertyChanged(this, new PropertyChangedEventArgs(propertyName));
     }
}

Ok so maybe a bit more than raising the event. It also marks the object as modified in the ChangeTracker. The ChangeTracker state is partly how the STE serializes its self tracked changes. It is this data that is used by the ApplyChanges extension method to work out what has changed. So the thing to remember here is that the change tracking is performed by code generated into the property setters.

Well the T4 Template has done its work so we create our service contract using these conveniently generated types and the client uses metadata and Add Service Reference to build its proxy code. It gets the OrderLine from the service, updates the quantity and sends it back. The service calls ApplyChanges on the context and then saves the changes and … nothing changes in the Database. What on earth went wrong?

At this point we have to step back and think about what those types we use in service contracts are actually doing. Those types are nothing more than serialization helpers – to help us bridge the object world to the XML one. The metadata generation uses the type definition and the attribute annotations to generate a schema (XSD) definition of the data in the class. Notice we’re only talking about data – there is no concept of behavior. And this is the problem When the Add Service Reference code generation takes place its based on the schema in the service metadata – so the objects *look* right, just the all important code in the property setters is missing. So you can change the state of entities in the client and the service will never be able to work out if the state has changed – so changes don’t get propagated to the database.

There is a workaround for this problem. You take the generated STEs and put them in a separate assembly which you give to the client. Now the client has all of the change tracking code available to it, changes get tracked and the service can work out what has changed and persist it.

But what have we just done? We have forced the client to be .NET. Not only that, we’ve probably compiled against .NET 4.0 and so we are requiring the client to be .NET 4.0 aware – we might as well have taken out a dependency on Entity Framework 4.0 in the client at this point. In addition, changes I make to the EDMX file are going to get reflected in the T4 generated code – I have coupled my client to my data access layer. Lets go back to the problems with using DataSets on service boundaries again. We’re back where we pretty much started – although the data being transmitted is more controlled.

So STEs at first glance look very attractive, but in service terms they are in fact similar to using DataSets in terms of the effect on the service consumer. So what is the solution? We’ll we’re back with our old friends DTOs and AutoMapper. To produce real decoupling and allow heterogeneous environments we have to explicitly model the *data* being passed at service boundaries. How this is patched this into our data access layer is up to the service. Entity Framework 4.0 certainly improves matters here over 1.0 as we can use POCOs which aid testability and flexibility of our service

.NET | EF4 | WCF
Friday, February 05, 2010 12:48:08 PM (GMT Standard Time, UTC+00:00)  #    Disclaimer  |   | 
# Thursday, February 04, 2010

I’ve just noticed that Andy has blogged about our drop in clinic we are running at DevWeek 2010. All of the Rock Solid Knowledge guys will be at the conference so come over and let us help solve your design and coding problems – whether it be WCF, Workflow, Multithreading, WPF, Silverlight, ASP.NET MVC,  Design Patterns or whatever you are currently wrestling with. We’re also doing a bunch of talks (detailed on our Conferences page)

DrRockman

.NET | RSK | WCF | WF | WF4
Thursday, February 04, 2010 1:19:44 PM (GMT Standard Time, UTC+00:00)  #    Disclaimer  |   | 
# Monday, January 18, 2010

I’ve just realised that DevWeek 2010 is on the horizon. DevWeek is one of my favourite conferences  - lots of varied talks, vendor independence and I get to hang out with lots of friends for the week.

This year I’m doing a preconference talk and two sessions:

A Day of .NET 4.0
Mon 15th March 2010
WORKSHOP REF: M1
.NET 4.0 is a major release of .NET, including the first big changes to the .NET runtime since 2.0. In this series of talks we will look at the big changes in this release in C#, multithreading, data access and workflow.
The best-known change in C# is the introduction of dynamic typing. However, there have been other changes that may affect your lives as developers more, such as support for generic variance, named and optional parameters. There is a whole library for multithreading, PFx, that not only assists you in parallelizing algorithms but in fact replaces the existing libraries for multithreading with a unified, powerful single API.
Entity Framework has grown up – the initial release, although gaining some popularity, missed features that made it truly usable in large-scale systems. The new release, version 4.0, introduces a number of features, such as self-tracking objects, that assist using Entity Framework in n-tier applications.
Finally, workflow gets a total rewrite to allow the engine to be used far more widely, having overcome limitations in the WF 3.5 API.
This workshop will take you through all of these major changes, and we’ll also talk about some of the other smaller but important ones along the way.

An Introduction to Windows Workflow Foundation 4.0
Tues 16th March 2010
.NET 4.0 introduces a new version of Windows Workflow Foundation. This new version is a total rewrite of the version introduced with .NET 3.0. In this talk we look at what Microsoft are trying to achieve with WF 4.0, why they felt it necessary to rewrite rather than modify the codebase, and what new features are available in the new library. Along the way we will be looking at the new designer, declarative workflows, asynchronous processing, sequential and flowchart workflows and how workflow’s automated persistence works.

Creating Workflow-based WCF Services
Tues 16th March 2010
There are very good reasons for using a workflow to implement a WCF service: workflows can provide a clear platform for service composition (using a number of building block services to generate a functionally richer service); workflow can manage long running stateful services without having to write your own plumbing to achieve this. The latest version of Workflow, 4.0, introduces a declarative model for authoring workflows and new activities for message based interaction. In addition we have a framework for flexible message correlation that allows the contents of a message to determine which workflow the message is destined for. In this session we will look at how you can consume services from workflow and expose the resulting workflow itself as a service.

As well as these you may well see me popping up as a guest code-monkey in the other Rock Solid Knowledge sessions. Hope to see you there

.NET | WCF | WF
Monday, January 18, 2010 2:52:19 PM (GMT Standard Time, UTC+00:00)  #    Disclaimer  |   | 
# Thursday, January 07, 2010

The Routing Service is a new feature of WCF 4.0 (shipping with Visual Studio 2010). It is an out-of-the-box SOAP Intermediary able to perform protocol bridging, multicast, failover and data dependent routing. I’ve just uploaded a new screencast walking through setting up the Routing Service and showing a simple example of protocol bridging (the client sends messages over HTTP and the service receives them over NetTcp). This screencast is one of a series I will be recording about the Routing Service. You can find the screencast here.

.NET | RSK | WCF
Thursday, January 07, 2010 1:24:26 PM (GMT Standard Time, UTC+00:00)  #    Disclaimer  |   | 
# Thursday, December 17, 2009

Seeing as this has changed completely from WF 3.5 I thought I’d post a quick blog entry to describe how to run a workflow declared in a XAML file.

You may have heard that the WF 4.0 default authoring model is now XAML. However, the Visual Studio 2010 workflow projects store the XAML as a resource in the binary rather than as a text file. So if you want to deploy your workflows as XAML text files how do you run them? In .NET 3.5 you could pass the workflow runtime an XmlReader pointing at the XAML file but in WF 4.0 there is no WorkflowRuntime class. It turns out you need to load the XAML slightly indirectly by creating a special activity called a DynamicActivity

DynamicActivity has an Implementation member that points to a Func<Activity> delegate – in other words you wire up a method that returns an activity (the workflow). Here’s an example:

static void Main(string[] args)
{
  DynamicActivity dyn = new DynamicActivity();
  
  // this line wires up the workflow creator method
  dyn.Implementation = CreateWorkflow;
 
  WorkflowInvoker.Invoke(dyn);
}
 
static Activity CreateWorkflow()
{
  // we use the new XamlServices utility class to deserialize the XAML
  Activity act = (Activity)XamlServices.Load(@"..\..\workflow1.xaml");
 
  return act;
}
.NET | WF | WF4
Thursday, December 17, 2009 4:27:07 PM (GMT Standard Time, UTC+00:00)  #    Disclaimer  |   | 
# Monday, November 02, 2009

Thanks to everyone who attended Guerrilla.NET last week. As promised the demos are now online and you can find them here

http://rocksolidknowledge.blob.core.windows.net/demos/GNET261009.zip

Monday, November 02, 2009 9:24:49 AM (GMT Standard Time, UTC+00:00)  #    Disclaimer  |   | 
# Friday, October 02, 2009

I just got back from speaking at Software Architect 2009. I had a great time at the conference and thanks to everyone who attended my sessions. As promised the slides and demos are now available on the Rock Solid Knowledge website and you can get them from the Conferences page.

.NET | Azure | REST | RSK | WCF | WF
Friday, October 02, 2009 10:12:18 PM (GMT Daylight Time, UTC+01:00)  #    Disclaimer  |   | 

I had a lucky coincidence while speaking at Software Architect 2009. Four hours before I gave a talk on Contract First Design with WCF the WSCF Blue guys released version 1.0 of their tool. Contract first design is the most robust way to build services that potentially have to exist in a heterogeneous environment – it essentially means you start from the WSDL which means everyone can consume the service as no implementation details leak on to the wire. The major stumbling block in the WCF world has been there is no tool that can take a WSDL document and generate the skeleton of a service that implements the defined contract. Finally we have a released tool that does this very thing in the shape of WSCF Blue. In fact it can go further than that because writing WSDL by hand is too much like heavy lifting for many people – so WSCF Blue can also take one or more schema and runs a wizard to build a WSDL document that creates a contract based on the messages in the schema documents.

WSCF Blue solves a problem that has been there since WCF 3.0 and the great thing is that its free – being release on codeplex

Enjoy!

.NET | WCF
Friday, October 02, 2009 10:06:53 PM (GMT Daylight Time, UTC+01:00)  #    Disclaimer  |   | 
# Thursday, October 01, 2009

I used to be a C# MVP a couple of years ago. When I took on the role of CTO of DevelopMentor I found I had a far smaller amount of time to devote to community based activity and so I lost my MVP status. I stepped down as DevelopMentor’s CTO back in January and as a result suddenly found I had time again and so can often be found hanging out at the MSDN WCF forum, speaking at conferences, etc. Consequently I have just heard that I’ve been awarded MVP status for Connected Systems :-) - nice to be back in the fold

.NET | Life
Thursday, October 01, 2009 4:22:59 PM (GMT Daylight Time, UTC+01:00)  #    Disclaimer  |   | 
# Saturday, September 26, 2009

Thanks to everyone who came to my sessions at BASTA! My first time at that conference and I had a great time. Here are the slides and demos

What’s New in Workflow 4.0

Contract First Development with WCF

Generics, Extension Methods and Lambdas – Beyond List<T>

.NET | WCF | WF
Saturday, September 26, 2009 9:24:56 AM (GMT Daylight Time, UTC+01:00)  #    Disclaimer  |   |