Expose your objects over HTTP with minimal coding

Quite often we create systems that make use of web services so as to expose our Data, or Business methods for client applications to consume them. In a classic 3 tier scenario it may be either the data layer exposed through the web service as a data service, or the business layer exposed through the web service as a business service. I would say that both are perfectly valid for exposure through a web service depending on our needs and requirements. But since web services already exist, why make a custom method and use this method instead of web services? The following paragraph contains some points I dont like when coding standard microsoft web services.

Points I don't like when using Microsoft Web Services.

What I do not like when coding a web service when using the standard .Net framework approach is the feeling of redundant coding I get. Surely you've felt the same at some point, especially when all we are doing is simply coding pass through functions to my business or data layer.

Further to that, I end up having to maintain a mess of web service references in my client application. I need to add one reference to each asmx page created. Further to this, when creating web service references, the Microsoft .Net framework will create a custom Proxy to access this web service. This custom proxy will also re-define any complex types (eg typed datasets or custom classes) returned through the web service. What i dont like about this, is that even though two seperate asmx files may expose the same type, each web service reference added will redifine the same type on its own copy of the proxy. This leaves the client application with multiple definitions of what essentially is the same data type.

 Automagical web services :) What would I like to see instead?

  • I would like to be able to switch between 'web service' mode and 'non web service' mode easily, with just a simple switch in the client application.
  • I do not want to create redundant code in webmethods just so that I can expose my data or business objects over HTTP. The code is already in those layers, I just want the ability to consume them over an HTTP connection.
  • i would just like to add a 'magic' attribute to my business/data classes that would mean 'This object can be invoked over HTTP.
  • I would like to be able to consume my business or data objects seamlessly, with no requirements to add tons of web references on my client application.

For these reasons, i thought it would be worthwhile to find a method of dynamically exposing my data or business objects over an HTTP channel. I'd like to share a solution which addresses the above issues, but as in most cases there are tradeoffs.

Whats the tradeoff though? The tradeoff is that this is not webservices and because of that, this custom HTTP invocation protocol is not interoperable across other non .Net platforms.

If interoperability is a requirement in your application then this post does not offer any solutions to the above points. If however interoperability with other platforms is not a requirement, then you may find this helpful. Furthermore this solution requires that your Base Business or Data objects inherit from ContextBoundObject, but this requirement could be overcome in a different implementation.

Method Interception and how this solution works.

Sample solution structure

This solution is based on the following bullet points:

  • Create an object and intercept all method calls to that object
  • Determine if we are in a 'web service mode'
  • if we are not in 'web service mode', then just call the underlying dataobject method and return its value.
  • If we are in 'web service mode' then marshal the call accross to a custom HTTP Handler, this handler will create the dataobject via reflection execute the method and return the value (if any) and output parameters.

The key point of this architecture is the ability to intercept method calls to the object, as soon as we intercept method calls we can decide whether we route the request to the underlying object, or route the request over HTTP to our Handler. This handler is generic and does not require any code modifications. It does however require a reference to our dll that contains our DAL classes.

Thomas Danecker has an excellent post about how to intercept methods with a custom proxy.
http://tdanecker.blogspot.com/2007/09/interception-with-proxies.html

I like that solution because it uses only classes from the .Net framework. What I dont like about it, is the requirement it emposes on the proxyable types to derive from the ContextBoundObject type.

Oren has a nice short post where he compares 7 different options when it comes to method interception, with pros and cons of each:
http://www.ayende.com/Blog/archive/2007/07/02/7-Approaches-for-AOP-in-.Net.aspx


This solution uses interception with native Microsoft .Net classes as per Thomas's example.

Ok, now some code!

Consider the following code ‘framework code’. Something you write once and all your applications are ready to consume. They suddenly gain the ability to flip between web service mode and non webservice mode transparently.

Our sample application consists of a DAL, a BLL a UI and a Test Project with some Test Cases. Our project also includes the Dynamic Service itself. So, let's see some of this in action and how it would all fit together. For the purpose of this test, our UI is a console application. This is what it would typically look like.

Listing #1:

using System;
using System.Text;
using DynamicService.Library;
using DynamicService.DAL;

namespace DynamicService.UI
{
    class Program
    {
        static void Main(string[] args)
        {

            using (WebServiceSettingsScope settingScope = 
                    new WebServiceSettingsScope(
"http://localhost:4040/Dataservice.ashx?db=TestDb", "", "", 3000)) { TestDal dal = new TestDal(); int i = dal.ReturnInputParameter(10); Console.WriteLine(i); } Console.ReadKey(); } } }


And this is what our TestDal class would look like (listing #2):

using System;
using System.Text;
using System.Data;
using DynamicService.Library; namespace DynamicService.DAL { [HttpRemotable] public class TestDal : ContextBoundObject { public TestDal() { //This is how the dal knows it's connection string for example... //the code below could be added in a base dal class that would allow the derived data objects //to know about the database they are talking to. (useful for multi database scenarios) DbConnectionStringSettings connectionStringSettings = ConnectionSettingsScope.ActiveConnectionSettings as DbConnectionStringSettings; string connectionString; if (connectionStringSettings != null) connectionString = connectionStringSettings.ConnectionString; } public int ReturnInputParameter(int param) { return param; } } }

 

Well, this should get us going! So, the solution proposes that objects that can be invoked remotely over HTTP are decorated with an HttpRemotable attribute. They must also derive from ContextBoundObject so that their methods can be intercepted by the appropriate proxy and routed accordingly, either to the object directly, or to our generic handler.

Listing #1 shows that the remotable object, in this case our TestDal, should be instantiated within a ConnectionSettingsScope. There are two types of connectionSettingsScope, two classes that derive from this class. There is a DbConnectionSettingsScope and a WebServiceSettingsScope.

Depending on the scope that the TestDal object is being created it will be wrapped with either a ServiceProxy or an InterceptorProxy. The serviceproxy will route the requests to the service defined in the WebServiceSettingsScope. If created within a DbConnectionSettingsScope the InterceptorProxy will simply forward the requests on to the underlying object within the same application domain.

How does this work behind the scenes? The decision of which Proxy is created is made within the HttpRemotableAttribute.

 

using System;
using System.Text;
using System.Runtime.Remoting.Proxies;

namespace DynamicService.Library
{
    public class HttpRemotableAttribute : ProxyAttribute
    {
        public override MarshalByRefObject CreateInstance(Type serverType)
        {
            if (!ConnectionSettingsScope.Exists)
                throw new Exception("Error creating serviced object: "  + serverType.FullName + "\r\n\r\n You must define the ServiceScope to create a ServicedObject\r\ne.g. using(ServiceScope svcScope = new svcScope(...)) \r\n{ //object instantiation } \r\n\r\n");

            RealProxy proxy = null;

            //if we are using a remote invocation then create a service proxy 
            if (ConnectionSettingsScope.UseRemoteInvocation)
            {
                WebServiceSettings webServiceSettings = ConnectionSettingsScope.ActiveConnectionSettings as WebServiceSettings;
                if (webServiceSettings == null)
                    throw new Exception("Unexpected error: the ConnectionScope.ActiveConnectionSettings is null.");

                proxy = new ServiceProxy(serverType, webServiceSettings);
            }
            else
            {
                //just create a dumb proxy that will forward on the requests locally
                proxy = new InterceptorProxy(serverType);
            }

            MarshalByRefObject transparentProxy = (MarshalByRefObject)proxy.GetTransparentProxy();
            return transparentProxy;
        }
    }
}

 

HttpRemotable is a special attribute, inheriting from ProxyAttribute. What makes it so special is the CreateInstance method which is called when we try to create an instance of an object that is derived from ContextBoundObject and that is decorated with a ProxyAttribute derived class such as HttpRemotable. It determines the type of proxy to be created depending on the Active ConnectionSettings on our ConnectionSettings stack.

What is this ConnectionSettingsScope that I keep refereing to?

 

using System;
using System.Text;
using System.Threading;
using System.Collections.Generic; 
using DynamicService.Library; namespace DynamicService.Library { public class ConnectionSettingsScope : IDisposable { [ThreadStatic] //Each thread will have its own copy of a ServiceScope stack private static Stack<ConnectionSettings> _ConnectionSettings; protected static Stack<ConnectionSettings> ConnectionSettingsStack { get { //this is thread safe, ConnectionSettings is a ThreadStatic variable. if (_ConnectionSettings == null) _ConnectionSettings = new Stack<ConnectionSettings>(); return _ConnectionSettings; } } public ConnectionSettingsScope(ConnectionSettings ConnectionSettings) { ConnectionSettingsStack.Push(ConnectionSettings); } static ConnectionSettingsScope() { } public ConnectionSettingsScope() { } public static bool UseRemoteInvocation { get { return ActiveConnectionSettings is WebServiceSettings; } } public static bool Exists { get { return (ConnectionSettingsStack!= null) && (ConnectionSettingsStack.Count > 0); } } public static ConnectionSettings ActiveConnectionSettings { get { return GetCurrentThreadConnectionSettings(); } } private static ConnectionSettings GetCurrentThreadConnectionSettings() { if (ConnectionSettingsStack.Count > 0) return ConnectionSettingsStack.Peek(); else return null; } // Dispose() calls Dispose(true) public void Dispose() { Dispose(true); GC.SuppressFinalize(this); } // The bulk of the clean-up code is implemented in Dispose(bool) protected virtual void Dispose(bool disposing) { if (disposing) { // free managed resources ConnectionSettingsStack.Pop(); // Remove the latest server url from the threadstatic stack } } } }

Maybe I need to explain myself a little on the above code snippet. ConnectionSettingsScope contains a ThreadStatic stack of ConnectionSettings. Each time we create a new ConnectionSettingsScope object it pushes the latest ConnectionSettings class on to its ThreadStatic stack (ThreadStatic means that the stack is static to the thread level, each thread has its own ConnectionSettings stack). Each time we create an HttpRemotable object, the Settings used on it, will be based on the latest ConnectionSettingsScope.

 

DataService.ashx explained

So what happens on the other side of the fence. What happens when I we are creating our HttpRemotable object within a WebServiceSettingsScope?
All method calls are intercepted by the ServiceProxy and a ServiceRequest is sent over to our ServiceHandler. The service request contains all sorts of information of what we are trying to do, it contains information such as the fully qualified name of the underlying real type that has been proxied, the method we want to execute, and all the parameters passed to the method.

This information is used by the handler to instantiate the object via reflection and call the appropriate method. When method execution completes, the handler constructs a ServiceResponse to send back to the client. This response contains the return value along with the values of any output parameters.

The actual connection string to the database is stored on the server side. To allow for a multidatabase scenario, the handler is called in the format of DataService.ashx?db=ConnectionStringKeyName

ConnectionStringKeyName is the key in the web.config of the connectionstring to the database we want to use.


All objects are created within a DbConnectionSettingsScope region on the service handler. They will be pointing to the database in the selected connectionstring entry of the web.config as described above.

The DataService.ashx must contain a reference to the dlls that contain the types to be remotely invoked. In this case a single reference to our Dal will suffice. Furthermore, when publishing this web project we must ensure that the referenced dll is also in place.
I can see some are already thinking that adding references is what i wanted to avoid. True to some extent, what i wanted to avoid was the volume of references added. In classic web services I would have typically created one seperate asmx file per data object that i wanted to expose.

 

The Service Handler supports:

  • Methods with void result
  • Methods with parameters whose values are serialisable
  • Graceful handling of exceptions and returning them to the client
  • Methods with output parameters
  • Overloaded methods with different parameter signatures.
  • ServiceResponses are gzipped by default to minimise data traffic

The Service Handler does not support:

  • Methods whose parameter values are not serialisable
  • Generic methods


Testing and Performance

As mentioned above, the generic service handler uses reflection to reconstruct the underlying object being invoked. In order to assess whether this would mean a serious performance hit or not I created a few tests that benchmark it against equivalent invocations against a microsoft .Net web service.

The performance is pretty much similar, and in some cases the custom ServiceHandler calls outperform the equivalent calls in web services.  

The following image contains the various test cases that validate the features supported by the generic Service Handler.

image_thumb17 

Summing it all up

This is a solution that you may consider, so long that cross-platform interoperability is not a concern to your application. The remotable objects should only have stateless atomic operations invoked upon them just as the case would be in Web Services.   If you require to have the object alive on the remote tier then Remoting is probably for you. This solution allows you to easily switch between a web service mode and a non webservice mode by literally changing 1 line of code in your application.   This solution minimises redundant coding in a web service tier and allows you to expose your objects over an HTTP channel by simply decorating them with one attribute and deriving from ContextBoundObject. I hope this helps yeou shave off some development time from your projects!

 

Suggestions, complaints, errors? I will be very happy to hear your opinion and feedback on this. There is a visual studio 2008 solution file that can be downloaded from here to get you started. Happy Coding!

 

Sample solution for this post: DynamicService.zip

Comments (2) -

3/24/2008 2:34:37 PM #

Bart Czernicki

You do realize you can do this with WCF 3.5 in a lot less code and not waste time doing all the "plumbing" code u wrote.

I don't have really negative to say about it, its just that I would prefer in doing the "creative business logic" rather than plumbing code.

Bart Czernicki

3/24/2008 4:07:49 PM #

Anastasiosyal

Hello Bart and thank you for your comment. I do not have a thorough knowledge of WCF 3.5 but I do have the impression that trying to accomplish something similar on an existing code base seems a little bit more involved.

Here is a pointer url for a WCF3.5 example:
blogs.conchango.com/.../rest-from-wcf-3-5.aspx

The author needs to define service contracts, needs to define data contracts and then needs to also enter a fair amount of configuration in the config file to define the service endpoints and bindings. In contrast all you have to do in this solution is decorate the class you want to invoke remotely with HttpRemotable and derive from ContextBoundObject. Comparing the two methods I find it easier to 'web service' enable an existing application to allow for HTTP calls using this method rather than do all the necessary steps to implement WCF3.5

In this post I went on to explain the inner workings of how this solution. How this works behind the scenes may have made it sound a little complicated than all is required to actually consume it.


It is without doubt that WCF is more robust, but when it comes to coding I see this as a simpler and faster solution for simpler scenarios. I guess it all comes down to the application requirements (time, speed, interoperability, code simplicity). We just have to weigh it out and choose what suits us best. However, implementing WCF 3.5 might be simpler than what I have come to think!? If you have any pointers to any links that show the how to implement WCF3.5 in a minimum number of steps it would be much appreciated Smile

Thanks
Anastasiosyal




Anastasiosyal

Comments are closed