Tuesday, 18 February 2014

Traversing paged results using Massive data access wrapper

Massive, a library created by brilliant developer Rob Conery and Jon Skeet's frequent humorous side-kick, has got a lot of nice features. Its best features are ease of use and dynamic appeal. It is also fast and lightweight. After testing Massive, I wanted to create an extension method to traverse paged results returned from Massive. The code below is tested against the AdventureWorks2012 database. First off, install Massive. I have created a simple Console Project to test this out and then use the Library Package Manager to install Massive:

Install-Package Massive -Version 1.1.0 

Remember to add the -Version switch above. Then install the AdventureWorks 2012 database here: http://msftdbprodsamples.codeplex.com/releases/view/55330 Add a connection string against the Adventure Works 2012 database after installing this on your developer PC. Use Data Explorer or Server Explorer in Visual Studio. Rename the connection string in your app.config or web.config to "adventureworks2012". Define a model entity for the Production.Product table next:

    public class Products : DynamicModel
    {
        public Products()
            : base("adventureworks2012", "Production.Product", "productid")
        {

        }
    }

To add a table entity in Massive, specify the connection string name, then the name of the table and then the primary keys. If you omit the name of the connection string, the first connection string found is used. Next, the code for the extension method to work against this DynamicModel inherited class and other inherited classes from DynamicModel:

     public static class MassiveExtensions
    {

        public static void MassivePagedAction(this DynamicModel massiveObject, int pageSize = 20, int startPageIndex = 1, 
            int endPageIndex = -1, Action<int, dynamic> pageAction = null, Action<dynamic> itemAction = null)
        {

            var pages = new Dictionary<int, dynamic>(); 
            var firstPage = massiveObject.Paged(currentPage: startPageIndex, pageSize: pageSize);
            pages.Add(startPageIndex, firstPage);
            int endIndex = endPageIndex == -1 ? firstPage.TotalPages : Math.Min(firstPage.TotalPages, endPageIndex); 
            
            for (int currentPageIndex = startPageIndex + 1; currentPageIndex <= endIndex; currentPageIndex++)
            {
                var currentPage = massiveObject.Paged(currentPage: currentPageIndex, pageSize: pageSize);
                pages.Add(currentPageIndex, currentPage); 
            }

            foreach (var keyValuePair in pages)
            {
                if (pageAction != null)
                    pageAction(keyValuePair.Key, keyValuePair.Value);
                foreach (var item in keyValuePair.Value.Items)
                {
                    if (itemAction != null)
                        itemAction(item); 
                }
            }

        }

    }


The extension method puts the paged results in a dictionary, where the keys are the page index and the value is the dynamic object that Massive returns, containing the paged data. There are numerous optional parameters to this extension method. The page size is default set to 20, if not specified. The start page index is 1, which is actually the default value. Rob Conery should perhaps have chosen the page index to be 0-based, as this is the standard in C#, but querying for page index 0 will give empty results. The end page index is default set to -1. This means that all pages will be returned, until there are not more pages. If you set the endPageIndex to a value less than firstPage.TotalPages inside the extension method, i.e. the actual total pages in the database, only pages up to the endPageIndex is returned. To get a single page, pass in a start and end index that differs with 1 (where end index has a value of one larger than start index). It is also possible to pass in page actions and item actions here. They default to null, but obviously at least an item action is desired to be set. A variant of the method above to allow a Func to be passed in to for example return results to the caller is interesting. Using the extension method above is shown next:

//GET PAGES RESULTS OF THE PRODUCTS TABLE USING A PAGE SIZE OF FIVE. GET ENTIRE TABLE.

            DynamicModel table = new Products();
          
            table.MassivePagedAction(pageSize:5,pageAction: (indx, page) => 
             Console.WriteLine("\nProducts in page # {0}:\n", indx), 
             itemAction: item => Console.WriteLine(item.Name)); 


//SAMPLE OUTPUT:


------ Test started: Assembly: TestMassive.exe ------


Products in page # 1:

Adjustable Race
Bearing Ball
BB Ball Bearing
Headset Ball Bearings
Blade

Products in page # 2:

LL Crankarm
ML Crankarm
HL Crankarm
Chainring Bolts
Chainring Nut

Products in page # 3:

..

//This resulted in 101 pages - There are 504 products in the AdventureWorks 2012 database (primarily mountain bikes and clothes+equipment)

Products in page # 101:

HL Bottom Bracket
Road-750 Black, 44
Road-750 Black, 48
Road-750 Black, 52

//Final page contains only four items as expected

To get a paged result and work with this paged result, another extension method can be used:

  public static Dictionary<int,List<ExpandoObject>> MassivePagedRetrieval(this DynamicModel massiveObject, int pageSize = 20, int startPageIndex = 1,
           int endPageIndex = -1)
        {

            var pages = new Dictionary<int, dynamic>();
            var pagedResult = new Dictionary<int, List<ExpandoObject>>(); 
            var firstPage = massiveObject.Paged(currentPage: startPageIndex, pageSize: pageSize);
            pages.Add(startPageIndex, firstPage);
            int endIndex = endPageIndex == -1 ? firstPage.TotalPages : Math.Min(firstPage.TotalPages, endPageIndex);

            for (int currentPageIndex = startPageIndex + 1; currentPageIndex <= endIndex; currentPageIndex++)
            {
                var currentPage = massiveObject.Paged(currentPage: currentPageIndex, pageSize: pageSize);
                pages.Add(currentPageIndex, currentPage);
            }

            foreach (var keyValuePair in pages)
            {
                List<ExpandoObject> items = new List<ExpandoObject>(); 
                foreach (var item in keyValuePair.Value.Items)
                {
                    items.Add(item); 
                }
                pagedResult[keyValuePair.Key] = items;
            }

            return pagedResult;
        }

To use this extension method, use:

     Dictionary<int, List<ExpandoObject>> pagedResult = table.MassivePagedRetrieval(pageSize: 10); 

The key is the page index again and a list of ExpandoObject objects are returned for each page (or key) in the dictionary. ExpandoObject is a dynamic object. To watch the resulting data in Visual Studio, use the debugger and choose Dynamic View when inspecting the result through debugging brekpoints in your code. This article has focused on paged data, as Massive often can be used to performed paged data retrieval in for example ASP.NET MVC-based solutions, but also other solutions can use this data access wrapper. The requirement is .NET 4.0 as System.Dynamics is the library which is used. Massive supports most features as more complex Object Relational Wrappers, such as Entity Framework. Its better speed and ease of use should be tempting. Rob Conery has got additonal information here: Massive Github page There is a trade off with Massive too, its dynamic nature is also its Achilles heel. If you rename fields in the database, you most likely must also update your code and since it is dynamic, chances are that errors can occur and be detected in runtime, out in production. This is easier to avoid using strongly typing such that Entity Framework also. Sadly, Entity Framework and many other ORMs are also slow. The following code shows how it is possible to retrieve a result from two tables. As you can see, one has to pass in SQL to output data.

var productsWithCategories = 
table.Query("SELECT p.Name, pc.Name As CategoryName FROM Production.Product p INNER JOIN Production.ProductCategory pc ON p.ProductSubCategoryId = pc.ProductCategoryId");

foreach (var ppc in productsWithCategories)
{
 Console.WriteLine("Product Name: {0}, Product category name: {1}", ppc.Name, ppc.CategoryName);
}

//RESULTING OUTPUT: 

Product Name: Road-150 Red, 62, Product category name: Components
Product Name: Road-150 Red, 44, Product category name: Components
Product Name: Road-150 Red, 48, Product category name: Components
Product Name: Road-150 Red, 52, Product category name: Components

..

This results in a quick way to access the database, but also loosing Intellisense (auto completion), strongly static typing and compile type checking. When returning results from the database in a service, it is possible to return the data as ExpandoObjects as shown in the extension method MassivePagedRetrieval shown above. If one uses ASP.NET MVC, the Model can bind to this data. This results in fewer application layers and less coding, but at the same time a risk for those pesky runtime errors out in production. Massive is very quick and for web developers, this looks very promising. Using ElasticObject or similar techniques, the ExpandoObjects can be converted to either XML or better - JSON - and processed using jQuery and Javascript. If Massive catches on or some other, ligtweight ORM is preferred by developers remains to be seen. This is a moving field. What is sure, the days of heavy weight ORMs like Entity Framework will not be bright if not the performance catches up with these lighter ORM frameworks. At the same time, it is hard to say that it actually will result in shorter time to market (TTM) for developers, as they loose Intellisens and other good features such as Entity Framework supports. Test out Massive and see if it matches your needs. Use the extension methods above to get started with paged result retrieval, if they look interesting. Massive also supports of course inserts, updates and deletes. In addition, validations and callbacks can be put in the class that inherit from DynamicModel. Massive also has specialized functions such as the standard aggregation methods and convenience method such as Find that is defined in the DynamicModel. Lastly, Massive is heavily based on System.Dynamics assembly in .NET 4.0 and newer framework versions. DynamicModel inherits itself from System.Data.DynamicObject. Also bulk updates are supported in Massive.

Monday, 17 February 2014

Using ElasticObject to parse arbitrarily large XML documents into object graphs

ElasticObject is a great implementation of a DynamicObject, that allows the developer to work with an arbitrarily large XML document, using an object graph. The code below access the Norwegian Weather Forecast service Yr.no to get some forecast data for the location where I grew up in Norway, where a weather station nearby automatically collects meteorological data such as wind, temperature, pressure and wind direction, humidity and so on. The data is available among other ways as XML to download and ElasticObject can be used to handle the XML. When handling XML, it is possible to create an XSD from sample XML data and in for example Visual Studio create an XSD file through XML->Create Schema option in Visual Studio after opening the sample XML file. Using the Visual Studio Command line, it is possible to generate a class through the command:

xsd /classes myfile.xsd

This generates a file called myfile.cs, which will be a data contract.

The data contract generated can be used to deserialize the XML into an object. Often this is a preferred strategy, since one gets a strongly typed object (graph) to work with. ElasticObject is a dynamic object, and allows the developer to avoid having to generate a serialization class before deserializing the received XML. Sometimes, this also allows changes in the XML to occur without affecting the code. Often it is a limited part of the XML tree, which is the received XML the developer needs to retrieve for further processing. Also, not having to generate classes from the XML schema is also convenient. Still, the developer needs to refer more often to the XML to understand the structure of the object which is created. The Intellisense of ElasticObject is rather poor, so the developer needs to query the object graph using Immediate Window and query the object to find the right "path" to the information in the XML document. Similar techniques using System.Linq.Xml and XPath queries can be used. To start using ElasticObject, install first the NuGet package in Visual Studio. Type the following command in Package Manager Console.

PM> install-package AmazedSaint.ElasticObject
Installing 'AmazedSaint.ElasticObject 1.2.0'.
Successfully installed 'AmazedSaint.ElasticObject 1.2.0'.
Adding 'AmazedSaint.ElasticObject 1.2.0' to TestElasticObject.
Successfully added 'AmazedSaint.ElasticObject 1.2.0' to TestElasticObject.

The following code then illustrates the use:

using AmazedSaint.Elastic;
using System;
using System.IO;
using System.Net;
using System.Xml.Linq;

namespace TestElasticObject
{
    class Program
    {

        static void Main(string[] args)
        {
            var wc = new WebClient();
          
            using (StreamReader sr = new StreamReader(wc.OpenRead(@"http://www.yr.no/stad/Norge/Nord-Tr%C3%B8ndelag/Steinkjer/S%C3%B8ndre%20Egge/varsel.xml")))
            {
                var data = sr.ReadToEnd();
                IterateForecasts(data);          
            }

            Console.WriteLine("Press any key to continue ...");
            Console.ReadKey(); 

        }

        private static void IterateForecasts(string data)
        {
            dynamic weatherdata = XElement.Parse(data).ToElastic();
            foreach (var node in weatherdata.forecast.text.location[null])
            {
                Console.WriteLine(string.Format("Fra-Til: {0} - {1}", node.from, node.to)); 
                Console.WriteLine(~node.body);
                Console.WriteLine();
            }
        }

    }
}


The code above uses WebClient to download the target XML data, then uses a StreamReader to read the XML file into a string. Then using XElement and the ToElastic extension method in AmazedSaint.Elastic namespace, this is stored into a dynamic variable which can be worked on. One important gotcha here is how to drill down into the object graph of the ElasticObject, which I could not figure out following the documentation, since it only contained more simple examples but I found on StackOverFlow: To drill further down into the object graph than its immediate child node, type the path to the element which contains the data to work with using a dot separated syntax and in addition: use the null index to get to the child element which contains the data to process - for example to output. When traversing the child element in the foreach loop, its child elements will be stored into the loop variable node and then it is possible to get to the attributes of the node. To get the value inside the element, use the tilde (~) operator. ElasticObject implements some operators to make it easier to work with XML and object graphs. For example, to cast an ElasticObject to XML, the > operator can be used:

            dynamic store = new ElasticObject("Store");
            store.Name = "Acme Store";
            store.Location.Address = "West Avenue, Heaven Street Road, LA";
            store.Products.Count = 2;
            store.Owner.FirstName = "Jack";
            store.Owner.SecondName = "Reacher";
            store.Owner <<= "this is some internal content for owner";

            var p1 = store.Products.Product();
            p1.Name = "Acme Floor Cleaner";
            p1.Price = 20;

            var p2 = store.Products.Product();
            p2.Name = "Acme Bun";
            p2.Price = 22; 

            XElement el = store > FormatType.Xml; 
            System.Console.WriteLine(el.ToString());

The ElasticObject is defined dynamically and the <<= operator is used to set the value inside the property, which will be shown when converting the ElasticObject to XML. In addition, the code above shows how to create child elements, as the Products property contains two child Product objects, which will be Product XML elements. The < operator is used to "pipe" the object into an XElement variable, which is the converted XML object from the ElasticObject object graph. To convert the object from an XML document to an ElasticObject, the ToElastic() extension method can be used to convert the XML document or XElement variable into the ElasticObject - the object graph again. Using ElasticObject, it is possible to work with XML in object graphs and since it is dynamic, it is not necessary to create new types, such as serialization data contracts. ElasticObject source code is available on GitHub here: https://github.com/amazedsaint/ElasticObject
Its creator is Anoop Madhusudanan, AmazedSaint profile on GitHub. It originally was a project hosted on CodePlex. ElasticObject should be considered as an alternative when working with XML and object graphs. It is very tempting to avoid having to create new objects to work with the XML but work with a dynamic object that can be extended and changed. It should also be efficient. An alternative can be to use anonymous types, but why go the hard way when one can go the easy - elastic way?

IoC container example

This article will present an IoC container that can resolve concrete types via interfaces or instances recursively. It should not be used in production code without improved loop detection and error handling, but can in some scenarios be used if one needs a very simple (and fast) IoC container.

The code is heavily based on Jon Skeet's walkthrough of a IoC container ("IoC container on the fly").

First the code of the IoC container itself:



using System;
using System.Collections.Generic;
using System.Linq;
using System.Reflection;

namespace TestComposedIoCContainer
{
    
    public class CompositionContainer
    {

        private readonly Dictionary<Type, Func<object>> providers = new Dictionary<Type, Func<object>>();

        private readonly object providersLocker = new object(); 


        public void Bind<TKey, TConcrete>() where TConcrete : TKey
        {
            lock (providersLocker)
            {
                providers[typeof(TKey)] = () => ResolveByType(typeof(TConcrete));
            }
        }

        public void Bind<T>(T instance)
        {
            lock (providersLocker)
            {
                providers[typeof(T)] = () => instance;
            }
        }

        private object ResolveByType(Type type)
        {
            var constructors = type.GetConstructors();
            if (constructors != null)
            {
                ConstructorInfo cInfo;
                cInfo = constructors.Count() == 1 ? constructors.Single() : 
                    constructors.Where(c => 
                    c.GetCustomAttributes(typeof(ImportConstructorAttribute), false).Length > 0).FirstOrDefault(); 
                if (cInfo == null)
                    throw new Exception(GetUsageMessage(type));
                var arguments = cInfo.GetParameters().Select(p => Resolve(p.ParameterType)).ToArray();
                return cInfo.Invoke(arguments);
            }
            else
            {
                var instanceField = type.GetField("Instance");
                if (instanceField != null)
                    return instanceField.GetValue(null);
                else
                    throw new Exception(GetUsageMessage(type));
            }
        }

        private static string GetUsageMessage(Type type)
        {
            return "Could not resolve a type implementing " + type.Name 
            + " - it must be registered through Bind to the composition container and either contain a single constructor or one constructor " 
            + "decorated with ImportContructor attribute or a field named Instance";
        }

        internal TKey Resolve<TKey>()
        {
            return (TKey)Resolve(typeof(TKey)); 
        }

        internal object Resolve(Type type)
        {
            Func<object> provider;
            if (providers.TryGetValue(type, out provider))
                return provider();
            else
                return ResolveByType(type);
        }

    }

}


The IoC container contains a dictionary which has got a Type as the Key, which is the usually either the interface or the concrete type to register through the Bind calls to the container, and a function expression that returns either an instance or resolves an instance through the container. When resolving a type, it can be multiple constructors in the class. I have extended Jon Skeet's code a bit, and by decorating the constructor by a ImportConstructor attribute, it is possible to specify which constructor is the constructor that should be the inversion of control constructor. Pass in all dependencies that must be resolved in that constructor. If the importing type or "part" to use a phrase from MEF, has no constructors that should be used, a field called "Instance" can be used. This is to support singleton patterns or similar. I have chosen to add locking when performing binds, to make this thread safe. Resolving instances is not made thread safe, as this can give very much locking. Usually a composition container is anyways set up in a single thread through registering bindings and then multiple threads will possibly access the IoC container. It is possible to put the CompositionContainer itself as a singleton, which is what one usually wants. I have added a generic singleton implementation in my blog that can be used to support this. The import constructor attribute is very simple:

using System;

namespace TestComposedIoCContainer
{
    
    public class ImportConstructorAttribute : Attribute
    {
    }
}

A unit test to display the use of this composition container, followed by the class definitions:

using NUnit.Framework;
using System;

namespace TestComposedIoCContainer
{
   
    [TestFixture]
    public class Program
    {

        [Test]
        public void MainTest()
        {
            var container = new CompositionContainer();
            container.Bind<ICalculator, Calculator>();
            container.Bind<IAdder, Adder>();
            container.Bind<IMultiplier, Multiplier>(); 

            var calculator = container.Resolve<ICalculator>();
            Console.WriteLine("Calculator resolved!");
            int resultAdd = calculator.Add(3, 5);
            Console.WriteLine("3 + 5 = {0}", resultAdd);

            int resultMult = calculator.Multiply(4, 8);
            Console.WriteLine("4 * 8 = {0}", resultMult);
        }

        public static void Main(string[] args)
        {

        }

    }

}


//CLASS DEFINITIONS

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace TestComposedIoCContainer
{
    
    public class Calculator : ICalculator
    {

        private IAdder adder;

        private IMultiplier multiplier;

        public Calculator()
        {

        }

        [ImportConstructor]
        public Calculator(IAdder adder, IMultiplier multiplier)
        {
            this.adder = adder;
            this.multiplier = multiplier; 
        }

        public int Add(int x, int y)
        {
            return adder.Add(x, y); 
        }

        public int Multiply(int x, int y)
        {
            return multiplier.Multiply(x, y); 
        }

    }


namespace TestComposedIoCContainer
{
    
    public interface ICalculator
    {

        int Add(int x, int y);

        int Multiply(int x, int y); 

    }

}


}


namespace TestComposedIoCContainer
{
    
    public class Adder : IAdder
    {

        public int Add(int x, int y)
        {
            return x + y; 
        }
    }

}


namespace TestComposedIoCContainer
{
    
    public interface IAdder
    {

        int Add(int x, int y);

    }

}


namespace TestComposedIoCContainer
{
    
    public interface IMultiplier
    {

        int Multiply(int x, int y);

    }

}


namespace TestComposedIoCContainer
{
    
    public class Multiplier : IMultiplier
    {

        public Multiplier()
        {

        }

        public int Multiply(int x, int y)
        {
            return x * y;            
        }

    }

}

//RESULT OF RUNNING NUNIT UNIT TEST ABOVE: 

------ Test started: Assembly: TestComposedIoCContainer.exe ------

Calculator resolved!
3 + 5 = 8
4 * 8 = 32

1 passed, 0 failed, 0 skipped, took 0,46 seconds (NUnit 2.6.2).



The code above should only be used for simple IoC scenarios. The IoC container uses very much reflection, which should not be consider the quickest way of resolving objects. The IoC container code can though resolve arbitrarily complex object graphs, when the registering of binds are done correctly, however there is a weakness here - the container does not do loop detection very good. For example, if a "part" or "component" of the composition container imports another parts and this part again resolves that part or some other recursive relation, looping will start to occur. But at the same time - many IoC frameworks are poor at detecting such errors anyways and will also give infinite recursion here. If you have simple needs for IoC resolution, the code above can be used in simple scenarios but most production code should instead choose among the many professional IoC frameworks out there.