Wednesday, 5 March 2014

Entity Framework Code First Generic Entity store operations

This article will discuss how generic entity store operations towards a data context can be used with EF Code First. This is tested with EF 6. First, define an interface with the often used generic entity store operations:

using System;
using System.Data.Entity;

namespace TestEFCodeFirst1
{

    public interface IRepo : IDisposable
    {

        T Insert<T>(T item, bool saveNow) where T : class;

        T Update<T>(T item, bool saveNow) where T : class;

        T Delete<T>(T item, bool saveNow) where T : class;

        int Save(); 

    }

}

Then implement this interface:

using System.Data.Entity;

namespace TestEFCodeFirst1
{
    
    public class Repo<TContext> : IRepo
        where TContext : DbContext, new()
    {

        public Repo()
        {
            Context = new TContext(); 
        }

        protected TContext Context { get; private set; }

        #region IRepo Members

        public T Insert<T>(T item, bool saveNow) where T : class
        {
            Context.Entry(item).State = EntityState.Added;
            if (saveNow)
                Context.SaveChanges();
            return item; 
        }

        public T Update<T>(T item, bool saveNow) where T : class
        {
            Context.Entry(item).State = EntityState.Modified;
            if (saveNow)
                Context.SaveChanges();
            return item; 
        }

        public T Delete<T>(T item, bool saveNow) where T : class
        {
            Context.Entry(item).State = EntityState.Deleted;
            if (saveNow)
                Context.SaveChanges();
            return item; 
        }

        public void Dispose()
        {
            Context.Dispose(); 
        }

        #endregion

        #region IRepo Members


        public int Save()
        {
            return Context.SaveChanges(); 
        }

        #endregion
    }

}

Next, create a simple DbContext to test out:

  
  public class DemoDbContext : DbContext
    {


        public DemoDbContext() : base("EFCodeFirstDemoDb1")
        {
            
            
        }


        public DbSet<Student> Students { get; private set; }

    }


An example of using this db context in a derived repo class is shown next:

public class DemoRepo : Repo<DemoDbContext>
{

}

The next code shows how to use this extension method:

using System;
using System.Data.Entity;

namespace TestEFCodeFirst1
{
    class Program
    {
        static void Main(string[] args)
        {

            Database.SetInitializer(new Init()); 

            using (var repo = new DemoRepo())
            {
                repo.Insert(new Student { Name = "Hardy", DateAdded = DateTime.Now,
                LastMod = DateTime.Now}, true);

                repo.Insert(new Student
                {
                    Name = "Joey",
                    DateAdded = DateTime.Now,
                    LastMod = DateTime.Now
                }, true);                

            }
        }

    }
}


To automatically patch your database do this:

In package-manager console type: Enable-Migrations In the configuration class, set AutomaticMigrationsEnabled to true.

namespace TestEFCodeFirst1.Migrations
{
    using System;
    using System.Data.Entity;
    using System.Data.Entity.Migrations;
    using System.Linq;

    internal sealed class Configuration : DbMigrationsConfiguration<TestEFCodeFirst1.DemoDbContext>
    {
        public Configuration()
        {
            AutomaticMigrationsEnabled = true;
        }

        protected override void Seed(TestEFCodeFirst1.DemoDbContext context)
        {
            //  This method will be called after migrating to the latest version.

            //  You can use the DbSet<T>.AddOrUpdate() helper extension method 
            //  to avoid creating duplicate seed data. E.g.
            //
            //    context.People.AddOrUpdate(
            //      p => p.FullName,
            //      new Person { FullName = "Andrew Peters" },
            //      new Person { FullName = "Brice Lambson" },
            //      new Person { FullName = "Rowan Miller" }
            //    );
            //
        }
    }
}


Add an Init class:

using TestEFCodeFirst1.Migrations;

namespace TestEFCodeFirst1
{

    internal class Init : System.Data.Entity.MigrateDatabaseToLatestVersion
    {

    }
}

This explains what is done in the line further up: Database.SetInitializer(new Init());

Tuesday, 18 February 2014

Performing slow work outside the WPF UI thread and updating the WPF GUI afterwards

Developers using WPF are very familiar with the problem of doing slow work on the UI thread and trying to update the GUI thread afterwards. This is not only a problem in WPF, but also in Windows Forms and even web applications. If the UI thread is being kept busy with a heavy, synchronous calculation, other work such as resizing the window and even repainting the UI will suffer and this results in a non-responsive GUI that the user cannot work with. The solution is to do work in another thread or threads and update the GUI afterwards. To make this easier, I list up an extension method below to do this. The extension method is an extension method on the dispatcher, which will usually be this.Dispatcher in code behind or Dispatcher.CurrentDispatcher in MVVM scenarios and the user must supply the function that will return the thread work return a type T (this can be either an object or a list of objects for example) and also a gui update method that receives an object of type T that the thread work method calculated. The gui update method will usually specify what shall be performed in the GUI or in the ViewModel in MVVM scenarios, while the thread work method specifies first will specify how the object of type T is retrieved. It should be possible to use this extension method both in code-behind and MVVM scenarios.

public static class DispatcherUtil
    {

        public static void AsyncWorkAndUIThreadUpdate<T>(this Dispatcher currentDispatcher, Func<T> threadWork, Action<T> guiUpdate)
        {
            ThreadPool.QueueUserWorkItem(delegate(object state)
            {
                T resultAfterThreadWork = threadWork(); 
                currentDispatcher.BeginInvoke(DispatcherPriority.Normal, new Action<T>(delegate(T result){
                    guiUpdate(resultAfterThreadWork);
                }), resultAfterThreadWork); 

            }); 
        }

    }

Next, an example of using this:

  private void Button_Click(object sender, RoutedEventArgs e)
        {

            this.Dispatcher.AsyncWorkAndUIThreadUpdate(() => FooService.GetSomeFooData(),
                s => btnFoo.Content = s); 

        }

The FooService is very simple and simulates a resource intensive calculation that takes about 5 seconds:

public static class FooService
    {

        public static string GetSomeFooData()
        {
            Thread.Sleep(5000);
            return "Hello world! "  + DateTime.Now.ToString(); 
        }

    }

I use ThreadPool.QueueUserWorkItem here. Of course this is a very simple example, and in complex applications one needs to watch out for state problems in the UI. When it is not sure when the calculation is performed, you should perhaps consider to avoid refreshing the UI if the UI has changed, especially in more complex MVVM component based WPF User Interfaces, based on for example Prism. At the same time, locking up the UI is not good either. I have only tested the code above in a simple scenario, but expect it to work also with more complex WPF UI. There are two golden rules when it comes to threading in WPF: 1. Never access UI elements from another thread than the UI thread 2. Do not do slow work on the UI thread but in another thread Most difficulties doing off thread async work in WPF comes to updating the UI after the calcuations are done in the other thread(s). With this extension method, hopefully WPF developers will find it easier.

Traversing paged results using Massive data access wrapper

Massive, a library created by brilliant developer Rob Conery and Jon Skeet's frequent humorous side-kick, has got a lot of nice features. Its best features are ease of use and dynamic appeal. It is also fast and lightweight. After testing Massive, I wanted to create an extension method to traverse paged results returned from Massive. The code below is tested against the AdventureWorks2012 database. First off, install Massive. I have created a simple Console Project to test this out and then use the Library Package Manager to install Massive:

Install-Package Massive -Version 1.1.0 

Remember to add the -Version switch above. Then install the AdventureWorks 2012 database here: http://msftdbprodsamples.codeplex.com/releases/view/55330 Add a connection string against the Adventure Works 2012 database after installing this on your developer PC. Use Data Explorer or Server Explorer in Visual Studio. Rename the connection string in your app.config or web.config to "adventureworks2012". Define a model entity for the Production.Product table next:

    public class Products : DynamicModel
    {
        public Products()
            : base("adventureworks2012", "Production.Product", "productid")
        {

        }
    }

To add a table entity in Massive, specify the connection string name, then the name of the table and then the primary keys. If you omit the name of the connection string, the first connection string found is used. Next, the code for the extension method to work against this DynamicModel inherited class and other inherited classes from DynamicModel:

     public static class MassiveExtensions
    {

        public static void MassivePagedAction(this DynamicModel massiveObject, int pageSize = 20, int startPageIndex = 1, 
            int endPageIndex = -1, Action<int, dynamic> pageAction = null, Action<dynamic> itemAction = null)
        {

            var pages = new Dictionary<int, dynamic>(); 
            var firstPage = massiveObject.Paged(currentPage: startPageIndex, pageSize: pageSize);
            pages.Add(startPageIndex, firstPage);
            int endIndex = endPageIndex == -1 ? firstPage.TotalPages : Math.Min(firstPage.TotalPages, endPageIndex); 
            
            for (int currentPageIndex = startPageIndex + 1; currentPageIndex <= endIndex; currentPageIndex++)
            {
                var currentPage = massiveObject.Paged(currentPage: currentPageIndex, pageSize: pageSize);
                pages.Add(currentPageIndex, currentPage); 
            }

            foreach (var keyValuePair in pages)
            {
                if (pageAction != null)
                    pageAction(keyValuePair.Key, keyValuePair.Value);
                foreach (var item in keyValuePair.Value.Items)
                {
                    if (itemAction != null)
                        itemAction(item); 
                }
            }

        }

    }


The extension method puts the paged results in a dictionary, where the keys are the page index and the value is the dynamic object that Massive returns, containing the paged data. There are numerous optional parameters to this extension method. The page size is default set to 20, if not specified. The start page index is 1, which is actually the default value. Rob Conery should perhaps have chosen the page index to be 0-based, as this is the standard in C#, but querying for page index 0 will give empty results. The end page index is default set to -1. This means that all pages will be returned, until there are not more pages. If you set the endPageIndex to a value less than firstPage.TotalPages inside the extension method, i.e. the actual total pages in the database, only pages up to the endPageIndex is returned. To get a single page, pass in a start and end index that differs with 1 (where end index has a value of one larger than start index). It is also possible to pass in page actions and item actions here. They default to null, but obviously at least an item action is desired to be set. A variant of the method above to allow a Func to be passed in to for example return results to the caller is interesting. Using the extension method above is shown next:

//GET PAGES RESULTS OF THE PRODUCTS TABLE USING A PAGE SIZE OF FIVE. GET ENTIRE TABLE.

            DynamicModel table = new Products();
          
            table.MassivePagedAction(pageSize:5,pageAction: (indx, page) => 
             Console.WriteLine("\nProducts in page # {0}:\n", indx), 
             itemAction: item => Console.WriteLine(item.Name)); 


//SAMPLE OUTPUT:


------ Test started: Assembly: TestMassive.exe ------


Products in page # 1:

Adjustable Race
Bearing Ball
BB Ball Bearing
Headset Ball Bearings
Blade

Products in page # 2:

LL Crankarm
ML Crankarm
HL Crankarm
Chainring Bolts
Chainring Nut

Products in page # 3:

..

//This resulted in 101 pages - There are 504 products in the AdventureWorks 2012 database (primarily mountain bikes and clothes+equipment)

Products in page # 101:

HL Bottom Bracket
Road-750 Black, 44
Road-750 Black, 48
Road-750 Black, 52

//Final page contains only four items as expected

To get a paged result and work with this paged result, another extension method can be used:

  public static Dictionary<int,List<ExpandoObject>> MassivePagedRetrieval(this DynamicModel massiveObject, int pageSize = 20, int startPageIndex = 1,
           int endPageIndex = -1)
        {

            var pages = new Dictionary<int, dynamic>();
            var pagedResult = new Dictionary<int, List<ExpandoObject>>(); 
            var firstPage = massiveObject.Paged(currentPage: startPageIndex, pageSize: pageSize);
            pages.Add(startPageIndex, firstPage);
            int endIndex = endPageIndex == -1 ? firstPage.TotalPages : Math.Min(firstPage.TotalPages, endPageIndex);

            for (int currentPageIndex = startPageIndex + 1; currentPageIndex <= endIndex; currentPageIndex++)
            {
                var currentPage = massiveObject.Paged(currentPage: currentPageIndex, pageSize: pageSize);
                pages.Add(currentPageIndex, currentPage);
            }

            foreach (var keyValuePair in pages)
            {
                List<ExpandoObject> items = new List<ExpandoObject>(); 
                foreach (var item in keyValuePair.Value.Items)
                {
                    items.Add(item); 
                }
                pagedResult[keyValuePair.Key] = items;
            }

            return pagedResult;
        }

To use this extension method, use:

     Dictionary<int, List<ExpandoObject>> pagedResult = table.MassivePagedRetrieval(pageSize: 10); 

The key is the page index again and a list of ExpandoObject objects are returned for each page (or key) in the dictionary. ExpandoObject is a dynamic object. To watch the resulting data in Visual Studio, use the debugger and choose Dynamic View when inspecting the result through debugging brekpoints in your code. This article has focused on paged data, as Massive often can be used to performed paged data retrieval in for example ASP.NET MVC-based solutions, but also other solutions can use this data access wrapper. The requirement is .NET 4.0 as System.Dynamics is the library which is used. Massive supports most features as more complex Object Relational Wrappers, such as Entity Framework. Its better speed and ease of use should be tempting. Rob Conery has got additonal information here: Massive Github page There is a trade off with Massive too, its dynamic nature is also its Achilles heel. If you rename fields in the database, you most likely must also update your code and since it is dynamic, chances are that errors can occur and be detected in runtime, out in production. This is easier to avoid using strongly typing such that Entity Framework also. Sadly, Entity Framework and many other ORMs are also slow. The following code shows how it is possible to retrieve a result from two tables. As you can see, one has to pass in SQL to output data.

var productsWithCategories = 
table.Query("SELECT p.Name, pc.Name As CategoryName FROM Production.Product p INNER JOIN Production.ProductCategory pc ON p.ProductSubCategoryId = pc.ProductCategoryId");

foreach (var ppc in productsWithCategories)
{
 Console.WriteLine("Product Name: {0}, Product category name: {1}", ppc.Name, ppc.CategoryName);
}

//RESULTING OUTPUT: 

Product Name: Road-150 Red, 62, Product category name: Components
Product Name: Road-150 Red, 44, Product category name: Components
Product Name: Road-150 Red, 48, Product category name: Components
Product Name: Road-150 Red, 52, Product category name: Components

..

This results in a quick way to access the database, but also loosing Intellisense (auto completion), strongly static typing and compile type checking. When returning results from the database in a service, it is possible to return the data as ExpandoObjects as shown in the extension method MassivePagedRetrieval shown above. If one uses ASP.NET MVC, the Model can bind to this data. This results in fewer application layers and less coding, but at the same time a risk for those pesky runtime errors out in production. Massive is very quick and for web developers, this looks very promising. Using ElasticObject or similar techniques, the ExpandoObjects can be converted to either XML or better - JSON - and processed using jQuery and Javascript. If Massive catches on or some other, ligtweight ORM is preferred by developers remains to be seen. This is a moving field. What is sure, the days of heavy weight ORMs like Entity Framework will not be bright if not the performance catches up with these lighter ORM frameworks. At the same time, it is hard to say that it actually will result in shorter time to market (TTM) for developers, as they loose Intellisens and other good features such as Entity Framework supports. Test out Massive and see if it matches your needs. Use the extension methods above to get started with paged result retrieval, if they look interesting. Massive also supports of course inserts, updates and deletes. In addition, validations and callbacks can be put in the class that inherit from DynamicModel. Massive also has specialized functions such as the standard aggregation methods and convenience method such as Find that is defined in the DynamicModel. Lastly, Massive is heavily based on System.Dynamics assembly in .NET 4.0 and newer framework versions. DynamicModel inherits itself from System.Data.DynamicObject. Also bulk updates are supported in Massive.