četvrtak, 17. siječnja 2013.

Using stored procedures with strong return types in Entity Framework 5.0.0

This article explains how to import stored procedures into Visual Studio Mvc4 application and use them with strongly typed parameters and return values. ADO.NET Entity Data Model will be used to represent the database.

Mvc4 uses Entity Framework to map the database to the entity model. The database mapping is done automatically by adding a new ADO.NET component to the project. The following picture shows scaffolded files holding POCO classes and the context. "TestModel.Context.cs" holds the database context derived from DbContext class. "TestModel.tt" holds the model files containing the POCO classes.


Image 1: Entity data model used for testing
Stored procedure does not have to be present when the ADO.NET model is added to the project. It can be created and imported later in the development process. In that case the database model has to be refreshed before the stored procedure becomes available for import.

To refresh the database, go to View -> Other Windows -> Entity Data Model Browser, find the desired model in the Model Browser, right-click and select "Update Model from Database". There you can edit the parts of the database the model encompasses. This can be used to add the newly created stored procedures to the model.

Image 2: Refreshing the model to include newly made stored procedure
After this is done, the time has come to import the stored procedure. Right click on the model diagram and select Add New -> Function Import button. The new "Add Function Import" window appears. Add a desired function name, select the stored procedure you want to add and the return type. The new stored procedure can be set to return any scalar or complex type, including entity data (for instance, the filtered subset of entities). If the return value is set incorrectly, the data retrieval will raise the invalid cast exception.

Image 4: Stored procedure import window
Now the stored procedure can then be called using the database context object and the result can be queried by using Linq or some other querying method.

Entities _context = new Entities();
var data = _context.MyStoredProcedure("Author name");


Scaffolded stored procedure call:

public virtual ObjectResult<Author> MyStoredProcedure(string authorName)
{
    var authorNameParameter = authorName != null ?
        new ObjectParameter("AuthorName", authorName) :
        new ObjectParameter("AuthorName", typeof(string));

    return ((IObjectContextAdapter)this)
        .ObjectContext
        .ExecuteFunction<Author>("MyStoredProcedure", authorNameParameter);
}


Sources and further reading:
  1. Reflecting changes made to stored procedure in Entity Framework-generated complex type
  2. ADO.Net Entity Framework: How to Retrieve data using Stored procedure step by step

četvrtak, 13. prosinca 2012.

ASP .NET MVC 4 - Repository dependency injection with Ninject




Repository dependency injection with Ninject


Primary Goals

First question that comes in our minds is why bother with so many layers of code. Here are the major reasons I can think off :

Maintainability and extensibility through Separations of concerns

Repository pattern enables us to separate the implementation of accessing data from the data that repositories provide.
I really like this definition:



Repositories are the single point where we hand off and fetch objects. It is also the boundary where communication with the storage starts and ends.
from Patterns of Enterprise Application Architecture, page 332.

MVC Architectural Pattern - Model/View/Controller

Code quality through Testability

Mock Repository Pattern can be used for creating mock/fake repositories. Repository mocks are useful is various ways. They enable friendly Test Driven Development, allow testing only a part of the solution, DB disconnected mode in case we do not have a DB, working on Views and Controllers without data store, etc.

Independent Unit testing for Controllers and Repositories.

Productivity through Loose coupling

Being able to work interdependently in friendly TDD environment can boost the development speed
significantly. Loose coupling can be obtained with IoC and DI.  We can inject dependency manually or Dependency Injection Containers such as Ninject Dependency Injector.
There is a large list of dependency containers, other better known are Unity, StructureMap, Castle Windsor and Autofac.


Solution structure

Solution is structured in five projects. One ASP .Net MVC4 project, one MS unit test project and three libraries. Repository contains all the "real" implementations to manipulate and expose data. MoqRepository contains fake methods together with fake/mock data. Model contains model classes. MVC project contains Controllers and Views.

ASP.Net MVC Project
  • Controller - just routes the calls and has no business logic inside
  • View - Generated with MVCScaffolding
Repository Project
  • Repository classes - uses DB and exposes data. EF code first strategy has been used to generate data access layer.
  • Repository interfaces - define the methods to be called from the Controller 
Mock Repository Project
  • Mock repository classes - fake repository implementation. Exposes data hard-coded inside the class.
Model project
  • Model classes - Model classes used to define DB table and view strong typed views.
Unit test project
  • Test repositories - unit tests for "real" repositories
  • Test controllers - unit tests for Controllers.

Ninject usage

I found the most easy way to get Ninject working is through Nuget. You can install it manually but it will take more steps to finish. There is no version for MVC 4 yet but the MVC 3 version works smoothly. 

Open solution

Open package management console

  • Leave Package source to NuGet official package source.
  • For Default project select ASP.Net project.
  • Type in :

Install-Package Ninject.MVC3
  • It will install all the dependencies and add needed assemblies. Result should look like this, ignore version part:

Successfully installed 'Ninject 3.0.1.10'.
Successfully installed 'WebActivator 1.5.2'.
Successfully installed 'Ninject.MVC3 3.0.0.6'.
Successfully added 'Ninject 3.0.1.10' to XXX.
Successfully added 'WebActivator 1.5.2' to XXX.
Successfully added 'Ninject.Web.Common 3.0.0.7' to XXX.
Successfully added 'Ninject.MVC3 3.0.0.6' to XXX.

In the App_Start project of the ASP.Net MVC project you will find a new file NinjectWebCommon.cs .  All you have to do in this class is to Register services which are going to be used in Controllers:


/// <summary>
/// Load your modules or register your services here!
/// Uncomment line 61 to use "real" implementation
/// </summary>
/// <param name="kernel">The kernel.</param>
private static void RegisterServices(IKernel kernel)
{
  kernel.Load(Assembly.GetExecutingAssembly());
  //kernel.Bind<IRepositoryMethods<Person>>().To<PersonRepository>();
  kernel.Bind<IRepositoryMethods<Person().To<MoqPersonRepository>();
}        

Loading Ninject bindings ....

and use it in the Controller, here is the example:

private readonly IRepositoryMethods<Person> _peopleRepository; 
public PeopleController(IRepositoryMethods<Person> peopleRepository)
{
    _peopleRepository = peopleRepository;

GitHub Example:

ASP.Net MVC4 with Ninject

Must read:

IoC, DI, DI versions
Martin Fowler Articles - Injection

Repository pattern
Patrik LÖWENDAHL Blog
Martin Fowler on Repository pattern

Repository mocking
Mock a DB repository using Moq

ASP.Net MVC
Official ASP.Net MVC pages
Scott Hanselman Blog

Ninject
GitHub open source project
Ninject WIKI ToC
Ninject.MVC3 Documentation
Extensions


utorak, 11. prosinca 2012.

Entity Framework Migrations Tool Basics

Introduction

Entity Framework Migrations is used for changing the database design with the code-first approach. The code-first approach enables a developer to first design the model by writing the POCO classes and then to create the database from the model. The problem comes with the desire to upgrade the model. With the model upgrade the database has to be altered as well to reflect the changed model.

Migrations tool deals with this problem by keeping track of all the changes to the model. The initial model design image is created upon the developer's request. After modifying the model, the developer can request the creation of the differential migration image. The database can then be automatically altered to conform to any of the stored migration images.

Migrations is only used to automatically alter the database. The model has to be altered by hand or by using some other method or tool.

The problem definition

The developer has created the model and the database using the code first approach. During the development phase the need arose to alter the model. The developer has then altered the model and started the application. The application failed with the following error:

The model backing the 'CustomDbContext' context has changed since the database was created. Consider using Code First Migrations to update the database (http://go.microsoft.com/fwlink/?LinkId=238269).

Enabling Migrations and creating initial model image

It is assumed the solution has an implemented class deriving from DbContext and implemented POCO classes describing a desired model. The first step is to set up the Migrations tool in the Visual Studio:
  1. In Solution Explorer find the project containing the class overriding the DbContext class and set it as the default project 
  2. Start the Package Manager Console and enter "Enable-Migrations" in the prompt:

PM> Enable-Migrations

This creates the Migrations folder in the selected project. This folder contains the following files:

  • The initial model image in the "InitialCreate" file
  • The Configuration file used to define the desired context behaviour and seeding
  • All the differential images, which contain the information how to migrate from the previous migration to the current one and back

Adding Migrations

When the initial model image is created, the developer is free to change the model. When the model is changed to the desired specification, it is necessary to add another migration. This is done by calling the "Add-Migration" command in the Package Manager Console:

PM> Add-Migration MigrationName

The result is the added differential migration in Migrations directory named "MigrationName". It contains two methods, Up() and Down(). Each instructs the Migrations tool how to change the database to the current migration or how to change it back to the previous one. This enables the developer to freely jump between migrations and use the desired one.
Reflecting the model to the database

When the new migration is created, the database can be automatically modified. This is done using the "Update-Database" command in the Package Manager Console:

PM> Update-Database


After the successful call the Database is updated to the last migration.

Migrating to the specific version

The database can be updated to any available migration, including downgrading to some previous migration. This can be accomplished by setting the "TargetMigration" property:

PM> Update-Database –TargetMigration:"MigrationName"


To revert all the changes and go back to the initial database design, use the following command:

PM> Update-Database –TargetMigration:$InitialDatabase


It is important to note that the model has to be altered by hand to conform with the database design after the database migration. The database design and the model design must be the same.

Getting a SQL script

Migrations folder can be shared between the developers using TFS. They can easily modify their databases using the previously described techniques. The problem is when the need arises to alter the deployment database. In that case the SQL script can be generated, which can be run on the deployment database to alter its design.

PM> Update-Database -Script -SourceMigration:$InitialDatabase 

    -TargetMigration:"MigrationName"

If the TargetMigration is not specified, the latest migration is used as the TargetMigration. The result is the SQL document which can be stored, moved and executed on a target database.

Conclusion

Entity Framework Migrations tool can be used to automatically alter the database to conform to the model changes. It is easily installed and used. When the model is changed by hand or using a tool, the database can be easily changed with only two calls to the tool.

Sources and additional reading

  1. EF 4.3 Code-Based Migrations Walkthrough
  2. Entity Framework Code First Migrations

Entity Framework Code First data loading strategies

Entity Framework Code First data loading strategies

Introduction

This post describes three different loading strategies when using Entity Framework Code First approach: lazy loading, eager loading and explicit loading. The loading types will be described using the following simple model. The Author and Book classes are related with a one-to-many relation (Author can write many books, a book can have one author). The same applies to the relation between the Publisher and the Book.

public class Book
{
    [Key]
    public long BookId { get; set; }
    public long AuthorId { get; set; }
    public long PublisherId { get; set; }
    public string Title { get; set; }
    public DateTime DatePublished { get; set; }

    public virtual Author Author { get; set; }
    public virtual Publisher Publisher { get; set; }
}

public class Author
{
    [Key]
    public long AuthorId { get; set; }
    public string AuthorFirstName { get; set; }
    public string AuthorLastName { get; set; }
    public DateTime AuthorDoB { get; set; }

    public virtual ICollection<Book> Books { get; set; }
}

public class Publisher
{
    [Key]
    public long PublisherId { get; set; }
    public string PublisherName { get; set; }

    public virtual ICollection<Book> Books { get; set; }
}

Database creation error inside the transaction scope

One of the Code First approach benefits is the automatic database creation if the database does not exist. When overridden DbContext class accesses the database, it first checks if the database exists. If it doesn't, the DbContext will try to create it using the model information. The problem appears if this happens inside a transaction scope. When the DbContext tries to create the database inside the transaction scope, the following error will occur.

CREATE DATABASE statement not allowed within multi-statement transaction.

One of the solutions to this problem is to call the database creation routine before entering the transaction:

DataContext.Database.CreateIfNotExists();

Lazy Loading

Lazy loading enables the application to load the data from the database when the need arises. The benefit with this strategy is its simple usage. The drawback is a new call to the database for each missing object, which uses up the network bandwidth and slows down the execution (each call to the database takes time).

This approach works using the dynamic proxy classes. When a model object is requested from the database, a dynamic proxy instance is retrieved instead. It can then access the database to load additional objects.

Lazy loading is enabled by default. It can be turned off using LazyLoadingEnabled property:

public BookStoreContext()
    : base()
{
    this.Configuration.LazyLoadingEnabled = false;
}


It should be noted that turning off lazy loading does not mean using eager loading by default (no related data will be implicitly loaded). This only means that no lazy loading will be performed. In that case the developer is left with the options to use eager loading or explicit loading.

Lazy loading can also be turned off for a particular relation or property by removing the "virtual" keyword. Example:

//
// The collection will get populated with Book objects when the author's books 

// get requested
public virtual ICollection<Book> Books { get; set; }

//
// The collection will stay empty when the author's books get requested
public ICollection<Book> Books { get; set; }


The next example shows how to use the lazy loading.

protected void Page_Load(object sender, EventArgs e)
{
    using (TransactionScope ts = new TransactionScope())
    {
        using (BookStoreContext c = new BookStoreContext())
        {
            Author a = c.Authors.FirstOrDefault<Author>();

            // The book list will be filled up with author's books
            //although no explicit call there was made to fetch them
            List<Book> b = a.Books.ToList<Book>();   
        }
        ts.Complete();
    }
}

Eager loading

Eager loading is used to retrieve the desired data and some or all of its related data. The developer can use Include statement to include the related data and Linq-to-SQL to define the wanted records. The benefit of this approach is all the needed data is fetched by a single call to the server, but the developer has to explicitly select the desired related data for the database to return.

Include statement uses the properties names defining the relations. In our case, some of the examples are "Books" for the collection of books defined in Author and Publisher, and the "Author" for the relation to book's author.

First example - No relational data is loaded

The example retrieves the author data, but no data regarding the author's books is returned. The list stays empty.

c.Authors.FirstOrDefault<Author>();
List<Book> b = a.Books.ToList<Book>();

Second example - Loading the relation to books

The second example uses eager loading Include statement to include the data from the related table. The list is filled with the author's books.

c.Authors.Include("Books").FirstOrDefault<Author>();
List<Book> b = a.Books.ToList<Book>();

c.Authors.Include(l => l.Books).FirstOrDefault<Author>();
List<Book> b = a.Books.ToList<Book>();


Two namespaces are needed for the approach using lambdas to work:

using System.Data.Entity;
using System.Linq;


Include statement can only include the data from the related tables. Otherwise an exception is thrown.

Third example - Loading the second-level relation

The third example shows how to eagerly retrieve the data from the table with several levels of relation. As we already know, the Authors table is related to the Books table and the Books table is related to the Publishers table.

c.Authors.Include("Books.Publisher").FirstOrDefault<Author>();
List<Book> b = a.Books.ToList<Book>();


This code loads the first author including all his books and the book's publisher for each one of those books.

Fourth example - Multiple relations need multiple defined paths

Each Include defines a path to one related table. The following example shows how to define two paths for two tables related to Books.

c.Books.Include("Author").Include("Publisher").FirstOrDefault<Book>();

This approach can be automated in a repository pattern to include the desired table relations for a given table. The following function shows how this might be implemented.

class BookRepository : IBookRepository
{
    //
    // ...
    //
   
    public IQueryable<Book> AllIncluding(params Expression<Func<Book, 

        object>>[] includeProperties)
    {
        IQueryable<Book> query = DataContext.Teams;
        foreach (var includeProperty in includeProperties)
        {
            query = query.Include(includeProperty);
        }
        return query;
    }

    //
    // ...
    //   
}


And the call to AllIncluding:

BookRepository bookRepository = new BookRepository();
var books = bookRepository.AllIncluding(book => book.Author, book => book.Publisher);

Explicit loading

Explicit loading loads only the data from a single dataset. In that way it is similar to lazy loading, but its explicit nature helps the developer to control the number of calls to the database. Each call to Load statement loads the data from a single dataset (excluding its related data), which means each Load call makes one call to the database.

//
// Explicitly load the whole Books DbSet
c.Books.Load();

//
// or
//

//
// Get the first book using the Linq-to-SQL
Book book = c.Books.FirstOrDefault();

//
// Explicitly load the references to the book using Entity Framework 

// explicit load
c.Entry(book).Reference("Author").Load();
c.Entry(book).Reference("Publisher").Load();

Conclusion

Many developers turn off the lazy loading just to prevent an enormous number of implicit calls to the database. Although the Entity Framework uses caching to temporarily store the retrieved results, it is considered much better practice to explicitly control and optimize the database access using eager and explicit loading approaches.

Sources and further reading

  1. Using DbContext in EF 4.1 Part 6: Loading Related Entities
  2. [Entity Framework] Using Include with lambda expressions

ponedjeljak, 10. prosinca 2012.