Ang3lFir3 – Life as a Code Poet

January 26, 2011

Introducing Chewie for nuget

Filed under: .NET, C#, Microsoft, technology, Uncategorized, web development — Tags: , , , , , , , , — ang3lfir3 @ 4:45 pm

So anyone following my friends and co-workers Jeff Schumacher (@codereflection), Adron Hall (@adronbh) and Bobby Johnson (@NotMyself) is aware that today we [ok really just them, I was busy on a diff less awesome project] started giving nuget a serious trial run to manage our extensive list of dependencies. A lot of which are not available on nuget at the moment.

This prompted a few blog posts from the guys here and here.

We kinda came up with the idea that we really needed something along the lines of bundler for ruby gems. So I started with the most simple version possible and Chewie was born. (All credit for the name goes to @NotMyself). At the moment its a basic file and a single line of powershell goodness.

check it out and contribute @ github


January 25, 2011

ExpandoObject and Views

This sample was spawned by a comment made by my friend and co-worker Bobby Johnson (@NotMyself) . I can’t remember exactly what he said but it had something to do with dynamic and views. (maybe he will tell us in the comments ūüôā )

Let me first say that many of the projects at work on built on MonoRail which is often considered to be the¬†predecessor¬†to ASP.Net MVC and as such we don’t get a huge amount of exposure to the new toys in ASP.Net MVC immediately.

With that out of the way I was getting prepared to write this post and I just happened to notice that ASP.Net MVC views in .Net 4.0 projects seem to all be of type System.Data.ViewPage<dynamic> , which got me super excited! This makes things even easier! (more…)

January 24, 2011

Refactoring MVP to MVC the slow way : pt1 Extracting Services

Filed under: .NET, ASP.Net MVC, C#, Patterns, technology, Uncategorized, web development — Tags: , , , , , , , , , — ang3lfir3 @ 9:58 pm

In this series I would like to examine the process my team and I have been undertaking on our current project. One of the aspects of working on legacy applications is that when you dive in you often see patterns that are not easily testable. The current application my team is working on was built many years ago and implements MVP (model view presenter) ,which at the time was a useful pattern for developing testable ASP.Net webforms applications. In the years since then many new patterns and frameworks have emerged and it is our desire to move the application from MVP to MVC (model view controller).

This series assumes that you are familiar with a few patterns and concepts namely the following:

  • Dependancy Injection
  • Inversion of Control
  • SOLID principals
  • MVP
  • MVC
  • TDD
  • Mocks/Stubs


October 19, 2010

I can haz tests?

Filed under: .NET, BDD, C#, Patterns, TDD, Uncategorized, web development — Tags: , , , , , , , , , , — ang3lfir3 @ 8:47 pm

So our friend David Burela is back at it again with Developer Blog Banter #2: How do you test your applications?


How do you organize your tests. Do you separate your unit tests, integration tests and UI tests into separate projects? Do you do anything specific to keep track of your tests? What naming conventions do you use? Do you run them before a check in or is that what the build server is for?
If you are not testing, then how would you like to test your apps if given the opportunity?

This post is my response to the above question.

I tend to have a Specs assembly for each major component of a project, these usually include some form of integration tests using in memory SQLite databases. The first thing most people will think is that I am probably mixing integration tests with my specs/unit tests and getting everything all mishmashed together. That is probably a good observation. As I find myself almost exclusively using MSpec for testing I see no real reason to separate the tests into any other grouping other than by their system components.

An example set of Test assemblies would be :


I did learn a few tricks from my friend @cbilson that I really liked and continue to use. That is naming the actual files in the test assemblies about the feature or part of the system we are testing. So for a group of specs that test the calculation of a bunch of distribution dates for a retirement fund based on some frequency (monthly, quarterly, annually etc) the name of the file would be ‚Äúcreating_distribution_dates_for_funds.cs‚ÄĚ. This name is also used for the namespace that all the tests live in since MSpec tests are each a separate class. Groups of related tests can be found quickly and helps others that may come on to the project find the tests that describe how something works.

Okay so about those in-memory database integration tests. Well this is another thing that @cbilson and I worked on together (ok mostly him but I helped). Its certainly not a new idea , I got it from some blog post of ayende’s that I read, but it was a major breakthrough in helping us move quickly with testing and be extremely accurate. We were able to not only have nice _FAST_ tests for mappings in nHibernate but also were able to test queries quickly and accurately. Having this ability helps a lot when you want to be able to test with not only the database for your application but also for test versions of other databases you may need to access (most of our apps use at least 3-4 databases). This can make repository testing a no brainer and helps eliminate the kludgy methods people have had to use in the past. @NotMyself, @codereflection and I have even gone so far as to integrate NBuilder into the process for some scenarios making tests clean, expressive and to the point. I’m getting sidetracked I think…. testing is exciting stuff damnit!

Of course all our tests are run on the CI server and we ‚Äúalways‚ÄĚ run them all before committing‚Ķ. right? ūüėČ

I would like to one day soon find a really elegant way to add some more integration testing, maybe even at the UI level, into the process. As this is very painful and hard to maintain today we do our best to test as much as we can. There is no substitution for great comprehensive QA we just hope we make their jobs a lot easier by building well designed rock solid software… that works.

Hope this covers the question and hope someone finds some value in it.

See also my response to the first Developer Blog Banter : My Technology Stack

March 18, 2009

Fluent NHibernate new style mappings – powerful semantics

Filed under: C#, Fluent NHibernate, NHibernate, ORM, Uncategorized — Tags: , , , , , , — ang3lfir3 @ 5:56 pm

So today I updated to the latest build of Fluent NHibernate.¬† As any of you who might have done the same have discovered there are some breaking changes. I wasn’t sure I liked the new class based conventions at first, especially since it wasn’t clear at first how to tackle altering my mappings. Then it dawned on me how powerful the conventions would end up being while also promoting DRY.

as @jagregory said :

“brevity was sacrificed for power in this case.”

This can be seen in the example below which is a self referencing Parent Child relationship.

The description of the relationship is:

Categories can have one or none parents.

Categories can have many or no children.

The parent Category is always found in a property called “Parent”.

The children are always found in a property called “Children”.


Original Mapping:

public class CategoryMap : ClassMap<Category>
  public CategoryMap()
    Id(x => x.Id);
    Map(x => x.Name);
    References(x => x.Parent).TheColumnIs("parent_id").Cascade.All();
    HasManyToMany(x => x.Products).Inverse();
    HasMany(x => x.Children).WithKeyColumn("parent_id").Cascade.All().Inverse();

**Note:¬† These aren’t exactly “Conventions” but it turned out that ‘WithKeyColumn’ got dropped and I had to look for a better way. The new style conventions offered that even over older convention styles.


The new style Convention Mappings :

The Convention classes below create a convention that reads like:

“For a HasMany when the child type matches the type of the root and the name of the property is ‘Children’¬† then use the column ‘parent_id’ as the KeyColumn. For a Reference when the child type matches the type of the root and the name of the property is ‘Parent’ then set its reference ColumnName to ‘parent_id’ “

public class SelfReferencingHasManyConvention : IHasManyConvention
  public bool Accept(IOneToManyPart target)
     return target.Member.ReflectedType == target.EntityType && target.Member.Name == "Children";

  public void Apply(IOneToManyPart target)
public class SelfReferencingReferenceConvention : IReferenceConvention
  public bool Accept(IManyToOnePart target)
    return target.Property.ReflectedType == target.EntityType && target.Property.Name == "Parent";

  public void Apply(IManyToOnePart target)

The Mapping after the convention :

Clean and clear, the conventions themselves are not cluttering the Mapping. More importantly the conventions help me stay DRY.

public class CategoryMap : ClassMap<Category>
  public CategoryMap()
    Id(x => x.Id);
    Map(x => x.Name);
    References(x => x.Parent).Cascade.All();
    HasManyToMany(x => x.Products).Inverse();
    HasMany(x => x.Children).Cascade.All().Inverse();

Adding Mappings to my Persistence Model

You can see that adding the conventions was pretty easy and straight forward. This applies to the fact that I am using the PersistenceModel approach.

public class DataModel : PersistenceModel
  public DataModel()
     AddMapping(new ProductMap());
     AddMapping(new CategoryMap());

kick it on

March 26, 2008

Developing InfoPath 2007 Solution == Most Painful experience in my life

The title says it all!!! Well almost. For the last few days I have been banging my head on InfoPath 2007 as a platform for developing a solution. There are great many painful experiences in InfoPath 2007 as a development platform that I was already expecting (it is after all an Office product thus making it inherantly painful)¬Ļ but I wasn’t expecting to have it just randomly crash VS any time¬†I write a little code.

What I am trying to do, and hopefully someone smarter than me can tell me how, is to incorporate a custom dll into an InfoPath 2007 Template project in VS2005 AND be able to use that dll to do work on the Template (the library¬†contains validation functions specific to US and I need it to be reusable). Now you might be thinking life was easy here, but let me through in a few wrenches…

We aren’t using the forms on machines that are NOT¬†connected to our network. The Templates are published to machines that are in the field. No SharePoint, ¬†No Forms Server, just templates and a mdb with the values for the drop down menus in it. Yup… each template comes with its very own personal copy of the database that contains nothing more than tables with values and labels for drop down lists. At a later date the data in the .xml files will be uploaded to a Database after being collected.

¬†Hopefully I have made it clear how these templates are being used (not my idea so i can’t provide any justifications)

So how does one use InfoPath 2007 to publish a template containing custom validation routines (complex enough that they need to be written in C#) that use a shared library and access fields on the form? WITHOUT THE WHOLE THING EXPLODING!!!! There are no examples that I can find anywhere of even developing with InfoPath in this sort of manner. I am not finding anything related to my issues on the team blog either.

Maybe I am missing something or simply just don’t get it…. what ever the case… hopefully someone can explain this to me…. cuz right now… I’m drowning.


1) office applications in general always seem to have the goofiest¬†API’s and worst documentation. If I just want to do something once and not become an expert in <Office_application_X_Development /> the pain is almost unbearable. It should be easy guys…. no really it should!

Blog at