New Posts New Posts RSS Feed: Does the DevForce Object Mapper work with individual metadata files (csdl,ssdl,msl) instead of an EDMX file?
  FAQ FAQ  Forum Search   Calendar   Register Register  Login Login

Does the DevForce Object Mapper work with individual metadata files (csdl,ssdl,msl) instead of an EDMX file?

 Post Reply Post Reply
Author
pk55 View Drop Down
Senior Member
Senior Member


Joined: 22-Jul-2009
Location: CA
Posts: 105
Post Options Post Options   Quote pk55 Quote  Post ReplyReply Direct Link To This Post Topic: Does the DevForce Object Mapper work with individual metadata files (csdl,ssdl,msl) instead of an EDMX file?
    Posted: 30-Sep-2009 at 1:57pm
I'm trying to create multiple data models from a single, very large database.  I need to have a couple of tables in both models so I've been looking at the "CSDL Using Namespace" technique where you work with the individual metadata CSDL, MSL and SSDL files separately (as "Using" isn't supported in the EDMX designer file) and you define the common types in one CSDL file and then refer to them in the other CSDL files.  You then have to generate the C# types for each model using EdmGen.exe and refer to both CSDL files in the metadadta connection string.  This info comes from: http://blogs.msdn.com/adonet/archive/2008/11/25/working-with-large-models-in-entity-framework-part-2.aspx
 
For example, with 2 datamodels that both refer to a common type, you'd end up with 9 files:
 
model1.csdl, model1.ssdl, model1.msl, model1.cs
common.csdl,
model2.csd;. model2.ssdl, model2.msl, model2.cs
 
The problem is that you don't end up with an EDMX file and I can't figure out how to tell the DevForce Object Mapper to work with the Entity Framework if there isn't a designer file.
 
Is this supported?
Back to Top
kimj View Drop Down
IdeaBlade
IdeaBlade
Avatar

Joined: 09-May-2007
Posts: 1391
Post Options Post Options   Quote kimj Quote  Post ReplyReply Direct Link To This Post Posted: 06-Oct-2009 at 4:33pm
Sorry, loose metadata files aren't currently supported in the Object Mapper, but we are looking at this as a possible feature enhancement.
Back to Top
AdamC View Drop Down
Newbie
Newbie
Avatar

Joined: 04-Feb-2008
Location: United States
Posts: 20
Post Options Post Options   Quote AdamC Quote  Post ReplyReply Direct Link To This Post Posted: 27-Oct-2009 at 1:26pm
We are in the same boat.  My hope is that the "large model" issues mentioned in Part I and Part II of the ADO.NET team blog will be addressed in the upcoming EF4/ VS 2010 release.  I just posted this on the ADO.NET blog site today:

Will any of the "large model" issues mentioned in Part I and Part II be addressed in the upcoming EF4 release?  There is a lot of good information about some of the new features, but can you let us know if there is any movement towards making EF more friendly to large models?  Or at least if there will be a less manual approach to splitting models in EF4? 

My team is in the process of migrating an application with LOTS of tables.  So far we only have ~80 tables in EF (out of 1,000+), but we will add more tables to the model as we add functionality.  Unfortunately we couldn't add all of the tables to EF due to performance problems and others issues mentioned in these blog posts.

We've tried breaking-up the model manually a few months ago, but unfortunately we encountered issues similar to "pk55".  We also ran into a few DevForce-related limitations.  However, if/when Microsoft provides better out-of-the-box support for large models/ splitting large models, I'm sure DevForce will take advantage of it. 

Anyway, a year has passed since we first started working with Microsoft EF and we are getting concerned about its ability to handle our large database.  If anyone from IdeaBlade has any inside information from the ADO.NET team, would you be willing to share?  :-)





Back to Top
WardBell View Drop Down
IdeaBlade
IdeaBlade
Avatar

Joined: 31-Mar-2009
Location: Emeryville, CA,
Posts: 338
Post Options Post Options   Quote WardBell Quote  Post ReplyReply Direct Link To This Post Posted: 28-Oct-2009 at 2:51pm
I am putting out feelers about EF v.4 and large models. I don't know if I'll have anything to disclose; worse, I'm not sure they've done anything on this front.
 
Kim has answered on the multiple-CSDL idea. It is not something we'll do any time soon.
 
I hasten to add that the EF development approach that introduced the multiple-CSDL suggestion is highly precarious. I personally would be uncomfortable with it; it is fragile and difficult to maintain.
 
I still lean toward several smaller EF models with clearly defined boundaries. Sure, your models may map to some of the same tables. Doing so could yield two distinct "Customer" CLR entity types, for example, each with the same class name and shape. This duplication shouldn't be a big hassle from the code gen perspective. The challenge is how to provide common custom business logic for "Customer" across the two models. In other words, when it comes to adding application-specific, business behavior to the Customer class ... and you want it to be the same for all Customer entities in each sub-domain, how do you manage that?
 
You can. It's not ideal but it's doable. For a start, keep each domain model in its own project.  In our example, you create a single Customer partial class file with the extra common logic and put it in, say, the Model A project. The Model B project points to it with a shortcut (the same way a Silverlight project's "Customer" class points with a shortcut to the Customer class file in the Web project).
 
You then cross-compile the file into projects A and B yielding distinct CLR Customer types that are structurally and logically identical. Use compiler directives in the shared code file to give the Model A "Customer" a different namespace than Model B's namespace to keep the type definitions from colliding.
 
All of this assumes that you can define clear sub-domain boundaries. Modules using Model A don't need Model B entities and modules using Model B don't need Model A entities. You may need to interface across modules (as when you are talking about the same conceptual "Customer"); you do this with what is known as an "anti-corruption" layer. It can be as simple as coordinating events that convey the shared CustomerId value in the event payload.
 
I happen to think this is more than an expedient; it's good design.
 
Domain Driven Design architects would argue more rigorously that the Model A Customer and the Model B Customer are the same in-name-only; they really are distinct concepts and should not only be distinct types ... they should have distinct storage as well!  DDD folks would put them in different tables in different databases!
 
This may strike you as a tad impractical <grin/>.
 
I think the separate domain and model approach with shared storage is a reasonable compromise. That's why I outlined the shared table / shared code appproach above.
 
I admit I've made this case before and met stiff resistance. AdamC knows exactly what I mean.
 
If you believe that, in your application, it is essential to retain complete flexibility: any entity can reach any other entity at any time in any application module ... if you believe there is no way to design sub-models ... then I doubt anything short of a large model solution will work for you. You will have to decide, then, if EF is right for you.
 
It may not be. You may need a different ORM technology. DevForce supports a POCO approach that is open to EF alternatives. You could write your application with nHibernate for example. You would still get the DevForce client experience which is no small benefit; n-tier, caching, offline, LINQ, validation, ... all of these are still yours.
 
Unfortunately, you will be taking on many of the tedious mapping duties that you expected EF (and DevForce) to cover; with a 1000 table database to map, you may not like the POCO route.
 
I don't know of any technology solution to this problem that is hiding in the wings ... not from Microsoft or anyone else.
 
This is not the happy report I want to give. It is an honest one.
Back to Top
AdamC View Drop Down
Newbie
Newbie
Avatar

Joined: 04-Feb-2008
Location: United States
Posts: 20
Post Options Post Options   Quote AdamC Quote  Post ReplyReply Direct Link To This Post Posted: 28-Oct-2009 at 4:15pm
Thanks Ward.  I appreciate your thoughtful and in-depth response.  After V1 of our product is released at the end of the year, we plan to create smaller models and will attempt to implement the solution you suggested.  We'll see how things fall-out and I'll keep you posted.  My hope was that Microsoft would be making this task easier for us in V4 of EF.  I would love to see a working version of the approach you suggested, so if IdeaBlade has any code to share it would be much appreciated.  This may help other customers/potential customers see how they can work with large databases.

I agree with you that developers should strive towards creating smaller EF models with clearly defined boundaries.  We have made a huge effort to follow just about every best-practice known to man in V1 of our software. :-)  However, I don't think Microsoft placed a "large model" limitation in EF to enforce this smaller model design practice.  In other words, it is my belief that Microsoft didn't take large models into account and unintentionally forced people to use a manual and more difficult to maintain work-around. 

By the way, I think you misinterpreted my "stiff resistance".  I do not resist the idea of a smaller model approach.  As a matter of fact, in my company's current "legacy" application we have smaller sub-domains and do not have any issues.  The resistance you detected had to do with how much time and effort we will need to spend to get this approach to work.  Shouldn't EF work the same regardless of whether or not people choose to create one large model or divide it into smaller models?  In my view this is a deficiency that will hurt Microsoft's EF offering (and possibly DevForce EF) in the long term.  I hope not, because we really love the DevForce EF product and support.  But my fear is that many larger, enterprise software companies with legacy, 1,000+ table databases may shy away from EF -- companies that I think would really benefit from DevForce.
Back to Top
WardBell View Drop Down
IdeaBlade
IdeaBlade
Avatar

Joined: 31-Mar-2009
Location: Emeryville, CA,
Posts: 338
Post Options Post Options   Quote WardBell Quote  Post ReplyReply Direct Link To This Post Posted: 28-Oct-2009 at 5:12pm
Hi Adam -
 
I completely agree that Microsoft missed the boat on large models. They didn't even think about it and, as far as I can tell, it is not a priority today either.
 
They didn't ignore big models "for our own good" either. I would prefer that YOU, the developer, make your choice independent of what I or any other dreamer/architect think is the wise course.
 
We didn't make it our focus either. We are doing a few things in a coming release to shrink the maximum generated file size (that's a problem for many Visual Studio installs). But we haven't taken up the challenge of helping you finesse EF's limitations.
 
Thanks for the clarification on my "stiff resistance" jibe. Your "resistance" was always practical. It's easy for DDD architects to talk about domain boundaries and separation. It takes a lot of work to get that right and the tooling to support you is just not there. I absolutely agree with you: "EF [should] work the same regardless of whether or not people choose to create one large model or divide it into smaller models".   I hope they address this in v.4; I have my antennae up for news of this.
 
We want that "big model" business! I see great value in your suggestion: "a working version of the approach you suggested."
 
No promises but I'll give it a little try when I get a chance. I got a jump on it in the last hour while I built a little example structured exactly as I described above.  I can report that it compiles ... which means it works right?
 
We'll see how far I can take it in my spare time.
 
Meanwhile, early word is "no change" for EF 4; I'm going to go to the horse's mouths for confirmation.
Back to Top
AdamC View Drop Down
Newbie
Newbie
Avatar

Joined: 04-Feb-2008
Location: United States
Posts: 20
Post Options Post Options   Quote AdamC Quote  Post ReplyReply Direct Link To This Post Posted: 29-Oct-2009 at 6:20am
Thanks again Ward and I'm in complete agreement.  I really appreciate the efforts you and IdeaBlade are making to assist in this area.  Like I mentioned, we hope to use the approach you suggested in the first quarter of next year, so any sample code/ guidance IdeaBlade can provide between now and then will be much appreciated!
Back to Top
WardBell View Drop Down
IdeaBlade
IdeaBlade
Avatar

Joined: 31-Mar-2009
Location: Emeryville, CA,
Posts: 338
Post Options Post Options   Quote WardBell Quote  Post ReplyReply Direct Link To This Post Posted: 13-Nov-2009 at 6:53pm
Finally got my essay on Large EF Models done and posted, along with sample MS Test application and a video. Find it at http://www.ideablade.com/WardsCorner/#largemodels .
Back to Top
pk55 View Drop Down
Senior Member
Senior Member


Joined: 22-Jul-2009
Location: CA
Posts: 105
Post Options Post Options   Quote pk55 Quote  Post ReplyReply Direct Link To This Post Posted: 19-Nov-2009 at 4:28pm
I've read your essay and watched your video and appreciate the effort.
 
I agree that not bringing static reference data in as entities (when they can be treated purely as ENUMs while keeping them in the database for RI or reporting purposes) makes sense and can certainly reduce the overall model size and since you have to typically code for changes to that type of data, having to update the enum isn't a big deal.
 
You didn't address this in the essay, but how would you use a common DevForce ancestor class for ModelA.Product and ModelB.Product? I'm not talking about linking the partial classes for Product; I'm talking about an injected base type (or a hierarchy of base types) where the common behavior and properties live since Product may share methods across other types of entities that have common behavior.  I don't think there's any way the object mapper can use a injected base type from another namespace as a common ancestor.  This ancestor class could have descendants in many EDMX files. Even now using a single namespace for multiple data models that come together as a single IdeaBlade Domain Model, I have to manually edit the injected base type declaration in the CSDL so that a common ancestor can be used for descendants in different models .  Someday in my dubious spare time, I'll write up a bug report about that :-) but it has something to do with a hierarchy of ancestors like:
           Entity
               GrandParent
                      Parent
                           Child
but this needs to be declared as:
           Entity
               GrandParent
           Entity
               Parent
           Entity
               Child
and then I manually edit the XML to set the entities IdeaBlade adornedBaseType to be Child > Parent. But I digress....
 
Also your technique of copying data from one model to the other doesn't address the need to do that using a PassThroughESQLQuery since you can't cross data model boundaries using ESQL queries in EF even if the physical tables are in the same database which leads you to duplicate more and more entities in each model that needs to get data from or navigate to entities in other models.  And ESQL is the only way I can create on-the-fly navigation to JOIN entities that I don't want to permanently link together (for various reasons beyond this post).
 
If EF can't handle large numbers of entities in a single model, then it should allow for breaking those models up.  That's why I asked about supporting the use of loose metadata files (using a common CSDL).  I agree that the EF teams implementation of that is incredibly kludgy (and badly documented) and it means you can't use the designer and that's just crazy.  They really need to find a way to allow the designer to work with multiple EDMX models for the same database so that you could create the associations between entities.  Maybe if they actually used a real database instead of some contrived example that had really large numbers of entities, they might see why it's important to fix this.
 
As an aside, I've taken to using the pre-generated Views using T4 templates and that definitely speeds up performance.
 
 
 
 
Back to Top
WardBell View Drop Down
IdeaBlade
IdeaBlade
Avatar

Joined: 31-Mar-2009
Location: Emeryville, CA,
Posts: 338
Post Options Post Options   Quote WardBell Quote  Post ReplyReply Direct Link To This Post Posted: 20-Nov-2009 at 3:44pm
Hi Paul
 
Don't know what problem you are encountering with nesting base classes. I think I know what you might mean ... I vaguely remember that it was once difficult to declare nested base classes in the OM but I thought we fixed that over a year ago. Please submit an example and bug report if you think you can repro this. I will get it fixed.
 
There should be no problem defining and sharing a base class that you've written to hold behavior used across entities in your models. Such a base class does not have to have any particular namespace. I can define an assembly (MyBaseModel) with all kinds of goodies in it. Of course MyBaseModel.ProductBase must ultimately inherit from Entity. If it inherits from any other entity type along the way - one that is particular to a model - , you are stuck. In that case, you play the same game with the base class (including namespace swapping in the partial classes) that you do with Product in my example.
 
You've given me another soap box to stand on and shout "Don't build models with deep class hierarchies."
 
Remember this important principle: favor composition over inheritance.  You should be adding behavior by injecting it, not by subclassing. Think "Strategy Pattern". This heuristic applies everywhere, not just in entity models. Your example provokes in me a deep sense of despair. Please don't do what I see you describing here.  P.s.: beware of the performance implications of model inheritance as well. TPH, TPC, TPT are great ways to get in trouble with ANY ORM; use them sparingly.
 
I am not dodging the issue. I repeat, the technology should not be imposing a design choice. If we have a problem, we'll fix it. But we are in no hurry to faciliate "bad" design either.
 
I did not understand your statement about "using a PassThroughESQLQuery since you can't cross data model boundaries using ESQL queries in EF even if the physical tables are in the same database." Where was the lack of clarity? I am unaware of a difference between my use of LINQ and the use of ESQL query in composing data to transfer across model boundaries. The approach should be the same as the one I demonstrated in my Product Projection example. What did I miss?
 
How can you say: "ESQL is the only way I can create on-the-fly navigation to JOIN entities that I don't want to permanently link together"?  You can perform joins in LINQ with DevForce. I hope you can appreciate my distress when someone posts a throw-away comment - "you can't do X" - and doesn't bother to substantiate the claim. I respond more respectfully and effectively when you either give me the example or say "I don't know how to do X".
Back to Top
 Post Reply Post Reply

Forum Jump Forum Permissions View Drop Down