Print Page | Close Window

Question on correct way of using DevForce on a website

Printed From: IdeaBlade
Category: DevForce
Forum Name: DevForce Classic
Forum Discription: For .NET 2.0
URL: http://www.ideablade.com/forum/forum_posts.asp?TID=1490
Printed Date: 23-Apr-2025 at 1:08pm


Topic: Question on correct way of using DevForce on a website
Posted By: JulianBenjamin
Subject: Question on correct way of using DevForce on a website
Date Posted: 23-Sep-2009 at 1:04pm
I have a technical design question on how best to use a Persistence Manager on a website bound to data.
 
Is it better to create a new Persistence Manager per user session and use that to perform all data calls so that a person making changes doesn't cause an untimely save on someone else's data (like when I call the SaveChanges method on a shared manager and another user is in the middle of another entry)?  Or should I use a global Persistence Manager for that application, so everyone is updated with changes made by others?
 
Also, when calling SaveChanges, is it best to pass in the list of entities you know you've changed, or just call the method with no parameters and have it save all changes that are pending?



Replies:
Posted By: GregD
Date Posted: 23-Sep-2009 at 3:06pm
Most would use a PersistenceManager per user session due to the thread safety issue you mention; also because of concurrency issues (e.g., user A gets Jane Doe's record on his web page, then user B changes Jane Doe's last name, then user A changes something else and submits his changes, causes his (original) LastName value to overwrite user B's changes in the cache).

It is also possible to instantiate, use, and destroy a PersistenceManager for each submit. This is less performant, but more scalable, than a PM per user session.

>> when calling SaveChanges, is it best to pass in the list of entities you know you've changed, or just call the method with no parameters and have it save all changes that are pending?

Generally best to save all changed entities. This relieves you of the burden of managing data integrity issues that would arise if your partial save failed to include all necessary parts of a conceptual entity that spans multiple related physical ones (e.g., an Order with its line items).



Posted By: Murray
Date Posted: 24-Feb-2010 at 3:10pm

I think there is a case for an 'Application' Persistence Manager and a 'Session' PM as well.

Would the choice depend on the role of the user? For instance a site admin would want to see the most up-to-date version of the data, therefore an appliaction PM would be the best option. A user browsing and simple submits would be better off with a session PM. Note that an Application Pool recycle will destroy the In Proc session.

I find that having to keep Session Persistence current with all db change quiet tricky  
 
We already have Session["PersistenceManager"] = new PersistenceManager(true); in Gloabl.asax Session_Start.
 
My question is how to instanciate an 'Application' Persistence Manager. Is that possible in a web app? I am using DF Classic


Posted By: WardBell
Date Posted: 26-Feb-2010 at 5:43pm
Never use a Global/Application Persistence Manager!
 
PersistenceManager is not thread safe and very bad things will happen if you try to share entities across sessions.
 
A Session-level  PersistenceManager that is held in server memory between requests is technically fine although you may question whether and to what degree that is scalable enough for your use. We do show this approach in our ASP.NET demonstrations ... but one should really understand the issues and appropriateness for your application before mimicking it.
 
I quite understand the desire to cache stable, reference entities on the middle tier rather than keep going to the Data Tier for them. The collection of US States is not likely to change for some time.
 
This can be done but you have to use EntityCacheState and write thread safe code (especially if the reference entities MIGHT change between middle tier re-cycling). This topic is advanced and should only be considered once you have demonstrated with metrics that (a) you are putting overwhelming pressure on the data tier with repeated requests for the same entities and (b) the database pressure would be substantially relieved if these entities were cached in the middle tier.
 
These propositions seem true on paper; not true very often in practice.
 
We can take a crack at options ... once someone shows me some measurements that justify the effort. I'm not trying to hide anything. Just seen too many folks chase this rat down the hole at the expense of adding business value to the application. My $0.02.



Print Page | Close Window