Check out this article. Unbelievable. These guys need to be in college (but can't afford it).
http://www.wired.com/wired/archive/13.04/robot.html?pg=1&topic=(none)&topic_set=(none)
Check out this article. Unbelievable. These guys need to be in college (but can't afford it).
http://www.wired.com/wired/archive/13.04/robot.html?pg=1&topic=(none)&topic_set=(none)
How often do we code something like this:
for each object O in ObjectsCollection SharedFunction.Processing(O) do some stuff SharedFunction.WriteToDb(O) nextA little background
The SharedFunction.Processing call does some SQL reads. I'm using the SQLHelper from Microsoft.
Problem
The problem is that with the above code after about 120 objects O I'm getting an error. Connection Time Out or Max Pool Size reached.
The Cause
It is such a tight loop that garbage collection isn't having a chance to clean up the left over connections from the Processing and WriteToDb.
Solution
The solution I used was to force a garbage collection. Then everything is honky dory.
Currently their is a "global" of sorts that carries the SQL Connection string. I didn't want to mess with a global connection as that's just asking for problems. However, an alternate solution could be to refactor SharedFunction.X to be instance functions, instantiate a connection in the instance and reuse it with the instance functions.
After looking at this some more, the alternate solution really isn't that nice to implement. It would require modification of two Business Logic Layer code modules and their associated Data Access Layers. I'd rather not be worrying about carrying a connection around and be more focused on what I'm really trying to accomplish.