Some Eclipse Foundation services are deprecated, or will be soon. Please ensure you've read this important communication.
Bug 331199 - XSDParser causes OutOfMemoryError
Summary: XSDParser causes OutOfMemoryError
Status: CLOSED DUPLICATE of bug 278853
Alias: None
Product: EMF
Classification: Modeling
Component: XSD (show other bugs)
Version: unspecified   Edit
Hardware: PC Windows XP
: P3 normal (vote)
Target Milestone: ---   Edit
Assignee: Ed Merks CLA
QA Contact:
URL:
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2010-11-26 07:16 EST by Wojciech Galanciak CLA
Modified: 2023-01-12 11:53 EST (History)
1 user (show)

See Also:


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description Wojciech Galanciak CLA 2010-11-26 07:16:10 EST
Adopter uses XSDParser to parse xsds. This is a part of they component deployment process. We can notice stable incease of userDataMap size after each deploy. We suspect that there are strong references (direct or indirect) to key objects stored somewhere on adopters side. Anyway, even when we start simple test which reads the same file 1000 times (and store strong references), we observed that there is one map entry for each node multiplied by 1000. It means that userDataMap keeps 1000 entries for the same node. Is that a correct behaviour? If it is, why userDataMap is a static field?

Build is 200705141058.
Comment 1 Ed Merks CLA 2010-11-26 10:17:30 EST

*** This bug has been marked as a duplicate of bug 278853 ***
Comment 2 Wojciech Galanciak CLA 2010-11-30 09:51:38 EST
I have tested this fix (bug 278853) in my build and unfortuantely it does not solve the problem. I could say that the problem nature is different. I suppose that userDataMap should have entries for unique nodes but in practice if I parse the same file two times, I get two entires for the same node. The reason of that is the type of nodes. In this case is org.apache.xerces.dom.ElementNSImpl which does not implement equals method so it is compared by reference. If I am wrong correct me but I think that in this case we loose the main advantage of having this map as a static field.
Comment 3 Ed Merks CLA 2010-11-30 12:08:24 EST
The DOM nodes are keys to the weak hash map.  If these keys appear to be leaks it's not because of the map itself.  They'll stay in the map as long as something else keeps them from being garbage collected.  I don't expect ElementImpl to do structural equality testing.  Different instances are expected to be not equals and lead to more map entries, but the "old" entries should be collected when they become garbage.  The map is static because one needs to be able to map from a Node to an object in the model without storing that data on the node itself.

You should be spending your time looking for what else is keeping the older DOMs in memory.  The weak hash map's keys cannot be the culprit.