| Summary: | XSDParser causes OutOfMemoryError | ||
|---|---|---|---|
| Product: | [Modeling] EMF | Reporter: | Wojciech Galanciak <wojciech.galanciak> |
| Component: | XSD | Assignee: | Ed Merks <Ed.Merks> |
| Status: | CLOSED DUPLICATE | QA Contact: | |
| Severity: | normal | ||
| Priority: | P3 | CC: | krzysztof.daniel |
| Version: | unspecified | ||
| Target Milestone: | --- | ||
| Hardware: | PC | ||
| OS: | Windows XP | ||
| Whiteboard: | |||
|
Description
Wojciech Galanciak
*** This bug has been marked as a duplicate of bug 278853 *** I have tested this fix (bug 278853) in my build and unfortuantely it does not solve the problem. I could say that the problem nature is different. I suppose that userDataMap should have entries for unique nodes but in practice if I parse the same file two times, I get two entires for the same node. The reason of that is the type of nodes. In this case is org.apache.xerces.dom.ElementNSImpl which does not implement equals method so it is compared by reference. If I am wrong correct me but I think that in this case we loose the main advantage of having this map as a static field. The DOM nodes are keys to the weak hash map. If these keys appear to be leaks it's not because of the map itself. They'll stay in the map as long as something else keeps them from being garbage collected. I don't expect ElementImpl to do structural equality testing. Different instances are expected to be not equals and lead to more map entries, but the "old" entries should be collected when they become garbage. The map is static because one needs to be able to map from a Node to an object in the model without storing that data on the node itself. You should be spending your time looking for what else is keeping the older DOMs in memory. The weak hash map's keys cannot be the culprit. |