This project is read-only.

BinaryPropertyDeserializer: Duplicate Keys being added to PropertyCache


I've hit an issue in developing with SharpSerializer, but I need help from someone more familiar with its use-cases to decide on how to fix.

Basically I get an "An item with the same key has already been added." ArgumentException during the property deserialize operation, on the line:
_propertyCache.Add(referenceId, referenceProperty);
As far as I can tell the only difference between the ReferenceTargetProperty already in the cache, and the new one being inserted is the 'Count' value, which is 2 for the cached value, and 1 for the new value.

Points to note:

1) The data structure being deserialized gets successfully deserialized the first time through the deserializer, it's only on the second call that this issue is hit - so definitely feels like the cache behavior is the core issue.

2) The data structure is an IDictionary<string,object> where object could also be a nested dictionary (to no more than 3-4 levels deep.

3) The propertyName value is Null (I'm guessing because the data structure is simply declared as nested dictionaries rather than custom classes.)


1) Should there be any kind of check around the above code to catch this? If so, should it simply increment the Count value? Or do something more clever?

2) Are there any inherent problems with using un-named nested generic dictionaries like this?

Apart from this issue, I'm pretty happy with SharpSerializer so far - and happy to contribute with a little bit of guidance...

Thanks :)


keyboarder wrote Jun 19, 2014 at 2:22 PM

To fix this, I clear the dictionary in the entry method:
   public Property Deserialize()
This works to eliminate the exception, and should fix the problem in the most general case. I do not see a need to save the cached values beyond the current call to Deserialize(). If I am mistaken, then more work needs to be done to provide a context for _propertyCache so that it isn't full of bogus values.

SemCharlie wrote Jun 19, 2014 at 10:15 PM

That makes sense - I didn't delve deep enough to understand the use cases, so I'll go with the Clear().

Thanks Keyboarder! :)