XmlSerializer and 0x don’t match

By , 03/11/2009 16:13

I was happily using the XmlSerializer to serialize and deserialize some XML documents at work when I came across a rather annoying “feature”. The deserializer is unable to deserialize integers that are specified in a hexadecimal format with a 0x prefix.

The easy solution would be to just use decimal numbers to represent the values since the XML files are generated anyway, but from a human point of view it would be better if the hexadecimal values are visible because they are more readable that way (specific ranges are used etc.).

My very simple solution is to hide the actual properties from the serializer and add some extra properties that handle the conversion.

  2.     [Serializable]
  3.     public struct SDODescription
  4.     {
  5.         private ushort m_index;
  7.         [XmlIgnore]
  8.         public ushort Index
  9.         {
  10.             get { return m_index; }
  11.             set { m_index = value; }
  12.         }
  14.         [XmlAttribute("index")]
  15.         public string IndexString
  16.         {
  17.             get { return String.Format("0x{0:X}", Index); }
  18.             set { Index = Convert.ToUInt16(value, 16); }
  19.         }
  20.     }

At the moment I see 2 downsides to this solution:

  • Extra members are exposed that are (and should) only be used by the serializer.
  • It is either hexadecimal or decimal, but not both. An extra check could be added in the setter to check if 0x is present, but for the getter a choice needs to be made anyway.

Flattr this!

Leave a Reply


Panorama Theme by Themocracy