JSON/XML data representation considering ADL constraints

Hi all, today we found a case when sharing observations between different systems, for instance Blood Pressure.

In the archetype, systolic and diastolic precision is limited to zero, which means DV_QUANTITY.magnitude is integer.

Due to internal data representation and serialization to JSON or XML, when creating a systolic value of “110”, we end up seeing “110.0” in the JSON or XML.

My question is: should precision affect how the RM is serialized to JSON/XML? I mean if precision = 0, should we end up with “110” as the value in the JSON / XML or it’s OK to have “110.0” and precision = 0 in the data ***?

*** that is DV_QUANTITY.precision attribute, which is optional.

Also, what should happen if DV_QUANTITY.precision is not in the data and we only have the magnitude “110.0” but looking at the archetype we see the constraint doesn’t allow decimal values? Is that valid?

Technically, it’s a real number with an integral value in this case. I don’t think it should matter if it is serialised as 110 or 110.0, since the natural promotion of Integer → Real should take place somewhere along the line, and the right in-memory structure will result.

1 Like

My first impression is that we should just let the serialization do its job in its own natural way, and avoid interference and extra openEHR reqs.
It is after all just data that lives outside the system; ‘importing’ it should however validate it. Otherwise we’ll end up like FHIR case, which requires a float like 10.000 in JSON to be send as it is (with trailing zeros), which is just not possible (in the libraries that I used), and does not makes any sense (in my opinion). But I’m also curious about who people see this issue…

2 Likes

I can’t see any issues from a clinical safety PoV with "“110.0” and precision = 0 in the serialisation as long as it is resolved in-memory.

1 Like

Thank you all for the clarification.

@ian.mcnicoll since DV_QUANTITY.precision is optional, sometimes we don’t have that precision=0 in the RM instance, but the constraint is in the correspondent archetype.

Another clarification, related but in another piece of the architecture, I guess also for persistence we depend on the representation of the DV_QUANTITY.magnitude at the database level being a float / real, so it might also store the data with a “.0” even if the precision is zero in the archetype and if there is not DV_QUANTITY.precision = 0 stored alongside the magnitude.

Another case I can think of is when the data contains “123.5” and the archetype has precision = 0 for that value. Is that value violating the constraint?

thank you all!

Thanks Pablo,

Makes sense. I can’t think of any reason why 123.0 with a precision of 0 would be a problem - I’d expect it to be without the .0 in memory though.

OTOH

is, or should be a validation error. as I understand it.

1 Like

Since the data type used for DV_QUANTITY.magnitude is float/real in serialization formats, databases and code, I doubt the representation in memory can actually be an integer. I guess the only control point is when the data is displayed in a GUI, but for the rest I doubt it.

My concern besides data validation is because using float representation at many levels could lead to misinterpretation of the data at the program level, since “111.0” might not be exactly “111” in memory or DB, so when comparing them we will see a difference (internally 111.0 might be 110.9999…).

And that could lead to risks if programmers don’t take that into account.

Computer data representation, meh…

3 Likes