Hello, I can read in the DDS spec the statement "Some elements in the returned collection may not have valid data. If the instance_state in the SampleInfo is NOT_ALIVE_DISPOSED or NOT_ALIVE_NO_WRITERS, then the last sample for that instance in the collection, that is, the one whose SampleInfo has sample_rank==0 does not Contain valid data."
Now imagine the sequence Write / Dispose occurs and the user asks to read only one sample (with states ANY, ANY, ANY). Does specification mean the implementation is supposed to return NO_DATA since the previous constraint cannot be met?
Indeed if the implementation returns one sample, how to distinguish a dispose from a write (both NOT_ALIVE, one of the sample contains invalid data but no means to know that)?
The DDS specification is a little bit ambiguous about this subject. From the quote you already mentioned (taken from section 22.214.171.124.3.8) you might suggest that a dispose always results in a dummy sample (i.e. a sample with no data) being inserted, possibly replacing an older valid sample in case of history=KEEP_LAST with depth=1.
However, in section 126.96.36.199.1.4 of the same DDS spec, it states the following:
"The actual set of scenarios under which the middleware returns DataSamples containing no Data is implementation dependent. The application can distinguish whether a particular DataSample has data by examining the value of the valid_data flag. If this flag is set to TRUE, then the DataSample contains valid Data. If the flag is set to FALSE, the DataSample contains no Data."
So here it is stated that the implementor is free to choose under which circumstances to insert dummy samples.
OpenSplice DDS only inserts a dummy sample when an instance state change occurs for an instance whose samples have already been taken away from the DataReader. In your scenario (a write followed by a dipose) there are 2 possible outcomes: * The sample has already been taken by the DataReader before the dispose arrives. * The sample is still available when the dispose arrives.
In the first case, a dummy sample will be inserted, whose lifecycle states are set to NOT_NEW, NOT_READ and NOT_ALIVE_DISPOSED. You can see that it is a dummy sample because the corresponding SampleInfo has its valid_data field set to FALSE.
In the second case the previous sample remains available, but its InstanceState is set to NOT_ALIVE_DISPOSED. You can see that it is a normal sample (not a dummy) because the corresponding SampleInfo has its valid_data field set to TRUE.
Other vendors may take other decisions here, but the valid_data flag will always tell you whether you are allowed to interpret the non-keyfields of each sample. Hope that answers your question.
to complete the statements above: even when there are only dummy samples available in a DataReader, this is still considered as data being available (even invalid_data is stil considered data in that case).
That means that the DATA_AVAILABLE status may be set to TRUE because of that, en that when you read or take such data you will get a returncode of RETCODE_OK. The RETCODE_NO_DATA will only be returned when the read or take operation return sequences with 0 elements.