The difference in how multiple pathologists answer the same question on same slide is called interobsever variability. It is often studied as it is important to establish how reliably can certain things be scored - whether pathologists agree that a case is high or low grade for example.
Slide Score helps you get an overview on how pathologists scored each slide and pathologists can get personal feedback on how they compare to the majority opinion
Slide Score was initially created to support a large interobsever variability study in breast cancer (manuscript in review).
It has some unique features to support this type of research:
-
Participants can receive an email with a link that will log them in the system and let them continue where they left off
-
Each pathologist can see slides that they have to score in a unique order to minimize scoring biases
-
Study can be configured in such a way that changing submitted scores or even going back to an already scored slide is impossible
-
Slides can have their original names and slide labels hidden
-
It's possible to limit the functionalities available so that even pathologists with minimal digital pathology experience can easily navigate it and finish scoring their whole set
-
After concluding the study pathologists can get personalized feedback on how much they agreed with their colleagues and where did they disagree