We present an interactive detection model to improve the cell annotation workflow of diffuse gastric cancer.
The model relates image and user inputs and is trained to detect three types of cells in diffuse gastric cancer histology.
We measure model multi-class cell detection performance as per-class F1 score and we show that it increases with the number of user input clicks.
Moreover, we show that the proposed interactive annotation approach substantially reduces the number of required user actions needed for complete image annotation, achieving a 17\% reduction for the multi-class case.
Future work will implement an iterative approach to filter out recurring false positives for further performance improvement.