Skip to main content
. Author manuscript; available in PMC: 2023 Oct 14.
Published in final edited form as: Phys Med Biol. 2022 Oct 14;67(20):10.1088/1361-6560/ac95f7. doi: 10.1088/1361-6560/ac95f7

Figure 2:

Figure 2:

Tokenization with knowledge distillation: The CNN decoder first takes the class token and the information token as input to generate segmentations matching the ground truth. The decoder then takes the distillation token with the information tokens as input for segmentation, aiming to reach the output of the teacher models.