Presenter

Qiankai Wang, MASc candidate in Systems Design Engineering

Abstract

Virtual staining has emerged as a promising alternative to traditional histological staining techniques, offering reagent-free, non-destructive generation of diagnostic-quality images from label-free modalities. However, two critical challenges hinder its broader adoption: the lack of domain-specific evaluation metrics and the inefficiency of large-scale whole slide image (WSI) preprocessing. Conventional image quality metrics such as SSIM and LPIPS fail to capture the diagnostic relevance of histological structures, while most WSI slicing tools are not optimized for high-throughput virtual staining pipelines.

This thesis addresses both challenges. First, we develop a multi-threaded WSI slicing framework tailored for OME-TIFF images, enabling scalable and efficient patch extraction with tile-aware indexing, thread-safe file I/O, and optional in-memory caching. Our method achieves a 6–10× speedup over traditional serial approaches while maintaining minimal memory overhead.

Second, we propose PaPIS (Pathological Perceptual Image Similarity), a full-reference, pathology-aware image quality metric. PaPIS leverages deep features extracted from a pretrained cell morphology segmentation model and incorporates Retinex-based feature decomposition to evaluate structural and perceptual fidelity from a diagnostic perspective. Experimental results show that PaPIS correlates better with histological quality than traditional metrics.

Finally, we integrate PaPIS as a perceptual loss in a modified CycleGAN model for virtual staining, demonstrating improved visual realism and pathology alignment in both patch-wise and whole-slide outputs. Together, our contributions provide a robust foundation for scalable, pathology-aware virtual staining pipelines.

Attending this seminar will count towards the graduate student seminar attendance milestone!

Join on Teams