My most recent publications are shown here but a full list is also available.
Akbar, Shazia; Peikari, Mohammad; Salama, Sherine; Panah, Azadeh Yazdan; Nofech-Mozes, Sharon; Martel, Anne L Automated and Manual Quantification of Tumour Cellularity in Digital Slides for Tumour Burden Assessment Journal Article Scientific Reports, 2019. Abstract | Links | BibTeX @article{Akbar2019,
title = {Automated and Manual Quantification of Tumour Cellularity in Digital Slides for Tumour Burden Assessment},
author = {Shazia Akbar and Mohammad Peikari and Sherine Salama and Azadeh Yazdan Panah and Sharon Nofech-Mozes and Anne L. Martel},
url = {http://www.nature.com/articles/s41598-019-50568-4},
year = {2019},
date = {2019-01-01},
journal = {Scientific Reports},
abstract = {The residual cancer burden index is an important quantitative measure used for assessing treatment response following neoadjuvant therapy for breast cancer. It has shown to be predictive of overall survival and is composed of two key metrics: qualitative assessment of lymph nodes and the percentage of invasive or in-situ tumour cellularity (TC) in the tumour bed (TB). Currently, TC is assessed through eye-balling of routine histopathology slides estimating the proportion of tumour cells within the TB. With the advances in production of digitized slides and increasing availability of slide scanners in pathology laboratories, there is potential to measure TC using automated algorithms with greater precision and accuracy. We describe two methods for automated TC scoring: 1) a traditional approach to image analysis development whereby we mimic the pathologists’ workflow, and 2) a recent development in artificial intelligence in which features are learned automatically in deep neural networks using image data alone. We show strong agreements between automated and manual analysis of digital slides. Agreements between our trained deep neural networks and experts in this study (0.82) approach the inter-rater agreements between pathologists (0.89). We also reveal properties that are captured when we apply deep neural network to whole slide images, and discuss the potential of using such visualisations to improve upon TC assessment in the future.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
The residual cancer burden index is an important quantitative measure used for assessing treatment response following neoadjuvant therapy for breast cancer. It has shown to be predictive of overall survival and is composed of two key metrics: qualitative assessment of lymph nodes and the percentage of invasive or in-situ tumour cellularity (TC) in the tumour bed (TB). Currently, TC is assessed through eye-balling of routine histopathology slides estimating the proportion of tumour cells within the TB. With the advances in production of digitized slides and increasing availability of slide scanners in pathology laboratories, there is potential to measure TC using automated algorithms with greater precision and accuracy. We describe two methods for automated TC scoring: 1) a traditional approach to image analysis development whereby we mimic the pathologists’ workflow, and 2) a recent development in artificial intelligence in which features are learned automatically in deep neural networks using image data alone. We show strong agreements between automated and manual analysis of digital slides. Agreements between our trained deep neural networks and experts in this study (0.82) approach the inter-rater agreements between pathologists (0.89). We also reveal properties that are captured when we apply deep neural network to whole slide images, and discuss the potential of using such visualisations to improve upon TC assessment in the future. |
Akbar, Shazia; Peikari, Mohammad; Salama, Sherine; Nofech-Mozes, Sharon; Martel, Anne L The transition module: A method for preventing overfitting in convolutional neural networks Journal Article Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization, 7 , 2019. Abstract | Links | BibTeX @article{Akbar2018b,
title = {The transition module: A method for preventing overfitting in convolutional neural networks},
author = {Shazia Akbar and Mohammad Peikari and Sherine Salama and Sharon Nofech-Mozes and Anne L. Martel},
url = {https://www.tandfonline.com/doi/abs/10.1080/21681163.2018.1427148},
year = {2019},
date = {2019-01-01},
journal = {Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization},
volume = {7},
abstract = {Digital pathology has advanced substantially over the last decade with the adoption of slide scanners in pathology labs. The use of digital slides to analyse diseases at the microscopic level is both cost-effective and efficient. Identifying complex tumour patterns in digital slides is a challenging problem but holds significant importance for tumour burden assessment, grading and many other pathological assessments in cancer research. The use of convolutional neural networks (CNNs) to analyse such complex images has been well adopted in digital pathology. However, in recent years, the architecture of CNNs has altered with the introduction of inception modules which have shown great promise for classification tasks. In this paper, we propose a modified ‘transition’ module which encourages generalisation in a deep learning framework with few training samples. In the transition module, filters of varying sizes are used to encourage class-specific filters at multiple spatial resolutions followed by global average pooling. We demonstrate the performance of the transition module in AlexNet and ZFNet, for classifying breast tumours in two independent data-sets of scanned histology sections; the inclusion of the transition module in these CNNs improved performance.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
Digital pathology has advanced substantially over the last decade with the adoption of slide scanners in pathology labs. The use of digital slides to analyse diseases at the microscopic level is both cost-effective and efficient. Identifying complex tumour patterns in digital slides is a challenging problem but holds significant importance for tumour burden assessment, grading and many other pathological assessments in cancer research. The use of convolutional neural networks (CNNs) to analyse such complex images has been well adopted in digital pathology. However, in recent years, the architecture of CNNs has altered with the introduction of inception modules which have shown great promise for classification tasks. In this paper, we propose a modified ‘transition’ module which encourages generalisation in a deep learning framework with few training samples. In the transition module, filters of varying sizes are used to encourage class-specific filters at multiple spatial resolutions followed by global average pooling. We demonstrate the performance of the transition module in AlexNet and ZFNet, for classifying breast tumours in two independent data-sets of scanned histology sections; the inclusion of the transition module in these CNNs improved performance. |
Akbar, Shazia; Martel, Anne L Cluster-based learning from weakly labeled bags in digital pathology Conference Machine Learning for Health Workshop, NeurIPS 2018, 2018. Abstract | Links | BibTeX @conference{Akbar2018a,
title = {Cluster-based learning from weakly labeled bags in digital pathology},
author = {Shazia Akbar and Anne L. Martel },
url = {https://arxiv.org/abs/1812.00884},
year = {2018},
date = {2018-01-01},
booktitle = {Machine Learning for Health Workshop, NeurIPS 2018},
abstract = {To alleviate the burden of gathering detailed expert annotations when training deep neural networks, we propose a weakly supervised learning approach to recognize metastases in microscopic images of breast lymph nodes. We describe an alternative training loss which clusters weakly labeled bags in latent space to inform relevance of patch-instances during training of a convolutional neural network. We evaluate our method on the Camelyon dataset which contains high-resolution digital slides of breast lymph nodes, where labels are provided at the image-level and only subsets of patches are made available during training.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
To alleviate the burden of gathering detailed expert annotations when training deep neural networks, we propose a weakly supervised learning approach to recognize metastases in microscopic images of breast lymph nodes. We describe an alternative training loss which clusters weakly labeled bags in latent space to inform relevance of patch-instances during training of a convolutional neural network. We evaluate our method on the Camelyon dataset which contains high-resolution digital slides of breast lymph nodes, where labels are provided at the image-level and only subsets of patches are made available during training. |
Akbar, Shazia; Peikari, Mohammad; Salama, Sherine; Nofech-Mozes, Sharon; Martel, Anne L Determining tumor cellularity in digital slides using ResNet Conference SPIE Medical Imaging, 2018. Abstract | Links | BibTeX @conference{Akbar2018c,
title = {Determining tumor cellularity in digital slides using ResNet},
author = {Shazia Akbar and Mohammad Peikari and Sherine Salama and Sharon Nofech-Mozes and Anne L. Martel},
url = {https://doi.org/10.1117/12.2292813},
year = {2018},
date = {2018-01-01},
booktitle = {SPIE Medical Imaging},
journal = {SPIE Medical Imaging},
abstract = {The residual cancer burden index is a powerful prognostic factor which is used to measure neoadjuvant therapy response in invasive breast cancers. Tumor cellularity is one component of the residual cancer burden index and is currently measured manually through eyeballing. As such it is subject to inter- and intra-variability and is currently restricted to discrete values. We propose a method for automatically determining tumor cellularity in digital slides using deep learning techniques. We train a series of ResNet architectures to output both discrete and continuous values and compare our outcomes with scores acquired manually by an expert pathologist. Our configurations were validated on a dataset of image patches extracted from digital slides, each containing various degrees of tumor cellularity. Results showed that, in the case of discrete values, our models were able to distinguish between regions-of-interest containing tumor and healthy cells with over 97% test accuracy rates. Overall, we achieved 76% accuracy over four predefined tumor cellularity classes (no tumor/tumor; low, medium and high tumor cellularity). When computing tumor cellularity scores on a continuous scale, ResNet showed good correlations with manually-identified scores, showing potential for computing reproducible scores consistent with expert opinion using deep learning techniques.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
The residual cancer burden index is a powerful prognostic factor which is used to measure neoadjuvant therapy response in invasive breast cancers. Tumor cellularity is one component of the residual cancer burden index and is currently measured manually through eyeballing. As such it is subject to inter- and intra-variability and is currently restricted to discrete values. We propose a method for automatically determining tumor cellularity in digital slides using deep learning techniques. We train a series of ResNet architectures to output both discrete and continuous values and compare our outcomes with scores acquired manually by an expert pathologist. Our configurations were validated on a dataset of image patches extracted from digital slides, each containing various degrees of tumor cellularity. Results showed that, in the case of discrete values, our models were able to distinguish between regions-of-interest containing tumor and healthy cells with over 97% test accuracy rates. Overall, we achieved 76% accuracy over four predefined tumor cellularity classes (no tumor/tumor; low, medium and high tumor cellularity). When computing tumor cellularity scores on a continuous scale, ResNet showed good correlations with manually-identified scores, showing potential for computing reproducible scores consistent with expert opinion using deep learning techniques. |
Akbar, Shazia; Peikari, Mohammad; Salama, Sherine; Nofech-Mozes, Sharon; Martel, Anne L Transitioning between convolutional and fully connected layers in neural networks Workshop International Workshop on Deep Learning in Medical Image Analysis, MICCAI 2017, 2017. Abstract | Links | BibTeX @workshop{Akbar2017a,
title = {Transitioning between convolutional and fully connected layers in neural networks},
author = {Shazia Akbar and Mohammad Peikari and Sherine Salama and Sharon Nofech-Mozes and Anne L. Martel},
url = {https://arxiv.org/abs/1707.05743},
year = {2017},
date = {2017-01-01},
booktitle = {International Workshop on Deep Learning in Medical Image Analysis, MICCAI 2017},
journal = {International Workshop on Deep Learning in Medical Image Analysis, MICCAI 2017},
abstract = {Digital pathology has advanced substantially over the last decade however tumor localization continues to be a challenging problem due to highly complex patterns and textures in the underlying tissue bed. The use of convolutional neural networks (CNNs) to analyze such complex images has been well adopted in digital pathology. However in recent years, the architecture of CNNs have altered with the introduction of inception modules which have shown great promise for classification tasks. In this paper, we propose a modified "transition" module which learns global average pooling layers from filters of varying sizes to encourage class-specific filters at multiple spatial resolutions. We demonstrate the performance of the transition module in AlexNet and ZFNet, for classifying breast tumors in two independent datasets of scanned histology sections, of which the transition module was superior.},
keywords = {},
pubstate = {published},
tppubtype = {workshop}
}
Digital pathology has advanced substantially over the last decade however tumor localization continues to be a challenging problem due to highly complex patterns and textures in the underlying tissue bed. The use of convolutional neural networks (CNNs) to analyze such complex images has been well adopted in digital pathology. However in recent years, the architecture of CNNs have altered with the introduction of inception modules which have shown great promise for classification tasks. In this paper, we propose a modified "transition" module which learns global average pooling layers from filters of varying sizes to encourage class-specific filters at multiple spatial resolutions. We demonstrate the performance of the transition module in AlexNet and ZFNet, for classifying breast tumors in two independent datasets of scanned histology sections, of which the transition module was superior. |