We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
Automatic Cell Counting With YOLOv5: A Fluorescence Microscopy Approach.
Summary
This paper is not about microplastics; it is a study of automatic cell counting using the YOLOv5 deep learning model applied to fluorescence microscopy images, achieving high accuracy for laboratory cell detection.
Counting cells in a Neubauer chamber on microbiological culture plates is a laborious task that depends on technical experience. As a result, efforts have been made to advance computer vision-based approaches, increasing efficiency and reliability through quantitative analysis of microorganisms and calculation of their characteristics, biomass concentration, and biological activity. However, variability that still persists in these processes poses a challenge that is yet to be overcome. In this work, we propose a solution adopting a YOLOv5 network model for automatic cell recognition and counting in a case study for laboratory cell detection using images from a CytoSMART Exact FL microscope. In this context, a dataset of 21 expert-labeled cell images was created, along with an extra Sperm DetectionV dataset of 1024 images for transfer learning. The dataset was trained using the pretrained YOLOv5 algorithm with the Sperm DetectionV database. A laboratory test was also performed to confirm result’s viability. Compared to YOLOv4, the current YOLOv5 model had accuracy, precision, recall, and F1 scores of 92%, 84%, 91%, and 87%, respectively. The YOLOv5 algorithm was also used for cell counting and compared to the current segmentation-based U-Net and OpenCV model that has been implemented. In conclusion, the proposed model successfully recognizes and counts the different types of cells present in the laboratory.
Sign in to start a discussion.
More Papers Like This
Label-free identification of microplastics in human cells: dark-field microscopy and deep learning study
Researchers developed a label-free method to identify microplastics inside living human cells using enhanced dark-field microscopy combined with deep learning, achieving high classification accuracy for polystyrene microparticles differing only in pigmentation.
A Handy Open-Source Application Based on Computer Vision and Machine Learning Algorithms to Count and Classify Microplastics
An open-source computer vision application was developed to automatically count and classify microplastics in microscopy images, achieving accuracy comparable to manual counting while processing samples orders of magnitude faster, offering the scientific community a free tool to reduce the bottleneck of tedious visual microplastic enumeration.
Improved detection and counting performance of microplastics in common carp whole blood by an attention-guided deep learning method
Researchers developed an attention-guided deep learning method called Attention-YOLO to improve automated detection and counting of microplastic polystyrene particles in common carp whole blood samples imaged by bright-field microscopy. The system incorporated a channel attention mechanism into the feature extraction network and was trained on a custom dataset of particles in various colors, improving detection accuracy over standard YOLO approaches for high-throughput toxicity studies.
Efficient Microplastic Detection in Water Using ResNet50 and Fluorescence Imaging
Researchers applied a ResNet50 deep learning model to fluorescence microscopy images of water samples, achieving high-accuracy classification of microplastics, demonstrating that deep learning can efficiently automate microplastic identification from microscopy data.
Development of Microplastics Detector and Quantifier Utilizing Deep Learning Based Algorithm
Researchers developed a microplastics detector and quantifier using deep learning-based image analysis, training a neural network to identify and count microplastic particles in microscopic images. The system achieved high accuracy and offers a faster, more objective alternative to manual counting.