Original Articles
Ahmed R, Nova TT, Jarin TF, Hossain MM, Shams K, Rashid MRA (2025) Real-time monitoring of oyster mushroom cultivation using CCTV and attention-enhanced ShuffleNet-based explainable AI techniques. Smart Agric Technol 101571. https://doi.org/10.1016/j.atech.2025.101571
10.1016/j.atech.2025.101571Carion N, Gustafson L, Hu YT, Debnath S, Hu R, Suris D, Ryali C, Alwala KV, Khedr H, Huang A, Lei J, Ma T, Guo B, Kalla A, Marks M, Greer J, Wang M, Sun P, Rädle R, Afouras T, Mavroudi E, et al. (2025) SAM 3: Segment Anything with Concepts. arXiv preprint arXiv:2511.16719. https://doi.org/10.48550/arXiv.2511.16719
10.48550/arXiv.2511.16719Charisis C, Nuwayhid S, Argyropoulos D (2025) A novel Mask R-CNN-based tracking pipeline for oyster mushroom cluster growth monitoring in time-lapse image datasets. Comput Electron Agric 237:110590. https://doi.org/10.1016/j.compag.2025.110590
10.1016/j.compag.2025.110590Choi YW, Kim NE, Paudel B, Kim HT (2022) Strawberry pests and diseases detection technique optimized for symptoms using deep learning algorithm. J Bio-Env Con 31:255-263. (in Korean) https://doi.org/10.12791/KSBEC.2022.31.3.255
10.12791/KSBEC.2022.31.3.255Gao P, Geng S, Zhang R, Ma T, Fang R, Zhang Y, Li H, Qiao Y (2024) CLIP-Adapter: better vision-language models with feature adapters. Int J Comput Vis 132:581-595. https://doi.org/10.1007/s11263-023-01891-x
10.1007/s11263-023-01891-xGhazal S, Munir A, Qureshi WS (2024) Computer vision in smart agriculture and precision farming: Techniques and applications. Artif Intell Agric 13:64-83. https://doi.org/10.1016/j.aiia.2024.06.004
10.1016/j.aiia.2024.06.004Guragain DP, Shrestha B, Bajracharya I, Sharma N, Ghimire D, Bhat SA (2025) OysterMushNet: A deep learning-based mobile application for real-time mushroom disease detection and agronomist support. Smart Agric Technol 101735. https://doi.org/10.1016/j.atech.2025.101735
10.1016/j.atech.2025.101735He H (2023) Segment_anything_annotator. GitHub repository. https://github.com/haochenheheda/segment-anything-annotator
He K, Gkioxari G, Dollár P, Girshick R (2017) Mask R-CNN. Proc IEEE Int Conf Comput Vis 2961-2969. https://doi.org/10.1109/ICCV.2017.322
10.1109/ICCV.2017.322Ji W, Li J, Bi Q, Liu T, Li W, Cheng L (2024) Segment Anything Is Not Always Perfect: An Investigation of SAM on Different Real-world Applications. Mach Intell Res 21:1071-1090. https://doi.org/10.1007/s11633-023-1385-0
10.1007/s11633-023-1385-0Kamilaris A, Prenafeta-Boldú FX (2018) Deep learning in agriculture: A survey. Comput Electron Agric 147:70-90. https://doi.org/10.1016/j.compag.2018.02.016
10.1016/j.compag.2018.02.016Kirillov A, Mintun E, Ravi N, Mao H, Rolland C, Gustafson L, Xiao T, Whitehead S, Berg AC, Lo WY, Dollár P, Girshick R (2023) Segment Anything. Proc IEEE/CVF Int Conf Comput Vis 4015-4026. https://doi.org/10.1109/ICCV51070.2023.00371
10.1109/ICCV51070.2023.00371Koirala A, Walsh KB, Wang Z, McCarthy C (2019) Deep learning - Method overview and review of use for fruit detection and yield estimation. Comput Electron Agric 162:219-234. https://doi.org/10.1016/j.compag.2019.04.017
10.1016/j.compag.2019.04.017Korea Institute of Science and Technology Information (KISTI) (2019) Development of optimal growth model for precision cultivation of oyster mushroom and a standardization of oyster mushroom house. National Research Report. TRKO201900015947.
Lee CJ, Park HS, Lee EJ, Kong WS, Yu BK (2019) Analysis of growth environment by smart farm cultivation of oyster mushroom ‘Chunchu No 2’. J Mushrooms 17:119-125. https://doi.org/10.14480/JM.2019.17.3.119
10.14480/JM.2019.17.3.119Lee SH, Yu BK, Kim HJ, Yun NK, Jung JC (2015) Technology for improving the uniformity of the environment in the oyster mushroom cultivation house by using multi-layered shelves. J Bio-Env Con 24:128-133. (in Korean) http://dx.doi.org/10.12791/KSBEC.2015.24.2.128
10.12791/KSBEC.2015.24.2.128Li B, Weinberger KQ, Belongie S, Koltun V, Ranftl R (2022) Language-driven Semantic Segmentation. arXiv preprint arXiv:2201.03546. https://doi.org/10.48550/arXiv.2201.03546
10.48550/arXiv.2201.03546Li Y, Wang D, Yuan C, Li H, Hu J (2023) Enhancing Agricultural Image Segmentation with an Agricultural Segment Anything Model Adapter. Sensors 23:7884. https://doi.org/10.3390/s23187884
10.3390/s2318788437765940PMC10534855Li Z, Guo R, Li M, Chen Y, Li G (2020) A review of computer vision technologies for plant phenotyping. Comput Electron Agric 176:105672. https://doi.org/10.1016/j.compag.2020.105672
10.1016/j.compag.2020.105672Lin TY, Maire M, Belongie S, Hays J, Perona P, Ramanan D, Dollar P, Zitnick CL (2014) Microsoft COCO: Common objects in context. Proc. Eur. Conf. Comput. Vis. pp 740-755. https://doi.org/10.1007/978-3-319-10602-1_48
10.1007/978-3-319-10602-1_48Liu S, Zeng Z, Ren T, Li F, Zhang H, Yang J, Zhang L (2024) Grounding DINO: Marrying DINO with Grounded Pre-training for Open-Set Object Detection. Eur Conf Comput Vis 38-55. https://doi.org/10.1007/978-3-031-72970-6_3
10.1007/978-3-031-72970-6_3Ma X, Wu Q, Zhao X, Zhang X, Pun MO, Huang B (2024) SAM-Assisted remote sensing imagery semantic segmentation with object and boundary constraints. IEEE Trans Geosci Remote Sens 62:1-16. https://doi.org/10.1109/TGRS.2024.3443420
10.1109/TGRS.2024.3443420Mazurowski MA, Dong H, Gu H, Yang J, Konz N, Zhang Y (2023) Segment anything model for medical image analysis: An experimental study. Med Image Anal 89:102918. https://doi.org/10.1016/j.media.2023.102918
10.1016/j.media.2023.10291837595404PMC10528428Mumuni F, Mumuni A (2024) Segment Anything Model for automated image data annotation: empirical studies using text prompts from Grounding DINO. arXiv preprint arXiv:2406.19057. https://doi.org/10.48550/arXiv.2406.19057
10.48550/arXiv.2406.19057Radford A, Kim JW, Hallacy C, Ramesh A, Goh G, Agarwal S, Sutskever I (2021) Learning Transferable Visual Models From Natural Language Supervision. Int Conf Mach Learn 8748-8763. https://doi.org/10.48550/arXiv.2103.00020
10.48550/arXiv.2103.00020Rao Y, Zhao W, Chen G, Tang Y, Zhu Z, Huang G, Lu J (2022) DenseCLIP: Language-Guided Dense Prediction with Context-Aware Prompting. Proc IEEE/CVF Conf Comput Vis Pattern Recognit 18082-18091. https://doi.org/10.1109/CVPR52688.2022.01755
10.1109/CVPR52688.2022.01755Redmon J, Divvala S, Girshick R, Farhadi A (2016) You Only Look Once: Unified, Real-Time Object Detection. Proc IEEE Conf Comput Vis Pattern Recognit 779-788. https://doi.org/10.1109/CVPR.2016.91
10.1109/CVPR.2016.91Royse DJ, Baars J, Tan Q (2017) Current overview of mushroom production in the world. In: Zied DC, Pardo-Giménez A (eds) Edible and medicinal mushrooms: Technology and applications. Wiley, West Sussex, UK, pp 5-13. https://doi.org/10.1002/9781119149446.ch2
10.1002/9781119149446.ch2Russell BC, Torralba A, Murphy KP, Freeman WT. 2008. LabelMe: a database and web-based tool for image annotation. Int J Comput Vis 77:157-173. https://doi.org/10.1007/s11263-007-0090-8
10.1007/s11263-007-0090-8Sánchez C (2010) Cultivation of Pleurotus ostreatus and other edible mushrooms. Appl Microbiol Biotechnol 85:1321-1337. https://doi.org/10.1007/s00253-009-2343-7
10.1007/s00253-009-2343-7Silva C, Costa D, Costa J, Ribeiro B (2024) Data annotation quality in smart farming industry. Prod Manuf Res 12:2377253. https://doi.org/10.1080/21693277.2024.2377253
10.1080/21693277.2024.2377253Singh R, Bidese R, Dhakal K, Sornapudi S (2025) Few-Shot Adaptation of Grounding DINO for Agricultural Domain. Proc Comput Vis Pattern Recognit Conf 5332-5342. https://doi.org/10.1109/CVPRW67362.2025.00530
10.1109/CVPRW67362.2025.00530Taupa GJ, Villarica MV, Vinluan AA (2024) Enhancing oyster mushroom cultivation with solar-powered IoT and machine learning: Predicting harvest readiness. J Inf Syst Eng Manag 10:5551. https://doi.org/10.52783/jisem.v10i33s.5551
10.52783/jisem.v10i33s.5551Wosner O, Farjon G, Bar-Hillel A (2021) Object detection in agricultural contexts: A multiple resolution benchmark and comparison to human. Comput Electron Agric 189:106404. https://doi.org/10.1016/j.compag.2021.106404
10.1016/j.compag.2021.106404Yin S, Xi Y, Zhang X, Sun C, Mao Q (2025). Foundation Models in Agriculture: A Comprehensive Review. Agriculture 15:847. https://doi.org/10.3390/agriculture15080847
10.3390/agriculture15080847Yoon S, Shin M, Kim JH, Jeong HJ, Park J, Ahn TI (2024) Computer Vision Approach for Phenotypic Characterization of Horticultural Crops. J Bio-Env Con 33:63-70. https://doi.org/10.12791/KSBEC.2024.33.1.063
10.12791/KSBEC.2024.33.1.063Zhong Y, Yang J, Zhang P, Li C, Codella N, Li LH, Zhou L, Dai X, Yuan L, Li Y, Gao J (2022) RegionCLIP: Region-based language-image pretraining. Proc IEEE/CVF Conf Comput Vis Pattern Recognit, 16793-16803. https://doi.org/10.1109/CVPR52688.2022.01629
10.1109/CVPR52688.2022.01629- Publisher :The Korean Society for Bio-Environment Control
- Publisher(Ko) :(사)한국생물환경조절학회
- Journal Title :Journal of Bio-Environment Control
- Journal Title(Ko) :생물환경조절학회지
- Volume : 35
- No :1
- Pages :26-39
- Received Date : 2026-01-04
- Revised Date : 2026-01-14
- Accepted Date : 2026-01-22
- DOI :https://doi.org/10.12791/KSBEC.2026.35.1.026


Journal of Bio-Environment Control








