Article Data

  • Views 1080
  • Dowloads 181

Original Research

Open Access

Neural Network Detection and Segmentation of Mental Foramen in Panoramic Imaging

  • Lazar Kats1,*,
  • MarilenaVered2
  • Sigalit Blumer3
  • Eytan Kats4

1Department of Oral Pathology, Oral Medicine and Maxillofacial Imaging, School of Dental Medicine, Tel Aviv University, Tel Aviv, Israel

2Institute of Pathology, The Chaim Sheba Medical Center, Tel Hashomer, Ramat Gan, Israel

3Department of Pediatric Dentistry, School of Dental Medicine, Tel Aviv University, Tel Aviv, Israel

4Freelance algorithm developer, Haifa, Israel

DOI: 10.17796/1053-4625-44.3.6 Vol.44,Issue 3,May 2020 pp.168-173

Published: 01 May 2020

*Corresponding Author(s): Lazar Kats E-mail: lazarkat@tauex.tau.ac.il

Abstract

Objective: To apply the technique of deep learning on a small dataset of panoramic images for the detection and segmentation of the mental foramen (MF). Study design: In this study we used in-house dataset created within the School of Dental Medicine, Tel Aviv University. The dataset contained randomly chosen and anonymized 112 digital panoramic X-ray images and corresponding segmentations of MF. In order to solve the task of segmentation of the MF we used a single fully convolution neural network, that was based on U-net as well as a cascade architecture. 70% of the data were randomly chosen for training, 15% for validation and accuracy was tested on 15%. The model was trained using NVIDIA GeForce GTX 1080 GPU. The SPSS software, version 17.0 (Chicago, IL, USA) was used for the statistical analysis. The study was approved by the ethical committee of Tel Aviv University. Results: The best results of the dice similarity coefficient ( DSC), precision, recall, MF-wise true positive rate (MFTPR) and MF-wise false positive rate (MFFPR) in single networks were 49.51%, 71.13%, 68.24%, 87.81% and 14.08%, respectively. The cascade of networks has shown better results than simple networks in recall and MFTPR, which were 88.83%, 93.75%, respectively, while DSC and precision achieved the lowest values, 31.77% and 23.92%, respectively. Conclusions: Currently, the U-net, one of the most used neural network architectures for biomedical application, was effectively used in this study. Methods based on deep learning are extremely important for automatic detection and segmentation in radiology and require further development.


Keywords

Neural network; Mental foramen; Panoramic imaging; Detection; Segmentation

Cite and Share

Lazar Kats,MarilenaVered,Sigalit Blumer, Eytan Kats. Neural Network Detection and Segmentation of Mental Foramen in Panoramic Imaging. Journal of Clinical Pediatric Dentistry. 2020. 44(3);168-173.

References

1. Laher AE, Wells M, Motara F, Kramer E, Moolla M, Mahomed Z. Finding the mental foramen. Surg Radiol Anat 38: 469-76, 2016.

2. Smith MH, Lung KE. Nerve injuries after dental injection: a review of the literature. J Can Dent Assoc 72: 559-564, 2006.

3. Greenstein G, Tarnow D. The mental foramen and nerve: clinical and anatomical factors related to dental implant placement: a literature review. J Periodontol 77: 1933-43, 2006.

4. Chong BS, Gohil K, Pawar R, Makdissi J. Anatomical relationship between mental foramen, mandibular teeth and risk of nerve injury with endodontic treatment. Clin Oral Investig 21: 381-7, 2017.

5. Laher AE, Wells M, Motara F, Kramer E, Moolla M, Mahomed Z. Finding the mental foramen. Surg Radiol Anat 38: 469-76, 2016.

6. Iwanaga J, Watanabe K, Saga T, Tabira Y, Kitashima S, Kusukawa J, Yamaki K. Accessory mental foramina and nerves: application to periodontal, periapical, and implant surgery. Clin Anat 29: 493–501, 2016.

7. Borghesi A, Pezzotti S, Nocivelli G, Maroldi R. Five mental foramina in the same mandible: CBCT findings of an unusual anatomical variant. Surg Radiol Anat 40: 635-40, 2018.

8. Rahpeyma A, Khajehahmadi S. Accessory Mental Foramen and Maxillofacial Surgery. J Craniofac Surg 29: 216-7, 2018.

9. Muinelo-Lorenzo J, Suárez-Quintanilla JA, Fernández-Alonso A, Varela-Mallou J, Suárez-Cunqueiro MM. Anatomical characteristics and visibility of mental foramen and accessory mental foramen: Panoramic radiography vs. cone beam CT. Med Oral Patol Oral Cir Bucal 20: 707-14, 2015.

10. Moro A, Abe S, Yokomizo N, Kobayashi Y, Ono T, Takeda T. Topographical distribution of neurovascular canals and foramens in the mandible: avoiding complications resulting from their injury during oral surgical procedures. Heliyon 4: 00812, 2018.

11. Charron O, Lallement A, Jarnet D, Noblet V, Clavier JB, Meyer P. Automatic detection and segmentation of brain metastases on multimodal MR images with a deep convolutional neural network. Comput Biol Med 95: 43–54, 2018.

12. van Ginneken B. Fifty years of computer analysis in chest imaging: rulebased, machine learning, deep learning. Radiol Phys Technol 10: 23-32, 2017.

13. Kooi T, Litjens G, van Ginneken B, Gubern-Mérida A, Sánchez CI, Mann R, den Heeten A, Karssemeijer N. Large scale deep learning for computer aided detection of mammographic lesions. Med Image Anal 35: 303-12, 2017.

14. Zhang, K., Wu, J., Chen, H. & Lyu, P. An effective teeth recognition method using label tree with cascade network structure. Computerized Medical Imaging and Graphics 68: 61–70, 2018.

15. Miki Y, Muramatsu C, Hayashi T, Zhou X, Hara T, Katsumata A, et al. Classification of teeth in cone-beam CT using deep convolutional neural network. Comput Biol Med 80: 24–9, 2017.

16. Chen H,Zhang K,Lyu P,Li H,Zhang L,Wu J,Lee CH. A deep learning approach to automatic teeth detection and numbering based on object detectionin dental periapical films. Sci Rep doi: 10.1038/s41598-019-40414-y, 2019.

17. Tuzoff DV, Tuzova LN, Bornstein MM, Krasnov AS, Kharchenko MA, Nikolenko SI, Sveshnikov MM, Bednenko GB. Tooth detection and numbering in panoramic radiographs using convolutional neural networks. Dentomaxillofac Radiol doi:10.1259/dmfr.20180051, 2019.

18. Lee JH, Kim DH, Jeong SN, Choi SH. Detection and diagnosis of dental caries using a deep learning-based convolutional neural network algorithm. J Dent 77: 106-11, 2018.

19. Lee JH, Kim DH, Jeong SN, Choi SH. Diagnosis and prediction of periodontally compromised teeth using a deep learning-based convolutional neural network algorithm. J Periodontal Implant Sci 48: 114–23, 2018.

20. Krois J, Ekert T, Meinhold L, Golla T, Kharbot B, Wittemeier A, Dörfer C, Schwendicke F. Deep Learning for the Radiographic Detection of Periodontal Bone Loss. Sci Rep doi: 10.1038/s41598-019-44839-3, 2019.

21. Lee JS, Adhikari S, Liu L, Jeong HG, Kim H, Yoon SJ. Osteoporosis detection in panoramic radiographs using a deep convolutional neural networkbased computer-assisted diagnosis system: a preliminary study. Dentomaxillofac Radiol 13: 20170344, 2018.

22. Kats L, Vered M, Zlotogorski-Hurvitz A, Harpaz I. Atherosclerotic carotid plaque on panoramic radiographs: neural network detection. Int J Comput Dent 22: 163-9, 2019.

23. Kim DW, Kim H, Nam W, Kim HJ, Cha IH. Machine learning to predict the occurrence of bisphosphonate-related osteonecrosis of the jaw associated with dental extraction: A preliminary Report Bone 116: 207-14, 2018.

24. Murata M, Ariji Y, Ohashi Y, Kawai T, Fukuda M, Funakoshi T, Kise Y, Nozawa M, Katsumata A, Fujita H, Ariji E. Deep-learning classification using convolutional neural network for evaluation of maxillary sinusitis on panoramic radiography. Oral Radiol 35: 301-7, 2019.

25. Ribera NT, de Dumast P, Yatabe M, Ruellas A, Ioshida M, Paniagua B, Styner M, Gonçalves JR, Bianchi J, Cevidanes L, Prieto JC. Shape variation analyzer: a classifier for temporomandibular joint damaged by osteoarthritis. Proc SPIE Int Soc Opt Eng doi: 10.1117/12.2506018, 2019.

26. de Dumast P, Mirabel C, Cevidanes L, Ruellas A, Yatabe M, Ioshida M, Ribera NT, Michoud L, Gomes L, Huang C, Zhu H, Muniz L, Shoukri B, Paniagua B, Styner M, Pieper S, Budin F, Vimort J-B, Pascal L, et al. A web-based system for neural network based classification in temporomandibular joint osteoarthritis. Comput Med Imaging Graph 67:45–54, 2018.

27. Ekert T, Krois J, Meinhold L, Elhennawy K, Emara R, Golla T, Schwendicke F. Deep Learning for the Radiographic Detection of Apical Lesions J Endod 45: 917-22, 2019.

28. Ariji Y, Yanashita Y, Kutsuna S, Muramatsu C, Fukuda M, Kise Y, Nozawa M, Kuwada C, Fujita H, Katsumata A, Ariji E. Automatic detection and classification of radiolucent lesions in the mandible on panoramic radiographs using a deep learning object detection technique. Oral Surg Oral Med Oral Pathol Oral Radiol 128: 424-30, 2019.

29. Poedjiastoeti W, Suebnukarn S. Application of Convolutional Neural Network in the Diagnosis of Jaw Tumors. Health Inform Res 24: 236-41, 2018.

30. Ariji Y, Sugita Y, Nagao T, Nakayama A, Fukuda M, Kise Y, Nozawa M, Nishiyama M, Katumata A, Ariji E. CT evaluation of extranodal extension of cervical lymph node metastases in patients with oral squamous cell carcinoma using deep learning classification. Oral Radiol doi: 10.1007/ s11282-019-00391-4, 2019.

31. Ronneberger O, Fischer P, Brox T. (2015) U-Net: Convolutional Networks for Biomedical Image Segmentation. In: Navab N., Hornegger J., Wells W., Frangi A. (eds) Medical Image Computing and Computer-Assisted Intervention – MICCAI 2015. Lecture Notes in Computer Science, vol 9351. Springer, Cham. MICCAI 2015.

32. Brosch T, Tang LY, Youngjin Yoo, Li DK, Traboulsee A, Tam R. Deep 3D Convolutional Encoder Networks With Shortcuts for Multiscale Feature Integration Applied to Multiple Sclerosis Lesion Segmentation. IEEE Trans Med Imaging 35: 1229-39, 2016.

33. Nair V, Hinton G. ICML’10. Proceedings of the 27th International Conference on International Conference on Machine Learning; 2010 June 21–24; Haifa, Israel. USA: Omnipress; p.807-14, 2010.

34. Dice L. Measures of the Amount of Ecologic Association Between Species. Ecology. 1945; 26: 297-302.

35. Tversky A. Features of similarity. Psychol Rev 84: 327-52, 1977.

36. Ronneberger O, Fischer F, Brox T. Dental X-ray Image Segmenation using a U-shaped Deep Convolutional Network. https://arxiv.org/abs/1505.04597, 2015.

37. Salehi SS, Erdogmus D, & Gholipour A. Tversky loss function for image segmentation using 3D fully convolutional deep networks. Tversky as a Loss Function for Highly Unbalanced Image Segmentation using 3D Fully Convolutional Deep Networks. https://arxiv.org/abs/1706.05721, 2017.

38. Hwang JJ, Jung YH, Cho BH, Heo MS. An overview of deep learning in the field of dentistry. Imaging Sci Dent 49: 1-7, 2019.


Abstracted / indexed in

Science Citation Index Expanded (SciSearch) Created as SCI in 1964, Science Citation Index Expanded now indexes over 9,500 of the world’s most impactful journals across 178 scientific disciplines. More than 53 million records and 1.18 billion cited references date back from 1900 to present.

Biological Abstracts Easily discover critical journal coverage of the life sciences with Biological Abstracts, produced by the Web of Science Group, with topics ranging from botany to microbiology to pharmacology. Including BIOSIS indexing and MeSH terms, specialized indexing in Biological Abstracts helps you to discover more accurate, context-sensitive results.

Google Scholar Google Scholar is a freely accessible web search engine that indexes the full text or metadata of scholarly literature across an array of publishing formats and disciplines.

JournalSeek Genamics JournalSeek is the largest completely categorized database of freely available journal information available on the internet. The database presently contains 39226 titles. Journal information includes the description (aims and scope), journal abbreviation, journal homepage link, subject category and ISSN.

Current Contents - Clinical Medicine Current Contents - Clinical Medicine provides easy access to complete tables of contents, abstracts, bibliographic information and all other significant items in recently published issues from over 1,000 leading journals in clinical medicine.

BIOSIS Previews BIOSIS Previews is an English-language, bibliographic database service, with abstracts and citation indexing. It is part of Clarivate Analytics Web of Science suite. BIOSIS Previews indexes data from 1926 to the present.

Journal Citation Reports/Science Edition Journal Citation Reports/Science Edition aims to evaluate a journal’s value from multiple perspectives including the journal impact factor, descriptive data about a journal’s open access content as well as contributing authors, and provide readers a transparent and publisher-neutral data & statistics information about the journal.

Scopus: CiteScore 1.8 (2023) Scopus is Elsevier's abstract and citation database launched in 2004. Scopus covers nearly 36,377 titles (22,794 active titles and 13,583 Inactive titles) from approximately 11,678 publishers, of which 34,346 are peer-reviewed journals in top-level subject fields: life sciences, social sciences, physical sciences and health sciences.

Submission Turnaround Time

Conferences

Top