Professional Documents
Culture Documents
بحث التخرج النموذج النهائيx
بحث التخرج النموذج النهائيx
بحث التخرج النموذج النهائيx
BY
Zahraa Salih Hadi
Abbas Hamza Yasir
Khayrat Ahmed Zaghir
Ruqayya Ahmed Abdul-Amir
Sura Nazar
Superviserd By
Dr. Ghassan Majid Jasim
2023-2024
َو الَّر اِس ُخ وَن ِفي اْلِعْلِم َيُقوُلوَن آَمَّنا ِبِه ُك ٌّل ِّمْن ِع نِد َر ِّبَنا ﴿
﴾ ۗ َو َما َيَّذ َّك ِإاَّل ُأوُلو اَأْلْلَباِب
ُر
Signature :
Name: Ghassan Majid Jasim (Supervisor)
Date: /04/2024
DEDICATION
To my mother, the spirit of compassion and warmth, who shone like the
morning sun in my life, with her tenderness and sacrifice, making every mo-
ment a unique experience.
And my father, the rock of steadfastness and generosity, inspired me with the
courage and determination to continue my path despite the harshness of life.
You have all the respect and appreciation, for you are the compass by which
I direct my steps
Towards success and excellence.
ACKNOWLEDGEMENT
First of all, praise be to God Almighty and thanked him for his showers of
blessings throughout our research work to complete the research successfully.
We would like to express our deep and sincere gratitude to Dr. Ghassan
Majid Al-Jubouri, University of Al-Qadisiyah, College of Dentistry, for giving
us the opportunity to conduct research and providing invaluable guidance dur-
ing this research. We have been inspired by his dynamism, vision, sincerity and
motivation. We have taught the methodology of conducting the research and
presenting the research work as clearly as possible. It was a great honor and a
great privilege to work and study under his supervision. We are very grateful
for what he gave us. We would also like to thank him for his friendship, sym-
pathy and sense of humor. We are so grateful to our parents for their love,
prayers, care, and sacrifice for educating us and preparing us for our future.
Special thanks go to our friends for the keen interest they have shown in suc-
cessfully completing this research
List of Contents
Subject I
Holy Quran II
Dedication IV
Acknowledgement V
List of Contents VI
List of Figures IX
List of Tables XI
List of Abbreviations XI
Abstract XII
CHAPTER ONE
Introduction
1. Introduction 2
CHAPTER TWO
Literatures
2. Literature Review 9
2.1 Classification Of AI 10
2.1.1 ANN 12
2.1.2 CNN 13
2.2 AI application in health care 14
1. Image Analysis 15
3. Patient Interaction 18
2.3.1 AI in Orthodontics 19
1. Cephalometric Analysis 20
Caries Detection 24
2.3.3 AI In Endodontic 26
2. Root Morphology 29
4. Retreatment Predictions 32
5. Pulpal Diagnosis 32
7. Microrobots 36
8. Other uses of AI in Endodontics 37
2. In Implant Dentistry 41
3. In Robotic Surgery 43
2.3.5 AI In Prosthodontics 46
1. Application Of AI In Prosthodontics 47
CHAPTER THREE
Discussion
Discussion 56
CHAPTER FOUR
References 60
NO. LIST OF FIGURE
3-2 AI and data-driven applications are situated all along the pa- 14
tient’s dental journey.
4-2 The figure shows an example of analyzed OPG and the out- 15
comes given by the software.
31
Example of how AI Technology can Illuminate a Black,
9-2 White and Grey Radiograph with Quantified, Precision Out-
puts for Doctor Communication and Patient Education
LIST OF ABBREVIATIONS
AI Artificial Intelligence
DL Deep Learning
ML Machine Learning
NN Neural Network
Abstract
Artificial Intelligence (AI) has emerged as a transformative force in den-
tistry, revolutionizing various aspects of patient care, diagnostics, and treat-
ment planning. This integration of AI technologies in dentistry offers effi-
cient solutions, ranging from image analysis for radiographic interpretation
to personalized treatment recommendations.
2
to this test is the Turing Test [6]. It is located on the lines that follow: If a hu-
man evaluator were able to differentiate between natural language communi-
cations between a human test taker and a machine. The scenario assumes that
a human evaluator is aware that the communication is taking place between a
human and a machine. Furthermore, the human evaluator, human test taker,
and machine are physically isolated from one other.[7] The interaction be-
tween the human test taker and the machine is restricted to written text,
specifically keyboard input, rather than spoken language. The purpose of this
exam is to just assess the machine's logical question-answering capabilities,
excluding any evaluation of its speech interpretation abilities.[8] If the human
evaluator is unable to differentiate between the human test taker and the ma-
chine, then the machine might be said to have successfully passed the Turing
Test, and it is referred to as possessing "machine intelligence". .[9]
Between 1957 and 1974, the field of AI experienced rapid expansion due
to the increasing availability and computational capabilities of computers, as
well as advancements in AI algorithms. One example is ELIZA, a computer
3
program capable of understanding spoken language and resolving problems
through text [12].
4
mistake rate of 2.25%, which is lower than the human top-5 error rate of 5.1%
.[18]
Over the past two decades, there has been a growing interest in the appli-
cation of digital data processing technology in the domains of medicine and
dentistry.[22]
There is dental management software available in the market that use ar-
tificial intelligence (AI) to collect and store patient data. At present, artificial
5
intelligence can be employed to create comprehensive virtual databases that
are readily accessible.[25]
It can be included into dental imaging systems to detect even the most
minute aberrations that are imperceptible to the human eye. Due to its excep-
tional capability, it can effortlessly be utilized for precise identification of
cephalometric landmarks .[28]
6
CHAPTER TWO
LITERATURE REVIEW
7
Review
Big data: refers to datasets so large or complex that they can’t be handled by
normal data-processing software. This is key to the power of technologies like
LLMs, whose training data consists of enormous quantities of text.
Deep learning : is a type of machine learning approach. Deep learning mod-
els are based on neural networks.They consist of a series of layers (which is
why they’re called “deep”) that gradually transform raw data (e.g., pixels)
into a more abstract form (e.g., an image of a face), allowing the model to un-
derstand it on a higher level.
8
Machine learning (ML): is a field of research focused on how machines can
“learn” (i.e., become capable of more and more advanced tasks). It involves
the use of algorithms interacting with training data to make autonomous deci-
sions and predictions.
- Classification of AI :-
There are many approaches whereby AI can be achieved; different types
of AI can achieve different tasks, and researchers have created different AI
classification methods.
AI is a generic term for all non-human intelligence. As Figure (1-2 )
shows, AI can be further classified as weak AI and strong AI. Weak AI, also
called narrow AI, uses a program trained to solve single or specific tasks. The
AI of today is mostly weak AI. Examples include reinforcement learning, e.g.,
AlphaGo, and automated manipulation robots; natural language processing,
e.g., Google translation, and Amazon chat robot; computer vision, e.g., Tesla
Autopilot, and face recognition; data mining, e.g., market customer analysis
and personalised content recommendation on social media (29).
Figure (1-2), Schematic diagram of the relationship between AI, strong AI,
weak AI, expert-based systems, machine learning, deep learning and neural
network (NN). [30]
9
Strong AI, also known as Artificial General Intelligence (AGI), encom-
passes AI systems that possess the same level of intelligence and capabilities
as humans. These systems have their own consciousness, awareness, and be-
havior, which are as adaptable and flexible as those of humans (30). The ob-
jective of Strong AI is to develop a versatile algorithm capable of making
judgments across several domains. Research on strong artificial intelligence
must exercise extreme caution due to the ethical concerns and the inherent
risks it poses. Therefore, there have been no significant uses of powerful arti-
ficial intelligence thus far.
ML and expert systems are distinct subsets of weak AI. Table (1-2)
demonstrates that machine learning (ML) can be categorized into supervised,
semi-supervised, and unsupervised learning approaches, according to the un-
derlying theory. Supervised learning involves the use of labeled datasets to
train algorithms, with these labeled datasets serving as the "supervisor" for the
algorithm. The algorithm acquires knowledge from the labeled input and ana-
lyzes and recognizes the shared characteristics of the labeled input in order to
make predictions about unlabeled input. Supervised learning encompasses
several algorithms such as k-nearest neighbors, logistic regression, random
forest, and support-vector machine [31]. In contrast, unsupervised learning
autonomously identifies the different characteristics of unlabeled data. Semi-
supervised learning is a method that falls between supervised and unsuper-
vised learning. It involves using a small set of labeled data together with a
larger set of unlabeled data for training purposes (. ( 32
2.1.1 ANN
2.1.2 CNN
CNN is a type of deep learning model mainly used for image recogni-
tion and generation. The mean difference between ANN and CNN is that
CNN consists of convolution layers, in addition to the pooling layer and the
fully connected layer in the hidden layers. Convolution layers are used to gen-
erate feature maps of input data using convolution kernels. The input image is
folded by the kernels completely. It reduces the complexity of images because
of the weight sharing by convolution. The pooling layer is usually followed
12
by each group of convolution layers, which reduces the dimension of feature
maps for further feature extraction.
The fully connected layer is used after the convolution layer and pooling
layer. As the name indicates, the fully connected layer connects to all acti-
vated neurons in the previous layer and transforms the 2D feature maps into
1D. 1D feature.
Figure ( 3-2 ) , AI and data-driven applications are situated all along the pa-
tient’s dental journey.
13
1 . Image Analysis
Figure (4-2) , The figure shows an example of analyzed OPG and the out-
comes given by the software [30]
14
2. Data synthesis and prediction
The field of clinical dentistry produces substantial volumes of data on a
daily basis. In addition to pictures, a wide range of clinical data, historical
data, claims, treatment data, and other diagnostic test results are also accessi-
ble. Due to the frequent and regular visits of numerous patients to dentists,
there exists a substantial amount of longitudinal data on a wide segment of the
population, particularly in many affluent nations. Presently, this data is fre-
quently gathered in separate data repositories, keeping concealed from collec-
tive utilization, and proving challenging to evaluate collectively when assess-
ing a patient's symptoms or predicting future oral and dental diseases.
Utilizing this data and harnessing it to gain a deeper comprehension of
the patient, including their risk profile and requirements, represents another
potential benefit of AI. Contemporary methodologies for accessing and con-
solidating data, such as through dashboards, strive to present practitioners
with the available data in a more comprehensive and practical manner. En-
abling a comprehensive perspective of each patient, taking into account their
unique medical history and potential dangers, is anticipated to enhance both
the quality and efficiency of healthcare, hence diminishing the necessity for
repetitive and expensive evaluations.
Utilizing data collected through multiple visits and over a span of years,
such as images, clinical evaluations, historical records, medication informa-
tion, and information on systemic conditions, can help to address the current
approach of treating all patients in the same way. This will facilitate a shift to-
wards a more individualized, accurate, and proactive approach to healthcare.
In addition, there will be a growing utilization of data provided by patients,
such as information on their eating habits or dental brushing behavior, which
will be collected through apps and mobile devices. Patients will also play a
more active role in the care process, not only as data contributors but also
through the use of applications and constant support in the "virtual" practice.
15
Therefore, AI will also facilitate a more inclusive dental care experience. The
outcome is referred to as P4 dentistry, which encompasses a more individual-
ized, accurate, proactive, and collaborative approach to dental care (Flores et
al., 2013)[30].
Another area of untapped data aggregation is the utilization of implanted
and wearable sensors. The advancing biomedical sensor technology allows for
the utilization of small and nano-sized sensors, which can be employed to
monitor saliva and indirectly assess mouth health or detect diseases.
Furthermore, the utilization of saliva diagnostics and its associated data
can also be utilized to evaluate various human ailments due to the strong cor-
relation between saliva and general well-being (Cui et al. 2022).The user's
text is "[31]."
Figure (5-2) , (A) The classification (mark as yes or no) of the lesion on
the periapical radiograph is schematized. (B) The segmentation (orange stain-
ing) of the lesion on the periapical radiograph is schematized. (C) The detec-
tion (blue box) of the lesion on the periapical radiograph is schematized [32].
16
3. Interacting with patients
Since the integration of data is a primary objective of data-driven health-
care and artificial intelligence (AI) tools for health, it is probable that there
will be an increased opportunity for ongoing engagement with patients. Devi-
ating from the commonly used "one-stop" method in dental care could enable
the direct treatment of oral diseases at their source, which is inside individu-
als' daily lives.
17
2.3 The function of artificial intelligence in certain dental specializa-
tions
2.3.1 Artificial Intelligence in the field of Orthodontics
There has been a substantial rise in the application of artificial intelli-
gence technologies in orthodontics in recent years. Artificial intelligence can
be applied to nearly every aspect of the orthodontic workflow. Artificial intel-
ligence (AI) assists in the diagnosis and planning of orthodontic treatments
[33]. The data is inputted into the system, where advanced algorithms and AI
software are utilized to forecast tooth movements and anticipate the eventual
results of the treatment.The text is referenced by the number 16. Artificial in-
telligence (AI) implementation decreases expenses, expedites the process of
diagnosing and treating patients, and perhaps removes the necessity for hu-
man labor , Figure (6-2) , Artificial intelligence (AI) [34] offers “a way to get
sharper prediction from data” [35],[36] by simultaneously analysing all the
different variables present in a malocclusion. This capacity offers the poten-
tial to assist the practitioner to obtain the most favourable outcome when
treating a malocclusion.[37]
1. Cephalometric Analysis
18
involves the identification of radiographic landmarks, followed by the mea-
surement of different lengths, angles, and ratios [38]. Cephalometric analysis
is mostly utilized for three purposes [39].
19
Figure (7-2): Cephalometric tracing done using Dolphin Imaging tech-
nology [32].
20
errors in decision step and can provide 80 to 90% accuracy when making an
orthodontic extraction decision (Jung and Kim, 2016; Xie et al., 2010)[44] .
Study Type of data Type of Size of Accuracy Sensitivity Specificity AUC Other perfor-
algorithm dataset mances
(training/
21
testing
Vertical root Panoramic CNN 240/60 0.75 Precision:
fracture radiography 0.93; F1: 0.83
detection
Apical lesion CBCT images CNN 16/4 0.93 0.88 PPV: 0.87;
detection NPV: 0.93
Caries Detection
22
Smart microchips enable the assessment of patients' food intake, diet
activity, and oral pH levels, regardless of whether it is a juvenile, adult, or
geriatric instance. Consequently, it aids in evaluating the activity of dental de-
cay, recognizing early or beginning stages of tooth decay, and then allows us
to manage the instances by modifying dietary habits or preventing further
tooth decay.The references [54], [55], and [56] are included.
23
of the state of their teeth, hence encouraging the adoption of oral hygiene
measures.
Integrating smart chips into tooth decay detection is consistent with the
broader trend of utilizing technology to enhance dental diagnosis, optimize
treatment results, and encourage oral hygiene.
24
Figure (8-2) , AI-enhanced Image Which Shows Models Applied for
Caries Identification and Quantification (Yellow) and Enamel (Red) [32].
The ability to identify the DEJ allows for representation of carious le-
sion classification as well as progression or resolution in accordance with evi-
dence-based literature.
2.3.3 AI In Endodontics
25
AI exhibited high levels of accuracy and precision when it came to de-
tecting, diagnosing, and predicting diseases in the field of endodontics. Artifi-
cial intelligence (AI) has the potential to enhance the accuracy and effective-
ness of diagnosing and treating endodontic conditions, hence improving the
overall success rates of endodontic treatments.
26
An advanced three-dimensional (3D) imaging technique is widely uti-
lized by around 80% of endodontists to effectively identify and treat root
canal diseases.
27
An attempt has been made to enhance the detection capability of radio-
graphic techniques by transitioning from conventional radiography to digital
imaging and digital image enhancement [62].
X-ray and CBCT image analysis assist in the detection of a VRF, a di-
agnosis that can be difficult to identify, Inadequate diagnosis can lead to the
necessity of unnecessary surgical operations or tooth extraction.
The clinician often faces a diagnostic challenge due to the clinical pre-
sentation and the low sensitivity of diagnostic imaging in detecting vertebral
fractures [63]. CBCT imaging shown superior efficacy in detecting vertical
root fractures (VRFs) in teeth without fillings, but radiography exhibited
somewhat higher accuracy in identifying VRFs in teeth that had undergone
root canal treatment.
2. Root Morphology
28
The successful outcome of a nonsurgical endodontic treatment relies
heavily on the ability to accurately identify the various root canal systems.
In the past, the diagnosis of this condition was typically made by peri-
apical X-rays and CBCT image analysis [67].
CBCT and electronic apex locators have recently been used as modern
tools for detecting the apical foramen [76] . The electronic apex locator, most
frequently used in clinics to measure root canal length, was developed over
time using multiple techniques . The root canal treatment prognosis can only
be guaranteed when the instrumentation ends at the apical constriction of the
root [ 77,78 ].
The ANN diagnosis method helps to improve the diagnosis and results
in a better radiographic determination of working length. Further, in a wide
range of clinical circumstances, ANNs are used as a judgement system [79] .
30
Figure (9-2) , Example of how AI Technology can Illuminate a Black, White
and Grey Radiograph with Quantified, Precision Outputs for Doctor Commu-
nication and Patient Education [79].
4. Retreatment Predictions
The system's strength lies in its ability to predict the outcome of retreat-
ment with a high degree of accuracy. The effectiveness of the system was
solely dependent on the quality of the data obtained, which posed a constraint.
Cognitive Behavioral Reasoning (CBR) is the systematic process of generat-
ing solutions to problems based on previous experiences with similar chal-
lenges. By retrieving comparable examples, crucial knowledge and informa-
tion can be assimilated. The issue of variations and the presence of diverse
procedures can result in system heterogeneity [81]. In order to attain greater
responsiveness, selectivity, and precision, future research should take into ac-
count the diversity of human approaches and may need to raise sample num-
bers [82].
31
Pulpal diagnosis .5
33
Figure (11-2) , Screen Shot of a Periapical Radiograph that Includes an En-
dodontically Treated Tooth and has been Analyzed by their AI Algorithms
[86].
34
Figure (12-2), Periapical Radiolucency on an Image without AI Definition
[86].
7. Micro-robots
35
Figure (14-2) , Endodontic microrobots used to increase the quality
of endodontic treatment and to reduce errors during the procedure
[87].
36
emphasizing significant methodology, results, and the consequences of these
investigations for the wider field of endodontics.
37
Enhancing the classification's effectiveness would likely enable the uti-
lization of these models as diagnostic instruments, hence aiding experts in
their therapy determinations. Extracting impacted third molars is a frequently
done treatment by maxillofacial surgeons, oral surgeons, and dentists.
38
with classification or text production algorithms, enabling the automated cre-
ation of a radiological report.
Oral cavity carcinomas are the most prevalent tumors in the upper aerodi-
gestive tract, and their incidence rate is on the rise. Early intervention typi-
cally leads to few surgical complications, but a delayed diagnosis might result
in a lower survival rate and significant functional and cosmetic consequences
[99]. Identifying pre-malignant lesions at an early stage helps prevent them
from developing into cancer, but this task is challenging for individuals with-
out specialized knowledge. Certain machine learning tools enable identifica-
tion using autofluorescence measurement [100] or photography [101].
40
2- In implant dentistry
43
E. Soft Tissue Surgery: AI-powered robotic systems aid in soft tissue
surgeries, such as frenectomy or soft tissue grafting. These systems can
precisely manipulate delicate soft tissues while minimizing trauma and
accelerating healing, leading to improved patient comfort and out-
comes.
44
Figure (16-2) , robotic surgery where human body motion and human intelli-
gence is simulated [112].
2.3.5 AI in prosthodontics
46
theless, progress in the scanning domain has resulted in its application in the
production of full dentures and intraoral scans for maxillofacial purposes.
Margin identification in fixed prosthodontics was accomplished through the
utilization of artificial intelligence after an intraoral scan. CAD/CAM, an ab-
breviation for "computer-aided design/computer-aided manufacturing", is em-
ployed in the fabrication of both permanent and removable dental prostheses
[119,120] . This technique utilizes data from numerous real crowns to offer
an optimal crown design suitable for various situations. In modern times, vari-
ous branches of dentistry have employed digital tools to assist patients in
achieving the aesthetically pleasing smiles they desire. These tools encompass
3D face tracking and cost-effective virtual 3D data hybrids, such as frag-
mented cone beam computed tomography (CBCT), intraoral scans, and face
scans. The virtualization of a patient's anatomy is a necessary step in any ther-
apeutic action aimed at altering their smile. The early grin designs were pro-
duced by creating basic sheet drawings from two-dimensional printed images
of patients [121,122]. The integration of AI in prosthetic dentistry has resulted
in numerous innovative possibilities. This includes the ability to generate oc-
clusal morphology in crown contemplation, even in cases of wear or fracture,
as well as programmed teeth setting for dentures and automatic framework
designs for removable dental prostheses. Additionally, AI serves as an educa-
tional tool, providing guidance to students at various levels of education, from
new students to postgraduates. It also offers support to less-experienced un-
dergraduate students in their professional development [123,124].
47
Figure (18-2) , CAD/CAM and Ai in employed in the fabrication of both per-
manent and removable dental prostheses [123]
48
Figure (19-2) , Virtual reality simulation (VRS) technology can be used to
simulate the facial profiles post treatment [124].
51
Tomita et al. [137] conducted a study that was similar to previous work.
They used a standard epoxy model and compared the digitalization of plaster
models created from alginate and silicone-based impressions, which are com-
monly used in dentistry. They used a unique coordinate system-based ap-
proach and their findings supported the findings of previous studies. They
also found that intraoral scanning processes tended to result in smaller mea-
surements compared to the set reference, indicating negative deviations. In the
same study, indirect digitization showed positive deviations.
The data obtained may vary in quality depending on the shape and sur-
face features of the prosthetic materials found in the neighboring and oppos-
ing teeth. In addition, a recent study examined the internal fitting of a prosthe-
sis using an intraoral scanner [143]. However, there is a lack of research in the
current literature regarding the impact of the relationship between external fit-
ting, such as proximal and occlusal contact, and the surrounding material
(tooth or restoration) from a clinical standpoint. This gap in knowledge leaves
us uncertain about the potential effects of material in the immediate vicinity.
53
The objective of this study was to assess the precision of digital impres -
sions of crowns made from different materials (polymethyl methacrylate, zir-
conia, gold, and cobalt-chromium alloy) utilizing either an intraoral scanner
or a traditional impression acquisition approach.
54
CHAPTER THREE
DISCUSSION
55
Discussion
56
CHAPTER FOUR
CONCLUSION AND
RECOMENDATION
57
Conclusion and Recommendation
58
We Recommended :-
59
REF-
ERENCES
60
References
61
the Auspices of the UNESCO. Paris, France: Eolss. (2012).
https://www.eolss.net
9. Weizenbaum J. ELIZA—a computer program for the study of natural
language communication between man and machine. Commun ACM.
(1966) 9(1):36–45. doi: 10.1145/365153.365168
10.Hendler J. Avoiding another AI winter. IEEE Intell Syst. (2008)
23(02):2–4. doi: 10.1109/MIS.2008.20
11. Schmidhuber J. Deep learning. Scholarpedia. (2015) 10(11):32832.
doi: 10.4249/ scholarpedia.32832
12. Liebowitz J. Expert systems: a short introduction. Eng Fract Mech.
(1995) 50(5–6):601–7. doi: 10.1016/0013-7944(94)E0047-K
13.McDermott JP. RI: an expert in the computer systems domain. AAAI
Conference on arti fi cial intelligence (1980).
14.Krizhevsky A, Sutskever I, Hinton GE. Imagenet classification with
deep convolutional neural networks. Commun ACM. (2017) 60(6):84–
90. doi: 10.1145/3065386
15.Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, et al. Ima-
genet large scale visual recognition challenge. Int J Comput Vis. (2015)
115(3):211–52. doi: 10. 1007/s11263-015-0816-y
16. Campbell M, Hoane Jr AJ, Hsu F-h. Deep blue. Artif Intell. (2002)
134(1–2):57–83. doi: 10.1016/S0004-3702(01)00129-1
17.Chao X, Kou G, Li T, Peng Y. Jie ke versus AlphaGo: a ranking ap-
proach using decision making method for large-scale data with incom-
plete information. Eur J Oper Res. (2018) 265(1):239–47. doi:
10.1016/j.ejor.2017.07.030
18.Open AI. Chat GPT. Optimizing language, odels for dialogue. Avail-
able at: https://openai.com/blog/chatgpt/ (accessed on 7 February
2023).
62
19.Fang G, Chow MC, Ho JD, He Z, Wang K, Ng T, et al. Soft robotic
manipulator for intraoperative MRI-guided transoral laser micro-
surgery. Sci Robot. (2021) 6(57): eabg5575. doi: 10.1126/scirobotic-
s.abg5575
20. Flowers JC. Strong and weak AI: deweyan considerations. AAAI
Spring symposium: towards conscious AI systems (2019).
organizations, or those of the publisher, the editors and the reviewers.
Any product that may be evaluated in this article, or claim that may be
made by its manufacturer, is not guaranteed or endorsed by the pub-
lisher.
21.Hastie T, Tibshirani R, Friedman J. Overview of supervised learning.
In: Hastie T, Tibshirani R, Friedman J, editors. The elements of statisti-
cal learning. New York, NY, USA: Springer (2009). p. 9–41. doi:
10.1007/978-0-387-84858-7_2
22. Ray S. A quick review of machine learning algorithms. International
conference on machine learning, big data, cloud and parallel computing
(COMITCon); 2019 14–16 Feb (2019).
23.Hastie T, Tibshirani R, Friedman J. Unsupervised learning. In: Hastie
T, Tibshirani R, Friedman J, editors The elements of statistical learn-
ing. New York, NY, USA: Springer (2009). p. 485–585.
https://doi.org/10.1007/978-0-387-84858-7_14
24.Zhu X, Goldberg AB. Introduction to semi-supervised learning. Synth
Lect Artif Intell Mach Learn. (2009) 3(1):1–130. doi: 10.1007/978-3-
031-01548-9
25.Zhou Z-H. A brief introduction to weakly supervised learning. Natl Sci
Rev. (2017) 5(1):44–53. doi: 10.1093/nsr/nwx106
26.Agatonovic-Kustrin S, Beresford R. Basic concepts of artificial neural
network (ANN) modeling and its application in pharmaceutical re-
63
search. J Pharm Biomed Anal. (2000) 22(5):717–27. doi: 10.1016/
S0731-7085(99)00272-1
27.LeCun Y, Bengio Y, Hinton G. Deep learning. Nature. (2015)
521(7553):436–44. doi: 10.1038/nature14539
28.Nam CS. Neuroergonomics: Principles and practice. Gewerbestrasse,
Switzerland: Springer Nature (2020). doi: 10.1007/978-3-030-34784-0
29.Kosan E, Krois J, Wingenfeld K, Deuter CE, Gaudin R, Schwendicke
F. 2022. Patients' perspectives on artificial intelligence in dentistry: A
controlled study. J Clin Med. 11(8).
30. Flores M, Glusman G, Brogaard K, Price ND, Hood L. 2013. P4
medicine: How systems medicine will transform the healthcare sector
and society. Personalized medicine. 10(6):565-576.
Citation: Ding H, Wu J, Zhao W, Matinlinna JP, Burrow MF and Tsoi
JKH (2023) Artificial intelligence in dentistry--A review.Front.Dent.
Med 4:1085251. doi: 10. 3389/ /fdmed. 2023. .1085251
31. Cui Y, Yang M, Zhu J, Zhang H, Duan Z, Wang S, Liao Z, Liu W.
2022. Developments in diagnostic applications of saliva in human or-
gan diseases. Medicine in Novel Technology and Devices. 13:100115.
32. Duane B, Taylor T, Stahl-Timmins W, Hyland J, Mackie P, Pollard A.
2014. Carbon mitigation, patient choice and cost reduction-triple bot-
tom line optimisation for health care planning. Public Health.
128(10):920-924.
33.Xie X, Wang L, Wang A. Artificial Neural Network Modeling for De-
ciding if Extractions Are Necessary Prior to Orthodontic Treatment [In-
ternet]. Vol. 80, The Angle Orthodontist. 2010. p. 262–6. Available
from: http://dx.doi.org/10.2319/111608-588.1.
34. Mackin N, Sims-Williams JH, Stephens CD. Artificial intelligence in
the dental surgery: an orthodontic expert system, a dental tool of tomor-
64
row. Dent Update [Internet]. 1991 Oct;18(8):341–3. Available from:
https://www.ncbi.nlm.nih.gov/ pubmed/1810794.
35. Gelfand AE, Hills SE, Racine-Poon A, Smith AF. Illustration of
bayesian inference in normal data models using gibbs sampling. J Am
Stat Assoc 1990;85:972-85.
36. Rohrer B. End-to-end Machine Learning Library; 2020.
37. Croskerry P. The importance of cognitive errors in diagnosis and
strategies to minimize them. Acad Med 2003;78:775-80.
38. Thanathornwong B. Bayesian-based decision support system for as-
sessing the needs for orthodontic treatment. Healthc Inform Res
2018;24:22-8.
39. S. Ö. Arik, B. Ibragimov, and L. Xing, “Fully automated quantitative
cephalometry using convolutional neural networks,” Journal of Medical
Imaging, vol. 4, no. 1, p. 014501, 2017.
40. M. K. Alam and A. A. Alfawzan, “Evaluation of Sella Turcica bridg-
ing and morphology in different types of cleft patients,” Frontiers in
Cell and Developmental Biology, vol. 8, p. 656, 2020.
41. K. J. Dreyer and J. R. Geis, “When machines think: radiology’s next
frontier,” Radiology, vol. 285, no. 3, pp. 713–718, 2017.
42.R. Leonardi, D. Giordano, F. Maiorana, and C. Spampinato, “Auto-
matic cephalometric analysis,” The Angle Orthodontist, vol. 78, no. 1,
pp. 145–151, 2008.
43. A. Richardson, “A comparison of traditional and computerized meth-
ods of cephalometric analysis,” The European Journal of Orthodontics,
vol. 3, no. 1, pp. 15–20, 1981.
44. Miller, R., Dijkman, D., Riolo, M., Moyers, R. Graphic computeriza-
tion of cephalometric data.1971.
45.Huang Y-P, Lee S-Y. An Effective and Reliable Methodology for Deep
Machine Learning Application in Caries Detection. medRxiv (2021).
65
46.Fukuda M, Inamoto K, Shibata N, Ariji Y, Yanashita Y, Kutsuna S, et
al. Evaluation of an artificial intelligence system for detecting vertical
root fracture on panoramic radiography. Oral Radiol. (2020)
36(4):337–43. doi: 10.1007/s11282-019-00409-x
47.Vadlamani R. Application of machine learning technologies for detec-
tion of proximal lesions in intraoral digital images: in vitro study.
Louisville, Kentucky, USA: University of Louisville (2020). doi:
10.18297/etd/3519
48. Setzer FC, Shi KJ, Zhang Z, Yan H, Yoon H, Mupparapu M, et al. Ar-
tificial intelligence for the computer-aided detection of periapical le-
sions in cone-beam computed tomographic images. J Endod. (2020)
46(7):987–93. doi: 10.1016/j.joen. 2020.03.025
49. Jaiswal P, Bhirud S. Study and analysis of an approach towards the
classi fi cation of tooth wear in dentistry using machine learning tech-
nique. IEEE International conference on technology, research, and in-
novation for betterment of society (TRIBES) (2021). IEEE.
50. Shetty H, Shetty S, Kakade A, Shetty A, Karobari MI, Pawar AM, et
al. Threedimensional semi-automated volumetric assessment of the
pulp space of teeth following regenerative dental procedures. Sci Rep.
(2021) 11(1):21914. doi: 10.1038/ s41598-021-01489-8
51. Lee J-H, Kim D-H, Jeong S-N, Choi S-H. Detection and diagnosis of
dental caries using a deep learning-based convolutional neural network
algorithm. J Dent. (2018) 77:106–11. doi: 10.1016/j.jdent.2018.07.015
52.Kühnisch J, Meyer O, Hesenius M, Hickel R, Gruhn V. Caries detec-
tion on intraoral images using artificial intelligence. J Dent Res. (2021)
101(2). doi: 10.1177/ 00220345211032524.
53.Schwendicke F, Rossi J, Göstemeyer G, Elhennawy K, Cantu A,
Gaudin R, et al. Cost-effectiveness of artificial intelligence for proximal
66
caries detection. J Dent Res. (2021) 100(4):369–76. doi:
10.1177/0022034520972335
54.Duane B, Taylor T, Stahl-Timmins W, Hyland J, Mackie P, Pollard A.
2014. Carbon mitigation, patient choice and cost reduction-triple bot-
tom line optimisation for health care planning. Public Health.
128(10):920-924.
55. Jose J, P. A, Subbaiyan H. Different Treatment Modalities followed by
Dental Practitioners for Ellis Class 2 Fracture – A Questionnaire-based
Survey [Internet]. Vol. 14, The Open Dentistry Journal. 2020. p. 59–
65.Availablefrom:http:// dx.doi.org/10.2174/1874210602014010059
56. Ramesh S, Teja K, Priya V. Regulation of matrix metalloproteinase-3
gene expression in inflammation: A molecular study [Internet]. Vol. 21,
Journal of Conservative Dentistry. 2018. p.592. Available from:
http://dx.doi.org/10.4103/jcd. jcd_154_18
57. I. Tsesis, E. Rosen, A. Tamse, S. Taschieri, and A. Kfir, “Diagnosis of
vertical root fractures in endodontically treated teeth based on clinical
and radiographic indices: a systematic review,” Journal of Endodon-
tics, vol. 36, no. 9, pp. 1455– 1458, 2010.
58.M. Fukuda, K. Inamoto, N. Shibata et al., “Evaluation of an artificial
intelligence system for detecting vertical root fracture on panoramic ra-
diography,” Oral Radiology, vol. 36, no. 4, pp. 337–343, 2020.
59. D. Prithviraj, H. Balla, R. Vashisht, K. Regish, and P. Suresh, An
overview of management of root fractures,” Kathmandu University
Medical Journal, vol. 12, no. 3, pp. 222–230, 2015.
60.. S. Kositbowornchai, S. Plermkamon, and T. Tangkosol, “Performance
of an artificial neural network for vertical root fracture detection: an ex
vivo study,” Dental traumatology, vol. 29, no. 2, pp. 151–155, 2013.
61. S. Talwar, S. Utneja, R. R. Nawal, A. Kaushik, D. Srivastava, and S. S.
Oberoy, “Role of cone-beam computed tomography in diagnosis of
67
vertical root fractures: a systematic review and meta-analysis,” Journal
of Endodontics, vol. 42, no. 1, pp. 12–24, 2016.
62. N. Boreak, “Effectiveness of artificial intelligence applications de-
signed for endodontic diagnosis, decision-making, and prediction of
prognosis: a systematic review,” The Journal of Contemporary Dental
Practice, vol. 21, no. 8, pp. 926–934, 2020.
63. V. Nagendrababu, A. Aminoshariae, and J. Kulild, “Artificial intelli-
gence in endodontics: current applications and future directions,” Jour-
nal of Endodontics, vol. 47, no. 9, pp. 1352– 1357, 2021.
64.T. Hiraiwa, Y. Ariji, M. Fukuda et al., “A deep-learning artifi- cial in-
telligence system for assessment of root morphology of the mandibular
first molar on panoramic radiography,” Dentomaxillofacial Radiology,
vol. 48, no. 3, p. 20180218, 2019.
65.Z. S. Madani, N. Mehraban, E. Moudi, and A. Bijani,“Root and canal
morphology of mandibular molars in a selected Iranian population us-
ing cone-beam computed tomography,” Iranian endodontic journal, vol.
12, no. 2, pp. 143–148, 2017.
66.S. Rahimi, H. Mokhtari, B. Ranjkesh et al., “Prevalence of extra roots
in permanent mandibular first molars in Iranian population: a CBCT
analysis,” Iranian endodontic journal, vol. 12, no. 1, pp. 70–73, 2017.
67.Y. Xue, R. Zhang, Y. Deng, K. Chen, and T. Jiang, “A preliminary ex-
amination of the diagnostic value of deep learning in hip osteoarthritis,”
PLoS One, vol. 12, no. 6, article e0178992, 2017.
68.X. Wang, W. Yang, J. Weinreb et al., “Searching for prostate cancer by
fully automated magnetic resonance imaging classification: deep learn-
ing versus non-deep learning,” Scientific Reports, vol. 7, no. 1, pp. 1–8,
2017.
69.T. Hiraiwa, Y. Ariji, M. Fukuda et al., “A deep-learning artificial intel-
ligence system for assessment of root morphology of the mandibular
68
first molar on panoramic radiography,” Dentomaxillofacial Radiology,
vol. 48, no. 3, p. 20180218, 2019.
70.Y. Xue, R. Zhang, Y. Deng, K. Chen, and T. Jiang, “A preliminary ex-
amination of the diagnostic value of deep learning in hip osteoarthritis,”
PLoS One, vol. 12, no. 6, article e0178992,2017.
71.X. Wang, W. Yang, J. Weinreb et al., “Searching for prostate cancer by
fully automated magnetic resonance imaging classification: deep learn-
ing versus non-deep learning,” Scientific Reports, vol. 7, no. 1, pp. 1–8,
2017.
72.P. Chhetri, N. O. Devi, and K. Dem Lepcha, “Artificial intelligence in
dentistry,” Journal of Clinical Research and Community, vol. 1, no. 1,
2021.
73.J. L. Gutmann and J. E. Leonard, “Problem solving in endodontic
working-length determination,” Compendium of Continuing Education
in Dentistry, vol. 16, no. 3, 1995.
74.M. Saghiri, K. Asgar, K. Boukani et al., “A new approach for locating
the minor apical foramen using an artificial neural network,” Interna-
tional endodontic journal, vol. 45, no. 3, pp. 257–265, 2012.
75.N. Boreak, “Effectiveness of artificial intelligence applications de-
signed for endodontic diagnosis, decision-making, and prediction of
prognosis: a systematic review,” The Journal of Contemporary Dental
Practice, vol. 21, no. 8, pp. 926–934, 2020.
76.M. Gordon and N. Chandler, “Electronic apex locators,” International
endodontic journal, vol. 37, no. 7, pp. 425–437, 2004.
77.X. Qiao, Z. Zhang, and X. Chen, “Multifrequency impedance method
based on neural network for root canal length measurement,” Applied
Sciences, vol. 10, no. 21, p. 7430, 2020.
69
78.M. Joseph, “Clinical success of two working length determination tech-
niques: a randomized controlled trial,” in Madha Dental College and
Hospital, Chennai, 2019.
79.L. Campo, I. J. Aliaga, J. F. De Paz et al., “Retreatment predictions in
odontology by means of CBR systems,” Computational Intelligence
and Neuroscience, vol. 2016, 2016.
80.D. Gu, C. Liang, and H. Zhao, “A case-based reasoning system based
on weighted heterogeneous value distance metric for breast cancer di-
agnosis,” Artificial intelligence in medicine, vol. 77, pp. 31–47, 2017.
81.V. Nagendrababu, A. Aminoshariae, and J. Kulild, “Artificial intelli-
gence in endodontics: current applications and future directions,” Jour-
nal of Endodontics, vol. 47, no. 9, pp. 1352– 1357, 2021.
82. Tumbelaka BY, Oscandar F, Baihaki FN, Sitam S, Rukmo MJSEJ.
Identification of pulpitis at dental X-ray periapical radiography based
on edge detection, texture description and artificial neural networks.
2014;4(3):115–21. [Google Scholar].
83. Zheng L, Wang H, Mei L, Chen Q, Zhang Y, Zhang H. Artificial intel-
ligence in digital cariology: a new tool for the diagnosis of deep caries
and pulpitis using convolutional neural networks. Ann Transl Med.
2021;9(9):763. [PMC free article] [PubMed] [Google Scholar].
84.
85.Okada K, Rysavy S, Flores A, Linguraru MG. Noninvasive differential
diagnosis of dental periapical lesions in cone-beam CT scans. Med
Phys. 2015;42(4):1653–65. [PubMed] [Google Scholar].
86.Orhan K, Bayrakdar IS, Ezhov M, Kravtsov A, Özyürek T. Evaluation
of artificial intelligence for detecting periapical pathosis on cone-beam
computed tomography scans. Int Endod J. 2020;53(5):680–9.
[PubMed] [Google Scholar].
70
87.Kong Y, Posada-Quintero HF, Tran H, Talati A, Acquista TJ, Chen IP,
Chon KH. Differentiating between stress- and EPT-induced electroder-
mal activity during dental examination. Comput Biol Med. 2023:155.
[PMC free article] [PubMed] [Google Scholar]
88.Ramezanzade S, Dascalu TL, Ibragimov B, Bakhshandeh A, Bjørndal
L. Prediction of pulp exposure before caries excavation using artificial
intelligence: Deep learning-based image data versus standard dental ra-
diographs. J Dent. 2023:138. [PubMed] [Google Scholar]
89.Choi S, Choi J, Peters OA, Peters CI. Design of an interactive system
for access cavity assessment: A novel feedback tool for preclinical en-
dodontics. Eur J Dent Educ. 2023 [PubMed] [Google Scholar]
90.Suárez A, Díaz-Flores García V, Algar J, Gómez Sánchez M, Llorente
de Pedro M, Freire Y. Unveiling the ChatGPT phenomenon: Evaluating
the consistency and accuracy of endodontic question answers. Int En-
dod J. 2023 [PubMed] [Google Scholar]
91.Farajollahi M, Modaberi A. Can ChatGPT pass the "Iranian Endodon-
tics Specialist Board" exam? Iran Endod J. 2023;18(3):192. [PMC free
article] [PubMed] [Google Scholar].
92.Yang H, Jo E, Kim HJ, Cha I-H, Jung Y-S, Nam W, et al. Deep learn-
ing for automated detection of cyst and tumors of the jaw in panoramic
radiographs. J Clin Med 2020;9:1–14.
https://doi.org/10.3390/jcm9061839.
93.Poedjiastoeti W, Suebnukarn S. Application of Convolutional Neural
Network in the Diagnosis of Jaw Tumors. Healthc Inform Res
2018;24:236–41. https://doi.org/10.4258/hir.2018.24.3.236.
94.Effiom OA, Ogundana OM, Akinshipo AO, Akintoye SO. Ameloblas-
toma: current etiopathological concepts and management. Oral Dis
2018;24:307–16. https://doi.org/10.1111/odi.12646.
71
95.Orhan K, Bayrakdar IS, Ezhov M, Kravtsov A, Özyürek T. Evaluation
of artificial intelligence for detecting periapical pathosis on cone-beam
computed tomography scans. Int Endod J 2020;53:680– 9.
https://doi.org/10.1111/iej.13265. /95/
96.Fukuda M, Ariji Y, Kise Y, Nozawa M, Kuwada C, Funakoshi T, et al.
Comparison of 3 deep learning neural networks for classifying the rela-
tionship between the mandibular third molar and the mandibular canal
on panoramic radiographs. Oral Surg Oral Med Oral Pathol Oral Radiol
2020;130:336–43. /96/
97.https://doi.org/10.1016/j.oooo.2020.04.005.[46] Zhang W, Li J, Li Z-B,
Li Z. Predicting postoperative facial swelling following impacted
mandibular third molars extraction by using artificial neural networks
evaluation. Sci Rep 2018;8:12281. https://doi.org/10.1038/s41598-018-
29934-1. /97/
98.Revilla-León M, Gómez-Polo M, Vyas S, Barmak BA, Galluci GO, Att
W, et al. Artificial intelligence applications in implant dentistry: A sys-
tematic review. J Prosthet Dent 2021:S00223913(21)00272-9.
https://doi.org/10.1016/j.prosdent.2021.05.008.
99.Gómez I, Seoane J, Varela-Centelles P, Diz P, Takkouche B. Is diag-
nostic delay related to advanced-stage oral cancer? A meta-analysis.
Eur J Oral Sci 2009;117:541–6. https://doi.org/10.1111/j.1600-
0722.2009.00672.x.
100. van Staveren HJ, van Veen RL, Speelman OC, Witjes MJ, Star
WM, Roodenburg JL. Classification of clinical autofluorescence spec-
tra of oral leukoplakia using an artificial neural network: a pilot study.
Oral Oncol 2000;36:286–93. https://doi.org/10.1016/s1368-
8375(00)00004-x.
101. Shamim MZ, Syed S, Shiblee M, Usman M, Zaidi M, Ahmad Z,
et al. Detecting benign and precancerous tongue lesions using deep
72
convolutional neural networks for early signs of oral cancer. Basic Clin
Pharmacol Toxicol 2019;125:184–5.
102. Fu Q, Chen Y, Li Z, Jing Q, Hu C, Liu H, et al. A deep learning
algorithm for detection of oral cavity squamous cell carcinoma from
photographic images: A retrospective study. Eclinicalmedicine
2020;27:100558. https://doi.org/10.1016/j.eclinm.2020.100558
103. Tomita H, Yamashiro T, Heianna J, Nakasone T, Kimura Y,
Mimura H, et al. Nodal-based radiomics analysis for identifying cervi-
cal lymph node metastasis at levels I and II in patients with oral squa-
mous cell carcinoma using contrast-enhanced computed tomography.
Eur Radiol 2021. https://doi.org/10.1007/s00330-021-07758-4.
104. Napel S, Mu W, Jardim-Perassi BV, Aerts HJWL, Gillies RJ.
Quantitative imaging of cancer in the postgenomic era:
Radio(geno)mics, deep learning, and habitats. Cancer 2018;124:4633–
49. https://doi.org/10.1002/cncr.31630.
105. Pan X, Zhang T, Yang Q, Yang D, Rwigema J-C, Qi XS. Sur-
vival prediction for oral tongue cancer patients via probabilistic genetic
algorithm optimized neural network models. Br J Radiol
2020;93:20190825. https://doi.org/10.1259/bjr.20190825
106. Alabi RO, Elmusrati M, Sawazaki-Calone I, Kowalski LP,
Haglund C, Coletta RD, et al. Comparison of supervised machine learn-
ing classification techniques in prediction of locoregional recurrences
in early oral tongue cancer. Int J Med Inf 2020;136.
https://doi.org/10.1016/j.ijmedinf.2019.104068
107. Tighe D, Thomas AJ, Hills A, Quadros R. Validating a risk strat-
ification tool for audit of early outcome after operations for squamous
cell carcinoma of the head and neck. Br J Oral Maxillofac Surg
2019;57:873–9. https://doi.org/10.1016/j.bjoms.2019.07.008
73
108. Poedjiastoeti W, Suebnukarn S. Application of Convolutional
Neural Network in the Diagnosis of Jaw Tumors. Healthc Inform Res
2018;24:236–41. https://doi.org/10.4258/hir.2018.24.3.236.
109. Yang H, Jo E, Kim HJ, Cha I-H, Jung Y-S, Nam W, et al. Deep
learning for automated detection of cyst and tumors of the jaw in
panoramic radiographs. J Clin Med 2020;9:1–14.
https://doi.org/10.3390/jcm9061839.
110. Orhan K, Bayrakdar IS, Ezhov M, Kravtsov A, Özyürek T. Eval-
uation of artificial intelligence for detecting periapical pathosis on
cone-beam computed tomography scans. Int Endod J 2020;53:680– 9.
https://doi.org/10.1111/iej.13265.
111. Poedjiastoeti W, Suebnukarn S. Application of Convolutional
Neural Network in the Diagnosis of Jaw Tumors. Healthc Inform Res
2018;24:236–41. https://doi.org/10.4258/hir.2018.24.3.236.
112. Kikuchi H, Ikeda M, Araki K. Evaluation of a virtual reality sim-
ulation system for porcelain fused to metal crown preparation at Tokyo
Medical and Dental University. J Dent Educ [Internet]. 2013
Jun;77(6):782–92. Available from: https://www. ncbi.nlm.nih.gov/
pubmed/23740915.
113. Sims-Williams JH, Brown ID, Matthewman A, Stephens CD. A
computer-controlled expert system for orthodontic advice [Internet].
Vol. 163, British Dental Journal. 1987. p. 161–6. Available from:
http://dx.doi.org/10.1038/sj.bdj.4806228.
114. 42.Baliga MS. Artificial intelligence - The next frontier in pedi-
atric dentistry. J Indian Soc Pedod Prev Dent [Internet]. 2019
Oct;37(4):315. Available from:
http://dx.doi.org/10.4103/JISPPD.JISPPD_319_19.
115. Rajput, G.; Ahmed, S.; Chaturvedi, S.; Addas, M.K.; Bhagat,
T.V.; Gurumurthy, V.; Alqahtani, S.M.; Alobaid, M.A.; Alsubaiy, E.F.;
74
Gupta, K.; et al. Comparison of Microleakage in Nanocomposite and
Amalgam as a Crown Foundation Material Luted with Different Luting
Cements under CAD-CAM Milled Metal Crowns: An In Vitro Micro-
scopic Study. Polymers 2022, 14, 2609.
116. Abouzeid, H.L.; Chaturvedi, S.; Abdelaziz, K.M.; Alzahrani,
F.A.; AlQarni, A.A.S.; Alqahtani, N.M. Role of Robotics and Artificial
Intelligence in Oral Health and Preventive Dentistry-Knowledge, Per-
ception and Attitude of Dentists. Oral Health Prev. Dent. 2021, 19,
353–363.
117. Amornvit, P.; Rokaya, D.; Sanohkan, S. Comparison of Accu-
racy of Current Ten Intraoral Scanners. Biomed Res. Int.
2021,2021,2673040. [CrossRef].
118. Cabanes-Gumbau, G.; Palma, J.C.; Kois, J.C.; Revilla-León, M
Transferring the tooth preparation finish line on intraoral digital scans
to dental software programs: A dental technique. J. Prosthet. Dent.
2022, S0022-3913, 00582-5. [CrossRef].
119. Rekow, E.D. Digital dentistry: The new state of the art—Is it dis-
ruptive or destructive? Dent. Mater. 2020, 36, 9–24. [CrossRef]
[PubMed].
120. Jreige, C.S.; Kimura, R.N.; Segundo, Â.R.T.C.; Coachman, C.;
Sesma, N. Esthetic treatment planning with digital animation of the
smile dynamics: A technique to create a 4-dimensional virtual patient.
J. Prosthet. Dent. 2022, 128, 130–138. [CrossRef].
121. Rekow, E.D. Digital dentistry: The new state of the art—Is it dis-
ruptive or destructive? Dent. Mater. 2020, 36, 9–24. [CrossRef]
[PubMed].
122. Delgado, A.H.S.; Sauro, S.; Lima, A.F.; Loguercio, A.D.; Della
Bona, A.; Mazzoni, A.; Collares, F.M.; Staxrud, F.; Ferracane, J.;Tsoi,
J.; et al. Risk of bias tool and guideline to support reporting of pre-clin-
75
ical Dental Materials Research and assessment of Systematic Reviews.
J. Dent. 2022, 127, 104350. [CrossRef].
123. Kirubarajan, A.; Young, D.; Khan, S.; Crasto, N.; Sobel, M.;
Sussman, D. Artificial Intelligence and Surgical Education: A system-
atic scoping review of interventions. J. Surg. Educ. 2022, 79, 500–515.
[CrossRef].
124. Afrashtehfar, K.I.; Alnakeb, N.A.; Assery, M.K. Accuracy of in-
traoral scanners versus traditional impressions: A Rapid UmbrellaRe-
view. J. Evid. Based Dent. Pract. 2022, 22, 101719. [CrossRef].
125. The Glossary of Prosthodontic Terms: Ninth Edition. J. Prosthet.
Dent. 2017, 117, e1–e105. [CrossRef].
126. Persson, A.S.; Andersson, M.; Oden, A.; Sandborgh-Englund, G.
Computer aided analysis of digitized dental stone replicas by
dental CAD/CAM technology. Dent. Mater. 2008, 24, 1123–1130.
[CrossRef].
127. Berrendero, S.; Salido, M.P.; Ferreiroa, A.; Valverde, A.;
Pradies, G. Comparative study of all-ceramic crowns obtained from
conventional and digital impressions: Clinical findings. Clin. Oral In-
vestig. 2019, 23, 1745–1751. [CrossRef]].
128. Su, T.S.; Sun, J. Comparison of repeatability between intraoral
digital scanner and extraoral digital scanner: An in-vitro study. J.
Prosthodont. Res. 2015, 59, 236–242. [CrossRef] [PubMed].
129. Ahrberg, D.; Lauer, H.C.; Ahrberg, M.; Weigl, P. Evaluation of
fit and efficiency of CAD/CAM fabricated all-ceramic restorations
based on direct and indirect digitalization: A double-blinded, random-
ized clinical trial. Clin. Oral Investig. 2016, 20, 291–300. [CrossRef]
[PubMed].
130. Sim, J.Y.; Jang, Y.; Kim, W.C.; Kim, H.Y.; Lee, D.H.; Kim, J.H.
Comparing the accuracy (trueness and precision) of models of fixed
76
dental prostheses fabricated by digital and conventional workflows. J.
Prosthodont. Res. 2019, 63, 25–30. [CrossRef] [PubMed].
131. Yilmaz, H.; Aydin, M.N. Digital versus conventional impression
method in children: Comfort, preference and time. Int. J. Paediatr.
Dent. 2019, 29, 728–735. [CrossRef].
132. Cattoni, F.; Tete, G.; Calloni, A.M.; Manazza, F.; Gastaldi, G.;
Cappare, P. Milled versus moulded mock-ups based on the superimpo-
sition of 3D meshes from digital oral impressions: A comparative in
vitro study in the aesthetic area. BMC Oral Health 230.CrossRef]
[PubMed].
133. Kim, J.E.; Park, Y.B.; Shim, J.S.; Moon, H.S. The Impact of
Metal Artifacts Within Cone Beam Computed Tomography Data on the
Accuracy of Computer-Based Implant Surgery: An In Vitro Study. Int.
J. Oral Maxillofac. Implant. 2019, 34, 585–594. [CrossRef]
134. Ender, A.; Zimmermann, M.; Mehl, A. Accuracy of complete-
and partial-arch impressions of actual intraoral scanning systems in
vitro. Int. J. Comput. Dent. 2019, 22, 11–19.
135. Malik, J.; Rodriguez, J.; Weisbloom, M.; Petridis, H. Compari-
son of Accuracy Between a Conventional and Two Digital Intraoral
Impression Techniques. Int. J. Prosthodont. 2018, 31, 107–113. [Cross-
Ref] [PubMed].
136. Guth, J.F.; Runkel, C.; Beuer, F.; Stimmelmayr, M.; Edelhoff,
D.; Keul, C. Accuracy of five intraoral scanners compared to indirect
digitalization. Clin. Oral Investig. 2017, 21, 1445–1455. [CrossRef].
137. Tomita, Y.; Uechi, J.; Konno, M.; Sasamoto, S.; Iijima, M.; Mi-
zoguchi, I. Accuracy of digital models generated by conventional im-
pression/plaster-model methods and intraoral scanning. Dent. Mater. J.
2018, 37, 628–633. [CrossRef] [PubMed].
77
138. Gonzalez de Villaumbrosia, P.; Martinez-Rus, F.; Garcia-Orejas,
A.; Salido, M.P.; Pradies, G. In vitro comparison of the accuracy ( true-
ness and precision) of six extraoral dental scanners with different scan-
ning technologies. J. Prosthet. Dent. 2016, 116, 550-543.e541. [Cross-
Ref].
139. Li, H.; Lyu, P.; Wang, Y.; Sun, Y. Influence of object translu-
cency on the scanning accuracy of a powder-free intraoral scanner:A
laboratory study. J. Prosthet. Dent. 2017, 117, 93–101. [CrossRef].
140. Bocklet, C.; Renne, W.; Mennito, A.; Bacro, T.; Latham, J.;
Evans, Z.; Ludlow, M.; Kelly, A.; Nash, J. Effect of scan substrates on
accuracy of 7 intraoral digital impression systems using human maxilla
model. Orthod. Craniofac. Res. 2019, 22 (Suppl. 1), 168–174. [Cross-
Ref].
141. Son, S.A.; Kim, J.H.; Seo, D.G.; Park, J.K. Influence of different
inlay configurations and distance from the adjacent tooth on the accu-
racy of an intraoral scan. J. Prosthet. Dent. 2021. [CrossRef].
142. Parize, H.; Dias Corpa Tardelli, J.; Bohner, L.; Sesma, N.;
Muglia, V.A.; Candido Dos Reis, A. Digital versus conventional work-
flow for the fabrication of physical casts for fixed prosthodontics: A
systematic review of accuracy. J. Prosthet. Dent. 2021. [CrossRef].
143. Hasanzade, M.; Aminikhah, M.; Afrashtehfar, K.I.; Alikhasi, M.
Marginal and internal adaptation of single crowns and fixed dental
prostheses by using digital and conventional workflows: A systematic
review and meta-analysis. J. Prosthet. Dent. 2020. [CrossRef].
78
Republic of Iraq
Ministry of higher education and scientific research
University of Al-Qadisiyah
College of Dentistry
BY
Zahraa Salih Hadi
Abbas Hamza Yasir
Khayrat Ahmed Zaghir
Ruqayya Ahmed Abdul-Amir
Sura Nazar
Superviserd By
79
Dr. Ghassan Majid Jasim
2023-2024
الخالصة
برز الذكاء االصطناعي ( )AIكقوة تحويلية في طب األسنان ،حيث أحدث ثورة في جوانب مختلفة من
رعاية المرضى والتشخيص وتخطيط العالج .يوفر هذا التكامل لتقنيات الذكاء االصطناعي في طب
األسنان حلوًال فعالة ،تتراوح من تحليل الصور للتفسير الشعاعي إلى توصيات العالج الشخصية.
إن استخدام خوارزميات التعلم اآللي ،مثل الش¡¡بكات العص¡¡بية التالفيفي¡¡ة ( )CNNsونم¡¡اذج التعلم
العميق ،يعزز دقة التشخيص وي¡¡تيح التحليالت التنبؤي¡¡ة لنت¡¡ائج ص¡¡حة الفم .باإلض¡¡افة إلى تحس¡¡ين س¡¡ير
العمل السريري ،تساهم تطبيق¡¡ات ال¡¡ذكاء االص¡¡طناعي في طب األس¡¡نان في تعزي¡¡ز تج¡¡ارب المرض¡¡ى
وتدخالت الرعاية الصحية األكثر تخصيًصا.
80