Skip to main content

Evaluating the impact of AI-generated educational content on patient understanding and anxiety in endodontics and restorative dentistry: a comparative study

Abstract

Background

Effective patient education is critical in enhancing treatment outcomes and reducing anxiety in dental procedures. This study compares the effectiveness of AI-generated educational materials with traditional methods in improving patient comprehension and reducing anxiety during endodontic and restorative dental treatments.

Methods

A cross-sectional, comparative study was conducted with 100 participants undergoing restorative or endodontic procedures. Patients were randomized into two groups: those receiving AI-generated instructional materials (via ChatGPT) and those receiving traditional education (verbal explanations and pamphlets). Baseline knowledge and post-intervention knowledge retention were assessed using structured tests. Patient perceptions of clarity, usefulness, comprehensiveness, trust, and anxiety were measured using Likert-scale surveys. Three dental experts evaluated the educational content for accuracy and suitability. Statistical analysis included t-tests and Cohen’s kappa to measure inter-rater reliability.

Results

AI-generated materials significantly outperformed traditional methods in all measured dimensions, including clarity (4.42 vs. 3.25), usefulness (4.63 vs. 3.50), comprehensiveness (4.50 vs. 3.29), trust (4.00 vs. 2.96), and anxiety reduction (mean anxiety score: 2.63 vs. 3.38, p < 0.001). Pre- and post-intervention knowledge assessments revealed substantial knowledge improvement in the AI group. Expert evaluations confirmed the accuracy and suitability of AI-generated materials, with high inter-rater reliability (κ = 0.75, p < 0.001).

Conclusions

AI-generated educational materials demonstrate superior effectiveness in improving patient comprehension and reducing anxiety compared to traditional methods. Their integration into dental practice could enhance patient satisfaction and streamline the educational process, particularly for complex or anxiety-inducing procedures. Future research should explore their application in diverse dental specialties and assess long-term impacts on patient behavior and clinical outcomes.

Peer Review reports

Introduction

The use of artificial intelligence (AI) in healthcare has transformed clinical practice, particularly in personalized treatment suggestions and diagnostic support [1]. A major development is the use of AI tools like ChatGPT to help educate patients and improve communication with healthcare providers [2]. In dentistry, successful treatment often depends on how well patients understand their treatment and post operative care. AI is starting to play a role in improving that education [3].

Dental anxiety is a well-documented phenomenon that significantly impacts treatment outcomes. Patients experiencing high levels of anxiety are more likely to avoid or delay necessary treatments. This leads to compromised oral health and increased complications [4]. Patients who are anxious may find it harder to understand important treatment details, risks, and following steps. This can make them less likely to continue with their care [5]. In order to address these challenges, it is imperative that dentists and patients engage in effective communication. However, conventional approaches, including written materials and verbal explanations, frequently prove inadequate [6].

Traditional educational tools often fail to meet patient needs, leading to growing interest in AI-generated solutions. For instance, recent studies suggest that AI can provide personalized, thorough, and easily comprehensible information tailored to each patient’s needs [7]. In orthodontics, research by Vassis et al. demonstrated that patients found AI-generated materials less intimidating and more informative compared to traditional printed resources. This resulted in higher satisfaction levels and better preparation for treatment [8]. However, there remains limited evidence regarding the efficacy of AI-generated content in other dental specialties, particularly endodontics and restorative dentistry.

Endodontic procedures are associated with heightened patient anxiety due to perceived complexity and potential complications. These might include infection, tooth fracture, or the need for retreatment [9, 10]. It is essential for patients to understand these risks in order to achieve the best possible results. AI-generated educational materials present an opportunity to deliver clear, concise, and customized information. This has the potential to improve both patient satisfaction and adherence to treatment plans [11]. Automating this process could alleviate some of the workload for dental practitioners, allowing them to focus more on direct clinical care [12].

This study aims to evaluate how AI-generated educational content influences patient comprehension and anxiety levels during restorative and endodontic procedures. This study assesses the impact of AI on patient comprehension, anxiety reduction, and overall experience enhancement. Evaluations by experts regarding the accuracy and comprehensiveness of AI-generated content will be performed to determine its reliability as a teaching tool in clinical environments. The null hypothesis proposes that there will be no significant difference in patient comprehension, anxiety levels, or satisfaction between those educated using traditional methods versus AI-generated content.

By exploring the intersection of AI and patient education, this study contributes to the growing body of literature on the application of AI in dentistry. It addresses the pressing need for robust and reliable educational materials, especially as digital healthcare technologies continue to shape clinical practice [13].

Methodology

Study design and participants

This study utilized a randomized, comparative interventional design to evaluate the impact of AI-generated educational content versus traditional education methods on patient understanding and anxiety. Participants were randomly assigned to either the intervention group (AI-generated content) or the control group (traditional education methods). Baseline measurements were taken before the intervention, and post-intervention assessments were conducted to evaluate changes in patient comprehension and anxiety levels.

Based on a pilot study by Vassis et al. [8], the sample size was established as 50 participants in each group (AI-generated vs. traditional information), for a total of 100 individuals. At a 5% significance level, this sample size provided 80% power to detect significant changes in patient satisfaction and comprehension.

Inclusion criteria

  • Patients aged 18 years or older undergoing restorative or endodontic dental procedures (e.g., dental fillings or root canal therapy).

  • Patients classified as ASA I or II according to the American Society of Anaesthesiologists (ASA) Physical Status Classification System.

  • Patients proficient in English, assessed using a standardized language proficiency screening test administered during the informed consent process. This test included simple questions about dental procedures and general health concepts to ensure participants could understand written and verbal instructions.

Exclusion criteria

  • Patients classified as ASA III or higher due to significant systemic conditions that may affect their ability to comprehend educational materials.

  • Patients with cognitive impairments or disabilities that hinder their understanding of the provided information.

  • Patients who had previously received detailed education about the procedure in question.

Participants were chosen from the group of patients slated for endodontic or restorative treatments. After their initial treatment consultations, recruitment was conducted over three months. The Fatima Jinnah Dental College’s institutional review board (IRB) granted ethical approval. Before enrollment, each subject gave written informed permission.

Randomization

To guarantee equal representation across groups, randomization was accomplished using a computer-generated sequence that was stratified by procedure type (such as dental fillings or root canal therapy). Participants were randomly assigned to one of two groups:

  • Group A: Using ChatGPT, patients were given AI-generated educational materials tailored to their specific dental operation (endodontic or restorative).

  • Group B: The attending dentist gave patients typical teaching materials in the form of brochures and spoken explanations.

Experts assessing the instructional materials were blinded to the content’s source (conventional or AI-generated) in order to reduce bias.

Baseline knowledge assessment

Participants took a baseline knowledge test to assess their familiarity with dental procedures and comprehension of oral health concepts before receiving the instructional materials. This pre-intervention assessment made sure that the baseline knowledge of the two groups was similar. Groups of participants with comparable levels of expertise were created using the baseline assessment data.

Intervention (Educational Material)

  • Group A (AI-Generated Content): Patients in this group were given teaching materials created by ChatGPT that were customized for their particular procedure (e.g., dental filling or root canal). The following standardized sections were part of the content:

    • Description of the procedure.

    • Risks and complications.

    • Care after surgery.

    • Frequently asked questions (FAQs).

  • Group B (Traditional Education Group): This group received verbal information and printed pamphlets, provided by the clinical staff.

Similar information was included in both AI-generated and conventional teaching materials, guaranteeing uniformity in the way the content was presented. To accommodate people with different levels of literacy, the language used was adjusted for accessibility.

Validation of AI-generated content

The AI-generated educational materials underwent a rigorous validation process to ensure accuracy, comprehensiveness, and cultural relevance. This process included the following steps:

  • Expert Review:

    A panel of three dental experts (endodontists and restorative dentists) independently reviewed the AI-generated content for accuracy, comprehensiveness, and suitability for patient education. Disagreements among experts were resolved through consensus discussions. The high inter-rater reliability (Cohen’s kappa = 0.75, p < 0.001) confirmed the consistency and reliability of the expert evaluations.

  • Pilot Testing:

    A pilot test was conducted with 10 patients to assess the clarity and cultural appropriateness of the materials. Feedback from these participants was used to refine the content, simplify language, and address any ambiguities. This step ensured that the AI-generated materials were accessible and relevant to the target audience.

  • Quality Assurance Protocols:

    To minimize the risk of misinformation, all AI-generated content underwent additional quality assurance checks. A multidisciplinary team, including dentists, educators, and communication specialists, reviewed the materials for factual accuracy and clarity. Disclaimers were included in patient materials stating that the AI-generated content serves as a supplementary resource and should be verified by healthcare professionals. Patients were encouraged to ask questions during consultations to clarify any ambiguities.

Knowledge retention assessment

Participants finished a post-intervention knowledge test after the educational session. This exam assessed particular factual information regarding the surgery, such as its risks, advantages, and aftercare. To assess knowledge improvement, the outcomes of the evaluations conducted before and after the intervention were compared.

Survey tool validation

Ten participants in preliminary research validated the survey tool to make sure it was relevant and clear. Question content and scale design were improved with input from the pilot test. Using Cronbach’s alpha to measure internal consistency, the results showed good reliability (α = 0.82).

Data collection

Two types of data were collected using this instrument:

  1. 1.

    Patient Survey: Following receipt of the instructional materials, patients answered a Likert scale survey [1,2,3,4,5] that evaluated the following:

    • Clarity of the information.

    • Content usefulness.

    • Comprehensiveness of the information.

    • Trust in the instructional data.

    • Level of procedure-related anxiety.

  2. 2.

    Expert Evaluation: Three dental experts (endodontists and restorative dentists) evaluated the accuracy, comprehensiveness, and suitability of the educational content independently. To maintain consistency, disagreements over expert evaluations were settled by consensus.

Outcome measures

Primary outcomes

  • Likert scale [1,2,3,4,5] responses indicating the patient’s level of satisfaction with the instructional material.

  • Knowledge assessments before and after the intervention, with an emphasis on comprehension and memory.

Secondary outcomes

  • Professional evaluations of the accuracy, thoroughness, and worth of the instructional material.

  • Based on survey responses, the degree of anxiety that patients had about the procedure.

Statistical analysis

Means and standard deviations were used to determine the responses for each group. Knowledge retention was assessed by comparing the results of knowledge tests taken before and after the intervention. Independent t-tests were used to compare the mean scores for survey questions (such as usefulness and clarity) between groups.

The Shapiro-Wilk test results, which verified a normal distribution for all dimensions (p > 0.05), supported the use of t-tests for Likert scale [1,2,3,4,5] data. Although non-parametric options were taken into consideration, they were judged unneeded because of the normality of the data.

To ensure consistency in expert ratings, inter-rater reliability was measured using Cohen’s kappa (κ = 0.75, p < 0.001). The statistical software SPSS version 26 was used for all analyses.

Results

Participant demographics and baseline characteristics

Fifty participants were randomly assigned to the AI-generated education group (Group A) and fifty to the traditional education group (Group B) out of a total of 100. Balanced representation across groups was guaranteed by stratified randomization according to procedure type (e.g., dental fillings or root canal therapy).

Both groups’ participants were similar in terms of their baseline anxiety scores, age, gender, and educational attainment. The success of the randomization technique was confirmed by the pre-survey, which showed no significant differences between the groups in baseline knowledge or anxiety levels (p > 0.05).

Effectiveness of AI-generated versus traditional educational content

Post-surveys were used to evaluate patients’ opinions of clarity, usefulness, comprehensiveness, trust, and anxiety. In every aspect, the AI group continuously performed better than the traditional group.

The mean scores for both groups and the p-values determined by independent t-tests are shown in Table 1. All dimensions showed statistically significant variations in scores (p < 0.001).

Subgroup analyses

Subgroup analyses were performed to explore potential differences in outcomes based on demographic factors such as age, gender, and educational level. While no statistically significant differences were observed between groups (p > 0.05), there was a trend indicating that younger patients and those with higher education levels demonstrated slightly greater improvements in comprehension scores compared to older patients and those with lower education levels. These findings highlight the need for tailored educational strategies to address varying patient needs.

Sensitivity analysis

A sensitivity analysis was performed to evaluate the robustness of the results. Excluding participants with baseline anxiety scores above 4.0 did not alter the significance of the observed differences in anxiety reduction (p < 0.001). Similarly, adjusting for educational level and age in a multivariable model confirmed the robustness of the findings. Reducing the sample size by 10% and 20% also maintained statistical significance, indicating the reliability of the conclusions.

To further validate the findings, a non-parametric Mann-Whitney U test was performed for Likert-scale responses, yielding results consistent with the primary analysis (p < 0.001), confirming the robustness of the conclusions.

Table 1 Comparison of mean scores across five dimensions

Clarity, usefulness, comprehensiveness, and trust

The mean scores are graphically compared in a bar chart (Fig. 1), which shows that AI-generated educational content is consistently more comprehensive, clear, and helpful than conventional approaches. Clarity (4.42 vs. 3.25), usefulness (4.63 vs. 3.50), comprehensiveness (4.50 vs. 3.29), and trust (4.00 vs. 2.96) were all markedly greater for the AI group than for the traditional group. These results demonstrate how well AI-generated content meets the educational needs of patients.

Fig. 1
figure 1

Comparison of AI and traditional educational content

Reduction in anxiety levels

To evaluate the effectiveness of the instructional materials, patient anxiety was measured both before and after the intervention. The mean anxiety levels in the AI group significantly decreased from 3.50 to 2.63 (p < 0.001, Cohen’s d = 1.2), suggesting a considerable impact size. The traditional group, on the other hand, only saw a minor decline, going from 3.42 to 3.38 (p > 0.05, Cohen’s d = 0.1).

A bar chart (Fig. 2) illustrates the variations in mean anxiety decrease between the two groups, while a boxplot (Fig. 3) shows the distribution of anxiety levels before and after the session.

Fig. 2
figure 2

Pre and post Intervention Anxiety Levels

Fig. 3
figure 3

Boxplot showing distribution of pre and post intervention anxiety levels

Expert evaluations

Three dental specialists assessed the educational materials for accuracy, comprehensiveness, and suitability for patient education. With mean ratings of 4.70 for accuracy, 4.60 for comprehensiveness, and 4.80 for suitability, AI-generated content was generally scored higher in every category.

These expert assessments are shown in Fig. 4, which highlights the evident superiority of AI-generated content. The reliability of these findings was further validated using Cohen’s kappa, which verified the high level of agreement among the experts (κ = 0.75, p < 0.001).

Fig. 4
figure 4

Expert ratings of educational content

Patient preferences for educational methods

Patients were asked to express their preference for the type of educational information they received. Among them, 65% preferred AI-generated materials, 25% favored traditional methods, and 10% expressed no preference.

A pie chart (Fig. 5) summarizes these findings, underscoring the strong preference for AI-generated content.

Fig. 5
figure 5

Patient preferences of educational methods

Discussion

This study demonstrates how AI-generated patient education has the potential to increase comprehension and lower anxiety compared to the traditional methods. In terms of clarity, utility, comprehensiveness, and trust, AI-generated materials continuously performed better than traditional approaches [3]. The findings indicate that AI can serve as a valuable tool to enhance patient education, particularly in restorative and endodontic dentistry.

This study emphasizes the benefits of AI-generated educational materials, yet it is crucial to place these findings within the wider framework of patient education strategies. Traditional methods, such as verbal explanations and printed pamphlets, have long been the cornerstone of dentist-patient communication. However, these methods frequently fail to meet the specific needs of individual patients, especially in complex treatments such as root canal therapy or dental restorations [14]. Content produced by AI effectively overcomes these challenges by delivering personalized, clear, and thorough information customized to match each patient’s level of understanding.

Clarity and comprehensiveness

The increased clarity and comprehensiveness scores of the AI group demonstrate the effectiveness of AI technologies in describing complex dental procedures in a manner that is both detailed and easily understood by patients. AI-generated materials solve some of the drawbacks of conventional educational approaches by customizing information to each individual’s needs. These results are consistent with other research in the healthcare field, which has demonstrated that AI-enhanced patient education improves the delivery and understanding of information [15, 16].

Among dental specialists, the strong inter-rater reliability (Cohen’s kappa = 0.75, p < 0.001) highlights the precision and thoroughness of the AI-generated content, confirming its authenticity. Incorporating AI-generated material into clinical environments require this reliability to ensure consistency and credibility.

Trust in AI-generated content

The accuracy and consistency of the information offered are perhaps the main reasons why patients trust AI-generated content. Most conventional methods depend on verbal explanations, which might not cover all the essential elements of a technique. They are also vulnerable to variation depending on the practitioner [3]. AI tools, on the other hand, make sure that important issues are addressed with in an effective way, which makes patients more confident.

This outcome contradicts prior studies suggesting that patients placed greater trust in human practitioners than in AI systems [17]. This disparity might be a reflection of the increasing acceptability of AI in healthcare, especially when it is used to enhance human expertise rather than replace it.

Reduction in anxiety levels

The significant decrease in anxiety that was noted in patients who received AI-generated education is among the study’s most convincing results. The traditional group’s mean anxiety levels hardly changed from 3.42 to 3.38, p > 0.05, whereas the AI group’s decreased significantly from pre- to post-intervention (3.50 to 2.63, p < 0.001).

The null hypothesis, which proposed no significant difference in patient comprehension, anxiety levels, or satisfaction between the groups, was rejected. The results demonstrated statistically significant improvements in all measured dimensions for the AI group compared to the traditional group (Table 1).

This result emphasizes how crucial thorough and understandable information is in easing patients’ anxieties over dental procedures. Reducing ambiguity through well-structured information has been shown to reduce anxiety and improve patient satisfaction in various medical fields [18].

Implications for endodontics and restorative dentistry

AI tools have the potential to have a big influence on patient education in dentistry settings because of their ability to provide accurate and understandable information. AI-generated information enhances patient comprehension and reduces concerns about treatments like root canals and dental restorations. Patient anxiety and misunderstandings are frequent during these procedures [11, 19]. AI is seen as a valuable adjunct to traditional methods because to its dual impact of enhancing comprehension and minimizing anxiety. This has the potential to ultimately improve patient satisfaction and treatment results [4].

Previous studies have emphasized the importance of culturally responsive teaching and adapting instructional practices to meet diverse patient needs [20]. AI tools can complement these strategies by dynamically adjusting content based on patient feedback and preferences. For instance, integrating multimedia elements such as videos or infographics into AI-generated materials could further enhance engagement. This could be particularly useful for patients with lower literacy levels or visual learning preferences [21]. Future research should explore the synergistic potential of combining AI technologies with traditional methods to optimize patient education outcomes.

Cost-effectiveness

Incorporating AI-generated educational content into dental practice improves efficiency and reduces costs by streamlining patient education. Traditional methods, such as printed pamphlets and prolonged chairside explanations, incur material expenses (e.g., printing, storage) and consume clinician time, averaging 15–20 min per patient [22]. In contrast, AI-generated content offers a standardized, scalable solution, reducing material costs by 30–40% and cutting clinician time spent on explanations by 50% [23, 24]. For instance, a study by Dutta et al. demonstrated that AI tools reduced per-patient education costs from 8.50 (traditional) to 3.20 (AI) while maintaining quality [23].

Schwendicke et al. found that AI-assisted caries detection was cost-effective, improving diagnostic accuracy while reducing long-term treatment expenses. Their randomized trial demonstrated that AI integration resulted in more efficient resource allocation, hence enhancing the financial feasibility of AI in standard dental treatment [25].

However, initial investments in AI integration (e.g., software licensing, staff training) must be weighed against long-term savings. Future research should do cost-benefit assessments to assess workflow optimization and patient efficiency, thereby confirming the economic feasibility of AI in routine dental care [24].

Limitations

While this study provides statistically significant results, several limitations should be acknowledged to contextualize the findings and guide future research.

  1. 1.

    Single-Institution Design: The study was conducted at a single institution, which may limit the generalizability of the results. Differences in patient demographics, institutional practices, and regional healthcare systems could affect the applicability of these findings to broader populations. Multi-center studies are needed to validate the results across diverse settings.

  2. 2.

    Sample Size Constraints: Although the sample size was adequate for this exploratory investigation, a larger cohort would strengthen the statistical power and enhance the reliability of the conclusions. Future studies should aim for more extensive recruitment to ensure broader representativeness.

  3. 3.

    Lack of Long-Term Assessment: The investigation did not assess the long-term retention of knowledge or the lasting impact of AI-generated instructions on compliance following treatment. Further research is necessary to determine whether AI-enhanced education leads to sustained behavioral changes or improved clinical outcomes over time.

  4. 4.

    Language and Accessibility Barriers: The study exclusively included English-speaking participants, potentially excluding non-English-speaking and underserved populations, particularly in rural or low-resource settings. This limitation raises ethical issues about fair access to AI-based educational tools because it could accidentally push linguistically diverse groups to the margins [26]. To address this, future studies should prioritize the development of multilingual AI platforms and incorporate community-engaged approaches to ensure inclusive participation.

By explicitly addressing these limitations, we hope to encourage more rigorous and inclusive research in this evolving field.

Future directions

Future research could build on the results of this study by investigating the use of AI-generated teaching in other dental disciplines. This could include subjects such as periodontology or orthodontics, to assess how generalizable the findings are. Combining AI techniques with multimedia elements, such as pictures and videos, could improve patient education even further by simplifying and visualizing difficult procedures.

Future research ought to examine if AI-generated instruction is especially advantageous for procedures that are linked to elevated patient anxiety levels. These procedures might include root canal therapy or surgical extractions. To find out if AI-generated content is better than traditional teaching techniques at reducing anxiety in high-stress situations, participants could be categorized according to their procedural anxiety levels. Moreover, research that includes patient feedback and dentist assessments of AI-generated information would address ethical issues and improve these tools for clinical application.

This study primarily focused on the effectiveness of textual AI-generated content. The integration of multimedia elements, such as videos and infographics, holds significant promise for enhancing patient engagement and understanding [27]. Multimedia materials could particularly benefit patients with lower literacy levels or those who prefer visual learning styles. Future studies should explore the development and efficacy of multimedia AI platforms in dental education as they have the potential to make complex procedures more accessible and less intimidating.

Conclusion

This study shows that AI-generated patient education greatly enhances comprehensiveness, usefulness, and clarity while lowering patient anxiety related to dental procedures. The results demonstrate how much patients prefer AI-generated content and confirm its appropriateness and correctness through professional assessments.

AI can completely transform dental treatment as it evolves and is included in patient education and communication plans. AI-generated tools can be a useful supplement to conventional approaches by increasing patient comprehension and satisfaction. This opens the door for more knowledgeable and self-assured patients, and eventually boosting clinical results.

Data availability

The datasets generated and/or analyzed during the current study are available from the corresponding author upon reasonable request.

Abbreviations

AI:

Artificial Intelligence

LLM:

Large Language Model

IRB:

Institutional Review Board

References

  1. Karalis VD. The integration of artificial intelligence into clinical practice. Appl Biosci. 2024;3(1):14–44.

    Article  Google Scholar 

  2. Alasker A, Alsalamah S, Alshathri N, Almansour N, Alsalamah F, Alghafees M, AlKhamees M, Alsaikhan B. Performance of large Language models (LLMs) in providing prostate cancer information. BMC Urol. 2024;24(1):177.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Thorat V, Rao P, Joshi N, Talreja P, Shetty AR. Role of artificial intelligence (AI) in patient education and communication in dentistry. Cureus. 2024;16(5).

  4. Nishi NJ, Pervin S, Shaude S. Dental fear: a narrative review. Int J Res Med Sci. 2022;10(3):777.

    Article  Google Scholar 

  5. Ho JC, Chai HH, Lo EC, Huang MZ, Chu CH. Strategies for effective Dentist-Patient communication: A literature review. Patient preference and adherence. 2024 Dec 31:1385–94.

  6. Nashwan AJ, Abujaber AA, Choudry H. Embracing the future of physician-patient communication: GPT-4 in gastroenterology. Gastroenterol Endoscopy. 2023;1(3):132–5.

    Article  Google Scholar 

  7. Batra P, Dave DM. Revolutionizing healthcare platforms: the impact of AI on patient engagement and treatment efficacy. Int J Sci Res (IJSR). 2024;13(1021275):613–24.

    Google Scholar 

  8. Vassis S, Powell H, Petersen E, Barkmann A, Noeldeke B, Kristensen KD, Stoustrup P, Petersen ED. Large-Language models in orthodontics: assessing reliability and validity of ChatGPT in pretreatment patient education. Cureus. 2024;16(8).

  9. Woodmansey K. Readability of educational materials for endodontic patients. J Endod. 2010;36(10):1703–6.

    Article  PubMed  Google Scholar 

  10. Ahmed S, Sharma P, Mahaprasad A, Patel A, Chohan H, Murugesan S. Assessing the influence of patient anxiety on the efficacy of endodontic procedures. J Pharm Bioallied Sci. 2024;16(Suppl 3):S2685–7.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Zhang P, Kamel Boulos MN. Generative AI in medicine and healthcare: promises, opportunities, and challenges. Future Internet. 2023;15(9):286.

    Article  CAS  Google Scholar 

  12. Müller A, Mertens SM, Göstemeyer G, Krois J, Schwendicke F. Barriers and enablers for artificial intelligence in dental diagnostics: a qualitative study. J Clin Med. 2021;10(8):1612.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Bhattad PB, Pacifico L. Empowering patients: promoting patient education and health literacy. Cureus. 2022;14(7).

  14. Sukalkar SM, Gundavda J, Zore S, Singh P, Shetty V. Bridging the Gap: A holistic framework for Understanding and improving Dentist-patient communication. Int J Multidisciplinary Res. 2024;6(6).

  15. Aydin S, Karabacak M, Vlachos V, Margetis K. Large Language models in patient education: A scoping review of applications in medicine. Front Med;11:1477898.

  16. Robinson CL, D’Souza RS, Yazdi C, Diejomaoh EM, Schatman ME, Emerick T, Orhurhu V. Reviewing the potential role of artificial intelligence in delivering personalized and interactive pain medicine education for chronic pain patients. J Pain Res. 2024 Dec;31:923–9.

  17. Robertson C, Woods A, Bergstrand K, Findley J, Balser C, Slepian MJ. Diverse patients’ attitudes towards artificial intelligence (AI) in diagnosis. PLOS Digit Health. 2023;2(5):e0000237.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Zemni I, Gara A, Nasraoui H, Kacem M, Maatouk A, Trimeche O, Abroug H, Fredj MB, Bennasrallah C, Dhouib W, Bouanene I. The effectiveness of a health education intervention to reduce anxiety in quarantined COVID-19 patients: a randomized controlled trial. BMC Public Health. 2023;23(1):1188.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Setzer FC, Li J, Khan AA. The use of artificial intelligence in endodontics. J Dent Res. 2024;103(9):853–62.

    Article  CAS  PubMed  Google Scholar 

  20. Cui S, Fiscella K, Xiao J. Culturally competent patient-dentist communication influences oral health disparities: what needs to change, why, and how. Quintessence Int. 2024(4):262–3.

  21. Javan R, Cole J, Hsiao S, Cronquist B, Monfared A, Hsiao SK. Integration of AI-generated images in clinical otolaryngology. Cureus. 2024;16(8).

  22. Martin AR. (2024) Traditional Methods in advance. The CLR James Journal. https://doiorg.publicaciones.saludcastillayleon.es/10.5840/clrjames20241219116

  23. Dutta A, Pasricha N, Singh RK, Revanna R, Ramanna PK. Harnessing artificial intelligence in dentistry: enhancing patient care and diagnostic precision. Asian J Den Sci [Internet]. 2024 Dec;7(12):379–86.

  24. Delgado-Ruiz R, Kim AS, Zhang H, Sullivan D, Awan KH, Stathopoulou PG. Generative artificial intelligence (Gen AI) in dental education: opportunities, cautions, and recommendations. J Dent Educ. 2025;89(1):130–6.

    Article  PubMed  Google Scholar 

  25. Schwendicke F, Mertens S, Cantu AG, Chaurasia A, Meyer-Lueckel H, Krois J. Cost-effectiveness of AI for caries detection: randomized trial. J Dent. 2022;119:104080.

    Article  PubMed  Google Scholar 

  26. Popkin R, Taylor-Zapata P, Bianchi DW. Physician bias and clinical trial participation in underrepresented populations. Pediatrics. 2022;149(2):e2021054150.

    Article  PubMed  Google Scholar 

  27. Macapagal J, Calimag MM. Effect of a Video-Based case presentation educational intervention anchored on the theory of planned behavior on adoption of oral health behaviors among dental patients: A Quasi-experimental study. Odovtos-International J Dent Sci. 2022;23(2):148–60.

    Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

Author information

Authors and Affiliations

Authors

Contributions

Dr. Shahid Islam is the sole author of this manuscript. He conceptualized, designed, and conducted the study, analyzed the data, and drafted the manuscript.

Corresponding author

Correspondence to Shahid Islam.

Ethics declarations

Ethics approval and consent to participate

This study was approved by the Institutional Review Board (IRB) of Fatima Jinnah Dental College, Karachi, Pakistan via certificate number AUG-2024-OPR03. Written informed consent was obtained from all participants before data collection. All procedures performed in this study involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Consent for publication

Not applicable.

Clinical trial number

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Supplementary Material 2

Supplementary Material 3

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Islam, S. Evaluating the impact of AI-generated educational content on patient understanding and anxiety in endodontics and restorative dentistry: a comparative study. BMC Oral Health 25, 689 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12903-025-06069-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12903-025-06069-0

Keywords