Showing 54 results for SF
Hossein Ghayoumi Zadeh, Mostafa Danaeian, Ali Fayazi , Farshad Namdari, Sayed Mohammad Mostafavi Isfahani ,
Volume 76, Issue 1 (April 2018)
Abstract
Background: One common symptom of diabetes is diabetic retinopathy, if not timely diagnosed and treated, leads to blindness. Retinal image analysis has been currently adopted to diagnose retinopathy. In this study, a model of hierarchical self-organized neural networks has been presented for the detection and classification of retina in diabetic patients.
Methods: This study is a retrospective cross-sectional, conducted from December to February 2015 at the AJA University of Medical Sciences, Tehran. The study has been conducted on the MESSIDOR base, which included 1200 images from the posterior pole of the eye. Retinal images are classified into 3 categories: mild, moderate and severe. A system consisting of a new hybrid classification of SOM has been presented for the detection of retina lesions. The proposed system includes rapid preprocessing, extraction of lesions features, and finally provision of a classification model. In the preprocessing, the system is composed of three processes of primary separation of target lesions, separation of the optical disk, and separation of blood vessels from the retina. The second step is a collection of features based on various descriptions, such as morphology, color, light intensity, and moments. The classification includes a model of hierarchical self-organized networks named HSOM which is proposed to accelerate and increase the accuracy of lesions classification considering the high volume of information in the feature extraction.
Results: The sensitivity, specificity and accuracy of the proposed model for the classification of diabetic retinopathy lesions is 98.9%, 96.77%, 97.87%, respectively.
Conclusion: These days, the cases of diabetes with hypertension are constantly increasing, and one of the main adverse effects of this disease is related to eyes. In this respect, the diagnosis of retinopathy, which is the same as identification of exudates, microanurysm and bleeding, is of particular importance. The results show that the proposed model is able to detect lesions in diabetic retinopathy images and classify them with an acceptable accuracy. In addition, the results suggest that this method has an acceptable performance compared to other methods.
Glareh Koochakpoor, Firoozeh Hosseini-Esfahani , Maryam Sadat Daneshpour , Parvin Mirmiran , Fereidoun Azizi ,
Volume 76, Issue 10 (January 2019)
Abstract
Background: There are contradictions in the role of genetic variations and food group intake on metabolic syndrome (MetS). This study was aimed at examining the interaction between food groups and CCND2 rs11063069, ZNT8 rs13266634 and MC4R rs12970134 polymorphisms, regarding MetS and its components.
Methods: In this matched nested case-control study (2006-2014), the data of 1634 (817 pairs) case and controls were selected among participants of the Tehran Lipid and Glucose Study (TLGS). The cases and controls were matched by age, sex and number of follow-up years. Dietary intakes were assessed using a valid and reliable food frequency questionnaire. Polymorphisms were genotyped.
Results: A significant interaction was observed between rs12970134 and green vegetable, read meat, and soft drink, in relation to the risk of low high density lipoprotein cholesterol (HDL-C), high triglyceride (TG) and high fasting blood glucose (FBG), respectively (P<0.05). The consumption of vegetables altered the effect of rs11063069 on MetS. Among G allele carriers, being in the highest quartiles of vegetables intake had a decrease risk of MetS, compared to those in the lowest quartile (P=0.007), but this trend was not observed in AA genotype carrier. There was also a significant interaction between rs13266634 and salty snack and fish intakes, in relation to the risk of abdominal obesity (P<0.05). Increasing salty meals by CT+TT genotypes carriers increased the odds ratio of abdominal obesity, while in the CC genotype, this increase was not observed. A significant interaction was also observed between rs11063069 with other vegetables, red-yellow vegetable and fruit intake respectively, regarding the risk of high FBG, low HDL-C and high blood pressure (P<0.05).
Conclusion: The present study demonstrates the interaction between food groups and MC4R, ZNT8 and CCND2 polymorphisms. To reduce the risk of MetS, high risk allele carriers of rs12970134 must avoid meat consumption, while in high risk allele carriers of rs11063069 and rs13266634, vegetables and fish should be consumed.
Zahra Esfandiari, Mohammad Reza Marasi , Fatemeh Estaki , Vahid Sanati , Elnaz Panahi , Nader Akbari , Roya Alsadat Madani, Jila Mosberian Tanha ,
Volume 77, Issue 1 (April 2019)
Abstract
Background: Nutrition education and introduction of procedures for choosing healthier food have an important role to reduce the rate of non-communicable diseases. It was shown the amount of risk factors of non-communicable diseases such as energy, salt, sugar, fat and trans fatty acid on the traffic light of food labelling. The status of risk is presented through three colors of red, yellow and green that are the signs of risk, precautious and safe use of food. The object of this study was to evaluate the influence of education on the knowledge, attitude and practices of Isfahan University of Medical Sciences students to the traffic light on food labeling.
Methods: This project was an empirical study performed by random sampling of 379 students of nine schools in Isfahan University of Medical Sciences from January 2017 to March 2018. The knowledge, attitude and practices of students toward the traffic light were assessed by self-administered and structured questionnaire. Education was performed face to face with the usage of pamphlet. In the period of three to six months, questionnaires were refilled out by students to determine knowledge, attitude and practice. Descriptive statistics were calculated using SPSS in mean± SD. Paired t-test was performed to assess the influence of education in total score of knowledge, attitudes and practices in test-retest. P value was considered less than 0.05 as statistically significant.
Results: Before education, the average of scores for knowledge, attitude and practice was 1.12±0.84, 14.44±4 and 2.25±2.2, respectively. Afterwards, the scores were increased to 11.72±0.75, 18.67±3.18 and 17.69±4.7 after education. Significant difference was observed in the scores of knowledge, attitude and practice of students before and after education (P<0.05).
Conclusion: Education of traffic light had a significant role in the improvement of knowledge, attitude and to some extent of practice of students in selection of healthier food.
Ali Mohammad Mosadeghrad , Parvaneh Isfahani ,
Volume 77, Issue 6 (September 2019)
Abstract
Background: Unnecessary patient admission to a hospital refers to the hospitalization of a patient without clinical indications and criteria. Various factors related to the patient (e.g., age, disease severity, payment method, and admission route and time), the physician and the hospital and its facilities and diagnostic technologies affect a patient unnecessary admission in a hospital. Unnecessary patient hospitalization increases nosocomial infections, morbidity and mortality, and decreases patient satisfaction and hospital productivity. This study aimed to measure unnecessary patient admissions in hospitals in Iran.
Methods: This study was conducted using a systematic review and meta-analysis at Tehran University of Medical Science in August 2019. Seven electronic databases were searched and evaluated for original research papers published between March 2006 and 2018 on patients’ unnecessary admission to a hospital. Finally, 12 articles were selected and analyzed using comprehensive meta-analysis software.
Results: All studies used the appropriateness evaluation protocol (AEP) for assessing patients’ unnecessary hospitalization in the hospitals. Overall, 2.7% of hospital admissions were rated as inappropriate and unnecessary (CI 95%: 1.5-4.9%). The highest unnecessary patients’ admissions were 11.8% in a teaching hospital in Meshginshahr city in 2016, (CI 95%: 8.8%-15.8%) and the lowest unnecessary patients’ admissions was 0.3% in a teaching hospital in Yasuj city in 2016 (CI 95%: 0%-3.6%). Unnecessary patient admission in public hospitals was higher than private hospitals. A significant statistical correlation was observed between unnecessary patient admission, and sample size (P<0.05).
Conclusion: The rate of unnecessary hospital admission in Iran is low. However, hospital resources are wasted due to unnecessary admissions. Expanding the primary health care network, reducing hospital beds, introducing an effective and efficient patient referral system, using a fixed provider payment method, and promoting residential and social services care at macro level, and establishing utilization management committee, using the appropriateness evaluation protocol, establishing short-stay units, and implementing quality management strategies at the hospital level are useful strategies for reducing avoidable hospital admissions.
Sadegh Norouzi , Fateme Esfandiarpour , Ali Shakouri Rad , Nasim Kiani Yousefzadeh , Zeinab Helalat , Reza Salehi , Mehrnoosh Amin , Farzam Farahmand ,
Volume 77, Issue 8 (November 2019)
Abstract
Background: The amount of anterior tibial translation during rehabilitation exercises is a key factor in organizing exercise regimen after anterior cruciate ligament injury. Excessive anterior tibial translation could increase the magnitude of tension imposed on injured and reconstructed anterior cruciate ligament knees. Forward lunge and open-kinetic knee extension exercises are commonly used in anterior cruciate ligament rehabilitation. However, there is insufficient data about the amount of anterior tibial translation in the eccentric and concentric phases of these exercises. This study compared the amount of anterior tibial translation in the eccentric and concentric phase of the lunge and seated knee extension in anterior cruciate ligament deficient and intact knees.
Methods: Using a non-probability sampling method, 14 men with unilateral anterior cruciate ligament rupture were selected for participation in this cross-sectional study. Participants were recruited from the university’s physiotherapy clinics. A uni-plane fluoroscope was used to image the knee joint while participants performed the forward lunge and open-kinetic knee extension exercises with the intact and injured legs in random order. Fluoroscopy imaging was performed in the radiology center at Sina Hospital, Tehran, Iran, from September 2013 to February 2014. Two factorial mixed ANOVA was used to analyze the data.
Results: There were no significant differences in the anterior tibial translation between the limbs and contraction phases during the lunge exercise. During open-kinetic knee extension, the anterior tibial translation in anterior cruciate ligament deficient knees was significantly more than that of healthy knees at 0⁰ (P=0.007). The anterior tibial translation in the eccentric phase of open-kinetic knee extension at flexion angles of 0⁰ (P=0.049) and 15⁰ (P=0.024) was significantly greater than that in the concentric phase.
Conclusion: In the lunge exercise, the amount of anterior tibial translation was similar between the eccentric and concentric phases and the intact and anterior cruciate ligament deficient knees, however, during open-kinetic knee extension exercise, in the eccentric phase was greater than that in concentric, and in the intact knees was greater than that in the intact knees, at 0-15⁰ angles.
Ali Mohammad Mosadeghrad , Parvaneh Isfahani, Taraneh Yousefinezhadi,
Volume 78, Issue 4 (July 2020)
Abstract
Background: Medical errors are those errors or mistakes committed by healthcare professionals due to errors of omission, errors in planning, and errors of execution of a planned healthcare action whether or not it is harmful to the patient. Medical error in hospitals increases morbidity and mortality and decreases patient satisfaction and hospital productivity. This study aimed to determine the prevalence of medical errors in Iranian hospitals.
Methods: This study was conducted using systematic review and meta-analysis approaches. All articles written in English and Persian on the prevalence of medical errors in Iranian hospitals up to March 2019 were searched in Web of Science, PubMed, Elsevier, Scopus, Magiran, IranMedex and Scientific Information Database (SID) databases, and Google and Google Scholar search engines. In addition, reference lists of the retrieved papers were hand-searched. A total of 9 studies matching the inclusion criteria were identified, reviewed, and analyzed using comprehensive meta-analysis software.
|
Results: The prevalence of medical errors was reported in 9 studies and prevalence rate ranged from 0.06% to 42%. Most studies used reporting forms completed by hospital employees for determining the prevalence of medical errors (67%). Only three studies collected data by reviewing patients’ medical records. Accordingly, the overall prevalence of medical error in Iran's hospitals based on the nine published articles was 0.01% (95% Cl 0%-0.01%) during 2008 to 2017. The highest medical error was recorded in a hospital in Shiraz, 2.1% (95% Cl: 1.4%-2.7%) in 2012. A significant statistical correlation was observed between medical errors and sample size (P<0.05).
Conclusion: The prevalence rate of medical error in Iran is low. It is strongly recommended to use more advanced and valid methods such as occurrence reporting, screening, and the global trigger tool for examining medical errors in Iranian hospitals. Proving adequate education and training to patients and employees, simplifying and standardizing hospital processes, enhancing hospital information systems, improving communication, promoting a safety culture, improving employees’ welfare and satisfaction, and implementing quality management strategies are useful for reducing medical errors.
|
Elham Hoseinnezhad Zarghani, Ghazale Geraily, Mahbod Esfahani, Mostafa Farzin,
Volume 78, Issue 7 (October 2020)
Abstract
Background: Total body irradiation (TBI) is a technique that is commonly used as a part of the patient conditioning regimen before the bone marrow transplant (BMT). The purpose of this study is to introduce and implement a reasonable TBI technique on the human-like phantom in Imam Khomeini Hospital in Tehran.
Methods: The present experimental study was conducted from October 2016 to November 2017 to implement the TBI technique at the Cancer Institute of Imam Khomeini Hospital in Tehran. For this purpose, percentage depth dose, and dose rate were measured in TBI condition (i.e. SSD=310 cm, field size=40×40 cm2, gantry angle=90°, and collimator angle 45°) in homogeneous phantom. Gafchromic EBT3 films were used to measure the absorbed dose in different areas of the human like phantom at the levels of head, neck, thyroid, lung, umbilicus, pelvic, thigh, knee and leg. Phantom irradiation was performed in parallel opposed anterior-posterior geometry using an 18MV photon beam produced by Varian 2100C/D. Cerrobend blocks were used for lung protection. After analyzing the exposed films with Image J software, the dose uniformity was calculated.
|
Results: Dose distribution uniformity was acquired in the order of -1.01% to +11.82% relative to the prescribed dose at the umbilicus. The difference between the calculated and measured dose at the umbilicus level was -2.73%. The radiation absorbed dose to the lung with blocks was 127.53cGy in one fraction which resulted in 756.18cGy in six fractions.
|
Conclusion: The implemented technique, obtained the acceptable ±10% dose uniformity in most of the body regions. The dosing accuracy was within the acceptable range. The lungs¢ dose was reduced to the desired level using lung shields. This technique is a simple and cost-effective method that does not require complicated dosimetric techniques. Regarding the obtained results, the proposed technique has the necessary conditions for implementation in Imam Khomeini hospital in Tehran.
Zahra Esfandiari, Fatemeh Amani, Meraj Pourhossein, Hedayat Hosseini,
Volume 78, Issue 12 (March 2021)
Abstract
|
The development of industry and technology, changes in agriculture, trade and global travel, and the adaptation of microorganisms are important factors in the occurrence of emerging diseases. Currently, the world is facing a pandemic caused by an emerging virus called the novel coronavirus (Covid 19) in 2020. This disease led to infect more than one million people worldwide and the death of more than five hundred thousand people during six months. Covid 19 causes death in patients with respiratory problems of varying severity. Fever, soreness, dry coughs, shortness of breath, runny nose, and nasal congestion were observed in coronavirus-infected individuals. Fever was one of its common symptoms. Other unusual signs such as diarrhea and nausea were reported for this disease. For the first time, the bat was introduced as the host of the novel coronavirus in China. Therefore, identifying the initial route of transmission of the novel coronavirus is necessary to prevent the occurrence and its widespread distribution. The virus enters into a human through respiratory particles as well as touching the surfaces contaminated by nasal, mouth and eye secretions. Viruses are obligate intracellular pathogens needing host cells to survive. These microorganisms cannot proliferate in foods and require live cells for existence. Food is introduced as a carrier of viruses to the consumer. There have been no reports of novel coronavirus transmission through food. However, it is important to observe the principles of health and safety by assuming the spread of the virus due to food contamination. Regarding the presence and proliferation of novel coronavirus in the gastrointestinal tract and aerosol formation of this microorganism in the feces and the possibility of re-transmitting it to people from various environmental sources, the most important priority is to remove the virus from food environments. It is also important to update the methods of disinfecting surfaces, especially areas with high contact of hand as well as personal hygiene. Therefore, it is recommended to educate the staff about managing the novel coronavirus and improving health guidelines. Furthermore, keeping distance and washing hands is in priority in different food-related environments.
|
Zohreh Ghoreishi, Ali Esfahani, Shima Asgarzad, Laleh Payahoo, Fatemeh Hajizadeh-Sharafabad ,
Volume 79, Issue 10 (January 2022)
Abstract
Background: Among all types of cancers, pancreatic cancer has poor prognosis with 5-year survival below 10%. In theory, alcohol intake may be a modifiable risk factor for pancreatic cancer due to its role in multiple carcinogenic and metabolic signaling pathways. In addition, alcohol consumption may lead to chronic pancreatitis which is underlying cause of pancreatic cancer. However, little is known about whether this factor is associated with pancreatic cancer. This study aimed to systematically review the cohort studies investigating the possible link between alcohol consumption and the morbidity or mortality of pancreatic cancer.
Methods: This study was carried out based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). All of cohort studies that assessed the association between alcohol intake and the risk of pancreatic cancer or death were included in this systematic review without a language restriction. Electronic databases including PubMed, Web of science, Scopus, and Google scholar were searched using the keywords "pancreatic cancer" and "alcohol" and similar words from 1990 to April 2021 to find the cohort studies.
Results: 858 articles were identified, of which 806 were excluded and the full-text of 52 papers were evaluated for the eligibility. Eventually, 22 articles were eligible and were included in this study. Many of the articles assessed the impacts of low to moderate alcohol intake. A comprehensive review of these studies showed that low to moderate alcohol consumption had a non-significant correlation with pancreatic cancer, while high alcohol consumption was significantly associated with the risk of pancreatic cancer or death. The results also revealed that high liquor consumption was associated with higher risk of pancreatic cancer. Nevertheless, the follow-up durations in most of these studies were shorter than that to lead to pancreatic cancer.
Conclusion: Long-term heavy alcohol drinking can increase the morbidity or mortality of pancreatic cancer. Regarding that several genetic and environmental variations involve in the pathogenesis of this cancer, simultaneous control of these differences should be addressed to determine the net effect of alcohol drinking on pancreatic cancer.
|
Ali Mohammad Mosadeghrad , Parvaneh Isfahani ,
Volume 79, Issue 12 (March 2022)
Abstract
Mohammad Nasr Esfahani , Aref Javari, Farhad Heydari, Majid Javari,
Volume 80, Issue 4 (July 2022)
Abstract
Background: Previous studies have shown that several factors affect the outcome of cardiopulmonary resuscitation. In this study, we have evaluated the factors associated with the outcome of resuscitation in in-hospital cardiopulmonary arrest patients (IHCA) 002E.
Methods: This cross-sectional non-probability study was performed on patients with in-hospital cardiopulmonary arrest between 2015 and 2020 in the emergency department (ED) of Al-Zahra Hospital, Isfahan, Iran. Data were then collected from medical records to describe patient characteristics, arrest profile, and survival details. Factors associated with the dependent variable were examined Logistic regression.
Results: Among 848 in-hospital cardiopulmonary arrests, 18 patients (2.1%) survived and were discharged from the hospital. The mean age of patients was 62.74±21.17 years, 583 (68.8%) were male, and 265 (31.2%) were female. The mean age of patients with successful resuscitation and those with unsuccessful resuscitation was 62.33±21.79 (6 to 116 years) and 61.58±21.20 (1 month to 108 years) years, respectively. The rate of unsuccessful resuscitation increased with increasing age (P=0.04). Also, the rate of unsuccessful resuscitation increased if there was an underlying disease (P=0.01). In frequency analysis of resuscitation services, emergency medicine with 633 (57.3%) resuscitation is in the first place in the number of resuscitations, of which 22.9% of them have been successful (ROSC). In the anesthesia service, of 2 resuscitations performed, both were successful. In the general surgery service, 36.5% of 63 resuscitations were successful, and the success rate for the neurosurgery service was 32.4% of 102 resuscitations. Analyzing the duration of successful and unsuccessful resuscitation has great importance. In successful resuscitation, the average time was 18.98 minutes and in unsuccessful resuscitation was 39.20 minutes. Also, the maximum and minimum time for successful resuscitations was 63 and 1 minutes. The maximum and minimum time for unsuccessful resuscitations was recorded as 60 and 10 minutes.
|
Conclusion: The results showed that several factors were influential in cardiopulmonary resuscitation. Increasing age and underlying disease reduced the success of cardiopulmonary resuscitation.
|
Majid Zamani, Masoudeh Babakhanian , Farhad Heydari , Mohammad Nasr-Esfahani , Mohammad Mahdi Zarezadeh ,
Volume 80, Issue 7 (October 2022)
Abstract
Background: In addition to heart disease, ECG also changes in non-heart disease, which due to its similarity, can lead to misdiagnosis of heart disease in patients. ECG changes in brain lesions such as ischemic and hemorrhagic strokes, brain traumas, etc. and have been studied in many articles, but the effects of brain midline shift on ECG changes have not been studied. In this study, we want to examine these changes.
Methods: This is a prospective cross-sectional descriptive study. Patients with brain tumors who were referred to Al-Zahra and Kashani hospitals in Isfahan from April 2019 to March 2021 were selected. Patients with a history of heart disease, patients receiving medications that cause ECG changes, patients with ECG changes due to non-cardiac and cerebral causes, and individuals under 15 years of age were not included in the study. Patients whose ECG changes were due to electrolyte disturbances or acute heart problems were also excluded from the study. After obtaining informed consent from patients, a CT scan or brain MRI was taken and patients were divided into two groups with and without midline shift. Then the ECG was taken and ECG changes (T wave, ST segment, QTc Interval, QRS prolongation) were compared in two groups of brain tumors with and without midline shift.
Results: 136 patients were included in the study. Of these, 69 patients were in the without midline shift group and 67 patients were in the midline shift group. In the midline shift group, 3% of patients had ST segment changes and 23.9% had T wave changes, which were 1.4% and 10.1% in the without midline shift group, respectively. The mean QTc Interval in the two groups without and with midline shift was 338.26 (4 28.438) and 388.66 (37.855), respectively, and the mean QRS in the without midline shift group was 86.09 (88.9.88) ms and in the midline shift group was 94.63 (±12.83) ms.
Conclusion: Brain midline shifts can cause QRS widening, QTc interval prolongation, and T-wave changes in patients' ECGs.
Ali Mohammad Mosadeghrad, Parvaneh Isfahani,
Volume 82, Issue 11 (February 2025)
Abstract
Zakieh Vahedian Ardakani , Mehran Zarei-Ghanavati , Hamid Riazi-Esfahani , Seyed Mehdi Tabatabaei , Mohammad Reza Mehrabi Bahar, Sadegh Ghafarian, Ahmad Masoomi,
Volume 83, Issue 1 (April 2025)
Abstract
Artificial intelligence (AI) has emerged as a transformative force in modern medicine, with ophthalmology standing at the forefront of its clinical integration. Among ophthalmic disorders, glaucoma—a leading cause of irreversible blindness worldwide—presents unique opportunities and challenges for AI-based solutions due to its chronic, progressive nature and reliance on multimodal data, including structural and functional assessments. This review article offers a comprehensive synthesis of the current and emerging roles of AI in the detection, monitoring, and management of glaucoma. AI algorithms, particularly deep learning and machine learning models, have demonstrated exceptional capabilities in interpreting fundus photographs, optical coherence tomography (OCT) images, and visual field data to identify glaucomatous damage. These systems often approach or even exceed the diagnostic performance of human experts. Moreover, AI has shown significant promise in facilitating large-scale population-based screening, improving early detection rates, and addressing disparities in access to subspecialty care, particularly in low-resource and remote settings. In the monitoring of disease progression, AI tools are being developed to detect subtle structural or functional changes over time, predict future visual outcomes, and support more precise and individualized treatment decisions. Despite these advancements, the widespread clinical adoption of AI in glaucoma care faces several critical barriers. Key limitations include poor generalizability of models across diverse populations, imaging devices, and clinical settings; scarcity of well-annotated, high-quality, and demographically representative datasets; and a lack of transparency and interpretability in algorithmic decision-making—commonly referred to as the “black box” problem. Ethical concerns, regulatory uncertainty, integration challenges within existing healthcare infrastructures, and medico-legal accountability also require thoughtful resolution before AI can be reliably deployed in clinical practice. This review critically evaluates the strengths, limitations, and real-world potential of AI technologies in glaucoma. It provides clinicians, researchers, and healthcare policymakers with a balanced and up-to-date perspective, highlighting promising avenues for future research, including explainable AI, federated learning, multi-modal data integration, and longitudinal validation studies. By fostering a deeper understanding of both the opportunities and challenges associated with AI, this article aims to guide the responsible, equitable, and evidence-based integration of AI into comprehensive glaucoma care.