Showing 54 results for SF
Roghayyeh Borji , Mohammad Reza Khatami, Mohammad Reza Abbasi , Alipasha Meysamie , Khosro Barkhordari , Farah Ayatollah Esfahani, Mina Pashang, Laleh Ghadirian ,
Volume 71, Issue 12 (March 2014)
Abstract
Background: The mortality due to Coronary Artery Bypass Graft (CABG) in patients with chronic renal failure is more common than normal population. This study evalu-ates the impacts of prophylactic dialysis on decreasing mortality and morbidity of non- dialysis-dependent patients with renal failure after CABG surgery.
Methods: In this study, fifty non-dialysis-dependent patients who were suffering from renal failure and needed to CABG, were selected by convenience sampling method. Se-quentially, they were allocated to prophylactic dialysis (n=20) and no prophylactic dialysis (n=30) groups, using a randomized block design. Exclusion criteria were under 18 year old patients and doing CABG for second time. Mortality rate and some complications such as acute renal failure, brain accident and atrial arrhythmias were compared between two groups after CABG. All cardiac surgeries were performed in a single centre and through a median sternotomy. P value less than 0.05 was considered as significant.
Results: The mean age of patients was (65.3±9.9). The patients included %16 (n.8) of women and %84 (n.42) of men. There were 20 patients in intervention and 30 patients in control groups. Baseline characteristics were similar in two groups. Comparison be-tween intervention and control groups after surgery did not show any difference in mortality (P=0.14), acute renal failure (P=0.4), cerebrovascular accidents (P=1) and atrial arrhythmias (P=0.3), need to second surgery due to bleeding (P=1), need to dialysis (P=0.14), need to rehospitalization (P=1), duration of ventilator use (P=0.4), duration of need to hospitalization (P=0.11), duration of a patients stay in the Intensive Care Unit (P=0.4) and deep sternal infection (P=0.7) rates.
Conclusion: According the results of this study, prophylactic dialysis, before conduct-ing CABG, does not have any significant effect on mortality and other complications. The only exception is lung complications in non-dialysis-dependent patients with renal failure.
Azadeh Meamarian , Shayesteh Ashrafi Esfahani , Shahrokh Mehrpisheh , Atoosa Mahdavi Saeedi , Kamran Aghakhani ,
Volume 73, Issue 3 (June 2015)
Abstract
Background: The relationship of the base of appendix to the cecum remains constant, whereas the tip can be found in a retrocecal, pelvic, subcecal, preileal, or right pericolic position. These anatomic considerations have significant clinical importance in the context of acute appendicitis. The knowledge about the correct anatomical position of appendix may facilitate in generating an accurate diagnosis of appendicitis as well as assist in achieving a better prognosis and early treatment. The present study aimed to determine the anatomical location of the appendix in Iranian cadavers.
Methods: This descriptive cross-sectional study was conducted on 200 cadavers who were referred to the Forensic Center of Tehran from March to September 2013. The data including age, sex, weight, and appendix length and position were collected and analyzed using SPSS software, version 16 (SPSS, Inc., Chicago, IL, USA).
Results: In the present study, 200 cadavers were evaluated accidentally, of which 173 (86.5%) were males and 26 (13%) were females, and the mean age was 39.96 years±16.31 (SD). The mean wall thickness of the appendix was 9.78 cm±16.31 (SD). The mean appendix length was 9.86 cm±1.79 (SD) in men and 9.30 cm±1.56 (SD) in women. The appendix height was long in 20 cadavers (10%), short in 3 cadavers (1.5%), and moderate in 177 cadavers (88.55%) cadavers. The appendix position was posterior in 120 (60%), ectopic in 32 (16%), and pelvic in 48 (24%) cadavers.
Conclusion: Majority of appendices examined in the present study were positioned at the posterior (Retrocecal) of pelvis. According to different positions of appendices in different populations and different races, the knowledge of appendix position in various populations is necessary for early diagnosis and treatment and fewer complications for related disease.
Zahra Esfandiari , Mohammad Jalali , Leila Safaeian, J Scott Weese ,
Volume 74, Issue 5 (August 2016)
Abstract
Clostridium difficile (C. difficile) is an important factor in the development of the gastrointestinal diseases because of irrational antibiotic prescription and antimicrobial resistance. In the past, this bacterium was introduced as an agent of the infection in the hospitals called "hospital acquired Clostridium difficile infection". This infection is a main cause of morbidity and mortality internationally. But changing in the epidemiology of the infection was observed in recent years. People not taking antibiotics as well as any contact with the clinical system were hospitalized due to the infection named "Community-Associated Clostridium difficile infection". Furthermore, the hypervirulent strains of C. difficile were identified outside of the health care facilities in different sources such as environment, animals and food products. Today the role of C. difficile has not been confirmed as a zoonotic agent or foodborne pathogen. Taking into account, it should be taken attention to the sensitive individuals such as pregnant women, elderly and children for the consumption of the contaminated food products with C. difficile spores and probable cause of the infection in these individuals. For this purpose, presentation of the guidelines or the prevention strategies for the transmission of bacteria in the society as well as the healthcare facilities is important. In this review study, the history, the risk factors of disease and the reports of infection in the healthcare facilities and outside of this environment in Iran were discussed. Finally, we supposed that based on the isolation of C. difficile with different genetic profile in Iran in comparison with international ribotypes, the existence of native strains leading to the infection in the community and the healthcare facilities is possible. This hypothesis shows the significance of regional differences in the epidemiology and microbiology of disease. In addition, according to the present reports on the irrational prescription of the antibiotics in our country, it seems that C. difficile infection is increasing but any continuous monitoring is not being occurred for the supervision in Iran. Approving these hypotheses need to the careful and continuous assessment besides comprehensive examination of molecular epidemiology of disease in the organizations related to the health in Iran.
Homa Mohseni Kouchesfahani , Somayeh Ebrahimi-Barough , Jafar Ai , Azam Rahimi ,
Volume 74, Issue 12 (March 2017)
Abstract
Background: Small molecule Purmorphamin (PMA) is the agonist of smoothened protein in Sonic hedgehog (Shh) signaling pathway. Effect of purmorphamin small molecule on differentiation of mesenchymal cells into bone tissue has been studied previously. Use of Shh causes progression of neural differentiation, and the differentiated cells express specific neural markers. Neurofilament (NF) and acetylcholine esterase (Chat) are specific markers of motor neurons and their expression in differentiated cells indicates their conversion into motor neurons. The aim of this study was to evaluate the ability of PMA to differentiate the human endometrial stem cells (hEnSCs) into motor neurons.
Methods: This analytical study was done in Tehran University of Medical Sciences laboratory on September of 2015. In this study hEnSCs were enzymatically extracted from endometrial tissue. After third passages, the flow cytometry was done for mesenchymal stem cells markers. The mesenchymal stem cells were divided into control and differentiated groups. FBS 10%+DMEM/F12 was added to the culture medium of control group and the differentiating group was treated with differentiating medium containing N2, PMA, DMEM/F12, FBS, B27, IBMX, 2ME, FGF2, RA, BDNF. After 21 days immunocytochemistry (ICC) test was done for the expression of NF and Chat proteins and Real-time PCR analysis for expression of neural markers such as NF, Chat, Nestin and GFAP (as glial marker) at mRNA level.
|
Results: The flow cytometry analysis showed that hEnSCs were positive for mesenchymal markers CD90, CD105 and CD146 and negative for endothelial marker CD31, and hematopoietic marker CD34. The immunocytochemistry and Real time-PCR results showed that the cells treated with PMA expressed motor neuron markers of NF and Chat.
Conclusion: According to the results of this study, it can be concluded that small molecule PMA has the potency to induce the differentiation of hEnSCs into neural cells, specifically motor neurons by activating Shh signaling pathway.
|
Nasim Ghazavi-Khorasgani , Elham Janghorban-Laricheh , Marziyeh Tavalaee , Dina Zohrabi , Homayon Abbasi , Mohammad Hossein Nasr- Esfahani,
Volume 75, Issue 6 (September 2017)
Abstract
Background: Post-acrosomal sheath WW domain binding protein (PAWP) is one of sperm factors related to oocyte activation and is expressed in elongating spermatid. Previously, the effect of high of testicular temperature in infertile men with varicocele on semen quality, sperm DNA damage, expression of genes and proteins were reported. In this study, expression of PAWP at RNA and protein levels, and also sperm DNA damage were compared between fertile and infertile men with varicocele.
Methods: In this case-control study, semen samples were collected from 35 infertile men with varicocele (grade II & III) and 20 fertile men referring to Isfahan Fertility and Infertility Center from January 2016 to September 2016. Ejaculated semen was obtained by masturbation into a sterile plastic container after 3-5 days of abstinence and was allowed to liquefy at room temperature. Briefly, sperm concentration, motility and morphology were evaluated using a sperm chamber (Sperm meter; Sperm Processor, Aurangabad, India), Computer Assisted Semen Analysis (CASA, Video Test, ltd: version Sperm 2.1, Russia) and Papanicolaou staining, respectively. In addition, DNA fragmentation, expression of PAWP at RNA and protein levels were assessed by real-time PCR, and Western blot technique, respectively. Microsoft Excel 2013 (Microsoft Corp., Redmond, WA, USA) and Package for the Social Studies were used to analyze data. We used independent-samples t-test to compare the mean value between different groups. Differences with values of P<0.05 were considered as statistically significant.
Results: Mean of semen volume, sperm concentration, percentage of sperm motility, and expression of PAWP at both RNA (P=0.0001) and protein (P=0.03) levels were significantly lower in infertile men with varicocele in comparison with fertile men. In addition, mean percentage of sperm abnormal morphology and sperm DNA fragmentation were significantly higher in infertile men with varicocele in comparison with fertile men (P<0.05).
Conclusion: Expression of PAWP as a protein involved in spermatogenesis and fertilization process, has decreased in infertile men with varicocele. In addition, sperm DNA integrity was disrupted in these individuals that can be considered as a main etiology of infertility.
Sajad Shafai , Elham Moslemi , Mehdi Mohammadi , Kasra Esfahani , Amir Izadi ,
Volume 75, Issue 10 (January 2018)
Abstract
Background: Prostate cancer is one of the most common diseases that affect men. Although prostate cancer is not the fatal flaw in most cases, detection of effective factors for early diagnosis and treatment is essential. Research results have shown that the use of KLK2 plus PSA can be a good biomarker for diagnosing prostate cancer. During prostate cancer, expression of KLK2 gene increases which can be used as a prostate cancer biomarker. The aim of this study is an assessment of KLK2 gene expression as a potential factor in the prostate cancer diagnosis.
Methods: In this case study, 50 prostate cancer urine samples from patients and 50 urine samples from normal individuals who were referred to Mehr Hospital of Tehran (from December 2014 to February 2016) were obtained and stored in the central research laboratory of Shahid Beheshti University of Medical Sciences, Tehran, till tests were being done. The age of collected samples between the 46 up to 71 years. RNA of samples were extracted, and then cDNA was synthesized by using M-MuLV enzyme, Oligo dt, and Random hexamer primers. KLK2 specific primers designed by Primer Express software, version 3.0 (Applied Biosystems, Foster City, CA, USA), and KLK2 gene expression evaluated by using ∆∆ct methods.
Results: In comparison with patients and normal sample`s gene expression, the mean increase expression of KLK2 gene in patients less than 50 years was 2.32 and in patients more than 50 years, it was 5.79, P<0.0001. In addition, gene expression results with respect to GS (Gleason grading system) classification shown that patients with GS6 had the lowest gene expression (3.40) and in the patients with GS8, had the highest gene expression (10.74) in comparison with normal group (P<0.0001).
Conclusion: The expression of KLK2 gene in people with prostate cancer is the higher than the healthy person; finally, according to the results, it could be mentioned that the KLK2 gene considered as a useful factor in prostate cancer, whose expression is associated with progression and development of the prostate cancer.
Hossein Ghayoumi Zadeh, Mostafa Danaeian, Ali Fayazi , Farshad Namdari, Sayed Mohammad Mostafavi Isfahani ,
Volume 76, Issue 1 (April 2018)
Abstract
Background: One common symptom of diabetes is diabetic retinopathy, if not timely diagnosed and treated, leads to blindness. Retinal image analysis has been currently adopted to diagnose retinopathy. In this study, a model of hierarchical self-organized neural networks has been presented for the detection and classification of retina in diabetic patients.
Methods: This study is a retrospective cross-sectional, conducted from December to February 2015 at the AJA University of Medical Sciences, Tehran. The study has been conducted on the MESSIDOR base, which included 1200 images from the posterior pole of the eye. Retinal images are classified into 3 categories: mild, moderate and severe. A system consisting of a new hybrid classification of SOM has been presented for the detection of retina lesions. The proposed system includes rapid preprocessing, extraction of lesions features, and finally provision of a classification model. In the preprocessing, the system is composed of three processes of primary separation of target lesions, separation of the optical disk, and separation of blood vessels from the retina. The second step is a collection of features based on various descriptions, such as morphology, color, light intensity, and moments. The classification includes a model of hierarchical self-organized networks named HSOM which is proposed to accelerate and increase the accuracy of lesions classification considering the high volume of information in the feature extraction.
Results: The sensitivity, specificity and accuracy of the proposed model for the classification of diabetic retinopathy lesions is 98.9%, 96.77%, 97.87%, respectively.
Conclusion: These days, the cases of diabetes with hypertension are constantly increasing, and one of the main adverse effects of this disease is related to eyes. In this respect, the diagnosis of retinopathy, which is the same as identification of exudates, microanurysm and bleeding, is of particular importance. The results show that the proposed model is able to detect lesions in diabetic retinopathy images and classify them with an acceptable accuracy. In addition, the results suggest that this method has an acceptable performance compared to other methods.
Glareh Koochakpoor, Firoozeh Hosseini-Esfahani , Maryam Sadat Daneshpour , Parvin Mirmiran , Fereidoun Azizi ,
Volume 76, Issue 10 (January 2019)
Abstract
Background: There are contradictions in the role of genetic variations and food group intake on metabolic syndrome (MetS). This study was aimed at examining the interaction between food groups and CCND2 rs11063069, ZNT8 rs13266634 and MC4R rs12970134 polymorphisms, regarding MetS and its components.
Methods: In this matched nested case-control study (2006-2014), the data of 1634 (817 pairs) case and controls were selected among participants of the Tehran Lipid and Glucose Study (TLGS). The cases and controls were matched by age, sex and number of follow-up years. Dietary intakes were assessed using a valid and reliable food frequency questionnaire. Polymorphisms were genotyped.
Results: A significant interaction was observed between rs12970134 and green vegetable, read meat, and soft drink, in relation to the risk of low high density lipoprotein cholesterol (HDL-C), high triglyceride (TG) and high fasting blood glucose (FBG), respectively (P<0.05). The consumption of vegetables altered the effect of rs11063069 on MetS. Among G allele carriers, being in the highest quartiles of vegetables intake had a decrease risk of MetS, compared to those in the lowest quartile (P=0.007), but this trend was not observed in AA genotype carrier. There was also a significant interaction between rs13266634 and salty snack and fish intakes, in relation to the risk of abdominal obesity (P<0.05). Increasing salty meals by CT+TT genotypes carriers increased the odds ratio of abdominal obesity, while in the CC genotype, this increase was not observed. A significant interaction was also observed between rs11063069 with other vegetables, red-yellow vegetable and fruit intake respectively, regarding the risk of high FBG, low HDL-C and high blood pressure (P<0.05).
Conclusion: The present study demonstrates the interaction between food groups and MC4R, ZNT8 and CCND2 polymorphisms. To reduce the risk of MetS, high risk allele carriers of rs12970134 must avoid meat consumption, while in high risk allele carriers of rs11063069 and rs13266634, vegetables and fish should be consumed.
Zahra Esfandiari, Mohammad Reza Marasi , Fatemeh Estaki , Vahid Sanati , Elnaz Panahi , Nader Akbari , Roya Alsadat Madani, Jila Mosberian Tanha ,
Volume 77, Issue 1 (April 2019)
Abstract
Background: Nutrition education and introduction of procedures for choosing healthier food have an important role to reduce the rate of non-communicable diseases. It was shown the amount of risk factors of non-communicable diseases such as energy, salt, sugar, fat and trans fatty acid on the traffic light of food labelling. The status of risk is presented through three colors of red, yellow and green that are the signs of risk, precautious and safe use of food. The object of this study was to evaluate the influence of education on the knowledge, attitude and practices of Isfahan University of Medical Sciences students to the traffic light on food labeling.
Methods: This project was an empirical study performed by random sampling of 379 students of nine schools in Isfahan University of Medical Sciences from January 2017 to March 2018. The knowledge, attitude and practices of students toward the traffic light were assessed by self-administered and structured questionnaire. Education was performed face to face with the usage of pamphlet. In the period of three to six months, questionnaires were refilled out by students to determine knowledge, attitude and practice. Descriptive statistics were calculated using SPSS in mean± SD. Paired t-test was performed to assess the influence of education in total score of knowledge, attitudes and practices in test-retest. P value was considered less than 0.05 as statistically significant.
Results: Before education, the average of scores for knowledge, attitude and practice was 1.12±0.84, 14.44±4 and 2.25±2.2, respectively. Afterwards, the scores were increased to 11.72±0.75, 18.67±3.18 and 17.69±4.7 after education. Significant difference was observed in the scores of knowledge, attitude and practice of students before and after education (P<0.05).
Conclusion: Education of traffic light had a significant role in the improvement of knowledge, attitude and to some extent of practice of students in selection of healthier food.
Ali Mohammad Mosadeghrad , Parvaneh Isfahani ,
Volume 77, Issue 6 (September 2019)
Abstract
Background: Unnecessary patient admission to a hospital refers to the hospitalization of a patient without clinical indications and criteria. Various factors related to the patient (e.g., age, disease severity, payment method, and admission route and time), the physician and the hospital and its facilities and diagnostic technologies affect a patient unnecessary admission in a hospital. Unnecessary patient hospitalization increases nosocomial infections, morbidity and mortality, and decreases patient satisfaction and hospital productivity. This study aimed to measure unnecessary patient admissions in hospitals in Iran.
Methods: This study was conducted using a systematic review and meta-analysis at Tehran University of Medical Science in August 2019. Seven electronic databases were searched and evaluated for original research papers published between March 2006 and 2018 on patients’ unnecessary admission to a hospital. Finally, 12 articles were selected and analyzed using comprehensive meta-analysis software.
Results: All studies used the appropriateness evaluation protocol (AEP) for assessing patients’ unnecessary hospitalization in the hospitals. Overall, 2.7% of hospital admissions were rated as inappropriate and unnecessary (CI 95%: 1.5-4.9%). The highest unnecessary patients’ admissions were 11.8% in a teaching hospital in Meshginshahr city in 2016, (CI 95%: 8.8%-15.8%) and the lowest unnecessary patients’ admissions was 0.3% in a teaching hospital in Yasuj city in 2016 (CI 95%: 0%-3.6%). Unnecessary patient admission in public hospitals was higher than private hospitals. A significant statistical correlation was observed between unnecessary patient admission, and sample size (P<0.05).
Conclusion: The rate of unnecessary hospital admission in Iran is low. However, hospital resources are wasted due to unnecessary admissions. Expanding the primary health care network, reducing hospital beds, introducing an effective and efficient patient referral system, using a fixed provider payment method, and promoting residential and social services care at macro level, and establishing utilization management committee, using the appropriateness evaluation protocol, establishing short-stay units, and implementing quality management strategies at the hospital level are useful strategies for reducing avoidable hospital admissions.
Sadegh Norouzi , Fateme Esfandiarpour , Ali Shakouri Rad , Nasim Kiani Yousefzadeh , Zeinab Helalat , Reza Salehi , Mehrnoosh Amin , Farzam Farahmand ,
Volume 77, Issue 8 (November 2019)
Abstract
Background: The amount of anterior tibial translation during rehabilitation exercises is a key factor in organizing exercise regimen after anterior cruciate ligament injury. Excessive anterior tibial translation could increase the magnitude of tension imposed on injured and reconstructed anterior cruciate ligament knees. Forward lunge and open-kinetic knee extension exercises are commonly used in anterior cruciate ligament rehabilitation. However, there is insufficient data about the amount of anterior tibial translation in the eccentric and concentric phases of these exercises. This study compared the amount of anterior tibial translation in the eccentric and concentric phase of the lunge and seated knee extension in anterior cruciate ligament deficient and intact knees.
Methods: Using a non-probability sampling method, 14 men with unilateral anterior cruciate ligament rupture were selected for participation in this cross-sectional study. Participants were recruited from the university’s physiotherapy clinics. A uni-plane fluoroscope was used to image the knee joint while participants performed the forward lunge and open-kinetic knee extension exercises with the intact and injured legs in random order. Fluoroscopy imaging was performed in the radiology center at Sina Hospital, Tehran, Iran, from September 2013 to February 2014. Two factorial mixed ANOVA was used to analyze the data.
Results: There were no significant differences in the anterior tibial translation between the limbs and contraction phases during the lunge exercise. During open-kinetic knee extension, the anterior tibial translation in anterior cruciate ligament deficient knees was significantly more than that of healthy knees at 0⁰ (P=0.007). The anterior tibial translation in the eccentric phase of open-kinetic knee extension at flexion angles of 0⁰ (P=0.049) and 15⁰ (P=0.024) was significantly greater than that in the concentric phase.
Conclusion: In the lunge exercise, the amount of anterior tibial translation was similar between the eccentric and concentric phases and the intact and anterior cruciate ligament deficient knees, however, during open-kinetic knee extension exercise, in the eccentric phase was greater than that in concentric, and in the intact knees was greater than that in the intact knees, at 0-15⁰ angles.
Ali Mohammad Mosadeghrad , Parvaneh Isfahani, Taraneh Yousefinezhadi,
Volume 78, Issue 4 (July 2020)
Abstract
Background: Medical errors are those errors or mistakes committed by healthcare professionals due to errors of omission, errors in planning, and errors of execution of a planned healthcare action whether or not it is harmful to the patient. Medical error in hospitals increases morbidity and mortality and decreases patient satisfaction and hospital productivity. This study aimed to determine the prevalence of medical errors in Iranian hospitals.
Methods: This study was conducted using systematic review and meta-analysis approaches. All articles written in English and Persian on the prevalence of medical errors in Iranian hospitals up to March 2019 were searched in Web of Science, PubMed, Elsevier, Scopus, Magiran, IranMedex and Scientific Information Database (SID) databases, and Google and Google Scholar search engines. In addition, reference lists of the retrieved papers were hand-searched. A total of 9 studies matching the inclusion criteria were identified, reviewed, and analyzed using comprehensive meta-analysis software.
|
Results: The prevalence of medical errors was reported in 9 studies and prevalence rate ranged from 0.06% to 42%. Most studies used reporting forms completed by hospital employees for determining the prevalence of medical errors (67%). Only three studies collected data by reviewing patients’ medical records. Accordingly, the overall prevalence of medical error in Iran's hospitals based on the nine published articles was 0.01% (95% Cl 0%-0.01%) during 2008 to 2017. The highest medical error was recorded in a hospital in Shiraz, 2.1% (95% Cl: 1.4%-2.7%) in 2012. A significant statistical correlation was observed between medical errors and sample size (P<0.05).
Conclusion: The prevalence rate of medical error in Iran is low. It is strongly recommended to use more advanced and valid methods such as occurrence reporting, screening, and the global trigger tool for examining medical errors in Iranian hospitals. Proving adequate education and training to patients and employees, simplifying and standardizing hospital processes, enhancing hospital information systems, improving communication, promoting a safety culture, improving employees’ welfare and satisfaction, and implementing quality management strategies are useful for reducing medical errors.
|
Elham Hoseinnezhad Zarghani, Ghazale Geraily, Mahbod Esfahani, Mostafa Farzin,
Volume 78, Issue 7 (October 2020)
Abstract
Background: Total body irradiation (TBI) is a technique that is commonly used as a part of the patient conditioning regimen before the bone marrow transplant (BMT). The purpose of this study is to introduce and implement a reasonable TBI technique on the human-like phantom in Imam Khomeini Hospital in Tehran.
Methods: The present experimental study was conducted from October 2016 to November 2017 to implement the TBI technique at the Cancer Institute of Imam Khomeini Hospital in Tehran. For this purpose, percentage depth dose, and dose rate were measured in TBI condition (i.e. SSD=310 cm, field size=40×40 cm2, gantry angle=90°, and collimator angle 45°) in homogeneous phantom. Gafchromic EBT3 films were used to measure the absorbed dose in different areas of the human like phantom at the levels of head, neck, thyroid, lung, umbilicus, pelvic, thigh, knee and leg. Phantom irradiation was performed in parallel opposed anterior-posterior geometry using an 18MV photon beam produced by Varian 2100C/D. Cerrobend blocks were used for lung protection. After analyzing the exposed films with Image J software, the dose uniformity was calculated.
|
Results: Dose distribution uniformity was acquired in the order of -1.01% to +11.82% relative to the prescribed dose at the umbilicus. The difference between the calculated and measured dose at the umbilicus level was -2.73%. The radiation absorbed dose to the lung with blocks was 127.53cGy in one fraction which resulted in 756.18cGy in six fractions.
|
Conclusion: The implemented technique, obtained the acceptable ±10% dose uniformity in most of the body regions. The dosing accuracy was within the acceptable range. The lungs¢ dose was reduced to the desired level using lung shields. This technique is a simple and cost-effective method that does not require complicated dosimetric techniques. Regarding the obtained results, the proposed technique has the necessary conditions for implementation in Imam Khomeini hospital in Tehran.
Zahra Esfandiari, Fatemeh Amani, Meraj Pourhossein, Hedayat Hosseini,
Volume 78, Issue 12 (March 2021)
Abstract
|
The development of industry and technology, changes in agriculture, trade and global travel, and the adaptation of microorganisms are important factors in the occurrence of emerging diseases. Currently, the world is facing a pandemic caused by an emerging virus called the novel coronavirus (Covid 19) in 2020. This disease led to infect more than one million people worldwide and the death of more than five hundred thousand people during six months. Covid 19 causes death in patients with respiratory problems of varying severity. Fever, soreness, dry coughs, shortness of breath, runny nose, and nasal congestion were observed in coronavirus-infected individuals. Fever was one of its common symptoms. Other unusual signs such as diarrhea and nausea were reported for this disease. For the first time, the bat was introduced as the host of the novel coronavirus in China. Therefore, identifying the initial route of transmission of the novel coronavirus is necessary to prevent the occurrence and its widespread distribution. The virus enters into a human through respiratory particles as well as touching the surfaces contaminated by nasal, mouth and eye secretions. Viruses are obligate intracellular pathogens needing host cells to survive. These microorganisms cannot proliferate in foods and require live cells for existence. Food is introduced as a carrier of viruses to the consumer. There have been no reports of novel coronavirus transmission through food. However, it is important to observe the principles of health and safety by assuming the spread of the virus due to food contamination. Regarding the presence and proliferation of novel coronavirus in the gastrointestinal tract and aerosol formation of this microorganism in the feces and the possibility of re-transmitting it to people from various environmental sources, the most important priority is to remove the virus from food environments. It is also important to update the methods of disinfecting surfaces, especially areas with high contact of hand as well as personal hygiene. Therefore, it is recommended to educate the staff about managing the novel coronavirus and improving health guidelines. Furthermore, keeping distance and washing hands is in priority in different food-related environments.
|
Zohreh Ghoreishi, Ali Esfahani, Shima Asgarzad, Laleh Payahoo, Fatemeh Hajizadeh-Sharafabad ,
Volume 79, Issue 10 (January 2022)
Abstract
Background: Among all types of cancers, pancreatic cancer has poor prognosis with 5-year survival below 10%. In theory, alcohol intake may be a modifiable risk factor for pancreatic cancer due to its role in multiple carcinogenic and metabolic signaling pathways. In addition, alcohol consumption may lead to chronic pancreatitis which is underlying cause of pancreatic cancer. However, little is known about whether this factor is associated with pancreatic cancer. This study aimed to systematically review the cohort studies investigating the possible link between alcohol consumption and the morbidity or mortality of pancreatic cancer.
Methods: This study was carried out based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). All of cohort studies that assessed the association between alcohol intake and the risk of pancreatic cancer or death were included in this systematic review without a language restriction. Electronic databases including PubMed, Web of science, Scopus, and Google scholar were searched using the keywords "pancreatic cancer" and "alcohol" and similar words from 1990 to April 2021 to find the cohort studies.
Results: 858 articles were identified, of which 806 were excluded and the full-text of 52 papers were evaluated for the eligibility. Eventually, 22 articles were eligible and were included in this study. Many of the articles assessed the impacts of low to moderate alcohol intake. A comprehensive review of these studies showed that low to moderate alcohol consumption had a non-significant correlation with pancreatic cancer, while high alcohol consumption was significantly associated with the risk of pancreatic cancer or death. The results also revealed that high liquor consumption was associated with higher risk of pancreatic cancer. Nevertheless, the follow-up durations in most of these studies were shorter than that to lead to pancreatic cancer.
Conclusion: Long-term heavy alcohol drinking can increase the morbidity or mortality of pancreatic cancer. Regarding that several genetic and environmental variations involve in the pathogenesis of this cancer, simultaneous control of these differences should be addressed to determine the net effect of alcohol drinking on pancreatic cancer.
|
Ali Mohammad Mosadeghrad , Parvaneh Isfahani ,
Volume 79, Issue 12 (March 2022)
Abstract
Mohammad Nasr Esfahani , Aref Javari, Farhad Heydari, Majid Javari,
Volume 80, Issue 4 (July 2022)
Abstract
Background: Previous studies have shown that several factors affect the outcome of cardiopulmonary resuscitation. In this study, we have evaluated the factors associated with the outcome of resuscitation in in-hospital cardiopulmonary arrest patients (IHCA) 002E.
Methods: This cross-sectional non-probability study was performed on patients with in-hospital cardiopulmonary arrest between 2015 and 2020 in the emergency department (ED) of Al-Zahra Hospital, Isfahan, Iran. Data were then collected from medical records to describe patient characteristics, arrest profile, and survival details. Factors associated with the dependent variable were examined Logistic regression.
Results: Among 848 in-hospital cardiopulmonary arrests, 18 patients (2.1%) survived and were discharged from the hospital. The mean age of patients was 62.74±21.17 years, 583 (68.8%) were male, and 265 (31.2%) were female. The mean age of patients with successful resuscitation and those with unsuccessful resuscitation was 62.33±21.79 (6 to 116 years) and 61.58±21.20 (1 month to 108 years) years, respectively. The rate of unsuccessful resuscitation increased with increasing age (P=0.04). Also, the rate of unsuccessful resuscitation increased if there was an underlying disease (P=0.01). In frequency analysis of resuscitation services, emergency medicine with 633 (57.3%) resuscitation is in the first place in the number of resuscitations, of which 22.9% of them have been successful (ROSC). In the anesthesia service, of 2 resuscitations performed, both were successful. In the general surgery service, 36.5% of 63 resuscitations were successful, and the success rate for the neurosurgery service was 32.4% of 102 resuscitations. Analyzing the duration of successful and unsuccessful resuscitation has great importance. In successful resuscitation, the average time was 18.98 minutes and in unsuccessful resuscitation was 39.20 minutes. Also, the maximum and minimum time for successful resuscitations was 63 and 1 minutes. The maximum and minimum time for unsuccessful resuscitations was recorded as 60 and 10 minutes.
|
Conclusion: The results showed that several factors were influential in cardiopulmonary resuscitation. Increasing age and underlying disease reduced the success of cardiopulmonary resuscitation.
|
Majid Zamani, Masoudeh Babakhanian , Farhad Heydari , Mohammad Nasr-Esfahani , Mohammad Mahdi Zarezadeh ,
Volume 80, Issue 7 (October 2022)
Abstract
Background: In addition to heart disease, ECG also changes in non-heart disease, which due to its similarity, can lead to misdiagnosis of heart disease in patients. ECG changes in brain lesions such as ischemic and hemorrhagic strokes, brain traumas, etc. and have been studied in many articles, but the effects of brain midline shift on ECG changes have not been studied. In this study, we want to examine these changes.
Methods: This is a prospective cross-sectional descriptive study. Patients with brain tumors who were referred to Al-Zahra and Kashani hospitals in Isfahan from April 2019 to March 2021 were selected. Patients with a history of heart disease, patients receiving medications that cause ECG changes, patients with ECG changes due to non-cardiac and cerebral causes, and individuals under 15 years of age were not included in the study. Patients whose ECG changes were due to electrolyte disturbances or acute heart problems were also excluded from the study. After obtaining informed consent from patients, a CT scan or brain MRI was taken and patients were divided into two groups with and without midline shift. Then the ECG was taken and ECG changes (T wave, ST segment, QTc Interval, QRS prolongation) were compared in two groups of brain tumors with and without midline shift.
Results: 136 patients were included in the study. Of these, 69 patients were in the without midline shift group and 67 patients were in the midline shift group. In the midline shift group, 3% of patients had ST segment changes and 23.9% had T wave changes, which were 1.4% and 10.1% in the without midline shift group, respectively. The mean QTc Interval in the two groups without and with midline shift was 338.26 (4 28.438) and 388.66 (37.855), respectively, and the mean QRS in the without midline shift group was 86.09 (88.9.88) ms and in the midline shift group was 94.63 (±12.83) ms.
Conclusion: Brain midline shifts can cause QRS widening, QTc interval prolongation, and T-wave changes in patients' ECGs.
Ali Mohammad Mosadeghrad, Parvaneh Isfahani,
Volume 82, Issue 11 (February 2025)
Abstract
Zakieh Vahedian Ardakani , Mehran Zarei-Ghanavati , Hamid Riazi-Esfahani , Seyed Mehdi Tabatabaei , Mohammad Reza Mehrabi Bahar, Sadegh Ghafarian, Ahmad Masoomi,
Volume 83, Issue 1 (April 2025)
Abstract
Artificial intelligence (AI) has emerged as a transformative force in modern medicine, with ophthalmology standing at the forefront of its clinical integration. Among ophthalmic disorders, glaucoma—a leading cause of irreversible blindness worldwide—presents unique opportunities and challenges for AI-based solutions due to its chronic, progressive nature and reliance on multimodal data, including structural and functional assessments. This review article offers a comprehensive synthesis of the current and emerging roles of AI in the detection, monitoring, and management of glaucoma. AI algorithms, particularly deep learning and machine learning models, have demonstrated exceptional capabilities in interpreting fundus photographs, optical coherence tomography (OCT) images, and visual field data to identify glaucomatous damage. These systems often approach or even exceed the diagnostic performance of human experts. Moreover, AI has shown significant promise in facilitating large-scale population-based screening, improving early detection rates, and addressing disparities in access to subspecialty care, particularly in low-resource and remote settings. In the monitoring of disease progression, AI tools are being developed to detect subtle structural or functional changes over time, predict future visual outcomes, and support more precise and individualized treatment decisions. Despite these advancements, the widespread clinical adoption of AI in glaucoma care faces several critical barriers. Key limitations include poor generalizability of models across diverse populations, imaging devices, and clinical settings; scarcity of well-annotated, high-quality, and demographically representative datasets; and a lack of transparency and interpretability in algorithmic decision-making—commonly referred to as the “black box” problem. Ethical concerns, regulatory uncertainty, integration challenges within existing healthcare infrastructures, and medico-legal accountability also require thoughtful resolution before AI can be reliably deployed in clinical practice. This review critically evaluates the strengths, limitations, and real-world potential of AI technologies in glaucoma. It provides clinicians, researchers, and healthcare policymakers with a balanced and up-to-date perspective, highlighting promising avenues for future research, including explainable AI, federated learning, multi-modal data integration, and longitudinal validation studies. By fostering a deeper understanding of both the opportunities and challenges associated with AI, this article aims to guide the responsible, equitable, and evidence-based integration of AI into comprehensive glaucoma care.