Research Article
Open Access
Prospective Study on Functional Outcome of CTEV By Ponseti Method of Cast Application
GB Chandan,
Keerthesh HD,
Abdul Khader
Pages 303 - 307

View PDF
Abstract
Congenital talipes equinovarus, also known as clubfoot, is a complex, congenital deformity of the musculoskeletal tissues below the leg seen in newborn. It is a dysplasia of all musculoskeletal structures of the foot presenting with equinus,cavus, varus and adducted position . the aim of the study is to have normal looking plantigrade mobile foot, avoiding permanent disability, and to assess the functional outcome of clubfoot, the time , rate and duration of correction taken for correction of deformity and to assess the difference in followup adherence with socioeconomic status and education level of parents by ponsetti method of cast application. The ponseti method of conservative treatment with or without tenotomy is preferred treatment of clubfoot. Methodology: The proposed study is a hospital based prospective interventional study including all the cases presenting to the CTEV Clinic, at department of orthopaedics OPD in secondary health care centre. The target population are children from birth to 3 months of age with congenital idiopathic clubfoot attending the CTEV Clinic. We have studied 150 feet of idiopathic clubfoot managed by Ponseti method. Results: Out of 150 feet ,138 feet had excellent functional outcome (Pirani score- 0-0.5) 9 feet had good outcome pirani score (0.5-1.0) and 3 had satisfactory outcomes (pirani score ->1). The mean number of casts used for effective treatment is 5.37 with standard deviation of 0.74. the mean follows up in months is 9.52 with standard deviation of 2.0031. Conclusion: Ponseti method of correction of clubfoot is potentially safe effective and affordable which significantly reduces the need for invasive surgical procedure, provides a painless, plantigrade, cosmetically acceptable functional foot with minimal complications. This method of treatment is effective in both developed and developing countries.
Research Article
Open Access
Evaluation of hepatic lesions by triple phase Multidetector Computed Tomography Scan
Pushpendra Singh Gaharwar,
Nirmal Kumar Mittal,
Premlata Chouhan,
Monika Singh Parihar,
Prakash Singh Baghel,
Neetika Singh,
Prateek Mittal
Pages 298 - 302

View PDF
Abstract
Background & Methods: The aim of the study is to Evaluation of hepatic lesions by triple phase multidetector computed tomography scan. This prospective observational study was carried out in the Department of Radio-diagnosis, Amaltas Institute of Medical Sciences and Hospital, Dewas, Madhya Pradesh, a tertiary care hospital catering to patients from central India. Results: Pattern across phases, during the arterial phase, lesions exhibited diverse enhancement: strong hyperenhancement (16 patients, 17.02%), arterial hyperenhancement + scar (13 patients, 13.83%), rim enhancement (6 patients, 6.38%), peripheral nodular enhancement (12 patients, 12.77%), mild to no enhancement (13 patients, 13.83%), no enhancement (11 patients, 11.70%), and peripheral rim/hypovascular pattern (14 patients, 14.89%). In the venous phase, enhancement patterns included isoattenuation/washout (16 patients, 17.02%), progressive enhancement (13 patients, 13.83%), hypodense/heterogeneous (14 patients, 14.89%), partial fill-in (12 patients, 12.77%), no enhancement (11 patients, 11.70%), and washout (9 patients, 9.57%). The delayed phase featured iso/minimal washout (16 patients, 17.02%), persistent hypodensity (14 patients, 14.89%), scar enhancement (13 patients, 13.83%), marked enhancement (13 patients, 13.83%), centripetal fill-in (12 patients, 12.77%), and hypodensity (15 patients, 15.96%). These dynamic-phase observations facilitate lesion characterization by highlighting vascular behavior. Conclusion: MDCT reliably characterizes lesion morphology, density, enhancement pattern, and anatomical distribution. These findings support the integration of MDCT as a frontline imaging modality for hepatic lesion evaluation, enabling accurate characterization, guiding biopsy decisions, and facilitating surgical or interventional planning. However, caution is warranted in interpreting atypical lesions, as occasional false positives and false negatives may occur, necessitating histopathological confirmation in suspicious or inconclusive cases.
Research Article
Open Access
To evaluate pelvic organ, prolapse by standardized pop q classification in preoperative and postoperative patients who are undergoing vaginal hysterectomy
Shreya Mohane,
Preeti Jain,
Ritu Sharda,
Shailaya Sonakiya
Pages 294 - 297

View PDF
Abstract
Background & Methods: the aim of the study is to evaluate pelvic organ prolapse by standardized pop q classification in preoperative and postoperative patients who are undergoing vaginal hysterectomy. The source of data for this study is patients referred to the Department of Obstetrics andgynaecology, Amaltas Institute of Medical Sciences, Dewas for vaginal hysterectomy. Patients who were found to evaluate POP by POPQ system were studied. This consists of 67patients with POP detected on USG between July 2023 to August2025. Results: In the existing study, 3% had stage I, 34.3% had stage 2, 43.3% had stage 3 and 19.4% had stage 4. Most common stage was stage 3. In the existing study, 1.5% was successful repaired. 28.4% had stage 0, 19.4% had stage 0 to1, 50.8% had stage 1. Conclusion: The highest incidence of these conditions increases with age. Vaginal delivery, compared to LSCS, weakens the perineum regardless of the number of deliveries. The most common symptom is SCOV, with the hymen as the fixed reference point.
Research Article
Open Access
A comparative study of Induction of Labor with Cerviprime Vs Misoprostol
Ritika Dhanora,
Preeti Jain,
Ritu Sharda
Pages 289 - 293

View PDF
Abstract
Background & Methods: The aim of the study is to compare study of Induction of Labor with Cerviprime Vs Misoprostol. The study was conducted in the Department of Obstetrics and Gynaecology, Amaltas Institute of Medical Sciences, Dewas (M.P.), which serves as a tertiary care referral center. Results: Out of 108 participants, 23 underwent induction due to gestational diabetes mellitus (GDM), comprising 21.3% of the total. Oligohydramnios accounted for 21 cases or 19.4%. Pregnancy-induced hypertension (PIH) was the indication in another 23 participants, again representing 21.3%. A postdated pregnancy led to induction in 26 cases or 24.1%, while premature rupture of membranes (PROM) was reported in 15 participants and accounts for 13.9%. The cumulative percentages for these categories were 21.3%, 40.7%, 62.0%, 86.1% and 100.0%, respectively. Postdated pregnancy was the most frequent reason for induction, closely followed by GDM and PIH. Together, these three indications accounted for nearly two-thirds of all inductions. Conclusion: Clinical implications arise from these findings. Cerviprime emerged as a preferred agent for induction where rapid, predictable response and fewer complications are desirable. It provided better outcomes without increasing intervention rates or exposing patients to excessive risk. Misoprostol remained effective but showed increased uterine activity and fetal compromise; selection requires careful consideration of individual patient profiles.
Research Article
Open Access
Comparative Study of Ilioinguinal Nerve Preservation versus Neurectomy on Chronic Groin Pain in Inguinal Hernioplasty
Deepika A Walmiki,
Rajashekhar T Patil,
Prashant Dhannur,
Niveda BR
Pages 283 - 288

View PDF
Abstract
Introduction: Chronic postoperative groin pain (CPGP) is a frequent and debilitating complication following inguinal hernioplasty. Ilioinguinal nerve entrapment or irritation is a key factor in its development. The optimal management of this nerve during Lichtenstein repair remains controversial. This study compares ilioinguinal nerve preservation versus prophylactic neurectomy in terms of postoperative pain, sensory disturbances, and patient satisfaction. Methods: This prospective interventional study included 100 patients undergoing Lichtenstein hernioplasty at a tertiary care center. Patients were randomized into two groups: Group A (n = 50), with ilioinguinal nerve preservation, and Group B (n = 50), with prophylactic neurectomy. Postoperative pain was assessed using the Visual Analog Scale (VAS) at day 1, 1 month, 3 months, and 6 months. Sensory disturbances were evaluated through light touch and pinprick testing. Patient satisfaction was recorded at six months. Statistical analysis was performed using t-tests and chi-square tests, with p < 0.05 considered significant. Results: At six months, chronic pain was significantly lower in the neurectomy group (8.3%) compared to the nerve preservation group (19.2%) (p = 0.04). While hypoesthesia was initially more common in the neurectomy group (37.5% vs. 26.9% at 1 month), the difference diminished by six months (16.6% vs. 11.5%, p = 0.41). Patient satisfaction was higher in the neurectomy group (76% reporting "Excellent" vs. 60% in the preservation group). Conclusion: Prophylactic ilioinguinal neurectomy significantly reduces chronic groin pain after Lichtenstein hernioplasty without causing persistent sensory deficits. Given the impact of chronic pain on quality of life, routine neurectomy should be considered, with proper preoperative
Research Article
Open Access
To Evaluate Diffuse Lung disease by High Resolution Computed Tomography
Neetika Singh,
Nirmal Kumar Mittal,
Prakash Singh Baghel,
Pushpendra Singh Gaharwar,
Manvendra Singh Shaktawat,
Premlata Chouhan
Pages 278 - 282

View PDF
Abstract
Background & Methods: The aim of the study is to Evaluate Diffuse Lung Disease by high resolution computed Tomography conducted at Department of Radio-diagnosis in a tertiary care hospital rural central India (Amaltas Institute of Medical Sciences, Dewas). All male and female patients referred for high resolution computed tomography lung above 18-year age of rural central India. Results: A comparative analysis of HRCT imaging patterns between smokers and non- smokers. Discrete consolidations, ground-glass opacities, cystic changes, and emphysematous changes were notably more prevalent among smokers, indicating a greater extent of parenchymal destruction and airway involvement likely attributable to smoking-related pathology. Smokers also showed a higher frequency of mediastinal lymphadenopathy and inter-septal thickening, suggesting associated inflammatory or fibrotic responses. Conclusion: HRCT enabled precise classification of diffuse lung diseases into idiopathic interstitial pneumonias—most notably Usual Interstitial Pneumonia (UIP)—as well as other categories such as granulomatous diseases like sarcoidosis, diffuse lung diseases of known causes including RB- ILD and HP, and rare disorders such as Lymphangioleiomyomatosis (LAM).
Research Article
Open Access
Prospective study to Correlate Neonatal Thrombocytopenia with Neonatal Sepsis
Anushka Jat,
Neha Kakani,
Ravleen Kaur Sabarwal
Pages 274 - 277

View PDF
Abstract
Background & Methods: The aim of the study is to Correlate Neonatal Thrombocytopenia with Neonatal Sepsis. Detailed clinical examination was done for all neonates who enrol in the study. Blood sample was taken in all neonates for sepsis screen, blood culture. The area from where blood culture was taken is cleaned and prepared with an antibacterial solution; samples are taken from the venous route. Results: A variety of hematological parameters with some exhibiting more variability (e.g., WBC count, CRP) and others being more consistent (e.g., platelet count). The mean values reflect typical blood profile characteristics, but the standard deviations highlight the range and variation in these measures within the sample group. Conclusion: In our study, we found that in neonatal sepsis, more than half of the neonates developed thrombocytopenia and 20% developed severe thrombocytopenia. It was independently associated with high CRP. To conclude, it is critical to screen for and treat thrombocytopenia in all babies brought to the NICU, including those who appear to be at low risk, because the incidence and mortality linked with this illness are significant. Because sepsis is still a common cause of newborn thrombocytopenia and the severity of thrombocytopenia in sepsis varies from mild to moderate to severe, the fact that thrombocytopenia is present in more than half of sepsis cases with positive blood cultures indicates the severity of the condition.
Research Article
Open Access
To Evaluate and Correlate the Role of Ultrasonography with Fine Needle Aspiration Cytology in Patient of Thyroid Lesions
Prakash Singh Baghel,
Nirmal Kumar Mittal,
Neetika Singh,
Pushpendra Singh Gaharwar,
Premlata Chouhan
Pages 270 - 273

View PDF
Abstract
Background & Methods: The aim of the study is to evaluate and correlate the role of ultrasonography with fine needle aspiration cytology in patient of thyroid lesions. The study was performed in the Department of Radio-diagnosis, Amaltas Institute of Medical Sciences Aims Dewas to evaluate the sonographic findings of thyroid lesions with FNAC fine needle aspiration cytology correlation in the diagnosis of Thyroid disorder. Results: Out of 44 patients, 34.09% had < 5 mm nodule size, 15.91% had 5mm -1 cm5mm -1 cm and 50% had >1 cm nodule size. 44% had Homogeneous whereas 56% had Heterogeenous type of echotexture of thyroid parenchyma. Out of 50 patients, 44% had Thyroiditis, 26% had colloid goiter, 14% had mng, 8% had adenomatous nodule, 6% had mng with thyroiditis and 2% had medullary carcinoma. Conclusion: The presence of hypoechoic nodules, taller than wider with puntate calcification and presence of vascularity is highly specificity for malignant thyroid lesion.
Research Article
Open Access
To Study Clinical and Biochemical Profile and its Correlation with Hepatitis B Virus DNA Titre Among Hepatitis B Positive Patients in Disease Severity
Satish Singh Gurjar,
Himanshu Rana,
Hardik Rathwa,
Rupal Dosi,
Harikrushna Prajapati
Pages 265 - 269

View PDF
Abstract
Background & Methods: The aim of the study is to study clinical and biochemical profile and its correlation with hepatitis B virus DNA titre among hepatitis B positive patients in disease severity. Patients who attended medicine inpatient and outpatient department who was HbsAg Positive. Results: In general, physical examination icterus was most common sign, which was present in 27 patients, followed by pallor, present in 13 patients. In these patients mean HBV was higher. There is non-significant relation between general physical examination and mean HBV. Conclusion: In general, physical examination icterus was most common sign, which was present in 27(67.5%) patients, followed by pallor which was present in 13(32.5%) patients. there is no significant relation between general physical examination and mean HBV the chi square statistic is not significant.
Research Article
Open Access
Comparative Evaluation of Nebulized Dexmedetomidine and Nebulized Ropivacaine for Attenuating Hemodynamic Stress Response to Laryngoscopy and Intubation in Elective Surgeries: A Prospective Randomized Clinical Study
Shweta Bhati,
Rangit Priyakar Pandey,
Ratanpal Singh
Pages 257 - 264

View PDF
Abstract
Background: Laryngoscopy and endotracheal intubation are potent noxious stimuli that trigger significant hemodynamic responses due to sympathetic nervous system activation. Various pharmacological agents have been employed to blunt this response, including α2-agonists like dexmedetomidine and local anesthetics such as ropivacaine. This study aimed to compare the efficacy and safety of nebulized dexmedetomidine versus ropivacaine in attenuating the intubation-induced pressor response in patients undergoing elective surgeries under general anesthesia. Materials and Methods: This prospective, randomized, comparative study was conducted on 100 adult patients (ASA Grade I–II, aged 20–60 years) scheduled for elective surgeries requiring general anesthesia with endotracheal intubation. Patients were randomized into two groups: Group D received nebulized dexmedetomidine, and Group R received nebulized ropivacaine, 15 minutes before induction. Hemodynamic parameters (heart rate, SBP, DBP, MAP) were recorded at baseline, post-nebulization, post-induction, and at 1, 3, 5-, 10-, 15-, and 20-minutes post-intubation. Recovery profiles, Ramsay sedation scores, VAS pain scores, oxygen requirement, intraoperative anesthetic consumption, blood loss, adverse events, and patient satisfaction were also assessed. Statistical analysis was performed using t-tests and chi-square tests; p < 0.05 was considered significant. Results: Baseline demographic and clinical parameters were comparable between the groups. Group D showed significantly greater attenuation of hemodynamic responses post-intubation, particularly in heart rate and MAP (p < 0.01). Time to extubation and response was shorter in Group D (p < 0.05), with higher postoperative sedation scores and significantly lower pain scores at multiple intervals. Group D also demonstrated reduced postoperative oxygen requirements and intraoperative consumption of propofol, vecuronium, and isoflurane (p < 0.05). Although adverse effects were minimal and similar in both groups, Group D had significantly less intraoperative blood loss. Patient satisfaction scores were comparable. Conclusion: Nebulized dexmedetomidine is superior to ropivacaine in blunting the hemodynamic stress response to laryngoscopy and intubation, enhancing sedation and analgesia, reducing anesthetic requirements and oxygen demand, and minimizing blood loss, all without increasing adverse effects. It offers a clinically advantageous, non-invasive option for improving perioperative stability in elective surgical patients.
Research Article
Open Access
Comparative Study of Dexmedetomodine Versus Midazolam for Sedation in Regional Anesthesia
Uma Raveendaran,
Ratanpal Singh,
Rangit Priyakar Pandey
Pages 249 - 256

View PDF
Abstract
Background: Sedation during regional anesthesia (RA) plays a vital role in enhancing patient comfort, procedural efficiency, and safety. Midazolam, a widely used benzodiazepine, is effective but associated with dose-dependent respiratory depression and delayed recovery. Dexmedetomidine, a selective alpha-2 adrenergic agonist, offers sedative and analgesic-sparing effects with minimal respiratory compromise. This study aimed to compare dexmedetomidine and midazolam in terms of sedative efficacy, hemodynamic and respiratory stability, recovery profile, adverse events, and satisfaction outcomes in patients undergoing surgery under RA. Materials and Methods: This prospective, randomized, double-blind study was conducted at Rajshree Medical Research Institute, Bareilly (U.P.), India, over six months. A total of 100 ASA I–II patients, aged 18–65 years, scheduled for elective surgeries under regional anesthesia, were randomized into two equal groups. Group D received dexmedetomidine (1 µg/kg loading + 0.2–0.7 µg/kg/hr infusion), and Group M received midazolam (0.05 mg/kg loading + 0.02–0.1 mg/kg/hr infusion). Sedation was assessed using the Ramsay Sedation Scale (RSS), with continuous monitoring of hemodynamic and respiratory parameters. Key endpoints included onset and recovery times, sedation depth, adverse events, and satisfaction scores. Results: Both agents achieved effective sedation. Midazolam had a significantly faster onset (6.3 ± 1.9 vs. 8.1 ± 2.2 min, p < 0.001), but dexmedetomidine provided significantly faster recovery (17.8 ± 4.2 vs. 26.1 ± 5.1 min, p < 0.001) and more stable sedation with fewer additional dose requirements. Dexmedetomidine showed better hemodynamic control, with lower heart rates and MAP values (p < 0.001), and superior respiratory safety, with no desaturation events compared to 12% in the midazolam group (p = 0.027). The incidence of adverse events was significantly lower in Group D (18%) than in Group M (40%) (p = 0.012). Patient and surgeon satisfaction scores were significantly higher in the dexmedetomidine group (p < 0.001). Conclusion: Dexmedetomidine is a safe and effective alternative to midazolam for procedural sedation in regional anesthesia. It offers superior respiratory safety, better hemodynamic stability, faster recovery, fewer adverse events, and higher satisfaction levels. Dexmedetomidine should be considered the preferred sedative agent, especially in RA settings requiring early discharge and preserved respiratory function.
Research Article
Open Access
Evaluation of Biofilm Formation in Bacterial Isolates from Cardiac Valve Infections
Nitesh K. Patel,
Vidhi Patel,
Sangita Vasava
Pages 245 - 248

View PDF
Abstract
Background: Cardiac valve infections, notably infective endocarditis (IE), are severe conditions often associated with high morbidity and mortality. One of the major pathogenic mechanisms contributing to treatment failure in IE is bacterial biofilm formation, which provides a protective niche for microorganisms, leading to antimicrobial resistance and recurrent infections. This study aims to evaluate the prevalence and degree of biofilm formation among bacterial isolates from cardiac valve infections. Material and Methods: This retrospective study was conducted on bacterial isolates recovered from cardiac valve tissue specimens of patients undergoing valve replacement surgery for infective endocarditis. Standard microbiological techniques were employed for bacterial identification. Biofilm-forming ability was assessed using the tissue culture plate (TCP) method and categorized as strong, moderate, weak, or non-biofilm producers based on optical density measurements. Results: Out of the total isolates (n = 72), 25% were identified as strong biofilm producers, 40.3% as moderate, 20.8% as weak, and 13.9% as non-biofilm producers. Staphylococcus aureus and coagulase-negative staphylococci accounted for the majority of strong biofilm producers. A notable proportion of Enterococcus faecalis and Gram-negative bacilli also exhibited moderate to strong biofilm-forming capacity. Biofilm-producing strains demonstrated higher antimicrobial resistance patterns compared to non-biofilm producers. Conclusion: The high prevalence of biofilm-forming bacteria in cardiac valve infections highlights the clinical importance of incorporating biofilm assessment in routine microbiological work-up. Recognizing the biofilm-forming potential of pathogens may guide more effective therapeutic decisions and surgical interventions in the management of infective endocarditis.
Research Article
Open Access
Comparison of Severity of Dry Eye in Type 1 Diabetic with Those in Type 2 Diabetic
Aayushmann Singh,
Sophiya Chaudhary,
Chetanya Prakash Gupta,
Saloni Balan
Pages 239 - 244

View PDF
Abstract
Introduction: Dry eye disease is a complex ocular surface disorder caused by multiple factors brought on by a vicious cycle of loss of hemostasis of the tear film, persistent inflammation, hyperosmolarity and neurosensory disturbance play an important role. Aim: To compare the severity of dry eye in type 1 diabetics with those in type 2 diabetics. Methodology: This observational study was conducted over 18 months in the outpatient departments of Ophthalmology and Medicine at Mahatma Gandhi Medical College and Hospital, Jaipur. RESULT: In this study, dry eye was more prevalent among patients with Type 2 diabetes (84.3%) compared to Type 1 (15.7%). Patients with higher HbA1c levels (>6.5%) had increased severity of dry eye, indicating a significant association between poor glycemic control and dry eye disease. Conclusion: Dry eye disease is significantly more prevalent and severe in diabetic patients, underscoring the need for regular ocular surface evaluation in this population.
Research Article
Open Access
Effectiveness of School-Based Nutritional Interventions in Reducing Childhood Obesity: A Community-Based Controlled Study
Abhisar Rohila,
Azba Mohamed Ayaz Shaikh,
Disha Thapa
Pages 234 - 238

View PDF
Abstract
Background: Childhood obesity has become a global public health concern, with increasing prevalence in both urban and rural settings. School-based nutritional interventions have been proposed as effective strategies to reduce obesity rates among children through early education and behavioral modification. This study aimed to evaluate the effectiveness of a comprehensive school-based nutritional intervention program on reducing obesity prevalence among school-aged children in a semi-urban community. Materials and Methods: A total of 400 children aged 6–12 years were enrolled and randomly divided into an intervention group (n=200) and a control group (n=200). The intervention group received weekly nutritional education sessions, monitored school lunches aligned with dietary guidelines, and parental workshops. The control group followed the regular school curriculum without intervention. Anthropometric parameters including Body Mass Index (BMI), waist circumference, and dietary habits were recorded at baseline and at the end of the study. Data were analyzed using paired t-tests and chi-square tests, with a significance level set at p<0.05. Results: At baseline, mean BMI in the intervention group was 21.4 ± 2.5, which significantly reduced to 19.8 ± 2.1 after 12 months (p<0.001). In contrast, the control group showed a non-significant change from 21.3 ± 2.6 to 21.0 ± 2.4 (p=0.12). The prevalence of obesity in the intervention group decreased from 18.5% to 10.2%, while in the control group, it remained relatively unchanged (17.9% to 16.8%). Dietary assessment revealed a significant improvement in healthy food consumption and a reduction in high-calorie snack intake in the intervention group (p<0.01). Conclusion: School-based nutritional interventions are effective in reducing obesity and promoting healthier dietary behaviors among children. Integration of structured nutritional education into school curricula, coupled with parental involvement, can serve as a sustainable approach to combating childhood obesity in community settings.
Research Article
Open Access
Assessment of Immunization Coverage and Its Determinants Among Under-Five Children in Urban Slums: A Cross-Sectional Analysis
Punit Patel,
Pulluri Sadanandam,
Manang Khakharia
Pages 223 - 227

View PDF
Abstract
Background: Immunization remains one of the most cost-effective public health strategies to reduce childhood morbidity and mortality. However, children residing in urban slums are often under-immunized due to various socio-economic and systemic barriers. This study aimed to assess the immunization coverage and identify its determinants among under-five children living in urban slum areas. Materials and Methods: A community-based cross-sectional study was conducted at selected urban slums. A total of 450 children aged 12–59 months were enrolled using stratified random sampling. Data on immunization status were collected using a semi-structured questionnaire and validated by checking immunization cards wherever available. Factors influencing immunization were assessed using multivariate logistic regression. Statistical analysis was performed using SPSS version 26.0, and a p-value <0.05 was considered statistically significant. Results: Out of 450 children, 298 (66.2%) were fully immunized, 110 (24.4%) were partially immunized, and 42 (9.3%) were not immunized. The coverage for BCG, OPV3, and measles vaccine was 91.8%, 85.1%, and 78.3% respectively. Maternal education (AOR: 2.7, 95% CI: 1.6–4.5), institutional delivery (AOR: 3.2, 95% CI: 1.9–5.3), possession of immunization card (AOR: 4.1, 95% CI: 2.3–7.0), and antenatal care visits >3 (AOR: 2.9, 95% CI: 1.7–4.9) were found to be significantly associated with full immunization status (p<0.05 for all). Conclusion: Despite the availability of immunization services, a significant proportion of children in urban slums remain partially or non-immunized. Maternal education, healthcare access during pregnancy, and awareness play crucial roles in improving coverage. Strengthening health education, increasing community outreach, and improving record-keeping may enhance immunization rates in underserved populations.
Research Article
Open Access
Evaluating the Effectiveness of Telemedicine in Managing Chronic Diseases in Rural Primary Care Settings: A Multi-Center Cohort Study
Shreyans Amrutlal Amin,
Vishruti Sanjaybhai Dhanani,
Jimit Pankajbhai Prajapati
Pages 218 - 222

View PDF
Abstract
Background: Chronic diseases such as hypertension, diabetes, and chronic obstructive pulmonary disease (COPD) pose a substantial burden on rural populations due to limited access to specialized care. Telemedicine has emerged as a potential solution to bridge healthcare gaps by enabling remote consultation and monitoring. However, evidence on its effectiveness in managing chronic diseases in rural primary care settings remains limited and inconsistent. Materials and Methods: This multi-center cohort study was conducted across five rural primary health centers in India. A total of 600 adult patients with at least one diagnosed chronic condition (hypertension, diabetes, or COPD) were enrolled. Participants were divided into two cohorts: the telemedicine group (n = 300), which received regular virtual consultations and digital monitoring, and the standard care group (n = 300), which followed conventional in-person visits. Clinical outcomes, patient satisfaction scores, and frequency of emergency visits were assessed at baseline and after 12 months. Data were analyzed using SPSS v25.0, and p < 0.05 was considered statistically significant. Results: After 12 months, the telemedicine group showed significant improvement in disease control markers compared to the standard care group. Mean HbA1c levels in diabetic patients decreased from 8.7 ± 1.2% to 6.9 ± 0.9% in the telemedicine group versus 8.6 ± 1.3% to 7.8 ± 1.1% in standard care (p < 0.001). Systolic blood pressure among hypertensive patients reduced by an average of 14.2 mmHg in the telemedicine group versus 6.5 mmHg in the control group (p = 0.002). The frequency of emergency visits declined by 28.3% in the telemedicine cohort (p = 0.015). Patient satisfaction scores were significantly higher in the telemedicine group (4.6 ± 0.4 vs. 3.9 ± 0.6 on a 5-point Likert scale, p < 0.001). Conclusion: Telemedicine significantly improved chronic disease management outcomes in rural primary care settings, demonstrating enhanced disease control, reduced emergency visits, and greater patient satisfaction. Integrating telehealth into rural health infrastructure could substantially alleviate access barriers and improve long-term outcomes for patients with chronic illnesses.
Research Article
Open Access
Assessment of Mental Health Screening Tools in General Practice for Early Detection of Depression and Anxiety Disorders: A Diagnostic Accuracy Study
Shreyans Amrutlal Amin,
Jimit Pankajbhai Prajapati,
Vishruti Sanjaybhai Dhanani
Pages 213 - 217

View PDF
Abstract
Background: Early identification of depression and anxiety disorders in primary care is critical to improve patient outcomes and reduce the burden of mental illness. Mental health screening tools like PHQ-9 and GAD-7 are widely used in general practice; however, their diagnostic accuracy and utility in diverse clinical settings require further evaluation. Materials and Methods: This diagnostic accuracy study was conducted over 12 months in a general practice clinic. A total of 300 adult patients aged 18–65 years were screened using PHQ-9 (for depression) and GAD-7 (for anxiety). The reference standard was a structured clinical interview based on DSM-5 criteria conducted by a clinical psychologist. Sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) were calculated for each tool. Results: Out of 300 participants, 180 (60%) were female and 120 (40%) were male. The prevalence of clinically diagnosed depression was 38%, while anxiety disorders were present in 32%. PHQ-9 showed a sensitivity of 89.5%, specificity of 78.2%, PPV of 74.2%, and NPV of 91.2% for depression. GAD-7 demonstrated a sensitivity of 84.3%, specificity of 81.7%, PPV of 70.4%, and NPV of 90.8% for anxiety disorders. Receiver operating characteristic (ROC) curves for both tools showed area under the curve (AUC) values of 0.87 for PHQ-9 and 0.85 for GAD-7. Conclusion: Both PHQ-9 and GAD-7 are effective, easy-to-administer tools for the early detection of depression and anxiety in general practice. Their high sensitivity and NPV make them particularly useful for screening purposes, supporting their integration into routine primary care mental health assessments.
Research Article
Open Access
Comparative Study of Postoperative Pain Between Open Lichtenstein Repair and Laparosopic Totally Extraperitoneal Repair for The Treatment of Inguinal Hernia
Vineet F Chauhan,
Smit Patel
Pages 207 - 212

View PDF
Abstract
Background: Post operative pain following a hernioplasty is a common problem which can hinder in early return to normal activities in a patient. Lichtenstein repair is considered the preferred approach for open inguinal hernia repairs . From early 1990s ,laparoscopic approach to hernia gained popularity. Laparoscopic surgeons advocate less postoperative pain and early return to normal activities . This prospective randomised study was done to compare the postoperative pain and chronic pain in open Lichtenstien repair and Laparoscopic hernia repairs for uncomplicated inguinal hernias. Objective: To compare the open Lichtenstein repair and laparoscopic mesh repair for uncomplicated inguinal hernias in terms of early post operative pain and chronic pain. Methods: This is a prospective randomised study of total 80 patients having uncomplicated inguinal hernia . Patients were randomly divided into two groups. 40 patients underwent open Lichtenstein’s repair (group A) while 40 patients underwent totally extraperitoneal repair (TEP) (group B) who had presented to Surgery Department Civil Hospital Ahmedabad between January 22 to January 2024. Post operative pain intensity was assessed by VAS score .Follow up for 1year was done for assessment of chronic pain. Permission of ethics committee was taken. Results: A total 80 patients of inguinal hernia were studied. Results of our study showed that from post operative day 1 to day 14 ,pain was statistically more in open Lichtenstein group. However there was no difference in pain scores after 14 days upto 1 year follow up . Conclusion: Laparoscopic repair has less pain in early postoperative period but chronic pain is similar in both groups . This suggested that laparoscopic hernia repair may be better for early return of work due to less pain in early postoperative period
Research Article
Open Access
Assessment of Sleep Quality and Its Correlation with Glycemic Control in Patients with Type 2 Diabetes Mellitus: A Prospective Cohort Study
Hetvi Dhavalbhai Patel,
Bhumi Khushalbhai Parmar,
Shivam Dhavalkumar Patel
Pages 203 - 206

View PDF
Abstract
Background: Sleep disturbances are increasingly recognized as a potential risk factor affecting glycemic control in patients with type 2 diabetes mellitus (T2DM). Poor sleep quality can alter hormonal balance and insulin sensitivity, thereby exacerbating hyperglycemia. This study aimed to assess the quality of sleep and its association with glycemic control in individuals with T2DM. Materials and Methods: This prospective cohort study was conducted on 150 patients with diagnosed T2DM attending the outpatient department of a tertiary care center over a period of 12 months. Sleep quality was assessed using the Pittsburgh Sleep Quality Index (PSQI), and glycemic control was evaluated through fasting plasma glucose (FPG), postprandial glucose (PPG), and glycated hemoglobin (HbA1c) levels at baseline and at 6-month follow-up. Participants were divided into two groups based on PSQI scores: good sleepers (PSQI ≤ 5) and poor sleepers (PSQI > 5). Statistical analysis was performed using Pearson’s correlation and independent t-test. Results: Out of 150 participants, 88 (58.7%) were identified as poor sleepers. The mean HbA1c in poor sleepers was significantly higher (8.3 ± 1.1%) compared to good sleepers (7.2 ± 0.9%, p < 0.01). A moderate positive correlation was observed between PSQI scores and HbA1c levels (r = 0.42, p < 0.01). Additionally, poor sleepers showed higher mean FPG (152 ± 18 mg/dL) and PPG (215 ± 25 mg/dL) compared to good sleepers (FPG: 134 ± 16 mg/dL; PPG: 192 ± 22 mg/dL). Conclusion: This study demonstrates a significant association between poor sleep quality and suboptimal glycemic control in patients with T2DM. Incorporating sleep assessment into routine diabetes management may help optimize metabolic outcomes.
Research Article
Open Access
Correlation Between Subclinical Inflammatory Markers and Early Vascular Changes in Asymptomatic Young Adults with Cardiovascular Risk Factors
Jay Hirenbhai Shilu,
Nisarg Vipulbhai Rajyaguru,
Kruti Yogeshbhai Mahavar
Pages 198 - 202

View PDF
Abstract
Background: Cardiovascular disease (CVD) often has a long asymptomatic phase before clinical manifestations appear. Subclinical inflammation and early vascular alterations such as increased carotid intima-media thickness (CIMT) and arterial stiffness are considered early indicators of atherosclerosis. Identifying these markers in young adults with cardiovascular risk factors (CVRFs) may help in early intervention and prevention strategies. Materials and Methods: A cross-sectional observational study was conducted on 120 asymptomatic young adults aged 20–35 years with at least one CVRF (e.g., family history, obesity, smoking, hypertension, or dyslipidemia). Serum levels of high-sensitivity C-reactive protein (hs-CRP), interleukin-6 (IL-6), and fibrinogen were measured as inflammatory markers. Vascular changes were assessed using carotid ultrasonography to measure CIMT and pulse wave velocity (PWV) for arterial stiffness. Statistical analyses included Pearson correlation and multiple linear regression to examine associations between inflammatory markers and vascular parameters. Results: The mean hs-CRP level was 2.8 ± 1.2 mg/L, IL-6 was 3.5 ± 1.0 pg/mL, and fibrinogen was 345 ± 50 mg/dL. CIMT averaged 0.63 ± 0.07 mm, and PWV was 7.2 ± 1.1 m/s. Significant positive correlations were observed between hs-CRP and CIMT (r = 0.45, p < 0.01), IL-6 and PWV (r = 0.39, p < 0.01), and fibrinogen and CIMT (r = 0.33, p < 0.05). Regression analysis indicated hs-CRP as an independent predictor of CIMT (p = 0.004).Conclusion: This study highlights a strong association between subclinical inflammatory markers and early vascular changes in young adults with CVRFs. Early identification of these markers may allow timely lifestyle or pharmacological interventions to prevent progression to overt CVD.
Research Article
Open Access
Evaluation of Serum Uric Acid as a Predictive Marker for Isolated Diastolic Hypertension in Prehypertensive Adults: A Longitudinal Study
Trusha Pansuriya,
Kruti Yogeshbhai Mahavar,
Azazkhan Umarali Nagori
Pages 194 - 197

View PDF
Abstract
Background: Isolated diastolic hypertension (IDH) is an early manifestation of vascular dysfunction, particularly in younger adults. Serum uric acid (SUA) has been implicated in the pathophysiology of hypertension, yet its potential as a predictive biomarker for the development of IDH in prehypertensive individuals remains underexplored. This study aimed to evaluate the longitudinal association between SUA levels and progression to IDH in prehypertensive adults. Materials and Methods: A prospective longitudinal study was conducted among 320 prehypertensive individuals aged 25–45 years, recruited from the outpatient department of general medicine at a tertiary care hospital. Baseline SUA levels were measured using an enzymatic colorimetric assay. Participants were followed for 24 months, with blood pressure monitored at 6-month intervals. IDH was defined as diastolic BP ≥ 90 mmHg with systolic BP < 140 mmHg. Data were analyzed using Cox proportional hazards regression, adjusting for confounders including BMI, age, sex, and lifestyle factors. Results: Out of 320 participants, 68 (21.3%) developed IDH over two years. Mean baseline SUA was significantly higher in those who progressed to IDH (6.4 ± 1.1 mg/dL) compared to those who did not (5.3 ± 0.9 mg/dL, p < 0.001). A SUA threshold of >5.8 mg/dL predicted IDH with 76.5% sensitivity and 70.2% specificity (AUC = 0.78). Multivariate analysis revealed that elevated SUA was independently associated with higher risk of developing IDH (adjusted HR: 2.15; 95% CI: 1.41–3.27; p = 0.002). Conclusion: Elevated serum uric acid levels are significantly associated with increased risk of isolated diastolic hypertension among prehypertensive adults. Monitoring SUA may aid in early identification of individuals at risk for progression to overt hypertension.
Research Article
Open Access
Prevalence of Subclinical Bacteremia in Patients with Prosthetic Heart Valves: A Cross-sectional Study
Amar C Sajjan,
Nitesh K Patel,
Pratiksha Sojitra
Pages 189 - 193

View PDF
Abstract
Introduction: Prosthetic heart valves are known to carry a long-term risk of infective endocarditis, often preceded by episodes of subclinical bacteremia. Identifying asymptomatic bacteremia in such patients is crucial to preventing complications. This study aimed to determine the prevalence of subclinical bacteremia among patients with prosthetic heart valves and assess its microbiological profile and associated risk factors. Material and Methods: A cross-sectional study was conducted at a tertiary care hospital over 18 months, enrolling 120 patients with prosthetic heart valves who were clinically asymptomatic. Detailed clinical and demographic data were recorded. Peripheral venous blood samples were collected aseptically and processed using automated blood culture systems. Positive cultures were further analyzed to identify the organisms and their antimicrobial susceptibility patterns. Results: Subclinical bacteremia was detected in 12 out of 120 patients (10%). The most frequently isolated organisms were coagulase-negative Staphylococcus species (50%), Streptococcus viridans (33.3%), and Enterococcus faecalis (16.7%). A higher prevalence was observed among patients with a history of valve replacement within the past year and those with comorbid diabetes mellitus (p < 0.05). No patients showed clinical signs of infective endocarditis during the study period. Conclusion: A significant proportion of patients with prosthetic heart valves may harbor subclinical bacteremia despite the absence of symptoms. Early detection, particularly in individuals with recent valve surgery or diabetes, can aid in timely intervention to prevent progression to prosthetic valve endocarditis. Periodic surveillance using blood cultures and consideration of prophylactic measures may be warranted in high-risk patients
Research Article
Open Access
A Study of Neutrophil to Lymphocyte ratio in COPD patients
Ashwin Songara,
Rohit Mukherjee,
Vibha Singh
Pages 184 - 188

View PDF
Abstract
Background & Methods: The aim of the study is to assess the correlation of Neutrophil to lymphocyte ratio with BODE index in stable COPD and to compare the mean NLR values in patients with stable COPD and those with exacerbation of COPD. Results: The relationship between neutrophil-to-lymphocyte ratio (NLR) and BODE Index categories was investigated, and the results are presented. A statistically significant difference in NLR across BODE Index categories was observed (Kruskal-Wallis H = 16.555, p < 0.001). The mean NLR increased with increasing BODE Index category, ranging from 3.07 (SD = 2.00) in the <5 category to 6.05 (SD = 2.21) in the ≥7 category. The mean ranks also showed a corresponding increase, from 28.91 in the <5 category to 54.86 in the ≥7 category, indicating that higher disease severity (as reflected by a higher BODE Index) is associated with higher NLR. A comparison of neutrophil-to-lymphocyte ratio (NLR) between patients with and without exacerbation revealed a statistically significant difference (Mann-Whitney U statistic = 613.00, p < 0.001). Patients experiencing exacerbation (n = 32) had a significantly higher mean NLR (5.56, SD = 2.06) compared to those without exacerbation (n = 70), who had a mean NLR of 3.79 (SD = 2.30). This difference is further supported by the mean rank values, with the exacerbation group exhibiting a mean rank of 67.34 and the non-exacerbation group a mean rank of 44.26, and sum of ranks of 2155.00 and 3098.00 respectively, indicating a clear elevation of NLR during exacerbation events. Conclusion: This study demonstrated a significant positive correlation between the Neutrophil-to-Lymphocyte Ratio (NLR) and the BODE index in the studied COPD patient cohort, indicating an association between NLR and overall disease severity. A significant difference in mean NLR was observed between patient groups, with markedly higher values found in individuals experiencing acute exacerbations compared to those in a stable state.
Research Article
Open Access
Assessment of Silent Atherosclerosis and Its Association with Carotid Intima-Media Thickness in Apparently Healthy Young Adults: A Cross-Sectional Study
Dhruv Parekh,
Priyanka Ruparel
Pages 179 - 183

View PDF
Abstract
Background: Silent atherosclerosis is an asymptomatic precursor to cardiovascular disease and may begin early in life, often remaining undetected in young, apparently healthy individuals. Carotid intima-media thickness (CIMT) is a reliable surrogate marker for early atherosclerotic changes and can be used to assess subclinical vascular damage. This study aimed to evaluate the prevalence of silent atherosclerosis using CIMT measurements and its association with various cardiometabolic parameters in apparently healthy young adults. Materials and Methods: A total of 200 apparently healthy individuals aged 18–35 years were enrolled after obtaining informed consent. Exclusion criteria included known cardiovascular disease, diabetes, or hypertension. Participants underwent detailed clinical evaluation, anthropometric measurements, and laboratory investigations, including fasting lipid profile, fasting blood glucose, and hs-CRP. CIMT was measured bilaterally using high-resolution B-mode ultrasonography. A CIMT value >0.8 mm was considered indicative of subclinical atherosclerosis. Results: The mean age of the participants was 26.3 ± 4.1 years, with 52% being male. The prevalence of increased CIMT (>0.8 mm) was found in 18.5% of subjects. Individuals with elevated CIMT had significantly higher BMI (26.7 ± 2.8 vs. 23.9 ± 2.6 kg/m², p < 0.001), total cholesterol (212.6 ± 31.4 vs. 178.5 ± 28.7 mg/dL, p = 0.002), LDL-C (134.9 ± 22.5 vs. 102.3 ± 19.1 mg/dL, p = 0.004), and hs-CRP levels (3.1 ± 1.2 vs. 1.5 ± 0.7 mg/L, p < 0.001). A positive correlation was observed between CIMT and both LDL-C (r = 0.43, p < 0.01) and hs-CRP (r = 0.39, p < 0.01). Conclusion: A significant proportion of apparently healthy young adults show early signs of atherosclerosis, as indicated by increased CIMT. Elevated BMI, LDL-C, and hs-CRP were strongly associated with subclinical atherosclerotic changes. Early screening and lifestyle interventions in this population may help mitigate future cardiovascular risk.
Research Article
Open Access
Comparative Evaluation of Maternal and Fetal Outcomes in Induction of Labor Using Foley Catheter Versus Misoprostol in Term Pregnancie: Prospective Randomized Controlled Trial
Mori Yashvantsinh,
Paramar Heena,
Tadha Ketan,
Solanki Lomabhai
Pages 174 - 178

View PDF
Abstract
Background: Induction of labor (IOL) is a common obstetric intervention, often necessitated in term pregnancies due to various maternal and fetal indications. Two widely used methods for cervical ripening are mechanical (e.g., Foley catheter) and pharmacological (e.g., misoprostol). This study aimed to compare maternal and fetal outcomes associated with Foley catheter versus misoprostol for labor induction in term pregnancies. Materials and Methods: A total of 200 pregnant women at ≥37 weeks gestation with singleton, cephalic presentation requiring labor induction were enrolled and randomly allocated into two groups: Group A (Foley catheter, n=100) and Group B (Misoprostol, n=100). Primary outcomes included time from induction to delivery, mode of delivery, and Bishop score improvement. Secondary outcomes assessed maternal complications (e.g., hyperstimulation, PPH) and neonatal outcomes (Apgar scores, NICU admissions). Data were analyzed using SPSS v25.0 with significance set at p<0.05. Results: Mean induction-to-delivery interval was significantly shorter in Group B (9.4 ± 2.1 hours) compared to Group A (12.6 ± 3.5 hours) (p<0.001). Vaginal delivery was achieved in 82% of women in Group B and 74% in Group A (p=0.19). Uterine hyperstimulation occurred in 12% of Group B versus 2% in Group A (p=0.01). NICU admissions were slightly higher in Group B (14%) than Group A (10%), but not statistically significant (p=0.42). No significant difference was observed in postpartum hemorrhage or Apgar scores between groups. Conclusion: Misoprostol demonstrated a shorter induction-to-delivery time and slightly higher vaginal delivery rate, but was associated with increased risk of uterine hyperstimulation. Foley catheter remains a safer option with fewer maternal side effects, though slightly less effective in rapid induction.
Research Article
Open Access
Impact of Duration of Diabetes Mellitus on the Physiological Decline of Sensory and Motor Nerve Conduction Velocities: A Comparative Study
Parthkumar Harsukhlal Kaneriya,
HirenKumar Kantilal Sitapara,
Mahimn Amit Joshi
Pages 170 - 173

View PDF
Abstract
Background: Diabetes mellitus (DM) is a chronic metabolic disorder known to cause progressive neuropathic complications. The duration of diabetes is a critical determinant in the severity of nerve dysfunction. This study aims to evaluate the impact of diabetes duration on sensory and motor nerve conduction velocities (NCVs), comparing recently diagnosed patients with those with longstanding disease. Materials and Methods: A cross-sectional, comparative study was conducted involving 60 type 2 diabetic patients. Group A included 30 individuals with a diabetes duration of less than 5 years, and Group B comprised 30 individuals with diabetes for more than 10 years. Standard nerve conduction studies were performed on the median, ulnar, peroneal, and sural nerves using a digital electromyography system. Motor nerve conduction velocity (MNCV) and sensory nerve conduction velocity (SNCV) values were recorded and compared between the two groups. Results: The mean MNCV of the median nerve in Group A was 52.3 ± 3.1 m/s, while in Group B it was 43.6 ± 4.7 m/s. The ulnar nerve MNCV showed a similar decline from 53.7 ± 2.8 m/s in Group A to 45.1 ± 3.9 m/s in Group B. SNCV of the sural nerve was 47.8 ± 2.4 m/s in Group A and 38.9 ± 3.2 m/s in Group B. All observed differences between the groups were statistically significant (p < 0.01). Conclusion: The study demonstrates a significant reduction in both sensory and motor nerve conduction velocities in patients with a longer duration of diabetes. These findings reinforce the importance of early glycemic control to prevent or delay diabetic neuropathy.
Research Article
Open Access
Evaluation of Artificial Intelligence-Based Risk Scoring for Early Sepsis Detection in Emergency Department Admissions
Mahimn Amit Joshi,
Parthkumar Harsukhlal Kaneriya,
Janvi Bhanjibhai Panchotiya
Pages 166 - 169

View PDF
Abstract
Background: Early identification of sepsis in emergency department (ED) patients remains a clinical challenge due to its nonspecific presentation. Traditional scoring systems often lack sensitivity or are time-consuming. Artificial intelligence (AI)-based risk scoring tools offer a promising alternative for real-time prediction by processing vast clinical data rapidly and accurately. Materials and Methods: A retrospective observational study was conducted on 2,000 patients admitted to the ED of a tertiary care hospital over 12 months. An AI-based sepsis risk scoring model was developed using machine learning algorithms trained on vital signs, laboratory results, and demographic data. The AI model was evaluated against the conventional Systemic Inflammatory Response Syndrome (SIRS) criteria and the Quick Sequential Organ Failure Assessment (qSOFA) score. Performance was assessed based on sensitivity, specificity, positive predictive value (PPV), and area under the receiver operating characteristic curve (AUROC). Results: Out of 2,000 patients, 300 (15%) were diagnosed with sepsis within 48 hours of admission. The AI model demonstrated superior performance with an AUROC of 0.92, compared to 0.78 for SIRS and 0.74 for qSOFA. Sensitivity and specificity for the AI model were 88% and 85%, respectively, while PPV was 68%. In contrast, SIRS showed a sensitivity of 72%, specificity of 62%, and PPV of 45%; qSOFA achieved 66% sensitivity, 70% specificity, and 49% PPV. Conclusion: AI-based risk scoring significantly improves early sepsis detection in the ED setting compared to traditional scoring methods. Its implementation could support timely clinical decisions, potentially improving patient outcomes and reducing mortality. Further prospective validation is recommended.
Research Article
Open Access
Impact of Polypharmacy on Hospital Readmission Rates in Elderly Patients with Multimorbidity
Bhumi K. Parmar,
Hetvi Dhavalbhai Patel,
Sachin Parekh
Pages 162 - 165

View PDF
Abstract
Background: Polypharmacy, commonly defined as the concurrent use of five or more medications, is a frequent phenomenon in elderly patients with multimorbidity. While intended to manage multiple chronic conditions, polypharmacy is associated with adverse drug reactions, medication non-adherence, and increased healthcare utilization. Hospital readmission, a critical indicator of healthcare quality, may be significantly impacted by polypharmacy in this vulnerable population. Materials and Methods: A retrospective observational study was conducted involving 300 elderly patients (aged ≥65 years) admitted to a tertiary care hospital over a 12-month period. Inclusion criteria were the presence of at least two chronic diseases and discharge to home following hospitalization. Patients were grouped based on the number of prescribed medications at discharge: non-polypharmacy (<5 drugs), moderate polypharmacy (5–9 drugs), and excessive polypharmacy (≥10 drugs). The 30-day readmission rate was recorded for each group. Data were analyzed using chi-square tests and logistic regression to assess the association between polypharmacy levels and readmission risk. Results: Among the study cohort, 60 patients (20%) had non-polypharmacy, 150 (50%) had moderate polypharmacy, and 90 (30%) had excessive polypharmacy. The 30-day readmission rates were 10%, 22%, and 38% in the non-, moderate-, and excessive polypharmacy groups, respectively (p<0.01). Logistic regression indicated that excessive polypharmacy was significantly associated with higher odds of readmission (OR: 3.5; 95% CI: 1.8–6.9) after adjusting for age, comorbidities, and functional status. Conclusion: Polypharmacy, especially at excessive levels, is strongly linked to increased hospital readmission rates in elderly patients with multimorbidity. Optimizing medication regimens and enhancing medication review at discharge may help reduce readmissions and improve clinical outcomes in this population.
Research Article
Open Access
Impact of Lifestyle Counseling on Glycemic Control in Type 2 Diabetes Patients Managed in General Practice
Aratiben Kishorbhai Dhandhaliya,
Bharatkumar Mansinhbhai Chaudhari,
Kinjal Karshanbhai Gohel
Pages 158 - 161

View PDF
Abstract
Background: Type 2 diabetes mellitus (T2DM) is a chronic metabolic disorder that requires continuous lifestyle management to achieve optimal glycemic control. General practice settings provide a crucial platform for delivering lifestyle counseling, yet its impact on glycemic outcomes remains underutilized and inconsistently evaluated. This study aimed to assess the effect of structured lifestyle counseling on glycemic control among T2DM patients in general practice. Materials and Methods: A prospective interventional study was conducted in four general practice clinics over a period of six months. A total of 200 patients with diagnosed T2DM were enrolled and divided into two groups: intervention group (n = 100) receiving structured lifestyle counseling and control group (n = 100) receiving standard care. The counseling sessions focused on diet, physical activity, smoking cessation, and stress management and were delivered monthly by trained healthcare providers. Glycemic control was assessed using HbA1c levels at baseline and at the end of six months. Data were analyzed using SPSS v26, and paired t-tests and chi-square tests were applied where appropriate (p < 0.05 considered statistically significant). Results: At baseline, the mean HbA1c levels were 8.4% ± 1.1 in the intervention group and 8.3% ± 1.2 in the control group (p = 0.67). After six months, the intervention group showed a significant reduction in HbA1c to 7.2% ± 0.9, compared to 8.1% ± 1.1 in the control group (p < 0.001). Additionally, 62% of the patients in the intervention group achieved HbA1c <7.0%, while only 28% did so in the control group. Improvements in dietary adherence and physical activity levels were significantly higher in the intervention group (p < 0.01). Conclusion: Structured lifestyle counseling in general practice significantly improves glycemic control among patients with type 2 diabetes. Integrating behavioral interventions into routine primary care can enhance long-term diabetes outcomes and reduce the burden of complications.
Research Article
Open Access
A Cross-Sectional Study on Evaluation of Knowledge and Attitude toward Basic Life Support and Cardiopulmonary Resuscitation Among First Year Medical Graduates
Deep Parsotambhai Antala,
Param Jagani,
Yash Raval
Pages 154 - 157

View PDF
Abstract
Background: Basic Life Support (BLS) and Cardiopulmonary Resuscitation (CPR) are essential emergency interventions that can significantly improve survival in cardiac arrest cases. Early training of medical students in these life-saving procedures is vital to build confidence and preparedness. This study aimed to assess the knowledge and attitude toward BLS and CPR among first-year medical students. Materials and Methods: A cross-sectional survey was conducted among 45 first-year MBBS students from a tertiary medical college using a pre-validated, self-administered questionnaire. The tool included 15 knowledge-based questions and 10 attitude-related statements. Descriptive statistics were used to calculate means and percentages, while chi-square tests were applied to identify associations between knowledge and attitude scores. Results: Out of 45 participants, 43 completed the survey (response rate: 95.5%). The mean knowledge score was 8.3 ± 2.5 (out of 15), with only 18 students (approximately 42%) demonstrating adequate knowledge (score ≥10). Meanwhile, 35 students (81%) showed a positive attitude toward learning and performing BLS/CPR. Prior exposure to BLS training was reported by 12 students (28%) and was significantly associated with higher knowledge scores (p < 0.01). No significant gender differences were noted in either knowledge or attitude scores. Conclusion: The study highlights a moderate level of knowledge but a generally positive attitude toward BLS and CPR among first-year medical students. These findings emphasize the importance of integrating formal BLS/CPR training early in the medical curriculum to bridge knowledge gaps and foster skill development.
Research Article
Open Access
Cardiac reverse modelling outcomes following aortic valve replacement in isolated severe aortic stenosis in patients with preoperative normal LV function vs. LV dysfunction: A single centre study
Deepak R. Sridhar,
Madhur Kumar,
Dipti Ranjan Dhar,
Aadarsha Rai,
Anubhav Gupta
Pages 147 - 153

View PDF
Abstract
Background - Aortic stenosis (AS) is associated with significant cardiac remodelling. LV dilatation and eventually concentric Left ventricular hypertrophy ultimately results in LV dysfunction. Patients with AS exhibit symptoms such as Syncope, angina and dyspnoea, with dyspnoea developing last and indicating underlying LV dysfunction secondary to cardiac remodelling. Surgical Aortic Valve Replacement has been shown to be associated with reverse cardiac modelling, reduction in Left ventricular dimensions thus improving quality of life. However, outcomes of AVR in patients with pre-existing LV dysfunction need to be evaluated. Materials and Methods This retrospective study was conducted at Safdarjung Hospital, New Delhi. 30 patients who underwent surgical Aortic Valve Replacement for isolated severe Aortic Stenosis in a 2-year period were selected. 15 patients had a left ventricular Ejection Fraction >/= 45% and 15 patients had an Ejection Fraction <45%. Data regarding Preoperative and postoperative LV function, Left ventricular dimensions (end-systolic and end-diastolic) were collected by transthoracic Echocardiography, data analysed and compared. Results Predominant aortic valve pathology was calcification in normal EF group (40%) and low EF group (40%). Mean postoperative change in LV ejection fraction was -2% in normal EF group and 3% in low EF group (p = <0.001). Left ventricular end systolic dimensions showed a greater reduction in patients with normal LV function as compared to patients with low EF (-6.1 vs -3.5, p = <0.001). There was no statistically significant difference when postoperative reduction in LV end diastolic dimension was compared (-6.7 vs. -6.1, p = 0.518). 2 patients in low EF group expired during the follow-up period while no mortality was observed in the normal EF group (p = 0.147). Conclusion The results show that while there was a greater reduction in LV end systolic dimension in the group with normal EF, there was neverthless an overall reduction in Left ventricular end systolic and end diastolic dimensions and an overall improvement in EF with no statistically significant difference in mortality between the two groups, indicating the beneficial role of surgical AVR in patients with low EF as well.
Research Article
Open Access
Pulmonary Adaptations to Sprint and Endurance Training: A Cross-sectional Comparative Study
Santosh Rohidas Bokre,
Dhanashri - Joshi Rahate,
Anupam Suhas Khare,
Akshaya Meher
Pages 142 - 146

View PDF
Abstract
Background - Pulmonary function is a significant predictor of athletic ability, particularly in track athletes. Lung volumes and capacities respond differently to aerobic and anaerobic training programs. Aerobic mechanisms, primarily used by long-distance runners, are designed to improve endurance and oxygen utilization, whereas sprinters perform mostly anaerobic metabolism and emphasize short bursts of power and energy. Objective: To compare pulmonary function test (PFT) values in sprinters and long-distance runners and to evaluate the influence of different patterns of training on respiratory function. Materials and Methods: This cross-sectional study included 100 trained male athletes aged 18-25 years, divided into two groups: 50 sprinters and 50 long-distance runners. Pulmonary functions were assessed using computerized spirometry, measuring Forced Vital Capacity (FVC), Forced Expiratory Volume in 1 second (FEV1), FEV1/FVC ratio, Peak Expiratory Flow Rate (PEFR), and Maximum Voluntary Ventilation (MVV). Standardized protocols were followed as per American Thoracic Society (ATS) guidelines. Results and Analysis: Distance runners exhibited significantly higher values of FVC, FEV1, PEFR, and MVV compared to sprinters (p < 0.05), indicating higher pulmonary efficiency. FEV1/FVC ratios were similar in groups. Higher values in endurance athletes suggest higher ventilatory muscle endurance and alveolar-capillary diffusion capacity. Conclusion: Endurance running serves the pulmonary system more than sprint running. Aerobic endurance exercise correlates with better lung capacity, inspiratory muscle performance, and functional ventilation. Anaerobic performers may also enjoy added incorporation of endurance exercise traits to optimize the state of their respiratory apparatus in general.
Research Article
Open Access
Assessment of Improvement in Clinical Symptoms of Weakness and Muscle Cramps in Type II Diabetics Receiving SGLT2 Inhibitors and Its Correlation with Serum Magnesium Levels
Vasant Shrivastava,
Gaurav Shrivastava,
Samrin Sheikh
Pages 138 - 141

View PDF
Abstract
Background - SGLT2 inhibitors consistently cause alteration in serum electrolyte level in the form of slight increases in serum magnesium levels.Type II diabetes itself causes increased urinary loss of Mg by the process of poor tubular Mg resorption. This study aimed to analyze the role played by SGLT2 inhibitors in improvement of symptoms of generalized weakness and muscle cramps in type II Diabetes and its association with increase in serum magnesium levels. Materials and Methods- Seventy-six Hospital OPD patients were enrolled in the study and split into two groups. Informed consent was obtained before starting the study. In one group SGLT2 Inhibitors were given to patients and in another group Oral Hypoglycemic agent were given (OHA). Follow up was done in 2 Phases with total duration of 16 weeks. Results- The Mean age in patients on SGLT2 Inhibitors was 55 ± 9.5 years while Mean age of patients on OHA Drugs was 50 ± 5.3 years. In the SGLT2 intervention arm thirteen patients (34.2%) had both weakness and cramps.20 patients (52.6%) had only weakness and 5patients (!3.1%) had only cramps at the beginning of the study. Serum Mg levels were normal in 26 patients (68.4%) and low in 12 patients (31.5%). Conclusion -By analyzing the two groups, it can be concluded that the use of SGLT2 inhibitors reduces symptoms of weakness and cramps; however, correlating this with changes in magnesium levels, serum magnesium levels alone may not serve as the most effective measure
Research Article
Open Access
To Correlate Clinical Manifestations with MRI Findings in Children with Cerebral Palsy
Shehbaz Khan,
Shivani Pitale
Pages 133 - 137

View PDF
Abstract
Background & Methods: The aim of the study is to Correlate in clinical manifestations with MRI findings in child with cerebral palsy. It seems that MRI is a useful tool for diagnosing of the etiology and pathogenesis of abnormal growth during antenatal, perinatal and neonatal damages. Recently MR imaging has been used to detect fetal brain damage. Also it can be used for determination of the anatomical pattern and likely timing of the brain lesion. Results: We find maximum case in WM injury 51% followed by normal 27%. The chi-square statistic is 50.6833. The p-value is < 0.00001. The result is significant at p < .05. Conclusion: According to the reviewed studies, most children with cerebral palsy have abnormal neuroradiological findings. MRI plays a significant and valuable role in revealing the pathologic basis of CP and had strong correlations with clinical findings in term- and preterm- born children. Also it has high potential to detect the type, extent, and possible time of brain damage in children with CP.
Research Article
Open Access
Evaluating the Impact of a Structured Mentorship Program on Medical Students: A Cross-Sectional Feedback Analysis
Shafique Ahmed Mundewadi,
Santosh Madhao Kayande,
Monica Suresh Yunati
Pages 127 - 132

View PDF
Abstract
Background: Mentorship programs have emerged as crucial support structures in medical education, aiding student transition and professional development. Objectives: To evaluate the effectiveness of a structured mentorship program based on student feedback and suggest improvements. Materials and Methods: A cross-sectional observational study was conducted among 78 first-year MBBS students enrolled in a structured mentorship program. Feedback was collected through a validated questionnaire covering eight domains: mentor-mentee interaction, time allocation, guidance provided, accessibility, communication, overall satisfaction, benefits derived, and suggestions for improvement. Descriptive statistics and thematic analysis were applied. Results and Analysis: Out of 100 students, 78 responded to the feedback form (response rate: 78 %). The majority reported high satisfaction with mentor mentee interaction, time allocation & relationship boundaries by mentor (85.9 %), meeting frequency (71.78%), mentor mentee meeting productivity (69.23%). 33.33% students stated that the most beneficial change in them due to mentorship program was self-confidence & stress relief. Suggestions from some students included that there should be increased meeting frequency and more academic guidance. Thematic analysis identified recurring themes such as enhanced confidence, clarity in career goals, and reduced academic stress. Conclusion: The mentorship program was well-received, positively influencing the academic and emotional development of medical students. Structured implementation and regular feedback are crucial to improving mentorship quality.
Research Article
Open Access
Exploring the Role of Oxidative Stress Biomarkers in Breast Cancer Diagnosis and Prognosis in Madhya Pradesh
Amarsingh Hazari,
Shreya Nighoskar,
Amarsingh Hazari,
Shreya Nighoskar
Pages 118 - 126

View PDF
Abstract
Background: Breast cancer is one of the most prevalent and fatal cancers among women worldwide. It is a complex disease driven by genetic, environmental, and lifestyle factors. Recent studies highlight oxidative stress as a major factor in breast cancer progression. This study investigates the potential of oxidative stress biomarkers—specifically malondialdehyde (MDA) and 8-hydroxy-2'-deoxyguanosine (8-OHdG)—in diagnosing and predicting prognosis in breast cancer patients in Madhya Pradesh, India. Objectives: To explore the association between oxidative stress biomarkers and clinical parameters such as tumor size, stage, metastasis, and treatment response in breast cancer patients. Methods: A cross-sectional study was conducted with 66 participants: 33 breast cancer patients and 33 healthy controls. Blood and tumor tissue samples were collected from participants, and oxidative stress biomarkers (MDA and 8-OHdG) and antioxidant enzyme activities (SOD, catalase, GPx) were measured. Statistical analyses were performed to assess correlations between biomarkers and clinical characteristics. Results: Significantly higher levels of MDA and 8-OHdG were found in breast cancer patients compared to healthy controls. Elevated oxidative stress biomarkers correlated with larger tumor sizes, advanced cancer stages, and metastasis. Antioxidant enzyme activity (SOD, catalase, GPx) was reduced in tumor tissues, suggesting compromised antioxidant defenses. Conclusion: Oxidative stress biomarkers such as MDA and 8-OHdG have potential as diagnostic and prognostic tools for breast cancer. Their correlation with tumor size, stage, and metastasis emphasizes their importance in assessing disease progression, particularly in regions with limited healthcare infrastructure
Research Article
Open Access
A Prospective Observational Study on the Incidence and Predictors of Difficult Airway in Patients Undergoing Elective Surgeries Under General Anesthesia
Jigneshkumar D. Patel,
Prasant Devendrabhai Chaudhary,
Amitkumar Abhaykumar Chandak
Pages 113 - 117

View PDF
Abstract
Background: Difficult airway management during general anesthesia poses a significant risk of morbidity and mortality. While several risk factors have been identified, a comprehensive prospective evaluation of incidence and predictors in elective surgical patients is warranted to improve patient safety. Methods: A total of 500 adult patients scheduled for elective surgical procedures under general anesthesia were enrolled. Pre-anesthetic airway assessments, including Mallampati score, thyromental distance, neck circumference, and mouth opening, were recorded. During anesthesia, intubation difficulty was assessed and graded using the Cormack-Lehane classification. Logistic regression analysis was performed to identify independent predictors of difficult airway. Results: The overall incidence of difficult airway was 6.5% (n=78). Univariate analysis revealed significant associations between difficult airway and Mallampati score ≥ 3 (OR 4.2, 95% CI 2.5-7.1, p<0.001), thyromental distance < 6 cm (OR 3.1, 95% CI 1.8-5.3, p<0.001), neck circumference > 40 cm (OR 2.8, 95% CI 1.6-4.9, p<0.001), and a history of obstructive sleep apnea (OR 5.5, 95% CI 2.9-10.4, p<0.001). Multivariate logistic regression identified Mallampati score ≥ 3 (Adjusted OR 3.8, 95% CI 2.2-6.5, p<0.001) and a history of obstructive sleep apnea (Adjusted OR 4.9, 95% CI 2.5-9.6, p<0.001) as independent predictors of difficult airway. Conclusion: Difficult airway occurs in a notable proportion of patients undergoing elective surgery. Preoperative assessment of Mallampati score and screening for obstructive sleep apnea are crucial for identifying patients at increased risk of difficult airway and implementing appropriate airway management strategies to improve patient safety.
Research Article
Open Access
Prevention of Deaths from Sudden Cardiac Arrest by Early Screening and Diagnosis
Paidi Suresh,
Shaik rajah Arif,
K Vigneswar Reddy,
Vuyyuru Sasank,
Datla V Krishnam raju
Pages 109 - 112

View PDF
Abstract
Background: Sudden cardiac arrest (SCA) remains one of the leading causes of mortality globally, often occurring without prior symptoms. Early screening and diagnosis of cardiac anomalies can significantly reduce mortality. This study aimed to evaluate the effectiveness of early diagnostic tools in preventing deaths due to SCA in a population-based cohort. Materials and Methods: A prospective observational study was conducted over 6 months involving 100 participants aged 20–60 years, selected from general outpatient clinics based on family history of cardiac disease or symptoms such as palpitations or syncope. Each participant underwent ECG, echocardiography, serum cardiac biomarkers, and risk stratification using the Framingham Risk Score. Those with positive findings were referred for cardiologist evaluation and follow-up. Mortality and incidence of cardiac events were monitored. Results: Out of 100 participants, 24% showed ECG abnormalities, 18% had reduced ejection fraction on echocardiography, and 12% had elevated troponin-I levels. Overall, 30 individuals were classified as high-risk for SCA. Preventive interventions including lifestyle modification, medical management (beta-blockers, statins), and ICD placement (in 3%) were initiated. No cases of SCA-related mortality were reported during the follow-up period. Participant compliance with screening was 92% (Table 1), and early diagnosis was significantly associated with reduced event incidence (p < 0.05) (Table 2). Conclusion: Early screening and diagnosis significantly aid in identifying individuals at risk for SCA and allow for timely preventive measures, thereby reducing mortality. Implementation of structured screening protocols in outpatient settings can be a vital public health strategy.
Research Article
Open Access
To evaluate vitamin D level in diagnosed tuberculosis cases in Central India
Vishal Patidar,
Shiv Kumar Pandey,
Ranjana Verma
Pages 105 - 108

View PDF
Abstract
Background & Methods: The aim of the study is to evaluate the levels of Vitamin D in patients diagnosed with TB, both pulmonary and extrapulmonary, in Central India. The participants for this study included adult patients diagnosed with tuberculosis, both pulmonary and extrapulmonary, who attended the outpatient or inpatient departments of Respiratory Medicine.
Results: The risk of Vitamin D deficiency among TB cases compared to healthy controls. The odds of having Vitamin D deficiency were 3.83 times higher in TB patients than in healthy individuals. This association was statistically significant with a p-value of 0.001 and the 95% confidence interval ranged from 1.746 to 8.402. Conclusion: The findings of the study revealed that Vitamin D deficiency was significantly more prevalent among TB patients (85.7%) compared to healthy controls (61%), with a statistically significant p-value of 0.001. Furthermore, the severity of Vitamin D deficiency was greater among TB cases, with fewer individuals achieving desirable Vitamin D levels compared to controls, and this difference was also statistically significant (p < 0.001).
Research Article
Open Access
Prenatal Prediction of Fetal Respiratory Distress by Measuring Intrauterine Fetal Pulmonary Artery Doppler Indices
Alka Agrawal,
Prem. S. Tripathi,
Chandrajeet Yadav,
Priyal Chouhan,
Navdeep Kaur,
Ananya Kalantri,
Pragya Verma
Pages 94 - 104

View PDF
Abstract
Background: Neonatal Respiratory Distress Syndrome (RDS) is a leading cause of neonatal morbidity and mortality, especially among preterm infants. It primarily results from insufficient pulmonary surfactant production, leading to alveolar collapse and impaired gas exchange. Early and accurate prediction of fetal lung maturity (FLM) is essential for obstetricians to plan timely interventions, such as antenatal corticosteroid administration and appropriate delivery timing, to reduce the risk of RDS. While traditional methods like amniocentesis remain the gold standard for FLM assessment, their invasive nature limits widespread use. Recent advances have highlighted fetal pulmonary artery Doppler, particularly the Acceleration Time to Ejection Time (AT/ET) ratio, as a promising non-invasive alternative for predicting lung maturity. Aim: This study aimed to evaluate the role of fetal pulmonary artery Doppler indices, especially the AT/ET ratio, in predicting fetal lung maturity and correlating these findings with the incidence of neonatal respiratory distress syndrome. Methods: A prospective observational study was conducted on 170 third-trimester pregnant women undergoing routine antenatal screening at a tertiary care center in India. Fetal main pulmonary artery Doppler parameters—including S/D ratio, Pulsatility Index (PI), Resistive Index (RI), Peak Systolic Velocity (PSV), and AT/ET ratio—were measured. Neonatal outcomes, particularly the development of RDS, were documented and analyzed in relation to the antenatal Doppler findings. Results: Of the 170 neonates, 39 (22.9%) developed RDS, with a significantly higher incidence among preterm deliveries (81.8% before 37 weeks). The AT/ET ratio showed strong predictive value for RDS, with a cutoff of 0.30 yielding sensitivity of 92.3%, specificity of 91.6%, positive predictive value of 76.6%, and negative predictive value of 97.6%. Other Doppler parameters such as PSV, PI, and RI did not demonstrate statistically significant associations with RDS. Conclusion: The fetal pulmonary artery AT/ET ratio is a reliable, non-invasive Doppler marker for predicting neonatal RDS. Incorporating this parameter into routine third-trimester ultrasound assessments may aid in early identification of at-risk fetuses and guide timely perinatal interventions to improve neonatal respiratory outcomes.
Research Article
Open Access
Evaluation of Optic Nerve Sheath Diameter by Ultrasound in Patients with Non-Traumatic Causes of Raised Intracranial Pressure
Alka Agrawal,
Archana Bhatnagar,
Gaddameedi Sudha Chandana,
Ayushi Bansal,
N. Anil Kumar Reddy,
Poonam Joshi
Pages 86 - 93

View PDF
Abstract
Background: Raised intracranial pressure (ICP) is a life-threatening condition requiring urgent detection. Invasive methods for monitoring ICP are accurate but carry risks and are not always feasible. The optic nerve sheath diameter (ONSD), measured by ultrasound, has emerged as a quick, non-invasive tool to assess ICP, but most studies have focused on traumatic brain injury. There is limited data on non-traumatic cases. The aim was to evaluate the diagnostic accuracy of ultrasound-measured ONSD in detecting raised ICP in patients with non-traumatic neurological conditions. Materials and Methods: A cross-sectional study was conducted on 100 patients with non-traumatic neurological suspected raised ICP. ONSD was measured bilaterally using point-of-care ultrasound and compared with CT& MRI findings. ROC analysis was used to determine optimal cutoff values, sensitivity and specificity. Results: Mean ONSD was significantly higher in patients with raised ICP (Right: 5.75 mm; Left: 5.80 mm) compared to those with normal ICP (Right: 5.36 mm; Left: 5.48 mm), with p-values of 0.01 and 0.03 respectively. Cutoff values of 5.71 mm (right) and 5.83 mm (left) yielded high sensitivity (90.7%) and moderate specificity (68–76%). Overall diagnostic accuracy was 80–81%. Conclusion: Ultrasound of the optic nerve sheath is a sensitive and rapid method to detect raised ICP in non-traumatic neurological emergencies. It can serve as a useful bedside screening tool, especially in settings where immediate imaging or invasive monitoring is not available.
Research Article
Open Access
Systematic Review: The Impact of Poor Sleep Quality on Psychological and Physical Health
Sneha K,
Vandana Daulatabad,
Prafull K,
Manoj Gundeti
Pages 79 - 85

View PDF
Abstract
Background: Poor sleep quality has emerged as a significant public health concern, increasingly associated with a wide array of psychological and physical health impairments. This systematic review aims to evaluate the cumulative evidence on how disrupted or inadequate sleep affects mental well-being, cognitive functioning, cardiovascular health, immune response, and metabolic regulation. Databases including PubMed, Scopus, and PsycINFO were searched for peer-reviewed articles from 2000 to 2024 using keywords such as "sleep quality," "insomnia," "mental health," "cardiovascular disease," and "immune dysfunction." A total of 103 studies met inclusion criteria, including longitudinal studies, randomized controlled trials, and meta-analyses. The results indicate a strong and consistent correlation between sleep disturbances and increased risk for depression, anxiety, cognitive impairment, and suicidality. Physiologically, sleep deprivation contributes to elevated cortisol levels, disrupted circadian rhythm, and decreased neurogenesis, impacting brain plasticity and emotional regulation. Physical health outcomes linked to poor sleep include heightened blood pressure, impaired glucose tolerance, weight gain, increased systemic inflammation, and compromised immune surveillance, all of which elevate susceptibility to chronic conditions such as hypertension, Type 2 diabetes, cardiovascular diseases, and autoimmune disorders. Emerging evidence also highlights sleep disruption as a potential modifiable risk factor in neurodegenerative diseases such as Alzheimer’s and Parkinson’s. Mechanistic insights suggest that the effects of poor sleep are mediated by dysregulation of the hypothalamic-pituitary-adrenal (HPA) axis, sympathetic nervous system overactivation, alterations in melatonin secretion, and imbalances in pro-inflammatory cytokines such as IL-6 and TNF-α. Furthermore, societal factors like shift work, digital screen exposure, and psychosocial stress exacerbate sleep disturbances across age groups. The findings emphasize the critical need for early diagnosis and interventions targeting sleep hygiene, behavioral therapies, and public health awareness to mitigate the widespread consequences of poor sleep. Future studies must focus on longitudinal analyses and tailored interventions to establish causality and improve clinical outcomes through optimized sleep health.
Research Article
Open Access
Screening and Risk Factor Analysis of Depression Among Elderly Patients Attending General Practice
Agam Kamaleshkumar Patel,
Utsav Dharmendrabhai Patel,
Shubham Mehulkumar Gohil
Pages 75 - 78

View PDF
Abstract
Background: Depression is a prevalent but underdiagnosed condition among the elderly, often masked by somatic complaints or attributed to aging. Early identification in general practice settings can improve outcomes and quality of life. This study aimed to screen for depression and analyze associated risk factors among elderly patients attending general practice clinics. Materials and Methods: A total of 300 patients aged 60 years and above were recruited using systematic random sampling. Depression was assessed using the Geriatric Depression Scale – Short Form (GDS-15), and scores ≥5 were considered indicative of depression. Data on demographic characteristics, medical history, social support, physical activity, and cognitive status were collected through structured interviews. Statistical analysis was performed using SPSS v26, with chi-square tests and logistic regression applied to identify significant risk factors (p < 0.05). Results: Out of 300 participants (mean age: 68.4 ± 5.9 years; 55.3% female), 92 patients (30.7%) screened positive for depression. Significant risk factors included living alone (OR = 2.1, 95% CI: 1.2–3.7, p = 0.006), presence of chronic illness (OR = 2.6, 95% CI: 1.5–4.5, p = 0.001), reduced physical activity (OR = 1.9, 95% CI: 1.1–3.3, p = 0.021), and cognitive impairment (OR = 2.3, 95% CI: 1.3–4.0, p = 0.004). No significant association was found with gender or educational status. Conclusion: Nearly one-third of elderly patients attending general practice clinics showed signs of depression. Risk factors such as social isolation, comorbidities, sedentary lifestyle, and cognitive decline were significantly associated. Regular mental health screening in primary care and targeted interventions may enhance early detection and management of geriatric depression.
Research Article
Open Access
Assessment of Antibiotic Prescribing Patterns for Upper Respiratory Tract Infections in General Practice Clinics
Shubham Mehulkumar Gohil,
Utsav Dharmendrabhai Patel,
Agam Kamaleshkumar Patel
Pages 71 - 74

View PDF
Abstract
Background: Upper respiratory tract infections (URTIs) are one of the most frequent reasons for patient visits in general practice, yet they are predominantly viral in origin and self-limiting. Despite this, antibiotics are often overprescribed, contributing to antimicrobial resistance (AMR). This study aimed to evaluate the patterns of antibiotic prescribing for URTIs in general practice clinics and assess their appropriateness in relation to clinical guidelines. Materials and Methods: Medical records of 500 patients diagnosed with URTIs were randomly selected and reviewed. Data collected included patient demographics, diagnosis, symptoms, antibiotic prescribed (if any), and adherence to standard treatment guidelines. Appropriateness of antibiotic use was assessed using national clinical protocols. Statistical analysis was performed using SPSS v26, with frequencies, percentages, and chi-square tests applied for categorical variables. Results: Among the 500 URTI cases analyzed (mean age 34.2 ± 12.8 years; 56% female), antibiotics were prescribed in 325 cases (65%). Of these, only 102 prescriptions (31.4%) were deemed appropriate according to clinical guidelines. The most commonly prescribed antibiotics were amoxicillin (41.5%), azithromycin (26.8%), and cefixime (12.3%). Fever and purulent nasal discharge were significantly associated with higher rates of antibiotic prescriptions (p < 0.05). Prescribing rates were higher in patients aged 18–40 years (72%) compared to older adults (52%). Conclusion: The study highlights a high rate of inappropriate antibiotic prescribing for URTIs in general practice, emphasizing the urgent need for antimicrobial stewardship and guideline-based interventions. Educating physicians and monitoring prescriptions could significantly reduce misuse and combat rising antibiotic resistance.
Research Article
Open Access
Semen Quality in Hilly and Plain Regions of India: A Comparative Study
Deepika Pandey,
Mohit Pradhan,
Afreen Fatima
Pages 66 - 70

View PDF
Abstract
Background: Geographic and environmental factors such as altitude and ambient hypoxia may influence male reproductive health by altering semen quality. This study aimed to compare semen parameters between healthy adult males residing in hilly and plain regions of India. Materials and Methods: A cross-sectional observational study was conducted over a 12-month period involving 200 men aged 21–45 years, with 100 participants each from hilly regions (≥1,500 meters above sea level) and plain regions (≤300 meters). Participants were recruited from two tertiary care centers. Semen samples were collected after 2–7 days of abstinence and analyzed as per World Health Organization (WHO) 2021 guidelines. Seminal parameters including volume, concentration, total count, motility, vitality, and morphology were compared between the groups. Statistical analysis was performed using SPSS v26.0, with p < 0.05 considered significant. Results: Baseline characteristics such as age, BMI, socioeconomic status, and duration of residence were comparable between the two groups. Semen volume, sperm concentration, total sperm count, progressive motility, vitality, and normal morphology were all significantly lower in the hilly region group compared to the plain region group (p < 0.05 for all). Seminal pH showed no significant difference. Conclusion: Men residing in hilly regions demonstrated significantly poorer semen quality across multiple parameters compared to those in plains, suggesting a potential negative impact of chronic high-altitude exposure—possibly due to hypobaric hypoxia and oxidative stress. These findings highlight the influence of environmental altitude on male fertility and warrant further investigation with biomarker and longitudinal studies.
Research Article
Open Access
Evaluation of Telemedicine Utilization and Patient Satisfaction in Primary Care During Post-COVID Era
Poojankumar Vishnubhai Patel,
Jaykishan Jitendrabhai Patel,
Arth Chandrashekhar Patel
Pages 62 - 65

View PDF
Abstract
Background: The COVID-19 pandemic significantly accelerated the adoption of telemedicine in primary care. While this shift addressed immediate healthcare access challenges, the post-COVID era demands a thorough evaluation of telemedicine's continued use and its acceptance by patients. Understanding patient satisfaction and utilization patterns is essential to integrate telemedicine effectively in routine primary care services. Materials and Methods: A cross-sectional descriptive study was conducted in five primary healthcare centers across urban and semi-urban areas. A total of 400 adult patients who had at least one telemedicine consultation during the study period were recruited using stratified random sampling. Data were collected through a structured questionnaire assessing telemedicine usage frequency, satisfaction levels (via a 5-point Likert scale), ease of access, communication quality, and perceived effectiveness. Statistical analysis was done using SPSS v26, with results expressed as mean ± SD, frequency, and percentage. Chi-square test was applied to determine associations between demographic factors and satisfaction scores (p < 0.05 considered significant). Results: Out of 400 participants (mean age: 42.3 ± 13.5 years; 58% female), 72.5% reported using telemedicine services more than once in the last year. Overall satisfaction was high, with 78.2% rating their experience as either "good" or "very good." Ease of scheduling (82.3%), time saved (88.6%), and quality of physician interaction (74.5%) were the most positively rated domains. However, 21.4% expressed concerns regarding lack of physical examination and diagnostic limitations. A significant association was observed between higher satisfaction and age group 30–50 years (p = 0.03). Conclusion: Telemedicine continues to be a well-utilized and highly satisfactory mode of healthcare delivery in the post-COVID era, particularly valued for its convenience and efficiency. Further integration with in-person services may optimize its utility and address remaining patient concerns.
Research Article
Open Access
A Cross-Sectional Study on Self-Medication Practices Among Medical Students
Poojankumar Vishnubhai Patel,
Jaykishan Jitendrabhai Patel,
Arth Chandrashekhar Patel
Pages 58 - 61

View PDF
Abstract
Background: Self-medication is a common practice among medical students due to their increased access to drug-related knowledge and medications. While it may provide quick relief for minor illnesses, inappropriate self-medication can lead to adverse drug reactions, masking of severe conditions, and development of antimicrobial resistance. Materials and Methods: A descriptive cross-sectional study was conducted among 350 undergraduate medical students from a tertiary medical college over a period of three months. Participants were selected using stratified random sampling across all years of study. Data were collected through a pre-tested, structured questionnaire covering demographic details, frequency and reasons for self-medication, types of drugs used, and sources of drug information. Results: Out of 350 participants, 276 (78.9%) reported practicing self-medication in the past six months. The most common ailments for which students self-medicated included headache (64.1%), common cold (48.5%), and gastrointestinal issues (31.2%). The most frequently used drug classes were analgesics (72.8%), antipyretics (65.6%), and antibiotics (29.7%). The primary reasons for self-medication were perceived minor illness (63.4%) and prior experience with similar symptoms (52.1%). Senior students (4th and final year) showed significantly higher self-medication rates than junior students (p < 0.05). Conclusion: Self-medication is highly prevalent among medical students, especially for minor ailments. While medical knowledge contributes to this behavior, it also underscores the need for better regulatory education and awareness about the risks associated with unsupervised medication use.
Research Article
Open Access
Evaluating Triglyceride-Glucose Index in predicting Renal Function in Population
Jagdeep Singh,
Nabeel Ahmed Hashmi,
Rohit Dubepuria
Pages 54 - 57

View PDF
Abstract
Background: The triglyceride-glucose (TyG) index, a surrogate marker for insulin resistance, has gained attention as a potential indicator of metabolic and cardiovascular risk. Recent evidence suggests a possible association between TyG index and renal function decline. This study aimed to evaluate the predictive value of the TyG index in assessing renal function in a general population cohort. Materials and Methods: A cross-sectional analytical study was conducted on 200 adults aged 30–65 years attending a tertiary care hospital. Participants were categorized based on estimated glomerular filtration rate (eGFR) into normal renal function (eGFR ≥90 mL/min/1.73 m²), mildly decreased (60–89), and moderately to severely decreased (<60). Fasting blood samples were analyzed for triglycerides and glucose to calculate the TyG index using the formula: ln [fasting TG (mg/dL) × fasting glucose (mg/dL)/2]. eGFR was estimated using the CKD-EPI formula. Statistical analysis included Pearson correlation and multivariate regression to evaluate the association between TyG index and eGFR. Results: Among 200 participants (mean age: 48.6 ± 9.1 years; 54% male), the mean TyG index was 8.82 ± 0.51. A significant inverse correlation was observed between TyG index and eGFR (r = -0.43, p < 0.001). Multivariate analysis adjusted for age, sex, BMI, and hypertension showed that a higher TyG index was independently associated with reduced eGFR (β = -2.12, p = 0.004). Participants in the highest TyG tertile had a 3.1-fold increased risk of moderate-to-severe renal dysfunction compared to the lowest tertile. Conclusion: The TyG index is significantly associated with renal function decline and may serve as a simple, cost-effective marker for early detection of individuals at risk for chronic kidney disease. Its integration into routine metabolic screening may enhance preventive strategies.
Research Article
Open Access
Study Of Different Types of Respiratory Abnormalities in Stable Chronic Heart Failure Patients
Abhishek Kumar Verma,
R.K Nath,
Ajay Kumar Sharma,
Puneet Aggarwal
Pages 47 - 53

View PDF
Abstract
Introduction: In chronic heart failure respiratory abnormality can result from pulmonary congestion or due to coexisting chronic obstructive pulmonary disease. We aimed to study different types of respiratory abnormalities in chronic heart failure patients Methods: Patients with age>18 years and LVEF<40% were included in the study. Symptoms, smoking status, comorbidities and treatment history were recorded in all patients. Patients with chronic kidney disease stage IV/V, prior decompensation for heart failure in last 6 months, NYHA class IV and prior history of lung disorders on treatment were excluded from the study. Patients were subjected to proper clinical examination and routine blood investigation. Symptoms of dyspnea, palpitation or fatigue were recorded as per NYHA functional classification. All patients underwent electrocardiography, transthoracic 2D echocardiography and pulmonary function test (PFT). Results: Hundred patients were studied with 65% males and 35% females. Mean Left Ventricular Ejection Fraction (LVEF) was 32.89(±3.75) %.PFT revealed respiratory abnormality in 58 (58%) subjects with 42(42%) having normal spirometry. Restrictive pattern was seen in 42(42%), obstructive in 6(6%) and mixed pattern in 10(10%) subjects. In NYHA III percentage of patients with obstructive (11.1%) and mixed (37.1%) pattern were greater than NYHA I or II (p:<0.001).In NYHA III,48.1% patients had severe or moderately severe respiratory abnormality as compared to nil in NYHA I or II(p:<0.001).There was positive and significant correlation between LVEF and FEV1(Pearson’s coefficient- 0.253,p:0.01) and between LVEF and FVC(Pearson’s coefficient-0.209,p:0.03). Conclusion: Both obstructive and restrictive form of respiratory abnormalities are common in patients of chronic heart failure. As functional class of dyspnea progresses, the severity of respiratory abnormality worsens with respiratory pattern shifting from normal to restrictive and further to obstructive and mixed pattern.
Research Article
Open Access
Bridging the Diagnostic Gap: A Retrospective Study on Missed and late Diagnoses of Congenital Heart Disease in Neonates and Children
Dr. Harish Jadhav,
Dr. Prashant Weekey,
Dr. Kamran Dalwai,
Dr. Vedashree Deshpande
Pages 41 - 46

View PDF
Abstract
Introduction: The most prevalent congenital defect in children is congenital heart disease (CHD), and prompt treatment and better results depend on early detection. But a lot of instances are diagnosed late, especially in environments with limited resources. Aims: To evaluate the clinical profiles and determine the risk factors associated with delayed detection of congenital heart disease in children attending a tertiary cardiac care center. Materials & Methods: This was a hospital-based cross-sectional observational study conducted at a tertiary cardiac care center. A total of 1991 pediatric patients diagnosed with CHD were enrolled. Data regarding age at diagnosis, clinical presentation, family history, birth term, consanguinity, and other relevant demographic and clinical factors were collected and analyzed. Patients were categorized into delayed and non-delayed diagnosis groups based on the age at diagnosis, and comparisons were made to identify associated risk factors. Result: Acyanotic CHD was found in 396 cases (56.1%) and cyanotic CHD in 310 cases (43.9%) of the 706 patients with delayed diagnosis. Only 192 cases (14.9%) of the 1285 individuals with no delayed diagnosis had cyanotic CHD, whereas the majority (1093 cases; 85.1%) had acyanotic CHD. There was a statistically significant correlation between the kind of CHD and delayed diagnosis (p <.00001). Conclusion: This retrospective investigation led us to the conclusion that the diagnostic gap in children's congenital heart disease (CHD) is influenced by age, gender, parental engagement, religion, geography, and type of CHD.
Research Article
Open Access
Evaluation of Ultrasound Appearance of Thedeltoid Muscle to Predict Type 2 Diabetes mellitus as Potential Non-Invasive and screening Test
Alka Agrawal,
Amit Shankhwar,
Mohd Faraz khan,
Girish Parashar,
Komal Dhayal,
Shubham Bilgaiyan
Pages 35 - 40

View PDF
Abstract
Background: Type 2 diabetes mellitus (T2DM) projected to rise, with a significant portion remaining undiagnosed. Early diagnosis is vital to reduce complications and healthcare costs.Concurrently, musculoskeletal ultrasound (US) is a widely used, non-invasive, and cost-effective imaging modality for shoulder deltoid muscle appearedunusually bright, or hyper echogenic, on US in patients with T2DM and prediabetes—distinct from the patterns seen in obese but non diabetic individuals Objectives: Evaluation of ultrasound appearance of the deltoid muscle to predict type 2 diabetes mellitus as potential non-invasive and screening test. Method: Our study was a hospital based, time bound cross-sectional observational study was conducted in the Department of Radiodiagnosis, M.G.M. Medical College & M.Y. Hospital, Indore, Madhya Pradesh, India. A total of 200 patients and this study was conducted on 200 patients were referred to the Department of Radiodiagnosis for ultrasound evaluation of imaging of the deltoid muscle were studied in our study. Results: In this study, Type II Diabetes Mellitus obese patients had a mean age of 60.17 years, BMI of 38.1 kg/m², deltoid-to-humeral cortex ratio of 0.6, and HbA1c of 7.7%. The deltoid muscle appeared hyperechoic on ultrasound. The p-value (<0.002) indicated a statistically significant correlation. Non-obese diabetics (mean age 65 years) showed similar hyperechoic findings, with a mean BMI of 25 kg/m², ratio of 0.54, and HbA1c of 6.7% (p < 0.001). In contrast, both non-diabetic groups displayed hypoechoic muscles with lower ratios, BMI, and HbA1c values. These findings confirm ultrasound’s reliability in detecting echogenic changes linked to diabetes. Conclusion: This could prove especially beneficial in screening of underserved and underrepresented communities, as well as developing countries. Earlier diagnosis and therefore earlier treatment may prevent or reduce the devastating complications of T2DM and help mitigate a portion of the enormous disease-associated healthcare economic burden.
Research Article
Open Access
Comparative Evaluation of Formalin and Alcohol-Based Fixatives on Tissue Morphology and Immunostaining in Routine Histopathology
Mayur Ramabhai Chaudhari,
Manthan Purohit,
Keyurkumar Gulabbhai Patel
Pages 31 - 34

View PDF
Abstract
Background: Fixation plays a pivotal role in preserving tissue morphology and antigenicity for routine histopathological examination and immunohistochemical (IHC) studies. Formalin, a widely used fixative, offers excellent morphological preservation but may reduce antigen retrieval efficacy. Alcohol-based fixatives have been proposed as alternatives, potentially improving IHC staining quality while preserving morphology. This study aims to compare the effects of formalin and alcohol-based fixatives on tissue architecture and immunostaining characteristics. Materials and Methods: In this comparative study, 60 tissue samples (30 each from liver and lymph node biopsies) were collected and divided equally for fixation in 10% neutral buffered formalin (NBF) and alcohol-based fixative (70% ethanol-methanol-acetic acid mixture). Following standard tissue processing, hematoxylin and eosin (H&E) staining was performed to assess morphological integrity. Immunohistochemistry was conducted using cytokeratin and CD3 markers. Morphological parameters such as nuclear detail, cytoplasmic clarity, and tissue shrinkage were scored on a 0–3 scale. IHC staining intensity and background staining were evaluated semi-quantitatively. Results: Formalin-fixed tissues demonstrated superior nuclear and cytoplasmic detail (mean score: 2.7 ± 0.3) compared to alcohol-fixed tissues (mean score: 2.3 ± 0.4). However, alcohol-based fixatives showed enhanced antigen preservation, with stronger IHC staining intensity for cytokeratin (3+ in 86.6% vs. 2+ in 63.3%) and CD3 (3+ in 83.3% vs. 2+ in 66.6%) compared to formalin-fixed counterparts (p < 0.05). Background staining was more prominent in formalin-fixed sections. Conclusion: While formalin remains the gold standard for morphological preservation, alcohol-based fixatives provide superior antigenicity for IHC applications. A balanced fixative approach or dual-fixation protocols may optimize both histological and immunohistochemical outcomes in routine diagnostics.
Research Article
Open Access
Comparative Effects of High-Intensity Interval Training and Moderate-Intensity Continuous Training on Cardiovascular Function in Sedentary Adults
Ankit Kumar Dinesh Bhai Jag Aniya,
Krutika Ramesh bhai Ninama,
Nirav Maheshbhai Patel,
Kirtan Bharat bhai Patel
Pages 26 - 30

View PDF
Abstract
Background: Sedentary behavior significantly contributes to cardiovascular risk. While both High-Intensity Interval Training (HIIT) and Moderate-Intensity Continuous Training (MICT) are recommended for cardiovascular health, comparative evidence on their efficacy in sedentary adults remains limited. This study aimed to compare the effects of HIIT and MICT on cardiovascular function in sedentary individuals over an 8-week training program. Materials and Methods: A randomized controlled trial was conducted involving 60 sedentary adults (aged 25–45 years), equally divided into HIIT (n=30) and MICT (n=30) groups. The HIIT group underwent 4 × 4-minute intervals at 85–95% of maximum heart rate (HRmax) interspersed with 3 minutes of active recovery, three times a week. The MICT group performed continuous exercise at 60–70% HRmax for 45 minutes, three times a week. Cardiovascular parameters including resting heart rate (RHR), systolic blood pressure (SBP), diastolic blood pressure (DBP), and VO₂max were measured at baseline and post-intervention. Results: Both groups showed significant improvements in cardiovascular metrics post-intervention (p < 0.05). The HIIT group demonstrated a greater reduction in RHR (−9.6 ± 2.1 bpm vs. −5.3 ± 1.8 bpm), SBP (−11.4 ± 3.2 mmHg vs. −6.7 ± 2.9 mmHg), and DBP (−7.2 ± 2.5 mmHg vs. −4.1 ± 2.3 mmHg) compared to the MICT group. VO₂max increased significantly in both groups, with a more pronounced increase in the HIIT group (5.1 ± 1.3 mL/kg/min vs. 3.2 ± 1.0 mL/kg/min, p = 0.001). Conclusion: Both HIIT and MICT improved cardiovascular function in sedentary adults, but HIIT was more effective in enhancing VO₂max and reducing blood pressure and resting heart rate. HIIT may serve as a time-efficient alternative to traditional aerobic training for cardiovascular health in sedentary populations.
Research Article
Open Access
Impact of Resistance Versus Aerobic Exercise on Inflammatory Biomarkers in Patients with Type 2 Diabetes Mellitus
Janvi Bhanji bhai Panchotiya,
Piyush Kumar Harsukhlal Kaneriya,
Vishvesh Kirit bhai Lakhani,
Darshil Rajesh bhai Korat
Pages 22 - 25

View PDF
Abstract
Background: Type 2 Diabetes Mellitus (T2DM) is associated with chronic low-grade inflammation, contributing to disease progression and complications. Exercise, particularly resistance and aerobic training, has been shown to modulate inflammatory markers. This study aimed to compare the impact of resistance versus aerobic exercise on selected inflammatory biomarkers in individuals with T2DM. Materials and Methods: A randomized controlled trial was conducted involving 60 participants diagnosed with T2DM aged between 40–60 years. Participants were randomly assigned into two groups: Resistance Exercise Group (REG, n=30) and Aerobic Exercise Group (AEG, n=30). The intervention lasted 12 weeks, with sessions thrice weekly. Serum levels of C-reactive protein (CRP), interleukin-6 (IL-6), and tumor necrosis factor-alpha (TNF-α) were measured at baseline and post-intervention using ELISA kits. Statistical analysis was performed using paired and unpaired t-tests; a p-value <0.05 was considered statistically significant. Results: Post-intervention, both groups showed significant reductions in inflammatory markers. In the REG, CRP decreased from 5.8 ± 1.2 mg/L to 3.4 ± 1.0 mg/L, IL-6 from 8.2 ± 1.5 pg/mL to 5.1 ± 1.1 pg/mL, and TNF-α from 6.9 ± 1.3 pg/mL to 4.6 ± 1.0 pg/mL. In the AEG, CRP decreased from 5.6 ± 1.3 mg/L to 4.0 ± 1.1 mg/L, IL-6 from 8.0 ± 1.4 pg/mL to 6.0 ± 1.2 pg/mL, and TNF-α from 6.8 ± 1.4 pg/mL to 5.2 ± 1.1 pg/mL. The REG demonstrated a more significant reduction in IL-6 and TNF-α levels compared to AEG (p<0.05), whereas CRP reduction was similar in both groups (p>0.05). Conclusion: Both resistance and aerobic exercise significantly reduced inflammatory markers in patients with T2DM. However, resistance training yielded a more pronounced improvement in IL-6 and TNF-α levels, suggesting it may be more effective in attenuating systemic inflammation in this population.
Research Article
Open Access
Effectiveness of Digital Cognitive Behavioral Therapy in Managing Adolescent Depression: A Randomized Controlled Trial
Swati Kumari,
Meeta Devaji Thakor,
Shubham Ajaybhai Chauhan
Pages 18 - 21

View PDF
Abstract
Background: Adolescent depression is a growing global concern, often underdiagnosed and undertreated. Cognitive Behavioral Therapy (CBT) is a well-established psychological intervention; however, its digital adaptation—Digital Cognitive Behavioral Therapy (dCBT)—offers the potential for greater accessibility and engagement. This study aimed to evaluate the effectiveness of dCBT in reducing depressive symptoms among adolescents compared to standard care. Materials and Methods: A randomized controlled trial was conducted involving 120 adolescents aged 13–18 years diagnosed with moderate to severe depression using the PHQ-9 scale. Participants were randomly allocated into two groups: the intervention group (n = 60) received an 8-week dCBT program via a mobile application, while the control group (n = 60) received standard school counseling. Depression severity was assessed at baseline and post-intervention using PHQ-9 and Beck Depression Inventory-II (BDI-II). Secondary outcomes included changes in anxiety (GAD-7) and quality of life (KIDSCREEN-27). Data were analyzed using paired and unpaired t-tests, with a significance level set at p < 0.05. Results: The intervention group showed a statistically significant reduction in PHQ-9 scores from a mean of 15.2 ± 2.3 at baseline to 7.8 ± 1.9 post-intervention (p < 0.001), whereas the control group showed a smaller reduction from 14.9 ± 2.6 to 12.4 ± 2.1 (p = 0.04). Similarly, BDI-II scores in the dCBT group dropped from 27.1 ± 4.8 to 13.5 ± 3.2. Improvement in anxiety and quality of life scores was also significantly greater in the intervention group compared to controls (p < 0.01). Conclusion: Digital CBT is a highly effective intervention for reducing depressive symptoms in adolescents, outperforming standard counseling in both primary and secondary outcomes. Its scalability and accessibility make it a promising tool for broader implementation in school and community mental health settings.
Research Article
Open Access
To Study the Role of Ultrasound Elastography in Characterization of Breast Lesions in A Tertiary Care Centre
Dr. Alka Agrawal,
Dr. Rakesh Vijayvargiya,
Dr Poonam Joshi,
Dr. Sonia Moses,
Dr. G. Sudha Chandana,
Dr. N. Anil Kumar Reddy
Pages 9 - 17

View PDF
Abstract
Background: Breast cancer remains a leading cause of cancer-related mortality among women globally. Early and accurate detection through imaging plays a crucial role in improving prognosis and reducing mortality. Objective: To assess the spectrum of breast pathologies using sonography and strain elastography, classify lesions using the BI-RADS system, and compare the diagnostic performance of sono-elastography with histopathology. Methods: In this cross-sectional study conducted at a tertiary care hospital in India, 77 female patients presenting with palpable breast lumps underwent B-mode grayscale ultrasonography, colordoppler, and real-time strain elastography. Lesions were categorized according to BI-RADS criteria and subsequently correlated with histopathological findings.Data analysis was performed using SPSS software. Results: The mean age of participants was 41.4 ± 14 years. Breast lumps were the most common presenting symptom (71%), predominantly located in the upper outer quadrant (49%) with nearly equal distribution between breasts. Ultrasound BI-RADS classified 43 (55.8%) lesions as malignant; histopathology confirmed 40 of these as malignant and 3 as benign. Strain elastography indicated benign features in 31 cases (elastography score 1–3, strain ratio <3.1, E/B ratio <1), with histopathology confirming 30 benign and 1 malignant lesion. Among 46 lesions with malignant elastography scores (4–5, strain ratio >3.1, E/B ratio >1), 44 were malignant and 2 benign on histology. Conclusion: Strain elastography enhances the diagnostic accuracy of breast ultrasound, with a sensitivity of 95.6%, specificity of 96.7%, and overall accuracy of 96.1%. This surpasses conventional ultrasound, which demonstrated a sensitivity of 93.02%, specificity of 85.2%, and accuracy of 89.6%. Integrating strain elastography with B-mode ultrasonography improves early detection of malignancy and may reduce the need for unnecessary biopsies.
Research Article
Open Access
Primary Varicose Veins of Lower Limbs with Special Reference to Its Management – A Hospital Based Study
Firoz Ahmed,
Dipak Choudhury,
Devid Hazarika,
Ranjan Chandra Baruah
Pages 1 - 8

View PDF
Abstract
Background: Primary varicose veins are a common chronic vascular issue that often requires surgery, but there’s still a lack of epidemiological and management data from well-resourced settings. Methods: We carried out a prospective observational study at Assam Medical College & Hospital from June 2020 to May 2021, involving 40 patients diagnosed with primary varicose veins. We analyzed demographic, clinical, and duplex ultrasonography data. The surgical procedures included saphenofemoral flush junction ligation (SFJL) with or without great saphenous vein (GSV) stripping, perforator ligation (PL), and saphenopopliteal junction ligation (SPJL). We looked at outcomes such as recurrence rates, complications, and the length of hospital stays. Results: The majority of patients were male (80%, with a male-to-female ratio of 4:1), and most were aged between 31 and 40 years (30%). Right limb involvement was quite common (47.5%). The main symptoms reported were dilated veins (30%) and leg pain (20%). Duplex ultrasonography showed that 60% of patients had combined superficial-perforator incompetence, with GSV involvement in 57.5%. Notably, SFJL combined with GSV stripping and PL (which accounted for 42.5% of the procedures) significantly lowered recurrence rates compared to non-stripping methods (6% vs. 75%; p<0.05). Complications were observed in 32.5% of cases (with delayed healing in 38.4%), but these were not linked to the stripping procedure (p=0.44). The average hospital stay was 12.7 days (SD 3.8), and this was not influenced by the extent of stripping (p=0.81). Conclusion: GSV stripping during SFJL significantly decreases the chances of recurrence without extending hospital stays or increasing complications. Using duplex mapping is essential for accurate surgical planning, especially in cases with complex hemodynamic presentations