Clinical and Preventive Nutritional Science
Student and Alumni Research Abstracts

Graduates of the Program have written theses on a variety of topics. Some have been published, or are in the process.

Author Name:

Research Type:

Program:

Keyword in the title:

 
 
Total: 276

Grocery Shopping Habits and Interest in Fruit and Vegetable Related Point-of- Purchase Educational Interventions: A Needs Assessment

Ashley Cully, 2014

Background: Supermarket point-of-purchase (POP) nutrition education may lead to more fruits and vegetables consumed. Objective: To determine current fruit and vegetable purchasing practices among Marlton, New Jersey ShopRite shoppers, their interest in POP interventions about fruits and vegetables, and the association between when (time of day and day of the week) shoppers shop and their interest in POP interventions. Design/ Subjects: A descriptive, internet-based, electronic cross-sectional survey using a convenience sample. Statistics: Descriptive statistics and frequency distributions were reported for demographics, fruit and vegetable purchasing habits, and interest in POP interventions. Pearson-Chi-square and Fisher’s Exact tests were used to explore the relationship between when shoppers shopped and their interest in POP interventions. Results: The study sample (n=69) had a mean age of 48.5 years; 39.6% were college graduates; 36.6% had annual household incomes of > $100,000. The sample was made up of mostly females (89.1%); 63.8% were white; with 2.8 people in their household. The most popular types of POP interventions shoppers were interested in were “demonstration tables” (n=44, 91.7%), “individual counseling with the Registered Dietitian” (n=39, 83.0%) and “informational signs throughout the store” (n=32, 66.7%). The most common time and day shoppers shopped was in the afternoon (n=30, 46.2%) and on the weekends (n=31, 49.2%), particularly Sundays (n=18, 28.6%). There was no statistical relationship found between when shoppers shopped and the types of POP interventions they were interested in. Conclusions: While no significant relationship was found between when shoppers shopped and the type of POP interventions they were interested in, this study indicates that shoppers are interested in POP interventions regardless of when they shop.


The Relationship Between Nutritional Status and Clinical Outcomes in Critically Ill Children

Lori Bechard, 2014

Critically ill children in pediatric intensive care units (PICU) are at risk for poor health outcomes, including hospital-acquired infections, prolonged hospitalizations, and mortality. Extremes in nutritional status during childhood are associated with morbidity and mortality. Body mass index Z-score (BMI Z-score), based on World Health Organization reference data, is a feasible assessment indicator of childhood nutritional status in a variety of settings. The purpose of this study was to determine the impact of nutritional status on selected clinical outcomes in an international population of mechanically ventilated children in PICUs. In the combined cohort (N=1622) of two data collection efforts in 2009 and 2011, 17.9% of subjects were underweight (BMI Z score < -2), 54.2% were normal weight (BMI Z score > -2 and <1), 14.5% were overweight (BMI Z score > 1 and < 2), and 13.4% were obese (BMI Z score > 2). Mortality, hospital-acquired infections, hospital length of stay, and ventilator-free days (VFD) were evaluated using multivariate analysis techniques. Patient age, PICU size and location, admission type, and diagnosis were significantly associated with nutritional status category and used as covariates in the analyses. Compared to normal weight children, the odds ratio for mortality was significantly higher in underweight (OR 1.64, p=.001), overweight (OR 1.67, p=.01), and obese children (OR 1.72, p=.02), and the odds ratio for hospital-acquired infections was significantly higher in underweight (OR 1.79, p=.008), overweight (OR 1.34, p=.02), and obese children (OR 1.50, p=.01). Hazard ratios for hospital discharge were significantly lower among underweight (HR 0.71, p<.001) and obese (HR 0.81, p=.02) children compared to normal weight children. The odds of achieving one or more VFD were significantly lower in underweight (OR 0.61, p<.001), overweight (OR 0.80, p<.001), and obese children (OR 0.51, p<.001) compared to normal weight children. Underweight was associated with 1.1 (p<.001), 1.4 (p<.001) and 0.78 (p=.03) fewer VFD, respectively, than normal weight, overweight, and obesity. The nutritional status of critically ill children has a significant impact on morbidity and mortality. Longitudinal investigations of the relationships between nutritional status and clinical outcomes during PICU admissions are warranted.


Amyotrophic Lateral Sclerosis: The relationship between percentage of body weight change and the Amyotrophic Lateral Sclerosis Functional Rating Scale - Revised. A Pilot Study.

Daniel Greenwood, 2014

Objective: To explore the relationship between percentage of body weight change and functional decline using the Amyotrophic Lateral Sclerosis Functional Rating Scale – Revised (ALSFRS-R) in veterans with Amyotrophic Lateral Sclerosis (ALS) from baseline through nine-months of follow-up. Problem Statement: ALS is a progressive neurological disease with a high risk of malnutrition. Weight loss is associated with shorter survival. Emerging research indicates that patients with ALS that are obese have longer survival and a slower decline in ALSFRS-R score. This study explored the relationship between the percentage of body weight change and the ALSFRS-R total score change from initial visit to three, six, and nine-month follow-up appointments in veterans that initiated ALS specialty care at the Minneapolis Veterans Affairs Medical Center between January 1, 2010 and October 1, 2012. Methods: A retrospective electronic medical review of adult veterans with ALS was performed at the Minneapolis Veterans Affairs Medical Center. One hundred and eight veterans’ records were reviewed, 68 veterans (63.0%) met inclusion criteria initiating care between January 1, 2010 and October 1, 2012 with the diagnosis of ALS. Data Analysis: A Pearson’s Product Moment Correlation was utilized to explore the relationships between percentage of body weight change and ALSFRS-R total score change from initial visit to three, six, and nine months. Significance & Conclusion: Patients were predominantly male (n=65, 95.6%) with a mean age of 68.2 years. Over the nine months of study, 38.2% (n=26) of patients had a feeding tube placed and 30.2% of patients (n=19) expired. The mean body weight of the population increased from initial visit (n=68, mean=181.9 pounds) to nine months (n=26, mean=190.7) and the mean body mass index (BMI) increased from initial visit (n=67, mean=26.2 kg/m2) to nine months (n=26, mean=27.4 kg/m2). However, this change was largely reflective of a greater proportion of under and normal weight patients expiring as the mean percentage of body weight change from baseline to three, six, and nine months was -2.5%, -2.6%, and -2.9%, respectively. No deaths occurred in obese patients (n=11), while 66.7% of deaths (11 of 19) occurred in patients with an initial BMI in the underweight or normal weight categories. The mean ALSFRS-R total score and ALSFRS-R subcomponent scores declined from initial visit to three, six, and nine months. There was no significant relationship between percentage of body weight change and change in the ALSFRS-R total score from baseline to three months (p=0.31), six months (p=0.36), and nine months (p=0.25) among surviving patients in this study. Future research should investigate the mechanisms of survival advantage with obesity in patients with ALS.


Change in Knowledge and Behaviors among Registered Dietitians Following a Tailored Educational Intervention in Evidence-Based Practice: A Randomized Controlled Trial.

E Annelie M. Vogt, 2014

Background Implementation of evidence-based practice (EBP) is essential for patient safety, quality of care, cost, and reimbursement of services. Objective To measure the impact of an EBP intervention among registered dietitians (RDs). Design Prospective randomized controlled trial with concealed group allocation. Population/Setting An intervention (EBP-ED) (n=22) and a comparison (WAIT) group (n=35). Intervention Tailored web-based interactive modules with online support. Main Outcome Measures Change in knowledge and clinical practice behaviors of EBP. Statistical Analyses Pearson’s chi square, independent t-test, repeated measures analyses of variance and covariance. Results Post intervention, there was a significant increase in knowledge means for “interpret statistical results” in the EBP-ED group compared to the WAIT group (p=0.002). A positive correlation between current stage of change and the total clinical practice behavior score (r=0.51, p<0.0001) was established at baseline. At follow-up, a significant large relationship was found between current stage of change and the total clinical practice behavior score in the EBP-ED group (F=4.72, p=0.02, r2=0.58). Post-hoc analyses suggested significantly higher means for those who were taking steps towards implementation compared to those who were not in the EBP-ED group (mean difference=10.57 ±3.67, p=0.029). When controlling for current stage of change, a significant weak interaction was demonstrated for the total knowledge score from baseline to follow-up between the study groups (F(1,52)=12.17, p=0.001, partial eta squared, ?2=0.19). Marginal means estimates suggested a 0.24 higher knowledge score in EBP-ED group. Conclusions The ability to interpret statistical results improved significantly in the EBP-ED group post-intervention. Motivation for change is an important factor for adopting knowledge into clinical practice. When motivation was controlled for in repeated measures analyses, a significantly higher knowledge score was demonstrated in the intervention group across time.


Impact of Providing a Combination Lipid Emulsion Compared to a Standard, Soybean Oil Lipid Emulsion in Children Receiving Parenteral Nutrition: a Systematic Review and Meta-analysis

Kristen Lawler Finn, 2014

Background: Soybean oil lipid emulsion may compromise immune function and promote hepatic damage due to its composition of long chain fatty acids, phytosterols, high proportion of ?-6 fatty acids, and low a-tocopherol levels. Combination lipid emulsions have been developed using medium chain triglyceride oil, fish oil, and/or olive oil, which provide adequate essential fatty acids, a smaller concentration of ?-6 fatty acids, and lower levels of phytosterols. The purpose of this systematic review is to determine if combination lipid emulsions have a more favorable impact on bilirubin levels, triglyceride levels, and incidence of infection compared to soybean oil lipid emulsions in children receiving parenteral nutrition. Methods: This study comprises a systematic review of published studies. Data were sufficient and homogenous to conduct meta-analysis for total bilirubin and infection. Results: Nine studies met the inclusion criteria. Meta-analysis showed that combination lipid emulsion decreased total bilirubin by a mean difference of 2.09 mg/dL (95% CI -4.42, 0.24) compared with soybean oil lipid emulsion, although the result was not statistically significant (P=0.08). Meta-analysis revealed no statistically significant difference in incidence of infection between the combination lipid emulsion and the soybean oil lipid emulsion groups (P=0.846). None of the four studies that included triglyceride as an outcome detected a significant difference in triglyceride levels between the combination lipid emulsion and soybean oil lipid emulsion groups. Conclusion: There is inadequate evidence that combination lipid emulsions offer any benefit regarding bilirubin levels, triglyceride levels, or incidence of infection compared to soybean oil lipid emulsions.


Associations between Tooth Loss/ Prostheses and Nutritional Status in Older Adults: A Systematic Review

Rena Zelig, 2014

Objectives: This study examined the associations between missing teeth, with or without prostheses, and nutritional status in community-dwelling older adults using the Mini Nutritional Assessment (MNA) as an indicator of malnutrition risk and Body Mass Index (BMI) as an indicator of weight status. Design: Systematic Review (SR) of human studies Methodology: Medline, CINAHL and Cochrane Libraries were searched to identify articles published between January 2000-April 2014. Studies were systematically selected using predefined eligibility criteria. Data were abstracted and synthesized in narrative and summary tables, risk-of-bias assessments were performed, and PRISMA guidelines were followed. Results: Of the 22 studies meeting inclusion criteria, five explored MNA, 15 examined BMI and one study explored both BMI and MNA. Five of eight studies that assessed MNA score identified significant associations between MNA score and tooth loss. MNA scores were significantly lower in those with fewer teeth or limited occlusion as compared to those with more teeth and/or more posterior occluding teeth pairs. In individuals with missing teeth/limited occlusion, MNA scores increased significantly following the provision of dentures. Eight of 15 studies identified significant associations between BMI and tooth loss. Individuals who were missing more teeth and/or had limited occlusion were more likely to be underweight or obese than those with more teeth and/or better occlusion. Studies differed in their design and in definitions of exposure and outcome variables. Conclusions: Of the 22 studies reviewed, 13 support associations between missing teeth, teeth replaced with prostheses, weight status and malnutrition risk; individuals with fewer teeth and poorer occlusion were at increased risk of non-normal weight status and malnutrition. Additional high quality research is warranted to better substantiate a causal relationship between dental and nutritional status.


The Relationship Between Health Related Quality of Life and Clinical Measures Among Participants of the LIFT-UP 12-Week Worksite Wellness Intervention At Rutgers University

Monica Cicchini, 2014

Background: As the number of overweight and obese individuals increases, worksite wellness programs are becoming more crucial. The goal of these programs is to achieve the national agendas to promote weight management, reduce cardiometabolic disease risk and improve health related quality of life (HRQOL), which is associated with cardiometabolic disease risk factors and overall productivity. Design/Subjects: This study was a retrospective, exploratory analysis of the LIFT UP worksite wellness intervention at Rutgers University. Of the 136 participants who attended the BL appointment between January 1, 2013 and June 7, 2013, 56.6% (n= 77) participated post intervention follow up (completers) within 15 weeks from BL visit before August 31, 2013. Statistics: Clinical measures and HRQOL were measured at baseline (BL) and after the 12-week intervention. Relationships between change in weight, change in waist circumference (WC), and change in each of the CDC HRQOL-14 Core Healthy Days measures were analyzed. Results: The completers (n=77) were primarily female (89.6%, n=69). Their mean age was 48.7 years (SD=.11.7). Paired t tests indicated statistically significant difference in weight (p<0.001), WC (p=0.001), BMI (p<0.001), blood glucose level (p<0.001), systolic (p=0.001) and diastolic blood pressure (p<0.001) from BL to post intervention follow-up. Wilcoxon signed rank test indicated statistically significance difference in self-rated health status (p ? 0.001), mentally unhealthy days (p = 0.003), and summary index score (p = 0.006) from BL to post intervention follow-up. Spearman's Rank correlation indicated that there were no statistically significant relationships between change in weight and percent weight loss and each CDC HRQOL-14 Core measure. There was also no statistically significant relationship between change in WC and each CDC HRQOL-14 Core measure. Conclusions: Participants in a worksite wellness program for overweight employees at a large, public university in New Jersey achieved statistically significant improvement in weight, WC, blood glucose levels, systolic blood pressure, diastolic blood pressure, mentally unhealthy days, activity limitation days and a summary index score from BL to follow up. Insights from these analyses may support further use of LIFT UP as a worksite wellness program within Rutgers University.


Impact of Participating in an Introductory Nutrition Course on Dietary Habits of Undergraduate Students

Jill Englett, 2014

Background: College students strive to develop independence and autonomy, often choosing to establish habits that differ from those developed during childhood and presenting an opportunity to positively influence dietary habits. The purpose of this study was to evaluate the impact of participating in an introductory nutrition course on dietary habits of undergraduate students in a rural university in Alabama. Design/Subjects: A retrospective review of the data from two three-day, pre-post semester, food records collected on 96 undergraduate students participating in an introductory nutrition course at a rural university in Alabama. Statistics: Descriptive statistics were used to depict demographic characteristics of participants and to define beginning and ending intake of fruit, vegetable, and dairy servings. Wilcoxon Signed-Rank tests were used to evaluate the differences between beginning and ending intake of fruit, vegetable, and dairy servings. Mann Whitney U tests where used to examine the difference in intake of fruit, vegetable, and dairy servings between traditional and nontraditional students and students taking the hybrid versus online class. Results: The study sample (n=96) was 88% female, with a mean age of 23.6 years. The majority (78%) of students were traditional students (n=75). Based on body mass index, 58% of the students were normal weight, while 35% were either overweight or obese. The initial and final average daily fruit, vegetable, combined fruit and vegetable and dairy intake servings were below the national recommendations. Intake of fruit servings (p<0.001), combined fruit and vegetable servings (p=0.001) and dairy servings (p<0.010) significantly increased, while there was no significant change (p=0.294) in vegetable servings from the beginning to the end of the semester. Conclusions: This study demonstrated that participating in a college nutrition class provides an opportunity to have a positive impact on fruit, combined fruit and vegetable and dairy intake.


Effectiveness of ArmyMOVE! Versus Traditional Nutrition Education and Counseling for Overweight Soldiers

Marybeth Salgueiro, 2011

Objective: To compare the effectiveness of the militarys ArmyMOVE! and Weigh to Stay weight management programs. Design: Randomized, prospective clinical outcomes trial. Subjects: Active-duty, overweight or obese soldiers, age 19-57 years, stationed at Fort Sam Houston, TX and Fort Bliss, TX and referred for nutrition counseling between April and August 2010. Intervention: A total of 102 service members (78 men and 24 women), ages 19-57 years old were randomized to attend either the ArmyMOVE! (AM) weight management program or the Weigh to Stay (WTS) program. Both programs included two sessions with a Registered Dietitian. The AM program utilized a facilitated, group support format and incorporated motivational interviewing and problem solving; the WTS program was delivered in a classroom format using a traditional lecture-based presentation. Main Outcome Measures: Of the original sample (n = 102), 57 subjects returned for outcome measures. Body weight, percent body fat, waist circumference, body mass index, stage of change, and fruit and vegetable intake were measured at baseline and at week 12. Statistical Analysis: Descriptive statistics, independent t-tests, Chi square and repeated measures ANOVA were used. A priori alpha was set at p d 0.05. Results: No statistically significant treatment effects were observed. Within subjects main effects were significant for decreases in weight (p<0.001), waist circumference (p = 0.002), body mass index (p < 0.001), and percent body fat (p < 0.001). No significant within subject effects were observed for fruit and vegetable intake (p = 0.55). Conclusion: Independent of group assignment, soldiers were able to improve body composition in 12-weeks.


A Descriptive Study of the UMDNJ-SHRP Coordinated Program Alumni Perceived Level of Professional Preparedness Based on Commission on Accreditation for Dietetic Education 2008 Eligibility Requirements and Accreditation Standards

Sylvia Villarreal, 2011

Objective: To describe the demographic and professional characteristics of the University of Medicine and Dentistry of New Jersey Coordinated Program in Dietetics alumni and their perceived level of professional preparedness for entry-level practice. Design/Methodology/Subjects: A descriptive web-based survey was emailed to all alumni of the University of Medicine and Dentistry of New Jersey, School of Health of Related Professions, (UMDNJ-SHRP) Coordinated Program (CP) in Dietetics who graduated between January 2001 and May 2010 (N=88). Survey respondents were emailed a link to the survey via SurveyMonkey (http://www.surveymonkey.com). Statistical analysis performed: SPSS V 17.0 was used for analysis. Descriptive statistics and frequency distributions were reported. Results: Of the 88 total email invitations that were sent, 41 (48.2%) alumni responded. The results of this study demonstrated that the UMDNJ-SHRP CP alumni who graduated between January 2001 and May 2010, perceived themselves to be adequately prepared or very well prepared for 1) professional practice expectations, 2) clinical and customer services, 3) management and use of resources, 4) scientific and evidence-based practice as it relates to entry-level dietetic practice. Conclusion: Based on the results of this study, the majority of alumni felt adequately or very well prepared for select entry-level practice competencies.


An Outcomes Assessment of Clinical Nutrition Expert Training by University of Medicine and Dentistry of New Jersey with University of Shizuoka

Akiko Ichimasa, 2011

In Japan, due to revisions in the Japanese dietitian regulations and amendments to the universal medical insurance system, the roles and responsibilities of hospital registered dietitians have been rapidly changing. The University of Medicine and Dentistry of New Jersey (UMDNJ) and University of Shizuoka (U of S) have a collaborative program funded by a Japanese program Global Center of Excellence (COE) to develop clinicians with research abilities through Clinical Nutrition Expert Training. The purpose of this research was to explore the perceived benefits and preferred learning strategies of the participants and how the program has affected their practice and teaching. The perceived needs and preferred methods for future education and training through the collaborative partnership were explored. A mixed-method design study was conducted to assess outcomes of the international collaborative clinical nutrition expert training. A sample population of all available University of Shizuoka faculty and their doctoral students (n=12) who participated in the training between 2004 and 2009 and the dean of the department of University of Shizuoka (n=1) was used. A quantitative survey was conducted using an online survey for demographic characteristics and analyzed by using descriptive statistics and frequency distribution. A qualitative semi-structured in-person interview was conducted in their native language in Japan to explore the outcomes of the training. Data from the interviews was audio recorded and analyzed by using content analysis. This program provided benefits in the awareness of the need to improve Japanese dietetics practice, knowledge, skills, motivation, and opportunity. Also, various outcomes of the expert training were presented. Some participants have changed their clinical practice or teaching material after participating in the program. Other participants have distributed the knowledge that they gained from the training to their students and colleagues including dietitians, school teachers, and faculty members.


Screening for Malnutrition on an Acute Care of Elders Hospital Unit with the Mini Nutritional Assessment - Short Form

Sandra Kuserk, 2011

Background: Malnutrition is often overlooked in hospitalized older adults. There is a need for an efficient, validated and reliable nutrition screening tool in order to trigger intervention by a Registered Dietitian (RD). Objective: The aim was to determine the sensitivity and specificity of the Mini Nutritional AssessmentShort Form (MNA-SF), Crozer-Chester Medical Center Nutrition Screening Tool (CCMC-NST-original), a modified version of the Crozer-Chester Medical Center Nutrition Screening Tool (CCMC-NST-modified) and the Malnutrition Screening Tool (MST). The level of agreement between these tools was also assessed. For the purposes of this study the Comprehensive Nutrition Assessment (CNA) by a RD was considered the gold standard by which the screening tools were compared. Design: This was a retrospective, cross-sectional analysis of data collected at a teaching hospital in Upland, Pennsylvania. Subjects: Seventy-seven patients from one Acute Care of Elders unit who met inclusion criteria. Statistical analyses performed: A Kappa statistic was performed to determine the level of agreement between the MNA-SF, CCMC-NST-original, CCMC-NST-modified and MST. Sensitivity and specificity were determined by comparing the MNA-SF, CCMC-NST-original, CCMC-NST-modified and MST with the gold standard of CNA by a RD. Results: The MNA-SF was the best tool for predicting malnutrition (sensitivity=.97, specificity=.77) while minimizing false-negatives (3.3%). The MST was the most specific tool (sensitivity=.72, specificity=.82) but it had the highest rate of false-negatives (28.3%). The CCMC-NST-original was a sensitive screening tool (sensitivity=.95, specificity=.18) but had a high number of false-positives (82.4%). The level of agreement between the CCMC-NST-original and the MNA-SF was not significant (K=.143, p=.077) nor was the level of agreement between the CCMC-NST-original and the MST (K=.055, p=.275). Conclusions: The current nutrition screening tool at Crozer-Chester Medical Center (the CCMC-NST-original) has a poor level of agreement with other validated nutrition screening tools. The MNA-SF was the most sensitive screening tool for identifying malnourished older adult patients in an acute care setting while minimizing false-positives.


Utilizing the Vanderbilt Head and Neck Symptom Survey to Assess the Impact of Symptom Burden on Oral Intake and Weight Change over Time in Head and Neck Cancer Patients Post Concurrent Chemoradiation

Heidi Ganzer, 2011

Background: Concurrent chemoradiation (CCR) for the treatment of HNC place patients at risk for malnutrition related to symptom burden. Following CCR symptom burden may persist, impacting weight, oral energy and protein intake. Objective: To explore relationships between select symptom burden scores, energy and protein intake, and weight change over time among patients with HNC that had completed CCR. Design: Prospective, cross-sectional study performed within two comprehensive cancer care centers. Subjects: Forty-three adults with HNC met inclusion criteria. Statistical analyses performed: A sensitivity power analysis was utilized. Pearsons correlations and partial correlations were utilized to explore relationships between select symptom burden scores, energy and protein intake, and weight change over time. Results: Participants were predominately Caucasian (n=42, 97.7%) and male (n=35, 81.4%) with a mean age 60.14 years (range 31-80 years). Ninety-one percent (n=39) had Stage III or IV disease. Eight-six percent (n=37) had a feeding tube placed. The mean weight loss from diagnosis to treatment completion was 7.91% ± 4.06. Participants using only a FT for nutrition had the highest intake of energy/protein (x¯ energy and protein, 2367 calories, 102.92 g). Within the mid recovery stage significant inverse relationships were found between oral protein intake and dry mouth and mucosal sensitivity (r= -.818, p=.012; r= -726, p= .032 respectively). After controlling for weight change significant inverse relationships were found in the mid recovery stage between oral energy intake and dry mouth and mucosal sensitivity (r= -.740, p=.046; r= -.751, p=.043 respectively). Significant inverse relationships were also found between oral protein intake and dry mouth and mucosal sensitivity (r= -.835, p=.019; r= -.726, p=.033 respectively). Conclusions: Dry mouth and mucosal sensitivity significantly impacted oral energy and protein intake post CCR in the mid recovery stage. Weight loss occurred from diagnosis to treatment completion and continued over time. Emphasis on symptom burden, specifically dry mouth and mucosal sensitivity, in regards to oral intake and weight post CCR should be standard of care.


Comparing Food and Food Service Satisfaction of Residents in Long-Term Care Facilities that Provide Meals in Resident-Centered Care Neighborhoods or Meals in a Traditional Nursing Home Setting

Phyllis Famularo, 2009

ABSTRACT OBJECTIVES: The purpose of this outcomes study was to determine the impact of decentralized meal service on resident-oriented outcomes, specifically food and food service satisfaction in long-term care facilities, especially with regard to the resident-centered care philosophy of resident choice. Family member satisfaction with food and food service and its relationship to the long-term care resident was also examined. DESIGN: Residents in two long-term care facilities, one providing decentralized meal service in resident-centered care neighborhoods and one providing meals in a traditional nursing home setting, were selected based on inclusion criteria to complete an interviewer-administered 28-item food and food satisfaction survey which encompassed four domains. Family members of residents that participated in the satisfaction surveys were mailed a family member food satisfaction survey employed by Sodexo. SETTING: Two long-term care facilities in New York state whose foodservice operations are managed by Sodexo were selected to participate in the study. One facility, Masonic Care Community, Utica, New York provided decentralized meal service in resident-centered care neighborhoods and the second facility, Ozanam Hall Nursing Home of Queens, Bayside, New York provided meals in a traditional nursing home setting. PARTICIPANTS: Ninety-one long-term care residents participated in the study with 46 residents from the Neighborhood dining facility and 45 from the Traditional dining facility. Residents were 65 years of age or older and resided in their current long-term care facility for at least 30 days. Forty-one family members responded to the family satisfaction survey; 27 from the Neighborhood dining facility and 14 from the Traditional dining facility. MEASUREMENTS: Mean scores were calculated for the 28 satisfaction questions on the resident surveys and mean domain scores were calculated for each of the four domains. Higher scores on the four-point scale denoted greater satisfaction. Demographic, co-morbidity, diet and feeding characteristics were collected for each resident that participated in the study to determine whether there were any differences between the study facilities. Mean scores were calculated for the nine questions on the family survey and results were descriptively compared to the results of the resident satisfaction surveys. MAJOR RESULTS: Gender was the only demographic variable that was statistically different between the two study facilities and a sub-analysis resulted in no significant gender differences when domain scores were analyzed. Residents in both study facilities were satisfied with the food services they receive with all four domain mean scores averaging over three points on the four-point scale. When the individual survey questions were analyzed, only one of the 28 questions on the resident satisfaction survey was statistically significant between the two long-term care facilities investigated (Get take-out food for me if I want it). However, when the mean scores of the four domains were analyzed, two of the domain scores were significantly different. Residents in the Traditional dining facility scored significantly higher in the Exercising Choice domain, and residents of the Neighborhood dining facility scores significantly higher in the Providing Food Service domain. Family members of the Traditional dining facility rated overall satisfaction, taste of food, appearance of food, portion size, menu options and variety of food significantly higher than the Neighborhood dining facility, however, the Neighborhood dining facility had a much higher family response, 58.7% compared to 31.1%. CONCLUSIONS: Residents of long-term care facilities were generally satisfied with the meal services they receive irrespective of type and location of meal service. While there were some differences noted in the domain scores, choice of meals was important to many of the residents surveyed. Residents that receive decentralized meal service in resident-centered care neighborhoods tended to be more vocal if they had food concerns. Family members in the traditional nursing home setting expressed greater satisfaction with food services; however, they had limited observation of the actual meal service due to the space restraints of a traditional nursing home.


Effect Of a Dietitian Managed Bone Algorithm on Serum Phosphorus Level in Maintenance Hemodialysis Patients

Deborah Blair, 2011

Objective: This study examined the effectiveness of a dietitian (RD) managed bone metabolism algorithm compared to non-RD managed on serum phosphorus (PO4) and related clinical outcomes [corrected serum calcium (cCa), intact parathyroid hormone (iPTH), incidence of parathyroidectomy] among in-center maintenance hemodialysis (MHD) patients. Design and Setting: The study was an 18-month retrospective medical record review of adult MHD patients (n=252) at five out-patient dialysis centers in western Massachusetts and Connecticut before and after change in the management of a comprehensive bone metabolism treatment algorithm (IV vitamin D, phosphate binding medication, calcimimetic) from non-RD to RD. Timepoints representing 3-month averages during the non-RD (11/08  7/09) and RD (8/09  4/10) managed periods of the same algorithm were used for analyses. Comparisons of outcomes at similar calendar timepoints [i.e. non-RD managed timepoint 2 (2/09  4/09) and RD managed timepoint 6 (2/10  4/10)] were performed considering potential demographic and clinical confounders. Results: Serum PO4 was lower on average during the RD managed timepoints (range = 5.16 5.19 mg/dL) compared to non-RD managed (range = 5.245.37 mg/dL), though the difference between timepoint 2 and 6 was not statistically significant (F=.108, p=.74) after controlling for age, dietary intake (enPCR), and dialysis adequacy (eKdrt/V). Mean cCa (range = 8.718.79 mg/dL) was similar throughout the study, and the difference between serum iPTH at timepoint 6 (363.0 + 296.8 pg/mL; mean + standard deviation) compared to timepoint 2 (319.8 + 251.5 pg/mL) was nonsignificant after controlling for age. There were fewer parathyroidectomies during the RD managed period (0.8%) compared to the non-RD managed (1.6%). Conclusions: Study results demonstrate that RDs may be equally effective as non-RDs in bone metabolism algorithm management with respect to serum PO4, cCa and iPTH control in MHD patients. Further research is needed to prospectively evaluate the effect of RD management on these bone mineral outcomes.


Perceptions, Attitudes, Knowledge and Clinical Use of Evidence-Based Practice among US Registered Dietitians. A Prospective Descriptive Pilot Study

E Annelie M Vogt, 2011

Background Evidence-based practice (EBP) has evolved out of a drive for patient safety, yet little research is published regarding EBP of registered dietitians (RDs). Objective The objective of this study was to describe perceptions, attitudes and knowledge (PAK) and clinical use regarding EBP of US credentialed RDs. Design The design was a prospective descriptive internet-based survey. Study Population The study population consisted of a randomized sample of 2,500 US credentialed RDs that spent at least 20% of their time in clinical practice per week. Statistical Analyses SPSS® v19.0 was used for analyses of descriptive statistics. Results Mean age of the respondents was 43.6 years, the largest proportion were female (n=171, 96.1%), white (n=157, 88.2%), worked full time (n=136, 76.8%) and were members of the American Dietetic Association (ADA) (n=123, 62.1%). Most frequent use of the ADA Evidence Analysis Library (EAL) was yearly (n=79, 41.4) Access to databases mainly occurred via work (n=167, 84.3%) or home (n=128, 64.6%) PC or laptop. Lack of access to mentors, formal training and time constraints were primary barriers. Over one-half of respondents (n=112, 56.6%) had access to databases, but used evidence-based resources (n=100, 52.6%) or read professional journals (n=90, 48.1%) less than once a month. Respondents considered EBP to be valuable (n=146, 75.3%) and relevant to practice (n=135, 68.9%). Increased frequency of use of resources (F=11.97, p<0.0001) and reading of professional journals (F=6.34, p<0.0001) was associated with increased PAK scores. Conclusions Results indicated that RDs had good access to databases but most used resources less than once a month. Perceptions and attitudes were higher than levels of knowledge of EBP and respondents lacked mentors and time for implementation. Increased frequency of use of resources was significantly associated with increased PAK of EBP. Educational strategies are needed to increase knowledge and appraisal skills of EBP.


The Likelihood of Utilizing Weight Management Resources by UMDNJ Faculty and Staff

Stephanie Macaluso, 2011

Background: Studies have shown improvements in eating habits, physical activity, and weight status of employed adults participating in worksite wellness programs but it is unknown which combination of factors best predicts the likelihood for interest in utilization of weight management resources in the workplace. Objective: This study sought to determine which combination of demographic characteristics, opinions and interests toward weight management, and behavioral factors were significant predictors for interest in utilization of weight management resources at the University of Medicine and Dentistry of New Jersey (UMDNJ) based on the weight management resources survey completed by faculty and staff in 2008. Design: A retrospective exploratory study was conducted based on previously collected survey data from the Perceived Needs, Interests, and Practices in Weight Management of UMDNJ Faculty, Staff, and Students study. Subjects: Faculty and staff responses (n=432) were used out of the 612 survey responses. Statistical Analyses: Descriptive statistics were reported for demographic characteristics, opinions toward weight management, interests in weight management, and behavioral factors. Chi-square analyses were used to determine the relationship between demographic characteristics, interests in weight management, behavioral factors and likelihood of utilizing individual diet counseling and likelihood of utilizing either an internet-based or campus-based weight loss program. Binary logistic regression was completed to determine significant predictors for interest in utilization of individual diet counseling and either an internet-based or campus-based weight loss program. Results: Seventy-seven percent of respondents were staff and 22.7% were faculty. Seventy-six percent of the sample was female, had a mean age of 43.82 years, a mean BMI of 27.64, and 51% had a graduate degree. Seventy-nine percent reported healthy eating, 84.5% reported attaining or maintaining a healthy weight, and 69.6% reported regular physical activity were all important to them. Seventy-seven percent were interested in healthy dining options and 60% were interested in individual diet counseling. Almost seventy percent reported eating at least 3 oz equivalents of whole grain daily and 57.6% reported eating at least 1 ½- 2 cups of fruits per day. Significant predictors for the likelihood of utilization of individual diet counseling included an obese BMI, age, interest in a nutrition and healthy eating class, and utilization of a campus-based weight program. Significant predictors for the likelihood of utilization of either weight loss program included an obese BMI, having a bachelors degree, interest in a nutrition and healthy eating class, interest in individual diet counseling, and utilization of individual diet counseling. Conclusions: A three-factor model including demographics, interests in weight management, and behavioral factors was significantly better in predicting interest in utilization of individual diet counseling and predicting interest in utilization of either an internet-based or campus-based weight loss program than a two-factor model including only demographics and behavioral factors.


Knowledge, Risk Factors, and Behaviors Associated with Lower-Limb Complications in Patients with Diabetes on Hemodialysis

Mona Therrien, 2012

Objective Identify the risk factors for lower-limb complications, knowledge of diabetes self-management and foot care behaviors in a sample of hemodialysis patients with diabetes. Design/Subjects A prospective, cross-sectional pilot study using a convenience sample of patients with diabetes on hemodialysis was conducted. Participants completed a survey on diabetes knowledge and foot care behaviors. Demographic and clinical data were gathered from the electronic medical record. Statistical Analysis Pearsons correlation and Kendalls Tau were used to test the relationships among knowledge, risk factors and foot care behaviors with p=< 0.05 considered significant. Results The mean age of the 79 participants was 65.6 years, 55.7% were female and 96.2% were Caucasian. Participants had diabetes for 22.4 years. Neuropathy was diagnosed in 60.3% of participants; 34.2% had a previous foot ulcer, 33.3% had peripheral vascular disease, 24.4% had an amputation, and 25.6% had foot deformities. Participants scored 78.2% on the Diabetes Knowledge Test, and the mean number of foot care behaviors performed was 1.61 out of 4. Knowledge deficits included carbohydrate content of foods and A1c testing. Foot care behaviors, including foot moisturizing and toenail trimming, were not performed according to recommended guidelines. There were positive correlations found between dialysis vintage and the following risk factors: PVD (r=0.283, p=0.012), neuropathy (r=0.244, p=0.031), foot ulcer (r=0.233, p=0.040), and amputation (r=0.324, p=0.004). Conclusions Risk factors and knowledge gaps regarding diabetes and foot care behaviors were identified. Interventions are warranted to bridge knowledge gaps in diabetes self-management and improve the performance of foot care behaviors.


Overweight and Obesity Screening Knowledge and Attitudes of Pre and Postdoctoral Dental Students at Rutgers University

Kelly Kleckner, 2014

Background: The Institute of Medicine recommends that health professional training programs include instruction in prevention, screening, diagnosis, and treatment of overweight and obesity. Objective: To study the knowledge and attitudes of dental students towards overweight and obesity screening in the dental setting. Design/Subjects: Prospective, internet-based cross-sectional survey of all dental students enrolled in the Rutgers School of Dental Medicine. Statistics: Associations between knowledge and attitude, and year enrolled were analyzed using the Mann-Whitney test. Spearman’s rho was used to test the relationships between attitude and knowledge, sex and self-perceived weight status. Results: Forty-nine of 494 students responded to the survey; the sample was 95.3% predoctoral students; 55.8% white; 65.1% female with a mean age of 26.1 years; 88.4% identified themselves as normal weight. The mean knowledge score was 7.53 of 10 questions (75.3% out of 100%). There was no significant relationship between knowledge score and year enrolled (p=0.329). Respondent attitude scores reflected negative attitudes toward weight screening [mean=25.19, SD=7.30 (neutral score=30)]. Although the relationship was not significant, the students in their clinical years had more positive attitudes regarding overweight and obesity screening in the dental clinic (mean =27.23, SD=8.94) than the students in their preclinical years (mean=23.88, SD=6.62). Conclusions: Respondent dental students with clinical experience had more positive attitudes towards weight screening in the dental clinic than students in their preclinical years. Further research is needed with a larger sample. Additional education and research may be needed to change knowledge and attitudes.


Perceptions of Clinical Registered Dietitians about Advanced Practice Residencies.

Cheryl Marsland, 2012

Objective: To identify the perceptions of clinical dietitians (RDs) regarding the need to complete, prerequisites for, components and competencies of Advanced Practice Residencies (APRs). Design/Subjects: An electronic survey was e-mailed to a random sample of 8508 beyond entry level RDs. RDs working >50% of their time in clinical dietetics and nutrition practice were included. Component and competency statements were rated using an agreement likert-scale. Statistics: SPSS 17.0 was used for data analyses (alpha: p d 0.05). Descriptive and inferential statistics were used to report data. Results: Of 1735 respondents (20.4% response rate), 57.7% were in clinical practice. The mean age and years in practice were 46.6+10.7 and 18.4+10.0 years respectively. Fifty-two percent had a graduate degree, of those who held a graduate degree 58.5% held a pre-professional graduate degree and 41.5% held a post-professional graduate degree. Thirty-nine percent of the respondents were interested in completing an APR; 55.7% agreed an APR should be required for advanced-practice RDs. Fifty-seven percent thought APRs should be free-standing programs and 48.9% thought APRs should be 3 months or longer in length. Respondents thought prerequisites for an APR should include a minimum number of years of RD experience (83.3%) and a post- professional degree (36.1%). RDs agreed that expert-level mentorship from RD clinicians (93.8%) and other healthcare professionals (78.3%) should be included in APRs. For competencies that should be performed on medically complex patients at an advanced level, more than 90% of respondents agreed the competencies should address advanced level communication with patients, families, and healthcare professionals; nutrition focused physical examination; and all components of the nutrition care process and model. No significant correlation was found between the survey question completion of an APR should be one of the requirements for an AP credential and years in practice (r = 0.661, p = 0.018). A significant inverse correlation indicated that respondents with less years of practice as an RD were more interested in completing an APR as compared to respondents with more years of practice as an RD (r= -0.291, p = <0.001). Respondents with a graduate degree were significantly more likely to agree (x2 = 7.913, p = 0.005) that the completion of an APR should be one of the requirements for an advanced practice credential in clinical dietetics than those without a graduate degree. Implications for Practice: These data provide a needs assessment for development of clinically focused APRs.


Can Stages of Change Predict Clinical Outcomes?

Ka Yee Phoebe Ho, 2012

Objective: To determine whether baseline stages of change (SOC) can predict clinical outcomes in a 12-week university-based worksite wellness program. Methods: A secondary analysis of data was conducted (n=137). Participants met with a registered dietitian for data collection, counseling, and education. The predictor was baseline SOC. Outcome measures were weight, body fat, and waist circumference (WC). Results: General linear model with repeated measures indicated changes in the three outcome measures were not statistically different by baseline SOC groups over time. Within all subjects, significant changes observed in weight, body fat, and WC at 12 weeks and in WC at 52 weeks. Conclusion: Baseline SOC was not a predictor for changes in weight, body fat, and WC at 12 and 52 weeks. A larger sample with evenly distributed baseline SOC groups is desirable for future research.


A Food Guidance System And Dietary Patterns Of Collegiate Level Soccer And Cross-Country Athletes: A Pilot Study

Karen Gibson, 2012

Objectives: Determine the impact of using a new food guidance system, the Meal Builder©, when compared to usual practice on the dietary patterns of collegiate athletes. Methods: Eight week, prospective, non-randomized study of collegiate cross-country running and soccer athletes assigned to either usual practice or intervention (Meal Builder©). Primary outcomes were carbohydrate intake/kg, protein intake/kg and calorie intake/kg. Secondary outcome measures were fruit and vegetable servings/day. Repeated measures analysis of variance was used to examine differences between groups on outcome measures with a priori alpha < .05. Results: Thirty-nine athletes in the intervention group (5 cross-country; 34 soccer) and thirty-one in the usual practice group (15 cross-country; 16 soccer) completed the study. There were no significant interaction effects between time or group assignment for any of the primary or secondary outcome measures. Positive trends in dietary intake were identified in the intervention group. Mean changes for intervention and usual practice groups respectively were: +0.4 and -0.3 gm of carbohydrate/kg; +0.05 and -0.06 gm of protein/kg; +3.2 and -2.6 kcal/kg; +0.7 and +0.1 servings of fruit/day; and +0.01 and -0.46 servings of vegetables per day. Conclusion: As positive dietary trends were noted in the intervention group, a larger sample followed over a longer study period would better evaluate the effectiveness of this food guidance systems approach.


Burnout and Positive Self-care Practices Of Dental Students at the New Jersey Dental School

Mark Keiserman, 2012

Background: Student well-being and coping strategies to manage stress develop through study years and into clinical practice. To date there is a paucity of published research regarding the association between dental student burnout and measures of student positive self-care practices such as dietary intake, physical activity, and weight management. Objective: The purpose of this study was to measure the prevalence of high risk of burnout and selected positive self-care practices and to explore relationships between high risk of dental student burnout and student self-care practices. Methods: Prospective, cross-sectional, self-administered internet-based surveyof all 410 students at the New Jersey Dental School (NJDS) of the University of Medicine and Dentistry of New Jersey (UMDNJ) enrolled for the Fall, 2011 term. Chi-square, Fishers Exact Tests, paired t-tests, and point biserial correlation were used. Alpha was set to p<0.05. Results: Among the 85 respondents (20.7% response rate), mean age was 26.5 years, 63.0% female, and 50.6% racial/ethnic minority group members. Healthy BMI (18.5  24.9 Kg/M2) were reported currently and one year ago by 70.1% and 71.6% of students, respectively. Eighty and seven-tenths percent, 54.2%, and 33.7% of students reported high risk of burnout, depressive symptoms, or poor physical health which impaired daily activities, respectively. Burnout was associated only with BMI (Fishers Exact Test, p<0.020), depression (Ç2 = 13.90, p<0.001), and physical activity impairments (Ç2 = 6.70, p<0.01). Conclusions: The results of this survey found that participating students do not exhibit behaviors consistent with Dietary Guidelines for Americans, 2010, and are at high risk of burnout, depression, and poor physical quality of life. Further research investigating the prevalence of dental student distress as manifested by burnout, depression, and poor physical quality of life and student practice of positive self-care behaviors which may form the basis for professional practice are indicated.


Impact of Oral Health Assessment Training of Dietitians on Implementing New Knowledge in their Practice: A Pilot Study

Shelly Rachman Elbaum, 2012

Background: Oral health can impact nutrition status in the elderly residing in long term care facilities (LTCFs). Nutrition-focused physical assessment (NFPA) includes examination of the oral cavity, assessment of cranial nerves, and dysphagia screening. NFPA training is not currently included in the scope of practice or undergraduate training of Israeli RDs. Objective: The objective of this study was to examine the impact of an oral health training program on documented patient care practices as reported by participating Israeli RDs approximately three months after training completion. Design: This study is a sub-study of a pre/post test design and an educational outcomes study focused on self-reported changes in clinical practice skills of participating RDs that served as their own control. Subjects: A convenience sample of 32 licensed, full-time Israeli RDs approved and employed by the Israeli Ministry of Health in several LTCFs. Statistical Analyses: SPSS® v19.0 was used to compare pre- and post-training assessment practices. Chi-square analyses were used to examine change in participants documented patient care practices at two times point. A priori ± was set at p d 0.05. Results: All participants were female with a mean age of 37.2 years (SD=8.5); mean of 9.2 years (SD=5.8) of clinical practice and 6.2 years (SD=4.2) of experience in long term care. Of the 660 assessments used for analyses 339 were pre-training and 321 were post-training. RDs frequently used information from medical records instead of conducting their own assessments in pre-training when compared to post-training data. Other significant findings of the comparison data demonstrated RDs completed most assessment tasks included in the oral health assessments (n=282; 96.9%), identified more findings as non-normal (n=126; 43.9%) and described those findings using appropriate medical terminology. At three months after training, RDs were significantly more likely to conduct and document oral assessment tasks across all exam components tested (p< 0.001) and were 5.7 times more likely to refer a patient to other allied health care professionals (OR = 5.7; p<0.001). Conclusion: This study demonstrated clinical practice patterns based on self-report of Israeli RDs employed by the Israeli Ministry of Health in long term care setting improved significantly within three months of completing OHTP. Further research on the impact of OHTP on Israeli RDs practicing in LTCFs is warranted.


The Study of Feeding Tube Use in Head and Neck Cancer Patients for Whom Chemoradiation is the Primary Mode of Treatment.

Sherri Lewis, 2012

Background: Patients undergoing chemoradiation (CRT) for stage III/IV head and neck cancer (HNC) are at risk for nutrition-related comorbidities. The benefits of using a prophylactic feeding tube (PFT) are undetermined. Design/Subjects: A retrospective, exploratory study of 147 patients with stage III/IV HNC who completed CRT at the James A. Haley Veterans Hospital in Tampa Florida was conducted. Statistics: Relationships between prophylactic feeding tube (PFT) use and weight change, emergency department (ED) visits, hospital admissions (HA) and proportion of chemotherapy cycles completed were analyzed using repeated measures ANCOVA and Mann-Whitney U tests. Pearson chi-square test was utilized to evaluate the relationship between the use of a PFT and FT dependency. Results: The study sample (n=147) were 87.8% white; 98% were males with mean age of 61.8 years. Twenty-one percent received a PFT before CRT, 30.5% had a reactive feeding tube (RFT) placed during CRT and 47% did not receive a feeding tube (FT). Patients with a PFT had significantly less weight loss during CRT compared to a RFT or no FT after controlling for pre-treatment percent weight change (p< 0.001). Patients with a PFT had significantly less weight loss from diagnosis to 12 months post CRT compared to patients without a PFT after controlling for pre-treatment weight change (p=0.005). Patients had 166 nutrition-related (dehydration, dysphagia, odynophagia, mucositis or feeding tube related) ED visits and 122 nutrition-related HA from day-1 of CRT to 12 months post CRT. Patients with a PFT had significantly fewer nutrition-related ED visits and nutrition-related HA (both p<0.001) and higher proportions of chemotherapy cycles completed compared to those without a PFT (p<0.001). Patients with a PFT were more likely to be 100% FT dependent than patients without a PFT at the completion of CRT (p<0.001) and at 3 and 6 months post CRT (p=0.002). By 12 months post CRT, there was no relationship between the use of a PFT and 100% FT dependency (p=0.827). Eighty percent of all patients were taking 100 % of nutrition orally and 14.7% were taking nutrition orally with supplemental tube feeding by 12 months post CRT. Conclusions: Patients with stage III/IV HNC treated with CRT experienced less weight loss, fewer ED visits and HA and completed chemotherapy at higher rates when a PFT was used compared to those without a PFT. The majority of patients were able to resume an oral diet by 12 months post CRT regardless of feeding tube status.


Nutrition Focused Physical Assessment (NFPA) Teaching Practices in Accreditation Council for Education in Nutrition and Dietetics (ACEND)-Accredited Dietetic Internships and Coordinated Programs (DIs/CPs)

Jennifer Kidd, 2012

Background: While nutrition focused physical assessment (NFPA) is integral in the Nutrition Care Process, there are no Accreditation Council for Education in Nutrition and Dietetics (ACEND) competencies requiring Dietetic Internships and Coordinated Programs (DIs/CPs) to teach students and interns to perform physical assessment. Objective: To explore DI/CP NFPA teaching practices and perceived barriers and enablers to teaching NFPA skills. Design: Cross-sectional, prospective, web-based survey. Subjects: ACEND-accredited DI/CP directors or their designee. Statistical Analysis: Descriptive statistics were used to report NFPA skills taught and barriers and enablers to teaching NFPA. Results: Ninety-one (32.8%) ACEND-accredited DI/CP directors or their designee participated in a survey regarding NFPA teaching practices in DIs and CPs. NFPA skills most frequently taught were related to anthropometrics and body composition: weight (82.0%), height (79.8%), waist circumference (64.0%), skinfold calipers use (55.1%), and bioelectrical impedance analysis (45.5%). Of non-anthropometric or body composition NFPA skills cited, skin assessment (64.4%) and peripheral edema assessment (62.1%) were the most frequently taught. Interest in teaching NFPA among program educators was cited as an enabler (46.6%) to teaching NFPA. Cited barriers to teaching NFPA skills included availability of educators trained in NFPA (56.2%), availability of multi-media resources (52.1%), and NFPA experience of supervised practice preceptors (50.7%). Conclusions: The results indicate that programs most frequently teach students to perform anthropometric measurements. Future research may address barriers including the need for comprehensive NFPA training, experienced preceptors in NFPA, and multi-media resources in order to increase frequency of NFPA skills taught to students and interns.


Trends in Nutrition Care Activity Involvement, Distinguishing the Entry-Level RD from RDs in Practice up to Four and Five Years: A Secondary Analysis of the 2010 Entry-Level Dietetics Practice Audit

Erin Andersen, 2012

Objective: To explore relationships between years of practice and involvement in nutrition care activities. Design/Subjects: This study is a secondary analysis of data from the Commission on Dietetic Registration's 2010 Entry Level Dietetics Practice Audit. RDs were stratified by years of practice: entry-level RDs, RDs in practice up to four years and RDs in practice up to five years. Relationships between years of practice and the highest type of involvement (assist others, perform myself, or supervise/manage) within 18 nutrition care activities were explored. Statistics: Demographic characteristics and perceived risk of involvement in nutrition care activities were reported using descriptive statistics. The relationships between years of practice and involvement (type and frequency) in nutrition care activities were tested using chi square. Results: Respondents (N=3,116) were 31.1±7.4 years, 42.5% had graduate degrees, and 31.2% worked in acute care. Significant relationship was identified between years of practice and type of involvement for 15 of the nutrition care activities. For counseling on end-of-life issues related to nutrition and hydration, more entry-level RDs performed myself than RDs in practice up to four and five years (x2=19.99, p=0.001). More RDs in practice up to five years supervise/manage the activity performing nutrition focused physical examinations than RDs in practice up to four years and entry-level RDs (x2=12.01, p=0.017). More RDs in practice up to five years supervise/manage the activity recommending intravenous or parenteral nutrition therapies than RDs in practice up to four years and entry-level RDs (x2=30.81, p=<0.001). Significant relationships were also identified between years of practice and frequency of involvement for 11 of the nutrition care activities; more entry-level RDs had daily involvement than RDs in practice up to four and five years. Application/Conclusion: A greater proportion of RDs in practice up to four and five years supervised/managed than entry-level RDs, while more entry-level RDs where involved daily in nutrition care activities.


Usage Patterns of the Standards of Practice and Standards of Professional Performance in Nutrition Support by Registered Dietitian Members of the American Society for Parenteral and Enteral and Dietitians in Nutrition Support

Christine Gaylor, 2012

Objective: To determine usage patterns and barriers and enablers to use of the 2007 Standards of Practice and Standards of Professional Performance for Dietitians in Nutrition Support (SOP/SOPP) by Registered Dietitian (RD) members of the American Society for Parenteral and Enteral Nutrition (A.S.P.E.N.) and Dietitians in Nutrition Support Dietetic Practice Group (DNS DPG) of the Academy of Nutrition and Dietetics. Design/ Subjects: A prospective, internet-based survey was emailed to RD members of A.S.P.E.N. and DNS who saw at least one nutrition support (NS) patient weekly or supervised at least one RD who saw at least one NS patient weekly (n=3,769). Descriptive statistics were used to report demographic and professional characteristics, use and barriers/ enablers to SOP/SOPP use. Results: Of 893 usable responses (23.7%), 81.6% were clinicians; 76.5% were hospital-based with a mean of 13.5 years (SD=15.8) in NS practice. Eighty-seven percent reported having read the SOP/SOPP (59.6% in part, 28.0% in entirety). Respondents used the SOP/SOPP to: guide practice (94.6%), assess clinical practice (83.3%), teach dietetic students (68.4%) and develop NS policies and procedures (66.4%). When respondents were asked to identify whether items were barriers, enablers or neither, the most frequently reported barriers to SOP/SOPP use were: length (33.5%), training (14.0%) and access to the SOP/SOPP (10.9%). Desire to advance current practice (69.1%), desire to practice autonomously (49.4%) and experience in NS (25.6%) were enablers to SOP/SOPP use identified by respondents. Conclusions: Primary uses of the SOP/SOPP by RDs in NS were to guide and assess clinical practice. Future research should address approaches to minimize barriers to SOP/SOPP use.


Developing a Predictive Energy Equation for Maintenance Hemodialysis patients

Wai Yin Ho, 2012

Background: Optimal nutritional care requires an accurate estimation of energy expenditure (EE). Determinants for EE in maintenance hemodialysis (MHD) patients have not been clearly elucidated. Objective: This retrospective, cross-sectional study explored potential predictors of measured resting energy expenditure (mREE) among MHD patients. Method: Data of adult MHD patients (n=67) from Vanderbilt University Medical Center were analyzed using Pearsons correlation and multivariate linear regression procedures to examine demographic, anthropometric, clinical, and laboratory data as predictors of mREE. Results: The mean age of the sample was 47 ± 13 years, and 75.6% (n= 50) were African American, 92.5% (n=62) were non-Hispanic and 73.1% were males (n=49). Lean body mass (LBM) (r=0.598), albumin (r=0.483), age (r=-0.455), weight (r=0.455), serum creatinine (r=0.355), height (r=0.313), BMI (r=0.275), sex (r=0.272), C-reactive protein (CRP) (r=0.265), and fat mass (r=0.256) were all significantly correlated with mREE (p<0.05). After screening for multicollinearity, the strongest predictive model (R2=0.489) of mREE included: LBM, albumin, age, and CRP. Two practical models (R2=0.460, R2=0.451) were derived for clinical utility: age, sex, weight, albumin, CRP/creatinine. Conclusions: Findings indicated that the inclusion of clinical parameters produced a better predictive energy model in the MHD population than the combination of demographic and anthropometric characteristics alone. However, the best predictive model in this sample explained less than 50% variance of mREE. Further research with a larger sample size is needed to identify the best combination of predictors of energy expenditure in the MHD population.


Needs Assessment of Graduate Nutrition Programs in the United States

Brenda Harvey, 2012

Objective: To conduct a needs assessment of U.S. graduate nutrition programs in order to determine the differences between programs which are offered exclusively to registered dietitians (RD) compared to those that do not require the RD credential. Design: A descriptive cross sectional prospective needs assessment utilizing an internet based survey administered through Survey Monkey was conducted over 10 weeks between October 2, 2011 and December 6, 2011. Subjects: Directors of graduate degree programs listed in the 2011-2012 Accreditation Council for Education in Nutrition and Dietetics (ACEND) directory were invited to participate. Statistical Analysis: Frequency distributions were used to report program characteristics. Fishers exact tests were used to explore the relationships between programs offered exclusively to RDs and those offered to non RDs in regards to program characteristics. Data analyses were conducted using SPSSv19; p value was set at d 0.05. Results: Of the 99 program directors invited to participate, 53 usable surveys (53.5%) were submitted. Nutrition was the most frequently reported masters degree program offered (n=40, 76.9%). Medical Nutrition Therapy/Human Nutrition was the most commonly offered major or emphasis (n=24, 60.0%). The most frequently reported admission requirement was a Bachelor of Science Degree (n=47, 92.2%). Eighty one percent (n=39, 81.3%) of programs reported that no credentials were required for admission and eight program/program tracks (16.7%) reported that the RD credential was required for admission. Thirty seven percent (n=14, 36.8%) of respondents indicated that a thesis was required for graduation, however 85.7% (n=24) reported that completion of a thesis was an optional graduation requirement. The primary educational platform used was the in-person classroom (n=40, 95.2%). No statistically significant differences were found between programs which required the RD credential and programs which required a bachelors degree for admission, completion of a thesis for graduation, or courses offered in nutrition focused physical exam. Conclusions: The results of this needs assessment describe the similarities and differences between admission and graduation requirements as well as curricular and program characteristics in U.S graduate nutrition programs which offer degrees in nutrition and dietetics. There were no statistically significant differences between programs which were offered exclusively to RDs compared to those that did not require the RD credential. There are no educational requirements for graduate education in nutrition and dietetics in the United States as graduate degrees are not required for entry into practice. Further studies may need to be conducted in order to explore these programs in regards to specific curricular components and how they will prepare graduates for the workplace.


Participation in the Active Intervention Phase of a Worksite Wellness Program: Changes in Cardiovascular Health Outcomes

Lauren Kolesa, 2012

Objective: To examine changes in AHA cardiovascular health metrics, Framingham risk scores and waist circumference values following participation in the active intervention phase of a 26-week worksite wellness program at an academic health sciences center. Design/Methodology/Subjects: A retrospective within-group pre-post intervention design was utilized with employees and students who had a BMI e 25 kg/m2 (N=74) and completed the active intervention phase (12±2 weeks) of a 26-week worksite wellness program at the University of Medicine and Dentistry of New Jersey between 2010-2011. Participants attended individual counseling sessions with the registered dietitian (RD) at baseline and following the active intervention phase, and had access to weekly online education sessions, a wellness blog, in-person weigh-ins, and RD guidance via email and telephone. Outcome measures completed at baseline and following the active intervention phase included AHA cardiovascular health metrics, Framingham risk scores for CHD and waist circumference values (WC). Descriptive and inferential statistical analyses were employed. A priori alpha level was set at 0.05. Results: Participants had statistically significant reductions in body mass index (mean reduction=0.6 kg/m2, p<0.001, paired t-test), systolic blood pressure (mean reduction=3.4 mmHg, p=0.013, paired t-test), total cholesterol (mean reduction=9.0 mg/dL, p=0.004, paired t-test), glucose (mean reduction=5.9 mg/dL, p=0.048, paired t-test) and WC (mean reduction=2.5 cm, p<0.001, paired t-test) following the active intervention phase. The prevalence of ideal cardiovascular health in all five health metrics including blood pressure, total cholesterol, glucose, physical activity, and smoking status increased from 1.5% (n=1) at baseline to 6.1% (n=4) after the active intervention phase. Conclusions: There were improvements in participants AHA cardiovascular health metrics following the active intervention phase of a worksite wellness program, as well as an increase in the prevalence of ideal cardiovascular health. Further research with a larger sample size, control group and longer intervention period is needed.


The Performance and Perceived Value of Registered Dietitians Working in Long Term Care on the Nutrition Focused Physical Examination (NFPE) of the Head, Neck and Oral Cavity

Heather Miceli, 2012

Background: To explore what extent Registered Dietitians (RDs) in Long Term Care (LTC) perform the Nutrition Focused Physical Examination (NFPE) of the head, neck, and oral cavity and their perceived value of its conduct as it relates to the nutrition assessment of the LTC resident. Objective: To determine the performance and perceived value of RDs in LTC of the NFPE of the head, neck, and oral cavity. Design: This was a prospective web-based survey. Subjects: A list of 331 email addresses was obtained to recruit RDs in LTC employed by Sodexo and Genesis HealthCare. Statistical Analysis: Descriptive statistics were used to report the performance and perceived value of the NFPE components of the head, neck, and oral cavity. Results: A total of 112 (33.8%) respondents completed the survey and met the inclusion criteria of being an RD, working or consulting in a LTC setting and assessing the nutrition status of LTC residents. The majority of the respondents were female (98.2%, n=109) with a mean of 11.5 years in clinical practice in LTC. The majority of RDs in LTC worked in a skilled nursing facility (96.4%, n=106) and 62.6% (n=67) reported having a Bachelors Degree as their highest level of education. The NFPE components most frequently performed independently included the evaluation of the presence of dentures (24.2%, n=27), overall dentition (23.2%, n=26) and wetness of the oral cavity (20.2%, n=22). The NFPE components not performed or used as part of the RDs in LTC nutrition practice included the assessment of the cranial nerves (70.6%, n=77). Twenty-four percent (n=27) of the respondents reported to conduct dysphagia screening. The majority of the respondents (64.3%, n=72) reported to use the NFPE information recorded by another healthcare professionals examination rather than perform the examination themselves (23.2%, n=26). The conduct of the intra oral examination (94.5%, n=104) and dysphagia screen (87.7%, n=93) were perceived as most valuable to the nutrition assessment of the LTC resident whereas the extra oral examination (37.7%, n=40) and cranial nerve screen (48.1%, n=50) were not perceived as valuable. Conclusions: The majority of the respondents do not independently perform components of the NFPE of the head, neck, and oral cavity. Of those that do, they perform some components of the intra oral examination (presence of dentures, overall dentition, and wetness of the oral cavity) and find them valuable to the nutrition assessment of the LTC resident


The Relationship Between Changes in Clinical Outcomes and Changes in Health Related Quality of Life of University Faculty and Staff Who Participate in a Worksite Wellness Program (Live Well!)Through Program Midpoint (12 Weeks)

Elizabeth Silverthorne, 2012

Objective: To determine the relationship between changes in clinical measures (waist circumference, body weight, percent body fat, and physical activity score as measured by the International Physical Activity Questionnaire [IPAQ]) and changes in HRQOL measures after 12 weeks of a worksite intervention. Methods: This study analyzed data from the first 12 weeks of a 26 week worksite wellness program. The sample represented 157 faculty/staff from four university campuses. Results: Spearmans rank correlation coefficient revealed significant relationships between reductions in waist circumference and body weight with improvement in self rated health after a 12-week worksite intervention. Conclusions: Significant clinical improvements in waist circumference, body weight, and total IPAQ score as well as HRQOL improvements were demonstrated after a 12 week worksite wellness program. Furthermore, there were significant relationships between reductions in waist circumference and body weight with changes in self-rated health.


Improving enteral delivery through the adoption of the “Feed Early Enteral Diet adequately for Maximum Effect (FEED ME)” protocol in a surgical trauma ICU – a quality improvement review

Beth Taylor, 2014

Background Despite the research supporting adequate enteral nutrition (EN) in ICU patients, underfeeding is still common. This quality improvement (QI) project was done to determine the effect of “volume-based” feeding on adequacy of enteral nutrition (EN) delivery and provision of calories and protein in a surgical/trauma ICU (STICU). Materials and Methods Mechanically ventilated STICU patients (n=111) fed at least 72 hours after achieving their target goal of EN during their first week of admission were reviewed retrospectively in a QI project. Data were obtained pre- (n=54) and post- (n=56) initiation of a “volume-based” feeding protocol (FEED ME - Feed Early Enteral Diet adequately for Maximum Effect). Descriptive statistics were used; statistical significance was P <0.05. Results The proportion of EN volume and calories delivered increased significantly (rate-based 63%+20 vs FEED ME 89%+9%, P <0.0001), as did grams of protein/kilogram actual body weight (1.13+.29 to 1.26+.37, P=0.036) using the FEED ME protocol. Groups were similar in patient demographics, clinical characteristics and nutrition practices. Only slightly more diarrhea (rate-based =0, FEED ME = 6 P = 0.046) in gastric fed patients was noted. The incidence of GRV > 350 ml (rate-based 20 vs. FEED ME 11 episodes, P=0.34) and emesis (5 vs. 2 episodes P=0.22) were similar. Conclusion A change in standard of practice to an EN volume-based feeding approach in a STICU led to a significant improvement in adequacy of calories and protein delivered, with only a slight increase in diarrhea.


The Impact of a 9-Week Interactive Internet-Based Nutrition Education Program on Nutrition Knowledge, Dietary Behaviors and Self-Efficacy of Collegiate Athletes

Christine Karpinski, 2012

Improving the nutrition knowledge and sub-optimal dietary behaviors of collegiate athletes is dependent upon the athletes obtaining accurate information from credible sources through effective nutrition education. The purpose of this study was to evaluate the impact of an interactive Internet-based nutrition education program on sports nutrition knowledge, dietary behaviors, and self-efficacy of collegiate athletes. Three outcome measures, nutrition knowledge, dietary behaviors and self-efficacy, were prospectively measured at baseline and after the intervention in 76 National Collegiate Association Division II student-athletes. The student-athletes were randomly assigned to participate in an interactive Internet-based sport nutrition education program (intervention group) or to be given access to static online nutrition fact sheets (comparison group). All participants completed a self-reported dietary behaviors and nutritional knowledge and self-efficacy questionnaire at baseline and at the end of the intervention. Height, weight, participation, and program satisfaction were assessed at baseline and the end of the intervention. Chi Square and independent samples t-tests were conducted to compare differences between groups at baseline and at the end of the intervention for demographic characteristics, summary scores, and item analyses. A two-way, repeated measures Analysis of Variance was conducted for each of the three main outcome variables to assess the significance of change over time. Statistical significance was set with a priori alpha of p d 0.05. The changes in total nutrition knowledge and dietary behavior scores were significantly greater in those athletes participating in the intervention group compared to the comparison group (p = 0.002 and p = 0.006, respectively, repeated measures ANOVA). There was no significant difference between the two groups over time for self-efficacy. Nutrition knowledge, dietary behaviors and self-efficacy scores increased for all participants from baseline to the end of the study (p d 0.001, repeated measures ANOVA), regardless of group assignment. The 9-week interactive, Internet-based intervention had a positive impact on the nutrition knowledge and dietary behaviors of this sample of athletes. The outcomes of this study can be used to establish a model that can be further refined to meet the unique needs of a wide variety of elite athletes and empirically tested in the future.


Effect of Lactobacillus Rhamnosus GG, LGG®, AND Bifidobacterium Animalis Subspecies Lactis, BB-12®, on Health-Related Quality of Life in College Students with Upper Respiratory Infections

Tracey Smith, 2011

College students are susceptible to upper respiratory infections (URIs) due to inadequate sleep, psychological stress and close living quarters. Duration and severity of URI symptoms, and functional impairment, impact health-related quality of life (HRQL). Certain probiotic strains modulate immune function; and, may positively impact HRQL during URIs. This study assessed the effect of probiotics on HRQL outcomes (i.e., self-reported duration, symptom severity and functional impairment) during URIs in college students. Secondary outcomes included incidence of URIs, and missed school and work days due to URIs. The study was conducted January-May 2011. Subjects (N = 231) were college students living on-campus in residence halls at Framingham State University (Framingham, MA), and were randomized to receive placebo (n = 117) or probiotic-containing candy (1 billion CFU per gram Lactobacillus rhamnosus GG, LGG®, and Bifidobacterium animalis ssp lactis, BB-12®; n = 114) for 12 weeks. All subjects completed The Wisconsin Upper Respiratory Symptom Survey-21 to assess duration and severity (symptoms and functional status) of URIs and documented missed work/school days due to URIs. Between-group differences were analyzed using chi-square or Mann Whitney U test, and significance was set at pd0.05. Final analysis included 198 subjects (Placebo, n = 97; Probiotics, n = 101). Median duration of URIs was significantly shorter by 2 days, and median severity score was significantly lower by 34% in the probiotics, compared to placebo, group. Incidence of URIs and missed work days were not different between groups; however, the probiotics group missed significantly fewer school days (mean difference = 0.2 days) compared to the placebo group. LGG® and BB-12® may be beneficial for mitigating decrements in HRQL, and minimizing missed school days, during URIs in college students. More research is warranted regarding mechanisms of action associated with these findings and the cost-benefit of prophylactic supplementation.


The Relationships Between Changes in Anthropometric Measures and Health Related Quality of Life Among University Employees Enrolled in a Worksite Wellness Program

Jillian Wanik, 2012

Objective: The primary aims of the current study were to explore changes in anthropometric measures and changes in health related quality of life (HRQOL) core measures and the relationships between these changes among participants who completed the Live Well active intervention phase. Design: The study was a retrospective, secondary analysis of Live Well Program participant data at the University of Medicine and Dentistry of New Jersey (UMDNJ). Subjects: Participants (n=103) in the Live Well program who completed the active intervention phase within 10 -14 weeks between September 2010 and May 2012. Results: Among participants who completed the active intervention phase, there were statistically significant decreases in weight, percent weight lost, in waist circumference (WC) and BMI (p =0.001) compared to baseline. After completion of the active intervention phase participants had significant improvements in self-rated health, (p = 0.002). Participants who reported an excellent rating for self-rated health status increased to 9.9% from 5% at baseline. Overall, 38% of participants had an improvement in their self-reported health status while 47% had no loss or gain in health status and 15% reported a decreased health status. The number of physical unhealthy days and activity limitation days reported by participants had significant decreases after the active intervention phase (p=0.020 for physically unhealthy days and p=0.005 for activity limitations). The post active intervention number of mentally unhealthy days reported by participants and summary index of unhealthy days over the past 30 days had decreases that were not statistically significant (mentally unhealthy days -1.2 days and summary index of unhealthy days -2.4 days compared to baseline) but may be clinically relevant. In this study, some HRQOL changes (self-rated health, mentally unhealthy days and activity limitation days) were significantly related to changes in percent weight lost. Conclusion: This study provides further support for worksite wellness interventions. The results indicate that for participants who completed a University-based wellness program, decreases in weight and WC were associated with improvements in several HRQOL scores. Significant relationships were seen between percent weight lost and several HRQOL core measures. The varying significance in the relationships between changes in anthropometrics and improvements in HRQOL scores associated with a worksite wellness intervention warrants further exploration.


The Impact of Feeding Route on Clinical Outcomes in Critically Ill Neurologically Injured Patients

Delara Saran, 2012

Background: Neurologically injured patients are considered at greater risk for gastrointestinal (GI) complications and enteral nutrition (EN) intolerance and research related to the impact of feeding route on clinical outcomes in this patient population are mixed. Objective: To explore the differences in clinical outcomes including whether EN was interrupted due to GI complications, duration of mechanical ventilation, intensive care unit (ICU) and hospital length of stay (LOS) between gastric and small bowel feeding in critically ill, neurologically injured patients. Design: This was a retrospective, secondary analysis of data from the International Nutrition Survey (INS) and baseline data from the Enhanced Protein-Energy Provision via the Enteral Route in Critically Ill Patients (PEP uP) study. Data were collected from ICU admission up to a maximum of 12 days and clinical outcomes were collected at 60 days. Participants/setting: Eligible patients had neurological injury and were admitted to the ICU for at least three days, mechanically ventilated within 48 hours, received at least three days of EN and no more than two days of parenteral nutrition and did not change feeding tube types during the study period. Of the 13,096 patients included in the INS and PEP uP between 2007 and 2011, 1691 (12.9%) met the study inclusion criteria. Of these patients, 84.1% (n=1422) received gastric feeding, 5.2% (n=88) received small bowel feeding and 10.7% (n=181) were excluded as they switched feeding tube types. Statistical analyses: Whether EN was interrupted was analyzed between the two feeding routes using Pearsons Chi square. Time-to-event outcomes were analyzed for 60-day survivors (n =1416) using Cox proportional hazards model, controlling for differences in baseline characteristics. Results: A significantly greater proportion of patients who received gastric feeding had EN interrupted due to GI complications, compared to small bowel feeding (19.6% vs 4.7%, P = 0.015). There were no statistically significant differences in the duration of mechanical ventilation, ICU LOS or hospital LOS between the two feeding groups after adjustment for baseline characteristics, though this could be a type 2 error due to unobserved heterogeneity in the model. Conclusions: The results of this study suggest that compared to small bowel feeding, patients receiving gastric feeding are five times more likely to have EN interrupted due to GI complications. However, small bowel feeding may not improve clinical outcomes including the duration of mechanical ventilation and LOS.


Use of Intravenous Fat Emulsions in Critically Ill Patients  A Secondary Analysis of an International Observational Multi-center Study

Christina Edmunds, 2012

Background: Commercially available IVFEs used as a source of fat in parenteral nutrition (PN) include: soybean oil, medium chain triglyceride (MCT) oil, olive oil and pure fish oil or fish oil containing IVFEs. Objective: To examine the differences in clinical outcomes including duration of mechanical ventilation, Intensive Care Unit (ICU) and hospital length of stay (LOS), among patients who received lipid-free PN and different IVFE types in PN. Design: This retrospective secondary analysis of data from the International Nutrition Survey. Demographic and clinical data were collected prospectively for up to 12 days, or until death or discharge from the ICU, whichever came first. Clinical outcomes were recorded at 60 days following ICU admission. Participants/setting: Eligible adult patients were admitted to the ICU for e72 hours, mechanically ventilated within 48 hours of ICU admission, received exclusive PN for e5 days, and did not change IVFE type during the data collection period. Statistical analyses: IVFE group demographic and clinical characteristics were compared using one-way ANOVA for continuous variables and chi square for categorical variables. A Cox proportional hazard model was used to examine differences between IVFE groups while controlling for differences in baseline patient characteristics for each clinical outcome. Results: Of the 12,585 patients included in INS, 451 (3.6%) patients met study inclusion criteria and 380 (3.0%) survived to be discharged from the hospital by day 60. Of the 60-day survivors, 42% received soybean oil (42.1%, n=160), while the remainder received either lipid-free PN (15.0%, n=57) or alternative IVFEs (43%, n=124). There were significant differences among IVFE groups with regards to admission category, presence of acute respiratory distress syndrome, body mass index, use of propofol for e6 hours and mean daily calories from PN. There was no difference in outcomes when comparing soybean oil based IVFE to lipid-free PN. However, when compared to lipid-free PN, patients who received olive oil or fish oil had a higher hazard rate for discharge from the ICU by 1.6 and 1.9 times, respectively (Wald=5.063, p=0.024 and Wald=4.535, p=0.033 respectively). When alternative IVFEs were compared to soybean oil, MCT and olive oil resulted in 1.5 times higher hazard rate for discharge from the ICU (Wald=4.709, p=0.03 and Wald=6.302, p=0.012 respectively) and fish oil resulted in 1.9 times higher hazard rate for discharge from the ICU (Wald=5.444, p=0.02). There were no significant differences in hospital LOS or duration of mechanical ventilation among IVFE groups. Conclusions: Use of MCT, olive, and fish oil IVFE were superior to soybean oil IVFE with regards to rate of discharge from the ICU. No differences were found between patients who received lipid-free versus soybean oil IVFE in PN.


The Impact of Standardized Feeding Guidelines on Enteral Nutrition Administration, Growth Outcomes and Bone and Liver Health in a Neonatal Intensive Care Unit

Theresa Loomis, 2013

Objectives: The objectives of this study were to determine if implementation of standardized feeding guidelines in a neonatal intensive care unit had an effect on 1) the timeliness of initiating enteral feedings, 2) the achievement of full, fortified enteral feedings, 3) the length of time to return to birth weight 4) the change in weight Z-score from birth to day of life 30 and 5)the presence of metabolic bone disease (MBD) and cholestasis. Methods: This was a retrospective chart review of infants <32 weeks gestation and <1500 grams at birth who either received enteral nutrition via traditional care (TC) or via standardized feeding guidelines (SFG). The outcomes of the study were to determine the day of life the first enteral feedings were started, the day of life full, fortified enteral feedings were established, the day of life the infant returned to birth weight, the change in weight Z-score from birth to day of life 30, and the presence of MBD and cholestasis. Results: There were 128 infants in the TC group and 125 infants in the SFG group. There was an improvement in all of the measured outcomes but these changes were not statistically significant. Conclusion: This study provides preliminary evidence that SFG have an effect on enteral nutrition administration, growth and morbidity for preterm infants. Although the findings were not statistically significant they are clinically relevant. Positive trends were found for administration of enteral nutrition as well as presence of metabolic bone disease and cholestasis.


Comparison of Standardized Patients and Real Patients as an Experiential Teaching Strategy in a Nutrition Counseling Course for Dietetic Students

Vicki Schwartz, 2014

Objectives: To compare the quality of communication and behavioral change skills among dietetic students having two nutrition encounters with either a real patient or a standardized patient. Methods: A retrospective analysis of video recordings (n=138) containing nutrition encounters of dietetic students (n=75) meeting with a standardized patient (SP) or a real patient (RP). Trained raters evaluated communication skills with the 28 question Calgary Cambridge Observation Guide (CCOG) and skills promoting behavior change using the 11 question Behavior Change Counseling Index (BECCI) tool. Results: Using the CCOG, there was a significantly greater score in the SP group for the category of “Gathering Information” in encounter one (p = 0.020). There were good to excellent ratings in all categories of the CCOG and the BECCI scores for the SP and the RP groups at both encounters. There was no significant differences in change scores from encounter one to encounter two between groups. Conclusions: Encounters with SPs and RPs are both effective strategies for dietetic students to demonstrate their communication and behavior change skills. Practice Implications: Utilizing SPs is an effective experiential strategy for nutrition counseling curriculums.


Enteral Nutrition Support Practices in the Intensive Care Units at National University Hospital in Singapore: A Retrospective Descriptive Study

Cherie Chun Yan Tong, 2013

Background: Early and adequate enteral nutrition (EN) is beneficial to critically ill patients. Objective: To describe the EN support practices provided to adult intensive care unit (ICU) patients at National University Hospital (NUH), Singapore. Design/Participants: A retrospective study of patients admitted to the ICUs between February 20th and May 31st 2012 who received solely EN during the first two weeks of ICU stay. Statistical Analyses: Descriptive statistics were used to describe demographic and clinical characteristics. The data for timing of EN initiation were compared with the 2009 American Society of Parenteral and Enteral Nutrition (A.S.P.E.N.) Guidelines (Chi-square Goodness of Fit test) and among the four ICUs (Kruskal-Wallis test). Results: The study sample (n=168) consisted of 61.3% Chinese, 62.5% male, 52.4% medical ICU patients with 34.5% having a respiratory diagnosis and a mean age of 63.3 years. The mean timing of EN initiation was 19.6 hours. A high proportion of patients (92.3%) received EN within 48 hours. Since not all patients started EN within 48 hours, there was a significant difference between the observed proportion (92.3%) and the expected proportion (100%) meeting the 2009 A.S.P.E.N. Guidelines (Ç2=1665.1, p<0.001). Nasogastric (78.0%) and orogastric (22.0%) tubes were the most common EN routes. The overall mean daily enteral calorie and protein intakes were 778.2 kcal/day (13.2 kcal/kg/day) and 29.4 g/day (0.5 g/kg/day). The mean gastric residual volume (GRV) was 150.1 ml and the highest frequency of GRV causing EN interruptions (51.4%) was in the range of 100-199ml. There were significant differences in the mean timing of EN initiation among the ICUs (Ç2 =15.74, p=0.001); the mean timing of EN initiation in the medical ICU and the cardio-thoracic ICU were 14.6 hours and 39.0 hours respectively. Conclusions: The majority of patients had early EN initiation, but EN feedings were interrupted at low GRV thresholds. Despite early EN initiation, calorie and protein delivery plateaued between ICU Day 3 and Day 14; the daily enteral calorie and protein intakes were between 896.5 kcal/day and 1304.9 kcal/day, and 33.8 g/day and 51.2 g/day respectively. These findings will help to develop strategies to improve EN support delivery in the NUH ICUs.


Intravenous Trace Elements (Copper, Selenium and Zinc) and Clinical Outcomes in a Population of U.S. Patients with Burn Injury: Results of a Retrospective Study

Susan Stankrob, 2012

Objective: To examine the effects of intravenous copper (Cu), selenium (Se) and zinc (Zn) on the clinical outcomes of ventilator associated pneumonia (VAP), mortality, intensive care unit length of stay normalized for burn size (ICU LOSN) and wound healing in patients with moderate to severe burn injury. Methods: This retrospective observational study extracted data from the Descriptive Study of Burn Nutrition database of the United States Army Institute of Surgical Research burn center. Patients in the IVTE group (n=65) received 4mg Cu, 500mcg Se and 40mg Zn for a duration of 14-21 days based upon initial burn size and were compared to 25 historical controls (NO IVTE). Clinical outcomes were compared between groups. Results: IVTE supplementation began 2.95±2.85 (median=2.0, range 0-16) days after admission. The IVTE group was 5.25 (95% CI: 1.22-22.54, p=0.026) times more likely to be diagnosed with VAP during the first 30 days of admission compared to the NO IVTE group. There were no differences in 30-day mortality (p=0.342), ICU LOSN (p=0.327) or days per percent wound healed (p=0.282) between groups. Conclusions: IVTEs did not improve clinical outcomes. The delay between admission and supplementation may have contributed to the lack of efficacy among patients in this sample. Further research is needed to examine the optimal timing, dosage and duration of IVTE supplementation.


Detecting Malnutrition in Cancer Patients Using the Academy-A.S.P.E.N. Malnutrition Framework Compared to the Patient-Generated Subjective Global Assessment Tool

Mary Marian, 2014

Background: In oncology a variety of nutrition assessment approaches that include both subjective and objective measures have been used to diagnosis malnutrition. The aim of the present study was to explore the ability of the Academy of Nutrition and Dietetic (Academy)/American Society for Parenteral and Enteral Nutrition (A.S.P.E.N.) malnutrition diagnostic framework (MDF) to detect malnutrition in participants beginning or receiving chemotherapy for the treatment of cancer in comparison to the Patient-Generated Subjective Global Assessment (PG-SGA) tool. Materials and Methods: Twenty-five cancer patients aged 18 or older beginning or undergoing chemotherapy for treatment at ambulatory care oncology clinics were recruited for this prospective study. Nutritional status was assessed using the Academy/A.S.P.E.N. diagnostic MDF and the PG-SGA tool. Data regarding demographic and clinical characteristics were collected, and nutritional status using two methods was compared using the kappa statistic. Results: According to the MDF within the context of chronic illness, 24% of participants were classified as severely malnourished while 4% were classified as moderately malnourished. In comparison, 20% were classified as moderately malnourished using the PG-SGA. This difference in the prevalence of malnutrition using the two methods was significant (P < 0.001). Conclusions: This study found a significant difference between the Academy/A.S.P.E.N. MDF and the PG-SGA in detecting malnutrition. More subjects were diagnosed with malnutrition using the MDF. The results of this study suggest the Academy/A.S.P.E.N. MDF is feasible for detecting malnutrition in oncology patients undergoing therapy.


Dietitians in Nephrology Care Knowledge of and Patient Care Practices Regarding Oral Health and Disease

Cynthia Pike, 2013

Background: Oral health screening is part of the nutrition focused physical exam (NFPE) therefore it is important for Registered Dietitians (RD) in nephrology care to have an understanding of oral health. Objective: To identify knowledge from a score on a series of questions related to oral health and disease and select patient care practices among RD members of the Academy of Nutrition and Dietetics-Renal Practice Group (RPG) or National Kidney FoundationCouncil on Renal Nutrition (CRN). Design: Descriptive, prospective internet-based survey. Participants/setting: RD members of the RPG/CRN who work in the ambulatory/outpatient setting with CKD patients. An email invitation was sent to 2614 members of the RPG/CRN with a response rate of 18.8% (n=492) usable surveys. Statistical analysis Performed: SPSS 20.0 was used to report descriptive statistics on demographic and professional characteristics, knowledge questions and patient care practices. Pearsons product moment correlation and Spearmans rho were used to test the hypotheses. Results: The survey sample was 97.5% female with mean age 47.9 years. The median number of years in clinical practice and clinical practice in nephrology care were 20 and 10 years respectively; 22.6% (n=93) reported having the CSR credential. Eighty-seven percent (n=416) reported dialysis patients as their primary patient population. The overall knowledge score for participants was 65.9% (7.9 out of 12 correct). A weak relationship was found between knowledge score and years in clinical practice in nephrology care (p=0.008). The majority of respondents did not report regularly performing the select patient care practices. Inverse relationships were found between knowledge score and the reported frequency of evaluating patients medications for risk of causing xerostomia (p<0.001), addressing xerostomia as part of diet/nutrition counseling (p<0.001) and discussing the relationship between blood sugar control and oral health with patients/clients with diabetes (p<0.001). Conclusions: Respondents were more likely to correctly answer questions related to oral structure/function than questions on diet and cariogenicity. Although many respondents answered individual knowledge questions correctly, the majority do not regularly perform the select patient care practices. More research is needed to evaluate if additional education and training impacts practice.


Exploring the Influence of the Doctorate of Clinical Nutrition Program on Graduates Personal & Professional Achievements

Amanda DeGennaro, 2013

Clinical doctorates are available in healthcare disciplines as either entry level or post professional. Graduates with an advanced practice clinical doctorate have the opportunity to gain advanced knowledge and skills, which can lead to professional advancement, job promotion, improved salary compensation, and higher self confidence. The University of Medicine and Dentistry of New Jersey (UMDNJ) offers a Doctorate of Clinical Nutrition (DCN) for Registered Dietitians (RDs) as a post professional clinical doctorate and currently is the only program of its kind in the country. The purpose of this research was to describe the professional and personal outcomes of the University of Medicine and Dentistry of New Jersey Doctorate of Clinical Nutrition alumni, explore the benefits associated with their advanced degree and determine why they chose to pursue an advanced degree. In addition, an evaluation of the program was included. The study design was qualitative using content analysis. Purposive sampling was used to identify subjects for this study which were DCN graduates between 2007 and September 2011 (n=28) with the exception of one DCN graduate that is full time UMDNJ faculty. Ten graduates enrolled in the study. The study used a semi-structured interview with an interview guide to collect demographic data and descriptive narrative data. The interviews were audio recorded, transcribed word for word, and analyzed using content analysis. NVivo, a qualitative research software, was used to analyze the unstructured information. Graduates perceptions of the program upon admission were confirmed after graduation; they received what they expected they would from the program which included positive professional and personal outcomes. Graduates praised the flexibility of the program and the online format. In addition to offering them more scholarship opportunities, research skills and opportunities, and a higher level of knowledge, graduates felt a personal satisfaction and confidence attributed to their cutting edge knowledge. Additional themes were explored such as suggested ways to improve the program and characteristics of DCN graduates.


Impact of an Online Education Module on Dietitian Attitudes and Knowledge Regarding Recommending and Ordering Multivitamin/Multivitamin-Mineral Supplements  A Pilot Project

Elizabeth daSilva, 2013

Background: Multivitamin/multivitamin-mineral supplements (MV/MVM) are commonly used yet there is a paucity of research describing Canadian dietitians (RDs) attitudes and knowledge towards recommending and ordering these products for their patients. Objective: To determine the attitudes and knowledge of RDs about recommending and ordering MV/MVM prior to and following an online education module. Design: A one group pre-test/post-test intervention. Participants: Fraser Health authority (British Columbia, Canada) clinical RDs (n=189) were recruited by email and eligible to participate if they worked with adults regardless of clinical practice setting. Intervention: An online narrated PowerPoint presentation and electronic resources were used as the educational intervention. Main Outcome Measures: After undergoing external review for face and content validity, an attitude questionnaire and knowledge test were administered pre and post intervention. The attitude questionnaire utilized a five point Likert scale (the higher value reflected a more positive attitude) with a maximum summative score of 30 points. The knowledge test was worth a maximum of 15 points. Statistical Analyses: Change in attitudes and knowledge were analyzed by the Wilcoxon Signed-rank and dependent t-test, respectively. Results: Of the 123 eligible RDs, 74 (60.2%) were recruited, 57 (77.0%) completed the study and 55 (96.5%) were included in the final analyses. The pre- and post- attitude questionnaires were internally reliable (alpha 0.83 and 0.86, respectively). Summative attitude scores were significantly higher on the second questionnaire compared to the first (t= 92.5, p<0.001). There was a significant increase (mean change=1.44 ± 1.49; t(54)=7.16, p<0.001) in the proportion of correctly answered knowledge questions from pre (78.0% ± 10.0%) to post (mean = 87.4% ± 6.0%) test. Conclusions: Canadian RD attitudes and knowledge test performance regarding ordering and recommending MVM supplements improved post online intervention. The high recruitment and completion rate confirm this was an effective continuing education strategy.


Nutrition Practices Needs Assessment at Nine Head Start Centers in New Jersey

Liz Hecker, 2013

Objective: To examine the nutrition practices and policies at nine Leaguers Head Start centers in Union and Essex Counties in New Jersey. To describe the nutritional status and demographic characteristics of the children ages 3-5 attending Leaguers Head Start centers. To study the relationship between the nutrition practices and weight status of the children attending Leaguers Head Start centers. Design/Subjects: This is a retrospective cross sectional secondary analysis. Nine Leaguers Head Start centers and the 700 three to five year old pre-school aged children who were enrolled at the centers were included in this study. Statistics: SPSS 20 was used to analyze descriptive statistics: age, sex, race, height, weight, BMI, BMI z-scores, weight status categories. Pearsons Product Moment correlation analyses were conducted to examine the relationship between BMI z-scores and the EPAO total and sub-area scores for nutrition practices for each Leaguers center and the combined Leaguers centers. Results: Data from 339 males and 361 females between 36 and 71 months of age were included in the study. The majority of the children were Black (59.3%) and Non-Hispanic (68.3%). Seventeen percent of the students were overweight and 19.7% were obese. A negative statistically significant relationship (p=-0.104, r=0.006) was found for BMI z-score and total score for nutrition practices for the sample (n=700). One sub-area, Staff behavior nutrition, had a negative significant relationship (p=0.024, r=-0.735) for the nine Leaguers centers. A negative significant relationship was seen for BMI z-scores and Staff behavior nutrition (p=0.014, r=-0.092), Nutrition environment (p=0.008, r=-0.101), Beverages (p=0.022, r=-0.087), Nutrition training and education (p=0.004, r=-0.108) and for the total sample (n=700). Conclusion: The prevalence of overweight and obese children at Leaguers Head Start centers is higher than national prevalence rates for children the same age. Negative statistically significant relationships were shown for four nutrition sub-areas: Nutrition training and education, Beverages, Nutrition environment, Staff behavior and BMI z-scores. A negative significant relationship was seen for EPAO total score and BMI z-scores. This needs assessment suggests that centers with higher EPAO total and sub-area scores for Nutrition training and education, Beverages, Nutrition environment and Staff behavior have a higher prevalence of children with lower BMI z-scores.


Changes in Patient-Care Practices of Israeli Dietitians who participated in an Oral Health Training Program on Nutrition Focused Physical Assessment of the Head, Neck, and Oral Cavity at Six Months Post-Training

Alma Vega, 2013

Objective: To explore changes in practices regarding performance of nutrition focused physical assessment (NFPA) skills three and six months after completion of an oral health training program (OHTP). Design: A secondary analysis of data collected in a prospective study to explore the changes in documented practices of Israeli Dietitians after an OHTP on the use of NFPA skills was conducted. The dietitians used a data collection tool to assess patients and document the findings of their assessments. Participants/Setting: A sample of 32 Israeli dietitians working full-time in Ministry of Health Long Term Care Facilities (LTCFs) was enrolled in the original study. Statistical Analyses: Chi-square and Fishers exact test analyses were used to compare changes in practices from baseline to three and six months post-training, and from three to six months post-training. Results: All participants were female, the mean age was 37.0 years (SD=8.9), they practiced as a dietitian for 9.8 (SD=5.9) years, and practiced as a dietitian in Israeli LTCFs for 6.4 (SD=3.9) years. During the baseline period, the dietitians relied on the medical record to extract patient information. From baseline to three months post-OHTP, there were statistically significant increases in the proportion of dietitians documented performed assessments (p<0.001) as compared to using the medical record and not assessing the component. Significant increases were found for all assessment components tested (p<0.001), referrals (p<0.001), and diet changes (p=0.005) from baseline to six months post-OHTP. No changes in how dietitians performed assessment components for the majority of the assessment components tested, referrals, or recommended diet changes from three to six months post-OHTP (p=0.136 to p=0.710) were found. Conclusion: Significant increases were found in the documented NFPA practices that were performed by the dietitians after the OHTP. The dietitians retained skills from three to six months post-OHTP. These findings support the benefit of training in NFPA and oral screening for Israeli Dietitians working in Ministry of Health LTCFs.


COMPARISON OF EDUCATIONAL INTERVENTIONS ON REGISTERED DIETITIANS RESEARCH OUTCOME CONSTRUCTS

Carrie Benton King, 2012

Many registered dietitians (RDs) report a lack of confidence in their research knowledge and skills which can be a barrier to their research involvement. The primary purpose of this on-line study was to measure the effect of a 14-week social cognitive career theory (SCCT)-based educational intervention combined with an existing evidence-based practice (EBP) continuing professional education (CPE) program on RD research outcome constructs (research self-efficacy, research outcome expectations, research interest, research involvement), as compared to the EBP CPE program alone and to a no-treatment control group. A random list of 7550 RDs, who self-reported clinical nutrition as their primary practice area was obtained from the Commission on Dietetic Registration for the randomized control trial. Respondents to the invitation e-mail (n = 1762) were asked to complete a screening survey; 620 RDs met all eligibility criteria and enrolled in the study. Subjects (N = 620) were randomly assigned to three study groups. Repeated measures analyses of covariance (ANCOVAs) were used to assess the effects on the study groups and the change in each of the research outcome constructs over time. There were significant interaction effects between group assignment and research self-efficacy (p <0.001) and research involvement (p = 0.005) over time, resulting in an increase in research self-efficacy (p = 0.001) and a decrease in research involvement (p = 0.004) at post-intervention. There were no significant effects for research outcome expectations and research interest over time. The similar, significant increase in research self-efficacy in both the standard-plus (p = 0.037) and standard (p = 0.048) groups over time, compared to the control group, that occurred post-intervention and was maintained at three-month follow-up, is the critical first step to ultimately promoting a change in the other research outcome constructs. The similarity of the change suggests that the more labor-intensive role of the instructor in the standard-plus groups CPE program may not be necessary to promote a significant change in research self-efficacy. The findings of this study add to the published literature on the research self-efficacy of health professionals and may be useful for guiding future educational efforts to promote RD research involvement.


Change in RD Knowledge Score After Completion of a Web-based Educational Intervention on Application of the IDNT

Lisa Wellnitz, 2013

Background: Recent research has demonstrated that the International Dietetics and Nutrition Terminology (IDNT) is underutilized within the dietetics profession. There is a paucity of resources on application of the IDNT in pediatric nutrition care settings, this lack of resources combined with underutilization of the IDNT within the dietetics profession illustrates a need for education on this subject within pediatric nutrition care settings. Objective: To determine practices regarding and changes in knowledge about the IDNT among Pediatric Nutrition Practice Group (PNPG) members who completed a web-based educational intervention on the application of the IDNT in pediatric nutrition care settings. Methods: A prospective one group pre-/post-test design was used. Email invitations were sent to all PNPG members (n= 3036) as an e-blast, inviting prospective participants to complete a pre-test, followed by a web-based module on application of the IDNT in pediatric nutrition care settings, and a post-test. Frequency distributions, Fishers exact test, paired t-tests, and point biserial correlation were used for analyses. Results: Twenty-one PNPG Registered Dietitian (RD) participants completed the course module, pre- and post-tests; 28 participants completed the demographic portion of the pre-test. Seventy-nine percent of participants (78.6%, n=15) reported usage of an electronic health record (EHR) for patient/client documentation. Seventy-five percent of participants (75.0%, n=21) reported usage of the Nutrition Care Process (NCP) as a framework for providing nutrition care. Seventy-nine percent of participants (78.6%, n=22) reported using one or more components of the IDNT for medical record documentation; 53.6% (n=15) of participants reported using all 4 IDNT components for documentation. Following the intervention, knowledge scores increased significantly (mean increase= 2.19 points out of a maximum 20 points, SD= 1.75, t= 5.73, p<0.001). Conclusion: The results demonstrated that among PNPG members who completed the intervention, knowledge of the IDNT improved significantly following the intervention. Further research is needed with a larger sample and control group.


A Needs Assessment of the Physical Activity Environment of Nine Head Start Centers in New Jersey

Traci Fauerbach, 2013

Background: Pediatric obesity during childhood is a leading public health concern that disproportionately affects low income and minority children. Physical activity in preschoolers has been show to provide protective benefits by slowing increasing adiposity. With 63% of preschool aged children enrolled in a form a preprimary education, childcare centers is a logical place to focus obesity prevention strategies. In order to focus program changes centers must first understand their baseline physical activity practices and environment. Objective: To conduct a needs assessment of the physical activity environment and practice of nine Leaguers Inc. Head start Centers and to describe the growth status of the children enrolled in these centers. To identify a relationship between childcare center physical activity practices and growth status of enrolled children. Design: A retrospective, cross sectional study. Subjects: 700 preschoolers from nine Leaguers Inc. Head Start Centers. Statistical Analyses Performed: Pearson s product moment correlations were used to explore the relationship between Environment and Policy Assessment and Observation (EPAO) Instrument Sedentary Environment, Sedentary Activity, Total Physical Scores and BMI z score. Results: The sample consisted of 700 children with a mean age was 49 months (range 36-71 months). Fifty-one percent of the children were female, black (59.3%), and/or Non-Hispanic (68.3%). 2.6% of children were categorized as severely underweight, 1.3% as underweight, 59.4% as normal weight, 17% as overweight, and 19.7% as obese. Mean EPAO sub-category scores for the centers were 9.63 for Active Opportunity, 12.59 for Sedentary Opportunity, 8.15 for Sedentary Environment, 12.06 for Portable Play Equipment, 9.44 for Fixed Play Equipment, 13.78 for Staff Behavior- Physical Activity, 5.00 for Physical Activity Training and Education, and 10.00 for Physical Activity Policy. Mean EPAO Total Physical Activity Score for centers was 10.08 out of a possible 20. There was a small to moderate negative effect relationship between BMI z score and Sedentary Opportunities Score (r= -0.279, p= 0.467, n=9). No relationship or significance was achieved for the correlation between BMI z score and Sedentary Environment Scores (r =0.00, p = 1, n = 9) or Total Physical Activity Score (r = 0.21, p = 0.958, n = 9). Conclusions: There are identifiable areas of nine Leaguers Inc. Head Start Centers physical activity practices that could be improved upon to better support obesity prevention practices. Physical activity policies need to be implemented that define and support active and inactive time, screen viewing, play environment, and physical activity education. Provisions for additional fixed and portable play equipment should also be explored.


National Kidney Foundation-Council on Renal Nutrition Second National Research Question Collaborative Study (SNRQ): A Comparison of the Dietary Intake of Women with CKD Stage 5 on Maintenance Dialysis with a Non-CKD Cohort from the Womens Health Initiative Dietary Modification (WHI-DM) Trial.

Mona Therrien, 2013

Objective: To compare the characteristics and dietary intake of SNRQ participants to the WHI-DM Trial group and to compare dietary intake of both groups to relevant reference norms. Design: A secondary analysis of data collected from the SNRQ and from the WHI-DM Trial. Setting/Subjects: SNRQ participants were adult women on dialysis (N=248) from U.S. dialysis facilities. WHI-DM Trial participants (N=48,836) were post-menopausal, 50-79 year-old women, from 40 U.S. clinical centers. Methodology: The one-sample t-test, Chi-Square and Wilcoxon signed-rank test were used to compare the SNRQ participants to the WHI-DM group and to compare dietary intake of both to nutrition reference norms. Differences were considered significant at a two-tailed p<0.01. Main Outcome Measure: Dietary intake was defined as dietary energy intake (DEI), dietary protein intake (DPI), fiber, fat, saturated fat, sodium, potassium, phosphorus, fruits, and vegetables. Results: Characteristics including age, race, weight, educational level, and cardiovascular disease differed between the SNRQ and WHI-DM groups (p<0.001). SNRQ participants had lower DEI, DPI, fiber, fat, saturated fat, potassium, sodium, phosphorus, fruit, and vegetable intake than WHI-DM women (p<0.001). Dietary intake of SNRQ hemodialysis (HD) and peritoneal dialysis (PD) patients differed significantly from reference norms (p<0.001) except for phosphorus intake in PD patients (p=0.03). WHI-DM women had higher intakes of fat, saturated fat, and lower intakes of fiber, fruit, and vegetables than recommended in reference norms for the general population. Conclusion: Dietary intake differed significantly between SNRQ participants and the WHI-DM group. Dietary intake of the SNRQ participants, except for phosphorus intake in PD patients, differed significantly from relevant reference norms. Keywords: dietary, intake, kidney, women, dialysis.


Frequency of Preoperative Nutrition Counseling by a Registered Dietitian is Related to Less Food Intolerance and Incidence of Dehydration and Greater Percent Excess Body Weight Loss in Postoperative Bariatric Surgery Patients

John Bock, 2013

Background: Food intolerance (FI) and dehydration are common complications experienced by patients after bariatric surgery and can lead to emergency department visits and readmissions. Objective: To examine the relationship between the number of preoperative nutrition counseling sessions by an RD and incidence of FI, dehydration and percent excess body weight loss (%EBWL) after bariatric surgery. Design: Retrospective chart analysis of patients that underwent bariatric surgery between August 2010 and December 2012. Participants/Setting: 160 patients that attended at least one preoperative and one postoperative office visit within a private medical nutrition therapy practice in NJ. Intervention: Patients were categorized into two groups. The mandatory clearance group (MCG) (n=67) included patients that attended one office visit preoperatively and the frequently followed group (FFG) (n=93) included patients who attended two or more preoperative visits for further nutrition counseling. Main Outcome Measures: FI, incidence of dehydration and %EBWL. Statistical Analyses Performed: Bivariate analyses assessed baseline group equivalence. Logistic and linear regressions were used to evaluate differences between groups after controlling for confounders. Results: The sample population was 80% female (n=128), with a mean age of 44.9±11.3 years and mean BMI of 45.8±7.7. The MCG experienced more FI (56.7%) at the initial postoperative visit compared to the FFG (20.4%) (p<0.001), and were 12 times more likely to have FI compared to the FFG (p<0.001) after controlling for confounders. The FFG also achieved 4.5% greater %EBWL by the first postoperative visit compared to the MCG (p=0.005). There were no differences in the incidence of dehydration between groups. Conclusions: Patients who saw an RD more than once preoperatively had significantly less FI and lost more weight in the first month postoperatively than patients who came in only one time prior to surgery. This study supports the need for more frequent preoperative RD counseling to prevent adverse outcomes.


Assessment of Weight Stigma in Dental Pre-Doctoral Students Using a Fat-Phobia Scale

Heather Cunningham, 2013

Background: Weight stigma is the concept that obese individuals are devalued in society because of weight. Health care professionals can possess degrees of weight bias. Dental professionals are becoming more involved with health care screenings, including body mass index (BMI) calculation and interpretation. Weight bias assessment tools, such as the Fat Phobia Scale, are used to determine attitudes about weight. Objective: To determine the level of fat phobia in pre-doctoral dental students and relationships between year in dental school and self-classified BMI. Design/Subjects: All pre-doctoral dental students (N=410) enrolled in the UMDNJ  NJDS were emailed a Terminology Used to Describe Weight survey. Statistics: Descriptive statistics were used to analyze demographic characteristics. Relationships between fat phobia score (FPS) and year in dental school and self-classified weight status were determined using one-way ANOVA tests. Post-hoc analyses were conducted to determine possibility of type II errors for relationships between FPS and both year in dental school and self-classified weight status. Results: Of the 37 responses (11.1%), the mean age of the population was 26.3±3.33 years. The sample was mostly male (68.6%) and white (48.6%). The mean composite fat phobia score was 3.9 ± 0.48. One-way ANOVA tests were used to test the relationship between year in dental school and FPS (p=0.34) and between self-classified BMI and FPS (p=0.38). No significant relationships were found between FPS and demographic characteristics, attitudes about weight and computed BMI. First year dental students had a mean fat phobia score of 3.7±0.49; fourth year students had a mean score of 4.1±0.54. Conclusions: Respondents possessed moderate fat phobia and with increasing year in dental school, fat phobia increased. Future research is needed on dental students and fat phobia with a larger sample size to determine the increasing trend in fat phobia with year in dental school.


Oral Health and Disease Knowledge of Childrens Hospital Association (CHA) Dietitians in Pediatric Care

Parvathy Raman, 2013

Background: Oral health screening as part of the nutrition-focused physical examination is an integral component of a nutrition assessment. Therefore it is important for Registered Dietitians (RDs) in pediatric care to have an understanding of oral health. Objective: To identify knowledge from a score on a series of questions related to oral health and diseases as related to nutritional care among RD members of Childrens Hospital Association (CHA). Design: Descriptive, prospective internet- based survey. Participants/setting: RD members who provide or supervise clinical practice at member facilities of Childrens Hospital Association (CHA) and work in clinical practice. An email blast was sent to all members within the CHA network. Twenty-four members agreed to receive the invitation to participate, with a response rate of 62.5% (n=15). Results: The survey sample (n=14) was 100% female with a mean age of 39.6 years. The median number of years in clinical practice in pediatric care was 10.9 years. 21.4 % (n= 3) reported having the CSP credential. 57% of participants reported working in an acute care setting (ambulatory care/clinic/outpatient setting). The overall knowledge score for participants was 74.8% (11.2 out of a possible 15 correct). There was no significant relationship between knowledge score and years as an RD in pediatric practice (p=0.295). There was no significant relationship between knowledge score and type of practice setting [F (4,9) = 1.633, p=0.248]. Ninety-three percent participants correctly answered the question regarding appearance of normal gingiva, however when asked if s/he sees any oral lesions 29% of participants answered nothing, that is not his/her job. Conclusion: Respondents who participated in this study scored correctly on preventative oral health related questions, tooth friendly between-meal snacks and strategies to decrease post-meal caries risk among adolescents. A large number reported not regularly performing oral health related patient care practices.


The Relationship Between Early Nutritional Status and Pulmonary Function in Pediatric Cystic Fibrosis Patients

Allison Gomes, 2013

Background: Individuals with cystic fibrosis (CF) have increased nutritional needs and progressive loss of lung function. Achieving a weight for age percentile (WAP) and/or a body mass index for age percentile (BMIP) above the 50th percentile has been associated with higher pulmonary function. Objective: To assess the relationship between early nutritional status and later pulmonary function in a single-center pediatric CF population. Design/Subjects: A retrospective chart review of 25 patients with CF born between 1989 and 1996 treated at Central Connecticut CF Center (CCCFC) was completed. Nutritional status was measured during the fourth year of life, as nutritional status peaks at this age based on data from the Cystic Fibrosis Foundation (CFF) Patient Registry from 2011. Pulmonary status was measured during the seventeenth year of life. Statistics: Summary statistics were carried out to describe the study sample. The relationship between early nutritional status (WAP and BMIP) and pulmonary function during the seventeenth year of life was analyzed using independent t-tests. Results: During the fourth year of life, 48% (n=12) of participants achieved a WAP at or above the 50th percentile and 44% (n=11) achieved a BMIP at or above the 50th percentile. There was no difference in mean FEV1% for both nutritional measures. For both WAP and BMIP, those at or above the 50th percentile had an FEV1% above 90%. There was a statistically significant relationship between age at diagnosis and FEV1%. Diagnosis within the first month of life resulted in a higher FEV1% compared to diagnosis after the first month of life (p=0.032). Conclusions: These findings underscore the importance of early interventions on long-term outcomes for patients with CF. A larger sample size may have the power to detect relationships between nutritional status and pulmonary function, though this study provides evidence for diagnosis within the first month of life and pulmonary function later in life.


Food Security, Family Influenced Behaviors, and Child Weight Status in Preschool Age Children Attending the Leaguers Centers Head Start Program

Amy Mack, 2013

Background: There are disparately high rates of both obesity and food insecurity among ethnic minority populations and those of low socioeconomic status. It has been observed that a significant proportion of the preschool age children attending the Leaguers Head Start Centers in the Newark, New Jersey area are overweight or obese. Objective: To describe the household food security status and family-influenced nutrition, sleep, and television viewing behaviors of preschool age children participating in the Leaguers Centers Head Start program, and to evaluate for differences in these factors between those children classified as overweight/obese and those of normal weight. Design/Subjects: A retrospective, descriptive study with analysis of cross-sectional data that was collected to assess Leaguers Head Start Centers nutrition and physical activity practices as well as parenting practices. Fifty-two parents/caregivers of children ages 3-5 years attending the Leaguers Head Start Centers volunteered to participate in an on-line survey. Statistics: To examine for differences in household food security, family influenced nutrition and health behaviors between families with an overweight/obese child and those with a normal weight child, the Pearsons chi-square test, Fishers exact test, t-test, and Mann Whitney U were utilized. Results: Of the study sample, 17.3% (n=9) were classified as overweight and 28.8% (n=15) as obese, with child gender almost equally distributed (51.9% male, 48.1% female). A majority of the sample reported race as black or African American (75.6%, n=39). Slightly more than one half of the sample was classified as fully food secure (55.8%), and 19.2% (n=10) with low food security and 15.4% (n=8) with very low food security. Of the family nutrition behaviors, family meal habits were scored the highest at 74.71%, and parent modeled nutrition behaviors the lowest at 44.83%. Most parents rated their childs sleep as very good (65.4%) or fairly good (30.8%) despite only 23.1% of the children receiving at least 10 hours of sleep on a usual night. More than one half of the children in this sample have a TV in their bedroom (63.5%, n=33), and almost half (46.2%, n=24) of the sample reported that their child watches more than 2 hours of television every day of the week. There were no significant differences found between overweight/obese children and normal weight children for the presence of household food insecurity (p = 0.246), family meal habits (p = 0.392), parent modeled nutrition behaviors (p = 0.339), having a TV in the childs bedroom (p = 0.146), or child sleep adequacy (p = 0.578). Conclusions: Forty-six percent of this sample was either overweight or obese, and 34.6% were classified with low or very low food security. Neither household food security or family influenced nutrition and health behaviors were found to be significantly different between overweight/obese children and those of normal weight status. However overweight/obese children were 2.5 times more likely to have a TV in the bedroom and half as likely to report any degree of food insecurity.


Exploring Approaches to Practice of Advanced-Practice Registered Dietitians within the Clinical Setting

Joanna Otis, 2014

Studies over the past 25 years have tried to identify essential educational and professional attributes, practice tasks and approaches of the advanced-practice registered dietitian (APRD). Although multiple models have been proposed, including a model of advanced clinical nutrition practice by Brody, et. al. in 2012, the APRDs approaches to practice has yet to be comprehensively explored. The purpose of this research was to explore the approaches to practice among entry-level RDs (ELRD) and APRDs, to identify differences in practice approaches, and to assess the consistency of fit between the APRDs approaches to practice and the Approach to Practice subcomponent of Brodys proposed model. Random samples of ELRDs and beyond-entry level RDs were screened for study eligibility. Those eligible were recruited into the study and scheduled for an interview. Thirteen ERLDs and 15 APRDs were interviewed. A qualitative content analysis technique was used; subjects participated in semi-structured interviews utilizing the Critical Incident Technique (CIT). Subjects were asked to provide a descriptive narrative of a memorable patient encounter, including its context, their actions, and the outcome of the situation. Interviews were audio-recorded, transcribed, and analyzed by content analysis. In comparison to ELRDs, APRDs were more adaptable, intuitive and practiced with greater autonomy. APRDs more often provided mentorship and accepted responsibility for their patient outcomes. Additionally, the APRDs were found to have Approaches to Patient Care, Approaches to Professional Practice, and Practice Values consistent with those identified in the Approaches to Practice component of Brodys proposed model of advanced clinical nutrition practice. Future research should focus on exploring the APRDs approach to practice using alternative research methods, such as direct observation, to see if this studys findings hold true.


Knowledge and Performance of Dysphagia Risk Screening among Registered Dietitians in Clinical Practice

Stephani Johnson, 2014

Background: Dysphagia risk screening is an important component of the nutrition-focused physical assessment (NFPA). Objective: To identify knowledge and performance of dysphagia risk screening skills among RDs in clinical practice. Design: Descriptive, prospective internet-based survey. Participants/Setting: A randomized sample of 3,669 US RDs were emailed the survey; 310 (8.4%) completed the survey and met study inclusion criteria. Statistical Analyses Performed: Descriptive statistics were reported for demographic and professional characteristics, knowledge questions, and performance of dysphagia risk screening skills. The relationships between knowledge and performance of dysphagia risk screening were explored using the ANOVA, Kruskal-Wallis, and Spearmans correlation. Results: The participants worked in clinical practice for a mean of 13.9 years. Their overall knowledge score was 66.4% (11.3 out of 17 questions). Twelve of the 20 dysphagia risk screening skills were most frequently reported as use information obtained from the medical record or other healthcare professionals in nutrition assessment. Sixty-three percent (n=166) reported performing or using information for >10 skills. Twelve percent (n=31) of participants did not perform or use information for any skills. Participants who performed or used information about select dysphagia risk screening skills had significantly higher mean knowledge scores than those who did not perform the skills. Participants with prior training and/or education performed or used information for more skills than those who did not (p=0.030). Conclusion: Prior training and/or education was related to increased performance of the dysphagia risk screening skills. RDs knowledge of dysphagia risk screening may be suboptimal, and are more often using information about the skills from other sources in their nutrition assessments than performing the skills themselves.


Confirming Essential Attributes for the Advanced Practice Registered Dietitian in Clinical Nutrition

Melissa McCormack, 2014

Background: This study was a follow up of a 2009-2010 Delphi study of advanced level practice (ALP) in clinical nutrition. The study sought to provide evidence to support advanced practice attributes and a proposed ALP model in clinical nutrition. Design/Subjects: A mixed mode survey was administered to a sample of advanced practice registered dietitians (APRDs) that met pre-determined eligibility criteria. Eighty-five essential attribute statements were rated from advanced to novice practice on a Likert-type scale. Statistics: The median rating, IQR and frequencies were reported for each statement. Median ratings were grouped by practice level. Results: Seventy-nine percent of eligible RDs (n=137, 78.7%) responded to the ALP survey. Respondents rated 44 (of 85) statements as advanced that were previously rated as essential APRD attributes in the Delphi study and confirmed 14 of 21 components of the proposed model of ALP in clinical nutrition within three dimensions. Seventy-seven percent of characteristics (n=30, 76.9%) were rated as advanced within the Professional Knowledge, Abilities and Skills dimension. Within the Approach to Practice dimension, 10.7% (n=3) of statements were rated as advanced level practice. Seventy-three percent (n=27, 73%) of attribute statements within the dimension of Roles and Relationships were rated as advanced practice. Conclusion: The survey supported various aspects of ALP in clinical nutrition including that APRDs have an advanced education and participate in aspects of research. Additionally, APRDs in clinical nutrition practice with autonomy, are educators, leaders, contribute to organizational systems and functions, are role models to those within the profession and to other professionals, and have a broad and diverse network of professional colleagues.


Radiation Therapy for Breast Cancer and Body Weight: A Pilot Study

Mary Marian, 2013

Background: Weight gain following a breast cancer diagnosis is common and associated with increased breast cancer-specific and overall mortality. Weight gain during treatment is common; particularly for chemotherapy with less known about radiation therapy. Objective: To determine if women experienced weight changes during radiation for breast cancer. The relationship between body weight and fatigue grade was explored. Design: This retrospective study examined the demographic, clinical and treatment characteristics, and if change in body weight and fatigue grade occurred during radiation therapy for stage 0-IIIC breast cancer. Subjects: Fifty-eight electronic medical records were reviewed with 48 (82.8%) women meeting study eligibility requirements. Statistical Analyses Performed: Descriptive statistics were reported for demographic, clinical and treatment-related characteristics. The paired-dependent t-test evaluated mean change in weight and the Wilcoxon signed-rank determined mean change in fatigue grade from baseline to treatment completion, respectively. Results: The sample (n=48) was primarily Caucasian (81.3%) with a mean age of 63 years. Mean baseline body mass index (BMI) was 30.1 kg/m2 (SD = 7.5), with baseline mean body weight of 175.6 pounds (79.8 kg) [SD = 45.3 lbs. (20.6 kg)]. There was no statistically significant change in body weight from baseline to the end of treatment [t(47) = .477, p=0.64]. A statistically significant difference in fatigue grade throughout treatment (Z = -4.84, p<0.000) with a mean change in fatigue grade of 0.85 was found. Conclusions: There was no change in mean body weight during radiation therapy for stage 0-IIIC breast cancer. There was a statistically significant increase in fatigue from baseline to the completion of treatment but no relationship found between body weight and fatigue grade.


Preoperative fasting practices at a US teaching hospital: A retrospective descriptive study

Kate Willcutts, 2014

Background: The American Society of Anesthesiology (ASA) guidelines for fasting prior to elective surgeries are six hours for solid foods and two hours for clear liquids. The common practice of nil per os (NPO) at midnight prior to surgery is based on tradition rather than evidence. Prolonged fasting times are associated with iatrogenic malnutrition and reduced patient satisfaction. Objective: The purpose of this research was to assess the preoperative fasting practices at a large, tertiary care, teaching hospital as compared to the ASA guidelines. Design/participants/setting: A retrospective chart review of adult patients (n=99) who had elective surgery and were admitted to the hospital at minimum the night prior to surgery. Statistical analyses: Descriptive statistics were used to describe demographic and clinical characteristics. Wilcoxon rank sum test was used to compare ordered fasting time with the ASA guidelines. Results: The time between the active NPO order for solids and for clear liquids and the time of surgery was a mean of 11.4 + 3.1 hours and 11.3 + 3.0, respectively (range=7.4-18.2). The mean difference between the ASA guidelines and the ordered pre-operative fasting time for solids was 5.4 hours (95% CI=4.8-6.0) (P=0.029) and for clear liquids was 9.3 hours (95% CI=8.7-9.9) (P=0.019).


A Study of Dietary Intake and the Relationship to Neutrophil Engraftment Among Outpatient Hematopoietic Stem Cell Transplant Patients with Multiple Myeloma

Joy Heimgartner, 2014

Background: At Mayo Clinic autologous hematopoietic stem cell transplant (HSCT) patients with multiple myeloma are treated on an outpatient basis. Current literature suggests that calorie and protein needs following HSCT exceed basal requirements, despite lack of evidence that this improves outcomes. The actual calorie and protein intake of patients undergoing outpatient HSCT have not previously been published. Objective: To describe the calorie and protein intake of patients with multiple myeloma receiving outpatient autologous HSCT at Mayo Clinic, Rochester, Minnesota and explore relationships between intake and neutrophil engraftment. Design/Participants: A retrospective study of adult patients with multiple myeloma who received HSCT as an outpatient procedure between January 2010 and December 2012 and who consumed only an oral ad libitum diet. Statistical Analyses: Descriptive statistics were used to describe the demographic and clinical characteristics. A sensitivity power analysis was utilized. Pearsons correlations were utilized to explore relationships between calorie and protein intake and engraftment. Results: The study sample (n=230) was predominantly male (n=129, 56.1%) with a mean age of 60.6 years (range 35- 75 years). At the time of transplant, 77.8% (n=179) had a BMI classified as either overweight or obese. The mean calorie intake of the sample was 1530.4±452.1 per day (18.8±6.3 kcal/kg/day) and mean protein intake of 52.7±20.0 grams per day (0.65±0.26 g/kg/day). Mean time to neutrophil engraftment was 15.1±2.5 days (range 10-23 days). There were statistically significant very weak positive correlations between both calorie and protein intake and neutrophil engraftment timing. Conclusions: Compared to the general population, study subjects had a higher prevalence of overweight or obesity. A majority of patients did not meet estimated basal energy or protein requirements, however there was no correlation between calorie or protein intake and engraftment timing that could be considered clinically relevant. 


Knowledge of and Readiness for Foodborne and Waterborne Bioterrorism of Hospital-based Food Service Directors

Effie Akerland, 2002

Objective: This study assessed the knowledge of and readiness for foodborne and waterborne bioterrorism of hospital-based food service directors and determined relationships among select demographic factors. Design: A nationwide structured explanatory mail survey was designed and pilot-tested to hospital-based food service directors. The survey was divided into three parts. The first section included questions regarding knowledge of foodborne and waterborne bioterrorism. The second section included question related to disaster readiness and the last section asked for demographic information. A reminder postcard was mailed two weeks after the initial mailing to non-respondents; a second survey was mailed to non-respondents one month after the postcard mailing. Subjects: Surveys were mailed to a random sample of 500 hospital-based food service directors affiliated with the American Society of Healthcare Food Service Administrators living in the United States. Statistical Analyses Performed: Data were analyzed using JMP-IN software version 4.0. Frequencies, means, medians, standard deviations, Wilcoxon rank sum tests, Pearson Chi-Square and linear regression analyses were performed. Results: Overall 41.6% (n=208) of surveys were usable. Almost half (n=99, 47.59%) of respondents had a minimal level of knowledge regarding bioterrorism. One third of respondents had minimal scores on readiness (n=79, 7.98%). Utilizing another tool deleting questions regarding adding bioterrorism annual and orientation training for employees post 9/11, one third of respondents had moderate scores (n=80, 38.46%) on readiness. There was no significant relationship between food service director knowledge of bioterrorism and the respondents age, years of experience, geographic location, gender, level of education and credentials. There was no significant relationships between food service director readiness for bioterrorism and respondents age, years of experience, level of education and credentials using either readiness scores. Respondents living inside the 100 mile radius of NYC, Somerset County and Washington DC had significantly higher readiness scores (6 item and 8 item p=.04) than those living outside of the 100 mile radius of all three crash sites of 9/11. Females living within the 100 mile radius of all three crash sites had significantly greater readiness scores (p = .01) than males. For the population as a whole, males had significantly higher readiness scores than females (p =.009). Application/Conclusions: Results of this pilot study indicate that additional research is needed focused on topics pertaining to knowledge of and readiness for foodborne and waterborne bioterrorism of food service directors. Results of this survey clearly indicate a need for greater education of food service directors and food service employees on topics pertaining to foodborne and waterborne bioterrorism. Foodborne bioterrorism has occurred in the USA and the increasing likelihood of future large scales attacks is fathomable in todays day and age. It is imperative that the profession is both ready and aware.


A randomized study of gutamine versus placebo administered to patients who receive Taxol Chemotherapy at an Outpatient Cancer Treatment Center in Northern New Jersey.

Tamar Albert, 2002

Objective: To determine if there are any differences in symptoms and incidences of neuropathy reported between subjects taking glutamine versus placebo. Design: A randomized, double blinded, prospective, placebo controlled clinical trial at outpatient clinic in Northern New Jersey. Subjects were consented prior to starting Taxol treatment dosage q 3 week and consumed glutamine or placebo 10 grams T.I.D. on day 1 of treatment until day 5, repeated over three treatment periods. Subjects/setting: A total of 10 subjects met eligibility criteria, 9 completed cycle 1, 6 subjects completed all three cycles. A general pain score scale was collected for the first 5 days of all three cycles. The general pain score was a sum of subjective neuropathy scores for each day. Six subjects were in the placebo group and three were in the glutamine group. Results: Overall, the treatment group appeared to have lower general pain scores than the placebo group at all 3 cycles. All three cycles appeared to have a similar pattern, in that the general pain score was highest from days 4 to 5 of the cycle. The results are very tentative due to the under powering of the study. Conclusions: There was some indication suggesting that glutamine helped to reduce the general pain score. The general pattern in all 3 cycles was that the general pain score increased on days 4-5 of each cycle. Overall, the general pain score was higher in the placebo group than the glutamine group on those days. A larger sample size study is needed to explore this further.


A survey of obesity management practices of pediatricians in New Jersey.

Michelle Allen, 2002

The purpose of this study was to examine obesity management practices and variables influencing those practices of members of the NJ Chapter of the American Academy of Pediatrics (1246 pediatricians). Frequency distributions, chi square, Wilcoxan Rank Sum (JMP-IN 4.04), and logistic regression (SAS 8.0) were used with p=.05. Response rate was 424 usable surveys. The three primary reported methods to determine overweight or obese included wt-for-ht (75%), wt-for-age (62%), and visual determination (52%); 27% reported using BMI-for-age. Over two thirds of respondents indicated that, when a child is overweight or obese, wt is discussed with parent and/or child, family, diet, and physical activity histories were taken, diet and physical activity counseling was provided, and patients were referred to RDs. Approximately 50% of respondents reported screening overweight or obese patients for Type 2 Diabetes Mellitus (DM). The top 3 reported barriers to enhancing obesity management in practice were patient compliance (87%), time (75%), and patient interest (51%). The top 3 reported strategies to expand obesity management were better education materials (78%), a seminar on pediatric obesity (65%), and better reimbursement (61%). Respondents identified medical journals (84%), seminars (36%), and conferrals with RDs (35%) as primary sources of nutrition information. Patient compliance had a significant impact on taking diet history (p=0.0011), providing diet counseling (p= 0.0043), and screening patients for Type 2 DM (p= 0.0494); patient interest had a significant impact on use of BMI to determine overweight or obese (p= 0.0117). Nutrition information sources had a significant influence on select obesity management practices. Less than 30% of NJ pediatricians use the BMI to assess overweight or obesity with patients; the majority of respondents view RDs as a credible source of nutrition information. Future studies should explore pediatricians' diet and physical activity histories and counseling practices.


Relationship between indirect calorimetry using different ventilator systems as compared to the Harris Benedict predictive equation.

Natalie Amato

ABSTRACT TEXT Critically ill intensive care unit (ICU) patients have specific energy requirements to optimize recovery and improve overall outcome. Two methods of assessing resting energy expenditure (REE) are indirect calorimetry (IC) and predictive equations. This study measured REE from IC using three ventilator systems and compared these results to predicted REE. Thirty ICU patients at a university teaching hospital were enrolled and divided into three equal groups of 10. Group 1 subjects' REE measurements were performed while on the Servo 300 and Puritan Bennett 840 ventilator systems; Group 2 subjects were on the Puritan Bennett 840 and Servo i ventilator systems; Group 3 subjects were on the Servo i and Servo 300 ventilator systems. Predicted REE was estimated in two ways: 1) Harris Benedict (HB) equation with an injury factor of 1.3 (STD IF) and 2) HB with a disease-specific injury factor (DS IF). Paired t-tests compared the differences between measured and predicted REE. There was a within subject difference of 113 +46 kilocalories in measured REE between Group 1 ventilator systems (p=0.04). When predicted equations were analyzed separately, DS IF was significantly higher than measured REE for the total population (p=0.0008) and for the subjects in Groups 2 (p= 0.01) and 3 (p=0.02). Data suggest that predictive equations may not capture the unique metabolic demands of the ICU patient and IC remains as the gold standard for assessing REE among the critically ill. When IC is not available, the STD IF is a reasonable estimate of REE for ICU patients. FUNDING DISCLOSURE- DNS Researchers Award; American Dietetic Association, Dietitians in Nutrition Support Dietetic Practice Group


The relationships between physical activity level and changes in anthropometric measures among participants in a university worksite wellness program

Jillian Wanik 2014

Purpose: To explore the relationships between physical activity (PA) level and changes in anthropometrics among participants in a worksite wellness program (WWP). Design: A retrospective cohort. Setting: A large Northeastern university. Subjects: 64 overweight or obese employees (89% female, 50% non-white, 69% obese) who returned at both 12 and 26 weeks. Intervention: A registered dietitian (RD) provided individualized assessment and counseling at baseline followed by a 12-week education intervention and individual appointments with the RD at 12 and 26 weeks. Measures: Participants completed health and demographic questionnaires. Anthropometric measures were obtained at each time point. The International Physical Activity Questionnaire-short form was used to calculate PA < >150 minutes per week (min/wk), median min/wk and MET min/wk. Analysis. Repeated measures general linear model and non-parametric tests were used. Results: At 12 and 26 weeks participants experienced significant decreases in weight (p=0.001); additionally, waist circumference (p=0.004) and abdominal obesity (p=0.001) decreased significantly in females only. Physical activity >150 min/wk (n=21), was associated with continued weight loss (p=0.03) and decreases in body fat percentage (p=0.02) between 12 and 26 weeks whereas PA < 150 min/wk was associated with weight and body fat percentage re-gain during the same time points. Conclusion: Among females in a WWP, higher levels of PA were associated with avoiding weight and body fat re-gain following successful loss.


Nutrition education in health professions: A survey of program directors

Jane Barracato, 1998

Objective: The objectives of this study were to determine the perceived needs, curriculum recommendations and expected competencies in nutrition of the graduates of dental (DMD), physician assistant (PA), nurse practitioner (NP) and midwifery (CNM) programs as reported by their program directors. Subjects: Directors of nurse practitioner programs (n=149), dental schools (n=54), certified nurse midwifery programs (n=42) and physician assistant programs (n=95) in the United States. Design: A four page nutrition education survey was sent to all program directors from dental, physician assistant, nurse practitioner and midwifery programs nationwide requesting information regarding their perceptions, recommendations and expected competency level of graduates of their programs in nutrition education. A reminder post card then another complete mailing was sent to those who did not respond to the initial mailing. Statistical analysis: All data analysis was performed using JMP IN software (SAS,1997). Frequencies and non parametric statistics were most frequently used to analyze data. Wilcoxan rank sum and Pearsons Chi-square were used to determine significance among perceptions by program directors for data analysis. Results: The overall response rate was 81.2% (n=276). Perceived need for competence in nutrition varied by program. The majority of PA, CNM and NP program directors had the same perceptions for competency in nutrition for their graduates. DMD schools differed significantly for perceived need to know how to counsel on a modified diet and how and when to refer to the registered dietitian compared with the other disciplines. At least three quarters of the program directors stated that their graduates were able to screen and assess independently the nutritional status of their patients. About half of the directors stated that their graduates were able to counsel patients on modified diets. Time was selected as the most important factor by the majority of programs that would enhance the provision of nutrition education in these programs. Computer based programs were the most frequently requested education material to enhance the nutrition education component of the their curricula's. Significance was also detected among program directors expected level of competency in nutrition of graduates of their programs. Conclusion: Program directors indicated that their graduates need to be competent in many areas of nutrition. However, a small amount of time (10-19 hrs) is dedicated to nutrition education in the entire curricula of these surveyed programs which may not be adequate for graduates to be fully competent in nutrition. More time needs to be allocated to nutrition in the curricula of the surveyed programs in order for graduates to be competent in nutrition to provide quality and comprehensive care to patients in their future practice as health care professionals. Return


Impact of continuing medical education on appropriateness of physician parenteral & enteral order writing.

Lisa Dispensa, 1997

Malnutrition is highly prevalent among hospitalized patients and is associated with negative health outcomes. Timely and appropriate nutrition intervention is important to prevent nutritional depletion among hospitalized patients. Identification of appropriate candidates for nutrition support (NS) therapy and optimal feeding strategies are challenging steps in the proper delivery of NS. The purpose of the study was to assist physicians in the proper identification of candidates who require NS, and promote enteral feeding in a timely manner. A NS continuing medical education (CME) program was provided to physicians at a 206-bed community hospital to determine the impact of CME on appropriateness of physician parenteral and enteral order writing. Data was collected for one month before and after the CME program via medical chart review on all adult patients who received NS therapy. To identify patients who were candidates for and did not receive NS therapy in the pre CME time period, a sample population was randomly drawn from all adult admissions during the study month (n=100 of 600 admissions). In the post CME time period, patients who were candidates for and did not receive NS therapy were identified by the clinical dietitian staff and referred to the principal investigator (n=43 of 900 admissions). Fisher's exact test was used with an alpha level preset at 0.05. Twenty-three and 37 patients received NS in the pre and post CME months, representing 3.83% (of 600) and 4.11% (of 900), respectively (p=0.7880). In the pre CME population, 11 (59%) patients inappropriately received parenteral nutrition (PN). This was not statistically different when compared to the post CME population in which 13 (52%) patients inappropriately received PN (p=0.5567). The average number of days PN was delayed was 13 and 9.5 in the pre and post populations, respectively (p=0.5567). In the pre and post CME populations, 13 (56%) and 21 (52%) patients were candidates to receive enteral nutrition (EN) therapy, respectively. In the pre CME population, 2 (15%) of 13 patients appropriately received EN. In the post CME population, 7 (33%) of 21 patients appropriately received EN. There was no significant difference between the two groups (p=0.4267). There was no statistical difference found in the timeliness of the initiation of EN between groups. The proportion of the sample identified as being at risk for malnutrition (pre CME) was statistically different from the proportion of the population (post CME) identified at risk, therefore these groups were not comparable. CME did not impact the appropriateness of physician parenteral and enteral order writing. This may be a result of the small sample size in each group and/or the limitation of the CME program to a single, one-hour session. However, there were a greater number of patients in the post CME group who received NS therapy and an increase in the number of patients who received enteral feeding. In addition, NS was initiated earlier (1.5 days earlier) in the post CME group. This may be due to an increased awareness among physicians of the need to identify malnutrition and begin nutrition intervention early. This study identified the need for standardization of care for the patients who require NS therapy. Physicians need to document the reason for initiation of NS, the mode of feeding chosen, and the expected length of therapy. A proactive approach to nutrition intervention must begin with clear documentation of the care plan.


Immediate gastric feeding in the critically ill trauma Patient: continuous versus bolus method

Jennifer Bridenbaugh, 1998

Introduction: The objective of this study was to investigate immediate gastric enteral feeding in the critically ill patient and determine if method of delivery, continuous (C) vs. bolus (B), had an impact on tolerance and effectiveness of immediate gastric enteral nutrition. Methods: The study design was a prospective, randomized, controlled, clinical trial. Eligible patients were randomized to receive either bolus or continuous feeding upon admission to the Surgical Intensive Care Unit (SICU). The enteral nutrition was initiated within 18 hours of injury. Results: Six hundred and three patients were admitted into the SICU during an eleven month time period (May 1997 through April 1998). Thirty-six (39%) were consented, randomized and enrolled. Of the enrolled subjects, 22 were randomized to receive continuous feeding and 14 were randomized to receive bolus feeding. There was a significant difference in the ram percent of calorie news met between the bolus and continuous deliveries (day 7: C = 51.5% ± 42.3; I = 87.1% ± 21.5 p = 0.03, Wilcoxon Rank Sum). Comparisons of total calorie and protein delivered as well as mean percent of protein needs met were not found statistically significant (p > 0.05, Wilcoxon Rank Sum). There were remarkable positive trends in nutrient delivery for the bolus method. There wen no significant number of complications identified for either feeding method (p > 0.05, Fishers Exact Test). Conclusions: Immediate gastric feeding is safe in the critically ill population. Bolus feeding may to advantageous to meet percent of caloric needs sooner after injury. In view of the emphasis for early enteral nutrition in the critically ill, gastric delivery should be considered as the first mode of feeding.


The effectiveness of dysphagia screening by a registered dietitian on the determination of dysphagia risk.

Rebecca Brody, 1997

The objective of this study was to examine the ability of the Registered Dietitian (RD) to identify patients at risk of dysphagia and make appropriate diet/feeding recommendations as compared to the Speech-Language Pathologist (SLP). Predictors of dysphagia risk were also determined. Thirty four patients admitted during a two month period to a neuroscience unit at an urban teaching hospital were analyzed prospectively. The RD and SLP screened subjects independently through questioning and/or mealtime observation for signs and symptoms of dysphagia. Presence of dysphagia risk and diet/feeding recommendations were determined. Kappa statistics demonstrated a moderate agreement (0.61) between the RD and SLP's determination of dysphagia risk ( > 0.7 = strong agreement, 0.4 to 0.7 moderate agreement, and < 0.4 weak agreement). The RD predicted the ability of the patient to consume an oral diet with strong agreement (1.0); various diet consistencies with moderate agreement (0.61); and the need for a liquid restriction with strong agreement (1.0). The most significant screening indices for prediction of dysphagia risk (p < .05) were age (p = 0.0181), history of dysphagia (p = 0.0428), difficulty swallowing solids (p = 0.0007), observed facial weakness (p < 0.001) and a wet or hoarse voice ( p = 0.007). Self-reported screening variables significantly related to dysphagia risk included drooling of liquids (p = 0.009) and solids (p = 0.0080), facial weakness (p = 0.0006), wet or hoarse voice (p = 0.0010), and prolonged eating time (p = 0.0157). This study supported the concept that the RD can effectively identify and manage patients with dysphagia. Screening for dysphagia can be implemented as part of standard nutritional assessments and may aid in decreasing dysphagia related complications.


Superintendents' attitudes, policies, and practices regarding school food and nutrition services.

Andrea Brounstein, 2004

Objective To determine New Jersey public school district superintendents' attitudes, policies, and practices regarding school food and nutrition services, and whether the use of qualified nutrition professionals by New Jersey public school districts has any impact on decisions made. Design Mail survey to population of New Jersey district superintendents, 3 mailings. Subjects/Setting New Jersey district superintendents, or others representing the district that completed the survey (n=587; response rate: 251/587 = 42.76%). Of the 251 surveys returned, 247 were usable. Statistical Analysis Performed Descriptive statistics which examined the frequency of specific attitudes practices, and policies. Correlation coefficients were used to analyze the relationship between each of these variables. x² analysis, Fischer's Exact tests, and ANOVA were used to analyze the relationships between each of the outcome variables and use of a qualified nutrition professional. Results One-half (50.20%) of the districts that responded reported using a qualified nutrition professional. The use of a qualified nutrition professional showed benefit in selling healthier items, however, had little affect on policy-making decisions and attitudes. NJ public school districts had a very positive attitude regarding the importance of having a district-wide food and nutrition policy (89% were in support), however, only 34.82% of districts reported currently have a nutrition-related policy. Over three-fourths (81.82%) of responding districts have access to one or more competitive food venues; only 39% of participants supported having school policies to help reduce the number of overweight or obese students. Applications/Conclusions This study supports the need for districts to implement individual district-wide nutrition-related policies, and employ more qualified nutrition professionals to assist in policy development at local and state levels. This study's findings should be used as a baseline needs assessment for state agencies to help guide each district to create and enforce stricter food and nutrition-related policies and practices


Registered Dietitians working in clinical positions have access to sources of empowerment

Linda Buckley, 2002

Objective: To determine Registered Dietitian's perceptions of their access to sources of empowerment in the workplace and any demographic factors which may be significantly related to their perceptions of empowerment . Design: This study was based on the conceptual framework of Kanter's theory of organizational power. The Conditions of Work Effectiveness Questionnaire, developed by Chandler and revised by Laschinger, was used to measure perceived access to sources of power: opportunity, information, support, and resources. Demographic questions were added to the questionnaire to determine factors which may effect perceptions of empowerment. Subjects: Subjects consisted of clinical RDs employed by ARAMARK, working in acute care settings. Letters were sent to the clinical nutrition managers of all of ARAMARK's acute care facilities asking for their assistance in distributing questionnaires to the RDs working in clinical positions in their departments. Responses were received from 96 managers, and a total of 362 surveys were mailed to them to distribute. Statistical analyses: The data were analyzed using descriptive statistics. One-way ANOVA was used to identify the difference in empowerment scores based on level of education, years in practice, work setting, and age. Significance was set at p=.05. Results: Usable questionnaires were received from 230 RDs (92%). On a 5-point scale, scores for opportunity (mean+/-standard deviation [SD]=3.96+/-0.7), access to information (mean+/-SD=3.38+/-1.0), access to support (mean+/-SD=3.39+/-0.9) and access to resources (mean +/-SD=2.79+/-0.6) suggest that RDs perceived themselves to have moderate access to sources of empowerment. Respondents with the longest time in practice (p=.0006), and those in the older age (p=.033) categories had statistically significant higher access to information scores, and their overall empowerment scores were also significantly higher than those in the other categories. Conclusions/applications: Registered dietitians working in clinical positions can have access to sources of empowerment in the workplace. The structures within the organizations they work in, the administration, and the dietitians themselves all have the potential to enhance their empowerment. These findings will be shared with clinical nutrition managers and the RD respondents who requested the information.


The Knowledge, Attitudes, Personal Health Care Practices and Patient Care Practices of UMDNJ Faculty Members Regarding Weight Management

Donna Castellano, 2005

Objective: Health care professionals' (HCPs) practices towards obesity treatment might be critical in helping to reduce the international epidemic of obesity. This study aimed to establish the knowledge, attitudes, personal health care and patient care practices of various different health-related professions, and the relationships of these variables regarding the provision of weight management to their patients. Design: A survey was mailed to a random sample of 778 full time faculty appointments of the University of Medicine and Dentistry of New Jersey during the academic year 2004; Statistical Analysis: Descriptive statistics of knowledge, attitude, personal health and patient care practices scores were analyzed using SPSS (Version II) software with significance set at alpha =.05. Pearson product moment correlation coefficient was used to compare relationships among the variables; independent t-test was used to compare mean total knowledge scores among faculty of the clinical vs. academic settings; one-way ANOVA was used to compare mean scores among the various health professions. Results: The survey yielded a 45.4% (n=353) usable response rate. More than 75 % of faculty could not identify the clinical definition of obesity based on BMI, and 25 % did not know that high waist circumference is associated with increased disease risk. There were no significant differences in total knowledge scores among clinical vs. academic faculty (p=.11). Overall, faculty members agreed that treating overweight and obesity is important and disagreed that treatment of overweight and obesity is futile. They had negative attitudes relating to patients' motivation, ability to lose weight, and that time prevents adequate counseling. Faculty members' personal health care and patient care practices were inconsistent when compared with federal guidelines. Common reasons for faculty members not providing education or referral for weight management were related to it being "not their responsibility" or "outside of their specialty." There were weak but significant correlations among faculty members' knowledge (p=<.01), attitudes (p=.03) and personal health care practices (p=<.01) and their patient care practices. No significant differences were found in mean scores for knowledge (p=.12), attitudes (p=.19), or personal health care practices (p=.93) among the various health care professionals surveyed in this study. Dentists had significantly lower patient care practice scores when compared with physicians (p=.02) and nursing professionals (p=.03). Conclusions: Overall, higher knowledge, more positive attitudes and better personal health care practices were associated with better adherence to evidenced-based patient care practices for weight management. Efforts should be made to improve all HCPs knowledge of screening techniques including use of BMI and waist circumference, as well as educating them to take responsibility for weight management within their own patient population. Improving HCPs attitudes and personal health care practices might also lead to improvements in patient care practices related to weight management.


A comparison of feeding tube material and the incidence of sinusitis in adult open heart surgery ICU patients.

Nicole DAndrea, 1999

Adult open heart surgery patients requiring nasogastric tube feeding were evaluated prospectively for the incidence of sinusitis. The nasogastric feeding tube material was different while the bore size was controlled (14 French).A total of 13 patients were enrolled in the study, 8 patients with a 14 French potyurethane nasogastric feeding tube and 5 patients with a 14 French polyvinylchloride nasogastric feeding tube. Duration of tube feeding ranged from 2 to 18 days in the polyurethane group (total tube feeding days=69) and 2 to 17 days in the polyvinylchloride group (total tube feeding days=33). All patients were tracked daily for signs and symptoms of sinusitis (including fever, headache, nasal discharge, oral secretions, purulent drainage, and sinus tenderness). Any patient that was suspected to have acquired sinusitis was to be evaluated by a head CT scan. One patient in the polyvinylchloride group obtained a CT scan which revealed sinusitis. This finding was not significant using chi-square analysis (p = 0.2199). There was no statistical significance between the two tube types when evaluating sinusitis signs and symptoms per patient basis using chi-square analysis (p=0.7264). There were significantly greater signs and symptoms of sinusitis in the polyvinylchloride group (p=0.0008, logistic regression) when analyzing signs and symptoms based on each tube feeding day. The severity of signs and symptoms of sinusitis increased as the tube feeding duration increased using chi-square analysis (p=.0076). The polyurethane nasogastric feeding tube was well tolerated by the patients with minimal complications and minimal sinusitis signs and symptoms indicating that, compared to polyvinylchloride, it was the more superior feeding tube material.


Resting Energy Expenditure of Overweight Women Determined by Three Methods

Kimberly Gottesman, 2015

Background: In clinical practice, determination of resting energy expenditure (REE) is important when conducting nutrition assessments, recommending nutrition interventions, and monitoring adequacy of oral intake. Measured REE is not always practical so clinicians rely on predictive energy equations (PEE) to estimate nutritional needs. Design/Subjects: A feasibility study was conducted to assess REE estimated by 25 kilocalories/kilogram (kcals/kg) compared to REE measured by Medgem® (MG) and Quark RMR® indirect calorimetry (QIC). Comparison between REE measured by MG and QIC was also made. Nine overweight women, ages 26-60 years, who were participants of a worksite wellness program were enrolled. Statistics: Resting energy expenditure determined by PEE, MG, and QIC were compared. Level of significance (p<0.05) was tested using Wilcoxon Signed Ranks. Bland Altman plots, with + 10% bias lines added, assessed the limits of agreement among the three REE values. Results: There was no significant difference in REE determined by MG and QIC (p=0.07). There were significant differences in REE when comparing PEE to QIC (p<0.01) and MG (p<0.01). With + 10% bias lines added, Bland Altman plots revealed that agreement between REE measured by QIC and MG was 67% and agreement between REE measured by QIC and estimated by PEE was 78%. Conclusions: In this sample, REE measured by MG was comparable to REE measured by QIC. Resting energy expenditure estimated by PEE was higher than REE measured by QIC and MG. Therefore, the use of the equation 25 kcals/kg in overweight women should be reconsidered and further research conducted.


The relationship between nutrition knowledge, diet intake and body composition in adults with HIV at The University Hospital, Infectious Disease (ID) Clinic - University of Medicine and Dentistry of New Jersey (UMDNJ) in Newark, NJ.

Stephanie Dimecurio

Purpose: The purpose of this study was to examine the relationship between nutrition knowledge, diet intake and body composition (weight, Body Mass Index, Body Cell Mass and Fat Mass) in adults with HIV at the ID Clinic at UMDNJ in Newark, NJ. Hypothesis: Nutrition knowledge has a positive influence on the diet intake of adults with HIV attending the UH-ID clinic at UMDNJ-, Newark, NJ. Study Design: A prospective study in an Infectious Disease Practice in an urban inner city Hospital. Method: All eligible clients with HIV with or without AIDS seen by the Registered Dietitian (RD) were asked to participate in this study. Informed consent was obtained from each subject. All participants were seen by the RD, gave a 24 hour recall, took a 14 question Nutrition Knowledge Quiz and completed a Bioelectrical Impedance Analysis (BIA) test. Actual weight, Body Mass Index (BMI), usual weight, % of usual weight, % Body Cell Mass & % Fat Mass were measured to reflect body composition. Food models were used during the 24-hour recall for a better assessment of serving sizes. Results: Eighty six subjects were enrolled in the study. The mean score on the Nutrition Knowledge Quiz was 67.0%. Fats/sweet and meat groups were adequately consumed by more than 85% of the population. Less than 40% of the population met the minimum recommendations for the milk, fruit and vegetable groups. More than 50% of the population met <90% of their calorie as well as <90% of their protein needs. Only 23% however met >90% of both calorie and protein needs. 25% of the population were <90% of their usual weight. 34% had <90% of ideal BCM. Diet intake reported as adequate or inadequate according to the Food Guide Pyramid, as well as percent of estimated needs met was inversely correlated to Nutrition Knowledge score. There was no relationship between nutrition knowledge and body composition. There was a significant relationship between lower BCM in women and lower intakes. Conclusion: Higher nutrition knowledge scores did not result in higher quality or quantity diets, nor did it result in improved body composition of adults with HIV at the ID Clinic in Newark, NJ.


Factors influencing nutrition counseling for clients with type II diabetes mellitus.

Diedre Ellard, 1997

Objective: To gather data relating to the client's perception of, not only the nutrition counseling provided, but also the perceived need for diet change, and possibly identify variables which impact on these. Design: A retrospective, descriptive study was conducted via mailed survey to clients of a diabetes center in a central New Jersey teaching hospital. The total population from September 1995 to May 1996, of all of the clients referred to the diabetes center was studied. Subjects: 29 participants responded by mail to the survey. Results: Pearson's Chi Square, Wilcoxon non-parametric rank sum, and rank correlation of Y were used to analyze the data. No statistically significant differences relating to teaching environment, site of teaching, age, gender, or insulin use were observed to influence the client's perception of nutrition counseling or the perceived need for diet change. However, statistical significance was detected between responses for some of the survey statements for the variables of teaching environment, site of teaching, gender, and insulin use. Applications: As principle providers of nutrition counseling, dietitians are challenged to not only individualize meal plans and integrate them into the total diabetes care, but also to develop teaching strategies which enable clients to appropriately apply their nutrition knowledge to diabetes-related problems. These findings suggest that the variables of teaching environment, site of teaching, gender, and insulin use should be considered in the development of these teaching strategies.


The Express Select® concept of patient food service delivery systems increases patient satisfaction, therapeutic and tray accuracy, and is cost neutral for food and labor costs.

David Folio, 2000

Objective: A customer service initiative known as Express Select® was implemented at Brandywine Hospital in Coatesville, PA and at Lancaster General Hospital in Lancaster, PA, 190-bed and 506-bed acute care hospitals, respectively, by the Wood Company food service management company. Express Select® converts a traditional meal delivery system to a spoken menu concept. The purpose of this study was to compare the Express Select® system to a traditional food delivery system. Design: This study compared two meal delivery systems' impact on patient satisfaction, therapeutic and tray accuracy, food cost and labor cost. Patient satisfaction variables were gathered by interviewing patients. The sample size was the equivalent of a one-day census. Therapeutic accuracy data compared the patient tray and the physician's diet order. Tray accuracy assessed if items on the patient tray matched ordered items. Patient food and labor costs were calculated based on the four weeks in the prior full month prior to implementation and the four weeks in the post full month post-implementation. Results: There was a significant increase in food taste, server courtesy, receipt of food ordered and overall satisfaction at Brandywine Hospital. There was a significant increase in all patient satisfaction categories at Lancaster General Hospital with the implementation of Express Select®. There was a significant increase in therapeutic accuracy and tray accuracy at both hospitals with the implementation of Express Select®. Food and labor costs decreased slightly through implementation, but not significantly. Conclusion: The Express Select® program has clearly demonstrated an ability to increase patient satisfaction, therapeutic accuracy and tray accuracy without increasing food and labor costs in two acute care trauma hospitals in Pennsylvania.


A survey of Certified Diabetes Educators to evaluate applied diabetes knowledge.

Colleen Fossett, 1999

Diabetes is a serious metabolic disease that affects 16 million people. Education is the cornerstone of treatment to delay and prevent the devastating complications that may occur. Certified Diabetes Educators (CDEs) are specialists in the field of diabetes education. The majority of CDEs are Registered Dietitians (RDs) and Registered Nurses (RNs). The objective of this study was to determine if there was a significant difference in the application of diabetes knowledge between RD, CDEs and RN, CDEs. A 20-itern questionnaire was mailed to 319 members of the American Association of Diabetes Educators (AADE) who were CDEs. Questions from the survey were replicated from the AADE core-curriculum posttest. Fifty percent of the questions were typically known and taught by RD, CDEs and 50% of the questions were typically known and taught by RN, CDEs. Questions on the survey represented diabetes knowledge in the domains of nutrition, pharmacology, hyper/hypoglycemia and complications. Of the 319 surveys that were mailed there was a 59.8% response rate (n = 191). The RD, CDEs had a 54% (N = 79) usable response rate, and the RN, CDEs had a 45.9% (n = 67) usable response rate. There was not a significant difference in the total scores of the RD, CDEs and RN, CDEs. on the twenty question survey ( p =.18), however there was a statistically significant relationship when the questions were filtered to specific domains. RD, CDEs scored significantly higher (p =.004) than the RN, CDEs. on questions in the nutrition domain, but significantly lower than the RN, CDEs. on questions in the pharmacology domain ( p= .013) and on questions in the complications domain ( P=.005). The findings of this study, suggest that RD, CDEs and RN, CDEs. have a broad range of diabetes knowledge. Analysis of the mean scores of RD, CDE and RN, CDE within domains, suggest the need for an expansion of knowledge beyond the traditional roles, of nurse and dietitian. The attainment of this knowledge may be encouraged by crosstraining, up-skilling and continuing education seminars. A CDE who is multiskilled may appear more competent in the eyes of the patient and more flexible and versatile to the facility in which they are employed. Future research includes additional studies on CDEs and the impact they have on the health care industry as well as the difference in diabetes knowledge between CDEs and non CDEs.


Influence of resting energy expenditure and body mass index on weight loss after roux-en-y gastric bypass

Laura Trento, 2001

Obesity is a growing epidemic. Surgical treatment for severe obesity has evolved over several decades. Weight loss after bariatric surgery varies, depending on operative procedure and pre-surgical factors. Altered resting energy expenditure (REE) and severity of obesity may influence weight loss outcome. Medical records of 192 patients (79% female) were reviewed for data on operative weight (op wt), BMI, measured REE (MREE) and follow-up wt. Patients previously had Roux-en-Y gastric bypass between May 1996 and March 2000 at a bariatric surgical program in northern New Jersey. Weight loss outcome was measured as a change in BMI and % excess weight loss (% EWL). Preoperative summary statistics revealed a mean age of 39 years, op wt 143 kg, BMI 50.8 kg/m2, MREE 2387calories and % predicted REE (% PREE) of 105%. Majority of the patients (74%) had normal metabolic rate (%PREE 85-115%). Only 7 patients (4.2%) were hypometabolic (%PREE < 85%). Thirty-four percent (n=65) of patients were super obese (BMI 50-59kg/m2) and 15% (n=29) were super/super obese (BMI >60 kg/m2). Postoperative follow-up rate was 32% (n=76) and 29% (n=29) at one and two years, respectively. Mean BMI and % EWL at one year was 36.4 kg/m2 and 52%; at two years, 32.5 kg/m2 and 66%, respectively. Postoperative BMI decreased over time for all patient groups. There were no wt. loss failures (follow-up wt greater than initial wt). There was no difference in %EWL based on the preoperative %PREE. Patients with preoperative BMI>50 kg/m2 had lower %EWL over time. Although the data suggests that super obese patients have less favorable weight loss, majority of patients lost greater than 50% EWL at two years and heavier patients may require longer follow-up. Analysis of outcome should also evaluate for improved co-morbid conditions and quality of life. More research is necessary to determine if altered REE impacts weight loss outcome.


The Impact of Dysphagia Screening by a Registered Dietitian on Identification of Dysphagia Risk and Appropriateness of Diet Order in Acute Stroke Patients

Maureen Huhmann, 2002

Objective : To assess level of agreement regarding determination of dysphagia risk and diet order between the RD and the SLP (Speech Language Pathologist) in hospitalized stroke patients and to determine the predictors of dysphagia that the RD can use to effectively identify individuals in this population. Design : Prospective single blinded study of patients admitted to the hospital with the diagnosis of stroke. The RD used a Dysphagia Screening Tool; results were compared to an SLP bedside swallowing evaluation to assess level of agreement on dysphagia risk and diet recommendations. Subjects/Setting : A convenience sample of 32 adult patients admitted to the Stroke Team at Raritan Bay Medical Center with the diagnosis of acute stroke from July 2002 to January 2003. Statistical Analyses : Demographic data, nutrition risk, incidence of dysphagia indicators, and MD and RN documentation of dysphagia risk factors were compared utilizing frequency distributions (JMP-In software, alpha set at p=0.05). The kappa statistic was used to assess agreement of dysphagia risk, diet consistency, and liquid restrictions. Logistical regression was used to identify the best predictors of dysphagia risk. Results : The RD identified 40.6% of patients (n=13) and the SLP identified 31.3% (n=10) of patients at dysphagia risk. There was excellent agreement (k=0.80) on determination of dysphagia risk, perfect agreement on oral diet (PO vs. NPO), excellent agreement on liquid diet (k=0.83) and very good agreement on solid diet (k=0.79) orders. An abbreviated RD screening tool was designed as a result of logistical regression analysis. Applications/Conclusions : This study demonstrated that the RD can effectively identify dysphagia risk and has a role in the screening of acute stroke patients for dysphagia. Upskilling of RDs to include dysphagia screening as part of standard nutritional care would expand the role of the RD as a member of the multidisciplinary stroke management team.


Herbal supplement use in the elderly

Christy Kaloukian, 2001

OBJECTIVE: To investigate herbal supplement use in the elderly and determine if socioeconomic status, level of education, and perceived health status are predictors of use. DESIGN: A survey on herbal supplement use was distributed to four Senior Centers in Camden County, New Jersey. SUBJECTS: One hundred ninety-nine individuals aged 65 and older attending Senior Nutrition Centers in Camden County. STATISTICAL ANALYSIS: Frequency distributions, Chi Square analysis, logistic regression, and odds ratios were performed using the JMP-IN statistical analysis program. RESULTS: Forty (20.8%) of the respondents reported using herbal supplements. Of those reporting use, twenty-four (60%) were females and 16 (40%) were males. Sixteen (40%) indicated use of more than one herb (range = 1-4 herbs). The herbs most commonly used were garlic (38.1%), gingko biloba (14.3%), cranberry (12.7%), saw palmetto (12.7%), ginseng (6.3%), and echinacea (6.3%). The primary reason for use was ensuring good health. The majority of herbs (76.2%) were used daily. Higher educational levels and higher income levels were both found to be significant, independent predictors of herbal supplement use (p=.010 and p=.025, respectively). The majority (71.8%) of herbal supplement users reported their health status as excellent, very good, or good. Ethnicity, prescription medication use, and gender were not predictors of herbal supplement use in this study. Socioeconomic status, perceived health status, and education level did not have a significant effect on herbal supplement use when all three factors were combined. CONCLUSIONS: Twenty percent of elderly subjects reported using herbal supplements. The primary reason for use was ensuring good health. The majority of herbs were used daily. Higher educational levels and higher income levels were both found to be significant, independent predictors of herbal supplement use. IMPLICATIONS: As the population continues to age and with increases in herbal supplement use, the healthcare community must be prepared to educate geriatric patients on the risks and benefits of use. Healthcare providers must begin asking questions regarding herbal supplement usage in the elderly. RD's should be knowledgeable of herbal supplements products and provide accurate information to their patients if they choose to incorporate herbs into their diet. In addition, pharmacists need to provide information to older individuals advising them of the potential risks of combining herbal supplements and prescription medications.


A retrospective review of the relationship between the initiation of and the adequacy of enteral nutrition support in mechanical ventilation weaning.

Cynthia Ann Kwiatkowski, 1997

OBJECTIVE: This study examined the relationship between the initiation of or the adequacy of enteral nutrition support to meet established weaning parameters and the total number of days requiring mechanical ventilation in patients admitted to the medical ICU. DESIGN: Retrospective chart review for the period from March '95 through February -96 was conducted. Only non-surgical patients (excluding feeding tube or tracheostomy placement) who were ventilated for three or more days were included. SUBJECTS: Fifty-two patients met eligibility requirements. The mean age was 68.65 years, with a range from 25 to 91years. STATISTICAL ANALYSIS: Statistical analyses were carried out using analysis of variance, spearman rho, nonparametric measure of association, wilcoxon rank test and ANOVA one way t test for continuous data and chi square tests for analysis of categorical data on JMP Start Statistics Systems (SAS Institute Inc.). RESULTS: There was a nonsignificant (p=0.1344) but moderate correlation (R=0.48) between the adequacy of the enteral nutrition regimen to meet established weaning parameters. No correlation was identified between either the day enteral support was started and the ability to meet established weaning parameters or the total number of days requiring mechanical ventilation. A significant (p=<0.0001) and moderate correlation (R=0.73) between the adequacy of enteral feeding and the total number of vent days was found. A trend was observed as the adequacy of the enteral support regimen increased so did the total number of vent days. Virtually no correlation (p=0.17; R=0.19) between those patients with and those patients without enteral nutrition support on the ability to meet weaning parameters was found. A fisher's exact test (p=0.6625) was not significant to support the conclusion that feeding or not feeding made a difference in mortality, due to the small sample size. CONCLUSIONS: Multiple factors affect the ability to initiate enteral nutrition support and extubation of the mechanically ventilated patient. In this study, it was not possible to demonstrate that patients who had early enteral nutrition support could be successfully weaned from the respirator sooner than patients not fed. Statistical significance was observed as the adequacy of the nutrition regimen increased there was an increase in the total number of vent days and patients fed were intubated longer than patients not fed. The degree of disease severity and post extubation survival, incidence of complications and length of hospitalization of the two groups is unknown. There is a need for further study to determine the effect of early, aggressive nutritional therapy vs. no nutritional support on morbidity and mortality. Improvements need to be made in identification of the ventilated patients who would best benefit from early nutrition support.


The relationship between two predictive resting energy expenditure (ree) equations and measured ree for ventilated adult open heart surgery patients in the early postopertive period in the ICU.

Susan Macia, 2000

Predicting energy needs in critically ill open heart surgery patients can be difficult because of uncertainties about influences of multiple factors. Understanding these components is important to prevent over and underfeeding, each uniquely limiting optimal outcome. Indirect calorimetry measures energy expenditure, accounting for multiple factors influencing energy expenditure. However, many facilities do not have indirect calorimetry and rely on predictive equations to estimate energy needs. The purpose of the study was to determine whether two predictive equations accurately estimate energy expenditures of postoperative ventilated open heart surgery patients as compared to measurements, using indirect calorimetry. It was hypothesized that predictive energy equations do not accurately predict resting energy expenditure in this population. A prospective trial was conducted in eight mechanically ventilated open heart surgery patients on postoperative day three or four. The four women and four men [ mean (+ S.E.) age 71.1+ 11.1 years ] underwent measurement of REE using indirect calorimetry by metabolic cart (SensorMedic)[ mean + (S.E.) 1441+ 156 calories/day]. Their REE was estimated by the Harris-Benedict equation multiplied by stress factor (1.2) plus a factor for pyrexia [mean (+S.E.) 1652 + 142 calories/day] and empirical equation 25calories/kg [mean (+ S.E.) 1549+ 100 calories/day]. Results showed that the Harris-Benedict equation with stress factor(s) overestimated mean measured REE, while 25 calories/kg was not significantly different from mean measured REE (P = .246). Preoperative or adjusted weight for obesity (Metabolically Active Weight = MAW) was used in predictive equations, as commonly used in clinical practice. Multiple linear regressions revealed that an equation with the factors: MAW (P = .001), Gender (P = .003) and Height (cm) ( P = .004) accounts for 96% of the variability in measured REE (R2 adj = .96), with the single most predictive factor for measured REE being MAW (R2adj = .71). Results of this study suggest that 25 calories/kg of MAW is more predictive of mean measured REE in the population studied. The study should be repeated in a larger population to detect the influence of other factors such as surgical intervention and medications on energy expenditure.


The relationship between the use of physical assessment competencies in practice and the methods of instruction of these competencies by RDs in professional practice.

Tami Mackle, 2002

The purpose of this study was to examine how individuals who have completed one of two physical assessment (PA) programs use the knowledge and skills learned in practice and whether method of instruction had an affect on use of skills in practice. Surveys were mailed to 891 individuals, all of whom completed either a video-based PA program or a hands-on seminar PA course between 1996-2000. Data was collected on area of practice, workplace setting, use of PA competencies, program in which competencies were learned, how competencies are used, influences on their use in practice, and demographic variables. Chi square analysis (JMP-IN 3.2.6) and stepwise logistic regression (SAS 8.0) was used for analysis with p set at .05. Four hundred and seven usable surveys were analyzed. Sixty percent of respondents worked in a clinical setting. Four of the five most used competencies were similar between the two programs, which included assessment of peripheral edema, dysphagia screening, skin assessment, and bowel sounds. More RD's were using PA competency information in clinical assessment, but did not perform the competencies independently. Respondents with the CDE (p=.0359) and CNSD (p=.0215) credentials were more likely to use select PA competencies. Almost 50% of respondents (n=153) reported confidence enhanced use of PA competencies and 52% (n=159) reported time was a barrier to using PA competencies in clinical practice. There were no significant differences in use of PA competencies in practice between respondents who completed the video-based or in-person programs. Although not statistically significant there appeared to be greater use of PA competencies by those who had received additional training and those who had completed the in-person program. Future studies on PA competency use should focus on effectiveness of learning strategies and time available to perform competencies in practice. Learning more about the "other" group who performed PA competencies often would also be beneficial regarding where PA competencies were being taught.


Should protein be included in the calorie calculations of parenteral and enteral nutrition?

Lou Manera, 1998

Purpose: The purpose of this study was to determine the practices and beliefs, of a sample of physicians and registered dietitians (RDs), concerning inclusion or exclusion of protein from the total calorie calculation of nutrition support regimes. Methods: Survey design. The population chosen was a nationwide, randomly selected sample of 400 physicians and 200 dietitians from the Society of Critical Care Medicine (SCCM) and from the American Dietetic Association (ADA) practice group Dietitians in Nutrition Support (DNS), respectively. Major Results: 1. Total usable response rate = 23% (n=135). RD's 38% (n=75); Physicians 15% (n=60) 2. Forty-seven percent of physicians and 72% of RDs believe protein should be included in the calorie calculations of parenteral nutrition (PN). 3. Forty-seven percent of physicians and 84% of RDs believe protein should be included in the calorie calculations of enteral nutrition (EN).. 4. Case studies showed that practitioners included protein in their calculations at percentages similar to their beliefs. Major Conclusions: 1. There is a significant difference in the way the physicians and RDs responded in this study with regard to inclusion or exclusion of protein in nutrition support. Physicians were undecided and dietitians favor inclusion of protein. 2. Based on case study comparison, physicians and RDs were consistent in their practices and beliefs about protein inclusion/exclusion.


What are the usage patterns and reasons for use of dietary supplements in community dwelling elders (over 65 years) with and without Alzheimer's dementia.

Electra Moses, 2002

ABSTRACT TEXT: One hundred twenty four community dwelling elders over 65 years with and without Alzheimer's dementia living in New Jersey were surveyed via telephone to determine their usage patterns of dietary supplements. Seventy-nine women (63.7%) and 45 men (36.3%), with ages ranging from 65 to 95 years participated in the study. There was no significant difference in age between the subjects with and without dementia. Almost one third (n=37, 29.8%) had dementia, and 87 (70.2%) did not. The majority of subjects reported use of dietary supplements (78.4%, n=29 participants with dementia) (72.4%, n=63 participants without dementia). Vitamin E (dementia: n=23, 38.3%; non-dementia: n=37, 61.7%), vitamin C (dementia: n=13, 39.4%; non-dementia: n=20, 60.6%), and multi-vitamin (dementia: n=22, 37.3%; non-dementia: n=37, 62.7%), were the vitamin supplements reported used more frequently by individuals in the population. Individuals with dementia used more vitamin E than those without dementia and used significantly larger doses of vitamin E (p=0.004) than those without dementia. Calcium (n=30, 24.2%) was the mineral supplement reported used more frequently by the population. Other dietary supplements reported used by the population were: Ginkgo biloba (n=3, 3.3%), Garlic (n=5, 4.0%), Glucosamine (n=7, 5.6%,) and Co Enzyme Q 10 (n=7, 5.6%). The data suggest that community dwelling elders over 65 years with and without Alzheimer's dementia living in New Jersey use a variety of dietary supplements. Usage patterns of vitamin and mineral supplements are similar for individuals with and without dementia, with the exception of vitamin E.


Testing the Reliability of United States Knowledge and Skill Dysphagia Risk Screening Survey Questions in a Sample of Canadian Registered Dietitians

Tracy Lister, 2015

Background: Reliable questions are needed for assessing registered dietitian (RD) dysphagia risk screening knowledge and skill competency. Objective: To determine the reliability of the Dysphagia Risk Screening Survey knowledge and skill questions with a sample of Canadian RDs. Design: Prospective internet survey administered twice in two weeks. Participants/Setting: A convenience sample of Canadian RDs was obtained by recruiting through a listserv. Statistical Analyses Performed: Descriptive statistics were reported for demographic and professional characteristics. Question reliability was determined by utilizing McNemar’s test, a kappa statistic and a weighted kappa statistic. Results: Twenty RDs completed both surveys and met inclusion criteria of working in clinical practice with adults. Thirty percent (n=6) of the respondents spent >50.0% of their time working with patients with dysphagia, 50.0% (n=10) work in long term care and 65.0% (n=13) received prior training and/or education on dysphagia risk screening. There was no significant difference (<0.05) in responses for 16 of the 17 knowledge questions utilizing McNemar’s test. Using a kappa statistic, frequency of responses to three knowledge questions had good agreement (k=0.60-0.75). Using a weighted kappa statistic frequency of responses to three of the 20 skill questions had excellent agreement (kw>0.75), five had good agreement (kw=0.60-0.75) and 12 had poor agreement (kw<0.60). Conclusion: Three knowledge and eight skill questions had good or excellent agreement when comparing the responses of the surveys completed by this sample of Canadian RDs. Additional question testing is required before the questions are utilized to assess dysphagia risk screening knowledge and skill competency of RDs.


Identifying Nursing Home Residents At Risk For Weight Loss Using The Minimum Data Set

Susan Musilu, 2005

ABSTRACT TEXT: Objective: The purpose of this study was to assess the relationship between weight change and ability to self-feed, number of medications taken and oral problems among nursing home residents over a six-month period from admission to six months post admission using data from the Minimum Data Sets (MDS) completed for each resident. Design: Retrospective study, all data was collected from each participant's completed MDS on admission, at three and six months post admission Subjects: One hundred nursing home residents 65 years of age or older. Statistical Analysis: Frequency distributions with means and standard deviations were calculated for continuous variables. Independent t-tests were used to examine differences amongst residents. Correlation coefficient and Chi square analyses were also used. Results: Between admission and six months, self-feeders gained significantly more weight (p=0.003) (mean = 2.4 lbs, SD±12) than non- self-feeders who lost a mean of 5.9 lbs (SD±14). Between three and six months, those with ability to self-feed gained significantly (p=0.05) more weight (1.6 lbs, SD±5.4) than those without the ability to self-feed who lost a mean of 1.3 lbs (SD±7.6). Between admission and six months change in BMI among self-feeder was significantly (p=0.009) higher (mean = 0.33, SD± 2.1) than change in BMI (-0.94 SD±2.4) among self-feeders. Between three and six months self-feeders change in BMI (mean= 0.2, SD±1.0) was significantly higher (p=0.04) than change in BMI (mean= -0.2, SD±1.3) among non-self-feeders. Those with ability to self-feed (n=71, 71.0%) had significantly less (p< 0.001) oral problems at three months than those without ability to self-feed. At six months 14% of the residents (n=14) did not have the ability to self-feed and experienced oral problems. Those with ability to self-feed had significantly less (p = 0.001) oral problems than those without the ability to self-feed. At three months, there was a moderate correlation (r = -.43, p<0.005) between ability to self-feed and oral problems. At six months post admission, there was a moderate correlation (r = .35, p <0.005) between ability to self-feed and oral problems. Between admission and six months, there was a significant moderate correlation (r =-.30, p = 0.003) between change in weight and ability to self-feed. There was also a significant but weak correlation (r = -.26, p = 0.009) between change in BMI and ability to self-feed. Conclusion: The study found that there was a significant difference in mean weight change between those residents with ability to self-feed and those without the ability to self-feed. Those with ability to self-feed gained weight whereas those who required assistance with meals lost weight. The residents with oral problems lost more weight than those without oral problems. Residents who required assistance with meals had significantly more oral problems than those who did not need assistance with meals. This demonstrated that some of the items in the MDS can be useful in identifying residents at risk for weight loss so that early initiation can be done to reduce weight loss.


The relationship between dietary folate and mean corpuscular volume (mcv) levels following increased folate fortification.

Andrea Nepa, 1999

INTRODUCTION: The role of the B- vitamin folate in reducing risk of neural tube defects (NTDs), cardiovascular disease and stroke is now being recognized by the medical community. As a result it is currently being proposed that the Recommended Dietary Allowance (RDA) for folate be doubled to 400 mcg/day (based on new Dietary Reference Intake (DRI) guidelines). The Food and Drug Administration (FDA) recently mandated the fortification of grains and cereals with folate in order to decrease the incidence of NTDs resulting from folate deficiency. A known consequence of folate deficiency is macrocytic anemia, which is correlated with elevated MCV levels. This study examined the relationship between usual dietary folate intake in hospitalized elderly and serum MCV levels. The purpose of this study was to determine if serum MCV levels could be used to identify individuals who may have folate deficiency (MCV is a readily available index in the hospital setting). HYPOTHESIS: MCV levels will be normal in individuals who meet or exceed the present (1989) RDA for folate in their diets on an average daily basis. MCV levels will be elevated in individuals who consume less than 50% of the RDA for folate on an average basis. DESIGN & METHODS: Forty-four elderly (>65 years old) inpatients at a S. New Jersey community hospital from 8/98-11/98 were recruited (42 were included) if they consented to participate. Individuals who had altered MCV levels for reasons other than folate deficiency were excluded. Individuals with known vitamin B12 deficiency or risk factors for vitamin B 12 deficiency were excluded, as vitamin B12 deficiency can also result in elevated MCV. MCV level was obtained from the medical record. All subjects completed a Food Frequency Questionnaire (FFQ) administered by the principal investigator. The FFQ was computer analyzed utilizing Berkeley Nutrition Services. Data was generated to determine the folate content of each subjects' usual diet (including supplements), major sources of folate in the diet and average number of servings from fruit, vegetables and grains. RESULTS: Of the 42 subjects (20 female, 22 male), 2 were African American and 40 were Caucasian. All were independent living, non-institutionalized elderly ranging in age from 65-86 (mean age 75). Cereals and grains had the greatest contribution to folate intake. The average daily intake of fruit and vegetable servings (not including potato) was 3.8; Average daily intake of grains and cereals was 2.8 servings per day; Average daily intake of sweets was 2 servings per day. Total folate intake from both food and supplements averaged 655 mcg/day. Mean folate intake from food only was 465 mcg/day, which was twice that of the RDA (which is currently 200 mcg/day for men and 180 mcg/day for women). All subjects met at least 100% of the RDA for folate with food alone and approximately half met the RDI of 400 mcg/day for folate by diet alone. Nineteen subjects used folic-acid containing supplements. MCV levels for all subjects ranged from 82.3 to 104.9 fL (mean = 91 fL normal range = 81-99 fL Three subjects had elevated MCV levels (>99 fL No significant difference was found between mean MCV concentrations when comparing supplement and non-supplement users (p=0.76) based on Student t-test. The mean MCV difference between the low RDI and high RDI groups was not statistically significant (p=O.16) based on Student t-test. The correlation between MCV and total folate intake was not statistically significant (p=0.41) among all subjects based on Spearman Rank test. DISCUSSION: All subjects in this study population achieved adequate dietary intake of folate in terms of meeting the present RDA. Grains and cereals contributed more than any other food to total folate intake, indicating that folate fortification may have contributed to better folate intake. (Note that some cereals were already fortified prior to FDA mandate, but the FDA has allowed an increase in folate fortification of cereals up to 100 grams per serving). All subjects were living independently prior to hospital admission, which may account for the relatively good dietary folate intake among all subjects. Also, patients who were sicker or older may have been more likely to decline to participate secondary to fatigue or feeling ill, which may suggest increased risk of malnutrition. Although the FFQ is ideal for hospitalized patients because it calculates average intake of nutrients prior to admission, it is impossible to include all foods eaten by all individuals, and it relies on the subject estimating portion sizes and frequency of intake. Moreover, the bioavailability of folate was not accounted for in this study. Among the subjects with elevated MCV, the subject with the highest MCV (104.9 fL) had folate intake that exceed the RDA but not the RDI. This subject had serum folate and vitamin B 12 levels in the within normal limits. In clinical practice much higher MCV levels than the normal range may be indicative of folate or vitamin B12 deficiency. In this study there was no significant relationship between MCV and total folate intake. However only 3 subjects had elevated MCV levels, which is not a large enough sample size to detect a significant relationship between elevated MCV and folate intake. Most patients with high MCV levels were excluded during the screening process for this study because of known factors which affect MCV levels independent of folate, nutriture. The majority of subjects had normal MCV levels and good folate intake. MCV has a normal range and does not decrease as folate intake increases. Rather, low folate status is known to result in abnormally high MCV. Because all subjects had folate intake above the RDA, one would not expect to see a relationship between MCV and folate, as both variables were within the normal range. No subject had folate intake below the RDA to assess if this resulted in altered MCV level. CONCLUSION: These findings reveal that the subjects in this study, all hospitalized free-living elders, were wellnourished in terms of meeting at least the RDA for folate through diet. Nevertheless, there was room for improvement in increasing folate to the level of the RDI. It is anticipated that this could be achieved by increasing intake of beans as well as fruit, vegetables and grains to be more consistent with the Food Guide Pyramid. This study did not demonstrate that folate intake can be predicted by MCV, since most of the subjects had normal MCV and all had good folate intake. In the future it would be useful to determine the relationship between folate intake and only elevated MCV to more accurately assess if elevated MCV can predict poor folate intake. Assessment of individuals with low intake of folate in relation to MCV would also be useful in determining this relationship. In addition it would be interesting to measure the dietary intake of folate among institutionalized elderly. These measures would serve as a basis for an easy and inexpensive screening tool to detect deficiency of this important nutrient.


The twenty-four hour nutrition screen - where one contract food service company is in the process.

MaryBeth Ostrowski, 1998

Objective: To identify twenty-four hour screening methods utilized by Marriott Management Services (MMS) acute-care hospitals in the Northeast and determine if any methods facilitate a higher percent achievement of the screen. Design: A descriptive study was conducted via a mail survey to all MMS Clinical Nutrition Managers (CNM's) in the Northeast (Maine-Virginia). The entire population was included in the study. Subjects: 78 CNM's responded to the mail survey, with a 90.7% total response rate. Results: The JMP IN statistical software package was used to analyze the data collected. Frequencies, mean values, and Pearson's Chi-Square were used to analyze the data. Frequency data revealed that the RN (82.1%) is the clinician most often responsible for completing the 24 hour nutrition screen, with the RN's initial assessment as mechanism (89.6%) and place of documentation (57.4%) occurring most often in responding facilities. The RD completed the screen 38.8% of the time, most often utilizing chart review (46.3%) and contact with patient or significant other (42.3%), and documented most often in the diet kardex (36.8%) and RN initial assessment (21.1%). The diet technician (DT) completed the screen in 19.4% of responding facilities, and most often utilized chart review (76.6%) and computer data review (23.1%), and documented most often in the diet kardex (45.5%) and the RN initial assessment (27.3%). Reported compliance to the screening procedure had a mean of 79.8%. No statistically significant differences were found between screening compliance and person completing the screen, mechanism used for screening, place of screening documentation, or most frequently used screening parameters. Applications: Nutrition screening is an important mechanism for the identification of patients at nutritional risk in acute-care settings. The findings of this study suggest that within MMS acute-care institutions in the Northeast, this screening is being primarily completed by the RN. Nutrition education of nursing professionals is essential for the appropriate completion of the nutrition screen, as well as referral to the RD as the nutrition expert for nutrition assessment, evaluation, and monitoring.


Expectations of the New Jersey Dietetic Association, 2003-2004, Current and Dropped Members

Marie Sackowitz, 2004

A telephone survey was conducted of all individuals that did not renew their NJDA membership in 2003 and then compared to the responses of an equal number of renewed members. Of the 295 dropped members, 105 were surveyed, plus 22 who renewed late. Statistical analysis was performed using SAS software, Version 8.2 (SAS Institute, Cary, NC). Demographic questions were analyzed using frequencies, mean and standard deviation. Expectation questions were analyzed using chi square, student t tests and ANOVA for comparisons. Results supported the hypothesis that differences existed between groups. Overall satisfaction with the Association was rated on a 1 to 5 Likert scale. Members' satisfaction with ADA's ample sources of information was the top ranked score (4.19) and ADA's position on nutrition topics ranked second highest (3.95). Dropped members gave the lowest scores to satisfaction with dues (2.52) and the opportunity to gain leadership skills from NJDA/ADA (2.57). Significantly more members renewed if they were involved in meetings (60%vs 33%), attended FNCE/NJDA (60% vs. 32%), joined DPGs (68% vs. 22%), served on dietetic boards (80% vs. 15%) than dropped members and used the Internet resources (68% vs. 23%). Students (60%), DTRs (62%), and those members born in Generation X (54%) and Y (72%) dropped their membership in 2003 and this may be partially due to 2002 FNCE in Philadelphia. The study suggests that NJDA/ADA needs to engage the younger generations, DTRs and students in Association activities to keep the momentum growing for the dietetic professional association and to solidify the association for the future.


Increased Physical Activity Leads to Improved Health-Related Quality of Life among Employees Enrolled in a 12 Week Worksite Wellness Program

Stephanie Macaluso, 2015

Objective: To determine the relationship between changes in physical activity (PA) and health-related quality of life among university employees who enrolled in a worksite wellness program (WWP). Methods: The study was an interim analysis of data collected in a WWP. The sample consisted of 64 participants who completed 12 and 26 week follow-up appointments. Results: Self-reported anxiety days significantly decreased from baseline to week 12. There were positive trends in self-rated health, vitality days, and summative unhealthy days from baseline to week 26. Among those with a self-reported history of hypertension (HTN), there was an inverse correlation between PA and summative physically and mentally unhealthy days at week 12. Conclusions: Among participants in this WWP with HTN, as PA increased there was a significant decrease in summative physically and mentally unhealthy days at week 12.


Impact of a six-hour nutrition education program on nutrition knowledge of pediatric physician residents

Rachel Teneralli, 2004

Objective To determine the impact of a six-hour nutrition education program on nutrition knowledge of pediatric physician residents. Design Prospective cohort study utilizing a nutrition education program for pediatric residents at an urban teaching hospital. One-hour lectures were taught at noon conferences for six consecutive weeks and posted on the residency's website. A 40 question multiple-choice exam assessed pre and post-intervention knowledge. Subjects/ Settings A convenience sample of 35 physician residents of a pediatric residency program in New Jersey for academic year 2003-2004. Statistical Analyses An analysis of variance (ANOVA) assessed post-intervention knowledge differences among the residency years. Because significant differences existed at pre-intervention, an analysis of covariance (ANCOVA) was conduced post-intervention. Scheffe post-hoc analysis determined where the differences between the years occurred. Paired t-tests analyzed change in knowledge for each residency year. Repeated measures ANOVA investigated if there was a significant increase in nutrition knowledge over time for all residency years combined. Correlations between attendance and test scores were examined. Results Test scores were significantly different between the residency years at pre-intervention (p=0.0013) as well as post-intervention (p=0.025). At pre-intervention, first year scores were significantly less than second and third year scores (p=0.05) and at post-intervention, third year adjusted mean scores were significantly higher than second year scores (p=0.05). A significant change in mean test scores for all residency years combined existed (p=0.004) as well as change in knowledge for all residency years. There was no relation between attendance and test scores. Applications/Conclusion The study reaffirms the benefit of incorporating nutrition education into pediatric residency training and supports the need for a formalized nutrition education program. The results were not directly related to attendance however a significant improvement in nutrition knowledge was demonstrated for all participants. The study suggests that traditional noon conferences may not be the most effective way or sole contributor to increasing nutrition knowledge.


Taste sensitivity to 6-n-propylthiouracil and it's effect on food preferences

Natalie Ullrich, 2000

Taste is the predominant reason for selecting food items. Studies in taste genetics have shown that individuals vary in taste sensitivity and which can influence food preferences. Laboratory studies using 6-n-propylthiouracil (PROP) have identified individuals who are sensitive to its bitter compounds. Approximately 70% of adult Caucasians in North America are able to taste PROP (taster), whereas 30% find it tasteless (non-taster). The bitter quality of PROP is similar to the bitter compounds found in certain foods such as cruciferous vegetables and certain fruits. Studies have shown that PROP non-tasters are less sensitive to the hotness found in chili peppers, hot sauces and caffeine. Few studies have examined PROP taste sensitivity and it's effect on food preference in adults. The objective of this study was to examine the food preference differences of PROP tasters and non-tasters and how gender influences the preferences using a food preference survey. The standard method of classifying PROP taster status was used along with the completion a food preference survey. A total of 219 adult men and women were recruited for the study. Statistical analysis was performed using SAS (1990) with a significance level of p<0.05. Chi-square analysis was used to examine the 65 individual foods and the different varieties of select foods. Individual foods were categorized into food groups (fruits, vegetables, condiments and alcohol) and factor analysis was used to create subgroups within each food category based on "liking". For each participant the total number of foods "liked" was calculated and analyzed using Friedman's 2-way ANOVA to examine the number of foods liked within each subgroup based on taster status and gender differences. A significant difference as a function of gender in the alcohol and condiment subgroups was found. Males liked olives (black, green, Greek)(p=0.03), hot condiments (hot pepper, salsa, hot sauce) (p=0.01), and hard liquors (gin/bourbon) (0.01) more than females. Non-tasters preferred red/white wine and beer (p=0.03) more than tasters and male non-taster preferred all types of condiments (p=0.02) more than male tasters. Women liked vegetables (turnips, greens, endive) (p=0.03) more than men, independent of taster status. These data suggest that a relationship exists between gender, PROP taster status and food preferences for strong and hot tasting foods. Specifically, men's taster status can influence the acceptance of certain types of foods and can be used to understand male dietary patterns.


Professional Practices of Registered Dietitians Regarding Complementary & Alternative Medicine use for Individuals with HIV/AIDS

Alpa Vyas, 2004

Use of CAM is prevalent among individuals with HIV/AIDS. There is limited scientific research on CAM use in HIV/AIDS and at present no evidence-based guidelines exist for health professionals to use when working with individuals with HIV/AIDS. The purpose of this study was to determine what the professional practices of RDs are regarding CAM for individuals with HIV/AIDS and to identify the demographic characteristics of RDs that affect professional practices. A survey was mailed to all members (N= 655) of the HIV/AIDS Dietetic Practice; 249 usable RD surveys were returned. Approximately 70% of RDs recommended CAM in patient care. RDs with formal training in CAM, personally using CAM, and with a higher level of education (Master's or Doctoral degree or in progress) were significantly (p < .014) more likely to recommend CAM in patient care. There were no correlations between recommendations for CAM and age or years in professional practice. The most frequently recommended CAM therapies were exercise (96%), calcium (90%), B-complex vitamins (86%), vitamin C (84%), and flaxseed (68%). Over 40% of respondents had no recommendation or opinion about garlic or St. John's Wort, which have the most evidence regarding caution with HIV medications. Over 70% had no recommendation or opinion on other commonly used herbals by individuals with HIV/AIDS and 60% recommended CAM therapies outside the realm of biological-based therapies. The results of this study suggest the need for incorporating scientifically sound CAM education into dietetics curricula at all levels and that RDs working with the HIV/AIDS population in particular, need to be cognizant of common CAM therapies used by their patient population and educate themselves accordingly to be effective healthcare providers.


Early Oral Feeding as Compared to Traditional Timing of Oral Feeding After Upper Gastrointestinal Surgery: A Systematic Review and Meta-Analysis

Kate Willcutts 2015

Objective: To compare the effects of early oral feeding to traditional (or late) timing of oral feeding after upper gastrointestinal (GI) surgery on clinical outcomes. Background: Early postoperative oral feeding is becoming more common, particularly as part of multimodal or fast track protocols. However, concerns remain about the safety of early oral feeding after upper GI surgery. Methods: Comprehensive literature searches were conducted across five databases from January 1980 until June 2015 without language restriction. Risk of bias of included studies was appraised and random effects model meta-analyses were performed to synthesize outcomes of anastomotic leaks, pneumonia, nasogastric tube (NGT) reinsertion, reoperation, readmissions and mortality. Results: Fifteen studies comprising 2112 adult patients met all the inclusion criteria. Mean hospital stay was significantly shorter in the early fed (EF) group than in the late fed (LF) group (weighted mean difference = -1.72 days; 95% CI, -1.25 to -2.20; P < 0.01). Postoperative length of stay was also significantly shorter (weighted mean difference = -1.44 days; 95% CI, -0.68 to -2.20; P < 0.01). There was no significant difference in risk of anastomotic leak, pneumonia, NGT reinsertion, reoperation, readmission or mortality in the RCTs. The pooled RCT and non-RCT results however, showed a significantly lower risk of pneumonia in EF as compared to LF (OR = 0.6; 95% CI 0.41-0.89; P = 0.01). Conclusions: Early postoperative oral feeding as compared to traditional (or late) timing is associated with shorter LOS and is not associated with an increase in clinically relevant complications.


The knowledge, personal use and professional practice of registered dietitians nationwide regarding soy.

Liana Weitz, 2001

Objective: To examine knowledge, personal practice and professional recommendations of registered dietitians regarding the use of soy. Design: A survey was developed to collect descriptive data on knowledge, personal and professional practices of soy by Registered Dietitians who are members of the American Dietetic Association. A pilot was conducted with clinical RDs. Knowledge of soy was obtained via a nine-question quiz. Personal and professional practices were obtained from food frequency lists. The reasons for use and professional advice were collected. Demographic data were obtained for comparisons. Subjects/setting: The survey was mailed in June 2000 to 500 RDs who had been randomly selected from The ADA database of individuals selecting the clinical area of membership. Usable data were collected from 286 (56.2%) respondents. Statistical analyses: Descriptive analyses of the knowledge, personal and professional practice regarding soy were done using JMP-IN (Version 3.26), significance was set at alpha <0.05. Analysis of level of knowledge verses personal practice as well of level of knowledge versus professional practices was done with logistical regression. Chi-square was used to analyze personal and professional practice. Logistical regression was used to analyze the relationship among all three variables. Results: Respondents had an overall moderate level score for knowledge of soy (x = 4.21, range (-) 3 to (+) 9). Individuals had a high level of knowledge of the potential health benefits of soy, (x = 2.44, range 0-3) but a low level of knowledge of sources of isoflavones (x = 0.95, range 0-3) and cooking/preparation (x = 0.82, range 0-3). Half of the respondents (50%, n = 144) reported use of soy/soy products. Two-thirds of respondents (68.29%, n= 140) indicated that they provide professional advice regarding soy to patients. Respondents who had a higher level of knowledge were more likely to use soy products (p= 0.0006, logistic regression, Whole-model test) as well as advise use of soy to patients (p= 0.002, logistic regression, Whole-model test). Individuals who used soy products were more likely to advise patients on use of soy products (p = 0.009, Pearson's Chi-Square). There were no significant relationships among the independent variables of knowledge and personal practice with professional practice when these variables were combined (p= 0.37, logistical regression). Applications/conclusions: Respondents had a "high" level of knowledge of the health benefits of soy, but were not familiar with cooking/preparation issues nor sources of isoflavones in soy products. Personal practices influenced professional practices. Based on the responses of a representative sample of clinical dietitians, there is a need for further education of RDs regarding the sources of isoflavones and the ability to translate this knowledge into practical methods for consumer use. The ADA, as the professional organization for food and nutrition experts, is the most appropriate organization to consider providing future Continuing Education regarding soy to the RD.


Effect of individual versus group diabetes outpatient education on weight change, hemoglobin A1c and clinical laboratory values in adults with type 2 diabetes mellitus 

Linda Vero, 2007

Objective: To compare the effectiveness of delivering diabetes nutrition education in a group versus an individual setting on patient outcomes. Design: Retrospective chart review. Participants/setting: Medical records for 160 patients with type 2 diabetes mellitus (T2DM) with a BMI e 20 kg/m² who completed a six week Diabetes Self-Management Education (DSME) program at an American Diabetes Association (ADA) Recognized Program. Intervention: Participants received either individual or group nutrition education in the DSME program. Main outcomes measures: Clinical outcomes were measured at baseline and at six weeks for weight and BMI, and at baseline and at five to six months for HbA1c, total cholesterol, LDL-cholesterol (LDL-C), HDL-cholesterol (HDL-C) and triglyceride (TG) levels. Statistical analysis performed: Chi-square tests were used to determine differences between the two groups (individual vs. group nutrition education) for demographic and other characteristics at baseline. Independent sample t-tests were used to analyze change in weight from baseline to six weeks, and, change in HbA1c and clinical laboratory values from baseline to five to six months between the two groups. Change in clinical outcomes between the two groups was analyzed using 2x2 split-plots ANOVA. All analyses were performed using SPSS version 13.0. Alpha set a priori at 0.05. Results: There were no statistically significant differences between the two groups for change in weight, BMI, HbA1c level and other clinical laboratory values. Participants in the DSME program lost a total of 2.47 ± 2.54 kg (5.4 ± 5.58 pounds) from baseline to six weeks. Participants in the DSME program had a mean reduction in HbA1c, total cholesterol, LDL-C and TG of 1.33%, 18.7 mg/dl, 9.76 mg/dl, and 55.24 mg/dl respectively, while HDL-C increased by a mean of 3.93 mg/dl from baseline to five to six months. Conclusions: The DSME program was effective in improving HbA1c, total cholesterol, HDL-C, LDL-C and TG levels from baseline to five to six months and body weight and BMI from baseline to six weeks. Both types of nutrition education delivery methods (individual and group) are effective in improving outcomes. Applications: Diabetes educators can investigate various educational approaches to improve patient outcomes.


A survey of ON DPG RDs: Perceived level of practice as compared to the ON DPG SOP & SOPP and the Bradley Model

Kristina DiStefano, 2009

Background: There is a considerable interest throughout the profession of dietetics to expand opportunities for advanced-level Registered Dietitians (RDs), to include more specialized practice areas and functions such as practice area specific nutrition diagnosis, evidence based practice, outcomes monitoring, order writing and application/conducting of research in the practice setting. However, levels of dietetic practice are not universally defined in dietetics. Objective: To explore the relationships among demographic characteristics and perceived levels of practice (generalist, specialist and advanced) among RDs who are members of the Oncology Nutrition Dietetic Practice Group (ON DPG), in relation to their levels of practice according to the Standards of Practice (SOP) and Standards of Professional Performance (SOPP) specialty and advanced-level practice activities for RDs in oncology nutrition care and the Bradley model of advanced level practice (ALP). Design: Members of the ON DPG during the 2008-2009 membership year were sent an e-mail invitation with an embedded link to an Internet-based survey and up to two follow-up e-mails over the course of a six week time period. The Internet-based survey was designed to obtain demographic characteristics, to identify perceived levels of practice in overall dietetics and in oncology nutrition practice, the level of performance of the SOP/SOPP specialty and advanced level practice activities and specific criteria of Bradleys 1993 ALP model. Subjects: ON DPG members with valid e-mail addresses (n = 1417) who identified themselves as an RD, met the Commission on Dietetic Registration (CDR) definition for Oncology Nutrition Dietitian and completed an Internet-based survey; 25.3% (n = 358) of the sample met these criteria. Statistical Analyses: Frequency distributions, Ç2, and independent t-tests (prior alpha set at 0.05) were used to analyze demographic characteristics, and the relationships between perceived levels of practice in oncology nutrition and the level of performance of oncology nutrition SOP and SOPP practice activities and perceived level of overall dietetic practice and specific criteria of Bradleys ALP model. Results: RD members of the ON DPG who perceived themselves as advanced level practitioners (ALPs) in oncology nutrition selected significantly more (independent samples t-test, t = -2.794, p <0.006) SOP/SOPP specialty level practice activities (mean = 42.18, SD = 13.44) than respondents who perceived themselves as specialty level practitioners (SLPs) (mean = 37.08, SD = 11.95) or generalist level practitioners (GLPs) (mean = 28.78, SD = 11.21). RD members of the ON DPG who perceived themselves as ALPs in oncology nutrition selected significantly more (independent samples t-test, t = -3.781, p <0.0001) SOP/SOPP advanced level practice activities (mean = 21.00, SD = 8.18) than respondents who perceived themselves as SLPs (mean = 16.75, SD = 7.42) or GLPs (mean = 12.39, SD = 6.87). Among RD ON DPG members who perceived themselves as ALPs in overall dietetic practice, 38.8% (n = 33) met all four of the ALP criteria of Bradleys model explored within the parameters of this study. Applications/Conclusions: These data will help the ON DPG gain a better understanding of its members by exploring how their perceived levels of practice relates to their level of performance of SOP/SOPP specialty and advanced level practice activities to help to guide the development of continuing education programs and conduct outcomes research to optimize the level of practice and performance among members. Evaluation of the SOP/SOPP practice activities performed by ON DPG members who perceived themselves as ALPs and met the four ALP criteria of Bradleys model may offer perspective to what constitutes advanced-level oncology nutrition practice.


To determine the physician indications for total parenteral nutrition (TPN) initiation, the agreement between physician indications and American Society for Enteral and Parenteral Nutrition (ASPEN) Guidelines, the relationship between clinical characteris

Kristen Finn, 2008

Objective: To determine the physician indications for total parenteral nutrition (TPN) initiation, the agreement between physician indications and American Society for Enteral and Parenteral Nutrition (ASPEN) Guidelines, the relationship between clinical characteristics and whether or not the patient had physician indications in agreement with the ASPEN Guidelines, and the cost of TPN formula incurred when physician indications did not agree with the ASPEN Guidelines. Design: This was a retrospective design using a medical record review. Three investigators independently determined if physician indications were in agreement with the ASPEN Guidelines. Subjects: The sample included 103 adult patients initiated on TPN over a six month period in an urban acute care teaching hospital. Statistical Analysis: Descriptive statistics were reported for clinical characteristics and physician indications. Chi square tests of independence and Point biserial correlation coefficients were conducted to determine the relationship between clinical characteristics and the agreement between physician indications and the ASPEN Guidelines. Results: The majority of patients (n=73, 70.87%) had physician indications for TPN initiation in agreement with the ASPEN Guidelines. The most common physician indication among patients with physician indications in agreement with the guidelines was a non-functional GI tract when SNS was indicated (n=55, 75.34%). The most common physician indication among patients with physician indications not in agreement with guidelines was failure to thrive / poor oral intake (n=11, 36.67%). No significant relationships were found between clinical characteristics and whether or not the patient had physician indications in agreement with the ASPEN Guidelines. The estimated total cost of TPN formula among patients with physician indications not in agreement with the ASPEN Guidelines was $12,200. Conclusion: Although the majority of patients in this study had physician indications for TPN initiation that were in agreement with the ASPEN Guidelines, noncompliance with guidelines resulted in avoidable healthcare costs.


Registered Dietitians Perceived Benefits of Master's Degree Programs

Jody Koutz, 2008

Objective: To determine the differences in perceived benefits of masters degrees among those who obtained their degree prior to or concurrently with becoming an RD and those who returned for their masters degree after obtaining their RD credential. Design: This study used a prospective web-based survey using SurveyMonkey (41). RDs were sent an e-mail with an embedded link to an electronic survey which was designed to obtain demographic and education characteristics as well as perceived benefits of masters degree education. Subjects: The CDR provided 2000 random email addresses of RDs holding current registration. A second sample of RD email addresses was obtained from contacting program directors of masters degree programs which require the RD credential. Of the valid email addresses obtained (n=1846), 30.0% (n=553) completed the survey. Statistical Analyses: Descriptive statistics, Pearsons Chi-Square and Fishers exact test were used to test the relationships between demographic/education characteristics and perceived benefits of masters degrees. Results: A masters degree was completed by 60.2% (n=333) of respondents who completed the survey. Of those with a masters degree as their highest level of education, nearly one quarter (n=82, 24.6%) completed their masters degree before becoming an RD, nearly one quarter (n=80, 24.0%) completed a combined masters degree with DI/CP, and over half (n=171, 51.4%) obtained the RD credential before completing their masters degree. Key benefits which were similar among the groups included identifying the benefit of confidence in the ability to perform job functions and qualification for more positions. Respondents who had completed the RD credential before obtaining a masters degree identified the greatest proportion of benefits compared to those obtaining masters degrees either before the RD or as a component of a DI/CP. Applications/Conclusions: There are differences in perceived benefits of masters degree based on when the degree was obtained by the RD. Key findings from this study suggest that obtaining the RD credential before completing a masters degree produces greater perceived benefit than those who completed a masters degree either before or concurrent to the RD credential. Masters degrees tailored to those already obtaining the RD credential provide the greatest benefit.


Educational Needs Assessment for Registered Dietitians Interested in Advanced Clinical Nutrition

Carmen Tatum, 2007

Objective: To examine the perceived needs for clinical nutrition education and reasons for or against pursing graduate degrees among Registered Dietitians in the U.S. Design: An electronic survey was sent to participants via email with an embedded link to the survey. The survey contained 27 multiple-choice questions regarding perceived needs for advanced clinical nutrition education, reasons for or against pursuing graduate degrees, as well as demographic and career related questions. Subjects: A random sample of 1000 RDs in the U.S. and a convenience sample of 166 RDs from the UMDNJ-SHRP Graduate Programs in Clinical Nutrition (GPCN) were sent the electronic survey. The usable response rate was 39.7% (n=425). Respondents were stratified into three groups for comparison: RDs without graduate degrees (Group A); RDs with entry-level graduate degrees (Group B); and RDs with or pursuing post-professional graduate degrees (Group C). Statistical Analyses: Descriptive statistics, Pearson Chi Squared tests, and Pearson Product Moment Correlation Coefficient were used to test relationships between demographic characteristics and perceived clinical nutrition education needs among the three groups. A priori alpha level was set at 0.05. Results: There were significant differences in perceived needs for advanced clinical nutrition education and demographic characteristics among the three groups of RDs. Significantly more RDs with post-professional graduate degrees than the other groups indicated perceived needs for the following course titles: Applied Physiology (p=0.006), Clinical Management (p=0.028), Applied Clinical Research (p<0.001), and Outcomes Research (p<0.001). There was a positive correlation between age and perceived interest in didactic courses (p=0.01) and a supervised practice experience (p=0.022) and between years as a RD and perceived interest in didactic courses (p=0.04) and a supervised practice experience (p=0.004). Application/Conclusion: Major findings in this study suggest that RDs at different points in their educational careers have differences in demographic and career-related characteristics as well as differences in perceived needs for advanced clinical nutrition education.


Characteristics and Roles in Ethical Decision-Making Among Registered Dietitians who Participate on a Bioethics Committee or as a Bioethics Consultant 

Britta Brown, 2008

Objective: To explore the relationships among demographic and education characteristics, and roles in ethical decision-making in relation to level of practice, among RDs who participate on a bioethics committee or as a bioethics consultant. Design: RDs who agreed to participate in this mixed-methods design study and who met inclusion criteria were sent an e-mail with an embedded link to an electronic survey and were scheduled to complete a 15-20 minute semi-structured telephone interview. The electronic survey and telephone interview were designed to obtain demographic, education, roles in ethical decision-making, and level of practice characteristics. Subjects: RDs who self-identified themselves to the American Dietetic Associations Ethics Committee as having experience as a bioethics committee member or consultant (n = 42) were contacted by telephone to determine their interest and eligibility in participating in this study. The usable response rate for participants completing both the electronic survey and the telephone interview was 47.6% (n = 20). Statistical Analyses: Descriptive statistics and Fishers exact test were used to test relationships between demographic and education characteristics and roles in ethical decision-making, in relation to level of practice. Qualitative themes were derived from coding telephone interview transcripts. Results: Twenty-five percent (n = 5) of subjects met 100% of Bradleys advanced-level practice characteristics and an additional 45.0% (n = 9) met 70-99% of this models characteristics. All subjects (n = 20) had greater than eight years professional experience and 65% (n = 13) had earned at least a Masters degree. Fourteen types of roles for RD involvement in ethical decision-making were identified during the telephone interviews and based on survey data, e 75% (n = 15) of subjects were involved in eight different types of ethical situations in their practice. Applications/Conclusions: Key findings from this study suggest RDs involved as bioethics committee members or consultants achieve many of Bradleys (32) advanced-level practice characteristics. Involvement in bioethics may represent an opportunity for specialty or advanced-level dietetics practice, but requires additional research.


A Survey of Current Job Functions of Renal Dietitians

Bonnie, Thelan, 2007

Objective The objective of this study was to determine the current practice patterns of renal dietitians, including what job functions renal dietitians were performing, what barriers they felt impacted their professional practice, and demographic information. Design This was a prospective study using an electronic Survey of Job Functions of Renal Dietitians. Subjects/setting There were 816 dietitians out of the Renal Dietitians Dietetic Practice Group of the American Dietetic Association and Council on Renal Nutrition from the National Kidney Foundation who completed the survey. There were 2,566 surveys e-mailed, and 747 useable responses (n=747, 29.1%). Statistical analyses performed Descriptive statistics, frequency distributions, Chi-Square and Fishers Exact tests were conducted. Chi-Square was performed to identify relationships between job functions respondents reported performing, demographic characteristics and perceived barriers to dietitian practice. Fishers exact test was used when greater than 20% of cells had less than five responses. Results Respondents frequency of performing job functions was documented. Job functions dietitians performed were affected by demographic characteristics and perceived barriers to professional practice. Dietitians with more than 10 years of renal experience were significantly more likely to perform a greater number of job functions, such as evaluating urea kinetic modeling (c2 =32.95, p=.000), or evaluating dialysis adequacy (c2 =24.16, p=.000) than those less than 5 years of renal dietitian experience. Younger dietitians (24-35 years of age) were significantly less likely to frequently recommend a weight change (c2 =8.23, p=.041), recommend a workup for inflammation (c2 =16.96, p=.001), or recommend a dyslipidemia management plan (c2 =16.42, p=.001) than dietitians who represented the older age groups. Dietitians who worked in an outpatient facility were significantly more likely to frequently prescribe a renal diet (c2 =13.39, p=.000), recommend renal vitamins (c2 =9.81, p=.002) or evaluate IDWG (c2 =32.24, p=.000), versus those who did not work in an outpatient facility. Respondents generally disagreed that barriers affected their ability to practice. The majority of dietitians disagreed (n=650, 87.1%) that they had limited knowledge of bone management. Three-quarters of dietitians disagreed (n=563, 75.9%) that nursing was unsupportive. Dietitians who had less than 5 years of dietitian experience were significantly more likely to agree that limited knowledge of bone management was a barrier to practice (c2=25.13, p=0.000) or limited knowledge of dialysis adequacy was a barrier to practice (c2=40.17, p=0.000) versus those who had greater years of dietitian experience. Conclusions This study documents the frequency and percent of dietitians performing job functions related to renal renal dietetics. The results of this study document the variability in the role of the renal dietitian, and strongly suggest that there are differing levels of practice within renal dietetics. Additional data of the impact of dietitian to patient ratio on job functions and frequencies is warranted. Future studies are needed to assist in defining these levels of practice.


The relationship between economic status, BMI, and body composition of third, fourth and fifth grade students enrolled at baseline in the CATCH study. 

Theresa Johnson, 2007

Objective: To explore the relationship between economic status, BMI, and body composition of third, fourth and fifth grade students enrolled at baseline in the CATCH study. Design: This study was a retrospective, secondary analysis of selected baseline data (November 2005 to March 2006) from the CATCH intervention study. This data included height, weight, skinfold thickness, and selected demographic data (gender, grade, and participation level in the school lunch program). The CATCH study is a three-year Healthy Schools Healthy Kids grant funded intervention study with baseline and outcomes data being collected by Troy University School of Nursing. Subjects: A convenience sample of children of both genders enrolled in third, fourth, and fifth grade classes in underserved school districts in the Alabama counties of Pike, Bullock, and Barbour (a control county in the CATCH study). A total of 394 children from nine schools in these rural or semi-rural districts participated in the study. Forty-one percent were male (n = 163, 41.4) and 59% were female (n = 213, 58.6%). Statistical Analyses: Descriptive statistics (n and percent, mean, median, standard deviation, and range where appropriate) were used to display economic status, BMI z-score, and body composition of the sample. Independent Samples t-test was used to determine if there was a significant difference between BMI z-scores and body composition of the group with known economic status and the group where economic status was not known. Point Biserial Correlation was used to test the relationship between economic status of third, fourth and fifth graders in the CATCH Intervention Study, BMI (reported as a z-score), and body composition. An ± priori level was set at 0.05. Results: There were no significant relationships between economic status and BMI z-score for the sample or subgroups (r = 0.076, p = 0.26, n = 222). There was no significant relationship between economic status and body composition for the sample (r = -0.076, p = 0.26; n = 221); however, there were relationships between economic status and skinfold scores in females (r = -0.182, p = 0.04; n = 125), fifth graders (r = -0.29, p = 0.02; n = 66), third grade males (r = 0.43, p = 0.01, n = 34), fifth grade females (r = -0.36, p = 0.035, n = 34), and among Barbour county students (r = -0.40, p = 0.001, n = 79). Application/Conclusion: The findings of this study do not support the notion that lower economic status is positively correlated with overweight condition. There are, however, age and gender differences noted in the findings. The relationship between economic status and body mass index and body composition in children is complex- exerting varying degrees of influence subject to variables such as growth spurts and degree of poverty. Interventions for battling the pediatric obesity epidemic should not discount economic factors.


Dietary Supplement Use and Disclosure Among Individuals with Diabetes

Michelle Szebalskie, 2004

Objectives The Purpose of this study was to determine the dietary supplement practices in individuals with diabetes and explore the relationships between self-reported disclosure to physician(s) and the reasons for disclosure to the physician(s). Design/Method A self-administered survey was administered from August 2003 through January 2004. 130 adult subjects including 72 (55.4%) male and 58 (44.6%) female ranging in age from 27 to 80 years were recruited at group classes from two outpatient diabetes centers located in central NJ. Results Dietary supplements were used by 60.8% of participants. Multivitamins (64.6%) were the most common dietary supplement used. General health and well-being (58.7%) was the most common reason for dietary supplement use. There were significant relationships between race/ethnicity (p=.007) of supplement users, and educational level (p=.013) with supplement use. Women were more likely to use supplements for bone health (p=.08), joint pain (p=.09), and memory (p=.10) than men. Supplement use was disclosed to their physician by 67.1% of supplement users. The primary reason for disclosure of supplement use was "I thought it was important for my doctor to know" (31.5%), followed by "My doctor asked" (27%). Similarly "I did not think it was important for my doctor to know (45.5%) and "My doctor did not ask" (36.4%) were the primary reasons for nondisclosure. Conclusions/Applications More people are using dietary supplements and disclosing their use to their physicians than previously reported in literature. Supplements were primarily used for wellness and health promotion. More information is needed on the prevalence and practices of dietary supplement use in individuals with diabetes. This study and previous studies suggests that it is the patients perception of what the physician needs to know and whether or not the physician asked were the primary reasons for disclosure. This study supports the need for more education to physicians and other health care providers regarding open communication of CAM and dietary supplements with their patients.


The Effect of an Education Program on Carbohydrate Counting Knowledge of Inpatient Acute Care Registered Nurses Caring for Pediatric Patients with Type 1 Diabetes at a Community Teaching Hospital

Lauren Dorman, 2008

Objective: To examine the impact of a carbohydrate counting education program on knowledge of Registered Nurses (RNs) caring for pediatric patients with type 1 diabetes. Design: A prospective cohort study utilizing a carbohydrate counting education program for pediatric RNs at a community teaching hospital was conducted. The education program included 45 minutes of lecture based on the American Diabetes Association (ADA) content on carbohydrate counting. A 15 question knowledge- based pretest with six demographic questions assessed pre-intervention knowledge. The posttest was administered two weeks after the education program and contained the same 15 knowledge-based questions and four questions on the benefit and impact of the carbohydrate counting program. Subjects: Twenty-two RNs working on the pediatric unit at Monmouth Medical Center participated in the study. Statistical Analyses: Descriptive statistics were used to evaluate demographic and educational characteristics. Paired samples t-test evaluated the changes in score from pretest to posttest. Pearson Product Moment Correlations evaluated relationships between demographic characteristics, pretest score and change knowledge in score. Independent t-tests and one way ANOVAs assessed knowledge change and differences between the level of knowledge before the program and change in knowledge from pretest to posttest among grouped demographic characteristics. Due to the nature of this small pilot study, a priori alpha level was set at 0.10. Results: The mean pretest and posttest scores were 8.41 and 13.82, respectively. Scores increased significantly (mean change = -5.41, t = -17.67, p d 0.001). There were weak, non-significant relationships (r = ±0.00 to ±0.30, p< .10) between demographic characteristics such as age, years as an RN, years as a pediatric RN, and prior carbohydrate counting education when compared to pretest score and change in score. No significant differences were found among pretest scores and change in score among academic qualifications (p = 0.63, F = 0.70) and nursing certifications (p = 0.40, F = 1.03). Application/Conclusion: This pilot study demonstrated that a carbohydrate counting education intervention for pediatric RNs significantly increased knowledge. Major findings in this study reflect a need for further research regarding pediatric RNs knowledge about carbohydrate counting. Despite nursing standards identifying RNs should be competent in this content matter; they were not at the time of the pretest.


Impact of Nutrition Education in a Supermarket on the Purchases of Fruit and Vegetables

Natalie Menza, 2009

Objective: To determine the effects of two types of supermarket nutrition education interventions with adults and their effect on fruit and vegetable purchases measured by the analysis of supermarket sales data. Design: This study used a prospective email-based communication to invite ShopRite Plus Price card customers to take part in the research. An electronic survey was used to collect demographic information. Customers were divided into two intervention groups; one which received written education material and one which received a dietitian-led supermarket tour. Sales information of fruit and vegetable purchases was collected for four weeks pre- and post-intervention in both intervention groups and compared using the supermarkets electronic scan data. Subjects: A sample of ShopRite Price Plus card holders in three counties in New Jersey, which fell into the category of top 10% by spending dollar. Ninety customers participated in this study. Seventy-seven percent were female, 52% had a college degree or graduate degree, 43% had an income of $75,000 or more, and 81% were Caucasian. Statistical Analysis: Descriptive statistics (mean, median, standard deviation, and range) and frequency distributions (n and percent) were used to display demographic information. Independent t-test was used to measure for differences between intervention groups, paired t-test was used to measure change in each intervention group. Pearson product moment correlation coefficient and Point Biseral Correlation were used to measure the relationship between demographic characteristics and change in fruit and vegetable purchases. Results: Customers that attended the supermarket tour had a statistically significant increase (p=0.05) in the amount of fruits and vegetables they purchased from pre- to post-intervention when compared with customers that received written education material only. In addition, customers in the supermarket tour group had an increase in purchase amounts for individual fruit and vegetable categories (fresh, frozen, canned, dried, and 100% juice) from pre- to post-intervention. However, this change was not significant. A significant positive correlation was seen in the supermarket tour group in amount of fruits and vegetables purchased with an increase in age (p=.047). No significant relationships were found in the number of people in the household, gender, education level, income, or race; and the change in total number of fruit and vegetable purchases for either intervention group. Applications/Conclusions: Supermarket-based nutrition interventions are more effective than written education material alone at increasing the amount of fruits and vegetables purchased. Supermarket interventions allow for education to be delivered at the point of purchase where people make a large number of food purchasing decisions. Furthermore, this type of intervention can be measured by sales data, shown to be a more reliable method of collecting diet history than self-reported data. Fruits and vegetables are one important area of focus, but supermarket-based interventions allow for the opportunity to educate on a multitude of topics in nutrition


Nutrition-Related Alterations and Medication Adherence in HIV/AIDS

Caryl Moscardini White, 2004

Adherence to HAART has been directly related to achieving and maintaining viral suppression. There is an emerging interest in how nutrition-related alterations may contribute to poor adherence. The purpose of this study is to evaluate the relationship between nutrition-related alterations (general symptoms, body habitus changes and metabolic abnormalities) and self-reported medication adherence in HIV positive MSM in an urban internal medicine private practice. Seventy men who have sex with men (MSM) were asked to complete a questionnaire regarding nutrition alterations they have experienced and their medication adherence. The RD interviewed the patients concerning nutrition-related alterations. There were no statistically significant findings between RD reported nutrition-related alterations and self-reported medication adherence. However, statistical significance was found (p=0.05) between patient perceived central adiposity and self-reported medication adherence. Overall self-report adherence was 45.71% (n=32). Of the general symptoms, patients significantly reported more than the RD for weight loss (p=0.001), nausea (p=0.05), and diarrhea (p=0.001) Of the body habitus changes, patients significantly reported more than the RD for thinning of the hips and buttocks (p=0.006) and increased fat in the abdomen (p=0.0001). There were no significant differences in RD and patient report of metabolic abnormalities. Although there was no statistical significance, there were trends, which may have clinical significance for studying the relationship between nutrition-related alterations and adherence. Additional studies are warranted to evaluate the methods for determining the nutrition diagnosis of nutrition-related alterations in comparison to self-report.


Relationship of Restorative Training and Nutrition-Related Outcomes Among Institutionalized Elderly

Elizabeth Tschiffely, 2005

Learning Outcome: To identify whether or not restorative training influenced nutrition-related outcomes. Restorative training is a nursing program funded by the Omnibus Budget Reconciliation Act (OBRA) that encourages functional independence in the institutionalized elderly. Limited research is available on the effects of these programs and their long term outcomes, but research is ongoing. Further, no research has been documented that examines the relationship between restorative training programs and nutrition outcome measures, such as weight. This research project examined the relationship between restorative training and nutrition outcome measures, including weight, feeding ability, and quality of life over a sixteen week period of time. MDS data that had been previously captured by the nursing home was evaluated at baseline and 16 weeks for feeding ability, quality of life, and overall change in care needs for 40 participants who were enrolled in another approved study at the nursing home. Weights were gathered from the unit weight books where the participant lived. The Wilcoxon Signed Ranks Test compared baseline and week 16 measures of weight and quality of life. Results revealed that significant correlations were found between BMI and feeding ability, but no significant differences were found between these measures and quality of life during this period. This pilot study is the first of its kind to evaluate the relationship between restorative training and nutrition outcomes. Further research, including larger studies over a longer period of time are warranted to further evaluate the impact of restorative training on nutrition and other outcomes.


Is Physiological Health Status and Diet Compliance Altered when Metformin (Glucophage) Replaces Insulin in the Treatment of NIDDM Individuals?

Ann Marie McDade, 1996

The purpose of this study was to examine if physiological health status, diet and exercise compliance is altered when Metformin (Glucophage) replaces insulin in the treatment of individuals with Non-insulin Dependent Diabetes (NIDDM). A retrospective chart review was compiled from 37 adult NIDDM patients started on metformin drug therapy from May 1995 to November 1995 at the Princeton division of the Joslin Center for Diabetes who were previously on insulin and sulfonylurea drug therapy or insulin therapy alone. Physiological health status variables (weight, BMI, HbA1C) were examined six months before commencement of metformin therapy, at the start of metformin drug therapy and six months later. Compliance to diet and exercise were also examined at the same time intervals. Descriptive statistics were performed on the JMP IN program (SAS Institute Inc.) One hundred percent of patients were able to stop using insulin and attain beneficial changes in blood glucose control with metformin in combination with a sulfonylurea. Results showed that the difference in weight between the two groups was -6.304lbs (p<0.0031), the difference in BMI between the two groups was -1.058 (p<0.0030), and the difference in HbA1C between the two groups is -1.016% (p<0.0039). There were trends seen in the group that was always diet compliant. This group experienced more weight loss than the group that was never compliant with the diet. Exercise compliance did not correlate with the changes seen in weight or HbA1C. The results demonstrated that metformin can be successfully given to patients previously on insulin and produce weight loss and decreases in BMI and HbA1C. Future studies need to address effects of this regimen on cholesterol and triglycerides.


Changes in Cardiometabolic Risk Factors among Women in a Worksite Wellness Program

Raghda Ghussen Alraei, 2015

Objective: The primary aims were to assess changes in cardiometabolic risk factors including weight, BMI, waist circumference (WC), percent body fat, systolic and diastolic blood pressure (SBP and DBP), surveillance hypertension, and the 10-year Framingham risk score, among female participants in a university based worksite wellness program (WWP), at 12 and 26 weeks from baseline. The relationships between percent weight change and changes in SBP and DBP were also explored. Study Design: An interim analysis of data from the LIFT-UP (Lifestyle Intervention For Total health – a University Program) WWP. Data were analyzed using repeated measures analysis of variance, McNemar’s test, and Pearson’s correlations. Population: The sample included 57 female participants who completed 12 and 26 week follow-up appointments within 10-14 weeks and 24-28 weeks from baseline, by July 31, 2014. Results: The mean weight loss at 26 weeks was -3.68 ± 7.52(SD) lbs. There were significant reductions in weight (p <0.001), BMI (p <0.001), and WC (p <0.001) at 12 and 26 weeks from baseline. SBP and DBP decreased significantly from baseline to 12 weeks (p = 0.018 and p = 0.007,respectively). The prevalence of surveillance hypertension declined from baseline to 12 weeks(p = 0.039). No significant relationships were found between percent weight change and changes in SBP and DBP at 12 and 26 weeks. Conclusions: Women enrolled in this WWP experienced improvements in several cardiometabolic risk factors at 12 and 26 weeks from baseline.


Is there a Predictive Relationship Between Food Consumption Patterns,Demographic Factors, and Sodium Intake in an Urban Outpatient Congestive Heart Failure Population

Bernadette Wykpisz, 1999

Introduction: Congestive Heart Failure is identified as one of the five leading hospital diagnoses among adults age 65 and older. Nonpharmacological therapies such as modification of a patients lifestyle and dietary sodium restriction have been used for the treatment of this disease. Since there is no cure for heart failure, strategies to maximize health behaviors that improve dietary compliance have potential for improving outcome for this costly disease. Efforts to understand the relationship between food consumption patterns, population demographics and sodium intake in this specific population may lead to improved efficacy of patient care and lower hospitalization rates. Objective: The purpose of this research study was to investigate the relationship between food consumption patterns, and individual sodium intake, and population demographics in an urban outpatient CHF population. Efforts to identify a relationship between these areas provide useful information necessary to help develop an effective intervention which may improve compliance with dietary sodium restriction and lead to improved efficacy of patient care. Methods: The study was conducted in an outpatient heart failure clinic on adult clients at a university affiliated, tertiary care medical center. The subject completed a nutrition questionnaire, provided a 24-hour food recall to the principal investigator and completed a three day food record. Demographic characteristics were obtained via the medical chart. Results: The mean age of the subjects was 64 years old with 73% male. The majority of the participants were disabled (63%) and reporting their income as less than $10,000 per year. All 67 of the participants completed the nutrition questionnaire and 24-hour food recall but only 29 subjects (43%) completed the three day food record. Eighty percent of subjects were classified in the same sodium group regardless of the method used to collect the data. A Kappa statistic (0.62) indicated that two methods used to collect sodium have good reproducibility. Participants who added salt to cooking, had a significant association (p=.012) with a higher sodium intake. The results support the importance of removing salt from cooking to help decrease the sodium intake of this specific population. Subjects who consumed more than three fourths of their meals outside of the home had significantly (p<.0001) higher sodium intake values than those who prepared their meals at home. Eating food purchased or prepared outside of the home was positively associated (p=.0004) with sodium intake; indicating subjects were 38 times (odds ratio) more likely to have an excessive sodium intake when meals were eaten outside of the home. Participants who had a higher income had a significantly (p=.0018) higher sodium intake. Conclusion: This study provides information usefully to the registered dietitian in identifying high or low risk factors of inappropriate sodium intake when evaluating food consumption patterns. Both methods used to collect sodium intake were consistent and comparable to each other. The participants willing to complete the three day food record also had a lower sodium intake and may be more willing to continue to change their life style for a positive outcome. Since the majority of the population were males and had an average income of less than $10,000 income per year, these research findings need to be tested with a larger population with various income levels. The findings can be used as a building block for future research studies looking at food consumption patterns, especially eating outside of the home on the CHF population.


The Impact of Medical Nutrition Therapy Provided by a Registered Dietitian on Nutrition Parameters and Functional Status in Weight Losing Cancer Patients in an Ambulatory Care Center

Maria Plant, 2006

Objective: The purpose of this study was to examine the effects of Medical Nutrition Therapy (MNT) provided by a Registered Dietitian (RD) on nutritional parameters and functional status in weight losing cancer patients in an ambulatory cancer care center. Design: This study was a retrospective analysis of nutrition outcomes among adult cancer patients seen at the Cancer Instititue of New Jersey (CINJ). Changes in weight, Body Mass Index (BMI), and performance status as measure by the Eastern Cooperative Oncology Group Performance Status (ECOG PS) scale were obtained from existing patient records at baseline, 6 and 12 weeks. Participants and Setting: This study included 55 adult cancer patients referred to the RD at the CINJ for MNT due to unintentional weight loss from July 1, 2004 to September 1, 2005 and who met with the RD at least two times during the 12 week MNT period. Intervention: All participants received MNT by the RD based on the ADAs MNT protocols for medical and radiation oncology. Weight loss data obtained from this study were compared against a reference norm available in current literature. Main Outcome Measures: Primary outcome measures were change in weight, BMI, and ECOG PS score. Statistical Analysis: Mean weight and BMI from baseline, six and 12 weeks were examined using repeated measures one-way analysis of variance (ANOVA). Nonparametric analysis of ECOG scores at each time period was conducted using the Friedmans Test. Mean weight change from baseline to 12 weeks (mean  1.6 lbs) was compared to mean weight change from baseline to 12 weeks from participants receiving usual care a study by Isenring et al (9) (mean = 10.34) using a one sample t-test. Results: There was no significant change in weight, BMI, or ECOG PS score over the 12 week MNT period supporting a stabilization of weight and a preservation of performance status with provision of MNT. Participants in this study lost significantly less weight than those in the control group in Isenring et als study (p= <0.0001). Conclusion: Individualized MNT provided by the RD improved outcomes in this group of weight losing cancer patients by reducing weight loss and stabilizing performance status. This study supports the provision of MNT by an RD according to the ADA protocols (14) for weight losing patients with cancer.


The Effects of Frequency and Duration of Feeding on Stooling Patterns and Growth of Healthy Breastfed Infants During the First Two Weeks of Life

Sarah Bell, 1999

Exclusive breastfeeding is recognized by many healthcare professionals as the superior way to feed infants. The American Academy of Pediatrics recommends exclusive breastfeeding for at least six months in order for infants and mothers to receive optimal health benefits. While breastfeeding initiation rates in the United States have increased, recent data suggest that only twenty percent of all infants are still breastfed at five to six months of age. Mothers who experience difficulty at the onset of breastfeeding are more likely to discontinue nursing their infants. Information that will help prepare mothers for this experience may help reduce early abandonment of breastfeeding. Mothers (n=18) and their newborns (n=18) were recruited from the labor and delivery unit at Mercy Fitzgerald Hospital located in Darby, Pennsylvania. Information on nursing and stooling patterns of exclusively breastfed infants was obtained through detailed data collection by their mothers for the first two weeks of life. The data on frequency of nursing, frequency of stooling, and duration of nursing was similar to observations found in other studies. On average, infants (n=18) nursed 9.1 +/- 4.9 times per day (median 8.9 times per day). Nursing sessions lasted 26.0 +/- 29.0 minutes per feeding (median 23.2 minutes per feeding) or 185.0 +/- 248.5 minutes per day (median 202.7 minutes per day). Infants (n=13) stooled on average 4.9 +/- 3.8 times per day (median 4.9 times per day). Comparisons were made between frequency and duration of nursing (as individual variables and combined into one variable) and frequency of stooling; as well as frequency and duration of nursing (as individual variables and combined into one variable) and weight change at two weeks of age. No statistically significant relationships were noted between frequency of nursing and frequency of stooling (p=0.701, r = -0.118); total nursing duration and frequency of stooling (p=0.868, r = -0.051); frequency of breastfeeding compared to weight change (p=0.162, r = -0.380); daily mean nursing duration and weight change (p=0.342, r = -0.254); and total nursing duration compared to weight change (p=0.565, r = -0.162). Based on observations from this study, breastfeeding and stooling patterns are highly variable and cannot be used as an accurate measure of adequate breast milk intake or growth. However, findings from this study may be helpful information for healthcare professionals to use in educating new mothers about breastfeeding for longer durations.


Education of a Pediatric Nutrition Screening Tool and Identification of the Best Predictors of Nutrition Risk

Megan Johnston Mullin, 2003

Objective: The purpose of this study was to identify whether the RN nutrition-screening tool, currently in use at CHOP can accurately identify patients at nutrition risk and to determine the nutrition risk factors that can be used by the RN for nutrition screening. Design: A convenience sample of pediatric patients admitted to the intensive care unit over an eleven day time period were enrolled in the study. The RN was responsible for completing the pediatric nutrition-screening tool within 24 hours of admission. The RD assessment was based on the ASPEN guidelines for determining nutrition risk among pediatric patients along with subjective clinical experience. Subjects: Seventy-six infants, children, and young adults between the ages of 28 weeks gestation and 20 years were enrolled in the study from August 5 to August 16, 2002. Statistical Analyses Performed: Data were analyzed using JMP-IN software version 4.0 Frequency distributions, Chi Square tests, Fishers Exact tests, Partition Analyses and, Logistic Regression models were employed. Results: The majority of patients were males (n= 47, 61.8%) of Caucasian (n=47 63.5%) descent. The mean age was 83.16 months (6 years and 7 months) with a standard deviation of +/- 74 months (6 years). The RD identified (n=51, 67.1%) significantly more patients at nutrition risk than the RN (n=17 22%) (p=0.03, Fishers exact test). The level of agreement between RD to RN determination of nutrition risk was 16%, indicating a negligible degree of agreement. The logistic regression model with the highest efficiency (AUC = 99.0%) in predicting nutrition risk was consistent with the ASPEN guidelines for 11 nutrition risk factors: history of poor appetite, NPO, use of tube feeding, increased metabolic requirements, weight for&.


Relationship Between Subjective Global Assessment and Measurements of Abdominal Obesity and Body Composition in Patients on Maintenance Hemodialysis

Andrea Rothschild, 2016

The 7-Point Subjective Global Assessment (SGA) is a valid and reliable tool used to assess the nutritional status of patients on maintenance hemodialysis (MHD). However, the SGA does not evaluate an overweight/obese condition as part of its assessment. The aim of this study was to explore the relationship between SGA and measurements of obesity [BMI, conicity index (Ci), waist circumference] and body composition [fat free mass, fat mass]. An interim analysis was performed for 136 patients on MHD who participated in a multi-site, cross-sectional study within the Northeastern region of the US. Spearman correlation coefficients, scatter plots and cross tabs were used to analyze the relationships between the overall SGA score and measurements of body composition and obesity. Statistical significance was set at p< 0.05. Participants had a mean age of 55.2 years (SD = 12.1), were largely African American/Black (84.6%) and male (61.8%) with a mean dialysis vintage of 62.5 (SD= 74.7) months. Based on the overall SGA score, all participants were categorized as either well-nourished (SGA score = 6 - 7) (47.8%) or moderately malnourished (SGA score = 3 – 5)(52.2%). No participants were classified as severely malnourished (SGA score = 1- 2). Over one-third of overweight (42.2%) and obese (37.5%) participants were classified as well nourished. A moderate inverse correlation (r = -0.314, p = <0.0001) was found between the overall SGA score and the Ci which demonstrated that a higher Ci was associated with greater nutritional risk or a lower SGA score.


Baseline Data from a Pediatric Weight Management Program: Eating Patterns and Physical Activity Patterns of Participants Compared to Family Members

Maura Bruno, 2006

Objective: This study compared the eating and the physical activity patterns of overweight children to family members. Subjects/designs Eating and physical activity patterns were identified for 25 female and 20 male children entering the Healthy LIFE® Pediatric Weight Management Program in Livingston, NJ and for each family member living in the home with the participating child. Participant and family demographic characteristics were analyzed and compared to eating and activity pattern responses. Statistical analysis performed Frequency distributions were used to describe participant and family member behaviors. Spearmans Rank Correlation Coefficient was used to measure the strength of the relationship between the eating and activity behaviors of the participants and the family members. Demographic data was compared to responses describing eating and activity patterns using separate chi square tests of independence. Results The older participants were more likely to have a higher body fat percentage compared to younger participants (p=0.032, chi square). The participants with a higher percentage of body fat were more likely (p=0.01, chi square) to report a lower parental income. Direct correlation existed between the eating patterns of the participant compared to 3rd and 4th sibling (p=0.013 and 0.010 respectively). Correlation existed between the physical activity pattern of the participant and the sibling group (p= 0.028) as well as the biological parent group (p=0.004) and the family group( p=0.003). Conclusions Correlation exists between the physical activity patterns of the overweight child and those of family members. Correlation exists between the eating patterns of the overweight child and siblings within a larger family. Changing behavior for the overweight child is a process that should involve the entire family with a focus on healthy eating and increased physical activity.


Awareness and Opinions by UMDNJ-SHRP Dietetic Alumni of the Potential Future System for Educating and Credentialing Dietetic Practitioners

Jennifer Armao, 2006

The purpose of this study was to examine the awareness and opinion of the UMDNJ-SHRP dietetic alumni on the potential future system for educating and credentialing dietetic practitioners. The American Dietetic Association (ADA) appointed a Dietetic Education Task Force (DETF) which proposed a set of recommendations to enhance the profession. This study examined the response of Registered Dietitians who are graduates of UMDNJ-SHRP dietetics program to five of these recommendations (n=221). The response rate for the study was 59%. Awareness of the ADA considering changing the requirement for entry to the profession to a graduate degree was reported by 62.6% (n = 82) of the respondents. The demographic characteristics of the respondents closely matched that of the members of ADA. Five questions related to the DETF recommendations were rated on a five point Likert scale. Respondents disagreed (79.2%, n = 103) that more students will choose to major in dietetics if a graduate degree was required. Respondents agreed (75.6%, n=99) there is a need for advanced practice programs to be nationally accredited. Respondents also agreed (56%, n=54) that elevating academic requirements will have a positive impact on how other health professionals perceive the RD. Respondents disagreed (47%, n=61) with the recommendation the DTR credentialing should be eliminated. Respondents agreed (71%, n=93) with the concept of integrating the supervised practice with the academic curriculum, to improve dietetic education. Future research should include larger studies to be conducted by the ADA of its members to examine the awareness and opinion of the potential future system for educating and credentialing of registered dietitians. This research provides valuable information on the UMDNJ-SHRP dietetic program alumnis viewpoint of the DETF recommendations.


Research Involvement Among R.D. members of the New Jersey Dietetic Association

Susan Genovese, 2006

Objective: This was a cross-sectional, descriptive study designed to measure registered dietitians research involvement and its relationship to key antecedent factors among a randomized sample of 500 Registered Dietitians (RD) members of the New Jersey Dietetic Association (NJDA) for the 2005-2006 membership year. Design: This study was designed to replicate research previously done by Byham-Gray and colleagues in 2003. Subjects: 500 Registered Dietitians (RD) members of the New Jersey Dietetic Association (NJDA) for the 2005-2006 membership year. Statistical Analysis: Individual descriptive statistical analysis consisting of frequency distributions for categorical variables and means, standard deviations, and rangers for continuous data. A chi-square analysis was also used to describe the nonparametric data was well as Pearsons correlations coefficients. Results: Findings indicated that higher total Research Scores were associated with RDs who completed more years of education (r=0.125, p=0.022) route of education (r=0.167, p=0.004) had taken research courses (r=0.263, p<0.001), frequency of reading research articles (r=0.362, p<0.001), earned an advanced-level certifications (r=0.152,p=0.008), worked full-time (r=0.191, p=0.001), worked in their field for >20 years (r=0.134, p=0.017) belonged to more than one professional organization (r=0.240, p<0.001) and were involved with local affiliation (r=0.230, p<0.001). Results indicated that dietitians research involvement is largely determined by their education and training, work experience, and professional member involvement. Conclusions: This study identified a need for the integration of specific research techniques and changes needed in dietetics curriculums so that practitioners are encouraged to participate in research activities.


Change in Self Reported Gastrointestinal Symptom Severity and Frequency following a Registered Dietitian Counseling Session for the Management of Irritable Bowel Syndrome

Eileen Heffernan

Objective: To describe the demographic and clinical characteristics of patients with IBS seen in a private nutrition practice and the change in IBS symptom severity and frequency following registered dietitian (RD) counseling. The utility of the Gastrointestinal Symptom Scale (GSS) for documenting change over time was also evaluated. Design/participants/setting: This study was a retrospective chart review of 18 patients with IBS seen in a private nutrition practice in Texas from January 2012 to September 2015. All participants completed a pre and post GSS to assess their symptom severity and frequency before and after RD intervention. Statistical analyses: Descriptive statistics were used to describe demographic and clinical characteristics of participants and the 14 IBS symptoms on the GSS. The distribution of differences for symptom severity and frequency were analyzed using the Wilcoxon Sign Rank Test. Results: Eighteen participants were included in this study. There were statistically significant changes in symptom severity for 7 of the 14 symptoms and in the total symptom severity score from pre to post RD counseling (p=0.002). The largest differences were found for passing gas (p<0.0001), bloating (p=0.001), and abdominal pain (p=0.003). Conclusions: Participants in this study experienced a reduction in symptom severity and in the number of symptoms reported after RD intervention. The GSS shows promise for assessing IBS symptoms severity change over time.


Level of Participation in the Dietetic Practice Based Research Network among members of the American Dietetic Association

Melissa Jewel, 2006

Learning Outcome: To determine the relationship between demographic characteristics of RDs and reasons for participation or lack of participation in the Dietetics Practice Based Research Network (DPBRN) at each level. Design: Descriptive study using an online survey asking participants demographic information and reasons for participation and non-participation in practice based research and the American Dietetic Associations (ADA) DPBRN. Subjects/Setting: Sample composed of 1026 active RD members of the ADA in one of three levels of participation in the DPBRN. Level One was composed of 700 randomly selected RD members who did not express interest in the DPBRN, nor did they enroll in the DPBRN. Level Two was composed of 199 ADA members who subscribed to the DPBRN but failed to complete the descriptive portion of the study. Level Three was composed of 127 ADA members who subscribed to the DPBRN and subsequently completed the descriptive portion of the survey. Statistics: Frequency distributions were used to describe demographic characteristics based on level of involvement in the DPBRN and to describe how many participants participated or did not participate at each level and for what reasons. Chi-square analyses and Fishers Exact tests were used to explore the relationship between the collected demographic characteristics and reasons for participation or non-participation. Results: The survey yielded an overall 21.8 percent response rate. The largest response rate to the survey came from respondents with >20 years of experience (41.6%) while the lowest response rate came from RDs with 3-5 years experience (9.6%). The majority of respondents (62.7%) have no current involvement in research while only 23.8 percent have had no past research experience. Level Three participants were significantly less likely to report barriers including lack of research skills, lack of practice guidelines or protocols, and lack of research mentors. The more past or current research experience the respondents had, the more likely they were to know how to design and conduct practice based research. Conclusions: Results indicate that RDs feel that barriers do exist preventing their participation in practice based research. Further research is necessary to determine how ADA can help RDs overcome the barriers to participating in practice based research enhancing the Dietetic Practice Based Research Network and increasing participation in the network.


Personal Beliefs and Practices and Professional Practices Regarding Herbal Medicine Among the Full Time Faculty of the Newark Based Schools of the University of Medicine and Dentistry of New Jersey

Kelly Dougherty, 1999

Background: Use of alternative and herbal medicine is increasing among the general population in the United States. The use of herbal medicine by health care professionals has not been documented. Health care professionals tend to base personal and professional practices on scientific research and facts. This population has interactions with the public and health care students. Therefore personal and professional practices will impact these audiences. Objective: To identify current personal beliefs and practices and the extent of professional herbal medicine practices (clinical, academic and research) among the full time Newark based faculty members of the University of Medicine and Dentistry of New Jersey (UMDNJ) and to determine whether there is a relationship between personal and professional practices regarding herbal medicine. Methods: A survey was distributed to all full time Newark based faculty members of UMDNJ via interoffice mail. Nine hundred and four surveys were sent. The survey addressed personal beliefs and practices and professional practices of the faculty members regarding herbal medicine. Demographic variables were also included. Results: The overall response rate was 51%, with 29.6% of the population reporting personal use of herbal medicine, 22% of the population who provide patient care recommend herbal medicine, 9% of those involved in teaching include herbal medicine in coursework and 1.1% of the population are involved in research on herbs. Respondents who use herbal medicine were more likely to recommend herbs to patients (p<.0001); likewise, users of herbal medicine were more likely to teach students about herbal medicine (p=.001). Conclusions: This study provides information on the prevalence of personal and professional practice regarding herbal medicine among full time Newark based faculty of UMDNJ. The results support the hypothesis that personal beliefs and practices do impact professional practices within the clinical and academic settings of the university. Personal use was greater than twice the usage reported in the general population and the percentage of respondents recommending herbs (22%) is less than the percentage of their patients who are requesting herbs (27%). Slightly more than 50% of the study population are interested in continuing education regarding herbal medicine. Education and training of the faculty will ensure that patient needs are met as well as the educational needs of the students who will be the future health care providers.


Prevalence, Demographic Characteristics, Usage and Disclosure Patterns of Dietary Supplements Among Cancer Patients

Alicia Michaux, 2006

Objective: To determine prevalence, usage patterns, reasons for use and disclosure patterns of dietary supplements among cancer patients at the Cancer Institute of New Jersey (CINJ). Design: Preliminary analysis of dietary supplement usage patterns and demographic portion of the data from a mail study in which 2,782 individuals who sought consultation or treatment at a state suburban cancer institute were randomly selected. Participants were contacted twice. Subjects/setting: Participants included cancer patients at the CINJ greater than 18 years of age, English speaking, and willing and able to complete the survey. Statistical analyses: Frequency distributions, Chi-Square, Fishers exact, independent T tests, means, medians, range, and standard deviation were performed using SPSS 13.0 with alpha set at .05. Results: The response rate was 23%, (n=628), usable rate 62%, (n=386). Eighty nine (23%) participants used a dietary supplement after cancer diagnosis. The most common dietary supplements listed were green tea (n=14), multivitamin/mineral (n=9), vitamin E (n=5), vitamin C (n=5) and selenium (n=4). The most common reason for use was to improve the immune system. The most common response for disclosure was encouraged me to continue using. The most common reason for nondisclosure was he/she never asked. The only other health care provider that dietary supplement users consistently disclosed to was their primary care physician. Conclusions: Dietary supplement use is prevalent among this cancer community and consistent with the current literature. There is a relationship between cancer type and dietary supplements in frequency of use and there is a relationship between cancer type in terms of cancer type and type of supplement use. Applications: Members of the multidisciplinary cancer team should be cognizant of dietary supplement use among cancer patients and recognize the most frequently reported types, reasons for use and disclosure patterns. Oncologists as well as other healthcare providers should open dialogue among cancer patients to elicit information about their dietary supplement usage during and after their cancer treatment.


Factors Predictive of Not Passing the Navy's Biannual Physical Fitness Assessment

Paul Allen, 2009

Learning Outcome: Identify descriptive characteristics of active duty U.S. Navy personnel related to passing the Physical Fitness Assessment (PFA). Learning Code: 4010 Objective: Determine the relationship of characteristics to number of PFA cycles to pass after failure for body composition, physical readiness, or both components. Design/Subjects: Retrospective analysis of cycles to pass a PFA after a failure using a random sample of 1767 sailors who failed the Spring 2004 PFA, excluding pregnant, deployed, or those sailors with medical waivers, who had sequential recorded measurements ending with passing the PFA or the Fall 2006 PFA. Method: Retrospective analysis of characteristics by failure type, analyzed using descriptive statistics, chi square, and regression analysis with SPSS (v16) and an a-priori of 0.05. Results: The majority were male (81.3%, n = 1436), 20-24 years old (36.1%, n = 638), rank E-5 (41.9%, n = 740), with a BMI of 25.0-29.9 kg/m² (53.2%, n = 940). Older (X² = 11.22, p = 0.004), male (X² = 10.63, p = 0.031), and obese (X² = 172.00, p < 0.0005) body composition failures and obese (X² = 77.07, p > 0.0005) combination failures had a significant relationship to the number of cycles to pass. A higher BMI (p < 0.0005) was associated with not passing within three PFA cycles after a body composition failure and after a combination failure. Conclusion: Sailors who were male, older, and with a higher BMI at the time of failure required more cycles to pass the PFA. Sailors with higher BMIs at the time of failure were also more likely not to pass within three PFA cycles.


Changes in Systolic Blood Pressure, Diastolic Blood Pressure, and Health Related Quality of Life in Female Employees With Surveillance HTN

Noor Brifkani

Background: Hypertension (HTN) affects one in three adults in the United States (US). Employees spend one-third of their time at work making the workplace an ideal environment for interventions. Design: The primary aims of this interim analysis of Lifestyle Intervention For Total Health–a University Program (LIFT UP) were to examine changes from baseline (BL) to after the 12-week post-intervention follow-up in systolic (SBP) and diastolic blood pressure (DBP), and Health Related Quality of Life (HRQOL) among female employee participants with surveillance HTN (SHTN). Participants met with the registered dietitian (RD) at BL and at 12 weeks. The RD conducted all measurements and provided individualized nutrition and physical activity counseling to participants at both visits. Results: Compared to those without SHTN (n=156), those with SHTN (n=154) who returned for follow up (n=91) were older (p<0.001), had a higher waist circumference (p<0.001), body mass index (p<0.001), and DBP/SBP (p<0.001 for both) at BL. The mean age of participants with SHTN =51 years; 43.5% were White Non-Hispanic. Those with SHTN had decreases in SBP/DBP (both p<0.001), mentally unhealthy days (p=0.007, decreased by >50%) and summary index of unhealthy days (p=0.016, 50% decrease) from BL to 12 weeks. The relationships between changes in SBP/DBP and HRQOL were not statistically significant. Conclusion: Participants with SHTN that returned for follow up had significant improvements in SBP, DBP, and HRQOL after the 12-week intervention. Further research with a control group is needed to determine if these changes were a result solely of program participation.


Dietary Supplement Use Among High School Students in a Central New Jersey School District

Andrea Strum, 2008

Objective: To determine demographic characteristics, knowledge about, reasons for use, and patterns of use of currently popular dietary supplements among high school students in a central New Jersey school district. Design: The survey was a prospective design using an in-person, paper and pencil survey. The survey contained 32 multiple-choice and four true or false questions and was administered during junior and senior physical education classes. Subjects: A sample population of 868 high school students (459 juniors and 409 seniors) was invited to participate in this study. Of the 868 students, 320 students returned the required assent and consent forms. Of the 320 eligible students, 286 students were present on the survey day to receive a survey. The usable response rate was 32.49% (n = 282). Statistical Analyses: Descriptive statistics were used to evaluate the demographic characteristics, patterns of use, reasons for use, and knowledge of popular dietary supplements. Independent samples t-tests were used to examine the relationships between knowledge and patterns of use with demographic characteristics. To determine relationships between each season of sport with knowledge and patterns of use, one-way ANOVAs were used. Pearson product moment correlation was used to determine the relationship between the number of sports played by each student and the number of dietary supplements each student used daily. Chi-square tests and cross tabulations were used to determine the relationship between demographic characteristics and reasons for use. A priori alpha level was set at 0.05. Results: Nearly all respondents (98.23%, n = 277) reported using dietary supplements at least once. Almost all respondents (93.26%, n = 263) reported having used caffeine in the form of a beverage. Nearly one quarter of respondents (22.66%, n = 63) used dietary supplements to look better. Of those who reported having used dietary supplements, 11.48% (n = 31) cited their parents as the primary sources of influence and 21.11% (n = 57) cited the Internet/TV/radio/magazine advertisements as their main sources of information. Only 7.17% (n = 20) answered all four knowledge questions correctly. There were no significant differences between demographic characteristics and knowledge. There were statistically significant differences between gender and weight reasons for use (p = <0.0001) and health reasons for use (p = <0.0001). Males (45.74%, n = 43) use supplements for weight gain-increase strength more than females (5.62%, n = 10); females (26.97%, n = 48) use supplements for weight loss-decrease body fat more than males (15.96%, n = 15). Males (28.72%, n = 27) use supplements to improve sports performance more than females (7.87%, n = 14); females (18.54%, n = 33) use supplements to stay healthy more than males (14.89%, n = 14). There was also a statistically significant difference between gender and number of dietary supplements used daily (t = 4.306, p = <0.0001). Females (mean = 1.55, SD = 1.240) used less dietary supplements than males (mean = 2.92, SD = 3.971). Application/Conclusion: Major findings in this study suggest that high school students use dietary supplements and are in need of education on dietary supplement regulations, risks and benefits of dietary supplement use, and credible sources of influence and information on dietary supplements.


Relationships between self-reported and desired body weight, calculated current and desired body mass index (BMI), and perceived current and desired body image, in the University of Virginia at Wise (UVA-Wise) Freshman

Cindy Atwell, 2009

Objective: Determine relationships between self-reported and desired body weight, calculated current and desired body mass index (BMI), and perceived current and desired body image, in the University of Virginia at Wise (UVA-Wise) Freshman. Design: Descriptive survey with a cross-sectional convenience sample. The self-administered anonymous, one time, in-class questionnaire obtained demographic characteristics, self-reported height, weight, desired weight, and perceived current and desired body image. Subjects: Freshman students e18 years old attending the Fall 2008 Freshman seminar at UVA-Wise, n=139. Statistics: Descriptive statistics, correlation coefficients, and Chi-Square test were used explore relationships using SPSS version 16. Results: Of the 139 completed questionnaires, 86% (n= 120) were usable. Of the respondents 75.6% were Caucasian and male (n=52.5%). Age ranged from 18 to 25 years (mean 18.43). There was a strong significant positive association between current and desired body weight (males r=0.684, p<0.0005; females r=0.750, p<0.0005), calculated current and desired BMI (males r=0.641, p< 0.0005; females r=0.630, p<0.0005), and current and desired image (males r=0.541, p<0.0005, females, r=0.728, p<0.0005). The heavier the current weight and BMI, the heavier the desired weight and BMI; the larger the current image the larger the desired image. There was a significant association between the category of the current and desired calculated BMI (males only Ç2=14.466, p<0.0005), and BMI categories of the current and desired images (males Ç2=9.817, p<0.002; females Fishers exact =0.008). The heavier the BMI category of the current calculated BMI, or of the current image, the heavier the BMI category of the desired calculated BMI, or image. Conclusions/Applications: Data indicated that the heavier the individuals, the heavier the desired weight, and the larger the desired images were which may perpetuate the prevalence of overweight and obesity in 18-25 years old students. Education programs are needed to provide weight and health information to students. Clinicians may need education on this issue of weight and image misperception.


The Relationship between Weight Change and Change in Health Related Outcomes in a 12-Week Worksite Wellness Lifestyle Management Program

Barbara Scardefield, 2008

Learning Outcome: To identify the relationship between weight change and changes in health related outcome measures after a 12 week worksite wellness program. Text: Participants in this retrospective analysis of a 12-week intervention study included adult employees (BMIe 25 kg/m2) enrolled in a worksite wellness program in an urban academic health sciences center. In the prospective intervention on which this study was based, participants met with an RD three times (baseline, weeks 6, 12), attended up to 12 consecutive weekly (in-person or via Web-CT) classes and have weights measured weekly. Anthropometrical measurements, blood pressure and finger stick values for total cholesterol, HDL cholesterol and glucose were measured and 24-hour dietary recalls and the International Physical Activity Questionnaire (1) were completed at baseline and week 12. Data was entered into and analyzed using SPSS v15.0 (p=0.05); paired t-tests, Pearson Product Moment Correlations and chi square analyzes were used. Of the 104 enrolled, 88 participants completed the program (84.62%); the majority were female (93.2%, n=82) with a mean age of 47.82 ±9.62 and an equal number of black (n=41, 46.6%) and caucasian (n=41, 46.6%) participants. After the intervention participants experienced significant improvements in weight (t=7.13, p<0.0001), BMI (t=-6.73, p<0.0001), body fat percent (t=-5.03, p<0.0001), waist circumference (WC) (t=11.06, p<0.0001) and waist hip ratio (WHR) (t=2.67, p=0.009) for women, Male Framingham points (t=-2.59, p=0.061), IPAQ score (t=2.87, p=0.006), calories (t=-5.01, p<0.0001), carbohydrate (t=-4.03, p<0.0001), fat (t=-5.32, p<0.0001), protein (t=-2.35, p=0.021), saturated fat (t=-5.20, p<0.0001), cholesterol (t=-2.83, p=0.006) and sodium (-3.38, p=0.001). Significant declines in grain intake (t=-4.98, p<0.0001), and significant increases in vegetable (t=2.00, p=0.048) and fruit intake (t=2.52, p=0.014) were noted. There were significant correlations between weight change and changes in the following: BMI (r=0.98, p<0.0001), percent body fat (r=0.25, p=0.021), WC (r=0.45,p<0.0001), diastolic (r=0.38, p<0.0001) and systolic (r=0.33, p=0.002) blood pressure, Framingham points for men (r=0.96, p=0.01), grain intake (r=0.28, p=0.01) and fruit intake (r=-0.23, p=0.034). Short-term positive changes in weight and other health related outcomes could be achieved in the university setting with a combined group and individual 12-week intervention program. Weight change is significantly related to several health related outcome measures.


Obesity Prevalence, Perceived Needs and Preferred Modalities for Nutrition Education Among Parent/Child Dyads in an Urban Pediatric Primary Care Clinic: An Exploratory Needs Assessment

Rachael Brauer, 2009

Objective: To examine the weight status of parent/child dyads attending an inner-city pediatric primary care clinic and explore parental perceived needs and preferred modalities for pediatric weight management nutrition education. Methodology/Design/Subjects: A prospective, descriptive needs assessment was conducted utilizing parental surveys to collect data on perceived needs and preferred modalities for nutrition education and measurement of weight status of parent/child dyads. Parents with children aged two to 12 years seen in an urban teaching hospital pediatric clinic were asked to participate in a self-administered survey and have weights measured by the clinic staff. Results: Of the 256 parents approached, 240 (93.8%) usable surveys were returned. Thirty-three percent (n=81) of children were overweight or obese; 34.9% (n=83) of parents identified themselves as overweight or obese. Poor agreement was found between parental perception and actual child weight status (k=0.03, p=0.003, Kappa statistic). One-half (n=113, 50.0%) of parents felt an RD was the best source for nutrition education; over 80% were interested in counseling sessions with an RD. Significantly more self-identified overweight parents preferred receiving education that limited direct interaction with healthcare professionals, such as handouts (p=<0.001, chi=square), internet resources (p=0.01, chi-square), and videos (p=0.04, chi-square) more often then their normal weight or obese counterparts. Parents of overweight children had similar preferences. Applications/Conclusions: Parental weight status may affect perceived needs and preferred modalities for nutrition education. Poor agreement between parental perception of child weight status and actual child weight status suggests a strong need for identification and discussion of child weight concerns with parents by healthcare professionals.


Registered Dietitians' attitudes, knowledge base and practice habits regarding obesity

Kate Chiller, 2008

Objective: To describe Registered Dietitians (RDs) attitudes towards, knowledge of and professional practices regarding obesity. Design/Methods: This was a prospective design email survey sent to a random sample of 700 RDs provided by the Commission on Dietetic Registration (CDR). The survey included demographic questions and five-point Likert-type scales to evaluate RDs attitudes towards and professional practices about obesity. Knowledge of obesity was assessed through True/False questions, SPSS v15.0 was used for analysis. Results: Of the 165 usable surveys returned (23.57%), most respondents were Caucasian females; the mean years as an RD was 14.39 years. Psychological problems (n=107, 66.05%) and overeating (n=141, 86.50%) were the most frequently reported causes of obesity. Over half (n=110, 67.38%) of the respondents agreed that they & feel competent in providing weight loss interventions to obese individuals. Most (n=121, 74.23%) respondents agreed/strongly agreed that RDs should be role models by maintaining a normal body weight. Amongst respondents, 67.90% described themselves as average body weight, 22.84% described themselves as overweight. The mean BMI for the respondents was 22.15. On a scale of 1-10, the mean knowledge score was 7.69. Popular sources of obesity information chosen by respondents were previous experience (n=99, 61.49%) and evidence based guidelines (n=123, 76.40%). Most commonly used treatments included recommending increasing physical activity (n=160, 98.16%) and referring individual to outpatient nutritional counseling (n=128, 79.01%). A Spearman rank correlation coefficient was used to conduct analyses to assess any relationships between causes and interventions for the treatment of obesity that were scored by respondents. Statistically significant relationships were found that included a strong and significant relationship (r=0.521, p=<0.0005) between the cause physical inactivity and the treatment physical activity and respondents identified poor nutritional knowledge as a cause of obesity and dieting with RD counseling as an effective treatment demonstrating a strong and significant relationship (r=0.435, p=<0.0005). A Chi-square analysis was done to determine the relationship between calculated BMI and self-described body weight of respondents. The results indicated that there is an association between the calculated BMI and the self-described body weight demonstrating that even though there was some variation between both groups overall respondents were able to correctly describe their body weight according to the appropriate BMI category. Conclusion: The results of the study demonstrated that respondents were able to correctly describe their body weight according to the appropriate BMI category. Respondents are also confident in providing weight loss interventions. The majority used evidence based guidelines to help make practice decisions regarding nutritional interventions for obesity thus demonstrating the importance of the development of evidence based guidelines specifically for the obese population.


The Effect of a Three-Hour Educational Program on Breastfeeding Knowledge of Pediatric Residents at a Community Teaching Hospital

Elizabeth Toolan, 2007

A prospective cohort study utilizing a three-hour nutrition education program for pediatric residents at a community teaching hospital was conducted; a convenience sample of 36 pediatric physician residents was used. Three one-hour lectures were taught over a three-week period at scheduled noon conferences. A 15-item multiple-choice exam assessed pre and post-intervention knowledge. SPSSv14.0 was used for data entry/analysis; alpha=0.05. Frequency distributions and Pearson correlations were used to analyze demographic data. Independent and paired samples t-tests and one-way analysis of variance were used to analyze mean changes in knowledge and determine differences in knowledge by key demographic characteristics respectively. Pearson correlation coefficients were used to explore relationships between co-variates. Scores increased significantly for all residents from pretest to posttest (t=-5.06, p=0.001) and first year (t= -2.21, p=0.047), second year (t= -5.16, p=0.001) and third year (t=-2.23, p=0.05) residents. Second year residents had greatest increase in scores (+2.41 points). There was no relationship between demographic characteristics and pre-test knowledge (stat). There was a very weak correlation between age and change in knowledge (r=-0.191, p=0.278). Attendance did not have a significant impact on change in knowledge when analyzed as attendance at all three lectures (n=11, 31%, t= -1.587, p=0.346) and at no lectures (n=5, 14%, t=0.366, p=0.346). This pilot study demonstrated that a breastfeeding educational intervention for pediatric residents increased breastfeeding knowledge. It supports the need for such education in a residency program. The results were not impacted by in-person attendance of the interventions, suggesting that traditional noon conferences may not be the most effective way to increase breastfeeding knowledge.


Nutrition Focused Physical Examination in the Professional Practice of Registered Dietitians

Susan Stankorb, 2009

Objective: To determine RDs'use of nutrition focused physical examination (NFPE) skills and factors influencing use. Methodology/Design/Subjects: A prospective descriptive internet-based survey was designed. An email with an embedded link to the survey was sent to a random sample of 2,447 active RD email addresses (obtained from the 2008 Commission on Dietetic Registration database). Chi-square analyses were used to explore relationships between demographic characteristics, factors influencing use, and NFPE practices. Results: Of the usable responses (n=367, 15.8%), 55.2% (n=197) of the respondents had/were pursuing graduate degrees, 32.7% (n=120) had a specialty credential and 34.6% (n=121) received training in NFPE. The mean age of respondents was 41.1±11.5 years. The NFPE skills RDs performed independently most frequently included measurement of weight (59.7%, n=194), height (54.8%, n=178), and waist circumference (29.2%, n=91) and skin assessment (22.3%, n=70). For 13 of the 20 skills listed, the most frequently reported response was do not perform or use as a portion of nutrition assessment including cranial nerve (89.1%, n=271), lymph-node examination (89.7%, n=270) and bioelectrical impedance analysis (78.8%, n=242). Respondents reported factors limiting use of NFPE included lack of prior education (42.4%, n=133) and training (43.5%, n=136). Those who received prior NFPE training were significantly more likely to measure waist circumference, conduct bioelectrical impedance analysis and cranial nerve examination than those without (p<0.0001 each, chi-square analyses). Conclusions: In this representative sample of RDs, less than 50% had prior NFPE training. Training was significantly related to performance; hence an emphasis on expanded NFPE training in dietetics is needed.


Comparison of predictive energy equations among individuals on maintenance hemodialysis

Ellis Avery Morrow, 2014

The Maintenance Hemodialysis Equation (MHDE) is a novel predictive energy equation developed to estimate REE for patients treated by MHD. This study compared the MHDE to the Harris Benedict Equation (HBE), Mifflin St. Jeor equation (MSJE), World Health Organization/Food and Agricultural Organization of the United Nations/United Nations University equation (WHOE) and a newly developed dialysis specific equation by Vilar et al. (VE). Using a retrospective chart review design, the level of agreement between the estimated REE (eREE) derived from the MHDE was compared to eREE from the HBE, MSJE, WHOE and VE using Bland Altman plots with ±200 calorie upper and lower limits of agreement and the interclass correlation coefficient (ICC). All data were obtained from 134 electronic and paper medical records of individuals on MHD during the month of June 2014. Estimated REE for each equation was calculated using SPSS software. There was agreement between the eREE from the MHDE-CR and the HBE, MSJE-P, WHOE and VE equations with 75% of estimates within the ±200 calorie limits of acceptability. There were significant, strong positive correlations (P<0.001) between the eREE from the MHDE-CR and the HBE (ICC=0.887), the MSJE-P (ICC=0.897), the WHOE (ICC=0.845) and the VE (ICC=0.845). There was limited agreement between the MHDE and the other equations among individuals with heights =72 inches, BMIs =35 and < 40 years of age. There was agreement between the MHDE and other commonly used predictive energy equations among individuals receiving MHD; however this agreement was not consistent across all variables (age, height, BMI). Future research should investigate the bias and precision of the available predictive energy equations by comparing them against indirect calorimetry.


Predictive Energy Expenditure Equations for Critically Ill Burn Patients: Which One Works?

Kate Breznak, 2008

Objective: To determine which of the five following equations: B-B modification of the Fleisch Standards, Curreri formula, Fleisch Standards with Wilmore Stress Factors, Harris Benedict with Long variation, and Ireton-Jones equation provides the least biased and most precise estimation of energy expenditure in relation to measured energy expenditure (MEE) post burn injury. Design: Secondary retrospective analysis of demographic characteristics and measured energy expenditure (MEE) data on burn patients admitted to an urban teaching hospital burn center between January 2002 and December 2006 who had their energy expenditure measured by indirect calorimetry (IC) at least once during their stay. Subjects: Mechanically ventilated burn patients aged 15 years and older. A total of 218 indirect calorimetry studies were analyzed for 98 patients. Statistical Analysis: Descriptive statistics, Pearson Product Moment Correlation Coefficient, and simple linear regression were used to evaluate the relationship between the estimation equations and MEE. The Bland-Altman Limits of Agreement Analysis were used to evaluate the bias and precision of each estimation equation in relation to MEE. A priori alpha was set at 0.05. SPSS 15.0 was used for analysis. Results: Of the 98 patients included in this study, the mean age = 51.59 ± 18.96 years, the mean body mass index (BMI) = 25.95 ± 7.55; 57% (n = 56) were male. The mean burn size = 25.38 ± 20.86% total body surface area. The mean MEE = 2011.54 ± 54 calories for the 218 IC studies. The lowest estimated energy expenditure was the B-B modification of the Fleisch Standards 2170.78 ± 376.14 calories and the highest estimated energy expenditure was the Harris Benedict with Long variation 3600.53 ± 628.43 calories. There was a significant positive strong correlation (Pearson Product Moment Correlation Coefficient analysis) between MEE and the B-B modification of the Fleisch Standards (r = .676, p = <0.001). Simple Linear Regression demonstrated that the B-B modification of the Fleisch Standards accounted for 46.1% (R2 = .461, p = <0.001), the Curreri formula 24.4% (R2 = .244, p = <0.001) of the variance in MEE. The B-B modification of the Fleisch Standards was the least biased (mean difference -159.25 kcal/day) and most precise (standard deviation of the bias 390.19 kcal/day) in relation to MEE. Application/Conclusion: In the absence of indirect calorimetry, the B-B modification of the Fleisch Standards provides the best estimate of MEE in the burn population. None of the formulas were unbiased or precise based on the Bland-Altman Limits of Agreement Analysis. Adjustments should be made based on clinical judgment and careful assessment to individualize the energy prescription provided to each patient.


Effect of the Probiotic Lactobacillus rhamnosus GG on Antibiotic Associated Diarrhea in Critically Ill Patients

Laura Laski, 2008

Objective: To determine the affect of co-administration of Lactobacillus Rhamnosus GG (LGG) with antibiotics on stool output in a Medical Intensive Care Unit. Design: This was a three phase study. Phase one was a nurse education session with a pre and post test on standard documentation of stool output using the King Stool Chart. Phases two and three were retrospective chart reviews to evaluate stool output in patients who received antibiotics with or without LGG. Sample: The sample for the first phase, the nurse education session, consisted of all nurses who work in the Medical Intensive Care Unit, 78.75% (n=63) of the 80 nurses who work in the unit completed the education sessions. The sample for the retrospective chart review was all patients who were admitted to the Medical Intensive Care Unit between September 1, 2007 and February 20, 2008. Fifty-one charts were included in the sample for phase two, the antibiotics only group. Ten charts were included in the sample for phase three, the LGG with antibiotics group. Statistical Analysis: Statistical analysis was completed using Statistical Package for Social Sciences, version 15.0. A priori alpha was set at p =0.05. A Paired sample t-test was used to determine a difference in change in nurse knowledge between the pre and post test scores in the education session. For the chart reviews in phase two and three Fishers exact tests were used to describe relationships in diarrhea development, and demographic and patient characteristics between the two groups. Independent sample t-tests were used to compare means in percentage of patient days with diarrhea and daily diarrhea scores between the groups in phases two and three. Results: There was an increase in nurse knowledge from the pre to the post test (p<0.005) in classification of stool output. From the chart reviews there was no difference in the incidence of diarrhea development, the mean daily diarrhea score, or the percentage of patient days with diarrhea in patients who did or did not receive LGG with antibiotics. Conclusions/Applications: Nurses knowledge increased following the education intervention. There was no statistically significant difference on incidence of diarrhea development, daily diarrhea score, or percentage of patient days with diarrhea between patients who received or did not receive LGG concurrently with antibiotics. A large trial using LGG in the critically ill population is needed to determine if this substance has an effect on decreasing stool output in critically ill adults on antibiotics.


Comparison of Maternal and Neonatal Outcomes of Women with Gestational Diabetes Mellitus (GDM) who receive Medical Nutrition Therapy at a Suburban Health Facility to Maternal and Neonatal Outcomes of Women with Gestational Diabetes Mellitus and Controlled

Margaret Giordano, 2009

Background: Not-controlled Gestational diabetes (GDM) raises health risks for mother and fetus. Medical Nutrition Therapy (MNT) for GDM can help women with GDM better control third trimester blood glucose (BG) levels and reduce perinatal adverse outcomes. Objective: Compare adverse maternal and neonatal outcomes of two distinct groups of women with GDM (BG controlled and BG not-controlled) having received MNT at Heritage Valley Health System (HVHS), a suburban health facility. Compare outcomes of HVHS groups to similar groups in the 2007 Gonzales-Quintero et al. study (GQS). Design/subjects/setting: Retrospective study was completed for 194 women with GDM who received MNT for GDM at HVHS, March 2003-May 2007. Intervention: Replicating GQS, the HVHS study selected 94 women with GDM, who had documented fasting BG and 1-h or 2-h postprandial BG, and separated into two groups using American College of Obstetricians and Gynecologists (ACOG) BG criteria. Statistical Analysis: Power analysis using G-Power determined effect size 0.59 based on 94 subjects, power 0.80 and alpha 0.05. Descriptive statistics were compiled using Statistical Package for Data Analysis (SPSS) version 16.0. Inferential statistical analyses employed the t-test for independent samples and z-test for proportions. Results: Comparing HVHS and GQS BG controlled groups, there was no significant difference in mean birth weights, but a favorably significant difference in the proportion of cesarean deliveries. Comparing HVHS BG controlled and BG not-controlled groups, there were significant differences in mean birth weights and proportions of cesarean deliveries and infants having one or more adverse perinatal outcomes, similar to GQS findings. Conclusions: Similar to GQS, the HVHS study demonstrated that women with GDM achieving ACOG criteria BG control can significantly lower their risk of incurring serious adverse maternal and neonatal outcomes. This should enlighten practitioners on the importance of managing late gestation BG and utility of MNT.


The Impact of a Four Hour Pediatric Nutrition Education Program on Pediatric Nutrition Knowledge of First, Second, and Third Year Pediatric Residents

Pinkin Panchal, 2009

Background: Pediatricians require a solid understanding of pediatric nutrition in order to provide education and counseling to their patients and families regarding nutrition related topics. Objective: To determine the impact of a four-hour nutrition education program on pediatric nutrition knowledge of pediatric residents and explore the relationships between change in knowledge and demographic characteristics. Design: A prospective, pre-test/post-test design was employed. The program involved completion of a 20 question multiple choice pre-test including nine demographic characteristic questions followed by four one-hour education sessions on four objectives - infant nutrition, preschool nutrition, school-age nutrition and adolescent nutrition. The education sessions were given at resident noon-conferences at a teaching hospital and were also available for download via pod-cast. A post-test was given two weeks after the completion of the education program. Subjects/Setting: A convenience sample of 34 first, second, and third year pediatric residents were enrolled in the study at an urban teaching hospital for academic year 2008-2009. Statistical Analyses: Paired samples t-test was used to determine change in nutrition knowledge from pre to post-test for total score and by educational objective. One-way analysis of variance (ANOVA) assessed the differences among each residency year. Pearson product-moment correlations and independent samples t-test were used to assess relationships between change in scores and demographic characteristics. Results: Mean total pre-test scores for all residency years were 12.18. Although post-test scores improved (12.50), they did not change significantly (p=0.36) for the total sample. There was a significant change in knowledge for objective 4 (adolescent nutrition) for total sample (p=0.02) and for first year residents (p=0.03). No significant difference in total change in score among year of residency was seen. There were no significant relationships between demographic characteristics and change in nutrition knowledge. Conclusion: Pediatric residents have inadequate knowledge of pediatric nutrition. Further studies incorporating larger sample, greater number of hours of pediatric nutrition education and validated questions may further enhance resident learning.


Changes in quality of life (QOL), metabolic parameters, and BMI percentiles for children and adolescents with Type I Diabetes Mellitus (T1DM) prior to and post initiation of insulin pump therapy

Rian Sutherland, 2009

Objective: The primary objectives of this study were to describe changes in quality of life (QOL), metabolic parameters, and BMI percentiles for children and adolescents with Type I Diabetes Mellitus (T1DM) prior to and post initiation of insulin pump therapy. Methods: This mixed-method research design consisted of a retrospective chart review of metabolic parameters and BMI percentiles before and after insulin pump therapy and an in-person interview with each child and family about changes in QOL since insulin pump therapy. Thirty-five children and adolescents (ages 5-20 years) were identified with T1DM and utilizing insulin pump therapy in a western Kentucky endocrinology practice from January 2007-July 2008. Results: Twenty-seven patients were enrolled and gave consent/assent. Six themes emerged from the interviews indicating insulin pump therapy had positive impact on QOL. These included improved blood sugar control, school and social life, ease of insulin delivery, more flexible meal times, and increased independence; 26 participants stated insulin pump therapy was easier than injections. The differences in the number of DKA events, moderate and severe hypoglycemic events, HgbA1c values, and BMI percentiles were not statistically significant; however, there was a statistically significant decrease in the number of mild hypoglycemic events post initiation of insulin pump therapy compared to before pump therapy (p<0.05). Conclusions: This study demonstrated that insulin pump therapy had positive impact on QOL and diabetes care for this group of children and adolescents. Further research is needed to explore QOL in relation to changes in metabolic parameters and BMI percentiles.


Research Interest and Research Involvement Among US Registered Dietitians

Melinda Boyd, 2015

Background: Assessment of research involvement among Registered Dietitians (RDs) has been described as a continuum with four levels. Research is important to advancing the profession of dietetics. Objective: To determine relationships between US RDs’ research interest and involvement using the Interest in Research Questionnaire (IRQ) and Dietitian Research Involvement Survey (DRIS) scores and to explore relationships between select professional characteristics and research interest and involvement. Design/participants/setting: A secondary analysis of data from a clinical trial of 580 US RDs in clinical practice. Statistical Analyses: Descriptive statistics and frequencies were reported for professional characteristics and research interest and involvement. Spearman’s correlation was used to analyze relationships between variables. Results: Respondents had mean IRQ and DRIS scores of 55.0 points out of 80.0 points and 23.5 points out of 60.0 points respectively. Frequency of reading research (r=0.298), highest degree earned at the time of completing the survey (r=0.172), and highest degree earned at time of becoming an RD (r=0.137) were each positively correlated with IRQ scores. A positive correlation was found between research interest and involvement (r=0.435). Conclusions: RDs in clinical practice in the US are more likely to be reading professional materials and involved with research if they are interested in research.


An assessment of physical therapists' attitudes, knowledge, and practice approaches regarding obese individuals

Suzanne Sack, 2008

Objective: To determine the attitudes, knowledge, and practice approaches of physical therapists (PTs) in regards to the obese population and to explore the relationship between their attitudes and knowledge. Design: This study was a prospective paper mail survey. Physical Therapists were sent a survey which was designed to obtain demographic characteristics as well as attitudes, knowledge, and practice approaches regarding obesity. The survey consisted of 31 multiple choice, fill-in, Likert scale, and true/false questions. Subjects: The American Physical Therapy Association provided 1000 random by selected mailing addresses of Physical Therapists who were current members. Of the deliverable addresses obtained (n=998), 34.7% (n=345) completed the survey. Statistical Analyses: Descriptive statistics were used to obtain PT attitudes, knowledge, and practice approaches in regards to obesity. Pearson Product Moment Correlation and Spearman Rank Correlation Coefficient were used to test the relationships between attitudes and knowledge. A priori alpha level was set at 0.05. Results: The PTs believed that physical inactivity (92.75%, n=320) and overeating (78.49%, n=270) are the most important causative factors of obesity and diet and exercise are the most effective treatments of obesity. When treating obese individuals, the respondents frequently recommend exercising more (87.37%, n=263), but rarely recommend making changes in nutritional habits or refer to other health care disciplines. Respondents reported neutral scores for their attitudes regarding obesity. The mean knowledge score was 6.74 out of 10. There were no significant correlations found between the respondents knowledge and attitude scores regarding adjectives used to describe obese individuals. A significant correlation (r=.133, p=0.043) was found between the respondents knowledge score and attitudes regarding statements about obesity. An inverse correlation was seen between the respondents age and knowledge score (r= -.195, p=<.0005) and years in practice and knowledge scores (r= -.216, p =<.0005). Applications/Conclusions: Key findings from this study suggest that PTs have neutral attitudes towards obese individuals. PTs appropriately indicated that physical inactivity and poor nutritional habits contribute to obesity. Changes in practice patterns are warranted to improve PTs referral systems in order to assist in the healthcare team approach to the treatment of obesity. Further education is needed in physical therapy programs in order enhance PTs knowledge of obesity.


The Impact of an enteral nutrition continuing education program for intensive care unit nurses on their knowledge of enteral nutrition and the enteral nutrition protocol at an acute care, community, level-two trauma hospital

Lisa Silberman, 2009

Objective: To determine the impact of an enteral nutrition (EN) continuing education (CE) program for intensive care (ICU) nurses (RNs) on their knowledge of EN. Design: Prospective pre/post test design to evaluate the effectiveness of a 30 minute EN CE program on EN knowledge. A pretest consisted of 16 knowledge-based questions, representing four educational objectives, and eight demographic characteristic questions. Posttests, using the same 16 knowledge-based questions and three questions regarding the impact of the EN CE program, were distributed two weeks after the EN CE program. Subjects: Forty-six ICU nurses employed at an acute care, community, level-two trauma hospital. Statistical Analyses: Descriptive statistics were used to analyze demographic and educational characteristics and pre/post test scores. Paired samples t-test evaluated change in knowledge and by each educational objective. Pearson product moment and point biserial correlations assessed relationships between change in knowledge and demographic and educational variables. Independent samples t-test and one-way ANOVA analyzed mean differences between change in knowledge and categorical educational characteristics. A priori alpha d0.05. Results: Mean pretest score=8.89 (SD=2.37); mean posttest score=11.65 (SD=2.60). Scores increased significantly (mean change=2.76, p<0.0005) from pretest to posttest. Each of the four educational objective scores increased significantly: objective one, determine the rationale for use of EN support and implementation of the EN protocol (mean=0.85, p<0.0005), objective two, identify appropriate formula selection and equipment (mean=0.61, p<0.0005), objective three, identify the proper techniques for initiation and administration of EN (mean=0.83, p<0.0005), objective four, identify and implement the guidelines to properly monitor EN (mean=0.48, p=0.011). There were weak, non-significant relationships between change in knowledge and demographic and educational characteristics. Application/Conclusion: ICU nurses knowledge of EN significantly improved after the EN CE program. Further research regarding the most effective teaching methods for RN CE programs and application of EN knowledge to practice is needed.


Knowledge, Perceptions and Practices of Registered Dietitians in the Dietetic Practice Group, Consultant Dietitians in Health Care Facilities, Regarding the American Dietetic Association's Standardized Language to Document the Nutrition Care Process

Therese Regan, 2009

Nutrition Care Process (NCP) to improve the consistency and quality of care provided by the dietetic practitioner and predictability of patient/client outcomes. There is limited information regarding Registered Dietitians' (RDs) knowledge, perceptions and practices of ADA's standardized language (SL) to document the NCP among members employed in non-acute care settings. Members of the Dietetic Practice Group (DPG), Consultant Dietitians in Health Care Facilities (CD-HCF) were surveyed and represented a sample of dietitians with the majority working in long-term care. Objective: Investigate the knowledge, perception and practices of the 2006-2007 RD members of the DPG, CD-HCF regarding ADA's SL to document the NCP. Design: A prospective Internet-based survey consisting of 30 questions was developed and emailed to the 2006-2007 RD members of the DPG, CD-HCF (n=4049). The RD's knowledge, perception and practices were measured. The survey was not validated. Subjects: Participants who opened the survey (n=1179) were eligible to answer the demographic questions. A smaller subset (n=321) completed all knowledge, perception and practice questions. Statistical analyses: Descriptive statistics, Pearson correlation, point biserial correlations and binary logistic regression. Results: No significant correlation was found between knowledge and perceptions (r=0.177, p<0.001). Respondents who used SL were significantly more likely to have higher SL Knowledge Scores (r=0.225, p<0.0001) and significantly more likely to have higher SL Perception Scores (r=0.384, p<0.0001). There was no significant interaction effect between SL Knowledge Scores and SL Perception Scores on RD's SL Practice (p<0.763). SL. Nearly 70% (n=553) of the surveyed RDs did not use ADA's SL in their patient care practice. Conclusions: Knowledge and perception influenced use of ADA's SL. Research is necessary to determine how state and federal mandated language requirements coordinate with use of SL in the long-term care setting. Continuing education programs regarding ADA's SL should strive for small, incremental changes providing skills and hands-on training specific to the long-term care setting.


Perceived Needs, Interests, and Practices in Weight Management of UMDNJ Faculty, Staff, and Students

Laurice Wong, 2009

Background: National health objectives set by Healthy People 2010 supports the development of weight management programs in the workplace to help combat the growing obesity epidemic. Objective: The primary aims of the study was to identify the perceived needs and interests of UMDNJ faculty, staff, and student for weight management resources, and explore personal and professional practices. Design/Subjects: A systematic, random sample of 4000 faculty, staff, and student e-mail addresses from the 2008-09 academic year was obtained. Using a prospective internet-based survey, weight management needs, interests, and practices were collected. Statistical Analysis: All data were downloaded and analyzed using SPSS v16.0; alpha was set at pd0.05. Descriptive statistics and inferential statistics were used. Results: Of the valid e-mails distributed (n=3921), there was a 15.6% response rate (n=612). More than 50.0% of respondents were staff (n=332, 54.6%), followed by students (n=180, 29.4%), and faculty (n=98, 16.0%). The mean calculated BMI was 26.6 kg/m2 (SD=6, range=15.5  51.2 kg/m2). Staff had the greatest frequency of obesity (n=111, 33.5%).One half of respondents (n=308, 50.3%) felt it necessary and important to have weight management resources available at UMDNJ. The most frequent weight management resources identified as necessary were an on-site fitness center (n=481, 79.5%) and access to healthy dining options or diet foods (n=468, 78.5%). Among respondents who provide patient care, those who were obese were significantly more likely not to screen for overweight/obesity (Pearson Ç2 =8.927, p=0.012); and more likely not to provide weight management counseling (Pearson Ç2 =8.885, p=0.012) than those who were normal or overweight. Application/Conclusions: UMDNJ respondents are interested in having weight management resources at work. The study provides a baseline needs assessment for work-based weight management resources. Healthcare professionals need to be educated regarding weight screening, counseling, and referral of patients. Further research should be conducted to examine weight status, personal, and professional practices of various healthcare disciplines.


Primary Practice Area and Selection of Nutrition Diagnostic Labels and Their Defining Characteristics

Yalaka Totton, 2007

Background As part of the Nutrition Care Process, the newly developed nutrition diagnostic labels should be agreed upon by registered dietitians (RDs) in all areas of clinical practice. Research examining the use of the labels and their defining characteristics by RDs in various practice areas is helpful in determining how applicable the labels are in each area. Objective To examine existing data on the selection of nutrition diagnostic labels and ratings of defining characteristics in relationship to primary practice areas (geriatrics, pediatrics, nutrition support, diabetes care, general clinical practice, and renal). Design Retrospective analyses were performed utilizing data from a previous web-based case scenario study (1) on the relationship between level of practice and reliability of nutrition diagnostic labels. Subject/setting In the original study RDs were recruited from ADAs email database and from identified experts. In this study, RD respondents were grouped according to their self-selected primary practice areas. Sample size numbers are different due to variable responses to questions from the original study (N= 273 for the case scenario responses N=286 for the rating of defining characteristics responses). Statistical analyses performed Descriptive statistics, chi-square and t-tests were used (p= 0.05). Results Significant differences between the mean number of nutrition diagnostic labels selected by primary practice area were found in only five out of 72 (7%) t-tests comparisons. A significantly (Ç2 = 4.173, p=0.041) higher percentage of RDs in the nutrition support area selected hypermetabolism in response to the Acute Care Burn Injury case. Additionally, a significantly (Ç2 = 4.64, p=0.031) lower number of RDs in geriatrics selected involuntary weight loss in response to the Long-term Care case. Compared to other areas, the geriatrics, renal, and general clinical practice areas rated three different major defining characteristics significantly higher (p < 0.05) under the diagnosis not ready for lifestyle change. The diabetes area rated two defining characteristics significantly higher for inconsistent carbohydrate intake. There were no significant differences in average ratings by the nutrition support and pediatrics areas. Conclusions Regardless of primary practice area RDs were fairly consistent in their use of nutrition diagnostic labels and ratings of major defining characteristics. The few differences that were found may be due to RD experience as related to the nutrition problems more commonly encountered in their primary area of practice. Prospective studies in this area of research are indicated.


The Relationship Among Years Post-RD, Demographic Characteristics and Level and Frequency of Involvement in Selected Job Activities

Yuko Horio-Scheld, 2008

Objective: To examine performance of Registered Dietitians (RD) who responded to the 2005 Dietetics Practice Audit in selected job activities at years one, three, and five following receipt of the RD credential. Design: This study was a retrospective secondary analysis of selected data from the 2005 CDR Dietetic Practice Audit, focusing on three practice area job activity classifications: general, conducting research, and providing nutrition care to individuals. Subjects: RDs in their first, third, and fifth years since becoming an RD who responded to the 2005 CDR Dietetics Practice Audit and who held a dietetics-related position(s). Statistical analysis: SPSS, 15.0 was used for data analysis. A priori alpha level was set at pd0.05. Descriptive statistics, Chi-square, One-way ANOVA and Tukey test were used to explore relationships between level and frequency of involvement, years since becoming an RD and demographic characteristics. Results: Approximately 34% of RDs (n=450) had a graduate degree(s), 11% (n=145) were enrolled in a degree program, 15% (n=191) entered as dietetics as a second career, and 59% (n=754) worked in clinical as their primary position. RDs at year one (n=315, 73.4%) were more involved in recommending nutrition status lab tests than RDs at year five (n=286, 63.7%) (Ç2 =13.687, p=0.033). RDs at year one (n=297, 69.2%) were more involved in evaluating clients overall health status than RDs at year five (n=274, 61.3%) (Ç2 =12.568, p=0.050). Frequency of involvement in these activities did not significantly change at years one, three or five. Greater than 90% were not involved in research activities except reviewing research literature; there were no statistically significant differences in level of involvement among the three year groups for research activities. Years of dietetics-related work experience in perform myself level was approximately 0.6±0.2 years longer than those in no involvement level in the develop research proposals activity (F=2.786, p=0.040). Application/Conclusion: The results of this study suggest that level and frequency of performance in selected dietetics job activities subtly vary with years in practice.


The Effect of Telephone-Directed Intervention using Medical Nutrition Therapy on Clinical Outcomes in Post-Surgery Roux-en Y Gastric Bypass Patients

Sue Benson-Davies, 2008

Background: An increasing number of rural post-Roux-en Y gastric bypass (RYGB) surgery patients have limited access to post-surgery support systems that provide reinforcement for behavior modification and weight gain prevention. A telephone-directed intervention (TDI) was explored to increase further weight loss through promoting diet and exercise behavior adherence. Objective: To investigate if a 12 week TDI protocol using medical nutrition therapy (MNT) could increase a patients ability to achieve personal weight and behavioral goals. Design: A prospective randomized controlled trial was conducted. Subjects/setting: Eighty-six post-RYGB patients who had surgery at the Spearfish Regional Surgery Center in Spearfish, South Dakota, between May 2003 and January 2007, were invited by postal mail to participate in the study. The inclusion criteria consisted of patients who were at least 12 months post-RYGB and were weight stable. Consented study participants were randomized into a control or intervention group. Intervention: Demographic characteristics, comorbidities, treatment characteristics and a readiness to change behavior survey were collected at a baseline clinic appointment on all of the study participants. The study participants determined one diet and one exercise behavior goal that he/she was willing to implement during the 12 week study with the intent of losing more weight. The control group received usual care while the intervention group received 20 minutes of MNT via the telephone at weeks two, four, six, eight and 10 during the intervention. Study completion data were collected during week 12 in a final clinic appointment. Main Outcomes Measured: Weight loss, diet and exercise goal achievement were measured. Statistical Analysis: Independent t tests, Pearsons Chi square and Fishers exact test were performed using SPSS v15.0. Results: Twenty-four percent (n=21) of 86 patients met the inclusion criteria and were consented for study participation. Ninety-five percent (n=20) completed the study. The number of registered dietitian visits at six months post-surgery was significant (p=0.03) between the intervention (80%; n=8) and control (27%; n=3) groups at baseline, while the other demographics, comorbidities and treatment characteristics were not significant. The intervention group achieved their diet goals five days during weeks five (p=0.042), six (p=0.007) and seven (p=0.02) compared to three days per week for the control group. The control group exercised five days during week two (p=0.008) of the intervention compared to two days of exercise for the intervention group. Weight change between the control (83.02±26.0 kg; n=10) and intervention (98.71±24.4 kg; n=10) groups at the end of the study was not significant (p=0.171). Conclusions: TDIs may be a highly applicable, accessible and feasible method to deliver MNT for weight loss support in a rural post-RYGB population.


Clinical outcomes of medical nutrition therapy implemented by a Licensed Dietitian to Hispanic female patients with type 2 diabetes mellitus living in Puerto Rico

Ivonne Anglero, 2009

Objective: To evaluate the effect of Medical Nutrition Therapy (MNT) implemented by a licensed dietitian on clinical outcomes in females with type 2 Diabetes Mellitus (T2DM) living in Puerto Rico (PR) at 12-weeks post-intervention compared to a non-participatory group.. Methods/Design/Population: This was a retrospective chart review conducted in an ambulatory clinic in San Juan-PR of adult females who did and did not receive MNT. An analysis of 140 medical records compared the demographic, disease severity, treatment characteristics and comorbidities of both participatory (n=70) and non-participatory group (n=70) at baseline and 12 weeks post-intervention. Descriptive and inferential statistics (ANOVA and ANCOVA) were used. Results: At baseline, no significant statistically differences between the groups were noted for age (t=-0.608, p=0.544), weight (t=1.398, p=0.164), and mean duration for T2DM diagnosis (t=-1.565, p=0.120). A statistically significant difference was observed, at baseline, between groups for total cholesterol (t=2.369, p=0.019) and the use of medications for T2DM (Ç2=12.933, p=0.004). After adjusting for diabetes medications and total cholesterol at baseline, the participatory group had significant reductions in fasting plasma glucose (F=8.235, p=0.005), Hemoglobin A1C (F=16.489, p<0.0005), weight (F=9.772, p=0.002) and BMI (F=10.509, p=0.001) at the end of the 12-weeks period. At study conclusion, the participant group had a significantly lower Framingham Risk Scores (F=5.729, p=0.210) compared to the non-participatory group. Conclusion: The study findings supported the value of MNT rendered by licensed dietitians in achieving treatment goals for females with T2DM living in PR. To our knowledge, this is the first research to investigate the effect of MNT on clinical outcomes in PR females with T2DM.


The Effect of Peer Nutrition Counseling in a Health and Wellness 100-Level Course and on Anthropometric, Dietary, and Physical Activity Outcomes

Nicole Clark, 2008

Objective: To evaluate the effect that two-one hour individualized peer nutrition counseling sessions have on non-nutrition majors anthropometric measurements (weight, waist circumference, BMI), dietary intake (consumption of fruits, vegetables, dietary fat, calories and fiber) and physical activity patterns (frequency, duration, and type). The primary outcome was change in fruit and vegetable intake from baseline (week 4 of the semester) to study completion (week 13 of the semester) in the treatment group. Participants: The subjects were a convenience sample of 81, 40 in the treatment group and 41 in the control group, non-nutrition majors enrolled in an introductory health and wellness course at a state run University in western Pennsylvania. The majority of the participants were 19-year-old Caucasian freshman females living in the dormitories with a major of business. Methods: A 24-hour recall was used to collect baseline and study completion dietary and physical activity information. The study lasted 12 weeks with the treatment group receiving two individualized one-on-one peer nutrition counseling sessions from senior level nutrition majors. In the first peer counseling session a primary goal, dealing with a Healthy People 2010 nutrition and physical activity topic, was developed and goal achievement techniques were discussed. The second peer counseling session focused on goal refinement. Results: There was a significant increase in fruit intake (F=6.621, p = 0.003) and vegetable intake (F= 6.237, p=0.015) in the treatment group and there was an increase in the treatment groups duration (F= 4.261, p=0.042) and frequency of physical activity (F=4.977, p=0.029) indicating that peer counseling can affect dietary and physical activity patterns. There was no significant change noted in anthropometric outcomes. Conclusion: Peer nutrition and activity counseling that focuses on goal setting may be a cost effective way to change college-age students dietary and activity patterns.


Enteral Nutrition in the ICU: Can Intakes Be Improved and Does it Matter?

Angela MacDonald, 2008

Background: The importance of early and adequate enteral nutrition (EN) has not been clearly established in the medical population. The purpose of this study was to investigate the impact of an educational intervention on the EN administration practices and patient outcomes in a medical intensive care unit (MICU). Methods: This retrospective study evaluated two groups of 51 patients before (Phase I) and after (Phase III) the educational intervention (Phase II). Pre- and post-tests were used to evaluate the change in knowledge from the educational intervention that emphasized the importance of early and adequate EN. Data collected in Phase I and III evaluated EN initiation, advancement, adequacy of intakes and holding practices as well as time on the ventilator, length of stay (LOS) in the MICU and hospital and discharge status. Results: There was a significant change in knowledge resulting from the educational intervention (p<0.001). The participants in Phase III were administered EN for significantly fewer hours (p=0.035), received more calories on day 2 of EN (p= 0.032), and more calories and protein per day in the MICU than Phase I participants (p<0.001). There was a significant decrease in MICU LOS by 2.2 days (p=0.007) and hospital LOS by 4.9 days (p=0.008) in Phase III. Conclusion: Change in nutritional knowledge on the importance of early and adequate EN resulted in significantly more EN being administered earlier in the hospital course and per day in the MICU. This may have resulted in decreased LOS in the MICU and hospital.


Evaluation of clinical outcomes of participants in the UMDNJ worksite wellness program

Felicia Stoler, 2008

Objective: To measure the effect of the UMDNJ Worksite Wellness Program on clinical outcomes from baseline, 12 weeks, and at intermediate follow-up; and to determine differences in demographic characteristics and clinical outcomes between those who did and did not return for intermediate follow up at 12 weeks. Design: A retrospective, secondary analysis of data collected through the UMDNJ Worksite Wellness Program. The data included demographic information and history of chronic disease. The clinical outcomes examined at baseline, 12 weeks and intermediate follow up include body weight, BMI, waist and hip circumferences, systolic and diastolic blood pressure, percent body fat, total cholesterol and non-fasting glucose. Subjects/Setting: There were 166 subjects in the original cohort, with 117 completing the 12 week intervention. For the purpose of this study, only female participants from groups one through three were analyzed (N=82). Of this sample, 54 volunteered to return for intermediate follow up and 24 did not. Statistical analyses: Frequency distributions and summary statistics were used to display the categorical and continuous data. Inferential statistics included split plot 2x2 ANOVA, independent samples t-test, and Pearson product moment correlation coefficients was also utilized. The a priori alpha was set at p=0.05. Results: There was a significant interaction between group assignment and time for hip circumference (F=10.199, p=0.002) only. However, all subjects improved at the end of the 12 week intervention with reductions in clinical outcome measures: body weight (F=41.195, p=0.000), BMI (F=41.988, p=0.000), systolic blood pressure (F 8.21, p= 0.005), diastolic blood pressure (F=6.051, p= 0.016), waist circumference (F= 58.066, p=0.000), hip circumference (F=71.03 9, p= 0.000) and percent body fat (F= 7.946, p=0.006). Decreases in weight at week 12 were positively correlated with BMI (r=0.997 p=0.01), waist circumference (r=0.388, p = 0.01), and hip circumference (r=0.522, p=0.01). Those that returned for intermediate follow up had a lower mean body weight at baseline (190.8 lbs + 44.7) and 12 weeks (187.7 lbs + 44.6). They sustained changes in the clinical outcomes measures at intermediate follow up. Conclusions: The worksite wellness program, which integrated nutrition and physical activity behavior modification, including both group and individual sessions, had a positive impact on reducing clinical outcome measures during the 12 week intervention. For those who returned for intermediate follow up, they sustained the clinical outcome levels from the end of the program, demonstrating maintenance of change versus recidivism after participation without continued intervention.


Utilization of the standardized language of dietetics in clinical practice.

Deborah Hutcheson, 2007

In 2003, the American Dietetic Association (ADA) implemented the Nutrition Care Process (NCP) and Model. The second step of the NCP, the diagnostic step, includes the development of a nutrition diagnostic taxonomy, in a standardized language. This step includes terms that are grouped into four distinct components: a nutrition diagnostic label, a definition of the nutrition diagnostic label, etiology (cause/contributing risk factors), and signs/symptoms (defining characteristics). Research is essential to confirm that the diagnoses and their defining characteristics accurately represent the observable data and are consistently used in practice by practitioners across patient populations. Little research to date has examined the actual use of the diagnostic labels and defining characteristics in practice sites. This pilot study was a retrospective descriptive study of the use of the ADA diagnostic labels and their associated defining characteristics, by dietitians over a six month period, in a 264-bed acute care community hospital in South Western Pennsylvania. Among the 183 complete chart notes, the frequency distribution for the three most often used diagnostic labels were: Inadequate Oral Food/Beverage Intake (48.4%, n = 1221), Involuntary Weight Loss (12.8%, n = 322), and Underweight  (10.2%, n = 258). The three most frequently used ADA diagnostic labels accounted for (71.4%, n=1801) of all diagnostic labels used. Three of eight (37.5%) ADADFCs for Inadequate Food and Oral Beverage Intake; five of 16 (31.3%) ADADFCs for Underweight; and four of ten (40%) ADADFCs for the diagnosis Involuntary Weight Loss met a moderate level of agreement or higher. Both critical clusters (use > 80%) and critical defining characteristics (use > 95%) of ADADFCs were identified. This pilot research may be used as baseline data for future comparative studies in validation of the diagnostic components of the Nutrition Care Process (NCP) in practice.


Hypocaloric Enteral Nutrition Support in Mechanically Ventilated Patients in an MICU

Jennifer Tomesko, 2008

Objective The goal was to determine the effect a hypocaloric enteral nutrition (EN) regimen had on critically ill patients clinical outcomes(i.e. mean blood glucose values, antibiotic hours) and economic outcomes (hours in the MICU, hours on mechanical ventilation) throughout their medical intensive care unit (MICU) length of stay (LOS). Design A retrospective chart review of 108 medical records; 54 in the hypocaloric, and 54 in the eucaloric group. Setting/Subjects The study was conducted at an urban teaching hospital, limited to adult patients aged 18 years or older receiving EN and mechanical ventilation for at least 48 hours from June 2004 through September 2006. Statistical analyses performed Results were analyzed using independent samples t-tests, Chi Square analyses, and Pearson correlations. All data were analyzed using the Statistical Package for Social Sciences, SPSS (version 14.0). Statistical significance was set with a priori alpha level of 0.05. Results The hypocaloric group received a significantly (t = -12.879, p < 0.0005) lower amount of total kcals/kg (19.18 + 3.40) than the eucaloric group (30.04 + 5.18). The actual body weight (t = 1.433, p < 0.0005), utilized body weight (t = 4.978, p < 0.0005), BMI (t = 4.978, p < 0.001) of the hypocaloric group was significantly greater than the eucaloric group. The hypocaloric group had a significantly (x2 =-12.611,p < 0.0005) greater frequency of patients receiving insulin infusions (46.3%, n = 25) than the eucaloric group (14.8%, n = 8). A significant correlation, although a weak correlation was found between insulin infusions and hours on antibiotics (r=-.293, p=0.003), MICU LOS (r=-0.225, p=0.019), and hours on mechanical ventilation (r=-2.82, p=0.003). The hypocaloric group had a lower mean daily blood glucose (140.28 + 24.94 ) than the eucaloric groups (149 + 34.88) (t = -1.59, p = 0.115). The eucaloric group had a fewer amount of hours on antibiotics (F=0.035, p = 0.852), hours in the MICU (F=0.055, p = 0.814), and hours on mechanical ventilation (F=0.483, p = 0.468) than the hypocaloric group. Conclusion Hypocaloric SNS support did not result in any statistically significant patient outcomes between the hypocaloric and eucaloric groups. Although not statistically significant, results may have been clinically important. The definition of hypocaloric requires further exploration since it is currently defined by a broad range.


Self-Reported weight status, co-morbidities and follow-up characteristics - Results of a survey of post-bariatric surgery patients

Lynn Monahan Couch, 2008

Objective: To examine the relationship between self-reported weight status, the severity of co-morbidities and follow-up characteristics among adult patients who underwent bariatric surgery and returned a mailed post-surgery survey. Design: A retrospective, descriptive secondary analysis of a closed data set of a cross  sectional post-bariatric survey. The survey contained 20 multiple choice questions regarding self-reported weight status, prevalence and severity of co-morbidities and characteristics of follow-up care, as well as questions on demographic characteristics. Subjects: The study sample consisted of 331 of 1,252 eligible adult bariatric patients (24.6% response rate) who responded to a July 2006 post-surgery survey and who had undergone bariatric surgery at Christiana Care Weight Management Center Bariatric Surgery program at Preventive Medicine and Rehabilitation Institute (PMRI) in Wilmington, DE between October 2001 and June 2006. Statistical Analyses: Descriptive statistics were used to summarize data. Chi-Square Tests of Independence and Fishers Exact Tests as relevant, independent samples t-tests and Pearson Product Moment Correlation Coefficient were used to explore relationships between demographic characteristics, self-reported weight status, prevalence and severity of co-morbidities and follow-up characteristics among the study sample. An a priori alpha level was set at 0.05. Results: The study sample was predominately female (85.8%, n = 284) with a mean age of 47.9 ± 9.4 years. Those reporting an improvement in dyslipidemia (t = -3.541, p < 0.0005) and obstructive sleep apnea (t = -3.994, p < 0.0005) symptoms post-surgery lost significantly more weight than respondents who did not report an improvement. Follow-up with the Registered Dietitian significantly improved symptoms of dyslipidemia (c2 = 6.821, p = 0.009). Respondents who had lost less weight desired follow-up significantly more (t = 2.345, p = 0.020) than those not desiring follow-up care. Those who attended support groups lost significantly more weight (t = 2.088, p = 0.038) compared to those who did not attend support groups. Significantly more women than men reported lack of health insurance (p = 0.009) and follow-up with their surgeon (c2 = 2.287, p = 0.038) as reasons for not participating in follow-up care. Significantly more men than women did not follow-up because they felt fine and felt they did not need follow-up (c2 = 9.070, p = 0.003). Respondents were significantly younger that followed-up with the RD (t = 2.577, p = 0.011) or that stated they were too busy to return for follow-up care (t = 2.219, p = 0.029). Application / Conclusion: This study demonstrated that those who attended bariatric support groups lost more weight and those that lost less weight strongly desired to participate in follow-up services. Also, gender and age differences impacted follow-up practices. These findings support the need for additional studies to examine bariatric patient follow-up practices, perceived needs and possible barriers for follow-up care, particularly as it influences weight loss status and co-morbidities. Further research should target patient beliefs, attitudes and barriers to follow-up as this information may be useful in developing strategies to improve adherence to follow-up programs and patient outcomes.


Impact of Oral Health Assessment education program for nursing staff on the nurses knowledge and practice regarding completion of the Oral health assessments

Nancy Munoz, 2008

Design: The study used a pre and post-test design with an education intervention to measure nursing knowledge regarding oral health assessments in the elderly along with a retrospective chart review to assess changes in patient-care practices before and after the education intervention. Subjects: Licensed nurses in three skilled nursing facilities (SNF) responsible for completing the NA and the MDS assessments (N=35). Statistical Analysis: Nurses knowledge was described via pre- and post-test scores (mean, standard deviation and range) and analyzed using a paired t-test. Change in practice was described using frequency distributions and chi square. SPSS 15.00 was used for data entry and analysis with a priori alpha set at < 0.05. Results: Nursing knowledge increased significantly from pre- to post-test. Mean scores increased significantly from 9.63 (pre-intervention) to 12.03 (post-intervention) (t=-4.04, p=<0.005). Post-intervention a greater number of NAs were complete (34.70%, n=52) compared to pre-intervention (26.70%, n=40). The number of dental referrals increased significantly (p=0.011) from 0.00% (n=0 pre-intervention) to 38.5% (n=5 post-intervention). There were significantly (x2= 10.25, p=0.001) more medical records with all variables related to patient-care practices completed in the post-intervention phase of this study. Application and Conclusion: The results of this study support that providing nurses with education on how to perform oral health assessments can increase nurses knowledge and patient-care practices for completing the oral health assessments of the institutionalized elderly.


Comparative effectiveness study of an online and face-to-face, college-level course (Nutrition in the Lifespan) in two New Jersey high schools

Kyle Thompson, 2014

Background: There is a dearth of high-quality comparative studies examining outcomes of online vs. face-to-face classes in high school populations. Objective: The objective of this research was to compare course grades and standardized examination scores between students enrolled in either an online or face-to-face version of a nutrition course offered for potential college credit within the Rutgers Health Science Careers program. Design: This was a retrospective feasibility study. Participants/setting: The participants were 32 high school students of traditional age who enrolled in either an online (n = 19) or face-to-face (n = 13) version of “Nutrition in the Lifespan”. The setting was two suburban New Jersey high schools during the spring semester of 2014, with one school offering the online course and the other offering the face-to-face course. Intervention: Identical content was delivered to both groups of students either an online teacher or a face-to-face teacher. Main outcome measures: Outcomes measures were course letter grades and standardized examination scores. Statistical analyses performed: A Mann-Whitney U test was used to evaluate comparative data. Results: Course grades were significantly higher for the face-to-face group (p < 0.001). There were no significant difference between groups for standardized examination scores (p = 0.097). The number of students receiving college credit for the course was significantly higher in the face-to-face group (p < 0.021). Conclusions: Further research utilizing larger group sizes is needed to determine comparative learning outcomes between an online or face-to-face version of “Nutrition in the Lifespan”.


MAHI (Mobile Access to Health Information): An Integrative Technology Approach for Enhancing Diabetes Self-Care and Quality of Life

Patricia Davidson, 2008

Objective: To evaluate the effect of the type of monitoring [interactive technology featuring a mobile tracking system (MAHI) versus traditional] on change in Diabetes Self-Care Behaviors (diet, exercise, self-blood glucose monitoring) and Diabetes Quality of Life (DQL) score. Design: A prospective, randomized, cohort (five separate and consecutive) outcomes research study that examined changes in participants Diabetes Quality of Life (DQL) score and Diabetes Self-Care Behaviors (diet adherence, exercise frequency, self-blood glucose monitoring frequency) in the context of traditional versus interactive technology monitoring. The primary outcome measure was the change in total DQL score and the secondary outcome measure included the change in Diabetes Self-Care Behaviors. Subjects/Setting: Forty-nine participants (male 29, female 20) with either type 1 or 2 diabetes, ranging in age from 18 to 65, who were participating in a six week American Diabetes Association recognized diabetes education program in Northern New Jersey, between March 15-August 30, 2007; equaling 25 in the interactive technology group and 24 participants in the traditional monitoring group. Statistical analyses: Comparison using independent samples t-tests, chi square, or Fishers Exact test was completed for between group analyses to evaluate for equivalence between groups at the start of the education program and completion. A split-plot analysis of variance was performed to investigate the effect of the type of intervention on DQL and diabetes self care behaviors (exercise and monitoring frequency). An a priori alpha level of 0.10 was set for statistical significance. All analyses used the Statistical Package for Social Sciences, SPSS (version 14.0). Results: There was no significant differences between groups (interactive and traditional) for gender (Ç2= 0.556, p=0.456) or comorbidities (hypertension Ç2 = 0.010, p=0.460, coronary artery disease Ç2 = 0.987, p=0.160, neuropathy Ç2 = 1.04, p= 0.154, high risk waist circumference Ç2 = 2.62, p=0.135) at the initial assessment. Individuals in the interactive technology group (n=25) demonstrated a significantly higher level of diet adherence than those in the traditional group (n=24) (Fishers Exact test p=0.004). Independent of group there was a significant improvement in DQL scores within subjects from the first educational session to the last educational session (F=23.86, p<0.0001) as well as for exercise frequency (F=42.35, p<0.0001) and self blood glucose monitoring frequency (F=88.14, p<0.0001). There was no significant interaction effects concerning the type of monitoring among any of the outcome variables [DQL (F=0.027, p=0.871), exercise frequency (F=0.364, p=0.550), self-blood glucose monitoring frequency (F=0.007, p=0.935)]. Applications/conclusions: The results suggest that both groups improved significantly in DQL and diabetes self-care behaviors. Additionally, the interactive technology group showed improvement in dietary adherence. Further research is needed to evaluate if the additional information (images of meals, nutrition fact labels, real time reflection by the patient) provided by this technology, has the potential to further the clinicians ability to develop and promote the self-care behaviors of the patient.


A Descriptive Study of Patients at a Northern California Primary Care Medical Practice: A Descriptive Pilot Study

Cheryl Peters, 2009

Objective: To describe the demographic characteristics, anthropometric measures, medical diagnoses, and cardiometabolic risk factors and to explore the relationships between the demographic characteristics and the cardiometabolic risk factors (CmRF) of adult patients seeking care at a northern California medical clinic. Design: A retrospective chart review of randomly selected medical records from the Nutrition and Lifestyle Medical Clinic patient population with an initial visit between May 15, 2006 and August 15, 2008. Subjects: The study sample included 366 electronic medical records (EMR) from patients aged 19 and older. The majority were female (70.5%, n=258), Caucasian (92.9%, n=340), and had >16 years of education (61.2%, n=224); half were >55 years of age (50%, n=183). Statistical Analyses: Frequency distribution, descriptive statistics Chi Square analysis, and independent samples t tests were used to describe the patient profile of demographic characteristics, anthropometric measures, medical diagnoses, and CmRFs. A priori alpha level was set at p=0.05. Results: The most prevalent medical diagnoses were hypercholesterolemia (66.9%, n=245) and hypertension (51.9%, n=190). The >55 age group were most likely to have at-risk levels of abdominal obesity (x2=4.184, p=0.041), systolic blood pressure (x2=3.161, p<0.0001) diastolic blood pressure (x2=4.856, p=0.028), total cholesterol (x2=2.201, p<0.0001), low-density lipoprotein (x2=8.218, p=0.004), fasting blood glucose (x2=1.050, p=0.001), and triglycerides (x2=8.522, p=0.004). Males were significant more likely to have at-risk levels of systolic blood pressure (x2=5.419, p=0.020) and triglycerides (x2=6.522, p=0.011). Noncaucasians were significantly more likely to have at-risk levels of high-density lipoprotein (x2=5.214, p=0.022) and triglycerides (x2=8.041, p=0.005). Application/Conclusion: The findings in this study demonstrate the use of an EMR in a primary care setting to identify patient profiles and screen for risk factors. Future resources for interventions at NLMC should be directed to target female patients >55 years of age and those with the diagnoses of hypercholesterolemia, hypertension, and abdominal obesity.


Nutrition Support Practice in Critically Ill Patients - A Retrospective Descriptive Pilot Study

Caroline Kiss, 2009

Objective: To describe the nutrition support provided to critically ill patients with a length of stay of greater than 72 hours. To explore the relationship between route of nutrition support and timing of nutrition support initiation, energy delivery, and protein delivery. Design: A retrospective review of patients admitted to the Intensive Care Unit (ICU) from July 2006 to June 2008 was performed. Patients remaining >72 hours in the ICU and receiving nutrition support were eligible. Data collection included demographic and clinical characteristics of patients and nutrition support practices. Subjects: A convenience sample of eligible patients was used to collect data from the electronic medical chart. Of the 3,915 patients admitted within the timeframe, 276 (7.0%) stayed >72 hours in the ICU, and 133 (3.4%) received nutrition support. Statistical Analyses: Descriptive statistics (number and percent for categorical variables, and mean, median, standard deviation, and range for continuous variables) were used to display demographic and clinical characteristic, and nutrition support practices. Chi-square test was used to assess the relationship between type of admission clinic and route of nutrition support. One way ANOVA was used to determine if there was a significant difference between the route of nutrition support and timing of nutrition support initiation, mean daily energy and protein delivery. For a significant difference among the three groups, Tukeys Honestly Significantly Different (HSD) was used for post hoc pairwise comparison. A priori ± level was set at 0.05. Results: Sixty-five percent of patients were male, the mean age of all patients was 66.6 years, the majority of patients were admitted from surgery (71.4%). The most common diagnosis category was gastrointestinal disease (42.1%), followed by respiratory failure (32.8%). The severity of illness score as measured by Simplified Acute Physiologic Score II was 39.3; 63.2% of patients required mechanical ventilation, and mean length ICU stay was eight days. Mean body weight was 76.5 kg (n = 86), and mean body mass index was 27.0 kg/m2. Initiation of nutrition support occurred 40.6 hours (SD = 31.2) after admission to the ICU, and was maintained for 5.9 days (SD = 5.4). The route of nutrition support was 38.3% for enteral nutrition (EN), 40.6% for parenteral nutrition (PN), and 21.1% for both EN and PN. For EN, gastric feeding was used in 86.1%, with an isocaloric or energy-dense standard formula in most patients (81.2%). All patients on PN received an industrially manufactured all-in-one formulation delivered via a central venous catheter. Patients received a mean daily amount of energy and protein of 861.7 kcal (SD = 462.8), and 37 g (SD = 20.5), respectively. There was a significant relationship between type of admission clinic and route of nutrition support (Ç2 = 59.5, p<0.0001). No significant differences between route of nutrition support and timing of initiation of nutrition support were found (F = 0.638, p = 0.530). Medical patients were more likely to get EN in contrast to surgical patients who were more likely to receive PN. There was a significant mean difference in route of nutrition support and delivered daily energy (F = 6.758, p = 0.002), and protein (F = 13.201, p<0.0001) Implications: Several nutrition support practices were identified that may be improved, specifically, the high rate of PN use, and the inadequate delivery of energy and protein. The results of this study provide a rationale for quality improvement of nutrition support delivery in this ICU.


The effect of a college wellness program on key clinical outcomes

Jane Ziegler, 2007

Objective: To measure the impact of the HealthyU program on health behaviors, anthropometric changes and blood pressure changes on HealthyU participants compared to a non-participatory control group. Design: A prospective outcomes research study evaluating a wellness program by collecting data with a longitudinal survey on personal/demographic, behavioral and environmental characteristics/factors and examined how they were associated with clinical outcomes over a 12-week period during the first semester of the freshman year. The main clinical outcomes examined were changes in weight, BMI, waist and his circumferences, waist to hip ratio, percentage of body fat, pounds of body fat, pounds of lean body mass, blood pressure, expended exercise calories and intake of fruits and vegetables. Subjects/Setting: A total of 73 female participants with 35 in the HealthyU group and 38 in the non-participatory congrol group. All participants were freshman and 18 years of age attending Cedar Crest College in Allentown, Pennsylvania. Statistical analyses: Results were compared using descriptive statistics for age and Pearson's chi square or Fisher's Exact Test was used for ethnicity, smoking status, and presence of psychological issues. Paired t-tests were used to assess difference in clinical outcomes within groups. The one-way ANOVA assessed differences for blood pressure between groups and one-way ANCOVA assessed differences between groups for the other clinical outcomes. Point Biserial Correlations were used to assess the relationships between weight change at 12 weeks and the personal, environmental and behavioral constructs. Results: From baseline to 12 weeks there were no significant differences between groups for the clinical outcomes of weight (p=.355), BMI (p=.459), wasit to hip ratio (p=.397), percentage of body fat (p=.109), pounds of body fat (p=.259), and pounds of lean body mass (p=.695). There were no significant differences between groups for systolic blood pressure at 12 weeks (p=.460) and diastolic blood pressure (p=.087). In the HealthyU group there was an increase in expended exercise calories at 12 weeks (p=.04) and for intake in fruits and vegetable servings (p<.0005). Applications/conclusions: No significant difference was seen in the clinical outcomes for the HealthyU group and the non-participatory control group, however the HealthyU group did have a significant increase in expended exercise calories and an increase in servings of fruits and vegetables. None of the health behaviors that were emphasized by the HealthyU program made a significant difference in the clnical outcomes, but rate of weight gain was slower than what had been found in similar studies. In the future, research should focus on wellness behaviors that influence healthy lifestyle changes in the college-aged population.


The Relationship Between Orofacial Pain Diagnosis and Nutrition Status: A Pilot Study

Julie Morgan, 2004

Objectives: To determine the relationship between orofacial pain (OFP) diagnoses and self reported pain/difficulty with biting, chewing and mouth opening and nutrition status. Methods: The population consisted of a convenience sample of 80 adult patients seen in the New Jersey Dental School (NJDS) Orofacial Pain (OFP) Clinic between June 1, 2005 and December 31, 2006 who underwent nutrition screens. Weight was measured and Visual Analog Scales (VAS) were used to assess perceived pain/difficulty with biting, chewing and mouth opening. Data entry and statistical analysis were completed using SPSS version 13. The apriori alpha level was set at p<.05. Results: The majority of subjects were women (n=73, 91.3%). OFP diagnoses were grouped into two sub-categories: musculoskeletal OFP (MP) (n=63, 78.8%), which included myofascial and temporomandibular joint pain; and neuropathic pain (NP)(n-17, 21.3%), which included trigeminal neuralgia and burning mouth syndrome. Point biserial correlations revealed a significant correlation between OFP diagnoses and weight change over the previous 12 months (r=.252, p=.026). Subjects with MP (mean=+4, SD=9.58) were significantly more likely than subjects with NP (mean=-2.19, SD=10.34) to gain weight over the previous 12 months. These results were further supported by independent samples t-tests (t=2.266, p=.026). No significant correlations were found between OFP diagnoses and BMI or weight change over the previous six months. A higher percentage of subjects with NP were overweight/obese (63.7%, n=11) compared to those with MP (35.3%, n=26). Point biserial correlations revealed significant moderate correlations between OFP diagnoses and VAS scores for biting (r=.419, p=0), and mouth opening (r=.466, p=0), but not between OFP diagnosis and VAS scores for chewing (r=.206, p=.069). Subjects with MP had significantly greater VAS scores when analyzed using the independent samples t-tests for biting (t=4.052, p<.0001, mean MP=5.01, mean NP=1.53) and mouth opening (t=4.646, p<.0001, mean MP=4.94, mean NP=.76), but not for chewing (t=1.843, p=.069, mean MP=4.6, mean NP=2.88). Conclusions: These results suggest that adult subjects with MP have significantly greater pain and/or difficulty with biting, and mouth opening, are less likely to be overweight or obese, but gain more weight over the course of six and 12 months than do subjects with NP. This pilot study supports the need for further research using a nutrition assessment tool and a larger sample size to explore relationships between OFP diagnoses and nutrition status, degree of pain experienced, or nutrient intake


Is Oral Disadvantage Predictive of Nutrition Risk?

Diane Rigassio-Radler, 2004

Oral and systemic diseases can affect dietary intake, and poor nutrition can impact oral integrity. The goal of this research was to explore factors predictive of nutrition risk, with an objective of identifying constructs to use in a nutrition risk screening tool in a dental clinic population. Methods: Data were collected by chart review for health behaviors of tobacco use, frequency of sugar consumption, and physical or financial limitations to optimal dietary intake; physical examination of oral variables including number of decayed and missing teeth, natural occluding tooth surfaces, prostheses, or soft tissue lesions; interview to ascertain presence of chronic diseases including hypertension, diabetes, and immunosuppresive disorders, 24-hour recall, and weight history; and measurement of height and weight for calculation of body mass index and percent weight change. The outcome variable, 'nutrition risk', was determined by expert clinical opinion by two registered dietitians. A convenience sample of 241 English-speaking adults scheduled for oral health care at the dental school clinic was obtained. Descriptive statistics, Pearon's product moment correlation, and logistic regression analyses were done with SPSS 11.5 and SAS 8.1 with a priori alpha set at 0.05. Results: Mean age was 49.02 (SD=16.05) years; 53.66% (n=132) were female. Of 23 variables that were considered, 12 were predictive of nutrition risk: immunosuppressive disorder, diabetes, dysphagia, soft tissue lesions, difficulty shopping for or preparing food, financial limitation in purchasing food, new caries, high sugar intake, unintentional weight change, history of eating disorder, difficulty with mastication, and number of natural teeth. The regression model correctly classified 87.55% (n=211) of the subjects. The model based on regression was converted to a proposed screening tool concept for clinical practice that correctly classified 84.23% and had a false positive rate of 14.01% and a false negative rate of 26.47%. Additional research is warranted to test the reliability and validity of a screening tool created from these constructs in adult dental clinic populations. This research was supported by an American Dietetic Association Foundation fellowship.


A Comparison of Parent Reported Feeding-Related Habits of Children Diagnosed with Autistic Spectrum Disorder and Children who are Typically Developing

Sarah Malliet-Moyna, 2005

Objective: The pediatric nutritionist is responsible for assessing the adequacy of the child's diet, parental knowledge of nutrition, and cultural and environmental influences that affect the child's nutrition status. This study aimed to identify the eating-related habits and examine the differences between the eating -related habits of children with Autistic Spectrum Disorders (ASD) and typically developing children to increase awareness and knowledge of the impact this diagnosis may have on daily eating and mealtimes. Design: A survey questionnaire from a convenience sample of parents of children diagnosed with autistic spectrum disorders and typically developing children. The sample of children diagnosed with autistic spectrum disorder came from two sites, Jawonio Inc. and PRIMETIME for KIDS. The parents of typically developing children came from a private pediatric practice, Hackensack Pediatrics. Statistical Analysis: Descriptive statistics of demographics, early health and eating habits and current health and eating habits including problem eating habits were analyzed using SPSS (Version 12) software in the ASD group. The typically developing group was eliminated because of low response rate. A secondary analysis of the problem eating habits in the ASD population was performed comparing this autistic sample to a historical control group of typical children using questions of the same wording and content. Z scores were calculated using an alpha priori of p=0.10 with z scores of > 1.66 indicating significance. Results: The survey of the typically developing population yielded a 2.5 % (n=5) response rate. This population was eliminated from further analysis. In the ASD population the survey yielded a 54.16% (n=26) usable response rate. A majority were male and 19.2 (n=5) were female. The mean age was 4.44 (SD 2.39) years of age. The majority (96.1%) was diagnosed with Autism and 12% were diagnosed with PDD-NOS. Mean age of diagnosis was 28.84 (SD 9.74) months of age. BMI for nutritional status was calculated using the parental reported measures. Both height and weight were provided for 16 out of 26 participants. One quarter was considered acceptable, 12.5 % were at risk for underweight, 50% were considered overweight and 12.5 % were considered at risk for overweight. Of the early health and eating habits in the first six months of life, more than one quarter had constipation and just fewer than 20% had diarrhea. An overwhelming majority received breast milk via bottle or breast. Of the 25 who responded to difficulties in achieving feeding skills in the first year of life, the highest prevalence reported was for difficulty transitioning to solids and difficulty drinking from an open cup with assistance. For current health and eating habits, an overwhelming majority of children had fairly regular snack/mealtimes, eat at the table, self-feed and used utensils. Over one quarter of the parent/caregivers indicated their child had food allergies, over one third currently had constipation, close to one quarter currently had diarrhea. Over one quarter has tried alternative nutritional therapy with elimination diets being the most common therapy tried. One quarter tried the gluten free/casein free diet. Over one quarter was eliminating foods/food groups with the majority of eliminated foods being dairy, gluten, and sugar/starches. Gastrointestinal reasons were the predominate reasons given for eliminating foods. A majority of the children had an overall restrictive intake. A majority of the children were reported to demand foods with the primary foods being yogurt, chicken nuggets, fries, juice and fruit. Regarding parental perception of child's weight status a majority reported their children were adequate weight with less than 20% perceived as under or over weight. Examining parental concerns related to their child's eating, a majority were concerned about their child's diet and would change their eating habits if they could and more than half found feeding their child stressful. For problem eating habits, picky eating, eating only a few foods, unwilling to try new foods, insistence for same foods, ritual/routine at the table, food cravings for carbohydrates and pica were the most common current problem eating habits. Texture and looks were the primary factor influencing pickiness. When the findings of the following current problem eating habits: insists on ritual/ routine at the table, unwilling to try new foods, and eating non-food items, results for this studies ASD group were compared to the historical control group of typical children. They were significantly different (p> 0.10). Looking at the characteristics of those who reported to have tried alternative nutritional therapy, the majority had tried the gluten free/casein free diet, was eliminating foods and reported a past or current history of constipation and/or diarrhea. Examining a BMI for nutritional status and parental perception of weigh status, 44% of parental perception of weight status correctly matched the BMI for nutritional status including over one quarter of the children who were considered overweight. Only three parents did not have any concerns about their child's diet and two of these children were one of only three diagnosed with PDD-NOS. Conclusions: This study provided preliminary data for the pediatric dietitian regarding the eating related habits of children with Autistic Spectrum Disorder including problem eating habits, gastrointestinal problems, use of alternative nutritional therapies, BMI for nutritional status, and parental concerns regarding eating. Efforts should be focused on obtaining accurate anthropometrics, exploring the use of alternative nutritional therapy and understanding the significant stress that feeding and eating related habits can cause in the daily lives of these families before providing nutrition suggestions, recommendation, and dietary changes. Recommendations must be based on sound science and evidence-based medicine so they support growth and development and are realistic and attainable.


Nutrition Risk and Perceived Health Status in the Elderly

Mitsoo Nanavaty, 1996

The healthcare revolution focuses on minimizing costs of care while improving outcomes in high risk populations. Prevention and anticipation of health problems in the elderly becomes an issue of primary importance. Poor nutritional status leads to poor health outcomes, increased cost of care and longer lengths of hospitalization. In the high risk population, functional ability, physical and mental health, social support and economic factors have been identified as having an impact on nutritional health. As part of a senior citizen survey study assessing alcohol and tobacco use in the elderly, the purpose of this study was to determine the relationship between nutrition risk and self-perceived social support, mental health and physical function in the community dwelling elderly. A mail survey was conducted on a statewide random sample of NJ elders (age>65). General health, physical and social function and mental health were evaluated by the 36 item health survey (Rand Corporation - SF 36), and social support was evaluated using the MOS Social Support survey. Nutrition risk was assessed using the Nutrition Screening Initiative CHecklist. Data was analyzed using JMP statistical software (SAS, Cony, NC). Bivariate analysis was done using the Wilcoxin/Kruskal Wallis test. Results showed a significant relationship between all the variables studied and nutrition risk (p<.0001). Regression analyses results identified general health (p<.0001), social support (p<.0001) and physical function (p=.01) to be most strongly associated with nutrition risk. We hope that this will help guide future efforts in the development and implementation of intervention protocols in this high risk population.


Delivery of the Enteral Nutrition Prescription and Incidence of Feeding Intolerance in Critically Ill Patients on Volume-based Enteral Nutrition

Susan Roberts, 2014

Background: Enteral nutrition (EN) delivery to patients in the intensive care unit (ICU) is frequently below the amount prescribed. Volume-based enteral nutrition (VBEN) is one strategy to minimize underfeeding. Objective: The purpose of this research was to determine if VBEN led to EN delivery that met the EN target (= 65% of prescribed amount) for the first week in the ICU and was well-tolerated in ICU patients. Also, differences in EN-related outcomes were evaluated between those who met the EN target compared to those who did not meet the EN target. Design/participants/setting: A retrospective chart review of adult mechanically ventilated (MV) ICU patients (N = 117) who required an ICU stay of = 3 days, MV for = 2 days, and received VBEN for = 3 days during the first week in the ICU at Baylor University Medical Center. Statistical analyses: Descriptive statistics were used to describe demographic, clinical and enteral feeding regimen characteristics. The Mann Whitney test was used to analyze for differences in EN-related outcomes for those who did and did not meet the EN target. Results: Patients received a mean of 67.0 ± 0.2% of their EN target and 61.5% (n = 72) of patients met the EN target of receiving = 65%. The percent of ICU days with hyperglycemia was similar for those who met the EN target versus those who did not (21.7 ± 27.3% versus 21.0 ± 27.7%, p = 0.857). Only two patients in the group that did not meet the EN target had an elevated gastric residual volume. Conclusions: The study results indicate VBEN is an effective strategy for delivery of an EN target of = 65% of the amount prescribed and is well-tolerated in ICU patients during the first in the ICU. Randomized controlled trials are needed to confirm these findings.


Nutrition Related Information in Primetime Television Commercials

Daria Mintz, 2002

As the prevalence of overweight and obesity increases (1-3), there is an urgent need to develop novel approaches to treatment and prevention. Fifty to 60% of adults consider television (TV) to be their primary source of entertainment (4). The average household's TV is in use for more than seven hours per day (%). More hours of TV viewing are positively correlated with increased body weight (6-8). Health care professionals are calling for actions that promote increased awareness of TV's impact on health, especially that of children and adolescents, and changes in TV content to promote their well-being. In their editorial "Caloric Imbalance and Public Health Policy," Koplan and Dietz stated "the time has come to develop a national comprehensive obesity prevention strategy that incorporates educational, behavioral, and environmental components analogous to those already in place for tobacco use" (9). The American Academy of Pediatrics (AAP) (10) recognizes TV's impact on children and adolescents with regards to issues concerning violence, sexual activity, illicit drugs, alcohol, body concept and self-image, nutrition, dieting, and obesity. The AAP encourages its members to work with parents, schools, government, and the entertainment industry to improve TV content, to raise awareness of TV's potential harm to children, and to encourage TV watching habits that reduce health risks. The AAP (10) also supports the Children's Television Act of 1990 that designates three hours of educational and informational broadcasting for children per week and restricts commercial advertising time during children's programming. A primary prevention approach to the problem of childhood and adult obesity requires that those in a position to educate and influence families about lifestyle habits such as TV watching, are aware of the programming and commercial content that influence consumer' nutrition beliefs and food choices. The aims of this study are to compare how nutrition information in TV commercials differ between that viewed by children compared to adults and to contribute to the body of knowledge regarding the nutrition related information broadcast in TV commercials.


Who are the RDs of the Diabetes Care and Education Dietetic Practice Group and What Functions are they Performing? Results from the 2002-2003 Membership Survey

Dana Mercadante Green, 2004

Results of the Diabetes Care and Education (DCE) dietetic practice group's 2002-2003 membership survey were used for this study. The purpose of this study was to examine current professional practices of RD's in the DCE dietetic practice group (DPG) analyzed by levels of practice: entry-level, specialty practice, or advanced level and compared to demographic characteristics. Seperate chi-square and Fisher's exact tests were performed to determine if educational background, professional expertise, place of employment and patient population were related to practice level. Sixteen functions performed by 1,232 RD members were categorized into level of practice according to standards of practice and analyzed to determine at what level of practice RD's in the DCE DPG are functioning. Levels of practice were also compared to demographic characteristics. The results of the DCE membership indicated that 61 (5%) RD's were functioning as an entry-level practictioner, 851 (69.1%) were functioning as a specialty practice level practitioner and 320 (25.1%) as an advanced level practitioner. Significantly more advanced level practitioners were performing more tasks compared to entry-level practitioners. Entry-level performed a mean of 2.4 functions, specialty practice performed 7.7 and advanced level performed 11.6 functions. The results of this survey further delineated that RDs function at various levels of practice and that credentials for RDs increased as level of practice advanced. In contrast, there was a small percentage difference in degree level for the three levels of practice, which brings into question whether an advanced degree is needed to perform as an advanced level practitioner. Future research should include additional advanced level functions outlined in the standards of practice of advanced diabetes practitioners and collection of the frequencies of functions performed to further define the practice patterns of RDs at different levels.


An Exploration of the Quality of Life and The Meaning of Food in Adults Living With Home Total Parenteral Nutrition

Marian Winkler, 2008

Home total parenteral nutrition (TPN) provides life sustaining intravenous nutrition therapy for individuals who have impaired gastrointestinal function because of intestinal failure. The purpose of this study was to define quality of life (QOL), describe how home TPN influences QOL, and explore the meaning of food and eating from the perspective of home TPN dependent adults. This qualitative study, using content and interpretative phenomenological analysis, included data from semi-structured telephone interviews with 24 self-selected adults receiving home TPN because of short bowel syndrome or pseudo-obstruction. Participants were stratified by length of time on home TPN into four subgroups: less than 2 years, 2 to 5 years, 5 to 10 years, and greater than 10 years. Models of health-related QOL, technology dependency and QOL, and factors influencing experiences of food and food intake in heart failure served as conceptual frameworks for this study. Analysis was also conducted using a chronic illness framework. The overarching theme that emerged from the data was the meaning of home TPN as a lifeline and nutritional safety net. There were five subthemes: definition of QOL, benefits of home TPN outweighs burdens of technology, the meaning of food, achieving normalcy in life, and discrepancies between expectations and reality. Quality of life was defined by participants as how much one enjoys life; being happy, satisfied or content with life; and being able to do what you want to do when you want to do it. Underlying disease state, magnitude of diarrhea or ostomy output, and length of time on home TPN influenced how QOL was perceived. Lifestyle adaptation was influenced by how structured or flexible an individual was with the TPN infusion schedule. Eating for survival, health benefits, and socialization emerged as three dimensions illustrating how food and eating influenced QOL. Participants expressed a strong desire to achieve normalcy in life. The challenge for health care professionals is to balance provision of the information necessary to facilitate a persons adaptation to technology and home TPN dependency while promoting hope and optimism for the future.


Impact of Evidence-Based Practice Web-Based Modules for Members of the Dietitians in Nutrition Support and Renal Dietitians Dietetic Practice Groups on their Perceptions, Attitudes and Knowledge of Evidence-Based Practice

Anna Parker, 2007

Objective The objective of this study was to assess the changes in Evidence based practice (EBP) perceptions, attitudes and knowledge after completion of five web-based course modules. In addition, the relationship between these changes and their demographic characteristics among registered dietitians (RDs) who are members of the Dietitians in Nutrition Support (DNS) and/or Renal Dietitians Dietetic Practice Groups (RPG) of the American Dietetic Association (ADA) in the United States was evaluated. Design This descriptive study used the Dietetics Evidence-Based Practice Questionnaire. Subjects/setting Of the 1006 potential participants who responded, 374 (37%) completed the pre-web-based training questionnaire and 155 (15.4%) completed the post-web-based training questionnaire. Statistical analyses performed Descriptive statistics, paired t-tests, one-way ANOVA and correlation coefficient were used to evaluate the data. Paired t-tests were performed to identify the changes in the perceptions, attitudes and knowledge after completion of the five web-based course modules on EBP. One-way ANOVA was conducted to examine the relationship between these changes and demographic characteristics. Pearson Product Moment Correlation Coefficient was used to determine if there was a relationship between the perception/attitude score and the knowledge score. A priori alpha of 0.05 was considered statistically significant. Results The mean perception attitude score increased significantly (t = - 7.008, p < 0.0005) from pre-web-based training (39.5 ± 4.0) to post-web-based training (41.8 ± 3.9). The mean knowledge score increased significantly (t = - 18.2, p < 0.0005) from pre-web-based training (29.4 ± 9.2) to post-web-based training (41.5 ± 6.5). Using one-way ANOVA, level of education (F = 3.99, p = 0.01), specialty certification (F = 5.54, p = 0.02), type of specialty certification (F = 3.87, p = 0.02), and prior evidence analysis training (F = 10.53, p = 0.001) were significantly related to knowledge scores. Conclusions The results of the study indicated that after completion of the web-based modules, perception/attitude and knowledge scores of EBP increased significantly. The majority of participants agreed that practicing an evidence-based approach improves patient care and they were interested in using EBP in the care of patients. Level of education, specialty certification, type of specialty certification and prior evidence analysis training were associated with higher knowledge scores. This study identified a need to provide education of EBP principles to integrate research evidence into clinical practice.


Screening for Nutritional Risk in Adult Patients Admitted to an Urban Acute Care Teaching Hospital

Tiash Sinha, 2007

Objective: To identify compliance with the screening process and the level of agreement on presenting symptoms between the documentation of nursing and physicians upon admission to an urban acute care teaching hospital in Massachusetts (MA). Design: Retrospective design using a chart review of medical records of adult patients. Subjects/Setting: Three hundred and ninety patient records were included from the inpatient care units of the hospital. Main outcome measures: The level of agreement between the same presenting symptoms documented by nursing and the physician related to nutritional risk of patients admitted to the hospital as documented on the nutritional/metabolic component of the Nursing Assessment Form (NAF) and in the physician admitting note. Statistical analyses performed: All analysis was performed using SPSS. Separate Z tests of a single proportion were performed for each presenting symptoms. Z-statistics was used to compute the Z tests and then look up the p-values from the Z-table. Results: Of the 390 medical records 358 (91.8%) had the NAF fully completed. Thirty-two (8.2%) either did not have a NAF or the NAF was blank. Of the 358 medical records that had a completed NAF, 337 (94%) had a completed nutritional/metabolic component and twenty-one (6%) had a blank or incomplete nutrition/metabolic section of the NAF. The top three presenting symptoms documented by both physicians and nursing that were in agreement were nausea/vomiting/poor oral intake in 61% (n=41) patients, liver diseases in 80% (n=4) and diarrhea in 71% (n =20) and those not in agreement were weight loss 45% (n=15), coronary artery disease 39% (n=26), and diabetes with 39% (n=27). Conclusion: The physicians were more likely to document weight loss, coronary artery disease, congestive heart failure, diabetes and renal disease while RNs more likely to document gastrointestinal symptoms and poor appetite. Application: Training sessions for physicians and nurses to help them to understand the nutritional risk factors might help to capture more patients at high nutritional risk.


Comparison of Two Standardized Nutrition Terminologies

James Brewer, 2008

Objective: The objective of this pilot study was to identify the most frequently utilized Nutrition Injury-Specific Nutrition Diagnostic Codes (NI-SNDCs) by RDS over a one year period. The level of agreement between the top seven NI-SNDCs and ADA diagnostic codes and defining characteristics chosen from the same nutrition assessment data was assessed. Design This study was a retrospective descriptive comparative study of the top seven utilized NI-SNDCs found within comprehensive nutrition assessments over a one year period. Nutrition assessment data from those comprehensive nutrition assessment notes were used to select ADA diagnostic terms and defining characteristics. Subjects/participants/setting: Comprehensive nutrition assessments were selected from all RD notes entitled Nutrition Assessment over a one year period May 2006 to May 2007. The qualifying components of a comprehensive nutrition assessment include diet and diet history, height, weight and weight history, lab data, medical diagnosis, past medical history and prescriptive drug data. The top seven NI-SNDCs were required to have occurred more than 21 times during this one year period totaling 210 charts used for this study. Statistical analyses performed: The level of agreement between the NI-SNDC and defining characterizations and the ADAs diagnostic term and defining characteristics were analyzed using frequency distributions n and % . SPSS 15.0 cross tab feature was used to analyze the % agreement between each nutrition diagnostic term and defining characteristic. Results: The % agreement between the NI-SNDC (n=210) and the ADA diagnostic terms was <50%. The % agreement between the NI-SNDC defining characterization and the ADA defining characteristic was 36%. There was a low level of agreement between these two nutrition standardized languages. Overall RDs using both standardized languages focused on different nutrition problems from the same assessment data. Application/conclusions: The patterns of nutrition diagnoses and defining characteristics from both of these standardized nutrition languages may be useful in future refinements of one standardized language. The discovery of weak agreement from this pilot study may inspire future research to delineate why the RDs focused on different nutrition problems from the same nutrition assessment data.


The Effect of Diabetic Ketoacidosis Insulin Management on Time Required to Achieve Tolerance to Oral Intake and Cumulative Length of Stay Among Pediatric Patients with Type 1 Diabetes Mellitus During Acute Care Hospitalization

William Palumbo, 2007

Objective: To determine whether type of treatment modality, multiple daily injections (MDI) versus continuous subcutaneous insulin infusions (CSII) using insulin pump therapy was more effective in time required to achieve tolerance to oral intake and an overall reduction in the cumulative length of stay (LOS) among pediatric patients age <21 years old after resolution of diabetic ketoacidosis (DKA) when adjusted for patient demographic characteristics, treatment characteristics and presence of infection. Design: A retrospective chart review. Eighty-three medical records were reviewed; 51 from the MDI treatment modality and 32 from the CSII treatment modality. Setting/Subjects: The Study was conducted at an urban commmunity based teaching hospital, limited to pediatric patients with type 1 diabetes mellitus (T1DM) hospitalized with DKA from January 1, 2003 to March 31, 2006 that were managed either with MDI or CSII upon resolution of DKA. Statistical analyses performed: Results were analyzed using independent t-test, Chi square and stepwise multiple regression analysis. All data were analyzed using the Statistical Package for Social Sciences, SPSS (version 13.0). Statistical significance was set with a priori alpha level of 0.05. Results: Patients receiving CSII were significantly older (174.03 + 54.5 months) than patients receiving MDI (150.25 +- 61.6 months)(t=-3.044, p=.003). Patients receiving CSII (n=31) had a significantly longer duration of time diagnosed with T1DM (74.45+-51.6 months) than patients receiving MDI (n=51)(24.45+-30.6 months)(t=-5.508,p<.001). Gender representation within the two modalities were equivalent as 33 (64.7%) females and 18 (35.3%) males received MDI treatment and 21 (65.6%) females and 11 (34.4%) males received CSII treatment (x2=.007,p=.932). Conclusion: The study suggests the importance of adhering to an established DKA protocol in effectively treating new onset and previously managed patients with T1DM and DKA. Patients receiving MDI treatment took significantly longer to attain DKA resolution and had a significantly longer length of stay after DKA resolved than patients receiving CSII. No significant difference in time required to achieve tolerance to roal intake was found in patients receiving MDI treatment versus CSII treatment. The number of patients receiving sliding scale insulin therapy after discontinuation of the DKA protocol was significantly greater in patients receiving MDI therapy compared to patients receiving CSII


Vitamin B6 Status During Restoration of Nutritional Autonomy in Adults Post Intestinal Transplantation

Laura Matarese, 2007

Intestinal transplantation is indicated to restore nutritional autonomy in patients with irreversible intestinal failure. With successful implantation and subsequent adaptation of the transplanted graft, establishment of nutritional autonomy and long-term rehabilitation is achievable with early discontinuation of parenteral nutrition (PN) and resumption of an unrestricted oral diet. Intestinal transplant recipients are able to eat normally, but the degree to which specific micronutrients are absorbed and processed and the incidence of micronutrient deficiencies is unknown. The purpose of this study was to determine the incidence, severity, timing of occurrence, and factors associated with vitamin B6 deficiency (as defined by serum vitamin B6 levels) in adults undergoing intestinal transplantation. Serum pyridoxal 5-phosphate, the active form of vitamin B6 and total homocysteine levels were measured prospectively over the course of 18 months in a cohort of 29 adult intestinal and multivisceral recipients at defined time points throughout the perioperative phase. For continuous variables (age of donor, age of recipient, cold ischemia time, timing of nutrition interventions, vitamin B6 and homocysteine levels) mean, median and standard deviation were calculated. The incidence, severity and timing of occurrence of vitamin B6 deficiency were determined using frequency distributions. Factors associated with the development of vitamin B6 deficiency were analyzed using independent samples t test or Fisher Exact tests. Levenes Test for Equality of Variances was computed prior to the independent samples t test to check for homogeneity of variances. Differences in outcome variables (B6 levels before and after TPN was discontinued, B6 levels before and after achievement of nutritional autonomy) were evaluated using independent sample t-tests. The strength of the relationship between continuous variables (B6 and homocysteine) was evaluated using Pearson correlation coefficient. The ability of the UPMC supplementation protocol to correct the B6 levels was evaluated using paired sample t-test. Statistical significance was set with a priori alpha of p = 0.05. As a group, 93.3% (n=27) of the population exhibited depleted serum vitamin B6 deficiency at some time during the perioperative period. At postoperative week 4, the incidence of B6 deficiency (<3.3 ng/ml) was 55.2% (n=16) with severe deficiency (<2.5 ng/ml) at 48.3% (n=14). The deficiency occurs early in the perioperative phase, within 31 days of transplantation, and within one week of discontinuation of parenteral nutrition. The deficiency was unrelated to age, gender, cold ischemia time, type of allograft received or homocysteine levels. Vitamin B6 repletion initiated with a single intravenous dose followed by daily oral dose (50 mg) resulted in full restoration of the vitamin B6 serum levels for all patients within one month (p<0.05, paired sample t-test). Vitamin B6 deficiency is common in patients after intestinal and multivisceral transplantation despite clinical achievement of nutritional autonomy. This novel discovery will impact clinical practice. Accordingly, vitamin B6 levels should be monitored and deficiency should be suspected and promptly corrected in patients with unexplained musculoskeletal and neurological symptoms.


The impact of an individualized meal plan on diabetes management outcomes at a DSME Center

Karen Horan, 2006

This retrospective study evaluated the impact of an individualized meal plan (IMP) approach on patient outcomes at one accredited Diabetes Self-Management Education (DSME) program. Medical record data was collected from 90 patients with type 2 diabetes mellitus seen at an outpatient diabetes center with a BMI _25 kg/m2 and who were controlled by diet, exercise, and/or oral agents. Patients received either nutrition education including an IMP or received nutrition education only in the DSME program (w/o IMP). Time for education was similar for both groups. Clinical outcomes (HbA1c levels, weights, and BMIs), scores of self-evaluated behavior change goals, diabetes medication usage, and carbohydrate (CHO) counting knowledge assessment were measured at baseline and completion of the six to nine month DSME program. SPSS was used for data analysis with alpha set at 0.05. Clinical outcomes were analyzed using a 2_2 factorial analysis of variance. Self-evaluated behavior change goals were analyzed using Chi square and Wilcoxon rank sum. CHO counting assessment and diabetes medication usage were analyzed using Chi square. No statistically significant differences were found between groups for change in HbA1c levels, weights, BMIs, self evaluated behavior change goals, diabetes medication usage, or CHO counting assessment. Combining the two groups, the DSME program had a 1.2 % decrease in HbA1c (p_0.0001), 5.0 lb weight loss (p_0.0001) and 0.75 kg/m2 (p_0.0001) decrease in BMI. Diabetes medication usage decreased by 13% in the combined group (p_0.0001). Results support that DSME programs are effective in improving select clinical outcomes regardless of method of nutrition education.


The Impact of Medical Nutrition Therapy by a Registered Dietitian on Clinical and Patient Oriented Outcomes in Cancer Patients

Maureen Huhmann, 2008

Health related quality of life (HRQOL) has become important in the measurement of the impact of disease and treatment on daily functioning and perception of well-being. A relationship has been established between weight loss and HRQOL in oncology patients, but there are few data relating body composition to HRQOL. The aims of this study were to examine the interaction between body composition, nutrition status, and HRQOL and to examine the effect of medical nutrition therapy (MNT) provided by a registered dietitian on these outcomes. Due to limited sample size, the analyses are limited to descriptions of eight subjects enrolled in a randomized controlled trial of MNT in individuals receiving chemotherapy for pancreatic, lung, and advanced stage prostate cancer. All patients experienced progression of disease while enrolled. The mean body weight of the subjects decreased minimally from baseline to three months (-1.36 +/- 2.32 Kg) and continued to decrease in the subsequent three months (-1.38 +/- 6.25 Kg). This indicated a slowing of the weight loss reported in the six months prior to study entry (-8.48 +/- 2.67) in all but one subject. Weight loss was reflected primarily as a loss of lean body mass based on BIA analyses. Weight gain in individual subjects was principally a gain of fat mass. HRQOL as measured by the mean Functional Assessment of Cancer Therapy  General (FACT-G) score deteriorated during the first three months of the study (-9.25 +/-10.31 points), and continued to deteriorate in the next three months (-16.67+/-31.57 points). This deterioration was paralleled with a mean increase in symptom distress score at three months (2.00+/-3.83) and six months (6.67+/-18.50). The data presented here highlights the relationship between decreases in weight, nutrition status, and HRQOL in individuals receiving chemotherapy for advanced cancer.


Impact of two pulmonary enteral formulations on nutritional indices and economic outcomes

Deborah Cohen, 2008

Objective The goal was to determine if there were differences in certain nutritional indices (enteral formula volume, fat intake, protein intake, energy intake), risk factors (demographics, risk factors, co-morbidities), and economic outcomes (ICU days, ventilator hours) between subjects who received a standard pulmonary enteral formulation compared to those who received a specialized pulmonary enteral formulation , which was enriched with EPA and GLA, in mechanically ventilated patients with respiratory failure in an ICU setting. Design Data on 19 subjects were collected from closed medical records (January 2005 through February 2007) and data from 31 subjects were obtained from medical records in the ICU (March 2007 through March 2008). Setting/Subjects The study was conducted at a community hospital in Southeast Missouri, limited to adult subjects aged 18 years or older with respiratory failure, admitted to the ICU and on mechanical ventilation from January 2005 through March 2008. Statistical analyses performed Results were analyzed using independent samples t-tests, Chi Square or Fishers Exact test, Pearson correlations, one-way analysis of variance and one-way analysis of covariance. All data were analyzed using the Statistical Package for Social Sciences, SPSS (version 14.0). Statistical significance was set with a priori alpha level of 0.05. Results Subjects who were administered the specialized pulmonary enteral formula received significantly more enteral formula volume (t=2.654, p=0.011), total fat (t=4.533, p<0.0005), and total energy (t=2.463, p=0.017) than the subjects who received the standard pulmonary enteral formula. The subjects on the specialized pulmonary enteral formula received significantly more protein (t=2.654, p=0.011) from a modular protein supplement than those on the standard enteral formula. More subjects on the specialized pulmonary enteral formula received propofol, but the difference was not significant (p=0.382). When controlling for the effects of enteral formula volume, total fat intake, and total energy intake as confounding variables, there were no significant differences between the enteral formula groups in ICU days (F=0.608, p=0.440) or hours spent on mechanical ventilation (F=0.691, p=0.410). There were strong, positive significant (p<0.0005) correlations found between enteral formula volume, total protein intake, total fat intake, and total energy intake as well as a strong positive significant (p<0.0005) correlation among these variables. Moderate correlations were found between ICU days and enteral formula volume (r=0.486, p<0.0005), total protein intake(r=0.391, p=0.005), total fat intake(r=0.383, p=0.006), total energy intake(r=0.405, p=0.004) and GI intolerance (r=0.465, p=0.001). Moderate correlations were also found between ventilator hours and enteral formula volume(r=0.453, p=0.001), total protein intake(r=0.417, p=0.003) and total energy intake(r=0.383, p=0.006). Conclusion Use of a specialized pulmonary enteral formulation did not result in any statistically significant effects on economic outcomes. The subjects who were on the specialized formula did receive more enteral formula volume and more total calories which may be clinically important. Further clinical studies are needed to address the impact a specialized pulmonary enteral formula has on ICU stay and ventilator hours.


Resting energy expenditure in overweight and obese adults: agreement between indirect calorimetry and predictive formulas.

Jane Ziegler, 2005

Objective. To determine the limits of agreement between resting energy expenditure (REE) using the MedGem® and five predictive equations in overweight and obese adults. Design. A retrospective chart review was used in which REE, measured with the MedGem® was compared to five predictive equations commonly used to calculate energy needs. Subjects/setting. A convenience sample of 100 medical records of overweight and obese Caucasian adults enrolled in an ambulatory care weight management program. Main outcome measure. A calculated REE from a predictive equation within the limits of agreement with measured REE. Statistical analyses performed. Data were analyzed using SPSS® version 13.0. Alpha was set at 0.05. Limits of agreement using a bias of ± 150 calories and a precision of ± 200 calories were used. Results Four equations had acceptable bias and precision; the female Harris-Benedict (HB) equation for Class I obesity (bias = +40.23, precision = +197.6), the female Mifflin-St. Jeor (MSJ) for Class I obesity (bias = -43.05, precision = +190.12), the female WHO equation for age > 60 years, Class I obesity (bias = -120.72, precision = +178.90) and the female WHO equation for age > 60 years, Class II obesity (bias = -34.75, precision = +63.13). Conclusions This study demonstrated the lack of reliability of predictive equations in estimating caloric requirements in overweight and obese adults compared to indirect calorimetry. Applications. Continued research is necessary to determine an accurate formula to estimate energy needs for use in the overweight and obese population.


Group versus individual appointments and their impact on kidney disease progression of type 2 diabetes patients who attend an ambulatory care facility

Norma Gonzales, 2007

Objective: To compare and evaluate the effectiveness of the group sessions and individual appointments for diabetes management of adults _ 18 years with type 2 DM by measuring glycemic control, chronic kidney disease progression, and in a distinct though overlapping population, patient satisfaction. Text: This study was a retrospective outcomes study of 128 ambulatory care charts from patients assigned to either a Shared Medical Appointment (SMA) (n_65) group or an Individual Appointment (n_63) (IA) group. The main outcome measures analyzed were glycohemoglobin (HbA1C), creatinine (CR), and estimated glomerular filtration rate (eGFR). Patient satisfaction was measured with a survey in a different but related sample (N_177). Subjects were limited to patients who attended either type of visit, for at least three (3) visits over a period of six (6) months between January 2004 and December 2005. Data was analyzed using SPSS v13.0; Chi square, independent and dependent sample t-tests, and 2 X 2 Split-Plot ANOVA were used. From baseline to six (6) months there was no significant difference between groups in the clinical outcomes HbA1C (6.78 % _ 0.83 in the SMA group, and 6.89% _ 1.12 for the IA group) (p_0.208), CR (1.05 mg _ 0.44 in the SMA group, and 1.22 mg _ 0.63 in the IA group) (p_0.084), and eGFR (79.53 ml/min/1.73 in the SMA group, and 71.85 ml/min/1.73 for the IA group) (p_0.104). Patient satisfaction was significantly higher in the SMA group (4.64 _ 0.532 in the SMA group, and 4.50 _ 0.725 in the IA group) (p_0.002), over the IA group. This study suggests that group sessions are widely accepted by patients, and may be as effective as individualized appointments in controlling glycemia and kidney disease progression in patients with Type 2 DM.


Evaluation of Physical Assessment Knowledge and Skills in Dietetic Internships and Coordinated Programs

Deborah Cohen, 2006

Objective: To determine what is taught regarding physical assessment (PA), how it is taught, and how it is evaluated in US accredited dietetic internships (DI) and coordinated programs (CP) with a general, nutrition therapy or combined emphasis. Design: 193 surveys were mailed to program directors of dietetic internships and coordinated programs. Participants were asked questions which related to which PA components were taught and how these components were measured in their program. Also obtained was information regarding barriers to teaching and evaluating PA. Statistical analysis: All data was entered directly into the Statistical Package for Social Sciences (version 13.0). Descriptive statistics (number, percent) were used to analyze all variables collected. Results:92 surveys (47.7%) were returned. The most frequently reported PA components taught were body composition (66%), skin assessment (53%), and vital signs (50%). RDs were the primary faculty reported to teach these components. The primary method used to teach PA was lecture in a classroom setting (88.3%) followed by observation in the clinical setting (50%). The most common method used for evaluating PA skills was written tests in the classroom setting (47.4%) followed by actual-patient observation by faculty in the clinical setting (42.1%). Barriers to teaching PA included lack of trained faculty, resources, interest among faculty, knowledge among faculty and time in the curriculum. Conclusions: Most DI and CP programs do teach and evaluate PA to their students, however, the majority of these programs are not evaluating their students competence of PA skills.


Reliability of nutrition diagnostic labels when used by registered dietitians at three levels of practice.

Pam Charney, 2006

Objective: To see nutrition diagnoses for select clinical cases and understand the research methods to test the reliability of nutrition diagnostic labels as used by Registered Dietitians when diagnosing nutrition problems. TEXT: The American Dietetic Association (ADA) adopted developed nutrition diagnostic labels to assist Registered Dietitians (RDs) in implementation of the Nutrition Care Process (NCP). The focus of this research was to determine the reliability of nutrition diagnostic labels as used by RDs in diagnosing nutrition problems and to determine reliability across entry level (0-18 months following registration), beyond entry level (3-7 years following registration), and expert RDs (nominated by leaders within ADA). There were 279 RDs who participated (110 entry level, 113 beyond entry level, and 56 experts). A three-group comparison study was conducted using an Internet-based scenario format. Following completion of a continuing education module on nutrition diagnoses, participants selected diagnostic labels from a list of 60 labels for six clinical scenarios, each having two parts. Percent agreement for all participants as well as for each subgroup was determined for every diagnostic label in each scenario. Chi-square analysis was used for each label with greater than 60% agreement to determine if selection of labels differed across levels of practice. There were 34 nutrition diagnostic labels that reached greater than 60% agreement by all participants. Of these, only one had significant between-group difference in percent agreement (more experts than entry level RDs selected physical inactivity; p _ 0.005). Complete adoption of the Nutrition Care Process (NCP) requires use of a reliable terminology. We were able to demonstrate good-to-excellent agreement in selection of nutrition diagnostic labels.


Relationship of Method of Delivery, Race and Health Characteristics on Health Related Outcomes following a 12 Week Worksite Wellness Program

Karen Cunningham, 2009

Background: Worksite wellness programs have the potential to impact a large diverse population. The effectiveness of Internet vs. In-person delivery of wellness programs has not been well studied. Objectives: To determine the effect of method of delivery on changes in health related outcomes following a 12 week intervention delivered either In-person or via the Internet. To identify relationships between changes in outcome measures and baseline demographic or health characteristics. Design: A retrospective secondary analyses of select data collected in the Wellness in the UMDNJ Workplace  Lifestyle Management Program. Paired samples t-tests and sign test for matched pairs were used to compare baseline to 12 week values of select outcome measures including anthropometric, biochemical, blood pressure, quality of life and physical activity measures for each arm and the total. Independent samples t-tests were used to compare changes in health outcome measures between delivery method arms and between groups based on race or health characteristics. Changes in physical activity were examined as changes to median and interquartile range. Subjects/Setting: Convenience sample of the 113 overweight/obese adult employees at an urban academic health sciences center in NJ who completed a 12 week intervention. Intervention: A worksite wellness program which included individualized meal plans, meetings with a RD and a lifestyle management program presented in 12 weekly sessions either In-person (classroom setting) or via Internet, each designed to allow interaction with RD and other participants. Main outcome measures: From baseline to 12 weeks, there were significant decreases in weight, BMI and waist circumference for both In-person and Internet arms of the study (p < 0.001). Although not statistically significant, mean weight loss was somewhat higher for participants that were white, obese at baseline or in the Internet arm of the study. Physical Activity increased from baseline to 12 weeks for both In-person and Internet arms of the study. Greater median increases in physical activity were noted in the In-person arm and for white participants. Baseline health characteristics were related to changes in several outcome measures including quality of life measures that merit further investigation to provide optimal interventions to a diverse population.


Vitamin D [25(OH)D] and its relationship to risk factors for cardiovascular disease in maintenance hemodialysis patients

Deborah Blair, 2009

Objective: This study examined the relationship between serum vitamin D [25(OH)D] levels and risk factors for cardiovascular disease (CVD) in maintenance hemodialysis (MHD) patients, specifically diastolic blood pressure (DBP), systolic blood pressure (SBP), pulse pressure (PP), serum total cholesterol (TC), and serum triglycerides (TG), pre- and post-supplementation with ergocalciferol. Design: Data were collected retrospectively through a medical record review of MHD patients for the six month period following introduction of a clinical protocol in April 2006 which includes measuring serum 25(OH)D and supplementing patients with ergocalciferol (50,000IU q/week x 24 weeks) if deficient (<40 ng/mL). Subjects: The sample consisted of adult MHD patients (n = 344) at five of the Fresenius Medical Care (FMC) out-patient dialysis centers in western Massachusetts who met the delimitations of having a serum 25(OH)D drawn between April - May 2006 and who were not previously prescribed ergocalciferol. Statistical Analyses: Descriptive statistics (mean, standard deviations, and range for continuous variables; percentages for categorical variables) were used to describe demographic and treatment characteristics, and bone mineral parameters at baseline for the study group as a whole, and at baseline and six months for subjects prescribed ergocalciferol. Pearsons correlations coefficients were used to examine the relationships between these factors and mean serum 25(OH)D at baseline, and to explore the relationship between the change in serum 25(OH)D from baseline to 6 months and the change in CVD risk factors during the same period. Alpha was determined a priori as p-value of <0.05. Results: Although no statistically significant relationships were found between risk factors for CVD and serum 25(OH)D at baseline for the group as a whole (21.0 + 13.5 ng/mL, mean + SD) or for patients with 25(OH)D <40 ng/mL (18.4 + 9.0 ng/mL, n = 318), a significant inverse association (r = -.159, p = .047) was demonstrated for patients with baseline serum 25(OH)D <30 ng/mL (15.7 + 6.4, n = 156) and PP >65 mm Hg (83.0 + 14.7 mm Hg). For patients achieving improvement in serum 25(OH)D from <30 ng/mL at baseline to >70 ng/mL at follow-up (n = 21), a significant inverse association was seen with change in PP (r = -437, p = .048) and SBP (r = -.438, p = .047). Similarly, for patients not on anti-hypertensive medications at baseline (n = 45) serum 25(OH)D was inversely associated with mean PP (r = -.327, p = .028), which decreased (71.4, 65.5, and 51.6 mm Hg) as serum 25(OH)D tertile increased (<13.6, 13.7-23.3, and >23.3 ng/mL) (p = .044). Implications: Improving serum 25(OH)D may aid in lowering pulse pressure and systolic blood pressure in MHD patients who are vitamin D deficient (i.e. <30 ng/mL). More research is needed to determine optimal serum 25(OH)D levels and supplementation strategies.


A Descriptive Retrospective Study of Patients with Type 2 Diabetes at a Community Health Center in South Carolina

Lorna Shelton Beck, 2009

Background: Diabetes education can improve the health outcomes of people with diabetes. In order for patients to be empowered to take charge of their health and manage their diabetes they require education on self management. Objectives: To determine the frequency of patient participation at the Centers diabetes education classes, and what demographic characteristics may have influenced attendance. Design: This was a retrospective study utilizing information from the Centers diabetes class attendance log and the medical records of patients with type 2 diabetes. The study analyzed the relationship between the frequency of attendance at the classes and the demographic characteristics of the patients. Subjects/Setting: 104 Registered patients of the Center, e 18 years old with type 2 diabetes who received care during 2007. Patients were divided into two groups, those who attended diabetes education classes, and those who did not attend. Statistical analyses performed: Frequency distribution was used to display the demographic characteristics. The relationships between each of five demographic characteristics (independent variables) and attendance at one or more classes (dependent variable) were determined using Independent samples t and chi square tests. Results: Of the people that attended one or more classes there was no statistically significant difference in the number of classes attended when examined by gender (t= - 0.212, p= 0.833) race (t=-0.319, p=0.751) and economic status (t=-0.509, p=0.613). Conclusions: The study demonstrated no relationship between attendance at the diabetes education classes and the five demographic characteristics.


Nutrition Focused Physical Examination (NFPE) Practices of Registered Dietitian Nutritionists that have Completed an In-Person NFPE Course

Susan Desjardins, 2016

Background: Nutrition focused physical examination (NFPE) is a fundamental component of nutrition assessment. The objective was to explore NFPE conduct among Registered Dietitian Nutritionists (RDNs) in clinical practice who completed the Rutgers School of Health Related Professions NFPE course between 1996-2015. Methods: Prospective, cross-sectional mixed mode survey. A convenience sample of 317 RDNs were sent an electronic or paper survey; 96 (30.3%) survey completers met the inclusion criteria of working in clinical practice. Descriptive statistics were used to analyze NFPE conduct, and factors that enhance or limit NFPE conduct. The Mann Whitney test and Spearman’s correlation analyzed relationships between NFPE conduct and professional characteristics. Results: Respondents had 15.1 years of clinical practice experience. For 7 of 24 NFPE components, the most frequent response was “conduct exam independently.” Examination of muscle (69.5%), fat loss (63.2%), hair/nails (67.4%), weight (62.1%), peripheral edema (54.2%), skin (53.2%), and height (53.1%) were most frequently reported as conducted independently. Time (71.3%) and work load (71.7%) were primary limitations to NFPE conduct. Respondents with additional NFPE training conducted a significantly greater number of NFPE components independently (p=0.002) than those without additional training. There were no relationships between the number of NFPE components conducted independently and professional characteristics. Conclusion: Findings revealed that additional NFPE training was related to increased independent conduct of NFPE components. Further research with a larger sample of RDNs with NFPE training is needed to determine findings’ generalizability.


Identification of Bariatric Nutrition Care Practices and Demographic and Professional Characteristics of Registered Dietitians in Bariatric Nutrition Care

Kimberly Wical, 2009

Objective: Identification of nutrition care practices, demographic and professional characteristics of Registered Dietitians (RDs) providing bariatric nutrition care. Explore relationships between professional characteristics compared to guidelines collectively published by the American Association of Clinical Endocrinologist, The Obesity Society, and the American Society for Metabolic and Bariatric Surgery (AACE/TOS/ASMBS). Design: Descriptive survey design with purposive sampling via an electronic survey. Subjects: RDs associated with Center of Excellence (COE) practices recognized by the ASMBS. Statistical Analysis: Descriptive statistics with frequency distributions of characteristics; Pearsons product moment correlations and Fishers exact tests were used to explore relationships of professional characteristics and agreement with AACE/TOS/ASMBS guidelines. Results: Of 383 COE facilities, 137 RDs (35.8%) provided name and email addresses to receive the survey invitation. Of the 137 RDs, 79.5% (n=109) responded to the electronic survey. The majority of respondents (n=97, 88.2%) were female, having a Bachelors degree (n=58, 55.2%) with a mean age of 37 years (SD=11.1 years). Roux-en-Y gastric bypass surgery was reported (n=102, 93.6%) as the most often seen surgery type for nutrition counseling followed by adjustable gastric banding (n=90, 84.1%). There were variations found in recommendations for preoperative diet, pre-surgery weight loss recommendations, frequency of nutrition counseling, and nutrition assessment for weight and method of calorie, protein and fluid needs. The majority of RDs made recommendations in agreement with the AACE/TOS/ASMBS micronutrient guidelines for multivitamin/mineral supplements (n=105, 100%), calcium citrate (n=69, 65.7%), vitamin D (n=48, 54.5%), sublingual vitamin B12 (n=65, 75.6%), intramuscular vitamin B12 (n=42, 85.7%), and intranasal vitamin B12 (n=23, 82.1%). Additional variations were found in supplement recommendations including chewable and pill form of multivitamin and mineral supplement and protein supplements as well as postoperative introduction of foods for meats, soft vegetables and fruits, raw vegetables and fruits, and breads, rice, and pasta. There was a weak non significant correlation between professional characteristics and specific vitamin and mineral recommendations of RDs according to the AACE/TOS/ASMBS guidelines. Conclusion: Variability was found in nutrition practices of RDs in bariatric nutrition which further justifies the need for outcome studies leading to consistent practices of best outcomes.


Level of Research Involvement in Practice-Based Research Activities among Clinical Preceptors of the UMDNJ Dietetic Internship

Angelina Nagel, 2009

Objective: The aim of this study was to determine registered dietitians (RDs) level of research involvement (by creating a research score) and to determine whether demographic characteristics predicted research involvement. Design: This cross-sectional, descriptive study used the Dietitian Research Involvement Survey, and applied the Tailored Design Method for administering online surveys. Subjects/Setting: This study surveyed 187 RDs among the preceptors of the UMDNJ dietetic internship drawn from a consecutively numbered roster of alumni and current, active graduate and doctorate students, randomly selected using a lottery method. Statistical Analysis Performed: Descriptive statistics included frequency distributions for categorical variables and means, standard deviations, ranges and minimum and maximums for continuous data. Bivariate relationships were explored between demographic characteristics and level of research involvement using Pearsons correlation coefficients, one-way Analysis of Variance (ANOVA) as well as independent samples t-tests. An a priori significance level was set at p< 0.10. Results: Higher total research scores were significantly different for dietetic internship RD preceptors that completed or were in progress of obtaining their Doctoral Degree (F=8.576, pd0.0005), earned a specialty board-certification (t=-4.32, pd0.0005), or who were in alternative settings for both primary area of practice (F=3.324, p=0.042) and type of rotation precepted (F=3.054, p=0.054). Those individuals who scored highest within the Alternative Settings were in Community/Public Health/Government Agency, Food Manufacturer Distributor/Retailer, and Private Practice (Consultation and Business). RDs most frequently practiced and scored highest at Level 1 of the research continuum (mean=10.97, SD±2.17) with least practice activity occurring at Level 4 of the continuum (mean=3.75, SD±1.89). Conclusions: Involvement by RDs in practice-based research activities was largely determined, using a modified research continuum, by their ability to incorporate an evidence-based approach and their level of education, area of practice, type of rotation precepted and certifications held. Very few RDs are involved in conducting practice-based research. The results identified a need for further exploration of practice-based research activities by RDs with a broader sample of dietitians.


Usage Patterns of nutrition-Focused Physical Assessment by the registered Dietitian Following Completion of a Nutrition-Focused Physical Assessment Program

Rene Brand, 2010

Background: Nutrition-Focused Physical Assessment (NFPA) is an integral component of the Nutrition Care Process. However, there is a paucity of research on current NFPA practices of Registered Dietitians (RDs) and the effects of training on their use of NFPA. Objective: To determine the NFPA skills utilized by RDs, factors that influence their use of NFPA, and whether completing NFPA training changes their usage patterns of NFPA. Design: Prospective internet-based survey sent to a convenience sample of 259 RDs via SurveyMonkey. Subjects: All RDs who had completed the University of Medicine and Dentistry of New Jersey (UMDNJ) Applied Physiology Course or the Dietitians in Nutrition Support Practice Group (DNS) Advanced Skills Workshop from 2004 through 2009 were invited to participate. One hundred and twenty (46.7%) respondents met the inclusion criteria; 110 surveys were usable. Statistical analysis: SPSS 17.0 was used for analyses. Descriptive and inferential statistics (chi-square and Pearson correlation coefficient) were used to test the relationships between demographic characteristics, factors influencing use, and the professional practices of NFPA (±=0.05). Results: The survey sample consisted of 69 UMDNJ and 41 DNS participants with a mean of 12.7 (±8.0 SD) years with RD credential, 56.4% primarily in clinical nutrition acute care, 41.7% in the critical care/nutrition support practice area, and 61.3% having a specialty certification/advanced practice credential (SP/APC). For four of the 17 skills surveyed (height and weight measurement, skin and nail and peripheral edema assessments, and extra-oral exam) the most frequent response was perform independently. Dysphagia screening, intra-oral exam, pulse/heart rate, and blood pressure measurement had the most frequent response of do not perform the skill but use in assessment. At least 14% of respondents reported an increase in their use of each NFPA skill after completing an NFPA program; 50% reported an increase in the use of skin and nail and peripheral edema assessments. Time available (n=49, 73.1%) and workload (n=47, 70.1%) were identified as factors limiting their use of NFPA. RDs with SP/APC were more likely to perform abdominal exam independently than those without SP/APC (p=0.029). The total number of barriers reported had a weak inverse relationship to the total number of NFPA skills performed (r= -0.273, p=0.004). Conclusions: All NFPA skills explored had a self-reported increase in use after completing a NFPA program. Further research on optimal training methods and an increased availability of NFPA training programs for the RD are recommended for more widespread use of NFPA.


Demographic factors related to importance of personal weight management and perceived need for weight management resources among UMDNJ faculty and staff.

Christina Hussey, 2009

Abstract: Demographic factors related to importance of personal weight management and perceived need for weight management resources among UMDNJ faculty and staff. Objective: The primary aims of this study were to explore the relationships between the demographic characteristics of UMDNJ faculty and staff survey respondents and their perceived needs for weight management resources at the worksite and their self-reported importance of personal weight management. Design/Subjects: This was a cross-sectional retrospective secondary analysis of data from the survey sent to UMDNJ faculty, staff and students in 2008. The sample included 431 faculty and staff from the five campuses, one hospital and eight schools of UMDNJ who responded to the 2008 survey on perceived needs and interests in weight management, personal and professional weight management practices and weight status. Statistical Analysis: Data was analyzed using SPSS 17. Descriptive and inferential statistics were used. Results: The majority of respondents were female (n=331, 76.8%), white, non-Hispanic (n=255, 59.9%), had greater than a baccalaureate degree (n=220, 51.2%), and were in an executive, administrative, managerial, professional/non-faculty position (n=228, 53.4%). The age of respondents ranged from 18-74 years (mean=43.7, SD=11.22). The mean BMI was 27.62 kg/m2, range = 16.14kg/m2 to 51.21 kg/m2. Of the respondents, 30.1% (n=128) were calculated to be overweight; 53.4% (n=229) perceived their weight status to be overweight; 29% (n=123) were calculated to be obese and 11.9% (n=51) perceived themselves to be obese. Non-whites, those who perceived themselves as overweight or obese, and those calculated to be overweight or obese were significantly more likely to rate having weight management resources as important (Pearson Ç2 = 8.731, p = 0.033, Pearson Ç2 = 21.033, p = 0.002; Pearson Ç2 = 22.990, p = 0.001 respectively). Faculty and those with greater than a baccalaureate degree were significantly more likely to rate attaining and/or maintaining a healthy weight as important (Pearson Ç2 = 9.192, p = 0.010, Pearson Ç2 = 7.800, p = 0.020 respectively), and healthy eating as important (Pearson Ç2 = 12.458, p= 0.002, Pearson Ç2 = 12.602, p= 0.002). Faculty was also more likely to rate physical activity as important (Pearson Ç2 = 9.993, p = 0.007). Age was found to have weak, but significant positive correlation with importance of healthy eating (r=0.096, p= 0.050) and with importance of physical activity (r = 0.142, p= 0.004). Conclusion: An interest in and need for weight management resources by UMDNJ employees was identified. Additional efforts to provide weight management options should be considered by UMDNJ. Healthy dining options and an on-site fitness center were the two highest rated resources.


Changes in Body Weight Among Military Casualties Following Thermal Injury

Marybeth Salguero, 2009

Abstract Objective: To examine weight change from pre-injury through 12 months rehabilitation following thermal injury in active duty military casualties admitted to the U.S. Army Institute of Surgical Research, and the relationship between weight changes and clinical characteristics. Design: A retrospective, descriptive pilot study. Electronic medical records were reviewed. Time points of interest included: intensive care unit (ICU) admission to ICU discharge, ICU discharge to hospital discharge, hospital discharge to six months rehabilitation; six months rehabilitation to 12 months rehabilitation; and overall, from pre-injury to 12 months rehabilitation. Subjects: The records of 106 military casualties who were admitted to the burn ICU between January 1, 2005 and December 31, 2007 and met the inclusion criteria were reviewed. Statistical analyses: Descriptive statistics, paired t-tests, repeated measures ANOVA, Pearson product-moment correlation, point biserial correlation and multiple regression were used. A priori alpha of 0.05 was considered statistically significant. Results: Statistically significant weight changes between each of the time periods were found (p < 0.001). Patients lost weight between ICU admission and ICU discharge (8.7 ± 12.7 kg) and between ICU discharge and hospital discharge (8.5 ± 10.5 kg). Weight gain occurred between hospital discharge and six months rehabilitation (9.0 ± 9.9 kg), between six months rehabilitation and 12 months rehabilitation (7.2 ± 6.7 kg) and overall, between pre-injury and 12 months rehabilitation (8.8 ± 10.4 kg). Among the subset of patients (n=59) with documented weights at each time point, significant differences were found between weights at each time point, F (3.8,223.1) = 35.6, p < 0.001. Statistically significant associations were found between weight change in the ICU and %TBSA, ISS, ventilator days, infection, and ICU length of stay (p < 0.05); between weight change from hospital discharge and six months rehabilitation and ISS, antipsychotic prescription and hospital length of stay (p < 0.05); and from pre-injury to 12 months rehabilitation and diagnosis of PTSD (p < 0.05). On multivariate analysis, predictors of weight change in each time period varied. Conclusion: The results of this pilot study suggest that significant weight changes occur in military burn casualties from pre-injury through 12 months rehabilitation and the predictors of weight change vary depending on the time period. Further research is warranted to more fully describe associations of weight change with clinical characteristics, to assess the effects on body composition and to determine appropriate interventions.


Lifestyle Intervention and Glucose Monitoring in Women with Impaired Glucose Tolerance in Pregnancy Receiving Care at a Community Health Center

Diana Gonzales-Pacheco, 2009

Abstract: Background/Aims To evaluate the effect of lifestyle intervention and self monitoring of blood glucose on maternal and neonatal outcomes in women with impaired glucose tolerance in pregnancy. Methods This was a non-randomized retrospective cohort study of adult women with an abnormal glucose screen and one abnormal value on a 100 gram oral glucose tolerance test. Treatment group (n=41) received lifestyle intervention and the control group (n=54) received usual prenatal care. Categorical maternal and neonatal outcomes were compared between groups using chi-square analysis. The a priori alpha level was set at < 0.05. A case series (n=5) is reported on select patient outcomes. Results Baseline characteristics were equivalent between groups. The rate of cesarean section delivery was not significantly different (p = 0.477) between the treatment (n=2, 4.9%) and control group (n=4, 7.4%). The rate of LGA (> 90th percentile) was not significantly different (p=0.613) between the treatment (n=5, 9.8%) and control group (n=5, 9.4%). The rate of macrosomic (birth weight > 4000 grams) infants was not significantly different (p=0.642) between the treatment (n=3, 7.3%) and control group (n=4, 7.5%). In the case series treatment group, three participants required medication to achieve glycemic control. Conclusions Although no significant differences in maternal and neonatal outcomes were demonstrated between groups, the prospective treatment group required clinically relevant interventions to maintain glycemic control, indicating the importance of treating IGT in pregnancy in women with risk factors for gestational diabetes.


Nutrition Focused Physical Examination Practices of RD Members of The Oncology Nutrition Dietetic Practice Group and Board Certified Specialists in Oncology Nutrition

Jessica Iannotta, 2010

Nutrition focused physical exam (NFPE) is a component of the standards of practice and professional performance for oncology nutrition. The purpose of this prospective internet-based survey was to determine the relationships between NFPE practices and professional and demographic characteristics among RD members of the ONDPG and/or CSOs. Twenty six percent (n=378) of the 1450 RDs surveyed completed the survey and met inclusion criteria. A priori ± was set at p d 0.05 was used for descriptive and inferential statistics. The survey sample was 98.2% female with a mean of 14.3±11.4 years of experience; 53.9% (n=179) had or were pursuing graduate degrees, 57.1% (n=100) were CSOs and 35.3% (n=130) had received training in NFPE. The skills most frequently performed independently by respondents were weight measurement (n=195, 53.6%), height measurement (n=141, 38.7%), PG-SGA performance (n=100, 28.2%), EN/PN access evaluation (n=78, 22.1%), and skin assessment (n=71, 20.0%). Workload was identified as a limiting factor by 72.9% (n=253) of respondents, in addition to lack of prior education, training, and time availability. There was no statistically significant relationship between level of education and number of NFPE skills performed. Those who received prior NFPE training performed significantly more NFPE skills independently (U=10144.5, pd0.0001) and with assistance (U=10985.5, pd0.0001). Those who were CSOs performed significantly more NFPE skills independently (U=12167.5, p=0.045) and with assistance (U= 12193.5, p=0.028). RDs with prior training in NFPE and the CSO credential perform significantly more NFPE skills independently. Further research in both general nutrition practice and oncology nutrition is warranted to help justify this need along with additional training and education in NFPE.


The effects of nutritional goals on weight status among very low birth weight and extremely low birth weight infants

Gina Cunha, 2010

Objective: To evaluate growth changes in very low birth weight (VLBW) and extremely low birth weight (ELBW) infants meeting three nutrition goals compared to those not meeting nutrition goals. Methods/Design: Retrospective chart review of ELBW and VLBW infants (n=100) from a level III neonatal unit, born appropriate for gestational age, meeting inclusion and exclusion criteria. Data were analyzed using SPSSv17.0. Frequency distributions reported demographic and clinical characteristics and nutrition goals. T-tests were used to analyze the change in z-scores and differences between ELBW and VLBW infants. A repeated measures factorial ANOVA was used to analyze the effect of nutrition goal achievement and weight group on growth change at each time period. Results: Of 110 infant records reviewed: 29 were ELBW (26.30%) and 81 were VLBW (73.60%) infants. The mean weight-for-age z-score at birth was -0.03 for both groups. For the total sample, day of life 28 z-score was -0.97 and -0.99 at discharge. For the total sample, 44.50% (n=49) met all three nutrition goals; 24.10% (n=7) were ELBW infants and 51.90% (n=42) were VLBW infants. No significant differences in growth were found between ELBW and VLBW infants even after controlling for nutrition goal achievement. Conclusions: ELBW and VLBW infants had comparable growth even after controlling for nutrition goal achievement. Nutrition goals may need to be more aggressive to promote weight gain closer to intrauterine growth rates. Though, in this study, the rate of extra-uterine growth restriction at discharge was lower than observed in most studies.


The Impact of an Educational Intervention Intervention for Continuing Education on Applying the Nutrition Care Process and Model and the International Dietetics and Nutrition Terminology for Registered dietitians Practicing in the Area of Long Term Care

Rena Zelig, 2010

Objective: To determine the change in knowledge and attitudes of RDs in Long Term Care (LTC) regarding the use of the Nutrition Care Process and Model (NCPM) and the International Dietetics and Nutrition Terminology (IDNT) in the LTC setting following an education intervention. Design/Methodology/Subjects: A prospective one group pre-post test design was used with RDs employed by four U.S. Long Term Care (LTC) health care services companies (N=422). Email invitations were sent to regional Directors of Nutrition Services and forwarded to prospective participants inviting them to complete a pre-test followed by a web-based module on the use of the Nutrition Care Process and Model (NCPM) and the International Dietetics and Nutrition Terminology (IDNT) in LTC and a post-test. SPSSv17.0 was used for analyses. Results: Sixty RD participants (14.2% of 422 invited; adequate to achieve statistical power) completed the course module, pre and post-tests. Following the intervention, knowledge scores increased significantly (mean increase=2.40 points out of a maximum 17 points, SD=1.99, t=9.33, p<0.01, paired-samples-t-test) as did attitude scores (mean increase=4.73 points out of a maximum 50 points, SD=6.04, t=6.07, p<0.01, paired-samples-t-tests). Knowledge scores increased significantly more for those who did not use the NCPM or the IDNT as compared to those who currently use the NCPM (t= 2.53, p=0.01) or IDNT (u=218.50, p=0.02, Mann-Whitney U). Participants with no prior IDNT education had a significantly greater change in attitude than those with prior education (u= 209.50, p=0.02, Mann-Whitney U). Conclusions: Educational interventions targeted to the LTC practice setting can improve IDNT and NCPM knowledge and attitudes as measured by pre-post tests. Further research with a larger sample size is needed.


A Delphi Study: Using Job Functions to Assist in Defining Levels of Practice for Dietitians Practicing in Nephrology Care

Daniel Pieloch, 2010

Objective: To identify job functions of Registered Dietitians (RDs) practicing in nephrology care and to explore whether agreement exists between these job functions and levels of practice defined by the American Dietetic Association (ADA). Design: The Delphi technique was utilized to administer two rounds of electronic surveys with a panel of experts over 14 weeks between October 2008 and January 2009. Subjects: Members of the National Kidney Foundation Council on Renal Nutrition (NKF-CRN) executive committee and board members of the ADA Renal Practice Group (ADA-RPG) from the years 2005 to 2008 and the 2008 members of the Nephrology Standards of Practice/Standards of Professional Performance (SOP/SOPP) Committee were invited to participate. Statistics: Frequency distributions (n and %). Consensus was defined as 60% agreement or greater of the expert panel Results: Of the 70 experts solicited for participation, 20 (29%) returned usable surveys in the first Delphi round; 75% (n = 15) of experts completed the second Delphi round. Out of 37 job functions, 34 (92%) were accepted by consensus and 3 job functions (8%) were not accepted and modified for the second Delphi round. Two new job functions were identified in the first Delphi round. Thirty-eight percent (n = 15) of job functions (n = 39) were classified with a level of practice (Generalist, Specialty, or Advanced). Seven job functions were classified as Generalist (18%), eight as Specialty (21%), and none (0%) as Advanced. Applications/Conclusions: Agreement exists between job functions of RDs practicing in nephrology care and levels of practice defined by the ADA. Because the majority of job functions were not classified with a level of practice, this survey demonstrates the challenges that RDs in nephrology care have in determining their level of practice and stresses the need for clearer and better defined pathways to determine their practice level.


Childhood Obesity: An Exploratory Needs Assessment of Self-Identified Parental Needs and Preferences for Pediatric Weight Management Nutrition Education

Carly DeGrood, 2010

Objective: To identify the weight status and demographics of children and their parents/guardians who attended a Midwestern pediatric primary care clinic and to explore their relationships with parent/guardian needs and preferences for pediatric weight management nutrition education. Design: An exploratory needs assessment using a mail-out survey between July 1, 2009 and October 19, 2009. Subjects/setting: Children aged two to 11 years who presented to a primary care clinic and their parents/guardians who completed a survey and consented to review of their medical records. Statistical analysis: Descriptive and inferential statistics were used to explore the relationships between the parent/guardian needs and preferences for nutrition education and the child BMI percentile, parent/guardian BMI and the parent/guardians perception of their childs weight status. A priori level was set at 0.05. Results: The subjects (n = 191, 9.4%) were primarily non-Hispanic/Latino and White. Registered dietitians (n = 138, 80.2%) were the preferred provider for nutrition education. An increased child BMI percentile and parent/guardian BMI were significantly related to the parents desire to have a healthier body, interest in a family weight management program, and specific nutrition topics such as planning physical activity for the whole family, preventing obesity related medical problems and providing correct serving sizes for their child. Conclusions: Identifying parental needs and preferences for nutrition education and their relationships with both child and parent/guardian weight status will help with program development and clinical practices to prevent and treat childhood overweight and obesity.


Risk of Malnutrition in Older Adults Who Receive Care at an Urban Dental Clinic

Melissa Stump, 2016

Malnutrition among older adults is a growing concern due to the associated increased risk for chronic disease. Screening for risk of malnutrition using validated tools can promote early identification of risk for malnutrition. The use of a self-administered, validated nutrition screening tool such as the Self-MNA? (Self-administered Mini Nutritional Assessment) facilitates such screening of older adults. The aim of this study was to identify the prevalence of malnutrition or risk for malnutrition among adults age 65 years and older who came for care to the Rutgers School of Dental Medicine (RSDM) clinics using the Self-MNA over a 6-month period. A retrospective chart review study design was used; 75 patient charts were reviewed to obtain Self-MNA data. The mean age of patients =71.9?5.5 years; 52.0% (n=39) were male. Thirty-seven percent were identified as being at risk for malnutrition (n=21, 28.0%) or malnourished (n=7, 9.3%) based on Self-MNA. All who were identified as malnourished all had a BMI >23 kg/m2. The Self-MNA can be easily implemented in a clinic setting. The dental clinic is a feasible location to include screening for malnutrition as it may promote early nutrition intervention which fosters comprehensive care.


The Effect of Peer Health Education and Peer Health Coaching on Clinical Outcomes: Health Measures, Nutrition Intake and Physical Activity in College Students

Deborah Hutcheson, 2010

Objective: To evaluate peer health education (PHE) compared to peer health coaching (PHC) on clinical outcomes: nutrition intake (consumption of calories, fiber servings, fruit and vegetable servings, and percentage of calories from fat); physical activity (frequency, duration, type); health measures (waist circumference, blood pressure, body weight, calculated BMI, frame size, body composition); and intermediate outcomes (stages of change and attainment of goals), given demographic characteristics (age, gender, class level, location of residence, ethnicity, and major) in a retrospective study over 12-weeks. Participants A convenience sample of non-nutrition students from the School of Health and Rehabilitation Sciences (SHRS) at the University of Pittsburgh (n = 40) were recruited to participate as the PHE group for 12-weeks (September  December 2008). A second convenience sample of SHRS non-nutrition students (n = 40) were recruited to participate as the PHC group for 12-weeks (January  April 2009). Both groups participated as part of the practice labs for courses in the Coordinated Masters Program in Dietetics. Methods: The PHE group received one individual health education session (Fall) and the PHC group (Spring) received three individual health coaching sessions. Health measures, dietary intake and physical activity outcome measures, stages of change and health goals were recorded pre and post intervention. Results were compared using descriptive statistics for demographics, clinical and intermediate outcome measures. Independent t-test, Fisher Exact test and Pearson Chi Square of Independence were utilized to determine equivalence between the PHE and PHC groups at baseline. Significant differences in gender between groups at baseline were identified, therefore; Pearsons correlations were performed to screen gender as a potential confounding variable. A one-way analysis of covariance (ANCOVA) was used for analysis of variables without significant baseline differences but correlation with the covariate gender and repeated measures analysis of covariance (ANCOVA) were used for variables with both significant baseline differences and correlation with the covariate gender. Independent t-tests or Mann-Whitney U Tests were conducted to compare clinical outcome variables without significant baseline differences or a significant correlation with the covariate gender. Correlation matrices with combined groups were conducted to evaluate the potential relationships between the clinical outcome variables and intermediate outcome variables over time. Results: No between group differences were found from baseline to 12-weeks. A significant medium negative correlation between weight change and health measures percent goal attainment (r = -0.299, p = 0.007) was found indicating a decrease in weight at a higher percentage of goal attainment. There was a significant medium positive correlation for type of physical activity (active or sedentary) and percent physical activity goal attainment (r = 0.351, p = 0.001) indicating that those at a more rigorous type or level of physical activity met a higher percentage of their goal for physical activity. A significant weak positive correlation was found between fruit intake and stages of change (r = 0.224, p = 0.046) and vegetable intake and stages of change (r = 0.245, p = 0.028) indicating that servings of fruit and vegetable intake increased as stages of change advanced from preaction to action stages. Conclusions: Exposure to peer health educators and coaches is beneficial in increasing fruit and vegetable consumption in college students and motivating movement from preaction to action stages of change.


The Effect of a Telephone Directed Intervention On Behavioral and Clinical Outcomes in Type 2 Diabetes

Candy Croft, 2010

Purpose The purpose of this study was to evaluate the effect on clinical and behavioral outcomes of routine telephone directed intervention as follow-up to diabetes self management education (DSME) in clients who completed a DSME program at Patrick Air Force Base, Florida. Methods Twenty subjects with type 2 diabetes who completed the DSME program were recruited to participate in a telephone directed intervention (TDI) or usual care control group (CG). The intervention was designed to provide follow-up care while meeting the needs of adult learners. The intervention included six brief telephone counseling sessions over 12 weeks immediately following the DSME program. The telephone calls were scripted utilizing open-ended questions to encourage subject-researcher interaction and individualization of the care. Clinical and behavioral measures were evaluated. Clinical measures included change in A1C, fasting blood glucose (FBG), body mass index (BMI) and lipid levels. Behavioral measures were changes in goals based on the American Association of Diabetes Educators (AADE) core self-care behaviors healthy eating and being active. A case history series approach was used to describe the researcher-subject interaction. Results An analysis of the case series indicated subjects found the intervention helped them improve self-management skills. Comments made by subjects indicated increased self-efficacy in diabetes self management. Though they did not reach clinical significance, improvements in clinical outcomes did occur. They included a lowering of A1C in both groups. The TDI showed a greater decline (7.3% to 6.4%) versus the CG (7.4% to 7%). There was also a decline in weight and BMI in most members of both groups. The BMI in the CG declined from 33.7 ± 4.7 to 29.9 ± 3.5 while the TDI declined from 33.9 ± 7.4 to 33.7 ± 5.4. Conclusions Despite an inability to show differences between the groups clinically, the analysis of cases indicates telephone directed intervention is a promising format for follow up diabetes education as people with diabetes indicated improved skills and confidence in diabetes self management. Telephone directed intervention should be considered as a method of increasing the availability of follow-up diabetes counseling and support. However, further research is required to evaluate the effect on clinical and behavioral outcomes as well as the appropriate timing, frequency and format of the calls.


The Relationship between the Timing of Initiation of Enteral Feeding and Total Number of Ventilator Days in Critically Ill Mechanically Ventilated Patients

Jessica Brown, 2010

Objective: To examine the relationship between the timing of initiation of enteral nutrition and the total number of days on mechanical ventilation. Additionally, the relationship between the timing of initiation of enteral nutrition and clinical characteristics, including age, gender, BMI, APACHE score, intensive care unit, and admission diagnosis, was examined. Design: A retrospective chart review, using electronic medical records of mechanically ventilated patients in intensive care at a level one trauma center. Subjects: Adult intensive care patients, 18 years of age or older, who required mechanical ventilation greater than 72 hours and received enteral nutrition during the course of mechanical ventilation. Statistical Analysis: Frequency distributions and descriptive statistics were used to report demographic and clinical characteristics of the subjects. Pearson Correlation was used to examine the relationship between the timing of initiation of enteral nutrition and BMI, APACHE score, and total number of ventilator days. Two separate one-way ANOVAs were used to evaluate the differences in mean timing of initiation of enteral nutrition among both the admission unit and the admission diagnosis variables. A priori alpha level was set at 0.05 Results: Medical records were reviewed for 294 subjects, 163 of which met inclusion criteria. There was no significant relationship found between timing of initiation of enteral nutrition and total number of ventilator days (p = 0.741) when evaluated as a continuous variable. There was a significant difference found in the mean timing of initiation of enteral nutrition, when evaluated as enteral feeding initiation categories, and total number of ventilator days (p=0.042). Subjects who were started on enteral nutrition within 24 hours of intubation had significantly fewer mean ventilator days than those subjects started on enteral nutrition 24.1-48 hours after intubation (p=0.020). There was no significant difference found in timing of initiation of enteral nutrition and gender (p=0.667). There was a significant difference in mean timing of enteral nutrition initiation and BMI category (p=0.031). Normal weight subjects were started on enteral nutrition significantly sooner than both underweight subjects (p=0.012) and obese subjects (p=0.033). There was no significant difference found in mean timing of initiation of enteral nutrition and intensive care unit (p=0.307). A significant relationship was found between mean timing of initiation of enteral nutrition and admission diagnosis (p=0.008). Results of post hoc analysis demonstrated that subjects with trauma diagnoses were started on enteral nutrition soon than subjects with cardiovascular (p=0.013), neurologic (p=0.023), metabolic/endocrine (p=0.004) and gastrointestinal (p=0.001) diagnoses. Subjects with a respiratory diagnosis were started on enteral nutrition sooner than subjects with gastrointestinal (p=0.006) and metabolic/endocrine (p=0.03) diagnoses. Applications: Significant differences in mean timing of initiation of enteral nutrition and total number of ventilator days, BMI, and diagnosis category suggest that further investigation into timing of initiation of enteral nutrition and improved patient outcomes is warranted. Further research may be conducted to determine the appropriate timing of initiation of enteral nutrition in mechanically ventilated, intensive care patients.


A Retrospective Study of the Relationships Between Demographic Characteristics and Anthropometric Measurements of Participants in a Randomized Controlled Weight Management Trial

Carol Ann Kaminski, 2010

Objective: To explore the relationships between demographic characteristics and anthropometric measurements of overweight older adults who were enrolled in a randomized controlled trial on weight management at the New Jersey Institute for Successful Aging (NJISA). Design/Methodology/Subjects: A retrospective study design which utilized a closed data set from a randomized controlled trial which measured the impact of providing medical nutrition therapy on weight, blood pressure, percent body fat, and eating and exercise habits among obese/overweight older adults treated at the NJISA over short-term (12 weeks) and long-term (26 weeks) time periods. SPSS v 17.0 was used for analyses, with a priori alpha level set at d 0.05. Results: Of the total participants (n = 37), females represented 64.9% of the sample. The mean age for all participants was 73.7 years. Subjects who were overweight were significantly older (75.9 vs. 71.5) than those who were obese (t = 2.3, p = 0.03). Subjects who did not live alone had a significantly higher percent fat free mass (124.9 vs. 107.1) (t = - 2.1, p = 0.045) and larger waist circumference (42.3 vs. 39.1) than those who did live alone (t = - 2.5, p = 0.02). Subjects who were on combination therapy had a significantly larger waist circumference (43.7 vs. 40.0) than those who were on hypertension medication (t = - 2.06, p = 0.05). Those who previously attempted to lose weight had a significantly higher percent body fat (38.6 vs. 29.7) than those who did not (t = -2.18, p = 0.04). Conclusions: Waist circumference as an indicator of abdominal obesity should be used in conjunction with BMI for assessment of obesity related disease risk. Further research with a larger sample size is needed to develop a current reference standard of an ideal weight for an older adult using sex, age, and stature.


Relationship of hypovitaminosis D to Framingham risk score

Delores Truesdell, 2010

Background Epidemiologic studies have suggested associations between vitamin D status and coronary heart disease (CHD) risk. Objective The study purpose was to determine whether vitamin D status was predictive of CHD risk using Framingham risk score (FRS) in postmenopausal women (overweight, Caucasian). Design This was a cross-sectional study of the baseline data collected in a clinical trial. Vitamin D status was determined by serum 25-hydroxyvitamin D {25 (OH) D}. Usual dietary intake was assessed with a three day food record (two week days and one weekend day). Education level, medical history, physical activity level, systolic blood pressure and cholesterol concentrations were obtained. Season was determined based on date of blood draw. Results The study sample comprised 178 women, aged 42 to 67 years (mean 55.71±4.33 years). The mean serum 25 (OH) D levels were 65.28±27.51 nmol/L (6.98147.56 nmol/L). Fourteen percent (n=25) had 25 (OH) D concentrations <37.5 nmol/L indicating vitamin D deficiency. The predictive model explained 12.0% of the variance in 25 (OH) D concentrations. In hierarchical regression, FRS was predicted by education level only. No association was observed between 25 (OH) D levels and FRS (p =0.981). Conclusions Serum 25 (OH) D concentrations were inversely associated with body mass index. Given the physiologic importance of vitamin D further investigations aimed at determining the effect of obesity and heart disease on vitamin D requirements are warranted.


The Impact of the Coordinated Approach to Child Health (CATCH) on Clinical and Health Behavioral Outcomes in Rural, Underserved Elementary School Children

Theresa Johnson, 2010

OBJECTIVE: The impact of a multi-component school based health intervention was evaluated. DESIGN: Secondary analysis of data from the Coordinated Approach to Child Health (CATCH) intervention study SETTING: Rural, underserved elementary schools in Pike, Bullock, and Barbour counties in Alabama PARTICIPANTS: Children (n=788) from nine schools- seven intervention and two control schools INTERVENTION: Three year CATCH intervention (nutrition education, optimization of the school food environment, and enhanced physical education). MAIN OUTCOME MEASURES: Dependent variables: BMI, BMI z-score, blood glucose, blood cholesterol, and blood pressure, 24 hour recall of beverages, and time spent in moderate to vigorous physical activity (MVPA), non MVPA, and sedentary activity. Independent variable: CATCH participation. ANALYSIS: Descriptive statistics, independent samples t test, Chi Square, one-way ANOVA, Odds Ratio, Spearmans r. Alpha priori level 0.05 RESULTS: CATCH students were less likely to report drinking soda (c2 = 10.09, p = 0.001), more likely to report drinking milk (c2 =7.75, p = 0.005) and to spend time in MVPA (c2 = 9.52, p = 0.002) and non-MVPA (c2 = 19.17, p = 0.000) compared to non-CATCH students. There were no differences between groups for BMI (p = 0.127), BMI z-score (p = 0.099), or skinfolds (p = 0.104). There were significant differences between groups for glucose (F = 62.56, p < 0.005), diastolic blood pressure (F = 14.24, p < 0.005), and systolic blood pressure (F = 3.59, p = 0.028), but these differences were not clinically significant as group means were within acceptable ranges for all three of these variables. There was a weak, positive association (r = 0.198, p < 0.001) between participation in the USDA free/reduced school lunch program and increasing BMI z-score. CONCLUSIONS AND IMPLICATIONS: CATCH was successful in improving health behaviors, but these improvements did not affect prevalence of overweight in this setting. There were no significant differences between groups for measures of adiposity. Participation in the USDA free/reduced school lunch program was associated with higher BMI z-score which underscores the need for intervention in this population.


Board Certified Specialists in Oncology Nutrition (CSOs’) Knowledge of Oral Complications of Cancer Therapies that Affect Food/Fluid Intake and Nutrition Status

Mary Beth Dunbar, 2014

Background: Oral complications associated with cancers and their treatments can impact functional ability to eat and overall nutrition status. Board Certified Specialists in Oncology Nutrition (CSO) provide medical nutrition therapy (MNT) to patients with cancer and therefore should be aware of the oral complications that can occur during treatment. Objective: To determine knowledge of oral complications of cancer therapies that affect food/fluid intake and nutrition status among RDs who were CSOs in 2013 and explore relationships between knowledge score and years as an RD in clinical and oncology practice. Design: Cross-sectional descriptive study using an internet-based survey. Participants/setting: RDs who were CSOs in 2013 with email addresses on file with the Commission on Dietetic Registration (CDR). An email invitation was sent to 622 CSO RDs with a response rate of 38.4% (n=239) surveys. Statistical analyses performed: Descriptive statistics were used to report demographic and professional characteristics and knowledge questions. Spearman’s correlation was used to test the relationships between knowledge score and years in clinical and oncology practice. Results: The survey sample was 93.7% female with a mean age of 42.0 years. The mean number of years as an RD in clinical and oncology practice were 14.6 and 9.0 years respectively. Forty-five percent (n=104) reported their highest degree was a Master’s degree; 13.6% (n=31) reported they had received training in oral health screening as part of nutrition-focused physical assessment (NFPA) and/or nutrition-focused physical examination (NFPE). Fifty-nine percent (n=136) reported ambulatory care as their current employment setting. The overall knowledge score for participants was 87.8% (14.0 questions correct out of a possible 16.0). A weak, positive, significant relationship was found between knowledge score and years as an RD in oncology care (r=0.149, p=0.024). Conclusions: Greater than 75% of the respondents answered 13 out of the 16 knowledge questions correctly. Respondents scored lowest on questions related to chemotherapy-induced mucositis and folic acid deficiency associated with methotrexate. Further research is needed among CSOs to determine gaps in knowledge regarding oral complications of cancer therapies. Research to identify appropriate education and training on oral health screening as part of NFPE and obstacles preventing NFPE among CSOs should also be performed. In addition, the knowledge portion of the survey tool should be further tested for reliability.


The Relationship between Subjective Global Assessment and C - Reactive Protein among Patients on Maintenance Hemodialysis

Melissa Kirchner, 2014

Background: Malnutrition and inflammation are contributing factors to protein energy wasting (PEW) for patients on maintenance hemodialysis (MHD) and are associated with high mortality rates. Subjective Global Assessment (SGA) is a validated tool to assess nutritional status among patients on MHD; serum C-reactive protein (CRP) is used to assess the inflammatory status in patients with chronic kidney disease (CKD). It is uncertain how SGA and CRP are related because recent studies have yielded inconsistent results. Objective: To explore the relationship between serum CRP level and nutritional status as determined by the seven point SGA rating (overall and by individual component), among adult patients with CKD stage five on MHD. Design/participants/setting: This study was a cross-sectional, secondary analysis of data among adult patients with CKD stage five receiving MHD at three research institutions in the Northeast region of the United States. The participants had data collected on clinical and demographic characteristics, SGA rating (overall and by individual component) and serum CRP level. Statistical analyses: Descriptive statistics were used to describe demographic and clinical characteristics, serum CRP level and SGA rating. Point Biserial Correlation was used to examine the relationship between overall SGA rating dichotomized into 2 categories (well-nourished and malnourished) and CRP level. Spearman’s Correlation was used to examine the relationship between each individual component of SGA on a seven point ordinal scale and serum CRP level. Results: In total, 65 participants were included in the descriptive analyses and 46 participants were included in the final analyses. Inflammation (serum CRP = 10 mg/L) was present in 10.9% of participants and malnutrition (SGA rating 1-5) was present in 42.6% of participants. No participants had severe malnutrition (SGA rating 1-2).There was little to no relationship between overall SGA category and serum CRP level (r=0.008, p=0.960), and individual components of SGA and serum CRP level including weight status (r=-0.085, p=0.573), dietary intake (r=0.188, p=0.210), gastrointestinal symptoms (r=0.257, p=0.085), functional capacity (r=-0.035, p=0.819), disease state and comorbidities (r=0.089, p=0.577), and physical exam (r=-0.034, p=0.822). Conclusions: There is no significant linear relationship between SGA rating (overall and by individual component) and serum CRP level. Participants considered to be well nourished by the SGA may have acute or chronic inflammation and its impact on nutritional status remains unknown. Future studies with a greater percentage of participants with severe malnutrition are warranted to examine the relationship between serum CRP level at all categories of nutritional status.


Goal Achievement of Participants in an American Association of Diabetes Educators Community Based National Diabetes Prevention Program

Tracy Bruen, 2014

Background: Risk factors for diabetes such as obesity and physical inactivity pose a threat to burdening Tennessee’s health care system if current trends are not reversed. Objective: To determine if participants enrolled in a community based implementation of the American Association of Diabetes Educators (AADE) grant funded National Diabetes Prevention Program (NDPP) at Williamson Medical Center (WMC) can be successful in achieving behavior changes associated with reduced risk for the development of type 2 diabetes and to compare differences between two cohorts (singles and couples) in meeting program goals. Design: A retrospective review of data from participants enrolled between June 8, 2013 and October 30, 2013 in WMC’s AAADE NDPP. Study Population: All adult participants (n=39) who met the original program criteria for the WMC AADE NDPP between June 8, 2013 and October 30, 2013. Statistical Analyses: Descriptive statistics were reported for demographic characteristics, anthropometric measurements, behavior change characteristics, physical activity and participation characteristics. The Wilcoxon sign rank test was used to determine differences from baseline to completion of the 16 week Core phase in participants’ weight, BMI, and reported minutes of physical activity per week. The Mann-Whitney U test was used to determine differences in mean percent weight change and mean percent change in weekly minutes of physical activity per week between the two cohorts. Results: Overall, 61.5% (n=24) of participants met program weight loss goals of 5-7% of initial weight and 64.1% (n=25 ) of participants met program goals for physical activity of 150 or more reported weekly minutes of physical activity. Participants in the program significantly reduced weight by a mean of 9.1 lbs. (p<0.001), reduced BMI by 1.4 kg/m2 (p<0.001) and increased reported weekly minutes of physical activity by a mean of 112.2 minutes (p<0.001). There was no significant difference between groups (singles and couples) in percent increase in physical activity (U=123.0, z= -0.2, p = 0.913) or in percent weight lost (U =150, z= -0.4, p = 0.730) from baseline to end of the 16 week core phase of the program. Conclusions: Most participants, regardless of cohort type, achieved weight loss and physical activity goals by the end of the Core phase of the program. Lifestyle behavior change can be successful regardless of participation with or without a significant social support person. A community based implementation of a national lifestyle behavior change program can successfully reduce risk factors associated with the development of diabetes among those who participate.


The Assessment of Nutritional Status in One Outpatient Maintenance Hemodialysis Center by Subjective Global Assessment, Malnutrition Inflammation Score, and a Comprehensive Nutrition Assessment: a Cross-Sectional, Prospective Study to Determine the Level of Agreement

Melissa Prest, 2010

Background: In maintenance hemodialysis (MHD) patients, scored nutrition assessments such as the Subjective Global Assessment (SGA) or the Malnutrition Inflammation Score (MIS) are recommended to be performed in addition to a comprehensive nutrition assessment (CNA). Objective: To determine the level of agreement between the SGA, the MIS with two variations of scoring based on Chan et al and Ho et al, and a CNA. For the purposes of this study the CNA was considered the gold standard by which the assessments were compared. Design: This was a prospective, cross-sectional study performed in one urban outpatient MHD center. Subjects: Fifty-seven MHD patients from one MHD center who met inclusion criteria were recruited to participate in this study. Statistical analyses performed: Effect size was set at 0.5, alpha error probability was set at 0.05, power (1-b error probability) was set at 0.9. Descriptive and non-parametric statistics were performed to examine the relationships between the three assessments and demographic characteristics. A Kappa statistic was performed to determine the level of agreement between the SGA, two variations of the MIS, and the CNA. Results: The participants were predominately black (68.4%, n = 39) and male (66.6%, n = 38). The mean age of participants was 54.26 ± 14.72 years; mean dialysis vintage was 7.68 ± 5.37 years, and 96.4% (n = 55) of the participants had one to two comorbidities. The mean assessment score in the SGA was 6.26 ± 1.28 with 71.9% (n = 41) of the participants rated as mild/well-nourished; mean assessment score in the MIS was 5.84 ± 3.94 with 73.6% (n = 42) of the participants rated as normal/mild malnutrition; and 61.4% (n = 35) of participants rated as well-nourished by the CNA. There was a good level of agreement between the SGA and MIS-Ho version (K = 0.462) and a substantial level of agreement between the SGA and MIS-Chan version (K = 0.688). There was a substantial level of agreement between the SGA and the CNA (K = 0.685), MIS-Ho and the CNA (K = 0.653), and MIS-Chan and the CNA (K = 0.722). Sensitivity and specificity was similar between the SGA (sensitivity = .68, specificity = .97) and the MIS-Chan (sensitivity = .68, specificity = 1.00). Comparatively, the MIS-Ho had higher sensitivity and lower specificity (sensitivity = 1.00, specificity = .70). Conclusions: The majority of participants in this study were well-nourished. All assessments had a substantial agreement with the CNA. The MIS-Ho was the most sensitive assessment and best at identifying malnourished patients. The SGA and MIS-Chan were the most specific assessments and best at identifying well-nourished patients.


The Components of Subjective Global Assessment (SGA) and Their Ability to Predict Overall SGA Score in Stage Five Chronic Kidney Disease Patients on Maintenance Hemodialysis

Tammy Drasher

Background: Valid nutrition assessment techniques are essential for the care of stage five chronic kidney disease patients on maintenance hemodialysis. Objective: To identify what combination of Subjective Global Assessment (SGA) components best predict nutritional status based on overall SGA score. Design: A retrospective, cross-sectional medical record review. Participants/Setting: 132 adult, maintenance hemodialysis patients at one out-patient dialysis center from 2001-2009 with SGA performed by one registered dietitian (RD) trained in SGA procedure. Statistical analyses performed: Spearman rank correlation coefficients determined the relationship between SGA components and overall SGA score while binary logistic regression analysis identified which components were most predictive of overall SGA score. SPSS 17.0 was used with an a priori alpha level of 0.05. Results: The SGA components of dietary intake, gastrointestinal symptoms, disease state/co-morbidity and physical exam for subcutaneous fat loss and muscle wasting provided the most efficient prediction of overall SGA score (X2 = 156.472, p < 0.001). The SGA components of weight change (Spearmans rho = 0.388, p < 0.001) and functional capacity (rho = 0.576, P < 0.001) were also correlated with overall SGA score. Only two subjects presented with edema thought to be related to nutritional status, therefore, this component was excluded from analysis. Conclusions: A combination of the SGA components of dietary intake, gastrointestinal symptoms, disease state/co-morbidity, and physical exam predicts nutritional status based on overall SGA score. Further research with a demographically diverse and larger study sample with SGA conducted by multiple health practitioners is warranted.


Breastfeeding Clinical Practices, Knowledge, and Attitudes of Neonatal Dietitians: An Exploratory Survey

Katina Langley

Objective: To describe the neonatal registered dietitians (RD) role in breastfeeding promotion and support in the Neonatal Intensive Care Unit (NICU) through examination of breastfeeding clinical practices, level of breastfeeding knowledge, attitudes toward breastfeeding, and demographic characteristics. In addition, relationships between breastfeeding clinical practices, knowledge, and attitudes were investigated. Design: This was a prospective, exploratory web-based survey study. A survey invitation letter and link, which requested the participation of the neonatal RD(s), was emailed to NICU medical directors identified in the American Academy of Pediatrics (AAP) 2009 US NICU Directory. Subjects: The target population was registered dietitians practicing in a NICU in the US. The AAP 2009 NICU Directory provided valid email addresses for 894 NICU medical directors for survey distribution to the neonatal RD. Missing email addresses were obtained by telephone, resulting in 47 email addresses to contact the neonatal RD directly. Successful distribution of 845 emails resulted in 143 useable surveys. Statistical analyses performed: SPSS 17.0 was used for data analyses. Descriptive statistics and Pearson Product Moment Correlation test was used to describe and identify relationships between breastfeeding clinical practices, breastfeeding knowledge, and breastfeeding attitudes. Cronbachs Alpha was used to determine internal consistency of breastfeeding clinical practices for justification of the composite score. A priori ± was set at p d 0.05. Results: The majority of survey participants were female (n=136, 95.8%), white (n=125, 88.0%) and 50.0% (n=71) held an advanced degree. One-third ( n= 47, 33.1%) of respondents had breastfeeding education in the last two years that averaged 17.5 hours. Breastfeeding clinical practices were measured on a frequency scale. The mean composite score for breastfeeding clinical practices was 20.0 (n=143) out of 50.0 possible points. More than three-quarters of the participants, had never taught the breastfeeding mother how to use a breastpump (n=112, 78.3%) and never taught a new mother breastfeeding techniques (n=108, 75.5%). The most frequently completed clinical practice was encouraging a new mother to provide breastmilk for her baby (n=51, 35.5%). The mean breastfeeding knowledge score was 13.7 (n=143) out of 20.0 possible points for a percentage score of 69. Breastfeeding attitude was measured on an agreement scale. Respondents scored a mean of 17.2 (n=143) out of 20.0; the higher the score, the more positive the attitude toward breastfeeding. When examining relationships it was found that the respondents who practiced breastfeeding support more frequently had higher knowledge scores (r=0.355, p<0.001) and had a higher attitude score (r=0.338, p<0.001). A weak relationship was found between breastfeeding knowledge and breastfeeding attitude (r=0.260, p=0.002). Conclusions: In this study, neonatal RDs had positive attitudes toward breastfeeding, but were deficient in their breastfeeding support and knowledge. The results of this study provide a rationale for more lactation education and training programs for neonatal registered dietitians.


The international dietetics and nutrition terminology (IDNT): factors associated with the degree of use in the dietetics practice.

Aikaterina Galeos, 2010

Background: The Nutrition Care Process (NCP) was introduced to the dietetic profession in 2003. The International Dietetic and Nutrition Terminology (IDNT), first published in 2004, is used to describe conditions at each step of the NCP. While ADA continues to encourage the use of the IDNT in practice settings, many RDs and DTRs have yet to use this terminology in practice. Objective: The purpose of this study was to determine the degree of IDNT use among RD members of the Clinical Nutrition Managers Dietetic Practice Group (CNM DPG), and to identify personal and organizational factors associated with the degree of IDNT use in the dietetic practice settings. Design/Subjects/Setting: A prospective internet based survey was e-mailed to RD members of the CNM DPG (N=454). Survey participants were e-mailed a link to the survey through SurveyMonkey (www.surveymonkeycom). Statistical analysis performed: SPSS vs. 17.0 was used for analysis. Internal consistency of the variables representing each theoretical factor was determined using Cronbachs alpha. Spearmans rho correlation was used to determine relationships between each factor and step of the NCP. Descriptive statistics and frequency distributions were used to summarize the statistics for the continuous and categorical variables. Results: The results of this study demonstrated that two personal factors (Relative advantage and Compatibility) and two organizational factors (Organizational slack and Innovation champion) were positively related with the degree of IDNT use in dietetic practice settings. Of the 1,718 total e-mail invitations that were sent and accepted, 454 (26.4%) responded and 375 (21.8%) usable surveys were included in final analyses. The four factors identified to be positively related with the degree of IDNT use, displayed high Cronbachs alphas; 0.813 (Relative Advantage), 0.829 (Compatibility), 0.767 (Organizational slack) and 0.700 (Innovation champion). Statistically significant (at p<0.001), moderate positive correlations were found between Relative advantage and the use of each step if the NCP. There were statistically significant weak to moderate correlations (p< 0.001) between Compatibility factor and each step of the NCP. There were statistically significant, moderate to strong correlations between the Organizational slack factor and the use of the NCP at each step (p<0.001). There was statistically significant, moderate to strong correlations between the Innovation champion factor and each step of the NCP (p<0.001). Conclusions: This was the first study to identify four theoretical factors-Relative advantage, Compatibility, Innovation champion and Organizational slack-all which have a strong influence on IDNT use. Future research with a larger population of RDs is needed to replicate the findings in study. Recognition of these factors and well-thought out, planned implementation strategies may facilitate and promote the use of the IDNT in dietetic practice settings.


Comparison Between the 2000 CDC and 2006/2007 WHO Growth Charts for the Prevalence of Overweight and Obesity in children Ages 2-12 Years: A Descriptive, Secondary Analysis

Tina Vaz, 2010

Objective: The purpose of this study was to examine the differences in weight status classification of children ages two to 12 years using two different weight status categories, the 2000 CDC and the 2006/2007 WHO growth charts. Design: A descriptive secondary analysis of demographic and anthropometric characteristics from data collected in 2008 and 2009. Subjects: The sample for this study consisted of data collected in 2008 of 240 children ages two to 12 years from an urban hospital based pediatric clinic in New Jersey and data collected in 2009 of 166 children ages two to 11 years from a suburban pediatric clinic in Wisconsin. Methods and Statistical Analysis: Demographic and anthropometric data were obtained from a 2008 and 2009 database. BMI z-score was calculated and plotted based on the 2000 CDC and 2006/2007 WHO growth charts using Epi Info and WHO Anthro Plus software respectively. Frequency distributions (n, %) were used to report the childrens demographic characteristics and prevalence of weight categories based on both growth charts. Paired student t tests and Wilcoxon nonparametric tests were conducted to test for significant differences in BMI z-scores between two growth charts. The Sign test was used to compare the proportions of subjects for all weight status categories between the 2000 CDC and 2006/2007 WHO growth charts. McNemar test was used to compare the proportion of subjects classified as healthy weight, overweight and obese between the two growth charts. A priori alpha level was set at 0.05. Results: Sixty-one percent of the children were African American (n=215, 61.8%) and the 87.8% were Non-Hispanic (n=330). The mean BMI z-scores for all subjects were statistically significantly lower (t= -18.04, p<0.001) when based on the 2000 CDC growth chart compared to the 2006/2007 WHO growth chart. There were significantly fewer children classified as healthy weight (WHO n=234, 57.9% versus CDC n=250, 61.9%) ( p=0.024) but significantly more children classified as overweight (WHO n=97, 24.0% versus CDC n=61, 15.1%) (p<0.001) when based on the 2006/2007 WHO growth charts for all subjects as compared to the 2000 CDC growth charts. No significant difference in the prevalence of obesity was found between the two growth charts. The 2006/2007 WHO growth charts classified more children in a higher weight category compared to the 2000 CDC growth chart except for obesity. Applications/Conclusions: The high prevalence rates for overweight and obesity in the US and in this sample underscore the need for early identification of overweight and obesity. The study findings provide evidence for the use of the 2006/2007 WHO growth charts as tools for early identification of pediatric overweight. Clinicians should be aware that the magnitude of differences in prevalence for overweight and obesity will vary depending on what growth chart is used, choice of growth indicator, age and population sampled.


Identifying Components of Advanced-Level Practice in Clinical Nutrition Practice: A Delphi Study

Rebecca Brody, 2010

The dietetics profession lacks a comprehensive definition of advanced-level practice (ALP). Using a three-round Delphi study, expert consensus was sought on four dimensions of ALP that define Advanced Practice Registered Dietitians (APRDs) in clinical nutrition practice. Purposive sampling identified 117 APRDs who met ALP criteria. In Round 1, experts rated the essentiality of statements regarding professional knowledge, abilities, and skills, approaches to practice, and roles and relationships on a 7-point ordinal scale and generated open-ended practice activity statements reflecting ALP. Median ratings of 1.0-3.0 were defined as essential, 4.0 were neutral and 5.0-7.0 were non-essential. In Rounds 2 and 3, experts rated statements not reaching consensus by evaluating their previous responses, the group median rating and comments. Consensus was reached when the interquartile range of the responses to a statement was < 2.0. Eighty-five experts enrolled (72.6%); 76 (89.4%) completed all rounds. In total, 233 statements were rated with 100% achieving consensus; 211 (90.6%) were essential to APRD clinical practice. Essential knowledge, abilities and skills components included having a masters degree, completing advanced practice residency programs, research coursework and advanced continuing education. Having 8 years of experience; clinical nutrition knowledge/expertise including expertise in nutrition science; specialization, participation in research activities, and skills in technology and communication were essential practice experience components. Scholarly work and informal recognition were essential. Consensus was achieved on roles relating to patient care and leadership and broad, diverse, and geographically dispersed relationships. Experts identified essential approaches to practice that focus on patient care and professional practice. Critical thinking, intuition and using a systematic approach were highly essential as were values encompassing professional growth and service to patients. Essential APRD practice activities were identified within nutrition assessment, diagnosis, intervention, and monitoring/evaluation. APRDs practice nutrition care with complex patients using application of advanced knowledge/expertise and advanced interviewing, education, and counseling strategies. Care is patient centered and approached in a comprehensive yet discriminating manner. Communication with patients and the healthcare team is a priority. A model of ALP in clinical nutrition was proposed depicting the requisite attributes and activities within four dimensions of professional practice.


Knowledge, attitudes and professional practices of nurses within the Veterans Administration (VA) towards overweight and obese Veterans.

Valerie Adegunleye, 2010

Authors: Valerie K. Adegunleye, Diane Radler, J. Scott Parrott Title: Knowledge, attitudes and professional practices of nurses within the Veterans Administration (VA) towards overweight and obese Veterans. Learning Outcome: To describe nursing staffs knowledge, attitudes and professional practices towards overweight and obese Veterans. Objective: To identify relationships between demographic characteristics, knowledge, attitudes and professional practices of VA nurses towards overweight and obese Veterans. Design & Methods: An internet-based survey was sent to nurses at two VA Medical Centers. The survey included demographic questions, 5-point Likert scales and 7-point differential scales to evaluate attitudes towards and professional practices related to obesity. Knowledge was assessed through true or false questions. Subjects: A convenience sample of nursing staff working in two VA facilities. Statistical analyses: Descriptive statistics were used for demographics, knowledge, attitudes and professional practices. Spearmans correlation and chi-square analyses were used to explore the relationships between variables. Results: E-mails were sent to 772 potential participants; 157 (20.34%) returned a usable survey. The majority were female (89.33%) and Caucasian (60.66%). More than half (58.60%) incorrectly answered BMI more accurately reflects risks for cardiovascular disease than waist circumference and nearly half (47.77%) incorrectly answered MOVE! [VA weight management program] is appropriate for all Veterans, and less than half (49.34%) indicated frequently/always recommending MOVE! Respondents who frequently or always recommended MOVE! felt that restaurant dining (r=0.260, p=0.001), food accessibility (r=0.212, p=0.009) and repeated dieting (r=0.237, p=0.004) were important causes of obesity and that physical activity/exercise was an effective treatment (r=0.286, p<0.001). There were no significant relationships between professional practices and BMI, educational level or knowledge. Application/conclusions: VA nurses may benefit from education regarding BMI, waist circumference, the components of MOVE!, and appropriate referrals of Veterans to the MOVE! weight management program.


Effect of Medical Nutrition Therapy (MNT) on the Clinical Outcomes of Dyslipidemia

Jim Brewer, 2010

Title: Effect of Medical Nutrition Therapy (MNT) on the Clinical Outcomes of Dyslipidemia Authors: Brewer, W. James, Byham-Gray, Laura, Denmark, Robert Abstract Objective: To determine whether U.S. Veterans who received Medical Nutrition Therapy (MNT) over a six-month time period from a Registered Dietitian (RD) have better clinical outcomes than those who received a one-time Traditional Care (TC) nutrition program. Design: A retrospective chart review of two separate nutrition interventions. One hundred seventy-three ambulatory care patients greater than 19 years old with elevated LDL-C levels above 100 mg/dl were included. Subjects: From 519 reviewed charts, 173 eligible patient charts were selected within an ambulatory care outpatient setting. Intervention: The MNT intervention group received nutritional counseling for dyslipidemia from an RD from two to eight times over a six month time-period in addition to usual care provided by the physician. The control group received traditional care from their physician and a 60 minute one-time nutrition program. Main Outcomes Measures: The outcome measures included the plasma lipid profiles total cholesterol, LDL-C, HDL-C, VLDL-C, TG, systolic and diastolic blood pressure and Body Mass Index (BMI). Statistical Analysis: Due to the lack of random assignment, descriptive statistics were used to analyze equivalence at baseline. Independent t-test, Fisher Exact test and Pearson Chi Square of Independence were used to assess equivalence between the TC group and the MNT group. Correlations were used to identify potential covariates. Repeated Measures Analysis (ANOVA) was used to compare the two groups over three points in time (Baseline, 3-Months and 6-Months). Results: Compared with the one visit and 60 minute education provided to the TC group, the MNT group voluntarily received a mean of 3.22 visits that equates to a mean of 95.59 minutes of nutrition counseling per patient. The MNT group showed significantly greater improvements than the TC group in total cholesterol (Chol) at three (decreased 7.43% vs. 4.26 %, p<0.0005) and at six months (decreased 12.05% vs. 4.77%, p<0.0005) and in low density lipoprotein cholesterol (LDL-C) at three (decreased 11.80% vs. 7.74%, p<0.0005) and six months (decreased 20.01% vs. 8.23%, p< 0.0005). Weight status and blood pressure were not significantly different between groups at baseline and did not change significantly at three and six months. Compared to the TC group, the MNT group at baseline, showed lower use of cholesterol lowering medications, higher BMI and were younger in age. Further analysis revealed these differences did not have a significant effect on clinical outcomes. Applications/Conclusions: MNT significantly improved clinical outcomes over TC among Veterans and these findings should be replicated in a larger sample to determine if this protocol should be extended for all VA ambulatory care outpatient settings.


Impact of a Peer-led, Social Cognitive Theory (SCT)-based Diet and Physical Activity Healthy Body Program on Anthropometrics and SCT Construct Scores of First-Year College Students

Lynn Monahan Couch, 2010

Background Overweight and obesity in the United States has reached epidemic proportions in adolescents and young adults. The transition from high school to college is a critical period for weight management and health promotion interventions as it is characterized as a time to establish independence through developing beliefs, support systems and lifestyle behaviors. Objective Among first year college students, the objective was to determine the impact of a 12-week, peer-led Healthy Body Program (HBP) on anthropometrics and Social Cognitive Theory (SCT) based diet and physical activity-related construct scores compared to a no-intervention control group (NIC). The relationships between changes in anthropometric measures and changes in SCT scores were also examined. Design A prospective, randomized, controlled design. Participants/setting Seventy-three first-semester college students volunteered for the study at West Chester University. Of those, 57 completed the study and were included in the analyses (n = 31 in HBP, n = 26 in NIC). Intervention Participants were randomized into either the HBP or NIC group. The HBP received weekly health bulletins and counseling sessions with peer educators, with goal setting and self-monitoring. Main outcome measures Diet and physical activity SCT construct scores (self regulation, social support, self efficacy, outcome expectations) at baseline and 12 weeks post-intervention and anthropometric measurements (e.g., weight, percent body fat, percent lean body mass and waist circumference). Relationships between changes in anthropometric measures and changes in SCT construct scores in the HBP group were also examined. Statistical analyses performed Repeated measures analysis of variance (ANOVA) was used to examine differences between the groups on outcome measures. Pearsons correlations were used to examine relationships between changes in anthropometrics and changes in SCT construct scores. Results HBP participants exhibited an increase in self-regulation to decrease calories and fat (F=4.708, p=0.034) and in dispelling negative outcome expectations for being physically active (F=5.123, p=0.028). Significant increases were seen, independent of group, in other self-regulatory behaviors [decreasing calories and fat (F=13.364, p=0.001) and [planning and tracking diet (F=11.336, p=0.001) and self-efficacy behaviors [planning and tracking diet (F=5.864, p=0.019) and in [increasing fiber, fruits and vegetable intake (F=4.493, p=0.039)]. Independent of group assignment, weight (2.46 lbs., F=17.945, p=0.000), percent body fat (1.03%, F=10.605, p=0.002) and waist circumference (1.05 inches, F=29.413, p=0.000) increased significantly with a decrease in percent lean body mass (1.05%, F=10.605, p=0.002). There was a significant inverse relationship (r= -.357, p=0.49) between changes in percent body fat and changes in negative outcome expectations for healthy eating. Conclusions The results suggest that participation in a 12-week, SCT-based, peer-led intervention increases diet self-regulation and dispels negative outcome expectations about being physically active in first-semester college students compared to a no-intervention control group. Self regulation may be an important component of diet-based interventions for college students . Twelve weeks may be insufficient time for program-related anthropometric changes. Further studies are needed to explore the impact of the intervention over the long term.


The Utility of the Pediatric Assessment Scale for Severe Feeding Problems

Sayuri Asano, 2010

Pediatric feeding disorders can develop in both healthy and chronically ill children, and may affect normal developmental milestones. Interdisciplinary feeding team (IFT) management is considered an optimal treatment to achieve feeding progress in these children. The Pediatric Assessment Scale for Severe Feeding Problems (PASSFP) is a parental questionnaire and is a tool to assess the severity of feeding-related issues. This study examined the utility of the PASSFP to measure the feeding progress among 10 children with feeding difficulties. Feeding therapy interventions were carried out over two consecutive outpatient clinic visits within six months that offered interdisciplinary feeding therapy. The PASSFP score and childrens growth parameters (z-score for weight-for-length, weight-for-height, or BMI) were collected at both of the visits to explore the relationship of changes between the PASSFP scores and growth over the intervention time period. A demographic questionnaire was conducted at the second visit to examine if there would be any factors that may affect feeding progress. Among 10 subjects, nine completed the study within three months, eight increased PASSFP scores, and all of the ten subjects maintained adequate z-scores. Case examination findings suggest that the PASSFP appears to be useful when compared to standard clinical observations of progress to measure the feeding progress within a shorter time frame than the tool has been typically used. Increases in the PASSFP score seem to be parallel to appropriate physical growth. Despite a small sample size, this study provides preliminary support for the use of the PASSFP tool for shorter intervals of feeding progress due to interdisciplinary feeding team therapy. A tool that can measure feeding progress changes over shorter intervention intervals is useful because IFT typically occurs within a three month time period. The PASSFP may be beneficial for clinicians to measure feeding progress to evaluate interventions and may also provide evaluation outcome for medical insurance providers to recognize the effect and need for interdisciplinary feeding treatment, subsequently improving the quality of life in children with feeding disorders and their family.


The Predictability of the Mediterranean Diet Score on Cardiovascular Disease Risk

Cheryl Thomas-Perters, 2010

Purpose: The study purpose was to determine the ability of the Mediterranean diet score (MED55 diet score) to predict cardiovascular disease (CVD) risk as calculated by DAgostinos Framingham Risk Score (FRS) in the patients at Nutrition and Lifestyle Medical Clinic (NLMC), a primary care practice in Northern California. Design: The retrospective study examined the ability of the MED55 diet score to predict CVD risk using the FRS, using cross-sectional data collected from electronic medical records (N=223) between May 15, 2006 and March 15, 2010, of adults between the ages of 18 and 80 years of age that met inclusion criteria. Dietary intake and health habits (physical activity, tobacco use) were assessed using the self-reported data on the NLMC health habit intake questionnaire at the patients last follow-up visit. Confounders known to effect CVD were controlled for (race group, body mass index, family history, lipid lowering or anti-hypertensive prescription (Rx), >1 CVD risk factor, number of visits, duration of care, and time receiving care). Systolic blood pressure and cholesterol concentrations were obtained. Results: The mean age was 61.44+11.38 years, ranging from 30-80 years. The mean MED55 diet score was 40.87+5.35 (0-55 point diet score) with higher scores considered more adherent to the Mediterranean diet pattern. The sample mean FRS was 11.40+5.23 with males having a higher mean FRS compared to females (13.72+4.42 and 10.06+5.20, respectively), which relates to a 10-year CVD risk of 18.55% and 8.70% for males and females, respectively. Using hierarchical regression, CVD risk as measured by the FRS was predicted by race group (p=0.009), lipid lowering prescription medication (p<0.001), >1 CVD risk factor (p<0.001) and time receiving care (r2=0.016, p=0.036). No association between MED55 and FRS (r2=.0.012, p=0.068) was observed. Conclusion: The MED55 diet score did not predict CVD as calculated by a FRS in the patient sample at NLMC. Further research is needed to explore, modify and validate the MED55 in an U.S. population, to determine if the MED55 diet score can predict FRS in a primary care setting. Keywords: cardiovascular disease; coronary heart disease; Framingham risk score; Mediterranean diet pattern; Mediterranean diet score; Cardiovascular disease risk factors, systolic blood pressure; total cholesterol; high-density lipoprotein; cross-sectional study;


U.S. Obstetrics-Gynecology Resident Demographic and Professional Characteristics, Knowledge and Professional Practice Regarding Vitamin and Mineral Supplementation for Pregnancy Following Roux-en-Y Gastric Bypass

Lyssa Lamport, 2010

Background: Obstetrics-gynecology residents need to be aware of the importance of preventing potential nutrient deficiencies via vitamin and mineral supplementation for Roux-en-Y gastric bypass patients that become pregnant. Objective: To determine the relationship between U.S. obstetrics-gynecology resident demographic and professional characteristics, knowledge and professional practice regarding vitamin and mineral supplementation for pregnancy following Roux-en-Y gastric bypass. Design: This was a prospective internet-based survey sent to 247 obstetrics-gynecology program directors via email invitation, which included a link to the survey on the SurveyMonkey site with the request to forward to their obstetrics-gynecology residents for completion. The survey included questions regarding demographic and professional characteristics and true/false, yes/no and multiple choice questions related to knowledge of and professional practices regarding vitamin and mineral supplementation for women who have undergone Roux-en-Y, vitamin and mineral supplementation for pregnancy, and vitamin and mineral supplementation for women who have undergone Roux-en-Y and become pregnant. Subjects: Of the 4943 U.S. obstetrics-gynecology residents sent the survey, the response rate was 5% (n=230). One hundred and ninety-six (85.2%, n=196) of the 230 individuals who entered the survey, completed both the demographic and professional characteristic questions as well as the knowledge quiz. For the purposes of data analysis, only those 196 responses were used. Statistical analyses performed: Descriptive and inferential statistics were done using SPSS 17 to examine the relationship between U.S. obstetrics-gynecology resident demographic and professional characteristics, knowledge and professional practice regarding vitamin and mineral supplementation for pregnancy following Roux-en-Y gastric bypass. A priori ± was set at p d 0.05. Results: The mean score for the knowledge quiz was 8.5 out of a possible score of 12. There was a direct positive relationship between knowledge of the recommended vitamin and mineral supplements for pregnant women who have previously undergone Roux-en-Y gastric bypass and prescription practices for a prenatal vitamin with DHA (p < 0.001, Fishers Exact Test), vitamin D (p = 0.009), Fishers Exact Test) and Iron (p = 0.018, Fishers Exact Test). No relationship was found between gender, age and year/class of residency year and knowledge of vitamin and mineral supplementation needs or knowledge of the need for calcium and vitamin B12 and prescription practices. Conclusions: Obstetrics-gynecology residents are knowledgeable of the vitamins and mineral supplements needed for pregnancy and gastric bypass, independently, but may not be knowledgeable of the vitamin and mineral supplements needed for pregnant women that have previously undergone Roux-en-Y gastric bypass. Future research is needed to explore how obstetrics-gynecology resident vitamin and mineral knowledge and prescription practices can improve after receiving education and training in the treatment of pregnant women that have previously undergone Roux-en-Y gastric bypass.


Use of the Subjective Global Assessment to Predict Health Related Quality of Life in Chronic Kidney Disease Stage 5 Patients on Maintenance Hemodialysis

Linda Vero, 2010

Objective: The study purpose was to determine whether a subjective global assessment (SGA) score was predictive of health related quality of life (HRQoL) in stage 5 chronic kidney disease (CKD) patients on maintenance hemodialysis (MHD). Design and Setting: This was a cross-sectional secondary data analysis of maintenance hemodialysis patients receiving therapy three times per week at dialysis centers located in the United States, Canada or New Zealand. Nutritional status was assessed using the 7-point SGA. HRQoL was determined using the Medical Outcomes Study 36-item Short Form (SF-36). Results: The study sample consisted of 94 male (n=47) and females (n=47), mean age 64.93±12.96. The mean SGA score was 5.78±1.1. Participants had a mean HRQoL physical health score of 36.53±9.30 at six months indicating a low or worse physical health state. The HRQoL mental health summary score was within the normal range (50.50±11.12). After controlling for confounders in the hierarchical regression models, the SGA score predicted HRQoL physical health (r2=.124) which was statistically significant (p=.012). No association was found between the SGA score and HRQoL mental health (p=.925). Conclusions: The SGA is a significant predictor of HRQoL physical health. Given that decreased HRQoL in persons on maintenance hemodialysis is associated with mortality, complications, and reduced compliance to treatment, using the SGA can be a cost effective screening tool to help identify persons on dialysis with a lower HRQoL related to physical health. Key Words: subjective global assessment, quality of life, chronic kidney disease, KDOQI clinical nutrition practice guidelines, malnutrition and quality of life


The Eating Experience in Long Term Survivors of Head and Neck Cancer: A Mixed Methods Study

Heidi Ganzer, 2014

Abstract Purpose: This study explored the eating experience in long term survivors of head and neck cancer (HNC) = 3 years post concurrent chemoradiation (CCR). Quality of life (QOL) and the meanings and perceptions survivors had as it related to the eating experience were explored. Methods: Purposive sampling was utilized; 10 long-term survivors of HNC participated in the study. A mixed-methods approach was used; exploratory qualitative research using content analysis and summary statistics were used to describe demographics, clinical characteristics and the Vanderbilt Head and Neck Symptom Survey version 2.0 scores (VHNSS 2.0). Results: Four categories (psychological, social impact, functional status, and the current eating experience) containing 15 subthemes and one overarching theme (adaptation) emerged. Current health status, QOL and QOL related to eating were viewed favorably despite that treatment and its late effects impacted participants’ daily lives. Adaptation and maladaptation in regard to food choice and downplaying of symptoms were recognized. Interviews as well as VHNSS 2.0 scores indicated that xerostomia, mucosal sensitivity, swallowing difficulty, length of time required to eat and dysgeusia remained problematic. Conclusion: Psychological, functional and social losses associated with eating were identified. Participants modify or avoid foods that are challenging yet report enjoyment with eating. Challenges with eating were downplayed. Due to the potential negative nutritional and social implications of avoiding specific food/food groups, standard of care in long term survivors of HNC should include assessment of the eating experience and functional challenges. Nutrition professionals can help patients optimize dietary intake and the eating experience.


The Impact of the Implementation of a Nutrition Support Algorithm on Nutrition Care Outcomes in the Intensive Care Unit

Caroline Kiss, 2010

Objective: The aim of this study was to assess the impact of the implementation of a nutrition support algorithm in an ICU without a designated dietitian on nutrition care practices and outcomes. Methods: The retrospective study included data collection on two cohorts of critically ill patients prior (N = 112) and following implementation of a nutrition support algorithm. Medical records of patients admitted to the medical-surgical intensive care unit (ICU) for more than 72 hours and receiving nutrition support therapy were reviewed. A nutrition support algorithm was developed and implemented as an operational version of the guidelines published by the Society of Critical Care Medicine (SCCM) and the American Society for Parenteral and Enteral Nutrition (ASPEN). To assess the impact of this implementation strategy, nutrition care practices and outcomes were assessed in the pre- and post-implementation groups. Results: There were significant differences between groups, pre- and post-implementation for the mean delivery of total energy 909±444 kcal/day versus 1097 ±420 kcal/day (p = 0.023). , When energy was normalized to kilogram of body weight per ICU day, daily energy increased from 13.3±6.2 kcal by 23% to 16.5 ±7 kcal from pre- to post-implementation group (p = 0.012). The adequacy of energy target at day 4 was 69.8±36.3% in the pre-implementation group, and 89.1±45.8 % in the post-implementation (p = 0.012). For patients staying at least seven days in the ICU, the cumulative energy deficit decreased from a moderate deficit in the pre-implementation group (n = 26, -5664±3613 kcal) to no deficit (n = 25, -2972±2420 kcal) in the post-implementation group (p = 0.011). There were also significant differences between groups, pre- and post-implementation for the mean delivery of protein per ICU day (35±17.9 g versus 59.1±27.3 g; p < 0.001). And when normalized to kg of body weight, daily protein delivery increased from 0.5±0.3 g/kg by 80% to 0.9±0.5 g/kg (p = < 0.001). In patients staying at least four days in the ICU, the amount of delivered energy and protein was related to their target requirements. In the pre-implementation group the patients received 48.3±21.5% of their protein target, whereas in the post-implementation group patients received 93.0±54.7% (p < 0.001). There were no significant changes in the route of nutrition support, and the time of enteral nutrition initiation after admission. Conclusion: Implementation of a nutrition support algorithm resulted in improved provision of energy and protein delivery. In order to achieve further nutrition care outcomes recommended by practice guidelines, the regular inclusion of a dietitian or a nutrition support team might be necessary.


Identifying Factors Related to Weight Loss of Nursing Home Individuals Using the Minimum Data Set (MDS) 2.0

Dana Ailor, 2011

Objective: To identify factors associated with unintentional weight loss among nursing home residents using the minimum data set (MDS) 2.0. Design/Methodology/Subjects: A retrospective cohort design was used for residents at three nursing home facilities in Florida (N=140). MDS data such as: feeding ability, oral problems, height, weight, and medications were collected at three time periods (admission, three months post admission, and six months post admission) along with diagnoses upon admission only. SPSS version 17.0 was used for analyses. A sample size of n=191 would have been sufficient based on A priori test for a Chi-square analysis, alpha=0.05. Frequency distributions and the Chi-square test of independence were used for data analyses. Results: One hundred and forty residents 65 years or older met inclusion criteria; 84% were female with a mean age of 83.4 years and a mean admission BMI of 25.7 kg/m2. Subjects who had a neurological diagnosis upon admission (n=65) were more likely (c²=7.89, p=0.048) to be a normal weight (n=24, 36.9%), overweight (n=17, 26.2%), or obese (n=15, 23.1%) than to be underweight (n=9, 13.8%). Subjects who were prescribed diuretics (n=43) were more likely (c²=12.91, p=0.005) to be obese upon admission (n=18, 41.9%) than any other BMI category. At three months, those who lost weight (n=26) were more likely (c²=7.00, p=0.008) to be prescribed an antidepressant (n=20, 76.9%) than to not have an antidepressant prescription (n=6, 23.1%). Those who had gained weight (n=31) were less likely (c²=7.27, p=0.007) to be prescribed an antidepressant (n=21, 67.7%) than to have an antidepressant prescription (n=10, 32.3%) Conclusions: This study demonstrated MDS items such as: feeding ability, oral problems, swallowing problems, and medications prescribed may be useful in identifying residents at risk for weight loss. The results underscore the importance of early interventions for residents at risk for unintentional weight loss. Further research with a larger sample size is needed.


Does Dietitian Intervention Impact Progression to Type 2 Diabetes in Patients with Pre-Diabetes?

Brenda Braslow, 2011

Background Registered dietitians (RDs) can have a role in preventing progression to type 2 diabetes (T2DM). Objective To investigate the relationship between RD intervention and progression to T2DM in adults with pre-diabetes (pre-DM). Methods Medical records of adults with pre-DM from 14 clinics were reviewed retrospectively. RD intervention included an individual session or Preventing Diabetes Class. The number of RD interventions, number of months from pre-DM to T2DM and the relationship between RD intervention and progression to T2DM were analyzed; ±d0.05. Results The mean age of patients in this sample (n=302) was 63.4 years (SD=11.5), mean baseline BMI and fasting glucose were 31.1 (SD=6.0) and 112.7 mg/dl (6.3 mmol/l) (SD=6.0), respectively. Twenty- nine percent (n=86) of the patients had a RD intervention. Of patients receiving RD intervention, 30% (n=24) progressed to T2DM. The mean number of RD interventions was 1.3. The majority of all patients (n=215, 71.2%) did not progress to T2DM in five years. BMI and fasting glucose were significant predictors of progression to T2DM, p=0.04 and pd0.01 (logistic regression), respectively. RD intervention was not a significant predictor of progression to T2DM in this small sample (logistic regression), p=0.78. While not statistically significant (t(85)=-1.50, p=0.14), of patients who progressed to T2DM, those without RD intervention (n=64, 73.6%) progressed 6.1 months sooner than those with RD intervention (n=23, 26.4%). Conclusions Further research is needed with a larger sample and more frequent RD interventions.


Exploring the Relationship Between Professional Characteristics and Perceived Level of Practice and Practice Activities Among Registered Dietitian Members of the Oncology Dietetic Practice Group

Marissa Ciorciari, 2011

Objective: To explore relationships between demographic and professional characteristics and perceived level of practice (LOP) in oncology nutrition care (ONC), and identify relationships between respondent highest degree obtained and how the specialty and advanced level practice activities were performed among RD members of the Oncology Nutrition Dietetic Practice Group (ON DPG) Design: This was a retrospective secondary analysis of a survey completed in 2008. Participants/setting: RD members of the ON DPG who completed an internet based survey and met the ON DPG definition of oncology nutrition dietitian (n=358). Statistical Analyses: Frequency distributions, Ç2, One-way ANOVA, and Kruskal-Wallis (prior alpha set at 0.05) were used to analyze demographic characteristics, professional characteristics, and the relationships between perceived LOP in oncology nutrition and professional characteristics. Ç2 was used to analyze the relationships between highest degree obtained and the most frequently reported SOP and SOPP in ONC specialty and advanced activities. Results: Respondents with graduate degrees were significantly more likely to practice at the advanced level compared to those without graduate degrees (Ç2(2)=13.93, p=0.001). No association was found between the timing of completing a Masters degree and LOP in ONC. Post-hoc analysis using the Mann-Whitney U test indicated ALPs had significantly more years with the RD credential compared to the GLPs and SLPs (p<0.001). RDs with graduate degrees were significantly more likely to have supervisory duties for select specialty SLP and ALP activities examined. Conclusions: This study provided an understanding of how professional characteristics influence activities in oncology nutrition in the face of expanding LOP in oncology nutrition. Respondents with graduate degrees were more likely to have supervisory SLP and ALP duties than those without graduate degrees. Further research is needed to examine how other professional characteristics impact functions performed by oncology RDs, and identify appropriate resources and education programs for professional growth development.


The Effect of Medical Nutrition Therapy in Patients with Pre-diabetes Participating in a Randomized Controlled Clinical Research Trial

Anna Parker, 2011

Background: Prior studies have provided evidence that lifestyle change prevents or delays the occurrence of type 2 diabetes mellitus (T2DM). The challenge is to translate research evidence for T2DM prevention into the healthcare setting. Objective: The study investigated the effect of medical nutrition therapy (MNT) as compared to usual care on fasting plasma glucose values, HbA1c, lipid levels, and Diabetes Risk Score, from baseline to the end of a 12 week intervention in overweight or obese adults with pre-diabetes given demographic characteristics, clinical characteristics and co-morbidities. Design: Prospective, randomized, parallel group study among 58 subjects with impaired fasting plasma glucose or an HbA1c of 5.7-6.4%, recruited between April and September 2010 participating in a 12 week intervention. Main outcome measures: Fasting plasma glucose, Diabetes Risk Score, HbA1c, and lipid levels. Statistical Analyses: A factorial repeated measures ANOVA was used to make comparisons between the two groups (the MNT and usual care groups) and two measures of time (baseline and twelve weeks post intervention). Data analysis was performed using Statistical Package for the Social Sciences (SPSS for Windows, Rel: 17.0. Chicago: SPSS Inc.). Using a factorial repeated measures ANOVA, an effect size of 0.25, and testing at an alpha of 0.05, this study had 95% power to detect a difference in outcome measurements with a sample size of 54 patients (27 per group). Results: There was a significant interaction for group assignment and HbA1c (F= 7.18, p=0.01), with the MNT group experiencing significantly lower HbA1c levels than the usual care group. By the end of the 12 week intervention, the MNT group mean HbA1c decreased to 5.79%, while the usual care group mean HbA1c increased to 5.99%. There was a significant interaction for group assignment and Diabetes Risk Score (F=13.43, p=0.001). Diabetes Risk Score for the MNT group decreased from 17.24 ± 3.96 to 14.69 ± 3.88 as compared to the usual care group score which went from 16.92 ± 4.87 to 16.64 ± 4.92). Regardless of group assignment, both groups experienced a reduction in fasting plasma glucose (F=3.96, p= 0.03), and total cholesterol (F=4.56, p=0.04). No significant differences were reported for HDL-cholesterol, LDL-cholesterol, and triglycerides. Conclusions: The study demonstrated that individualized MNT was effective in decreasing HbA1c, fasting plasma glucose, total cholesterol, and Diabetes Risk Score in adults with pre-diabetes in a 12 week intervention period. It seems reasonable to conclude that the beneficial medical outcomes from MNT may prevent or delay the onset of T2DM.


The relationship between mid-upper arm circumference to occipitofrontal circumference ratio and standard anthropometric measurements

Sarah Vermilyea, 2011

Objective: Obtaining accurate weight and length or height measurements of critically ill children can be difficult in the PICU setting. The purpose of this study was to explore the relationship between the mid-upper arm circumference (MAC) to occipitofrontal circumference (OFC) ratio and standard anthropometric measures (weight-for-age, length-for-age, weight-for-length, and BMI-for-age z-scores), among children aged three months corrected age to four years admitted to the pediatric intensive care unit (PICU) in an urban pediatric hospital. Design: A retrospective, secondary analysis of cross-sectional data collected from a single center between March 25th and August 19th 2010. Subjects: A consecutive sample of children admitted to the medical, surgical, or cardiac PICUs who were enrolled in the original study and met the inclusion criteria for this study. Statistical Analysis: Descriptive statistics were used to report demographic characteristics and anthropometric measurements. Pearsons product moment correlation coefficient statistic was used to test the relationships between the MAC/OFC ratio and weight-for-age, length-for-age, weight-for-length, and BMI-for-age z-scores, and the agreement between the MAC/OFC ratio categories and weight-for-length or BMI-for-age categories was tested using a kappa statistic. Pearsons chi-square was used to test the relationship between the MAC/OFC ratio and the percent of goal rate of weight change; sensitivity and specificity analysis were used to describe misclassifications by the MAC/OFC ratio. Results: Of the 108 patient records the mean age was 19.30 (SD = 13.02) months; 55.56% were female. Of participants aged three to 36 months (n = 88), 27.27% were underweight and 2.27% were overweight according to weight-for-length classification, and 13.33% of children aged greater than 36 through 48 months (n = 15) were underweight, 80.00% were healthy, and 6.67% were obese using BMI-for-age classification. The mean MAC/OFC ratio was 0.33 (SD = 0.03); 26.85% had protein calorie malnutrition according to Kanawati et al. MAC/OFC ratio classification. There was a strong, significant correlation between the MAC/OFC ratio and weight-for-length z-scores (r = 0.720, p <0.0010). The MAC/OFC ratio accurately classified 85.23% of participants in this age group. There was a moderate, significant correlation between the MAC/OFC ratio and weight-for-age z-score (r = 0.491, p <0.001). Applications/Conclusions: This study of standard anthropometric measurements and the MAC/OFC ratio provided preliminary data supporting a relationship between weight-for-length z-scores and the MAC/OFC ratio in critically ill children aged three months corrected age to 36 months, as well as between weight-for-age z-scores and the MAC/OFC ratio in critically ill children aged three months corrected age to 48 months. Further research is needed to test this relationship.


Registered Dietitians Roles and Influencing Factors in Decision Making Processes for PEG Placement in the Elderly

Maria Szeto, 2011

Objective: To explore RDs roles in the decision making processes for PEG placement and describe ethical climate of their workplace and examine the relationship between these two constructs. Design/Subjects: This is an exploratory descriptive study using a web-based survey. RDs providing clinical dietetic services in complex continuing care and long term care settings in Ontario, Canada were recruited. Statistics: Descriptive statistics were used to report roles, ethical climate and professional characteristics. Pearsons and non-parametric correlation studies were used to examine the relationship between RDs patients/families-related roles and ethical climate with respect to physicians, and RDs patients/families- related roles and professional characteristics. P value set at 0.05. Results: Sixty-seven RDs met inclusion criteria. The majority 96.9% (n=64) thought RD had a role in decision making processes. Almost 50% (n=28, 49.1%) of RDs were always involved in two patients/families-related roles, identifying relevant nutrition issues and discussing feeding options and alternatives. A moderate to strong positive relationship between RDs roles with patients/families and ethical climate in relation to physicians in their workplace, r=.321, p=0.016 was found. There was a moderate to strong positive relationship between RDs roles and adequacy of knowledge, r=.465, p<0.001 and adequacy of skills, r=.520, p<0.001. There was a strong positive relationship between RDs roles and their role satisfaction, r=.554, p<0.001. Conclusion: A positive working relationship with physicians, knowledge, skills and role satisfaction significantly increase RDs role involvement with patients/families in decision making processes for PEG placement in the elderly.


An Examination of Grit and Leadership in Current Students of the Graduate Programs in Clinical Nutrition at Rutgers-The State University of New Jersey

Meredith Allen, 2016

Background: Grit is a measure of perseverance and passion towards long-term goals. Grit and leadership are important attributes of the current students in the Rutgers Graduate Programs in Clinical Nutrition (GPCN). Both grit and leadership are desired qualities to have in the GPCN students. Objective: To explore demographics and grit and leadership scores of the current students in the GPCN using the 8-Item Grit Scale and the Leadership Practices Inventory tool. Design: This prospective cross-sectional pilot study was conducted over an eight-week period using mixed-methods (electronic communication and paper survey). Participants: All students in the Rutgers GPCN from the MSCN and DCN programs were invited to participate (N=84). Statistical analyses performed: Descriptive statistics were reported for the Leadership Practices Inventory, 8-Item Grit Scale, professional and demographic characteristics. Results: The response rate for GPCN students was 62.4% (N=54). GPCN students (N=54) had a mean age of 36.8 ± 9.7 years and 96.0% were female. Respondents had a grit score of 3.8 ± 0.5 (out of a possible 5). The highest scoring leadership competency (out of a possible score of 6-60) was Enable the Others to Act with a mean score of 48.6 ± 5.2. The lowest scoring leadership competency was Inspire a Shared Vision with a score of 42.5 ± 9.5. Conclusions: The study had a good response compared to similar literature. The mean grit score of GPCN students was 3.8 ± 0.5. The highest scoring leadership competency for the GPCN students was Enable the Others to Act, using the Leadership Practices Inventory tool. Both tools were feasible to use in this population.


Practices in Weight Management and History of Chronic Diseases of UMDNJ Faculty and Staff

Ancy George, 2011

Background: One of the Healthy People 2020 objectives is to increase the proportion of worksites that offer nutrition or weight management classes or counseling, and reduce the proportion of adults who are obese. Methods/Design: The objective was to identify the weight status, perceived needs, interests in weight management resources, and chronic diseases history of employees at an academic health sciences university in 2010. A prospective survey utilizing paper and electronic surveys were sent to a stratified random sample of 2000 employees at UMDNJ. SPSS 17.0 was used for data analyses (alpha: pd0.05). Results: The overall response rate was 35.7% (n=623). Employee mean Body Mass Index was 27.0 kg/m2 (SD=5.8, range=15.9-59.0 kg/m2); 57.5% (n=345) of respondents were overweight or obese. Staff were significantly more obese than faculty (x2=16.479, p<0.001). The most common chronic disease was dyslipidemia (n=176, 28.3%). Obese respondents were more likely to have one or more chronic diseases as compared to other weight categories (x2=16.958, p<0.001). More than 39% of respondents (n=247) were interested in having weight management resources available at the university with healthy dining options and an onsite fitness center as the most frequently requested resources. Forty six percent (n=292) of the respondents reported themselves as physically active for e 30 minutes most days of the week. Obese respondents (n=71, 51.4%) were more likely to think that weight management resources were necessary as compared to other weight categories (x2=32.787, p<0.001). Conclusion/Application: The 57.5% incidence of overweight/obesity in this university employee sample was less than the incidence of overweight/obesity in US adults (68.0%). The study provides a baseline needs assessment for worksite weight management resources with greater interest from obese respondents than any other weight categories. Future research is needed to explore the potential barriers for university employees to engage in regular physical activity, and the most appropriate worksite interventions to reduce obesity.


Comparability of Response Rates of Paper and Electronic Surveys on Weight Status, Personal Weight Management Practices, and Chronic Diseases History among UMDNJ Faculty and Staff

Bhavna Aneja, 2011

Background: Health surveys in the workplace are an important part of epidemiological research, needs assessment and health promotion. Surveys can be sent and received through internet/ electronic media or traditional paper based survey methods. Objective: To compare the response rate to a survey delivered via paper and electronically that was utilized to collect information on weight status, personal weight management practices, and chronic disease history among University Of Medicine and Dentistry of New Jersey (UMDNJ) faculty and staff. Design/Subjects: A prospective survey utilizing paper and electronic survey modes was sent to a stratified random sample of 2000 UMDNJ faculty and staff. A modified Dillmans Tailored Design method was used and up to four contacts were made with the study participants. The fourth contact was made in the alternate mode with the non- responders. Statistical Analysis: SPSS 17.0 was used for data analyses (alpha: pd0.05). Descriptive and inferential statistics were used. Results: The overall response rate was 35.7% (n= 623). The response rate to the paper survey in the first three mailings was 36.6% vs. 13.0% to electronic survey in the first three mailings. The response rate to the non- respondent paper survey was 16.6% versus 5.4% to non- respondent electronic survey. There was a significant association between mode of survey delivery and if participants responded to the survey (Ç2 = 137.64, p<0.001). Participants were 3.9 times more likely to respond to the paper survey as compared to the electronic survey. There was a significant association between mode of survey delivery and whether participants responded to the non- respondent survey (Ç2 = 39.46, p<0.001). Participants were 3.3 times more likely to respond to the non- respondent paper survey as compared to the non- respondent electronic survey. There was no significant association between mode of survey delivery and participants in terms of age, BMI, and educational level. When race was controlled for, UMDNJ faculty were 1.7 times more likely to respond to the paper survey versus the electronic survey compared to the UMDNJ staff. When UMDNJ position was controlled for, Whites were 1.7 times more likely to respond to the electronic survey versus paper survey as compared to all other races. Application/Conclusions: The current study demonstrates that UMDNJ faculty and staff in general were more likely to respond to the paper survey as compared to the electronic survey. Conducting the survey utilizing both, electronic and paper modes can be an important method to increase the response rate of a study. Also, increasing the number of mailings, and follow up reminders especially with paper mode can be a key to increasing the response rate of future studies.


Nutrition Focused Physical Examination Practices of RD Members of the Renal Practice Group and/or the Council on Renal Nutrition

Lynn Munson, 2011

Background: Nutrition focused physical examination is a component of the standards of practice and professional performance for nephrology nutrition and is included in the National Kidney Foundation (NKF) Kidney Disease Outcomes Quality Initiative Clinical Nutrition Guidelines. Objective: To determine the relationships between NFPE practices and demographic and professional characteristics of RD members of the ADA Renal Practice Group and the NKF Council on Renal Nutrition and assess barriers to NFPE practice. Design: This was a prospective Internet-based survey sent to 2768 members of RPG and CRN. Subjects: A list of 3405 email addresses of RD members of RPG and CRN was obtained. A total of 817 (29.8%) of RDs completed the survey and met the inclusion criteria of providing care to at least one dialysis patient/week in ambulatory care/outpatient settings. Statistical Analysis: SPSS 17.0 was used for analysis. Descriptive and inferential statistics were performed to examine relationships between NFPE performance and demographic and professional characteristics. A priori ± was set at p < 0.05. Results: The survey sample was 96.8% female with a mean of 19.8 + 11.3 years of clinical experience. 53.6% (n=382) had or were pursuing graduate degrees and 15.9% (n=115) had Certified Specialist in Renal credentials. The majority (64.6% n=466) worked in for-profit dialysis centers. NFPE skills most frequently performed independently included visual assessment of muscle and fat stores, height and weight measurement, skin assessment and assessment of peripheral edema. Time availability was identified as a strong barrier by 41.5% (n=298) of respondents. Conclusions: The five NFPE skills reported most frequently as being performed independently were visual assessment of muscle and fat stores, height measurement, weight measurement, skin assessment, and assessment of peripheral edema. Time availability, workload and privacy/space were barriers to NFPE performance. Training enhances NFPE performance and is particularly effective when a mentor is available.


The Dietary Protein and Energy Intake of People with Chronic Kidney Disease from Population Estimates Within the National Health and Nutrition Examination Survey

Linda Moore, 2011

Objective: To describe the dietary protein (DPI) and energy intake (DEI) of people with chronic kidney disease (CKD) compared to people without CKD. Methods: NHANES 2001-2008 was used to identify CKD and dietary intake. Complex survey analyses were used to report population estimates of CKD, and DPI and DEI at each stage of CKD. Comparison to adults with No CKD was performed with complex survey design analysis using ANOVA. One-sample t-tests compared intake to recommended values. Results: Of 41,658 NHANES participants, 16,872 (40.5%) were e20 years of age and had evaluable data for CKD staging and dietary intake. For DPI, using those with No CKD as the comparator (mean±SE, 1.34±0.01 g/kg/d), lower DPI was reported by adults at stage 2 CKD (1.27±0.03 g/kg/d, p=0.0008), stage 3 CKD (1.14±0.2 g/kg/d, p<0.0001), and stages 4 or 5 CKD (not receiving dialysis; 1.04 ± 0.05; p<0.0001). DEI was also lower in those with CKD: No CKD (35.6±0.7kcal/kg/d), stage 2 CKD (32.8±0.8kcal/kg/d, p<0.0001), stage 3 CKD (29.2±0.5kcal/kg/d, p<0.0001) and stage 4 or 5 CKD (not receiving dialysis) (26.3 ± 0.8kcal/kg/d, p<0.0001). Seventy percent of No CKD, 60% of stages 1-3 and 50% of stages 4 and 5 (not yet receiving dialysis) had DPI above recommended values from Dietary Reference Intake (DRI) and the Kidney Disease Outcomes Quality Initiative (KDOQI). Ten percent of No CKD and 20% of stages 2-5 (not yet receiving dialysis) had inadequate DPI according to the DRI and KDOQI. Conclusions: DPI and DEI of adults in the US with CKD was significantly different than in those without CKD. DPI decreased with each progressive stage of CKD. Dietitians and physicians should consider this information when treating people with CKD.


The Impact of a 12-week Worksite Lifestyle Management Program on Health Related Quality of Life

Maura Bruno, 2010

Objective: To determine the effectiveness of a 12-week workplace intervention program (WIP) focused on weight loss and reduction of cardiovascular risk factors on health related quality of life (HRQOL) and the effect of delivery method on outcomes. Methods: A retrospective analysis of data collected in a 12-week trial comparing in-person (IP) and Internet-based (IB) intervention to identify the impact on HRQOL utilizing the CDC-HRQOL-14 questionnaire. Results: Repeated-measure-ANOVA indicated no significant intervention effect for HRQOL by group assignment. Within subjects, significant main effect was noted for improvement in summative index of unhealthy days, sleeplessness and vitality days at weeks 12 and 26. At week-26 significant main effect was found for improved mentally unhealthy and depression days. Conclusions: Improvement in HRQOL following a 12-week university-based WIP can occur independent of method of delivery (IP vs. IB).


Screening Practices of Dentist Members of the New Jersey Dental Association Regarding Eating Disorders

Rosemarie Scarpa, 2011

Background: Dentists may be the first to note the oral manifestations of eating disorders, facilitating early diagnosis and intervention and reduction of oral and systemic complications. It is important for dentists to identify oral and physical manifestations of eating disorders, manage oral manifestations, and refer patients to appropriate health care professionals for management. Objective: To explore screening practices, demographic characteristics, and knowledge regarding eating disorders, among dentists who are members of the New Jersey Dental Association (NJDA) and to identify relationships among these variables. Design: The study was prospective and descriptive and utilized an Internet based survey sent to dentist members of the NJDA. Participants: Active general or pediatric dentist members of the NJDA in 2010-11 who responded to an email survey. Statistical analyses performed: Descriptive and inferential statistical analyses were conducted to assess the relationships between the screening practice of looking for oral manifestations of eating disorders and gender, age, number of years in professional dental practice and knowledge score. A priori alpha level was set at 0.05. Results: The sample was 64.8% male (n=81) and respondents were general (n=108, 85.7%) or pediatric dentists (n=15, 11.9%). The mean age was 50.63 years. The mean number of years in practice was 23.22. Most respondents did not complete academic or continuing dental education courses (n=104, 88.1%) or clinical training (n=112, 93.3%) in eating disorders identification and management. The mean knowledge score was 29.94 of 44. Most respondents looked for oral manifestations of eating disorders (n=122, 87.8%) and referred patients to their primary care physician when referring for eating disorders management (n=97, 89.0%). No relationships were found between the screening practice of looking for oral manifestations of eating disorders and gender, age, number of years in professional dental practice or knowledge score regarding eating disorders. Conclusions: Most respondents look for oral manifestations of eating disorders and discuss suspected or known oral pathologies suggestive of eating disorders with their patients, but they may not recognize all of these manifestations or may be unsure of their etiology. Respondents were more likely to correctly answer questions on the diagnostic criteria for anorexia nervosa and bulimia nervosa than questions regarding their oral and physical manifestations. When respondents refer patients with suspected or known oral pathologies suggestive of eating disorders, they largely refer to the patients primary care physician, with the majority indicating they do so because the primary care physician should be the health professional to make the referral.


A Retrospective, Longitudinal Evaluation of Nutritional Status in a Cohort of Childhood Cancer Survivors

Nancy Sacks, 2014

Background: Children with cancer experience suboptimal weight gain and linear growth during therapy. Survivors may demonstrate atypical growth patterns after therapy completion. Objective: To describe nutritional status using weight for age Z-score (WFAZ), height for age Z-score (HFAZ) and categories of nutritional status at specified time points; and to analyze changes in nutritional status between diagnosis (T1) and five years from diagnosis (T2). Design/Subjects: Retrospective pilot study evaluating childhood cancer survivors (Brain tumor: n= 61; Other solid tumor: n=61). Statistics: WFAZ and HFAZ were analyzed between T1 and T2 using the paired samples t-test. Nutrition category (underweight, normal, overweight, obese) was compared between T1 and T2 with the Fisher’s Exact Test. Results: The study sample (n=122) were 82.8% Caucasian, 53.3% male, with a mean age at diagnosis of 77.4 months (range 0.04-235.1). There was a statistically significant decrease in mean HFAZ in the entire cohort at T1 (-0.22±1.29) vs.T2 (-0.55±1.13), p=0.010 and in the brain tumor cohort at T1 (-0.13±1.22) vs.T2 (-0.77±1.03), p=0.004. There was a statistically significant difference between the nutrition category at T1 and T2 in the entire cohort (p<0.0001), brain tumor cohort (<0.05) and the other solid tumor cohort (p<0.05). Percentage of subjects classified as overweight/obese increased from T1 to T2, 7.6/13.3 to 11.8/14.7, respectively. Thirty-five percent of survivors were overweight or obese at initial survivorship visit. Conclusions: This pilot study demonstrates the importance of evaluating WFAZ and HFAZ and categories of nutritional status. The mean WFAZ and HFAZ decreased at the end of therapy, indicating suboptimal weight gain and linear growth. Trends were observed for increases in both WFAZ and HFAZ at five years from diagnosis, but HFAZ does not return to Z-scores comparable to diagnosis. Further research is needed to better understand changes in weight and height to develop proactive strategies that maximize growth.


Investigating the relationship between measured resting energy expenditure and steady state among patients on maintenance hemodialysis

Laura Olejnik, 2014

Indirect calorimetry (IC) is the gold standard for determining measured resting energy expenditure (mREE) in the clinical practice setting. Current practice guidelines recommend a 30 minute test for meeting steady state (SS) limits. A SS interval was defined as the coefficient of variation between VO2 and VCO2 = 10% within a predetermined time. Achieving a 30 minute SS interval may be difficult in the maintenance hemodialysis (MHD) population as participants may become uncomfortable during IC testing either due to the test itself or secondary to their current health status. The aim of this study was to explore if a shortened SS interval (e.g., = 5 minutes) was of acceptable limits for bias and precision. The levels of agreement between mREE at 5 minutes SS and each studied time interval (30, 10, 4, 3 and 2 minutes) were determined by combining paired-samples t-test and Bland-Altman analysis. Statistical significance was set at p < 0.05. A secondary analysis of existing data was completed on 53 patients on MHD who participated in a multi-site, cross-sectional study within the Northeastern region of the United States. There was a non- significant difference in mREE at 30 minutes SS (p=0.49), 10 minutes SS (p=0.96), 4 minutes SS (p=0.65), and 2 minutes SS (p=0.29) when compared to the 5 minute SS interval period. Calculated mREE by IC within the MHD population vary between 5-18% when compared to control subjects in current publications. Based on this range, the limit of agreement was set at ± 10% mREE of the longest SS time interval (30, 10 and 5 minutes) within each comparison SS pair. For 30, 10, 4, 3 and 2 minutes of SS, results showed that 95.7%, 94.7%, 97.9%, 97.9% and 97.9%, respectively, met the limits of agreement. There were minimal bias between 10 and 4 minutes SS compared to 5 minutes SS. Utilizing an abbreviated SS protocol of 5 minutes appears to provide accurate mREE measurements in patients on MHD while minimizing IC testing time as well as patient burden.


Rutgers Biomedical and Health Sciences School of Health Related Professions: Outcomes Assessment of the Alumni of the Master of Science and Doctorate of Clinical Nutrition Programs

Erin Kenny, 2014

Background: Periodic evaluation of the outcomes of academic programs is important to assess if the programs are meeting program goals and outcomes. It is important to explore and identify alumni research-related activities and their perceived level of preparedness to meet program competencies. Objective: To identify outcomes of Rutgers School of Health Related Professions (SHRP) Master of Science in Clinical Nutrition (MSCN) and Doctorate of Clinical Nutrition (DCN) programs regarding perceived preparedness to meet programmatic competencies and post-graduation scholarship research activities. Design: Descriptive, cross-sectional, mailed survey Participants/Setting: All Rutgers SHRP MSCN and DCN alumni. Of the 141 MSCN alumni mailed the survey, 72 (51.0%) completed it and of the 39 DCN alumni mailed the survey, 24 (61.5%) completed it. Statistical Analyses Performed: Descriptive statistics were reported for demographic characteristics, alumni perceived preparedness to meet competencies, and research-related activities. The relationship between the specific research competency “Design and conduct dietetics/nutrition research in a variety of settings” (for both MSCN and DCN programs) and the number of research activities completed by MSCN and DCN alumni was explored using the Spearman’s correlation coefficient. Results: The mean number of years in practice was 14.3 years for the MSCN and 21.8 years for the DCN respondents. Sixty-seven percent (67.7%) of MSCN are “RD/RDN staff” 33.3% and of DCN are “Faculty/Educators”. Sixty eight percent (68.7%) of MSCN and 54.2% of DCN were in clinical practice. The most frequently reported clinical practice area for MSCN was “acute care community hospital-inpatient” (49.0%) and “private practice” (30.8%) for DCN. MSCN respondents rated being “very prepared” for all competencies except two, and DCN respondents rated being “very prepared” for all competencies except one. Fifty-eight percent (58.3%) of MSCN and 95.8% of DCN respondents were involved in one or more research activities. The most frequently reported MSCN research activities were: co-investigator in unfunded research, collaboration in study design, and data collection. The most frequently reported DCN research activities were: collaboration in study design, data collection, and mentor students. Conclusion: All alumni felt very prepared to meet most program competencies. More than one-half of MSCN and DCN alumni are in clinical practice. They are involved in research-related activities, publications, and presentation of research; DCN are involved in a greater range of research activities compared to MSCN.


Rutgers Biomedical and Health Science SHRP Students Enrolled in the MSCN and DCN programs: Self-Reported Frequency of Performing Beyond Entry-Level and Advanced-Level Practice Tasks

Jaime Avila, 2015

Background: There is limited published research regarding practice activities of registered dietitians at beyond entry-level (BEL) and advanced-levels of practices (ALP). Understanding how clinical practice varies across the levels is important for developing BEL and ALP education and training programs. Objective: To survey students enrolled in graduate-level programs regarding their frequency of performance of BEL and ALP tasks from the 2013 Commission on Dietetic Registration Advanced Clinical Dietetics Practice Audit. Design: A cross-sectional, mailed paper-based survey. Respondents reported how often they performed 56 practice tasks. Participants: Master of Science in Clinical Nutrition (MSCN) and Doctor of Clinical Nutrition (DCN) students from a single university working in clinical practice. Statistics: Frequency distributions and Fishers Exact Test were used to summarize survey results and compare practice activities by program. Results: Of the 63 respondents, 71.4% (n=45) met the inclusion criteria; 89.0% (n=26) MSCN and 55.9% (n=19) DCN respondents. MSCN students performed more daily nutrition care tasks while DCN respondents performed more research and scholarship-based tasks. Of the activities performed on a daily/weekly basis, 36.8% (n=7) of DCN respondents communicate research findings, 63.2% (n=12) evaluate published research, 36.8% (n=7) analyze data from nutrition care research, and 42.1% (n=8) utilize systematic methods to obtain published evidence to answer clinical questions. Conclusions: Consistent with prior research, those in clinical practice with master’s degrees perform more research based practice tasks on a daily/weekly basis compared to those without a master’s degree. These findings may help establish competencies for graduate programs geared toward BEL and/or ALP RDs.


Validation of a Practice-Based Research Involvement Survey for Registered Dietitian Nutritionists in Clinical Practice

Maria Plant, 2015

Background: Research is critical to the advancement of the dietetics profession and evidence-based practice. A measurement tool that is sensitive to research activities in clinical practice is essential to accurately measure research involvement of registered dietitian nutritionists (RDNs)in this setting. Objective: The aim of the study was to validate the Practice-Based Dietitian Research Involvement Survey (PBDRIS)in a sample of RDNs working in clinical practice. Participants/setting: A randomly selected sample of 1,500 RDNs from the Commission on Dietetic Registration (CDR) membership database who indicated clinical practice as their primary focus were asked to participate in the study. The final sample included 79 respondents (5.4% response rate). Of these 79 respondents, 37 completed the retest (46.8% response rate). Intervention: Eligible RDNs were invited to complete the PBDRIS via an online survey delivery system. Main outcome measure: The main outcome of interest for the study was validation of the PBDRIS. Statistical analyses performed: The validity of the questionnaire was assessed by content experts using a content validity index. Reliability and internal consistency were examined using Cronbach’s a coefficient and mean inter-item correlation. Reproducibility was measured between test and retest phases using Spearman’s correlation. Results: Content validation of the PBDRIS was conducted resulting in a content validity index of 0.90 for the total tool. Cronbach’s a for the total PBDRIS was 0.87 reflecting good reliability. Cronbach’s a ranged from 0.53- 0.81 for individual research levels. Mean inter-item correlations were optimal for all research levels (0.24-0.41). Corrected item-total correlation scores for all items on the PBDRIS were greater than or equal to 0.30, suggesting overall adequate item correlation. Spearman’s correlation coefficients ranged from 0.34- 0.72 reflecting tool reliability and reproducibility. Conclusion: The PBDRIS is a valid and reliable tool for measuring research involvement among RDNs working in clinical practice.


Comparison of measured resting energy expenditure (mREE) from a metabolic cart to a portable handheld device in maintenance hemodialysis patients: a feasibility study.

Miriam Alavi, 2015

Background: A portable handheld device for measuring REE of patients on dialysis compared to traditional indirect calorimetry may be a more convenient, reliable estimation of the individual’s energy needs compared to predictive formulas that are currently used in clinical practice. The purpose of this study was to determine the level of agreement in REE as measured by a metabolic cart and a portable handheld device among patients on MHD. Design/Subjects: This study involved medically stable, English speaking MHD patients, mean age 60.6 ± 10.2 years, 70.6% male, mean BMI 30.0 ± 7.1 kg/m2. Most common etiology of CKD was hypertension (41.2%). Measured REE (mREE) from both devices was completed in 16 participants. Statistics: The level of agreement between the mREE from the portable handheld device compared to the mREE from the metabolic cart was analyzed using paired-samples t-tests and Bland Altman analysis. Statistical significance was p<0.05. Individual mREE measurements were considered to be in agreement if the difference was within ±10% or 147.9 kcal. Results: There was agreement between the mREE from the portable device and from the metabolic cart with 68.8% of measurements falling within the band of acceptability. There was no statistically significant difference in the mean scores of the mREE from the two devices (p=0.759). The majority of participants (94.1%) found the portable handheld device to be comfortable. Conclusions: There was agreement between the mREE from the two devices among individuals receiving MHD. The portable handheld device may be a reasonable option for clinicians to use in the assessment of energy requirements for patients on MHD as it was acceptable to participants, and is less expensive and requires minimal time to obtain results compared to the metabolic cart. Further research is needed to evaluate the accuracy of the portable device for measuring REE in this population.


Determining the Reliability of a Nutrition-Focused Physical Assessment Knowledge Test of the Head, Neck and Oral Cavity among Clinical RD Preceptors

Christine DeSouza, 2016

Background: A reliable tool is needed to identify registered dietitians’(RDs’)knowledge of nutrition focused physical assessment(NFPA)of the head,neck and oral cavity. Objective: To determine the reliability of a knowledge test of NFPA of the head, neck and oral cavity and compare scores between RDs with and without prior NFPA training. Design: Prospective,internet-based,test/retest design. Participants/Setting: A convenience sample of clinical RD preceptors from Rutgers, The State University of New Jersey School of Health Related Professions Department of Nutritional Sciences for the 2014-2015 academic year. Intervention: The test was administered twice over seven weeks. Tests were matched from each time for test/retest analysis. Using Test I only, scores from RDs with and without prior training in NFPA were compared to determine the construct validity. Statistical Analyses Performed: Descriptive statistics were used to describe scores at each time point. Cohen’s Kappa coefficient was used to test the reliability of individual questions and the Wilcoxon signed rank test was used to determine if the differences between median scores at each time point were statistically significant. The strength of the association between scores at each time was analyzed with Spearman’s rank correlation coefficient. Results: Eight RDs completed both tests and met criteria to be included in test/retest analysis. Two individual questions had perfect agreement (k =1.00, p=0.01) and 10 questions had fair to moderate agreement (k between 0.21 to 0.60) between test/retest. A good association (p=0.718) and no statistically significant difference was found between scores at each time point(p >0.05), meaning the tests performed similarly. There were not enough valid cases of trained RDs to determine the construct validity. Conclusion: Twelve individual questions demonstrated fair to excellent agreement, indicating good reliability; however, the tool requires further testing with a larger sample to determine the reliability of the tool as a whole.


Rutgers Biomedical and Health Sciences School of Health Related Professions:2015 Outcome Assessment Survey of the Alumni of the Rutgers University Coordinated Program in Dietetics

Staci Walden, 2016

Background: Evaluation of dietetic education program outcomes provides information to assess the degree to which a program meets its goals and objectives. Exploration of alumni perceptions of preparedness to meet program competencies can help to provide insight into the educational experience from the student’s perspective. Objective: To explore perceptions of alumni of the Rutgers University Coordinated Program (Rutgers CP) in Dietetics on their preparedness to meet Accreditation Council on Education in Nutrition and Dietetics (ACEND) program competencies upon graduation from the program, and, to describe their professional and scholarly achievements since graduating from the Rutgers CP. Design: The study was a descriptive, cross-sectional design, mailed survey. Respondents completed a Likert-scale survey concerning their perceived preparation by the Rutgers CP to perform program competencies upon graduation. Participants: All alumni of the Rutgers School of Health Related Professions (SHRP) Coordinated Program in Dietetics were invited to participate. Statistical Analyses: Descriptive statistics were conducted using SPSS version 21. Results: Of the 191 alumni, 187 were sent postal surveys. The majority of respondents were female (93%), and 75% had completed the joint Rutgers University/Thomas Edison State College (RU/TESC) track. The median age of respondents was 45.0 years (SD = 9.3). The highest frequency (66.7%) of alumni identified their current job title as RDN. The most frequently reported employment settings were acute care (33.9%) and Long Term Care, Extended Care of Assisted Living (LTC/EC/AL) (32.1%). More than one half (57.8%) of the respondents had participated in at least one scholarship activity. 64.2% of entry-level competencies in the clinical domain, and 66.6% of entry-level competencies in the combined clinical and foodservice domain were rated at 80% or greater in ‘very prepared’ and ‘prepared’ categories. None of the competencies in the foodservice domain were rated at 80% or greater in the categories of ‘very prepared’ and ‘prepared’. Conclusions: Alumni respondents rated clinical competencies for entry-level practice higher in perceptions of preparedness than competencies associated with foodservice. Alumni are engaged in scholarship and professional activities in the field of dietetics.


The Relationship Between the Response on Nutrition Specific Questions of the Kidney Disease Quality of Life (KDQOL) Survey and Nutrition-Related Parameters and Fluid Status of Adult Maintenance Hemodialysis Patients

Amy Kaminski, 2016

To determine the relationship between patients’ self-reported perceptions regarding their dietary and fluid restrictions and their overall nutritional status, we explored the specific diet and fluid questions on the KDQOL annual survey in relation to nutrition-related parameters (albumin, normalized protein catabolic rate, phosphorus, potassium) and fluid status (measured by average interdialytic weight gains; IDWGs) among adult patients receiving in-center MHD. A retrospective chart review (N=50) was conducted between January 2014-June 2015. Spearman’s correlation coefficients were used to analyze relationships between patients’ KDQOL responses to diet and fluid restriction questions, mean KDOQL component scores, nutrition-related parameters, average IDWGs, and clinical characteristics. Summary statistics were used to report the mean KDQOL component scores. An a priori alpha was set at the 0.05 level. Sixty-percent of the sample were males with 62% being African American with a mean dialysis vintage of 6 years (SD=15.1). The Effects of Kidney Disease mean component score was 74.8 (SD=20.9) with the mean Mental Component Score (MCS) and Physical Component Score (PCS) being 52.2 (SD=8.9) and 42.2 (SD=10.3), respectively. While non-significant, patients who were less bothered by their diet had a higher albumin level (r=-0.1; p =0.6), whereas patients who had higher IDWGs were more bothered by their fluid restriction (r=0.2, p=0.1). Interestingly, patients who had a higher mean dialysis vintage were significantly less bothered by their fluid restriction (r = 0.4, p = 0.007) but scored significantly lower on the PCS (r=-0.3;p=0.02). While this sample reported higher mean KDQOL scores than cited in the literature, this study identified that key questions/components of the KDQOL can identify patients who may need additional support and encouragement with maintaining optimal health and nutritional status.


The Relationship between Social Support and Diet Quality

Renee Pieroth, 2016

Background: Diet quality is part of a public health strategy to attain and maintain a healthy body weight, optimize overall health, and reduce the risk of chronic disease. Greater social support has been associated with physical and mental health. The relationship between social support and diet quality is not well understood. Objective: The purpose of this research is to assess the relationship between social support, using the Rees social support index (SSI), and diet quality, using the Healthy Eating Index-2010 (HEI-2010), among U.S. adults aged 40 years and older. Design/participants: This study was a cross-sectional analysis of data from the 2007-2008 National Health and Nutrition Examination Survey (NHANES) (N = 3243). Social support was examined by the SSI (0-5) and is the sum of five dichotomized variables. The HEI-2010 was calculated from the average of two 24-hour dietary recalls. Statistical analyses: SAS survey procedures were used to incorporate the appropriate sample design weights. Unweighted frequencies are reported along with weighted means and confidence intervals (CI) or standard errors (SE). The total and component HEI-2010 scores were compared among the six SSI groups with additional models comparing three SSI categories of low, moderate, and high social support and controlling for gender, age, race/ethnicity, income level, and education level. Results: The mean HEI-2010 score for those with low SSI (n=210) was 51.1 (95% CI: 48.0, 56.0) compared to 56.7 (95% CI: 55.0, 58.3) for those with high SSI (n=1725), p < 0.0001. Once adjusted for demographic variables, there was an overall trend seen where an increasing SSI score was associated with an increasing mean total HEI-2010 score (p = 0.05). When stratified by gender and adjusted for other demographics, moderate and high SSIs (mean total HEI-2010 55.5; SE 0.9 and 55.3; SE 1.0, respectively) were associated with higher diet quality compared to low SSI in men (mean total HEI-2010 49.6; SE 1.6), p = 0.009. There was no significant difference among SSI groups and diet quality in females (p = 0.28). Conclusions: This study suggests that there may be a positive relationship between social support and overall dietary quality among middle-aged and older adults in the U.S., and it may differ by gender.


Predictors of Preoperative Weight Loss Achievement in Adult Bariatric Surgery Candidates Following a Low Calorie Diet for Four Weeks

Deborah Hutcheon, 2016

Background: Achieving program-mandated preoperative weight loss poses a challenge for many bariatric surgery candidates. No systematic method exists to identify at-risk patients early in preoperative care. Objectives: This study sought to explore predictors of preoperative weight loss achievement and to develop a treatment algorithm for guiding clinical decision-making. Setting: Greenville Health System, South Carolina Methods: A retrospective chart review was conducted for 378 patients who followed a program-mandated low calorie diet (LCD) for four weeks to achieve =8% excess body weight loss (EBWL). Associations between weight loss achievement and patient demographic, nutrition, psychological, clinical, anthropometric, and treatment characteristics documented at five preoperative evaluation events were analyzed using logistic regression. Results: During the LCD, 62.7% of patients achieved =8% EBWL. Independent predictors of achievement (all p<0.05) were male sex (OR 2.31, 95% CI 1.21-4.42), Caucasian race (OR 2.45, 95% CI 1.38-4.34), body mass index (BMI) at surgeon evaluation (50.0-59.9 kg/m2: OR 0.44, 95% CI 0.20-0.97; =60 kg/m2: OR 0.15, 95% CI 0.05-0.42), number of comorbidities (OR 0.83, 95% CI 0.74-0.93), hypertension diagnosis (OR 2.42, 95% CI 1.42-4.13), pre-diet weight change (OR 1.08, 95% CI 1.01-1.16), and time between surgeon evaluation and preoperative LCD initiation (61-90 days: OR 0.46, 95% CI 0.23-0.93).


The Relationships between Fat Free Mass (FFM) Body Composition Estimates(using Three Validated Equations)and Select Patient Characteristics from a Cohort of Patients on Maintenance Hemodialysis in the Northeastern US Region.

AnDre' Blanks, 2016

Background: Body composition estimates are influenced by the relationship between FFM, fat mass, fluid status, and patient characteristics in individuals on maintenance hemodialysis (MHD). Objective: The purpose of this study was to examine relationships among FFM estimates using three validated equations in the context of patient characteristics for a MHD cohort. Design: This secondary analysis of a multi-site cross-sectional study identified patient characteristics (e.g. interdialytic weight gain (IDWG), fluid overload). FFM was calculated using three equations (i.e. Kushner, Lukaski, and Segal). Relationships between FFM as estimated by the three equations and the patient characteristics were determined using statistical correlation tests. Results: A total of 133 participants were included for most analyses. The mean age was 55.4 ±12.3 years, 39.1 % (n=52) of participants were obese, and 31.8% (n=41) were fluid overloaded. The FFM means using the equations were Kushner, 54.3±12 kg; Lukaski 52.6±11.3 kg; and Segal 56±12.3 kg. A significant correlation (Kushner r= 0.428 p<0.001; Lukaski, r= 0.409 p<0.001; Segal, r= 0.493 p<0.001) was found between IDWG and FFM. A significant correlation (Kushner, r= 0.483, p<0.001; Lukaski, r= 0.453, p<0.001; Segal r= 0.335, p<0.001) was found between fluid overload and FFM. Conclusion: Participants with higher FFM measures had higher IDWGs and were more likely to be fluid overloaded. Factors contributing to outliers were related to fluid overload, IDWG, fat mass, and how the three validated equations account for fluid overload and fat mass.


Nutrition focused physical examination knowledge score changes following completion of a computer-assisted instruction module among an international cohort of dietetic students

Jillian Redgate, 2016

Introduction: Lack of education and training are barriers to use of nutrition focused physical examination (NFPE). Computer-assisted instruction (CAI) can be used for NFPE education globally. The purpose of this study was to assess changes in knowledge scores of an international cohort of dietetic students enrolled in an online NFPE CAI module. We hypothesized that there would be a significant increase in scores from before to after completion of the CAI for students in each school. Methods: Participants from the Rutgers University Coordinated Program in Dietetics in the United States (U.S.) and Tel-Hai College in Israel enrolled in an eight-week online NFPE CAI module. The module included live virtual classroom sessions, multi-media synchronous and asynchronous presentations, online discussion forums, and case studies. Changes in pre/post-test scores using a 48-item multiple-choice exam given before and after completion of the module were analyzed retrospectively using Wilcoxon signed rank tests. Results: Of the 21 U.S. participants who completed the pre- and post-tests, there were significant increases in knowledge scores from a x ¯ of29.2 (SD=4.47) at pre-test to a x ¯ of 39.4 (SD= 2.91) at post-test (p<0.001). There were significant increases in scores for the 26 Israeli participants from a x ¯ of 24.0 (SD=4.37) on the pre-test to x ¯ of47.3 on the post-test (SD=1.15) (p<0.001). Post-test scores were higher for extra- and intra-oral and dysphagia screening content areas compared to introduction to NFPE and malnutrition screening areas. Conclusion: All participants enrolled in this CAI on NFPE exhibited increased knowledge scores from before to after completion of the module. The findings support the need for larger controlled studies to test the effectiveness of using CAI to provide NFPE education and training to dietetic students. This research was funded by the Rutgers University Centers for Global Advancement and International Affairs.


COMPARISON OF A HANDHELD INDIRECT CALORIMETRY DEVICE AND PREDICTIVE ENERGY EQUATIONS AMONG INDIVIDUALS ON MAINTENANCE HEMODIALYSIS

Ellis Avery Morrow, 2016

Practical methods for determining resting energy expenditure (REE) among individuals on maintenance hemodialysis (MHD) are needed due to the limitations of indirect calorimetry (IC). Two disease-specific predictive energy equations (PEE) have been developed for this metabolically complex population. The aim of this study was to compare estimated REE (eREE) by PEEs to measured REE (mREE) with a handheld indirect calorimetry device (HICD). A prospective pilot study of adults on MHD (N=40) was conducted at two dialysis clinics in Houston and Texas City, Texas. Measured REE by an HICD was compared to eREE determined by six PEEs using Bland-Altman analysis with a band of acceptable agreement of ±10% of the group mean mREE. Paired t-test and the intraclass correlation coefficient were also used to compare the alternate methods of measuring REE. A priori alpha was set at P < .05. The mean (±SD) age was 56.7±12.9 years, 52.5% (n=21) were female, and 85% (n=34) were African American. BMI ranged from 18.1- 47.1 kg/m2, 67.5% were overweight (BMI=25) and 50% were obese (BMI=30). The Maintenance Hemodialysis Equation-Creatinine version (MHCD-CR) was the most accurate PEE with 52.5% of values within the band of acceptable agreement, followed by the Mifflin-St. Jeor Equation and the Vilar et al. Equation at 45.0% and 42.5%, respectively. The MHDE-CR was more accurate and precise than the other PEEs evaluated. Further research is needed to validate the MHDE-CR and other practical methods for determining REE among individuals on MHD.


Impact of Vegetable Exposure-Garden Grown Intervention(VEGGI)on Vegetable Consumption of Undergraduate Students

Jill Goode Englett, 2016

Objective: To explore the impact of Vegetable Exposure-Garden Grown Intervention (VEGGI) on vegetable intake and diet quality of college students enrolled in an in-class section undergraduate nutrition course (intervention group) compared to online sections without exposure to the intervention (comparison group). Participants: Students enrolled in an undergraduate nutrition course spring 2016 semester. Methods: A 17-week, quasi-experimental study compared the pre- versus post-course dietary intake of students enrolled in in-class (n=14) and online (n=46) sections of an undergraduate nutrition course. Results: An ANCOVA showed no significant difference in the change in vegetable intake between groups, however the comparison group significantly increased their vegetable intake by 0.42 cups per day. Participating in a nutrition course positively impacted participants’ dietary intake of kcalories, added sugar and protein. Conclusions: Though VEGGI did not have a direct impact on vegetable intake at the current intensity, nutrition education positively impacts diet quality.


Comparison of Subjective Global Assessment and Protein Energy Wasting Score to Nutrition Evaluations conducted by Registered Dietitian Nutritionists in identifying risk of Protein Energy Wasting in Maintenance Hemodialysis Patients

Simon Siu-Man Sum, 2016

Objective: The study compared the 7-point Subjective Global Assessment (SGA) and the Protein Energy Wasting (PEW) Score with Nutrition Evaluations (NutrE) conducted by registered dietitian nutritionists (RDNs) in identifying PEW risk in stage five chronic kidney disease (CKD) patients on maintenance hemodialysis (MHD). Design and Methods: This study is a secondary analysis of a cross-sectional study entitled “Development and Validation of a Predictive energy Equation in Hemodialysis”. PEW risk identified by the 7-point SGA and the PEW Score were compared against the NutrE conducted by RDNs through data examination from the original study (reference standard). Subjects: A total of 133 patients were included for the analysis. Main Outcome Measures: The sensitivity, specificity, positive and negative predictive value (PPV and NPV), positive and negative likelihood ratio (PLR and NLR) of both scoring tools were calculated when compared against the reference standard. Results: The patients were predominately African American (n=112, 84.2%), non-Hispanic (n=101, 75.9%), and male (n=80, 60.2%). Both the 7-point SGA (sensitivity =78.6%, specificity = 59.1%, PPV = 33.9%, NPV = 91.2%, PLR = 1.9 and NLR = 0.4) and the PEW Score (sensitivity = 100%, specificity= 28.6%, PPV = 27.2%, NPV = 100%, PLR = 1.4 and NLR = 0) were more sensitive than specific in identifying PEW risk. The 7-point SGA may miss 21.4% patients having PEW and falsely identify 40.9% of patients who do not have PEW. The PEW Score can identify PEW risk in all patients but 71.4% of patients identified may not have PEW risk. Conclusions: Both the 7-point SGA and the PEW Score could identify PEW risk. The 7-point SGA was more specific and the PEW Score was more sensitive. Both scoring tools were found to be clinically confident in identifying patients who were actually not at PEW risk.


Calorie and Protein Intake of Very Low Birth Weight Infants When the Nutrient Composition of Breast Milk is Measured by Human Milk Analysis

Melanie Newkirk, 2016

Purpose: Prematurity and its associated morbidity underscore the importance of meeting the targeted energy and protein requirements in VLBW infants. It is established that human milk (HM) is ideal for enteral feeding and donor breast milk (DBM) is used when mother’s own milk (MOM) is not available. Various fortification strategies are available to increase the calories and protein in HM, however when the baseline nutritional content is unknown, it is not possible to determine if target goals are met. Use of a breast milk analyzer facilitates accurate determination of the calorie and macronutrient content of HM in order to reach the recommended calorie and protein goals. The purpose of this study was to compare the actual calorie and protein intake on days VLBW infants were fed either fortified MOM or DBM relative to established nutritional recommendations and to determine if there were differences between the groups. Methods: A retrospective medical record review was conducted in 31 VLBW infants receiving MOM or DBM. Calorie and protein composition was measured from a batch of HM prepared for a 24 hour feeding period using the Calais HM Analyzer. Data collected included milk source (MOM or DBM), type and amounts of fortifiers added, volume of enteral feedings and weight of the infant so that total enteral calorie and protein intake could be measured. Enteral feeding days of MOM and DBM were compared to each other and to established nutritional recommendations (110-135 kcals/kg/d and 3.5-4.5 grams protein/kg/d). Statistical analysis included the independent samples t-test and a p-value < 0.05 was considered statistically significant. Results: A total of 145 days of enteral feedings were included, 78 (53.8%) from DBM and 67 (46.2%) from MOM. All HM was blindly fortified to at least 24 kcals/ounce with additional protein added. In the DBM group, 18 of 78 days (23.1%) and 12 of 67 days (17.9%) from the MOM group were fortified to an assumed concentration of 26 kcals/ounce. Energy and protein content of DBM and MOM prior to and after addition of fortifiers was similar. Fluid intake from enteral feedings was significantly higher when receiving DBM compared to MOM, 150.6 ± 7.6 ml/kg vs 146.8 ± 11.3 ml/kg (p=0.016). DBM feedings provided 110.1±9.0 kcals/kg compared to 113.0 ± 21 kcals/kg from MOM feedings (p=0.275). Mean protein intake was also similar, 4.1 ± 0.5 g /kg/d on DBM days vs 4.0 ± 0.5 g kg/d on MOM days (p=0.162). On days of HM enteral feedings, 46 of 78 DBM days (59.0 %) and 30 of 67 MOM days (44.8%) were below the minimum established calorie needs of 110 kcals/kg. Protein goals of 3.5 g pro/kg were not met on 8 out of 78 DBM days (10.3%) and 8 of 67 (11.9%) MOM days. Conclusions: DBM in VLBW infants provides nutrient intake comparable to MOM but with a higher volume of enteral feedings. MOM and DBM feedings met protein needs with use of a high protein fortifier. The calorie content of MOM and DBM before and after fortification was lower than is assumed in clinical practice. Both types of HM may fail to meet energy needs with standard fortification at typically prescribed feeding volumes. A subset of study patients required higher calorie fortification to meet minimum energy needs and this warrants further investigation related to growth outcomes.


Nutrition Specialists Have a Positive Impact on Weight Status Outcomes in Multi-Component Pediatric Weight Management Interventions: A Systematic Review and Meta-analysis

Kyle Thompson, 2016

Nutrition specialists are considered key members of multi-component pediatric weight management (PWM) intervention teams, but to date, their contribution has not been quantified. The purpose of this systematic review and meta-analysis was to measure the effects of the nutrition specialist on PWM outcomes including BMI, BMI z-score, and waist circumference. A comprehensive search of controlled trials completed between May 2012 - December 2015 was conducted across the PubMed, SCOPUS, CINAHL, and Cochrane databases. Studies included overweight and/or obese patients from 6 – 18 years of age receiving outpatient weight management treatment. Data extraction was performed using a standardized tool, and was merged with a pre-existing database housed within the 2015 PWM Project of the Academy of Nutrition and Dietetics Evidence Analysis Library. Meta-analyses were conducted by provider type at selected time points. Meta-regression analyses evaluated significant differences from the reference category for the various provider types. Meta-regression analysis indicated smaller increases in BMI over time for the nutrition specialist-only category compared with the neither nutrition specialist or behavioralist category (reference), and this difference was significant at the 3 to <6 months and 1-year to 2-years time points (p = 0.001 and 0.05, respectively). The nutrition specialist-only condition resulted in larger reductions in BMI z-score than behavioralist-only, combined nutrition specialist and reference types. Meta-regression showed that the difference in BMI z-score between the nutrition specialist-only category and the reference category was significant at 3 to <6 months, 6 months to < 1 year, and 1-year to 2-year time points (p = 0.01, 0.05, and 0.01, respectively). Evidence indicated that longer-term PWM outcomes were better when a nutrition specialist was involved in delivering care. To the best of the investigators’ knowledge, this research constitutes the first meta-analysis to quantify the importance of utilizing a nutrition specialist to deliver nutritional components of multi-component PWM interventions.


Relationships Between Dietary Energy and Protein Intake and Nutrition Specific Quality of Life in Individuals on Maintenance Hemodialysis

Malki Waldman, 2017

The National Kidney Foundation Kidney Disease Outcomes Quality Initiative (KDOQI) guidelines recommend 30-35kcal/kg and 1.2gm protein/kg daily to provide optimal nutrition in a population at risk for protein energy wasting. Due to its association with morbidity and mortality, health related quality of life is assessed for individuals on maintenance hemodialysis (MHD), but using tools that minimally address nutrition. The Nutrition Specific Quality of Life (NSQOL) tool measures perceptions regarding how dialysis interferes with eating behaviors. This study explored relationships between dietary energy (DEI) and protein intake (DPI) and NSQOL, and described food group servings among individuals on MHD using a 24 hour diet recall collected on a treatment day. A cross-sectional, secondary analysis using data collected from 144 individuals enrolled in the Development and Validation of a Predictive Energy Equation in Hemodialysis study was conducted. Spearman’s correlation coefficients were used to evaluate relationships. Participants had a mean (±SD) age of 55.9 years ± 11.8, were mostly African American/Black (84.0%) and male (59.0%).The mean DEI was 17.7kcal/kg and the mean DPI was 0.7g/kg. Out of 15 points, the mean NSQOL composite score was 9.5 ± 3.8. A small positive correlation was found between NSQOL and the mean DEI (r = 0.198, p <0.01) and DPI (r = 0.241, p <0.01). The food group servings most frequently consumed were animal protein and grains. In this sample, DEI and DPI reported on a treatment day were lower than KDOQI recommendations and associated with their NSQOL. Future research should include dietary intake data collected from non-hemodialysis treatment days to further determine how NSQOL may impact overall eating behaviors.


Changes in Body Composition Measurements and Weight among Women who Completed a Six Week Food and Fitness Program for Weight Loss

Tamara Giles, 2017

Learning Outcome: To understand body composition outcomes of women in a short term diet and exercise program. Background: The link between changes in body composition measurements and short term weight loss programs that combine both lower calories and defined exercise is unclear. Objective: This study explored changes in waist and hip circumference, body fat, and weight among women who completed a six week weight loss program that provided pre-prepared meals and a defined exercise protocol. The relationship between age and change in weight was also assessed. Methods: A retrospective record review was conducted of 216 adult women who completed the six week program between January 2013 and May 2016. The participants received 3 freshly prepared meals and 2 snacks each day that met their individual calorie need. The participants also completed twice weekly resistance training for 20 minutes per session. Results: Mean age was 46.1±11.7 years with a baseline BMI of 28.5±5.1. Body composition measurements decreased after 6 weeks; mean change in weight was -8.3±4.4 pounds, body fat -4.3±2.2%, waist circumference -2.0±1.4 inches, and hip circumference -1.5±0.9 inches. There was a statistically significant weak, positive correlation between age and change in weight (r=0.13, p=0.049); as age increased weight reduction increased. Conclusion: Women who completed this six week program that provided all food and a defined exercise program showed significant improvements in body composition measurements.


Performance of Advanced and Beyond Entry Level Practice Tasks of Rutgers’ Students and Alumni Compared to Advanced Practice Registered Dietitian Nutritionists

Michelle Romano, 2017

Background: Advanced practice (AP) and beyond entry level (BEL) clinical nutrition tasks have been identified by the Commission on Dietetic Registration’s (CDR) Advanced Level Clinical Practice Audit (ALCPA). Rutgers’ Graduate Programs in Clinical Nutrition (GPCN) offer advanced practice curricula. The study aim was to identify the frequency of performance of AP and AP/BEL practice tasks of Rutgers GPCN students and alumni, and compare their frequency of task performance to a sample of national AP registered dietitian nutritionists identified in the ALCPA. Methods: The study was a secondary analysis of data from the CDR ALCPA and cross-sectional surveys of GPCN Doctorate of Clinical Nutrition (DCN) students and Master of Science and DCN alumni. Data included frequency of task performance from respondents with an advanced degree and >20% work time in clinical practice. The percentage of task involvement was compared between the two samples by the Rao-Scott chi-square test. Results: For 11 of 16 AP tasks and 12 of the 38 AP/BEL tasks the proportion of task involvement by the GPCN sample (n=84) was significantly greater than the ALCPA sample (n=1330) (p=0.05). The AP tasks included: evaluating published literature to determine applicability to a practice setting, conducting in-depth nutrition focused physical examination, leading an interdisciplinary team in designing nutrition programs, analyzing safety aspects of practice, developing strategic plans, and tasks related to scientific inquiry. Conclusions: Frequency of task involvement for GPCN alumni and DCN students from a single university was significantly greater for the majority of CDR’s AP tasks compared to the ALCPA sample. This may be related to the programs’ focus on advanced practice clinical nutrition.


The Perceived Value of the Certified Nutrition Support Clinician® Credential by Healthcare Practitioners Who Hold the Credential

Jaclyn Kassoff, 2017

Background: Many healthcare practitioners attain specialist certifications that exemplify specialty knowledge in a focused area of practice. The Certified Nutrition Support Clinician® (CNSC®) credential, administered by the National Board of Nutrition Support Certification (NBNSC), is available to registered dietitians, registered nurses, physicians, physician assistants, and registered pharmacists. Individuals with this credential have demonstrated that they possess the knowledge necessary to provide safe nutrition support care. It is unknown how individuals who choose to certify value having the credential. Purpose: The purpose of this study was to determine the perceived intrinsic and extrinsic value of having the CNSC® credential among healthcare practitioners with the credential. Methods: A full analysis of a cross-sectional electronic survey of healthcare practitioners that hold the CNSC® credential is reported. The survey was comprised of the Perceived Value of Certification Tool (PVCT©), used with permission from the Competency and Credentialing Institute, and demographic and clinical characteristic questions. The PVCT© is a validated tool that asks 18 Likert-type items addressing the intrinsic (12 items) and extrinsic (6 items) value of holding a credential. The survey was administered via an online survey platform; an email invitation and two subsequent reminders were sent to individuals with the CNSC® credential as of April 2016. Data are summarized using descriptive statistics. Responses of ‘no opinion’ to any PVCT item were imputed using the individuals’ mean rating across completed items. Total, intrinsic and extrinsic value scores are presented as mean + standard deviation; total percent agreement is reported as frequencies by collapsing strongly agree and agree categories. Results: Of the 4419 CNSC® credential holders emailed the survey, 20.6% (n = 909) responded and one survey was excluded for a total of 908 (20.5%) usable surveys. The mean age of respondents was 42.0±11.5 years. The majority were registered dietitians (95.5%) and female (94.4%) with either a bachelor’s (47.3%) or master’s degree (47.6%). Fifty-seven percent of respondents spend more than 51% of their weekly work time providing nutrition support care; 44.5% do so in university or academic affiliated hospitals and 37.0% in community hospitals. The sample reported working in nutrition support as a healthcare professional for a mean of 13.0±9.0 years most frequently in clinical care (89.5%), with adult populations (79.4%) in critical care settings (54.6%). Respondents have held the CNSC® credential for a mean of 8.3±7.4 years; 45.5% are in their initial certification cycle while 34.8% have recertified at least twice. Respondents’ total perceived value score was 60.7±8.2 out of a possible score of 72. Perceived intrinsic and extrinsic value scores were 42.2 + 5.6 (out of 48) and 18.5 + 3.4 (out of 24), respectively. Eleven of 12 intrinsic value items had over 90% total agreement to the statement. The items with the most agreement include: validates specialized knowledge, enhances feeling of personal accomplishment, and provides personal satisfaction, all above 98% total agreement. Extrinsic value items ranged from the lowest item at 51.0% agreement (Increases salary) to 95.7% agreement (Promotes recognition from peers). Conclusions: Healthcare professionals who are CNSC® credentialed reported high perceived intrinsic and extrinsic value for the credential. The majority of respondents are registered dietitians, thus the results are not generalizable across professions. These findings will assist the NBNSC in developing strategies to target items with lower perceived value.


Changes in cardiometabolic health among female employees with clinically relevant weight loss in the LIFT UP! worksite wellness program

Michelle Reed, 2017

Learning Outcome: To compare changes in cardiometabolic health from baseline to 26 weeks in female employees participating in a university-based worksite wellness program (WWP), in those with and without clinically relevant weight loss (> 5% body weight). Background: Moderate weight loss of > 5% has been shown to improve blood pressure (BP), total cholesterol, and blood glucose levels and is considered clinically relevant weight loss (CRWL). Design: This was a retrospective interim analysis of data from a university-based WWP. Participants were included if they were female and completed baseline and 26-week appointments between January 2013 and June 2016. Statistical analyses: Descriptive statistics were used to analyze participant characteristics at each time point. Paired sample and independent sample t-tests, McNemar tests, Wilcoxon Signed Rank and chi square tests were used to analyze changes and differences between-groups. Results: Of the 166 participants, their mean age was 50.0 + 11.0 years; mean BMI at baseline was 34.2+6.6 kg/m2; 33.7% (n=22) reported a history of hypertension (HTN). The mean weight loss (n=166) at 26-weeks was 2.4+4.5% (p<0.001); 21.1% (n=35) experienced CRWL. Those with CRWL had significantly greater reductions in weight (p<0.001), BMI (p<0.001), waist circumference (p<0.001), systolic BP (p=0.028), and total cholesterol (p=0.016) than those without CRWL. Those with CRWL experienced a reduction in diastolic BP (-5.0+8.3 mmHg) more than double that of participants without CRWL (-2.2+8.3 mmHg). Conclusions: In this university WWP, weight loss of > 5% body weight was associated with significantly greater improvements in cardiometabolic health. Future research is needed to evaluate the impact of CRWL on BP in those with and without HTN, given the large proportion of participants reporting HTN at baseline.


Change in knowledge scores of pre-professional dietetic students following completion of a nutrition-focused physical examination computer-assisted instruction module

Erika Breitfeller, 2017

Background: Nutrition-focused physical examination (NFPE) is an integral component of nutrition assessment; competency in NFPE is included in the 2017 Dietetic Education standards. The objective of this study was to assess the change in knowledge scores of dietetic students enrolled in a computer-assisted instruction (CAI) on NFPE. Design/Subjects: Retrospective analyses of the pre/post-test scores of students (N=31) from the Rutgers University Coordinated Program (n=18) and Dietetic Internship (n=13), enrolled in a six-week CAI module “The Nuts and Bolts of NFPE for Dietetic Practice.” The module was housed on the Moodle learning platform. Live virtual classes and simulated practice evaluations were conducted using an Adobe Connect virtual classroom. A multiple choice question test was used to assess knowledge scores at baseline, immediately, and six weeks post-module completion. Statistics: Statistical analyses were conducted in SPSS 23.0. Changes in knowledge scores overtime were described with a Friedman’s ANOVA. Post-hoc analyses were conducted to assess when the significant change occurred. Results: There was a statistically significant increase in mean knowledge score from pre-test to first (63.17% to 72.38%, p=0.002) and second post-test (63.17% to 75.00%, p=0.001). There was no statistically significant change of knowledge score from first to second post-test (p=0.132). Conclusions: Findings revealed that, for the total sample, knowledge scores increased from baseline to immediately following completion of the CAI module and were retained at six weeks post-intervention. Future research with a larger cohort and control group to assess effectiveness of the CAI module in teaching NFPE to pre-professional dietetic students is needed.


The relationship between prevalence of risk for malnutrition and tooth loss among patients aged 65 and older who came to a university based dental school clinic

Gloria Verdino, 2017

Objective: The aim of this study was to determine the prevalence of malnutrition and risk for malnutrition among community dwelling older adults who came for care to an urban dental school clinic. The relationship between Self-Administered Mini Nutritional Assessment (Self-MNA) score and the number of natural or restored teeth was examined. Design: A retrospective, cross sectional analysis of new patients who came for care in a 13-month period. Statistical analyses: Descriptive statistics for were used for Self-MNA score and a Spearman’s correlation coefficient for inferential analyses. Results: Of the 119 older adults (mean age 72.6 years, 48.7% male) who completed the Self-MNA questions, 23% (n=27) were at risk for malnutrition and 5.0% (n=6) with malnutrition based on Self-MNA score. Among patients with malnutrition, 83% (n=5) reported moderate to severe decline in food intake and weight loss > 7 pounds in the past three months, 83.3% (n=5) reported recent severe illness or stress, and 66.7% (n=4) experienced dementia or depression. All of those with malnutrition (n=6) had a BMI =23 kg/m2. The mean number of teeth decreased from 17.4 in those with a normal nutritional status to 14.4 in those with malnutrition; although the positive relationship was not significant. Conclusion: Patients with malnutrition had a higher incidence of weight loss, decreased intake, depression/dementia, and severe illness than those with normal nutritional status. Future research should employ a larger sample to study the relationships between number of natural or restored teeth and nutritional status in older adults.


The Relationships Between Sitting Time, Waist Circumference and Blood Pressure in Women Enrolled in the Rutgers’ LIFT UP Weight Management Program

Jennifer Dalton, 2017

Objective: To explore the relationships between sitting time, waist circumference, and blood pressure at baseline in female employees enrolled in a university workplace wellness program (WWP). Methods/Design: This was a secondary analysis of baseline data from female employees enrolled in a university WWP (2013-2016). The relationships between sitting time, waist circumference (WC) and blood pressure were analyzed using Spearman’s Correlation coefficients. Results: Of the 381 female employees with baseline data, the mean age=48.4 +10.7 (range=22.0-73.0) years. The mean WC was reflective of abdominal obesity (99.7+13.0,range=73.0-146.5 cm); 60.9% were either overweight (n=111, 29.1%) or class 1 obese (n=121, 31.8%) and 50.9% (n=194) presented with surveillance hypertension. The mean sitting time for the sample was 600 minutes (10-hours) over a 24-hour day (range 120.0-1080.0 minutes/day). There was a significant weak positive relationship between sitting time and WC (r=0.212, p<0.001) indicating that as sitting time increased, WC increased. Although a trend toward a linear relationship was identified, the relationships between sitting time and systolic (r=0.089, p=0.083) or diastolic blood pressure (r=0.096, p=0.09) were not significant. Conclusions: In this sample of university female employees who were overweight/obese, as sitting time increased waist circumference increased. These findings support the concept that sitting time is a cardiometabolic risk factor. Further research with a larger sample is needed to analyze relationships between WC and blood pressure in women in the workplace.


The relationships between dental caries in primary teeth and weight status of patients who had a comprehensive dental exam at the Pediatric Dental Clinic at Rutgers School of Dental Medicine

Deborah Salvatore, 2017

Background: There is heterogeneity regarding associations between weight status and dental caries in children in published research. The primary aim of this study was to explore the relationships between child weight status and dental caries in primary teeth in patients between the ages of 2-5 years seen in an urban dental school clinic. Methods: This was a cross-sectional study using data from electronic dental records of 186 children. Weight status was calculated using Body Mass Index (BMI) category and BMI Z-score; dental caries in primary teeth was assessed based on the nomenclature: decayed, filled surfaces (dfs) and decayed, filled teeth (dft). Results: The mean age of the children was 43.9 months; 54.2% (n=65) were Black/African American; 40.9% were overweight or obese (n=76). Mean dfs and dft values were 9.8 and 3.7 respectively. After excluding children with no caries, mean dfs and dft values were 16.65 and 6.14 respectively. No relationship was found between dental caries experience in primary teeth and weight status for the total sample. Analyses of racial subgroups revealed a weak significant negative relationship between dft and BMI Z-scores (r=-0.27, p =0.02) in Black/African American children and a weak significant positive relationship between dfs and BMI-Z scores (r= 0.28, p =0.05) in White children suggesting that race may impact these relationships. Conclusions: In this heterogeneous sample there was no relationship between weight status and dental caries; however among White children, dfs increased with increasing BMI Z-scores while lower dft values were seen with higher BMI-scores among Black/African American children. Future research examining these relationships in racial cohorts in a larger sample is needed.


Changes in Sitting Time and Health-Related Quality of Life in Adult Female University Employees Enrolled In a Worksite Wellness Program

Nicole Buonamassa, 2017

Background: Worksite wellness programs (WWPs) are growing in popularity in the workplace and may help reduce sedentary behavior and improve health-related quality of life (HRQOL). Objective: The primary objective of this study was to examine changes in sitting time and HRQOL, and the relationship between changes in sitting time and changes in mentally unhealthy days (MUD) over time in overweight and obese female university employees. Participants/Setting: Female adult employees of Rutgers University who were on time for appointments at baseline, 12, and 26 weeks, were eligible. Statistical Analyses: Demographic characteristics were assessed using descriptive statistics in an interim analysis of previously collected data. Friedman’s ANOVA was used to analyze changes in sitting time and HRQOL. Spearman’s Correlation was used to determine the relationship between changes in sitting time and changes in MUD over time. Results: From a sample of 378 LIFT UP participants, 118 (31.2%) were included in this study. Study participants had a mean age of 49.2 years (SD=11.2) and baseline BMI of 33.6 kg/m2 (SD=6.1). Eight-four percent of participants were white, non-Hispanic (50.8%, n=60) and black, non-Hispanic (33.9%, n=40). Statistically significant decreases in sitting time (-53.31 min/day; p<0.001) were reported from baseline to 26 weeks. Wilcoxon Signed Rank Test revealed that the significant change in sitting time occurred between baseline and 12 weeks (-40.8 minutes/day; p=0.002) and was maintained over 26 weeks (-53.3 minutes/day; p<0.001). Changes in HRQOL were not statistically significant. No correlation was found between changes in sitting time and changes in MUD. Conclusions: Statically significant decreases in sitting time may be possible for adult female university employees enrolled in a WWP, however; interventions to interrupt sitting time may be beneficial in future programs as while the decreases were significant, the end sitting times remained higher than that of the general population.


Is Nutrition Specific Quality of Life Associated with Nutritional Status?

Sarah Feasel-Aklilu, 2017

Objective: The study purpose was to explore the relationship between nutritional status, as measured by Subjective Global Assessment (SGA), and Heath Related Quality of Life (HRQoL) measured using the Nutrition Specific Quality of Life (NS-QoL), tool among participants on maintenance hemodialysis (MHD). The study aim was to determine if NS-QoL may be an adjuvant tool for detecting changes in nutritional status among patients on MHD. Design, Setting and Subjects: This is a cross-sectional, secondary analysis of data from a multi-center study. Participants were adult (>18) men and women on MHD (n=145) recruited from three institutions in the Northeastern United States. Methods: Statistical tests were conducted to determine the relationship between key demographic characteristics (age, sex, dialysis vintage, gender and ethnicity) and SGA and NS-QoL. Spearman’s correlation examined the relationship between the independent variable, SGA and the dependent variable, NS-QoL. A univariate general linear model was conducted to adjust for confounding variables. Main outcome measure: The relationship between overall SGA score and composite NS-QoL score. Results: The sample consisted of 85 men (58.6%), with a mean age of 55.3 ±11.9 years, who were largely African American (84.1%) and non-Hispanic (77.2%). Mean SGA score was 5.5 ± 1.0 and the mean NS-QoL composite score was 9.51 ± 3.77. No key demographic characteristics had a statistically significant relationship with SGA, while sex (p<0.001) and race (p=0.015) both had statistically significant relationships with NS-QoL. After adjusting for the variables of race and sex, NS-QoL score was positively correlated with SGA composite score (p=0.042); as NS-QoL score increased so did the SGA score. Conclusion: The current study found a positive linear correlation between NS-QoL composite score and SGA, as well as five SGA subcomponent scores and NS-QoL. These findings indicate that NS-QoL can complement the SGA to provide information about a patient’s nutritional status.


Changes in Nutrition Focused Physical Examination Knowledge of Undergraduate Nutrition and Dietetic Students Enrolled in a Computer Assisted Instruction Module

Wendy Medunick, 2018

Objective: Nutrition focused physical examination (NFPE) is a fundamental component of nutrition assessment and essential for entry-level dietetics practice. The objective was to analyze changes in NFPE knowledge among three cohorts of undergraduate nutrition and dietetic students following completion of a computer assisted instruction (CAI) module. We hypothesized total knowledge scores would increase significantly from pre to first post-test for the total sample and there would be no significant difference in knowledge scores from first to second post-test for the 2016 and 2017 cohorts. Methods: Retrospective analyses of pre/post-test scores from undergraduate nutrition and dietetic students (n=59) who completed a 6-week multi-media CAI module on NFPE during the 2015, 2016 and 2017 spring semesters at a northeastern university were conducted. The module included orofacial examination, dysphagia screening, assessment of fat and muscle wasting and malnutrition identification. A 48-item multiple choice test was constructed and given before and immediately following the CAI to all cohorts. In 2016 and 2017, a second post-test was added 6-weeks after the first post-test to assess knowledge retention. Paired t-tests and repeated-measures ANOVA were used to analyze change in knowledge scores. Results: Of the 59 students who completed the CAI, total knowledge scores increased significantly (60.8% to 82.7%, p<0.001) from pre to first post-test for the total sample. For cohorts 2016 and 2017, total knowledge scores increased significantly (60.8% to 85.8%, p<0.001) from pre to second post-test and first to second post-test (83.1% to 85.8%, p=0.020). Conclusions: Students who completed a 6-week CAI module on NFPE significantly increased their knowledge scores immediately following the module. The 2016 and 2017 cohorts demonstrated further increases in knowledge scores over time. Future research using a control group and a traditional in-person instruction group is needed to determine the efficacy of CAI to teach NFPE.


Volume-based versus Rate-based Enteral Nutrition in the Intensive Care Unit: Impact on Nutrition Delivery and Glycemic Control

Susan Roberts, 2018

Background: Underfeeding with enteral nutrition (EN) is prevalent in intensive care units (ICU) and associated with negative outcomes. This study evaluated the impact of volume-based EN (VBEN) versus rate-based EN (RBEN) on delivery of prescribed calories and protein, and glycemic control. Methods: This retrospective study included adult patients requiring mechanical ventilation within 48 hours of ICU admission and with a RBEN (n=85) or VBEN (n=86) order for =3 consecutive days during the first 12 ICU days. Results: Patients on VBEN, versus RBEN, received more prescribed calories (RBEN, 67.6%; VBEN, 79.6%; P<.001) and protein (RBEN, 68.6%; VBEN, 79.3%; P<.001). Multiple linear regression analyses confirmed VBEN was significantly associated with an 8.9% increase in calories (P=.002) and 7.7% in protein (P=.004) received, after adjusting for age, APACHE II score, duration of and initiation day for EN and ICU admission location. Presence of hyperglycemia (P=.40) and glycemic variability (GV) (P=.99) were not different between the two groups. After adjusting for age, body mass index, diabetes history, primary diagnosis and percent of days on corticosteroids, glycemic control outcomes (presence of hyperglycemia, P=.27; GV, P=.67) remained unrelated to EN order type in multivariable regression models. Conclusion: VBEN, compared to RBEN, was associated with increased calorie and protein delivery without adversely affecting glycemic control. These results suggest VBEN is an effective, safe strategy to enhance EN delivery in the ICU


The Relationships between Physical Activity and Cardiometabolic Risk Factors Among Women Participating in a University-Based Worksite Wellness Program

Kimberly Gottesman

Objective: Associations between changes in physical activity (PA) and cardiometabolic risk factors among women with overweight/obesity enrolled in a university-based worksite wellness program (WWP) were examined. Methods: Data from 173 women who completed a 26-week WWP were analyzed retrospectively. Participants completed diet and PA assessments and received individualized diet/lifestyle counseling at baseline, and 12 and 26-weeks thereafter. Anthropometrics, blood pressure and serum cholesterol were measured; PA was self-reported using the International Physical Activity Questionnaire-short form at each visit. Results: Significant improvements in anthropometrics (P<0.001), blood pressure (P<0.001), total cholesterol (P=0.014), and PA (P=0.007) were found at 26-weeks. In adjusted linear regression models, a 10-metabolic-equivalent-minute increase in PA was associated with 0.01% corresponding decreases in weight and waist circumference. Conclusions: Among women who completed this WWP, increased PA was associated with reductions in anthropometric measures.


Characteristics and drivers of the registered dietitian nutritionist’s sustained involvement in clinical research activities: A mixed methods study

Melinda Boyd, 2018

Background: Evidence-based practice is the foundation for clinical dietetics. Research contributions by registered dietitian nutritionists (RDNs) is an important part of developing the profession. Research involvement has been studied in RDNs, but little is known about the drivers of participation in research while working clinically. Objective: To explore the characteristics of established RDN clinician researchers, determine their level of research involvement, and identify key drivers contributing to their continued success. Design: We utilized a convergent parallel mixed methods study design. Research involvement was examined using the Practice-Based Dietitian Research Involvement Survey (PBDRIS). Workplace support was assessed using the Research Capacity and Culture (RCC) survey. These validated survey tools were combined to create the Clinician Researcher Survey (CRS). Semi-structured interviews were used to investigate key themes in established RDN clinician researchers who were active at higher levels of research involvement. Participants/Setting: Practicing RDN authors in the United States who published research in 2015-2016 (n=29) in select clinically relevant journals were surveyed on research involvement and key characteristics. Drivers for continued research involvement were identified. A subsample (n=10) participated in semi-structured interviews. Results: Respondents were white (95.8%) females (100%) holding a graduate degree (95.8%) with a mean age of 47.5?12.5 years and 23.0 ± 12.3 years of experience in dietetics. Research involvement scores from the PBDRIS ranged from 60.0 to 97.5% which correlated to levels 3 and 4 of the research continuum. Only one median rating on the RCC for the team domain had a less than adequate score (<5.00). Interviews with established RDN clinician researchers identified exposure, curiosity, and dedication as three overarching themes with eight subthemes that signified drivers of continued involvement. Conclusion: Higher levels of confidence and skill in research were achieved among RDN clinician researchers who had an available mentor, support from their workplace environment, and personal drive. Key Words: registered dietitian nutritionist, research involvement, clinician researcher, mixed methods


Characteristics and drivers of the registered dietitian nutritionist’s sustained involvement in clinical research activities: A mixed methods study

Stephanie Gall, 2018

Background Evidence-based practice is the foundation for clinical dietetics. Research contributions by registered dietitian nutritionists (RDNs) is an important part of developing the profession. Research involvement has been studied in RDNs, but little is known about the drivers of participation in research while working clinically. Objective: To explore the characteristics of established RDN clinician researchers, determine their level of research involvement, and identify key drivers contributing to their continued success. Design: We utilized a convergent parallel mixed methods study design. Research involvement was examined using the Practice-Based Dietitian Research Involvement Survey (PBDRIS). Workplace support was assessed using the Research Capacity and Culture (RCC) survey. These validated survey tools were combined to create the Clinician Researcher Survey (CRS). Semi-structured interviews were used to investigate key themes in established RDN clinician researchers who were active at higher levels of research involvement. Participants/Setting: Practicing RDN authors in the United States who published research in 2015-2016 (n=29) in select clinically relevant journals were surveyed on research involvement and key characteristics. Drivers for continued research involvement were identified. A subsample (n=10) participated in semi-structured interviews. Results: Respondents were white (95.8%) females (100%) holding a graduate degree (95.8%) with a mean age of 47.5?12.5 years and 23.0 ± 12.3 years of experience in dietetics. Research involvement scores from the PBDRIS ranged from 60.0 to 97.5% which correlated to levels 3 and 4 of the research continuum. Only one median rating on the RCC for the team domain had a less than adequate score (<5.00). Interviews with established RDN clinician researchers identified exposure, curiosity, and dedication as three overarching themes with eight subthemes that signified drivers of continued involvement. Conclusion: Higher levels of confidence and skill in research were achieved among RDN clinician researchers who had an available mentor, support from their workplace environment, and personal drive. Key Words: registered dietitian nutritionist, research involvement, clinician researcher, mixed methods


The Perceived Value of the Advanced Practice Certification in Clinical Nutrition by Registered Dietitian/Nutritionists Who Hold the Credential and Potentially Eligible Candidates

Jami Baltz, 2018

Background: The purpose of this study was to determine the perceived value of the RDN-AP certification of RDN-APs and RDNs who were potentially eligible to take the advanced practice credentialing exam. Design/Subjects: A web-based survey including the Perceived Value of Certification Tool (PVCT©) was emailed to 37 RDN-APs and 11,150 RDNs who were potentially eligible to take the RDN-AP exam as of September 2017. Statistics: Likert scale responses obtained from the PVCT© were analyzed as a total value score, two sub-scales (intrinsic value and extrinsic value), and percent agreement. The relationships between the total perceived value score, number of years in clinical nutrition practice, and the number of post-RDN clinical nutrition related credentials were also evaluated. Data were analyzed using descriptive statistics and correlations. Results: Respondents’ (N=138; RDN-APs n=15; potentially eligible RDN-APs n=123) median age was 43 years with a median of 16 years of clinical practice. The median value scores were: total--54.0 out of 72; intrinsic--37.9 out of 48; and extrinsic--17.0 out of 24. The mean total percent agreement for all value statements was 74.6% (RDN-APs=87.8%, potentially eligible RDN-APs=72.9%). The intrinsic value statement “personal accomplishment” had the highest percent agreement (87.0%), while the extrinsic value statement “increases salary” had the lowest percent agreement (36.2%). RDN-APs had a significantly higher median intrinsic value score (45.3) (p=0.002) and total perceived value score (61.0) (p=0.010) than potentially eligible candidates (36.0 and 54.0, respectively). There was a moderate positive correlation (r=0.334, p=0.002) between the total perceived value score and the number of post-RDN credentials for the total sample. Conclusions: Respondents agreed the RDN-AP credential held value; RDN-APs had a higher perceived value than potentially eligible candidates. Respondents with one or more post-RDN credential had a higher total perceived value of the RDN-AP credential than those with none.


Exploring the Differences Between Early and Traditional Diet Advancement in Postoperative Feeding Outcomes in Patients With an Ileostomy or Colostomy

Sabrina Toledano

Objective: To assess the differences in postoperative feeding outcomes when comparing early and traditional diet advancement in patients who had an ileostomy or colostomy creation. We hypothesized that patients who receive early diet advancement would pass flatus and produce their first ostomy output earlier than patients who receive traditional diet advancement. Methods: At a U.S. tertiary care hospital, data from patients that underwent a new ileostomy or colostomy creation from June 1, 2013 to April 30, 2017 were extracted from a prospectively maintained institutional outcomes database. Patients who received early diet advancement (postoperative day 0 and 1) were compared to traditional diet advancement (postoperative day 2 and later) for demographics, preoperative risk factors (BMI, primary diagnosis) and operative features (surgical approach, type of ostomy). The postoperative feeding outcomes included time to first flatus and first ostomy output. Mann-Whitney U tests determined bivariate differences in postoperative feeding outcomes between the diet advancement groups. Poisson regression was used to adjust for unequal baseline characteristics. Results: Data from 255 patients were included; 204 (80.0%) received early diet advancement and 51 (20.0%) had traditional diet advancement. The mean age was 56.5 years (total sample), with slightly more males (52.5%) than females (47.5%). Cancer was the most frequently reported primary diagnosis (42.4%) with no difference between groups. Time to first flatus and time to first ostomy output were significantly shorter in the early compared to traditional diet advancement group (median difference 1 day for both flatus and ostomy output, p<0.001). Group differences remained significant after adjusting for confounders (American Society for Anesthesiology Physical Status Classification System, surgical approach, type of ostomy, and type of resection) for both time to first flatus (?=1.32, p=0.01) and time to first ostomy output (?=1.41, p< 0.001). Conclusions: Early diet advancement is associated with earlier return of flatus and first ostomy output compared to traditional diet advancement after the creation of an ileostomy or colostomy. Keywords: ileostomy, colostomy, early diet, traditional diet, flatus, ostomy output


Self-reported physical activity, anthropometric measures and health-related quality of life measures in female University employees participating in one of two worksite wellness weight management studies: An interim analysis

Samantha Nuzio, 2018

Background: Physical activity (PA) and health related quality of life (HRQoL) may be related as prior research has shown that participants that increase PA can also improve HRQoL, however, results are inconsistent and relationships are not commonly explored in the worksite setting. Introducing PA as part of a worksite wellness program (WWP) may improve HRQoL. Objective: To compare PA, HRQoL, and anthropometric outcomes of two WWPs, LIFTUP and THEBEST. Participants/Setting: Female employees at Rutgers University with overweight or obesity that completed baseline and 12-week appointments of LIFTUP and THEBEST protocols on the Central New Jersey Campuses. Outcomes: PA, HRQoL, weight, waist circumference, and body fat variables were analyzed within and between groups at baseline and 12 weeks later. Statistical Analyses: Demographic characteristics were assessed using descriptive statistics. Inferential statistics were used to analyze within and between sample differences in PA, HRQoL and anthropometrics. Results: The mean age of all participants (n=103) was 47.51 years; 74.80% (n=77) of the total sample reported White race/ethnicity. PA improved significantly from baseline to week 12 in LIFTUP participants (582.88 Met-minutes/week) (p<0.01); there were significant differences between groups in changes in PA at week 12 (p=0.03); LIFTUP had a greater increase compared to THEBEST. A weak inverse correlation was found between change in PA and change in physically unhealthy days in LIFTUP participants (r = -0.27, p = 0.04). Conclusions: Greater improvements in PA were seen in the participants without a PA monitor compared to those who were provided a monitor. In these same participants, as PA increased, physically unhealthy days decreased. Further research is needed in order to generalize results, as the sample was small.


Health Professions Students’ Attitudes towards Interprofessional Education

Jessie Sullivan, 2018

Objective: To examine health professions students’ attitudes towards interprofessional education (IPE) using the interprofessional relationships and interaction subscales of the University of West England Interprofessional Questionnaire (UWE-IQ). Methods/Design: This was a retrospective cross-sectional study. Data were collected via electronic survey from health professions students who attended IPE case conference events in November 2014 or 2016. Health professions students’ attitudes towards IPE were explored and findings across health profession programs were examined. Data were analyzed using descriptive and inferential statistics. A priori alpha was set at p = 0.05. Results: Data from 83.6% (n=437) of surveys were utilized. The majority of students’ attitudes towards interprofessional relationships were positive (n=372, 86.5%). The majority of students’ attitudes towards interprofessional interaction were negative (n=220, 50.3%). There were no significant differences in interprofessional relationships [F(9, 420)=1.023, p=0.421] or interaction [F(9, 427)=1.414, p=0.179] domain scores across health profession programs. Significant differences were found in the interprofessional relationships (p=0.006) and interaction (p=0.003) domain scores between sexes. Females had significantly less positive views (mean=31.18±3.68) towards interprofessional relationships than males (mean=32.26±3.77), and females had significantly more negative views (mean=22.40±5.66) towards interprofessional interactions than males (mean=24.37±6.00). Applications/Conclusions: Health professions students have positive attitudes towards interprofessional relationships and negative attitudes towards interprofessional interactions. Significant differences in IPE attitudes were not found across health profession programs, however differences existed between sexes. Females had lower mean scores in both the interprofessional relationships and interaction domains compared to males. Additional research should be conducted to determine why differences in IPE attitudes exist between sexes.


Changes in Food Intake Patterns of Rutgers Employees Participating in a Worksite Wellness Program

Alexandra Musarra, 2018

Background: Research on weight loss interventions focusing on changes in dietary behaviors have been published, but there is limited evidence measuring these relationships in the workplace setting. Objective: To evaluate the relationships between changes in food intake patterns and changes in weight, waist circumference and body mass index (BMI) among female employee participants in a worksite wellness program (WWP) Design: A retrospective study design. Participants/setting: Female employee participants in Lifestyle Intervention For Total health- a University Program (LIFT UP) between October 2013 and June 2017. Intervention: A 12-week active intervention of a university-based WWP at Rutgers University. Main outcomes measured: Changes in weight, waist circumference, BMI and daily food intake patterns (fruit, vegetable, added sugar and sugar sweetened beverages) from baseline to 12 and 26 weeks and between 12 and 26 weeks. Statistical analyses: Repeated measures ANOVA was used to analyze changes in anthropometric measurements and daily food intake patterns; Spearman’s correlation coefficient was used to analyze relationships between changes in anthropometric measurements and changes in food intake patterns. Results: The mean age of the sample was 50.0 ± 10.0 years; 61.6% were white (n=125). There was a statistically significant weak, negative correlation between changes in vegetable intake and changes in weight and BMI from baseline to 12 weeks (r=-0.26, p=0.01 and r=-0.28, p=0.003, respectively) and a weak, positive correlation between changes in added sugar intake and changes in weight and BMI from baseline to 12 weeks (r=0.26, p=0.01 and r=0.27, p=0.01, respectively). Conclusions: In this university-based WWP, increases in vegetable intake were associated with decreases in body weight and BMI measurements over the 12-week active intervention. Similarly, decreases in added sugar intake were associated with decreases in body weight and BMI measurements over the same time period. Implementation of WWPs including lifestyle counseling and diet education may improve dietary behaviors and anthropometric measures in females with overweight or obesity. However, randomized controlled trials are needed to to determine causality.