Saturday

E-Portfolio for mid-term report on my SONAR FLC project

As part of my participation in this year-long faculty learning community (FLC), I prepared a mid-term e- portfolio detailing my project plans. Here it is, may you find it useful and/or interesting:

Slayter - SONAR FLC Portfolio

Monday

On linking course objectives to student grades - an experiment

Slayter - November 16 2009 - Faculty Meeting 2

On using "evidence-based practice" as the lens for the new iteration of my course

Evidence-based practice is a jargony term that is bandied about by students, professors and social work practitioners alike - and we often mean something different by the term.  Some people think this means "research proves that this intervention works" and some people think it means "collecting data about your intervention to show whether it works" and some people, like me, embrace the idea that “evidence-based practice is a process of consideration.”

During the course of planning for and running this fall's SONAR FLC, I have also been working with our School of Social Work's field education department (each student has at least one internship as part of their graduate work) to think about how to bring our field work instructors under the same umbrella re: what EBP is...especially now that we will introduce this framework to our students.  We need to bridge the gap between academic experience/thinking and community-based social work practice...

We use this definition in our conversations "with the field":
 
Definition:  Evidence-based practice (EBP) is a process that can aid decision making based on the best scientific evidence. This is as opposed to operating from gut feelings, tradition, prior education, or even a 3 particular helping professional’s experience. Helping professionals (physicians, nurses, counselors, social workers, dentists, etc.) can use EBP to pose specific questions of vital significance to their clients/patients, then search electronically quickly and efficiently for the current best evidence to answer the specific question.  Evidence-based practice is possible now because of increased speed, accessibility, and utility of electronic access to evidence (assuming access to evidence, ability to interpret – and time to do this work).

Much of my understanding of EBP comes from Gibbs, who states that:  Evidence-based practice represents both an ideology and a method. The ideology springs from the ethical principle that clients deserve to be provided with the most effective interventions possible. The method of EBP is the way we go about finding and then implementing those interventions (see, e.g., manuals on EBP methods by Gibbs, 2003; Cournoyer, 2003; and Rubin, 2007).
 
Evidence-based practice represents the practitioner’s commitment to use all means possible to locate the best (most effective)evidence for any given problem at all points of planning and contacts with clients. This pursuit of the best knowledge includes extensive computer searches, as described in the following (Gibbs & Gambrill, 2002). Evidence-based practice is an enormous challenge to practitioners because the methods of locating the most effective interventions go beyond, or are more rigorous than, even those of empirically-based practice.

Points to go over with group:

-History of EBP and evidence-supported instruments/interventions in medical research vs. EBP is discussed in broader SW circles
-Gibbs’ steps method of considering evidence for use in practice as “evidence-based practice” as a process

Step 1. Develop A Question.

Step 2. Find the Evidence.

Step 3. Analyze the Evidence.

Step 4. Combine the Evidence with Your Understanding of the Client and Situation.

Step 5. Application to Practice. I

Step 6. Monitor and Evaluate Results.

-Here, there is a big discussion about the debate about qualitative vs. quantitative evidence

Discussion on challenges in the field re: implementation of an EBP process:

-The problem of accessing evidence and interpreting evidence (note: I think we should work towards getting access to library databases for field instructors and doing a refresher course on how to interpret evidence)
-The problem of time to do this work in the current agency environment
 
So, much of this overlaps with the new approach to the old course...and this is the basis for the lens/framework I will be using in the new course.





Wednesday

On teaching about the process of research...research as a second language

Most research courses that I have taken focus on teaching about the process of research (see graphic) - and expect what I refer to as "research as a second language" to be picked up along the way....I have decided to take a different approach with my own teaching approach...in great part to attempt to reduce student anxiety by acknowledging that research is a second language from the start.

Research as a second language:
The importance of demystification and translation in teaching social work research

Elspeth Slayter, Ph.D., M.S.W., School of Social Work, Salem State University

This essay addresses my evolving pedagogical philosophy for teaching research methods to graduate level social work students, particularly first-generation graduate students with some work experience. Over the past two years, I have observed my students entering class with emotions that range from anxiety to disdain with respect to their need to take this required course. Since my own experience as a somewhat recalcitrant graduate social work research student a decade ago, I have valued the importance of a humorous, supportive and down-to-earth teaching approach in research courses. While my students do not always share my humor, they do respond to my efforts to “demystify” what research consists of, how it is created and what it really means. Key in this effort is a constant “translation” process which makes the language of research into more accessible and familiar terms that students can embrace while developing basic skills in both the interpretation and creation of research. As the semester unfolds, my efforts to help students translate between their worlds of work and the language of research result in the end product of “demystification.”

Engagement: “Start where the client is:” Across the country, social work instructors all face the challenge of teaching applied research to an often resistant and fearful audience (Green, Bretzin, Leininger, and Stauffer (2001 ). Much of my work involves taking a supportive, “cheerleading/coaching” approach at the start of the semester, especially during the first class. Beginning with my first contact with students (in the classroom or on a coffee break) I am mindful of the social work adage “start where the client is.” Specifically, my first step involves assessing both my students’ feelings about the course (as an engagement tool) and their knowledge level (in order to pitch my lecture to the right level). Generally, students indicate that they are “anxious,” “fearful” or “upset” about the course. Moving into my second step, I use these responses to frame my next steps by talking about the measurement of feelings in research (e.g. “how do you measure anxiety? Here’s how several researchers have ‘operationalized’ this concept. Now that’s a big word, let’s break it down. See, you get this stuff!”). Key to this process is the use of humor to quell anxiety, bust stereotypes and begin to help students to build their competency in research methods. Often, my third step involves a group conversation involving working with each student in front of the class to think about the ways in which research impacts their lives, even if they do not conceptualize it as such. As the anxious shoulders drop and the furrowed brows relax a bit, I move into my fourth step, the provision of a detailed narrative of the syllabus and course requirements, focusing on how each topic relates to social work practice. Finally, I spend a significant amount of time in the first two weeks helping students to identify areas of passion – both positive (“what do you love about social work practice?”) and negative (“what bothers you most about your field of practice?”).

Translation: Making it real (and keeping it real): Much of the pedagogical literature on teaching this course to social work students supports the use of applied projects that “make research real” (Jackson & Sedehi, 1998; Sells, and Smith, 1997) . The primary way in which I prepare to “make research real” for my students involves both the conduct of actual short-term (e.g. 30-minute ethnographic observations, assessments of clinical scales used in agency settings) and semester-long research projects (e.g. research proposal development and/or implementation) which are focused on students’ experiences in their field placements. These assignments are key to the process of “translation” and they allow students to understand “the language of research” in their own terms. I also find that students do better on these tasks if they are not functioning on their own, which supports my belief in the power of cooperative learning (Garrett, 1998 ). Further, the use of short-term projects allows me to “prime the pump” for class discussions. Throughout the semester, my students generally comment that while “it was a pain” to conduct short-term projects each week, they felt that they had engaged with the material in a more in-depth manner, and that they achieved more mastery of the concepts taught. I often include small, group collaborative exercises in each class which allows for the development of confidence and the ability to compare notes. Each class session ends with each group talking about their goals and work plan for the coming week, allowing students to share ideas and problem-solving strategies.

Demystification: A process that unfolds slowly: While much of the remainder of the semester involves the recycling of the techniques described above, I am always careful to conduct regular “temperature checks” related to how students are feeling. Often, this involves asking students to reflect on what they have learned since the start of the semester. Alternately, this can involve something as simple as supporting students in the process of navigating course materials, discussions and assignments by using humor in asking questions such as “OK, so on a scale of 1-5, how clear was that? Clear as mud? Clear as the blue sky? Or somewhere in between? You may feel anxious about understanding this now, but remember how you were able to finally understand the difference between independent and dependent variables? You can do this.” I have also begun to institute the use of non-graded self-assessment techniques (both in class and via WebCT) to gauge mastery of basic concepts and areas in which additional instructional support might be needed. Finally, while research may not have been completely demystified by in my course, I end the semester with an informal wrap-up assessment targeted at showing students just what it is they have learned.


i. Green, R., Bretzin, A., Leininger, C. and Stauffer, R. (2001) Research learning attributes of graduate students in social work, psychology and business. Journal of Social Work Education, Vol. 37, Issue 2


ii. Jackson, A. and Sedehi, J. (1998) Homevisiting: Teaching direct practice skills through a research project. Journal of Social Work Education, Vol. 34 Issue 2. and Sells, S. and Smith, T. (1997) Teaching ethnographic research methods in social work: A model course. Journal of Social Work Education, Vol. 33, Issue 1.

iii. Garrett, K. (1998) Cooperative learning in social work research courses: Helping students help one another. Journal of Social Work Education, Vol. 34, Issue 2.

Beginning to think about how to measure student engagement - quantiatively

Once our FLC all got on the same page about the fact that we needed to measure either process or outcome measures (or both) related to student engagement, we had a good discussion about what constitutes "engagement." 

The idea of "engagement" is near and dear to my heart - as it was central to my dissertation research on engagement in substance abuse treatment among people with intellectual disabilities.  In that project, I operationalized engagement to include a measure of the proportion of expected visits attended within a certain timeframe after beginning, or in the parlance of health services researchers, intiating, substance abuse treatment.  In applying this thinking to engaging graduate students in required research courses, I realized that while attendance is part of the picture - it is not all.  As with my dissertation research, no study is perfect - but in thinking about how to measure engagement in this project, I went back to the literature on student engagement. 

The memo (more accurately a list of links/items on engagement from others) pasted in below is something I prepared for our FLC members for discussion on this topic after going back to the literature.  We had a wide-ranging discussion of how we might use each of the approaches listed in this memo to measure our own students' engagement in our required research courses.  We also talked about some of the individual questions posed by NSSE and how we might tweak them.  My own approach, informed by reviewing all of this material, resulted in this draft survey instrument (click here, you will need to enter data in order to see the entire survey).  In designing this survey, I used the best practice of designing "table shells" ahead of time, to visualize what my data would look like once collected, in order to make sure that my measurement approaches were testable in the ways that I needed them to be (see tables here).

MEMORANDUM
November 17, 2010
To:          SONAR FLC Members
From:    Elspeth
Re:         Approaches to measuring student engagement/engagement
Clinical measurements of engagement: 
Attendance at X out of X sections (rate)
Measures of student engagement:
Graduate National Survey of Student Engagement (NSSE)
This survey is designed to measure the degree of graduate student engagement in college activities that correlate to student learning and personal development. This survey includes questions on class participation, academic rigor, amount of effort put into class work and overall experienceReports:
2007/08 Graduate National Survey of Student Engagement Summary.pdf
Table 1: In your postbaccalaureate experience at CSU Stanislaus, about how often have you done each of the following?
Agreement
a) Asked questions in class or contributed to class discussions
50.0%
b) Made a class presentation
44.6%
c) Prepared two or more drafts of a paper or assignment before turning it in
31.9%
d) Worked on a paper or project that required integrating ideas or information from various sources
54.3%
e) Included diverse perspectives (different races, religions, genders, political beliefs, etc.) in class discussions or writing assignments
42.4%
f) Came to class without completing readings or assignments
6.5%
g) Worked with other students on projects during class
33.3%
h) Worked with classmates outside of class to prepare class assignments
25.3%
i) Put together ideas or concepts from different courses when completing assignments or during class discussions
36.7%
j) Tutored or taught other students (paid or voluntary)
13.2%
k) Participated in a community-based project (e.g., service learning) as part of a regular course
6.6%
l) Used an electronic medium (listserv, chat group, Internet, instant messaging, etc.) to discuss or complete an assignment
46.7%
m) Used e-mail to communicate with an instructor
49.5%
n) Discussed grades or assignments with an instructor
28.9%
o) Talked about career plans with a faculty member or advisor
19.6%
p) Discussed ideas from your readings or classes with faculty
23.1%
q) Received prompt written or oral feedback from faculty on your academic performance
27.2%
r) Worked harder than you thought you could to meet an instructor's standards or expectations
20.7%
s) Worked with faculty members on activities other than course work (committees, orientation, student life activities, etc.)
9.8%
t) Discussed ideas from your readings or classes with others outside of class (students, family members, co-workers, etc.)
34.8%
u) Had serious conversations with students of a different race or ethnicity than your own
34.8%
v) Had serious conversations with students who are very different from you in terms of their religious beliefs, political opinions, or personal values
32.6% Items

Table 2: How often has your coursework emphasized your engagement in the following mental activities?
Agreement

a) Memorizing facts, ideas, or methods from your courses and readings so you can repeat them in pretty much the same form

16.3%

b) Analyzing the basic elements of an idea, experience, or theory, such as examining a particular case or situation in depth and considering its components

43.5%

c) Synthesizing and organizing ideas, information, or experiences into new, more complex interpretations and relationships

39.1%

d) Making judgments about the value of information, arguments, or methods, such as examining how others gathered and interpreted data and assessing the soundness of their conclusions

40.2%

e) Applying theories or concepts to practical problems or in new situations



ERIC Identifier:  ED482269
Publication Date: 2003-09-00
Author: Chapman, Elaine
Source: ERIC Clearinghouse on Assessment and Evaluation
Assessing Student Engagement Rates. ERIC Digest.  
Given the emphasis placed on levels of academic achievement in schools, the way in
which students acquire knowledge through the learning process has become a primary concern. Several studies have highlighted the significant role that affective factors can play in learning (e.g., Mathewson, 1994; Wigfield, 1997), placing particular emphasis on student engagement. This Digest defines student engagement and describes various methods used to measure it, both in empirical research studies and at the classroom level.
"WHAT IS STUDENT ENGAGEMENT?"
Early studies of student engagement often focused on time-on-task behaviors (e.g.,
Fisher, et al., 1980; Brophy, 1983). More recently, however, other definitions have
appeared in the literature. Student engagement has been used to depict students'
willingness to participate in routine school activities, such as attending classes,
submitting required work, and following teachers' directions in class. For example,
Natriello (1984) defined student engagement as "participating in the activities offered as part of the school program" (p.14). Negative indicators of engagement in this study included unexcused absences from classes, cheating on tests, and damaging school property.
Another definition focuses on more subtle cognitive, behavioral, and affective indicators of student engagement in specific learning tasks. This orientation is reflected well in the definition offered by Skinner & Belmont (1993):
Children who are engaged show sustained behavioral involvement in learning activities accompanied by a positive emotional tone. They select tasks at the border of their competencies, initiate action when given the opportunity, and exert intense effort and concentration in the implementation of learning tasks; they show generally positive emotions during ongoing action, including enthusiasm, optimism, curiosity, and interest.
The opposite of engagement is disaffection. Disaffected children are passive, do not tryhard, and give up easily in the face of challenges [they can] be bored, depressed,
anxious, or even angry about their presence in the classroom; they can be withdrawn
from learning opportunities or even rebellious towards teachers and classmates. (p.
572.)
From a different perspective, Pintrich and & De Groot (1990) associated engagement levels with students' use of cognitive, meta-cognitive and self-regulatory strategies to monitor and guide their learning processes. In this view, student engagement is viewed as motivated behavior apparent from the kinds of cognitive strategies students choose to use (e.g., simple or "surface" processing strategies such as rehearsal versus "deeper" processing strategies such as elaboration), and by their willingness to persist with difficult tasks by regulating their own learning behavior.
Use of cognitive and meta-cognitive strategies (e.g., I went back over things I didn't
understand" and "I tried to figure out how today's work fit with what I had learned
before") may be taken to indicate active task engagement, while use of shallow
strategies (e.g., "I skipped the hard parts") may be taken to indicate superficial
engagement (Meece, Blumefield, and Hoyle, 1988).
"HOW IS STUDENT ENGAGEMENT MEASURED?"
The most common way that student engagement is measured is through information
reported by the students themselves. Other methods include checklists and rating
scales completed by teachers, observations, work sample analyses, and case studies.
Each of these methods is described briefly below.
"Self-Reports." Students may be asked to complete surveys or questionnaires 
regarding their level of task engagement. Items relating to the cognitive aspects of engagement often ask students to report on factors such as their attention versus distraction during class, the mental effort they expend on these tasks (e.g., to integrate new concepts with previous knowledge), and task persistence (e.g., their reaction to perceived failure to comprehend the course material). Students can also be asked to report on their response levels during class time (e.g., making verbal responses within group discussions, looking for distractions, and engaging in non-academic social interaction) as an index of behavioral task engagement. Affective engagement questions typically ask students to rate their interest in and emotional reactions to learning tasks on indices such as choice of activities (e.g., selection of more versus less challenging tasks), the desire to know more about particular topics, and feelings of stimulation or excitement in beginning new projects.
In addition to asking the question of whether students are engaged in learning tasks,
self-report measures can provide some indication of why this is the case. Research into achievement goal orientations, for example, has indicated positive relationships
between task or mastery goals, which reflect a desire for knowledge or skill acquisition, and students' use of effective learning strategies (e.g., Covington, 2000). Studies have also demonstrated positive relationships between students' perceived learning control and adaptive learning processes (e.g., Strickland, 1989; Thompson et al., 1998).
"Checklists and Rating Scales." In addition to student self-report measures, a few studies have used summative rating scales to measure student engagement levels. For example, the teacher report scales used by Skinner & Belmont (1993) asked teachers to assess their students' willingness to participate in school tasks (i.e., effort, attention, and persistence during the initiation and execution of learning activities, such as "When faced with a difficult problem, this student doesn't try"), as well as their emotional reactions to these tasks (i.e., interest versus boredom, happiness versus sadness, anxiety and anger, such as "When in class, this student seems happy"). The Teacher Questionnaire on Student Motivation to Read developed by Sweet, Guthrie, & Ng (1996) asks teachers to report on factors relating to student engagement rates, such as activities (e.g., enjoys reading about favorite activities), autonomy (e.g., knows how to choose a book he or she would want to read), and individual factors (e.g., is easily distracted while reading).
"Direct Observations." Although self-report scales are widely used, the validity of the data yielded by these measures will vary considerably with students' abilities to accurately assess their own cognitions, behaviors, and affective responses (Assor & Connell, 1992). Direct observations are often used to confirm students' reported levels of engagement in learning tasks. A number of established protocols are available in this area (e.g., Ellett & Chauvin, 1991). Most of these observational studies have used some form of momentary time sampling system. In these methods, the observer records whether a behavior was present or absent at the moment that the time interval ends or else during a specific time period.
In classwide observations, approximately 5 minutes of observational data can generally be collected on each target student per lesson. Thus, a 30-minute observation period would allow observations of approximately 5 target students, with 6 to 7 sessions being required to observe a full class. In addition, to obtain a representative sample of students' behavior over the full course of a lesson, observations are generally rotated across students so that each student is observed continuously for only one minute at a time.
"Work Sample Analyses." Evidence of higher-order problem-solving and metacognitive learning strategies can be gathered from sources such as student projects, portfolios, performances, exhibitions, and learning journals or logs (e.g., Royer, Cisero, & Carlo, 1993; Wolf, et al., 1990). The efficacy of these methods hinges on the use of suitably structured tasks and scoring rubrics. For example, a rubric to assess the application of higher-order thinking skills in a student portfolio might include criteria for evidence of problem-solving, planning, and self-evaluation in the work. A number of formal and informal protocols for assessing students' self-regulated learning strategies also incorporate components that focus on metacognitive skills (e.g., Pintrich & DeGroot, 1990; Ward & Traweek, 1993). The Metacognitive Knowledge Monitoring Assessment and the Assessment of Cognitive Monitoring Effectiveness are more targeted measures suitable for use in classroom situations and with demonstrated sound psychometric properties in empirical evaluations (Osborne, 2001).
"Focused Case Studies
From a different perspective, Pintrich and & De Groot (1990) associated engagement levels with students' use of cognitive, meta-cognitive and self-regulatory strategies to monitor and guide their learning processes. In this view, student engagement is viewed as motivated behavior apparent from the kinds of cognitive strategies students choose to use (e.g., simple or "surface" processing strategies such as rehearsal versus "deeper" processing strategies such as elaboration), and by their willingness to persist with difficult tasks by regulating their own learning behavior.
Use of cognitive and meta-cognitive strategies (e.g., I went back over things I didn't
understand" and "I tried to figure out how today's work fit with what I had learned
before") may be taken to indicate active task engagement, while use of shallow
strategies (e.g., "I skipped the hard parts") may be taken to indicate superficial
engagement (Meece, Blumefield, and Hoyle, 1988).

"Checklists and Rating Scales." In addition to student self-report measures, a few studies have used summative rating scales to measure student engagement levels. For example, the teacher report scales used by Skinner & Belmont (1993) asked teachers to assess their students' willingness to participate in school tasks (i.e., effort, attention, and persistence during the initiation and execution of learning activities, such as "When faced with a difficult problem, this student doesn't try"), as well as their emotional reactions to these tasks (i.e., interest versus boredom, happiness versus sadness, anxiety and anger, such as "When in class, this student seems happy"). The Teacher Questionnaire on Student Motivation to Read developed by Sweet, Guthrie, & Ng (1996) asks teachers to report on factors relating to student engagement rates, such as activities (e.g., enjoys reading about favorite activities), autonomy (e.g., knows how to choose a book he or she would want to read), and individual factors (e.g., is easily distracted while reading).

"Focused Case Studies." When the focus of an investigation is restricted to a small
group of target students, it is often more useful to collect detailed descriptive accounts of engagement rates. Case studies allow researchers to address questions of student engagement inductively by recording details about students in interaction with other people and objects within classrooms. These accounts should describe both students' behaviors and the classroom contexts in which they occur. This might include, for example, the behavior of peers, direct antecedents to the target student's behaviors (e.g., teacher directions), as well as the student's response and the observed consequences of that response (e.g., reactions from teachers or peers). Case studies generally attempt to place observations of engagement within the total context of the classroom and/or school, and are concerned as much with the processes associated with engagement as they are in depicting engagement levels.

REFERENCES

Assor, A., & Connell, J.P. (1992). The validity of students' self-reports as measures of performance-affecting self-appraisals. In D.H. Schunk & J. Meece (Eds.), Student Perceptions in the Classroom (pp.25-46). Hillsdale, NJ: Lawrence Erlbaum.

Brophy, J. (1983). Conceptualizing student motivation. Educational Psychologist, 18, 200-215.

Covington, M. (2000). Goal theory, motivation, and school achievement: an integrative review. Annual Review of Psychology, 51, 171-200.

Ellett, C.D., & Chauvin, E. (1991). Development, validity, and reliability of a new
generation of assessments of effective teaching and learning: Future directions for the
study of learning environments. Journal of Classroom Interaction, 26(2): 25-36.

Fisher, C., Berliner, D., Filby, N., Marliave, R., Cahen, L., & Dishaw, M. (1980).
Teaching behaviors, academic learning time, and student achievement: An overview. In C. Denham & A.

Lieberman (Eds.), Time to Learn. Washington, D.C.: National Institute of Education.

Mathewson, G.C. (1994). Model of attitude influence upon reading and learning to read. In R.B. Ruddell & H. Singer (Eds.), Theoretical Models and Processes of Reading, 3rd. Ed. (pp. 1131-1161). Newark, DE: International Reading Association.

Meece, J.L., Blumenfield, P.C., & Hoyle, R.H. (1988). Students' goal orientations and cognitive engagement in classroom activities. Journal of Educational Psychology, 80 (4): 514-523.

Natriello, G. (1984). Problems in the evaluation of students and student disengagement from secondary schools. Journal of Research and Development in Education, 17, 14-24.

Osborne, J. (2001). Assessing metacognition in the classroom: the assessment of
cognition monitoring effectiveness. Unpublished manuscript, the Department of
Educational Psychology, University of Oklahoma.

Pintrich, P.R., & De Groot, E.V. (1990). Motivational and self-regulated learning
components of classroom academic performance. Journal of Educational Psychology, 82(1): 33-40.

Royer, J.M., Cisero, C.A., & Carlo, M.S. (1993). Techniques and procedures for
assessing cognitive skills. Review of Educational Research, 63(2): 201-243.

Skinner, E.A., & Belmont, M.J. (1993). Motivation in the classroom: Reciprocal effects of teacher behavior and student engagement across the school year. Journal of Educational Psychology, 85(4): 571-581.

Strickland, B.R. (1989). Internal-external control expectancies: From contingency to creativity. American Psychologist, 44(1): 1-12.

Sweet, A.P., Guthrie, J.T., & Ng, M. (1996). Teacher Perceptions and Students'
Motivation To Read (Reading Research Report No. 69). Athens, GA: National Reading Research Center.

Thompson, M., Kaslow, N.J., Weiss, B., & Nolen-Hoeksema, S. (1998). Children's Attributional Style Questionnaire revised: Psychometric examination. Psychological Assessment, 10(2): 166-170.

Ward, L., & Traweek, D. (1993). Application of a metacognitive strategy to assessment, intervention, and consultation: A think-aloud technique. Journal of School Psychology, 31, 469-485.

Wigfield, A. (1997). Reading engagement: a rationale for theory and teaching. In J.T. Guthrie and A.

Wigfield (Eds.), Reading Engagement: Motivating Readers Through
Integrated Instruction. Newark, DE: International Reading Association.

Wolf, D., Bixby, J., Glenn, J., & Gardner, H. (1990). To use their minds well:
Investigating new forms of student assessment. Review of Research in Education, 17, 31-74.

On learning about autoethnography and my end of semester exercise

Two of my FLC colleagues introduced the topic of autoethnography as a qualitative research method.  I learned that this method is used in the occupational therapy program to create a capstone project for all master's students.  Although articles were distributed, I could not locate abstracts for these documents on the SSU library database system, thus the lack of links.  My understanding is that this method is used to assist students in tracking where they have come from, where they are now, and where they hope to go in their profession - with some focus on how they may use research in the future as occupational therapists. 
The idea of this reflective exercise makes sense to me.  However, our discussions of autoethnography itself (spanning 2 of our sessions) leaves me with many questions about this method. 

I do not "buy" this as a valid social science research method.  Therefore, I realized that I would not teach this method (we have a hard enough time gaining acceptance for pure qualitative research with an N of 20-30, for example) but would consider this type of approach to having students reflect on past, present and future re: views about research.  In some ways, this informed one of the questions on my final reflective exercize, which is planned to be something like this:

INSTRUCTIONS TO STUDENTS:  Please write your answers to these questions before coming into class.  We will share our thoughts with the group one question at a time.

What have you learned this semester?

What have you re-learned this semester?

What do you wish you had learned this semester?

What were you surprised you had learned this semester?

How will you approach "managing" your own social location in research/evaluation in social work settings?

How do your views about research and social work practice NOW differ from the start of the semester?

What do you envision the challenges of implementing evidence-based practice will be once "in the field?"

On my own community-based participatory research project and how it has informed my thoughts about teaching

My own community-based participatory research project has been the subtextual fodder for my entire project.  Here is a bit about it:
Research partners:  The research team includes Beth Flanzbaum, MSW, LICSW, Assistant Director of Outpatient Services and MSW Internship Coordinator at Youth Opportunities Upheld (Y.O.U.), Inc. and Elspeth Slayter, MSW, PhD, Assistant Professor at Salem State College School of Social Work (SSC SSW).  Y.O.U., Inc. is a behavioral health organization providing a range of social, psychological, and preventive programs for at-risk youth with developmental disabilities and their families in Worcester, Massachusetts.  In 2007, Y.O.U., Inc. served 1,428 clients aged 15-22, comprising 76 percent of their total client population.  Nineteen percent of the agency’s client population are Latino/a, with another 25 percent of other diverse ethnicities that represent an increasing immigrant population in Worcester, including a large population of African immigrants.   As a social service agency, Y.O.U., Inc. is concerned with the need for graduate mental health practitioners who, upon entering its workforce, are optimally skilled at assessing and addressing risk and promoting resilience among youth with developmental disabilities.  As a regional leader in Massachusetts’ new court-mandated systems-of-care initiative, the Children’s Behavioral Health Initiative, for low-income youth, Y.O.U., Inc. recognizes that its future access to a competent workforce is dependent on developing and evaluating practice-relevant field-based education for mental health professionals in the present.  As a School of Social Work, SSC SSW is committed to efforts to increase academic and field-based content on practice with a range of people with disabilities.

Brief description of partnership:  Through their ongoing partnership, Y.O.U., Inc. and SSC SSW share the goal of risk prevention and resiliency promotion among youth with developmental disabilities who are at risk of psychological and personal distress (including suicidality), youth violence and family turmoil.  Together, the entities have a long-standing partnership that includes both the training of master’s level social workers and the ongoing conduct of collaborative research and evaluation on suicidality among Latino/a clients and clinical practice with populations with disabilities.  Recently, this team has identified the agency’s need to development better clinical practice approaches for the range of work engaged in with youth with developmental disabilities in a multi-service setting.

Research Context:  Over the past year, youth with developmental disabilities have been identified as an under-served population within the Worcester, Massachusetts community, and within Y.O.U., Inc. in particular.  Despite the nascent development of clinical practice approaches for people (mostly adults) with developmental disabilities, much less is known about the application of existing best practices to this population (Beail, 2003).  For youth who are eligible for services through state-funded mental retardation agencies, some specialized services are available.  However, for youth with developmental disabilities who fall between the cracks, community-based service agencies are the option – yet they rarely have the expertise to conduct this work in a way that will promote client resilience and prevent risk.  Given the prevalence of developmental disabilities, especially among youth served by the agency through their involvement with Massachusetts’ Children’s Behavioral Health Initiative, and given the agency’s particular role in service provision to a range of ethnically diverse communities in Worcester, Massachusetts, development of better approaches to clinical services provision for this population is a significant need.

Research Aims:  The study to be advanced via participation in the Institute has three aims.  First, to develop culturally-competent and developmentally appropriate set of clinical practice protocols for youth with developmental disabilities aged 15-22 in partnership with community partners.  Second, to assess the implementation of these clinical practice protocols in a multiservice agency serving ethnically diverse youth with developmental disabilities aged 15-22.  Third, to evaluate the effectiveness of these clinical practice protocols in fostering resilience and reducing risk among youth with developmental disabilities aged 15-22.

Study Design:  The study is designed to assess the development and use of clinical practice models with at-risk youth with developmental disabilities aged 15-22.  Focusing both on the collection of qualitative and quantitative process and outcome data, this prospective, longitudinal study will include two overlapping phases over a two-year period.  First, after using a literature-informed process in the development of culturally-competent and developmentally appropriate clinical practice models for this population, the implementation of these models will be studied.  The implementation study will qualitatively and quantitatively track the experiences of Y.O.U., Inc. graduate interns, their supervisors and all of the agency’s clinical staff in order to create a continuous quality improvement process about the models in application to client work.  Second, client process and outcome measures related to risk prevention and resiliency promotion will be tracked via both the use of case record review and qualitative interviews with clinicians.  While the clients themselves will not be interviewed as part of this study, measures related to psychological and personal distress (including suicidality), youth violence and family turmoil will be tracked through use of data from the Treatment Outcome Package (TOP) – a behavioral health outcome measurement tool used by Y.O.U., Inc. (Kraus, Seligman, & Jordan, 2005; Behavioral Health Laboratories, 2008).

Study Population: The proposed study will involve two study populations.  First, Y.O.U., Inc. graduate interns, their supervisors and all of the agency’s clinical staff will be part of both the implementation study and the client process and outcomes study.  Second, youth with developmental disabilities aged 15-22 will comprise the second study population through the use of the secondary data sources discussed above.  At present the agency provides services and supports to youth aged 10-22 years receive services in outpatient, home-based, residential, and school-based services as well as therapeutic day treatment – including a range of youth with developmental disabilities. 

Expertise of the Research Team:  Beth Flanzbaum, MSW, LICSW, has been Assistant Director of Outpatient Services at and MSW Internship Coordinator at Y.O.U., Inc since 1998 and holds academic affiliations with four MSW programs.  She has chaired several of the agency’s quality management committees and been directly involved in teaching and supervising the agency’s interns.  She is skilled in the use and application of TOP as an assessment and utilization review instrument.  Ms. Flanzbaum is a licensed clinical social worker and has over 25 years of direct practice experience with children and families.  Elspeth Slayter, MSW, Ph.D., is an Assistant Professors at Salem State College School of Social Work.  Dr. Slayter conducts behavioral health services research focused on people with disabilities (Slayter, 2010a; Slayter, 2010b; Slayter, 2009, Slayter and Steenrod, 2009, Slayter, 2008 and Slayter, 2007).  Additionally, Dr. Slayter holds particular expertise in both partnering with social service agencies in applied research design and data collection approaches, but also in working with secondary data from administrative records to investigate challenges to creating and sustaining effective behavioral health service delivery systems.  Dr. Slayter’s 7 years of direct practice experience with at-risk youth with developmental disabilities involved in child welfare, juvenile or criminal court proceedings in New York City also inform this study.  Ms. Flanzbaum and Dr. Slayter are currently conducting a needs assessment related to clinical practice with youth with developmental disabilities who are involved with Y.O.U., Inc. 

Expected public health and policy relevance:  Given increasing community participation among youth with developmental disabilities (a central disability policy goal), it is particularly important to prepare mental health clinicians who are able to both prevent risk and promote resilience among this population – a population often excluded from empirical studies in an effort to obtain an ideal sample (Humphries and Weisner 2000).  Focusing on the development of these clinical skills for use among at-risk youth with developmental disabilities of diverse ethnicities in a clinical setting, this project will advance clinical knowledge for practice focused on an underserved population.  Additionally, this study will forward a model for increasing the preparedness of graduate-level mental health clinicians to engage in resilience promotion and risk prevention with this vulnerable population of youth with developmental disabilities.


References

Beail, N.  (2003). What works for people with mental retardation? Critical commentary on
cognitive–behavioral and psychodynamic psychotherapy research. Mental Retardation: 41(6) 468–472.

Behavioral Health Laboratories, Inc. (2008). The Treatment Outcome Package. BHL
Products and Services. Retrieved February 18, 2008 from http://www.bhealthlabs.com/products/tools-test.cfm.

Gustaffson, C., Öjehagen, A., Hansson, L., Sandlund, M., Nyström, M., Glad, J., Cruce, G.,
Jonsson, A. and Fredriksson, M. (2009).  Effects of Psychosocial Interventions for People With Intellectual Disabilities and Mental Health Problems.  Research on Social Work Practice, Vol. 19, No. 3, 281-290  

Humphries, K. and C. Weisner. 2000. Use of exclusion criteria in selecting research subjects and its effect on the generalizability of alcohol treatment outcome studies.  American Journal of Psychiatry 157:588-594.

Kraus, D.R., Seligman, D.A., & Jordan, J.R. (2005). Validation of a behavioral health
            Treatment outcome and assessment tool designed fro naturalistic settings: The
            Treatment Outcome Package. Journal of Clinical Psychology, 61(3), 285-314.

National Institutes of Health, 2008.  Program Announcement: Research on Psychopathology In
Intellectual Disabilities (Mental Retardation) (R01).  PA-09-150.  Retrieved on April 1, 2009 from http://grants.nih.gov/grants/guide/pa-files/PA-09-150.html

Slayter, E. (2010a).  Triply diagnosed: Access to substance abuse treatment for Medicaid
beneficiaries with intellectual disabilities and serious mental illness.  Heath and Social Work.  Volume and issue to be announced.

Slayter, E. (2010b). Not immune: Access to substance abuse treatment for youth with intellectual
disabilities. Journal of Disability Policy Studies. Volume and issue to be announced.

Slayter, E. (2009). The experience of intimate partner violence among women with and without
disabilities: Data-driven implications for social work practice. Journal of Maltreatment, Aggression and Trauma. 18(2), 182-199.

Slayter, E. and Steenrod, S. (2009). Addressing alcohol and drug addiction among people with
mental retardation: A need for cross-system collaboration. Journal of Social Work Practice in the Addictions. 9(1), 71-90.

Slayter, E. (2008). Understanding and overcoming barriers to substance abuse treatment access
among people with mental retardation. Journal of Social Work in Disability and Rehabilitation. 7(2), 63-80.

Slayter, E. (2007) Balancing risk management with the dignity of risk: A case management
framework for people with mental retardation and substance abuse. Families in Society: The Journal of Contemporary Human Services. 88(4), 651-659.

Sturmey, P. (2006). Against psychotherapy with people who have mental retardation: In
response to the responses. Mental Retardation: 44(1) 71–74

On social location and how that can inform teaching social work research for reflective practice.

Some of our FLC group discussions highlighted the importance of considering social justice as a motivator for students to engage in required research courses.  In other words, to encourage students to realize that research can promote social justice goals - either methodologically (through participation via a participatory action research design or member-checking in pure qualitative research, for example) or in terms of topics studied. 
One aspect of this conversation included the need to talk with our students about their own "social location" in relationship to the clients they see (social work, occupational therapy) or the students they teach (education).  Recognizing the hierarchies that can exist in research projects (not to mention in our own research classrooms) is a vital step in the research process - but may also be enough of a conversation parallel to students' true interest in being in school (presumably to work with people) that it may function as an engagement mechanism.  This discussion was also augmented by review of 2 papers on assessing the quality of qualitative resarch (and how researchers need to address their own biases in data analysis), click here for one exemplar article shared by one of my FLC colleagues. 

I include this graphic as an illustration of how one might think about talking about social location with one's students. 

I think that I will do this as part of an middle-of-semester reflective exercise to think about how social location may impact data collection (during the "challenges of data collection lecture) - but also to get students thinking about the gap between research and practice worlds.

Although individual pedagogical frames of reference for teaching research may differ across our faculty, we all engage in a process that Donald Shöen would refer to as reflective practice in the teaching and learning process.  Embracing the ecological framework, we encourage students to learn by engaging directly in the process of research in their concentration area (e.g. children and family services, older adults/end of life, health and mental health).  During the foundation year, in Research in Social Work, students are, for example, learning about ethnography through doing a brief community-based observation or are building research synthesis skills through the conduct of a structured literature review.  In the concentration year, while in Research Laboratory I, students delve into the nuts and bolts of research design by learning as they engage in the process of setting up all aspects of their community-based qualitative and quantitative studies.  Similarly, during Research Laboratory II, students continue to learn through the reflective process of collecting, analyzing and interpreting their data – all the context of a reflective classroom.  Throughout the course of this reflective process of learning, we ground students’ learning and experiences in the context of our profession’s debate about both the creation and use of evidence-based practice.

How the focus of our FLC changed - and stayed the same

The graphic to the left reflects the participatory action research (PAR) process as I understand it - where community partners and university partners are equals in co-creating research projects that are mutually beneficial.  Elements of the PAR process are ver similar to community-based participatory research (CBPR).  For more information on CBPR, click here.


When proposing this FLC, I was interested in weaving in true CBPR principles into the design of the projects FLC members would engage in.  My proposal (see below) was full of CBPR language and philosophy.  My main goal was for the group to center discussions around bringing the community into the classroom and vice versa.  Here was the blurb from my original proposal for this FLC:

The proposed faculty learning community (FLC) will focus on the promotion of student engagement in required research-related courses in the human services – something that often presents a challenge to faculty members in such courses.  For students seeking training in social work, education, occupational therapy, criminal justice or nursing, the goal is often not to foster a research career – but rather to ‘work with people.’  However, as faculty members, we know that the skills learned in statistics and research methods courses are often vital to the development of professionals who can think critically about their work in light of the existing research in their professional arena, engage in program evaluation or interpret basic statistical findings. Given widespread commentary on the research-to-practice gap in the human services, the proposed FLC is designed to begin to bridge the gap between student engagement in research-related courses and the need for human services graduates with basic research-related competencies.  Participants will explore the ways in which they can enliven students’ experiences in the classrooms through the development of curricula that include short and longer-term, hands-on projects.  Drawing on human services literature about bringing the community into the classroom – and the classroom into the community, this FLC will explore the uses of a range of tools, from short, community-based assignments for weekly discussions to the development of community-based participatory research projects aimed at program evaluation, implementation studies or treatment fidelity assessments. 

Application guidelines for potential FLC participants were as follows:

Application process

What will you ask potential participants to submit as an application and how will you make decisions about selecting from among potential applicants? Potential participants in the proposed FLC will be asked to address the following topics in their application:

1.      Current teaching philosophy specifically for research-related courses;

2.      Description of any required community involvement among students in their Department or School

3.      Previous work with community partners in either a service or research-related capacity

4.      Description of the course they plan to work on as part of the proposed FLC (with a syllabus reflective of the current, or traditional approach to teaching the course)

5.      Short essay on why they are inspired to engage in a change process at this particular point in their teaching career, including commentary on the joys and challenges of teaching research related-courses in their professional discipline



While a community focus remained a central tenet in our FLC's discussions, the group did not decide to move in this direction.  To some extent, this was a good thing - people realized that more groundwork needed to be prepared in order to do true, good CBPR - much less add students into the mix.  And students new to the concept of CBPR at that. So, as I wrestled with my role as facilitator at the start of this process, I felt better as we moved in more of a group direction as the weeks went on - a parellel process to the decision to move away from "true" CBPR applications to our courses. 


Speaking about my own proposed project, the conceptualization of which began, to be honest, before the FLC process, the re-framing of an introductory level research methods course for graduate students in social work, I had originally intended to co-teach with a community agency partner who was "research savvy."  The goal would be to help students propose research projects that were truly relevant to our co-teacher's agency - and feasible for completion in the second and third required research courses in our curriculum.  While we currently operate under the same approach vis-a-vis the main assignment being a research proposal, many are very disconnected from the concepts of utility to the field and feasibility...


This idea of co-teaching, however, soon became pie-in-the-sky.  I realized that the timing of the course and the weekly meetings of the course - as well as the problem of payment for a co-teacher (not currently allowed in our contractual structure) were major impediments to this, in retrospect, rather grandiose plan. 




My second idea happened by chance after talking with my FLC members about the CBPR projects I am working on out in Worcester, MA with Youth Opportunities Upheld, Inc. (a.k.a. YOU, Inc.).  This project is separate from my teaching, and a summary of that work can be found either in this blog here or in my own blog here (offering 2 options in case there are linkage issues).  Given that our graduate program has a number of students from the Central MA region (given the lack of affordable, public social work programs in the center of the state), a seed was planted by an FLC colleague who suggested locating the class in question for my FLC project in YOU, Inc. for proximity to my community research partner.  While my YOU, Inc. partner was game for this idea, the logistics of offering a course on such short notice- much less the liability issues, memorandum of understanding process and computer lab access issues made this impossible to implement for Spring 2011 - or Fall 2011, for that matter.  However, this plan is still being worked on for the following academic year.  So, yet again, another pie-in-the-sky (or perhaps more accurately pie-in-my-face) moment.


My third idea was to shift to the evidence-based practice focus discussed elsewhere in this portfolio (click here for a definition).  The idea now became to involve a community agency partner with "research savvy" in key lectures and perhaps to provide feedback to students about their major projects at the end of the semester.  Planned end of semester projects are going to be focused on assessing whether a students' field placement does or does not use evidence-based practice approaches to service delivery - and recommending approaches based on the existing literature.  When vetted with colleagues at the School of Social Work, this seemed much more feasible, although we then received the news that external speaker honoraria had been cut.  I still feel I can find a way with these ideas, though. 


So while I intended to do full-on CBPR, I am very happy with the idea of an intermediate step involving community participation and response to students.  This is about "making it real" to students vis-a-vis how they will and can use research as social work practitioners.  There will be a time for full-on CBPR in our classes, just have to keep building.