Evidence Options for Faculty Development Centers Program Satisfaction • • • • • •

Number of programs by type Number of participants by program Total number of unique participants Demographic and program of participants Number of applicants for certain programs Number of interested faculty who cannot attend

Program Participation • • • •

Program event feedback Interviews with participating and non-participating faculty Feedback from center staff on perceived effectiveness of program Number of returning faculty for programming

Faculty Learn From Programming • Survey on teaching techniques and knowledge (Felder & Brent, 2010) and evidence-based teaching approaches • Needs assessment • interview/survey on beliefs about teaching

Application of Programming • Classroom observation • Course evaluations • Faculty self-report of self or self compared to peers (Way 2002) (interviews, survey)

Impact on Teaching Culture • Faculty survey • Interviews with faculty, students, and/or stakeholders

Impact on Student Learning • Student approaches to studying (before and after course with faculty who received instructional training (Gibbs & Coffey 2004) • Retention rates • Artifacts from courses for breadth and depth of knowledge • Student Surveys Evaluation types modified from (Marbach-Ad, et al., (2015)

CATLR collected data in bold.

Program Theory & Logic Models Our original approach to evaluating the Center’s impact was to triangulate data. Once diving more into the literature, formalizing a program theory seemed equally important. Doing so forced a close look at the literature and data to evidence each assumption one frequently makes about participants of faculty development centers ultimately changing teaching techniques and in turn impacting student learning (Hacsi, 2000; Wilder Research, 2009; Woodard, 2013). Faculty and Course Evaluations

Student perception of intellectual growth

Teaching Culture (faculty survey)

Logic Models are an effective tool to identify program inputs, activities, and outcomes (Armstrong & Barsion, 2006). It can be used to ensure that activities align with the desired outcomes (and ways in which those outcomes can be evidenced). Program Theory requires an articulation of the underlying assumptions of the logic model (i.e., looks at how and why the desired change is expected to come about; Woodard, 2013).

Using Multiple Methods to Assess Professional Development Center Jennifer Lehmann, Susan Chang, Cigdem Talgar, Michael Fried

Introduction Collecting evidence to "prove" that faculty development centers have made a significant impact on teaching practices in higher education has been a persistent and perennial issue (Chen, Kelley, & Haggar 2013; Fink 2013; Kucsera & Svinicki 2010; Meyer & Murrell, 2014). The majority of centers have assessed their impact using event participation and evaluations (Meyer & Murrell, 2014) which cannot speak to the actual impact on teaching practices or show a change in the teaching culture at the institution (Fink, 2013). Ideally, a center would document its effectiveness using an approach that directly observes changes in teaching practices and associated depth or breath of student learning. However, the amount of time, expertise, and resources needed to evaluate a center in a way that isolates teaching effectiveness to student learning can take away from time and resources needed to provide faculty development (Chism, 1998). To demonstrate the impact that our programing has had on both the institution's culture of teaching (i.e., teaching attitudes) and impact on teaching practices, we examined multiple data that the institution was already collecting: (1) data collected by Institutional Research (surveys of faculty and students); and (2) student course evaluations. Through program theory, we examined both the literature and the above data to evidence the impact the center is making on campus.

Evidence of Our Center’s Impact

Faculty Development Center Program Theory

Participation and Program Satisfaction Participated in Teaching Enhancement Workshops % Faculty Self-Reporting

If CATLR exists, then it can provide faculty development activities to faculty.

70% 60%

58% 46%

50% 40% 30% 20% 10% 0%

2010-11

2013-14 Academic Year

Source: Faculty Survey (46% response rate); may include training outside the Center

Faculty Perspective on Teaching Culture

If faculty participate in programming, then they gain awareness, knowledge, and beliefs about teaching approaches.

To what extent does University provide...

Faculty development participation: increased motivation, self- awareness, and enthusiasm for teaching (Sarikaya, 2012); increased motivation to use proven teaching strategies (Felder & Brent, 2010).

% Significant or very Significant

2010-11 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0%

84%

2013-14

90% 95%

92%

70%

Faculty development support for teaching

Faculty development support for research

77%

Publicity for achievement in teaching

Source: Faculty Survey (46% response rate)

Faculty and Course Evaluations

If faculty change their awareness, knowledge, and/or beliefs about teaching approaches, then they will change their teaching practices.

Faculty development participation lead to: long term impact on faculty, learning environment, and culture of institution, (Sarikaya, 2010); faculty engaging in instructional development scholarship (Felder & Brent, 2010); increase in both teaching practices using observations by trained external evaluators and faculty self-reports (D’Eon et al, 2008). However research has found minimal impacts in behavior (Hacsi 2000; Weiss 2000).

Overall Rating of Instructor’s Teaching Effectiveness? Term Mean Means by participation Spring 2010 4.3 Spring 2013 4.3 Spring 2016 Non-participating 4.3 4.4 1-2 Events 4.5 3+ Events 4.5 In addition to looking at overall rating of teaching effectiveness, we examined: I could the course intellectually challenging; I learned a lot in this course; Instructor used class time effectively; Instructor fairly evaluated my performance; Out-of-class assignments and fieldwork helped me learn; Lectures helped me learn; and In-class discussions and activities helped me learn.

Student Academic Excellence Satisfaction Student Survey

If faculty change teaching practices, then they will impact student learning. Faculty development programming: improved student satisfaction and retention (Arum & Roksa 2011 as cited in Lavis, 2016); increased student outcomes in medical school (Sullivan 2005 as cited in Woodard).

Mean satisfaction on 1-7 scale

7 6

*** 5.61 5.77 5.8

* 5.79 5.88 5.92

5 4 3 2 1

There is a commitment I am able to experience to academic excellence intellectual growth here. on this campus. Year 2012

Year 2014

Year 2016

Source: Noel Levitz Student Survey (18% response rate)

Looking at all of these data elements, each unpacking different levels of evaluation, we can start to see an overall picture on the University’s teaching culture and teaching practices. While it would have been ideal to also incorporate feedback from interviews with faculty and their students, these data elements are a helpful place to start. Particularly since this data was already collected but had not yet been shared with the Center.

Challenges Data Source Participation and Satisfaction

Conceptual Challenges Only helps with improving programming. Doesn't document success or impact of workshops on teaching or learning.

Implementation Challenges Faculty may provide non-critical feedback (Van Note Chism & Szabo, 1998). In support of this, our Center hit a ceiling effect with little gain from feedback.

Limited by the population that was If asking about specific teaching techniques, limited surveyed (for us, that meant only by a) faculty’s understanding full-time faculty). of what the techniques are; Faculty Survey and b) response bias (responding how they think administration wants them to report)

Course Evaluations

Interviews

Student Surveys

General Challenges

Limited by student interpretation. When implementing a new teaching technique, ratings may initially go down (before ultimately improving).

Had to jump through hoops to get access to the data. Amount of raw data was so vast that there were endless ways to select faculty/courses to include (course type, level of participation in the center; type of faculty). Data was captured slightly different each year, requiring significant restructuring.

Had to be strategic in recruiting IRB was concerned that center participating faculty, in faculty and in the ways questions were worded so as to minimize the interest of keeping a positive partnership, might feel conceptual challenges. Took much obligated to both participate longer to create questions and get (and respond positively) to the IRB approval than anticipated. interviews. May be limited by response rates. Limited by student interpretation. Not a direct measurement of student learning or teaching practices. Varying levels of programming Data already being collected ≠ (engagement; depth; duration; readily available data topic) and types of participants -Approval to have access to the (faculty, co-op coordinators, data (and other logistical hurdles) graduate students, part-time -Structure of the data (variables not faculty). included, originally offered in pdf, flat data structure)

Future Options:

• Evaluate at program level rather than the Center as a whole (since they have differing goals, inputs, participants, and perhaps outcomes) • Continue to look at the readily available data in multiple ways. Looking at the data by college, department, or type of faculty may allow for a better understanding of where faculty at various levels are excelling and where there is opportunity to provide more or different types of programming. • We did not fully unpack the qualitative data form the course evaluations. This may be an opportunity to identify faculty development needs (particularly within a specific college or department) • Logic Models for specific center programs • Add direct questions to faculty and student surveys sent by IR • If the ultimate goal is to impact student learning, need to assess breath and depth of knowledge learned using artifacts from course References: Chen, W., Kelley, B., & Haggar, F. (2013). Assessing faculty development programs: Outcome‐based evaluation. Journal on Centers for Teaching and Learning, 5,  107‐119. D'Eon, M., Sadownik, L., Harrison, A., & Nation, J. (2008). Using self‐assessments to detect workshop success: Do they work?. American Journal of Evaluation. Fink, L. D. (2013). Innovative Ways of Assessing Faculty Development. New Directions for Teaching and learning 2013(133), p 47‐59.  Felder, R. M., & Brent, R. (2010). The National Effective Teaching Institute: Assessment of impact and implications for faculty development. Journal of Engineering  Education, 99(2), 121‐134. Gibbs, G., & Coffey, M. (2004). The impact of training of university teachers on their teaching skills, their approach to teaching and the approach to learning of  their students. Active learning in higher education, 5(1), 87‐100. Hacsi, T. A. (2000). Using program theory to replicate successful programs. New Directions for Evaluation, 2000(87), 71‐78. Hines, S. R. (2015). Setting the Groundwork for Quality Faculty Development Evaluation: A Five‐Step Approach. The Journal of Faculty Development, 29(1), 5‐12.  Kucsera, J. V., & Svinicki, M. (2010). Rigorous Evaluations of Faculty Development Programs. The Journal of Faculty Development, 24(2), 5‐18.  Marbach‐Ad, G., Egan, L. C., & Thompson, K. V. (2015). Evaluating the Effectiveness of a Teaching and Learning Center. In A Discipline‐Based Teaching and Learning  Center (pp. 185‐221). Springer International Publishing.  Meyer, K. A., & Murrell, V. S. (2014). A national survey of faculty development evaluation outcome measures and procedures. Online Learning Journal, 18(3). Lavis, C. C., Williams, K. A., Fallin, J., Barnes, P. K., Fishback, S. J., & Thien, S. (2016). Assessing a Faculty Development Program for the Adoption of Brain‐based  Learning Strategies. The Journal of Faculty Development, 30(1), 57‐70. Sarikaya, O., Kalaca, S., Yeğen, B. Ç., & Cali, S. (2010). The impact of a faculty development program: evaluation based on the self‐assessment of medical educators  from preclinical and clinical disciplines. Advances in physiology education, 34(2), 35‐40. Van Note Chism, N., & Szabo, B. (1998). How Faculty Development Programs Evaluate Their Services. Journal of Staff, Program & Organization Development, 15(2),  55‐62. Way, D. G., Carlson, V. M., & Piliero, S. C. (2002). Evaluating teaching workshops: Beyond the satisfaction survey. To improve the academy: Resources for faculty,  instructional, and organizational development, 20, 94‐106. Weiss, C. H. (2000). Which links in which theories shall we evaluate?. New directions for evaluation, 2000(87), 35‐45.

Evidence Options for Faculty Development Centers ... -

Evidence Options for Faculty. Development Centers. Our original approach to evaluating the Center's impact was to triangulate data. Once diving more into the literature, formalizing a program theory .... Setting the Groundwork for Quality Faculty Development Evaluation: A Five-Step Approach. The Journal of Faculty ...

196KB Sizes 5 Downloads 189 Views

Recommend Documents

Faculty Development -
Shree L R Tiwari College Of Engineering was set up in 2010 by Rahul Education Group to impart need-based and application-oriented technical education.

Faculty Development -
Shree L R Tiwari College Of Engineering was set up in 2010 by Rahul Education Group to impart need-based and application-oriented technical education.

faculty development programme -
Dr. Sarang A. Joshi Chairman, BOS (Computer Eng.) University ... Computer Engineering, University Of Pune ... About the Institute : Siddhant stands for values or.

faculty development programme -
G.V. Garje Chairman, BOS (Information Tech.) ... Dr. Varsha Patil Member ,Board Of Studies , ... Council for Technical Education, New Delhi, recognized by.

faculty development programme -
Dr. Sarang A. Joshi Chairman, BOS (Computer Eng.) University ... courses of Engineering . ... This is platform to bring together all faculties of various colleges to.

2006 Faculty development Showcase Schedule.pdf
Sign in. Loading… Whoops! There was a problem loading more pages. Retrying... Whoops! There was a problem previewing this document. Retrying.

Adjunct Faculty Voices; Cultivating Professional Development and ...
Development and Community at the Front Lines of Higher Education (The New ... New Faculty Majority) free ebook website for android Adjunct Faculty Voices:.

Evidence development strategy insert.pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Evidence ...

PDF-Download- Management of Child Development Centers Full Online
Development Centers Full Online. Books detail ... Child Development Programs provides an overview of the demographic and theoretical context within which.