Midterm Accreditation Report Fall 2012

American River College

AMERICAN RIVER COLLEGE MIDTERM REPORT

Submitted by AMERICAN RIVER COLLEGE 4700 College Oak Drive Sacramento CA 95841

To Accrediting Commission for Community and Junior Colleges Western Association of Schools and Colleges

July 2012

TABLE OF CONTENTS

Certification of the Midterm Report ............................................................................... 5 Statement of Report Preparation ..................................................................................... 7 Response to Team Recommendations and Commission Action Letter ...................... 13 Recommendation 1................................................................................................................ 13 Recommendation 2................................................................................................................ 21 Recommendation 3................................................................................................................ 27 Recommendation 4................................................................................................................ 33

Response to Self-Identified Issues .................................................................................. 39 Appendices ....................................................................................................................... 47 A. Improvements in Student Learning as a Result of ARC’s SLO Assessment Process: Report on Cohort I, 2007-2008 vs. 2010-2011 ................................................................. 49 B. Addendum: Guidelines for departments completing ARC SLO Action Plans responding to SLO student self-assessment survey results containing a statistically significant negative deviation ............................................................................................................. 55 C. Instructions to Accompany ARC Faculty Designed Assessment Plan Entry Template: 2nd cycle ........................................................................................................................... 59 D. Resolution F 09-04: Social Relativity Act ........................................................................ 65 E. Student Satisfaction Survey .............................................................................................. 69 F. ARC ESL Center Diagnostic Placement Exam and Post Test.......................................... 81

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

3

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

5

Statement of Report Preparation Mindful of the principle that accreditation is a process and not an event, American River College began work on the midterm accreditation report by organizing three aspects of preparation: •

In summer 2010, starting the process of gathering the evidence to document the college’s progress in completing the four recommendations identified in the evaluation report by the ACCJC’s visiting team and the eight planning agenda identified in the college’s selfstudy



In fall 2011, establishing the shared governance body having responsibility for overseeing report preparation



In spring 2012, writing the report sections and assuring sufficient time for comprehensive reviews of draft 0 by subject matter experts, draft 1 by the entire college community, draft 2 by the shared governance leaders, and draft 3 by District leadership.

Gathering Evidence to Support Progress in Completing the Recommendations & Planning Agenda The accreditation liaison officer and IT research assistant II organized the system for evidence gathering based on: •

The accreditation standards addressed by the recommendations and planning agenda



The faculty and staff having primary responsibility for the recommendations and planning agenda



The steps to complete the work of meeting the recommendations and planning agenda



The evidence needed to document work completed



The time frame required for completing the work or the estimated deadline for completing the work.

The ALO periodically reported to the President’s Executive Staff (PES) on the status of the college’s work completed on the accreditation team’s recommendations and college’s planning agenda and the compilation of evidence [0.1]. Evidence documents were digitally-stored in a shared drive folder maintained by the IT research assistant II. Establishing the Shared Governance Body for Overseeing Midterm Report Preparation At its first summer planning meeting in June 2011, members of the President’s Executive Staff reviewed the status of the work being completed on the 2009 ACCJC accreditation team’s recommendations and the college’s planning agenda and discussed the concept of establishing a shared governance body responsible for overseeing preparation of the midterm report and,

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

7

looking toward the 2015 comprehensive visit, for preparing the self evaluation. In fall 2011 on behalf of the PES, the ALO proposed to the Planning Coordination Council the formation of the college’s Accreditation Oversight Committee (AOC) to serve two functions; the text below is excerpted from the proposal: •

Following the accreditation visit and prior to submitting the midterm report, the AOC will receive updates from the accreditation liaison officer on the college’s progress toward meeting the ACCJC’s recommendations and the college’s planning agenda. The AOC will meet once a semester during this period (fall 2011-spring 2012)



In the three years prior to the accreditation visit, the AOC will serve as the accreditation steering committee and oversee the progress of the college’s evaluation study. The accreditation steering committee will meet once monthly (fall 2012-spring 2015).

To assure college-wide representation on the AOC, the proposal specified the membership of the AOC in both its manifestations: •

Faculty: the Academic Senate president and the chairs of the Curriculum Committee, SLO Assessment Committee, and the faculty co-chair.



Classified staff: the Classified Senate president



Associated Student Body: the ASB president



Managers: the President’s Executive Staff*



Resource staff: to be called upon as needed, research staff preparing demographics and organizing the evidence files; the graphics designer; the distance education coordinator; other faculty or staff identified by PES and the other members of the AOC.

After successful first and second readings in November and December 2011, the Planning Coordination Council approved the proposal to establish the college’s Accreditation Oversight Committee.

_____________________________ * The President’s Executive Staff (PES) includes the accreditation liaison officer.

8

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

Writing the Report and Assuring Sufficient Time for Comprehensive Review The ALO convened the first meeting of the Accreditation Oversight Committee on 9 March 2012. At that time, the AOC approved the proposed schedule for preparing the midterm report: Draft Preliminary draft

Reviewing Period February-Early March

Draft 0 Draft 1 Draft 2 Draft 3 Draft 4

9-20 March 23 March-10 April 16-23 April 27 April-12 May 30 May-13 June

Responsible for Reviewing Recommendations: subject matter Recommendations and planning agenda: PES Accreditation Oversight Committee College community Accreditation Oversight Committee Chancellor Harris Governing Board

As summarized above, preparation of report drafts, synthesis of feedback for subsequent drafts and reviews followed this order: •

Reviews of the preliminary, ALO-compiled draft sections addressing the accreditation team’s four individual recommendations and the college’s eight specific planning agenda were completed by subject matter experts and the PES in early February and March 2012



Draft 0 was reviewed by the members of the AOC over a two-week period in March



Draft 1 was reviewed online by the college community over a two and a half week period in March and April



Draft 2 was reviewed by the AOC over a one-week period in April



Draft 3 was submitted to the District office for review prior to submission to the governing board in May



Draft 4 was approved by the governing board at its meeting on 13 June 2012.

The 2009 review by the Commission’s accreditation team confirmed that the college adheres to all eligibility requirements, standards, and policies of the Accrediting Commission for Community and Junior Colleges. This report contains a description and evaluation – i.e., a narrative analysis – and the supporting evidence to demonstrate the college’s response to the team’s four recommendations and a summary describing the effort to complete the work of resolving the college’s eight self-identified issues.

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

9

Reference 0.1

ACCJC Recommendations and ARC Planning Agenda, updated 2 December 2011

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

11

Response to Team Recommendations and Commission Action Letter Recommendation 1: The ACCJC SLO rubric speaks to authentic assessment and the full engagement of faculty and staff; therefore, to demonstrate SLO success at the proficiency stage by the Commission’s 2012 deadline, the team recommends that American River College identify a formal process to review the quality of its assessment tools and to ensure part-time faculty participation in the assessment of SLOs (II.A.1.c, I.B.4, I.B.5). The SLO Assessment Committee has responsibility for overseeing the college’s assessment process for course, program, and institutional student learning outcomes. It assists departments with their assessment activities; reviews assessment instruments and protocols, action (response) plans, and progress or implementation reports; reviews SLOs of non-instructional programs; provides two-way communication with staff concerning issues, programs, and opportunities relating to SLOs; and monitors the outcomes assessment portion of the college’s program review process [1.1]. The discussion below summarizes the committee’s efforts to identify and implement a formal process to review the quality of the college’s assessment tools and to assure adjunct faculty participation in the assessment of SLOs. Authentic Assessment and the Formal Process for Reviewing the Quality of Assessment Tools Description Since 2007-2008 when the college began its first three-year assessment cycle, the SLO Assessment Committee has led the development, evaluation, and improvement of two types of assessment instruments for course-level assessment: the student self-assessment completed through student self-assessment surveys and the faculty designed assessment completed through faculty designed assessment instruments. In addition, the committee has overseen the assessment process for program and institutional SLOs (ISLOs). The student self-assessment consists of research office-deployed surveys for all courses taught during the semester. Over the last five years, completion rates for the student self-assessment are 74.86 percent for hard copy surveys distributed in classes and 50.55 percent for online surveys. SLO student self-assessment surveys are one form of evidence for measuring achievement of student learning in a given course. Authenticity of the student self-assessment relies on the college’s research efforts: the faculty research coordinator has responsibility for using a statistical procedure to determine if one or more SLOs are rated more negatively when compared with most of the SLOs in the given course; i.e., the faculty research coordinator uses a repeated measures analysis of variance (ANOVA) procedure and a p<.01 standard to analyze whether a statistically significant negative deviation exists. Published research indicates that the absolute rating on student self-assessments is not a valid indicator of achievement (i.e., student learning

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

13

cannot be inferred simply because students strongly agree or disagree) because the respondents are prone to response biases such as overconfidence; however, these response biases do not create selectively lower ratings for a particular SLO. Because of this pattern, the faculty research coordinator’s procedure detects selectively low ratings. Irrespective of rating, a statistically significant negative deviation indicates that action may be needed to improve student learning, as determined by the expertise of the discipline faculty. Authenticity of the student self-assessment process is validated through two steps. First, each fall the faculty research coordinator analyzes the results of the student self-assessment to identify whether the number of negative deviations has decreased, thus validating that improvement in student learning has occurred. Second, the results of the analysis are reported to the SLO Assessment Committee as part of its re-evaluation and, as appropriate, revision of the student self-assessment process. While the revision of the student self-assessment process itself has not been necessary, the committee’s dialogue with faculty has indicated the need to improve the guidelines for completing SLO action plans; accordingly, in spring 2012, the committee recommended revisions to the guidelines, which were subsequently accepted by the Academic Senate (see Appendices A and B). The faculty designed assessment consists of assessments created by discipline faculty. In completing their faculty-designed assessment plans, departments must evaluate and then indicate how their assessment tools meet the definition of authentic assessment. Faculty are reminded In the instructions for completing the entry template for the faculty-designed assessment plan that authenticity of assessment requires students to demonstrate the ability to apply their knowledge to real world tasks or problems approximating those in the workplace or venues other than the classroom. Further, for their proposed assessment instruments, faculty must (a) demonstrate how the proposed assessment tool maps to the SLOs being assessed, (b) set benchmarks for students to achieve for each SLO, and (c) identify the measure that demonstrates learning. Having provided this information, faculty submit their proposed assessment tools and assessment plans for technical review by the SLO Assessment Committee. Using the “ARC Faculty Designed Assessment Plan Evaluation Form 2nd Cycle” to document the results of its review, the committee examines the assessment tools for alignment with the SLOs chosen for assessment and assures that the resulting assessment processes do provide accurate data. Sample questions considered by the committee in its review include (a) when in the semester each course SLO will be assessed, (b) how comprehensively each course SLO will be assessed, (c) how each course SLO will be assessed, and (d) how achievement will be determined for each course SLO assessed. As part of the review process, faculty may be asked to revise their assessment tools and procedures to assure that assessments are deployed using a consistent methodology in all sections, i.e., consistency of directions, time allowed, time at which the assessment is deployed, and student expectation of earning credit (see Appendix C).

14

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

Evaluation For the five years that the college’s three-year assessment cycle has been in place, the tables below summarize the results of the SLO Assessment Committee’s review of the faculty-designed assessment plans and action plans. Faculty Designed Assessment Plans

2007-2008 Cycle 1: 1st year 2008-2009 2009-2010 2010-2011 2011-2012

Cycle 1: 2nd year Cycle 1: 3rd year Cycle 2: 1st year Cycle 2: 2nd year

Initially Approved 35.00% 80.95% 13.33% 68.42% 44.00%

Revisions 60.00%

Initially Rejected 0.00%

Initially Missing 5.00%

9.52% 73.33% 31.58% 52.00%

4.76% 13.33% 0.00% 4.00%

5.00% 0.00% 0.00% 0.00%

In the first five years of the college’s SLO assessment process history and after an initial rejection rate of 0 percent, the pattern for the initial rejection rate and for revisions peaks in the third year of the first cycle; in the first year of the second cycle, initial rejections decrease to 0 percent and initial approval increase to 68.42 percent. The overall improvement in the pattern of assessment plans not rejected in the first review 1 coincides with three factors in the SLO Assessment Committee’s revision of the assessment process in 2010-2011: •

Providing an updated template for the faculty assessment plan



Issuing clearer instructions for completing the plan



Implementing a revised technical review process.

Modeled on the review process used by the Curriculum Committee, the revised tech review process has the advantage of allowing closer scrutiny and discussion of the proposed assessment plans by small work groups whose reports to the full committee then become the basis for informed dialogue and the recommendations of the committee. Following the recommendations of the committee, the faculty research coordinator, SLO assessment coordinator and classified IT research assistant II assist the departments whose assessment plans were recommended for revision to revise their plans for the full committee’s next review.

1

The 2011-2012 decrease to 44 percent in the initially-approved assessments, the increase in revision rate to 52 percent, and the increase to 4 percent in the rejection rate resulted from the committee’s rejection of one department’s assessment instrument: that department’s assessment instrument was rejected because the instrument was a repeat of the previous assessment cycle and did not include SLOs. The department responded to the initial rejection of its tool by creating a new assessment tool, and this new tool has been accepted by the SLO Assessment Committee.

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

15

Action Plans

2007-2008 2008-2009 2009-2010 2010-2011 2011-2012

Cycle 1: 1st year Cycle 1: 2nd year Cycle 1: 3rd year Cycle 2: 1st year Cycle 2: 2nd year

Initially Approved n/a 16.67% 52.94% 75.00% 38.10%

Revisions n/a 72.22% 41.18% 18.75% 52.38%

Initially Rejected n/a 0.00% 5.88% 6.25% 9.52%

Initially Missing n/a 11.11% 0.00% 0.00% 0.00%

A similarly positive effect in the 2010-2011 results is shown for the initially approved action plans’ being, to date, at an all-time high of 75 percent. The 22 percent improvement in the plans initially approved can be explained in part by the first time, pilot-basis use of a statistical procedure to help the faculty understand when a statistically significant negative deviation in the student self-assessment surveys is present. Specifically, the statistical deviation information makes clear that a statistical negative deviation at a .01 level indicates a 1-in-100 chance that the deviation is random and therefore merits action. This statistical axiom presents two advantages: (a) enabling faculty to place a greater focus on the faculty designed assessment and (b) allowing the committee to have a more effective and systematic review process. Since fall 2007 when the SLO Assessment Committee began oversight of the college’s assessment process, deployed the first student self-assessment (indirect), and provided training and support to help faculty with (a) designing the faculty designed assessments (direct), (b) understanding the results of both assessments, and (c) creating the action plans and action plan implementation reports, the committee has simultaneously supported the college’s assessment processes and reviewed the results of these processes to identify those aspects needing improvement – essentially, assessing the assessment process. The college has fully implemented the visiting team’s recommendation that American River College identify a formal process to review the quality of its assessment tools.

16

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

Ensure Part-Time Faculty Participation in SLO Assessment Description The SLO Assessment Committee recognizes that authentic assessment requires the full engagement of the faculty and staff. To assure the adjunct faculty’s participation and contribution to the assessment process, section 12 of the “Instructions to Accompany [the] ARC Faculty Designed Assessment Plan Entry Template” reminds departments that, “adjunct faculty should, at a minimum be allowed to review and provide feedback on assessment tools to be used in their classes” and to, “document any and all involvement of adjunct faculty in the process.” Since the beginning of the college’s first three-year assessment cycle, training has consistently emphasized the importance of the adjunct faculty’s participation, and the templates for the assessment plan and action plan include the department chairs’ verification that adjunct colleagues have participated in the assessment dialogue for their departments. Evaluation To date in this second year of the second cycle of assessment, 76 percent of departments in Cohort I and 93 percent of departments in Cohort II have documented the involvement of their adjunct faculty colleagues. In all cases where adjunct faculty were not involved, either no adjunct faculty were in the department (e.g., Speech Language Pathology Assistant is a oneperson program) or no adjunct faculty were teaching the course being assessed. In a further development supporting adjunct participation in the college’s assessment process, the SLO Assessment Committee’s proposal to add three adjunct representatives to its membership was approved by the Academic Senate and unanimously approved by the Planning Coordination Council at meetings in March 2012. The college has fully implemented the visiting team’s recommendation to assure part-time faculty participation in the assessment of SLOs.

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

17

Reference 1.1

Student Learning Outcomes/Assessment Committee Web Page http://inside.arc.losrios.edu/committees/assessment_slo.html

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

19

Recommendation 2: The College recognizes that student development needs for leadership, active participation in shared governance, and awareness and tolerance for diversity are paramount at American River College. In order to improve, the team recommends that efforts, such as conflict resolution training, Interest Based Alliance training, and the Community and Diversity Center Initiative, be established for students and institutionalized to address these critical student needs (I.A.1, II.B.1, II.B.3.b, II.B.3.d). To organize the effort for accomplishing this recommendation and the planning agenda identified in the college’s 2009 self study, the college recognized that the efforts supporting the second recommendation on meeting student development needs, including conflict resolution training, are related both to the ACCJC’s fourth recommendation on encouraging student participation in shared governance and to the college’s fifth planning agenda on performing a comprehensive review of Campus Life programs. This section of the midterm report focuses on (a) the Campus Life-organized training on conflict resolution and Interest Based Alliance and (b) the efforts made through the Community and Diversity Center Initiative. Description In response to the accreditation team’s recommendation, the Campus Life-organized training on conflict resolution and Interest Based Alliance work has included the following: As stipulated by the college’s fifth planning agenda, “In consultation with faculty, staff, and student representatives, Student Services will perform a comprehensive review of Campus Life programs during the 2008-2009 academic years to ensure that these programs encourage personal and civic responsibility and development of the student beyond the classroom.” Subsequently, Campus Life’s comprehensive 2008-2009 program review identified training topics for the Campus Life staff to include in a series of student leadership summits; the topics addressed the training needs specified in both the second and fourth recommendations of the accreditation team [2.1]. Goals in developing the schedule for the student leadership summits included: •

Fostering team building skills



Discovering individual strengths



Communication and conflict management



Student self-governance.

Supporting these goals were Interest Based Alliance (IBA) training, team building exercises at an off-campus facility that integrated team recreational challenges with opportunities to recognize how the dynamics of trust and shared risk support the work of the team, information on parliamentary procedure, and decision making and conflict management activities. In the attempt to assure stronger mastery of the information

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

21

covered, the first and second days’ agenda included time for reflecting on the concepts learned and for previewing their connection to the next day’s work. As framed in the goals and objectives statements for the 2010, 2011, and 2012 student leadership summits, the intended learning outcomes for the several days of training included: [2.2, 2.3, 2.4] •

Build a sense of community and teamwork within the Associated Student Body and the Club and Events Board (CAEB, formerly the Inter-Club Council) leaders



Embrace and communicate the principles of respect, integrity, accountability and responsibility



Actively collaborate and build meaningful relationships to achieve positive change



Lead by positive example and commitment.

Evidence of the effectiveness of student leaders’ having achieved these learning outcomes can be observed in the collaboration between the Associated Student Body (ASB) and CAEB. Examples of this collaboration are the special, themed Club Day events sponsored by CAEB; these events focus on diverse student groups within the campus community. Additional evidence can be found in ASB-authored resolutions supporting an inclusive campus community. See Social Relativity Act F 09-04 in Appendix D. In response to the accreditation team’s recommendation, the Community and Diversity Center Initiative’s work has included the following: Based on assessment of the college’s programs, services, and environment of mutual respect, understanding and tolerance, a plan was developed to realize the vision and mission of the Community and Diversity Center Initiative launched in spring 2009. To fulfill that vision, the CDCI has focused the last three years on establishing itself with a physical space and a web presence. The goal is to foster a respectful, inclusive, and culturally responsive community through supporting access and representation for all community members, creating a welcoming and diverse campus climate, providing education and training, and developing ways to recognize and affirm those who support and further community-building and diversity. The CDCI, staffed by a 0.20 release time director and by volunteers, has focused on:

22

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

1

2

Facilitating activities: •

Workshops and special events open to students, faculty, and staff for personal and professional development



Effectiveness surveys



CDCI’s Ally program



Campus events



Fostering space for open, responsible dialogue and peer mentoring



Ongoing development for the CDCI staff and recruitment of new student practitioners.

Organizing resources: •

The campus connection point



Bias incidence protocol (pending approval)



Speakers’ bureau (in process)



Community referrals



Academic book exchange



Diversity book and video library



Appropriate web resources



Teaching aids



Curriculum and curriculum development.

The CDCI works to develop understanding and appreciation of diversity through workshop opportunities that focus on diversity while simultaneously nurturing leadership skills. From March 2009 through December 2011, the CDCI offered over 65 events, of which 40 were designed for students, included student participants, or were facilitated by students. Topics included “Diversity and Student Empowerment,” “Diversity and Me: Student Leadership training,” and “Privilege 101.” The CDCI has also developed a conversation series called “Community in Conversation” to introduce students and faculty, staff, and administrators to a variety of topics on diversity [2.5, 2.6]. Students have been trained to participate on the facilitating team for these workshops. Evaluations of these workshops have included such comments as, “Diversity is self-evident in all aspects of social interaction in ARC,” “I learned about other ethnicities and religions,” “That I am in fact in an unaware group majority,” “I learned to listen attentively and respect people’s opinions,” and “That I do make judgments wether [sic] I realize it or not” [2.7, 2.8, 2.9].

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

23

The CDCI’s philosophy is that diversity belongs to the entire college community, and beyond the efforts of the CDCI, the college has offered many opportunities to celebrate and learn about mutual respect, understanding, mutual responsiveness, and celebrating differences. A few of the many examples include sponsorship by student organizations of “Diversity Days,” “Beaver Week,” and “Club Day”; study breaks at “The Spot”; the gerontology department’s monthly workshops; Campus Life’s webinar on facilitation skills for diversity sessions and student leadership summits; the Equity Committee’s “Multicultural Week”; Umoja’s learning community; and the MESA program [2.5]. Evaluation The college has worked on three fronts to address the accreditation team’s second and fourth recommendations and the college’s fifth planning agenda: •

The topics presented by the Campus Life office at the three leadership summits of July 2010, August 2011, and January 2012 cumulatively addressed student development needs concerning leadership: i.e., the summits began with teambuilding and how to work together, moved on to working with others outside the team, and, most recently looked at building organizational infrastructure and assuring continuity; learning activities at the summits focused on exploring the importance of the student voice in shared governance (self), teaching individual and group accountability and responsibility (self and other), and building positive campus collaborations (other).



Campus Life will continue to present the leadership summits to help meet student development needs for leadership; e.g., the next leadership summit is scheduled for August 15, 2012



The Community and Diversity Center Initiative has provided college-wide educational opportunities that emphasize diversity, equity, and inclusion and promote civil discourse



The Planning Coordination Council has identified and implemented processes for encouraging greater ASB participation in the college’s shared governance processes (this point is fully addressed in the midterm report section on the accreditation team’s fourth recommendation and the college’s fifth planning agenda).

The college has fully implemented the visiting team’s recommendation to institutionalize the framework of leadership training and shared governance processes to meet the critical student development needs identified in the visiting team’s second and fourth recommendations and the college’s fifth planning agenda. The college fully meets the standards I.A.1, II.B.1, II.B.3.b, II.B.3.d.

24

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

References 2.1

Student Development Instructional Program Review 2008-2009, April 2010

2.2

Student Leadership Summit 2010: Goals and Objectives, July 30, 2010

2.3

Student Leadership Summit 2011: Goals and Objectives, July 21, 2011

2.4

Agenda, Student Leadership 2012, January 21, 2012

2.5

Community and Diversity Center March 2012 Report to the President, March 2012

2.6

Community and Diversity Center Workshops, March 2012

2.7

Community and Diversity Center Workshop Evaluation Results, October 4, 2011

2.8

Community and Diversity Center Workshop Evaluation Results, October 6, 2011

2.9

Community and Diversity Center Workshop Evaluation Results, November 1, 2011

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

25

Recommendation 3: In order to improve, the team recommends that the assessment of the services and the sequencing of the modules provided by the learning resource centers be formalized and systematic with support from the Research office (II.C.2). The college recognizes that formalized and systematic assessment of the services in the Learning Resource Center (LRC) and the modules provided in the LRC will result in two outcomes: •

Improvement of the support services provided by the LRC



Assurance that the modules through which the support services are sequenced and delivered provide the most effective assistance to students using LRC services.

In responding to this third recommendation, the college followed this sequence of actions: 1

Identified the services provided to students in the college’s Learning Resource Center

2

Through responses to a survey, asked the coordinators of those services to identify: a) The format used to provide the service to students, including whether services have been or are currently in module format b) The type of evaluation(s) used to assess and improve the quality of the services c) The intervals at which the evaluation(s) are completed.

3

After compiling the data described in steps 1 and 2, identified the appropriate actions, if any, for implementing the recommendation.

Description Services provided to students in the college’s Learning Resource Center Eight services are provided to students in the LRC: •

Tutoring services, which includes individual tutoring (drop-in and scheduled) and tutoring through the Beacon program



Open computer lab for students doing instructionally-related work



ESL center



Foreign languages lab



Reading center



Reading across the disciplines (RAD)



Writing center



Writing across the curriculum (WAC).

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

27

To the survey question on whether modules had been or were currently the format for providing their services to students, the coordinators responded [3.1]: 1 2

Two services, i.e., tutoring and open computer lab, indicated that their support is offered on a drop-in basis and cannot be provided in module format Six services indicate that their services have been or are currently provided in module format: a) Three of the services, i.e., those of the writing center, WAC, and RAD, had previously been offered in modules; however, as a result of the effort (begun well ahead of the ACCJC evaluators’ 2009 visit) to improve productivity, all three services revised to lecture course format, the writing center and WAC in fall 2010, and RAD in spring 2010 b) The fourth service indicates that all of the services of the reading center are offered in lecture format with content organized in module units, a change made in 2011 c) The two services that continue to offer their curricula in module format are the ESL center and foreign language lab. • ESL offers ESL Center: Listening Skills in ESL (ESLL 97), ESL Center: Reading Skills in ESL (ESLR 97), ESL Center: Writing Skills in ESL (ESLW 97), and ESL Center: Integrated topics in ESL (ESL 97) and ESL Center: Skills in ESL (ESL 181). • The foreign language lab offers the course, “Foreign Language Lab: Integrated Topics in Spanish (Spanish 131)” supporting SPAN 300; in addition, the foreign language lab offers drop-in assistance for all levels of Spanish. The course outlines for the curricula offered by both the ESL center and the foreign language lab have been reviewed by the Academic Senate’s Curriculum Committee through its regular curriculum process. Each of the course outlines has been approved by the Curriculum Committee with a minimum of five SLOs identified for each course.

The type of evaluation(s) used to assess and improve the quality of the services All eight services evaluate the quality of their programs and services to students. The specific categories of assessment instruments and data used to identify program improvements are summarized below: •

28

Institutional research data and program review Excluding the open computer use service, the LRC-housed services receive summary data from the “Academic Support Programs” reports provided by the research office. This data provides for each support service a detailed look at enrollment, success rates, demographics, and, for those support services tied to specific instructional courses,

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

success in all other courses associated with the specific support services. Because all eight services are included in the program reviews for the instructional departments of which the services are a part, all eight services receive the institutional research data supporting the program review process. For example, because they are housed in the English area, program reviews for the writing center and WAC are included in the program review for the English department, and the program reviews for the reading center and RAD are included in the program review for the reading department; similarly, the programs for the ESL and foreign language labs are included in the program review for those respective departments. •

SLO student self-assessment data Five of the services participate in the three-year SLO assessment process: tutoring services, the reading center, the writing center, WAC, and RAD. The SLO Assessment Committee reviews and approves the assessment instruments and action plans for these services. Because the LRC’s Foreign Language lab provides supplemental instruction, SLO assessment is captured in the curriculum SLOs for the courses supported by the lab’s supplemental instruction.



Program assessment data The ESL center administers a two- to four-page intake and one-to-three-page postmodule tests to all ESL center students. Analysis of the tests allows ESL faculty to identify where module sequencing should be adjusted to improve student learning; for example, several modules are rewritten each semester and additional materials are developed to address specific student needs, as identified through the tests.



Student surveys All eight services housed in the LRC administer surveys to their students at the completion of their work with the services, and faculty use information compiled from these surveys to evaluate and improve their programs. Starting in spring 2009, the reading and writing centers and the WAC and RAD programs began using a common student satisfaction survey administered when students complete their work with the services.

Examples of the ESL intake and post module tests and the student satisfaction survey are compiled in Appendix E and F.

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

29

The intervals at which the evaluation(s) are completed Each of the eight services indicated that evaluations of their programs are completed at least every semester because of the commitment to survey the students participating in these support programs. In addition, the reading and writing centers, WAC, RAD, foreign language and ESL centers are included in evaluations and assessments every three and six years, respectively, as part of their departments’ program review and SLO processes. For example, the SPAN 131 supports SPAN 401, 402, and 411 and is assessed through those courses. Evaluation The results of the analysis described above affirms that even before the fall 2009 ACCJC accreditation evaluators’ visit: 1

The eight support services housed in the Learning Resource Center were already formally and systematically participating in assessment processes to identify and implement improvements to their programs

2

These processes include the use of institutional research data

3

At the time of the visit, six of the services were module-based

4

As a result of discussions begun before the 2009 visit and because of the desire to improve productivity, four of the six module-based services were already taking steps toward and have since implemented the move away from module- to lecture-based format

5

The two services which continue to use module-based format do so because the effectiveness of their programs is supported by data compiled through (a) SLO assessment data for the courses they support, (b) program assessments composed of intake and post-diagnostic tests, (c) student surveys, (d) program review, and (e) institutional research.

Prior to the 2009 visit, the services offered in the Learning Resource Center were already observing formalized and systematic assessment, and the departments offering services in modules had already begun to revise their services to lecture format. Of the two services that continue to offer modules, assessments continue to be systematic and formalized. The college has fully implemented the visiting team’s recommendation to assess systematically and formally the services provided by the Learning Resource Center and to assure that the modules supporting these services are sequenced to provide the most effective assistance to students using LRC services. The college fully meets the standard II.C.2.

30

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

Reference 3.1

LRC Support Services Survey Results: Spring 2012

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

31

Recommendation 4: In order to improve, it is recommended that student participation in the shared governance processes and committees be actively encouraged to ensure the student voice is not lost. This recommendation was recognized in the Self Study, is detailed as a planning agenda, and was spoken to during the team visit (IV.A.1, 2.a, IV.A.3). As referenced in the visiting team’s fourth recommendation to actively encourage student participation in the shared governance processes, American River College had identified as one of the eight planning agenda in its self study the Planning Coordination Council’s responsibility to develop procedures to assist the Associated Student Body to improve the current level of student participation on the college’s standing committees. Description The college has implemented efforts to encourage student participation in shared governance through the following efforts by the Planning Coordination Council, the college’s shared governance body: 1

To support the developing awareness of the need to encourage ASB participation in the college’s shared governance processes, the PCC added in spring 2010 a new section to the template for the reports completed by the standing committee chairs at the end of each academic year. The new section asked committee chairs to state how many ASB reps were appointed to their committees and to report on the frequency (i.e., not at all, sometimes, most of the time, all of the time) of attendance by the ASB reps attending committee meetings [4.1].

2

On August 4, 2011, the chair of the Planning Coordination Council, college president, and the Academic and Classified Senates’ presidents participated in a panel discussion at the 2011 ARC student leadership summit. The panel discussion focused on the college’s shared governance processes and included discussion and a Q and A session on the definition of shared governance and the role of the college constituencies in assuring the effectiveness of the shared governance processes at the college. During this segment of the training, the constituency leaders discussed with the ASB leaders the concept of shared governance, including the Title 5 authority and District policy on which shared governance is based, and ARC’s experience of collegial and collaborative decision making based on shared governance processes. The chair of the Planning Coordination Council described the function of the Planning Coordination Council as the college’s shared governance body, identified the 10 standing committees, listing each of the

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

33

committees’ specific responsibilities, and enumerated the number of ASB seats on each committee – a total of 13 student positions among the ten standing committees. 2 3

At its December 2011 and February 2012 meetings, the Planning Coordination Council discussed the topic, “Participatory Governance: best practices supporting ASB participation.” In December, the PCC reviewed the results of a survey sent to standing committee chairs in November; the survey requested (a) specific information about ASB representatives’ participation in standing committees meetings and (b) the chairs’ recommendations for encouraging greater participation at meetings by ASB representatives [4.2].

4

A survey created for the ASB leadership was prepared for distribution at the ASB’s January 2012 student leadership summit; questions in the survey sought the students’ perspective on the value of shared governance and requested their suggestions for improving students’ participation in the college’s shared governance processes [4.3].

5

At the PCC’s meeting in February 2012, the results of the responses to the ASB leadership survey were shared; also, it was announced that the ASB president had been invited to meet with the PCC chair and the IT research assistant II overseeing the surveys to discuss the recommendations that emerged from both of the survey efforts.

A summary of the information compiled from responses to both surveys is shown below. •

The standing committee chairs’ survey indicated that 72.4 percent of the committee seats for ASB student representatives had been appointed; however, the frequency of attendance ranged from attending every meeting to never attending.



In the students’ survey, 100 percent of the respondents strongly agreed that having student representation on college committees is important.



From both the surveys, two common actions were suggested on how student participation might be improved on standing committees: o

Sending student representatives reminders of the meeting that include the agenda and supporting documents, a standard practice of the standing committee chairs

o

Offering a clear orientation that covers the functions of the committee. [Note: orientation of the membership is a function undertaken at the standing committees’ first meetings.] Additionally, the student survey suggested two other actions to increase awareness of and encourage ASB participation in shared governance: (a) include information about ASB positions on standing committees and (b) advertise to the student population the opportunities to participate in shared governance.

2

At its meeting in September 2011, the PCC approved elevating the Basic Skills Initiative work group to a standing committee, to which the ASB appoints one student representative.

34

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

The table below summarizes the survey results [4.4]. Suggested actions to improve ASB participation Email meeting reminders to student representatives, and include agenda & supporting documents Provide clear orientation on the committees’ work and members’ responsibilities & the meeting schedule Include information about standing committee opportunities at Welcome Day and Club Day Advertise the opportunities to participate in shared governance to the student population

Suggested by Standing ASB/CEAB Committee Chairs Representatives







  

Complementing the efforts of the Planning Coordination Council, the Campus Life staff has planned and presented leadership training for ASB officers and leaders of the Club and Events Board. Summer leadership summits in 2010 and 2011 included presentations by college constituency leaders discussing their contributions to the college’s shared governance processes; topics included, for example, Roberts’ rules of order, encouraging participation in meetings, and working with other constituency groups. In July 30-August 1, 2010, August 3-4, 2011, and January 10-11, 2012, a combined total of approximately 45 students participated in the student leadership summits. Panel discussions, team building exercises, and opportunities for reflection supported the themes of the leadership summits: •

In July 2010, the theme was communication and how to work as a team; activities focused on recognizing individual strengths and valuing the contributions of the individual to the efforts of the group as a whole; Interest Based Alliance training was featured in this first leadership summit



In August 2011, the theme was working effectively with others outside of one’s own team; activities focused on communication, conflict management, and solving problems when a group must communicate and work with groups outside itself to obtain resources necessary for the first group’s problem solving



In January 2012, the theme was building the organizational infrastructure and working toward continuity; activities focused on procedures, succession-planning, documenting processes and avoiding reinventing the wheel, setting realistic goals and following established timelines, and improving processes through evaluation and re-evaluation.

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

35

Supported by the Campus Life staff, the Associated Student Body’s efforts to assure the student voice is not lost include: •

In 2010-2011, the Associated Student Body initiated work on a resolution to incentivize student participation on shared governance committees; although discussion and revision of the resolution are ongoing, the draft document serves as evidence of the value that student leaders continue to place on student voice in shared governance.



In the transition between the 2010-2011 and the 2011-2012 academic years, the Associated Student Body formally added shared governance reports to its regular meeting agenda, thereby formalizing discussion about shared governance at Student Senate meetings.

Evaluation On February 22, 2012, the Associated Student Body president met with the IT research assistant II, who had compiled both of the surveys on which the recommendations for increasing student participation in shared governance are based, and with the chair of the Planning Coordination Council. Their discussion affirmed the suggestions identified by the surveys in which the Planning Coordination Council and the Associated Student Body leadership had participated. Further, their discussion also identified these four additional recommendations to achieve continuity and sustained commitment to ASB participation in the college’s shared governance processes: a) Coordinating a College Hour training for students on student government and how to participate in student government and shared governance b) Within the first two weeks of every semester, advertising on the digital marquee the ASB’s open slots in the standing committees c) PCC’s hosting a shared governance information table at both Welcome and Club Days d) Providing a template for ASB reps to submit written reports summarizing standing committee meetings. The college has fully implemented the accreditation team’s recommendation to actively encourage student participation in shared governance processes and assure that the student voice is not lost. The college fully meets the standards IV.A.1, 2.a, IV.A.3.

36

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

References 4.1

Standing Committee Annual Report for PCC, Spring 2010

4.2

Personal communication, email to PCC members from Jane de Leon: Request for your Input: ASB's participation on your standing committees, November 21, 2011

4.3

ASB Shared Governance Student Survey, January 2012

4.4

Participatory Governance-best practices supporting ASB participation, December 5, 2011

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

37

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

39

Item

1 I.B.1

Standard

In 2009-2010, the Planning Coordination Council will develop procedures to assist the Student Association to improve the current level of student participation on the college’s standing committees.

College Self-Study 2009 Planning Agenda

Reponses to Self-Identified Issues

Spring 2012

Completion Date

Added to the standing committees’ 2010-2011 end-of-year reports a new section asking how many of the committee’s ASB representatives were appointed and how often the ASB reps attended meetings Deployed surveys to standing committee chairs in November 2011 and to ASB leadership in January 2012 to request best practices for supporting ASB participation Discussed the results of the surveys in December 2011 and February 2012.

On 22 February 2012, the ASB president, PCC chair, and staff researcher met to discuss the specific suggestions that emerged from the surveys: the two best practices identified in both the chairs’ and students’ surveys were: a) present a clear orientation and expectation of meeting times and b) e-mail meeting reminders to students (N.B.: both are standard practices of the committee chairs); also, students requested that information on the standing committees and opportunities to participate in shared governance be made available during Welcome Day and Club Day [5.1]. Further, the discussion of the survey information resulted in agreement to undertake these additional actions aimed at achieving continuity and sustained commitment to ASB participation in the college’s shared governance processes: a) coordinating a College Hour training for students on student government and how to participate in student government and shared governance; b) within the first two weeks of every semester, advertising on the digital marquees the ASB’s open slots in the standing committees; c) staffing by representatives of PCC, the college’s sharedgovernance group, on a shared governance information table at both Welcome and Club Days; d) providing a template for ASB reps to submit written reports summarizing standing committee meetings.







Recognizing the work begun by the Campus Life staff in 2008-2009 to complete its comprehensive program review, including identifying what was needed to enhance the student government experience, PCC began its work on this planning agenda in 2010-2011 and has completed the following:

2012 Status Update

40

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

Standard

Item

2 I.B.6

In the spirit of continuous quality improvement, during 2009-2010 the Planning Coordination Council will study the operational procedures supporting program review and the EMP. To ensure the completeness of the study, the PCC will seek direct involvement from the users themselves as well as from their constituency leaders.

College Self-Study 2009 Planning Agenda Spring 2010 through Spring 2012

Completion Date



Developed the planning implementation report Added a section to the end-of-year reports asking standing committee chairs to document their actions responding to program review planning implications related to their committees’ functions; e.g., PD committees were asked to explain how they responded to resource requests for professional development Recorded the program review presentations for use in training workshops Established faculty mentors to support members of the program review cohort with completing the program review process Scheduled training workshops using Outlook reminders sent at the beginning of the semester Formed the Academic Senate’s Program Review Subcommittee (formally the program review task force) Refined the planning implementation report and created the follow up report to facilitate “closing the loop”; i.e., effective spring 2010, a program review planning implications summary is distributed to the Planning Coordination Council, the college’s shared governance group, and standing committee chairs and deans are asked to document their actions supporting resource requests; this compilation is then posted for college-wide distribution [5.2] Revised the templates for the end-of-year reports to align more closely with the planning implementation report

 











2011-2012



2010-2011

Created the program review council to receive program review presentations. Members include constituency leaders, chairs of the Curriculum and SLO Assessment Committees, the deans, AVPs, PCC chair, the research resource staff member, and the President’s Executive Staff



The college has made these improvements to the program review process: 2009-2010



Administers annually in April an evaluation survey to faculty and staff who have participated in program review and the EMP Presents the results of the evaluation survey to PCC the following fall

To assure continuous quality improvement of the college’s operational procedures supporting program review and the EMP, the IT research assistant II member who serves as a permanent resource to the Planning Coordination Council:

2012 Status Update

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

41

Standard

Item

3 II.A.1.c

continued

2 I.B.6

The Student Learning Outcomes Assessment Committee will propose a process during fall 2009 for assessing institutional student learning outcomes. The proposal will be discussed by the college’s governance groups (Academic Senate, Classified Senate, Associated Student Body, and managers) and modified as necessary during spring 2010.

In the spirit of continuous quality improvement, during 2009-2010 the Planning Coordination Council will study the operational procedures supporting program review and the EMP. To ensure the completeness of the study, the PCC will seek direct involvement from the users themselves as well as from their constituency leaders (cont.).

College Self-Study 2009 Planning Agenda

Fall 2009

Spring 2010 through Spring 2012 continued

Completion Date

The ISLO assessment process was discussed at meetings of the Academic Senate in August 2009 and PCC on 2 November 2009.

The assessment process mirrors the faculty designed assessment and student selfassessment assessment process used for assessing course-level SLOs. The assessment process is an ongoing process operating on a two-year cycle.

During the faculty and staff assessment process in the spring 2010 convocation, participants were asked to identify effective ways to measure ISLOs and how programs contributed to student achievement of the ISLOs.

To create the student assessment process, members of the committee looked at the CCSSE survey instrument and chose specific questions most closely aligned with ARC’s ISLOs. The committee then created the survey on which faculty and staff (a) ranked ISLOs in terms of how much their actions indirectly or directly support students’ ability to achieve the ISLOs and (b) identified a single ISLO representing ARC’s greatest challenge. Discussion of the activity supporting the faculty and staff assessment was proposed.

Members of the SLO Assessment Committee met in June 2009 to create both the student self-assessment and the faculty designed assessment processes for assessing institutional student learning outcomes (ISLOs).

Improvements made to the EMP 2010-2011  In training workshops, demonstrated the connection between the SLO assessment process and EMP 2011-2012  Updated training to describe more fully the planning process, including the connection between program review and the EMP  Scheduled drop-in labs to supplement EMP training workshops  Met with deans to discuss the EMP process and facilitate inclusion of their suggested changes to the EMP process  Met with faculty and classified staff representatives to gather feedback and suggestions for improvement  Met with college deans and associate vice presidents in May 2012 to receive suggestions on improving the EMP process.

2012 Status Update

42

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

6 II.B.3.d

The faculty coordinator of the Community and Diversity Center Initiative will assess the college’s programs during the 2009-2010 academic years and report to the president on the steps that may be necessary to enhance the institution’s environment of mutual respect, understanding, and tolerance.

In consultation with faculty, staff, and student representatives, Student Services will perform a comprehensive review of Campus Life programs during the 2009-2010 academic years to ensure that these programs encourage personal and civic responsibility and development of the student beyond the classroom.

5 II.B.3.b

Item

During the 2009-2010 academic years, the public information officer will distribute guidelines for program brochures to department chairs and deans to ensure that such brochures remain accurate and up to date.

Standard

4 II.A.6.c

College Self-Study 2009 Planning Agenda

Spring 2009 through Spring 2012

Spring 2010 through Spring 2012

Fall 2010

Completion Date

The topics presented by the Campus Life office at the three student leadership summits of July 2010, August 2011, and January 2012 build cumulatively to address student development needs for leadership: i.e., the summits began with teambuilding and how to work together, moved on to working with others outside the team, and, most recently looked at building organizational infrastructure and assuring continuity. The Community and Diversity Center Initiative has provided college-wide educational opportunities that emphasize diversity, equity, and inclusion and promote civil discourse.

The CDCI works to develop understanding and appreciation of diversity through workshop opportunities that focus on diversity while simultaneously nurturing leadership skills. From March 2009 through December 2011, the CDCI offered over 65 events, of which 40 were designed for students, included student participants, or were facilitated by students. Topics included “Diversity and Student Empowerment,” “Diversity and Me: Student Leadership training,” and “Privilege 101.” The CDCI has also developed a conversation series called “Community in Conversation” to introduce students and faculty, staff, and administrators to a variety of topics on diversity. Students have been trained and participate as part of the facilitating team for these workshops.





The college has institutionalized training to meet student development needs through the following:

Because the concerns of this planning agenda comprehend those of the ACCJC’s second recommendation on institutionalizing training to meet student development needs, the summary of the college’s response to that recommendation is also shared here:

Student Services completed a comprehensive program review in 2008-2009 that resulted in Campus Life’s new programs to meet student development needs beyond the classroom.

A copy of the guidelines is sent to new deans on starting their assignments.

The document, “Guidelines for ARC Brochures & Special Publications” was created in 2010 and distributed to department chairs and deans in fall 2010. The guidelines are posted in the ARCDocs database and are accessible from the following link: http://arcdocs.arc.losrios.edu/results.aspx?n=Brochures&s0=f&s1=t

2012 Status Update

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

43

Standard

Item

continued

6 II.B.3.d

The faculty coordinator of the Community and Diversity Center Initiative will assess the college’s programs during the 2009-2010 academic years and report to the president on the steps that may be necessary to enhance the institution’s environment of mutual respect, understanding, and tolerance. (cont.)

College Self-Study 2009 Planning Agenda Spring 2009 through Spring 2012 continued

Completion Date

 





Institutionalizing funding for a full-time, rotating CDCI director and classified support Include in CDCI staffing employees from the Instruction and Student Services areas Institutionalize funding for resources and programming opportunities Include cultural competence in learning outcomes assessment, curriculum proposals, teaching institutes, interview committee and department chair training, basic skills grants and programs, tutor and peer advisor training [5.3].

To enhance the institution’s environment of mutual respect, understanding, and tolerance and allow the CDCI to function more effectively within the college organization, the current CDCI director recommends the following future actions resulting from her assessment of the college’s programs, as called for in the planning agenda:

The CDCI’s philosophy is that diversity belongs to the entire college community, and beyond the efforts of the CDCI, the college has offered many opportunities to celebrate and learn about mutual respect, understanding, mutual responsiveness, and celebrating differences. A few of the many examples include sponsorship by student organizations of “Diversity Days,” “Beaver Week,” and “Club Day”; study breaks at “The Spot”; the gerontology department’s monthly workshops; Campus Life’s webinar on facilitation skills for diversity sessions and student leadership summits; the Equity Committee’s “Multicultural Week”; Umoja’s learning community; and the MESA program.

Evaluations of these workshops have included such comments as, “I learned that ‘flying under the radar’ isn’t the best method to learn in a social setting,” “Diversity is fun,” and “I learned that I shouldn’t keep controversial opinions to myself when the time is appropriate” [5.3, 5.4, 5.5, 5.6].

2012 Status Update

44

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

See the planning agenda for Standard I.B.1.

8 IV.A.2.a

Item

The college’s Research Office will undertake its own survey of employee satisfaction during the 2009-2010 academic years (a) to identify more specifically the particular issues raised by the District survey as these relate to American River College employees and (b) to help the college identify opportunities to resolve those issues.

Standard

7 III.A.4.c

College Self-Study 2009 Planning Agenda

Spring 2012

Not applicable

Completion Date



Upon further study after the fall 2009 visit, this planning agenda was found not to be applicable, so it was not undertaken. The 2009 self study identified the similarity of concern which was the focus of planning agendas supporting Standards I.B.1 and IV.A.2.a. As described earlier in this summary of the planning agenda, the work undertaken for Standard I.B.1 also supports Standard IV.A.2.a.



Concerns that were identified by the District’s “How are we doing” LRCCD spring 2008 employee satisfaction survey can only be addressed at the District level because these concerns are the purview of collective bargaining agreements and the District’s Human Resources office , e.g., benefits, the promotion process Actual levels of dissatisfaction by constituency groups within ARC could not be identified because the identification of issues in the 2008 District survey of employee satisfaction’s, though identified by college, were presented only by constituencies (i.e., faculty, classified staff, managers) across the District.

During follow-up discussion at meetings of the President’s Executive Staff and with constituency leaders, it was determined that this planning agenda for III.A.4.c was not accurately identified as a matter for the college to pursue for the following reasons:

2012 Status Update

References 5.1 5.2 5.3 5.4 5.5 5.6

Participatory Governance-best practices supporting ASB participation, December 5, 2011, also referenced as 4.4 Program Review Follow Up: Follow-up on Resource Requests Identified at Program Review Presentations. Cohorts 2009-2010 and 2010-2011. February 21, 2011 Community and Diversity Center March 2012 Report to the President, March 2012, also referenced as 2.5 Community and Diversity Center Workshop Evaluation Results, October 4, 2011, also referenced as 2.7 Community and Diversity Center Workshop Evaluation Results, October 6, 2011 , also referenced as 2.8 Community and Diversity Center Workshop Evaluation Results, November, 1, 2011, also referenced as 2.9

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

45

APPENDICES

Appendix A Improvements in Student Learning as a Result of ARC’s SLO Assessment Process: Report on Cohort I, 2007-2008 vs. 2010-2011...........................................................................49

Appendix B Addendum: Guidelines for departments completing ARC SLO Action Plans responding to SLO student self-assessment survey results containing a statistically significant negative deviation .....................................................................................................................................55

Appendix C Instructions to Accompany ARC Faculty Designed Assessment Plan Entry Template: 2nd Cycle....................................................................................................................................59

Appendix D Resolution F 09-04: Social Relativity Act .................................................................................65

Appendix E Student Satisfaction Survey .......................................................................................................69

Appendix F ARC ESL Center Diagnostic Placement Exam and Post Test ...................................................81

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

47

APPENDIX A Improvements in Student Learning as a Result of ARC’s SLO Assessment Process: Report on Cohort I, 2007-2008 vs. 2010-2011

Yuj Shimizu, PhD August 22, 2011

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

49

Improvements in Student Learning as a Result of ARC’s SLO Assessment Process: Report on Cohort I, 2007-2008 vs. 2010-2011 Yuj Shimizu, PhD, Department of Planning, Research, Technology, and Professional Development, American River College, 08/22/2011

ANALYSIS I Statistical analyses were conducted on student self-assessment SLO data for 450 different courses (aggregated over sections and instructors) for all 18 departments in Cohort I to determine whether improvements in student learning had occurred as a result of ARC’s SLO assessment process. Both the first (2007-2008) and second (2010-2011) cycles were analyzed. Analysis I was conducted in two stages. First the courses were analyzed to determine whether a statistically significant deviation(s) was present for each course. Such a deviation would indicate that the SLOs were not evenly rated. If the analyses returned a significant result, the data pattern was subsequently examined to determine if, in particular, a negative deviation was present (see Figure 1). A significant negative deviation is a clear indicator that student success on a particular student learning outcome may need to be improved through corrective action. Figure 1.

Example of a negative deviation.

Importantly, a reduction in the percentage of surveyed courses that contained a significant negative deviation from the first cycle to the second cycle would represent initial evidence of improvements in student learning outcomes produced within ARC’s SLO process.

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

51

Results In the first cycle (2007-2008), 26.32% of surveyed courses contained a significant negative deviation. In the second cycle (2010-2011), only 16.84% of surveyed courses contained a significant negative deviation (see Figure 2). Therefore, the absolute percentage of courses that contained a significant negative deviation dropped by 9.48%. This amounts to a 36% relative improvement (reduction) for the second cycle compared to the first. Figure 2. Cohort I: % of courses containing significant negative deviations 30 20 10 0 2007-2008 (1st Cycle) 2010-2011 (2nd Cycle)

At the department level, the vast majority (70%) experienced improvements compared to their first cycle. These results represent ARC’s initial evidence of improvements in student learning outcomes at the course level produced through the student self-assessment portion of our college’s SLO assessment model. ANALYSIS II A follow up analysis was conducted (at the behest of J. Gamber) to determine whether the specific courses to which departmental actions had been taken in an attempt to improve student learning (as indicated by “action completed” on their implementation reports) contained fewer significant negative deviations in 2010-2011 than was the case in 2007-2008, thereby indicating improvement in student learning specifically for courses to which actions had been taken.

52

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

Results In 2007-2008, prior to the corrective actions, 37.5% of courses to which departments would ultimately take action contained a significant negative deviation*. Following the corrective actions, only 25% of these courses contained a significant negative deviation (see Figure 3). This change represents a 12.5% absolute decrease and a 33% relative improvement (decrease) from the first to the second cycle. Figure 3. Cohort I: % of courses containing sig. neg. deviations that were acted upon after 1st cycle 40 30 20 10 0 2007-2008 (1st Cycle)

2010-2011 (2nd Cycle)

Therefore, the results of analysis II represent ARC’s initial evidence of improvements in student learning outcomes at the course level produced specifically through actions that were taken toward achieving that goal. These data indicate a direct association between corrective actions taken and subsequent improvements in student learning. In Summary, after Cohort I went through its initial re-assessment, solid and substantial evidence of improvements in student learning were observed, both broadly across courses and departments as a whole, and specifically for courses in which actions were taken in an attempt to improve student learning. These results also provide important validation for the student self-assessment stream of ARC’s two-pronged approach to SLO assessment. *Side note: One might wonder why only 37.5% (rather than 100%) of courses acted upon contained a significant negative deviation to begin with when negative deviations are supposed to be the indicator to enact corrective action. The answer is that at the outset of this process, the college did not have the aid of a statistical test or standard by which to identify courses containing statistically significant negative deviations. This drawback within our processes was corrected in 2009-2010 with the adoption of the repeated-measures ANOVA statistical procedure for identifying significant deviations and the adoption of the p < .01 standard. As such, starting with Cohort III (2009-2010), this starting percentage is expected to be at or near 100%.

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

53

APPENDIX B Addendum: Guidelines for departments completing ARC SLO Action Plans responding to SLO student self-assessment survey results containing a statistically significant negative deviation

Student Learning Outcomes Assessment Committee August 2011

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

55

  Addendum: Guidelines for departments completing ARC SLO Action Plans responding to SLO student self‐assessment  survey results containing a statistically significant negative deviation  SLO student self‐assessment survey results are one form of evidence regarding achievement of student learning  outcomes for a given course.  These results are analyzed using a statistical procedure to determine if one or more SLOs  are rated more negatively compared to the majority of SLOs for a given course (i.e., if a statistically significant negative  deviation exists1).  Such data indicates that action may be warranted to improve achievement for the lower rated SLO(s)  (SEE FIGURE BELOW).  Previous research has shown that the absolute rating on these types of surveys (e.g., all  ratings are generally positive, or negative) is not a valid indicator of achievement (Dunning,  Heath, & Suls, 2004) as they are prone to many response biases, such as overconfidence.    However, these response biases would not create selectively lower ratings for a particular  SLO.  Our statistical procedure is designed to detect these selectively lower ratings.   Therefore, if a statistically significant negative deviation is detected, action may be  warranted to improve achievement for the lower rated SLO(s).   There could, however, be a variety of reasons, specific to the discipline or situation, why no action might be taken.  For  example:     

The SLO is known to be more difficult because (insert discipline specific rationale) and its rating, relative to the  other SLOs is acceptable in the department’s opinion  The SLO’s relative rating is in line with industry / agency standards (cite industry / agency)  There exists contrasting evidence that the SLO is being met satisfactorily as measured by (describe measure)        An unexpected situation, specific to a course / semester, led to the lower rating(s) (describe situation). 

 [The above list is not exhaustive; there could be other valid reasons]  However, if a program presents responses such as those stated below, then the SLO Assessment Committee will require  further explanation from the program.    

The deviation is not significant  No action will be taken (without providing a discipline or situation specific rationale)  The ratings are generally on the positive side (such as in the figure above), therefore, achievement is  satisfactory. (As mentioned above, this response would not be defensible in light of known biases that can affect  the overall positioning of all ratings).  References 

Dunning, D., Heath, C., & Suls, J. (2004). Flawed self‐assessment: Implications for health, education, and the workplace.  Psychological Science in the Public Interest, 5, 69‐106. 

                                                            

1

 Analyses are conducted using the Repeated‐Measures Analysis of Variance (ANOVA) Statistical procedure and a standard of p < .01. 

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

57

APPENDIX C Instructions to Accompany ARC Faculty Designed Assessment Plan Entry Template: 2nd Cycle Student Learning Outcomes Assessment Committee August 2010

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

59

Faculty Designed Assessment Plan Instructions Sept 8, 2010

Instructions to Accompany ARC Faculty Designed Assessment Plan Entry Template: 2nd Cycle

[For use by department Prior to Conducting the Faculty Designed (Direct) Assessment] The following instructions, guidelines, and suggestions are intended to provide departments with detailed guidance as they complete the Faculty Designed Assessment Plan Entry Template. The numbers below correspond to the numbered questions on the template. 1. Comparison to previous assessment cycle:  Please refer to your previous SLO assessment action plan (the Research Office can provide a copy).  In cases where attempts were made to improve student learning (i.e. actions were listed in your action plan), individual SLOs should be reassessed.  Departments are encouraged to expand the scope of assessment where possible, though time and resource limitations must be considered.  In situations where previous investments of time need not be duplicated (e.g. you can reuse a previously developed assessment tool), departments should strongly consider conducting additional assessments.  For departments with multiple course designators, departments should consider assessing a course in a new designator. 2. Course name  Please provide the course number and full course name (e.g. GEOG 300, Physical Geography). 3. Rationale for choosing this course  As with previous assessments, courses should be chosen for their relative impact on your students and for their connection to your curriculum.  This might include courses with high enrollment, capstone courses, gatekeeper courses, and/or key courses required for a degree or certificate.  Courses should not be chosen simply because an assessment tool already exists and/or because an instructor offers to do the work on behalf of the department. 4. SLOs to be assessed  Approved guidelines do not mandate that all SLOs for a course need be assessed or that those assessed be assessed simultaneously.  There will understandably be a trade off between the time required to deploy and analyze the assessment tool (e.g., multiple choice tests vs. scoring essays using a shared rubric) and the number of SLOS that can be practically assessed.  You are required to assess at least one SLO of record (directly from SOCRATES), though you may also assess one or more draft SLOs as agreed upon by your department. 5. Timing of assessment  For each SLO assessed, please list the semester, year, and approximate timing during the semester when the assessment will take place (e.g. at the end of the Spring 2011 semester or approximately half way through the Fall 2009 semester)  Typically assessments would be expected to occur at the end of the semester, though it could occur earlier if all materials related to a particular SLO have already been covered. 6. Breadth of assessment

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

61

Faculty Designed Assessment Plan Instructions Sept 8, 2010

   

Please list the number of sections assessed out of total sections taught or the percentage of students assessed per section. Approved guidelines do not mandate any minimum criteria for number of sections or expected sample size. The actual number chosen should reflect a balance of practical considerations (i.e., what is feasible), and ideal benchmarks (e.g., what kind of participation and extent of data would be viewed by the instructors as being reliable and valid). The Research Office can assist departments with applying statistically valid sampling protocol where assessing 100% of students in all sections is not possible.

7. The assessment tool(s)  Examples include but are not limited to the following: using a shared rubric to score an essay, a portfolio, a performance, an exhibition, a demonstration, a case analysis, a business plan; using a shared skills checklist aligned with SLOs to assess specific abilities (e.g., nursing skills, welding skills, culinary skills); using a common test (e.g., multiple choice created jointly by instructors or adopting a standardized test used by a program’s professional association); aligning classroom assessment techniques (CATs) already in use with course SLOs and using them jointly with other instructors.  Where possible, use the underlined terminology above in your description.  Please note that, in contrast with previous practice, departments are now required to submit a fairly complete draft of all assessment tools (quiz, exam, assignment, rubric, checklist, etc) with their assessment plan.  Where an assessment tool is assessing multiple SLOs, the draft must clearly indicate which SLOs correspond to which portions of the assessment (e.g. each question in a multiple choice test must indicate the SLO it is assessing). This is NOT required of the version deployed to students.  Rubrics must include detailed criteria for assigning particular scores (i.e. numeric scores or scores of “excellent”, “good”, “fair”, “poor”, etc must be accompanied by explanations and/or examples of the sort of performance that warrants a particular score.)  To the extent possible and feasible, the assessment should be deployed using a consistent methodology in all sections. This includes being consistent with instructions provided to students, time allowed, timing of assessment during the semester, and student expectations of earning credit. 8. Authentic assessment  The SLO Assessment committee seeks to promote the importance of “authentic assessment.” In general, such assessment requires students to apply knowledge as opposed to memorizing, restating, and/or reiterating information.  Specifically, authentic assessment requires students to: o perform “real world” tasks which approximate those found in the workplace or other nonclassroom venues and/or o apply critical thinking to address problems and/or o use acquired knowledge to address problems.  Carefully evaluate your assessment tool (and revise where necessary) to ensure that it meets this important criteria. 9. Goals and/or minimum acceptable score  Where goals and/or minimum acceptable scores differ from SLO to SLO, this question must be answered separately for each SLO.  Option A: The department submits BOTH a minimum acceptable score AND a goal. 62

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

Faculty Designed Assessment Plan Instructions Sept 8, 2010





o Example: A department might choose a score of 3 (out of 5) on a rubric as the minimum acceptable score for a particular SLO. The same department might then establish the goal that 70% of students will score a 3 or higher. o Note that, in Option A the goal should be stated as the percentage of students who are expected to score at or above the minimum acceptable score. Option B: The department states one average overall score – for ALL students assessed - which they consider to be a reasonable goal. o Example: For each SLO, the department might choose an average score of 3.5 on a scale of 0-5 or 75/100 as their goal. o Note that, in Option B, the goal is NOT stated as a percentage, but rather as an average score for all students assessed. These goals and/or minimum acceptable scores are NOT binding on departments, but rather are intended to facilitate future analysis, interpretation, and planning.

10. Method of assessment design  Approved guidelines strongly encourage collaboration among all instructors, full time and adjunct, who teach the course being assessed.  Indicate the methods of communication used (e.g., department meetings, subcommittee meetings, office and hallway discussions, emails, phone calls, blogs, etc.).  Briefly describe the process of seeking and incorporating input.  Also include, where possible, the names of all faculty involved (denote adjuncts where possible). This may prove extremely useful during future assessment cycles. 11. Method of grading/scoring  Approved guidelines strongly encourage, where appropriate, collaboration among all instructors, full time and adjunct, who teach the course being assessed.  If the grading will be strictly objective, indicate the name(s) of the person or persons who will likely do the grading and the roles to be played by each (e.g. Jed Smythe will run the scantron forms through the machine and calculate and assemble the data).  If the grading will be to some degree subjective, briefly describe the procedures to be followed and indicate the name(s) of the person or persons who will likely be evaluating student work (e.g. John Aubert, Cathie Browning and Yuj Shimizu will each be grading the same anonymous student essays, with an average of their three scores assigned as the final score on each essay. They will conduct a norming session prior to assigning scores).  The Research Office is available to assist with development of a statistically appropriate grading methodology. 12. Adjunct faculty involvement  The accrediting standards speak to the full engagement of faculty in the SLO assessment process.  Where possible, adjunct faculty should, at a minimum, be allowed to review and provide feedback on assessment tools to be used in their classes.  Please document any and all involvement of adjunct faculty in the process. 13. Dialog  It is important that this process involve multiple faculty in your department, rather than one person (e.g. the chair) doing the work on behalf of others. 14. Evidence of dialog  Evidence of dialogue may be required for accreditation purposes.

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

63

APPENDIX D Resolution F 09-04: Social Relativity Act American River College Associated Student Body September 9, 2009

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

65

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

67

APPENDIX E Student Satisfaction Survey American River College Research Office Spring 2011

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

69

Student Satisfaction Survey Fall 2011 Count

Percent

Please select the support service you are CURRENTLY working in right now. Reading Center

11

1.36 %

Writing Center

14

1.73 %

Reading Across the Disciplines (RAD)

569

70.16 %

Writing Across the Curriculum (WAC)

168

20.72 %

47

5.80 %

2

0.25 %

Writing Center at NATOMAS Writing Across the Curriculum (WAC) at NATOMAS Total Responses

811

100 %

In addition to enrolling in the this semester, I am also registered for (choose all that apply): Reading Center

41

4.60 %

Writing Center

42

4.71 %

Reading Across the Disciplines (RAD)

249

27.95 %

Writing Across the Curriculum (WAC)

217

English As a Second Language (ESL)

9

None of the above

325

24.35 % 1.01 % 36.48 %

I do not understand the question

5

0.56 %

I don't know about these other support services

3

0.34 %

Total Responses

891

100 %

I heard about the program from (mark all that apply). Flyer

82

7.74 %

Brochure

45

4.25 %

Counselor

106

10.00 %

Instructor

622

58.68 %

82

7.74 %

Enrolled from a previous semester

76

7.17 %

Other

47

4.43 %

Friend

Total Responses

1060

100 %

I was able to get help with my Reading Center module when I needed it. Strongly Agree

4

36.36 %

Agree

7

63.64 %

Total Responses

AMERICAN RIVER COLLEGE

11

100 %

MI DT E RM A CCRE DI TAT I ON R EPORT

71

Student Satisfaction Survey Fall 2011 Count

Percent

The conferences held with my instructor were helpful. Strongly Agree

5

45.45 %

Agree

5

45.45 %

Neutral

1

9.09 %

Total Responses

11

100 %

The Reading Center staff was knowledgeable and helpful. Strongly Agree Agree Total Responses

6

54.55 %

5

45.45 %

11

100 %

The Reading Center staff understood my needs and answered my questions clearly. Strongly Agree

6

54.55 %

Agree

5

45.45 %

Total Responses

11

100 %

The Reading Center program helped me to apply reading or study strategies covered in my module. Strongly Agree

7

63.64 %

Agree

4

36.36 %

Total Responses

11

100 %

The Reading Center program helped me to understand and remember the main ideas and details in my reading materials. Strongly Agree

5

45.45 %

Agree

6

54.55 %

Total Responses

11

100 %

The Reading Center program has helped me increase my vocabulary. Strongly Agree

2

18.18 %

Agree

5

45.45 %

Neutral

3

27.27 %

1

9.09 %

Not part of my module Total Responses

72

AMERICAN RIVER COLLEGE

11

100 %

MI DT E RM A CCRE DI TAT I ON R EPORT

Student Satisfaction Survey Fall 2011 Count

Percent

The Reading Center program has helped me develop specific reading skills such as previewing and annotating. Strongly Agree

6

54.55 %

Agree

4

36.36 %

Not part of my module

1

9.09 %

Total Responses

11

100 %

The Reading Center program has helped me to make accurate inferences. Strongly Agree

2

18.18 %

Agree

8

72.73 %

Not part of my module

1

9.09 %

Total Responses

11

100 %

I was able to get help with my Writing Center module when I needed it. Strongly Agree

45

73.77 %

Agree

15

24.59 %

1

1.64 %

Don't Know Total Responses

61

100 %

Using the Writing Center program helped me improve my grades. 49.18 %

Strongly Agree

30

Agree

21

34.43 %

Neutral

6

9.84 %

Disagree

1

1.64 %

Don't Know

3

4.92 %

Total Responses

61

100 %

The Writing Center staff was knowledgeable and helpful. 78.33 %

Strongly Agree

47

Agree

11

18.33 %

Neutral

1

1.67 %

Don't Know

1

1.67 %

Total Responses

AMERICAN RIVER COLLEGE

60

100 %

MI DT E RM A CCRE DI TAT I ON R EPORT

73

Student Satisfaction Survey Fall 2011 Count

Percent

The Writing Center staff understood my needs and answered my questions clearly. Strongly Agree

47

77.05 %

Agree

10

16.39 %

Neutral

2

3.28 %

Don't Know

1

1.64 %

1

1.64 %

---Choose one below--Total Responses

61

100 %

The Writing Center program helped me apply the writing terms and ideas covered by my module(s): Strongly Agree

40

65.57 %

Agree

19

31.15 %

Neutral

1

1.64 %

Don't Know

1

1.64 %

Total Responses

61

100 %

The Writing Center program helped me write clear and correct sentences. Strongly Agree

36

59.02 %

Agree

21

34.43 %

Neutral

2

3.28 %

Don't Know

1

1.64 %

1

1.64 %

---Choose one below--Total Responses

61

100 %

The Writing Center program helped me evaluate and revise my own writing by applying the skills learned in my module(s): Strongly Agree

36

Agree

22

36.07 %

Neutral

2

3.28 %

Don't Know

1

1.64 %

Total Responses

74

59.02 %

AMERICAN RIVER COLLEGE

61

100 %

MI DT E RM A CCRE DI TAT I ON R EPORT

Student Satisfaction Survey Fall 2011 Count

Percent

The Writing Center program helped me learn and use writing skills necessary for academic success. Strongly Agree

37

60.66 %

Agree

20

32.79 %

Neutral

3

4.92 %

Don't Know

1

1.64 %

Total Responses

61

100 %

The Natomas Writing Center is open during hours when I need help. Strongly Agree

29

59.18 %

Agree

10

20.41 %

Neutral

7

14.29 %

Disagree

1

2.04 %

Strongly Disagree

1

2.04 %

1

2.04 %

Don't Know Total Responses

49

100 %

Did you complete the RAD program? Yes

566

No

3 Total Responses

569

99.47 % 0.53 % 100 %

I was able to schedule the appointment that best fit my schedule. Strongly Agree

357

63.07 %

Agree

135

23.85 %

Neutral

51

9.01 %

Disagree

16

2.83 %

Strongly Disagree

4

0.71 %

Don't Know

3

0.53 %

Total Responses

AMERICAN RIVER COLLEGE

566

100 %

MI DT E RM A CCRE DI TAT I ON R EPORT

75

Student Satisfaction Survey Fall 2011 Count

Percent

The program will help me (or has helped me) succeed in another college class: Strongly Agree

493

60.79 %

Agree

248

30.58 %

Neutral

51

6.29 %

Disagree Strongly Disagree Don't Know ---Choose one below--Total Responses

2

0.25 %

3

0.37 %

10

1.23 %

4

0.49 %

811

100 %

The work I did in RAD addressed the difficulties I was having in my class. Strongly Agree

265

46.90 %

Agree

200

35.40 %

Neutral

86

15.22 %

Disagree

5

0.88 %

Strongly Disagree

6

1.06 %

Don't Know

1

0.18 %

---Choose one below---

2

0.35 %

Total Responses

565

100 %

I feel RAD helped me do better on my exams. 244

43.11 %

Agree

203

35.87 %

Neutral

104

18.37 %

Strongly Agree

4

Strongly Disagree

5

0.88 %

Don't Know

3

0.53 %

---Choose one below---

3

0.53 %

Total Responses

76

0.71 %

Disagree

AMERICAN RIVER COLLEGE

566

100 %

MI DT E RM A CCRE DI TAT I ON R EPORT

Student Satisfaction Survey Fall 2011 Count

Percent

I feel RAD helped me get better organized and improved the way a spent my study time. Strongly Agree

312

55.12 %

Agree

199

35.16 %

Neutral

48

8.48 %

3

0.53 %

Strongly Disagree

3

0.53 %

---Choose one below---

1

0.18 %

Disagree

Total Responses

566

100 %

Because of the work I did in RAD I feel I better understand how to approach all my classes in the future. Strongly Agree

293

51.77 %

Agree

227

40.11 %

Neutral

43

7.60 %

1

0.18 %

Strongly Disagree

1

0.18 %

Don't Know

1

0.18 %

Disagree

Total Responses

566

100 %

My main reason for coming to RAD was to get extra credit. Strongly Agree

88

15.55 %

Agree

93

16.43 %

Neutral

143

25.27 %

147

25.97 %

90

15.90 %

Don't Know

2

0.35 %

---Choose one below---

3

0.53 %

Disagree Strongly Disagree

Total Responses

AMERICAN RIVER COLLEGE

566

100 %

MI DT E RM A CCRE DI TAT I ON R EPORT

77

Student Satisfaction Survey Fall 2011 Count

Percent

The transferable units I earned in RAD are important to me. Strongly Agree

170

30.04 %

Agree

177

31.27 %

Neutral

151

26.68 %

Disagree

42

7.42 %

Strongly Disagree

22

3.89 %

4

0.71 %

Don't Know Total Responses

566

100 %

Which of the following best describes why you dropped RAD? I dropped the class I was getting help with

1

33.33 %

Other

2

66.67 %

Total Responses

3

100 %

I will consider taking RAD in the future. Strongly Agree

218

38.31 %

Agree

173

30.40 %

Neutral

77

13.53 %

Disagree

14

2.46 %

8

1.41 %

Don't Know

25

4.39 %

---Choose one below---

54

9.49 %

Strongly Disagree

Total Responses

569

100 %

Which of the following best describes why you signed up for RAD in the first place? Extra credit

122

21.52 %

I wanted help with my class/reading

152

26.81 %

I was having difficulty understanding the ideas in my class

71

12.52 %

I needed the units

16

2.82 %

4

0.71 %

My counselor sent me My instructor told me I had to Other Total Responses

78

AMERICAN RIVER COLLEGE

165

29.10 %

37

6.53 %

567

100 %

MI DT E RM A CCRE DI TAT I ON R EPORT

Student Satisfaction Survey Fall 2011 Count

Percent

The WAC program helped me apply strategies to improve written academic assignments. Strongly Agree

98

57.99 %

Agree

63

37.28 %

Neutral

7

4.14 %

Don't Know

1

0.59 %

Total Responses

169

100 %

I was able find time in the WAC schedule to get help with my assignments when I needed it. Strongly Agree

81

47.65 %

Agree

61

35.88 %

Neutral

16

9.41 %

Disagree

9

5.29 %

Strongly Disagree

1

0.59 %

Don't Know

1

0.59 %

1

0.59 %

---Choose one below--Total Responses

170

100 %

Using the WAC program helped me improve my grades in my classes this semester. 48.82 %

Strongly Agree

83

Agree

66

38.82 %

13

7.65 %

Disagree

2

1.18 %

Don't Know

5

2.94 %

---Choose one below---

1

0.59 %

Neutral

Total Responses

170

100 %

The faculty and staff in this program were able to understand my needs and able to answer my questions: Strongly Agree

115

Agree

67.65 %

48

28.24 %

Neutral

5

2.94 %

Don't Know

1

0.59 %

---Choose one below---

1

0.59 %

Total Responses

AMERICAN RIVER COLLEGE

170

100 %

MI DT E RM A CCRE DI TAT I ON R EPORT

79

Student Satisfaction Survey Fall 2011 Count

Percent

The WAC program helped me write or revise written academic assignments. 104

61.18 %

Agree

60

35.29 %

Neutral

6

3.53 %

Strongly Agree

Total Responses

170

100 %

The WAC program helped me review, discuss and apply writing skills that were appropriate for my individual needs. 109

64.12 %

Agree

53

31.18 %

Neutral

8

4.71 %

Strongly Agree

Total Responses

170

100 %

I would sign up for the program again. 46.61 %

Strongly Agree

378

Agree

256

31.57 %

118

14.55 %

Disagree

18

2.22 %

Strongly Disagree

11

1.36 %

Don't Know

25

3.08 %

5

0.62 %

Neutral

---Choose one below--Total Responses

811

100 %

I would recommend this program to a friend. Strongly Agree

557

68.68 %

Agree

204

25.15 %

Neutral

46

5.67 %

1

0.12 %

Strongly Disagree

2

0.25 %

I do not understand the question

1

0.12 %

Disagree

Total Responses

80

AMERICAN RIVER COLLEGE

811

100 %

MI DT E RM A CCRE DI TAT I ON R EPORT

APPENDIX F ARC ESL Center Diagnostic Placement Exam and Post Test American River College ESL Department

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

81

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

83

84

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

85

86

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

87

88

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

AMERICAN RIVER COLLEGE

MI DT E RM A CCRE DI TAT I ON R EPORT

89

American River College 4700 College Oak Drive Sacramento, CA 95841 (916) 484-8011 www.arc.losrios.edu Los Rios Community College District

Midterm Accreditation Report - American River College - Los Rios ...

Statement of Report Preparation. Mindful of the principle that accreditation is a process and not an event, American River College began work on the midterm accreditation report by organizing three aspects of preparation: • In summer 2010, starting the process of gathering the evidence to document the college's progress in ...

4MB Sizes 1 Downloads 133 Views

Recommend Documents

Los Rios Advanced Education Applicaton.pdf
Los Rios Advanced Education Applicaton.pdf. Los Rios Advanced Education Applicaton.pdf. Open. Extract. Open with. Sign In. Main menu. Displaying Los Rios ...

EE241 Midterm Report Offset Compensation ...
and storing the offset on capacitors (auto zero approach). ... The auto -zero approach is therefore the solution of .... occur at once (e.g. flash memory). (5) Offset ...

Autism Accreditation Review Report The Holmewood School 2017 ...
Whoops! There was a problem loading more pages. Retrying... Autism Accreditation Review Report The Holmewood School 2017 FINAL.PDF.

hpcsa accreditation sacssp accreditation -
Teen pregnancy & abortion in South Africa. IBIS reproductive Health. International NGO. A model of intervention with children with sexual behaviour problems in ...

Accreditation Release.pdf
... CA 92834-6846 / T 657-278-3517 / F 657-278-2209. THE CALIFORNIA STATE UNIVERSITY. Bakersfield / Channel Islands / Chico / Dominguez Hills / East ...

Midterm - GitHub
Feb 21, 2017 - How would your decision to use validation/holdout or k-fold cross validation affect your ability to obtain out-of-sample predictions from the ...

What Accreditation Means
applicable measurement guidelines issued by the Interactive Advertising Bureau;. 2. Provides full and ... York, NY 10170. Email: [email protected].

DAVID MARTINEZ RIOS 3B.pdf
Page 1 of 1. Page 1 of 1. DAVID MARTINEZ RIOS 3B.pdf. DAVID MARTINEZ RIOS 3B.pdf. Open. Extract. Open with. Sign In. Main menu. Displaying DAVID MARTINEZ RIOS 3B.pdf. Page 1 of 1.

Accreditation Release.pdf
Page 2 of 2. Accreditation Release.pdf. Accreditation Release.pdf. Open. Extract. Open with. Sign In. Main menu. Displaying Accreditation Release.pdf.

NAEYC Accreditation Decision.pdf
Loading… Page 1. Whoops! There was a problem loading more pages. Retrying... NAEYC Accreditation Decision.pdf. NAEYC Accreditation Decision.pdf. Open.

Obituários & Malditos.pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Obituários ...

midterm examination - VU Students.Ning
MIDTERM EXAMINATION. Spring 2010. IT430- E-Commerce (Session - 5). Asslam O Alikum 100% solved papers of it 430 (2010) with reference by. Afaaq and Shani bhai (5). Remember Us In Your Prayers. Best regard's. Muhammad Afaaq .... If a web server is bom

midterm examination - VU Students.Ning
http://www.scribd.com/doc/3701290/Product-Life-Cycle1. ▻ Improve warranty terms and service availability. ▻ Emphasize market segmentation. ▻ Stimulate demand for the product. Question No: 21 ( Marks: 1 ) - Please choose one. The majority of fir

midterm examination - VU Students.Ning
Star Topology. ▻ Bus Topology ... denial of service attack P#95 (Afaaq). Question No: ... of the items the customer has select and allow him to view the details of ...

Accreditation Release.pdf
A strong, independent student media and a new PR/AD agency since the last. review. • A reputation in its region for producing capable students. In addition to ...

Yampa White River Coal Report 2015.pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Yampa White ...

midterm examination -
Operating System talks to and manages devices through. ▻ Loader ..... ANSWER: The key feature of object-oriented design is that it treats every thing as object ...

midterm examination - Ning
FIN622- Corporate Finance (Session - 5) Spring 2009 MIDTERM PAPERS .... Which of the following statements is Correct regarding the fundamental analysis?

midterm examination - VU Gujranwala
Java script interact with user through______________. ▻ Special ... Java script does not interact with user. Question No: 8 ... Monitor is an example of: ______.

midterm examination - VU HELP
Question No: 3 ( Marks: 1 ) - Please choose one. In which protocol, the messages are not deleted from email server. ▻ SMTP. ▻ POP3. ▻ IMAP (Afaaq).

State Profile Report - The College Board
College Board, achieve more, Advanced Placement Program,AP, SAT and the .... From 1996–1999, nearly all students received scores on the ...... Florida Office.

2012 United States Report - Babson College
photo: Jose Mandojana; illustration: Murilo Maciel .... the total required reserves by an estimated $1.33 billion in an effort to free up funds for .... R&D Transfer ...... While this may provide experiential learning for future efforts, the survey a

midterm examination - Ning
Manager [email protected]. Super Moderator in http://www.virtualinspire.com/. Islamabad. 0346-5329264. If u like me than raise your hand ...