Intersegmental Chief Academic Officers

Goucher College

Minutes of the October 31, 2003 Meeting

Welcome

 

Co-chair Michael Curry welcomed attendees to Goucher College and reviewed the group’s mission, as adopted at the April 2003 meeting.  He introduced co-char, Ann Smith; all attendees introduced themselves.

 

Approval of Minutes

 

Motion made to approve the 4/11/03 minutes.  All present were in favor.

 

MHEC Student Learning Outcomes Assessment Workgroup

 

Dr. Michael Keller from the MHEC staff, reported on the ongoing work of this group that was convened by the Secretary of MHEC. A copy of his report was distributed and is attached. Concerns were raised by ICAO members regarding potential MHEC requirements for key indicators and benchmarks.  Dr. Keller indicated that he anticipates that the reporting will focus on descriptive analysis of what is actually happening.  Concerns were also voiced about recent discussion at Middle States regarding ‘heading off creative federalism.’

 

  A second report is due from the workgroup in the coming months.

 

Reauthorization of Higher Ed Act

 

Dr. Lynn Gangone introduced Susan Hattan, Senior Consultant with the National Association of Independent College and Universities.  Ms. Hattan provided a historical perspective on this legislative process that occurs every six years.  Over the years the focus of the reauthorization battles has changed.

·        1992 focus = integrity, high default rates, ‘rip-offs’

·        1998 focus = core principles, access

·        2004 probable focus = outcomes, standards, Title IV $$$.   How much institutional control should Title IV funding give the government?

 

Ms. Hattan suggests that several issues will be central to this year’s deliberations including cost of college, transfer of credit, intellectual diversity, learning outcomes, accreditation (including distance education), minority achievement gaps.  Measures on teacher education, graduate education and international education have already passed. She anticipates that the timeline for legislative action will roughly be:  heated debate on important issues by March 2004, a draft by July 2004 and a final bill in September 2004.

 

 

 

 

Reports from Discipline Groups

 

Arts and Humanities

Gena Glickman reported that the group has met and has three recommendations.

            1) This group is concerned primarily with Arts and asks that a second group be appointed to deal with Humanities issues;

             2) A best practices in fine arts assessment conference is proposed for spring 2004; and

             3) The group will be surveying institutions regarding whether a fine arts requirement is required within institutional general education programs.

 

Biological Sciences

 

Virginia Guilford reported that the group has met and is recommending a minor revision to the statement.  A copy of suggested changes was distributed.

 

English

 

Steve Horvath reported that the group has met and will be circulating a report via email shortly.

 

All discipline group reporters indicated that low meeting attendance was a problem for their groups.

 

New Charges for Discipline Groups

 

Topic deferred to next meeting due to time constraints.

 

Other Business

 

  • ICAO follow-up on issues raised in this meeting:

            MHEC Student Outcomes Assessment Workgroup

1)      Review the interim draft of the working group

2)      At the next ICAO meeting, report on outcomes assessment issues raised at the December Middle States Association meeting

3)      Document our group’s reaction to both of the above.

 

            Reauthorization of the Higher Ed Act

1)      Teri Hollander will circulate a brief she prepared on the topic

2)      We all should stay informed on the issues (A good source of information is the National Association of Independent Colleges and Universities newsletter, available at www. Naicu.edu.)

3)      Small groups of 2-3 members should monitor the critical issues. The co-chairs will circulate a sign-up list at the next meeting.

4)      Next meeting we should begin to document our group’s reactions to the issues.

 

  • Future Meetings

      December 19 – CCBC-Catonsville

      February 27 – University of Maryland University College

      May 7 – Prince George’s Community College

 

Ann Smith

Recorder


MARYLAND HIGHER EDUCATION COMMISSION

Progress Reports on Student Learning Outcomes Assessment

Reporting Guidelines

 

 

Background

 

As part of the State’s performance accountability process prior to 1996, Maryland’s public colleges and universities had to develop a plan for the assessment of undergraduate student learning outcomes and to submit annual progress reports to the Commission.  When the Commission adopted the system of benchmarked indicators for accountability in 1996, the campuses assumed responsibility for monitoring student-learning outcomes.  However, the Commission reserved the option of requesting periodic reports from the public campuses on this subject.

 

Agreement was reached with the Commission’s Segmental Advisory Council that the public campuses would provide the Commission with a report on their progress in improving student learning, instructional effectiveness, and curricula every three years beginning in 1998. 

 

When the Commission received the 2001 student learning outcome assessment reports, it asked the Secretary of Higher Education to convene an intersegmental workgroup for the purpose of identifying standard ways of measuring the progress made in the education outcomes of students and developing a mechanism for reporting this information.    While past assessment reports have provided a good overview of the processes in which the public campuses have engaged in improving student learning, only a limited amount of information has been shared in terms of the impact that these efforts are having on undergraduates.  The next cycle of reports needs to focus on the results of assessment.

 

Within the next few years, greater attention is likely to be given to the results of assessment activities as key stakeholders inquire about the quality of learning that is taking place in college.  Accreditation organizations are asking campuses to provide information about the outcomes of assessment efforts.  There is impetus at the federal level to require colleges and universities to test students and monitor their progress similar to the mandates imposed on K-12 education.  There is recognition in Maryland of the growing interest in this area.  At the 2002 Governor’s Conference on Higher Education, there was a consensus that assessment of student learning is not an optional activity. 

 

Content of Progress Reports

 

The workgroup that drafted the guidelines for the next round of reports included faculty and staff from the public two- and four-year institutions who have considerable experience in the area of undergraduate learning outcomes assessment.  A representative of Middle States Commission on Higher Education also participated on the workgroup.

 

These reports, which shall be due at the Commission on August 2, 2004, will focus on each of the five competencies related to general education and essential skills that are used in Middle States’ accreditation process:  written and oral communication, scientific and quantitative reasoning, critical analyses and reasoning, technological competency, and information literacy. 

 

For each of these competencies, campuses must address the following questions:

 

1.      What is the definition used for this competency at your institution?

2.      What direct or indirect measures, methods, instruments and/or analyses are used to do assessment in this competency?  Accompanying these guidelines is a list of widely-recognized and accepted techniques for assessment in general education.   Institutions are not restricted to these approaches, but they must provide a rationale supported by the literature in the assessment field for any alternatives that they include in their report.

3.      At what level(s) does assessment for this competency occur – courses, programs, and/or institutional?

4.      Are results available for one or more of the assessment activities related to this competency?  If so, please describe the results for each assessment activity, providing statistical data as appropriate and an explanation of the extent to which the outcome demonstrates that students have achieved college level proficiency in the competency area.  If results are not available for particular assessment activities, please indicate whether your institution intends to produce and release these outcomes in the future and its timetable for accomplishing this task.

5.      Have the results of each of the assessment activities related to this competency been used to enhance teaching and learning at your institution?  If so, please describe the manner in which the assessment findings have contributed to these improvements.  If there are reasons that your institution has not yet used the assessment results to strengthen teaching and learning, please provide an explanation.

 

If an institution has no current activities in one or more of the competencies, they must acknowledge it in their report and discuss whether plans have been or will be developed to do assessment in these areas in the future.

 

Structure of Progress Reports

 

The reports should be organized on the basis of competency, with a separate section devoted to each of the areas in which assessment activity is under way.  There is no limitation on the length of the reports, but parsimony is encouraged.  All reports must have an executive summary, not to exceed five pages, regardless of length.  The Commission staff will prepare a document containing these summaries, without editing, along with its own analysis, which will draw on the entire reports. 

  

 

 

TECHNIQUES FOR MEASUREMENT OF STUDENT LEARNING OUTCOMES

--COMPETENCIES RELATED TO GENERAL EDUCATION—

 

 

 

Direct Measures.  Those that provide clear and compelling evidence of what students are learning.

 

·        Course-embedded assessments, including written work and presentations scored using a rubric.

·        Scores on locally designed tests and competency exams accompanied by test “blueprints” describing what is being assessed.

·        Score gains between entry and exit on tests, competency exams and writing samples.

·        Ratings of student skills in the context of class activities, projects and discussions.

·        Portfolios of student work.

·        Scores on nationally-normed instruments, notably CAAP (ACT), Academic Profile (ETS), and Tasks in Critical Thinking.

 

Indirect Measures. Those that provide signs that students are probably learning, but it is less clear exactly what they are learning.

 

·        Grades on assignments in general education courses not accompanied by a rubric or scoring guide.

·        Student grades or passing rates in general education courses.

·        Student evaluations and ratings of the knowledge and skills they have gained in general education courses.

·        Student or graduate satisfaction with their learning in general education competencies, collected through surveys, exit interviews or focus groups.

·        Results of nationally-normed surveys, notably CIRP Freshman Survey, CSXQ Survey (Indiana University), College Student Survey (HERI), CSEQ (Indiana University), and NSSE (Indiana University).