Michael Kelley’s April 29, 2013, editorial, “Can We Talk About the MLS?,” and the 157 comments posted to that article so far prompted us to consider accountability for the American Library Association’s (ALA) accreditation of graduate programs in library and information science. We first turned to ALA’s Standards for Accreditation [PDF]:
“Accreditation assures the educational community, the general public, and other agencies or organizations that an institution or program (a) has clearly defined and educationally appropriate objectives expressed as student learning outcomes, (b) maintains conditions under which achievement of objectives can reasonably be expected, (c) is in fact accomplishing objectives substantially, and (d) can be expected to continue to do so. Accreditation serves as a mechanism for quality assessment and quality enhancement with quality defined as the effective utilization of resources to achieve appropriate educational objectives and student learning outcomes.”
Citizens expect that medical and law schools meet academic and professional standards and that their graduates are qualified to serve the public with knowledge and skill. This concept of accountability is now spreading at the state level where public colleges and universities are being required to report on student-centered indicators such as graduation and retention rates to qualify for full funding. The National Council of State Legislatures provides information on how states have allocated funding for higher education institutions based on performance indicators.
The ALA Standards emphasize what programs must accomplish in terms of strategic planning and student learning outcomes. ALA does not dictate what those outcomes should be nor does it specify any particular courses that must be offered in an MLIS program. So, what does it mean to be a graduate of an ALA-accredited program? Past competency debates offered philosophical positions that could not be tested without nationally based comprehensive examinations to certify library and information professionals.
Each MLIS program determines its own strategic planning goals and its own student learning outcomes. This approach protects the academic freedom of the faculty to test new ideas and create new methodologies. Nonetheless, the Standards are clear about what they are expected to accomplish:
“Within the context of these Standards each program is judged on the degree to which it attains its objectives. In accord with the mission of the school, clearly defined, publicly stated, and regularly reviewed program goals and objectives form the essential frame of reference for meaningful external and internal evaluation. The evaluation of program goals and objectives involves those served: students, faculty, employers, alumni, and other constituents.”
But this approach does not necessarily meet the intent of the Standards in providing a coherent and consistent body of knowledge to be mastered by MLIS graduates.
“…the curriculum provides … for the study of theory, principles, practice, and values necessary for the provision of service in libraries and information agencies and in other contexts.” [Standards, p. 9]
Under the current Standards, the 63 accredited programs at 58 institutions define their curricular objectives and course offerings independent of each other. The individual LIS programs define acceptable student learning outcomes at the local level combining course outcomes into program outcomes. However, ALA does not then combine the program outcomes.
If the program outcomes were merged and evaluated then ALA could report on the common knowledge skills needed by library and information professionals across all accredited programs. This, in turn, could define what it means to be an ALA-accredited graduate. At the next step, if the graduate obtains a professional position and performs satisfactorily then this gives credibility to the program’s accreditation and ALA’s articulation of nationally based library and information professional outcomes.
ALA’s Committee on Accreditation (COA) uses three means to evaluate programs:
- Program Presentations, in which the school measures itself against the Standards in a self-study;
- External Review Panel reports, in which a visiting review team measures the school and its program presentation against the Standards; and,
- Program annual statistical and biennial narrative reports monitored by COA over time.
It is not our intent to change the first two of these processes, and we propose that public accountability would be improved if Program Presentations and External Review Panel reports be made freely available. However, we do challenge the kinds of data collected and evaluated to improve accountability.
The Association for Library and Information Science Education (ALISE) has systematically collected MLIS program and statistical information from ALA-accredited institutions since 1980. COA has lent its name to this ALISE effort since COA uses a small portion of the collected data in its review of MLIS programs. The data used by COA includes three general areas: faculty numbers; student enrollment with degrees given, including diversity; and income/expenditures. Recently, the ALA Office of Accreditation’s PRISM report identified these data elements.
Trend data is useful in assessing program changes over time to assess consistency. Significant changes in program resources might be questioned. We propose adding statistical information to provide for transparency in interpreting program trends to provide additional evidence of public accountability. Institutional and program data are often used for three purposes to compare an MLIS program to: (1) a specific standard; (2) another program; and (3) itself from one year to the next.
Currently, COA uses the third objective above in its reliance on trend data over time for each program. But public accountability might, for example, uncover the need for prospective students to compare one program’s graduation or placement rate with another program.
The accreditation of a program also provides a de facto benchmark for other programs. The current Standards may lead to devolution in quality because indicators are established locally without reference to any national benchmarks. It is in the interests of ALA and COA to have its Standards linked to identifiable data that is respected and valued by member institutions, prospective students, alumni of programs, and the general public.
Individual program statistics are now available on the ALA website, but it is our assessment that the measures fall short of providing a meaningful picture of a program’s adherence to the Standards. The movement toward increased public accountability of institutions of higher education allows students and the general public to have right-to-know factual data about the programs accredited by organizations such as ALA. The National Center for Educational Statistics (NCES) currently reports by college its default rate on student loans. Even campus crime statistics are publicly available.
COA can expect that measures of accountability will increase in the future, and it would be productive to address those now rather than later. We offer new indicators to provide a better picture of an MLIS programs’ ability to meet the intent of the current Standards, which state:
“Prospective students, employers recruiting professional staff, and the general public concerned about the quality of library and information services have the right to know whether a given program of education is of good standing. By identifying those programs meeting recognized Standards, the Committee offers a means of quality control in the professional staffing of library and information services.”
Public accountability measures can initially focus on indicators of interest to prospective students and employers. These might include:
- Applicants to the MLIS program: number applied, admitted, rejected, enrolled
- Qualification for admission for those enrolled: average GRE, TOEFL, and undergraduate GPA; percent holding other graduate degrees and percent waived from meeting each admission criterion
- Number and percent of: (a) students enrolled exclusively in campus, online, and hybrid courses; (b) courses offered as campus, online, and hybrid courses
- Number and percent of matriculated students who graduate (retention)
- Average student time in months to completion of master’s degree
- Percent graduating in 1, 2, 3, 4, 5, 6, 7, or more years
- Placement: number and percent of graduates who report full-time employment within one year of receiving master’s degree and average length of time it takes to obtain full-time employment
- Average number of students per course section for campus, online, and hybrid courses
- Average number of students taught by full-time faculty members and part-time faculty members per section
- Percent of students taught by full-time faculty members and part-time faculty members
It is important that the statistics collected provide some indication of the academic rigor or quality of the program. At this time, the Standards are liminal, providing only a threshold that must be reached, and not aspirational. According to the Standards, quality is defined as “the effective utilization of resources to achieve appropriate educational objectives and student learning outcomes.”
Academic preparedness is difficult to assess, and major universities often rely on standardized test scores as an indicator of academic rigor in the admissions process. GRE and TOEFL test scores are indicators of success in graduate school and the Educational Testing Service (ETS) produces reports depicting the relationship between GRE scores and performance in graduate programs.
Not all ALA master’s programs require standardized test scores for admission, which could place them at a disadvantage when compared to the business, law, and academic programs in their universities. A number of universities appear to have allowed their LIS departments to eliminate the GRE requirement from MLIS applicants, and this may suggest that these programs have changed their standards for this requirement. The validity of the GRE as an unbiased and strong predictor of success for graduate students is firmly established beyond ETS, and it might appear to the general public that the dropping of this requirement is related to concern over the admission of those with low scores. For example, one ALA-accredited program informs applicants they do not need to submit GRE scores, letters of recommendation, an essay, or a résumé since the program does not consider them in the MLIS admissions process.
Prospective students and the general public should be informed of how admission decisions are made based on past student performance and other indicators. Availability of retention and placement data are also matters of public concern for professional degree programs. There is tremendous diversity in the design and structure of MLIS programs across universities. There are programs with over 2,000 students and those with fewer than 100 students. Some are totally online and others only offer instruction on campus. Some might be regarded as library centric and others may be information focused.
These differences among accredited MLIS programs can be significant but each must adhere to the same set of ALA Standards if they wish to have their program accredited. We endorse such diversity but also note that the student experience at such different programs may need to be captured and shared since MLIS graduates receive the same accredited degree. It might be productive for ALA to sponsor social media opportunities where students can publicly share their experiences applying to and attending accredited programs. This information could enhance the public’s understanding of the quality of individual programs. Recent reactions in the blog appended to Kelley’s LJ editorial suggest that students have a variety of views about the value of earning an MLIS degree and its prospective value in seeking gainful employment.
Conclusion
Our proposal for change does not address other potential indicators. These might include the need for a critical mass of faculty to cover knowledge in LIS, measures of faculty scholarly productivity, tenureability, measures of institutional support for faculty, and adequate program administration. We hope that others might suggest ways to improve accountability for ALA-accredited MLIS programs.
Some programs require students to create portfolios to demonstrate student learning outcomes achieved across courses to include intern positions. The portfolios can also be shared with prospective employers. Assuming the employers have the time to review the portfolios of applicants, it shifts the burden of review from relying on the reputation of the program and university attended to scrutiny of the course work of the applicant. Our recommendations are to suggest that certain markers are important in distinguishing one MLIS program from another and that employers might favor graduates of a program with known indicators of quality.
Of paramount importance is the availability of consistent, reliable, transparent data. We are advancing a position to expand that data to include additional indicators. These statistical indicators, the Program Presentation, and the External Review Panel Report should be available on ALA’s website as a central source the public can refer to when seeking information about accredited MLIS programs.
Dan O’Connor is an Associate Professor at Rutgers University, NJ, in the Department of Library and Information Science. Phil Mulvaney is Library Director Emeritus from Northern State University in South Dakota. Both authors served on ALA’s Committee on Accreditation (COA) until 2012 when they resigned from the committee stating that COA needed a more comprehensive code of ethics to address areas such as conflicts of interest. This article does not address those specific issues.