Clearinghouse on Assessment and Evaluation

Library | SearchERIC | Test Locator | ERIC System | Resources | Calls for papers | About us



From the ERIC database

Vocational Education Performance Standards. ERIC Digest No. 96.

Imel, Susan

A broad consensus for improving vocational education programs has emerged from the educational reform movement aimed at raising the performance of U.S. students in academic subjects. Pending legislation for the reauthorization of the Carl D. Perkins Vocational Education Act (House Bill HR 7 and Senate Bill 1109) mandates the development of performance indicators or standards. In the literature the terms educational indicator, quality indicator, outcome indicator, performance standard, and performance measure are used interchangeably, and there is general agreement that indicators or standards are single or composite statistics that reveal something about the performance or health of an educational system (Asche 1990). According to Asche, "quality or performance indicators have suddenly become the nation's barometer of education wellness" (pp. 3-4).

A number of options for establishing vocational education performance standards are currently under consideration. This ERIC Digest examines some vocational education experiences with outcome measures, describes proposed approaches, and enumerates potential issues and challenges in establishing performance standards for vocational education.

During the last decade, state and local programs have developed experience with defining and measuring performance-oriented outcomes of vocational education. Three of the most frequently used outcome measures are labor market, learning, and access. Although each of these has its strengths and weaknesses, they reflect the broadly accepted definition of the principal objectives of vocational training--the preparation of individuals for productive and gainful employment--as well as a primary policy concern with program access (Hoachlander, Choy, and Brown 1989; Office of Technology Assessment 1989).

Indicators of labor market performance of vocational graduates--the traditional standards by which the effectiveness of vocational education and employment training have been measured--include job placement, earnings, and duration of employment and unemployment. Although economic indicators have been widely used as a measure of vocational education's effectiveness, there are problems with the traditional methods of collecting the data. Information provided by program graduates has questionable validity, and there is also potential for bias in data provided by school personnel. Using state wage records as a means of collecting more accurate information has been proposed as a way to address some of these concerns (Hoachlander, Choy, and Brown 1989; OTA 1989).

Even if the labor market outcome information could be collected in a valid, unbiased, and accurate manner, many vocational educators would object to such indicators being the sole measure of program effectiveness on the following grounds: (1) adopting placement as the primary criterion ignores the multiple goals of vocational education; (2) a large number of economic and personal factors beyond the control of the vocational education system determine the employment of students; (3) a narrow focus on placement encourages programs to admit only those who can be placed and to concentrate on coaching in job placement and interview skills at the expense of vocational skills; and (4) placement rates and other economic indicators measure the gross effect of participation (total place, total earnings) rather than the net effect (the difference between labor market outcomes that occurred when students participated in vocational education versus what would have occurred had these programs not existed) (ibid.).

Learning outcomes--what individuals learn in school--are at least as important an indicator of program quality as labor market outcomes. In addition, vocational educators have much more control over what and how much students learn than they do over what happens to them in the labor market.

Although there are several ways to measure learning outcomes, the most common method employed in vocational education is occupational competency testing designed to assess mastery of skills (tasks) and knowledge found in specific jobs. Even though competency testing has not played a role in federal vocational education policy, states have been quite active in developing and using competency tests (Hoachlander, Choy, and Brown 1989).

Occupational competency tests provide important indications of program effectiveness, but many vocational educators do not think they should be used as the sole basis for performance standards for the following reasons: test scores often reflect economic and social factors in addition to abilities developed through education and training; schools may coach students on test-taking strategies and on specific test items at the expense of teaching the skills and knowledge purportedly measured by the test; tests do not necessarily indicate how a person would perform at work but instead measure the upper limit of what an individual can do; and at a time when there is a growing consensus that more broadly applicable generic skills are needed in the workplace, use of competency tests could encourage emphasis on highly specialized skills (OTA 1989).

Because federal policy has emphasized access to vocational programs for women, minorities, and students with special needs, outcomes related to access are also under consideration as performance indicators. Traditional efforts to monitor access outcomes in vocational education have focused on monitoring the numbers of different types of students enrolled in particular programs relative to their numbers in the larger population. However, these efforts have suffered from two major flaws (Hoachlander, Choy, and Brown 1989).

First, when enrollments are used to monitor access, attention is deflected from the primary policy concern that these students acquire the skills necessary to compete effectively in the labor market. Second, monitoring access has traditionally relied on program-based data collection rather than course-based efforts, and the result has been a "series of inaccurate, inconsistent, and generally chaotic attempts to arbitrarily assign students to vocational education programs" (ibid., p. 44). To correct these problems, Hoachlander, Choy, and Brown (1989) suggest tying access into performance-based policies as well as relying on periodic collection of student transcripts, which would permit longitudinal analysis, to monitor access.

None of the outcome measures currently in use appears to be sufficient for judging the quality of vocational education programs when used singly. However, some combination of labor market outcomes, learner outcomes, and access outcomes seems to hold promise for developing measurable standards of performance. States that either have implemented or have under development systems to measure performance vary in their approaches, especially in terms of the level of specificity of the indicators, the level at which the data are collected and used, and the primary purpose of the system (Asche 1990).

Illinois is currently pilot testing a technically advanced system that uses quality indicators to link planning, evaluation, and program improvement through data that are collected and analyzed at either the local or regional level. The system uses six indicators for measuring outcomes: placement and continuing education, enrollment, employer satisfaction, student satisfaction, employability skills attainment, and cost. Although the primary focus of the Illinois system is program improvement, it appears to satisfy most of the requirements in the pending federal legislation (Asche 1990).

Copa and Scholl (1983) report on efforts to verify and develop a set of indicators for use in measuring the outcomes of vocational education in Minnesota. A method of verification was used that took into consideration current social and economic issues, vocational education as a part of education more broadly conceived, and critical questions useful in examining any potential indicator. Use of this verification process resulted in the selection of four vocational education indicators that appeared to have adequate data available to support continued development: number of graduates employed, number of graduates employed in occupations related to program, employer's satisfaction with the quality of the graduate's work, and program cost. Although these indicators are heavily oriented toward labor market outcomes, the process described is useful for those developing systems of measuring vocational outcomes.

Vocational education is a multifaceted system with a diverse clientele and multiple goals, and it exists in a complex policy environment. Developing and implementing a system of performance standards for vocational education requires making demanding decisions on performance assessment, accountability, and actions.

Apling (1989) describes a set of potential problems that any performance standards system for vocational education needs to address: the impact of performance standards on those whom the program serves, with the danger that individuals most needing services will be the least likely to be served; the influence of performance standards on the types of training provided, with a danger that effective--but long-term and expensive--services will be discouraged in favor of short-term and inexpensive approaches; the difficulty in meeting multiple standards, some of which may not be compatible; the problem of adjusting standards for programs in different labor markets or serving different types of clients; and the difficulty of setting minimum standards (p. 18).

Apling also suggests that such questions as what performance is assessed, whose performance is assessed, who uses performance information and how do they use it, and what results from meeting, not meeting, or exceeding standards raise issues that must be resolved in order to implement a performance standard system for vocational education.

Potentially, any system of performance standards developed for use in vocational education can have both positive and egative effects. Asche (1990) cautions vocational educators to be aware that indicators tend to measure quantity, not quality; are designed to serve policymakers, not educators; aim for parsimony, not complexity; tend to reflect those things that can be easily measured; and unless there is reasonable agreement on what educational goals should be, may shape the curriculum (p. 5). On the other hand, Asche also points out that a performance indicator system can have positive aspects: locally developed indicators can provide opportunities for school-based improvement and the development of shared goals and values, indicators can be useful in monitoring policies and practice and improving schools, and indicators offer an opportunity for vocational education to be included in educational reforms (p. 6).

Apling, R. N. "Vocational Education Performance Standards." Washington, DC: Congressional Research Service, The Library of Congress, July 1989. (ED 309 320).

Asche, M. "Standards and Measures of Performance: Indicators of Quality for Virginia Vocational Education Programs." Paper prepared for the teleconference "Preparing a Competent Work Force through Indicators of Quality for Vocational Education." Blacksburg: Division of Vocational and Technical Education, Virginia Polytechnic Institute and State University, 1990 .

Copa, G., and Scholl, S. "Developing Vocational Education Indicators: Some Steps in Moving from Selection to Use." St. Paul: Minnesota Research and Development Center for Vocational Education, University of Minnesota, September 1983. (ED 235 385).

Hoachlander, E. G.; Choy, S. P.; and Brown, C. L. "Performance-Based Policies Options for Postsecondary Vocational Education: A Feasibility Study." National Assessment of Vocational Education Background Paper. Berkeley, CA: MPR Associates, March 1989.

Office of Technology Assessment. "Performance Standards for Secondary School Vocational Education: Background Paper.". Washington, DC: Office of Technology Assessment, April 1989. (ED 313 591).

This ERIC Digest was developed in 1990 by Susan Imel, ERIC Clearinghouse on Adult, Career, and Vocational Education, with funding from the Office of Educational Research and Improvement, U.S. Department of Education under Contract No. RI88062005. The opinions expressed in this report do not necessarily reflect the position or policies of OERI or the Department of Education.

Title: Vocational Education Performance Standards. ERIC Digest No. 96.
Author: Imel, Susan
Publication Year: 1990
Document Identifier: ERIC Document Reproduction Service No ED318914
Document Type: Eric Product (071); Eric Digests (selected) (073)
Target Audience: Administrators and Practitioners

Descriptors: Access to Education; * Accountability; * Competence; * Educational Quality; Employment Level; Job Skills; Occupational Tests; * Outcomes of Education; Performance; Postsecondary Education; * Program Evaluation; Secondary Education; * Standards; Vocational Education

Identifiers: *Carl D Perkins Vocational Education Act 1984; ERIC Digests

Degree Articles

School Articles

Lesson Plans

Learning Articles

Education Articles


 Full-text Library | Search ERIC | Test Locator | ERIC System | Assessment Resources | Calls for papers | About us | Site map | Search | Help

Sitemap 1 - Sitemap 2 - Sitemap 3 - Sitemap 4 - Sitemap 5 - Sitemap 6

©1999-2012 Clearinghouse on Assessment and Evaluation. All rights reserved. Your privacy is guaranteed at ericae.net.

Under new ownership