printable version of page Printer-friendly page

Focus On Basics

Volume 3, Issue B ::: June 1999

Nationwide Accountability: the National Reporting System

by Barbara Garner
The data reporting system for Adult Basic Education has been redesigned. A pilot test of the new system finds both success and stumbling blocks ahead

 

In the mid-1990s, federal and state administrators of the US adult basic education system felt growing pressure to document that participation in literacy programs leads to positive results. The Office of Vocational and Adult Education (OVAE), working with state Directors of Adult Basic Education, embarked on an effort to improve the National Reporting System (NRS) and link it more closely to program accountability and improvement. With the passage of the Workforce Investment Act of 1998, the redesigned NRS becomes even more important.

The Workforce Investment Act (WIA) legislation replaces the former Adult Education Act, which was in place for 33 years. Title II of WIA, called the "Adult Education and Family Literacy Act," addresses adult education. Title II reflects a priority towards more intensive, higher- quality services rather than rewarding number of students served. It also puts a much greater emphasis on learner outcomes, and therefore on accurate measurement and reporting (Balliro & Bickerton, 1999).

Mike Dean, of OVAE, is responsible for overseeing the NRS redesign project, which is being implemented by the Pelavin Research Center of the American Institutes for Research, in Washington, D.C. The new system is not dramatically different from what was in place before, Dean explains. "In reality, the system has been tracking many post program outcomes, as well as outcomes within the program. The difference is that there is a new value on collecting and reporting performance data. Before, from the federal level, if someone couldn't track people, there were no real consequences."

He continues, "Now they have to [track people]. As a result, we have to train people on the methodologies they are going to need to collect, report, and analyze this information. We're trying to demonstrate the value that this information can have, at the local level for program improvement, for teachers, for staff development, for information about instruction. One of the challenges is to educate practitioners as to how important it [reliable outcome data] is, why it's important, and how it can improve programs."

Finer Gradations

The NRS project revised the educational functioning levels that programs report for students, as well as the written descriptions of those levels. The new system has a finer gradation of measurement. For example, in the old system, Level One was grade levels zero to 2.9 on the Test of Adult Basic Education (TABE), Level Two was 3.0 to 5.9. The new levels are zero to 1.9, then 2.0 to 3.9 (Condelli & Kutner, 1997). The Knox County Adult Basic Education Department, Knoxville, TN, participated in a pilot test of the new levels. Knoxville staff member Bill Walker reports that the finer gradation "allows us to show progression within some lower levels. The pilot allowed us to claim more successes. It used to be, if a student began at 0 and went to 2.1 in a one-year period, it wasn't successful because he didn't progress to the next level. Under the new groupings, that student will be seen as a successful completer of beginning literacy."

Robbie Thomas, director of the Queen City Vocational Center, Cincinnati, OH, another program that piloted the NRS, concurs: "I felt like the [new] breakdown made sense. The written descriptions and the testing matched. Instructors felt pretty comfortable with them. Having the greater gradation is a positive thing."

Follow Up Difficult

WIA also requires programs to track and report outcomes such as placement, retention, and completion of post-secondary education or training, and unsubsidized employment or career advancement. These outcomes occur after learners leave program and are notoriously hard to track. The programs participating in the pilot tested a system for gathering this information that involved calling former students.

Knoxville's Walker explains, "Our charge [in the pilot] was to select 200 learners who had dropped out of our programs six months prior and to contact them to administer a survey of why they dropped out and what benefits they got from adult ed. We did this to determine the effectiveness of telephone polling to gather information for learner outcomes that are mandated in the Workforce Investment Act." Programs participating in the pilot were successful in reaching, on average, 23% of the people they called. The response rate varied from program to program, ranging from a high of 35% to a low of 11%.

"I believe that some of the core indicators of performance [literacy gain] in the Workforce Investment Act will be very easy for adult educators to evaluate. I have misgivings about the core indicators of performance in regards to retention or completion of post-secondary education or training and unsubsidized employment, and I have misgivings about our ability to collect these and other reliable and numerous data by telephone," says Walker.

The difficulty the participating programs had in contacting learners was not surprising. Adult basic education programs have trouble following up on learners after they have left their programs, because of the transience of and reticence on the part of the learner population and a lack of program resources. The same phenomena make research on adult literacy learner outcomes extremely difficult (Beder, 1999). It's a tricky challenge: to show evidence of the impact of participation in adult basic education requires substantial resources, which may not be forthcoming until the evidence is produced.

Thomas corroborated the difficulty of obtaining follow- up data: "We serve about 3,500 students a year. The pilot - a telephone survey - was looking for information on 200 students who had withdrawn. Going into it, I thought that 400 calls would produce the results. We called 536 people. Of the people we called, 28% completed the survey, 5% refused, 37% had invalid phone numbers, 30% had no answer, line busy, or we left messages."

Thomas thinks that if students were told upon entry into the program that this was part of the process, the response rate would improve, yet she admits that this is just conjecture. She also feels that using the instructors to make the calls, rather than the instructional aides she did use, might have given more successful response rate. "I feel that follow up is something we need to do. It was a new experience for us. It will be time consuming - it took 51 hours to make the phone calls - an added dimension in terms of work load and information. But, how can you show that you're making a difference unless you follow up on students?"

The performance measures just tested will go into effect for the program year beginning July 1, 2000. Training and technical assistance to states on reporting requirements will begin in the summer, 1999. For more information, contact Larry Condelli of Pelavin Research Associates at (202) 944-5331, Mike Dean at (202) 205-9294, or visit the web site at http://www.air-dc.org/nrs.

References

Balliro, L. & Bickerton, B. (1999). "The Workforce Investment Act and New Multi-Year RFP." Bright Ideas, Vol. 8, No. 4.

Beder, H. (1999). The Outcomes and Impacts of Adult Literacy Education in the United States. Cambridge, MA: NCSALL.

Condelli, L. & Kutner, M. (1997). Developing a National Outcome Reporting System for the Adult Education Program. Washington, DC: US DOE OVAE, Division of Adult Education and Literacy.


 

Educating Lawmakers

by David Rosen

What can practitioner and adult learner leaders do when policymakers do not understand that adult literacy education must be a legislative priority? How can we help them to take more responsibility for this? Legislators have become supporters, even activists, after they learn directly from students or program graduates how much they have gained from adult literacy education programs. Learners and practitioners can visit legislators. Legislators learn how students have gained confidence, and how they and their families have improved their economic situation, literacy skills, and health. A particularly effective strategy that has been used in Massachusetts involves having adults who are waiting for basic skills classes send postcards to their legislators. Large numbers of people who must wait months or years for these critical services is a measure of constituent dissatisfaction that captures lawmakers' attention, and helps them to become accountable.

In Pennsylvania, practitioners who learned about the Massachusetts postcards have planned a different strategy, asking students to send cards to legislators when they have earned their GED or other adult diploma, thanking them for the public investment in them which has paid off. Another effective strategy is a coordinated campaign in which legislators are invited to visit programs and talk with students. In Massachusetts a few years ago, after a legislator who was skeptical about the value of adult literacy had visited two programs in one week, he turned around: he's now an ardent adult literacy advocate. With all these efforts practitioners can be more effective if they work together, meet regularly, set an annual agenda, communicate frequently with others in the field, and dig in for the long haul.


About the Author

David Rosen is a member of the Massachusetts Coalition for Adult Education Public Policy Committee and Moderator of the National Literacy Advocacy electronic list. He has been an adult literacy activist in Massachusetts for 15 years.

Updated 7/27/07 :: Copyright © 2005 NCSALL