(navigation image)
Home American Libraries | Canadian Libraries | Universal Library | Community Texts | Project Gutenberg | Children's Library | Biodiversity Heritage Library | Additional Collections
Search: Advanced Search
Anonymous User (login or join us)
Upload
See other formats

Full text of "A report on improving and developing an evaluation program."

A REPORT ON IMPROVING AND DEVELOPING 
AN EVALUATION PROGRAM 



PREPARED FOR 



THE RESEARCH DIVISION 



THE NATIONAL ENDOWMENT FOR THE ARTS 
PURCHASE ORDER NO. C80-441 



*ILE COK, PLEASE «^*% iVERSITY ASSOCIATES, INC. 
Research Division 



MAY 20, 1980 



National Endowment for the Arts 



Washington, D.C. 20506 





University Associates, Inc. 



475 L'Enfant Plaza, West ■ Suite 2100 ■ Washington, D.C. 20024 ■ Telephone 202/554-4710 



A REPORT ON IMPROVING AND DEVELOPING 
AN EVALUATION PROGRAM 



PREPARED FOR 
THE RESEARCH DIVISION 
THE NATIONAL ENDOWMENT FOR THE ARTS 
PURCHASE ORDER NO. C80-441 



UNIVERSITY ASSOCIATES, INC. 
MAY 20, 1980 



PREFACE 



In February, 1980, the Evaluation Unit within the 
Policy and Planning organization line was discontinued as 
a distinct entity and the responsibilities were assigned 
to the Research Division. 

In March, 1980, Harold Horowitz, Director, Research 
Division, inquired about my interest in working with him 
during the planning period as he developed a systematic 
evaluation study program. Since evaluation study approaches 
are a long-term interest of mine in work along this line 
for the U.S. Office of Education and earlier, the National 
Science Foundation, I enthusiastically accepted. We agreed 
that in 25 days I would review completed and on-going 
evaluation studies and make recommendations for improvement. 

I have read stacks of reports, documents and guides. 
Possibly, I became involved in substance and content in 
greater depth than necessary. Program directors and other 
NEA officials freely discussed evaluation problems with me, 
and I'm sure when they saw my interest, they willingly 
extended their discussions to include the excitment of their 
activities. Also, I have held discussions with individuals 
outside the NEA staff who are knowledgeable about evaluation 
in the arts. Mainly, in this project I have sought ideas 



from others, talked about my experiences and in this report 
have included my biases. The Research staff have generously 
given me as much time as I needed. Bill Potter's help has 
been invaluable, as has been the assistance of Candice 
Parrish and Maryann Gerard. Thanks to everybody concerned. 

Early in my work I became convinced that the assignment 
could be expanded and that more would be gained than by 
concentrating all of my time and attention on completed and 
on-going studies. I discussed with Mr. Horowitz my desire 
to range farther and consider some factors in launching a 
viable system of evaluation studies. Section 1 covers that 
aspect of my thinking. The purpose is to state the many areas 
that should be included in a comprehensive evaluation program. 
The details for the parts are not presented. Consideration 
has not been given to scaling the overall plan to mesh with 
resources likely to be available. Section 2 deals with 
on-going and completed studies. The discussions that I have 
held with Harold and will hold in the near future will be a 
main contribution. This report is directed to him for his 
consideration . 



Denzel D. Smith 
Senior Consultant 
University Associates, Inc. 



11 



Section 1 



The Setting 

It is not the purpose of this report to reinvent the 
term evaluation, enumerate technical strategies, or debate 
the effectiveness of particular evaluation methodologies. 
Evaluation of some kind happens, as it always has, for any 
program. We attach different values to the ranges of 
information available for decision-making situations. Some 
leaders feel that the realities of political imperatives 
deny the uses of impact evaluation, for example. However, 
the timing seems right to seek an improved vantage point 
for understanding the extent to which the Endowment is 
meeting stated goals. 

Similar statements are expressed regularly concerning 
the planning function. Yet, the commitment to effective 
planning for the Endowment has been made. To many people 
neatness of the exercise in planning is not considered to be 
very important in the decision making process. To them the 
exercise simply provides better copy for presentations, for 
example, budget presentations. Also, precisely how information 
on program effectiveness and impact can be used to facilitate 
management decisions is not well communicated and often not 
believed when communicated. 



Digitized by the Internet Archive 

in 2012 with funding from 

Boston Library Consortium Member Libraries 



http://archive.org/details/reportonimprovinOOnati 



2. 



The Endowment is committed to extensive planning — short 
range, middle range and long range. I have discerned, however, 
that it appears less committed to development of evaluation 
in a systematic manner as a means of better understanding 
progress toward achieving goals. 

Structure and Substance 

What, then, should be the structure and substance of an 
organizational unit with responsibility for providing 
important evaluation information.' 

First, all levels within an agency should accept 
responsibilities for demonstrating progress toward meeting 
their objectives and the agency's goals. This is a coopera- 
tive venture. As program planning begins, attention should 
be given to stating objectives; the main strategies should 
be named and some indicators identified that show progress. 
A key factor is that evaluation requirements should be 
identified during early planning phases. 

The evaluation staff of the Research Division should have 
the competence to provide technical assistance to program leaders 
in the form of training when requested and advice on how achieve- 
ment might be considered. The function should be cast at the 
program level in terms of facilitating effectiveness of a 
program; and at the agency level facilitating support toward 
goal attainment. The supportive service would not duplicate 



3. 



other services or assume responsibilities of other offices 
in preparing program objectives or identifying measures of 
progress . 

The Planning and Budget Division, of course, has primary 
responsibilities in developing and implementing the planning 
function for budget considerations. The Office of Program 
Coordination has the important role in planning at the 
program level. At the Council level, it is verbalized that 
planning is the responsibility of all -- the Council, panels, 
program directors. With this in mind, in order for a network 
of meaningful information to flow, input on how progress is 
to be measured must be provided. Unless evaluation considera- 
tions are taken into account at all stages in planning by 
these groups, two results can be expected. First, planning 
will become a narrow channel providing a script for budget 
planners. Second, evaluation studies will continue to be 
descriptive studies without a purpose or program application. 
Staff specializing in evaluation approaches should be involved. 
How the competence should be developed or where personnel is 
stationed should not cause administrative irritation. Certainly 
the Research Division must develop technical competence in 
evaluation methods. 

Second, postponing the identification of measures and indi- 
cators showing objective and goal attainement tends to minimize the 
role of evaluation in the planning/management evaluation scheme. 



4 



Improved planning capability and increased competence in 
research and evaluation are evident under the Office of the 
Deputy Chairman for Policy and Planning. Equally visible is 
an increased interest in planning among program leaders and 
the Office of Program Coordination under the Deputy Chairman 
for Programs. These developments lead to the expectation 
that the role of evaluation will receive more prominence 
than it previously has attracted — not less prominence. 

Third, the substance of evaluation programs is dependent 
upon a good network that identifies issues. Top management, 
the Council, the panels, and the program directors could be 
more effective in seeking evaluation capabilities that are 
problem solving oriented at the policy level. Research and 
evaluation studies should be expected to provide assistance 
in understanding the work of the Endowment. To date, 
expectations seem especially low as far as evaluation studies 
are concerned. For a variety of reasons, the Research 
Division has been relegated to being an entity of its own. 
The role of evaluation should not be considered as a mere 
add-on. The research and evaluation programs should be 
in the mainstream and the function oriented toward issues 
and problems in goal attainment for the Endowment. 

Fourth, the Endowment may not have identified and 



5. 



communicated its data requirements. There are information 
sources within the programs and at the general administrative 
level. Some program directors feel that too many data 
collections are incomplete, contain inaccuracies, and are not 
current. Others believe their needs are met by supporting 
non-agency data gathering groups. Some program leaders 
develop their own capabilities in data collection. Internally, 
there are data processing developments to capitalize on 
application information. And, of course, there are the usual 
fiscal and budget and grant administration data collection 
operations. There are contracts for developing the format 
for gathering data, e.g. , the National Information Systems 
Project. Doubtless, there are other programs underway. Any 
program, especially research and evaluation programs, is 
dependent upon reliable statistical information. That, however, 
is not the most important consideration. The basic question 
is to determine what information is required to meet the 
needs of the Endowment. 

In 19 77 the Research Division accepted a very good 
final report on a feasibility study for an economic data 
program on the condition of the arts and cultural organizations. 
Recommendations from this report have been followed in research 
studies. But, it does not appear that the report was used to 
define data requirements for the Endowment. To clarify and 



* 



* Grant No. RQO-22-3N, October 31, 1977 by Graduate School of 
Public Administration, New York University. 



6. 



integrate the thinking about information requirements, ' it is 
suggested that a working group be formed under the direction 
of the Deputy Chairman/Policy and Planning with representation 
from all data users. The Research Division could provide 
the coordination and assist in identifying linkages among the 
many data requirements. It is necessary to emphasize that 
evaluation supportive of Endowment policies will be suspect 
until these data acquisition and data use areas are better 
understood. 

Fifth, in implementing evaluation studies in a systematic 
fashion, there should be less separation between research 
and evaluation studies. For example, a program of research 
can be responsive to program needs as well as emphasizing the 
impact analysis requirements for Endowment policy planning 
and decision making. Also, the designs for evaluation 
studies can be of high quality in terms of technical 
considerations and problem solution. 

The term evaluation is disturbing to many managers and 
leaders for reasons stated so often. that they are not repeated 
here. Emphasis should be placed on analysis of data and 
interpretation of information. The requirement is that 
researchers and evaluation specialists work with leaders in 
defining issues and purposes that require solution. The 
results need to be important to members with responsibility 
for policy issues as well as to program leaders. 



7. 



Sixth, program leaders conduct self -evaluation and 
internal evaluation studies differently with varying degrees 
of quality standards. Some programs will have a stated 
impact requirement, for example, Office of Partnerships, 
State Programs. Some programs may not have explicit evalua- 
tion statements. Nevertheless, self-evaluation and internal 
program analysis studies at the program level should be 
encouraged. This approach tends to be less threatening and 
is useful in improving program effectiveness. 

Seventh, a plan for intensive program review should be 
implemented to supplement the continuing data collection and 
evaluation efforts of each program. 

Such a plan would provide an in-depth review of a few 
programs each year based on available resources. Innovative 
arrangements could be devised using, for example, cross 
cutting themes such as fellowships, touring, etc. A schedule 
indicating a review every 3 years, for example, would encourage 
program staff to be aware of and prepare for such a review. 
In this way the network communicating results on issues 
enhances chances for timely use of such results in policy 
discussions . 

Eighth, provision should be made for discrete, once 
only evaluation studies. This is necessary to meet crises. 
It is unrealistic to believe such needs will not occur. Even 
with a first-rate systematic evaluation approach "crises" 



8. 



responses will be requested. However, a research and evalua- 
tion unit should not be forced to expend many resources on 
"one-shot" studies. 

What does the above say about the development of a 
systematic program in evaluation? 

1. Blend the research program and the evaluation requirements 
so that the purpose of these activities can be communicated 
in terms of assistance in solving Endowment policy issues 
and problems. 

2. Data requirements for the Endowment should be clarified 
and defined. The Research Division, as a member of an 
Endowment wide working group, should provide a coordinating 
role . 

3. In support of all groups with responsibilities in planning 
at the program level, develop a capability for technical 
assistance in the phase of planning dealing with objectives 
and identifying measures of objective attainment. This 
would be a complementary role that would contribute to 
improved working relations . 

4. Seek the opportunity to provide technical and professional 
assistance to groups responsible for short and long range 
planning from the standpoint of laying the groundwork for 
impact studies to be conducted at later dates. In this 
way bridging the gap between meeting program objectives 



9. 



and Endowment goals may have a better opportunity for 
completion . 

5. Develop a systematic multi-year program of evaluation 
research that is directed toward Endowment policy and 
decision making. 

6. Develop a scheme for periodic intensive external evalua- 
tion of each program on a set schedule For example, 
resources might allow intensive study of five programs 
per year, to be repeated every three years. Innovative 
arrangements could be devised using cross cutting themes, 
e.g., fellowships, touring, etc. 

7. Reserve resources for a few special, one-of-a-kind evalua- 
tions each year based on important pressure requirements. 



10. 



Section 2 



A requirement in the work order for consulting services 
is to review completed as well as on-going evaluation studies 
to assess procedures and circumstances affecting successful 
completion of the projects. These studies have been reviewed 
and reactions have been communicated in discussions with 
members of the Research Division staff. Also, discussions 
about the uses of the results of evaluation studies have been 
held with many program directors, other NEA officials and 
several individuals outside NEA who are knowledgeable about 
arts administration. 

The negative aspects noted in reviewing completed evalua- 
tion studies can be listed, but that approach has not been 
productive in this situation simply because the conditions 
are different now than previously, and the capabilities 
within the Research Division and throughout the Endowment are 
quite improved in relation to the period that the Evaluation 
Unit functioned as a separate entity. It would be interesting 
to search out usable evaluation information and impact find- 
ings and trace their uses. Tracing uses is an interesting 
technique but is very time consuming and was not pursued. 
Informal discussions on uses of findings of completed 
evaluation studies were not productive either. 



11. 



The Evaluation Report 

1. The final reports are too long and difficult to understand. 
This statement is a common one, and, expected. The 
reasons for the lengthy presentations in reports are many. 
Sometimes the sponsor desires a wide distribution of a 
report and often encourages long programmatic descriptions. 
Evaluation reports should be directed toward a stated 
user with well defined purposes and the contractor should 
not be in doubt about this. 

2. Evaluation reports too often describe activities. The 
analysis, assessment and interpretation become secondary. 
Program staff need good descriptions of project activities 
to meet a variety of requirements. But, evaluation 
studies may not be the best source of description of 
program activities. Too, there may be confusion in 
terminology in some situations. Simply describing all 
activity is not assessing its effectiveness, its quality, 

or its impact. There are two considerations here. (A) More 
precise language should be written in the solicitations. 
Questions listed in solicitations too often encourage 
description. (B) The language of evaluation is not common, 
not standard among the program staff and the researchers. 
The Research Division should provide a common vocabulary 
as a means of improving internal communication. This 
seems like an opportune time to establish a common technical 
language in research and evaluation. 



12. 



3. The format for evaluation reports may be confusing to 
a contractor and may be dulling the interest of a 
targeted reader. For example, in several reports the 
executive summary is just another report and the importance 
of outstanding impacts becomes less distinguishable. It 
would be better to direct the contractor to deal with 
significant findings. The Research Division should 
assume responsibility for preparing the action document 
when the report is forwarded to a target administrator. 
In this way the issues can be restated, the policy 
problem that had been the basis for the evaluation study 
can be laid out and the action that should be taken can 
be stated by the competent professional within NEA. All 
of the above should be performed within the time frame 
previously established by the administrator responsible 
for action on the issue. 

The Solicitation 

These paragraphs are directed toward improvement in 

solicitations and do not enumerate the many good elements in 

the format in use. 

1. The most important improvement should occur prior to 
preparing and issuing the solicitation. The policy 
issues that suggest an evaluation study should be 
documented. The present solicitations state questions 



13. 



to be answered but the statement of issues and the 
problems to be solved may not be evident. What policy 
issue is being discussed? 

The name of individual or group, e.g., a Congressman 
wanting an evaluation of a program may be known to the 
staff but this may not be documented. Statements on 
issues, a statement of problems to be solved, who wants 
the evaluation, and what action is required are items 
to be accepted and approved. Agreement on these topics 
would help to understand the needs of top management, 
i.e., is the evaluation study an expedience dealing with 
funding or is it intended to clarify the role of the 
Endowment? 
2. The substantive content of the solicitation would profit 
from more critical analysis. The suggested questions to 
be answered in a solicitation should not be a long 
shopping list. The staff should discriminate between 
information required and why and information that would 
be "nice to know. " 

The Design of Evaluation Studies 

There are few comments concerning the design of completed 
and on-going evaluation studies. The methods of sampling and 
the approaches used vary in quality, as expected. As the NEA 
staff gain more experience and acquire more technical competency 



14. 



in evaluation methods, data analyses and interpretation will 
improve. Even so, the turning point in using the results of 
evaluation studies will be the astuteness in the design for 
evaluation studies. There is no reason to believe that 
questionnaire surveys will be discontinued but there will 
not be many questions that ask a respondent simply whether or 
not the grant funds received were useful. Also, for many 
years NEA will depend upon secondary sources for data for use 
in analysis but the responsible NEA staff will have a 
better understanding of the accuracy and comprehensiveness of 
the data collections available. Too, the quality of evalua- 
tion proposals should improve as there are several research 
groups around the nation with good track records in evaluation 
research and eager for business. Be more agressive in selection 

Monitoring a Contractor 

The important work of the staff is done up front — the 
pre-solicitation definitions, the staff decisions on acceptable 
possible methodologies, the clarity and tightness of the 
solicitation, and the definiteness of design during contract 
negotiations. Assistance to the contractor in implementing 
the study continues, of course, but after the up-front work 
is completed, it is better to stay out of the way, to keep 
informed and helpful so that the need for intervention is 
unlikely. The research staff is competent and the need for 



15. 



further discussion is unnecessary. However, because evaluation 
results are feared by some and there are those who feel that 
evaluation studies have little value, a review of the philosophy 
and strategies in monitoring is suggested. For example, it 
isn't useful to complain about the use of a data source after 
the study is completed. It doesn't do much good to ask a 
contractor for monthly statements of progress if the material 
is not used except for administrative protection. 

Finally, the evaluation studies to date may have been 
used in more ways than have been suggested. For example, 
information may be used for general purposes in preparing 
program reports. The fault is that the results of evalua- 
tion studies have had limited acceptance by the Endowment 
staff and have been given a place of low value as aids .in 
solving problems. Now, it can be hoped the thrust will be 
different.