156 FTP

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

EDITORS NOTES

his volume of New Directions for Evaluation, devoted to applications and discussion of data collection and analysis methodology, introduces social network analysis (SNA) and its application to evaluation practice. The chapter authors briey describe the methodology, provide an overview of the measures that comprise some of a social network analysts tools, and illustrate the application of the method to four evaluation case studies. The nal chapters include a personal account of current SNA use by a government agency and suggestions for future use of SNA for evaluation practice. SNA has a long and complex history that we barely touch on in this volume. Our intent is to provide an introduction to the methodology that might pique the curiosity of a variety of readers. Network analysis methodology can be extremely quantitative and technical. One of the criticisms of the eld of SNA is that the bulk of the work is academic and does not bring forth simple or practical applications. Few texts within the eld of network analysis provide both an understanding of the concept and directions and processes for clear application of the methodology. There are currently no known books that specically apply network analysis to program evaluation. We believe that this volume can help bridge the gap between the development of a methodology and its use within a specic discipline such as program evaluation. SNA is based on a way of viewing evaluation through the lens of relationships, compared to traditional methodologies that can be categorized as attribute focused. The application of SNA is not determined by the context of the evaluation or by the subject area or eld, such as whether the evaluation is formative or summative, value free, utilization focused, in-house or external, in the area of education reform, about online collaborations, or a measure of the quality of management. The use of SNA is determined by questions about the nature of the relationships of the networks relative to a specic project, program, or initiative, whether that relationship is between people, organizations, events, or ideas. SNA can provide evaluators with tools for identifying and exploring the structures that form or are formed by networks, whether they are formal or informal, static or changing. The emphasis in this volume is on highlighting the conceptual framework and differentiating SNA from traditional methods.

The listing of the editors in this volume is alphabetical and represents equal contributions by both.

NEW DIRECTIONS FOR EVALUATION, no. 107, Fall 2005 Wiley Periodicals, Inc. Published online in Wiley InterScience (www.interscience.wiley.com) DOI: 10.1002/ev.156

SOCIAL NETWORK ANALYSIS IN PROGRAM EVALUATION

Because this is an introduction for the eld of program evaluation, we are writing for a broad range of readers. The volume will be differentially applicable for four levels of practitioners. First are evaluators who want to increase their understanding of SNA and the processes and procedures for using the methodology to answer specic evaluation questions. We provide enough details of the measures that are illustrated in the case studies to provide this level of practitioner with a conceptual framework and procedural steps for including SNA within an evaluation project. We do not, however, intend this volume to be a practical manual for use but rather a step forward in clarifying the conceptual framework for SNA, which is different from traditional methodologies, and then to open the discussion for taking the next steps. To implement SNA within an evaluation design, the evaluator would need further skills to develop questions appropriate for the data, to be able to align SNA measures to evaluation questions based on the theoretical framework of the evaluation study and the measure, and to be procient at using one of the software packages available for SNA or at creating the necessary algorithms with other statistical packages. The second group of practitioners consists of evaluators who are interested in understanding more about the choice of questions appropriate for SNA and the alignment of a question to a specic measure. As we and the chapter authors explore the methodology further in Chapter Three and in the case studies, practitioners can see how SNA questions provide a different view of evaluation and how this view relates to more traditional evaluation questions. The third set of practitioners comprises those who might want to explore connections between an SNA conceptual framework and other theoretical frameworks, such as systems perspectives and complexity and communication and other networks that may dene an evaluation project. These areas are touched on in Chapter Threes exploration of the methodology, and in the nal chapters discussion of the potential for future applicability in evaluation practice. In the fourth group are practitioners who want a broader view of a methodology but are not currently interested in pursuing any specic applications. We have written this volume so that evaluation practitioners can explore the methodology without feeling they are reading a textbook. We acknowledge that this volume is metaphorically the tip of the iceberg in the applications of SNA to evaluation practice. We begin by clarifying and dening terms, setting the parameters of evaluation practice that we are addressing, and discussing what determines a methodology. Chapter Two briey reviews the history and development of SNA. SNA is not a new methodology, and glimpsing its historical roots provides a clue as to its complexity, its theoretical foundations, and the current interest in its applications.

NEW DIRECTIONS FOR EVALUATION DOI 10.1002/ev

EDITORS NOTES

Chapter Three illustrates and explores the most common measures used in SNA. Here Maryann Durland presents two themes. First, in the application of SNA for evaluation purposes, the choice of a measure is aligned with the theory of the relationship under study. Understanding the basic structure of the measures and the levels of analysis provides a framework for an evaluator to determine whether a measure is appropriate for a particular evaluation question. Second, many of the algorithms and analysis tools in SNA are also used in traditional methods of data analysis. Durland highlights in Chapter Three that although the mathematics and statistics are similar, the interpretations are conceptually different. Chapter Three is not meant to be a manual for immediate application, but rather a framework for understanding the concepts and language of SNA, and it will guide the evaluator through the overall language and conceptual framework in reading the case studies that follow. The next four chapters are case studies that illustrate how SNA was used to answer specic evaluation questions. The authors of the fourth chapter, Susan Kochan and Charles Teddlie, used SNA in an early evaluation of the communication patterns in a high school study as part of a larger evaluation. Their chapter provides examples of two important points: how SNA can t into an evaluation design and how the technology and the methodology of doing SNA have advanced and improved over the processes of mapping a network by hand. When this evaluation was rst conducted, the tools for doing SNA were not readily available to researchers or evaluators. Most of the computer tools were complex and mainframe based, so individuals familiar with the methods created maps by hand and calculated the measures with mainframe programs. This chapter claries the differences between handdrawn maps and those calculated by algorithms. The purpose of maps, as illustrations of calculations, and the alignment of maps to the measures are topics of an ongoing conversation within the SNA community. Chapter Five offers the network analysis of a program for the developmentally disabled. Kimberly Fredericks presents the ndings from the network analysis of a demonstration program and compares them with the ndings of a comprehensive program evaluation. This case analysis was conducted by interviewing key stakeholders at evaluation implementation sites across one state. The chapter illustrates how SNA was used with analysis of networks of key actors to understand an evaluation question. It shows the complexity of decision making and how multiple ways of viewing help to clarify the parameters of the decision and guide decisionmaking processes. In Chapter Six, Sandra Birk shares an example of using SNA to answer one complex evaluation question with forty-seven individual networks. It is an instance of a highly specic evaluation focus, the evaluation of capacity, dened as the ability to conduct the next generation of research. In this

NEW DIRECTIONS FOR EVALUATION DOI 10.1002/ev

SOCIAL NETWORK ANALYSIS IN PROGRAM EVALUATION

context, capacity refers to the individuals who have specic knowledge in any of the critical areas. This evaluation reveals how the patterns of networks illustrate differential levels of capacity, demonstrating how ndings are often supportive of intuitive information that has not been previously identied and how SNA can provide formative information. One result from the evaluation was as simple as a manager realizing it was necessary to go back and meet individuals on work teams who had been identied by others as important. Chapter Seven, by Maryann Durland, is an evaluation of the integration of two large departments. The merger was an initiative planned to increase capacity and efciency. The integration had been well planned, included moving individuals around from one of four locations, and all of the participants had had extensive preparation, training, and time to meet prior to the integration. The evaluation was conducted to examine the extent to which the two departments were integrated and working on common projects. We end with visions and suggestions for the future. Chapter Eight, by David Introcaso, is a reaction to and summary of a current application of SNA to evaluation practice. Chapter Nine presents our views on the potential of the methodology for application in evaluation practice. We hope our discussion in this chapter will open the way for further conversation on the methodology, its application to evaluation practice, and the next steps for evaluation. It is our intention that this volume be both informative and spark a new area of interest for evaluators. We look forward to further discussions, debates, and sharing of application details about SNA with other evaluation practitioners. Maryann M. Durland Kimberly A. Fredericks Editors

MARYANN M. DURLAND is an independent consultant specializing in evaluation and the applications of social network analysis. KIMBERLY M. FREDERICKS is assistant professor of public administration and policy in the Department of Political Science, Indiana State University.
NEW DIRECTIONS FOR EVALUATION DOI 10.1002/ev

You might also like