Relationships Between Quantitative Measures of Evaluation Plan and Program Model Quality and a Qualitative Measure of Participant Perceptions of an Evaluation Capacity Building Approach

Jennifer Urban, Marissa Burgermaster, Thomas Archibald, Alyssa Byrne

Research output: Contribution to journalArticleResearchpeer-review

4 Citations (Scopus)

Abstract

Despite a heightened emphasis on building evaluation capacity and evaluation quality, there is a lack of tools available to identify high-quality evaluation. In the context of testing the Systems Evaluation Protocol (SEP), quantitative rubrics were designed and tested to assess the quality of evaluation plans and models. Interview data were also collected and analyzed using a priori codes. A mixed methods approach was used to synthesize quantitative and qualitative data and explore trends. Consistencies between data types were found for attitude and capacity, and disconnects were found for knowledge, cyberinfrastructure, time, and quality. This approach to data integration represents a novel way to tap the generative potential of divergence that arises when different methods produce contradictory results.

Original languageEnglish
Pages (from-to)154-177
Number of pages24
JournalJournal of Mixed Methods Research
Volume9
Issue number2
DOIs
StatePublished - 15 Apr 2015

Fingerprint

Quality Evaluation
Evaluation
evaluation
Mixed Methods
Data Integration
Divergence
Model
Testing
divergence
Relationships
Evaluation capacity building
Quality evaluation
lack
trend
interview
Data integration
Mixed methods
System evaluation
Qualitative data
Trends

Keywords

  • evaluation capacity building
  • evaluation plan quality
  • logic model quality
  • mixed methods
  • systems evaluation

Cite this

@article{cf5b319737c84b33b74c1fe7f2d599c7,
title = "Relationships Between Quantitative Measures of Evaluation Plan and Program Model Quality and a Qualitative Measure of Participant Perceptions of an Evaluation Capacity Building Approach",
abstract = "Despite a heightened emphasis on building evaluation capacity and evaluation quality, there is a lack of tools available to identify high-quality evaluation. In the context of testing the Systems Evaluation Protocol (SEP), quantitative rubrics were designed and tested to assess the quality of evaluation plans and models. Interview data were also collected and analyzed using a priori codes. A mixed methods approach was used to synthesize quantitative and qualitative data and explore trends. Consistencies between data types were found for attitude and capacity, and disconnects were found for knowledge, cyberinfrastructure, time, and quality. This approach to data integration represents a novel way to tap the generative potential of divergence that arises when different methods produce contradictory results.",
keywords = "evaluation capacity building, evaluation plan quality, logic model quality, mixed methods, systems evaluation",
author = "Jennifer Urban and Marissa Burgermaster and Thomas Archibald and Alyssa Byrne",
year = "2015",
month = "4",
day = "15",
doi = "10.1177/1558689813516388",
language = "English",
volume = "9",
pages = "154--177",
journal = "Journal of Mixed Methods Research",
issn = "1558-6898",
publisher = "SAGE Publications Ltd",
number = "2",

}

Relationships Between Quantitative Measures of Evaluation Plan and Program Model Quality and a Qualitative Measure of Participant Perceptions of an Evaluation Capacity Building Approach. / Urban, Jennifer; Burgermaster, Marissa; Archibald, Thomas; Byrne, Alyssa.

In: Journal of Mixed Methods Research, Vol. 9, No. 2, 15.04.2015, p. 154-177.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - Relationships Between Quantitative Measures of Evaluation Plan and Program Model Quality and a Qualitative Measure of Participant Perceptions of an Evaluation Capacity Building Approach

AU - Urban, Jennifer

AU - Burgermaster, Marissa

AU - Archibald, Thomas

AU - Byrne, Alyssa

PY - 2015/4/15

Y1 - 2015/4/15

N2 - Despite a heightened emphasis on building evaluation capacity and evaluation quality, there is a lack of tools available to identify high-quality evaluation. In the context of testing the Systems Evaluation Protocol (SEP), quantitative rubrics were designed and tested to assess the quality of evaluation plans and models. Interview data were also collected and analyzed using a priori codes. A mixed methods approach was used to synthesize quantitative and qualitative data and explore trends. Consistencies between data types were found for attitude and capacity, and disconnects were found for knowledge, cyberinfrastructure, time, and quality. This approach to data integration represents a novel way to tap the generative potential of divergence that arises when different methods produce contradictory results.

AB - Despite a heightened emphasis on building evaluation capacity and evaluation quality, there is a lack of tools available to identify high-quality evaluation. In the context of testing the Systems Evaluation Protocol (SEP), quantitative rubrics were designed and tested to assess the quality of evaluation plans and models. Interview data were also collected and analyzed using a priori codes. A mixed methods approach was used to synthesize quantitative and qualitative data and explore trends. Consistencies between data types were found for attitude and capacity, and disconnects were found for knowledge, cyberinfrastructure, time, and quality. This approach to data integration represents a novel way to tap the generative potential of divergence that arises when different methods produce contradictory results.

KW - evaluation capacity building

KW - evaluation plan quality

KW - logic model quality

KW - mixed methods

KW - systems evaluation

UR - http://www.scopus.com/inward/record.url?scp=84927138613&partnerID=8YFLogxK

U2 - 10.1177/1558689813516388

DO - 10.1177/1558689813516388

M3 - Article

VL - 9

SP - 154

EP - 177

JO - Journal of Mixed Methods Research

JF - Journal of Mixed Methods Research

SN - 1558-6898

IS - 2

ER -