The Creative Side of Evaluation: What I Love the Most
Talking about evaluation with colleagues brings up different reactions, mostly ranging from not knowing what evaluation is to the ones who avoid the subject because they feel it is too technical and complex, to the ones who think all I do is stare at spreadsheets and data all day long. And they are not wrong, a big part of my job is indeed complex and looking at data from multiple perspectives and for multiple purposes.
But there is a creative part to evaluation that allows my curious mind to activate and engage (borderline to obsession) and that is something that really energizes me about this career.
Recently, I have been creating evaluation tools to help federal program managers improve their evaluation efforts. By evaluation tools, I mean frameworks and structures that help people see the story behind their work; and AI tools that remove the burden of paperwork and free up time to engage in the actual evaluation process, but that is a topic for another blog post. In this one, I want to talk a little bit about a recent project — full of creativity — that I loved being involved in.
Just this week, I drove from the Greater Tampa Bay Area to Daytona Beach, on the opposite coast of the Sunshine State. The purpose of this trip was to unveil a Parent and Family Engagement (PFE) Evaluation Tool at the ECTAC (East Coast Technical Assistance Center) Member Meeting.
From Requirement to Reflection
To give a little context (in case you are outside of the federal programs world), parent and family engagement is one of the biggest requirements in federal programs under the Elementary and Secondary Education Act (1965), as amended by the Every Student Succeeds Act (2015). Why? Because both the law and those of us in education, understand the value of engaging parents and family members in their children’s learning process. This is especially relevant for Title I, Part A, a federal program designed to support the academic achievement of students from disadvantaged backgrounds.
Wisely, as part of grant requirements, districts must allocate a portion of Title I funds to support parent and family engagement. And because this is such an important investment, there is an evaluation component attached to it, to ensure that these activities are meaningful and effective.
That’s where my role came in!
ECTAC district members from different counties across Florida voluntarily joined efforts to develop an evaluation tool for PFE activities. With the guidance and facilitation of ECTAC leadership, we met to brainstorm, draft, and refine the structure of what eventually became the Parent and Family Engagement Evaluation Tool. This was not just a compliance checklist. It was a shared effort to create something practical and reflective that helps districts and schools evaluate the content and effectiveness of their engagement activities, not just document their completion.
Collaborative Creativity
The best part of this process was how collaborative it was. Every district brought a different lens: some focused on researching evidence-based practices, others on providing the PFE context to ensure the tool aligned with real needs, and another group outlined the specific steps to guide the evaluation plan. Together, we identified common indicators, defined what “content and effectiveness” means in the context of our project, and connected them to outcomes that districts and schools could actually see and measure.
My focus was on taking all that collective knowledge and expertise and turning it into an evaluation framework. Throughout this process, I found myself designing, visualizing, and connecting dots across concepts and expected outcomes. Using strong research as a foundation, I developed a theory of change that became the backbone for a dynamic evaluation framework. I call it dynamic because it goes beyond a static checklist or single data source. It brings together quantitative and qualitative components to capture both measurable results and the lived experiences behind them. It supports descriptive, comparative, and explanatory analysis, allowing users to explore what happened, how it compares to expectations, and why it worked (or didn’t).
Most importantly, the framework is flexible. It can be implemented at the school or district level depending on available data, time, and skillset. Whether a school has access to robust survey data or relies on narrative reflections, the framework adjusts, meeting teams where they are while still keeping the focus on learning and improvement.
Why Creativity Belongs in Evaluation
Working on this project reminded me that creativity is at the heart of evaluation. Behind every framework, logic model, or rubric is a design process that requires imagination, empathy, and vision. Creativity in evaluation means thinking about how people will use the tools we build and how data will lead to understanding, not just reporting. It’s about designing structures that simplify complexity without losing meaning.
When I’m building evaluation tools, I often think of them as bridges because I am connecting the technical side of research with the practical realities of classrooms, families, and schools. The creative process is what makes those bridges usable. It helps transform abstract indicators into stories of change that educators and community members can recognize and connect with.
And that, to me, is one of the assets of evaluation: when done thoughtfully, it becomes both analytical and artistic, structured and intuitive. Creativity is what turns data into meaning and measurement into impact.
Some pictures of the PFE Evaluation Tool Project team: