This week, the eminently useful Brigada Today newsletter—in addition to drawing attention to GMI’s Agency Web Review—also posted an item from someone looking a self-assessment survey for local church missions programs, recalling that ACMC used to offer one.
Responses were subsequently posted by Marti Wade (who does great work with Missions Catalyst) noting that the tool is still available via Pioneers, which assumed ACMC assets upon its folding; and by David M of Propempo International, which also offers a version of the tool. A snapshot of an excerpt from the ACMC/Pioneers version appears above.
Seeing the ACMC survey brought back a memory from a 2003 project that GMI did for ACMC. We surveyed 189 ACMC member churches to understand the status of church mission programs as well as their needs and goals. The survey included each of the twelve questions from the self-assessment grid.
Subsequently, we did statistical modeling to determine if/which/to what degree various church missions program elements were associated with growth in missions sending and with missions budget as a proportion of overall church budget.
Unfortunately, most of the correlations were not statistically significant, and those that were significant were negatively correlated—meaning churches that rated their mission program highly (placing greater priority on the various dimensions) tended to demonstrate less growth in sending or lower relative financial commitment.
How could this be?
Turns out that this is a fairly common outcome of self-assessment exercises. In short, teams with excellent performance also tend to have high standards—and their vision for growth frequently leads them to be more self-critical than lower-performing teams, which often have lower standards.
So, am I discouraging local churches to use the Mission Assessment Tool? Not at all. I encourage churches to download it and use it as the basis for discussion—it can be a great discussion starter for vision building, clarifying core values and identifying priorities for development. For the reason described above, you may find out that some team members differ on where the program stands—or where the opportunities are for growth.
But when program evaluation is the goal, it helps to have outside eyes providing the perspective. Those well equipped to offer feedback on a church’s mission program are:
1. Those served by or in partnership with the mission team, such as missionaries who may have other supporting churches (these must be given anonymity in response) and/or
2. Outside consultants who work with many church mission programs and have a valid basis of comparison.
Meanwhile, at the 30,000-foot level, researchers, missiologists and consultants are eager to discover the key differences between high-performing church mission teams and others. The statistical modeling sought to answer the question: What program elements are the most common outflows (or drivers) of increased financial/sending commitment: Better mission education? Better worker training? Greater emphasis on strategy? More local mission involvement? This is where self-assessment bias—seen across a sample of 189 churches—becomes a problem.
One helpful approach is to focus on relative data. Were we to re-examine the analysis today, I would be inclined to transform the raw data into relative performance rankings (each church’s perception of its relative strengths and weaknesses). This compensates for differing standards of excellence by looking at each church’s priorities.
Self-evaluation bias can also be reduced by developing assessment tools with response scales/categories that are so observably objective that they cannot easily be fudged. The ACMC tool uses descriptions for each commitment level that are intended to be objectively observable—but in some cases they are subject to interpretation, or to cases where a higher-level condition may be met while a lower-level condition is unfulfilled. In the 2003 study we gave specific instructions to respondents that they should proceed through each scale a step at a time, stopping at the lowest unmet condition. However, such an instruction may not have been enough to overcome the need of some respondents to affirm their church’s mission program with high marks.
This issue also points to the importance of testing assessment tools for validity among a pilot sample of respondents—with results compared to an objective measure of excellence.
Take care with self-assessment. After all, scripture itself warns us in Jeremiah 17:8-10 that “the heart is more deceitful than all else and is desperately sick; who can understand it?”