Category Archives: Research Blog

Using research to help create the future

This week the International Association of Missionary Aviation has its annual meeting in Idaho.  GMI board member Jon Lewis is the plenary speaker.

Five years ago, GMI played a significant role in IAMA’s annual meeting, as we presented the results of a multi-year research project looking at the present and future of mission aviation.  The idea behind the FlightPlan project was that global mission was trending away from an emphasis on overcoming physical barriers and moving toward an emphasis on overcoming political, cultural and religious barriers.  In such a world, what might be the appropriate – or potential – roles for the people and tools of the mission aviation community?

A cornerstone of GMI’s 184-page report was a set of seven prospective “models” for ministry.  These models emerged from an analysis of conditions and needs in mission and in general aviation – but also by looking at innovative enterprises in sectors that are “near neighbors” to mission aviation:

  • Organizations on the fringes of the mission aviation sector, such as Wings of Hope, a non-sectarian group that was later nominated for the Nobel Peace Prize.
  • Commercial entities that complement or parallel mission aviation, such as air taxi service and fractional jet ownership.
  • Organizations in the supply chain of mission aviation, such as aviator training schools and small-aircraft developers like Quest Aircraft.
  • Organizations that deal in similar activities to those of mission aviation, such as the global logistics industry and the UN’s World Food Programme.
  • Organizations that could be viewed as competitors to mission aviation, such as the Aga Khan Development Network, which sponsors community air service in the spiritual “Tension Belt” of Africa.

This near-neighbor approach is a systematic, intentional way of developing viable new models for business or ministry.  We didn’t think this up on our own; we borrowed the concept from Kim and Mauborgne, the authors of Blue Ocean Strategy (who are said to have built on ideas from Clayton Christensen and others).  They say:

The process of discovering and creating blue oceans [new models and markets] is not about predicting or preempting industry trends.  Nor is it a trial-and-error process of implementing wild new business ideas that happen to come across managers’ minds or intuition.  Rather, managers are engaged in a structured process of reordering market realities in a fundamentally new way.  (pp. 79-80)

That quote captures the kernel of using research in strategic planning.  Are you engaged in that process?

For the FlightPlan project, we used the process to identify seven models that organizations could use to challenge their strategic thinking and focus their strategic planning.  These included:

The Agile Provider: In a world where change is constant, this provider (or network) is ready for anything – broad and acute needs, short- and long-term deployment, people, skills and/or cargo. The Agile Provider has the resources, processes, flexibility and drive to deliver many resources in many places, on many scales, for many purposes – with the ultimate purpose of representing Christ to the world.

The Nation Developer: This knowledgeable organization assists in the development of transportation and communications infrastructure in nations that have a combination of spiritual needs and capacity-development needs, with focus on nations that have not been open to traditional forms of Christian witness.

The Field Opener: Because places remain where Christian workers – and thus the gospel – face physical barriers that prevent or delay access to God’s word and the Church, this provider (or network of providers) efficiently develops air access in remote areas, paving the way (sometimes literally) for others who are good spiritual and financial stewards of the access provided.

The Tribal Advocate: As tribal peoples direct the development and application of technologies to meet their current and future needs, knowledgeable Christian individuals and groups assist and advocate for them, honoring their decisions and partnering with them in carrying out their priorities and achieving their goals.

The Microaviator: A missionary, church or national church planter who uses one’s own plane as a personal vehicle to get from place to place quickly and safely, or who hires an air taxi service to do so. Microaviators typically use very small planes to do their work. They consider themselves missionaries first, aviators second. Microaviators may also include churches that charter business aircraft to transport short-term teams.

The Business Creator: An enterprise that uses business-as-mission strategies to establish aviation-related commerce, jobs and influence in cities and villages in less-reached areas. Independently, or in partnership with nationals, the Business Creator improves livelihoods for believers and unbelievers, builds goodwill, sets a positive example through faith and lifestyle, and creates natural evangelism opportunities.

The Resource Broker: This provider obtains, enhances and deploys valuable time and technology resources for aviation as they become available. The Resource Broker skillfully identifies, evaluates and capitalizes on resources that may be donated, loaned, salvaged, purchased at auction, etc. This low-cost, high-value approach enables the deployment of resources at an affordable cost for end users. Known for good stewardship and the ability to utilize resources that do not easily fit into traditional suppliers’ systems, the Resource Broker monitors aviation needs and opportunities to determine the best way to deploy resources.

Click here learn more about the FlightPlan research project and to download an executive summary of the research.

There are many ways to do futures planning and scenario research.  A good link to many resources for ministries is Jay Gary’s Christian Futures Network.

I know of at least two mission organizations that have done their own futures research with a high level of skill and intentionality.  One is Mission Aviation Fellowship, whose former COO David Bochman did such a project as part of his doctoral dissertation.  Another is Pioneers, though neither project has been published, to our knowledge.

What about you?  If your agency is interested in researching possible and preferable futures for your organization, let us know – GMI Research Services will be glad to help.

Four rewards, four challenges in rebranding

 

 

 

 

A few days ago I participated in a panel discussion at the Evangelical Press Association conference here in Colorado.  Moderator Jon Hirst of Generous Mind was the moderator; other panelists included Keith Brock of The CSK Group design firm and Phil O’Day, who is less than two weeks away from the public launch of CAM International’s rebranding to Camino Global.

My fellow panelists offered some great ideas for the audience – and the audience did its share, too, with some great questions and comments.  Here are four rewards and four challenges of the rebranding process I’ll remember from the session:

Reward of Rebranding 1: When everyone buys in.  Keith told a story of working with a hotel chain on its rebranding process.  Several months later, while staying at one of the hotels, he asked a desk clerk about what the brand meant to him.  Keith was delighted to hear the clerk enthusiastically talk about the hotel’s emphasis on making memorable moments for guests – demonstrating a core objective identified in the rebranding process.

Reward of Rebranding 2: Better “elevator conversations.”  Phil mentioned how quickly people – including prospective missionaries – assess their interest in an organization.  Representatives of CAM International typically had to begin discussions by talking about the past (by answering “What does CAM stand for?”) rather than describing the agency’s vision for the future.  With the rebranding, reps can make much better use of their first 30 seconds.

Reward of Rebranding 3: Alignment between internal identity and external image.  Some people feel that emphasizing marketing communications is inappropriate for those doing God’s work.  I couldn’t disagree more.  Effective communication is about people receiving a message in the way that the sender intended.  Rebranding requires a commitment to knowing what your message is – and to understanding (and measuring) how audiences receive that message.  It’s not about flash and cool; rather, it’s about others sharing our understanding of ourselves.

Reward of Rebranding 4: While mission organizations do compete with one another for recruits, in the end they are working toward the same purposes and therefore often cooperate.  Phil mentioned that he spoke to several organizations that shared their experiences about rebranding: Crossworld, WorldVenture, Christar and others.  I also spoke to other organizations when GMI was first considering rebranding, and what they shared was very helpful.

Challenge of Rebranding 1: Considering what to do with valuable elements of the existing brand.  Keith mentioned this, which resonated with me.  A key GMI asset has always been the www.gmi.org website, which has always had strong search engine optimization due to links from many other mission sites.  GMI’s consideration of a name change revolved around options that would enable retention of the acronym.  In the end, we opted not to change the name, but instead to emphasize a new tagline that elevates research alongside mapping – and to feature gmi.org as a secondary logo.

Challenge of Rebranding 2: How – and how long – to engage in dialogue with those who oppose the change.  Phil mentioned that it is important to allow constituents to express their views and to let them know that they are being heard.  You can’t ignore or dismiss them.  (I know of a mission organization that fully reversed its brand change because the field staff refused to use it.)  However, at some point you have to agree to disagree and move on, working to sell the majority on the concept.

Challenge of Rebranding 3: How to address sub-brands.  One question came from someone who manages a sub-brand of a large organization that is phasing in a new brand.  Keith responded by talking about the importance of having an intentional strategy for how – and how much – to tie sub-brands together.  Depending on your needs and objectives, you may want much, little or no unifying elements across sub-brands.  He mentioned his work with Focus on the Family and its spinoff organization CitizenLink (formerly Focus on the Family Action).  Both organizations are tied to the same mission, but the original brand is functionally nurturing and the newer brand is functionally confrontational (my word, not Keith’s).  In Focus’ case, decreasing the perceived association between the two brands was useful for both.

Challenge of Rebranding 4: How to Communicate Effectively, not Extravagantly.  Getting the word out to constituents about the change is important.  However, non-profits, and especially mission organizations, run the risk of overdoing communications.  Most people understand that brands have value, but that value only ties indirectly to mission fulfillment.  I mentioned a conversation this week with a woman who supports a missionary through an organization that recently rebranded.  After receiving multiple letters and glossy brochures from the agency, she began to wonder about how well the administrative portion of her gifts were being spent.

If your mission agency is looking to rebrand, I recommend that you connect with Jon or Phil about their experiences (Jon helped direct HCJB’s rebranding to HCJB Global a few years ago); contact Keith about full-service strategy and creative; or contact GMI for ideas on researching your identity and image.

Meanwhile, let us know: What challenges and rewards have you experienced in rebranding?

 

Visual projects need visual research. Case Study: GMI logo.

Are you using stories visually?  If so – or if not – check out the Visual Story Network to discover the power of visual stories.

Which brings us to visual research.  When the output is visual, it helps if the input is, too.  Rather than using words to tell a designer what her work should look like, visual research shows the concepts and elements that can be easily adapted into visual communication.

There is a lot of psychological theory underlying representative aspects of visual design.  While it helps to know why something works, sometimes it is sufficient just to produce something that works.

Some forms of visual research are highly sophisticated; others are accessible and usable for almost anyone.  For an example of the latter, check out Visual Explorer.

Some simple, informal visual research – nearly a decade old – turned out to be influential in the design of the current GMI logo.  As you read the story, think about ways that you could apply visual research.

For many years, GMI used this logo:

To some, it said, “We aspire to be the IBM of the mission world.”  To me, it said, “The world as obscured by a Venetian blind.”

We sometimes paired it with the tagline “Helping the Church See.”  I guess the logo could represent our helping the Church to see God’s world by opening the Venetian blinds of ignorance.  But no one wants to be told he or she is ignorant.  And GMI’s technical skills go well beyond adjusting window treatments.  Why not take the blinds down completely, open the window and climb through?

In 2010, after at least a decade of talking about it, GMI finally took the initiative to rebrand.

There’s a tangential story that I’ll mostly skip over about the debate over a potential name change to something other than Global Mapping International.  In the end, GMI opted to emphasize its well-known acronym, paired with the tagline: Strategic mission research and mapping. 

We had done some simple research on the GMI brand back in 2003.  The first step was interviewing staff to get their input on the personality characteristics of GMI.  In words.  Some wanted to talk about what GMI did, but we fought to stay focused on who GMI was in terms of personality and values.  It would have been good to include some of our resource users and other stakeholders as well, but we were less interested at that point in external image and more interested in internal identity.

Eventually, nine themes emerged that were mentioned frequently through the use of related words:

  • trustworthy
  • informed
  • supportive
  • innovative
  • stimulating
  • adaptable
  • engaged
  • pragmatic
  • accessible
  • courageous
  • compassionate

We revisited this list during the design process, using a simple online survey with the board and staff to prioritize the characteristics – that prioritized the characteristics in the order listed above.

But the input was still purely verbal.  The designer took the old logo and the personality elements, then went to work on a new concept, seeking to retain a connection to elements of the old logo.  Here was the first draft I saw:

I liked the way that a data/technology element was incorporated, but I felt that the image offered little warmth and a bulky font.  It communicated “trustworthy” perhaps, but definitely not “accessible,” “adaptable,” or “compassionate.”

I was one of many who were given an opportunity to offer feedback on the design.  In addition to the comment above, I mentioned a piece of visual research that we had done in early 2003 to follow up on the personality characteristics that we had identified.

Our visual exercise involved a few hundred logos, some from ministries and some from commercial organizations, printed and strewn across a conference room table.  Staff members were asked to select logos that captured each of the personality characteristics identified in the interviews.  I analyzed the selected logos both by characteristic and as a group, looking for patterns or trends that might help to capture the full set of personality traits.

Visual elements that popped up frequently included silhouettes, question marks and a particular combination of colors.

I wrote to the design team:

When we did our visual tests way back when, the colors most often paired with blue were gold and black.  Would like to see a treatment that incorporates those – perhaps using them in the map border and in a scattering of the data ovals.

Unfortunately, I was living overseas and did not have a copy of the research “report” with the visual input to show the designer what I was talking about.  The document was only three or four pages — a few paragraphs of analysis interspersed with selected images taped onto the paper.  It existed only in hard copy; I never got around to scanning it.

With input from many people, the designer had to choose which ideas to incorporate.  The second attempt seemed to be a step back:

One issue was the skewed rendering of the continents relative to the oval representing the globe.

In my response, besides noting that issue, I asked the design team to locate the 2003 visual research report and try at least one design using the blue, black and gold combination.  I also wrote:

I would also like to see us at least consider rendering the GMI in lowercase in the main logo.

This was also supported by the visual research.  I mentioned three reasons for trying lowercase:

  1. To help communicate Adaptable and Accessible – hopefully not at the exense of Trustworthy and Supportive.  After all, if Intel and AT&T can use lowercase text in their logos…;
  2. To reflect a sense of servanthood and credit-sharing that has been a hallmark of GMI’s work in partnership with others; and
  3. To create room to give a nod to Innovation and Stimulation by stylizing or coloring the dot in the “i.”

The next version revealed that the designer was listening:

Now I thought we were getting somewhere.  In his email delivering the artwork, the designer mentioned reviewing the visual research (and admitted that he chose to substitute gray for black — which makes sense for high-tech applications).

This concept received positive feedback from almost everyone, so the process moved to refinement.  Several options and variations were considered, along with various color combinations.  As it turned out, the winning logo was indeed a combination of blue, black and gold.

Why those colors?  The standard color psychology interpretation holds that blue provides trust and dependability, black reflects strength and authority (plus clarity on a white background), and gold implies wisdom, along with generosity of time and spirit.  Deep blue and gold are also complementary colors on the color wheel.

I also felt that the logo represents a simple story about GMI does and why: GMI brings forth insights from information to support others in bringing the light of the Gospel to a dark world.

To go further, GMI produces and is engaged with data, organizing and interpreting it to draw out points of insight.  The data is drawn from and often describes the world.  The insight helps advance the work of revealing God’s light to the world.  Our work is supportive, so lowercase “gmi” is at the bottom.

I didn’t expect to get a logo that told a story, but I think the power of telling that story will be meaningful in communicating GMI’s work.

The logo may seem a bit busy, but then, so are we – so even its weakness fits.

Sometimes research projects have little impact on decision making for various reasons – changing conditions, political or financial considerations, a leader’s preference, or something else.  But in this instance, a simple visual research project spoke into the project in meaningful ways.  (It was much later when I noticed that the continents in world map appear in silhouette, another theme from the research.)

Think about ways that you may be able to gather image-based information for your visual projects.

Simple Survey Idea 4: Don’t give the answers away

Do you ever “give away” answers in your surveys?  I’m talking about subtle (and not-so-subtle) signals that can lead to bias.  Here are a few errors to avoid:

Pandering

Several weeks ago I refinanced my house using an online lender.  All ended well, but there were a few glitches along the way – a key email with documents attached was apparently lost and I had to prompt the company to follow up with the underwriter.

The day after closing I received the following survey invitation from the mortgage processor:

Subject: I so appreciate YOU! Please help if you can :) I am so close to being # 1 in the company having “GREATs”…

Thank you so much for being an amazing customer to work with. I greatly appreciate all your help to get your loan taken care of. I hope that you feel I have given you “GREAT” customer service. My managers would love to hear what you think about my performance as your processor. If you do not mind, please take 1 minute to fill out the 1 question survey to help me out. We are always looking for “GREATs.”

Apparently customer-service ratings at that company are used in compensating or rewarding mortgage officers.  That’s fine.  But the question it raises is: Why would the company – which cares enough about satisfaction to tie it to rewards – let the person being evaluated pander for good ratings in the survey invitation?

You may have seen a more subtle form of this:

Thanks for coming to the SuperDuper Missions Conference.  Weren’t the speakers and worship music great?  Plus, over 300 people responded to the challenge to give or go.  Hopefully you were as blessed as I was.

Say, I would love to get your feedback to help us make future conferences even better!  Here’s a link to the survey…

It can be hard to contain enthusiasm when asking for post-event feedback – especially if you sent out several enthusiastic pre-event emails.  But if you want honest input, commit to avoiding remarks that suggest how the event should be evaluated (or how you would evaluate the event).

It Must Be Important Because They’re Asking About It

Most people have encountered surveys with leading questions, designed to confirm and publicize high levels of support for a position on an issue.  Like this:

Are you in favor of programs that offer microloans to lift women in developing countries out of the cycle of poverty with dignity through sustainable small businesses, with local peer-accountability networks to ensure loan repayment?

Even if you have read articles about recent studies suggesting that the link between microfinance and poverty reduction is tenuous or non-existent, you might be hard-pressed to answer “no” to the question as worded.

But there are other, more subtle ways that organizations can “suggest” certain responses.  Telling people in the survey invitation that the survey is about microloans can encourage people to overstate their interest in that topic (as well as leading to response bias in which interested people are more likely to respond at all).  Better to say that the survey is about strategies for poverty reduction or (broader still) addressing key areas of human need in the developing world.

This lets you gauge interest in your issue by mixing it in with several related issues, like this:

From the following list, please select up to three programs that you have been involved in, or would consider becoming involved in:

__ Well-digging programs to help provide a consistent healthy water supply

__ Community health education programs to teach villagers basic hygiene

__ Microloan programs to help women grow sustainable small businesses

__ Literacy programs to help kids and adults gain life and career skills

__ Legal advocacy and awareness to stem human trafficking

__ Theological education programs to equip first-generation church leaders

__ Sponsorship programs to sustain the education and nurture of at-risk kids

The rest of the survey can be about microloans.  But before tipping your hand, you learn about interest in that issue relative to other issues — and even the correlation of interest among issues.  Plus, you can use survey logic to excuse non-interested people from follow-up questions that don’t apply to them.

You can go even further to mask your interest in the survey issue, even while asking several questions specific to that issue.  Before starting the battery of questions about microloans, include a statement like this:

“Next, one of the above topic areas will be selected for a series of follow-up questions.”

The statement is truthful and adheres to research ethics — it does not say that the topic will be randomly selected. But it leaves open the possibility that those who sponsored the survey may be interested in several types of programs, not just microloans, encouraging greater honesty in responses.

Unnecessary Survey Branding

However, these approaches still won’t work if the survey invitation is sent from someone at “Microcredit Charitable Enterprises” and the survey is emblazoned with the charity’s logo.  There are many good reasons to brand a survey to your constituents, starting with an improved response rate.  But sometimes, branding can be counterproductive.

If objective input is key, consider using an outside research provider in order to avoid tipping your hand, especially since research ethics require researchers to identify themselves about who is collecting the data.

Allowing Everything to Be “Extremely Important”

Another way that researchers can “give away” answers is by letting people rate the importance of various items independently.  Take this question, for instance:

In selecting a child-sponsorship program, how important to you are the following items?  Please answer on a scale of 1 to 5, where 1 is “Not at All Important” and 5 is “Extremely Important”:

1    2    3    4    5   Sponsor’s ability to write to and visit the child

1    2    3    4    5   Receiving regular updates from the child

1    2    3    4    5   On-site monitoring of the child’s care/progress

1    2    3    4    5   Written policies regarding how children are selected

1    2    3    4    5   Annual reporting of how your money was used

All of those are important!  The question practically begs respondents to give each item a 5.  Will that information help the agency?  Maybe for external communication, but not in deciding which areas to promote or strengthen.

Instead, consider this alternative:

In selecting a child-sponsorship programs, how would you prioritize the following items?  Distribute a total of 100 points across the five items.

Or

Please order the following five elements of a child-sponsorship program according to their relative importance, from 1 “most important” to 5 “least important.”  You can use each number only once.

In most cases, relative-value questions will produce much more useful data.

Are there other ways that you have seen surveys “give away” answers to respondents?   Or avoid doing so?  Let us know about your experiences and ideas.

Back to College – a week @biolau for #bmc2012

I’ve been blessed to assist several student mission conferences with evaluations and planning research.  But until last week, I’d never actually attended one in person.  Not as a student.  Not as an adult.

So, my impressions of the Biola Missions Conference 2012 come unjaded – but also with no basis for comparison.  Take them as you will.

Here’s what stood out to me:

  1. The amazing level of creativity and production values of the young adults.  I have often heard people talk about the sophistication of today’s young people in regard to media.  They have highly developed filters, they are tough to impress, etc.  But that’s about their reaction to content produced by others.  Last week the content that students created themselves was on display.  And I was impressed.
    Dance, drama, visual art, poetry, film, music, oratory, cuisine, fashion – creativity seemed to be everywhere, all the time – and always expressing a passion for God and for mission.It’s one thing to attend a professionally produced plenary session with creative stage lighting, choreographed dance numbers, moving testimonies, etc.  But upon leaving the assembly hall, I walked past sidewalk artists in the process of creating scripture-based chalk drawings, then past elaborately decorated booths to advocate for and educate about peoples in unreached lands.  I caught the aroma of African food in the air as I walked to the end of campus.  There, the line stretched around the building for Global Awareness, a series of interactive role plays where participants might find themselves in the midst of a Somalian hostage crisis or a Chinese house church.

    Now, wowing a guy who barely falls on the fringe of Gen X may be a low bar.  So, check out these examples of their work and tell me what you think.

    Biola is close by Hollywood, and I met at least two students who hope to shine their light in the entertainment industry after college.  That may have had something to do with the emphasis on art and production.  But I’m still amazed that the whole thing was pulled together in the spare time of people taking a full load of classes.

  2. A student-run event has its glitches – but it also has great educational value.  I encountered half a dozen bumps that would never have happened at a professionally run event, but each of them represented a lesson in project management that won’t soon be forgotten.  And the event staff were accessible and eager to address issues when they arose.  They don’t deserve a complaining spirit from me.So, I’ll mention only one incident, and that just to share the smile it brought.  During the Welcome Tea for mission agency reps, a student leader said, “For those of you staying on campus, I hope you remembered to bring bedding.”  Note for next year’s conference: consider including that detail in the pre-event info packet.  As it turned out, my student host (and I think many others) provided sheets and pillows for their missionary guests, generously trading their beds for four nights on the floor.
  3. Staying in the dorms is a bit of an adventure for a 40-something guy, what with students’ odd hours, having to climb over a bookshelf to get into the loft, and sharing a bathroom with 30 people.  Still, I would do it again in a heartbeat – and not just to save money on a hotel room – because several of my best conversations with students took place there.  Definitely a shot-in-the-arm to interact with talented, God-fearing men with big dreams and passion for the Kingdom.

If any Biola students or others considering long-term cross-cultural service are reading this post, I’d be honored if you would be willing to give your opinions a few times a year on mission-related issues.  Sign up for our Mission Research Panel here.

It’s also a great time for college students who want to apply their skills, passion and creativity in mission research for a few months to apply for a GMI internship.  Send your resume to info@gmi.org.

Looking forward to attending and speaking at Biola again next year!

 

Simple Survey Idea #3: Give Something Away

When you do a survey, you are asking people for their time and their opinions.  People are increasingly aware of the value of both.

With that in mind, it is a good practice – even among those who already know and trust you – to give something away in appreciation for their input.  Doing so will bless people and build goodwill.  It will also improve your response rate (and therefore the quality of your data).  And it will make them more inclined to participate in future surveys.

“But we don’t have the budget to give anything away,” I sometimes hear people say.  I say, “If you can’t find something to give away, you’re not trying very hard.”  You don’t need budget – there are lots of ways to give survey responders something for free.

The first thing you should give them is a short survey.  That may be a topic for another Simple Survey Ideas post, but it’s so important that it always warrants mentioning.

You can give people access to the survey results, a good idea if your responders are peers/stakeholders and you know they will be interested in what you are learning.  Depending on your survey software, it may cost you some time to format and email results out to those who responded.  But if you use an online package like Survey Monkey, you can set up options to automatically show the survey results to date upon completion of the survey.  That “Instant Results” feature is even available on the free Survey Monkey package.

Of course, the first few people who respond won’t get a very complete picture, so you might also want to send people a link to the full set of responses once you complete and close the survey.  This option is available in all of Survey Monkey’s paid subscription plans.

Quick aside: the advantages that you get with the online services’ paid plans (unlimited responses, survey logic, ability to download data, HTTPS security) make them well worth the cost (vs. the free plan) for almost any survey.  Even if you are just doing a one-off survey, you should still sign up for a month and then cancel the subscription when you’re done.  Your survey is worth the $24 investment.

Another useful offering is free-information-of-interest-to-respondents.  I use this with virtually every survey I do.  You can almost always find an article or ebook or presentation or video related to your survey topic or to a common interest of the survey audience.  Even if you don’t produce content, you can always find something free on the Internet to direct people to.

In this way, you can say in your survey invitation, “Everyone who completes the survey will receive a free ‘Top 10’ list of resources about _____.”  It doesn’t matter that the list is out there on the Internet for anyone to find – linking people to it is delivering value.  With Survey Monkey, the option to redirect survey finishers to the website of your choice only comes with annual plans.  So, you may need the workaround of embedding your own link on the last page of the survey, so responders can get to your resource.  At the risk of going beyond “simple,” try something like this:

Thanks for completing our survey.  Before clicking “Done,” click <a href=”http://www.yoursite.org/” target=”_blank”>this link </a> to open a new window with the free resource we promised.

Be careful that the resource will be of interest to nearly everyone that you invite.  Giveaways that appeal only to a certain segment of your audience will lead to response bias.

Should you ask permission of the content provider in advance?  It’s a good idea but not required – groups that offer free content on the web typically want people to find that content.  You benefit them by linking to their site.  Groups that provide many free mission-related resources include the World Evangelical Alliance and the U.S. Center for World Mission.

A quick-response incentive promises resources to the first X number of responders.  This can be a good idea if you have a limited number of tangible resources to give away – and especially if you need responses quickly.

A related incentive is the sweepstakes prize offer, where respondents are randomly selected to receive a prize – usually something with significant value.  Many researchers use a combination of a free something-for-everyone resource with a high-value sweepstakes prize for a few randomly selected winners.

I like sweepstakes offers – they are fun and they work to generate response.  But you have to be responsible with them – some laws apply (see a quick overview here and know that this post does not constitute legal advice).  If you go this route, make sure that everyone who responds has an equal chance to win (even those who don’t meet the criteria for responding to your survey – nothing ruins a good survey like people lying to qualify for a prize), clearly communicate what and how many prizes you are giving away, eligibility, how and how often you can enter, when the giveaway will take place, how winners will be notified, approximate likelihood of winning, and any geographic or residency limitations.

That sounds like a lot, but consider that the following covers all of that without sounding too much like the legal disclaimer lingo in car dealer’s radio ad:

“You and up to 400 others who complete the survey by March 31 will qualify for a random prize drawing for one of 10 copies of the Operation World DVD.  One entry allowed per survey link.  In April GMI will inform the 10 winners by email – they will need a valid U.S. mailing address to receive their DVD.  Not valid where prohibited by law.”

How to manage a random drawing without hundreds of slips of paper and a huge hat?  Discover the RAND function in Excel.  Very handy – be sure to sort, save and print results for your records.

Also make sure to give away everything you promise.  If some people don’t claim their prize by a given date, move on to the next people on your randomized list.

Prize giveaways are appealing to most, but it is not unusual for those in ministry circles to steer clear of them because of their similarity to gambling games of chance.  Before launching a contest, be sure your organization’s leadership knows about it.  If you run into concerns, one alternative is to allow or encourage winners to donate their prize to charity.

Some survey sponsors use a charitable donation as the incentive itself, which carries real appeal for respondents.  One commercial firm I worked with leads off its surveys with a question like this:

In appreciation for your opinion, our firm will be distributing charitable donations totaling $1000.  From the following list, please select the charitable organization that you would like your portion of the donation to go toward:

__ Organization A

__ Organization B

etc.

If your group is a charitable organization, you can use a list of projects instead.  This works well if you can (truthfully) mention that an individual donor has put up the gift money to be distributed in this manner.

A final tip that applies to any gift or incentive that you offer: don’t position it as the primary reason to respond – especially in the subject line of an invitation email.  Not only do words like “prize” and “win” tend to trigger spam filters, but leading with the gift offer sends a message to invitees that you view the exercise as a transaction (or worse, that you think they are primarily motivated by greed).

Instead, keep the focus on the importance of the survey topic and the value of the person’s opinion – then mention the gift or prize.  As a survey sponsor, your identity should be that of a listener asking people for the favor of their input and offering them the opportunity for involvement – plus a gift as a token of your appreciation – rather than as a purchaser of people’s opinions.

 

Simple Survey Idea #2: Send a Reminder

I talk with lots of people who design and field their own web surveys.  It amazes me how many have never considered sending a reminder out to those they have invited — even to people who are known well by the person doing the survey.

People are often very willing to help, but they are busy and working through lots of messages, and survey invitations are easy to set aside until later.  One reminder is often helpful.  I almost always send at least one reminder out to survey invitees.  In some cases, I will send out a second reminder.  In rare cases, a third.

Why send a reminder at all?  Perhaps it goes without saying, but more data usually equals better-quality information.  Better statistical accuracy is part of that: most people understand that a sample of 300 yields a tighter margin of error than a sample of 100.

But in most cases, response bias will be a bigger threat to the quality of your data than statistical error from sample size.  Consider your sample of 300 responses.  Did you generate those from 400 invitations (a 75% response rate) or 4,000 invitations (a 7.5% response rate)?  The former would give you much greater confidence that those you heard from accurately reflect the larger group that you invited.

What is a “good” response rate?  It can vary widely depending on your relationship to the people invited (as well as how interesting and long the survey is, but that’s a topic for another post).  Domestic staff/employee surveys often generate a response of 85 percent or more.  However, for internationally distributed missionary staff, a response of 60 percent is healthy.  For audiences with an established interest in your work (event attenders, network members), a 35-percent response is decent.  For other audiences, expect something lower.  One online survey supplier’s analysis of nearly 200 surveys indicated a median respose rate of 26 percent.

So, do reminders substantially increase response to surveys?  Absolutely.  Online survey provider Vovici blogs, “Following up survey invitations with reminders is the most dramatic way to improve your response rate.”  They show results from one survey where the response rate rose from 14 percent to 23, 28 and 33 percent after subsequent reminders.

My experience has been similar.  I find that survey invitations and reminders have something like a “half-life” effect.  If your initial invitation generates X responses, you can expect a first reminder to produce an additional .50X responses, a second reminder .25X responses, and so on.

I disagree with survey provider Zoomerang’s suggestion of sending a series of three reminders — especially if the audience is people you know — but I do agree with their statement, “Think of your first email reminder as a favor, not an annoyance.”  I recommend sending at least one reminder for virtually any survey, with a second reminder only if you feel that your response rate is troublesome and you need that extra .25X of input.

At least Zoomerang provides a sample reminder template you can use.  I agree that you should keep reminders short — shorter than the original invitation.  With any invitation or reminder, you will do well to keep the survey link “above the fold” (to use a phrase from old-time print journalism), meaning that it should be visible to readers without their having to scroll down through your message.

I also find that it very helpful to use list managers in sending survey reminders.  Most online providers will have an option where you can send only to those members of your invitation list who haven’t responded.  Not only does this keep from annoying those who already did respond, but you can word the reminder much more directly (and personally, with customized name fields).  So, instead of saying:

“Dear friend — If you haven’t already responded to our survey, please do so today.”

You can say:

“Dear Zach — I notice that you haven’t responded to our survey yet.  No problem, I’m sure you’re busy.  But it would be great to get your input today.  Here’s the link.”

Take care in using the above approach — if you have promised anonymity (not just confidentiality), as in an employee survey, opt for the generic reminder.

When to send a reminder?  If your schedule is not pressing, send a reminder out 5-10 days after the previous contact.  I recommend varying the time of day and week in order to connect with different kinds of people.  If I sent the initial invitation on a Monday morning, I might send the reminder the following Wednesday afternoon.

 

Simple Survey Idea #1: Keep survey language simple

I am working on a web survey for a group of people in India.  Smart folks, many of them technology savvy.  And they speak English — but often not as their first language.

Some surveys should be translated or fielded in multiple languages.  For many surveys, though, English is sufficient.  But what kind of English?

My default mode is to use more complicated English than is needed.  The more I work with multilingual people around the world, the more I realize the value of keeping language simple, especially with surveys and interview questions.

The good news is that there are tools out there that can help.  Here is a site that lets you paste in text and compare it to one of many collections of simple English words.  It shows which words are not considered simple.

http://www.online-utility.org/english/simple_basic_helper.jsp 

With many international audiences it is a good idea to test your language before sending out a survey.

I put the above portion of this post into the site and found out that the following words were not included in a somewhat large collection of 15,000 simple words: web, savvy, and multilingual.  With smaller collections, many more words miss the cut, including realizecompare and survey.

Try it out — you have nothing to lose but complexity.

It’s Winter: Time to Make Snowballs!

A lot of mission researchers are interested in studying people who aren’t easy to get to.  They may be unknown in number, difficult to access, suspicious of outsiders, etc.

This makes random sampling virtually impossible.  Unfortunately, a random sample is an assumption or requirement of many statistical tests.

So, if you’re doing research with underground believers or people exploited in human trafficking, you can’t just go to SSI and rent a sample of 1500 people to call or email.

When you need a sample from a hard-to-reach population, make a snowball!

Snowball sampling, a more memorable name for the formal term, respondent-driven sampling, is a means of getting to a reasonably large sample through referrals.  You find some people who meet your criteria and who trust you enough to answer your questions, then ask them if there are other people like them that they could introduce you to.

In each interview, you ask for referrals – and pretty soon the snowball effect kicks in and you have a large sample.

For years this approach was avoided by “serious” researchers because, well, the sample it produces just isn’t random.  Your friends are probably more like you than the average person, so talking to you and your friends isn’t a great way to get a handle on your community.

But, like six-degrees of separation, the further you go from your original “seeds,” the broader the perspective.  And in recent years, formulas have been developed that virtually remove the bias inherent in snowball samples – opening up this method to “respectable” researchers.

How to do it?  Some researchers simply throw out the first two or three generations of data, then keep everything else, relying on three degrees of separation.  Not a bad rule of thumb.

For more serious researchers, there is free software available to help you weight the data and prevent you from having to discard the input of the nice people who got your snowball started.  Douglas Heckathorn is a Cornell professor who developed the algorithm (while doing research among drug users to help combat the spread of HIV) and helped bring snowball sampling back from the hinterlands of researcher scorn.  You can read more about his method here and download the software here.

Suddenly, you need not settle for a handful of isolated snowflakes, nor for a skewed snowdrift of opinion (via an unscientific poll of your social media friends).  Instead, you can craft their referrals into a statistically representative snowman.

Meanwhile, if the sample you need is one of North American field missionaries or North Americans seriously considering long-term cross-cultural service, you should consider renting one of GMI’s mission research panels.  Email us for details.

Analyzing open-ended questions: A bright future for word clouds

Commercial survey research firms usually charge clients significantly extra to include “open-ended” questions in a survey.  They tend to be messy and time-consuming.  Traditionally, analysts would read through a selection of responses to create categories of frequent or typical responses, then read back through all of the responses to categorize them.

For publishing in a peer-reviewed journal, multiple people would categorize responses independently, then work together to create a synthesized coding scheme to limit bias.

Most qualitative text-analysis software still requires you to manually “code” responses.

With all that work, open-ended questions are still important in exploratory and qualitative research – and frequently satisfying for survey respondents looking for an opportunity to say what is on their mind, unhindered by structured response categories.

But the tag-cloud age has been a blessing to those without the time and money to do full, traditional analysis of text responses.   Graphics with words sized by frequency of use enables analysts to quickly get a sense of the nature of open-ended responses.

New editions of survey software – even budget packages like Survey Monkey – include cloud-creating tools to help users understand open-ended responses at a glance, without all the coding work.

Even those doing traditional coding enjoy working with clouds, which help analysts to quickly create an initial set of codes.

If your survey package doesn’t have cloud-generating capacity, no problem.  Worldle is a free site that lets you create art-like word clouds.  The clouds in the previous post were created using Worldle.  It’s a terrific, easy-to-use site that lets you paste in your text – our data came straight from a spreadsheet – and generate a word cloud with one click.  It automatically removes common words, allows you to choose the relative cloud shape, color scheme, font and orientation of the words.  We chose to illustrate the top 100 terms for each question.  Wordle lets you save and use your clouds however you want to.

I really like the tool’s artistic quality.  Wordle clouds almost beg to be shown to others.  Then they become motivated, too.  My daughter, upon first seeing Wordle, immediately had a vision about making a sign to promote a bake sale.  A few descriptive terms later, she had created a beautiful graphic to draw people’s attention.

This is where research moves from information to influence.  Imagine asking your constituents about their needs – or your organization’s impact – then printing a graphic of their responses to hang in your office as a reminder and motivator to staff.  Unlike a research report, which may or may not get read before being filed away (or worse!), word cloud art can keep research right in front of your team.  The graphic format makes the information more memorable as well.

Researchers, meanwhile, can compare and contrast different audience segments, as I did in the word cloud below.

What applications can you think of for word clouds?