Stakeholder engagement is the heart of any successful community development effort. If you’ve ever worked for a community organization or been involved in a community project, you’ve probably spent a lot of time attending local meetings and public hearings. Face-to-face dialogue is key to building relationships, gathering input, and rallying support for your cause.
But, what if you to gather input from hundreds of stakeholders dispersed across a large geographic region? With limited resources and an inability to meet face-to-face, how do you get quality feedback from a wide range of people affected by your project?
At Cornerstone Partnership, we’re often spinning our wheels over these questions. As a national peer network for long-term affordable homeownership programs, we have nearly 900 members whom we rely on to tell us what’s happening on the ground, so we can build resources, tools, and programming that makes it easier for our members to do their work better.
This spring, we launched a national effort to develop “Stewardship Standards” for the affordable homeownership industry. Our goal was to gain insights into best practices that affordable homeownership programs were undertaking across the country and to translate those insights into standards. We needed to generate large-scale awareness, obtain quality feedback, and build community among our members.
The good news is that we had a strong baseline of content—we had been able to assemble content for nearly 60 sample standards and practices from our library of research and tools. The bad news is that that meant we had to get feedback on dense content that encompassed nearly 15 pages of text.
While we’re only half way through our project, we’ve tried a range of approaches. Read below about the tools and techniques that we’ve undertaken and get insights into building more, and better online stakeholder participation.
Online Focus Groups
Approach: We held online working sessions with an average of 10 participants per session. The participants discussed the standards using the Adobe Connect web conferencing platform, which features online chat and polls to supplement discussion. Participants completed an online survey prior to attending the calls, so the facilitator had insight about the participants and their first reactions to the standards. We’ve held eight work sessions with over 100 participants.
Effectiveness: A moderated discussion format allowed us to probe into areas where participants disagreed and to understand varying perspectives. The Adobe Connect platform is great for facilitating discussion, and it has many tools to work with. Participants also reported that they enjoyed being able to hear from their peers. But, comments can stray off-topic and it’s time-intensive to prepare and develop materials for these types of moderated discussions.
Tips: Use a skilled facilitator and practice interacting with technology before going live. Use a pre-survey or interview prior to session to gather knowledge about the group.
Approach: Prior to conducting the focus groups, we asked participants to complete a brief (10- to 15-minute) online survey that would be used to prompt discussion in the focus groups. Participants were given a set of standards and were asked if each individual standard should be considered “optimal,” “necessary,” or “dropped” from consideration. Approximately 95 percent of participants took time to complete the surveys.
Effectiveness: Generally low cost to develop and administer, the survey allowed us to ask specific questions and easily tally results. We also obtained quality comments in our open-ended questions. The disadvantages of this approach are that participants may misinterpret questions and lack the opportunity to hear others’ opinions.
Tips: Give yourself enough time to send out reminder notices to participants. And, when asking open-ended questions, be as specific as possible to get quality responses.
Collaborative Writing Platform
Approach: We posted our standards on a platform known as CommentPress, an open source WordPress plug-in. CommentPress allows readers to read a document and provide comments in a paragraph-by-paragraph format. We built a “Standards Room” where we posted synthesized comments from the working sessions so readers could quickly read the main points about each standard. We also highlighted additional questions for consideration. We received 35 comments from 14 participants over the period of three months. We generated approximately 160 unique visits, with each visitor visiting 4 pages per visit and spending approximately 2 to 3 minutes on each page.
Effectiveness: We felt mixed about this platform’s effectiveness. The format seemed to be useful in generating awareness and allowing participants to look over the content quickly. Participants also provided high-quality comments about the work. However, few participants commented on the additional questions we highlighted for each standard. It was also resource-intensive to set up the platform and then to remind participants to respond to the additional questions we highlighted for each standard. And, when we surveyed participants about their use of the platform, they said overall it was fairly easy to use, but several noted that the platform could be initially confusing and they weren’t sure comments were being submitted.
Still, it’s been interesting testing this new approach to online engagement and we look forward to learning about other online, collaborative feedback tools. Finding better ways to solicit online input is significant for many industries, especially the public sector. We’ve been inspired to see how the federal government has partnered with Cornell University to develop the Regulation Room, a multiyear study to test how online technologies can encourage online civic engagement in federal rule-making. They’ve conducted extensive research and experimented with developing technology as well as the details involved in moderating online communities and implementing instructions to gather high-quality comments. Check out their site to participate in the Consumer Financial Protection Bureau’s efforts to gather new federal rules about consumer debt collection practices or see how they collected input to modify residential mortgage regulations.
Tips: Before starting any online community, be sure that you have staff resources available to moderate the online community (i.e., welcoming new members, responding to comments and questions).
So this is what we’ve tested so far. Does online stakeholder engagement play a role in your work? If so, let us know if you have any tools or techniques you can share with us!