How to Draw Conclusions from Your Online Community
We are living in a remarkable time. As technology develops at an accelerating rate, we no longer have to guess what our members are thinking. We can draw meaningful conclusions from online community data that is pouring in every single day.
You can capture data from so many different places, including website usage, email reach outs, and of course, your online community.
That said, a common mistake people make is to draw conclusions from anecdotal evidence. For example, perhaps one person commented in the community that they loved your annual conference and it could not have been better. Should their one voice become the representative of your entire member population?
This goes the other way as well. Although negative feedback should always be taken seriously, it is important that you gain a larger perspective before making any significant organizational decisions in response.
So what next? How can you feel confident that your planned move is the right one to make?
You can apply the scientific method. By running an experiment with a reasonable sample size, you can figure out how to draw conclusions from your online community that will impact your effectiveness. Let’s take a look at the five steps, and how you might apply them:
Step 1: Ask a Question
In this very first step, you are identifying the area you would like to explore. Your online community is an excellent place to narrow in on assumptions to challenge, as you are likely frequently receiving feedback.
As an example, let’s say that a member tells you that your association’s email communications are so formal that they are off-putting. Could this be affecting your bottom line? You can find out.
Step 2: Do Background Research
You can conduct this research in many ways. How do other organizations communicate with their customers? At this point you may research similar associations, but also other types of organizations as well. Now, what about your open and click rates? How do they compare to industry standards? Have your association’s emails always been written with such formalities or is this a recent change? If recent, what are the reasons? How do open/click rates compare to the past?
There are all sorts of questions you can begin to ask to help you draw conclusions.
Step 3: Form a Hypothesis
Do you remember “If/then” statements? At this point, you put together the pieces of your research to form a hypothesis, which is essentially a prediction of how a change would impact the outcome.
In this case: “If email communications are written in a less formal tone, then our average click rate will increase.”
Note: sometimes your hypotheses will not be cut and dry. For instance, “less formal” cannot easily be quantified, but that is where your good sense comes in.
Step 4: Conduct an Experiment
This is the most exciting step, as you now get to test the hypothesis that you’ve been working so hard to develop. Say hello to A/B testing because it is your new best friend.
A/B testing is when you produce two identical pieces except for one variable, which is what you are testing for. You then bestow the two onto identical audiences in terms of demographics and behavior, and see which performs better. A lot of programs you use may actually have A/B testing abilities built in, making this step really easy.
It is absolutely crucial, and thus worth highlighting once again, that you change only one variable. Otherwise, you cannot isolate it as the reason for the desired outcome.
Let’s say you choose to send out two identical emails inviting members to a webinar that you are conducting. One is written in the usual formal tone, while the other uses more conversational language.
Step 5: Analyze Results
How did the emails perform? Was there a significant difference in click rate? Keep your sample size in mind. If 10,000 people open your emails and there is a major difference in click rate, then your hypothesis is well proven. If 10 people open your email, be weary that you may have to run multiple experiments before you can draw significant conclusions.
Further, make sure to keep asking questions. Didn’t get the results you were hoping for? It doesn’t necessarily mean your hypothesis was incorrect. Perhaps the language you used was too informal, for instance. You can always keep digging.
Congratulations! You have now performed a well thought-out experiment that allows you to draw meaningful conclusions from your online community. The whole process may feel a little removed, but think about the implications. For instance, perhaps you learned that your click rates increased by 5% when you started to use different language to describe the exact same content that you were providing. That is a big deal.
Alternatively, you may have run the experiment and determined that your initial hypothesis was incorrect. That is equally important knowledge, as you potentially avoided going down the wrong path for your organization.
Good luck on your scientific endeavors, and remember, test for one hypothesis at a time!
If you enjoyed reading this post, you may also enjoy our webinar:
Or these blog posts:
We created rasa.io because we fundamentally believe that up to now, the approach most associations take to online community building has been far too narrow. Networking, resource sharing, and Q/A are just a part of the online community experience, so we created a platform the puts member engagement where it should be - at the heart of your association’s community.