Table of Contents
I read my friend’s Enomoto-san’s blog enmt.info (in Japanese) on ‘building hypothesis for growth hack’, and thought I shared his view and my quick thoughts on it. The blog post outlines his experience around people asking him ‘how do you build hypothesis?’ or even ‘is my hypothesis even a hypothesis?’. For many marketers who already got many testings going and already built that testing culture, that’s great, but for many people, it may be challenging when data are becoming widely available and tools allow marketers to act quicker than ever.
From digital analytics expert’s standpoint, it is pretty common to look at the ‘facts’ and ‘get ideas’ after we synthesize those inputs to build a hypothesis to test. But that is a pretty broad thing to digest as that has too much flexibility on how to approach acquiring facts or ideas.
Apparently, what works for Enomoto-san is actually something that has worked for me as well.
That is to find the ‘Bright Spot’.
It is very common for us to hone in on problems and just drill down on that problem, basically focusing on what’s NOT working and fixing it. In many situations, after testing things to see if your idea fixes that problem, it is hard to iterate on it. However, if you ever find out a working version of the test that improved the performance, that is a ‘Bright Spot’ where you can apply that idea to further iterate on.
Let’s say you tested a Call-to-Action (CTA) that increased a conversion rate. You can now build on top of that learning because if gives you a hint of what worked or resonated with your consumers. You can also apply and scale that learning of ‘what worked’ to somewhere else where there is a problem.
For example, if you found that consumers are searching on the internet using a specific term rather than a term you used as a business, go test it. I recently tested a CTA using what consumers prefer to use on search engines and conversion rate increased. That gave me more ideas to scale testing and apply that methodology elsewhere.
People’s brain stall when they face problems and try to solve them by stacking up a list of problems to solve
Marketers like to focus on what’s not working and try to solve problems that are not working. That said, they may build a list of things to do or test based on a stack of problems that are identified. That approach of stacking up a list of issues and solving it generates a lot of stress and requires a lot of stamina from your brain. Even if you’re a rock star, and can deal with solving it, your team may not be able to follow you.
Like investments, it could be a great idea to invest small ideas that work little by little, and see that positive results and return on your investments will stack up over time.
So when you look at your data, and you see some bright spots where things are working well, trying reproducing that idea in building that test hypothesis. It makes your job more fun and less stressful.
Finding good from the bad can bring insights
I think this idea of focusing on what worked is a great way to tackle problems differently while many people are focused on finding problems to solve from what’s not working. At the same time, this idea could be applied to data analysis as well.
For example, if you see some pages with high bounces, then use Voice of the Customer survey tools to see if that is a good bounce or bad bounce. You can learn from the data differences and start applying what’s working to test the areas where it is not working. This approach has yielded a lot of great insights for me recently. Hopefully, this brings new thinking to your work, too.
Enjoy buliding your hypothesis using data! and if you have any questions, please feel free to reach out.
Join the newsletter to receive the latest updates in your inbox.