We recognize that there are many opinions about what will or will not improve a website or process. That’s why we test!
We are data-driven marketers and we expect that you are engaging White Peak for conversion rate optimization (CRO) because you prefer data-driven answers over guesswork and opinions.
White Peak will always use a structured and systematic approach to improving the performance of your website. Our approach is informed by experience and insights — specifically, analytics and user feedback. Each step of the way, we take into account your website’s unique objectives and needs (KPIs).
White Peak’s Conversion Rate Optimization process is:
- Analyze, Learn, Repeat
We believe that our experience with the research and prioritization phases of CRO set us apart from competitors and in-house resources that may be involved with CRO for the first time. Therefore, in the spirit of transparency, below is what you can expect from White Peak during those phases.
To begin optimization, we need to know what your users are doing and when possible why. Before we begin optimizing and testing on the targeted pages, we need to do some research that will provide us with the data-driven insights that will guide our strategy against the goals we are trying to achieve with your site or landing page.
Here is the summary of the process we use at White Peak to collect the data we need to inform our testing plan:
- Heuristic Analysis
- Technical Analysis
- Web Analytics Analysis
- User Intent Analysis, where appropriate
- Mouse Tracking Analysis
- Qualitative Research
- User Testing, where appropriate
Heuristic analysis is about as close as we get to “best practices.” However, even with years of experience, we still can’t tell exactly what will work best on your website. While experience guides and informs our work, it doesn’t function as truth. That’s why we have a process.
When doing heuristic analysis, we evaluate each page based on:
Web analytics analysis is next in our process. To start, we need to make sure everything is working properly. This includes fully configuring Google Analytics to get actionable data from Goals, Segments, and Events.
User intent analysis may be conducted by asking site visitors, in their own words, to describe what they are looking for and why they want the information. We use polls and surveys to obtain this data, typically while a visitor is on your site. User intent analysis can also inform us of the language we should be using and the selling points we should highlight during testing.
Next is mouse tracking analysis, which includes heat maps, scroll maps, click maps, form analytics, and user session replays. This analysis helps us understand how people are experiencing a page and provide a layer of insight that provides context for much of the other data we see.
Qualitative research is an important part of our research as well because it tells you what the quantitative analysis misses. Don’t underestimate the importance of this step. It should be conducted with rigor because it will provide no less valuable information as your Google Analytics data.
Finally, there is user testing. The concept is pretty straightforward: When, in our judgment, it is appropriate or necessary, we will observe actual people using and interacting with your website with them commenting on their thought process out loud.
There are many frameworks to prioritize CRO. When we have gone through our seven-step research process, we will have found a variety of issues – some of them severe, some minor. We then put each issue into one of these 5 buckets:
- Test. (This bucket is where we place items to be tested)
- Fix. (This can involve fixing, adding or improving tag or event handling on the analytics configuration)
- Hypothesize. (This is where we have found a page or process that needs to be improved but it will involve multiple tests to define the specific improvement)
- Get it Done. (Here is the bucket for “no-brainers” that really should require testing at all because they are so obvious)
- Investigate. (If an item is in this bucket, we need to do further analysis for one reason or another)
Once we have put all of the ideas into a bucket, we rank them from 1 to 5 (1 = minor issue, 5 = critically important). There are two criteria that are more important than others when giving a score:
- Ease of implementation (time/complexity/risk). Sometimes the data tells us to build a feature, but it will take months to do it. So it is not something we would start with.
- Opportunity (subjective opinion on how big of a lift Zywave might get).
Our team will then create a spreadsheet based on our disciplined, experienced analysis and you will have a prioritized testing roadmap, more rigorous than most of your competitors will have. We then work with your team to execute our experiments.
White Peak uses A/B testing when executing our CRO plan on your site. The tools we use make use of “multi-armed bandit” algorithms, harnessing the power of machine-learning to optimize your site. This means that we can test as many variants as you want to. Your test will automatically adapt to show the best variant with the best conversion rate.
Similar to traditional A/B testing, our team is able to monitor the number of views and conversions that each variant gets. But, the main difference with multi-armed bandit testing is that if one variant gets a higher conversion rate than the other variants, it gets more traffic.
The idea is that the variant with the highest conversion rate is the winning version and it’s automatically shown most of the time. As a test runs, our system will reconsider other variants. There is no need to constantly check your results, but you can if you need to. We can generate reports to provide the visibility you need.
As your test runs, new variants can be added to try out new ideas. Alternatively, you can leave the test running along. Our tool’s machine learning algorithms will adaptively pick the optimal version to show your visitors.