Saturday 21 September 2019

Startup diary: You can optimise your business and it will make a big difference - probably the difference between life or death

Research: Effective use of A/B testing can help firms boost user engagement. Stock image
Research: Effective use of A/B testing can help firms boost user engagement. Stock image

Richard Rodge, Voxgig founder

User engagement is the key to making your software-as-a-service business work. It's the big theme of the second half of the year for Voxgig. Last week, I wrote about how we'll collect data to validate that our users are engaged. That's the first step.

The second step is to improve user engagement to drive increased revenue. That's where a little bit of maths comes into play, in the form of something known in the trade as 'A/B testing'.

You've put up your website. You've got your design looking professional and you've built the content for your product pages.

You've got the first version of your product working and you've got some happy customers.

Is the job done? Do you just ramp up the marketing spend now? Is that how you grow the business? No. You have to optimise all the user interactions in your system.

How do you know if the benefits and features that you are highlighting on your product pages are the ones that will really resonate with potential customers?

What about your homepage; does your tag and initial message really hook people into learning more? Does your content marketing convert readers into users at the rate you need?

Inside your product, which features are the ones people actually value and use? Are your customers struggling with a given feature because your documentation is a little obtuse?

There are so many ways you can optimise an online business that it can be hard to know where to start.

But the great thing is that you can optimise your business and it will make a big difference - probably the difference between life or death in the end. OK, so how do we do it? We use A/B testing.

Here's the idea: take two different versions of something - let's say it's the opening message on your home page.

Deliver one message to 50pc of your visitors (that's version A) and another message to the other 50pc (that's version B). Do this for a week to make sure you collect lots of data.

Now, just like a political poll, you've measured the popularity of the messages. If one message gets 10pc more sign-ups than the other one, then you know you need to use the first message. Now rinse and repeat for all areas of your website.

When I worked in e-commerce in the early 2000s, you had to build this sort of thing yourself. And you had to get the mathematics right, or else you ended up fooling yourself.

These days, it's all much easier. You can use services like Boxever, Optimizely, Swrve, and many others, to get the job done.

These services all have different approaches and pricing levels, so you are spoilt for choice when it comes to picking the right one for your business.

The first reason to use one of these services is to make sure your data gets the proper statistical treatment.

You are effectively running scientific experiments. It's easy to ally our biases, and internal agendas, to twist results. This is no good for the company.

You need hard scientific facts backed by proper statistical analysis. You'll be expending resources to change your website and your product, and you need to know you'll get a return on that expenditure - don't forget, you'll also have to explain it to your investors.

The second reason is speed of execution. Having built a custom A/B testing system, I can tell you it's a complex piece of engineering, and easy to get wrong.

You could end up thinking version A is fabulous for getting customers, but due to a glitch in your code, it's actually version B (no, this never happened to me, oh no, not me sir!).

Like all powerful tools, A/B testing also has some nasty pitfalls.

Even the specialist services cannot protect you from your own enthusiasm. You should insist that your marketing team pays attention to these and does their research.

Let's look at some of the common mistakes just to give you a feel for them.

Beware of seasonality in your business. Perhaps the summer is the wrong time for certain tests? December is always going to be different.

You can't just run any old A/B test any old time; it's surprising how many people forget this.

Beware of running too many tests at the same time. You'll be tempted because there are so many places you can see that need improvement.

The problem is that you won't have enough visitors to generate enough data.

Your chosen service will display appropriate warnings about statistical significance, and you'll ignore them anyway in your excitement.

This often happens after the first few successful runs that generate big returns - everybody falls in love with A/B testing and goes overboard.

Finally, beware of treating all problems like a nail, now that you have a big hammer.

Don't stop listening to actual customers and what they are telling you. This data is qualitative, certainly, but it also tells you about very valuable changes that you need to make. Always listen to customers.

Metrics: this week, we have 79 open issues and 176 closed issues.

Richard Rodger is the founder of Voxgig. He is a former co-founder of Nearform, a technology consultancy firm based in Waterford

Indo Business

Also in Business