Tuesday 19 March 2019

Listen to your users and accept good and bad demo feedback

Startup diary

Peer pressure: When you get almost universal feedback from a cohort of users providing reviews via product demonstrations you must act on their comments
Peer pressure: When you get almost universal feedback from a cohort of users providing reviews via product demonstrations you must act on their comments

Richard Rodger

If you're a regular reader of the Startup Diary, then you'll know we're in the middle of running our first set of hands-on demonstrations with our first batch of users. These are conference speakers that want to use our system to help them plan and co-ordinate their speaking engagements. How have these demos been going?

Let's rewind a little. What is the strategic approach here? We've just launched our new set of features for speakers. Because these features are new, they are buggy, badly designed, and definitely not a good fit for our customers.

To think anything else is to suffer from a dangerous self-delusion. Hubris has killed many a startup.

The strategic solution that solves this problem is not very hard in principle - just listen to your customers. In practice, you need to do this in a structured way.

The structure that we use are 'cohorts'. We invite small groups of new users onto the system in batches in different time periods that follow each other.

The Software-as-a-Service business has lots of internal jargon, and 'cohort' is one such word. In the general case, it means a set of users grouped by a calendar interval. You might break your users up into batches based on the month that they joined your service.

Cohorts are very useful for tracking the long-term behaviour of your users, and whether they stick around long enough to make you any money.

In the launch phase, cohorts serve a related purpose - helping to achieve product market fit.

Each version of your product will have various flaws and issues, and will fail to be right for the market in the implementation of many features. Even if you are your own customer (I am a conference speaker) you will still get this wrong.

The only way to find our what really resonates with your users, is to let them loose, watch them in action, and ask for feedback.

We've now completed our first cohort of demo users, and the results have been good and bad, which is to be expected.

First, the good news: every demo so far has resulted in a commitment to use the system, or actual use. The basic feature set that we have built is actually useful. That's an important hurdle to clear. We haven't just been demoing to speakers, but also conference organisers and exhibitors.

We have commitments to migrate data over to our system from competitors, and we have initial adoption.

This is great. In addition to our trial clients, this gives us a great deal of confidence that we can answer the famous question "you might have a gap in the market, but do you have a market in the gap?" with a definitive 'yes'.

We've had lots of feature requests too, which again demonstrates user engagement - they actually care enough to want things to improve.

We've also had users telling us they will be using the system for specific events over the next six months, and they need specific features to make this happen.

This is wonderful and terrifying at the same time - our lives just got harder. That's something that is not often said when people talk about startups. Most of the time, the next level of success just means that things get harder on the team. You are constantly on the back foot.

Issues that were acceptable in a Minimum Viable Product are now critical issues in production use. Be careful what you wish for.

The bad news is that our graphical design is... not great. In the words of one user, it needs to "go up a level".

We sort of knew this, so it is not a surprise. When you get almost universal feedback from a cohort of users, it's time to act.

We'll need to prioritise improving the graphical design. This is not a simple matter as we also have to worry about new features for those upcoming conferences.

And unlike many software projects, where the customer has to put up with delays, conferences happen on a given date no matter what - the deadline cannot be moved, and the system must be up.

You might be tempted to say we should have got a better design done in the first place. You'd be right, but your observation would not be that useful. We knew that already, but chose not to focus on that area.

The user feedback is a direct consequence of a decision taken last year.

It still feels like the right decision - we don't have users leaving because of the design. The underlying features still speak for themselves. This is a classic startup decision.

You have limited resources and you have to make hard choices, and the wrong ones can kill you.

In the case of the design, we decided to use a standard user interface toolkit that provides a boring but practical and consistent user interface.

The design has no real personality. But it works, and it allowed us to focus on functionality.

Clearly we have reached the end of the line with this approach.

The cohort strategy also allows you to avoid annoying potential customers too much. Knowing that we know we need design work, we have slowed down the demos to new users. There is no point getting the same feedback again and again.

This is a very hard thing to do. We know we can improve our user numbers right now, and we know we'll have smaller user numbers in this half of the year as a result of holding off a little, but we believe it will mean higher numbers in the second half of the year.

People care about design, even if only viscerally, and it is a competitive differentiator - just ask Apple.

I'm going to introduce a new set of reporting numbers to this diary.

To date we have built up our database of conferences and speakers manually through direct market research. Over time, we intend for our users to add this data to the system as they publish the details of their own events and talks.

But you have to kick-start the process, which is one reason we chose a conference search engine as our MVP.

In our search engine at the moment, which is focused on technology conferences, we have 2,184 conferences, 6129 speakers, 4932 exhibitors, and 931 venues.

This is probably not more than about 15pc of the total number of public technology conferences.

Marketing update: speakers newsletter - 5,664 subscribers, open rate 12pc. EventProfs newsletter - 354 subscribers, open rate of 41pc, and the podcast is at 41 downloads.

Richard Rodger is the founder of Voxgig. He is a former co-founder of Nearform, a technology consultancy firm based in Waterford

Indo Business

Also in Business