How Two Questions Can Change Your Product’s Growth – Chatbots Magazine

Hoe twee vragen de groei van uw product kunnen veranderen

Bij Poncho obsederen we over gegevens. Niet alleen weergegevens, maar ook interne metergegevens over onze betrokkenheid, prestaties en groei. We concentreren ons kritisch op sessies , dagelijkse retentie en gebruikersverloop . Wanneer we naar deze statistieken kijken, of dit nu op elk platform, als gebruikerscohort of als aggregaat is, kunnen we zien hoe effectief ons product bij onze gebruikers is.

Een nieuwe gebruiker uitleggen wie en wat Poncho uitdaagt. Hij is een kat; vertelt je het weer; is humoristisch; stuurt twee keer per dag weersverwachtingen; uit Brooklyn; spreekt geen andere talen; is feilbaar; heeft een hekel aan grafieken & grafieken; houdt van pizza. Dat is veel om aan een gebruiker over te brengen, of het nu gaat om "Aan de slag" in Messenger of "Installeren" in de App Store . Om dat alles over te dragen aan een nieuwe gebruiker, moeten we onze ervaring met het inboeten voortdurend evalueren.

Minder vragen

Over the past two months, we’ve been A/B testing a new onboarding experience on our Facebook Messenger, Kik, and Viber bots. The onboarding experience in those bots has changed some but generally the flow has been the same since our launch at F8 in 2016.

Our onboarding originally consisted of sending a GIF, and introduction message, and a barrage of at least five questions before a user could even experience the bot. Subtle messaging content and conversational nudges helped a large number of users complete the onboarding experience, but we noticed many users would try to ask their own questions during the experience. Many wanted to quickly start chatting with Poncho. And looking at the user cohorts, a significant number of those users were non-native English users which lead to higher onboarding abandonment rates.

We needed to refresh the onboarding experience to reduce friction and complications. After much deliberation and thought, we settled on a radical hypothesis: we only need to ask two questions. We whittled down the entire experience to ask only these two questions: 1) their location, and 2) do they want to customize their experience.

We hebben dit 'Two-Question Onboarding' genoemd .

Aside from getting a location from the user to provide weather information, we start a new user with a set of defaults: an 8:00am morning forecast, a 6:00pm evening forecast, default temperature format from their location, severe weather alerts, etc. Only in the customization experience do we ask further questions about changing these preferences and showing examples of what these forecasts and notifications.

The current results of our hypothesis have surpassed our expectations.

Stats from Mixpanel showing over a 20%+ increase in day 7 retention after two months of testing & improvements.

The new experience saw our recently day 7 retention of around 60% in Dec’17 skyrocket to over 80% with the new onboarding. Our onboarding completion rate is now over 75%, and our daily unique sessions are climbing more than ever.

Our focus on improvement led us to test something different while keeping to our intention of having a simple and informational onboarding experience. The results show that we’ve continued to improve upon an already engaging, sticky product by improving how we communicate to user who & what Poncho is.

Learning by testing

The “Two-Question Onboarding" experience today was not the one we started testing back in December. We’ve worked hard over the past two months rewriting content, testing flow changes, removing friction points from that experience that did not need friction, and observing cohorts of users to determine effectiveness.

Poncho introducing himself in user’s native language & English.

One example of unintentional friction was how we conversed with non-native English users. The “Two-Question Onboarding" was actually was three questions for those users whose profile’s locale did not indicate English language. In the introduction to those users, we would send over both welcome message in their native language and an English version stating that Poncho is only available to converse in English.

We initially asked those users if they wanted to “Continue in English?," not thinking how challenging this question might be for many people. For anyone who is not fluent or native in that language, asking such a question can cause someone to question their ability to effectively converse, whether human or chat bot. This question has so much friction that the number of non-native English speakers dropped below the existing completion rate.

By removing the question and altering the dual introduction messages to be friendly & informative rather than confrontational, we saw onboarding completion rates reach above the past levels. Overall, we’ve learned that we should be more considerate on how we frame and ask certain questions.

Side-by-side comparison of onboarding intro graphics. Just as much can be communicated with less time and smaller file size.

Another learning with “Two-Question Onboarding" was how we visually engaged with the user. Our first message to a user was roughly a 1MB GIF. Anyone in a developed country on a good WiFi or cellular network might not notice a delay, but the reality is that even a 1MB is a heavy cost for users. That 1MB GIF needs to be cached by the platform, loaded by the app from the platform’s caching system, and then hoping the user waits for it to load & watches the few seconds of animation.

As as experiment, we replaced the GIF with a set of static images, randomly choosing one to present to the user. These reflected Poncho saying hello, whether in bed, at home, or wherever. We included a smile, a phone, and elements of his life (umbrella/weather, pizza, and such). Ultimately these new static images loaded faster and drew less attention in comparison to the rest of the introduction content. The static image effectively conveyed a visual sense of Poncho and his character without the wait.

Why these metrics?

Someone recently asked why we focus so much on day 7 retention and session metrics instead of weekly/monthly retention, time per session, or conversation length.

To be frank, we’re a daily service that sends forecasts and other notifications. Brevity is the balance between great and annoying. We know that the drop-off rate of new users is highest between days 0 and 3. By day 7, we’ll know whether a user has stuck as we’ll see only a subtle drop-off between day 7 and beyond.

With that said, the metrics that are important to us won’t be important to other types of chat bots and apps.

What’s next for Poncho?

The Poncho team has been gearing up for a brand-new, fully conversational experience on Slack. We’re working to make Poncho the best virtual co-worker by combining our award-winning experiences with new engagements for the workplace setting. Sign up here to get notified when we launch.

We also are beta testing a Developer API for partners that want to integrate the Poncho experience into their platforms and apps. Join The Daily Mail and others in bringing Poncho weather experience to your users. More -information available here.

Greg Leuch is head of product at Poncho. He’s also a mentor for TheVentury’s ELEVATE chatbot accelerator. You can reach him on Twitter, Facebook, LinkedIn, or email.