Francesco Bovoli: The Tech-flavored Product Guy

by | Apr 15, 2020 | Resources

Francesco Bovoli is the CPO/COO at Emoticast, a UK-based Tech startup whose apps generate GIFs with embedded music for mobile messaging. Francesco has wide experience at large organizations such as Accenture and has also spearheaded start-ups such as IdeaPlane and We met Francesco to discuss topics such as user growth, optimizing experiments, and the secret to increasing opt-in rates in push notifications. Here’s what we found out…

Getting into product management

Interviewer: Tell us a bit about your product journey

Francesco: I used to say, “I am a product-flavoured tech guy.” As I get older, I code less and less. So at Emoticast, I’d say, “I am a tech-flavoured product guy.” I’m actually a computer engineer. I did my Masters in computer engineering and my first startup as a CTO was actually writing code in the late ‘90s, what was back then the leading edge — Linux, Apache, MySQL, and PHP. It was brand new back then, and you could make decent money with that.

Link to full audio interview [here]

Interviewer: You’ve been speaking about user growth a lot. What are your thoughts about user growth? How did you approach it?

Francesco: When I joined Tunemoji, the app was already live and with a solid user base. It takes a long time to actually fine-tune the format. The original idea of Tunemoji came from tune emoji.  So it was like stickers, with sound bolted at the back. And that format simply didn’t work. And that’s pretty normal at start-ups, in my experience. The reason some of the labels and all of the publishers that partner with us is that we are nimble innovators.

“We settled for weekly active users, as monthly would have been too slow”

The startup mantra is once you find a decent product market fit, you need to start growing. So step one was to identify what it means to grow. For us, user engagement meant clips sent, clips watched and active users — it is all about how engaging it is for users. And we settled for weekly active users, as monthly would have been too slow.

User Experiments and Frequency

Interviewer: When you say slow, you mean you are not reacting fast enough?

Francesco: Yes. Everybody says if you really want to go fast, you need to run experiments fast and also see the results of the stages. So, if what you measure lasts 30 days, after 3 days you only affected 10 percent of your user base. Then your needle is always moving very slowly, which is probably ok for an enterprise customer. In an enterprise you know the typical turn you want to have is less than 10 percent per year. In a consumer space, you have an 80 percent turn after the first week. After a week, 80 percent of users are gone. And you have new ones coming in.  So a monthly time frame is not good enough, it isn’t fast enough. 

Interviewer: And also there are some seasonal aspects… Mondays and Fridays people have different moods…..
Francesco: There are days when there is a lot of traffic and the day is really bad if you want to run an experiment. Now, is it bad because of the traffic?  We went for weekly, since it’s the one that makes the most sense.

Interviewer: How do you balance the amount of activity you need to do to run the experiment successfully? Because I am sure there are some product tweaks that you do from one experiment to the other. How do you manage the lead times in getting those product tweaks in place?

Francesco: It chiefly depends on the experiment. In some experiments, you can go live for an hour and you kind of know it’s working. For others, not so much. In the beginning, it depends on how you structure the experiments and the type of experiments. For example, when we started the push notifications experiment, there were a few experiments where we knew we had a winner straight away. Because literally we start a push notification and you see your potential rate on the second day of working. On the other hand, if you ask people to review the app, that takes time because it’s not an app that gets thousands of reviews per day. One of the big problems I faced was how to run experiments faster and in a solid way. How do you know that the experiment is conclusive and how do you run more than one experiment at once? 

That’s where we ended up adopting the buckets, which is a term introduced to me by a friend called Ivailo Jordanov, a brilliant growth guy based here in London whom we consulted during my gig at Karhoo. Basically, he suggested dividing our entire user base into segments, assigning every user to a “bucket”. Then, when running an experiment, assign the users from some buckets to experiment A, and the remaining buckets to pot B, which gives you a couple of important results. 
The first is that you can run a three-ways ABC test rather than a 2-ways AB test.
The other thing is that it allows to run several tests in parallel.
Say you want to do a test to force users to log in with Facebook, at the same time while you are experimenting with retention by sending push notifications. You could find that the users logging in with Facebook have higher retention: is it because they are logging in through Facebook, or is it because of the push notifications? If you have multiple buckets of users you can run both tests in parallel, guaranteeing you can tell the results of the two tests independently, for example assigning the buckets as follows:

Customer journey

Push notifications and Engagement strategy

Interviewer: I remember you speaking about the push notifications part. You have a long learning history around push notifications. Maybe you can introduce us to that.

Francesco: Happy to. There was quite a lot of learning, some parts are even embarrassing, which you would probably laugh at.
It’s imperative that you ask the user permission to enable push notifications at the right time because if you get the timing wrong, the user can decline, which basically means the user’s gone, as on iOS you can effectively only the user once. So what we were doing initially was to wait until the user had used the product at least thrice, and watched the clip at least 5 times. Then we knew that they understood the value of the product, and only then we would ask whether we could send push notifications, which sounded really smart. 
Initially, the number of people who wanted to opt in for the push notifications was 10 percent, while our expectation was more around 60 percent. So I started digging around on the Internet and found an article that explained quite in depth the different ways to ask users for permission: the simplest way to do this is called the Blitzkrieg approach, which essentially means asking the user straight away, as soon as he opens the app. If you do that, said the article, only about 30 percent of users opt-in.
So we thought, “Hang on a second, we thought we were so smart because we were waiting for the user to open the app five times and five clips and so on, and that gave us a 10% conversion rate. But the article says that if you ask straight away, you get 30%?!”
The explanation really was quite simple. The average churn rate for a mobile app is like 80 percent in the first couple of days. If you wait for two days to ask for permission to send push notifications, 80 percent of the users are already gone. If push notifications are critical for your attention, you can’t wait five days to ask the user. It’s got to be sooner. 
So we did the obvious thing: we switched to asking the users straightaway. Long story short, that single change increased our opt-in rate from 10 percent to 30 percent overnight, and then the push notifications actually started to work. 

“Important Question: Have you asked your users why they didn’t engage?”

Because then the users actually started receiving notifications. That was silly learning number one, I just wished somebody told me sooner!
The idea to start experimenting with push notifications came during a strategic offsite focused on growth. It was moderated by my good friend Michele Battelli, now at Improbable. He asked the obvious question: “Have you asked your users why they didn’t engage?” Well, at that point we did not, because back then we had no login, no tracking, nothing. Nothing at all. We just did not know how to ask them!
So we decided to try on Android and sent the push notifications to all the users who still have the app, but did not use it in the last 7 days. Which turned out to be still a few thousand, which gave us enough replies to dive into the data and have a look.
The thing that really stood out was that if I were actually to calculate the net promoter score out of 10, the score would have been 9. Hang on a second that is really good. But why are we getting a NPS of 9 from the users who churned? That seemed wrong.

“I just love your product. But I forgot that I had it.  You guys should remind me more often” – User

How is it possible that one who stopped using your product would recommend it? Going through the comments, we spotted a guy who said, “I just love your product. But I forgot that I had it.  You guys should remind me more often.” I felt really stupid.
It’s literally like the user begging you, “Remind me” that he’s got the product. And we weren’t sending any reminders at all. So we started from the most basic: ‘Hey, we miss you please come back’ — which is probably the worst possible notification you can come up with, but one we could do straight away. So we tried. Retention actually increased overnight — another couple of percentage points with that obnoxious notification. We knew that was a winner and obviously that it was a matter of starting work which should be going off for weeks about fine-tuning the messaging.

Interviewer: So looks like you had been over-defensive, thinking, “I don’t want to disturb my user”.

Francesco: Absolutely

Interviewer: You could do a lot more than what you thought…
Francesco: Sometimes your users actually want to be reminded. It was a matter of how you find the right way without actually annoying them and how many they can take what the right format and so on.

Interviewer: So you guys A/B tested the copies as well?

Francesco: We were running four to five tests in parallel. Today’s analytic platforms allow you to just run a switch or type and send them. So that gave us a really good insight into what we learned. Currently, we’re doing fairly frequent reminders, designed to add value. Rather than telling the user: “What a great day is today. Please come back.” We tell them: “Hey this is the best clip of the day that you haven’t seen yet and that we think you would like because we know what you like and by the way, this friend of yours likes the same thing, so maybe you want to send it to him.”

It’s really important to add value to the user by surfacing the content they really want to see, and you have to have the data and the smartest way to actually do something that adds value to them.

Abhishek Bagalkot

Abhishek Bagalkot

Related blogs