4 min read

Patterns of Guessing

Two weeks ago I published a web-based game, “Guess the Pattern.” As I noted in the newsletter that week, I instrumented the game so I could analyze patterns of play. Every time you loaded the page, clicked on the level slider, picked a color, made a complete guess of all five colors, or completed a game, the web app sent a message back to the server. The Rails side of the app stored those messages in a table appropriately named player_events. (Note that it did not store anything from the http headers. There is no database record of browsers or IP addresses or anything else that is not in-game activity.)

Yesterday I spent some time sifting through the data. Here’s what I learned.

As of yesterday the game saw 333 unique players as measured by a generated anonymous id stored in a cookie. It’s worth noting that this approach could be over- or under-counting players. People might play on multiple devices or share their device. But for the purposes of this analysis, assuming there’s a 1:1 relationship between players and ids is good enough.

Of the 333 players, the vast majority (~70%) visited the site and left without clicking on anything. However of the ~30% who interacted with the game, ~90% made at least one full guess and just over 90% of those made at least 2 guesses. Overall, ~60% of players who interacted with the app at all completed at least one game. So there was a steep drop off from visiting to playing, but a more gradual tapering after that. If you think of activity in the game like a conversion funnel, it looks like this:

Activity funnel for players in the game showing a steep decline from the top of the funnel to the bottom.
Activity funnel for players in the game

Given that it's a game about feedback, I did expect that players would twiddle the feedback lever. The data supports that hypothesis: 79 players changed the feedback setting two times or more. That's a full 80% of the players who did more than just visit the page.

I did have one hypothesis that I am pleased to say the data refuted.

The day after I published the game, I regretted making the default feedback setting the lowest possible, the one where it just tells you “probably wrong.” I fretted that players would arrive at the site, try to play the game, and immediately be turned off by the useless feedback. Turns out I need not have worried. There was only one player who made a single guess without changing the feedback level (and thus saw “probably wrong”), and then left. The other 87 players who made at least one guess changed the feedback level at least once.

Although the low feedback setting did not appear to cause players to bail on the game, the game still wasn't “sticky.” The timestamps on activity suggests that the vast majority of the active players played the game for a bit, but then left and did not come back.

(Of course it is possible that they did come back, but cleared their cookies and were counted as a new player. However given the overall patterns in the data I think that's unlikely.)

In fact, only about a dozen players were active an hour or more after their first visit to the site. Only ten of those played across multiple days. If you are one of those ten, many thanks for enjoying the game! You're in a rarified 3%.

The game also did not go viral. The bursts of first-time players arriving were clustered around my social media activity; new arrivals trickled off rapidly.

Graph of new players by day showing spikes on the days I posted about it on social media
Graph of new players by day showing spikes on the days I posted about it on social media

There was one surprise waiting for me in the data. Five players won at least one game on the lowest possible feedback setting; one won three low feedback games in succession. Given how unlikely it is to guess the right answer out of 6 x 5 x 4 x 3 x 2 = 720 possible combinations, this wasn’t just luck.

My first theory was that players cranked up the feedback level early in the game, then ratcheted it down at the end. Indeed, some had. But not all. Some hadn’t changed the feedback level at all during a game. I knew it was possible to use developer tools to find the solution in browser memory, but I hadn’t tried to do so myself. Inspired by this finding, I went looking. Within 15 minutes I too had cheated my way to victory (and re-learned some important lessons about the visibility of anything on the client side of a web app).

Speaking of re-learning lessons, I mistakenly reported on Twitter that 34 folks had tried to set the feedback level higher than it could go. Turns out I was doing so many SQL queries so quickly that I confused myself. I wasn’t looking at data values; I was looking at counts. WHOOPSIE! When I re-examined the data, I discovered that there are no server-side records of anyone at all having set the feedback level to anything other than the defined values 1-3.

So what’s the final verdict? If folks didn’t come back to play the game again after their first visit or forward the link along to others, is it a failure?

Not exactly. Given the lack of traction on the game, it is certainly not something I’m going to invest much more in at this time. However I learned a ton by instrumenting it, publishing it, and analyzing the data. I’ll put those lessons to good use in the future. And perhaps some day I’ll come back to this game to see if I can tweak it to improve its stickiness.

Before I come back to the feedback game, I have plans for the simulation. Last week I published a video with Davis and me playing the unnamed bug simulation. Now we’re working on a developer version. Watch this space for news. I am hoping we'll have something ready for play testing in the coming weeks and I'll be putting out the call for play testers here.

Stay Curious,

Elisabeth