top of page

Redefining "Social" in Puzzle Games: How Research Split Into Two Strategies

  • Feb 24
  • 5 min read

The Problem

The puzzle game I had been working on had social features, but engagement was low. The team's strategy seemed straightforward get players to add more friends, because player engages player and makes them retain better, right? Right?


But they'd never validated two fundamental assumptions:

  1. Do players even want "social" features in the puzzle game?

  2. What does "social" actually mean in this context?


Most of my initial conversations in this regard went like something like this:


"This worked for other games, it'll work for ours"


I pushed the team to get us some more clarity before investing in features or initiatives targeted at friend acquisition and engagement.


What does social mean for our players, and do they even want it?


Secondly, does having more friends actually lead to more social interaction or engagement?


The Research

I led a multi-phase research program mixing qualitative, quantitative, and hybrid methods to unpack social barriers and drivers. What started as one study became a web of interconnected research each phase building on the last. A good problem to have, if you ask me.

1: Segmentation Validation (2 weeks)

Analyzed behavioral data and player attitudes across segments to test the friend-count hypothesis.

Method: Behavioural Data Analysis + Attitudinal Survey

Sample: Analyzed existing player segments based on friend count (categorized by the team definitions: low social, mid social, high social). Surveyed players across the segments to understand social attitudes, behaviours and feature engagement.


The finding: Friend count didn't equate to social engagement. Some of the users who had high social activity had the lowest friend count - a small tight knit circle if you will.


Friends can keep you in the game, but no. of connections plateaus after a point, so it seemed quality of connection mattered more rather than quantity.


This was surprising to me and not intuitive at first, my own research on Social Exploration in multiplayer games did not share the same findings. In a puzzle game? Different context, different dyanamics.


We had a lightbulb moment: your friend list can be huge, but you don't interact with everyone and engagement flattens over time. Friend count was a vanity metric and chasing it might not be the best use of time.

2: Testing "If It Works There, It'll Work Here (3 weeks)


Method: Competitive analysis + cross-game audience comparison research


Approach:

  • Analyzed Teams features in competitor games (what they offered, how they worked, in what contexts they succeeded)

  • Conducted comparative research on player behaviors, motivations, and session patterns across games in the same category and team defined competitors.


The team wanted to build an array of features, including guilds, because "it works so well in other games."


Do we understand why those features work in those contexts? And do those contexts match ours?


I ran competitive analysis to understand what features existed and why they worked elsewhere, then conducted audience comparison research to test whether our players' habits and motivations matched the contexts where guilds succeeded.

They didn't. Our players had different play patterns, session expectations, and reasons for playing.


This approach combining competitive analysis with audience research to challenge "it works elsewhere" assumptions was later presented at King's internal UX conference as a framework for evaluating competitor features.

3: Barriers & Drivers Deep Dive (6 weeks)

Combined qualitative and quantitative research to map the social landscape.


Method: Mixed methods - qualitative interviews + quantitative survey


Sample:

  • Qualitative: In-depth interviews (n=30) across player segments (split across engagement, social and spending habits)

  • Quantitative: Survey (n = 1500) to validate themes at scale


Barriers identified:

  • Outside game: Privacy concerns, perceived artificiality, shame or guilt about time invested, limited understanding of what "social" could mean in gaming, nothing to speak about really.

  • Inside game: Too much hassle, unclear benefits, fragmented experience, reward loops that felt unfulfilling and forced


Drivers identified:

  • Curiosity: Seeing stats, comparing progress

  • Status: Achievement, competition, leaderboards

  • Support: Belonging, community, mutual benefit

  • Belonging: Finding "players like me"


The Insight


Firstly, for this audience, social was more than active interaction, it was also about passive presence.


The team had been chasing active social: guilds, messages, life requests, teams. But players didn't want only coordination. They wanted to feel present alongside others without direct engagement in seeing stats, comparing progress, sharing journeys.


This distinction clicked with the team. It explained why multiplayer social features weren't landing and this unlocked ideation paths to both.


Secondly, players of a game used by millions every moment thought they were the only ones still playing it. Yet there was genuine curiosity not a lack of it. There were communities that existed on social media, but not within the game.


What level are people on?

How many attempts did it take the average player to pass this?

What's the high score on this level?

Do people in my neighborhood play this?


The desire to connect around the game existed. What was missing was visibility into the community beyond friend requests, messages, and in-app chats.


"Social" Reframed and Redefined


"Social" meant something fundamentally different in a puzzle game than in multiplayer games. And it meant different things inside vs. outside the game itself. This became the teams' guiding principle.


Some barriers appeared to require product solutions (in-game mechanics like co-op events, trading, better reward loops).


Others required marketing approaches (creating buzz, community moments, making the invisible community visible, creating security and brand trust).


I split the findings across teams product got mechanics insights first, marketing got visibility strategies first. Then I connected the teams, because the data suggested striking gold in working together towards a solution.


4: KANO and Concept Validation

Based on Phase 3 insights, we conducted several ideation and brainstorming workshops, we developed ideas, concepts and prototypes and got to validating and testing them.


Method

  • KANO Analysis (n = 1500): Evaluated potential social features to identify table stakes vs. delighters

  • Diary Study (n = 15): I conducted a diary study to early-concept-test Crushable, validating engagement patterns, habit creation, and surfacing onboarding challenges.


A KANO analysis during quant validation provided a strong case for "Unwrapped" a personalized review feature to address curiosity and reinforcing the invisible community. (Millions have been in your shoes) Passive social, no pressh!



The Outcome:


Segmentation Redesign: The team updated social segmentation beyond friend count to segments based on actual social behavior and motivations.


Marketing Path (Launched)


"Unwrapped" launched, a yearly review that gives players a personalized summary of their gameplay (levels passed, candies collected, boosters used, highlights). It boosted high social sharing rates and a measurable lift in player reengagement.


It directly addressed the perception of isolation showing players they were never alone.






"Crushable" launched as a strategic social moment in players' daily routines, creating shared experiences and buzz.


Both became tentpole initiatives for the marketing team.


Product Path (In Development): The product team explored an updated Teams feature based on preliminary research insights. Early reception surfaced iteration needs, and the feature continues to develop.

Why This Mattered


This research challenged a fundamental assumptions the team had operated on


By redefining how the team understood and measured social behavior, the research

  • Prevented investment in the wrong solution (just adding more friends)

  • Overhauled the segmentation: Led to rebuilding segments around friend count + interactions. Data Science and UxR partnership to identify meaningful metrics to build intuitive segmentation.

  • Uncovered a paradox: millions of players felt alone

  • Split into two strategic paths and collaborative working addressing different dimensions of the problem

  • Led to two successful marketing launches (Candy Unwrapped, Crushable)

  • Informed ongoing product iteration (Teams feature)

My learnings


Through countless meetings, workshops, and artifacts, this project taught me the power of reframing.

Challenge how the team sees the problem, and the right solutions become obvious.


The team shifted from shifted from How do we make players more social? to What do players actually need from social in this context?


That reframe led to an intuitive segmentation strategy (not vanity metrics), data backed decisions, launched two successful products, and created a framework the company still uses and was widely shared in the UX conference within King.


Most importantly, I learned that research doesn't just answer questions. Sometimes it helps teams ask better ones.

 
 
 

Comments


©2022 by Arpita Chandra

bottom of page