Signup

Advanced survey techniques: what we use at Typeform and why it works for us

We interviewed Typeform Data Analysts Joran van Heugten and Francesco Mora to take a deeper dive into the world of survey analysis. Jump into the minds of our internal experts and walk away with proven techniques you can employ on your next project.

Composing a strong survey is hard. 

There, we said it. And not to toot our own horn, but we’re pretty much the OG of great surveys. We know just how much work goes into getting them right. 

Because here’s the thing: You need to know exactly what information you want to collect before you send them out. Otherwise, by the time you realize you should have asked a different question, that ship has already sailed. 

In this guide, we break down: 

  • How to create more effective surveys 

  • What makes survey data different from other kinds

  • When you should run a survey

  • What to do if your qualitative data tells a different story than your quantitative data

Survey data is brilliant—but quirky

Even data experts struggle with collecting and analyzing survey data. There are a few reasons for that: 

Strong questions are tricky to craft

Survey data is only ever as good as your initial questions. If you ask bad questions, you get bad data. 

For instance, let’s say you ask people, “How do you feel when you wake up on the weekends and on Mondays?” 

Chances are, most of us feel pretty peppy on Sunday morning…and maybe less so on Monday. But because you grouped the two questions together, your data will be all over the place. You won’t have usable information about your respondents. 

Weird things (like question order) affect data in unexpected ways 

As data analysts, we don’t always anticipate how odd people are. For instance, they’re far more likely to complete your form if you ask them for their email address upfront. Why? Honestly, we have no idea. But it’s true! 

Want to learn more about how to increase the number of respondents you get with each survey? Check out these research-driven tips to capture more leads with typeforms

Writing good surveys takes psychological insight

To quote conversational analyst Elizabeth Stokoe, “Good answers are built from good questions. Every single time.” And good questions take a solid understanding of behavioral psychology. 

For example, let’s think about trust. People are more likely to complete a survey if you use direct, transparent language. Typeforms with to-the-point wording on the Welcome Screen (e.g., “Join our newsletter”) get 5% more responses than those which start with a question (e.g., “Are you interested in our exclusive newsletter?”).

Or, consider the widely observed psychological phenomenon of “mirroring.” People tend to mimic each other in social situations without even realizing it. This behavior plays out in surveys, too. For instance, if you use upbeat wording in your survey, people are more likely to respond positively. 

Here’s how we sidestep these challenges and build surveys that give us the rich, nuanced data we need.

Four steps to an advanced survey strategy

If you want to gather juicy insights about your customers and prospects, work your way through these four steps:

Step 1: Plan

When surveys go wrong, this step is usually the culprit. If you don’t plan effectively, you neglect to ask essential questions. Here are a few tips for planning your surveys: 

Begin with the end in mind 

The key to great survey results is starting with a clear hypothesis. Every question you ask should get you closer to confirming or rejecting that hypothesis.

For example, if you want to know if people like a new feature, you need to make sure that: 

a) The people you survey have tried the new feature. You’d be amazed how many people forget this part! 

b) You don’t ask leading questions like, “How much do you love [feature]?” These questions skew your data since they guide your respondents to respond a certain way.  

c) People can tell you what they want to tell you, not what you want to know. If you provide a multiple-choice range, include an “Other” option or an open-text box to let people share their thoughts. This allows them to provide input even if they don’t fall into one of the neat buckets on your multiple-choice list.

Step 2: Build 

If you’re just getting started with surveys, check out Survey School 1: Forms & Questions. But if you’re ready to level up, here are a few best practices: 

Take full advantage of the potential for rich, qualitative data.

Surveys aren’t just about asking yes/no questions. They’re also a great tool for learning how your customers and respondents feel—the subjective, emotional stuff that’s hard to capture any other way. 

You can use surveys to spot patterns and correlations, understand the why behind a particular sentiment, and gather recommendations to enrich your product and marketing strategies. 

So don’t limit yourself to the basics. Get a little creative. 

For instance, you might ask a long-form text question instead of a “rate your experience from 1 to 5” question. Then, use AI tools to run a sentiment or text analysis to spot patterns in the answers. You may get a more nuanced and useful data set. 

Want a few more tips for asking better survey questions? Check out Survey School 2: How to maximize survey response rates.

Check your bias 

If you want usable data, you can’t steer your respondents toward the answers you hope to see. Don’t ask people, “How excited are you to work with us?” (or the equivalent). Instead, try: 

  • Using the Randomize feature in Typeform, which mixes the order of the options in a multiple-choice block. Otherwise, your data may skew toward the selections at the top because some people won’t read the complete list.

  • Ask the same question in a few different ways to compensate for a poorly worded question. Say you ask respondents a yes/no question like, “Is feature X helpful?” Ask a separate multiple-choice question like, “Have you used feature X for: Option 1, Option 2, Option 3, Other.” 

  • Use neutral language to frame your questions. Typeform has an “Optimize text” feature that automatically suggests better wording so you don’t accidentally influence your respondents’ answers. Here’s a guide to getting started with AI in Typeform.

Step 3: Distribute 

Once your survey is ready to go, you need to distribute it to your target audience.

Make sure you’re sending it to the right people 

Pause to consider your recipient list before hitting send. It’s tempting to drum up as many respondents as possible, but more data is not always better data. Instead, target a representative sample, like current customers who fit your ideal customer profile (ICP), prospects who entered your funnel in the past 60 days, or customers who recently churned. The more focused the group, the more helpful the data. 

Give your survey purpose and context 

If you want people to respond to your survey, you need to provide some context and clarify the “what’s in it for them?” 

Consider this example from Quartz. They originally promoted their survey with a rather generic, two-sentence pitch: 

They saw an underwhelming 0.41% click-through rate. 

But then they promoted it differently via their newsletter email:

The verbiage and timing made a big difference in their results.

“We used a more personal tone, provided some context, and led with a more answerable question (‘Hey, I know the answer to this!’). Plus, we sent it on a Saturday when people actually had the time to do the requested action,” said Mia Mabanta, Director of Marketing at Quartz. “We saw a 4.48% click-through.” That’s an 11x improvement from their first distribution effort.

Use incentives strategically 

From a statistical point of view, the bigger the sample size, the better—assuming, as we said earlier, that your sample is representative of your target market. You usually can’t learn much or draw sound conclusions from six survey responses, although it’s probably still better than nothing. 

Consider offering an incentive for people to complete a survey for you. Though, think carefully about how you do this. For instance, if you offer a cash reward for completing a survey, there’s a good chance that people who’ve never used your product will respond just for the payout. 

Instead, link your incentive to your product or service. Think: 

  • A discount on complementary products or services

  • A free month of your subscription service 

  • Acknowledgment in your blog or newsletter 

  • Early access to new features 

  • A helpful resource, like a free guide or tutorial series

Step 4: Collect and analyze

Your responses are in, and you have enough respondents to make statistically significant conclusions. A word of caution: Don’t dive right into the analysis. Instead, make sure you: 

Clean your data thoroughly 

You’ve heard the expression “garbage in, garbage out,” right? Survey data is no exception. We see too many people get enthusiastic and jump right into analyzing what they’ve collected. Then they start looking at the results and say, “Oh, wait. We made these three questions mandatory, and 92% of people just chose the first option for all three of them.” 

Don’t be that person. Check your data and remove outliers. We’ve got a guide for how to do it here: Survey School 4: Survey Analysis 101.

Remain neutral when assessing answers

Channel that neutrality you developed when you crafted your questions into this phase, too. If you skew your analysis to prove your assumptions, you’re not doing yourself or your business any favors. Bringing in a neutral third party to lead your data analysis might be a good idea. If you can’t, try asking yourself questions to reduce your bias, like: 

  • What did I assume before I looked at this data? 

  • What evidence might disprove my assumption? 

  • What do I see here that I didn’t expect? 

When qualitative and quantitative data don’t see eye to eye

This happens more than you might expect—what your customers say and what you see them doing just don’t line up. For instance, you might think a feature is popular because you see so many customers using it. But when surveyed, they tell you they hate that feature. 

It’s a bit like a person who hates their car, but still drives it every day. There’s a big discrepancy between the sentiment and the behavior. 

So, what do you do here? 

In most cases, we recommend using the qualitative data you gather from your survey as a broad overview. It provides a framework that shows you the right questions to ask in future behavioral analysis. Then, use quantitative data (like usage data or A/B testing) or additional qualitative research to get into deeper analysis. 

To get to the bottom of the scenario above, here’s what you’d need to ask: 

a) Is it actually true that everyone hates that feature, or did I skew the results because of how I framed the question? Was it just that “I hate the feature” was at the top of the multiple-choice list? 

b) If they do hate the feature, why? This might be a good opportunity to do some in-depth interviews with customers.

c) Why does everyone use that feature if they hate it? Would they be happier with your product if there were an alternative way of doing the task?

The answers to these questions might prompt another survey, or you could consider A/B testing your user interface (UI) to determine the issue.

To get better at surveys, do more surveys 

As with so much in life, practice makes…better. To create more effective surveys, you need to launch more of them.

Eventually, you develop a process you can refine over time. You learn the questions that bear the most fruit and the wording you should avoid. You discover the best ways to get more responses and how to create surveys people are eager to answer.  

If you’d like help, our Survey School series on this Tips resources page is a great place to start. We walk you through the basics of creating surveys, asking better questions, analyzing your results, and finding helpful, compliant tech to do all of this. Take a look

Liked that? Check these out: