How to set up a post-module survey (with sample questions)

The best feedback you'll ever get from a buyer isn't an Instagram DM. It's not on the post-course 1:1 either. It's in the survey they fill out right after finishing a module, while the content is still fresh and the feelings are concrete.

Here's something many creators don't know: the difference between a creator with a 15% completion rate and one with 65% isn't content quality. It's that the second creator always knows where buyers get stuck, because they collect surveys and then adapt.

In this guide I'll show you how to set up a post-module survey in audienced, what questions to ask, and how to turn the results into concrete improvements.

Difference between a survey and a quiz

In audienced these are two separate lesson types.

  • Quiz — knowledge check, has correct and wrong answers, a score, a pass threshold.
  • Survey — opinion gathering, no right answers, no score, optimal for feedback.

A survey is ungraded by design. The buyer fills it in, you see the answers, nobody is labelled "wrong". That's why people answer more honestly.

Why a per-module survey, not a post-course survey

A lot of creators make the mistake of putting one survey at the very end of the course. Problems:

  • The buyer is already out of the flow, motivation to answer drops.
  • Memory of module 1 is fuzzy after 4 weeks.
  • You can't course-correct anymore — the survey arrives when the course is done.

A survey after each module (or every other module):

  • The buyer is still warm, answers more honestly.
  • Memory is fresh.
  • You spot patterns early and fix lessons in current modules before the whole course ends.

Step 1: add a survey as the last lesson in the module

In the course editor open the module you want a survey for. Click Add lesson → pick type Survey.

Set:

  • Title: "Feedback: Module 1 — Instagram basics".
  • Description: "Before the next module, give me a minute of your time. Your answers directly shape the next modules."
  • Mandatory completion: yes or no. If yes, the user can't reach module 2 until they complete it. Recommended for the main feedback survey, not every module.
  • Anonymity: optional — whether answers are tied to the user or anonymous.

Step 2: add questions

audienced surveys support the same types as quizzes, just without correct answers:

  • Single choice.
  • Multi-select.
  • Likert scale (1–5, 1–10).
  • Open-ended.
  • NPS (Net Promoter Score) — 0–10 with automatic segmentation.

My recommended 5-question "standard" template per module

Q1: Likert 1–5

How clear were the concepts in this module? 1 = very unclear, 5 = completely clear

Q2: Likert 1–5

How useful do you find what you learned? 1 = not useful, 5 = extremely useful

Q3: Single choice

Which lesson was most valuable to you? [list of module lessons]

Q4: Single choice

Which lesson was unclear or didn't work? [list of module lessons + "All were clear"]

Q5: Open-ended

What would you add or change in this module?

Five questions, 90 seconds to fill out, key info on the table.

NPS question — every third module

After every third module (or mid-course, or at the end) add:

How likely are you to recommend this course to a friend with similar needs? 0 (never) — 10 (definitely)

NPS result:

  • 9–10 = Promoters (actively recommend).
  • 7–8 = Passives (satisfied, but won't actively recommend).
  • 0–6 = Detractors (dissatisfied, potentially negative word of mouth).

NPS = % promoters − % detractors. For an online course:

  • Below 30 — needs major improvement.
  • 30–50 — average.
  • Above 50 — excellent.
  • Above 70 — elite.

Step 3: review results in admin

In Courses → your course → Surveys you see per survey:

  • Number of responses.
  • Average score for Likert questions.
  • Distribution for single choice (pie chart).
  • List of open answers.
  • NPS calculation.

Example of what I do with Laura's (Zadnja dieta) results:

Module 3 survey shows an average usefulness of 3.2/5, while other modules sit at 4.5+. Open answers keep saying: "Lesson 3.2 is too fast, too much info at once."

Action: Laura records a replacement, longer video for lesson 3.2, split into two parts. In the following month the average usefulness of module 3 jumps to 4.4.

Without the survey, she wouldn't know. Nobody wrote that in DMs because they were in the flow and jumped ahead.

What to do with NPS segments

Promoters (NPS 9–10)

  • Send a testimonial ask: "Thanks for the 9/10. Would you write 2 sentences for our page?"
  • Affiliate invite: "Your natural network would appreciate this. You'd earn 30% commission on referrals."
  • Upsell: "Your next step is [premium programme]. Promoters get 20% off."

Passives (NPS 7–8)

  • Ask why not a 10: "Thanks. What would need to be different for a 10?"
  • Act on the feedback if you see a pattern.

Detractors (NPS 0–6)

  • Immediate personal DM or email: "I can see this wasn't what you hoped for. Could I get 10 minutes with you to understand?"
  • Don't try to convince. Listen and either fix the content or offer a refund. Both build reputation.

The audienced filter users by NPS score lets you target segments in a single click.

Common mistakes when building surveys

Too many questions

A 15-question survey has a 20% completion rate. A 5-question one has 85%. Less is more.

Closed questions without "Other"

Always offer "Other: ________". That's where the most valuable insights hide.

Leading questions

"How much do you love this incredible lesson?" → biased. "How useful was it?" → neutral.

A survey isn't always appropriate

After every module is too much. Every other module (or after key course phases) is the sweet spot.

Ignoring the results

Collecting without acting is wasted time. Set yourself a rhythm: once a month, 30 minutes of reviewing results, pick 1–2 concrete improvements.

Frequently asked questions

How many people actually fill in the survey?

With good copy and a mandatory flag: 70–85%. Without mandatory: 30–50%. A post-course survey without incentive: 15–25%.

Should I offer an incentive?

You can (discount on the next product, exclusive bonus content), but you risk getting "all 5s" answers without substance. I recommend no — clearly communicate how the results shape the course instead.

How long should the survey be?

60–120 seconds to complete. 5–7 questions.

Should I run surveys on free courses (freebies) too?

Yes. One of the most valuable sources of insight for your main paid product.

Does audienced offer predefined survey templates?

Yes. In the Template library we have 5 proven templates: Module feedback, End-of-course, NPS, Onboarding check, Pre-launch survey.

Can I see survey results per customer group?

Yes. In the admin you filter by signup date, paid plan, challenge cohort. Useful when comparing different course versions.

What about qualitative answers — do I have to read them all?

No. Read the first 20 by hand to catch patterns. Then at higher volumes use AI — audienced has an AI summary feature that summarises 1,000 answers into 10 key themes.

Closing thoughts

A post-module survey isn't an admin function. It's the most direct channel from the buyer's head into your decisions.

Creators who grow don't build courses on gut feeling. They ship a first version, collect surveys, iterate. Version 2 is better. Version 3 dominates. Without surveys, you stop at version 1.

If your current course doesn't have a single module survey, I suggest adding one this week. 20 minutes of work, disproportionate impact.

Try audienced 14 days free