When new users sign up, they’re not looking to “fill out a form”; they’re trying to get something done.
They’re not just exploring; they’re evaluating. They’re asking: “Is this product right for me?”
The faster we understand this, the faster we can help them succeed. We must ask the right questions at the right moment to answer them.
At involve.me, we built a dynamic onboarding survey for our product. And what started as a simple data-collection step has evolved into a personalized entry point that helps us match users with the right use cases, shorten time-to-value, and improve segmentation and automation.
All without overwhelming users.
This article shows how we structured our onboarding survey, what we learned from A/B testing it, and how to avoid common pitfalls. If you’re looking for new ways to onboard your users and predict customer retention, this is a playbook you can adopt today.
The Goal of the Onboarding Survey
There are two main goals for giving an onboarding survey to new SaaS users.
First, it helps personalize the user's experience by gathering insights into their needs and profile. And that’s why we built a smart onboarding survey directly into our product experience at involve.me. It doesn’t just collect data; it kickstarts personalized journeys and gives us insight into what they need before they ask.
Second, it provides the SaaS company with valuable data that can influence product and marketing decisions and predict customer retention based on the survey responses.

One of the questions in our onboarding survey
Let me walk you through the onboarding survey that involve.me’s new customers see when they create a free account and access the editor for the first time. The survey consists of three questions:
1. What is your industry?
2. What’s the size of your company?
3. What goal would you like to achieve with involve.me?
Answers to these questions help our SaaS team with the following:
1. Personalize Customer Experience
Based on the answers to the questions about the customer's industry and current needs (whether it's lead generation or product recommendations), we offer a set of templates (forms, quizzes, surveys, calculators) that specifically help new customers with their inquiries. For users looking for more flexibility or with specific needs, we also offer the possibility to create their funnel from scratch or generate it with AI.
While new users can explore the templates on their own, it's always better to provide some guidance. According to statistics from 2024, a poor onboarding experience in the software industry results in a user drop-off rate of 40-60% after sign-up. Redirecting new customers to relevant content and features can significantly increase retention rates.
“76% of customers are likely to continue using a service if they have a positive onboarding experience.“
How did we create such personalization in the product?
To create a pop-up survey with onboarding questions, we used our own product: the involve.me funnel builder.
Instead of having the "Thank You" page as the final step, we set a custom URL for each specific industry. This means that participants are redirected to the configured URL upon completing the survey. For projects with multiple outcomes, we set different URLs for each outcome, ensuring tailored content for each new user:

involve.me's custom redirect
2. Improve the Marketing Strategy
Learning about our customers' industries and goals helped our marketing team identify the best strategies to provide more value and attract more users to create accounts on involve.me.
We began by analyzing survey responses and comparing them to customer histories. This helped us determine which customers, from which industries, stayed with us the longest and who fit our Ideal Customer Profile (ICP). Based on this data, we expanded our template library, wrote blog articles featuring our solutions (form and survey templates serving different goals), and created industry-specific landing pages.
While our SaaS product is customizable and can be used across all industries, we created personalized content for our ICPs from different sectors. We used this approach to show our customers how creatively they could use involve.me.
One example is the Social Media Hashtag Generator template for marketing agencies (It's a customizable template. Get it here):
3. Features Planning
involve.me is a feature-rich and multi-industry product. The data from the onboarding survey, where customers share how they plan to use the product, helps our SaaS team prioritize which features to develop next.
For example, when we discovered that many customers use involve.me specifically for lead generation, we introduced email validation options. These options help ensure high-quality leads by preventing participants from submitting invalid emails.
Create your own onboarding funnel
From Guesswork to Growth: A/B Testing Our Onboarding Survey
We didn’t get our onboarding survey right on the first try. However, through A/B testing, automation, and key lessons, we turned a simple survey into a scalable growth lever.
Before settling on our current onboarding survey, we tested several variations, including copy, question order, and the number of steps. This section will outline how we utilised A/B testing to determine what worked and what didn't.
Business & User Context
➡️ Why did we decide to touch the onboarding survey at all?
We aimed to increase the conversion rate and time to completion for the onboarding survey, while also enhancing user activation and customer retention.
➡️ What specific user or revenue problem were we seeing (e.g., drop-off rate, low completion, poor targeting)?
Since our last onboarding survey launch the previous year, although we saw improvements after implementation, we have started to see a steadiness over the previous few months.
We noticed that the conversion rates of the onboarding survey were steady, even the drop-off rate. We also noticed that the subscription rate was constant.
This was the moment we started to consider trying something new.
Hypothesis & Success Metrics
➡️ What did we learn from our previous onboarding surveys?
Our onboarding survey has evolved since the very beginning of the product. Initially, the survey had six questions, each on a separate page. When we decided to shorten the survey to just three relevant questions, we saw an 11% increase in responses.
This change made sense, as new customers need to first get to know the product and decide if it’s right for them before sharing a lot of information. More established customers who like the product and have used it extensively are more willing to share insights that can influence future features.
Before, our onboarding survey covered the whole screen. This made users leave our product without even trying it, instead of completing the survey first. Now, the survey is integrated into the product, allowing users to see the involve.me dashboard behind the survey, which helps them understand that they are already in the product.

pop-up survey in the product
Based on our latest refinements of the onboarding survey, we wanted to test what we can improve to enable users to experience the value more quickly.
Some of our hypotheses were that if we offer users more options to create the first funnel closer to their needs, they will experience the product's value more quickly. Thus, they will subscribe more rapidly and stay with us for a longer period.
➡️ Which KPIs did we declare in advance as “north-star”?
Our north-star KPI was subscription rate.
A subscription means that the onboarding was successful as all our users start with a free account and if the product matches their needs, they become customers.
This KPI is actionable, directly linked to growth, and accurately measures whether onboarding successfully sets user expectations and value.
Experiment Design
We used our own A/B feature with involve.me to test the onboarding survey. As we already validated the questions, we only wanted to test the last part of the onboarding survey, which offers users the possibility to choose how they want to start their funnel.
➡️ What exact changes separate variants A and B in terms of copy, number of questions, layout, and logic jumps?
Variation A focused on the possibility of starting with templates that were aligned with the intention set at the beginning of the survey.
Variation B offered the possibility of starting with a template or from scratch.
(In both variations, users could utilize AI to build their funnels).
At involve.me, A/B tests are configured with a 50/50 traffic allocation, using random assignment at the point of user entry. No audience segmentation or pre-filtering is applied, ensuring that all users are exposed to either variant A or B under identical eligibility criteria.
This guarantees statistically valid comparisons across the entire user population.
➡️ How long did the test need to run to hit sample-size targets, and what calendar events could bias results?
We ran the A/B test for two months, as we also wanted to observe how users behaved after their first month, even though it took only one month to hit the sample-size targets.
Implementation & QA
➡️ Which tools did we use for bucketing and data capture (e.g., Optimizely, GTM, in-house)?
Once again, we stuck to what involve.me offers, as it comes with a built-in analytics dashboard. This dashboard is easy to follow and view, allowing for real-time changes.
After each month, we used the AI insights (a feature from involve.me analytics). This helped us get up to speed with the process of understanding the performance of both variations.

The data displayed in this screenshot is not representative of our onboarding survey
Then, we exported the data for more in-depth insights and to make further correlations.
Note: You can export and download reports as XLS (Excel) or CSV files. The XLS file contains detailed data about every participant, with all questions and answers in a separate sheet called "Detailed Report." In contrast, the CSV file only contains basic participant data, including personal information and key metrics.
Data Analysis
➡️ How did we handle segmentation and multiple-comparison corrections?
The platform provides a comprehensive view that allows you to compare the performance of both versions directly and observe the differences in user interaction with each version.
While involve.me offers a pretty good A/B testing feature to compare two versions of a funnel against each other and determine which one performs better, the feature is obviously not as advanced as dedicated A/B testing tools.
For instance, the tool does not currently allow setting segmentation and making multiple-comparison corrections. The A/B tests offer a 50/50 traffic allocation, using random assignment.
➡️ What secondary insights surfaced (e.g., impact by user cohort, device type)?
Secondary insights are also limited. Once the test is running, you can track and analyze the following metrics (Visits - Starts - Leads - Submissions - Completion Rate - Avg. Duration)
Business Impact & Next Steps
➡️ How does the lift (or lack thereof) translate into annualized revenue, retention, or other strategic gains?
What was interesting to notice is that in the first month, both variations performed almost identically. However, upon examining the subscription rate and user interaction with the product, we found that one variation performed slightly better.
After the second month, we had a winner, and the same variation also started to gain more ground.
➡️ What follow-up tests or feature roll-outs are planned based on this outcome?
As next steps, we aim to enhance our multi-channel onboarding process to work more smoothly with the onboarding survey.
From a feature perspective, the AI features helped users speed up the process of creating funnels.
Lessons
➡️ What surprised us most, and what prior assumptions were challenged?
We realized that starting from scratch (from a blank canvas) or with AI outperformed the option to start with one of the suggested templates.
While our templates cover a wide range of use cases and can technically save a lot of creation/design time, we noticed that some users have a very specific idea in mind. For them, selecting one of the two template options and editing them would often result in more work than creating it from scratch or having the AI generate it.
➡️ What can other SaaS teams replace or avoid?
The 360-degree approach we employed, through the onboarding survey and cross-checking of metrics, significantly contributed to determining the winning version and avoiding a focus on vanity metrics. After we picked the winning variation, we noticed that the conversion rate performed even better.
We encourage other teams to have in mind these three angles:
Onboarding survey metrics (conversion rate, completion rate, drop-offs)
Business KPIs (activation rate, retention, churn, MRR)
Customers' KPIs (the performance of the funnel: active funnels, submissions, completion rates, visitors/leads/submissions)
How We Embedded the Onboarding Survey into the Product
We created the onboarding survey using the involve.me tool. Other SaaS companies can similarly create a survey with their branding elements and get a unique embed code that can be added to their website or product. It’s possible to control the survey trigger and timeline within the pop-up settings in the involve.me editor:

An example of setting up survey triggers
To create a similar survey, you can use one of our pre-designed templates to create an onboarding survey for your customers:
Onboarding Survey Integrations
We integrated the onboarding survey with Brevo, our CRM tool, to enhance our marketing automation. Based on the survey responses, we send out a personalized welcome email a few minutes after signup, featuring three recommended templates. This immediate personalization helps new users feel understood right from the start.
A few days later, we follow up with another email promoting blog articles related to customer's industry. This strategy keeps users engaged and helps them see the value in our service.
An example of the lead nurturing campaign (in Brevo) based on the survey responses
Recommendations for Other SaaS Companies
Drawing from our experience at involve.me, here are concise recommendations for other SaaS companies looking to optimize their onboarding process and use surveys effectively:
Set clear goals: Use onboarding surveys to personalize user experiences. Analyze survey responses to identify Ideal Customer Profiles (ICPs) and create targeted blog content and features that cater to specific customer segments.
Optimize survey structure: Keep surveys short and integrate them into the product interface. Your team can reduce the number of questions by implementing conditional logic. This means that respondents only see questions relevant to their previous answers.
Prioritize feature development: Base feature prioritization on user feedback gathered through surveys to address user needs.
Learn from mistakes: Analyze user feedback and response rates to enhance the user experience. It's possible that (in your case) adding more questions to the survey could improve personalization. Consider the needs of new customers and adjust the survey accordingly.
Take advantage of integrations for automation by connecting surveys with CRM tools. This allows for automated personalized follow-ups, boosting user engagement. Also, you can connect surveys with Excel, Notion, Airtable, and other tools to ensure access to data for all team members.