In my last post, we discussed the fundamentals of Net Promoter Score (NPS) and The Why behind NPS. Today, we want to share specific, tangible examples of what this looks like at SendGrid. When we launched our revamped Net Promoter System in April, here’s what we did. Starting point We thought specifically about each of the following themes and optimized the process to ensure it supported listening to, learning from, and acting on feedback from our customers in order to improve how we serve them: Survey execution (listening) Verbatim process (learning) Closed loop feedback process (learning and acting) Data analysis (learning) Action (acting) The key to support the new process we launched this April was to begin with the end in mind. We needed a survey process that would a) capture a lot of data b) allow us to close the loop with customers in near real-time and c) analyze and report on the data. We chose the tools and approach described below because it was flexible and powerful enough to support the new processes and mostly used tools we already employed. This worked really well for us, but may not work for everybody. Survey execution We emailed our customers the survey and leveraged a beautiful and simple survey design from Knak.io. We triggered emails to customers who had not responded to the survey, but sent no more than 3 emails per customer to avoid making them feel spammed. And, of course, we A/B tested subject lines and content variations to optimize toward what our customers responded to most. The survey data flowed into our Salesforce CRM which allowed us to append other customer data for reporting and analysis, and build workflows in a tool our go-to-market teams use daily. In addition to the email surveys, we created landing pages to engage with our loyal fans and deliver a segmented message to passives and detractors. We surveyed all paid customers and a random sample of free customers to establish a new baseline score. As they say, “you can’t manage what you can’t measure.” We kept it extremely simple and pure, and eschewed adding extra questions as you can see below. Verbatim process First, verbatims are NPS jargon for the text comment a customer leaves to the question “what is the primary reason for the score you gave us.” The real customer comment above is the verbatim. This data flowed into Salesforce which allowed us to do customer reporting for analysis and alerting for customer success teams. We created a Salesforce daily report which we could paste into Google Sheets for verbatim tagging, post-survey analysis, and to support part of the closed loop process. A small team read each customer comment and labeled it into categories. Afterward, we shared verbatims with primary response teams to follow up. Closed loop feedback process We created our closed loop feedback process and goal to support our 10% improvement goal, and to hear both positive and constructive feedback, subject to the resources we were able to allocate to this brand new process. We began by reaching out by phone to ⅓ of our customers who responded to the survey, attempting to make contact within 48 hours of the customer’s survey submission. For this part of the process alone, that meant mobilizing ~¼ of the company—you can see how important our customers are to us by that fact alone! We strategically chose to close the loop with all large customers along with paid customers who gave us an 8, 6, 2, 1 or 0. Both Salesforce alerts and Google Sheets helped us keep this process running smoothly. As a side note, we hope to get to a place someday where we can reach out to every single respondent. Teams across the company, including Customer Success, Product, UI/UX, Support, and Marketing reached out to customers each day, attempting outreach within 48 hours of the survey completion. In addition, our Senior Leadership Team reached out to almost 100 customers and will more than triple that effort during our next survey. There are 2 purposes of closing the loop: 1) is to learn more (e.g. what’s the root cause of the verbatim?) and 2) is to connect with the customer to say “thank you for the feedback” or simply “sorry” when necessary. It meant the world to many customers that we would actually follow up on their comments. Data analysis (learning) This was one of the highlights for me. As a company with 120,000 customers of all shapes and sizes (using multiple products, in over 90 countries, in dozens of industries, from startup to enterprise, etc.) with many different verbatims explaining why they gave us the score they did, there were no shortage of ways to cut the data. We analyzed tens of thousands of data points, pulled out insights, and summarized what we learned. I then presented the data and insights to our leadership team and company-wide. The data analysis along with the closed loop feedback process together help us learn in 2 powerful ways. First, the entire company can learn from the key insights generated in data analysis (e.g. how happy are customers using product A vs. product B and why, or how do the scores between our recent and long-time customers compare and why?). Second, front line employees and senior leaders can learn from the customer interactions they have when they close the loop and attempt to learn root causes of feedback. Side note: some may think this sounds either overly-complicated or rudimentary. Yes, I’m saying that analyzing thousands of responses in Google Sheets was actually fairly smooth and manageable thanks to the efforts of all of the Gridders who supported this process. It took a large team to do this, but it was the right thing to do if we wanted humans to talk to customers at scale. The flexibility this provided for data analysis was a highlight. However, this method isn’t for everybody, which is why great software providers (and our customers such as Promoter.io and CustomerGauge exist). Action (acting, obviously!) We took the areas which were bright spots (literally hundreds and hundreds of customers said something like “it just works” and gave us an average score of 9.5) and began leveraging this language as we speak with customers and prospects. We also took learnings of areas to improve and integrated into product and UI/UX planning along with service offering planning. There were some quick fixes we addressed and other areas we identified that will take some time as our Product team very thoughtfully approaches how we build products. What’s next—how we’re optimizing There’s no doubt we have room to improve in our Net Promoter System. We certainly haven’t figured everything out—but I do believe we have the fundamentals right. Here’s what we’re optimizing: We’re still incorporating learnings into product and service roadmaps—some of these things take time. As a data-driven company, we’re more deeply integrating NPS in our decision making DNA. I’m working to ensure the cool stories that surface during the NPS process are shared with everyone at SendGrid, because they inspire us and touch on the human element that brings us to work with a smile every day—we’re doing a good job but we can do even better. We’re implementing even deeper leadership NPS engagement. Each of our senior leaders will speak with at least a dozen customers during October. We’re going to add new elements to the analysis, including time series components. Eventually, we’ll be moving to a more ongoing survey cadence. Closing advice We hope sharing our methodology has helped those of you who are going through a similar exercise. I can attest to the fact that working for a company that really cares about its customers is life-giving and a career-defining experience. My hope is that SendGrid, our customers and our SaaS peers continue to focus on The Why and The Fundamentals. Our work to satisfy our customers is never complete, but NPS allows us to better compete as our customers’ world and our industry changes quickly.