Toggle navigation
Home
▼ Details
Products and pricing
Chart gallery
User stories
Text analytics
CDC NAMCS Library
Blog
Tutorials
Contact
Sign in
Post Editor
← All blog posts
View post
Save
<p><style> .block-quote { width: 240px; padding: 10px; margin: 5px;; float: right; border: 5px solid #eee; font-size: 18px; font-weight: bold; font-color: #888; } .block-author { font-weight: normal; font-color: #ccc; } </style> Verbatims from open ended survey questions are a rich source of insight for market researchers, and a great way for your survey to tell you something you didn’t already know. But surveys often don't include them, as analyzing text responses has historically been a hassle.</p> <p>What if coding text verbatims were fun and easy? Would we ask them more often? Might we learn more of what the market is often very willing to tell us?</p> <p>If you have a current survey with text verbatim responses, let us know. We're running a study you might be interested in... <!--more--></p> <h2 id="a-typical-survey">A typical survey</h2> <p>Look at nearly any quant market research survey. If it is like most, all or nearly all of the questions are closed-end:</p> <ul> <li>Check all that apply...</li> <li>Rate the following...</li> <li>Rank these items…</li> </ul> <p>Ok, maybe there are some <em>numeric</em> open ends. And maybe there are a couple “Other (specify)<strong>____</strong>” items. But still.</p> <p>Now go back and look at the questions clients ask in a typical RFP:</p> <ul> <li>“Why do customers purchase …”</li> <li>“What are the top strengths…”</li> </ul> <p>Product managers, marketers, the people who depend on insights from the research, tend to ask open ended questions. Actually, on survey platforms where product managers design their own surveys, (e.g. SurveyMonkey) you tend to see a lot of open ended questions. But if the analyst on the hook for analysis is involved in survey design, typically you’ll see verbatims disappear quickly.</p> <p>This is because analyzing verbatim text responses has historically been a pain.</p> <div class="block-quote"> "I avoid text open ends because they're a lot of work, not because they're not valuable." <span class="block-author">-- Senior market researcher</p> </div> <h2 id="quotes">Quotes</h2> <p>A simple way to use text verbatims is to scan through them and pull out a few great quotes to add “color” to the final report in a “Zagat review” style. That’s pretty simple.</p> <h2 id="codes">Codes</h2> <p>More systematically, coding verbatims is a great way to make quantitative sense of the responses. You can summarize them, crosstab them, and look for correlations.</p> <p>But coding verbatims is a major hassle. So much so that surveys rarely elicit open-end text responses, and even then only sparingly.</p> <h2 id="text-analytics">Text Analytics</h2> <p>There are a number of software platforms that promise automated “text analytics” based on sentiment or keywords. But, automated text analysis misses much of the nuance. For instance, in a recent survey, medical device patients expressed the reason they like a product is that it connects to their cell phone, with quotes like: “It works with my cell phone, so I can travel.” “It works with my cell phone, so I can cancel my landline.”</p> <p>These both mention “cell phone” but are completely different reasons -- one promises mobility, one promises economics. APIs for text analytics and “sentiment” analysis often miss the nuances like these that are essential to product marketing insights. If verbatims were jelly beans, automated text analytics might sort them by color, but still mix “lime” and “peppermint.”</p> <div class="block-quote" style="float-left"> Automated analytics might sort jellybeans by color but a human can identify 'Lime' from 'Peppermint' </div> <p>In most surveys, there are dozens, hundreds, or maybe a couple thousand responses, but few enough that a person could look at them all -- and would probably want to. We just want it to be fun and easy.</p> <h2 id="outsourced-coding">Outsourced coding</h2> <p>Coding can be outsourced, and there are professionals and even firms who do nothing but this. But that’s pretty expensive, and reserved for a few few questions, on a very few surveys, by a very few firms. Otherwise, in our world, it’s up to you, the market researcher.</p> <h1 id="text-verbatims-in-protobi">Text Verbatims in Protobi</h1> <p>Protobi now offers a <a href="./text-verbatims">Verbatim Coding Widget</a> that dramatically streamlines the process of creating, refining and analyzing codes for text responses (as well as more advanced widgets such as <a href="http://help.protobi.com/adminaccess/reformat-tool-beta">http://help.protobi.com/adminaccess/reformat-tool-beta</a>)</p> <p>If you just a want coding done for you, Protobi can work with external professional coding partners to do this typically within a week, within $0.25 to $0.50 per response. Contact <a href="mailto:support@protobi.com">support@protobi.com</a> for a quote.</p> <h2 id="current-research">Current research</h2> <p>If you have a current survey with text verbatim responses, let us know. We're running a study to crowdsource verbatim coding and answer some interesting questions:</p> <ul> <li>How long does it take people to code a question?</li> <li>Can our crowd code verbatims as well as the study's own professional analyst?</li> <li>How consistent / reliable are codes across people?</li> <li>Who comes closest to matching the researcher’s own codes?</li> <li>Do other coders identify categories/distinctions that the researcher finds useful?</li> <li>How do the number of categories emerge / coalesce vs number of items?</li> <li>How many items do you need to categorize before you’ve basically got it all?</li> </ul>
Date
Status
Published
Draft
Slug
edit
Thumbnail
Categories
Manage
Release
Features
Datasets
Surveys
Tips
NAMCS
Applications
Crosstab
Tutorial
Design
Concepts
Segmentation
Examples
Blog Test Category
Delete
Convert to MD