Canvs Surveys V5 is here!
V5 of Canvs Surveys is live and we couldn't be more excited to share its massive enhancements. In this release we've introduced the ability to add Nets so that Codes can be rolled up into a hierarchical structure. Additionally, our Data Science team has developed new technology to produce more helpful and intuitive Autocodes. Using Codes and Nets on your surveys will help you derive answers to your questions quickly and efficiently.
Nets and Codes for Canvs Surveys
Canvs Surveys now has the ability to show the hierarchical relationship between Codes and Nets. After updating automatically assigned codes to a satisfactory level, you can achieve an even higher level of analysis by rolling up Codes into Nets. You'll be able to create, manage, edit and visualize Nets alongside Codes.
Updated Tree-map
When you click on the Codes tab, there will two sub-tabs named Nets and Codes. In the Nets view, you'll be greeted with all available Nets.
When you click on a specific Net, you'll be taken to a view of the Codes belonging to that Net.
By going back to All Nets and clicking on the Codes tab, you will see a view of all Codes belonging to the Survey regardless of the Net.
Within the treemap, a forward slash is used to differentiate between a Net and a Code ("Net Name"/ "Code Name"). As the tiles get smaller, the platform gives visual priority to the Net name when viewing the Nets Tree map and priority to the Code name when viewing the Codes treemap.
Our Data Science team has done tremendous work bringing additional Autocodes functionality to the platform (more on this below!). You will see some Nets and Codes created automatically and 100% algorithmically, so we've created a way to differentiate between "System Generated" and "User Generated" Nets/Codes . In the case of "System Generated" Nets or Codes, the tooltip will have an "AUTO" indication along with a short message.
Autocodes
Autocodes aim to empower you to be able to summarize the ideas behind survey responses as quickly as possible. Hand coding is been time-consuming and cumbersome. In this release, Canvs Surveys now has the ability to code a considerable percentage of Open Ends with high accuracy – completely algorithmically and automatically.
After Canvs processing, each Open End verbatim response is tagged by a set of {emotion} and {topic} tags. Verbatims that share the same combination of {emotion} and {topic} are clustered into the same Autocode.
As a part of our Autocoding feature, we've also developed "precodes". Precodes have two distinct functions:
First, pre-codes summarize ideas that are not tied to specific topic, like a General Emotion (Enjoyment, Dislike, Indifference, Boredom, etc), as well as high-level concepts that may be tied to multiple or varied topics/emotions, like Intent (Would Buy, Wouldn't Buy, etc), Content (Kid Friendly, Nostalgic), or Other Options (Better than Others, Prefer Others), Recommendations, and more.
Second, Canvs Surveys pre-codes that categorize open-ends that do not provide valuable information (No Answer, Unsure, Need More Information, Likely Spam) as well as those that may require attention from the user (Long-Winded, Survey Issue).
NOTE: If you have your own codes frames that you'd like to make sure we always look for, please reach out to support@canvs.ai. We'll work with our Data Science team to make sure your unique code frames are available automatically.
Currently, we have implemented the following automatic Nets and Codes:
No Answer
No Answer (n/a, no comment)
Unsure (i don't know, not sure)
Need More Information (don't know enough about that, want to read reviews before I decide)
Flagged
Likely Spam (a;dlskfj, ????, -----)
Survey Issue (I don't understand the question, etc)
Long-Winded (responses over 500 characters)
Overall
General Enjoyment (it was good, no issues)
General Dislike (not my type of movie, not for me, didn't like it)
General Indifference (meh, I don't care, could go either way)
General Boredom (it was boring, nothing unique)
General Confusion (it was confusing, I didn't understand it)
General Sadness (made me cry, it looks sad)
General Mixed Emotion (I loved the suspense, but I hated how sad it made me)
Nothing (responses when asking questions like "what would you change" or "what did you like")
Everything (responses when asking questions like "what would you change" or "what did you like")
Funny (it made me laugh)
Not Funny (thought it would be funny, but it was not)
Not Scary (it wasn't as scary as I expected it to be)
Recommendations
Would Recommend
Wouldn't Recommend
Intent
Would Watch (i will definitely watch this, i am going to go watch this, i always turn this show on)
Might Watch (depends on what else is on at that time, might watch it if i'm free)
Wait to Watch (wait to stream it, I'll watch it on DVD instead)
Don't Watch in Theaters (I don't see movies in theaters, I won't go to the movies anymore)
Would Buy (I would buy this, I always buy)
Might Buy (I would consider buying it if, I'd maybe buy)
Wouldn't Buy (I would never buy, I don't buy)
Other Options
Other Options - General (there are other movies out, depends on what other shows are on)
Prefer Others (Would rather see other movies, I prefer to use products from another company)
Better Than Others (This bank is the best bank in New York, better than all other brands of toothpaste)
Cost
Price Conscious (when respondent is considering price, but does not explicitly state how they feel about it)
Expensive (too expensive, the rates are too high)
Affordable (great prices, price is right, not too much money)
Content
Kid Friendly (great for kids, family friendly movie)
Not Kid Friendly (too violent for kids, can't take my kids)
Nostalgic (always watched it growing up, remember it from my childhood)
Issue
Issue Resolved (the tech solved my problem)
Issue Not Resolved (support couldn't fix the issue with my phone)
Time
Time Conscious (too busy, if my schedule allows)
Took Too Long (wait was too long, shouldn't take this long)
Didn’t Take Too Long (was quick, efficient)
Communication
Hard to Understand (didn't speak clearly, i couldn't understand what he was saying)
Easy to Understand (gave clear instructions, wasn't complicated to understand)
Didn’t Hear Back (they said they'd call and didn't, call dropped and no call back, never got the email)
Service
Friendly (agent was friendly)
Not Friendly (spoke in a rude tone, was impatient)
Competent (agent was knowledgeable, easily knew what the problem was)
Incompetent (support couldn't figure out what the problem was, the company should hire competent staff)
Managing Codes
To better meet the natural workflow of a coder, we've added the new option to Manage Codes on the Action Bar.
Once you've reviewed the automatic Precodes and Autocodes and added any new Codes to your Open Ends, you'll need to add those Codes to Nets if needed and make edits.
The Manage Codes modal allows users to:
Create a new Net
Merge two Codes into one
Remove Codes from a Net
Drag and Drop one or more Codes into a new or existing Net
Undo previous actions (before hitting Apply)
Rename a Net or Code by double clicking into the title and typing
Jump to the Edit Codes modal from the Manage Codes modal and continue editing there.
You can differentiate between Nets and Codes in the list by observing the naming format, mentioned above. Net names are bold followed by a forward slash, and all Codes belonging to a Net are indented underneath.
With the new Manage Codes modal, you have more advanced ways of filtering what Open Ends you're seeing on the right hand side. On the left of the list of Nets and Codes, you can toggle to Include or Exclude what you see. Click once to Include, twice to Exclude, and a third click to reset the filtering.
Canvs Surveys Content Page Updates
Canvs Surveys Ranker
We've made updates on the Ranker on the Codes tab across all subsequent views (List, Horizontal Bar Chart and Word Cloud). You'll now see a dropdown with all available Nets in the Survey to choose from. You can Select / Deselect All for further filtering capabilities on the Open Ends list.
The Ranker now makes the clear distinction between System Generated and User Generated Nets or Codes with an Auto icon. On the List view, you also have the ability to multi-select on the drop down in order to filter the Open Ends list in more ways. The multi-select feature is not available on the Horizontal Bar chart or the WordCloud.
Open Ended Response List
We've updated the Codes iconography to show Nets, following the same naming format ("Net Name"/ "Code Name").
Updated Exports
We've adjusted our current exports to accommodate Nets in Canvs Surveys. The Survey Export contains two new tabs: Code Rankings and Responses with Codes.
In the Code Rankings tab, you will see tables with Nets along the Code that belonging to that Net as well as additional metrics.
In the Responses with Codes tab, you'll see all Open Ends listed and the Net name they belong to on the columns to the right.
Other Data Science Updates
Mapping to other Emotion Frameworks
Canvs Surveys now has Paul Ekman's and Robert Plutchik's Emotion Frameworks built in to the User Interface. This allows you to change filters so that you can easily decide if you want to use the 42 Canvs Emotions, Paul Ekman's Six Universal Emotions, or Robert Plutchik's Wheel of Emotions.
Our efforts of translating modern dialogue and our Canvs Emotion Framework to Plutchik and Ekman's Emotion frameworks were guided by Dr Michel Tuan Pham from the Emotions Lab at Colombia University. The Emotions Lab consists of PhD students studying the intersection of emotion and behavior.
Find out more here.
Cleaning Up Topics
Single word topics that summarize less than 1% of the total amount of Open Ends will be removed from the treemap, making the final data display less noisy.
Question Based Analysis
Canvs is now "question aware" and will produce Topics, Emotions, and Codes based on each question independently. This avoids situations where Topics, Emotions, and Codes appear, but have no results when a specific question is selected.
If you can any questions or would like to customize your organizations Autocodes, please reach out to support@canvs.ai. We love to help!