Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
1 of 111

Pairing Analytics With Qualitative Methods to Understand the WHY

13

Share

Download to read offline

Rudimentary analytics can be valuable to understand WHAT your customers and prospects do. However, the true value from analytics comes from marrying that with the WHY - and more importantly, overcoming the WHY NOT. In this session, Analytics Demystified Senior Partner Michele Kiss will discuss quantitative and qualitative techniques analysts can leverage to get more insight into customer behavior. (Psychologist’s armchair not included.)

Related Books

Free with a 30 day trial from Scribd

See all

Pairing Analytics With Qualitative Methods to Understand the WHY

  1. 1. From Bean-Counter to Data Psychologist Using Analytics to Understand theWHY Senior Partner, Analytics Demystified Michele Kiss @michelejkiss
  2. 2. @michelejkiss About Me I’m originally Australian (not that you can tell anymore) In my ‘free time’ I teach Les Mills Group Fitness classes (like BodyPump, CXWORX, RPM) I have an unnatural obsession with my pets – a 9-year-old cat, Bella, and a dorky labradoodle named Fenway
  3. 3. @michelejkiss Analytics Demystified I am a Senior Partner at Analytics Demystified. If you have been in this industry for even a relatively short amount a time, it’s likely you have heard of Eric Peterson, John Lovett, Adam Greco (etc.)
  4. 4. @michelejkiss We literally wrote the book...s on digital analytics When I say that Demystified WROTE the book on digital analytics, that’s not hyperbole. We literally did.
  5. 5. @michelejkiss So today we’re going to talk about what we sometimes forget about, in our highly quantified world: the WHY. What lies behind the numbers?
  6. 6. Typical Analytics Think of a typical analytics report.
  7. 7. Typical Analytics “What happened?” The information included tells us WHAT happened. Site traffic went up or down, these traffic sources shifted, conversion rate declined.
  8. 8. @michelejkiss Typical Analytics +8% Click-Through Rate! -18% Click-Through Rate! NoSignificant Difference! Whichvariationperformedbetter Even if you take an arguably “more visual” analytics report than a standard “charts and tables” report, the ultimate message doesn’t change. Which variation performed better? But not why.
  9. 9. @michelejkiss Typical Analytics Select Brand Select Style Select Features Detailed Product Page Exit Site 8 9 % 5 4 % 7 6 % 5 2 % 4 8 % 2 7 % 21% Browse byTheme 7% 62% 13% Browse by Idea Where do others go? Where do others go? Gray arrows: “Eventual conversion” (may involve pages in between steps) Yellow arrows: Immediate next-page User behaviour indicates some may be confused (e.g. going back) or not finding what they’re looking for (abandoning path) LEGEND 6% Home Page Another Site Area 17% Home Page 9% Exit Site 5% Select Brand 8% Where do others go? Exited Site 3% Saw Survey on Page 19% Back to Select Style (ThesevisitorsaregoingdowntheCategorypath) From here: 68% go to Other Area 9% Exit site 9% go back to Select Brand 3 2 % Even if you look at data in great detail – it’s still telling us WHAT happened? But not why? More is needed to understand why people behaved the way they did
  10. 10. @michelejkiss Typical Analytics Select Brand Select Style 4 8 % 2 7 % Where do others go? 5% Select Brand Take ONE behavior… users who select a brand, then select a style… then go back to selecting a brand.
  11. 11. @michelejkiss Leaves a sensible analyst thinking… but why?
  12. 12. @michelejkiss Hmmm… For example, a client of mine recently conducted an A/B test. This was a properly created and split test of a Home Page hero. The call to action drives users down a particular path, and one variation is found to have an impact.
  13. 13. @michelejkiss Hmmm… However, on a totally unrelated path, a significant result is also found. If you trust the numbers at face value, you might assume that the test affected this conversion. However, if you stop and assess this critically, you might wonder – how can a hero image shown pages earlier, that contains NO links to this path, truly affect conversion to it? In other words… you have the results, but you wonder… WHY?
  14. 14. That’s not to say that this would be better
  15. 15. “My Mom says blue is the better choice.”
  16. 16. @michelejkiss What are people doing? Web Analytics Question Which performs better? Why do our users do/don’t do “X”? Social Analytics Surveys A/B/MV Testing User Testing Session Replay Voice of Customer To truly understand behaviour, we need to understand a number of questions, and this involves a number of different methods.
  17. 17. @michelejkiss Essentially, we need to play a little armchair psychologist.
  18. 18. The Plural of Anecdote is not DataHowever, that doesn’t mean just substituting qualitative data as somehow being more powerful than quantitative data. Rather, we want to harness multiple methods for getting a more complete picture.
  19. 19. Art Science After all, analytics is understanding human behavior (and we are complicated!)… that’s nothing if not an art AND science combined.
  20. 20. @michelejkiss The Power of Data How pervasive is the problem? Commentary How do people feel about it? There is immense power in being able to use quantitative data to understand the magnitude of a problem, together with qualitative data to unearth the feelings and motivations that lie behind that action.
  21. 21. Which is more persuasive?
  22. 22. Only18%progressthrough Step1ofourSignupFlow 100% 18% 7% 6% Step 1 Step 2 Step 3 Complete
  23. 23. 100% 18% 7% 6% Step 1 Step 2 Step 3 Complete “Whytheareyouaskingmeformybirthdateand streetaddresstodownloadawhitepaper?!” -Actual customer feedback based on Step 1 of our form
  24. 24. These methods are pervasive in other fields These methods are pervasive in other fields But we don’t use them nearly enough
  25. 25. @michelejkiss The medical field – we normally think of as being “all quantified” – lab tests and MRIs.
  26. 26. @michelejkiss Medical EmpiricalTests Patient-Reported History However, the medical field actually combines data from tests together with patient reported history and symptoms. Diagnosis can’t be based purely on one or the other.
  27. 27. @michelejkiss Medical In a season three episode of House, they deal with a car-crash patient who has a disorder called CIPA, which is a rare inherited disorder of the nervous system which prevents the sensation of pain, heat, and cold. Life with CIPA means constantly checking your mouth to make sure you didn’t bite it, your teeth to ensure none of them have fallen out or broken, your eyes to make sure you didn’t scratch a cornea, your limbs to make sure you haven’t broken a bone… even setting an alarm just to remind you to go to the bathroom. It also means when you go to the hospital and there’s something wrong, you are NO help to the diagnosis because you literally can’t tell the doctor what hurts. In this case, the absence of this more “qualitative” data is literally a deal-breaker, that makes quantitative analysis of test results more akin to “finding a needle in a haystack”
  28. 28. @michelejkiss Journalism Statistics Quotes Journalists do this without missing a beat. After all, an article that just cited a bunch of numbers at you (especially in prose form, where it’s hard for people to process numbers!) wouldn’t sway anyone. But when statistics, speaking to the prevalence of the issue in question, are combined with quotes that bring the story “to life”, you’ve got a much more persuasive article.
  29. 29. @michelejkiss Legal “Iwanttotellyouastory.I'mgoingtoaskyoualltocloseyoureyeswhileI tellyouthestory. Thisisastoryaboutalittlegirlwalkinghomefromthegrocerystoreone sunnyafternoon.Iwantyoutopicturethislittlegirl.Suddenlyatruckraces up.Twomenjumpoutandgrabher.Theydragherintoanearbyfieldand theytieherupandtheyripherclothesfromherbody.Nowtheyclimb on…. Canyouseeher?Herraped,beaten,brokenbody.Canyouseeher?Iwant youtopicturethatlittlegirl.Nowimagineshe'swhite.” Evidence Argument The legal field too ties hard evidence together with narrative, via the opening and closing arguments of a trial.
  30. 30. @michelejkiss Marketing TestimonialsCustomer Numbers Marketing does this constantly – using customer numbers and observed impacts on business to persuade, but also testimonials, quotes and case studies.
  31. 31. @michelejkiss Health “Feel” Measures Even in health and fitness. There are a lot of ways you can track your progress to a fitness goal. For example, losing weight, or measuring muscle quality. But you can also just go by “feel” – “my clothes feel looser” (even if you haven’t lost a pound!)
  32. 32. So let’s say you’re sold.
  33. 33. How can you better sell your findings?
  34. 34. @michelejkiss Surveys
  35. 35. @michelejkiss Surveys RedesignControl A/BTest Winner But why?
  36. 36. @michelejkiss Surveys RedesignControl A/BTest Survey “Whatareyouhopingtofindonthiswebsite?” Other Pricing Product Information Support Capture the test variation for the survey results to compare across groups.Run surveys in tandem with A/B tests. (Even something as simple as “what were you hoping to find” might, when you compare variations, help you directionally understand the difference between the groups.) Make sure you’re actually capturing which test version ties to the survey responses! You could do that by passing the test ID to your survey tool, by passing the survey ID to your testing tool, or by integrating the two in your web analytics tool.
  37. 37. @michelejkiss Exit Surveys You probably have to have been living under a rock to have never seen an exit survey…
  38. 38. @michelejkiss Post-Conversion Surveys @michelejkiss For customers who did convert, what pushed them over the edge?
  39. 39. @michelejkiss Fall Out Surveys Why’dyouleaveme?! On the flip side…
  40. 40. @michelejkiss Task Success Surveys
  41. 41. @michelejkiss Monitor Satisfaction Scores “NetPromoterScore”,“WordofMouthIndex”andsimilar
  42. 42. @michelejkiss User-Initiated Feedback Tools like Opinion Lab make it possible for users to give feedback when THEY choose, rather than irritating users with a pop-up
  43. 43. You can guess all day at whybut it’s easier to just ask…
  44. 44. @michelejkiss What Should I Ask? Avinash’s“Three Questions” ! What was the purpose of your visit? ! Did you accomplish what you came for? ! (If not)Why not?
  45. 45. @michelejkiss What Should I Ask? ! Would you recommend us to a friend?
  46. 46. @michelejkiss What Should I Ask? ! Would you recommend us to a friend? ! How would you rate your visit today?
  47. 47. @michelejkiss What Should I Ask? ! Would you recommend us to a friend? ! How would you rate your visit today? ! Did anything frustrate you on your visit today? (If so, please explain)
  48. 48. @michelejkiss What Should I Ask? ! Would you recommend us to a friend? ! How would you rate your visit today? ! Did anything frustrate you on your visit today? (If so, please explain) ! What did you like best/least about your visit today?
  49. 49. @michelejkiss Keep In Mind No one is willing to fill this in:
  50. 50. @michelejkiss Keep In Mind Your goal is to get qualitative information (Leantowardsfree-form,open-endedquestions) Yes, it’s way easier to “count” and present questions where customers simply picked “Yes” or “No” or rated on a scale of 1-10, but here you’re actually looking for their own voice, since this is to supplement the wealth of quantitative data that you already have.
  51. 51. @michelejkiss UserTesting
  52. 52. @michelejkiss UserTesting Remote UserTestingFormal UserTesting Observation Focus Groups Eye tracking Task completion Card sorting
  53. 53. @michelejkiss Focus Groups Researcher gathers a group of people and asks questions and uses their discussions and interactions to better understand the user perspective on the research question. Funnily enough, I was part of a focus group back when Adobe first acquired Omniture. It lasted about two hours, and they asked us a lot of questions about the perceptions of the two companies. I also very clearly recall them asking for our opinion on various ways to rename the product. And being miffed that they didn’t accept by “none of the above”! E.g. They had “Adobe SiteCatalyst, powered by Omniture” or “SiteCatalyst, powered by Adobe” etc. I wanted “Omniture SiteCatalyst, powered by Adobe.” ϑ The researcher will normally try to find people in the right demographic for what they are looking for. For example, in my case, they were looking for Omniture users. Keep in mind… a focus group shouldn’t be your “source of truth” – it’s NEVER going to a representative sample! Not only that but what people are willing to share in a focus group is not necessarily going to be their honest truth. It’s an artificial environment and that must be kept in mind. (That’s why it’s good to use focus groups to complement quantitative data but not assume they can stand 100% on their own.)
  54. 54. @michelejkiss Observation &Task Completion This is probably what you typically think of when you think of user testing – Watching users actually interact with your product. Observation can be done in-person, via recording or fully remotely. This is often used with specific task completion exercises (rather than having the user just muddle about) Task completion studies give the user specific things to do. These can be guided or unguided, and good researchers will structure it so that (for example) the order of the tasks is randomized across participants, to avoid fatigue, primary/recency effects, etc. If you want to get a good example of how these go, check out … (next slide)
  55. 55. @michelejkiss WantTo See It In Action? Usertesting.com has a sample video that will show a remote, task-based user test Really helpful to see if you’ve never gotten to observe user testing before
  56. 56. @michelejkiss ObservationTesting Example Example: Tire company. Viewing a tire gave you an option to “Add”… which took you to something that looked like a cart, but instead, was a quote. Just analytics alone would not have revealed that users were confused as to why they couldn’t purchase, and didn’t actually realize that they had to print off the quote and take it in to the store!
  57. 57. @michelejkiss ObservationTesting Example Allowed them to change the call to action to Get Quote, and put more clarification on the quote itself that it was a quote only, and “these are the next steps to actually purchase these tires.”
  58. 58. @michelejkiss Card Sorting Card sorting exercises are often used to help understand how users “group” different pieces of content or information, and can be useful to help inform Information Architecture For example, what things do users think should be under a “Products & Services” menu? What should be under the “About Us” menu? The card sorting can be open or closed. Open: Users can bucket the items in to any categories they want Closed: Users are given the categories and need to assign each item
  59. 59. @michelejkiss PrototypeTesting One of the advantages of user testing over (or in conjunction with) A/B testing is that you can test concepts before they have been fully developed. This might include testing early working versions, but it could be as basic as doing task completion studies on wireframes or sketches. I recently participated in this with a company called Whistle, who make a doggie activity monitor. I did an exercise where they had printed screenshots of different screens of their new app experience. I was assigned a task, and on each screen I was asked a question about what I would do next, in order to complete the task. Doing this quickly revealed to them that some of the CTAs they had named weren’t intuitive to the user (me.) And what’s even cooler is that the new version of the app just updated on my phone, so I can see it actually come to life!
  60. 60. @michelejkiss PrototypeTesting Example + SurveyA client of mine wanted to create a non-branded lifestyle app. They recruited a group of beta testers to use the app for four weeks, and combined Google Analytics tracking of in-app behaviour with surveys throughout the beta. Together this allowed them to understand what people actually did (especially vs. what they SAID they did!), how they used the app, whether they would continue to use it and also how it affected their perception of the brand. The data coupled was incredibly powerful – features that users claimed were important to them were in fact not used. This allowed them to conduct follow up research to understand whether this was just a case of people not being honest about what mattered, or whether it was poor app design and users couldn’t find the features!
  61. 61. @michelejkiss So How Can I You already have a job. You’re an analyst. You’re not going to go out there and become a fully dedicated UX expert. However, you should be thinking about how you can get insight from these types of research.
  62. 62. @michelejkiss Sit In On UserTests One option: If your organization has a user experience group that runs regular user tests, take the time to sit in on them from time to time. You can do so in the room, or often remotely.
  63. 63. @michelejkiss RunYour Own You don’t have to “take over” user testing in your organization There’s no reason that the analytics team can’t use user testing to inform test or analysis ideas
  64. 64. @michelejkiss No Funding?
  65. 65. @michelejkiss DIY UserTesting Ask your family and friends for help! Ask them to perform behaviours on your site or app, and record or take notes. You can easily use software like Camtasia to record their on-screen actions for analysis later. If you have a user experience team, ask for their help to properly structure tasks and questions.
  66. 66. You could also have a little fun with it…
  67. 67. @michelejkiss
  68. 68. #justsayin
  69. 69. @michelejkiss Don’t Forget…Think ofYour Audience User testing isn’t about getting a bunch of random people off the street. Think about WHO you are talking to and hearing from. If you care about how to improve loyalty, it’s worth talking to your loyal customers. If you want to improve the first-time experience, you’ll want a totally different segment. Analyzing the right segment is just as important here as segmentation is in analytics.
  70. 70. @michelejkiss Session Replay Lots of vendors (Decibel Insights, ClickTale, Foreesee, Tealeaf and more) offer the ability to replay user’s sessions. This data is also typically aggregated in to heatmaps (since watching tons of sessions can be time consuming!) While you won’t get the commentary that typically accompanies user testing, it can be enlightening to see what users actually DO when they interact with your site.
  71. 71. @michelejkiss Customer Service Data Most companies have SOME kind of customer feedback process, whether it be notes from phone calls, a general comment email address, or a comment form. Consider reaching out to your customer service team to see how you can work together to get access to this information. Keep in mind – you should be working with these people already! Often times they bear the wrath of angry customers when (for example) A/B tests are not functioning properly. So you want them to be in the loop on what’s going on your site or app, and likewise they can share valuable information with you about what customers are saying!
  72. 72. @michelejkiss Social Listening Another option to get qualitative feedback is to leverage social listening. Even if this isn’t a formal program, with a proper listening tool, it’s not hard to review comments on your brand’s Facebook page, or read through mentions on Twitter.
  73. 73. @michelejkiss Personas Claire •  Health-conscious working mom, a runner and cyclist •  Wants to make nutritional choices for her family •  Loves using technology to make her life easier Another way you can tie quantitative and qualitative is to work with your user experience team (or agency) to create analytics segments based on those personas. Obviously not every item in the persona will be possible, however use the data you have available as best you can.
  74. 74. @michelejkiss Personas Persona •  Health-conscious working mom, a runner and cyclist •  Wants to make nutritional choices for her family •  Loves using technology to make her life easier Online •  Online hours: Post-8PM (after kids are in bed) •  Frequent visits to nutrition & recipe content •  Logs in to save recipe, or sends to email Brainstorm what behaviours you think “Claire” would perform on your website or in your app, based on her persona Validate those with your UX team and ideally, with the “Claire’s” that your UX team talks to (e.g. what content on your site interests them? Make sure it aligns with what you have included in your segment) This is also a great way of interesting your UX team in your web analytics data – when they can view how people behave by their personas, they’re liking to better understand the data
  75. 75. @michelejkiss Be your user You’d be surprised how many analysts take shortcuts in actually experiencing the user experience. For example, troubleshooting your mobile form…. From a desktop computer with User Agent switching in place to mimic a mobile device. You can’t truly gauge the user experience unless you do so in the same way as your user!
  76. 76. @michelejkiss So now, theWHEN
  77. 77. @michelejkiss When ing an analysisan analysis Qualitative data can be used to help inform analysis ideas. For example, something you might not have even realized is an issue for your users might come out from customer service feedback, or from on-site surveys or user testing. Analysis will then help you quantify how big a problem it is, and testing will allow you to optimize to improve the experience.
  78. 78. @michelejkiss While running tests Running surveys alongside you’re A/B Tests can shed valuable light on why certain variations might perform differently. User testing (even while live!) of your A/B versions can add commentary to why people are behaving the way they are.
  79. 79. @michelejkiss When looking for opportunities Tell me if this sounds familiar: You’ve been running an optimization program for a few years. You’ve already tested all the easy stuff. And you’re tapped out for ideas. Why not let your users decide what you’re missing?
  80. 80. @michelejkiss During a redesign User feedback is critical when you are trying to redesign an experience. While you might have specific ideas about what you want to do, qualitative feedback can provide critical insight in to alternate approaches that might actually work better with your users. You can use this at all stages of the process – before any ideas are even thrown around, while you’re designing (especially user testing to try out the concepts you’re discussing) and up until you launch. (Then, hopefully at the same time as you A/B test your new designs against the existing site!)
  81. 81. @michelejkiss Pssssst… Youaren’tlimitedtojust testingyourownsite!
  82. 82. @michelejkiss Test AgainstYour Competition You can user-test any public site. Why not test the same feature on your site vs. your competitor’s? (For example.)
  83. 83. @michelejkiss What can we do?
  84. 84. 1.Work with other teams If you have teams dedicated to gathering qualitative insights (e.g. user experience teams, customer service or consumer feedback, survey or research) start by trying to work with them, to leverage what they’re already doing. (You don’t need to reinvent the wheel within your department…) Besides, while their work will inform yours, your work can also help give insight in to theirs.
  85. 85. @michelejkiss Bring the teams together If you are in a manager or in a position to control location, co-locating your analytics and qualitative teams (like user experience or consumer feedback) can help build cooperation and sharing amongst the two teams.
  86. 86. @michelejkiss Encourage a dialogue For example, at Demystified we use Slack for internal communication. Using a discussion tool like Slack can be valuable to help different departments discuss and debate ideas.
  87. 87. @michelejkiss Share Findings Working on an analysis? Just wrapped up something insightful that might be of use? Meet with the team to walk them through it, and see if they have any qualitative information that would lend greater insight to what you are working on.
  88. 88. @michelejkiss Allow the time… While adding qualitative insights to analytics work can be critical to better understanding how your users behave, if your analytics team simply don’t have time, this helps no one. (It will just make them feel like there is one more thing they’re failing at!) Managers need to account for the additional effort by allowing additional time for results and analysis, and prioritizing appropriately.
  89. 89. @michelejkiss Share Access Often, companies have access to many different tools. However, access isn’t available to everyone (and even if it is, people don’t necessarily have the training they would need to use them.) If you are going to have your analysts leveraging additional data sources, it’s critical they have easy access to the information, the necessary training to feel comfortable using the tools, and people to ask questions to.
  90. 90. @michelejkiss Where Possible, Integrate Data 1.70% 1.72% 1.85% Detractors: 0-6 Passives: 7-8 Promoters: 9-10 Conversion Rate by Net Promoter Score Many survey tools offer the ability to integrate their result data in to your web analytics data. This allows you to segment and understand on-site behavior in light of survey responses. For example, does Net Promoter Score correlate to conversion rate? Do people who said they could NOT complete their task tend to fill in your contact form more or less? We found this extremely useful in a previous company, where we actually ran surveys on behalf of advertisers. While the survey was itself revenue-generating, we obviously didn’t want the survey to hurt overall revenue, since that could lead to a bigger hit for the business. Even at its most basic, we were able to examine the exit rates of users who were presented with, and completed, the survey, to see if this was sending users OFF the site. (It turns out, it was.) This was then used to put strict rules in place about the frequency and targeting of surveys, to allow the overall survey product to be revenue positive by limiting the impact to other businesses.
  91. 91. 2. Invest If you don’t have teams or tools in place to start leveraging more qualitative data for your analysis, you’ll need to budget for it. This doesn’t have to be piles of money… You can conduct inexpensive user testing online for minimal investment, and there are plenty of free survey solutions out there. Consider using a tag manager to allow the team to easily deploy surveys. (GTM and Adobe DTM are both free!!) There ARE free options you can leverage. And use the results you get from those to sell the need to invest more in this area.
  92. 92. @michelejkiss What can I do? So what if it’s just you? You don’t have a manager who is willing to spear-head this effort, but you know it could add real value to the work you do?
  93. 93. @michelejkiss You’ll have to go it alone… Free solutions will be your saving grace: Free surveys, free tag managers. Run ad-hoc user tests, even if it’s only on your family and friends. Use trial versions of technologies to prove out the value for a particular analysis.
  94. 94. @michelejkiss Don’t be above bribes! If there ARE people in your organization who can help, you just don’t have the official “clout” to demand they help, don’t overlook the power of bribery. Taking someone to lunch and getting their thoughts or help, or bribing another team with treats, can be surprisingly successful.
  95. 95. @michelejkiss Your plan of attack So, what can you do now?
  96. 96. @michelejkiss Your plan of attack ! Run a survey alongside your next A/B test
  97. 97. @michelejkiss Your plan of attack ! Run a survey alongside your next A/B test ! Do a simple‘user test’(family or friend!) on your next analysis topic
  98. 98. @michelejkiss Your plan of attack ! Run a survey alongside your next A/B test ! Do a simple‘user test’(family or friend!) on your next analysis topic ! Make friends with UX + Customer Service
  99. 99. But wait… How do I present this stuff?
  100. 100. Use “Quotes”
  101. 101. “Ididnotfindthesoftware,and didnotwanttobrowsethrough hundredsofundescribed options” -Actual customer feedback after failing to find software Only 11%of users who search for downloads successfully download software
  102. 102. @michelejkiss Share themes Word Clouds can be a useful way of just giving the gist of the overall feedback, without users needing to actually read everything or without having to do complicated sentiment analysis. Critical themes and words will stand out. While these don’t replace concrete data, they can add insight and help “round out” the picture.
  103. 103. @michelejkiss Chart Feedback Positive Negative $10 Plan $5 Plan Free Plan You can chart feedback e.g. on two axes. How positive vs. negative was each piece of feedback and, let’s say you know a second variable (like what plan the customer was on) adding that on the second axis so you have a quadrant.
  104. 104. @michelejkiss Chart Feedback Positive Negative $10 Plan $5 Plan Free Plan This will allow you to focus on the positive, high-value feedback
  105. 105. @michelejkiss Chart Feedback Positive Negative $10 Plan $5 Plan Free Plan Or the negative, low-value feedback
  106. 106. @michelejkiss Group Feedback Read About Email Product Read About the company Checkout Pricing See Admin Options Get Support Get Multi-User Support Learn About Cloud Sync Learn AboutYour Security Preview Calendar App Learn About Phone Other What is the main purpose of your visit? Grouping feedback in to categories can make it more easy to digest, and easier to overlay with web analytics data to draw comparisons between what people said they came for, and what they actually did.
  107. 107. Keep in mind… Qualitative data is not better or worse than quantitative.
  108. 108. It’s just different. And gives us a different picture.
  109. 109. The Hawthorne Effect After all, there’s an entire theory around how our behaviour changes when we are observed, referred to as the Hawthorne effect. Certain research methods like user testing might be more affected by this than, say, web analytics is. For this reason, combining what someone says they are interested in with how they actually behaved can be immensely valuable, as their behaviour might have changed just by being asked.
  110. 110. @michelejkiss The moral of the story: The more data you can use to understand the bigger picture, and not be limited just by one tool, the better your analysis will be at uncovering the motives behind your user’s actions – and the better your marketing can be at addressing those motives and concerns.
  111. 111. Michele Kiss Senior Partner Analytics Demystified

×