80.642.81 – CSR Campaigns SPRING 2022 Module 11 Lecture Transcript Monitoring and Evaluation This week you’ll complete the building blocks of your CSR campaign’s communication plan, focused on monitoring and evaluation. [Slide 2 – Iterate] Developing a campaign is an iterative process. You will learn more and more about your campaign as you develop it. For example, you learn more about your objectives when you develop your strategies, because developing and describing your strategies reveals whether an objective will help you achieve your goal based on the needs of your audience. The same realization occurs when you develop your tactics. [Slide 3 – Sketch and Final design by Paul Cezanne] Iteration is how creative teams work. Think about how a designer begins with a concept and then sketches out their idea, shares the rough idea with a creative director who provides feedback and illuminates opportunities to improve that idea. My point with inviting you to learn about iteration is to encourage you to enjoy the process of experimentation – and refinement of your work. You want to develop the Page 1 of 8 AS.480.642.81 – CSR Campaigns Monitoring and Evaluation most creative, effective, and strategic campaign possible. Please don’t try to do all of that in your first drafts! Stop the lecture here and watch the video uploaded to Module 11 called: “Iterative Process” to see how designers use iteration to improve their craft. [Slide 4 – the matrix] Ok, now we’ll move into the core content for this week: measurement and evaluation. Here you see our trusty Communication matrix and you’ll notice that evaluation is the last part of the process. But you also know from writing your objectives that we “begin with the end in mind” as Steven Covey was fond of saying. What that means is that we think about measurement when we set our objectives. [Slide 5 – Evaluation] Evaluation criteria are the desired results as stipulated in the objectives. Evaluation tools are the methods we use to gather the data. [Slide 6 – Matrix Applied example] Pages 196-197 in your reading show you several examples of how to align your criteria and tool with each objective. In this example, a regional bank conducted research that Page 2 of 8 AS.480.642.81 – CSR Campaigns Monitoring and Evaluation showed brand loyalty depends on the quality of customer service each person experiences. Therefore, the bank’s communications team set 4 objectives to improve the customer experience. Note: Oftentimes, customer experience projects form “cross functional teams” which are composed of leaders from a variety of organizational functions (such as marketing, HR, communication, etc) in order to create shared objectives to improve different aspects of the customer experience. Here we see how the communications team has determined their criteria and measurement tool for Objective #1. [Slide 7: Lush Cosmetics] Now we’ll look at our sample campaign plan for Lush Cosmetics. Here you see the measurement criteria and tools for the Lush Cosmetics CSR campaign we’ve followed each week. [Slide 8: Social Media measurement] Page 195 in your reading walks through HubSpot’s five social media measurement tiers. [Slide 9: Social Media Tips] Page 3 of 8 AS.480.642.81 – CSR Campaigns Monitoring and Evaluation Additionally, page 199 in your reading gives you some specific strategies for measuring social media / digital marketing strategies. [Slide 10: Your Homework/Building Block] • Describe your measurement and evaluation approach for each objective. • At minimum, show alignment across your objective, measurement criteria, and measurement tool. [Slide 11: Additional information about Measurement] There are a number of approaches toward measurement and evaluation. For a more comprehensive and rigorous approach toward measurement and evaluation, here is the first of two resources. Note: this level of rigorous approach is not required for your campaign plan. However, if you’re feeling ambitious and want to integrate some of these tools, you are welcome to. Just be careful to be consistent and purposeful. This is the framework developed by the International Association for the Measurement and Evaluation of Communication, or AMEC. AMEC’s method is captured in this graphic. Page 4 of 8 AS.480.642.81 – CSR Campaigns Monitoring and Evaluation AMEC’s method is comprehensive, and well suited for public relations, social marketing, and influence campaigns. Note that the AMEC method makes an appropriate and accurate distinction between goals and objectives. You can adopt a modified version of this approach for your campaign plan. But remember that this is optional! If you do follow this approach, you will focus on outtakes, outputs, outcomes and impact, as appropriate. [Slide 12: Out-takes] Outtakes includes things like customer feedback surveys, feedback forms collected during or at the conclusion of an event, and your observations as a participant in the activity or event. [Slide 13: Outputs] Outputs are metrics associated with specific communication activities or products. [Slide 14: Outcomes] Outcomes are metrics that address longer-term effects on your stakeholders. Page 5 of 8 AS.480.642.81 – CSR Campaigns Monitoring and Evaluation [Slide 15: Impacts] Impacts represent the cumulative effect of your implementing strategies to achieve your client’s CSR enterprise goals. In sum, AMEC offers a standardized method of evaluation that works with complex and long-term communication campaigns. [Slide 16: Flash cards] As an additional – but not required—resource to strengthen your knowledge of measurement and evaluation, there is a set of flash cards in the Module 11 folder. The Flash Cards show an approach to monitoring and evaluation that is more commonly designed into technical assistance, public diplomacy programs, and behavioral change programs. You are not required to use these flash cards, but you may if you choose. [Slide 17: Flash card #3: Logic Frames] I want to draw your attention especially to card number 3, which covers logic frames. A logic frame depicts the program interventions by specifying inputs, activities, outputs, outcomes and impacts in a sequential series. [Slide 18: Flash card #10: Indicators] Also take a look at flashcard number 10, which distinguishes indicators and outcomes. Page 6 of 8 AS.480.642.81 – CSR Campaigns Monitoring and Evaluation [Slide 19- Homework] The resources I’ve shared offer just a sliver of information that is available on monitoring and evaluation, which has become its own specialty within the communication field as well as program and project management. Regarding the building block for monitoring and evaluation, what I’m look for is a one- to two-page Word document in which you present and justify your approach to monitoring and evaluation. If you designed SMART objectives, then metrics should naturally follow. The monitoring and evaluation section of your communication plan presents those metrics, and explains how you would gather and use information collected against those metrics. This section assures your decision-makers that you have a valid and evidencebased approach to measure progress, use resources wisely, and course correct as needed. At minimum, show alignment across your objective, measurement criteria, and measurement tool. Alternatively, you may follow the AMEC Integrated Evaluation Framework and you may also draw insights from the Flashcards. Page 7 of 8 AS.480.642.81 – CSR Campaigns Monitoring and Evaluation [Final Slide] That concludes our examination of measurement. I look forward to reviewing your building blocks this week. Page 8 of 8 U.S. Copyright Law (title 17 of U.S. code) governs the reproduction and redistribution of copyrighted material. Downloading this document for the purpose of redistribution is prohibited. PLANNING FOR PUBLIC RELATIONS AND MARKETING 61! EDITION LAURIE J. WILSON, APR, Fellow PRSA Brigham Young University JOSEPH D. OGDEN Brigham Young University Kendall Hunt publishing company Cover images used under license from Shutterstock, Inc. Kendall Hunt publishing company www.kendallhunt.com Send all inquiries to: 4050 Westmark Drive Dubuque, IA 52004-1840 Copyright 1995,1997, 2000 by Laurie J. Wilson Copyright 2004, 2008, 2015 by Kendall Hunt Publishing Company ISBN 978-1-4652-9774-7 All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the copyright owner. Printed in the United States of America chapter 12 Communications measurement and evaluation 195 what we should focus on as our organization’s marketing, public relations and ad vertising communication specialists is setting objectives that are measured in terms of results. We also need to justify budget expenditures in terms of results and deter mine program effectiveness in terms of results. Evaluation of program effectiveness simply measures whether our tactics suc ceeded in delivering the right motivational messages to the right publics and caused them to act. Action should bring the results needed to meet the objectives. Were attitudes, opinions and behaviors changed? Did those changes produce the desired outcome and satisfy the goal within the allocated budget? Evaluation that does not measure end results simply cannot stand the test of today’s organizational manag ers. And communication professionals who cannot demonstrate that their efforts produce the desired outcomes within acceptable expenditures are themselves expendable. Pamela Vaughan, HubSpot’s lead blog strategist, identifies the top five social media ROI metrics. In examining those metrics, we see validated Katie Paine’s state ment at the beginning of this chapter. Measuring ROI of social media differs little from measuring ROI of traditional media. The tools may be different, but the criteria or metrics are essentially the same. According to Vaughan, we should: First measure reach — How far your message spreads. Second measure traffic — Does your reach generate traffic to websites or other places, virtual or physical, where you market your product or idea? Third measure conversion of traffic to leads — How much traffic is converted to interest or leads? Fourth measure conversion ofleads to customers — How many of the leads are becoming customers or supporters? Fifth compare conversion rates of different tactics — Which tactics had the highest conversion rate of traffic to leads and leads to customers? Essentially, Vaughan is suggesting we measure whether the tactics we are using are getting the right messages to the key publics and motivating them to act so we achieve results. Regardless of the channels we use — traditional media, social media or any other channels and tactics — we must measure the effectiveness of our com munication. Any other criteria or metric is meaningless. Evaluation is actually relatively easy if it is planned from the beginning of a cam paign using the Strategic Planning Matrix. Good evaluation owes a lot to good objec tives. If the objectives are written as outcomes to be accomplished in order to reach the goal, then the evaluation will be results-oriented. Two steps must be considered in evaluating any plan. First, by what criteria should we judge success (or what are the metrics)? Second, what are the best tools to measure those criteria? Evaluation criteria It is particularly important in this era of “big data” to set clear objectives that then be come the metrics or evaluation criteria by which we measure success or results. Ac cording to Ash Ashutosh, CEO of Actifio, a provider of data management software, EVALUATION CRITERIA Metrics or standards set to measure success. 196 chapter12 Communications measurement and evaluation Evaluation criteria and tools wua A regional banking institution’s research shows that while the public per ceives it is financially strong, well-managed and safe, brand loyalty even in the financial industry is dependent upon perceptions of the quality of customer service and the involvement of the organization in its local communities. The bank implemented a campaign highlighting community relations efforts and improved customer service. It had four objectives, each of which becomes a criterion to measure success. Objective one: Improve the bank’s overall customer service ratings from 4.8 on a seven-point scale to 5.8 within six months (21 percent increase). Criteria: Customer service ratings are 5.8 or higher on June 1, 2015 (six months after campaign begins). Tool: We’ll plan to use the bank’s automated email survey system to measure customer satisfaction ratings two weeks before and two weeks after June 1, 2015, and take the aggregate score. As an addition al step, we will monitor monthly customer satisfaction scores to gauge our progress during the campaign. Objective two: Improve the public perception of the bank as customer service oriented from 40 percent using that descriptor to 60 percent using that descriptor within one year (50 percent increase). Criteria: Sixty percent of customers will feel the bank is customer-ser vice oriented on Jan. 1, 2016 (one year after the campaign begins). Tool: Replicate the values perception survey upon which the campaign was built in June 2015 to measure progress toward the objective and the first week of January 2016 to determine if the objective was met. Objective three: Raise awareness of the bank’s local contributions to the community to 60 percent within six months. Criteria: Sixty percent of customers will know about at least one of the bank’s contributions to the community on June 1, 2015. Tool: Add an unaided recall question to the values perception survey upon which the campaign was built. Determine progress toward the objective with the interim survey in June 2015 and a final survey the first week of January 2016. chapter 12 Communications measurement and evaluation Objective four: To maintain a 94 percent customer retention rate during 2015 and a 95 percent retention rate for the four years after that (through 2019). Criteria: The bank loses less than 6 percent of its current customers in 2015 and less than 5 percent of its customer base each year from 2016 to 2019. Tool: Use the bank’s customer records to determine what percent of current customers remain each year. The number of customers on the first day of January each year will serve as the benchmark. “Organizations must find smarter data management approaches that enable them to effectively corral and optimize their data.” One of the best ways to do this is to clearly define success in terms of specific, measurable objectives so it becomes clear what data is relevant. Criteria are automatically determined when objectives are set. Objectives are designed to provide direction to planning and to identify the results that define suc cess. Clients and managers will judge success by the criteria (objectives) you have set. In this step of your plan, restate your objectives in terms of success, and desig nate an appropriate method for measuring each one, including a date. For example, if one of your objectives is to increase name recognition of your client from 30 to 80 percent, the metric for success would be written, “Achieve 80 percent name recogni tion of the client’s name among key publics by June 30, 2015.” The successful achievement of all campaign objectives should result in the ac complishment of the goal, which may or may not be directly measurable. If you have followed the planning matrix, accomplishing the overall goal will signify to management that you have achieved success in all three standards identified above. You can justify the expenditure because you reached your goal within proposed budget. You demonstrate effectiveness because your strategies and tactics combined to accomplish the goal. And, you met the campaign objectives, which resulted in the accomplishment of the goal. Make sure to establish meaningful measures of success. Message exposure doesn’t mean message receipt. Always keep in mind that behavior is the ultimate measure. In addition to evaluating campaign results, you should look at the effectiveness of different parts of your plan, including how well strategies and tactics performed. You should also evaluate your own performance: your professionalism, creativity and ability to direct or implement a communication effort. You can add evaluation factors that specifically address your success and effectiveness in community rela tions, media relations or some other skill area. While media placement is not a mea sure of whether a public received and acted upon a message, it is still a factor to be evaluated within the context of effective strategies and tactics. Only through honest 197 198 chapter 12 Communications measurement and evaluation self-evaluation will you improve your skills. What did you do well? What could you have done better? Where do you need more training or experience? These are pri marily internal measures and do not usually become part of the formal campaign put together using the Strategic Planning Matrix. But they are, nonetheless, important. Converting your objectives to evaluation criteria is your primary evaluation of results. Additional criteria that address your team’s specific capability and expertise are highly useful secondary criteria to measure your effectiveness and improve your performance. Evaluation tools o QP EVALUATION ° TOOLS Methods used to gather data needed to assess whether evaluation criteria were met. Each objective must be converted to an evaluation criterion or metric, and each cri terion must be measurable by an evaluation tool. Measurement tools are essentially research tools. They are the same kinds of methodologies used in research, but they focus on outcomes. They include surveys, sales measures, vote counts, dollars raised or saved, legislative bills passed or failed and hundreds of other concrete outcomes. The rules of research apply in evaluation. Sound methodology will not only give you credibility but also reliable and valid data on which to base future efforts. Typically, evaluation measurements require a benchmark measurement before the program begins, during the program or both. Without adequate planning for the evaluation process, the benchmarks are often not taken before the campaign starts, resulting in no data for comparison. Unless you know where you started, you can not determine how far you’ve come. Although measurement tools are essentially the same as research methods, many research organizations have specialized in evaluative methods. It would be wise to access the websites and newsletters on evaluation and measurement produced by specialty firms like Paine Publishing or Cision. While evaluation tools for some objectives may be obvious, others may require complicated formulas that would, for example, combine measures of sales, media placements and social media referrals in some kind of sliding scale that measures the effect of communications, marketing and customer engagement on product sales. Clearly articulated evaluation tools must include the source of information and how it will be obtained. Include all necessary tasks when describing the evaluation tool for each criterion. If you are measuring the criteria mentioned above, your evaluation tool would read something like this: “Conduct a random, statistically viable, telephone survey of the key public population June 28 to 30, 2015, to deter mine what percent recognize the client’s name.” This data could then be compared to the survey conducted at the beginning of the campaign Jan. 1 to 3, 2015, which indicated 30 percent name recognition for the client. Adding evaluation tools to calendars and budgets The evaluation process necessitates reviewing your calendar and budget to ensure that all evaluation tools are scheduled and costs estimated. You can designate a sepa rate section of the calendar and budget to specifically address the planned evalua tion. A wiser choice might be to include evaluation as part of the planned strate gies and tactics for each public. Only
Do you have a similar assignment and would want someone to complete it for you? Click on the ORDER NOW option to get instant services at essayloop.com. We assure you of a well written and plagiarism free papers delivered within your specified deadline.