By aggregating existing platform transparency reports and adding in policy-level granularity, the new document creates a common framework that enables advertisers to assess progress against brand safety for each platform member of GARM. The new framework also drives simplicity, focus and highlights the use of best practice methodology.
The GARM Aggregated Measurement Report is based around four key questions marketers can use to assess progress over time. The report is consistent with the common framework used to define harmful content not suitable for advertising and introduces aggregated reporting. The new frameworks of common definitions and aggregated reporting deliver consistency in well-established practices at the same time as advancing best practices into industry standards.
Ultimately, the report provides a common and focused framework for advertising industry stakeholders to make more informed decisions about their advertising investment.
Highlights from the latest data show that more than eight-in-ten of the 3.3 billion pieces of content removed across the platforms participating in the report is from three leading categories – Spam, Adult & Explicit Content, and Hate Speech & Acts of Aggression.
The data also illuminates a growth in action taken on Hate Speech & Acts of Aggression across platforms. GARM platforms have reported increases in activity and its impact with significant progress by YouTube in the number of account removals, Facebook in the reduction of prevalence, and Twitter in the removal of pieces of content.
These initial improvements have occurred amid an increased reliance on automated content moderation to help manage blocking and reinstatements due to COVID-19 disruptions that resulted in moderation teams working with limited capacity.
GARM is a cross-industry initiative founded and led by the World Federation of Advertisers (WFA) and supported by other trade bodies, including the Association of National Advertisers (ANA), Incorporated Society of British Advertisers (ISBA) and the American Association of Advertising Agencies (4A’s).
“We have built on our agreed definitions to produce a detailed database of the progress that’s being made on reducing harmful content and the potential for monetization across the digital platforms. The collaboration between advertisers, agencies and platforms has been very constructive and we now have common ground to drive even greater progress for the benefit of society, marketers and the long-term health of the digital ecosystem,” said Stephan Loerke, CEO of the WFA.
The report follows nine months of collaborative workshops between major advertisers, agencies and key global platforms working together as one of GARM’s Working Groups, bringing together for the first-time data in a single, agreed location around four core questions and eight authorised metrics that have been agreed as critical to tracking progress on brand safety.
The Aggregated Measurement Report provides a simple and transparent framework based around four core questions that advertisers can use to understand how well the platforms are enforcing their policies in the context of the brand safety floor:
• How safe is the platform for consumers? The prevalence of harmful content will be reported as the number of views of harmful content as a percentage of all views of content.
• How safe is the platform for advertisers? The incidence of advertising appearing in the context of harmful content will be reported as the number of ad impressions on harmful content as a percentage of all ad impressions. For newsfeed environments, the overall consumer prevalence measure above will be reported.
• How effective is the platform enforcing its safety policy? This will be reported as the total number of pieces of harmful content removed and the number of times it has been viewed.
• How responsive is the platform at correcting mistakes? This will be reported as the total number of appeals made by users and the number of reinstatements made by platforms.
Independent oversight and measurement is critical to the GARM initiative, helping create accountability on the challenge of harmful content. It enables each member to ask how are we progressing collectively, how are we progressing individually, how are we tackling each of these topic areas?
Today’s report includes self-reported data from Facebook, Instagram, Pinterest, Snap, TikTok, Twitter and YouTube. Numbers are self-reported by platforms. The full data set can be downloaded here. Twitch, which only joined GARM in March, will join the reporting process for the next report, due later this year.
GARM Working Groups continue to work on other areas of focus including better adjacency controls for brands and hopes to announce further initiatives later in the year.
Raja Rajamannar, Chief Marketing and Communications Officer, Mastercard and WFA President: “This report is great progress for our joint efforts, bringing together consistent and reliable data that marketers can depend on. It establishes common and collective benchmarks that reinforce our goals and help brand leaders, organizations and agencies make sure we keep media environments safe and secure.”
Carolyn Everson, VP, Global Business Group, Facebook: “In 2018, we started Facebook’s transparency report to help people understand how we’re doing at enforcing our policies. However, we recognize marketers need to be able to have a single report to understand the industry’s progress through the lens of a common language and framework. The GARM Aggregated Measurement Report is a big step forward to help simplify these reports for marketers.”
Debbie Weinstein, Vice President, Global Solutions, YouTube: “The Aggregated Measurement Report is a great example of GARM delivering on its mission to bring the industry together to improve the safety, trust and sustainability of digital media. It is our hope that the report helps advertisers more easily assess the progress platforms like YouTube are making in this critical area.”
Sarah Personette, VP, Global Client Solutions, Twitter: “From its conception, GARM has fostered an open and honest exchange of ideas to solve critical problems that will help drive positive global impact. Twitter believes in the power of a public, and open conversation, and our ongoing work with GARM further reinforces our enduring commitment to provide transparency into the work we are doing to support the health of the public conversation.”