Close

Form PX14A6G Meta Platforms, Inc. Filed by: AS YOU SOW

April 22, 2022 4:47 PM EDT

 

      Main Post Office, P.O. Box 751                    www.asyousow.org

Berkeley, CA 94704

   

BUILDING A SAFE, JUST, AND SUSTAINABLE WORLD SINCE 1992

 

 

Notice of Exempt Solicitation Pursuant to Rule 14a-103

 

Name of the Registrant: Meta Platforms Inc (FB)
Name of persons relying on exemption: As You Sow
Address of persons relying on exemption: Main Post Office, P.O. Box 751, Berkeley, CA 94704

 

Written materials are submitted pursuant to Rule 14a-6(g)(1) promulgated under the Securities Exchange Act of 1934. Submission is not required of this filer under the terms of the Rule, but is made voluntarily in the interest of public disclosure and consideration of these important issues.

 

 

 

Meta Platforms Inc (Formerly Facebook) (FB)
Vote Yes: Item #8 – Report on Community Standards Enforcement

Annual Meeting: May 25, 2022

 

CONTACT: Andrew Behar | [email protected]

 

 

THE RESOLUTION

 

RESOLVED: Shareholders request the Board, at reasonable expense and excluding proprietary or legally privileged information, prepare a report analyzing why the enforcement of “Community Standards” as described in the “Transparency Center” has proven ineffective at controlling the dissemination of user content that contains or promotes hate speech, disinformation, or content that incites violence and/or harm to public health or personal safety.

 

SUPPORTING STATEMENT: Proponent suggests the report include, in Board and management discretion:

 

·A quantitative and qualitative assessment by an external, independent panel of qualified computer scientists of the effectiveness of Meta’s algorithms to locate and eliminate content that violates the Community Standards

 

·An assessment of the effectiveness of Meta’s staff and contractors in locating and eliminating content that violated the Community Standards

 

·An examination of benefits to users and impact to revenue if the Company would voluntarily follow existing legal frameworks established for broadcast networks (e.g. laws forbidding child pornography and rules governing political ads)

 

·An analysis of the benefits of the Company continuing to conduct technology impact assessments focused on how Meta’s platforms affect society.

 

This report should cover each of Meta’s major products, including Facebook, Messenger, Instagram, WhatsApp, and any other app that reaches over 100 million users.

 

   
 

 

     

2022 Proxy Memo

Meta Platforms, Inc. | Report on Community Standards Enforcement

 

 

SUMMARY

 

Over the past several years, Meta Platforms, in particular Facebook, has been criticized for proliferating false and divisive language including political advertisements containing deliberate lies and mistruths,1 the continued propagation of hate speech and extremist groups,2 mental health issues arising from the use of Meta’s platforms,3 and anti-immigration violence around the world.4

 

Meta has been the subject of widespread public backlash and as a result, has seen the value of its shares drop during some of its recent scandals. For example, the Company’s stock dropped nearly 5% after whistleblower Frances Hague publicly stated that Facebook’s algorithms push misinformation onto users, and that Facebook executives were aware of the negative effects of the platform on young users.5

 

The lack of progress by Meta to enforce community standards could have detrimental long-term effects for shareholders and outside stakeholders. Meta has a responsibility to its investors to ensure the site is safe and remains well-respected. An analysis of past and current efforts at mitigating misinformation, and addressing the efficacy of such practices, is a vital step in this process.

 

The Proposal asks for a report analyzing why the enforcement of Community Standards as described in the Company’s Transparency Center has proven ineffective at controlling the dissemination of user content that promotes hate speech, disinformation, or content that incites violence and/or harm to public health or personal safety.

 

RATIONALE FOR A YES VOTE

 

1.The proliferation of harmful information on Meta’s sites not only leads to negative public perception and reputational risks, but importantly, leads to real world impact.

 

2.Meta does not provide sufficient analysis to permit its shareholders to assess why the enforcement of “Community Standards” as described in the “Transparency Center” has proven to be ineffective.

 

3.External trends, such as regulation and advertiser retaliation, could pose long-term risks to Meta.

 

DISCUSSION

 

1.The proliferation of harmful information on Meta not only leads to negative public perception and reputational risks, but importantly, leads to real world impact.

 

 

_____________________________

 

1 https://www.washingtonpost.com/technology/2019/10/10/facebook-policy-political-speech-lets-politicians-lie-ads/

2 https://www.dailydot.com/debug/hate-speech-facebook/

3 https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739

4 https://www.dw.com/en/new-study-shows-afd-facebook-posts-spur-anti-refugee-attacks/a-41972992

5 See, https://www.cnbc.com/2021/10/04/facebook-whistleblower-reveals-identity-ahead-of-60-minutes-interview.html; See also, https://www.cnbc.com/2021/10/04/facebook-shares-drop-5percent-after-site-outage-and-whistleblower-interview.html

 

2
 

 

     

2022 Proxy Memo

Meta Platforms, Inc. | Report on Community Standards Enforcement

 

 

One of the most important aspects of success for social media companies is the preservation of trust among users. When trust is broken, it can lead to reputational damage and have lasting negative impact on the goals of maintaining a strong user base and strong advertising partnerships. According to Ernst & Young, without trust, media companies cannot sustain the value of their brand or drive the subscription and advertising revenues that are their lifeblood.6 The most effective way for social media companies to fight incorrect news is to support employees and make the investments necessary to monitor and react quickly.

 

Yet, Meta continues to ignore concerns of its employees and internal reviews about the lack of effectiveness of its current programs. For example, internal documents obtained by the New York Times show the degree to which Facebook knew of extremist movements and groups on its site that were trying to polarize American voters before the 2020 election.7 In a postmortem internal report on how Facebook handled the 2020 election, employees wrote that Facebook’s election misinformation rules left too many gray areas. As a result, posts that “could be construed as reasonable doubts about election processes” were not removed because they did not violate the letter of those rules. Those posts then created an environment that contributed to social instability, the report said.8

 

As Meta’s reputation is damaged it also decreases employee morale which could increase risk to the company and shareholders. According to a report by the New York Times, “about half [of employees] felt that Facebook was having a positive impact on the world, down from roughly three-quarters earlier this year.”9 On top of this, employee trust in leadership, and their intent to remain at the company, decreased as well. Such a significant decrease in employee satisfaction could be harmful for stockholders. Research shows that there is a strong correlation between company morale and productivity, which could in turn affect profits.10

 

Decisions made by Meta across its platforms can lead to real world harm. The Company’s plans to allow end-to-end encryption for all private messaging has raised concern about the potential for easier spread of child pornography and exploitation materials and less accountability by purveyors. Its decision to delay such encryption through 2023 to assess a range of concerns is to be applauded. Facebook also stopped one program intended to protect children owing to fears that a new EU privacy directive rendered it unlawful, but the US National Center for Missing and Exploited Children argued that Facebook has drawn the wrong conclusion, as well as pointing to continued efforts from companies including Google and Microsoft.11

 

 

_____________________________

 

6 https://www.ey.com/en_us/forensic-integrity-services/how-media-organizations-can-get-real-and-confront-fake-news

7 https://www.nytimes.com/2021/10/22/technology/facebook-election-misinformation.html?referringSource=articleShare

8 https://www.nytimes.com/2021/10/22/technology/facebook-election-misinformation.html?referringSource=articleShare

9 https://www.nytimes.com/2020/11/24/technology/facebook-election-misinformation.html

10 https://www.researchgate.net/publication/309494308_Correlation_of_Morale_Productivity_and_Profit_in_Organizations

11 https://www.theguardian.com/technology/2021/jan/21/facebook-admits-encryption-will-harm-efforts-to-prevent-child-exploitation

 

3
 

 

     

2022 Proxy Memo

Meta Platforms, Inc. | Report on Community Standards Enforcement

 

 

A University of Warwick study found a direct correlation between social media posts by far-right Alternative for Germany (AfD) party and hate crime in Germany, to the extent that localized internet outages actually resulted in a reduction in hate crimes in some areas.12 Data collected from AfD’s Facebook and Twitter accounts found that “right-wing anti-refugee sentiment…predicts violent crimes against refugees in otherwise similar municipalities with higher social media usage.”13

 

There is major investor concern that Meta is not taking its role as a significant media presence seriously. For example, Facebook’s chief operating officer, Sheryl Sandberg, denied the site had a major role in the January 6 U.S. Capitol riots, stating, “I think these events were largely organized on platforms that don’t have out abilities to stop hate, don’t have our standards and don’t have our transparency.” Her remarks were made as reporters continue to catalogue groups with tens of thousands of members on Facebook who were openly planning events with slogans such as “If they won’t hear us, they will fear us: Occupy Congress.”14 Twitter, on the other hand, has admitted that its platform has played a part in the violence on January 6, with its head of policy, Nick Pickles stating, “My colleagues were shocked watching the events in the Capitol and I think it’s impossible for anyone to look at that and not think ‘Did we play a part in this?’ We have to conclude: yes.”15

 

The impact of reputational damage to Meta increases the risk for shareholders in the long run. Corporations, individuals, and employees are signifying that change within our company must occur. Given the ability for incorrect news and divisiveness to cause widespread harm to the Meta brand and create risk for shareholders, a report into the effectiveness of mitigation practices is warranted.

 

2.Meta does not provide sufficient analysis to permit its shareholders to assess why the enforcement of “Community Standards” as described in the “Transparency Center” has proven to be ineffective.

 

The Proposal asks for an evaluation as to why the current enforcement strategies of Meta’s Community Standards are ineffective. Meta’s Transparency Center releases quarterly Community Standard Enforcement Reports. For the Fourth Quarter 2021, the report found that harmful content on Facebook and Instagram “remained relatively consistent.”16 Despite reports on the number of posts Meta “takes action” on, more information is needed on how this data affects policy and strategy decisions within the company, and whether these decisions lead to improvements.

 

A NewVantage Partners survey of Fortune 1,000 senior executives found highly data-driven organizations are three times more likely to report significant improvements in decision-making quality compared to those who rely less on data.17 Given the importance of data based decision-making, shareholders deserve to have a clear understanding of not only how Meta is fighting against misinformation, but how effective those programs are. Decisions should be made by analyzing outcomes -- and the effectiveness of the programs -- not merely by stating the number of programs or actions taken, or the number of pieces of information removed.

 

 

_____________________________

 

12 https://www.dw.com/en/new-study-shows-afd-facebook-posts-spur-anti-refugee-attacks/a-41972992

13 https://www.dw.com/en/new-study-shows-afd-facebook-posts-spur-anti-refugee-attacks/a-41972992

14 https://www.theguardian.com/technology/2021/jan/21/facebook-admits-encryption-will-harm-efforts-to-prevent-child-exploitation

15 https://www.theguardian.com/technology/2021/jan/21/facebook-admits-encryption-will-harm-efforts-to-prevent-child-exploitation

16 https://about.fb.com/news/2022/03/community-standards-enforcement-report-q4-2021/

17 https://online.hbs.edu/blog/post/data-driven-decision-making

 

4
 

 

     

2022 Proxy Memo

Meta Platforms, Inc. | Report on Community Standards Enforcement

 

 

These asks reflect recommendations laid out in a 2019 report that Facebook chartered. Facebook created the Data Transparency Advisory Group, in collaboration with the Justice Collaboratory at Yale Law School, to assess Facebook’s Community Standards Enforcement Report (CSER) and provide recommendations on how to improve reporting practices.18 It recommended Facebook:

 

·Release accuracy rates for both human and automated decisions;19
·Explore ways of relating prevalence metrics to real-world harm (e.g., is an increase in prevalence of hate speech posts correlated with an increase in ethnic violence in the region; or an increase in removals of hate speech posts correlated with a decrease in ethnic violence?);20
·Break out actioned content measures by type of action taken (e.g., content taken down, content covered with warning, account disabled);21
·Explore ways to enhance bottom-up (as opposed to top-down) governance to include more user participation; and
·For the sake of transparency, explore ways of releasing anonymized and aggregated versions of the data upon which the metrics in the CSER are based. This would allow external researchers to verify Facebook’s representations.22

 

The report notes that, “We did not conduct a sufficiently detailed audit to assess how accurately Facebook implements this [enforcement] process, or whether the performance reported in the CSER is accurate.”23 This highlights the importance of the Proposal’s request to have a report fully drafted by external and independent experts. A third-party report on the effectiveness of enforcement policies would be instrumental in ensuring that our Company is doing everything in its power to limit the proliferation of divisive or untrue speech.

 

3.External trends, such as regulation and advertiser retaliation, could pose long-term risks to Meta.

 

“What could really hurt Facebook is the long-term effect of its perceived reputation and the association with being viewed as a publisher of ‘hate speech’ and other inappropriate content,” said Stephen Hahn-Griffiths, the executive vice president of the public opinion analysis company RepTrak.24

.

A whistleblower complaint filed with the SEC argues that the Company has failed to adequately warn investors about the material risks of dangerous and criminal behavior, terrorist content, hate speech, and misinformation on its sites.25 This resulted in Congressional hearings in 2021 on Facebook’s mismanagement of these scandals. This could lead to renewed sentiment from both the public and government that Facebook’s policies need to be examined, and perhaps regulated.

 

 

_____________________________

 

18 https://law.yale.edu/yls-today/news/facebook-data-transparency-advisory-group-releases-final-report

19 https://law.yale.edu/sites/default/files/area/center/justice/document/dtag_report_5.22.2019.pdf p.8

20 https://law.yale.edu/sites/default/files/area/center/justice/document/dtag_report_5.22.2019.pdf p.9

21 https://law.yale.edu/sites/default/files/area/center/justice/document/dtag_report_5.22.2019.pdf p.10

22 https://law.yale.edu/sites/default/files/area/center/justice/document/dtag_report_5.22.2019.pdf p.10

23 https://law.yale.edu/sites/default/files/area/center/justice/document/dtag_report_5.22.2019.pdf p.8

24 https://www.nytimes.com/2020/08/01/business/media/facebook-boycott.html

25 https://www.washingtonpost.com/technology/2021/10/22/facebook-new-whistleblower-complaint/

 

5
 

 

     

2022 Proxy Memo

Meta Platforms, Inc. | Report on Community Standards Enforcement

 

 

Facebook has historically been protected from liability for hate speech and incorrect information by Section 230 of the Communications Decency Act. This act, passed in 1996, shields internet companies from liability on their sites by stating that they are not to be regarded as the publisher of information, but merely a platform for the information. However, many legislators have been pushing to recall this Act. Members of Congress from both political parties, including Republican Senator from Texas Ted Cruz and Democratic Speaker from California Nancy Pelosi, have hinted that this act could be removed.26 President Joe Biden spoke on this issue, saying, “the idea…is that Section 230 should be revoked, immediately should be revoked, number one. For Zuckerberg and other platforms.27 Given that changing or removing Section 230 of the Communications Decency Act has become a more realistic possibility, it would be wise to have an understanding of the effectiveness of practices to ensure mitigation strategies are working.

 

Shareholders would benefit from an examination of how users and revenue would be impacted if the Company voluntarily followed existing legal frameworks for broadcast networks. This would include following laws forbidding child pornography and rules governing political ads.

 

Such analysis could provide greater clarity on how improvements to its community standards enforcement could improve Meta’s standing in the eyes of the private sector. Facebook has already seen instances of retaliation against its advertising policies. In 2020, a month-long advertising boycott by many major corporations, unified under the hashtag #StopHateForProfit, signified that companies were unhappy with the level of distrust on Facebook. Many other companies, such as Target, Nike, and Hershey, have intentionally reduced their advertising spending on Facebook.28 While the official boycott ended, several top advertisers including Verizon, Coca-Cola, Clorox, and HP continued their boycott. Patagonia as recently as October 2021, has maintained its advertising boycott.29

 

Given that advertising revenue continues to make up the bulk of Facebook’s income,30 the Company should examine how aligning with political advertising regulations and creating effective enforcement of its community standards, would impact its bottom line. Meta notes in its opposition statement that it spent approximately $5 billion on safety and security in 2021 alone. Given these costs, shareholders require greater transparency on the effectiveness of Meta’s current actions and the costs and benefits of additional or different actions.

 

 

RESPONSE TO META BOARD OF DIRECTORS’ STATEMENT IN OPPOSITION

 

In Meta’s Opposition Statement, the company notes, “We also publish quarterly Community Standards Enforcement Reports that disclose how we are doing at enforcing our content policies on Facebook and Instagram. In these quarterly reports, we disclose the prevalence of several types of violating content on Facebook and Instagram, the amount of this content we took action on, the amount of content we found proactively before people reported it, the amount of content we actioned that people appealed, and the amount of content we restored after removing. In addition, we hold conference calls with the media after we issue each of these quarterly reports to provide further clarity on our actions.”

 

 

_____________________________

 

26 https://www.nytimes.com/2020/05/28/business/section-230-internet-speech.html

27 https://www.theverge.com/2020/1/17/21070403/joe-biden-president-election-section-230-communications-decency-act-

Revoke

28 https://fortune.com/2020/11/07/facebook-ad-boycott-big-brands-lego-clorox-verizon-microsoft-hp/

29 https://www.cnn.com/2021/10/28/business/patagonia-ceo-facebook-boycott/index.html

30 https://www.nytimes.com/2021/07/28/business/facebook-q2-earnings.html

 

6
 

 

     

2022 Proxy Memo

Meta Platforms, Inc. | Report on Community Standards Enforcement

 

 

While investors are pleased to see this progress, this does not fully respond to the requests of this proposal. Shareholders would like to see an external and independent report not only on the metrics Meta lays out above, but on the overall effectiveness of both its technical and human enforcement strategies. The company notes that outside of its reports and other public disclosures, it collaborates with global experts to improve its services. Once again, this is a welcome strategy, but given Meta’s reputation of lacking transparency, and the continuing real world harms associated with its platforms, an independent report by a panel of experts is needed to restore both the trust of investors and the public.

 

Meta states, “To strengthen our commitment in enforcing our policies, we have over 40,000 people working on safety and security issues, including over 15,000 people who review content in more than 70 languages in more than 20 locations across the world. We also spent approximately $5 billion on safety and security in 2021 alone.” Given the resources Meta is allotting to safety and security issues, it is important for investors to have a better understanding of how effectively this money is being spent toward the goal of ending hate speech, human harm, and disinformation associated with Meta’s platforms.

 

CONCLUSION

 

Vote “Yes” on this Shareholder Proposal Regarding Report on Community Standards Enforcement

 

By supporting this proposal, shareholders underscore the importance of Meta assessing the success or failure of its efforts to manage misinformation and divisiveness on its platform. A failure to stem such posts creates significant risk of reputational damage and potential loss of customers and revenue. As Meta continues to work against hate speech and misinformation on its platform, this assessment could guide future decisions at Meta and promote a positive reputation, minimizing the long-term risk for investors.

 

--

 

For questions, please contact Andrew Behar, As You Sow, [email protected]

 

THE FOREGOING INFORMATION MAY BE DISSEMINATED TO SHAREHOLDERS VIA TELEPHONE, U.S. MAIL, E-MAIL, CERTAIN WEBSITES AND CERTAIN SOCIAL MEDIA VENUES, AND SHOULD NOT BE CONSTRUED AS INVESTMENT ADVICE OR AS A SOLICITATION OF AUTHORITY TO VOTE YOUR PROXY. THE COST OF DISSEMINATING THE FOREGOING INFORMATION TO SHAREHOLDERS IS BEING BORNE ENTIRELY BY ONE OR MORE OF THE CO-FILERS. PROXY CARDS WILL NOT BE ACCEPTED BY ANY CO-FILER. PLEASE DO NOT SEND YOUR PROXY TO ANY CO-FILER. TO VOTE YOUR PROXY, PLEASE FOLLOW THE INSTRUCTIONS ON YOUR PROXY CARD.

 

 

7

 

 

 

 



Serious News for Serious Traders! Try StreetInsider.com Premium Free!

You May Also Be Interested In





Related Categories

SEC Filings

Related Entities

PX14A6G