Form PX14A6G Meta Platforms, Inc. Filed by: Shareholder Commons

April 15, 2022 5:30 PM EDT


United States Securities and Exchange Commission
Washington, D.C. 20549




Pursuant to Rule 14a-103




United States Securities and Exchange Commission


Washington, D.C. 20549






Pursuant to Rule 14a-103




Name of the Registrant: Meta Platforms Inc.


Name of persons relying on exemption: The Shareholder Commons, Inc.


Address of persons relying on exemption: PO Box 7545, Wilmington, Delaware 19803-7545


Written materials are submitted pursuant to Rule 14a-6(g) (1) promulgated under the Securities Exchange Act of 1934. Submission is not required of this filer under the terms of the Rule but is made voluntarily in the interest of public disclosure and consideration of these important issues.










The Shareholder Commons urges you to vote “FOR” Proposal 7 on the proxy, the shareholder proposal requesting that the Board of Meta Platforms Inc. (“Meta” or the “Company”) commission and publish a report on the risks created by its policy of prioritizing financial performance over mitigating the risks it poses to the economy. The Proposal also requests that the report address any threat this prioritization poses to its diversified shareholders’ portfolios.


The Shareholder Commons is a non-profit advocate for diversified shareholders that works with investors to stop portfolio companies from prioritizing company finances when doing so threatens the value of investors’ diversified portfolios.







A.The Proposal


With more than 3.5 billion regular users of its communications platforms, Meta is possibly the most vital communications link among people on the planet. With its unparalleled reach, it can be a truly positive force for good, connecting people and providing critical information. At the same time, the sheer magnitude of its connectivity puts Meta at risk of being a dangerous force that enables bullying, criminal enterprise, genocide, and harmfully false content.


Meta is a for-profit company that understandably focuses on cash flows and profits, and recent news reports suggest that its management believes the “rules of the game” require that it favor Meta business interests when they clash with public interests.


The Proposal asks for a report on the risks inherent in this prioritization of financial returns over economic risk and how such prioritization may threaten the interests of shareholders who rely on a thriving economy to support their diversified portfolios:


RESOLVED, shareholders ask that the board commission and disclose a report on (1) risks created by Company business practices that prioritize internal financial return over healthy social and environmental systems and (2) the manner in which such risks threaten the returns of its diversified shareholders who rely on a productive economy to support their investment portfolios.




Voting “FOR” Proposal 7 does not constitute a criticism of Meta’s business decisions. The requested report will help diversified shareholders understand how Meta strikes the balance on their behalf between its own share value and investors’ broader economic interests.


The report will help shareholders understand the true cost of Company decisions that can significantly influence mental health, violence around the globe, climate change, vaccine utility, and social stability, among other systemic issues. Without the report, diversified investors and the fiduciaries who vote on their behalf will not have the information they need to understand the effects Company decisions may have on shareholders’ investment portfolios.


B.The Proposal will help Meta and its shareholders navigate the difficult balance between optimizing Meta’s financial performance and limiting practices that threaten the economy


Meta’s Family of Apps (Facebook, Instagram, Messenger, and WhatsApp) are used by 3.59 billion people monthly.1 This connection to nearly half the world’s population provides extremely remunerative business opportunities, but also creates an enormous responsibility. Meta can connect its users and provide information in ways that truly improve lives. At the same time, however, the power of its reach creates potential for serious harm, including genocide, criminality, sabotaging and stifling public discourse, and undermining civil society.




1 Company’s 2022 Annual Report on Form 10-K, available at







Recognizing its responsibility, Meta has tried to address abuses on its platforms while maintaining its role as a public square for diverse viewpoints. However, as with any for-profit business enterprise, Meta’s ambition to improve its impact may face perceived constraints to the extent such improvements threaten profitability. For example, managers seeking to maximize financial returns might resist reducing the flow of engaging but harmful false information to avoid reducing traffic on its platforms. Alternatively, managers might refuse to carry unpopular viewpoints that are important to include in the public discourse if advertisers threaten to pull business because of Meta’s association with such viewpoints.


If Meta management seeks to optimize the financial value of the Company for shareholders, they may feel constrained to trade off positive impact for profits. Meta appears to have made this trade-off in multiple instances, continuing to employ policies and algorithms that boost financial returns but threaten the broader economy, as the following examples demonstrate:


1.       Meta shut down its team that prioritized people over profits


In October of 2021, Time ran an investigative piece entitled, “How Facebook Forced a Reckoning by Shutting down the Team that Put People ahead of Profits.”2 The story detailed the work of Meta’s civic engagement team and its attempts to limit harmful social impact from algorithms used to drive more traffic (and thus more revenue). Among its conclusions:


But for many of the Facebook employees who had worked on the team, including a veteran product manager from Iowa named Frances Haugen, the message was clear: Facebook no longer wanted to concentrate power in a team whose priority was to put people ahead of profits…


Facebook’s focus on increasing user engagement, which ultimately drives ad revenue and staves off competition, [Haugen] argued, may keep users coming back to the site day after day—but also systematically boosts content that is polarizing, misinformative and angry, and which can send users down dark rabbit holes of political extremism or, in the case of teen girls, body dysmorphia and eating disorders.


One former member of the team told Time that before it was dissolved, “The team prioritized societal good over Facebook good. It was a team that really cared about the ways to address societal problems first and foremost. It was not a team that was dedicated to contributing to Facebook’s bottom line.”


The Time story was based in part on testimony from Haugen, who reviewed numerous internal documents and testified before the United States Congress, explaining,




2 Billy Perrigo, “How Facebook Forced a Reckoning by Shutting down the Team that Put People ahead of Profits,” Time (October 7, 2021), available at







The company’s leadership knows ways to make Facebook and Instagram safer and won’t make the necessary changes because they put their immense profits before people… This is not simply a matter of some social media users being angry or unstable. Facebook became a $1 trillion company by paying for its profits with our safety, including the safety of our children.3


2.       Meta policies create fertile fields for various forms of misinformation and harmful content


Meta uses algorithms that dictate the content its users see. These algorithms are designed to create user traffic and engagement, which allow Meta to increase its advertising revenues. As Meta explains, substantially all its revenue comes from advertising, and that revenue is at risk from “decreases in user engagement” and “inability to continue to increase user access to and engagement with our products.”4


This engagement with Meta platforms affects users’ perceptions, and these perceptions affect social institutions and the ability of the global community to address potentially catastrophic threats.


a.Climate change


Climate change and humans’ contribution to it presents a clear example of the sort of subject matter about which damaging misinformation and disinformation are shared widely across Meta’s influential platforms, deeply skewing users’ understanding of an existential threat. As one expert bluntly stated:


Facebook is becoming the last bastion of climate denial.5


One recent report quantified the problem:


[We find] an average range of between 818,000 and 1.36 million views of climate misinformation every day. Just 3.6% of this content has been fact checked. This report also finds that Facebook continues to directly receive thousands of dollars while placing climate misinformation on its advertising platform. This issue in particular has an easy fix, was raised over a year ago, and yet nothing has been done. …




3 United States Senate Committee on Commerce, Science and Transportation, Sub-Committee on Consumer Protection, Product Safety, and Data Security, “Statement of Frances Haugen,” (October 4, 2021), available at

4 Company’s 2021 Annual Report filed on Form 10-K, available at

5 Friends of the Earth press release, “New Facebook study: 99 percent of climate disinformation goes unchecked,” (September 16, 2021), available at







[W]hile Facebook is not the only social platform without an unambiguous policy on climate misinformation, it is the largest, and as such represents one of the biggest, if not the biggest, threat to climate action in the months and years ahead.6


b.COVID-19 vaccines


Information about the current pandemic presents more global risks. Indeed, Company personnel believe that COVID-19 content on Meta’s platforms is harmful, as the following quotes from internal documents concerning COVID-19 vaccination show:


·We know that COVID vaccine hesitancy has the potential to cause severe societal harm…


·Vaccine hesitancy in comments is rampant.


·Our ability to detect vaccine-hesitant comments is bad in English, and basically non-existent elsewhere.7


These documents were disclosed in a Wall Street Journal series on Meta that called attention to its internal divisions over the need to address the harm implicit in its business model. In the article that addressed vaccine misinformation, the reporters concluded that Meta’s business model itself was harmful:


The vaccine documents are part of a collection of internal communications reviewed by the Journal that offer an unparalleled picture of how Facebook is acutely aware that the products and systems central to its business success routinely fail and cause harm.8


c.Body image


Internal documents also show an awareness within Meta that its Instagram platform created severe psychological risks for teens:


For the past three years, Facebook has been conducting studies into how its photo-sharing app affects its millions of young users. Repeatedly, the company’s researchers found that Instagram is harmful for a sizable percentage of them, most notably teenage girls.


“We make body image issues worse for one in three teen girls,” said one slide from 2019, summarizing research about teen girls who experience the issues.




6 Stop Funding Heat, “In Denial - Facebook’s Growing Friendship with Climate Misinformation,” (November 2021), available at

7 Sam Schechner, Jeff Horwitz, and Emily Glazer, “How Facebook Hobbled Mark Zuckerberg’s Bid to Get America Vaccinated,” The Wall Street Journal (September 17, 2021), available at

8 Id.







“Teens blame Instagram for increases in the rate of anxiety and depression,” said another slide. “This reaction was unprompted and consistent across all groups.” …


Expanding its base of young users is vital to the company’s more than $100 billion in annual revenue, and it doesn’t want to jeopardize their engagement with the platform.9


3.       Meta moves slowly to address criminal behavior


Another story in the Wall Street Journal series described Facebook’s weak response to its own employees’ reports of human trafficking and drug cartel activity on the platform:


Employees flagged that human traffickers in the Middle East used the site to lure women into abusive employment situations in which they were treated like slaves or forced to perform sex work. They warned that armed groups in Ethiopia used the site to incite violence against ethnic minorities. They sent alerts to their bosses on organ selling, pornography and government action against political dissent, according to the documents. …


When problems have surfaced publicly, Facebook has said it addressed them by taking down offending posts. But it hasn’t fixed the systems that allowed offenders to repeat the bad behavior. Instead, priority is given to retaining users, helping business partners and at times placating authoritarian governments, whose support Facebook sometimes needs to operate within their borders, the documents show.


Facebook treats harm in developing countries as “simply the cost of doing business” in those places, said Brian Boland, a former Facebook vice president who oversaw partnerships with internet providers in Africa and Asia before resigning at the end of last year.10


While these human rights violations are themselves horrendous, they also sap the global economy of productivity over time, as human potential is wasted and the networks of trust that undergird a healthy economy are compromised. As an example, the article detailed how Meta allowed its platforms to be used as tools of a drug cartel in Mexico, threatening the rule of law that buttresses a healthy economy:




9 Georgia Wells, Jeff Horwitz, and Deepa Seetharaman, “Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show,” The Wall Street Journal (September 14, 2021), available at

10 Justin Scheck, Newley Purnell, and Jeff Horwitz, “Facebook Employees Flag Drug Cartels and Human Traffickers. The Company’s Response Is Weak, Documents Show,” The Wall Street Journal (September 16, 2021), available at







The ex-cop and his team untangled the Jalisco New Generation Cartel’s online network by examining posts on Facebook and Instagram, as well as private messages on those platforms, according to the documents…


The team identified key individuals, tracked payments they made to hit men and discovered how they were recruiting poor teenagers to attend hit-man training camps…


… The former cop recommended the company improve its follow-through to ensure bans on designated groups are enforced and seek to better understand cartel activity.


Facebook didn’t fully remove the cartel from its sites.


The investigation team asked another Facebook unit tasked with coordinating different divisions to look at ways to make sure a ban on the cartel could be enforced. That wasn’t done effectively either, according to the documents, because the team assigned the job didn’t follow up.


On Jan. 13, nine days after the report was circulated internally, the first post appeared on a new CJNG Instagram account: A video of a person with a gold pistol shooting a young man in the head while blood spurts from his neck. The next post is a photo of a beaten man tied to a chair; the one after that is a trash bag full of severed hands.


The article reports that Meta has been similarly lax about enabling ethnic cleansing:


In Ethiopia, armed groups have used Facebook to incite violence. The company’s internal communications show it doesn’t have enough employees who speak some of the relevant languages to help monitor the situation. For some languages, Facebook also failed to build automated systems, called classifiers, that could weed out the worst abuses. Artificial-intelligence systems that form the backbone of Facebook’s enforcement don’t cover most of the languages used on the site. …


In a December planning document, a Facebook team wrote that the risk of bad consequences in Ethiopia was dire, and that “most of our great integrity work over the last 2 years doesn’t work in much of the world.” It said in some high-risk places like Ethiopia, “Our classifiers don’t work, and we’re largely blind to problems on our site.”


Groups associated with the Ethiopian government and state media posted inciting comments on Facebook against the Tigrayan minority, calling them “hyenas” and “a cancer.” Posts accusing Tigrayans of crimes such as money laundering were going viral, and some people on the site said the Tigrayans should be wiped out.


As violence escalated, Secretary of State Anthony Blinken labeled the violence “ethnic cleansing.”







4.       To preserve user engagement, Meta declines to change its algorithm to prevent negative political activity on its platforms


Another article from the Wall Street Journal series explained how a change in the Facebook algorithm drove more negative posting, but Meta chose not to implement a full solution because it wanted to preserve its business.11 The article explained:


The 2018 algorithm change affected Facebook’s central feature, the News Feed… It accounts for the majority of time Facebook’s nearly three billion users spend on the platform. The company sells that user attention to advertisers, both on Facebook and its sister platform Instagram, accounting for nearly all of its $86 billion in revenue last year.


The change led to harsher discourse:


In Poland, the changes made political debate on the platform nastier…


“One party’s social media management team estimates that they have shifted the proportion of their posts from 50/50 positive/negative to 80% negative, explicitly as a function of the change to the algorithm,” wrote two Facebook researchers in an April 2019 internal report.


[Political parties in Central and eastern Europe] now have an incentive, [a political scientist] said, to create posts that rack up comments and shares—often by tapping into anger—to get exposure in users’ feeds.


The issue extends to Western Europe and Asia as well:


The Facebook researchers wrote in their report that in Spain, political parties run sophisticated operations to make Facebook posts travel as far and fast as possible.


“They have learnt that harsh attacks on their opponents net the highest engagement,” they wrote.” …


Facebook researchers wrote in their internal report that they heard similar complaints from parties in Taiwan and India.


When employees figured out how to tweak the algorithm to address the negative impacts, the change was vetoed because it would reduce traffic, the Company’s stock in trade:




11 Keach Hagey and Jeff Horwitz, “Facebook Tried to Make Its Platform a Healthier Place. It Got Angrier Instead.” The Wall Street Journal (September 15, 2021), available at







Early tests showed how reducing [an] aspect of the algorithm for civic and health information helped reduce the proliferation of false content. Facebook made the change for those categories in the spring of 2020.


When Ms. Stepanov presented Mr. Zuckerberg with the integrity team’s proposal to expand that change beyond civic and health content—and a few countries such as Ethiopia and Myanmar where changes were already being made—Mr. Zuckerberg said he didn’t want to pursue it if it reduced user engagement, according to the documents.


In other words, Meta was forced to decide between its profits and having a positive impact on world politics, and the former won.


C.Meta’s diversified shareholders would benefit if it stopped prioritizing its own financial value over broader economic health


1.       Investors must diversify to optimize their portfolios


It is commonly understood that investors are best served by diversifying their portfolios.12 Diversification allows investors to reap the increased returns available from risky securities while greatly reducing that risk.13 Due to this modern understanding of investing, many (if not most) of Meta’s shareholders are diversified, or serve diversified clients or beneficiaries and own Meta shares as part of an asset-diversification strategy.


2.       A diversified portfolio’s performance largely depends on overall market return


Once a portfolio is diversified, the most important factor determining return will not be how the companies in that portfolio perform relative to other companies (“alpha”), but rather how the market performs as a whole (“beta”).


In other words, the financial return to such diversified investors depends chiefly on the performance of the market, not the performance of individual companies. As one work describes this, “[a]ccording to widely accepted research, alpha is about one-tenth as important as beta [and] drives some 91 percent of the average portfolio’s return.”14




12 See generally, Burton G. Malkiel, A Random Walk Down Wall Street (2015).

13 Id. 

14 Stephen Davis, Jon Lukomnik, and David Pitt-Watson, What They Do with Your Money, Yale University Press (2016).







3.       Costs companies impose on the economy heavily influence beta


Over long time periods, beta is influenced chiefly by the performance of the economy itself,15 and diversified portfolios rise and fall with GDP or other indicators of the economy’s intrinsic value. As the legendary investor Warren Buffet puts it, GDP is the “best single measure” for broad market valuations.16


But the external costs created by companies pursuing profits can burden the economy. For example, fomenting conflict has a cost: “War and other forms of armed conflict should be considered a major impediment to the economic development of low-income countries, many of which are beset by ethnic and religious strife.”17 Failure to mitigate rising temperatures will be costly as well: the world’s largest reinsurer found that if increases in atmospheric carbon concentration stay on the current trajectory, rather than aligning with the Paris Accords, GDP could be 10 percent lower by 2050.18 More immediately, the difference between an efficient response to COVID-19 and an inefficient one could create a $9 trillion swing in GDP.19


Thus, if Meta increases its own bottom line with policies that foment armed conflict or limit society’s ability to address environmental risks or pandemics, the profits may be inconsequential to diversified shareholders compared to the added costs the economy bears. Economists have long recognized that profit-seeking firms will not account for costs they impose on others, and there are many profitable strategies that harm stakeholders, society, and the environment.20 When the economy suffers from these “externalized” costs, so do diversified shareholders. Meta’s diversified shareholders will internalize many of the costs Meta imposes on the economy, as shown in Figure 1.




15 Principles for Responsible Investment & UNEP Finance Initiative, “Universal Ownership: Why Environmental Externalities Matter to Institutional Investors,” Appendix IV,

16 Warren Buffett and Carol Loomis, “Warren Buffett on the Stock Market,” Fortune Magazine (December 10, 2001), available at

17 Clifford Thies and Christopher Baum, “The Effect of War on Economic Growth,Cato Journal (Winter 2020), available at

18 Swiss Re Institute, “The Economics of Climate Change: No Action Not an Option,” (April 2021) (Up to 9.7% loss of global GDP by mid-century if temperature increase rises on current trajectory rather than Paris Accords goal), available at

19 Ruchir Agarwal and Gita Gopinath, “A Proposal to End the COVID-19 Pandemic,” IMF Staff Discussion Note (May 19, 2021), available at

20 See, e.g., Kaushik Basu, Beyond the Invisible Hand: Groundwork for a New Economics, Princeton University Press (2011), p.10 (explaining the First Fundamental Theorem of Welfare Economics as the strict conditions (including the absence of externalities) under which competition for profit produces optimal social outcomes).









Figure 1


The Proposal seeks information that will allow those investors to understand how Meta’s choices are affecting their portfolios.


It is important to note that Mark Zuckerberg, whose ownership of high voting shares gives him working control over Meta, is not diversified; his family’s wealth is concentrated in Meta shares, giving him a conflict of interest in matters where the interests of a concentrated shareholder would clash with the interests of a diversified shareholder.


D.In its opposition statement, Meta either fails to understand or refuses to acknowledge the point of the Proposal


Meta’s statement opposing the Proposal says, “We believe that protecting our community is more important than maximizing our profits.” This statement obviously contradicts the evidence laid out above. Meta’s decision-making structure is designed to enhance the value of Meta shares, not community safety or diversified shareholder returns.







Meta argues the report is unnecessary because it has already “made significant investments in our safety and security efforts,” and “spent approximately $5 billion on safety and security in 2021 alone.” But the question is not what Meta spent, but rather how it decides how much to spend. After all, that $5 billion pales in comparison to the $44.81 billion Meta spent repurchasing shares in 2021, or the additional $48 billion in cash balance with which Meta ended the year.21


The report will help diversified shareholders understand whether the economy and their diversified portfolios could be better protected if more of that cash were spent on translators for all the languages in which Meta platforms are used, more fact-checking, or better systems to filter out recruiting by abusive employers or incitement to violence against ethnic groups, for example.


Meta also argues the report is unnecessary because:


In 2017, we began our efforts to prioritize meaningful social interactions, and deprioritize other items like viral videos for our users by changing our News Feed rankings. In 2018 we further revised our News Feed rankings to prioritize posts from friends and family as our research suggested that people derive more meaningful conversations and experiences when they engage with people that they know. Additionally, beginning in October 2020, we turned off political ad spending in the U.S. for approximately four months to reduce opportunities for confusion or abuse ahead of and following the U.S. presidential election.


But the documents reported in the press show that Meta’s own employees knew the News Feed changes were detrimental to public discourse. As The Wall Street Journal reported, Mr. Zuckerberg rejected fixes that might reduce user engagement.22


Highlighting this divergence of interests between Meta and diversified shareholders is not an indictment of Meta’s Board or management. It is simply an honest description of the current state of affairs. If Meta refuses to acknowledge these plain facts, there is little chance it will be able to work constructively with its shareholders to strike the right balance.


E.Why you should vote “FOR” Proposal 7


Voting “FOR” the Proposal will signal to Meta that shareholders want to understand whether the Company is putting the global economy (and thus their diversified portfolios) at risk to improve Meta’s financial performance.




·Meta is the largest social media company in the world. The influence of its platforms is well known, and involves questions not only of politics, but of data privacy, public health, and other issues. Activity on social media platforms can threaten the social fabric upon which thriving economies depend.





22 See supra n.11.







·Press reports of internal documents demonstrate that Meta prioritizes user traffic and engagement over preventing negative economic impact.


·While Meta may increase its internal rate of return by externalizing costs, its diversified shareholders will ultimately pay these costs in the rest of their portfolios.


·Meta’s controlling shareholder and other decision makers—who are heavily compensated in equity—do not share the same broad market risk as Meta’s diversified shareholders.


·The Proposal only asks for an analysis, not a change in practice. Any trade-offs of economy-wide risk for narrow Company financial gain must be explained, so that shareholders can reach informed views about Meta’s balance between promoting internal financial return and maintaining the economic health that supports their diversified portfolios.




Please vote “FOR” Item 7.


By voting “FOR” Item 7, shareholders can urge Meta to account directly for the external impacts of its operations, which directly affect diversified shareholders’ returns. Such a report will aid the Meta Board and management to authentically serve the needs of shareholders while preventing the dangerous implications to shareholders and others of a narrow focus on financial return.


The Shareholder Commons urges you to vote “FOR” Proposal 7 on the proxy, the Shareholder Proposal requesting a report on cost externalization at the Meta Platforms Inc. Annual Meeting on May 25, 2022.


For questions regarding the Meta Platforms Inc. Proposal submitted by H.E.S.T. Australia Ltd, Trustee of Health Employees Superannuation Trust Australia, please contact Sara E. Murphy of The Shareholder Commons at +1.202.578.0261 or via email at [email protected].












Serious News for Serious Traders! Try Premium Free!

You May Also Be Interested In

Related Categories

SEC Filings

Related Entities