Upgrade to SI Premium - Free Trial

Form PX14A6G Alphabet Inc. Filed by: LORING WOLCOTT & COOLIDGE FIDUCIARY ADVISORS LLP/MA

May 11, 2020 3:13 PM

 

Notice of Exempt Solicitation

NAME OF REGISTRANT: Alphabet Inc.

NAME OF PERSONS RELYING ON EXEMPTION: Loring, Wolcott & Coolidge Fiduciary Advisors, LLP

ADDRESS OF PERSON RELYING ON EXEMPTION: 230 Congress Street, Boston, MA 02110

WRITTEN MATERIALS: The attached written materials are submitted pursuant to Rule 14a-6(g)(1) (the “Rule”) promulgated under the Securities Exchange Act of 1934, in connection with a proxy proposal to be voted on at the Registrant’s 2019 Annual Meeting.

 

 

Dear Alphabet Inc.’s Shareholders:

 

The proponents of Proxy Item 7 are long-term holders of Alphabet Inc. (“Alphabet” or “the Company”) Class A shares, therefore our interests are well aligned with those of Alphabet Inc. In that spirit, we recommend that all shareholders support Proxy Item 7 which requests that the Board of Directors (“the Board”) establish a committee to oversee the Company’s human rights risks (a “Human Rights Risk Oversight Committee” or “the Committee”) to help anticipate and oversee management of the adverse human rights and societal impacts associated with Alphabet’s technologies.

 

Support for a Board committee to oversee Alphabet’s human rights risks is warranted because:

 

1.Human rights risks are embedded in Alphabet’s business model: Through its ubiquitous footprint, Alphabet has unique power and influence within our society, but at the same time poses distinct risks to core human rights such as the right to privacy, political participation, freedom of expression, health, and non-discrimination. These risks are embedded in its products and services with extensive access to user data, and could threaten its business model.
2.Failure to manage these impacts could result in significant business risk: Alphabet’s success depends on retaining the trust of its users, advertisers, employees, investors, and the public. Yet it faces regulatory, legal, human capital and—probably most significant given its reliance on advertising revenues—reputational risks, if not sufficiently managed.
3.The Company’s actions to address these risks are currently insufficient: Both users and advertisers must trust that the Company has appropriate policies and practices in place to manage human rights risks; but thus far the Company has failed to demonstrate it has a coherent approach to manage these risks or to ensure the Board is providing oversight.
4.The Board is not providing sufficient oversight to meet investors’ expectations: The adoption of this committee would help important stakeholders, including shareholders and the Company’s own employees, retain (or regain) trust by demonstrating that genuine efforts are being made to oversee potential human rights risks at the board level.

 

The proponents believe Alphabet has neither (1) comprehensive, company-wide policies, processes and due diligence systems; nor (2) Board-level oversight necessary to sufficiently address the human rights risks associated with the Company’s technologies. Please see Tables 1 and 2 in the annex for a detailed analysis of the relevant policies, practices, and processes highlighted in Alphabet’s Opposition Statement in the Proxy and our recommendations for improvement.

 

This proposal is submitted by a broad coalition of shareholders demonstrating the significant level of shareholder interest in seeing the company take action on this issue. There are four lead filers that span the globe: Hermes Investment Management in the UK, the Sustainability Group of Loring, Wolcott & Coolidge in the USA, NEI Investments in Canada and Robeco in the Netherlands. In addition, a broad group of co-filers has lent their support including Aviva Investors, AXA Investment Managers, Boston Common Asset Management, Church Commissioners for England, Church of England Pensions Board, de Pury Pictet Turrettini & Co Ltd, and MP Pension among others from North America, UK, Europe and Asia as co-filers. This is an action we have collectively taken, following a private letter sent to the Company in November 2019 signed by over 80 institutional investors representing nearly $10 trillion in assets under management and advice, who requested a dialogue with Alphabet on these issues. The Company denied this request.

 

We urge you to indicate your support by marking “FOR” on ITEM 7.

 

The views expressed are those of the authors and Loring, Wolcott & Coolidge Fiduciary Advisors, LLC as of the date referenced and are subject to change at any time based on market or other conditions. These views are not intended to be a forecast of future events or a guarantee of future results. These views may not be relied upon as investment advice. The information provided in this material should not be considered a recommendation to buy or sell any of the securities mentioned. It should not be assumed that investments in such securities have been or will be profitable. This piece is for informational purposes and should not be construed as a research report.

 

NOTE: This is not a solicitation of authority to vote your proxy. Please DO NOT send us your proxy card; LWCFA is not able to vote your proxies, nor does this communication contemplate such an event. LWCFA urges shareholders to vote FOR Proxy Item Number 7 following the instructions provided on the management’s proxy mailing.

 

 1
 

 

Alphabet’s business

 

Alphabet is one of the most influential entities of our time. Its size and reach are staggering: in 2020, Alphabet became one of only a few companies with a trillion-dollar market capitalization.i

 

Alphabet is many enterprises rolled up into one. Google Chrome is the world’s dominant browser, making it the de facto gateway to the entire web, while Google’s Android is the world’s biggest mobile operating system, with over 2.5 billion monthly active Android devices reported in 2019.ii YouTube is the largest video platform in the world and has nearly two billion users each month.iii Alphabet’s cloud computing offerings include GSuite and Google Maps in addition to its file-sharing platforms. Through Nest, the Company offers smart doorbells, thermostats, lighting, and home security. For advertisers, it offers an array of analytical tools to reach Alphabet’s billions of users with precision. It also provides systems software, including the AI-powered Google Assistant, which has 500 million users across 90 countries,iv as well as consumer hardware products like the Chromebook and Pixel smartphones and tablets. Beyond the Google empire, the Company’s “Other Bets” include Verily, engaged in healthcare research, Calico, focused on prolonging human life, AI leader DeepMind, and autonomous vehicle-maker Waymo, among others which operate outside the spotlight.v

 

While the Company’s products and technologies are diverse and sprawling, they are connected by a mission to “organize the world’s information and make it universally accessiblevi through the collection, storage, control and analysis of data from every moment of our everyday lives. In March 2020, Ruth Porat expressed that users want technologies which “are always with you… in a non-obtrusive way…” and “able to go from surface-to-surface, to phone to home to living room to cars to watch.”vii The ubiquity of Alphabet’s technologies in our lives—and the potential human rights risks this raises for the Company and individuals—is likely to significantly increase. For example, in March 2020, a General Data Protection Regulation (GDPR) complaint was lodged with the Irish Data Protection Commission (DPC), (Google’s lead GDPR regulator in Europe) that Google’s “privacy policy tying” allows it to cross-use the mass of data which it has acquired between its products.viii  The control and cross-analysis of this vast data trail of private information across markets gives rise to significant ethical and human-rights responsibilities which Proponents believe require a world-leading human-rights response from the Company.

 

While the products and technologies are vast and disparate, Alphabet’s revenues are centralized: the annual revenue generated by Google, Alphabet’s largest subsidiary, is chiefly derived from advertising across its platforms which accounts for more than 83% of Alphabet’s 2019 total revenue.ix YouTube, Google’s fastest-growing segment, saw ad revenue increase 36% YoY, ahead of Google Search, the world’s most widely used search engine which grew 15% YoY. In comparison, Alphabet’s revenues from its Other Bets grew 11% year over year.x If Alphabet fails to manage its human rights-related risks, it could face a breakdown of trust from users and advertisers—a dangerous proposition for a company that derives such a large majority of its revenues from advertising. As a result, it is imperative that Alphabet has systems and structures in place to ensure it is comprehensively overseeing these risks at the Board level—the only body with direct responsibility to shareholders which is empowered to oversee all of Alphabet, not just one unit or another.

 

Human Rights Risks Embedded in Alphabet’s Business Model

 

Alphabet’s technologies, products, and services have transformed our daily lives and the global economy. As a result, Alphabet’s internal decisions can have far-reaching consequences for individuals and society, ranging from the tremendous amount of data it gathers across its technologies and the inherent privacy risks this poses, to facilitating the spread of hate speech and violence, to advertising policies that could result in political manipulation and small-group targeting. Additionally, there are specific controversies rooted in the strategic direction of the Company and its relationship with Artificial Intelligence (“AI”) such as the controversial Project Maven, a partnership with the U.S. military, and Project Dragonfly, a censored search engine in China, which have caused concern among investors, regulators, users, and even employees within Alphabet. As a result, it is clear that Alphabet faces significant human rights risks. A more detailed discussion of these risks is provided in Annex 2.

 

 2
 

 

It is also clear that Alphabet—like all companies—has the responsibility to respect human rights. In 2011, governments around the world came together in the UN Human Rights Council to unanimously adopt the UN Guiding Principles on Business and Human Rights (UNGPs) which establishes that all companies, including Alphabet, have a responsibility to respect human rights, including civil, political, economic, social, cultural, and labor rights.xi

 

According to the United Nations High Commissioner on Human Rights:

 

“In order to meet their responsibility to respect human rights, business enterprises should have in place policies and processes appropriate to their size and circumstances, including:

 

(a) A policy commitment to meet their responsibility to respect human rights;

 

(b) A human rights due diligence process to identify, prevent, mitigate and account for how they address their impacts on human rights;

 

(c) Processes to enable the remediation of any adverse human rights impacts they cause or to which they contribute.”xii

 

Currently, Alphabet falls short of these expectations.

 

Examples of adverse impacts and due diligence gaps

 

Alphabet’s potential for multiple and diverse impacts on human rights include, but are not limited to:

 

-Right to Privacy (Art.12, UDHR): Through its technologies and relationships, Alphabet collects troves of sensitive personal data—such as health, biometric, and real-time location tracking—and has deployed listening features on Google Home products, which were undisclosed at the time of launch.xiii For example, in Project Nightingale, Google collected the personal medical data of up to 50 million Americans from one of the largest healthcare providers in the US without informing patients. This included individual names and medical histories, which could all be accessed by Google staff.xiv Beyond the obvious harms that could ensue if such data were compromised or leaked, Alphabet’s opaque disclosure regarding how its subsidiaries—including those doing potentially controversial and sensitive work like Verily and Calico—currently handle users’ personal information necessitates a company-wide response. Moreover, the way Alphabet addresses law enforcement requests for broadly-defined searches (e.g. geofence warrants, where technology companies disclose anonymized location information for all devices in a specific area at a specific time)xv raises urgent questions about the privacy, due process, and civil rights of billions of individuals that could be affected.xvi The scope and nature of information on individuals that the Company possesses, together with its technology, creates and amplifies human rights risks that would not otherwise be present.

 

-Right to Political Participation (Art. 21, UDHR): While Google has sought to limit micro-targeting of political ads, or the ability to target—or even manipulate—specific groups or individuals, and to curtail the spread of disinformation, political advertisers can still target ads by using age, gender, and zip code.xvii Google provides no evidence that it conducts human rights due diligence on its targeted advertising practices.xviii

 

-Right to Freedom of Opinion, Expression, and Information (Art. 19, UDHR): The right to freedom of information is jeopardized by micro-targeting and profiling with limited transparency into the algorithm that puts people into targeted buckets. In addition, Google’s now-dormant “Project Dragonfly” sought to launch a search engine that would be compatible with China’s state censorship provisions.

 

-Right to Health (Art. 25, UDHR): Multiple sources have documented health and safety impacts on YouTube content moderators, especially mental health issues arising from their work.xix YouTube appears to recognize these risks and reportedly requires contractors to sign a release acknowledging that content moderation may result in Post-Traumatic Stress Disorder.xx

 

 3
 

 

-Right to Equality & Non-Discrimination (Art. 2, Art. 7, UDHR), Right to Life & Security (Art. 3, UDHR): The potential for the proliferation of hate speech and acts of violence remains a potential risk for YouTube. Moreover, according to some studies, Google’s search algorithms, embedded across its many platforms, can contribute to discrimination and exacerbate bias. For example, Google’s search engines have been shown to systematically discriminate against women and people of color.xxi

 

Human Rights Risks Pose Significant Material Business Risks to Alphabet

 

Alphabet’s business model presents inherent material risks, including regulatory, reputational and human capital risks. The Sustainability Accounting Standards Board’s (SASB) Materiality Mapxxii identifies data security as likely to affect the financial condition or operating performance of technology companies in particular. Similarly, SASB is also carrying out a research project to explore the topic of content moderation on Internet platforms, to determine whether establishing standardized accounting metrics in this area may be warranted.xxiii

 

Reputational risks

 

Alphabet faces significant financial risks if it fails to maintain the trust of users—a concerning prospect for a company dependent on advertising revenues. According to Pew Research, 81% of Americans believe they have little control over data collected about them by companies, and the same amount believe that the risks of this data collection outweigh any benefits.xxiv Further, 79% of Americans report being somewhat to very concerned about how this data is being used.xxv By failing to effectively address these concerns and by failing to implement clear oversight to avoid these risks materializing further, the Company is opening itself to significant commercial risk if a competitor is able to win that trust instead. With more than 83% of the Company’s revenues in 2019 (~$135 billion out of Alphabet’s total revenues of ~$161 billion) coming from online advertising services such as Google AdWords and Google AdSense,xxvi losing the trust of users risks significantly undermining the value of the platform to advertisers. In specific cases, brands have already shown a willingness to suspend their advertising relationships with the Company if their advertising appears associated with controversial content. xxvii For example, a 2019 New York Times article summed up the situation: “Advertisers Boycott YouTube after Pedophiles Swarm Comments on Videos of Children.”xxviii

 

Regulatory risks

 

In recent years, there have been a number of regulatory developments targeting aspects of Alphabet’s activities that expose the company to significant regulatory risks. For a detailed discussion of the regulatory risks Alphabet faces in the US and internationally, please see Annex 2.

 

Human capital risks

 

The battle for talent in the tech world is fierce. Like many tech companies, Alphabet has faced recruitment and retention challenges, and will likely face more if it fails to maintain the trust of employees. For example, Project Dragonfly caused significant backlash from employees who were alarmed at being associated with a tool that essentially enables state surveillance. Some 1,400 signed an open letter, some of whom reported being retaliated against for airing their opposition publicly, and others who subsequently quit.xxix In addition, thousands joined a letter protesting the Company’s involvement with Project Maven, a military contract that called into question the boundaries of weaponizing AI.xxx Signaling rising dissatisfaction among Google’s employees, more than 20,000 of the Company’s workers joined a walkout in November 2018 demanding that the Company address issues ranging from its use of private arbitration in sexual harassment cases, increasing its disclosure of employee salaries and compensation, the addition of an employee Board representative, and the appointment of a Chief Diversity Officer to report directly to the Board.xxxi

 

 4
 

 

Company actions to address these risks are insufficient

 

Because the above risks are relevant to every Alphabet subsidiary and technology, and inherent in the Company’s business model, it is imperative that a commitment to human rights be codified at the highest level of the Company for the purposes of oversight and accountability. In the tables found in Annex 1, we use Alphabet’s statement of opposition to this proposal to demonstrate that the Company’s patchwork of internal bodies lack the transparency as well as the overarching responsibility to oversee Alphabet’s vast human rights-related risk exposure. The proponents believe anything short of an enterprise-wide human rights risk management approach renders the Company’s sprawling activities vulnerable to material business risk – risk which is unaddressed by the Company’s current configuration.

 

Investors have the responsibility to engage on these risks

 

Given the financial risks to the Company and shareholders’ own commitments to conduct human rights due diligence under the Guiding Principles, we have a responsibility to ensure the company is overseeing such risks at the highest level.

 

In particular, European Union regulation requires European investors such as banks, pension funds and insurers to disclose risks to people and the planet in their investments and to publish the actions they are taking to prevent harm. This requirement will come into effect at the end of this year, and will be applied in 2021. As a result, reporting on due diligence policies and activities is mandatory for all investment institutions with over 500 staff, and smaller investors must comply or explain their rationale for exemption.xxxii

 

The lead proponents have repeatedly requested meaningful dialogue with the Company to learn more about its internal efforts to address these issues—including through an investor letter supported by over 80 signatories representing nearly $10 trillion in assets under management—yet those requests have largely been ignored by the Company. In addition, the quarterly ESG calls that are organized by the Company are not sufficiently informative for our due diligence purposes. Over the past year, the length of these calls has been shortened from one hour to 30 minutes and in the last call there was no longer the opportunity for a live Q&A.

 

Board oversight is INADEQUATE

 

While the Board is accountable to investors, it is unclear to Proponents which criteria and processes are used to determine when and how the Board becomes involved in overseeing human rights risks, nor whether it has sufficient time and expertise to manage these specific risks.

 

According to Ranking Digital Rights, “For the third year in a row, Google continued to lag behind its peers in the Governance category, disclosing less about its governance and oversight over human rights issues than other members of GNI [Global Network Initiative].”xxxiii In general, Proponents believe that Alphabet lags its international peers including major global companies such as Microsoft,xxxiv Intel,xxxv Unilever,xxxvi and Nestle,xxxvii all of whom demonstrate comprehensive governance approaches for overseeing human rights risks. At each of these companies, there is a clear reporting line for escalation of human rights issues and accountability from senior executives with the relevant expertise, a basic expectation that Alphabet fails to meet.

 

 5
 

 

Benefits of a Dedicated Board Committee

 

As shareholders, entrusted with the responsible stewardship of our investments, we believe a Human Rights Risk Oversight Committee—drawing on internal and external experts—would be best positioned to oversee human rights risks in a way that protects the Company and its investors, and respects the rights of individuals in the Company’s operations and throughout its value chain. Additionally, we would expect that a stand-alone Board committee would have its own public charter, providing investors with clarity regarding the Board’s mandate to oversee the risks and potential human rights implications associated with its technologies and operations. Likewise, while the Board already has the ability to draw on internal and external experts, we would expect such a commitment to help identify and prioritize resources to fill an existing expertise gap in the Board’s current composition.

 

Annex 1: Proponent rebuttal to Alphabet’s Statement of Opposition

 

The proponents believe Alphabet’s statement of opposition—which is intended to assure investors that such a proposal is unnecessary—actually confirms the identified gaps and deficiencies. It makes clear the company has neither (1) appropriate, company-wide, policies, processes and due diligence systems; nor (2) Board-level oversight necessary to sufficiently address the human rights risks associated with the Company’s technologies. The following tables break down the weaknesses in policy and oversight demonstrated in the Company’s statement of opposition, and outline opportunities to address these concerns.

 

TABLE 1: Alphabet’s policies, processes, and practices to date demonstrate an insufficient response to /management of human rights risks.
Problem Alphabet Statement of Opposition Recommendations
Alphabet clearly states
its efforts to respect
human rights are not
company-wide, and
rather only applicable
to Google.
“Across Google, we are guided by internationally recognized human rights standards.” The Statement of Opposition refers to Google and not Alphabet. Therefore, they do not apply to entities such as Calico, Waymo, Verily and others that fall outside the Google umbrella. To uphold its human rights responsibilities, Alphabet is expected to adopt and embed a broad-based Human Rights Policy that is applicable across all of Alphabet’s activities and subsidiaries, and among all business relationships, including suppliers, contractors, and end-users, as well as business and government clients.
The Company’s human
rights policy falls short
of international human
rights standards and
best practice.
“We are committed to respecting the rights enumerated in the Universal Declaration of Human Rights and its implementing treaties, as well as upholding the standards established in the United Nations Guiding Principles on Business and Human Rights and in the Global Network Initiative (GNI) Principles…”

Google’s commitment to human rights does not appear to align with the UN Guiding Principles, as it does not:

1.     Reference the minimum human rights standards outlined in the International Bill of Human Rights and the ILO Core Conventions;

2.     Clarify whether it has been approved at the most senior level of the company;

3.     Provide information on the Company’s human rights governance nor lay out a plan for embedding this commitment into the business and carrying out due diligence;

4.     Define Google’s salient human rights issues;

5.     Stipulate Google’s human rights expectations of personnel; business partners and other business relationships linked to its operations, products or services;

6.     Refer to a commitment to adopt or provide evidence for the existence of effective grievance mechanisms to enable access to remedy.

 

These gaps should be addressed in an Alphabet-wide commitment.

Board and
management oversight
of human rights risks is
vague.
“Senior management oversees the implementation of the human rights and GNI Principles at Google and provides quarterly updates to our Board of Directors on relevant issues. Dedicated personnel are focused on product, jurisdiction, and functional areas and are responsible for the day-to-day operations of protecting users and meeting Google’s human rights obligations.” To assure investors and the public that Alphabet is effectively overseeing and managing these risks, the Company should clarify who at the senior management level is responsible for overseeing not only the GNI Principles, but all human rights risk management. Further information is also necessary on how business functions coordinate to ensure the GNI Principles and broader respect for human rights are embedded across the Company. Finally, further information is needed on the processes and criteria for sharing and escalating human rights risks and impacts to the Board to help investors assess its ability to foresee emerging risks and hold accountable the Company’s management of human rights risks.

 

 6
 

 

There are gaps in
YouTube policies that
continue to enable the
proliferation of hate
speech.
“On content quality, one of the most complex and constantly evolving areas we deal with is hate speech. In 2019, YouTube took a close look at its approach towards hateful content in consultation with dozens of experts in subjects like violent extremism, supremacism, civil rights, and free speech. Based on those learnings, YouTube made several updates to its hate speech policy…” While YouTube has adopted a Hate Speech Policyxxxviii, it is limited to what happens on the website, meaning it does not further investigate the context necessary to better understand the behavior of a group or individual behind an account.xxxix In addition to potential adverse impacts on freedom of expression, this approach to hate speech has resulted in decisions that continue to enable the proliferation of hate e.g. by deleting a video without reviewing the channel or by demonetizing channels while leaving videos online. This approach differs from Twitterxl, which takes contextual factors into consideration, including by engaging individuals who report an abuse when determining whether to remove content.
Google’s limits on
disinformation and
political ads do not go
far enough to protect
against human rights
risks.
“Similarly, our ongoing work on information integrity led to the release of a white paper in 2019 detailing our work to tackle the intentional spread of disinformation — across Google Search, Google News, YouTube, and its advertising systems.” The White Paper is a positive step. However, given that Google’s influence and business model both make it ripe for disinformation, it is concerning that at times it is silent on how best to address the inherent conflicts that arise. Political advertising is one way disinformation can spread on Google’s platforms. While Google has sought to limit micro-targeting of political ads, the ability to target (or even manipulate) specific groups or individuals,  and the spread of disinformation, political advertisers can still target ads by using age, gender, and zip code.xli The political advertising transparency report discloses the amount spent per political advertiser and targeted geographical regions in the UK, US, Europe Union and India, but fails to disclose how the targeting of small groups worksxlii.  In addition, political advertisers are still able to buy ad space for auction on Google’s ad exchange using their own targeting data, and technology from other companies.xliii Google provides no evidence that it conducts human rights due diligence on its targeted advertising practices.xliv While it does disclose some information about how its rules are enforced, the disclosures are insufficient to determine the efficacy of these rules. It does not disclose information about its processes for enforcing political advertising rules nor about the outcomes of those processes.xlv
Google’s AI Principles
(“the Principles”)xlvi,
while a good start,
must go further and be
applied at the
enterprise level.
“Google also publicly released AI principles that actively govern its research and product development …we have established an AI Principles review process that is designed to assess new projects, products, and deals, and we embed human rights due diligence, including human rights impact assessments as appropriate, as part of that process. The review structure is composed of a diverse and inclusive group of Googlers, including senior executives, user researchers, social scientists, ethicists, human rights specialists, policy and privacy advisors, legal experts, and senior experts from a variety of other disciplines.”

Google has avoided publicly committing to uphold international human rights standards for how to develop and use algorithmic systems.xlvii The push for ethical AI as set out in the AI Principles is not grounded in human rights norms, and the Principles do not require the Company to conduct human rights due diligence in the development and use of algorithmic systems.xlviii

Proponents believe that enterprise-wide adoption of robust guidelines regarding the design and application of Alphabet technology—and disclosure of where responsibility lies—would benefit all shareholders.

The Company’s efforts
to identify and assess
human rights impacts
are insufficient.
“In 2019, in collaboration with independent experts using the UN’s Guiding Principles on Business and Human Rights as a framework, Google commissioned a formal human rights impact assessment (“HRIA”) of the Celebrity Recognition tool and technology’s potential impact on human rights. The HRIA played an essential role in shaping the API’s capabilities and the policies established around them and we publicly released a summary of the HRIA. “ While we commend Alphabet for conducting an HRIA of the Celebrity Recognition tool, it only covers one technology. Although Google is a member of the Global Network Initiative (“GNI”) and thus undergoes periodic independent assessments, such assessments are limited to Google’s progress on managing risks associated with government requests that affect privacy and freedom of expression.xlix Alphabet should prioritize conducting HRIAs based on an initial assessment of the Company’s most salient human rights risks associated with high-stakes operations, products, services and business relationships. 

 

 

 

 7
 

 

Alphabet fails to
disclose sufficient
information on how it
is assessing and
managing its human
rights risks.

 

“Consistent with these guidelines, we devote significant resources to ensure that we are aware of, and able to appropriately address the various risks that our businesses face, including potential impacts our businesses may have on human rights.”

While we are encouraged that the Company “devotes resources” to addressing human rights risks, it does not currently disclose how such risks are defined, nor how resources are deployed, leaving investors in the dark on whether they are effective and sufficiently comprehensive. For example, while a short summary is available, the results of the GNI assessments on Google’s progress on privacy and freedom of expression are confidential.l The Company’s 2019 Responsible Supply Chain Report also fails to disclose information about how the Company determines and manages salient human rights risks to people in its supply chain and instead focuses on issues the Company determines to be financially material to Google.li

 

Proponents believe that in order to fulfill its human rights responsibilities and holistically inform investors about material risks, Alphabet should disclose how it goes about identifying salient human rights issues, and then how it manages these issues, including assessing financial materiality.

Privacy is a central
human rights—and
business model—risk
yet the company’s
governance of it is
insufficient to protect
against risk.
“Our Audit Committee has considered topics related to human rights, including…our ongoing commitment to privacy across all our product areas” In 2019, Ranking Digital Rights’ company-specific analysis found that Google should do more to protect privacy by clarifying what information it collects, shares, and for what purpose, and then give users clear options to control what is collected and shared about them.lii They also identified a due diligence gap in Google’s failure to provide an option for users to end-to-end encrypt their private content or communications for Gmail, YouTube, or Google Drive.liii
     

 

TABLE 2: In light of these shortcomings, leadership and oversight at the Board level is necessary to protect shareholder (and human) rights.
Problem Alphabet Opposing Statement Recommendations
Human rights related
risks are lumped into
general risk management

“As described in Alphabet’s Corporate Governance Guidelines, our Board of Directors has overall responsibility for risk oversight.…

“Our governance structure empowers our Board of Directors and its committees to holistically review our business and to consider a wide range of risks, including the impact of our products and services on human rights, among others.”

The scope and scale of Alphabet’s human rights risks cannot be effectively addressed by being grouped into the Board’s general risk management responsibilities. Instead, effective human rights risk management requires a standalone mandate with appropriate and specific expertise.
The Board should have a
specific mandate for, and
active role in, overseeing
human rights related
risks.

“Senior management oversees the implementation of the human rights and GNI Principles at Google and provides quarterly updates to our Board of Directors on relevant issues.” (emphasis added)

“The current structure of our Board … allows for regular assessments on a variety of topics, including the potential impacts of our products and services on human rights.” (emphasis added.)

The Board’s human rights risks oversight role appears passive. To effectively oversee the vast array of human rights related risks, including those linked to the Company’s core business, the Proponents believe that the Board would need to be more involved than only receiving “quarterly updates” from management and allowing for regular assessments. Instead, the Board should have an unequivocal mandate and process to actively oversee and anticipate human rights risks.
In the opinion of the
Proponents, the audit
committee does not
have the time or
expertise necessary to
adequately address
human rights risks.
“…our Audit Committee has considered topics related to human rights, including our ongoing work to address harmful content, our ongoing commitment to privacy across all our product areas, and our investments in affordable housing, among others.” The Proponents are concerned that the Board’s current committee structure cannot give auditing, environmental sustainability, and human rights sufficient attention. Further, in the opinion of the Proponents, the existing Board committee members do not have sufficient relevant human rights experience and currently there is insufficient clarity regarding how—and if—consultations with external issue experts occur.

 

 8
 

 

AnNex 2: Regulatory risks

 

-California: In 2018, the California Consumer Privacy Act (CCPA) was enacted to protect consumer rights relating to the access to, deletion of, and sharing of personal information that is collected by businesses. The Attorney General may start to bring enforcement actions under the CCPA as of July 1, 2020. liv

 

-United States: The Online Privacy Act in US Congress, if adopted, would enable users to sue companies for data protection violations.lv

 

-European Union: In 2016, the EU adopted the EU General Data Protection Regulation (GDPR) which regulates data protection and privacy and enables data protection regulators in the EU to impose fines of up to 4% of a company’s global turnover for serious breaches. lvi In addition, the European Commission has also pressured social media platforms to accelerate takedowns of illegal content including hate speech by adopting a voluntary EU Code of Conduct.lvii Since then, it has warned social media platforms it could introduce legislation if targets are not met.

 

-Germany: Under the “NetzDG” law,lviii online platforms must remove “content that is manifestly unlawful within 24 hours of receiving the complaint” and remove or block “all unlawful content immediately, this generally being within 7 days of receiving the complaint” or face fines.lix

 

-France: The country is in the process of adopting the so-called Avia law, which requires companies to remove hateful content within 24 hours from receiving a notification about it.lx

 

-United Kingdom: In February 2020, the UK government released a White Paper outlining its new effort to “establish a new statutory duty of care to make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services.”lxi

 

The regulatory landscape has resulted in a growing number of lawsuits against the Company:

 

-February 2020: A lawsuit was filed by the New Mexico Attorney General alleging that Google illegally collected data using Chromebooks;lxii and

 

-January 2019: The French data protection authority fined Google about $57 million for failing to sufficiently disclose to users how data is collected across its services.lxiii

 

 

 

 

 

 

Submitted: May 11, 2020

 

 

 

 

 

 

The views expressed are those of the authors and Loring, Wolcott & Coolidge Fiduciary Advisors, LLC as of the date referenced and are subject to change at any time based on market or other conditions. These views are not intended to be a forecast of future events or a guarantee of future results. These views may not be relied upon as investment advice. The information provided in this material should not be considered a recommendation to buy or sell any of the securities mentioned. It should not be assumed that investments in such securities have been or will be profitable. This piece is for informational purposes and should not be construed as a research report.

 

NOTE: This is not a solicitation of authority to vote your proxy. Please DO NOT send us your proxy card; LWCFA is not able to vote your proxies, nor does this communication contemplate such an event. LWCFA urges shareholders to vote FOR Proxy Item Number 7 following the instructions provided on the management’s proxy mailing.

 

 9
 

 

                                                           

 

iJennifer Elias, “Alphabet, Google’s Parent Company, Hits Trillion-dollar Market Cap for First time,” CNBC, 1/16/20, https://www.cnbc.com/2020/01/16/alphabet-stock-hits-1-trillion-market-cap-for-first-time.html
iiEmil Protalinski, “Android Passes 2.5 Billion Monthly Active Devices,” VentureBeat, 5/7/2019. https://venturebeat.com/2019/05/07/android-passes-2-5-billion-monthly-active-devices/
iiiElizabeth Dwoskin, “YouTube’s arbitrary standards: Stars keep making money even after breaking the rules” The Washington Post, 8/9/2019. https://www.washingtonpost.com/technology/2019/08/09/youtubes-arbitrary-standards-stars-keep-making-money-even-after-breaking-rules/
ivManuel BronStein, Vice President of Product, Google Assistant, “A More Helpful Google Assistant For Your Every Day,” The Keyword, 1/7/20. https://www.blog.google/products/assistant/ces-2020-google-assistant/
vAvery Hartmans and Mary Meisenzahl, “All the Companies and Divisions under Google’s Parent Company, Alphabet, Which Just Made Yet Another Shake-up to Its Structure,” Business Insider, 2/12/20. https://www.businessinsider.com/alphabet-google-company-list-2017-4
viSundar Pichai, “2018 Founders’ Letter” https://abc.xyz/investor/founders-letters/2018/
viiRuth Porat, CFO, Alphabet and Google at the Morgan Stanley Technology, Media & Telecom Conference on March 2, 2020. https://abc.xyz/investor/static/pdf/morgan_stanley_technology_ruth_porat_03022020.pdf?cache=750ccc5
viiiMadhumita Murgia, “Google Accused by Rival of Fundamental GDPR Breaches,” The Financial Times, 3/16/20. https://www.ft.com/content/66dbc3ba-848a-4206-8b97-27c0e384ff27
ixAlphabet Inc. 2019 Form 10-K https://abc.xyz/investor/static/pdf/20200204_alphabet_10K.pdf?cache=cdd6dbf
xAlphabet Inc. 2019 Form 10-K https://abc.xyz/investor/static/pdf/20200204_alphabet_10K.pdf?cache=cdd6dbf
xiUnited Nations Human Rights Council, “Guiding Principles on Business and Human Rights: Implementing the United Nations ‘Protect, Respect and Remedy’ Framework” 6/16/2011. https://www.ohchr.org/documents/publications/guidingprinciplesbusinesshr_en.pdf
xiiUnited Nations Human Rights Council, “Guiding Principles on Business and Human Rights: Implementing the United Nations ‘Protect, Respect and Remedy’ Framework” 6/16/2011. https://www.ohchr.org/documents/publications/guidingprinciplesbusinesshr_en.pdf, p 15-16.
xiiiTaylor Telford, “Google failed to notify customers it put microphones in Nest security systems” The Washington Post, 2/20/2019. https://www.washingtonpost.com/business/2019/02/20/google-forgot-notify-customers-it-put-microphones-nest-security-systems/
xivEd Pilkington, “Google's secret cache of medical data includes names and full details of millions – whistleblower” The Guardian, 11/12/2019. https://www.theguardian.com/technology/2019/nov/12/google-medical-data-project-nightingale-secret-transfer-us-health-information
xvJenifer Valentino-DeVries, “Tracking Phones, Google Is a Dragnet for the Police” The New York Times, 4/13/2019. https://www.nytimes.com/interactive/2019/04/13/us/google-location-tracking-police.html
xviNathaniel Sobel, “Do Geofence Warrants Violate the Fourth Amendment?” Lawfare, 2/24/2020. https://www.lawfareblog.com/do-geofence-warrants-violate-fourth-amendment
xviiEmily Stewart, “Why Everybody is Freaking out about Political Ads on Facebook and Google” 11/27/2019. https://www.vox.com/recode/2019/11/27/20977988/google-facebook-political-ads-targeting-twitter-disinformation; Scott Spencer, Vice President Product Management, Google Ads, “An Update on Our Political Ads Policy” The Keyword, 11/20/2019. https://www.blog.google/technology/ads/update-our-political-ads-policy/
xviiiGoogle Profile, 2019 Ranking Digital Rights Corporate Accountability Index, https://rankingdigitalrights.org/index2019/assets/static/download/Google2019.pdf
xixCasey Newton, “The Terror Queue: These moderators help keep Google and YouTube free of violent extremism—and now some of them have PTSD” The Verge, 12/16/2019. https://www.theverge.com/2019/12/16/21021005/google-youtube-moderators-ptsd-accenture-violent-disturbing-content-interviews-video
xxCasey Newton, “YouTube Moderators are Being Forced to Sign a Statement Acknowledging the Job Can Give Them PTSD” The Verge, 1/24/2020. https://www.theverge.com/2020/1/24/21075830/youtube-moderators-ptsd-accenture-statement-lawsuits-mental-health
xxiSafiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism, NYU Press, February 2018; and Safiya Umoja Noble, “Google Has a Striking History of Bias Against Black Girls” Time Magazine, 3/26/2018. https://time.com/5209144/google-search-engine-algorithm-bias-racism/
xxiiSustainability Accounting Standards Board, SASB Materiality Map® 2018 https://materiality.sasb.org/
xxiiiGreg Waters, “SASB to Research Content Moderation on Internet Platforms” SASB Blog, 12/4/2019. https://www.sasb.org/blog/sasb-to-research-content-moderation-on-internet-platforms/
xxivBrooke Auxier et al, “Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their Personal Information” Pew Research Center, 11/15/2019. https://www.pewresearch.org/internet/2019/11/15/americans-and-privacy-concerned-confused-and-feeling-lack-of-control-over-their-personal-information/
xxvBrooke Auxier et al, “Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their Personal Information” Pew Research Center, 11/15/2019. https://www.pewresearch.org/internet/2019/11/15/americans-and-privacy-concerned-confused-and-feeling-lack-of-control-over-their-personal-information/
xxviJ. Clement, “Google - Statistics & Facts” Statista, 2/5/2020. https://www.statista.com/topics/1001/google/?
xxviiDaisuke Wakabayashi, “The Most Measured Person in Tech Is Running the Most Chaotic Place on the Internet” The New York Times, 4/17/2019. https://www.nytimes.com/2019/04/17/business/youtube-ceo-susan-wojcicki.html
xxviiiDaisuke Wakabayashi and Sapna Maheshwari, “Advertisers Boycott YouTube After Pedophiles Swarm Comments on Videos of Children” The New York Times, 2/20/19. https://www.nytimes.com/2019/02/20/technology/youtube-pedophiles.html
xxixBryan Menegus, “Here’s the Letter 1,400 Google Workers Sent Leadership in Protest of Censored Search Engine for China” Gizmodo, 8/16/18. https://gizmodo.com/heres-the-letter-1-400-google-workers-sent-leadership-i-1828393599; Noam Scheiber and Kate Conger, “The Great Google Revolt” The New York Times Magazine, 2/18/2020. https://www.nytimes.com/interactive/2020/02/18/magazine/google-revolt.html
xxxNoam Scheiber and Kate Conger, “The Great Google Revolt” The New York Times Magazine, 2/18/2020. https://www.nytimes.com/interactive/2020/02/18/magazine/google-revolt.html
xxxiDaisuke Wakabayashi, Erin Griffith, Amie Tsang, and Kate Conger. “Google Walkout: Employees Stage Protest over Handling of Sexual Harassment.” The New York Times, 11/1/2018. https://www.nytimes.com/2018/11/01/technology/google-walkout-sexual-harassment.html
xxxiiActionAid and Global Witness, “Policy Briefing: EU’s Regulation on Investor Disclosure on Sustainability Risks and Due Diligence: How Can It Work Most Effectively For People and Planet?” 12/9/2019. https://www.globalwitness.org/en/campaigns/european-union-brussels-global-witness-eu/eu-investor-disclosure-regulation/

 

 10
 

 

                                                           

 

xxxiiiGoogle Profile, 2019 Ranking Digital Rights Corporate Accountability Index, https://rankingdigitalrights.org/index2019/assets/static/download/Google2019.pdf
xxxiv“Microsoft Global Human Rights Statement” https://www.microsoft.com/en-us/corporate-responsibility/human-rights-statement
xxxv“Intel Global Human Rights Principles,” Effective February 2009; last updated November 2019. https://www.intel.co.uk/content/www/uk/en/policy/policy-human-rights.html
xxxviUnilever’s Human Rights Policy Statement” https://www.unilever.com/Images/unilever-human-rights-policy-statement_tcm244-422954_en.pdf
xxxviiNestlé, “Creating Shared Value and Meeting Our Commitments” UN Guiding Principles Reporting Framework Index of Answers 2019. https://www.nestle.com/sites/default/files/2020-03/creating-shared-value-ungprf-index-of-answers-2019.pdf, pg. 5
xxxviiiYouTube, “Hate Speech Policy” https://support.google.com/youtube/answer/2801939?hl=en#
xxxixYouTube Help, “The Importance of Context” https://support.google.com/youtube/answer/6345162
xlTwitter, “Hateful Content Policy” states “To help our teams understand the context, we sometimes need to hear directly from the person being targeted to ensure that we have the information needed prior to taking any enforcement action. https://help.twitter.com/en/rules-and-policies/hateful-conduct-policy
xliEmily Stewart, “Why Everybody is Freaking Out About Political Ads On Facebook and Google” Vox.com, November 27, 2019, https://www.vox.com/recode/2019/11/27/20977988/google-facebook-political-ads-targeting-twitter-disinformation; Scott Spencer, Vice President Product Management, Google Ads “An Update On Our Political Ads Policy,” The Keyword, 11/20/2019. https://www.blog.google/technology/ads/update-our-political-ads-policy/
xlii“Political Advertising On Google” https://transparencyreport.google.com/political-ads/home?hl=en_GB
xliiiGerrit De Vynck, “Google’s Limits on Political Ads Have a Loophole Trump Could Tap,” Bloomberg, 12/2/2019. https://www.bloomberg.com/news/articles/2019-12-02/google-s-limits-on-political-ads-have-a-loophole-trump-could-tap
xlivGoogle Profile, 2019 Ranking Digital Rights Corporate Accountability Index, https://rankingdigitalrights.org/index2019/assets/static/download/Google2019.pdf
xlvNathalie Marechal and Ellery Roberts Biddle, “It’s Not Just the Content, It’s the Business Model: Democracy’s Online Speech Challenge,” Ranking Digital Rights, 3/16/2020. https://d1y8sb8igg2f8e.cloudfront.net/documents/REAL_FINAL-Its_Not_Just_the_Content_Its_the_Business_Model.pdf
xlvi“Responsible AI Practices,” https://ai.google/responsibilities/responsible-ai-practices/
xlviiNathalie Marechal and Ellery Roberts Biddle, “It’s Not Just the Content, It’s the Business Model: Democracy’s Online Speech Challenge,” Ranking Digital Rights, 3/16/2020. https://d1y8sb8igg2f8e.cloudfront.net/documents/REAL_FINAL-Its_Not_Just_the_Content_Its_the_Business_Model.pdf
xlviiiNathalie Marechal and Ellery Roberts Biddle, “It’s Not Just the Content, It’s the Business Model: Democracy’s Online Speech Challenge,” Ranking Digital Rights, 3/16/2020. https://d1y8sb8igg2f8e.cloudfront.net/documents/REAL_FINAL-Its_Not_Just_the_Content_Its_the_Business_Model.pdf
xlixGlobal Network Initiative, “Company Assessments” https://globalnetworkinitiative.org/company-assessments/
lGlobal Network Initiative, “Company Assessments” https://globalnetworkinitiative.org/company-assessments/
li“Google Responsible Supply Chain Report 2019” https://services.google.com/fh/files/misc/google_2019-rsc-report.pdf
liiGoogle Profile, 2019 Ranking Digital Rights Corporate Accountability Index, https://rankingdigitalrights.org/index2019/assets/static/download/Google2019.pdf
liiiGoogle Profile, 2019 Ranking Digital Rights Corporate Accountability Index, https://rankingdigitalrights.org/index2019/assets/static/download/Google2019.pdf
livOffice of the Attorney General, “Attorney General Becerra Publicly Releases Proposed Regulations under the California Consumer Privacy Act” State of California Department of Justice, 10/10/2019. https://oag.ca.gov/news/press-releases/attorney-general-becerra-publicly-releases-proposed-regulations-under-california ; and Attached Fact Sheet https://oag.ca.gov/system/files/attachments/press_releases/CCPA%20Fact%20Sheet%20%2800000002%29.pdf
lvOffice of Representative Anna Eshoo, “Summary of H.R. 4978, the Online Privacy Act” 11/5/2019. https://eshoo.house.gov/sites/eshoo.house.gov/files/migrated/wp-content/uploads/2019/11/One-Pager-Online-Privacy-Act-Eshoo-Lofgren.pdf; and Full Text of H.R.4978 (116th Congress) available at https://www.congress.gov/bill/116th-congress/house-bill/4978/text
lviEU General Data Protection Regulation, “What is GDPR, the EU’s new data protection law?” Accessed 4/29/2020. https://gdpr.eu/what-is-gdpr
lviiThe European Commission, “The EU Code of conduct on countering illegal hate speech online” Accessed 4/29/2020. https://ec.europa.eu/info/policies/justice-and-fundamental-rights/combatting-discrimination/racism-and-xenophobia/eu-code-conduct-countering-illegal-hate-speech-online_en
lviii“Act to Improve Enforcement of the Law in Social Networks” as adopted by the Bundestag, 7/12/2017. https://www.bmjv.de/SharedDocs/Gesetzgebungsverfahren/Dokumente/NetzDG_engl.pdf;jsessionid=829D39DBDAC5DE294A686E374126D04E.1_cid289?__blob=publicationFile&v=2
lix“Act to Improve Enforcement of the Law in Social Networks,” German Bundestag, 2017. https://www.bmjv.de/SharedDocs/Gesetzgebungsverfahren/Dokumente/NetzDG_engl.pdf;jsessionid=829D39DBDAC5DE294A686E374126D04E.1_cid289?__blob=publicationFile&v=2
lxAngelique Chrisafis, “French Online Hate Speech Bill Aims To Wipe Out Racist Trolling” The Guardian, 6/29/19. https://www.theguardian.com/world/2019/jun/29/french-online-hate-speech-bill-aims-to-wipe-out-racist-trolling
lxiUnited Kingdom Home Office, Department for Digital, Culture, Media, and Sport, “Online Harms White Paper” 2/12/20. https://www.gov.uk/government/consultations/online-harms-white-paper/online-harms-white-paper
lxii“Google Sued By New Mexico Over Claims It Spies on US Students,” BBC News, 2/21/20. https://www.bbc.co.uk/news/world-us-canada-51591420
lxiiiAdam Satariano, “Google Is Fined $57 Million Under Europe’s Data Privacy Law” The New York Times, 1/21/19. https://www.nytimes.com/2019/01/21/technology/google-europe-gdpr-fine.html

 

 

11

 

 

Categories

SEC Filings