By Kelly Sarabyn
Trust in Facebook is at an all-time low. Eighty-one percent of people have little to no confidence Facebook will protect their data or privacy. This lack of trust in the company has real financial consequences: following the Cambridge Analytica data breach Facebook’s market value plunged 80 billion. It turns out customer affinity, and alignment with the brand’s mission, has a tangible value.
The user data obtained by Cambridge Analytica was allegedly used to help Donald Trump’s campaign, echoing the earlier use of Facebook’s advertising platform by Russian operatives to try to influence the 2016 presidential election. In both cases, the root cause was not a hack or security breach, but rather a savvy exploitation of the tools that Facebook put in place to allow advertisers to leverage users’ information.
Americans responded with outrage at Facebook for allowing these deceptions to happen. The social network apologized, and claimed it would hire more moderators and fix its algorithms to ensure its “guidelines” were being followed. Yet these guidelines were lengthy and obscure: In April, it was revealed, for example, that Facebook had reduced the reach of a popular pro-Trump page of two African-American women because their content was “unsafe.” In true 1984-style, the women were told this decision about their content was “unappealable.”
Confronted by this incident in Congress, CEO Mark Zuckerberg said it was an “enforcement error,” but declined to elaborate on the enforcement or the policies being enforced. Indeed, Zuckerberg’s testimony to Congress on how his company handles and monetizes the massive amount of data it collects was cagey and opaque.
As recently as last year, Facebook updated its mission: to “give people the power to build community and bring the world closer together.” The company’s reactive and fumbling approach to these scandals is public affirmation of a failure to align the operations of the company behind that mission: Facebook’s business model with respect to user data is buried beneath lengthy terms and conditions only a high-priced lawyer might understand, and even Zuckerberg seems unable, or unwilling, to clarify them publicly. The revelation of recent abuses stemming from this shadowy business model exposed a clear disconnect from Facebook’s public purpose, so it’s no surprise any trust the consumer might have had in the company has vanished.
Facebook’s crime isn’t monetizing user data. After all, if the product’s free, the user is the product. With the continued growth of big data technology, the practice of collecting and leveraging customer data will continue to rise. Companies who place customer data at the core of their business strategy need to align their data practices into their brand story: the brand story sets expectations, and acting consistently with it not only will not alienate customers, but it even creates and fosters trust. By failing to do this, Facebook’s public image has been tarnished, and the company has lost not only market value, but valuable brand equity.
Twice as many young people trust Google and Amazon to do the right thing compared to Facebook. Americans overall express much greater trust, like, and appreciation for Google and Amazon, as well. Yet, like Facebook, both of these companies leverage massive amount of user data for commercial ends. Amazon, who ranks highest in trust and appreciation surveys, collects and uses its customers’ data to sell more products. So, why is there such a difference in perception by customers?
Behind closed doors, Facebook’s COO Sheryl Sandberg allegedly asked other Facebook executives, “What business are we in?” and provided the answer as advertising. Publicly, Facebook claims it is in the business of empowering people to build community. The misalignment between internal expectations and the externally articulated mission of the organization leaves Facebook without a coherent vision. When a company’s contradictory stories become public, as they will — and in Facebook’s case, have — it erodes trust in a company.
Facebook was widely criticized, for example, when a document was leaked showing the company telling potential advertisers it could identify when teenagers were feeling “insecure,” “worthless,” and “in need of a confidence boost,” and would help brands target and tailor ads to take advantage of these moods. It’s easy to see why this information would help Facebook meet its internal goal of selling advertising — for advertisers, the more information it can have on a person’s psychological state, the more likely the ad is to convert.
However, providing detailed psychological data on users’ weaknesses to brands does not empower people to build community. To the contrary, giving companies detailed information to exploit users’ vulnerabilities detracts from users’ ability to build community — they will feel besieged by content that is designed to prey on their vulnerabilities, and less likely to share. As a result, when this conversation between Facebook and a potential advertiser was leaked, users felt betrayed.
Following a spate of bad PR, Facebook has made weak attempts to move its internal business model toward its public purpose, claiming, for example, in January, it was changing its algorithms to prioritize content from friends, family and groups, rather than businesses. But Facebook still faces a fundamental contradiction in its brand by failing to identify whether the hero of its story is advertisers or users.
Contrast Facebook with Amazon’s use of data. Amazon’s animating purpose — to empower customers to quickly and conveniently buy what they want — is consistent with their using their massive stockpile of user data to enhance their customer’s experience, and recommend products tailored to specific users’ interests and needs.
Amazon also offers brands the ability to put interest-based ads in front of their users, but, again, these are marked as ads, and tailored to what the user is most likely to purchase. Amazon’s use of user data is aimed at making brands a partner in delivering on their purpose, so it’s logical users would embrace it.
This means there is no need for Amazon to obscure their policies, or peddle different stories to different audiences, fomenting distrust. Their easy-to-read FAQs, for example, states, “Information about our customers is an important part of our business, and we are not in the business of selling it to others.” It then states, in plain English, the exceptions to that policy.
Retailers have complained about Amazon’s failure to share more of their user data, but Amazon knows the hero of their story is their customers, not the brands purchasing advertising or selling products on their platform. As a result, Amazon shares user data with retailers only if it helps to provide customers more targeted and better product options. This use of customer data does make Amazon money: the better tailored displayed products are, whether they are ads or simply recommendations, the more products customers will buy on Amazon. But because this practice aligns with Amazon’s core purpose, it also benefits the customer.
Amazon’s not an outlier. Consumers have embraced scores of brands built on tracking and utilizing their data, but in each case where it is successful, it’s a mutual benefit characterized by an embrace of one coherent message. Netflix is constantly looking at their customers, creating an anonymized data analysis of all their viewing habits, but they leverage that data to provide original content more suited to their users’ tastes, and improve viewing recommendations. This use of data benefits Netflix, as customers watch more content, as well as the customer, who can more easily discover content they like.
The John Hancock insurance company has taken data collection even farther by collecting data on its customers via fitness wearables. Offering its customers the option of enrolling in a program where their exercise is tracked via smart wearables means the company knows more about their plan enrollees than ever before, but it also provides the opportunity for discounts when the data shows they are exercising regularly. This data collection is transparent, not for sale to third parties, and designed to give the customers more options and benefits. It’s more invasive than anything done by Facebook, but is consistent with the brand’s underlying mission of healthy lifestyles, and feels authentic.
Even the classic consumer loyalty card program is an example of the myriad ways companies can utilize data on their customers to make money. In each case, customers happily surrender information on what they are purchasing in exchange for discounts or free services — and over time, better product selection at stores, more suited to what they actually want to buy.
As customer preferences become more and more specialized, companies will increasingly leverage big data technology to track data on their customers, and many companies will rely on utilizing this data as part of their business model. Doing this successfully is not only about security, practices, and clear terms, it requires aligning those practices with a core purpose that both customers and employees agree to.
When that happens, companies can be open and honest about their data practices, which provides the bedrock for customers to form a trustworthy and lasting relationship with the company. As Facebook has learned, violating that commitment is more than just a public relations headache: it causes material damage to the fabric of the community around the company itself.
Kelly Sarabyn is a manager at Woden. Whatever your storytelling needs may be, Woden can help. Read our extensive guide on how to craft your organization’s narrative, or send us an email at firstname.lastname@example.org to discuss how we can help tell your story.