Technology has become society’s answer for everything. The internet has virtually replaced brick-and-mortar libraries, has infiltrated most aspects of research, and has become the largest means by which people communicate and share information. However, information is not always accurate. Worse, information is often edited to purposefully alter perception. A lack of accountability further encourages erroneous internet content as there are few regulations and fewer methods for enforcement due to vague laws. Monopolies such as Google, Amazon, Facebook, and Microsoft benefit from the current unregulated structure, as they profit off the consumers’ inability to hold them accountable for violations. This presents a conflict of interest to consumer privacy, as people allow access without understanding what they’ve allowed and how the information can be used.
Neglecting or refusing to update internet policy and law in order to maintain a lack of regulation, enforcement, and accountability is unethical if not illegal. Violations of consumer trust are not always expressly illegal, though violating such trust has an enormous impact on companies that rely on consumer opinion to stay in business. Yet why would companies maintain trust when there is no competition for their services? At some point, conventional wisdom says, the scale will tip against the company. If Facebook fails to keep users, its power, prestige, profit, and influence will likewise narrow.
When the consumers have influence they feel invested. However, when a company subverts the trust of the consumer to benefit itself, such as the recent privacy violations allowed by Facebook, consequences are far-reaching. Facebook and similar online tech companies feel entitled to require consumers to disclose personal information before allowing access and use of their product. This collection of content extends to individuals with whom consumers share a connection: a single Facebook account can provide personal information for literally thousands of people. Even deleting an account will not eliminate information gathered, as Facebook reserves the right to keep information others have shared after the fact.
Though consumers were aware their information was being gathered to personalize commercial ads within Facebook’s application, it was considered an acceptable trade for free use of applications, operating systems, and connectivity. However, consumers were not aware of the extent their personal information was made available to third party companies who purchased the information from Facebook for exploitation purposes. In the Cambridge Analytica scandal, Facebook enabled personal information of 87 million accounts to be sold to a political-advertising firm, who then used psychological information derived from the accounts to customize advertisements designed to motivate users into electing President Donald Trump. Additionally, Facebook does not monitor or prevent fake news and ads on its platform, to include those planted by foreign actors, designed to manipulate the US public and potentially compromise democratic integrity.
After a breach of public trust, regaining the previous level of confidence is more difficult than the initial efforts taken to earn it. In the past, innovators and politicians have used modified business models to influence the public: company mission statements (political platform), management of employees (running a campaign), marketing (lobbyists), and consumers (the public). To monitor social media companies and rebuild consumer trust, it may be advantageous to capitalize on traditional business tactics initially used to achieve consumer trust. However, the previous business model must be redefined and expanded as current practices do not address the limitations in how Facebook leverages user data in order to better sell to advertisers. Zuckerberg must address privacy issues when restructuring a new business model rather than being perceived to manipulate consumers. The methods utilized for his initial success should be reapplied: he must cater to the psychological needs of consumers by developing a product they use for life enhancement rather than a product that uses them to build external revenue.
The current business model at Facebook does not address how violations of privacy can be devastating to the consumer and has failed to monitor these violations. This abuse may be purposeful for profit, but it can also derive from inadvertent neglect and complacency. Jane Dalton quoted the French president, Emmanuel Macron, in The Independent in 2018, “Google and Facebook are becoming too big to be governed and may face getting dismantled.” In an article written by Jeremy White in 2018, Mark Zuckerberg stated, “it’s not a question of ‘if regulation’, it’s a question of ‘what type.’” The population is concerned that dismantling social media sites will fundamentally alter them and that government regulation would include censorship laws that would be too restrictive. Each concern renders social media sites unattractive for use.
The best way for companies to protect consumers from abuse without dismantlement or overzealous censorship is for the consumer to be more involved: consumers of social media ostensibly create the very content of social media and should therefore become real shareholders and stakeholders in its management. This would enable the public to provide input on how the company coordinates, collaborates, updates, synchronizes, and corrects mistakes. As a commonwealth company, the public could be held more liable for the content they provide to social media sites. This law would therefore not be a platform to limit the freedom of speech, enforce politically correct commentary, or censor content, but rather hold companies – and its users – responsible for posting and hosting illegal content. The potential for neglect and complacency is avoided. The burden of responsibility is shared and enforced by all. The masses feel reinvested, consumers have a product they want, and companies can earn back credibility.
To ensure this new business model functions as desired and does not fail means changing laws and policies to prevent future breaches rather than signing agreements which can be reworded to benefit the company rather than the consumers. What must also be understood is that during a transition, laws and policies must not stall in the diagnostics phase. The intent of the law should be followed so solutions and preventative measures with real consequences can be implemented. As the company is crowd-owned, interpretation is left to how a reasonable public understands the intent. This prevents delaying laws and policy while lawyers sneak legal loopholes into regulations. When such loopholes are found, they should be immediately amended. Policies and laws should be made public so that the public can act as mediator.
To sustain this new business model, getting caught up in the momentum of reaching end-state goals should not distract from assessing and understanding new risks. Moving forward, Facebook needs to prevent this, which Mark Zuckerberg seems to have taken to heart. He has promised to assess the lack of oversight at Facebook in addition to assessing future risk. He has also promised to address the lack of strong leadership. Leadership needs to be ethical as well as supervisory: an anti-corruption Chief Compliance Officer should be put in place and be willing to take accountability of compliance to prevent profits from competing with anti-corruption, compliance, and ethics programs. If people control their own privacy settings, then allow them to decide what information to give and how much compensation to receive in trade. As with companies like Ebay, Facebook can take a percentage of this transaction. This ensures Facebook will have sufficient resources to continue providing a platform for social media but does not allow the platform to become too large and unmanageable as it is now. Clear procedures and policies should be made available, showing clear documentation of expectations. Communication and training will ensure compliance, but training can’t be one-size-fits all. Monitoring must be sufficient to ensure anti-corruption, but in this case, it would be focused on Facebook’s policies rather than its consumers. This ensures that owners, executives, employees, and shareholders equally account for wrong-doing and provides a means for corrective action that remains consistent.
A new European Union privacy law is already in place which gives consumers greater control over the use of their data. Harper Neidig explains users will be able to request what personal information companies have, request their information be deleted, order the cessation of distribution of personal data to third parties, and to revoke consent for personal information shared. (Neidig 2018) In this new business model, the public does not partner with the social media platform, but it does control the information and how it is shared: nothing can be shared without the specific release of the initial poster. Such release restrictions limit availability of information based on constraints set by the initial poster, which in turn limits how much manipulation can occur post-posting. A full release is considered fully public and available. This release also ensures information can be traced to the initial poster to ensure content laws are enforced, fake news is prevented, and unauthorized information leaks are quickly contained. When each consumer has control over their own personal information, selling that personal information then becomes a choice for each individual.
When such changes are implemented, it is more likely that consumers will regain trust. Psychologically, Facebook needs to return to a time when the wishes of consumers were more important than profit. Zuckerberg needs to target connectivity and the exchange of ideas between people once again rather than targeting people for their private information in order to exploit potential commercial activity. Consumers should not be treated like credit but should control the means by which companies earn credibility. When the masses have such influence, they feel invested, protected, and they will trust the future of the company. This may be asking for a new psychology of Facebook. But clearly in light of recent information, a new psychology is desperately needed.