Facebook chief executive Mark Zuckerberg announced last week that he would spend 2018 fixing the problems with his platform that enable bad actors to do harm, such as Russia's interference in the United States election. As Mr Zuckerberg's former mentor, I applaud this commitment and would like to offer my friend a road map to protect our democracy.
I first noticed bad actors exploiting Facebook in early 2016, and contacted Mr Zuckerberg and Ms Sheryl Sandberg, Facebook's chief operating officer, just before the election. I spent four months trying to convince Facebook that its algorithms and advertising business model were vulnerable to bad actors. They were reluctant to accept my conclusion then and continued to deny and deflect until the end of 2017. The company still argues that it is not responsible for the actions of third parties on its platform.
I can understand that it was initially difficult for Facebook to believe that its product was at fault, but there is no longer any excuse for inaction.
What we need from Mr Zuckerberg is acknowledgment that Facebook has some responsibility for what others do on its platform and that it is prepared to make fundamental changes to limit future harm. This week's announcement of changes to Facebook's News Feed may be a positive step, but it's not a solution. Had this change been in place in 2016, it might even have exacerbated the Russian interference by increasing the exposure of Facebook group users to misinformation.
I recommend that Facebook follow the example of Johnson & Johnson during the Tylenol poisonings in 1982. Johnson & Johnson did not cause the tampering. It was not technically required to take responsibility, but it knew it was the right thing to do. The company took immediate and aggressive action to protect its customers. It took every bottle of Tylenol off every retail shelf and redesigned the packaging to make it tamper-proof. There was a substantial economic cost in the short run, but the company built trust with customers that eventually offset it.
Following this model, the first step for Facebook is to admit it has a problem. Mr Zuckerberg did that in his blog post. The next step is for Facebook to admit that its algorithms and advertising business model invite attacks by bad actors. By giving users only "what they want", Facebook reinforces existing beliefs, makes them more extreme and makes it hard for users to accept unpleasant facts. Instead of bringing people together, Facebook drives us apart.
The same tools that make Facebook so addictive for users and so effective for advertisers are dangerous in the hands of bad actors. And thanks to automation, Facebook cannot currently prevent harm. It will happen again and again until Facebook takes aggressive action. The problem cannot be fixed by hiring contractors to review problematic posts. It needs to change the priorities of its algorithms and retool its business model. It needs to act like Johnson & Johnson.
Facebook also owes its users a personal apology. Thanks to Facebook's negligence, 126 million Americans were exposed to Russian manipulation, and most of them do not realise it. To compensate, Facebook must notify every user touched by Russian election interference with a personal message explaining how the platform was manipulated and how that manipulation harmed users and the country. They should include copies of every post, group, event and ad each user received. Facebook is the only entity able to break through to users trapped in its filter bubbles.
US Senator Richard Blumenthal made this request several months ago. Facebook's response was a "portal" that was as hard to find as it was inadequate.
Finally, Mr Zuckerberg should volunteer to testify in an open hearing before Congress. The country needs to hear him explain Facebook's strategy and design choices and justify its refusal to accept responsibility for what bad actors are doing on the platform.
Facebook is tailor-made for abuse by bad actors, and unless the company takes immediate action, we should expect a lot more of it, including interference in upcoming elections. If Facebook chooses to protect its current business model, it has enough power and influence to skate by without implementing the changes needed to protect democracy and public health in the United States and across the world.
But users and regulators are watching. Mr Zuckerberg and Ms Sandberg have an opportunity to be heroes or villains. The choice is theirs.
• The writer is a managing director at Elevation Partners and was an early investor in Google and Facebook.