Facebook has announced the banning of several individuals who it characterized as being "dangerous" political figures for their rhetoric. These people include Milo Yiannopoulos, Laura Loomer, Paul Joseph Watson, Alex Jones, Paul Nehlen, and Louis Farrakhan. They were banned by Facebook CEO Mark Zuckerberg for not complying with the company's standards. The banning is what the Wall Street Journal called the company's "most sweeping" action "yet against online provocateurs". Far-right extremists need to be shut down, silenced, muted. They should not be allowed to spew their hatred in any social media outlet. Facebook is within its rights to evict these accounts, even if they've done nothing criminal. As a private company, Facebook can monitor its own content and make whatever decisions it deems appropriate for the company. So the First Amendment doesn't apply here. Their First Amendment rights are not being violated. But there is a slippery slope that the decision creates. The ban could lead to unintended consequences, affecting those like Minnesota Democrat Rep. Ilhan Omar who has been accused of hate speech and making anti-Semitic comments. Since entering office in January, the congresswoman has questioned the loyalty of those Americans who support Israel and accused US politicians of being beholden to Israeli money. Criticizing Israel for its occupation of Palestinian land and criticizing those who support the occupation is not being anti-Semitic but in many quarters in the US it is seen as such. So it's easy to see that, if Zuckerberg is deciding what can and cannot be posted on Facebook, a figure as controversial as Omar could be banned. Some people support her views on Israel and others don't. Should she be banned to please the latter group? There are two tricky parts here: Hate speech is clearly defined; however, if you live in the anti-Omar camp, you might categorize her comments as being hateful, anti-Semitic, as her opponents claim. Today, people such as Jones and Yiannopoulos are banned. Tomorrow, it might be Omar's turn. Problem two concerns overlap. Where is the line between what can and cannot be said and who is drawing it? The Facebook ban runs counter to the long-running American tradition of talking about everything, even subjects that are potentially "dangerous", to use the Facebook wording. Facebook, like other social media organizations, has taken the position that ideas and expressions, even if they don't violate the law, can be too dangerous for dissemination, and so must be suppressed. But to have a company, even if private, able to exclude certain Americans from its platform, makes some uncomfortable. They believe strong free-speech traditions are in trouble. President Trump, for example, has blasted Facebook and Twitter in a series of tweets, slamming the social media platforms for silencing content. But for many, this Facebook move is being celebrated as a positive development that is long overdue. After all, social media is, in no small part, responsible for the rise of these hate-mongers. They may have the right to spread vile hatred, but Facebook has the right and responsibility to not give them a platform. The First Amendment argument is very powerful, and it's something that is unique to America. But it's not unlimited. It's not unrestricted. Screaming fire in a crowded theater when there is no fire endangers people. So there are restrictions that should be imposed. Kicking people of hate out of Facebook will not halt the rise of racism or other bigotry. Nonetheless, it makes it harder for their vileness to spread. Because Facebook is taking away the speech of private individuals, the ban will probably end up being challenged in a courtroom. But until then, Facebook is making good on its promise to combat hate speech. div class=" simple-translate-button" style="background-image:url("moz-extension://1eb291e5-1d94-5949-827b-2f1c78b90df1/icons/512.png"); height:22px; left:10px; top:10px; width:22px"