Updated from Nov. 14, 2:45 p.m. EDT.

Facebook (FB) has acknowledged that fake news is more of a problem for the site than simply a public relations issue.

On Monday, Facebook updated its policy to "explicitly clarify" that fake news sites can't use the company's Audience Network tool, which allows content publishers to display advertisements on the social media site in exchange for a cut of ad revenues, according to the Wall Street Journal. Alphabet's (GOOGL) Google made a similar move, banning misleading sites from its Google AdSense service.

The Pew Research Center found recently that 44% of Americans get at least some of their news from Facebook. That's a lot of people turning toward the Menlo Park, CA-based social media giant to find out what's going on in the world.

The prevalence of inaccurate and false news that users spread on Facebook received lots of attention in this year's presidential election, however. A recent analysis by Buzzfeed of more than 1,000 posts shared on Facebook from several news sites showed that 38% of the posts on hyper-right wing sites were either partly or entirely false, as well as 19% of those on hyper-left wing pages. By contrast, 99.3% of the posts on mainstream sites that Buzzfeed analyzed were partly or entirely truthful.

But websites pumping out false content generate a higher number of Facebook shares, comments and reactions than traditional mainstream political sites. Buzzfeed found that Occupy Democrats, the hyperpartisan site with the greatest median shares per post, earned 10,931 shares per post while, CNN Politics only garnered a median of 50 shares per post.

With lots of inaccurate content swarming news feeds, the New York Times said recently that fake news has become "media's next challenge."

Facebook is a holding in Jim Cramer's Action Alerts PLUS Charitable Trust Portfolio. Want to be alerted before Cramer buys or sells FB? Learn more now.

Facebook CEO Mark Zuckerberg defended his company this weekend by saying that 99% of the content users view on the site is "authentic." This followed Zuckerberg's claims last Thursday that "the idea that fake news on Facebook -- of which it's a very small amount of the content -- influenced the election in any way is a pretty crazy idea."

But President Barack Obama condemned fake news on Facebook last week during a Hillary Clinton rally at the University of Michigan prior to Tuesday's election, saying that misinformation on the social media site "creates this dust cloud of nonsense." He added that if these false reports are repeated and shared enough times, "people start believing it."

It's still unclear if the growing presence of fake news could drive users away from Facebook, however. For the third quarter of 2016, the company saw a 17% year-over-year rise in daily active users and a 16% year-over-year increase in monthly active users, showing that Facebook has plenty of momentum on its side.

David Boardman, dean of Temple University's School of Media and Communication, said the company has not only a "moral" obligation to address the issue but also a "business imperative."

"Growth in the volume of 'fake news' could definitely push users away from Facebook, so they are compelled to address the issue," he said. "Certainly, that will be a challenge. But when, for example, more than a million users share a fabricated news story that Pope Francis had endorsed Donald Trump for president, the implications for both our political system and Facebook's role are powerful and clear."

Hilliard Lyons analyst Stephen Turner said that "following the election, we do expect time usage and sharing [on Facebook] to abate." But he added that users will most likely remain engaged with the site, only tweaking their news feeds to get a more balanced source of political news.

Adjusting news feed settings may not help users avoid fake news, however, as the company's own features and algorithms sometimes spread false content as well.

This summer Facebook came under fire after the company began relying more heavily on algorithms to determine its 'Trending Topics' rather than human curators. As a result, computer-generated article descriptions and titles have become prone to error, and incorrect information has spread more readily across Facebook.

In August, a false story about Fox News journalist Megyn Kelly supporting Democratic presidential nominee Hillary Clinton jumped to the top of the trending bar and stayed there for hours before Facebook employees flagged it as fake.

The company has made more efforts to crack down on misinformation. Adam Mosseri, Facebook's VP of product management, issued a statement Thursday that the social media site is trying hard to identify inaccurate content using algorithms based on community responses and to keep it from spreading on the company's 'trending' page, but he acknowledged that the work isn't finished.

Zuckerberg seemed more hesitant to take drastic action. He said in a statement this weekend that the company must "proceed carefully" in determining what content is truthful and what is false. "I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves," Zuckerberg said.

Dan Kennedy, an associate professor at Northeastern University's School of Journalism, argued that the social media giant does have an obligation to address its fake news problem. "News from reliable media organizations should be weighted more favorably than content from meme farms so that users see more of the good stuff and less of the bad," he said. Although Kennedy noted that it's unlikely Facebook's false content had an impact on this year's election, he added that the site "is nevertheless journalism's largest and most important distribution channel."

Recode co-founder Kara Swisher said on CNBC's 'Squawk Alley' Friday morning that "[Zuckerberg] will continue to insist that they don't have editorial control over these things... and people shouldn't believe everything they read." But she said the company "does have some [responsibility], to some extent" for the fake news.

"Facebook has to face the fact they have an influence the way television networks have, the way newspapers had, and the question is, what are they going to do about it?" Swisher said.

If you liked this article you might like

Bonus White Papers: The FAANGs

Bonus White Papers: The FAANGs

Video: Meet the New ETF Using Artificial Intelligence to Pick Stocks

Video: Meet the New ETF Using Artificial Intelligence to Pick Stocks

Snap's Stock Tanks as Online Campaign Against Snapchat Redesign Spreads

Snap's Stock Tanks as Online Campaign Against Snapchat Redesign Spreads

Watch: Facebook Co-Founder Chris Hughes on How to Solve 'Fake News'

Watch: Facebook Co-Founder Chris Hughes on How to Solve 'Fake News'

Using Twitter to Trade Stocks

Using Twitter to Trade Stocks