Facebook knows what people think of its effect on mental health.
Some worry it contributes to depression, short attention spans or low self-esteem. That it's a time-suck, a distraction and a worrisome presence in their children's lives. They're torn between chasing the rush of "likes" and feeling guilty over time lost staring at a screen.
In a self-aware blog postpublished Friday, researchers at Facebook attempt to address those concerns. The post asks, "Is spending time on social media bad for us?"
It doesn't answer the question, but the post says Facebook is concerned about the issue and is earnestly investing in research to learn more.
It also announced a feature it thinks might help: Facebook is rolling out a new 30-day "mute" button for people, pages and groups, which it says will give users more control over what they see. The people, pages and groups you mute will not be notified about the action.
The post comes after a number of former employees and investors criticized Facebook publicly. Former executive Chamath Palihapitiya recently said, "I think we have created tools that are ripping apart the social fabric of how society works," although he recently walked that statement back. Sean Parker, who was an early investor in the company, said it was "exploiting a vulnerability in human psychology" and worried about Facebook's impact on relationships, productivity and children's brains.
Facebook also received criticism recently when it released a messaging app aimed at kids.
"We know that people are concerned about how technology affects our attention spans and relationships, as well as how it affects children in the long run," said Facebook's director of research David Ginsberg and research scientist Moira Burke in Friday's post.
They cite studies that say social media can be detrimental to mental health, and others that say the opposite. The conclusion the authors come to is that social media itself isn't good or bad, it's how you use it that dictates its impact on your mental well-being.
A problem with shifting responsibility from the company to the users is that people often can't identify what's harmful to them on a conscious level, according to Holly Shakya, an assistant professor at UC San Diego.
Shakya studies the impact of Facebook on mental health. She says one way social media can be damaging is negative social comparisons. That's when people see carefully curated versions of other people's lives and then feel bad about their own. They likely won't think to mute these kinds of friends, who could number in the hundreds.
She's also thinks interacting on social media can detract from real-world relationships.
"This media doesn't give you what a human being really needs to have a meaningful social interaction, like eye contact, physical contact and posture," says Shakya.
Facebook's Ginsberg and Burke say they want Facebook to "enhance" people's meaningful real-world relationships, not detract from them. Their post touts the various ways the company is trying to improve the well-being of users.
"We're working to make Facebook more about social interaction and less about spending time," said the post.
In addition to the new mute feature, Facebook has been making regular changes to the news feed so it shows less clickbait and more posts from close friends. In 2015, it launched the "Take a Break" tool so people could hide posts from exes after a breakup. It also has a number of suicide prevention initiatives and resources, including an AI tool that identifies suicidal posts.
Facebook is investing money into more research on digital distraction and social media's impact on kids, according to the post. The company has its own internal research team that will share findings publicly. It is also hosting a summit for academics and industry leaders next year to discuss the issues.
Even with the best of intentions and a staff of qualified researchers, the company's findings may invite criticism. Facebook is a public company with a vested interest in keeping people logged on and scrolling.
Shakya says she'd be skeptical about a tech company's internal research, but it would depend on what the studies claimed.
"There are companies that really do care about their impact on the world and do self-assessment," she said. "If Facebook just waves its hands and tries to fool people into thinking things are just hunky dory, it's going to backfire."