On September 25, an article that was posted on Facebook’s newsroom page described how the platform is updating its guidelines and cracking down on illicit drug sales in hopes to curb substance abuse.
Authored by Monika Bickert, Vice President of Global Policy Management at Facebook, the article not only delineated how some of the new policies are being enforced but also addressed claims that had been made by the Washington Post during the same day.
While Bickert and other representatives of the company promised that they would be cracking down on sales of illicit substances, the Washington Post claimed that the algorithms of some of the services offered by Facebook — particularly services that deliver tailored feeds, such as Facebook’s marketplace and Instagram — promoted the spread of illicit content.
The publication also claimed that Facebook has been exposing some of the most vulnerable members of its community to drugs and consequently substance abuse.
In response, Facebook’s statement highlighted that its representatives have been working with the Substance Abuse and Mental Health Services Administration to raise awareness about the epidemic, connect users who seek to drugs with professional treatment and support those who may be struggling with substance use disorders.
Along with the recent announcements made by Facebook, specific steps to address these issues were outlined. The steps included: establishing partnerships with outside organizations and trade experts to control illicit sales; flagging and disabling content that doesn’t adhere to established guidelines; proactively investigating accounts, hashtags, groups, and pages that are associated with prohibited content; and developing innovative technology to identify instances when users attempt to sell or trade drugs.
The announcement came after a number of recent promises and pledges made by Facebook employees during congressional hearings, which were held just over a month ago.
But according to the Washington Post article, even though pledges to adopt more strict regulations had been made, Facebook’s services continue to appear to serve as marketplaces for illicit trades, allowing drugs and other regulated goods to be advertised.
While Bickert’s article recognized that Facebook needed to do more to address the problem, it also claims that the findings published by the Washington Post were misleading.
Prior to last month’s hearings, Facebook Founder and CEO Mark Zuckerberg had spoken about what his company is doing to increase the safety of its users — according to the most recent statics available, released in June, the platform has an average of 1.47 billion active users a day and 2.23 billion monthly active users.
Zuckerberg also stated that the company had failed to address and take responsibly for the fact that its tools have been used for harm.
Amidst apologies, Zuckerberg made a number of remarks particularly focused on how challenging it has been to fully enforce the community’s standards and policies, considering Facebook’s fast growth.
Facebook’s marketplace was introduced in 2016 to every user over the age of 18 located in the U.S., Australia, the U.K., and New Zealand. The feature can now be used in more than 70 countries and more than one in every three Facebook users based in America utilize the marketplace.
On October 2, the company announced that updates have been made to not only celebrate the 2nd anniversary of the marketplace but also to improve users’ experiences.
This month’s announcement highlighted that new features, which use artificial intelligence, would be introduced to the service to allow people to have more shopping options and conduct faster transactions.
According to Facebook representatives, the new features are efforts to not only create a more reliable and secure community but also to facilitate the detection of inappropriate content.
The new feature redirects users who search for hashtags or posts related to drugs to numerous free resources including confidential referrals to addiction treatment providers and information on substance abuse and prevention.
Now, those who use the app to look for content that’s associated with certain words or hashtags, such as #opioids or #fentanyl, will see a pop-up that asks ‘can we help?’ and describes the aforementioned resources. The pop-up also offers three options: get support, see posts anyway, or cancel.
Instagram representatives explained that the new pop-up will be visible to users as they update the app. They also explained that the reason why users may choose to see posts anyway is that certain hashtags are not only used by people who are trying to buy or sell illegal substances but also by those who are struggling and looking for recovery support in the community.