Social Media Regulation In African Nations Will Require Extra Than Worldwide Human Rights Legislation

from the isps and human rights department

Much emphasis has been placed on moderation as done by platforms – the rules of social media companies that base their decision on what content remains online. However, limited attention has been paid to how actors other than social media platforms, in this case governments, seek to regulate those platforms.

They are more focused on African governments and mostly do that regulation through law. These laws can be broadly divided into two: direct and indirect regulatory laws. The direct regulatory laws can be seen in countries like Ethiopia and, more recently, Nigeria. They are similar to the German Network Enforcement Act and the French Online Hate Speech Act, which directly impose responsibilities on platforms and require them to remove online hate speech within a certain period of time, and non-compliance with severe sanctions.

Section 8 of the Ethiopian Proclamation on the Prevention and Suppression of Hate Speech and Disinformation 1185/2020 provides different responsibilities for social media platforms and actors. These responsibilities include suppressing and preventing disinformation and hate speech content through social media platforms and a twenty-four time window within which such content must be removed from their platforms. It also stipulates that they should align their policies with the first two responsibilities.

The proclamation also transfers responsibility for reporting and raising public awareness of compliance with social media platforms to the Ethiopian Broadcasting Authority – a body legally empowered to regulate broadcasting services. The Ethiopian Commission on Human Rights (EHRC), Ethiopia’s National Human Rights Institution (NHRI), also has responsibility for public awareness. But it is the Council of Ministers that is responsible for the implementation of laws in Ethiopia, which can provide further guidance on the responsibilities of social media platforms and other private actors.

In Nigeria, the bill, the bill to protect against Internet counterfeiting, tampering and other related matters, has yet to become law. The bill is intended to regulate disinformation and coordinated inauthentic behavior on the Internet. The law is similar to that of Singapore, which has been criticized by the current United Nations Special Rapporteur on the Promotion and Protection of the Right to Freedom of Expression and Expression for the threats to online opinion and rights in general.

One of the main criticisms of these laws is that they are opaque and pose a threat to online opinion. For example, Ethiopian law defines hate speech broadly and does not take into account the contextual factors that need to be considered when categorizing online speech as hateful. With respect to Nigerian law, there are no clear oversight, accountability or transparency systems to review the unlimited powers of the government in what is considered disinformation.

The indirect regulatory laws are used by governments, through their telecommunications regulators, to force Internet Service Providers (ISPs) to block social media platforms. This type of regulation requires ISPs to block social media platforms due to public emergencies or national interests. What constitutes these emergencies or interests is vague and, in many cases, examples of government voices or platforms.

In January 2021, the Ugandan government ordered ISPs to block Facebook, Twitter, WhatsApp, Signal and fiber. The order was placed by the communications supervisor. The order came a day after Facebook announced it was closing down pro-government accounts that share disinformation.

In June 2021, the Nigerian government ordered ISPs to block access to Twitter on the grounds that its activities threatened the very existence of Nigerian companies. However, there were conflicting views that the disposition was due to both distant and immediate causes. The distant cause was the role Twitter played in connecting and gathering the public during the #EndSARS protests against police brutality, while the immediate cause was attributed to Twitter’s deletion of President Muhammadu Buhari’s tweet referring to the country’s civil war , contained disguised threats of violence, and violated Twitter’s abusive guidelines.

In May 2021, Ethiopia had just lifted the ban on social media platforms in six locations in the country. Routine closings like this are commonplace for African governments, and this often happens during elections or a major political development.

On closer inspection, the overarching challenge of both forms of regulation is the lack of accountability and transparency, particularly on the part of governments, in enforcing these provisions. Social media platforms are also complicit as there is little or no information about the type of pressure they are facing from these government actors.

In addition to the mainstream debates about the governance of social media platforms, the time has come to consider broader forms of regulation, particularly with regard to how they manifest themselves outside Western systems and what threats such regulation to online Represents opinion.

One solution that has been proposed, but has also been heavily criticized, is the application of international human rights standards to the regulation of social media. It has been argued that this standard is the most preferred due to its common use in all contexts. However, its greatest strength also seems to be its greatest weakness – how can this standard be applied in local contexts given the complexity of managing online language and the large number of actors involved?

To work towards effective solutions, we need to rethink and reinstall the traditional governance roles of not only governments and social media platforms, but also ISPs, civil society and NHRIs. For example, the unchecked powers of most governments to determine online harm need to be reconsidered to ensure there are judicial reviews and human rights impact assessments (HRIAs) of the government’s proposed bans on social media.

ISPs also need to be encouraged to jump into the fray, opt for human rights, and not block social media platforms every time governments make such problematic demands. For example, you should start joining other stakeholders like civil society and academia to promote laws and policies that mandate judicial reviews and HRIA requirements before complying with government requests to block platforms or even content.

Applying international human rights standards to regulating social media is not where work stops, but where it begins. First, NHRIs involved in social media regulation, such as governments, social media platforms, private actors, local and international civil society organizations, and contracting parties such as the United Nations and the African Union, must define a typology of harm as well as actors actively involved in involved in such regulation. To ensure that they meet the challenges of such regulations, the responsibilities of these actors must be anchored in international human rights standards, but in such a way that these actors actively communicate and work together.

Tomiwa Ilori is currently a PhD student at the Center for Human Rights of the Law Faculty of the University of Pretoria. He also works as a researcher for the Centre’s Expression, Information and Digital Rights Division.

Techdirt and EFF are working together on this Techdirt greenhouse discussion. On October 6th, from 9:00 a.m. to 12:00 p.m. PT, many authors on this series will discuss and debate their plays in front of a live virtual audience (register here).

Thank you for reading this Techdirt post. With so many things vying for everyone’s attention these days, we appreciate your giving us your time. We work hard every day to bring quality content to our community.

Techdirt is one of the few truly independent media companies left. We don’t have a huge company behind us and we rely heavily on our community to support us at a time when advertisers are increasingly disinterested in sponsoring small, independent websites – especially a website like ours that isn’t ready for theirs Reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying / intrusive advertisements, we’ve always kept Techdirt open and available to everyone. But in order to continue to do this, we need your support. We offer our readers a variety of ways to support us, from direct donations to special subscriptions to cool merchandise – and every little bit helps. Thanks very much.

–The Techdirt team

Filed Under: africa, content moderation, ethiopia, human rights, isps, nigeria, uganda

Comments are closed.