Please enable JavaScript to read this content.
As students sit the last of the Kenya Certificate for Primary Education Examinations (KCPE), two new studies require us to look beyond a new education system to the wider learning ecosystem if the new Competency Based Curriculum is to succeed.
Nearly 1,500 candidates have the honour of closing the book on nearly four decades of the 8-4-4 education system this week. Widely criticised for its heavy-on-theory and learning-to-pass exams emphasis, the 8-4-4 system also lacked an inclusive disability lens, support for non-employment careers and digital literacy.
The shift to the new CBC is probably Kenya's most disruptive education reform since independence. Challenges are many and it will take the nation's teachers, parents, and students time to adapt to and resource the 2-6-6-3 system.
In 2023, close to 32.7 per cent of the population (17.8 million) regularly use the internet. 10.5 million people engage one or more social media platforms and as many as 67 per cent of Kenyan children (aged 12-17) could be consuming and creating digital content today. Two new multi-country studies released this week had Kenya in their sights for this reason. Both alert us that raising ethical, empowered, and engaged young citizens requires a 360 approach to learning.
The Amnesty International and Algorithmic Transparency Institute "Driven into the Darkness: How TikTok encourages Self-harm and Suicidal Ideation" report found that it can take only 20 minutes for Kenyan, Filipino and American children to encounter harmful mental content.
Watching mental-health-related videos triggered the "For You" feed to supply a deluge of similar videos romanticising, normalising, or encouraging suicide. Children reported they felt TikTok affected their ability to sleep, learn and spend social time with friends leading to anxiety, depression and thoughts of self-harm. The second study "I feel exposed: Caught in TikTok's Surveillance Web" also released by Amnesty offers the reason for why this is happening. Like other social media platforms, TikTok's business model and algorithms are designed to harvest personal data for the purpose of bombarding users with personalised advertisements. Furthermore, the study argues, the company's protective policies and safeguards are relaxed for countries with weak legal regulation.
With TikTok ranked sixth most popular platform and generating US$3 million in Kenya-based advertising and several published Council for Responsible Social Media and Mozilla Foundation studies, it was not long before the Chinese conglomerate caught the attention of the National Assembly and the Presidency. The two new studies provide further helpful guidance to both offices.
Legislators must go beyond the simplistic knee-jerk "to ban or not to ban" policy options and require TikTok to moderate harmful violent, indiscriminatory or sexual content away from minors. TikTok must also stop all data harvesting for advertising targeted at minors.
It can also deactivate the "For You" feed by default and let users actively choose what content they wish to see. It is also time TikTok introduced and published its own periodic human rights impacts assessments.
In line with the CBC vision, mass digital rights literacy programmes are now needed. Given the huge understanding gap between most parents and children, digital safety programmes must be designed with children and be based on localised understanding of young people's interests, interactions, and levels of risk. Adults must go beyond their worst nightmares and avoid tunnelling programmes towards restricting young people's use. Restrictive strategies, as all adults know from our teenage years, will only drive online activity underground. To succeed we can balance digital safety messages with the power of learning, research, business, and active citizenship.
All primary, secondary, and higher learning institutions can follow the example of Moi University who held a privacy-first debate presided over by Chief Justice Emeritus David Maraga on Thursday.
These discussions can be repeated in our homes, and all the offline and online public spaces young people interact with. Adapting the algorithmic for the collective good is important but alone, it remains insufficient. Nurturing 24 million young digital citizens requires so much more from all of us.