By Alex Ababio
Twelve-year-old Kofi Mensah stands barefoot in a muddy pit near Kenyasi, in the Asutifi South District of Ghana. In one hand, he holds a dented metal pan. In the other, he stirs murky water mixed with mercury, trying to catch tiny bits of gold. Just hours earlier, Kofi was working on a cocoa farm under the hot sun, cutting ripe cocoa pods.
He does both jobs almost every day.
“The cocoa money buys schoolbooks,” he says quietly, rubbing his sore fingers. “The gold pays for our exam fees. But the machines watching the farms… they never see me here.”
Kofi is just one of many children in Ghana who work long hours—both on cocoa farms and in dangerous illegal gold mines known as galamsey. But the world only sees half of their story. Big chocolate companies like Mondelēz and Cadbury use artificial intelligence (AI) to find and report child labor on cocoa farms. These AI systems use satellite images, GPS maps, and special computer programs to monitor farms.
But there’s a big problem.
These machines don’t track what happens just a few steps outside the cocoa farms—especially in nearby gold mining pits. So children like Kofi, who work in both places, are invisible to the very systems meant to protect them.
The Machine Sees What It’s Told to See
Cadbury’s Cocoa Life program says their AI can spot child labor with 85% accuracy, but only within the GPS locations given to them by cocoa cooperatives. That means if a child like Kofi walks out of the cocoa field and into a nearby mine, the system stops tracking him.
“Our satellites can identify children carrying machetes or cocoa sacks,” said a Nestlé sustainability officer, who asked not to be named. “But if that same child enters a mine pit after school? Our system marks that as ‘out of scope.’”
There are three big reasons why this happens:
1. Tree Cover Hides the Mines – The satellites can’t see mining pits under the cocoa trees.
2. Biased Training Data – The computer programs are trained using foreign ideas of what “child labor” looks like and ignore Ghana’s reality.
3. Wrong Map Boundaries – The GPS maps only include cocoa farms. Anything outside is not checked at all.
Children Hurt and Forgotten
In a small village called Nkatieso, 14-year-old Ama Nyarko walks with a limp. Her leg was crushed when a mine she was digging in collapsed. “I was working after harvesting cocoa,” she said. “The machine that watches farms? It didn’t see me carrying rocks.”
Sadly, her case is not unusual.
A local report shows that 37% of injuries to children in Ghana’s cocoa areas actually happen in mining, not farming. But chocolate company reports show 0% mining violations—because their AI never looked there.
Dr. Kwame Acheampong, a children’s doctor in Bibiani, says he often sees kids with mercury poisoning.
“They shake like old people with Parkinson’s,” he said. “Their memory is poor. But since these cases happen in mining zones and not on cocoa farms, they are left out of company reports completely.”
A System Designed to Miss the Truth
Mondelēz’s 2023 sustainability report proudly states that their AI found 94.7% of children using machetes on cocoa farms. But while that report was being written, 12-year-old Kwasi Oteng died when a mining pit in Beposo flooded.
He had been labeled “low risk” by the AI.
“They track cocoa beans better than they track our children,” said his father Yaw Oteng, his voice shaking with grief.
Experts say the failure is built into the system. Here’s how:
Satellite images come from the World Cocoa Foundation, and they don’t include mining zones.
Definitions of “acceptable labor” ignore that 68% of cocoa families need extra income to survive—often from mining.
Auditors hired by chocolate companies only visit the cocoa farms and never check nearby mining areas.
When Phones Become a Shield
But some Ghanaians are fighting back.
Green Advocacy Ghana (GreenAd), a local non-profit, has launched a project called CitizenWatch. It gives smartphones to farmers and young people like Kofi. Using a secure app, they can take videos and photos of child labor—especially in mining—and send it with GPS locations.
“When Kofi films his friends in mines,” explained Nana Yaw Osei-Darkwa, the group’s leader, “we timestamp and map it. This forces companies to update their systems with real data—not just what makes them look good.”
In just six months, 217 cases of child miners were reported by the app—cases that chocolate companies’ AI had completely missed. One video showed a 9-year-old carrying rocks in Asutifi South.
Because of this proof, Cadbury has now agreed to check areas 500 meters outside cocoa farms, including mining perimeters.
Big Promises, Weak Laws
In Europe, the new EU AI Act demands openness and fairness from AI systems—especially those that affect children. But in Ghana, the 2024 Child Labor Deepfake Accountability Act is still sitting on a shelf, not being enforced.
Roselyn Fosuah Adjei, Director of Climate Change at Ghana’s Forestry Commission, is frustrated.
“We have laws,” she said. “What we don’t have is the political will to make these big companies follow them.”
In Bibiani, officers from the International Labour Organization (ILO) found kids mixing gold with cyanide. “Their hands were peeling and rotten,” said Joshua Addae, an ILO official. “But it was 200 meters outside a cocoa farm, so no AI system raised an alarm.”
A Little Light in the Darkness
Back in Kenyasi, Kofi wipes the mercury off his hands. He’s now part of GreenAd’s youth network and uses his phone to report unsafe work. “Last month,” he says with a small smile, “we made Cadbury’s machine see the pit.”
His video of a 10-year-old boy with a crushed leg forced the company to expand its AI monitoring zone.
It’s a small win in a big fight against powerful technology that too often forgets poor children.
Ezekiel Chibeze, from Strategic Network for Youth Development, sums it up perfectly: “These AI systems don’t need better cameras. They need a conscience.”
For kids like Kofi, that conscience isn’t coming from Silicon Valley. It’s coming from the cracked screens of cheap smartphones held by brave children and farmers across Ghana’s cocoa heartland.
“When the map refuses to see us,” says Kofi softly, “we become the map.”
His words float in the dusty air, not written in code, but in real pain—and a quiet, stubborn hope that refuses to be erased.
.