How is your private information protected from AI?

Private information

By Koketso Mamabolo “Data! Data! Data! I can’t make bricks without clay.”  Written over a hundred years ago, these words have taken on a different meaning. When The Copper Beeches was published in June of 1892, sharing and receiving information could only happen as fast as horse and steam-powered trains and ships could travel.  It has become cliché but it is difficult to overstate how much information we have access to at the click of a button, something which the protagonist of the story, Sherlock Holmes, may have appreciated. All of Sir Arthur Conan Doyle’s stories about the master detective can be easily searched, downloaded, and printed in less time than it takes to boil a kettle.  If it’s not enough that a simple search can reveal information – which encompasses the length and breadth of human knowledge, from the beginning of time to real-time – now artificial intelligence has given us a tool that could even produce a Sherlock Holmes story for you in minutes.  Do you want it as a sitcom? A gothic horror? Do you want it Disney-style? Maybe Great Gatsby set in 1980s America? All that can be done in less time than it would take you to read a page or two of The Copper Beeches. Whatever shape or form you want, generative AI can do it. Well, that’s if you’re willing to sacrifice, unsurprisingly, on quality and originality. Since this is a machine and not the great short story writer himself, you would expect a poor imitation of the original. But here lies the rub. Even if it creates a “new” Sherlock Holmes story, it has to pull bits and pieces from the original stories, and the narratives, styles, and details of works other than Sherlock Holmes. It has to be able to produce something familiar.  It needs material to work from. It needs data. To make bricks, it needs clay. Without data, there is no artificial intelligence. But what happens when the data being fed into these AI models is personal information?In The Copper Beeches the protagonist is approached by Violet Hunter, a woman who needs Holmes’ help in a peculiar case in which she is being asked to forfeit her personal privacy as part of a new job. While the story covers a few themes, one that sticks out is the tension between what people are asked to sacrifice and what those sacrifices are used for. What does this have to do with AI? In order for us to have AI models doing the kinds of things they are doing, and will be able to do, we need to feed them a lot of information. While literature like Sir Arthur Conan Doyle’s stories are covered by copyright law, what about personal information? What are the limits, if any, on how personal information can be used by AI? Enter POPIA With generative AI seemingly always breaking new milestones, the question of whether or not policymakers and regulators are prepared and informed enough to adapt is one all countries have to reckon with. South Africa is still catching up in terms of policies and legislative frameworks. As one of the countries with the highest adoption rates in the world, particularly among developing nations, South Africa could become a case study in the impacts of AI. While a national policy on AI is still being developed, there are already legal frameworks in place which cover some of the concerns. Along with the country’s lodestar, the Constitution, the laws that apply to the AI realm include the Consumer Protection Act, the Cybercrimes Act, and the Promotion of Access to Information Act (PAIA). But the key piece of legislation protecting the public is the Protection of Private Information Act (POPIA). In order for the public and private sectors to function they need us, like the woman in The Copper Beeches – who has to cut her hair to get the job – to give up something personal. In this case it is our information, which we all give out all the time, sometimes without thinking. This is especially relevant now that more and more businesses are experimenting with AI agents and at times having to share data which may contain the private information of a customer or a client. In a survey McKinsey found that over 60% of businesses are experimenting with AI agents, while 88% are using AI regularly in at least one business function. The healthcare sector is one alarming case which has reported one of the highest AI agent adoption rates. POPIA compels organisations and businesses to avoid risks and protect the rights of the public. The act sets out data protection principles which apply when using AI to process information. A valid legal basis for processing the information is required which could be a contractual obligation, consent or legitimate interest. For historical data, the original sampling must have been in line with the act, and the data can only be used for the original purpose for which it was collected.  Read the full story in the March edition of Public Sector Leaders: Sources: Government Gazette | Microsoft | Michalsons | Labournet | McKinsey

What’s in store for AI in 2026?

AI in 2026

The next year promises to be a pivotal one, where AI evolves from reactive tools into proactive partners that help people, organisations, and governments work more efficiently and creatively.

Why every board needs an AI expert

AI expert

Boards need someone who can translate between engineers, risk teams, and directors – someone able to challenge optimistic assumptions and set measurable guardrails.

How tech is rewriting Africa’s GDP

Tech GDP

From mobile money platforms that have transformed everyday commerce to billion-rand data centre investments, Africa’s digital economy is rewriting the continent’s growth story.