Why boards and CEOs must take data privacy, security and AI seriously

Joel Basoga

What you need to know:

AI raises several legal and ethical issues.  First, what data is being used to train or program the AI? 

In this information age, the analysis of data and use of artificial intelligence has enabled organizations to improve customer experience, service delivery, and ultimately, resulted in increased revenue and operational efficiency across processes. Therefore, beyond reacting to data incidents, boards must understand the opportunities that data and artificial intelligence present.

The use of data and artificial intelligence has risen to prominence, due to the advancement of technology that allows for the processing of data in ways that were not possible 30 years ago. Data is information that is collected, processed or controlled. Such information includes text, images, sounds, software and databases. Artificial intelligence (AI) is a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments (15 USC 9401).

The most common application of AI are chatbots that interact with customers, in place of customer agents.  Chat GPT is one of the popular interactive AI systems. AI has also been used to screen job applications, perform menial tasks and to provide insights into consumer patterns and habits.

There are various risks associated with data and AI. Data security incidents where personal and financial information is accessed have caused immense financial losses to individuals and companies. Second, the use of AI, in different settings raises questions of fairness, discrimination and proprietary rights, especially in the context of the data on which AI is trained.  How should boards think about these challenges and opportunities?

Despite these opportunities and risks, most boards do not fully appreciate or understand data management issues. Most often, the role of data is often compartmentalized to the technology officer. While large organizations have to sectionalize different risk components, the unique opportunities and risks associated with technological issues like data and AI must take a front seat in the boardroom.

Data privacy is now one of the singular areas of legal practice that has seen significant legislative movement in the world over the last decade.

Boards must be equipped to understand regulatory issues on data.  How should personal data be secured? In what circumstances can it be shared with third parties?  Can companies store personal data of Ugandans outside Uganda- on servers in Europe or the United States? What approvals are required for such incidents?

Uganda has a specific data privacy regulator; the Personal Data Protection Office.

A corporate entity can face fines of up to 2 percent of its gross turnover for failure to comply with the data privacy laws. Organizations that fail to keep data reasonably secure may be subject to legal proceedings for damages.

Similarly, AI raises several legal and ethical issues.  First, what data is being used to train or program the AI? Does the organization have proprietary rights to that training data? Have customers consented to the use of their data in machine learning algorithms?  Is the data used likely to perpetuate discrimination? It is incumbent upon boards to ensure that any use of AI is transparent, fair and in accordance with intellectual property and data privacy laws.

Data protection officers and technology officers should have more interface with senior executives. Boards should designate a seat for technology experts on their executive committees.  Board members should have regular training to understand and update themselves about the developments in technology and privacy.

Boards must ensure that their organizations have an artificial intelligence and privacy policy in conformity with local and international regulations.

Finally, organizations must act ethically as they adopt or incorporate more technology solutions in their businesses.

Mr Joel Basoga, head of the Technology practice at H&G Advocate’s [email protected]