back to top
Wednesday, 3 December, 2025
HomeA FocusHPCSA updates rules for SA doctors, medical staff

HPCSA updates rules for SA doctors, medical staff

As AI is increasingly used in healthcare, the Health Professions Council of South Africa (HPCSA) has introduced new regulations to which all doctors and other medical professionals must now adhere.

The new AI regulations have been published together with new regulations on end of life procedures, reports BusinessTech.

According to law firm Bowmans, the changes are meant to protect patients while giving practitioners clearer guidance, said senior associate Jay Page, who added that the updates “show a clear intention to reinforce patient dignity and ensure medical professionals keep up with ethical and technological developments”.

The first set of updates appears in Booklet 7, linked to withholding or withdrawing life-prolonging treatment.

While the main principle remains unchanged – decisions must always be in the patient’s best interests – the HPCSA now gives more detailed instructions on how those decisions should be made.

Page said a significant update was the introduction of a “patient representative – someone chosen by the patient to make medical decisions for them if they cannot, or do not want to, do so themselves”.

The guidelines now formally define who this representative can be, and their role. The HPCSA also articulates that patients must be consulted before any decision to stop or withhold treatment, as long as they are able to take part.

If patients do not want to know the details or be involved in the discussion, the new rules now allow the practitioner to speak directly to the nominated representative instead, said Page.

If a patient cannot make decisions at all, the HPCSA has tightened the rules on who may give consent.

The patient can grant someone written authority to decide for them; otherwise, a court order or other applicable law may give someone that authority.

If neither applies, consent must be obtained in a specific order: first from a spouse or partner, then from a parent or grandparent, then from an adult child, and finally from a sibling.

Doctors may still decide to withhold or withdraw treatment they believe to be futile, even if the family wants it to continue.

However, the HPCSA now states clearly that this can only happen if the decision is in the patient’s best interests.

Page said this is an important addition because it strengthens the protection of vulnerable patients.

The second set of changes appears in Booklet 20, regarding ethical use of AI in healthcare.

Page added that the HPCSA takes a balanced approach, supporting innovation but warning that new technologies has risks.

These include concerns about data privacy, the potential for discrimination against under-represented groups in datasets, and the lack of proper regulation or standards for many AI tools.

To help manage these risks, the HPCSA has introduced three “pillars of AI”: ethical, legal and technical.

The AI tools must respect patient autonomy and confidentiality, comply with South African laws such as the Protection of Personal Information Act, and meet standards for safety, reliability and security.

Most importantly, “AI must always be used to benefit the patient, not to make the practitioner’s job easier”, Page said.

The guidelines also noted that AI may never replace a doctor’s own judgment: a practitioner must always make the final call and cannot pass responsibility to an algorithm.

Another key change is that only validated and culturally appropriate AI tools may be used.

This ties in with recent rules from the South African Health Products Regulatory Authority, which listed the types of AI tools used in healthcare – from imaging software to predictive algorithms and wearable monitoring devices – and outlined how they should be regulated.

The future is AI

Meanwhile, at the recent Wits FHS Prestigious Lecture 2025, advances in AI technology were predicted to ease both patients’, pharmacists’ and employers’ issues when trying to decipher difficult-to-read doctors’ scripts and sick notes.

An ambient scribe, an AI tool that listens in on a medical consultation and automatically generates a written document, could also cut hours of administrative work, ease doctors’ burnout, and redirect more time to actual patient care, said Professor Bruce Bassett, AI Chair at the Wits Machine Intelligence and Neural Discovery (MIND) Institute, who presented at the Wits Faculty of Health Sciences’ 2025 Prestigious Research Lecture, “From Data to Diagnosis: Rethinking Medicine in the Age of AI.”

The adage ‘garbage in, garbage out,’ will continue to be true,” said Bassett, adding that flawed or incomplete data can lead to misdiagnosis, biased treatments and inappropriate treatment. “AI can’t compensate for the absence of African genomic information or inconsistent health records, for example. Humans will always provide the necessary quality.”

Human-led expertise, and particularly, AI trained on African-specific data, was central to the lecture’s overall message, with Professor Collen Masimirembwa, Senior Scientist at the Sydney Brenner Institute for Molecular Bioscience at Wits, reminding the audience that should AI be meaningfully used in healthcare, data reflecting African biology, epidemiology and treatment responses are critical.

“Africa has the highest genomic diversity in the world, but remains underrepresented in global medical datasets. African populations have about 200 times more genetic variation than Europeans, but diagnostic tools, risk scores and pharmacogenetic studies are based on non-African data,” he said.

Bassett also spoke about multimodal AI systems, using the example of doctors missing pulmonary embolism diagnoses, which AI systems successfully flagged.

“We see a rapid rise of digital biology and the growing ability of AI to synthesise radiology, pathology, genomics and clinical histories to generate preliminary diagnostic hypotheses,” he said.

A panel discussion chaired by Professor Helen Rees, Executive Director of Wits RHI included Dr Maurice Goodman, Discovery Health’s Chief Medical Officer, Dr Scott Mahoney, Senior Programme Officer for AI and Health at the Gates Foundation and Dr Aisha Pandor, co-founder and CEO of Pandora Health.

Goodman spoke about the need to equip health professionals with the skills to interpret AI outputs, while Mahoney said AI could only succeed if the continent’s data systems were strengthened.

Pandor stressed the importance of avoiding AI technology that reflects top-down assumptions rather than local realities, while Rees said data quality, regulatory capacity and clarity, and training were as important as the technology itself.

Professor Shabir Madhi, Dean of the Faculty of Health Sciences, said AI presents a unique opportunity to reimagine clinical care, research and education, but emphasised that Africa must become a producer rather than a consumer of AI-driven health solutions.

 

BusinessTech article – New rules for doctors and other medical staff in South Africa (Open access)

 

Wits article – Why AI still needs humans: key lessons from the Wits FHS Prestigious Lecture 2025 (Open access)

 

See more from MedicalBrief archives:

 

Potential of medical liability pitfalls with increasing AI use

 

AI algorithms in diagnosis could harm patients – Dutch study

 

Doctors prone to moral distress when caring for cognitively-impaired elderly

 

Staying out of trouble with Notification of Death certificates

 

What the law requires after procedure-related deaths

MedicalBrief — our free weekly e-newsletter

We'd appreciate as much information as possible, however only an email address is required.