GPhC Clarifies Role of AI in Pharmacy Practices

The General Pharmaceutical Council (GPhC) has published a new position statement clarifying how artificial intelligence (AI) should be used within pharmacy practice, emphasising that professional standards remain paramount. While recognising the growing role of AI in healthcare, the regulator makes it clear that pharmacists, pharmacy technicians and pharmacy owners must continue to meet its existing standards when incorporating such technologies into their work.

Central to the GPhC’s position is the principle of personal accountability. Pharmacy professionals remain fully responsible for their decisions and actions, even when these are informed by AI tools. The regulator stresses that AI must not replace clinical decision-making or professional judgement, but should instead be used as a supportive tool. This reinforces the expectation that patient safety and care quality must always take precedence over convenience or efficiency gains offered by technology.

To comply with GPhC standards, pharmacy professionals are expected to have a clear understanding of how AI tools function, including their intended use, limitations and potential biases. Appropriate training is essential before using such tools in practice. Importantly, any outputs generated by AI must be carefully reviewed to ensure they are accurate, free from bias and not misleading. Transparency is also a key requirement: professionals should be open about their use of AI and be prepared to explain to patients how it is being used in their care, including any associated risks or benefits. Where necessary, patient consent must be obtained.

Data protection and confidentiality obligations remain unchanged. The GPhC highlights that professionals must ensure AI use complies with all relevant requirements in these areas. In addition, any concerns about errors, risks or inappropriate use of AI should be raised in line with the duty of candour. Crucially, AI must never be used in a way that could put patients or the public at risk of harm.

The regulator also outlines expectations for pharmacy owners and superintendent pharmacists. They are required to carry out due diligence to ensure that AI tools are safe and used appropriately. This includes establishing robust governance arrangements, such as risk assessments, data security measures and clear information governance processes. Adequate training must be provided for staff using these tools. Furthermore, AI systems should be subject to ongoing monitoring and review as part of routine quality and risk management processes. During inspections, the GPhC will expect to see evidence that these standards are being upheld.

In addition to practice guidance, the GPhC has issued advice on the use of AI in revalidation and in pharmacy education and training. For revalidation, the regulator makes clear that it is not acceptable to use AI to generate complete submissions or to falsify information. Suspected misuse could lead to further action, including requests for new records or potential investigation under fitness to practise procedures.

In education and training, the guidance encourages ethical, transparent and patient-focused use of AI, while maintaining academic integrity. It aligns with broader regulatory principles, supporting education providers and learners to adopt AI responsibly in the development and delivery of pharmacy training programmes.