Guidance and support for the use of AI in science and applied science
11 March 2024
Amy Brewer, Science Subject Advisor
The rapid and ongoing advances in generative artificial intelligence (AI) tools bring both benefits and challenges to education and assessment. In this blog, I highlight the guidance available for managing AI use in Cambridge Technicals (2012) Level 2 Science and Cambridge Technicals (2016) Level 3 Applied Science. I’ll also explain how to deal with misuse in assessments.
What is AI?
AI tools generate text or images in response to user prompts and questions. The responses of AI tools are based upon the data sets upon which they have been trained. ChatGPT is the best-known example of an AI chatbot, but many other chatbots and tools are available.
Artificial intelligence (AI) is playing an increasingly important role in various scientific fields, transforming how researchers conduct experiments, analyse data, and make discoveries. Examples include:
- unveiling hidden patterns in massive datasets such as those used in genomics or climate science
- automating tasks such as image recognition and protein folding
- simulating complex systems like weather patterns and drug-target interactions.
Appropriate use of AI
Whether the use of AI by students is appropriate for a given task will depend on the marking criteria and nature of the task.
The biggest value will be for students who want more help understanding a topic or command words. Often simple Google search results can throw up lots of sponsored links or students have a tendency of only using the first hit. A generative AI chatbot like Gemini might give them a broader response that is easy to understand, or they can ask the bot to explain it in a different way for them.
It can also generate content. Students could use it to help find images for their presentations, for example. In Level 2 Science unit 2, LO2 states: discuss why different types of graphs are used in different contexts. Students could use generative AI to help find graphs to allow this discussion to happen. Or in unit 4, students could use AI to identify different types of health and safety regulations, and different types of risk avoidance techniques where marks are achieved by applying it to the context/scenario provided.
In Level 3 Applied Science, there are many examples where students could be supported by AI. In unit 5, students need to evaluate the significance and limitations of genetic profiling techniques. They could use AI to discover information about these different techniques and their uses because the marks come from the evaluation. Alternatively, in unit 7 students must explain how legislation affects food labelling in the UK. Again, knowing the legislation is not credit-worthy, so AI can be used to discover this because explaining how it affects food labelling is where the marks originate.
Inappropriate use of AI
Like plagiarism, AI can be used by students to create work which they then try to pass off as their own work. Where a student has used AI to complete all or some of their work, they are not demonstrating their own knowledge, understanding and application of skills. This may prevent the candidate from presenting their own authentic evidence.
Examples of AI misuse include using or modifying AI responses without acknowledgement, disguising the use of AI, or using it for substantial sections of work. You can support your students by teaching them about appropriate use of AI in science and applied science demonstrating how to reference AI correctly where its use is appropriate, and having clear policies for AI use within your department.
There are a number of ways to exploit the use of AI that teachers need to be aware of. Some generative AI is very good at preparing graphs from submitted data, writing risk assessments for a practical method, creating comparisons where the criteria says describe the similarities and differences, and can support with writing conclusions, discussions and justifications for ideas. Generally, AI has struggled carrying out calculations, but this is a space where developments will come along swiftly.
It is very easy to ask generative AI to repeat a response in a particular tone or voice, transform information into a table or list, and modify a response so that it avoids detection. Examples where describe or explain are the command words are most easily exploited via generative AI. Traditionally, you would be able to spot a student’s attempt at plagiarism because you could back-search for key phrases from their work, but this is significantly harder with AI responses.
What to do when candidates misuse AI in assessments
Teachers must not accept work which is not the student’s own. Ultimately the Head of Centre has the responsibility for ensuring that students do not submit inauthentic work.
If you suspect AI misuse and the student has not signed the declaration of authentication, your centre doesn’t need to report the malpractice to OCR. You can resolve the matter prior to the signing of the declarations.
If AI misuse is suspected after formal submission and signing of the authentication sheet, AI concerns within candidate work should be reported with a JCQ M1 form, as outlined in the JCQ AI guidance, available on the Malpractice section of the JCQ website. Please email your completed forms to OCR at malpractice@ocr.org.uk.
Frequently asked questions
For more information about these two qualifications, explore our FAQs.
Further support
Please refer to the JCQ AI use in assessments: Protecting the integrity of assessment document for further information on managing the use of AI within your assessments.
We are also producing a range of support resources, included recorded webinars, on our AI support page.
Stay connected
Share your thoughts in the comments below. If you have any questions, you can email us at science@ocr.org.uk, call us on 01223 553998 or message us on Twitter (formerly X) @OCR_Science. You can also sign up to subject updates to keep up-to-date with the latest news, updates and resources.
About the author
Amy joined OCR in May 2022 and is a subject advisor for GCSE Sciences and Applied Science. Before joining OCR, Amy taught Chemistry to 11-18 year olds for 16 years and was responsible for planning her school’s science schemes of learning. In addition to her teaching responsibilities, Amy mentored PGCE students, was responsible for the progress of KS5 science students, and is committed to improving diversity and inclusion in the sciences.
Related blogs