By Andrew Cox
What are the essential skills we need to have to use generative AI effectively and ethically – what you might want to call generative AI literacy?
One commonly mentioned dimension of AI skills is knowing how to use prompts effectively to get the kind of answer one wants. Formulating well defined prompts can ensure you get a more useful answer, such as a bullet point summary of an article's findings, rather than a 500 word summary of the whole text.
Good prompting can really matter. For example, generative AI has been trained to be very polite. If you show it a picture of a dog and assert that it is a cat, there is a fair chance that it will confirm that it is a cat. So how we prompt AI to be direct with us is important.
So a lot of AI literacy talk is about prompting or "prompt engineering".
Prompting is important, but I want to focus in this blog on three other dimensions, that are less discussed.
Being reflective
An important dimension of generative AI skills is the need to be reflective about its impact on ourselves as learners.
There have been a number of studies that have investigated the impact of using generative AI on the effectiveness of learning, and its indirect effects such as our social engagement in learning.
I have heard a lot of people say that AI just makes them more efficient. But there may be some hidden costs to that efficiency.
It may be more "efficient" to get Gemini to summarise a paper or explain a concept for us. But what depth of learning are we losing when we don't struggle through the hard task of reading the paper for ourselves, and through that process learn how to be better at reading?
It's great that AI can redraft sentences for us to write in a more academic style. But are we losing the skill to write better ourselves and coming to rely too much on the technology?
It may be "efficient" to ask Gemini rather than a teacher or peers to explain a concept that is difficult to understand. But what are we losing in terms of building up the community of learning around us? Yes, Gemini, is always on and answers very rapidly, but discussing ideas with peers is one of the most productive parts of learning at university.
I think we need to be reflective about our use of new technologies like generative AI and think honestly about how our uses impact our learning.
Generative AI is a great support to learning. We should be taking time to discover how to use well. But we need to be alert to some of the less obvious impacts on our learning experience.
- The way our personal data is extracted for profit, as with social media.
- The way BigTech has also appropriated content on the internet to train its model.
- The lack of diversity in the BigTech workforce which leads to it creating services riddled with bias.
- The exploitative way the companies used cheap and precarious labour in countries like Kenya to do the unpleasant work of filtering out toxic content when training their Large Language Models.
- The way AI has been developed with little regard for the impact on human work, eg in the creative industries.
- The environmental impacts of AI. Google and Microsoft have recently acknowledged failing against their carbon emission targets precisely because of AI. Image generation is a particularly resource intensive use.