Skip to Main Content
University of York Library
Library Subject Guides

Digital Creativity: a Practical Guide

AI generation

A practical guide to getting digitally creative and using digital tools and technologies to explore work, ideas, and research.

Using AI generation tools

Artificial intelligence, or AI for short, has brought a range of opportunities (and some issues). We'll explore the world of AI generation tools and some of the considerations you need to bear in mind when using these tools.

What are AI generation tools?

Artificial intelligence, or AI for short, involves using computers to do tasks which mimic human intelligence in some way. There are a vast range of methods of creating artificial intelligence and applications for AI.

One area in which AI has been used is for generating content that is similar to other content, sometimes known as generative AI. This is often done using a branch of artificial intelligence known as machine learning, which focuses on getting computers to appear to learn how to do something, like recognise or generate certain kinds of content. Machine learning can be supervised, where it is given datasets and told to learn from them, or unsupervised, where the computer is given a goal and parameters and has to try and reach that goal without example data.

AI generation tools are applications, usually web-based, which allow you to harness the power of this AI generation without needing to know how to create AI or machine learning apps yourself. These tools can do a range of things (you've probably seen some of them in action, especially text generation tools like ChatGPT or Microsoft Copilot), but also come with many caveats and restrictions. Typically, you give these tools some kind of prompt, maybe using text and/or images, and it 'generates' content in return.

Asking Copilot to list 10 examples of ways you could teach generative AI
An example of giving Microsoft Copilot a prompt.

One important thing to be aware of with AI generation tools is that they are all based on datasets. The data that the AI tool has been 'trained' on impacts what results you will get and what it can do. For example, generating images requires a large dataset of existing images and the AI will create images based on what it learns from these images, so certains styles or types of image in the dataset will mean you get particular results. The data that these tools are trained on might be copyrighted or someone's intellectual property (IP), which introduces other issues about what people do with the outputs of generative AI tools.

As we'll explore further on this page, think about what you use generative AI tools for. There is University guidance for students and postgraduate researchers around the use of generative AI in assessment and other work.

As we'll explore further on this page, it is also important to consider what information you are putting into a generative AI tool as part of your prompt or additional files you're providing it to work with. The University's guidance on using Information Classification can help you determine what level your information is, and has some guidance around AI usage with different classifications of data.

What can generative AI tools help you with?

Tools powered by artificial intelligence are everywhere, and they are not just generative AI - for example, everyday applications like Gmail use a range of AI techniques to do things like highlight important emails or suggest how you might want to reply to an email. There's a whole range of tasks that AI has been used to help with, so often you'll have to explore the tools out there to see what might be possible.

Generative AI tools are being used in a huge range of ways. Text generation, for example, is being used to search for information, to create outlines and plans, to proofread text, to explore ideas, and many other things. People are constantly finding new ways to use (and abuse) these tools, so we couldn't ever write an exhaustive list of how you can use any type of generative AI! Always check any University, departmental, funding, and/or publisher/journal guidelines around what uses of generative AI are accessible for work that you might be submitting.

Tip

Be careful with the limitations of these tools. Always test what you want to do with the tool and the outputs you get before investing too much time in it. Lots of these tools are rapidly changing due to technological change, so keep an eye out for functionality changing. Also be aware of pricing models: many free tools have limitations like watermarks or limits on creation, as they want you to pay for the full version. Sometimes tools changing from being free to being paid for due to popularity, so make sure you download or export any creations once you've made them.

Developing ideas

One area that both text and image generation AI tools can be used for is developing ideas. You don't use the outputs of generative AI as any kind of final work, but instead as useful prompts for you to consider as you develop ideas for things like creative projects or standard documents (like asking what things people typically include in a cover letter, a lesson plan, etc). You then take these prompts and write/create your own content, critically evaluating which of the suggestions you got might be useful and which aren't. Particularly for creative projects, this can help you to go beyond your initial idea. For example, if you were creating a public-facing resource educating someone about generative AI, you could ask a text generation AI tool for a list of ideas of different kinds of resources you might make, or you could use an image generation AI tool to suggest possible design features.

Always bear in mind that this should not replace you critically evaluating what should be included in anything you are creating! Don't just follow the instructions the computer gives you - they often aren't correct, or don't match up with your specific context. If generative AI isn't helping you with ideas, try other methods for exploring ideas instead.

Searching

To explore more about what AI tools can do in terms of finding information and being used as a reference source, see our Searching for Information guide page on AI tools:

Considering how we view and use AI

How artificial intelligence is developed, used, and viewed in society are big areas of discussion and debate. For example, within XR Stories at York there's work going on around challenging AI stereotypes around how we view AI and how it is represented, the kinds of images and sounds we use when we think of AI. The world of artificial intelligence has been changing a lot recently with developments making it possible to have better generative AI, so it can be useful to read around these topics and think more broadly about the societal impacts of AI as well as the technological ones. Lots of research is being done in these areas, including at the University of York.

There are many other ethical considerations in the world of AI. For example, some kinds of AI like machine learning can be based upon existing datasets, with the computer 'taught' from this existing data. What data is chosen as the training data is crucial: using datasets that contain inequality and bias will replicate those inequalities and biases in the AI tool. The data often contains copyrighted material or material that is someone's intellectual property, so they might not want people to be generating new work that is very similar to theirs (though copying work isn't something new to generative AI!).

When you use AI tools, it is good to be critical of artificial intelligence at the same time, and even how we view the tools themselves. Do we see them as 'magic' applications that can create something out of nothing, or complex code that has been designed and written by humans making choices? Does this make a difference to how we use the outputs of these tools? The 'people' side of generative AI is important: not just the people who create the tools, and the people whose work might have been used when the AI tool was trained, but also people whose work is more hidden, like those who label the huge datasets that generative AI is trained on. People run and work at the technology companies who build and sell these AI tools. Basically, there's a lot going on behind the scenes, and responsible use of AI means being aware of this!

Text generation AI tools

One of the most talked about areas of AI generation has been text generation. This involves tools that can take text prompts and generate new text in response, often in the form of a chatbot, a way of appearing to talk to the computer as it gives conversation-style outputs. ChatGPT was an early big name in the area, but there are lots of others now, including Microsoft Copilot and Google Gemini.

As with other kinds of AI generation, there are some important considerations with these tools. All of these tools, sometimes referred to as large language models or LLMs as this is the technology they use to work, require huge datasets of text, but what this text is and how up to date it is can impact the responses generated. When using text generation tools, it can be worth exploring further what that tool has been trained on and if it has different versions, to understand what you might be getting out of it.

There are also big questions around ethics and plagarism, some of which are being explored in current academic research. The use of data both for training these tools and the prompts that users give them can be a concern, with articles trying to help people know how to remove their data from ChatGPT, for example. Be critical about what information you give to any generative AI tool in the form of prompts - for example, don't feed your research data into them if you don't want that data to be publicly available.

At the University of York, we have access to the free version of Microsoft Copilot, and if you log in with your University username and password, Microsoft will not retain any prompts or responses from Copilot and that data will not be used to train the underlying model. See more on the IT Services guidance on Copilot:

The tools often advertise that they will produce "original content", but checking that would rely on people checking everything that the tool generates before using it. For lots of written content, the person who writes it is credited and it matters who writes it. Obviously, these kinds of tools can never be used for any academic work or anything where you want it to seem like your own work, as it won't be.

Text generation AI tools are interesting to consider in terms of what they say about work. Is writing copy, captions, video scripts, and more so unnecessary as a creative task that AI could generate it more efficiently? How might this impact copywriters' jobs? These tools often state that they are making copywriters' jobs easier rather than replacing them, but there's a long history of how technology, work, and human skills interact. The use of generative AI does not happen in a vacuum and we should consider the broader societal implications of any digital tools that we use.

A blue typewriter
AI text generation is similar to the old idea of monkeys typing Shakespeare - but this time they've been told about the works of Shakespeare and are trying to write a new version. Would that be Shakespeare's work or the monkeys' work?

Another way that these text generation tools are interesting are in how they can assist humans with their writing, rather than do it for them. You can generate content to spark ideas, give you a starting point, or help find words that mean something specific by asking the tool for a word that matches your definition. Again, it is important to consider what you do with the outputs of these tools and if using generative AI is actually the most useful tool for your task (sometimes a simple online search for synoyms of a word or a template for a certain type of thing would be quicker and easier).

Image generation AI tools

A popular kind of AI generation tools are ones that can create images from prompts, often applying particular criteria or using certain models that have particular design styles. These have been used for fun, to make stock images without needing to pay or find copyright-free pictures, and to test the limits of AI. There are a huge number of image generation AI tools out there and lots of other AI generation tools (and even image and design tools in general) have image generation built-in in some way now. Lots of image generation tools cost money, and others give you a limited number of 'credits' to use to make images.

More broadly, image-related AI tools are used for image recognition, which has a varied range of applications in areas like healthcare, environment, farming, and much more. There's a long history of "computer vision"!

A set of images generated by craiyon for the prompt 'AI generation tools' which all features an abstract brain or face of some kind
An example of AI generated images for the prompt "AI generation tools" created using craiyon (formerly DALL-E mini).

There are many tools out there that allow you to do similar things, from create images using prompts like the example above to uploading an image and using AI to change that image to be in a different style or more similar to a different image. These can be great for inspiration or fun as the results are often ridiculous and not what you were actually looking for.

For tools that require prompts, what you get out often depends on using the exact right terminology for that tool. There's a lot of trial and error involved, and be aware of biases in the data underlying these tools which can impact their results (for example, here's some of the limitations and biases explored for DALL-E mini).

For tools that create images in certain art styles, watch out for where these tools could be using the art styles of real, living artists. Use these tools critically and think about where they might be imitating living, working artists in ways they might not want. For example, they might be good for inspiration, but not something you would use on a final output.

The uncanny valley

If you use AI generators to create images or videos of people, these may suffer from the concept of the uncanny valley: when humanoid objects look just imperfect enough to a human that they provoke uncanny feelings of uneasiness. Have you ever seen a computer-generated person (e.g. in a video game) and they don't quite look right? Or felt unnerved by dolls or humanoid puppets or robots? This can be the uncanny valley effect.

When using AI generation tools, bear this idea in mind if you're creating images of people. You might make your imagery worse if you try to include a human and they don't look real enough. Consider generating more abstract images instead, though AI generators can often make even inanimate objects look a bit "wrong" and unnerving, or finding copyright-free stock images from sites like Pixabay and Unsplash instead.

Forthcoming training sessions

Forthcoming sessions on :

Taught students
Staff
Researchers
Show details & booking for these sessions

There's more training events at:

Feedback
X