Artificial intelligence, or AI for short, has brought a range of opportunities (and some issues). We'll explore the world of AI generation tools and some of the considerations you need to bear in mind when using these tools.
Artificial intelligence, or AI for short, involves using computers to do tasks which mimic human intelligence in some way. There are a vast range of methods of creating artificial intelligence and applications for AI.
One area in which AI has been used is for generating content that is similar to other content, sometimes known as generative AI. This is often done using a branch of artificial intelligence known as machine learning, which focuses on getting computers to appear to learn how to do something, like recognise or generate certain kinds of content. Machine learning can be supervised, where it is given datasets and told to learn from them, or unsupervised, where the computer is given a goal and parameters and has to try and reach that goal without example data.
AI generation tools are applications, usually web-based, which allow you to harness the power of this AI generation without needing to know how to create AI or machine learning apps yourself. These tools can do a range of things (you've probably seen some of them in action, especially text generation tools like ChatGPT), but also come with some caveats and restrictions.
One important thing to be aware of with AI generation tools is that they are all based on datasets. The data that the AI tool has been 'trained' on impacts what results you will get and what it can do. For example, generating images requires a large dataset of existing images and the AI will create images based on what it learns from these images, so certains styles or types of image in the dataset will mean you get particular results.
To start exploring tools powered by and investigating AI, see the link below, but be critical - these haven't all been tested and you'll want to check what data you're giving them.
As we'll explore further on this page, think about what you use generative AI tools for. There is University guidance for students and postgraduate researchers around the use of generative AI in assessment and other work.
Tools powered by artificial intelligence are everywhere, from Gmail's suggestions to image generators. There's a whole range of tasks that AI has been used to help with, so often you'll have to explore the tools out there to see what might be possible.
Content creation is a rising area for AI involvement, both generating content and assisting with media creation and editing. For example, Runway is a web browser-based video editor that uses AI for a range of features including removing backgrounds from videos and cleaning up audio, as well as having some standard video editing features as well.
Another area that AI often helps with is discovery, whether that is finding content you might enjoy because you liked other content or searching for information. This is because AI features can quickly synthesis and search through information, learning from patterns in what people find useful.
Be careful with the limitations of these tools. Always test what you want to do with the tool and the outputs you get before investing too much time in it. Lots of these tools are rapidly changing due to technological change, so keep an eye out for functionality changing. Also be aware of pricing models: many free tools have limitations like watermarks or limits on creation, as they want you to pay for the full version. Sometimes tools changing from being free to being paid for due to popularity, so make sure you download or export any creations once you've made them.
To explore more about what AI tools can do in terms of finding information and being used as a reference source, see our Searching for Information guide page on AI tools:
How artificial intelligence is developed, used, and viewed in society are big areas of discussion and debate. For example, within XR Stories at York there's work going on around challenging AI stereotypes around how we view AI and how it is represented, the kinds of images and sounds we use when we think of AI.
When you use AI tools, it is good to be critical of artificial intelligence at the same time, and even how we view the tools themselves. Do we see them as 'magic' applications that can create something out of nothing, or complex code that has been designed and written by humans making choices? Does this make a difference to how we use the outputs of these tools?
There are many other ethical considerations in the world of AI. For example, some kinds of AI like machine learning can be based upon existing datasets, with the computer 'taught' from this existing data. What data is chosen as the training data is crucial: using datasets that contain inequality and bias will replicate those inequalities and biases in the AI tool.
A popular kind of AI generation tools are ones that can create images from prompts or criteria. These have been used for fun, to make stock images without needing to pay or find copyright-free pictures, and to test the limits of AI.
There are many tools out there that allow you to do similar things, from create images using prompts like the example above to uploading an image and using AI to change that image to be in a different style or more similar to a different image. These can be great for inspiration or fun as the results are often ridiculous and not what you were actually looking for.
For tools that require prompts, what you get out often depends on using the exact right terminology for that tool. There's a lot of trial and error involved, and be aware of biases in the data underlying these tools which can impact their results (for example, here's some of the limitations and biases explored for DALL-E mini).
For tools that create images in certain art styles, watch out for where these tools could be using the art styles of real, living artists. Use these tools critically and think about where they might be imitating living, working artists in ways they might not want. For example, they might be good for inspiration, but not something you would use on a final output.
Similar ideas of turning written text prompts into digital media have been used for items other than images, such as video or 3D models. These have similar considerations to image generation and don't always look particularly "real" but can be interesting to explore and get inspiration from.
If you use AI generators to create images or videos of people, these may suffer from the concept of the uncanny valley: when humanoid objects look just imperfect enough to a human that they provoke uncanny feelings of uneasiness. Have you ever seen a computer-generated person (e.g. in a video game) and they don't quite look right? Or felt unnerved by dolls or humanoid puppets or robots? This can be the uncanny valley effect.
When using AI generation tools, bear this idea in mind if you're creating images of people. You might make your imagery worse if you try to include a human and they don't look real enough. Consider generating more abstract images instead, though AI generators can often make even inanimate objects look a bit "wrong" and unnerving, or finding copyright-free stock images from sites like Pixabay and Unsplash instead.
There are also a range of AI tools out there that can generate text for you, and it is a rapidly changing and growing area for technology companies. ChatGPT has dominated technology headlines for a while now, and many tech companies are looking to get in on the hype. Lots of text generation AI tools have paid-for versions or elements, as this is a rapidly expanding area.
As with other kinds of AI generation, there are some important considerations with these tools. Some text generation tools have a specific focus - e.g. marketing copy - and are unlikely to produce useable text for other contexts, because AI works best when it has a specific focus and can be applied to that one task. These tools will have been trained using marketing copy, so that's the text they will produce. Other tools use much broader datasets, but that may mean that they are less good at writing very specific kinds of content.
There are also big questions around ethics and plagarism, some of which are being explored in current academic research. The use of data both for training these tools and the prompts that users give them can be a concern, with articles trying to help people know how to remove their data from ChatGPT, for example. The tools often advertise that they will produce "original content", but checking that would rely on people checking everything that the tool generates before using it. For lots of written content, the person who writes it is credited and it matters who writes it. Obviously, these kinds of tools can never be used for any academic work or anything where you want it to seem like your own work, as it won't be.
However, they are interesting to consider in terms of what they say about work. Is writing copy, captions, video scripts, and more so unnecessary as a creative task that AI could generate it more efficiently? Does this impact copywriters' jobs? These tools often state that they are making copywriters' jobs easier rather than replacing them, but there's a long history of how technology, work, and human skills interact.
Another way that these text generation tools are interesting are in how they can assist humans with their writing, rather than do it for them. You can generate content to spark ideas, give you a starting point, or help find words that mean something specific by asking the tool for a word that matches your definition. In these cases, it is good to consider what you do with the outputs of these tools, as well as thinking carefully about what information you put into them.
Forthcoming sessions on :
There's more training events at: