Chat GPT & Generative AI at UNSW

Catch up on our live Q&A session on the uses and implications of generative AI tools, such as ChatGPT, at UNSW. This event was held online on Wednesday 7 June 2023.

Event highlights & panel members

Highlights

University's stance on AI: Get insights into the University's perspective on generative AI tools such as ChatGPT.

Future of AI in education: Hear how AI could shape the future of education and what that means for you as a student at UNSW.

Ethical implications: Delve into the ethical considerations surrounding the uses of AI, discussing ways you can effectively use these tools while maintaining your academic integrity.

Panel members

  • Prof. Alex Steel, Pro Vice-Chancellor Education & Student Experience, and Professor at UNSW Law & Justice
  • Dr Sasha Vassar, Lecturer, School of Computer Science and Engineering, UNSW Engineering
  • Prof. Stephen Doherty, Professor of Linguistics, Interpreting, and Translating, UNSW Arts, Design & Architecture
  • Michele Leeming, Manager, Student Conduct and Integrity, Conduct & Integrity Office
  • Dr James Bedford, Academic Learning Facilitator, Pro Vice-Chancellor Education & Student Experience Portfolio

Facilitator: Katia Fenton, Arc Student Director

Your Questions Answered

As part of our live Q&A session, students were invited to submit questions to the panel ahead of time. Below you can find answers to the most frequently asked questions from the event.

UNSW & AI

What is UNSW's position on AI?

UNSW is fundamentally very excited about AI. AI is the future. As a university we are all about critical thinking and making the world a better place. Appropriate uses of AI have huge potential to make peoples' lives across the world better and richer. The significant question for us is how do we use this new technology in a way that is positive, and we see our ability to contribute to that as really significant.

Are students encouraged to use AI?

AI will become a part of the learning experience. We want students to be informed and critical users of AI technologies and to make sure that we prepare people for exciting careers in the future. UNSW has the view that we should be enabling as much technology as possible to enable our students to perform at their best.

At a discipline by discipline level, what is appropriate usage will vary according to the nature of the task. All students and staff will be using AI going forward, it will be in Microsoft by the end of the year and it will be in search engines. This will shape how we structure courses, assessments and classroom experiences. UNSW will ensure that all students have access to required technology.

What are some potential future trends or advancements in AI that could impact academia? Do you think AI will destroy the education system as we know it?

AI will change education, but that is nothing new, education is always changing. AI gives us a lot of opportunities to create really engaging and new learning opportunities and assessments where students can explore ideas and think critically, and stay relevant to the future of work and life. AI will help staff and students but it will not replace any of the main things that you learn in university, such as critical thinking, comfort with uncertainty, appreciating alternative viewpoints, working constructively with others - which will become even more important in a world saturated with AI - being able to identify when to rely on AI and when not to, to understand its biases and limitations and how that can negatively affect people. Nor will it replace the role of the educator who will continue to use their expertise to create and distill resources, give feedback, feed into a curriculum, and create classroom experiences. AI doesn't have new thoughts or ideas to bring to the table in the way that expert educators do naturally, nor does it have social connection and engagement.

Would UNSW consider having a framework for AI usage and what might this look like?  Different faculties, schools, programs, courses and even tutors are currently approaching the use of AI differently. I am confused and would like to see more consistency.

UNSW is rolling out a new course outline system which will help standardise course outlines and ensure consistent information about the use of AI in courses. However, there is never going to be a single university rule or answer for how you use AI. There will be different instructions about AI in different courses and situations, depending on the purpose of the assessment and the learning outcomes. Sometimes the task will be asking you to demonstrate what you know without AI assistance - exams are a good example of that. But increasingly you'll be asked to demonstrate that you can complete a task using AI as part of your research or writing - when that is the case the assessment instructions should make clear what is expected.

There are three key principles across the university:

  1. Always do what you are asked to do in the assessment, if you don't follow the instructions you can't get marks.
  2. If you are asked to do your own work, then that is what you should do as we want to see that you have undertaken that learning rather than someone or something else.
  3. When you incorporate the ideas that are not your own you should always acknowldge it. That applies in the world of AI, just as it did before.

How might UNSW, with all its AI research expertise, implement AI enabled systems to better improve the student experience?

There are many opportunities to use generative AI to further improve the student experience and we will be working as hard as we can, carefully considering and addressing any ethical and privacy issues concerning data. Currently, there is early testing of using machine learning to provide more personalised and individual support, and AI could be used to automate and improve processes and remove inconsistences. There are also many positives with AI able to increase learning accessibility, including speech detect and automated captioning. At this stage though we are being very cautious around use of AI until student and staff privacy rights are clear.

Learning & Assessment

How is Chat-GPT any different to any other tool? Why is it not treated as such?

AI tools, such as ChatGPT, are tools that all students have access to. Students can learn how to use these tools and apply them when appropriate. In the same way that students must maintain academic integrity with other learning tools, the same is true when using AI. There may be specific instructions in courses about how to use learning tools, including AI, depending on the learning activities and outcomes in a course.

If use of ChatGPT is optional, does that raise the question of grade equity among students?

UNSW aims to design assessments that are fair and equitable for students. If an assessment instruction permits the use of AI, measures will have been taken to ensure equitable access. Use of AI in preparation for an assessment - such as a research tool, study scheduler or for inspiration in how to approach the assessment, is one of a range of study options and guides that students have. Use of AI may or may not be helpful in all situations. Our Academic Skills services can provide advice on how best to use these study tools.

How will UNSW bring AI into learning (classes and moodle)?

We have developed instructions around the use of Generative AI for teachers in their assessments, and guidelines around the use of AI within classrooms. Some staff have integrated the use of generative AI in their teaching, for example, using it to quickly create case studies for students to analyse, building understanding of and assessing the output of coding, others are encouraging students to be critical users of this technology by looking for bias and why that bias exists.

How will UNSW adapt the format of assessments so they continue to accurately reflect a student's understanding of the subject area?

Assessment is all about trying to make sure you can demonstrate what you have learnt - how you can manipulate knowledge, how you can engage in critical thinking, how you can be at the cutting edge of your discipline. Your learning is what we care about. We don't want to have assessments which are based on old-fashioned ways of learning and old-fashioned ways of assessing. We want to make sure that we adapt our assessment approaches to incorporate all these new ways of learning. At the same time, we need to make sure that assessments are valid. We need to balance innovation with validity. So we will be testing new assessments in a mix with traditional assessments. We see our students as partners in developing new engaging and valid assessments using AI.

How is UNSW supporting staff in AI usage? Some know/feel more comfortable than others.

UNSW is providing staff with a range of ongoing professional development opportunities to improve AI literacy and integrate teaching and learning strategies with AI where appropriate. A particular emphasis has been assessment redesign to ensure a focus on student learning whether or not the assessment allows for the use of AI. Impacts of AI are different for each discipline which may result in different approaches and instructions from staff in regards to learning activities and assessment.

Tips for Using AI

Are there any legitimate or ethical uses of AI in study and if so what are they and how?

There are many (perhaps too many to mention) 'legitimate' uses of ChatGPT, but each will depend on your supervisor/course coordinators' instruction. If it is permitted, always acknowledge that you have used an AI Generator. The fundamentals of academic integrity haven’t changed. The emphasis is on assessing what you know and create as a student. A good use of AI will enable you to learn more and develop, it will not replace learning.

In terms of productive and legitimate ways to use these tools, ChatGPT can be useful for providing feedback on written work or other work you have drafted (but be careful not to upload personal information, and remember whatever you upload may be reused without your control). It is always important that you understand the instructions for the assignment as this may mean that you cannot use AI.

If you can use AI, some examples of how you could use it include:

  1. Asking it to 'act like an expert' or giving it specific prompts, and then providing it with a sample of your writing to provide suggestions to consider in order to improve your work. These suggestions need to be critiqued and carefully evaluted, but the feedback these tools provide can be helpful for students during the writing and editing proces.
  2. If you are struggling with a concept in a lecture or reading, ChatGPT is a great way to have this concept explained in a way that is easy to understand.
  3. Some students who have English as an addition language, are using ChatGPT to explain phrases or figures of speech they are unfamiliar with.

How can I learn how to use Generative AI effectively, to develop my learning and help me in my studies?

Check out the recording of the Study Hacks workshop which provides tips about using Generative AI. You can talk to your course convenor or tutor about good ways to use AI in different courses, and resources and tips will continue to be added to the Academic Skills webpage.

What do you think of the quality of ChatGPT responses?

We have seen some interesting developments with the output from ChatGPT. At the beginning a lot of the linguistic structures were vague or generic, it couldn't really give specific examples from Australia or outside of the dates the system learns from. Also with referencing, you could really see that they made up references and scholars, but these things change over time. It does give confidently wrong answers in STEM areas too, such as coding. It is going to be really important for academics and students not to have a static view of what is AI and what isn't, because it is always going to be evolving.

How secure is the information and prompts that you enter into ChatGPT? Should students avoid putting unpublished research or assignments into ChatGPT?

At this stage if you sign up to ChatGPT you are agreeing to help OpenAI with their research. So don't give them information that you don't want them to use for whatever research purposes they're using it for. We are seeing across the world some significant concerns about privacy, images, reuse of personal information and copyrighted materials. It will be important to ensure that we use reputable organisations with strong privacy policies.

Should students be concerned about bias in AI information that is generated?

AI apps like ChatGPT may generate outputs that contain biases. You should think critically about outputs and consider, for example, whether there are stereotypes (on gender, sexuality, race, religion etc.), whether some groups or perspectives are not included, whether there is political bias. There may also be bias in output depending on what the prompt was. Remember that AI is only predicting the most likely outputs based on its database, and that AI is trained on existing data and human inputs. That is likely to replicate existing biases.

How can I learn how to make effective prompts?

Watch out for a session in the Study Hacks workshop series, or make an appointment to see an Academic Learning Facilitator.

When you use prompts with AI, who "owns" the prompt response and how can we minimise the risk of plagiarism or other academic misconduct?

Using or even quoting an AI generated response to a prompt can be considered problematic as AI itself isn't really a source in itself, it is an amalgamation of sources and we don't know where they come from. Check with the course convenor if this is allowed and if it is then reference it.

I have 2 questions about the use of AI in Computer Science. Firstly, can I use it to 'proof read' work I have done to identify issues that may occur? Secondly, can I use these LLMs to write test cases to verify that my work is working as intended (.i.e. write and format list of 500 numbers to input)?

In terms of assessment, the answer is to read the assessment instructions and ask you teacher if you are unsure. As a learning tool, it can be hugely beneficial - you can use it to help explain any errors, or why you are not producing desired output. In terms of testing, and writing test cases, it would be important to understand how to test your code so you can get used to writing more robust and flexible code. This means you should generate the parameters of your test cases yourself, as part of your learning.

Can you use Chat GPT to rephrase already written sentences, or text to get a well-structured context?

It is important to consult the Course Outline for the specific course as many courses allow you to use these tools. Where permitted in the Course Outline, we encourage students to learn how to use tools to improve their academic writing, but there are a few things to consider with AI.

Firstly, don't use them blindly - make sure you reflect on and critique changes made to your text and understand all of the content that is generated and that you understand and intend what is written. Ask yourself has it improved it? Are there errors in what it has produced? Sometimes the amount of editing you need to do afterwards is more time consuming than if you had written it yourself.

Secondly, be aware that anything you submit to an AI app is then in the public domain.

Thirdly, it is important to keep drafts of your work so that you can consult previous versions if you need to go back to your original thinking and phrasing. It is also useful if you need to have a conversation with your course convenor about your submitted work.

Finally, remember that use of AI in this way is likely to be flagged in Turnitin as AI generated - make sure that use of it in this way is permitted in your assessment. You shouldn't just assume it is.

Can students use ChatGPT to carry out a literature review, as long as you then fact-check the information and the references, and acknowledge the use of ChatGPT? If you check everything and change wording, would you then also need to acknowledge the use of ChatGPT?

At this stage, it is not recommended students use something like ChatGPT 3.5 or ChatGPT 4 as part of their literature review. Beyond providing fake sources very confidently (something which may change with future updates) there are much better ways of finding academic articles. In particular, tools like Elicit and Consensus use generative AI to help find academic sources and can be more helpful. If you are performing a literature review, you can also use Bing in 'creative mode' which accesses the internet - it is free and does not have a 2021 cut-off date. The standard version of ChatGPT 3.5 has a knowledge cut-off date and won't find up-to-date resources which are crucial for performing a good literature review. When doing a literature review, you want to make sure that you understand with academic sources being referenced. You may have a different interpretation to one that is generated by AI. We want to know your thoughts. You should reference when you use ChatGPT or other Generative AI, and state how you have used it.

How acceptable is it to use ChatGPT to write contents of a research paper ? Should ChatGPT be mentioned as a co-author?

You should not be using AI generated content in a way that an AI tool could be considered a co-author. Written work should be yours. If you use ChatGPT or other AI tools it should be in a way that helps you develop your argument and writing, not to create writing and argument itself. Use of ChatGPT should always be referenced however and, if you are using AI, think about where information is going. Don’t submit research or information that has private or personal information or there are copyright matters at stake, as it is then in the public domain.

If AI could do Excel or other similar work so easily, what is the future focus for current students who are studying those relatively mechanical data tools?

It can suggest formulas and do some amazing macros, that definitely save a lot of time. But you still need to apply critical thinking skills to understand what it is that you are trying to show, what it is that is the purpose of a particular formula or tool, how you can you interpret the data well and what it means.

Translation & Grammar

Can I write in my native language and have Chat GPT or other translate it?

UNSW is a university that teaches and assesses in English, except in language courses. Using digital translators is often not recommended except in specific circumstances. One of the reasons is because language is not merely words, but also based in particular contexts. A pure translation of words will not necessarily reflect the context. The best way to produce English language work is to write in English. Use of AI translators is likely to appear in a Turnitin report. If your first draft is not in English you cannot assume your marker can read your work to verify that you understood the issues being assessed.

How soon until there is a commercial AI interpreter?

We already have commercial translation and interpreting services. In fact, these services have existed since the 1990s and are now commonplace in the language services industry. It is only in recent years that these tools have become more easily available on personal laptops and smart phones. However, even the best AI system has not replaced human interpreters due to a range of technical (e.g. quality) and legal requirements. Professional translators and interpreters are still very much in demand and these tools are a standard part of their toolkit.

Can we use AI (e.g. Chat GPT or Grammarly) to check the grammar mistakes or vocabulary misuse? Will this be identified as AI generated writing?

Language editing tools might be ok to be used to improve expression in academic writing, but please follow course instructions and let the convenor know so that they can advise if that is ok in their course. Yes, this will likely be identified as AI-generated writing. This is why it is important to consult the Course Outline for the specific course to see if the course assessment instructions allow you to use these tools. Also make sure you reference the use of Grammarly or other apps in your submission and keep drafts of your work so that you can consult previous versions if you need to and, if needed, can show your convenor your previous drafts if you are asked to have a conversation about how you developed your work.

Referencing

How do I reference Chat GPT/AI?

It depends on the referencing guideline in the course. MLA and APA differ in how they cite ChatGPT and Harvard are yet to decide. The best advice is to go back to first principles. Work you submit needs to be your own. If you are unsure, you are recommended to include a citation indicating where you used ChatGPT and for what, such as prompts or editing. Quoting ChatGPT can be considered problematic as ChatGPT isn't really a source in itself, it is an amalgamation of sources and we don't know where they come from.

Learn more on how to reference and acknowledge AI tools

Academic Integrity

How does UNSW plan to adapt its academic integrity guidelines to address the use of AI in coursework and assessments?

This is currently under review.

At UNSW, what is the line between using ChatGPT and what is considered plagiarism?

The line will be determined by the instructions provided. If Chat GPT is permitted, it must be referenced appropriately. It is recommended that students should keep drafts of their work, the instructions they provided to Chat GPT and the subsequent outputs.

How can Turnitin differentiate between ChatGPT text and text that is actually original but 'seems' AI generated?

Turnitin does detect a number of AI language generators, not just Chat GPT and Grammarly, and we have seen this in the first term this year. All Turnitin does is flag an answer as having AI-like responses. At that point the marker examines the answer to determine whether to investigate further. All decisions around academic integrity are made based on UNSW staff analysis, not the Turnitin report. We recognise that most students are doing the right thing, and any further inquiries we make of students is done without any pre-judgment.

What is the current UNSW process when an assignment is flagged for having AI content by Turnitin?

If AI generation is detected, it is the same process that we currently have. It is first handled at the faculty level. It will not be unusual for students to be asked to provide drafts of their work or have a conversation about their work so they can demonstrate their knowledge and understanding of the learning for the course/assessment. If a student is unable to articulate how they completed their assignment and are unable to show the required learning outcome, their submission may be referred to the Conduct and Integrity Office and will follow the Student Misconduct Procedures. We want to be clear that the start and end of the investigation will not be on the Turnitin AI report. We will look at all the facts available, including whether the instructions to students were clear.

More support & info

Our Academic Skills Team wants you to make the most of your university studies. Browse their wide range of self-study resources or book a one-to-one consultation with an Academic Learning Facilitator. You can also get in touch with the team at [email protected].


See also

Back to top