SFU to share draft artificial intelligence policy this summer

The AI task force created guidelines related to academic integrity, ethics, and other areas

0
371
This is an illustration of a robotic arm and hand touching its index finger to a human’s index finger (arm and hand also pictured), mimicking Michelangelo’s “Creation of Adam” painting.
IMAGE: Cash Macanaya / Unsplash

By: Corbett Gildersleve, News Writer

On January 31, SFU released a statement that their Artificial Intelligence Learning and Teaching Task Force started developing policy recommendations regarding the use of generative artificial intelligence (GenAI) like ChatGPT for coursework. Currently, SFU has no AI policy developed, but the task force will share its guidelines with the public this summer. The policy is anticipated to be implemented in the fall. 

These guidelines will consider “academic integrity, pedagogy and teaching innovation, governance and ethics, impact assessment and communication, and graduate studies.” The Peak corresponded with Megan Robertson, co-chair of the task force’s pedagogy and innovation subcommittee, about its progress and challenges. 

Robertson said each subcommittee has now “submitted information to the chair” of the task force, Paul Kingsbury. Kingsbury and special advisor Parsa Rajabi have compiled this information into a draft of recommendations, which is currently under review by the task force. 

Robertson said the subcommittee considered the different approaches professors take with AI, as some are “incorporating AI tools into almost every aspect of teaching and learning,” while others see “that AI tools are not effective for the goals of the course.” She also said that while AI has posed some issues pertaining to academic integrity, “the introduction of AI tools is an opportunity to rethink assignments, assessments, and exams. Advocating for instructors to have the time and space to reimagine their courses is an important part of my work.” Robertson is also an educational developer in the curriculum and instruction division at the SFU Centre for Educational Excellence, facilitating workshops with instructors to “develop and update resources” for teaching.

“One of the key recommendations of the task force relates to increased awareness and literacy about what AI tools are, how they work, and the opportunities and risks involved with using them.” — Megan Robertson, co-chair, pedagogy and innovation subcommittee

When asked about the well-known issue of GenAI “hallucinating” and providing made-up information, Robertson said, “One of the key recommendations of the task force relates to increased awareness and literacy about what AI tools are, how they work, and the opportunities and risks involved with using them.” Specifically, she said it was key to ensure “everyone has access to information about protecting personal information, intellectual property, and copyright” moving forward.

Additionally, “We know from research and use cases that people are least confident in AI outputs when they have knowledge and expertise in the area that they’re asking the AI to generate content about.” She suggested instructors can appropriately “model how they analyze and interpret ideas” with AI by disclosing why they chose to use it.

GenAI is not the only form of artificial intelligence that has been developed. Some examples include expert systems, machine learning, and neural networks. When asked if the subcommittee looked at other types as part of their work, Robertson said the task force “focused on how to develop guidelines and recommendations that will allow instructors to make informed decisions about their individual teaching contexts.” SFU stated they are “hopeful that the task force will approve the guidelines in the coming weeks” for community feedback, and that “information on the work being undertaken by the task force, as well as resources for students and instructors, are available on the AI strategy website.”

Reviewing AI policies and guidelines from other universities in Canada, many of them follow a similar format. The University of BC, University of Victoria, University of Alberta, University of Saskatchewan, University of Manitoba, University of Toronto, and McGill University all put the onus on the student, instructor, staff, and administration to decide whether or not to use GenAI. In the case of the University of Victoria, they do not allow the use of tools to detect GenAI use for assignments due to “how they collect and store student information and intellectual property.” 

Leave a Reply