Global citizenship, technology and Collaborative Online International Learning

Global learning and citizenship are at the heart of Sussex’s Strategic Framework, and many educators are looking to online collaboration as a cost-efficient and sustainable approach to internationalisation in education. Collaborative Online International Learning (COIL) is one example of how virtual exchange and international partnership can be used to promote cross-cultural understanding and collaboration among students from different countries.  

In this blog post from the Academic Developer Laura Clarke and Learning Technologist Tyrone Knight who work with the Business School, we will briefly explain the concept of COIL as a high-impact practice and discuss how faculty can support students by providing specific guidance on how to use technology to meet COIL’s learning objectives. 

Introduction to COIL 

COIL partnerships develop collaborative projects for students to work on across time zones and countries using easily accessible online tools. COIL can be incorporated into any discipline and allows students to explore course concepts from different cultural perspectives. The amount of integration between partner modules is decided by participating faculty. COIL can take place throughout a whole module or, more commonly, as a smaller part of the module over two or three weeks.  

Diagram modelling how a module at one institution is partnered with a module at an institution in a different country, which may or may not be in the same discipline. Instructors collaborate on planning and design and students collaborate on a project. Instructors only allocate marks to their own students.

COIL Partnerships 

In an effective COIL partnership, teaching staff develop a collaborative project-based activity that encourages active learning and teamwork and incorporates metacognitive reflections to support the development of intercultural competency. Deardoff & Jones define intercultural competencies as ’a person’s ability to interact effectively and appropriately in cross-cultural situations based on his or her intercultural attitudes, knowledge and comprehension, and skills,’ and studies have shown that participating in COIL can enhance students’ global learning and foster cultural understanding. Studies include Enriching students’ engaged learning experiences through the Collaborative Online International Learning Project and The effectiveness of collaborative online international learning (COIL) on Intercultural Competence Development in higher education). COIL expands students’ global perspective by providing opportunities to solve problems in different cultural, social, and economic contexts. 

Global citizenship and Cultural awareness 

Developing global citizenship is an important goal of education, but under-represented students in higher education are less likely to participate in study abroad programmes. Thus, COIL is an inclusive and accessible way for all students to deepen their cultural awareness. It not only offers all students an international experience, but it also signposts faculty respect for international and multicultural learning. Borger notes that ’Where educators have established an appreciation of culture and actively demonstrated responsiveness toward diversity in the classroom, minority students have improved feelings of value, are more proficient in learning, and demonstrate increased engagement and achievement.’ 

Technology requirements to support COIL 

Academics need to clarify the processes, outputs and evidence students need to demonstrate across the project early on to inform what tools the students could use. This should avoid unsuitable tools being chosen, and realised late into the project, for example to capture evidence and logs, or share materials. These technologies need to work for the partner school, based on any global restrictions. Exploring institution paid solutions benefit from wider feature availability and less product restrictions, as seen in this web conferencing tools post

Technology good practice and providing examples 

Providing generic good practice advice or examples will help inform students on how to use their chosen tools, including guidance on how to run effective virtual meetings, and its importance. 

Academics could suggest that will support students working with international students, and provide guidance on how to use them. This should be a tool you are familiar with, and that your institution already supports, which also supports safer data practices as mentioned below. At Sussex this is likely to be Zoom

Accessibility, data protection and censorship 

Academics could refer to important factors students need to think about when choosing and using these technologies with others, including accessibility and General Data Protection Regulation (GDPR) / data protection. 

It is also key to raise the importance of tools and features that support their peers needs, like captions, transcriptions, and saving/ exporting the chat. For example, this Zoom accessibility page refers to features to enable all participants to contribute to a meeting. 

Convenors should encourage students to use the institutions ITS approved platforms and solutions, which are compliant with Data Protection legislation. Students should be aware of what companies do with their data, including what needs considering and how private data could be minimalized. Student data may be less secure and used for other purposes and kept for a long time. 

It should also be remembered that institutions will have no access to files and emails in the event of a complaint or misconduct.  

Internet censorship is another factor when collaborating globally, and the maps on this study show how this varies. 

Reference links 

Tagged with: ,
Posted in Learning Design

How technology has helped deliver better professional services

This blog usually concerns itself with learning technologies, but this week I am looking at some of the advances in the technology that Professional Services staff use to support teaching, learning, assessment and the general student experience.

I know we all complain when the systems don’t quite do what we want them to, but when I think back to all the paper we used, the time spent handwriting everything, even just the hours we spent folding documents to be posted to students I’m amazed and gratified at the amount of progress in technology in the last 30 plus years, which was essential to streamline and improve the work we do, and the service we provide to students as part of Professional Services. When I joined Sussex in 1990 it was as a clerk/typist in the Undergraduate Admissions office in Sussex House. In those days we had one typewriter, and a temperamental machine attached to a dial-up modem, which had a direct link to UCCA, so that we could record any offers made to potential students.

Dissertation dash panic and sorting scripts

By 1996 I was in the Exams and Student Progress team. Everyone now had a PC, and a rudimentary email system. However, most of our work was still done manually. Once a year we had a whole week devoted to taking in students’ essays and dissertations (the origin of the famous ‘dissertation dash’), which took place on Mandela Balcony.  Students would queue up at desks with a sign showing their module code, hand in two physical copies of their work, and sign a paper ‘submission sheet’.  Students had to make sure that they had correctly completed the cover sheets (pink for finalists, yellow for second year, and grey for first year), and a title form signed by their tutor.  When it came time for the doors to close there would be a lot of panicked, anxious students trying to fill out the cover sheets and attach them to their work. Part of my job at that time of day was to walk around the room with a one-hole punch, helping them put their work together, and reassuring them that as they were already in the room they could relax and would be able to submit.

The work would then be taken into a back room and sorted by hand into a pile for each module, in candidate number order.  The next day the scripts would be separated into two piles by pairs of staff, who would check them against the submission sheets, and bundle each pile of scripts up, for collection by the first and second examiners.

A vast improvement to the submission process came when we started to work with scanners, which would scan the barcode on a student’s registration card, so we could log their submissions directly into CMS, this really sped up the process, and also meant the submission were recorded with more accuracy.

Plotting the Exam Timetable by hand and inputting marks

My then manager, Jackie Marsh, who was in charge of Arts-based exams, would book a room with the manager of the Science-based exams, Tony Durrant, for a week, and together they would create the exams timetable. This would be written out by hand on A3 sheets of paper, and once it was finalised it would be typed out into a Word document, and then hard copies would be posted out to all students taking exams. Eventually we had access to WCM, so the timetable could be posted on our website instead, which cut down on a huge amount of work and paper use. Now that we have so many more students and modules the exam timetablers use the computerized system Optime, which has its own challenges but has brought the process into the 21st century.

One of my jobs in those days was to print out the mark sheets for all submitted assessments and exams.  The mark sheets for submissions would go with each pile of scripts collected in Mandela Balcony, and also with the scripts returned to the office from the exam rooms.  Either the examiners or Subject Secretaries (as the Course Coordinators were then called) would write the marks for each piece of work on the marks sheet and return them to the Exams Office, where we would type them into CMS.  Each mark had to be entered twice, and this was a long, labourious process, not made easier by the amount of missing marks we had to chase, and the quality of some examiners’ handwriting!

After the marking process was completed, the Schools would return all the scripts to us (submissions and exams) and we would take them to our archive room in Arts D.

Things are very different today with most tasks being managed digitally. Canvas (and its predecessor Study Direct) have been used to enhance the students’ learning experience and the e-submission and feedback systems provide new tools and challenges for Professional Services staff, but the days of piles of paper are a thing of the past.

Posted in digital skills

Buddycheck: Get ready for a new peer evaluation tool to support group work at Sussex

Coming soon to Canvas – Buddycheck is new peer evaluation and feedback tool that can help make group work fairer, more transparent and reduce workloads for staff. It’s Learning Tech that dreams are made of!  

View of five people working on laptops from above
Photo by Marvin Meyer on Unsplash 

Here I explain what it can do, what you can do now to find out more about the app, and the guidance and training we will be offering in September.  

Why a peer evaluation tool? 

Learning how to work effectively as part of inclusive groups and teams is a vital skill and one we need to help our students develop. The improved integration of group work at all levels of the curriculum, will contribute towards meeting Learn to Transform strategic priorities, and those emerging from the Curriculum Reimagined project, to put skills development and employability at the heart of the Sussex curriculum.  Also, emerging best practice in assessment in response to the emergence of generative AI tools, such as ChatGPT, highlights further the need for an increased focus on curriculum design which embeds and expands the use of such authentic assessments. 

However, when talking with academics, student dislike of group work is often cited as a reason not to include group work in the curriculum.  This is in part due to concerns about group assessment being unfair or unrepresentative of group member’s contributions. It likely also reflects that fact that group work is just difficult. However, this doesn’t mean it should be avoided. Instead, we should strive to develop curricular that build our students’ skills and confidence over time.  

To support group work and mitigate such student concerns, peer evaluation and scoring of individual contributions to group work is already used extensively at Sussex. However, existing approaches have several challenges:  

  • Where students are asked to agree and allocate a share of marks amongst themselves the lack of anonymity can be stressful and put pressure on students, e.g. to allocate equal marks to peers 
  • Staff are not party to the reasons for the decisions made within groups, or by individuals, about peer scores 
  • Approaches that seek to preserve anonymous evaluation and scoring, eg by asking students to email scores for their peers to be collated by staff, are immensely time consuming 
  • Students rarely receive feedback on their contributions to group work or how they might improve in the future 
  • The approaches used for peer evaluation and scoring can be idiosyncratic and inconsistent over a course of study 

How can Buddycheck help? 

Buddycheck is a simple to use peer evaluation and scoring tool, which will be integrated into Canvas from August this year. It will help you to: 

  • Collect your students’ scores and feedback on their own and their peers’ contributions to group work  
  • Provide your students with automated yet personalised feedback on contributions to group work.  
  • Use peer scores to calculate and apply individual weightings to marks awarded for group submissions. 

It can be used formatively or linked to summative assessments. You can ask students to respond to pre-set questions on their contributions to a group project, or devise your own.  You can also decide on the level of peer feedback and anonymity, and can adjust how peer scores are weighted and used to calculate individual marks.  Other features include:  

  • Automated reminder emails  
  • Full integration into Canvas meaning it is easy to find, and updates easily when changes are made to group memberships 
  • The instructor dashboard labels flag high, low performing or over confident students  
  • You can choose how to use peer feedback, I..e. for the tutors’ eyes only or as peer-to-peer feedback 
  • Instructors can view all scores and feedback given and received by peers  
  • You can ‘play’ with applying self-scores and adjustment factors and weightings before finalising 
  • You can preview as student  
  • If applying individual weightings you can overwrite the group marks in Gradebook, or create a new column 

An extract from a Buddycheck ‘personal report’ for a student user is provided below. Note how the spider plot demonstrates, to a somewhat overconfident student, the difference between how they rated themself (Self), how their teammates rated them (Received), and what the average rating was of the entire team (Team Avg). The personal report then breaks the scores by question with feedforward comments on how students might improve.  

An extract from a Buddycheck ‘personal report’   Presented as a spider diagram
An extract from a Buddycheck ‘personal report’

We road tested Buddycheck in Semester 1 of 2022/23 with four Module Convenors from 3 schools, on modules with between 25 and 950 students, covering Levels 4,5,6 and 7.  All four had a good journey and told us they found Buddycheck easy to use, that it simplified their peer assessment processes and they are happy to recommend it to colleagues. 

“… a very user friendly tool that greatly simplifies my peer assessment process. The best point is that it integrates into Canvas very well, so each student receives a private and personal request to complete the evaluation. In this way, there is no longer ‘the pressure to give equal peer marks’ that I observed in the past years when I used my old peer assessment method, and I believe it has made the peer assessment fairer.”  Hsi-Ming Ho, Lecturer in Theoretical Computer Science (Informatics) 

What support is available? 

Buddycheck will be available from August along with ‘how to’ guidance and workshops from Educational Enhancement from September, along with guidance you can use with your students.  

We also plan to enhance our current guidance on planning group work assignments to include insights into how to encourage inclusive and productive group work, including using peer scoring and evaluation.  

However, if you would like to know more about Buddycheck now, e.g. to inform your curriculum development over the summer, you have three options: 

  1. Take a look at existing guidance: 
  1. Join a 30 minute online briefing session for Sussex staff: 
  1. Or, just drop me a line and I will happily talk you through it. S.hemsley@sussex.ac.uk  
Posted in Learning Technologies, Marking and assessment, Technology Enhanced Learning

The quest for alternative assessment

Formal exams and essays have formed the backbone of academic assessment for centuries, but they are far from perfect, and they don’t suit everyone – tutors and learners alike. Alternative assessment methods have been growing in popularity. Could it be the accelerating adoption of artificial intelligence that finally forces us to fully embrace them?

What are the alternatives?

Alternative assessment refers to any assessment method that is not a traditional timed exam paper or academic essay. This can take different forms, including:

  • Portfolio: a collection of student work that demonstrates their learning over time.
  • Project: a longer-term assignment that requires students to apply their knowledge and skills in a real-world context.
  • Presentation: an opportunity for students to share their learning with an audience.
  • Performance: a task that requires students to demonstrate their skills in a hands-on way.
  • Media: students submit their work using video or audio recordings.

There are numerous benefits to embracing alternative assessment. It makes it possible to assess a wider range of learning outcomes, such as critical thinking, problem solving and creativity, thus potentially providing a more comprehensive assessment of student learning. It also increases student engagement – alternative assessments can be more engaging and motivating for students, which can lead to improved learning outcomes.

Another benefit would be closing the awarding gap and increasing access. Alternative assessments provide students with a variety of ways to demonstrate their learning, which can be beneficial to those with disabilities or from diverse backgrounds. And, of course, it would help to address the challenge of artificial intelligence, because it is much more difficult to use AI to submit via an alternative assessment method.

Even before the rise of AI tools such as ChatGPT Dr Carli Rowell, (Sociology and Criminology), approached Educational Enhancement to deliver a workshop highlighting different alternative assessment methods. Dr Charlie Crouch (Academic Developer) and Rachael Thomas (Learning Technologist) were delighted to have the opportunity to collaborate on this workshop (Sussex login required).

Professional Services collaboration

Educational Enhancement are often engaged in blue-sky thinking with academic colleagues but are also regularly asked about how changes can be made within current university systems. Accordingly, there was great benefit in collaborating with Professional Services colleagues to identify where and how any changes to assessment could be accommodated.

A beautiful collaborative relationship was born among the Professional Services teams providing support to the Social Sciences cluster of schools.

  • Charlie Crouch (Educational Enhancement) drew from pedagogic literature on the benefits and challenges of alternative assessments, and what has been successful in other institutions, to facilitate the workshop.
  • Amanda Bolt (Academic Quality and Partnerships) gave practical advice about the School Education Committee process for when, how and whether to make changes to accommodate alternative modes of assessment.
  • Anna McCall (Academic Regulations) provided information about assessment regulations and common pitfalls in assessment briefs.
  • Rachael Thomas (Educational Enhancement) provided expert knowledge of specific university systems and instruction on how to set up assignments to accept the different ways students might submit work.

Together, they provided a rounded picture of the process, and were able to respond to queries which arose during the workshop.

The workshop

The workshop started with an activity, asking participants to add post-it notes to a wall, saying why they think we need to assess students. This was followed by an explanation of why alternative assessments are desirable, what they could look like, some case studies and student feedback.

Participants in the workshop were then asked to play a game (described below), to identify different methods of assessment, and how these might work within Sociology and Criminology. After the game, there was an opportunity for participants to ask questions about the practicalities of how they would implement the assessment types they had discussed and identified as appropriate for their learning outcomes.

The game

The larger part of the workshop focused on participants taking part in Assessment: The Game – an activity developed by Ian Turner, a professor in learning and teaching in higher education at University of Derby, to ’break down existing barriers and preconceptions about assessment modes that can be used in higher education.’

The workshop attendees were split into small groups, and each member was dealt three cards from a deck of 60, each showing a different type of assessment mode. They were invited, as groups, to consider one of the learning outcomes associated with their module and create an assessment which aligned with that learning outcome, using one of the assessment mode cards in their group. They were given 45 minutes to discuss this in their groups, with an opportunity to swap their cards if they couldn’t find a match with one of their learning outcomes.

Review

Feedback during and following the workshop included some robust discussions about the concerns that assessment might become less academically rigorous as they became more varied. Inevitably, conversations also touched on the possible implications of ChatGPT for assessment. And there were suggestions for how the skills gap in academic staff and students could be addressed to enable the use of different tools and systems.

The game was well received, with one participant commenting ‘it was really great and worked well’ and recognising that it was structured enough to provide direction, but free enough to allow discussion of issues important to the school. As a result, a strong interest in podcasts was identified and a follow up workshop on designing podcasts as assessment has been arranged. Everyone involved enjoyed spending an afternoon together dedicated to exploring the possibilities of alternative assessments.

If you would like the Educational Enhancement team to deliver this workshop in your school, please contact tel@sussex.ac.uk.

Tagged with:
Posted in Learning Design

Playful Learning: The seriousness of fun

Image generated by AI

’Imagination is more important than knowledge. For knowledge is limited to all we now know and understand, while imagination embraces the entire world, and all there ever will be to know and understand.’

Albert Einstein

In today’s blog post we are going to embark on an adventure into the wonderful world of Playful Learning! What is that I hear you ask wonderful imaginary audience, well Playful Learning is a pedagogical theory examining how the act of play can be applied to education. Anyone who’s ever watched a toddler examining the world will know that the act of playing is vital to our early life learning, but can the act of playing be applied to the Higher Education field? Yes, I believe it can and in this blog post I will attempt to convince you why that is the case.

So, lets dive a bit deeper into what Playful Learning actually is. Essentially, it’s the idea that when students are actively engaged in enjoyable and meaningful experiences their capacity for learning and retention increases.

Play vs work or fun vs seriousness.

There is often a false dichotomy whereby anything that is playful or fun is seen as the opposite of work and seriousness, the idea being that they are opposite spheres that cannot intersect, if you are having fun then what you are doing cannot be serious, if you are playing then you are not working But this is clearly rubbish, we often do our best work when we are fully engaged and indeed having fun, it’s a win-win for everyone except for antiquated ideas. All intelligent life on earth appears to engage in play and humans engage the most in play, there’s strong evidence that from an evolutionary stance the act of play has helped create our current intelligence.

This is not to say that we must always be playing and having fun, sometimes there are areas of life which we will find boring, but we must pursue them anyway, rather the idea is to minimise these areas where we can.

There is also something to be said for subjective experiences, what one person considers fun may be anathema to someone else. I find it fun to run for many miles in the rain across the Downs but I’m sure that’d be torture to someone else. No-one wants to be forced into a ’fun‘ experience they won’t enjoy. So Playful Learning is not always the solution for every use case, but it does have its place and some key benefits. below:

Engagement and motivation

We all learn best when we are engaged and indeed often the hardest part of learning is getting students to the point at which they are fully engaged, when they have intrinsic motivation to pursue a subject and learn as much as they can. What causes us to be unengaged when attempting to learn? Usually, it’s being bored and so losing any motivation to continue. But if we are having fun then our engagement increases, we can stay engaged for longer periods and develop a deeper understanding of our subject.

Play can give agency to students, allowing them direct choice over how they engage with and respond to a Playful Learning activity. This can give students a strong feeling of empowerment and agency allowing them to decide on their own approach, vital to ensuring consistent engagement.

Curiosity, creativity, and the acceptance of failure

It’s easy to get stuck in a rut when dealing with knowledge and learning. Being playful can often be novel and foster a sense of exploration and trying things out, this enables students to examine new and innovative approaches.

Tied to this is creating an environment where failure is accepted as a vital part of the learning process, which is important as failure will frequently be encountered when learning anything. If you have a more ’serious‘ environment then students may end up being more risk adverse and will fear failure, for example the student that does not want to answer a question posed in a classroom as they worry about the risk of feeling stupid in front of their peers. Play can help overcome this, enabling students to feel safe to take risks, learn from their failures and build a level of resilience within a playful environment.

Even as a teacher, when you attempt to run Playful Learning activities they hold an element of risk, they may go well but, in some cases, they may be a failure. Leading by example to show your students it’s okay to try and fail sometimes and to just accept and learn from our failures allows our students to do the same. You can change the framing of failure from a negative to a positive.  

Encouraging active participation and collaboration

Too often learning can be a passive process, students simply listening to someone talk. We know this is not the best way to learn, rather we want students to be active within the learning process. Play can help to encourage this as by its nature it must be active and whilst it can be done alone it’s far more likely to be a collaborative activity where students can all engage together in an activity such as a simulation, a game, or a building activity. If students are having fun, then they are also more likely to work together.

Play allows students safer ways of working together, having fun together and learning to work as a team and collaborate. Reluctance to engage in collaborative activities can often be overcome with a fun or playful activity that reduces some of the stress that can arise during collaborative activities.

Conclusions

The present state of the world is filled with stress and divisions in many ways, and this has an effect on the teaching environment, but we can in some ways help to counteract this by creating a playful space within our teaching. In a world overshadowed by war, pandemics, advancing AI, and societal breakdown, it can’t hurt us to embrace a sense of playfulness.

If you’d like help exploring or using Playful Learning in your teaching, please get in touch with Educational Enhancement at tel@sussex.ac.uk  

Tagged with:
Posted in Learning Design

Academic Developers June round-up

Education Festival 2023  

Crowd of people in lobby of the ACCA building

Last month, 46 colleagues participated in the first ever Education Festival at Sussex. Ahead of the Education Awards in the evening, we enjoyed a half-day of speed presentations, interactive workshops and a solution room dedicated to generative AI. We discussed a broad range of topics such as building inclusive learning communities, assessment and feedback, communicating scholarship, online learning, co-creation and dis-metacognition. To find out more about the day, dedicated to sharing good practice, research and innovation in teaching and learning, you can read our recent blog

Education Awards 2023 

Group of Education Award winners on the steps of the ACCA building

The University celebrated and recognised teaching and professional services staff at the Education Awards on 4 May, with a ceremony followed by canapes and drinks in the Attenborough Centre for the Creative Arts, and The Educational Enhancement team were thrilled to win an Inclusive Sussex Award. 

Education and Innovation Fund 

Three photos of Dr Victoria Walden (MAH), Dr Dave Smalley (Psychology) and Dr Emily Danvers (ESW).

We’re delighted to announce the winners of the second round of the Education and Innovation Fund are Dr Victoria Walden (MAH), Dr Dave Smalley (Psychology) and Dr Emily Danvers (ESW).

There is one more opportunity to apply for this year’s round of funding, and colleagues can apply through this online form. The closing date for the third round of applications is Friday 14 July 2023, and the outcomes will be announced a few weeks later. 

If you have a project that you would like to put forward we would strongly encourage you to work you’re your Academic Developer. They are ready to provide advice and guidance on the awards and how best to complete your application. 

Teaching with AI Community of Practice 

Educational Enhancement is launching a Teaching with AI Community of Practice. This will be a cross disciplinary network for sharing concerns, ideas and examples of practice in teaching and assessment in response to the recent emergence of AI tools such as ChatGPT. Events and resources are coming soon. Please register your interest by emailing Simona Connelly (S.Connelly@sussex.ac.uk

Posted in Professional Development

Canvas ‘hacks’ to save you time and stress.

Let me first make it clear I’m not talking about software hacking, but about lifehacks. A lifehack is ‘a tip, trick, or efficient method for doing or managing a day-to-day task or activity’ (source: Dictionary.com).

This post will show you some tools and functions in Canvas that you may not have been aware of, which can save you time and stress.

See what’s what.

Sometimes we have so much information available to us we can’t find what we are looking for. In Canvas, most teaching staff use the Card View on their Dashboards. This option has some great little tools to help you make it easier to see what you are looking for.

Hiding/showing cards.

Not every module you have a current role on needs to be on your Dashboard. By using the ‘star’ icon in your list of Modules you can choose which cards appear on your dashboard. So you can keep your teaching modules where you see them first, and others where you might have less involvement (as moderator, for example), can stay in the Modules list.

List of modules, some with orange star icon and some with white star icon.

Rearranging cards.

Even within the collection of cards you want on your Dashboard there are likely to be some you use more often than others. You can rearrange the cards on your Dashboard into any order you prefer, so the ones at the top can be the ones you access most often.

You can move cards by dragging and dropping, or by using the Move options under the 3 dots on each card.

'Move' tab within 3 dots at top of a module card. Options to move up or down.

Nicknames.

If you have several cards with the same or similar names (such as undergraduate and postgraduate versions) you can add a Nickname that only you will see to help identify them. The real name of the module will still appear underneath your chosen Nickname.

2 screenshots, first showing the box to add a nickname and second, where the chosen nickname will appear on the module card.

Colours

Some people like to use colours to distinguish between things. Whilst not something we would advocate generally, as it is not accessible, here we are talking about things only you will see. So if it helps you, go ahead and use the card colour option to change the colour of the dot, the title, and if there’s no module image, the colour of the card itself.

Colour tab under 3 dots on module card, with grid of colours to select

There is also an option to have a colour overlay on all the cards on your Dashboard. This is on by default, but if you don’t need it, you can turn it off from the 3 dots at the top of the Dashboard. This will generally make your Dashboard look brighter and card images clearer.

Dashboard showing card with overlay colour and setting to change this under 3 dots at top of Dashboard.

For these and more options to personalise your Dashboard see the Canvas Guide (please note, Canvas uses the term Courses for what we call Modules).

Turn back time.

We all make mistakes from time to time, and Canvas gives you a couple of options for undoing them.

Page History

When you are editing a Page it is easy to accidentally delete something important, or maybe you changed your mind about some changes. As an editor you have the option to view all the previous versions of a Page (please note this only applies to Pages, not other types of content). You can select a version to look at it and then click on Restore this revision to make it the current version.

2 screenshots, first showing View Page History option under 3 dots next to Edit button on a Page. Second showing list of versions with option to Restore a version

Because you have this tool available it’s a good idea to Save often while making big changes to a Page so if you do need to go back, you won’t lose too much progress.

Undeleting

If you accidentally delete an item from your Canvas site it is probably not gone for good. If you type /undelete at the end of the site’s URL (so if your module URL was https://canvas.sussex.ac.uk/courses/45679 it would become https://canvas.sussex.ac.uk/courses/45679/undelete) you will see Assignments, Quizzes, Announcements, Pages, Units and more that has been deleted from the site. To restore something just click the Restore button – disaster averted!

Details of a page that has been deleted with a button to 'restore'

As always, if you want any help with these tools, please contact your Learning Technologist via tel@sussex.ac.uk.

Tagged with:
Posted in Canvas

Education Festival 2023

Last month, 46 colleagues participated in the first ever Education Festival at Sussex. The sun was shining on the ACCA on May 4th, perfect for a half-day dedicated to sharing good practice, research and innovation in teaching and learning.   

Deputy Pro Vice Chancellor Kelly Coate in front of an orange and white wall.
Kelly Coate

After some refreshments, we began the day with a welcome from Kelly Coate, who placed the Education Festival in its wider context, such as the DARE to Transform network and associated Pedagogical Revolution events. We then began our first panel of speed presentations dedicated to building inclusive learning communities.  

A man standing in front of a screen
Adam Bradley

Up first, Katherine Kruger and Heather Taylor presented reflections on their inclusivity connector project, which explored how students’ racialised identities impact their learning experience here at Sussex. Katherine and Heather shared their students’ reflections so far, including how inclusive initiatives can be othering and the need to make space for students’ own perspectives and experiences in the classroom. Next, Dorina Cadar shared her thoughts managing emotions in the classroom, including practical advice on how we can respect and use those emotions to support students’ learning experiences. Lastly, Clare Hardman and Adam Bradley presented reflections on their pilot module mentoring project, alongside some humorous illustrations! They explored an issue we’ve all been thinking about a lot recently: encouraging student participation. 

A group of women standing in front of a screen
Tsholo Molefe and Marlene Gadzirayi

The second speed panel focused on assessment and feedback. Josephine Van-Ess introduced self-reflective logs and their potential for life-long learning. Then, Masters students, Tsholo Molefe and Marlene Gadzirayi, drew on their experiences of writing self-reflective logs to explain how powerful they can be, in particular for developing a sense of belonging and empowering students in their learning journey. Sam Hemsley showed us Buddycheck, a new peer evaluation and scoring platform which will be integrated into Canvas from September. The tool simplifies the collection and review of peer comments and scores on contributions to group work, and provides students with automated feedback. Lastly, Jo Tregenza showed us an alternative assessment from a unit of work on the needs of pupils with EAL and bereavement. Over the semester, her students compile sketchbooks full of activities and reflections, with some truly inspiring results.  

A person standing in front of a projector screen
Lucila Newell

After a break (with our Education Festival playlist in the background), we divided into two groups for the interactive sessions. First up, participants had the choice between Lucila Newell and Brena Collyer de Aguiar’s workshop on gamified learning practices, or Carli Rowell’s workshop on staff student co-creation. Lucila and Brena introduced a new gamified Online Distance Learning module. They explained why gamified learning is effective and allowed attendees to try one of the module challenges by imagining themselves in 2047. Carli introduced us to the pedagogic motivations behind student co-creation and showed us how she co-created one of her modules with her students. Drawing on this experience, Carli then offered a guide to embedding co-creation in the curriculum. Next up, Brena and Helen Todd shared their practical learnings from supporting online students. Examples included providing audio recordings of content for students so they have flexibility and where are how they study. Meanwhile in Marcelo Staricoff’s workshop on dis-metacognition, two volunteers were asked to demonstrate the different experiences of navigating the ‘pit of not knowing’, before we considered practical examples, including framing learning outcomes as questions.  

A group of people in a room

After lunch, we enjoyed another speed panel which focused on different ways of disseminating scholarship. Sue Robbins introduced us to the Infographics Project, which is exploring novel and engaging ways of communicating scholarship. Catrina Hey informed us about an innovative Library pilot project focusing on creating open access teaching resources, and Sarah Watson introduced us to a new online resource dedicated to developing the scholarship of teaching and learning. The new webpage has cross-disciplinary and disciplinary specific resources, as well as a guide to progression for colleagues on Education and Scholarship pathways.  

A group of women sitting at a table with sticky notes
Sarah Watson, Mellow Sadik and Kelly Coate

Lastly, we could not host an Education Festival in 2023 without the mention of the infamous generative AI! Sam Hemsley provided us with a timely awakening post-lunch with the deliberately provocative statement: ‘ChatGPT means the essay is dead’. The statement was certainly divisive! We split into groups to discuss our initial reactions to Sam’s statement before opening our Solution Room. Participants considered whether they agreed with the statement, and what the implications for future assessment could be. Thoughts varied from practical, immediate actions such as how we can design written assignments to be more robust in the face of AI, to the fundamental, pedagogical underpinnings of how and why we assess students. We ended with a call to action in the form of a new community of practice focused on generative AI. If you’re interested in being a part of this community, please email the Academic Development team who will let you know what’s coming up. 

Overall, we were thrilled with our first Education Festival at Sussex. The day felt very collegial and ‘warm’, and we left with pages of notes and further avenues for development. Thank you again to all of our speakers and participants. We hope to see you and many more for more education fun at the festival next year.  

Links to presentations in this post are limited to University of Sussex staff only

Posted in Events, Professional Development

About our blog

We are the Educational Enhancement team at the University of Sussex. We publish posts each fortnight about the use of technology to support teaching and learning. Read more about us.

Subscribe to the Blog

Enter your email address to receive notifications of new posts by email.

Archive