How the pandemic covertly made teaching, learning and assessment more inclusive

Graphic, depicting a computer, surrounded by icons indicating a variety of online activity. Including, search, images, video, mail and chat.
Dan Axson profile picture

Dan is an Academic Developer in Technology Enhanced Learning here at the University of Sussex. Inclusion, accessibility and digital capabilities are his areas of interest.

Intro

There is little doubt the pandemic has been overwhelmingly challenging for most, but there are areas in which we can be optimistic. In higher education, with teaching, learning and assessment moving online, there is a lot to learn from how things adapted. Through the lenses of digital capability and the Universal Design for Learning framework (UDL), this blog post explores how we adapted, how it maps to the UDL framework, suggests some emerging evidence of its positive impact on inclusion and accessibility, and finally proposes what our next steps might be.

Lens 1: Digital Capabilities.

‘Digital capability is the term we use to describe the skills and attitudes that individuals and organisations need if they are to thrive in today’s world.’ (Jisc). Digital capability has benefited a great deal from the pandemic. In our personal lives we used video conferencing to connect with friends and family, collaboration tools to work from home. The same is true of teaching, learning and assessment. Many of us using technologies that until March 2020, we’d largely not been aware of. There was little to no incentive, nor reason, to routinely engage with such technologies.

Then a little virus had other ideas. We had to learn how to teach via Zoom, how to better use our virtual learning environment (VLE), how to use chat, discussions and breakout rooms in live sessions. We had to learn how to help our students submit digital files, how to troubleshoot, the list goes on. Those of you who have looked at the framework, you’ll recognise where these activities fall within the defined areas of digital capability. No doubt you will recognise more in yourself.

It’s not an understatement to say that many of us, myself included, have way more digital capability in Spring 2021 than we did going into Spring 2020. No amount of CPD, Digital Skills conferences, initiatives or job description editing could have hoped to achieve this on that scale and at that pace. It took a global health crisis.

Lens 2: Universal Design for Learning (UDL)

Universal Design for Learning is ‘a framework to improve and optimize teaching and learning for all people based on scientific insights into how humans learn.’ (CAST). Below are examples of how UDL might be, or has been, implemented as a result of the pandemic for each of the three principles; Engagement, Representation, Action and Expression.

Engagement: 

  • Checkpoint 7.3: Minimize threats and distractions
    Reducing sensory overload by reducing the amount of activities or ways in which to contribute in an online seminar. Where you may have had two to three activities in a seminar, you now had one, with other activities engaged with asynchronously. 

Representation

  • Checkpoint 1.1: Offer ways of customising the display of information
    Display information in a flexible format. Use of templates for modules sites, provided digitally accessible material, presented in a consistent manner. Increase use of video and captioning.

Action and Expression

  • Checkpoint 4.1 Vary the methods for response and navigation
    Alternative methods for engaging. Question and answer in lectures can be a challenge for some to manage on a small single screen device. Switching between Zoom, Poll Everywhere, Padlet and Canvas would cause too much challenge. Instead colleagues are looking at ways of inviting responses pre and post lecture.

As you can see, whilst many of the adaptations aren’t explicitly intended to make things more inclusive and accessible (arguably they should have been), rather they were an ‘emergency pivot’ for business learning continuity. 

What impact are we starting to see?

With many factors at play, (e.g. safety nets, academic misconduct, extenuating circumstance (EC) claims), there is still a lot to unpick, so what I present below is just my observations. Where I am seeing and hearing this evidence of impact is from the mouths of our colleagues and students. Reporting specifically in two key areas: attendance and attainment.

Attendance

Many academics are reporting higher attendance in online sessions, also sustained attendance. Where in the past they may have seen a drop off towards the assessment period, this is not happening at the same level. Similarly, some are reporting that their online sessions are averaging a higher attendance rate than in prior years. It could be that the option to join a session from home because of caring or similar responsibilities will mean they can attend, where they otherwise wouldn’t have. It’s worth noting here that attendance does not equal engagement, though that conversation is for another post.

Attainment

There is cautious optimism when it comes to attainment, and associated gaps with various student demographics. Some colleagues in course reviews have commented that a reduction in failure rate is attributable to the diversification of assessment types. It’s too early in this instance to fully attribute this to online assessment, or change in assessment design. Or as is more likely, a combination of factors.

We must approach this with an open mind. For example, with EC claims, was there an increase in these because they were clearer and better signposted? Was it because of technical issues, and therefore we would expect a reduction next time, or with learning from home in a pandemic being the dominant space, were there caring, health or environmental issues? Again, likely a combination.

Your digital capability matters

Many of the positive adaptations to learning and assessment will not have been possible without us collectively improving our digital capability, both institutionally and individually. Our ability to create more inclusive and accessible spaces for learning is in large part dependent on our digital capability.

What now?

I know this raises more questions than it answers at this stage. The ones I’m keen to explore are as follows:

  • Has digital capability widened the range of activities and methods by which a student can engage, thus being more inclusive?
  • Through the lens of UDL, have changes in assessment mode and design made them more inclusive, in turn having the impact of reducing attainment gaps?

Through the two lenses, what can you identify in your practice? What digital capabilities have you developed? What can you now recognise as inclusive practice, that maybe wasn’t explicitly so? 

For those of you at Sussex, if you would be interested in exploring these with us or want to find out how we can support your scholarly activities in these areas and more, please get in touch. DARE@sussex.ac.uk

Links

Jisc Digital Capability Framework

Universal Design for Learning Guidelines from CAST

A series of very excellent UDL webinars hosted by The Support Centre Inclusive Higher Education (SIHO, Belgium): Towards genuinely inclusive universities

Tagged with: , , , , , , ,
Posted in Uncategorised

Reflections on different engagement techniques whilst teaching online

person writing on notebook

Guest post by Seun Osituyo

Seun Osituyo is a Lecturer in Accounting and a fellow of the Higher Education Academy. She teaches Management Accounting, Introduction to Accounting and Auditing. Her research interests include risk disclosure, risk management, sustainability communication and strategic management accounting practices.

Student engagement is a very useful part of our job as tutors as it helps us check that learning has taken place (Butcher, Davies and Highton, 2006). But what exactly do I mean by student engagement here? Authors such as Axelson and Flick (2011) suggest that student engagement could lie somewhere between a serious commitment to mere appearance in the classroom. Relating student engagement to learning, Astin (1999) suggests that learning could be a reflection of both “quantity and quality of students’ physical and psychological energy” invested. This brings me to my adapted definition of student engagement –

being present and actively participating during teaching sessions, to check that learning outcomes are achieved.

My quest was to find out what was happening behind those black screens during the synchronous lectures and seminars and to identify other ways to promote active participation in this covid era.

Here I would share my thoughts and experience on student engagement from teaching both quantitative and qualitative accounting modules in this academic year.

What proved useful in Term 1 of 2020-2021 academic session?

Qualitative accounting modules are largely delivered to students at the more advanced undergraduate level or postgraduate level. The nature of topics for the module I taught in term 1 required students to read selected academic research articles on the related topics each week and prepare to discuss at the seminar. The terms endogenous connections (making connections with the research articles) and exogenous connections (application) are very useful here (Strømsø, Bråten, and Samuelstuen, 2003). Students were expected not only to read and understand the research articles but to make sense of it and apply to what they already know and to real life situations. During my first lecture with the cohort I did a poll where I asked students to choose if they preferred purely calculation-based exam questions or purely discursive exam questions. The responses I received inspired the need to ensure that they were evidently engaged, on the discursive topics!

The lecture materials were designed in a way that I could ask questions from time to time not only about what I covered but to see how students would apply the concept to ‘real situations’. For example, when teaching on the topic Accountability, students were asked to discuss the question – ‘Who is the University of Sussex accountable to’ in breakout rooms. At the start of a seminar and lecture most times, I re-emphasised that working in groups might be more beneficial for discursive topics as they will get to actively learn from each other. The key here was keeping the breakout room discussions very short. Oftentimes I would share a file with instruction and time (usually five minutes to discuss and two minutes to put a summary of the discussion on a virtual board e.g. Padlet).

It must be noted that not all students used the breakout rooms. Some students used the breakout rooms to discuss research articles and lots of ideas in summary were provided by one student from each virtual group on Padlet. Some students preferred not to join breakout rooms but provided individual answers on Padlet. Sometimes (in very few occasions) students felt comfortable to unmute themselves and discuss their summaries during the seminars. Many did not. Taking into consideration the individual circumstances of people, these summaries were then discussed by the tutor. It was interesting to see different ideas on Padlet and more so that all of them get to see what other (virtual) groups thought about the articles. In all of this, I said to myself – “as long as we find a way to communicate with one another, there is an opportunity for learning to take place”. This was also useful for me to check that the learning outcomes were achieved both during the seminars and lectures. On reflection, I probably could have recorded only parts where I spoke at the seminar.

My experience in the quantitative accounting module in term 1 was quite different. Students engaged more using the chat section possibly because questions were mainly calculation-based. Students will be given a question to answer within the timeframe (e.g. seven minutes) and then put the result in the chat section. I will of course go over the explanation again for students that did not get it right. Verbal responses from students were not popular especially when all participants were online. On reflection, maybe breakout rooms might have also worked for calculation-based seminars.

What have I tried in Term 2?

OK, I did not stop there. In term 2 I wanted to introduce flipped learning through asynchronous teaching in one of my modules. My reasons for this were as follows.

Firstly, flipped learning provides individual students with the autonomy to learn at their own pace within reason. In addition to attempting seminar questions, Fisher, Perényi and Birdthistle (2018) suggest that flipped learning provides students with an individual learning space where they can be confident about their knowledge of concepts. Of course, the use of inappropriate pedagogy might lead to resistance of this approach by students. Hence the students should be able to see how the flipped learning materials link to the overall module content (Turan and Goktas, 2016). So, in most cases, I would split one of the topic learning outcomes into smaller unit objectives and create content based on these objectives.   

Secondly, I observed some colleagues both in my school and in other institutions who have used pre-recorded videos and they gave good feedback especially on student engagement.

Thirdly, depending on numerous factors, such as different time zones, some students may be unable to join all synchronous lectures. Yes, the synchronous lectures can be recorded, and real-time interactions between the lecturer or tutor and students are beneficial but students who are unable to attend the live sessions at all especially due to reasons beyond their control (e.g. broadband issues) may feel excluded and have less opportunity to fully engage with the module, in my opinion, if all lectures and seminars are synchronous. I also wanted to offer an opportunity for students to learn both asynchronously and synchronously and appreciate their distinctive benefits.

Lastly, I still very much wanted to have more interactive live sessions with students. My thoughts were if students understand the basics of what we would cover during the synchronous lecture beforehand, it will yield a more useful and interactive synchronous session.

To ensure that students had the opportunity to fully participate during the synchronous lecture, the pre-lecture videos were provided well in advance. I embedded at least one quiz in most pre-lecture videos which students will have to attempt before proceeding to the other parts of the video. I will then collate student responses and discuss at the start of the synchronous lecture. This was very useful to check that what I explained in the video was understood.

Did this improve active participation at the synchronous lecture?

In comparison to last term, I will say yes. More students actively participated during the synchronous lecture. Students answered questions when asked, and asked questions. The discussion seemed to flow well. In some cases, we were able to quickly move into numerical questions during the synchronous lecture, which I would ask them to attempt first by applying their knowledge from the concepts covered in the pre-lecture video. This seemed to work well as some students came up with the right answer.

I am not sure if this is related but the attendance rate has also been very good on this module. We are now in week 5 and we have had at least 70% of students attending the synchronous lecture every week out of a class of over 200 students. This is slightly better than my term 1 attendance rate. Although I would not want to make an unfair comparison here as this is an entirely different cohort from the students that I taught in term 1 and it is also a different module. Other factors might have played a role in this, for example the synchronous lecture for this module starts at 9 am on Monday.

Student feedback about the pre-lecture videos collected during one of the synchronous lectures suggests that the pre-lecture videos have also been useful for them. Here are just a few comments from the feedback:

“allows you to be flexible, able to watch the videos when it is the best time for you.”

“well paced and interactive”.

Future implications

Providing different opportunities for students to engage such as flipped learning and group discussions with the aid of technology enhanced learning tools promote active participation. One thing is certain, if I had not attended trainings provided by the Higher Education Academy (now Advanced HE) and used some of these skills in my teaching during the pre-covid era, I might have found student engagement in these times a bit challenging. Trainings provided by TEL Sussex were very useful and applying these skills have contributed to the more ‘engaged’ classroom I now have.

Group discussions:

These have always been encouraged on discursive modules (Bruun, Lindahl and Linder, 2019). I am now introducing virtual group discussions to first year students. Students should be encouraged to use it more, in my opinion, as if used properly stimulates active learning and helps build a sense of community for students. Group discussions can be in different forms and do not necessarily need to be verbal e.g. using virtual blackboards like Padlet. More importantly, students should be able to communicate with each other in the way they feel best.

Will I consider using flipped learning in the post-covid era?

Flipped learning is not a new teaching approach and has been used since the 1990s. I have come to appreciate the usefulness of providing pre-lecture activities especially with uncertainties facing us tutors where we are sometimes not able to tell what happens behind those screens. The flipped learning approach makes the class more active, we are able to address any concerns about the introductory aspect of a topic as soon as possible (e.g. at the start of the synchronous lecture), before moving on to the more technical aspects. I felt the teaching and learning flowed more, with a very good attendance rate at the synchronous lectures. Although making these videos can be tedious and time consuming, the benefits are endless. If practical, I will use them again.

References:

Astin, A. W. (1999). Student involvement: A developmental theory for higher education.

Axelson, R. D., & Flick, A. (2010). Defining student engagement. Change: The magazine of higher learning43(1), 38-43.

Bruun, J., Lindahl, M., & Linder, C. (2019). Network analysis and qualitative discourse analysis of a classroom group discussion. International Journal of Research & Method in Education42(3), 317-339.

Butcher, C., Davies, C., & Highton, M. (2006). Designing learning: From module outline to effective teaching. UK: Routledge.

Fisher, R., Perényi, Á., & Birdthistle, N. (2018). The positive relationship between flipped and blended learning and student engagement, performance and satisfaction. Active Learning in Higher Education, 1469787418801702.

Strømsø, H. I., Bråten, I., & Samuelstuen, M. S. (2003). Students’ strategic use of multiple sources during expository text reading: A longitudinal think-aloud study. Cognition and instruction21(2), 113-147.

Turan, Z., & Goktas, Y. (2016). The Flipped Classroom: instructional efficiency and impact of achievement and cognitive load levels. Journal of e-learning and knowledge Society12(4).

Posted in Guest Post

How can we improve Module Evaluation Questionnaires?

focused woman writing in clipboard while hiring candidate

by Junko Winch

Junko Winch is a Lecturer of Japanese at the School of Media, Arts and Humanities at the University of Sussex.

Module Evaluation Questionnaires (MEQs) are an important source of student feedback on teaching and learning. They are also often relied upon as evidence cases for promotion and teaching. However, in their current form they suffer from low response rates reducing their usefulness and validity. Local practices have grown to address the need for feedback but they are inconsistent year on year or across the university. Existing research on teaching evaluations indicates that there are a source of bias and suggests careful design of MEQs.

The MEQ project

The MEQ project was undertaken to inform the University of Sussex’s policy and practice. The output was presented to the University’s Surveys Group for their strategic direction of University of Sussex.

Research Questions (RQ) and the corresponding methods

Literature review

The literature review revealed tutors’ and students’ biases related to MEQs. However, bias is a source of unreliability, which also threatens validity. Validity and reliability are defined in various terms, but for the purpose of this report, validity is defined as “the general term most often used by researchers to judge quality or merit” (Gliner et al., 2009, 102) and reliability as “consistency with which we measure something” (Robson, 2002, 101). 

Tutors’ related biases
Students’ related biases

The findings and recommendations

1. The purpose of MEQs

MEQs have three purposes: institutional, teaching and academic promotion. To help to reduce the bias effects outlined in the literature, full MEQs and other teaching related data should be provided to promotion panels to avoid the cherry picking of comments or data by applicants. For example, quantitative data such as class average attendance rate, average, minimum and maximum marks as well as qualitative response analysis would help build a more accurate overall picture of the class.

2. Analysis of MEQs

Students’ biases mentioned in the literature may present difficulty in relying on MEQs as sole instrument. Furthermore, the current MEQ statements may confuse students due their contents and wording.

Following points are suggested:

  • Purpose and goal of the questionnaire should be clearly stated. The purpose of the stakeholders should be taken into account when designing the MEQs to ensure that the intended MEQ purpose is achieved.
  • Some statements ask two questions in one statement. However, some students may not necessarily answer both questions, which affect validity.
  • Consideration should be given to the words such as ‘satisfied’ which might have different connotations depending on cultures and individuals.

Recommendations

Carefully developed MEQs have potential to offer valuable insights to all stakeholders. The primary recommendation is to undertake a staff-student partnership to agree the purpose of the MEQs and co-design a revised instrument that meets the stated purpose.

Reflections

I have engaged this project as my CPD and appreciate that it has given me various opportunities. For example, I was given an opportunity to write this blog. Furthermore, giving a presentation to the University Surveys Group reminded me of my doctorate viva as the University Survey Group included Pro Vice Chancellor for Education and Students, Associate Dean of the Business School and the Deputy Pro Vice Chancellor of Student Experience. When answering questions from the University Survey Group, I learned how difficult it is to meet the needs of different perspectives and cultures. For example, I was asked a question from a quality assurance perspective, which was unexpected as I wrote this Report from a teaching staff perspective. The University Survey Group also included Students’ Experience team which also made me consider another perspective involving MEQs. Furthermore, working with my colleague from the Business School made me realise the departmental/academic discipline’s cultural differences from where I am affiliated (School of Media, Arts and Humanities). Looking back, this was a very valuable experience for me and I will recommend any colleagues who wish to join the DARE Scholarship programme to undertake a similar project.

References

Carrell, S. E., & West, J. E. (2010). Does professor quality matter? Evidence from random assignment of students to professors. Journal of Political Economy, 118, 409–432.

Gliner, J.A., Morgan, G. A. & Leech, N. L. (2009), Research Methods in Applied SettingsAn Integrated Approach to Design and Analysis, N.Y., Routledge.

Patrick, C. L. (2011). Student evaluations of teaching: effects of the Big Five personality traits, grades and the validity hypothesis. Assessment & Evaluation in Higher Education36(2), 239–249

Robson, C. (2002). Real World Research –Second Edition, Oxford, Blackwell.

Tagged with:
Posted in Guest Post

“The wheels firmly on the bus” Reflections on teaching a new module in the ‘new normal’

by Jeanette Ashton & Paolo Oprandi

About the authors

Jeanette Ashton is a Lecturer in Law and a Non-Practising solicitor, having joined the University of Sussex after 8 years at Brighton University.  She teaches Contract law, Equity and Trusts and Understanding Law.  Her research interests are legal education, whistleblowing and contract law.  She is Employability lead for the Law School and co-leads the CLOCK legal companion scheme.

Paolo Oprandi is a Doctor in Education with a colourful and varied academic background. He is currently working at the University of Sussex as a Senior Learning Technologist in the Technology Enhanced Learning team. He has an interest in technology in teaching, curriculum design and assessment and enhancing the student learning experience so that students can make the most of their years in education.

“The wheels firmly on the bus” Reflections on teaching a new module in the ‘new normal’

There is no doubt the 2020/21 academic year has presented educators with unprecedented challenges, and I cannot help feeling a sense of relief at having made it through the Autumn semester without, as one of my colleagues said ‘the wheels having fallen off the bus’.  I want to reflect on the effectiveness of learning and teaching techniques I used in delivering Understanding Law, a module for first year non-law students on the Legal Studies pathway, which I convened for the first time.

Introducing Flipped Learning

Dr Paolo Oprandi and I have explored the flipped learning approach. Many advantages have been found with this approach including more students meeting and exceeding the learning outcomes (Lee & Choi 2019) and more students taking self-regulated learning approaches to learning (van Alten et al 2020). It is a curriculum design where the content that the student is expected to learn is presented before a face-to-face session via a recorded lecture presentation, an academic paper and/or any other medium that students can engage with in their own time. The face-to-face session is used to discuss, analyse and critique the learning material with the tutor present. It is called a flipped curriculum design as it sits in contrast with curricula that present the learning material during the face-to-face teaching session and confine opportunities for students to discuss, analyse and critique the material to homework tasks and reading groups when the tutor is not present. The major worry for academics taking this approach is that students engage with the learning material before a taught session.

Planning the module delivery

With the benefit of having attended various TEL training sessions during the summer, I decided to utilise Panopto to frontload preparation via short, pre-recorded lectures.  To facilitate communication and engagement I planned to use the Padlet tool and Zoom quizzes within live, follow-up lecture sessions.  When planning the module, for around 75 students, I did not know who would be delivering the four two-hour seminars, so I took the decision to run these as synchronous online sessions.   

The key objective of Understanding Law is to give students a solid foundation as to how law is made, interpreted and developed, an overview of human rights law, and the different types of public and private law, alongside a working understanding of the court system of England and Wales.  This equips them with the legal skills necessary for the subsequent modules on the Legal Studies pathway.  The usual form of delivery is via live two-hour lectures and two-hour seminars.  Particularly as this was my first time convening the module, and mindful that arrangements for students starting in September were uncertain, for each topic I decided to record 3 x half-hour content sessions, with a Padlet wall on Canvas for each, which I would then use to build a live dual mode session, complemented by an in-lecture quiz. 

Students’ views

To find out how the students felt about the flipped learning approach and the effectiveness of the Padlet tool, Paolo and I drew up a Qualtrics survey, to ask how this approach compared with other modules without pre-recorded material, how effective they had found Padlet, and their thoughts on the live session quizzes.  The response was low, to date only around 15% of the cohort, and this may be partly due to survey fatigue, and perhaps if I had been able to see more students in person, they would have been more inclined to complete. 

Students’ views on pre-recorded content

Despite the limitations however, the qualitative responses are interesting. On the pre-recorded content, a common response was that this enabled the students to manage their time, work independently outside of the live lecture, pause and take notes, and replay parts which needed clarification. 

In our study, students were able to access the pre-recorded materials ahead of the live session and the corresponding seminars. I had the peace of mind that, whatever the semester might bring, the content was there. Many of the students appreciated the recorded lectures saying,

 “[They gave] context to the required readings”

“I was able to take notes effectively at my own pace, since I was able to pause the recording and re-listen to parts I was unable to understand the first time I heard it.”

The live sessions that followed the students watching the pre-recorded lectures worked well. However, one respondent, whilst stating the benefits of doing the work in their own time, felt that the pre-recorded content ‘lacked a personal feel’. The first of these was online only, while room capacity issues were finalised, and the remainder were dual mode, though in person attendance dropped off towards the end of the module, with the majority of students choosing to access via Zoom. 

Personal reflection and students’ views on Padlet

During the sessions I used Padlet to ask questions of the students’ understanding and students to pose questions back to me. The combination of building in additional content to address questions raised on the Padlet wall, Zoom polls to check understanding, and questions via the Zoom chat function in the session, facilitated engagement and connection, albeit in a different way from usual. 

This was the first time I had used the Padlet tool.  At the beginning of the module, students needed a lot of encouragement to post questions, and it took a little time to direct students away from emailing questions to me, and instead to post on the Padlet wall.  However, once they got used to this, comparing with my experience of other modules, this proved more effective than the Canvas Discussion Board.  Perhaps this is because it is more visual, sitting alongside the topic materials, rather than accessing via another window.  It was also easier to see other students’ questions and to know that these will be covered in the live follow up session, avoiding duplication. The students appreciated my efforts; one stated,

“[Padlet] was a really effective way of bringing up a question and making the seminar more useful”

On the live session quizzes, the responses were largely positive, for example,

 ‘I found it motivating to stay on task and up-to-date on lectures’

‘I was able to check my own understandings of some terms and the system of English Law’. 

One student stated that the questions were simplistic and that there wasn’t an incentive to get the answers correct, but hopefully they found the seminars, which required them to analyse a case on the theme of the legal topic area, more challenging.  I deliberately made the quizzes anonymous, to encourage students to answer without fear of getting the answer wrong and participation in the lectures for the quizzes was always 80%+, which was encouraging. 

The students were asked to rate the effectiveness of the Padlet tool.  Again, acknowledging the low response rate, most responses were ‘highly effective’ or ‘somewhat effective’, with a couple of respondents answering ‘neither effective nor ineffective’.  As I had set up the Padlet walls as anonymous, to encourage students to post questions no matter how minor they might be, it is impossible to say how many students engaged with the Padlet.  The anonymity was appreciated by at least one of the students who mentioned,

[I found it] good for asking questions anonymously”

However, even if a student did not personally use a Padlet wall, it was effective in ensuring they were all able to access the same information, either through the live session, which they could also access afterwards, or by the answers on the Padlet wall itself, as per the Assessment Padlet wall below:

Screenshot for padlet wall with Q&As on assessment.
Assessment Padlet

Final reflections

My concluding thoughts on the learning and teaching experience of the Understanding Law module this semester are that I did not feel I built the same connections with the students as in ‘normal’ times, with solely in person two-hour lectures and the rapport which those bring.  However, despite that, I feel that the overall learning experience was positive, that the students were on the whole engaged, and the feedback from the tutors running the seminars supported this. 

I am looking forward to seeing the AB1 assessments and feel confident that the module has succeeded in getting the students where they need to be for the rest of their pathway programme.  The Padlet tool has been effective in facilitating communication with the students and in giving them the opportunity to play an active role in shaping the live sessions (Fuchs 2014)  I have already designed the Padlet walls for my two core spring law modules, but this time giving the students the facility to edit and respond to posts to help each other, which I hope will facilitate collaboration. Now just to plan the rest…….

References

van Alten, D.C., Phielix, C., Janssen, J. and Kester, L., 2020. Self-regulated learning support in flipped learning videos enhances learning outcomes. Computers & Education158, p.104000.

Fuchs, B., 2014. The writing is on the wall: using Padlet for whole-class engagement. LOEX Quarterly40(4), p.7.

Lee, J. and Choi, H., 2019. Rethinking the flipped learning pre‐class: Its influence on the success of flipped learning and related factors. British Journal of Educational Technology50(2), pp.934-945.

Tagged with: , ,
Posted in Guest Post

Guest Post – How to Create Connection in Challenging Times by Jenni Rose

Two stylised hands reaching to one another, each made up of a word cloud with words such as connectors, unite and work with.
Jenni Rose profile picture

Jenni Rose,
Lecturer,
University of Manchester.

Jenni qualified as an Accountant with the ICAEW when working in Audit with KPMG in 2008. The main focus of her teaching is in auditing, financial reporting and financial statement analysis, as well as on the MBA at the University. Much of the strength of her teaching come from developing innovative and creative teaching and learning techniques to increase student engagement includuing using the flipped classroom approach and researching efficient teaching excellence.
Profile page on University of Manchester website.

How to Create Connection in Challenging Times

There is no doubt that it is more difficult than usual to create connection – much of our lives are lived online at the moment and we are unable to be in large groups of strangers or with those we love. You might be feeling anxious about missing out on marking important occasions or unsure how to make create the connections we need as social creatures. 

To frame your reflection on this you can first focus on starting where we are then we using what we do have, and moving into doing what we can. 

To ‘start where you are’ consider what connection means to you – does it mean that deep life long friendship connection or the smaller connections we have with strangers in random situations. What kind of connections do you value? How do you feel after you’ve made a good connection with someone? There is plenty of research on the benefits of connection, 50% increased chance of longevity (Holt-Lunstad, Smith, Layton 2010), Stronger gene expression for immunity (Cole 1996), Lower level of anxiety and depression (Seppala 2014), Higher self esteem and greater empathy (Seppala 2014) but what are your own personal benefits of strong connections?

Brene Brown is famous for talking about the power of vulnerability and the sense of connection you feel to someone is often in line with the vulnerability you show to the other person. This takes courage and risks being hurt but can really help with connection.

A final benefit for those who are teaching or learning (or both!) is Cole’s theory from 1996 that learning is a social and cultural process. Besides the obvious benefits of keeping track of deadline and company from someone going through the same process as you, there are also benefits of accelerated learning for students who have strong connections to peers (Gowing 2019).

So practically what can we use to feel connection? First, we need to take a look at what we have control over – we only have control over our behaviour, our reactions and what we focus on. Have a look at this video which talks about the circle of control, from our webinar.

Much of our life at the moment is conducted on Zoom, both professionally and personally but how can we best use this? I’d say the careful use of video is important. In class, speaking as a lecturer, I can only use my teaching skills on those who I can see on video or, to a lesser extent, those who talk which increases motivation and accelerates learning. On the other hand, I find a better connection when I’m speaking 1:1 to someone when it is over the phone as I’m less self conscious about what I look like. Work out feels good for you.

Emma Seppala is a researcher focused on connection who emphasises this. She points out that the benefits of connection are closely linked to your own subjective sense of connection. Therefore you need to work out what gives you the strongest sense of connection.

Here are some ways of “doing what you can” …

And other ideas from my own experience and those who came to the webinar:-

  • Look back over old photos and see which evoke feelings of connection
  • Consider how you feel in a room of strangers (do you feel connected or disconnected)
  • Take a walk without your phone – perhaps this will help you feel connected to nature, even though you are on your own
  • Try talking to random strangers or shop keepers (a famously English way to open up any conversation is to discuss the weather)
  • Smile at random people and smile at yourself in the mirror
  • Find others which may have a common interest
  • Try different ways of using social media eg browsing or what you post and what is important to you
  • Try to rekindle an old friendship where you’ve felt a connection 

To conclude, if connection is important to you then spend some time reflecting on what gives you the most satisfying feeling of connection and seek it out. Start where you are, motivate yourself by visualising the personal benefits for you of increasing connection in your life. Use what you have, technology-wise or technology-free, whichever gives you the best connection given the circumstance and how you feel. Finally, do what you can; pick one small action you will do today to increase the feeling of connection in your life and immediately start to benefit from the positivity you have created.

References

Dudley-Marling C. (2012) Social Construction of Learning. In: Seel N.M. (eds) Encyclopedia of the Sciences of Learning. Springer, Boston, MA. https://doi.org/10.1007/978-1-4419-1428-6_96

Gowing, A (2019). Peer-peer relationships: A key factor in enhancing school connectedness and belonging. Educational and Child Psychology 36(2):64-77 https://doi.org/10.1371/journal.pmed.1000316

Holt-Lunstad J, Smith TB, Layton JB (2010) Social Relationships and Mortality Risk: A Meta-analytic Review. PLoS Med 7(7): e1000316

Sepalla, E (2020) https://emmaseppala.com/connect-to-thrive-social-connection-improves-health-well-being-longevity/Accessed 1st November 2020.

Tagged with: , , , ,
Posted in Guest Post

Three tips for enhancing students’ engagement with feedback

decorative image - two women discussing over a paper.

In my many years of experience as an educator I have spent innumerable hours in writing feedback for students’ work. I always thought this was one of the most important elements of my role as a teacher, where I had the opportunity to provide personalised guidance to students and connect with them on an individual basis.

But the truth is that not all students engage with feedback and perhaps more crucially not all are able to apply the feedback to the next assessment.

Why does this happen? In a recent discussion I had with students, one of them mentioned, and I paraphrase: “You don’t need feedback for exams, if you pass them”. So is it clear to students what is the purpose of feedback?

According to Carless and Boud (2018) feedback is

the process through which learners make sense of information from various sources and use it to enhance their work or learning strategies”.

As such, feedback is not something static but a dynamic process that aims to shape an individual’s performance. So what can we do to help students engage with the feedback and enhance their feedback literacy?

Tip 1: Know the value of the feedback you give, make it explicit and clearly communicate that to the students in advance.

First and foremost, as feedback providers, we need to be able to clearly identify the value of the feedback we give. Can the students apply that feedback to another piece of work, would it help them to improve their performance? And if so, do we know when and where the students will need to apply that feedback?

Or are the things that the students didn’t quite get right in this piece of work, only relevant for that module and for that assessment? And as such, the comment from the student above was actually valid? They passed the exam, and the rest is history now.

If we can ourselves clarify and identify the value of the feedback we give, we can then help the students make those connections as well.

Tip 2: Provide opportunities for giving, receiving and analysing feedback.

But even if we can make this clear to students, how can we ensure that the students can break down the feedback, analyse it, and devise their own strategies for improving their performance?

Here is where developing students’ feedback literacy comes into play. Feedback literacy has been seen as “the understandings, capacities and dispositions needed” (Carless and Boud 2018) in order to be able to respond to feedback. We need to guide students in the process of understanding the feedback, as well enabling them to internalise the feedback to become self-regulated learners (Nicol and Macfarlane-Dick, 2006).

Several strategies are available to achieve this.

Firstly, by clearly defining the way that students’ work is assessed, through discussion and analysis of marking criteria early on.

Secondly, by offering opportunities to students to compare their work with the work of others (for instance through exemplars, or peer marking activities). By applying the marking criteria themselves, students will be able to identify what areas they need to work on and how to improve their own work.

Finally, by enabling dialogue and discussions (among peers or between students-staff), we can share examples of strategies on how to apply feedback and what actions need to be taken for further improvement.

If you have used any of these strategies in your teaching, feel free to share your experience in the comments.

Tip 3: Provide a supportive well-planned learning environment

Although  there are strategies to support students to develop their feedback literacy skills, Gravett (2020) argues that feedback literacy is better “conceptualised as a complex breadth of dynamic, nuanced, situated feedback literacies” through a sociomaterial lens, where factors such as space, time and power-relations come into play. These factors will vary significantly among students, especially when considering how diverse and heterogenous the student population is.

It is important, therefore, that faculty also aims to provide a supportive and well-planned learning environment that takes into consideration the medium, the space and the time that feedback is given and analysed by the students, as well as who is providing the feedback. Establishing a supportive, caring relationship with students is important to allow students to be receptive to feedback and not scared of it.

What strategies could we use for that? What are the limitations? Add your thoughts in the comments.

In summary, what we need  to remember as feedback providers is that our students will be receiving feedback in a variety of forms and from a variety of sources, but for them to engage with the feedback they will need to identify its value, be able to decipher it and be in the right environment to engage with it. We may not be able to address all of these factors directly, but considering them when planning our teaching may help us navigate through the complexities of students’ engagement with feedback.

References:

Carless, D. and Boud, D. (2018) The development of student feedback literacy: enabling uptake of feedback, Assessment & Evaluation in Higher Education, 43:8, 1315-1325, DOI: 10.1080/02602938.2018.1463354

Gravett, K. (2020) Feedback literacies as sociomaterial practice, Critical Studies in Education, DOI: 10.1080/17508487.2020.1747099

Nicol, D.J., and Macfarlane‐Dick, D. (2006) Formative assessment and self‐regulated learning: a model and seven principles of good feedback practice, Studies in Higher Education, 31:2, 199-218, DOI: 10.1080/03075070600572090

Disclaimer: This is an opinion-based post and is not representative of views held by the University of Sussex.  

Tagged with:
Posted in Uncategorised

Introduction to the University of Sussex Scholarship Framework

Welcome to the DARE to Transform Blog.

The blog is one of the University’s outlets for sharing scholarship of teaching and learning. It forms part of the activities of the Development, Advancement and Recognition in Education (DARE) to Transform Network which was established in 2019 to serve as a scholarship and pedagogical research incubator, through the establishment of a community of practice and range of supporting initiatives to advance teaching, learning and assessment and encourage educational experimentation and enquiry.

Boyer’s groundbreaking work, recognising the different forms of scholarship within the academy and how they intersect (Boyer, 1990) led to the the development of the Scholarship of Teaching and Learning (SoTL) as a discipline in its own right. Scholarship should also be viewed as a proactive concept, such that, knowledge is actively and continually developed, applied and improved, and collaboratively shared with the wider community. This increasing focus on SoTL has led to the development of Education and Scholarship career frameworks within Higher Education including the University of Sussex. The framework for scholarship that Sussex has adopted evolves as colleagues progress through different career levels, moving from a focus on developing the academic’s own knowledge and practice to influencing and leading the field. The Sussex model of scholarship builds on the DART model (Kern et al., 2015), adding mentoring and leadership into the development process. This approach recognises that the curriculum, systems of assessment, student experience initiatives, widening participation activities and community engagement endeavours represent key avenues for both scholarship and pedagogic research.  Social media channels, digital repositories, online journals and other digital platforms provide multiple mechanisms for disseminating outcomes of scholarly activities into society, culture, professional networks, research communities and the wider HE context.

DARE to Transform comprises a number of different streams of activity to support colleagues in developing their scholarship, including a mentoring scheme, invited seminars, case studies featuring scholarship stories and internal and external scholarship opportunities.

The DARE to Transform blog has been established as an open, online publishing outlet for both early career and experienced colleagues across the University to share and disseminate scholarly outputs.  These may address a range of thematic areas and take the form of reflective posts, opinion pieces, educational resources or initial findings from action research.  Should you wish to write for this blog or get involved in any the activities associated with DARE to Transform please email DARE@sussex.ac.uk

You can find further information relating to the blog scope and style guidance here.

DARE is coordinated by Dr Susan Smith (Business School) and Dr David Walker (Student Experience) with support from colleagues across the institution.

References:

Boyer, E. L. (1990). Scholarship reconsidered: Priorities of the professoriate. Princeton University Press, 3175 Princeton Pike, Lawrenceville, NJ 08648.

Kern, B., Mettetal, G., Dixson, M., & Morgan, R. K. (2015). The role of SoTL in the academy: Upon the 25th anniversary of Boyer’s Scholarship Reconsidered. Journal of the Scholarship of Teaching and Learning, 1-14.

Posted in Uncategorised