Blackboard Original and Blackboard Ultra – Key Differences

As we enter the first wave of transitioning to Blackboard Ultra, one of the most common questions I hear is ‘How different is it?’. Although coming from the same provider, Blackboard Ultra is essentially a different VLE rather than an upgrade of the Original. While a lot of the functionality overlaps, what makes it a separate product to me are the differences in the underlying design principles of the platform. Below I outline the three most essential, from my perspective, changes:

Content organisation

Learning Module in UltraThe navigation of the content in Blackboard Original was dictated by the left-hand menu. From my experience, this ‘sectioned’ structure frequently led to Blackboard being used as a repository of resources rather than an actual learning environment.

Blackboard Ultra seemed to take a different approach which to me resembles a design of platforms such as FutureLearn or Coursera focused on leading a student through a sequential learning journey. As there is no left-hand menu, the content has to be organised on one main page by using Folders or Learning Modules.

Learning Modules exist in Blackboard Original, although they are rather underused which is a shame as they facilitate a sequential organisation of the resources and activities. In Blackboard Ultra the Learning Modules are at the forefront of the design. Their added value is providing courses with a more visually appealing interface. For those who used Moodle, Learning Modules in Ultra resemble tiles format. There are some planned improvements to the navigation of Learning Modules which will further enhance their usability.

Read more about Ultra Learning Modules

Documents instead of Items (or rather Blank Pages)

Document in Ultra

Instead of items, Ultra offers Documents as the key content creation tool. In terms of functionality Documents resemble Original’s Blank Pages. They enable you to add a collection of different items to one page. This includes text, multimedia as well as attachments which, unlike in the Original, display directly within the Document. This is another change which to me emphasises the continuous learning journey and sequential engagement with content. One important consideration when using Ultra Documents is paying attention to accessibility principles such as titles and headings, and alternative text, otherwise they can easily become a hindrance to screen reader users.

Read more about Ultra Documents

Gradebook (Grade Centre)

The last and possibly the most substantial change is the Gradebook, so the equivalent of the Grade Center. To me, the idea behind the changes in this area is to enable a clearer overview of the different progression metrics and monitor student engagement in addition to performing administrative processes. The Original Grade Centre had only one view with rows for students and columns corresponding to submission points, calculations and notes.

The Gradebook has three views:

  • Markable items – quick view of all the graded items, their due dates, submissions to mark

Markable Items View of the Gradebook in Ultra

  • Marks – closest to the Grade Centre, includes all submission points columns along with all the added calculation columns against a list of students

Marks Items View of the Gradebook in Ultra

  • Students – quick overview of the student activity including the last access date and overall mark (you can then click on the student to see more details on each individual)

Students Items View of the Gradebook in Ultra

Read more about the Ultra Gradebook


In the upcoming weeks, I will be spotlighting other changes and new functionalities. In the meantime, if you like to see a more in-depth comparison of all the differences, these resources may be helpful:

Using Rubrics to Support Self-Regulated Learning

Prefer video over text? Watch this presentation instead (11 min)


Although rubrics can be useful in enhancing marking consistency (Jönsson & Svingby, 2007), as argued by Brookhart (2018, p.10): ‘The value of a rubric lies in its formative potential (Panadero and Jonsson, 2013), where the same tool that students can use to learn and monitor their learning is then used for grading and final evaluation by instructors.’

A recent meta-analysis conducted by Panadero, Jonsson, Pinendo and Fernandez-Castilla (2023) revealed a positive and moderate effect of rubrics on student academic performance, self-regulation and self-efficacy. Rubrics have been shown to support student ability to establish more accurate goals, monitor progress and reduce cognitive load thus contributing to self-regulated learning (Brookhart & Chen, 2015; Panadero & Jonsson, 2013; Krebs et al., 2022; Reddy & Andrade, 2010). As argued by the meta-analysis’ authors: ‘By making criteria, performance levels, and (when relevant) scoring strategies explicit, these may become objects of action and reflection themselves (i.e., students can use them to regulate their learning) helping students to improve their learning via self and peer assessment (Nicol, 2021; Panadero et al., 2019). This interpretation is supported by students, who are generally positive about being provided with rubrics and claim to use the rubrics to better understand (and meet) expectations (e.g., Andrade & Du, 2005; Jonsson, 2014; Reynolds-Keefer, 2010).’ (Panadero, Jonsson, Pinendo, & Fernandez-Castilla, 2023, p.113).

This ‘formative potential’ of rubrics differentiates it from quality assurance tools such as the University Generic Marking Criteria. The effectiveness of rubrics as a learning tool lies in clear, meaningful and specific criteria (Brookhart, 2013; Nitko & Brookhart, 2007; Popham, 2000; Suskie, 2009) and performance level descriptors referring to observable and measurable qualities which ‘help students envision where they are in their learning and where they should go next’ (Bookhart, 2018, p.2). As defined by Brookhard (2018, p.1):

‘A rubric articulates expectations for student work by listing criteria for the work and performance level descriptions across a continuum of quality.’

The following steps, designed by van Leusen (2013) and adapted by Arcuria and Chaaban (2019) offer a useful framework for designing rubrics:

  1. What knowledge and skills is the assignment designed to assess? (Learning Objective)
  2. What observable criteria represent that knowledge and skills?(Performance Criteria)
  3. How can you best divide those criteria to represent distinct and meaningful levels of student performance? (Performance Levels)
  4. What observable characteristics of students’ work differentiate among the performance levels for each criterion? (Performance Level Descriptors)

The University of Tasmania provides three excellent examples of rubrics worth looking at – Writing Standards Descriptors (for rubrics)

A screenshot of the 'Complete Rubrics - Downloadable' tab


Using Rubrics in Blackboard

Rubrics are available both in Turnitin as well as Blackboard Assignments. They can also be associated with other gradable items in Blackboard such as essay, short answer, and file response test questions, blogs and journals, wikis, discussion forums and threads.

Turnitin Rubrics

Rubrics types

  1. Creating the rubric (video – 3:01min)
  2. Marking with the rubric (video – 3:35min)
  3. What will the student see

Blackboard Rubrics

Rubrics types

  1. Creating the rubric (video – 1:07 min)
  2. Associating rubrics with graded items (video – 1:29 min)
  3. Marking with the rubric
  4. What will the student see

In Blackboard Ultra, you will have the option to generate a rubric based on the course content using the AI Design Assistant.

 

References:

Arcuria, P., & Chaaban, M. (2019). Best Practices for Designing Effective Rubrics. ASU TeachOnline. Best Practices for Designing Effective Rubrics – Teach Online (asu.edu)

Banta, T. W., & Palomba, C. A. (2015). Assessment essentials: planning, implementing, and improving assessment in higher education. Jossey-Bass.

Brookhart, S. M. (2013). How to create and use rubrics for formative assessment and grading. Association for Supervision & Curriculum Development.

Brookhart, S. M. (2018). Appropriate criteria: Key to effective rubrics. Frontiers in Education 3, 22. https://doi.org/10.3389/feduc.2018.00022

Brookhart, S. M., & Chen, F. (2015). The quality and effectiveness of descriptive rubrics. Educational Review67(3), 343-368. https://doi.org/10.1080/00131911.2014.929565

Campbell, A. (2005). Application of ICT and rubrics to the assessment process where professional judgement is involved: the features of an e‐marking tool. Assessment & Evaluation in Higher Education30(5), 529-537. https://doi.org/10.1080/02602930500187055

Jönsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2(2), 130–144. https://doi.org/10.1016/j.edurev.2007.05.002

Krebs, R., Rothstein, B., & Roelle, J. (2022). Rubrics enhance accuracy and reduce cognitive load in self-assessment. Metacognition and Learning17(2), 627-650. https://doi.org/10.1007/s11409-022-09302-1

Nicol, D. (2021). The power of internal feedback: Exploiting natural comparison processes. Assessment & Evaluation in higher education46(5), 756-778. https://doi.org/10.1080/02602938.2020.1823314

Nitko, A. J., & Brookhart, S. M. (2007). Educational Assessment of Students (5th ed.). Pearson Education.

Panadero, E., Broadbent, J., Boud, D., & Lodge, J. M. (2019). Using formative assessment to influence self-and co-regulated learning: the role of evaluative judgement. European Journal of Psychology of Education34, 535-557. https://doi.org/10.1007/s10212-018-0407-8

Panadero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposes revisited: A review. Educational research review9, 129-144. https://doi.org/10.1016/j.edurev.2013.01.002

Panadero, E., Jonsson, A., Pinedo, L., & Fernández-Castilla, B. (2023). Effects of rubrics on academic performance, self-regulated learning, and self-efficacy: a meta-analytic review. Educational Psychology Review35(4), 113. https://doi.org/10.1007/s10648-023-09823-4

Popham, W. J. (2000). Modern educational measurement: Practical guidelines for educational leaders (3rd ed.). Allyn and Bacon.

Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & evaluation in higher education35(4), 435-448. https://doi.org/10.1080/02602930902862859

Suskie, L. (2009). Assessing student learning: A common sense guide (2nd ed.). Jossey-Bass.

Wolf, K., & Stevens, E. (2007). The role of rubrics in advancing and assessing student learning. Journal of Effective Teaching7(1), 3-14.

van Leusen, P. (2013). Assessments with rubrics. ASU TeachOnline. https://teachonline.asu.edu/2013/08/assessments-with-rubrics/

Using Turnitin Similarity Reports as Formative Feedback

Prefer video over text? Watch this presentation instead: Using Turnitin Similarity Reports as Feedback (10 min)


How do you use Turnitin Similarity Reports in your teaching? Commonly the Turnitin similarity checker is seen as the ‘plagiarism detector’ and although it can be a useful indicator of whether or not unacceptable academic practices occurred, reducing its role to a purely punitive one is a huge missed opportunity.

If we look at the resources from the tool’s provider, their recommendations advocate for a more pedagogical approach: ‘One of the most effective ways to utilise the Turnitin Similarity Report is to guide students to make improvements in their writing and research practices.’ (Understanding the Similarity Report: An educator guide | Turnitin). They even provide a student guide to understanding the Similarity Report which starts with: ‘Continue reading to better understand what this information means and how you can use it to improve your writing.’ (Understanding the Similarity Report: A student guide | Turnitin).

As suggested by Mphahlele & McKenna (2019, p. 3): ‘‘Much of the literature in favour of such software argues that its most effective use is as a pedagogical tool, rather than a policing one. It is shown to add an enormously positive aspect to the development of student writing when used in such a way. ’ Interestingly, some experimental studies showed a positive impact of using Similarity Reports along with clear and specific guidelines and paraphrasing exercises on reducing plagiarism, suggesting that Similarity Reports could be used in its prevention rather than detection. The research indicates that students are more likely to internalise guidelines on acceptable academic practices when they refer explicitly to their own work (Barrett & Malcoln, 2006). As discussed by David and Carroll (2009, p.6): ‘Many students seemed to have a kind of ‘eureka’ moment, when faced with the onscreen evidence of how they had used sources, where they understood more fully about issues related to academic integrity, as they connected to their own work.’ Using Similarity Reports as the reference point, students can engage in comparison processes, underlying the generation of valuable internal feedback (Nicol, 2021).

The most effective ways of utilising Turnitin Similarity Reports seem to involve an opportunity for students to discuss their reports with staff. Where this is not possible on an individual basis, David and Carroll (2009, p.67) suggested small group tutorials with examples highlighting typical mistakes. If a fully self-directed design is applied, clear guidelines are key (Rolfe, 2011). It has been also suggested that limiting the similarity report generation to one attempt may minimise the risk of system evasion (McKeever, 2006). A potential implementation may look like this:

Clearly labelled folder on Blackboard included in the Assessment and Feedback area:

A folder on Blackboard

Step-by-step instructions with links to further resources:

A crucial factor in the effectiveness of this approach is clear signposting from the tutor. Many other institutions in the sector seem to follow similar practices:

If you are interested in adopting this approach or you like to share your way of using Similarity Reports, please get in touch: ania.udalowska@bristol.ac.uk. To conclude, I would like to second this quote from Kaktiņš, (2019, p. 432):

‘Overall, it appears that the use of text-matching software in its most authentic iteration is most effectively implemented as part of a holistic approach (Lee & Edwards, 2013) focused on formative learning and the development of students’ responsibility for their work, plus disciplinary measures where required.’

 

References:

Barrett, R., & Malcolm, J. (2006). Embedding plagiarism education in the assessment process. International Journal for Educational Integrity, 2(1), 38-45. https://doi.org/10.21913/IJEI.v2i1.23

Davis, M. and Carroll, J. (2009). Formative feedback within plagiarism education: Is there a role for text-matching software? International Journal for Educational Integrity, 5(2), 58-70. https://doi.org/10.21913/IJEI.v5i2.614

McKeever, L. (2006). Online plagiarism detection services—saviour or scourge? Assessment & Evaluation in Higher Education31(2), 155–165. https://doi.org/10.1080/02602930500262460

Mphahlele, A., & McKenna, S. (2019). The use of Turnitin in the higher education sector: Decoding the myth. Assessment & Evaluation in Higher Education, 44(7), 1079–1089. https://doi.org/10.1080/02602938.2019.1573971

Nicol, D. (2021). The power of internal feedback: exploiting natural comparison processes. Assessment & Evaluation in Higher Education46(5), 756–778. https://doi.org/10.1080/02602938.2020.1823314

Rolfe, V. (2011). Can Turnitin be used to provide instant formative feedback? British Journal of Educational Technology, 42(4), 701–710. https://doi.org/10.1111/j.1467-8535.2010.01091.x

Turnitin. (n.d.). Understanding the Similarity Report: A student guide. Turnitin. https://www.turnitin.com/papers/understanding-the-turnitin-similarity-report-student-guide

Turnitin. (n.d.). Understanding the Similarity Report: An educator guide. Turnitin. https://www.turnitin.com/papers/understanding-the-turnitin-similarity-report-instructor-guide

 

Engineering Feedback Practice Sharing Event – Resources

The Engineering Feedback Event was attended by 25 members of staff. The day included some excellent presentations and lively discussions. As part of the students-led workshop, attendees had an opportunity to build models of effective feedback practices using Lego.

Lego model

Thank you to all attendees and speakers!


Here is a list of recordings and other resources from the sessions (to access the resources please log in with your UoB credentials):

Keynote: Moving Feedback Forwards in Higher Education (Professor Naomi Winstone, The Surrey Assessment and Learning Lab)

Closing the Feedback Loop: Insights from Engineering Students​ (Ana Beatriz Quelhas O. e Moreira​, Abdalrahim Naser​, Gen Kawaguchi, John Whittock)

Approaches to Peer Feedback (Dr Joel Ross)

Using Turnitin Similarity Reports as Feedback (Ania Udalowska) 

Video Feedback (Dr Joel Ross) 

Using Blackboard and Turnitin rubrics (Ania Udalowska)

Engineering Feedback Practice Sharing Event – 27th February 2024

On the 27th of February, we will be holding a sharing event, open to students and staff, focusing on exploring good feedback practices. This event will be an opportunity to share examples of excellent feedback practices from engineering colleagues, hear what works for students and learn more about what research says about feedback.

Book on the event

A graphic showing one individual giving feedback to the other.

Programme: 

09:30: Welcome

09:45: Approaches to Peer Feedback (Dr Joel Ross)

10:15: Using Turnitin Similarity Reports as Feedback (Ania Udalowska) 

10:30: Closing the Feedback Loop: Insights from Engineering Students​ (Ana Beatriz Quelhas O. e Moreira​, Abdalrahim Naser​, Gen Kawaguchi, John Whittock)

11:00: Coffee break

11:15: Approaches to exam feedback (Professor Lucy Berthoud)

11:45: Video & Audio Feedback (Dr Joel Ross) 

12:15: Using Blackboard and Turnitin rubrics (Ania Udalowska)

12:30: Lunch

13:30: Keynote: Moving Feedback Forwards in Higher Education (Professor Naomi Winstone, The Surrey Assessment and Learning Lab) – Read more about the keynote below 

14:30: Student-led Workshop: Revisiting feedback practices (Ana Beatriz Quelhas O. e Moreira​, Gen Kawaguchi, Eliana Garcia Bustos)

15:30: Close


Keynote: Moving feedback forwards in higher education – Professor Naomi Winstone PFHEA NTF, The Surrey Assessment and Learning Lab

Naomi Winstone profile

Naomi is a cognitive psychologist specialising in the processing and impact of instructional feedback and the influence of dominant discourses of assessment and feedback in policy and practice on the positioning of educators and students in feedback processes. Naomi is Professor of Educational Psychology and Director of the Surrey Institute of Education at the University of Surrey, UK. She is also an Honorary Professor in the Centre for Research in Assessment and Digital Learning (CRADLE) at Deakin University, Australia. Naomi is a Principal Fellow of the Higher Education Academy and a UK National Teaching Fellow.

Engineering Digital Education Support

Since September 2023 Engineering has its own dedicated Digital Education Officer and it’s me!

Hello!

What’s your remit? 

Broadly speaking my role is about supporting students and staff in using technology for learning and teaching. This can mean both; helping with resolving technical queries with digital education tools such as Blackboard, Turnitin, and Re/Play as well as supporting the design of online and blended teaching activities. I also run training both for individuals and teams (Engineering Digital Education Provision) and coordinate various projects. Some of the things I worked on since I was appointed included: 

  • Creating a list of digital education priorities for engineering, based on a staff survey, informal conversations, and the NSS data.  
  • Collating key information regarding the digital exam provision and needs. 
  • Writing on a report outlining a rationale for making Turnitin similarity reports visible to students.
  • Implementing some improvements to the engineering online programme – using groups, adaptive release, and Grade Centre smart views.

Are you a part of the DEO? 

No, my role sits within the Faculty Education Team, although I work closely with the DEO, often acting as a middle point between engineering staff and the central digital education team. 

How can I contact you? 

You are welcome to get in touch either via my e-mail: ania.udalowska@bristol.ac.uk or the Engineering Digital Education Support Mailbox: eng-digitaleducation@bristol.ac.uk.