Type of Powerful Assessment - Other


  • Associated co-curricular activities

    The results of these and evidence of their success can be included in an ePortfolio.


  • Enabling Program - Active involvement of both students and tutors in assuring assessment quality

Before the course starts tutors meet together online to go through the assignment task, requirements & rubrics, and reflect on past student challenges with the assignment and particular skills that need highlighting to their current students [Calibration]. Tutors describe the assessment task, requirements and rubric to all students in week 1. 3 weeks prior to assignment’s due date two students volunteer to explain assessment item the following week. Two weeks prior to the assignment’s due date two volunteer students recap on the assessment task, its requirements and grading rubric to all students in their class, ensuring students hear the assignment explanation in their own language and words.

Before submitting their assignments, students are asked to mark an exemplar against the marking rubric in pairs and to report back on the grade and comments they would give. They then compare their work with the grade and comments the exemplar actually received which is then critically examined. Tutors practise giving written feedback by moderating their written feedback comments on sample de-identified assignments in small group moderation meetings, ensuring the feedback aligns with task instructions and that everyone is on the same page in terms of their understanding of the requirements and rubrics for each grade. [Moderation]. Tutors mark assignments outlining 2 or 3 specific things for students improve on and model exactly what is required. HDs and failures are double marked and spot checks conducted by unit assessor. [Quality review].

The benefits for tutors is that they have built team spirit and collegiality which mitigates isolation working on different campuses; and they have participated in calibration and moderation as a best practice assessment approach. This reassures tutors that they are ‘on the right page’ [Validation and Acknowledgement]. In addition they learn from students and from each other throughout the process; they can instantly see how their comments have been received and applied in practice, and can then modify or improve their feedback for future student marking.

The benefits for students is that they practise critical skills in evaluation and receive further feedback ensuring they have fully understood how to apply the feedback; and learn a process that can then be employed to enhance their performance on the next assignment. The process can extend to real world learning and practice whereby students come to understand the power of giving and receiving constructive feedback to improve work practices. The student voice is also present in the feedback process and taken on board by tutors.

Southern Cross University, Australia
Contact: Suzi Hellmundt (suzi.hellmundt@scu.edu.au)
Julia Doyle (julia.doyle@scu.edu.au)


  • Maker works at MIT - At: http://makerworks.mit.edu/

    The mission of MakerWorks is to foster a student community in a hands-on learning environment where modeling, prototyping, and validation resources coexist. MakerWorks provides space and equipment for a community of innovators that focus on deterministic designing and problem solving. MakerWorks is a student run makerspace where students, faculty, and staff are allowed to work freely on any project they choose. MakerWorks consists of prediction, prototyping, and validation tools to support a wide variety of projects. More at our Member Wiki. Accepted projects: class, research or personal. Free to use, charges for some machines and materials.


  • MIT Hackers - At: http://hacks.mit.edu/

     The IHTFP Gallery is dedicated to documenting the history of hacking at MIT.

The word hack at MIT usually refers to a clever, benign, and "ethical" prank or practical joke, which is both challenging for the perpetrators and amusing to the MIT community (and sometimes even the rest of the world!). Note that this has nothing to do with computer (or phone) hacking (which we call "cracking").

See also:https://en.wikipedia.org/wiki/Hacks_at_the_Massachusetts_Institute_of_Technology

Over time, the term has been generalized to describe anybody who possesses great technical proficiency in any particular skill, usually combined with an offbeat sense of humor. Like most art exhibitions, the great majority of hacks are temporary installations; most are removed within a day or so by MIT Physical Plant, the MIT Confined Space Rescue Team (CSRT),[54] or occasionally by the hackers themselves. It is a traditional courtesy to leave a note or even engineering drawings behind, as an aid to safe de-installation of a hack.[54] MIT hacks can push the limits of technical skill, and sometimes fail in spite of meticulous planning. Even these engineering failures have been acknowledged to have educational value, and sometimes a follow-up attempt succeeds. One hack on the Great Dome is documented as having finally succeeded on the fourth try, after a complete re-engineering of both the installed artifact and the installation method

(More About the IHTFP Gallery and FAQ.)


  • Programmatic assessment (PA) in medicine: Traditional assessments focus on performance at infrequent single time points (i.e. examinations or tests) and encourages ‘binging and purging’ of knowledge. Students pass or fail on a limited number of high stakes assessments. The results of this approach are: learning that focuses on memorization of knowledge but not on transfer of knowledge and a culture that values marks over feedback. PA uses multiple assessment modalities: online quizzes as part of each week’s case-based learning, written and/or online tests every few weeks to address the content covered in each subsection of the program, cumulative progress tests, direct observations of students performing history-taking and physical examination, simulation, peer feedback, supervisor feedback, students’ written reflections on areas of their own strength and weakness and other assessment exercises as deemed appropriate.

    Performance on these assessments is aggregated on the basis of the competencies needed by practicing physicians instead of by course or method and students review their aggregated assessment data with a faculty mentor. PA is being implemented as part of the new MD curriculum in the 2016-2017 school year at the University.

(University of Toronto)