This six-minute screencast (6'06") is a top-level set of guidelines for developing effective teaching materials. For some, it may feel like going over well-worn ground, for others it may provide pause for thought. Rationalising what constitutes learning materials seems superficially straight-forward but when one considers the different institutional interpretations of what represents 'direct' learning versus 'delf-directed' learning it soon becomes apparent that judgement is needed even here.
These resources from 2013-2017 are being shared to support colleagues new to teaching online in the face of the COVID-19 pandemic.
Tertiary providers are increasingly expected to deliver 'work-ready' graduates. This is a challenge when we must acknowledge that many graduates will begin a career, in a year's time or in the three years, that does not exist today (Susskind, R., & Susskind, D., 2017). Identifying the competency frameworks within our disciplines and those of our professional colleagues is a good place to start (Atkinson, 2015). We can then identify a range of graduate attributes that will underpin our programme outcomes and inform the development of real-world assessment.
Challenging Our Assumptions
It is critically important to challenge our assumptions whenever we contemplate introducing any new courses or programmes into our portfolios.
Whether you are designing an individual course or an entire programme, it is important to ‘future-proof’ it to the greatest extent possible. To ensure that it is consistent and logical. If one sees individual courses as self-contained ‘units of learning’ with their own outcomes and assessment, you risk creating problems later on, for course substitutions, updating and student continuity
It is important to question all of our assumptions about the context into which our learning design is intended fit. Despite the fact that you may feel you know your learning context intimately the chances are there will be some contextual evolution. Take the time to go through these questions, if only to confirm your assumptions.
Regardless of whether you are charged with designing an entire degree-award, a programme or an individual course, you will be doing so within an institutional context. Validating learning is a responsibility of approved degree-awarding institutions in the UK and many countries too, although some have regional or national validation processes (www.inqaahe.org). Regulations vary marginally between contexts but they are remarkably consistent in their aspirations despite different levels of detail being required.
You should design your course or programme with reference to the academic regulations and policies and practices implemented by your institution. But, it is important to avoid copying existing learning on the basis that they will automatically be suitable for validation. The regulatory framework also evolves over time, it adjusts over time in response to the dynamic dialogue between innovative course designers and those responsible for institutional quality assurance. Never copy and paste!
You might want to convene a course team and ask:
Course / Module
What credit weighting is my course expected to carry?
At what Level is my course intended to be taught?
Is it intended to assess the same course at different levels?
Where in the programme sequence is my course intended to appear?
Is my course intended to flexible enough to be aligned to multiple programmes
Is my Programme divided into Stages, are there multiple exit points?
What are the naming conventions within my Programme?
Where does the academic management of the learning sit?
Which School will oversee the quality processes associated with this learning?
Are there graduate attributes at a School level?
How does this learning align with the strategic objectives of the University?
National Quality Assurance Context
Once you have a sense of how your learning design might conceivably fit into the institutional context, but before anything is regarded as fixed, it is prudent to review external contextual influences on learning design. One of the most important is the national, regional or state context.
In the United Kingdom, for example, this oversight is provided by the Quality Assurance Agency (qaa.ac.uk) or QAA. This section is illustrative of the kinds of questions you will need to be asking yourself..
The UK Quality Code for Higher Education is a web-based resource with printable PDFs (qaa.ac.uk) that provides a comprehensive structural guide as to how learning designs should be interpreted. It does not provide a design template, rather it functions more accurately as an evaluative framework. Part A of the code is the most pertinent to the design process at this moment. There are four themes that UK course and programme designers need to consider:
At what Level is the programme’s named award to be made (Graduation level)? In the UK these levels are defined in the Framework for Higher Education Qualifications
Broad guidance as to the distinguishing characteristics of specific named awards.
Convention determines that certain exit awards have a certain number of credits associated with them. Credit is often defined through the concept of ‘notional student hours’ which might, for example, suggest that 1 credit equates to 10 hours of study. This measurement should include everything the student does, including assessment.
Disciplines, at both undergraduate and postgraduate levels, may have subject benchmarks associated with them. These provide valuable conventional guidance on what is anticipated to be learned by students under specific discipline, or subject, headings.
These may closely relate to professional criteria which is dealt with next.
Professional Accreditation and Employment Trends
Now you know your course or programme is going to fit into your institutional profile and you are assured that it will meet the quality assurance criteria, you need to ask yourself 'why would a student want to do this course'?
Given the design process is likely to take several months and it may take a year or two before you enrol first students; the reality is your Postgraduate students will probably be graduating in two years at the earliest, your Undergraduates students in 4 years; a great deal can change.
It is important to build into your design and review processes, some form of environmental horizon scanning. This may exist in your practice already but where it doesn’t it is worth instituting. Gathering White Papers from commercial partners or competitors, clients, employers as well as press clippings and exploring changes in the direction that your profession or discipline may be heading should be the focus of some course team debate.
For more on horizon scanning, you may want to explore this UK government resource.
There is clearly also value in sharing your early programme and module designs with representatives from the professions or disciplines that your graduates are intended to graduate into. It’s often a good idea to do this very early on in the process, not to ask for validation of your designs, but to capture the widest possible intelligence on future directions.
Here are some basic questions, but you should explore as a course team those questions that seem more appropriate to your evolving context.
What competency frameworks (apprenticeship standards) and professional body guidelines exist in my discipline?
If there is no national guidance, what about international guidance that might be indicative of trends?
Are there globally recognised ethical standards in my discipline?
What internationally agreed accords are under development?
Are competitors working on alternative offerings such as two-year degrees or new degree apprenticeships.?
Globalisation vs Localisation
How is my profession or discipline evolving over time, are there identifiable trends?
How important is language ability or digital skills?
Automation / Systematisation
How much of my discipline or profession is data-driven, or knowledge-based, and therefore more prone to automation?
On the contrary, are there inter-personal or affective skills that distinguish my discipline that is likely to require personal presence?
What are the big ideas in my discipline?
Are there new Internet applications that take away part of what has traditionally been seen as a distinguishing feature of my discipline?
It is natural for course teams to be intimately familiar with the scholarship that underpins the ‘content’ that they intend to deliver to students. Harder for most course teams is to get some distance from their own practice and to take a ‘bird’s eye view’ of their design as it emerges.
Again, it is important to be sensitive to the evolving discipline landscape. The best way to do this is to establish some form of ‘environmental scanning’ or ‘horizon scanning’ processes within your design team. Avoid the danger of fixating on a competitor’s advantage, or a particular client’s requirements, by maintaining as broad a view as possible.
Here are four categories you may want to start with. Review sources in each category with the same question; “What does this source tell me about the evolving needs of effective learning design in my discipline?”
Academic Journals in your discipline
Academic Books and Book Chapters in your discipline
Academic publications in related fields that impact directly, or indirectly on your discipline.
Conference proceedings are very often very much current or future implementations of scholarship. A great place to get a handle on what is happening ‘now’ and in the near future.
The blogosphere is a great place to source original and innovative approaches. Once you have validated the sources (so that you know the writer has credibility) you may want to track their train of thought over time.
White Papers from software producers (most disciplines make some use of technology!) and publishers are also counted as ‘Grey Literature’. Some software companies have in-house R&D divisions that foreshadow major trends in your discipline.
Personal or Team contacts also provide invaluable accounts of practice that inform the design process. You may find out the difficulties, or advantages, of running virtual scenarios for example and correct your design accordingly.
Evaluating your Contextual Judgements
It is important to return to these questions as you go through the future stages of the 8-SLDF. You will want to revisit these questions each time you have a course team meeting:
Has my institutional strategy or alignment changed in any way?
Have any quality assurance regulations, guidelines or benchmarks changed in any way?
Do I still have all of the external reference points (my horizon-scanning) established to be able to define Programme Outcomes?
What contextual circumstances might suggest that I should do something different from the norm and what external support is needed? And if I’m not doing anything innovative, why not?!
What issues has my horizon scanning produced that others in the School or wide University need to be aware of?
Atkinson, S. P. (2015). Graduate Competencies, Employability and Educational Taxonomies: Critique of Intended Learning Outcomes. Practice and Evidence of the Scholarship of Teaching and Learning in Higher Education, 10(2), 154–177.
Susskind, R., & Susskind, D. (2017). The Future of the Professions: How Technology Will Transform the Work of Human Experts. OUP Oxford.
It may seem strange to design our evaluation structures before we have even recruited students onto our programmes. We need first to understand the distinction between assessment, feedback and evaluation. It is then important to explore both the evaluation of learning experiences and evaluation for-learning, which I will refer to as in-class evaluation for the sake of consistency.
The pages associated with this blog, stage 8 of the 8-SLDF, explores 5 basic concepts that underpin the evaluation of learning.
Distinguishing between Evaluation, Feedback and Assessment
Measuring Student Performance versus Teacher Performance
In-class evaluation versus Post-Completion evaluation
Progression: Access, Retention, Pass Rates, Grades, Completion and Destination
This post is a summary of the page for Stage 7 of the 8-SLDF, and the fourth element in a constructively aligned course design approach, which is feedback throughout. Closely reflective of both our assessment practice and our learning activities, feedback is best fully integrated into the learning rather than seen as a separate administrative response to submitted work. Designing feedback throughout opportunities in our courses will lead us to adopt variations in our learning activities and potentially to modify our assessment strategies too. Reviewing our strategies for feedback at this stage in the design process allows us to ensure that we can adjust our ILOs, assessment and activities if necessary to accommodate meaningful feedback throughout.
There are four concepts which we need to clarify or define, for this stage of the 8-SLDF. These are:
Feedback for learning
They all feature in a well-structured feedback approach to any module or programme in higher education, regardless of whether it is a classroom/seminar based module, online or blended course. They are explored fully on the Feedback pages.
The third element in a constructively aligned course design and stage six of the 8-SLDF is the learning activities that allow students to prepare for the assessment of their learning outcomes. This is not about the content that we share with our students; it is about how we develop an appropriate strategy to do that. Some modules will require a good deal of knowledge to be acquired by novice learners and a set-text and discursive seminars may be the appropriate strategy. Could we use one-minute papers, 'Pecha Kucha', lightning talks, and other techniques to secure student engagement? Alternatively, we might be designing a more advanced module in which a discovery learning approach is more appropriate. Could we use enquiry based learning models here instead, asking our students to prepare to take a debate position, run a Moot or team-based discussion? The important thing is that we are developing a strategy and practical approaches that build on our design, not seeking innovation for innovation's sake. The pages for this stage of the 8 Stage Learning Design Framework are summarised as:
The content to be taught should serve the students ability to evidence the ILO
The skills and attributes that are taught at a topic, week or session level should be designed to rehearse elements of the assessment
Not everything that engages students is directly assessed but everything they are asked to do should be justifiable as informing the assessment and ILOs.
You might want to ask yourself as a course design team
How closely mapped are the ILOsto each topic, week or session outline?
How confident are you that you cover the ILOs appropriately in terms of weighting and importance?
How much variation is there in the learning approaches taken throughout your module?
How are you enabling students to develop skills beyond knowledge acquisition?
This post is a shortened version of a new resource posted about designing effective ILOs available here.
ILOs are the detailed explanations, written in language the students will understand before beginning the module or programme, as to what they will be able to DO when they have successfully completed the learning.
Many quality assurance structures, institutional or external, require Programme and Module Specifications contain details of the 'intended learning outcomes' (ILOs) of the programme of study. ILOs serve to provide students with a ‘checklist’ of the types of skills, attributes, abilities or competencies they should be able to evidence through successfully completing the module or programme.
Intended Learning Outcomes should not be seen as a straight-jacket for faculty. Rather, if they are well written, they should provide scaffolding for creativity in teaching and assessment.
Most teachers can identify any number of unintended learning outcomes, depending on the character of the cohort, the changing context in which learning takes place or the emergent nature of the discipline. However, the ILOs are the facets of learning that will be assessed. They should be written knowing that these are the capabilities that will be assessed, not the content knowledge.
Read more for guidance on how to structure effective ILOs that cover all dimensions of learning.
There are five key considerations we should take into account as we either work with media in our learning and teaching process or design learning using media. The relative importance of each of these depends on the level, context and nature of the module or programme but all should need to be reflected in any learning design. These five are:
Students: orientation and disposition to learning with and through media.
Staff: abilities to work with media. Their abilities to identify appropriate media and manipulate it as appropriate.
Professional needs: present and future demands of the professional context in using media appropriately.
Content & Resources: Identifying existing, or creating, effective media resources for learning.
Institutional Choices: The constraints and opportunities for learning designers to develop media.
Tertiary providers are increasingly expected to deliver 'work-ready' graduates. This is a challenge when we must acknowledge that many graduates will begin a career that does not exist today. Identifying the competency frameworks within our disciplines, and those of our professional colleagues is a good place to start. We can then identify a range of graduate attributes that will underpin our programme outcomes and inform the development of the real-world assessment.
It is important to question all of our assumptions about the context into which our learning design is intended fit. Despite the fact that you may feel you know your learning context intimately the chances are there will be some contextual evolution.How much has your discipline context evolved in recent years?.
Our learning programmes are designed to reflect our institutional specialisms and priorities, to play to our strengths. Sometimes we risk forgetting that they will be taken by real people with different dispositions, orientations and perspectives. For this reason, the first stage of my 8 Stage Learning Design Framework is concerned with profiling students. Much the same way that any product designer asks 'who is my customer going to be?', we need to do the same thing in education. I'm sharing here a process in which we describe between four (five or six works too) imaginary individuals who are as different from each other as we can imagine as our potential students. I advocate a diversity matrix to build these characters, articulating different dispositional, educational, circumstantial and cultural parameters. We then run these user-cases through four further perspectives to reassess their discipline orientation, learning orientation, personal context and social context. There are overlaps and contradictions, a creative tension that reveals the strengths and relative weaknesses of any potential programme design. See the Research pages for more of Student Profiles
This an introduction to a new resource being shared on this website, the 8-Stage Learning Design Framework, or 8-SLDF for short. The framework provides a supportive step by step process to enable faculty and course designers to develop robust and well-aligned programmes or modules. Publication of the 8-SLDF is in preparation so only brief explanations are provided but resources will be shared over time with associated commentaries. These blog posts will find a permanent home on the research pages of this site too.
I believe that the best way of ensuring that students and faculty can both engage in a meaningful, positive and fruitful learning collaboration is by designing courses well.
By well, I mean that courses that are constructively aligned, relevant to the real-world experience of students, engaging and transparent. Courses must also be cultural and socially aware. Students need to know why they are being asked to perform learning tasks and we should always have an answer. Knowing 'why' an activity matters because it is the first step in any individual's self-reflective process, their metacognition and the development of their personal epistemologies (Atkisnon, 2014). We also need to know 'why' because doing anything for the sake of it is clearly wasteful of our time and energy. We as faculty are valuable players in the relationship between our students, the discipline, our institution and the wider world. Being good at what we do makes a difference. Designing courses that enable us to be better at what we do simply makes sense.
The 8-Stage Learning Design Framework has had a long gestation. It has its foundations built through my educational development practice around the work done by John Biggs on constructive alignment (2007) and the SOLO taxonomy (1982). I then incorporated work by Anderson and Krathwohl's reworking of Bloom's cognitive domain taxonomy (2001) alongside others domain development, including the original Bloom project's articulation of the affective domain (1956), Dave's psychomotor domain (1967), and my own interpretations of Metacognitive and Interpersonal domains.
The issue of the effective materials design was inspired by the Open and Distance learning world (pre-digital), particularly by Derek Rowntree (1994) and Fred Lockwood (1994), on my collaborations with Kevin Burden around the DiAL-e Framework (2009) and my own scholarship around the SOLE Model (2011). More recently I have drawn inspiration from the work of James Dalziel and Gráinne Conole (2016), and Diana Laurillard (2012), in their learning design conceptualisations, particularly as it relates to learning activities.
The result is I believe a comprehensive, flexible and adaptable learning design framework not just for activities but for entire courses, module and programmes. It is an appropriate framework regardless of the discipline, level, context or mode of learning. It is a framework for any adult, formal, learning context.
Anderson, L. W., & Krathwohl, D. R. (2001). A taxonomy for learning, teaching, and assessing : a revision of Bloom’s taxonomy of educational objectives. New York: Longman.
Atkinson, S. P. (2011). Developing faculty to integrate innovative learning in their practice with the SOLE model. In S. Ferris (Ed.), Teaching, Learning and the Net Generation: Concepts and Tools for Reaching Digital Learners. Hershey, PA: IGI Global.
Atkinson, S. P. (2014). Rethinking personal tutoring systems: the need to build on a foundation of epistemological beliefs. London: BPP University College.
Atkinson, S. P. (2015). Graduate Competencies, Employability and Educational Taxonomies: Critique of Intended Learning Outcomes. Practice and Evidence of the Scholarship of Teaching and Learning in Higher Education, 10(2), 154–177.
Biggs, J., & Collis, K. F. (1982). Evaluating the Quality of Learning: Structure of the Observed Learning Outcome Taxonomy. New York: Academic Press Inc.
Biggs, J., & Tang, C. (2007). Teaching for Quality Learning at University: What the Student does (3rd ed.). Buckingham. GB: Open University Press.
Burden, K., & Atkinson, S. P. (2009). Personalising teaching and learning with digital resources: DiAL-e Framework case studies. In J. O’Donoghue (Ed.), Technology Supported Environment for Personalised Learning: Methods and Case Studies (pp. 91–108). Hershey, PA: IGI Global.
Conole, G. (2016). Theoretical underpinnings of learning design. In J. Dalziel (Ed.), Learning design: conceptualizing a framework for teaching and learning online (pp. 42–62). New York: Routledge
Dave, R. H. (1967). Psychomotor domain. Presented at the International Conference of Educational Testing, Berlin.
Krathwohl, D. R., Bloom, B. S., & Masia, B. B. (1956). Taxonomy of Educational Objectives, Handbook II: Affective Domain. New York; David McKay Company, Inc.
Laurillard, D. (2012). Teaching as a Design Science (1 edition). New York: Routledge.
Lockwood, F. (Ed.). (1994). Materials Production in Open and Distance Learning. London: SAGE Publications Inc.
Rowntree, D. (1994). Preparing Materials for Open, Distance and Flexible Learning: An Action Guide for Teachers and Trainers. London: Routledge.