Metaverse explained for University Leaders: What is currently possible within the Metaverse? 2/4

I am not selling anything here. That should be self-evident given that my answer to the question “what is currently possible within the Metaverse?” is, not much. I could even suggest nothing, because ‘it’ doesn’t exist yet, certainly in the form it aspires to. What we have instead are partial experiences, glimpses into the promise of what the future holds. In part one of this four-part blog, I explored the definitions of what the Metaverse might be. We don’t have it yet.

Recent press (including this from the NYT), in part the reason for the delay in issuing this second of four short articles, have highlighted how deeply unpopular the concept of an immersive working environment in the Metaverse actually may prove to be. Meta’s own Horizon platform, the immersive environment that is the company’s manifestation of the Metaverse, is proving unpopular even amongst its own employees. Essentially, the Metaverse still remains largely the domain of ‘video games’. There is a serious risk of over-inflating the promise of a virtual reality workspace. Just as 3D films have repeated the cycle of innovation, technology breakthrough, costly implementation, partial deployment, and customer non-engagement, so it looks like the Metaverse risks repeating this trajectory.

If you are looking to review institutional strategies in the light of challenges and opportunities presented by the Metaverse, please feel free to get in touch with spa@sijen.com

Nonetheless, we should discuss what is currently accessible for educators. There are a range of AR and VR visualisations that aid learning. These include 3D visualisations of the human body for medical purposes, and of engineering and architectural designs that aid a deeper understanding of structure. The challenge for academics is to confront themselves with the question of whether learning gained through these 3D renditions adds enough value to warrant associated costs. If you were a medical science student before these visualisations were available, are you likely to have learned anything new from these 3D renditions? Are these 3D images necessarily enhanced by viewing them using VR headsets? It might be a ‘nice to have’, but does it warrant the not insignificant investment in staff training and equipment?

What is currently available in the commercial world,  notably in disaster response and security contexts, are a series of hyper-real representations of real-world scenes, as opposed to fantasy worlds, in which skills can be perfected. The most obvious in the public consciousness would be flight simulators on which pilots learn to master new aircraft. Surgeons have also benefited for some years from the VR renditions of difficult procedures that can be rehearsed before opening up a patient. Touching on a humanities field, but still with a foot firmly in the technical realm, the restoration team working on the Notre Dame in Paris collaborates within a VR version of the fire-gutted cathedral, discussing and experimenting with approaches before tackling the real thing. 

There is no doubt that the human brain is clever. Having a 3D visualisation of an object or a scene, displayed on a flat screen, satisfies most cognitive engagements. Is immersion in virtual reality either helpful or necessary?

Graphic design and game design students would undoubtedly benefit from practice suites to be able to design 3D models and game interactives. Saving individual students the cost of investing in the kit that is likely to be constantly upgraded as IT equipment manufacturers attempt to recoup their investments.  However, unless there is a distinct visualisation requirement,  asked of by current or emergent practice within the profession to which university programmes are aligned to, I would suggest there is no need to invest heavily in developing the in-house capacity to create VR experiences. It remains cheaper, not cheap, but cheaper, to employ either a third party, or your own student designers, to create experiences. 

What is less certain is the role that AR will play in the Metaverse. That’s for next time.

If you are looking to review institutional strategies in the light of challenges and opportunities presented by the Metaverse, please feel free to get in touch with spa@sijen.com

The threat to the integrity of educational assessments is not from ‘essay mills’ but from Artificial Intelligence (AI)

The threat to the integrity of educational assessments is no longer from ‘essay mills’ and contract cheating but from Artificial Intelligence (AI).

It is not so long ago that academics complained that essay mills, ‘contract cheating’ services, and commercial companies piecing together ‘bespoke’ answers to standard essay questions, were undermining the integration of higher education’s assessment processes. The outputs of these less than ethically justifiable endeavours tried to cheat the plagiarism detection software (such as Turnitin and Urkund) that so many institutions have come to rely on. This reliance, in part the result of the increase in the student-tutor ratio, the use of adjunct markers and poor assessment design, worked for a while. It no longer works particularly well.


If you are interested in reviewing your programme or institutional assessment strategy and approaches please get in touch. This consultancy service can be done remotely. Contact me


Many institutions sighed with relief when governments began outlawing these commercial operations (in April 2022 the UK passed the ‘Skills and Post-16 Education Act 2022’ following NZ and Australian examples) and went back to the business-as-usual. For the less enlightened this meant a return to setting generic questions, decontextualised, knowledge recitation essay tasks. Some have learnt to at least require a degree of contextualisation of their students’ work, introduced internal self-justification and self-referencing, requiring ‘both sides’ arguments rather than declared positions, and applied the ‘could this already have been written’ test in advance. Banning essay mills, or ‘contract cheating’, is necessary, but it is not enough to secure the integrity of assessment regimes.

Why students plagiarise is worthy of its own post, but suffice it to say it varies greatly depending on the student. A very capable student may simply be terrible at time management and fear running out of time or feel the assessment is unworthy of them. Another student may be fearful of their ability to express complex arguments and in pursuit of the best possible grade, plagiarise. Some may simply have not learnt to cite and reference, or to appreciate that rewording someone else’s thoughts without attributing them also constitutes plagiarism. And there is that category of students whose cultural reference point, deference to ‘the words of the master’, make plagiarism conceptually difficult for them to understand.

I remember receiving my most blatant example of plagiarism and academic malpractice back in 2006. A student submitted a piece of work that included 600 words copied wholesale from Wikipedia, complete with internal bookmarks and hyperlinks. I suspect the majority of students are now sufficiently digitally literate not to make that mistake, but how many are also now in a position to do what the essay mills used to do for them, stitch together, paraphrase and redraft existing material using freely available AI text generation tools.

As we encourage our students to search the web for sources, how easy is it for them now to access some of the easily accessible, and often free, online tools? These tools include https://app.inferkit.com/demo which allows you to enter a few sentences and then generate longer texts on the basis of that origin. You can enter merely a title, of at least five words, or a series of sentences into https://smodin.io/writer and have it generate a short essay, free of references. Professional writing tools aimed at marketers, such as https://ai-writer.com, would cost a subscriber to be effective but would allow students to generate passable work. This last tool actually tells you the sources from which its abstractions have been drawn, including academic journals.

You might find it enlightening to take something you have published and put it through one of these tools and evaluate the output.

It is insufficient to ask the student to generate their own question, or even to ask the student to contextualise their own work. Some of the emergent AI tools can take account of the context. There is a need to move away from the majority of long-form text assessments. With the exception of those disciplines where writing more than a thousand words at once is justifiable (journalism, policy studies, and some humanities subjects), there is a need to make assessments as close to real-world experience as possible. It needs to be evidently the product of an individual.

Paraphrasing is a skill. A valuable one in a world where most professions do not lack pure information. The issue is to evaluate the quality of that information and then be able to reduce it to a workable volume.

I’ve worked recently with an institution reviewing its postgraduate politics curriculum. I suggested that rather than try and stop students from ‘cheating’ by paraphrasing learned texts, they should encourage the students to learn what they need to do to enhance the output of these AI tools. Using one of these tools to paraphrase, and essentially re-write, a WHO report for health policy makers made it more readable, it also left out certain details that would be essential for effective policy responses. Knowing how to read the original, use a paraphrasing tool, and being able to explore the deficiencies of its output and correct them, was a useful skill for these students.

We cannot stop the encroachment of these kinds of AI text manipulation tools in higher education, but we can make their contemporary use more meaningful to the student.


If you are interested in reviewing your programme or institutional assessment strategy and approaches please get in touch. This consultancy service can be done remotely. Contact me.


Image was generated by DALL-e



do-not-reply@: the inefficiencies of email use demonstrated by graduates

Graduates, and their colleagues, born after 1970 are unlikely to have worked in a context in which email was not a primary communication tool. Its inefficiency is manifest but often overlooked.

I went from my undergraduate degree in 1988 into a role as a Logistics Manager for Vichy L’Oreal. The job involved using a stand-alone PC, a telephone to the factory in France, and a weekly Telex to Paris. I was appointed on the basis that I spoke French. I couldn’t write French for caramel au beurre (toffee), but I could speak it. No one asked me at my interview whether I had any computer skills or indeed whether I was numerate. I didn’t have email.

Today employers make similar assumptions about digital competence. I have just finished a consultancy project looking at embedding real-world practices and assessment in a business degree. The external reviews had been good, but the feedback from the destination surveys with employers less so. Their recent recruits could not communicate appropriately using email.

This challenge was described by Cal Newport in his Harvard Business Report piece “A Modest Proposal.”(2016). Outlining IBM’s early experience of corporate email in the 1980s, he explains that while the first few days showed positive signs of increased productivity but quickly employees simply adopting the system for their routine offline communication, and notably communicated ‘vastly more’ than they had before. “Thus — in a mere week or so — was gained and blown the potential productivity gain of email,” Newport cites an IBM manager. Email is now a source of inefficiency. Email use is so ubiquitous it becomes difficult to imagine running an organisation without it.

The review of this particular business degree demonstrated much like that was positive. Lots of collaborative project type activities and group assessments. Business simulations using software, embedded in the institutional learning management system, form the basis of the second year and a ‘new’ business project provided the foundation for the third year. In many ways it is a progressive design, sharing characteristics of many medical school’s approaches to ‘problem based learning’. So, why weren’t the employers recruiting from this degree happy?

It turns out that the employers biggest complaint was the inability of graduates to use email effectively. Their complaints mirror closely many of my own with the way colleagues routinely use email, so let’s unpack them.

Saying enough, but not too much. Writing a handwritten email requires thought before you write and a conscious decision whether to ‘go over the page’. It leads to shorter, more direct forms of asking questions or answering them. Students who do not use email (preferring messaging apps) are not practiced at measuring their message.

Treating email like chat. Students who are invested in the immediacy of communication, and the transient nature of the message, are prone to not invest sufficient thought in the enduring nature of business communication. Microsoft Teams, Yammer, and other messaging apps simply reinforce this behaviour. Communications may be immediate, but it’s it also ‘cheap’.

Inappropriate use of reply-all. Senior managers expressed some frustration with receiving emails with ‘thanks’ as the body of the email, sent to everyone, sometimes dozens of people, simply because the person hit ‘reply-all’. The question one might ask is if you received a paper memo from the same person, with the same information, would you go to the trouble of writing a memo back that says ‘thanks’?

Lazy addressing of email. Dragging out an old email in order to identify a sender and then hitting reply fails on two counts. Firstly, it may be that without changing the subject heading, the email may be filed with a conversation thread that is otherwise closed. It may get lost, and prove difficult to retrieve if the subject is different from that of the header. Secondly, the dreaded ‘reply-all’ means that one risks sending an appropriate message to people you did not mean to contact. Human Resource departments often have stories of ‘misunderstandings’ borne merely as the result of inappropriately sending email to the ‘wrong’ people.

Some larger commercial organisations are templating emails, turning them more into ‘digital memos’, using mailing aliases, using more BCC, and originating emails from ‘do-not-reply@‘ addresses. I came across one company that has banned all internal email, using Teams for anything internal, and using email for external communications. There are moves for organisations to reimpose a degree of structure around workflow and issuing mandates (as French legislation does) protecting the right for employees to ‘un-plug’. Organisations are embracing the limitations imposed through tools like Slack, imposing different internal and external channels. Others are simply exploring internal training courses on how to write emails efficiently. Some have even chosen to embrace the paper memo to replace some internal communication. Somewhere I am sure someone is ‘reinventing’ Lotus Notes as I write.

There is a growing problem, not just for the younger generations now graduating who have no experience of email efficiency, but for businesses worldwide. My advice to the leadership of the degree that I was consulting on was to make the second and third year projects less ‘clinical’, more chaotic in terms of the technology platforms that were available to students. Students need to learn to communicate using email in its wild and untamed form. That would require students to be educated as to write digital memos.Then any restricted workflow they may encounter will be a bonus.

I had actually used a PC before I joined Vichy in 1988, but certainly not the way the role required of me. Fortunately I seemed to have the ‘knack’. As well as the weekly telex and making daily phonecalls, I wrote a handful of internal memos (on paper) each week. I was managing a monthly budget of millions of pounds worth of stock flows with just that level of communication. I can imagine there are at least three people doing the same job now, engaged in a sisyphean effort just to manage the email traffic.


Newport, Cal. “A Modest Proposal: Eliminate Email.” Harvard Business Review, February 18, 2016. https://hbr.org/2016/02/a-modest-proposal-eliminate-email.

Image by Sara Kurfeß @stereo.prototype at unsplash.com

 

How do you define hybrid, or hyflex, learning?

I struggled recently to define hybrid learning to a client. They asked how they could go about creating ‘hybrid learning’ for their learners. A reasonable question?

There appears to be some confusion, in practice and in the literature, as to the differences between hybrid, hyflex (hiflex, hi-flex, etc), and blended learning. So, I would like to take a minute to propose some definitional parameters, and wait to see if you agree or disagree.

The terms hybrid and hyflex are, in my mind, essentially the same thing, but they differ from ‘mainstream’ blended approaches. Blended learning, as curricula and teaching practice, determines where a learner studies, and what they are doing in each space. The blend is anticipated and written into the curriculum. The teacher knows what the student will be doing in-person or as a distance learner. Indeed the course is most probably designed ‘flipped-classroom’ style, to optimise the precious time in face-to-face-face contexts, whether in-person or virtually. There are a few flavours of blended learning but they are all pre-determined by the course creator.

Hybrid, or hyflex, approaches attempt to give some agency, some control, to the learner as to the nature of their learning experience, the when, where and how. Both aim to empower the student to choose what learning should be studied face-to-face and that which should be studied online, and how to go about engaging with that learning. The only apparent difference, largely in US practice, appears to be the unpacking of the the distance participation element as asynchronous or synchronous online engagement. To me it’s a definition without a difference.

This hybrid/hyflex nature very often means courses spawn new hybrid ‘spaces’ in which there is an attempt at seamless integration between real-world in-person and virtual learning experiences. This means that designers of courses that aspire to be hybrid/hyflex learning may be required to enable the same (or equivalent) learning experiences to be modelled in multiple forms or alternative spaces (Bennett et al., 2020; Goodyear, 2020). This could be significant burden. Think about it as Universal Design for Learning (UDL) on drugs.

Blended, hybrid/hyflex are in fact all flexible learning models of delivery. They all make use of different combinations of the two modes of learning, in-person and distance. And they all fall within a regulatory and validation authority that determines the relative openness of programmes of study. Flexible is anything that is less than fixed. Its merely a question of degree. It’s clearly a spectrum. Courses are on a spectrum of curriculum delivery between rigid and flexible.

I persuaded this particular client that they did not need to go ‘all-in’ and design courses for hybrid delivery. Rather, they simply needed to consider what learning and teaching activities were best suited for ‘away-from-the-classroom’ study and to determine whether these required independent study or collaboration with others. To be a bit more… flexible.

It wasn’t the answer they wanted. After all, being ‘hybrid’ is so very much, you know, ‘now’. But it’s the answer they got.

Dr Simon Paul Atkinson

15 July 2022

Bennett, Dawn, Elizabeth Knight, and Jennifer Rowley. “The Role of Hybrid Learning Spaces in Enhancing Higher Education Students’ Employability.” British Journal of Educational Technology 51, no. 4 (2020): 1188–1202. https://doi.org/10.1111/bjet.12931.

Goodyear, Peter. “Design and Co‐configuration for Hybrid Learning: Theorising the Practices of Learning Space Design.” British Journal of Educational Technology 51, no. 4 (2020): 1045–60. https://doi.org/10.1111/bjet.12925.

Image generated using OpenAI DALL-E

Very Brief Overview of ‘Innovating Pedagogy 2022’

This very brief summary is in no way to be taken as a substitute for reading the full report, or indeed the Executive Summary, which is available here: Innovating Pedagogy 2022

Cover of Innovating Pedagogy 2022This is the 10th annual report exploring new forms in interactive and innovative practice of teaching, learning and assessment. These innovations already exist in pockets of practice but are not considered mainstream. This report, a collaboration between the Institute of Educational Technology in The Open University, UK, and the Open University of Catalonia, Spain, is the result of a filtering process and is compiled, based on a review of published studies and other sources.

Hybrid models
Maximising learning flexibility and opportunities. Beyond the strict curriculum delineations in Blended Learning models, Hybrid forms aim to empower the learner to optimise their own learner choices at to where, when and how to learn. Providing flexible choices requires teachers and institutions to adjust their systemic approaches.
Influencer-led education
 Learning from education influencers on social media platforms. Acknowledging the growth of edu-influencers, who optimise their use of social media tools to share their knowledge, experience, and passion for a range of subjects from the highly specialised to the generic. Evaluating the veracity of the message is a challenge for the learner.
Dual learning scenarios
Connecting learning in classrooms and industry workplaces. A step on from work-integrated learning models, the expectation is that course designers fully meld both formal classroom and work spaces into a coherent experience.
Pedagogies of the home
Understanding the home as a place for cultural learning. Not the same as home-schooling. Rather, it seeks to leverage the wider socio-cultural environment that the learner inhabits. Also recognises the burden on marginalised communities to fully participate.
Pedagogies of microcredentials
Accredited short courses to develop workplace skills. Existing approaches, snippets taken from existing programmes, fail to create an effective learning ecosystem for learners who require support to develop a patchwork portfolio meshing formal, non-formal and informal experiences together.
Pedagogy of discomfort  
Emotions as powerful tools for learning and for promoting social justice. A process
of self-examination that requires students to critically engage with their ideological traditions and ways of thinking about issues such as racism, oppression and social injustice.
Pedagogy of autonomy
Building capacity for freedom and independent learning. Explores notion of incorporating informal, non-formal and formal learning patterns into the learner’s experience, creating self-regulated learners with an emphasis on their metacognitive development and allowing them to reflect their true selves..
Wellbeing education
Promoting wellbeing across all aspects of teaching and learning. Wellbeing education helps students to develop mental health ‘literacy’ by teaching them how to manage their own mental health, recognise possible disorders, and learn how, where and when to seek help.
Watch parties
Watching videos together, whatever the time or place. Leveraging the increased connectivity prompted in response to covid-19, and the move of  media providers to provide educational tools, this is the notion of structured engagement around a shared viewing (or listening) experience.
Walk-and-talk
Combining movement and conversation to enhance learning. Not just used in service of for those in need of emotional support, where the therapeutic benefits have been proven, but across a wide range of learning activities where reflection and thought would be best served by being away from the classroom and being outside and mobile.
10 Themes from the 2022 Innovating Pedagogy report

 

Kukulska-Hulme, A., et.al. (2022). Innovating Pedagogy 2022: Open University Innovation (No. 10). Open University.
 

Dr Simon Paul Atkinson PFHEA / 13 July 2022

Image is generated by OpenAI’s DALL-E2

Teaching about existential threats: why we need to teach concepts, not just facts.

It has now been more than four months after Russia’s invasion of Ukraine and I have been thinking how badly we need to be teaching about existential threats. I think we need to develop a curriculum that is open to contemporary real world challenges.

It has now been more than four months after Russia’s invasion of Ukraine. Like many, I have been ruminating. This post it’s about that. Or at least not directly. I have been thinking about how badly we need to be teaching about existential threats. I think we need to develop a curriculum that is open to contemporary real world challenges.

I think global education needs to adjust to new realities. The First World War, the Great War, wasn’t a World war in July 1914. It became one later. The Second World War likewise was not a world war in September 1939, although it engulfed the globe in due course. We are yet to see whether the February 2022 Russian invasion of Ukraine will prove to have been the start of the Third World War. Hopefully it will not become that kind of milestone, but i think we owe it to students to prepare them for that possibility.

Education is different now than it was in 1914 or 1939. Now we have wall to wall coverage, ubiquitous social media and real-time battlefield insights piped into childrens’, adolescents’ and adults’ television screens, tablets and smartphones. There is almost as much misinformation as there are facts on many platforms; as many well-meaning transmitters of misleading (sometimes factually inaccurate) news, as there are respectable and accredited voices. The democratisation of information is a good idea, but it assumes individual generators of that information are well-informed, critical and, if opinionated (or biased), state that upfront.

I have learnt a great deal about Ukraine in recent months. Some of it from listening to TikTokers and YouTubers. Some summarise Ukrainian news sources for us. and others share their daily lives from war torn cities. Some review Russian media sources. Some provide valuable daily summaries. Others share emotive responses to news as it happens. I am aware that I am, I believe, relatively digitally literate. I’m critical of the sources, often look up the individual commentators on other social media, check their LinkedIn profile and review past output. This last element is particularly interesting. I have been suspicious, though not dismissive, of social media account that started in late February 2022. Some are clearly chasing followers, clicks and likes. Some are clearly trying to provide what they see as a genuine information service. Being able to identify the difference is not always easy. Our students need to learn these skills.

Risk evaluation is a very difficult thing to teach. Each student will have different life experiences to make them more or less fearful of uncertainty. Those who lived through the Cuban missed crisis or the nuclear standoffs in the early 1980s may say they’ve seen it all before. What is different now is the ubiquitous nature of information, and misinformation, which is in danger of confusing students’ ability to make their own judgements.

Educators are morally obliged to teach the unseen. That includes climate change and the risk of nuclear war. There is not a discipline that cannot leverage the moment. Social sciences and humanities obviously have an edge. Physical sciences are often less flexible in terms of the curriculum. But I think it’s important that anyone teaching today pauses before delivering any concept, any idea or thought, and consider whether there is a contemporary example amongst the unseen existential threats that exist. It means sometimes abandoning our own safe assumptions, our own safe havens, and exploring things that we ourselves may see as uncertainties.

There any number of examples of elements within any curriculum that can leverage the Russia-Ukraine war. Political scientists can explore the notion of the Eurasian multipolar world view. Geographers can explore the three seas project, and Sociologists can explore the religious realignment we are now seeing amongst the Orthodox churches. Examples are endless if we focus on the concept rather than the content. Beyond any obvious historical comparisons there are lessons to be learnt across all the disciplines using contemporary examples. The learning of concepts, geopolitical perspectives, resource management and cultural power, are all more useful to the students that any specific set of facts.

How relevant the current Russian invasion of Ukraine may be perceived by faculty and their students alike will depend largely on geography. While the conflict itself is seen as a largely European ‘problem’, and it’s global economic implications are yet to be clearly felt, I would understand that these reflections are probably more relevant to my European colleagues than to many others. But the principle still stands. All concepts that form any part of the curriculum need to be based within a contemporary world context. We need to leverage the current crisis that is being seen and witnessed by students through the prism social media. In doing so we can both serve the curriculum and educate students with critical judgement about their sources of information.

Concepts not Content

I am passionate about privileging the teaching and learning of concepts rather than content. Concepts are instruments that serve to identify, define, explain, illustrate and analyse real-life elements and events, past, present or future. These are usually within the confines of a particular geography, social context and within discipline conventions, but when defined well, reach across all cultural boundaries.

There are essentially two kinds of concepts: sensory and abstract. Sensory concepts are tangible, they can be experienced through our senses. Abstract concepts are not directly experienced, they are often not visible and need to be imagined. There is a simple three step process for you to consider as you build learning with concepts: define, illustrate, and imagine.

Define:

It’s important to keep the definition of a concept at its simplest. It should be a self contained concept.

Let’s take for example the statement that Regional wars have global consequences.

We could then unpack what we means by regions, wars and global consequences. The easiest way to validate your concepts’ definition is to see how easy it is to state its opposite. Regional wars do not have global consequences.

I can already envisage an assessment task that asks students to identify regional conflicts that did not have global consequences, and then have their peers challenge them subsequently as alternative perceptions of those consequences (after some enquiry-based learning).

Illustrate:

Illustrating a concept helps learners to catagorize new knowledge, to cement that new learning in a hierarchy or order of reality. Illustrations can be examples that demonstrate the truth of the definition, or it’s opposite. An illustration that does not match the definition also serves to help learners make sense of the definition. So in this example “Regional wars can have global consequences”, I could describe the key protagonists and events that led to the war in the Middle East between Israel and Arab powers in 1956, which had profound long term implications for European loss of influence and the rise of the United States as a regional power broker.

For its opposite I could take the regional war fought between the Sahrawi Indigenous Polisario Front and Morocco from 1975 to 1991 (and involving Mauritania between 1975 to 1979), for control of Western Sahara which has had minimal global impact. Although it is still a live issue in that region.

Neither world wars, both clearly regional conflicts but with different impacts. A useful conceptual space to unpack thoughts and idea with students. Learners do not need to have detailed knowledge about the background histories of the parties to be able to develop and understanding why these two different conflicts result in different implications. The challenge for them to unpack the factors that make up the definition of the concept shared earlier, that regional wars do not have global consequences.

I am not teaching my students about the 1956 Israel-Arab war or the war in Western Sahara, I am illustrating the factors that go into making the truth of my definition self evident. Examples and non-examples both support the interpretation of concepts.

Imagine:

Imagining scenarios in which the concept might be illustrated, perhaps using analogies, can prove very effective. Interpreting analogies requires the learner to deconstruct and reconstruct the element of the concept, it supports deeper comprehension, improves retention and allows the learner to adapt then meaning of a concept into their own sociology-cultural context.

It’s important that as you construct your imagined scenario or your analogy, that you ground it in the existing, or at least conceivable, experience that your learners already have or could have. There is a danger that we forget just how culturally diverse our student cohorts are. References to popular culture, national habits and pastimes may mean something to you but are not going to be generally understood.

You could for example ask students to imagine a conflict between whichever country you are teaching in and ask how a conflict with a neighbour state might, or might not, have global consequences. I acknowledge for too many in the world this is not merely an intellectual exercise.

Changing Practice

Concepts are foundational to all new learning but we, in tertiary education, are in the habit of burying or obscuring the key concepts amidst the weight of information, and then expecting the learner to be able to think in abstract terms.

I had a lecturer recently tell me that they didn’t have time to change the example that they were using to teach supply and demand, a well developed scenario based on the oil price during the Second Gulf War. I found it very hard to believe, given that the current Russian invasion of Ukraine has had a direct impact on oil and commodity prices globally. Why, I suggested, why didn’t they ask the students to fill in the details of his scenario so that they would understand better the duplications of each factor rather than sharing a preprepared example. I suggested that might also provide an opportunity for students to talk more openly about the current threats that they may perceive impact on them personally as result of this particular war.

Strange as it may seem I think that the current war provides an important catalyst for the re-evaluation and revitalisation of much of our social science and humanities curriculum. It reminds us that there are existential threats around us, and that these should service as pivotal points of reference as we explore concepts with our students enabling them to make meaningful connections.

Students need to be encouraged to seek out sources of information with a critical eye in order to be better prepared for the unforeseen.


Photo by Антон Дмитриев on Unsplash

Ukraine: a teachable moment finding its way into our curricula.

Graphic of Ukrainian Colours
In recent weeks as the war in Ukraine has unfolded I have watched educators trying, with significant success, to use events as teachable moments. The intricacies of shifting boundaries and conflicts used to fuel debates about historical context. Economics teachers use economic interdependences between countries, evidenced through oil and gas supplies, phosphates and grains to great effect. Exploring ethnic identities form a core part of anthropological and social sciences conversations. What I see, are teachers in the English-speaking liberal democracies, the ‘West’ (where I have sight), teaching this war as not being ‘over there’, as some distant disconnected experience
. Rather it is being taught in the context of ‘it is happening here’ or at the very least ‘could it happen here?’

Very often teachers are struggling to answer questions from students and still ‘getting through’ the prescribed content, predetermined in curriculum structures and resources imposed from outside. The best national, regional and institutional systems empower teachers to leverage events that are affecting their students. The worst amongst them have rigid content requirements. These later are written by bureaucrats not by teachers. Concepts are more powerful than content, ideas more enduring than facts. Giving students a framework for critical thought using ideas and concepts allows them to seek out and identify facts and content. Importantly, it empowers the student to make connections between disparate thoughts, across time and geographies.

I think education should be radical, it should be focused on change, not on maintaining the status quo; it should be focused on transformation not normalising; it should be focused on the individual as a member of diverse and overlapping communities, not as unique cogs in a machine. Radical education should be innovating not perpetuating, enriching not sustaining, challenging not confirming.

Oscar Wilde said that

“The whole theory of modern education is radically unsound. Fortunately in England, at any rate, education produces no effect whatsoever. If it did, it would prove a serious danger to the upper classes, and probably lead to acts of violence in Grosvenor Square.”

True. So little has changed since the 19th century despite the dawning of a digital Information age. In my view, we are still too committed to a curriculum of content rather of concepts.

Courageous teachers across the world are navigating troubling times with creativity and insight. They are often forced to bend and circumvent an imposed curriculum to make the learning effective and real. Why teach about supply and demand to business students using Californian almond production when you can explore the impact of disrupted wheat exports from Ukraine? Why explore the English Reformation when a contemporary example of religious disaggregation is happening today in the Orthodox Churches. Ideally, teachers should have the flexibility to compare and contrast established (predetermined resources) with students’ own contemporary comparators.

‘Resilience’: the latest hyped up term being applied to education.

“If you managed to cover the absences of staff successfully last semester, are you maybe just over staffed?” If you managed to move all of your learning in a frantic fortnight with minimal support, well “how hard can it be, and do you really need all of that expensive support?”

There is a danger of being ‘successful’ in responding to a crisis. Senior management often don’t see the pain and sweat, the family disruption, the anxiety, and stress as it is happening. “Look how resilient you have all been in response to Covid-19, just carry on like that.”

Resilience is very in vogue at the moment. There are any number of workshops and seminars to empower you as an individual to recognise your own resilience. Some generously provide a ‘toolkit’. Others provide just a forum to share stories of resilience. I have been a participant in a number of these session in the last 12 months. To coin a Yogi-ism ‘It’s déjà vu all over again’. In the 2000s the same workshops were being run for us a managers using different buzz words, adaptability and self-awareness.

Adaptability requires a certain degree of intellectual flexibility, but above all it requires that an individual feel secure and trusted. Most individuals can be persuaded to try a different approach, provided if it turns out not to work, that they won’t be reproached. Most employees will find creative solutions, in collaboration with others, if they feel that their jobs don’t depend on them getting it right first time. Employers need to provide safe zones for failure. Employees need to understand their boundaries and self-imposed limitations. How far should you stretch outside of your current experiences, your ‘comfort zone’? This requires one to be self-aware. To know your limits and when it’s ok to step beyond them.

If senior management in tertiary institutions really want to ensure the resilience of their staff they need to empower even the most junior faculty or support person to make mistakes. To encourage them to be adaptable and responsive to changing circumstances. They must also ensure that staff are self-aware, willing to declare their own limitations and their own boundaries. Given the ability to recognise one’s own limits and being creative in adapting practices to stretch them is a practical definition of professional fulfilment.

I can cope with the evolution of language, it is one of the things I love about English. I recognise that running workshops encouraging staff to be adaptable and self-aware might sound a bit 2000s and language may need to be spiced up a bit. It just gets a bit tiresome to have old concepts repackaged and presented as something radically new. Personally I think it better to confront the underlying conditions in which ‘resilience’ is enabled.

Photo by Karim MANJRA on Unsplash

 

Why we need to change how we design courses.

There are many courses out there that do a great job of teaching manual, dexterity and physical capabilities. From bricklaying, hairdressing, to gas-fitting, there are course that are focussed around manual processes. However, there are huge numbers of graduates from tertiary programmes that cannot perform duties required of employers on day-one simply because they have not learnt how to do something. Their learning may have been told ‘why’, and even ‘what’ is expected, but it has not enabled them to perfect the skills associated with the ‘how’.

It remains remarkable to me that so many course and programme specification documents, replete with (sometimes well-formed) learning outcomes, have NO psychomotor outcomes. There are few courses that could not be improved by including an assessed outcome associated with using a tool or technology.

To prove the point I asked colleagues informally before Christmas whether they could think of a course where there was NO tool or technology use in play. Without further prompting, most agreed that Excel skills, SPSS, CAD tools, even library databases all required a degree of incremental competence but that these had not been in any way ‘taught’, let alone assessed, within their courses. One provocateur suggested that their course required only the ability to write and reflect. It took little effort to unpick this given that writing in this context requires a word-processing package, formatting, style sheets, spell-checking and in-text-citations, all of which are assumed graduates skills. This colleague stood their ground, suggesting that they were not employed to teach those skills; that was someone else’s responsibility.

This may be at the root of the challenge. Thirty years ago (when many of our current educational leadership graduated) your three to seven years spent at University was a valuable time spent in proximity to the sources of privileged knowledge, the esteemed Professor or the library. You had a whole life after graduation to develop the rounded skills associated with being whatever your chosen lifetime employment might be. That is simply no longer the case. The ‘academy’ no longer contains the privilege knowledge. We have democratised the information sources. Even those who embark on a lifelong vocation will find the landscape around them continuously changing.

Access to the LinkedIn Learning resources, and the cornucopia of free web resources, has allowed some institutions to negate whatever obligations for manual, dexterity and physical skills development they might feel towards their students. Some course weave these external resources into the learner’s experience, others totally abdicate responsibility and deem it part of the independent learning required of learners.

One reason for this lack of attention paid to the acquisition of psychomotor skills is because it is thought harder to assess someone’s psychomotor skill set that it is to test their knowledge, and by extension their intellectual or cognitive skills. If I can’t meaningfully assess it, I’ll just avoid teaching it. It is also a function of the ‘curse of knowledge’, given that faculty have acquired their psychomotor skills in a particular technology or tool over an extended period of time and they have failed to either document that learning or indeed to reflect on it.

There are some well designed courses out there. I hope you designed or teach on one. But there is still a significant deficit in the in-course provision of support for the acquisition of psychomotor skills associated with tools and technologies in a range of disciplines. We need to design courses across ALL disciplines that are rooted in the skills that graduates require to handle the uncertain information, technology, and socio-cultural environments they face. This means designing courses first around psychomotor skills, interpersonal and affective skills, then meta-cognitive and cognitive skills. Then, and only then, should we worry about the factual knowledge element. We need programme and course designers to be designing with different priorities if we want to make learning appropriate for the contemporary learner.

Photo by Markus Spiske on Unsplash

FLANZ President’s Review of 2021

2021 may have proven to be only slightly less challenging than 2020. If only because some disruption and tumult were expected. All sectors of education continued to make adjustments to their practices, embed new processes and look to long-term solutions. FLANZ is also developing to handle future challenges. Continue reading “FLANZ President’s Review of 2021”

%d bloggers like this: