The Metaverse explained for university leaders: challenges for universities ahead (3/4)

Press coverage of recent cryptocurrency disruptions and the significant staff reductions at Twitter and Meta is giving pause for thought amongst investors and futurists, as well as university leaders considering the future of the Metaverse.

The fact that you may feel like you cannot keep up with the news is understandable. The collapse of the cryptocurrency platform FTX, the apparent meltdown underway at Twitter and the 11-year sentence handed down to Elizabeth Holmes for the Thanos fraud do all have something in common. The digital world is sufficiently obscured from the majority of people, sometimes deliberately, and the ‘trust train’ may have now hit to buffers, reminiscent of the end of the dot-com boom.

So what of the metaverse? I did not mean to imply that it is a fanciful dream that will never have an impact on higher education,  but I have reservations.  I received some negative feedback for comparing 3D Cinema and VR technology adoption curves. I stand by my contention that such technology developments need to take more account of user expectations, as well as their user experience. Demographic patterns play a huge part in any technological innovation. The challenge for most Universities is to decide whether they are best to invest in low-tech entry materials and approaches to build a foundation for future ‘metaverse’ technologies or to join a narrow range of institutions that are innovating around these emergent technologies.

If you are looking to review institutional strategies in the light of challenges and opportunities presented by the Metaverse, please feel free to get in touch with spa@sijen.com

The obstacles for entry are less technical than they are learning design and delivery related. Clearly having sufficient finance in place is an obstacle for some, but even for those that have the cash to spend, knowing where to invest is crucial.

The technologies already exist for building VR immersive experiences, and some a free to try out (Unity.com), and the headsets and accessories are in theory available within reach of those on a medium or high income, although with the current cost of living crisis one might anticipate that Meta’s sales expectations for its latest headset to fall short. But creating a test suite, a development platform to create VR immersive environments, requires a greater degree of investment.

If I was a betting man (and I am not), I would agree with those who believe that Zuckerberg is willing to sacrifice the social platforms (Facebook, Instagram), with their declining demographic, in favour of speculative investment in the future. The future for him is the Metaverse. However, there is no clear evolutionary path for the Metaverse. That includes the challenges mentioned previously, those of wearable technologies, computer power to sustain them, the privacy legislative framework, and the broader legal implications. There are billions around the world without internet access, millions without reliable broadband, and millions who do not have the disposable income, time, or inclination, to while away hours as a virtual avatar. It might be ‘cool’, but is it really worth the time and effort?

Legal frameworks are struggling to keep up with the rapid technological changes society faces. The European Union is possibly the most active in seeking to impose guardrails around digital technologies. Some of these are privately welcomed by the big technology companies, who lend some of their legal minds in pursuit of meaningful legislation, while other legal restrictions are resisted. Profit still comes first after all.

Esports, a growing share of the online gaming space, certainly benefits from advances in hyper-real 3D immersive technologies. A business paying to advertise inside these game spaces, whether the hoardings around a virtual pitch or track, or branding on virtual apparel, makes sense. Whether this gaming trend will fruitfully spill over into academia, I am doubtful.

What should universities do?

There are things universities should do, in my view, to ensure they are ready to react (if I’m wrong) and VR technologies become more integrated into the learning experience of a wider group of students.

They should have both a Student Charter and an Information Technology Policy that are both reviewed annually. Things move that fast. And all students and staff should be asked to reassert their commitment each year. The executive summary for both of these documents serves to enhance the digital literacy of the entire learning community.

Privacy and ‘netiquette’ are concepts that are intertwined in the experience of staff and students. I can use abusive language, within limits, and ALL CAPS to insult people on Twitter and face little in the way of challenge. If I was to stand in London’s Leicester Square and do the same thing, say exactly the same thing hurling abuse at a passerby, it would not be too long before a couple of Police officers would turn up and move me on. Failure to comply would likely result in arrest and being charged with disturbing the peace. Imagine that scenario now within a virtual world. Who is the Police? What penalties would I face, if any? The behavioural norms we associate with the real-world fall apart in the digital sphere. That is already true today given the vile abuse faced by female academics in particular.

Is your institutional policy framework designed to cope with this scenario?

A student group, registered with your Student Union, organises a virtual event, hosted on a third-party application ( ZOOM for example) using a license owned by the controversial speaker themselves. The event requires registration but this is also done by the speaker themselves, and the event is advertised without any explicit endorsement from the student group themselves through it is heavily advertised verbally and using paper flyers around campus. During the event, some students mount a protest, disrupting the event. The event attracts huge criticism and excepts of the ZOOM meeting go viral on TikTok and Telegram, with some of the student’s name and affiliations being attributed. The mainstream press seize on the event as an example of both the ‘no-platform’ policy position you hold and the ‘woke, liberal elite’ attitudes in evidence.

My advice to a recent University client was that they should run ‘war games’ scenarios with senior student leaders and Heads of Department around exactly these kinds of scenarios. Because the challenges institutions face are less about being overrun by technological developments than it is one of uncontrollable user scenarios.

And explore AR in the short term. That’s for the final blog in this short series.

If you are looking to review institutional strategies in the light of challenges and opportunities presented by the Metaverse, please feel free to get in touch with spa@sijen.com

Image credit: generated using DALL-E

Metaverse explained for University Leaders: What is currently possible within the Metaverse? 2/4

I am not selling anything here. That should be self-evident given that my answer to the question “what is currently possible within the Metaverse?” is, not much. I could even suggest nothing, because ‘it’ doesn’t exist yet, certainly in the form it aspires to. What we have instead are partial experiences, glimpses into the promise of what the future holds. In part one of this four-part blog, I explored the definitions of what the Metaverse might be. We don’t have it yet.

Recent press (including this from the NYT), in part the reason for the delay in issuing this second of four short articles, have highlighted how deeply unpopular the concept of an immersive working environment in the Metaverse actually may prove to be. Meta’s own Horizon platform, the immersive environment that is the company’s manifestation of the Metaverse, is proving unpopular even amongst its own employees. Essentially, the Metaverse still remains largely the domain of ‘video games’. There is a serious risk of over-inflating the promise of a virtual reality workspace. Just as 3D films have repeated the cycle of innovation, technology breakthrough, costly implementation, partial deployment, and customer non-engagement, so it looks like the Metaverse risks repeating this trajectory.

If you are looking to review institutional strategies in the light of challenges and opportunities presented by the Metaverse, please feel free to get in touch with spa@sijen.com

Nonetheless, we should discuss what is currently accessible for educators. There are a range of AR and VR visualisations that aid learning. These include 3D visualisations of the human body for medical purposes, and of engineering and architectural designs that aid a deeper understanding of structure. The challenge for academics is to confront themselves with the question of whether learning gained through these 3D renditions adds enough value to warrant associated costs. If you were a medical science student before these visualisations were available, are you likely to have learned anything new from these 3D renditions? Are these 3D images necessarily enhanced by viewing them using VR headsets? It might be a ‘nice to have’, but does it warrant the not insignificant investment in staff training and equipment?

What is currently available in the commercial world,  notably in disaster response and security contexts, are a series of hyper-real representations of real-world scenes, as opposed to fantasy worlds, in which skills can be perfected. The most obvious in the public consciousness would be flight simulators on which pilots learn to master new aircraft. Surgeons have also benefited for some years from the VR renditions of difficult procedures that can be rehearsed before opening up a patient. Touching on a humanities field, but still with a foot firmly in the technical realm, the restoration team working on the Notre Dame in Paris collaborates within a VR version of the fire-gutted cathedral, discussing and experimenting with approaches before tackling the real thing. 

There is no doubt that the human brain is clever. Having a 3D visualisation of an object or a scene, displayed on a flat screen, satisfies most cognitive engagements. Is immersion in virtual reality either helpful or necessary?

Graphic design and game design students would undoubtedly benefit from practice suites to be able to design 3D models and game interactives. Saving individual students the cost of investing in the kit that is likely to be constantly upgraded as IT equipment manufacturers attempt to recoup their investments.  However, unless there is a distinct visualisation requirement,  asked of by current or emergent practice within the profession to which university programmes are aligned to, I would suggest there is no need to invest heavily in developing the in-house capacity to create VR experiences. It remains cheaper, not cheap, but cheaper, to employ either a third party, or your own student designers, to create experiences. 

What is less certain is the role that AR will play in the Metaverse. That’s for next time.

If you are looking to review institutional strategies in the light of challenges and opportunities presented by the Metaverse, please feel free to get in touch with spa@sijen.com

Metaverse explained for University Leaders:
 A simple guide to the immersive future (1/4)

University Leaders will doubtless come away from the latest round of late summer conferences with ideas about how to seize some real estate in Metaverse. With some caveats, I would suggest it is worthwhile that Universities start thinking now about how to harness the potential.


If you are looking to review institutional strategies in light of the challenges and opportunities presented by the Metaverse, please feel free to get in touch.


In four separate postings I want to outline:

  1. What the Metaverse is and is not.
  2. What is currently possible within the Metaverse
  3. Where are the challenges for universities in journeying into the Metaverse
  4. The opportunities likely to emerge over the next few years within the Metaverse.

What the Metaverse is and is not.

It is not yet here. The Metaverse is conceived as a series of intertwined digital experiences, from the presentation of personalised content based on physical proximity to the fully immersive virtual reality experience. The Metaverse is not a single ‘place’, it is rather an experience. It is envisaged as being an experience in which you, the individual, spend time between the virtual world and your flesh-and-blood existence.

Definitions are as varied as they are numerous as the label ‘Metaverse’ has some commercial cachet. Existing technologies, immersive gaming and virtual worlds have adopted the label of Metaverse. Even some commercial teleconferencing companies have chosen to use the label to describe their all-walls solutions.

Whilst several technologies play a part in building the Metaverse, including headsets, graphics platforms, blockchain encryption and so on, these individual technologies do not in themselves represent the Metaverse (Gillis, 2022). They are all pieces. They are yet to come together as a complete pattern. We are some years away from sufficiently integrated experiences that would warrant the label of Metaverse. The two experiences underpinned by this array of technologies are Augmented Reality (AR) and Virtual Reality (VR), both covered by the term Extended Reality (XR). AR could be defined as overlaying elements of digital representations on top of what we experience in the real world. AR is largely synonymous with Multiple Realities (MR). VR could be defined as the creation of immersive alternate reality. Big Think has a more detailed series of definitions (see here)

Definitions

Most commercial definitions of the Metaverse emphasise the connectivity between different digital experiences. They recognise already that no one wants to have to create multiple digital selves to participate in different experiences. How commercial realities will affect this aspiration is uncertain. Anyone who has signed up for multiple streaming services, Netflix, Amazon Prime, Apple TV, etc, will tell you, it can be frustrating.

My working definition for Vice-Chancellors of the Metaverse is:

A series of experiences of augmented reality (AR) and virtual reality (VR), grouped under the banner of extended reality (XR) facilitated by technologies (headsets, touch-sensitive haptic clothing, etc). Participation in the Metaverse allows individuals to create a digital version of themselves (digital-twin) and immerse themselves ‘inside’ the internet, as a representation of the digital world.

A shorter version is:

Metaverse is the intertwining of increasingly immersive digital spaces, experienced through XR technologies by you as your digital twin.

Implications

What this will look like in practice is open to question. At one extreme, a favourite film plot for the dystopia, individuals will spend most of their time ‘plugged in’ to a virtual ‘Matrix’, experiencing less and less human contact. More positive perceptions suggest a reality where working at home does not mean you cannot participate in person at a stand-up. Simply pop on the virtual reality headset and hyper-real representations of your team members appear in the space chosen for the meeting. Wearing a touch-sensitive technology suit (a haptic suit) would mean you can shake the hands of a new member and feel the pressure of their handshake. In its most utopian representation, it could be equated to the holodeck from the Star Trek franchise, a fully immersive hyper-real experience.

It is worth remembering that the concept of the Metaverse is not a new one. It has been around for at least 30 years. The term Metaverse is frequently attributed Neal Stephenson, in his 1992 cyberpunk novel Snow Crash (1992). Star Trek’s own holodeck television representation began in 1988. Its conceptualisation within education was explored in 1995 when John Tiffin and Lalita Rajasingham described virtual learning experiences that were fully immersive in their work In Search of the Virtual Class (1995). An inspiring read, all the more so because it is now 26 years old.

The challenge for university leadership is to know whether to invest and get ahead of the wave, uncertain as to the regulatory frameworks that are likely to be imposed, lack of clarity about the implications for personal privacy, and doubt as to which of the big players will set the technological standards that will allow for interoperability.

In summary

The Metaverse IS coming, will be complex, untidy, multispeed, digitally divisive, and fragmented in its realisation and implementation. The Metaverse IS NOT a product or service you can buy for your students.

Next time: What is currently possible within the Metaverse


If you are looking to review institutional strategies in light of the challenges and opportunities presented by the Metaverse, please feel free to get in touch.


Gillis, M. (2022, August). Emerging Technologies Ushering the Life Sciences Industry into the Metaverse, according to Accenture Report [August 2022]. Newsroom Accenture. https://newsroom.accenture.com/news/emerging-technologies-ushering-the-life-sciences-industry-into-the-metaverse-according-to-accenture-report.htm
Stephenson, N. (1992). Snow crash (Reissued). Penguin Books.
Tiffin, J., & Rajasingham, L. (1995). In search of the virtual class: Education in an information society. Routledge.

Workshop review: ‘Innovating Pedagogy 2022’

Thursday 8th September I had the privilege of running an online workshop for FLANZ  to explore the potential of a range of different pedagogical approaches that might apply to different educational sectors in New Zealand and Australia.

See Transcript The Innovating Pedagogy 2022 is the 10th annual report from the Open University (UK) exploring new forms in interactive and innovative practice of teaching, learning and assessment. These innovations already exist in pockets of practice but are not considered mainstream. This collaboration between the Institute of Educational Technology at The Open University, UK, and the Open University of Catalonia, Spain, is the result of a filtering process and is compiled, based on a review of published studies and other sources. Ten concepts or themes are identified.

Hybrid models Maximising learning flexibility and opportunities. Beyond the strict curriculum delineations in Blended Learning models, Hybrid forms aim to empower the learner to optimise their own learner choices at to where, when, and how to learn. Providing flexible choices requires teachers and institutions to adjust their systemic approaches. Influencer-led education Learning from education influencers on social media platforms. Acknowledging the growth of edu-influencers, who optimise their use of social media tools to share their knowledge, experience, and passion for a range of subjects from the highly specialised to the generic. Evaluating the veracity of the message is a challenge for the learner.
Dual learning scenarios Connecting learning in classrooms and industrial workplaces. A step up from work-integrated learning models, the expectation is that course designers fully meld both formal classroom and work spaces into a coherent experience. Pedagogies of the home Understanding the home as a place for cultural learning. Not the same as home-schooling. Rather, it seeks to leverage the wider socio-cultural environment that the learner inhabits. Also recognises the burden on marginalised communities to fully participate.
Pedagogies of micro-credentials Accredited short courses to develop workplace skills. Existing approaches, snippets taken from existing programmes, fail to create an effective learning ecosystem for learners who require support to develop a patchwork portfolio meshing formal, non-formal, and informal experiences together. Pedagogy of discomfort   Emotions as powerful tools for learning and for promoting social justice. A process of self-examination that requires students to critically engage with their ideological traditions and ways of thinking about issues such as racism, oppression, and social injustice.
Pedagogy of autonomy Building capacity for freedom and independent learning. Explores the notion of incorporating informal, non-formal, and formal learning patterns into the learner’s experience, creating self-regulated learners with an emphasis on their metacognitive development and allowing them to reflect their true selves.. Wellbeing education Promoting wellbeing across all aspects of teaching and learning. Wellbeing education helps students to develop mental health ‘literacy’ by teaching them how to manage their own mental health, recognise possible disorders, and learn how, where, and when to seek help.
Watch parties Watching videos together, whatever the time or place. Leveraging the increased connectivity prompted in response to covid-19, and the move of media providers to provide educational tools, this is the notion of structured engagement around a shared viewing (or listening) experience. Walk-and-talk Combining movement and conversation to enhance learning. Not just used in service of those in need of emotional support, where the therapeutic benefits have been proven, but across a wide range of learning activities where reflection and thought would be best served by being away from the classroom and being outside and mobile.
10 Themes from the 2022 Innovating Pedagogy report

The workshop used Mentimeter as an online polling tool. Of the 25 participants, 20 regularly voted and made 659 submissions. The tertiary sector dominated, at 15, with fewer representatives from the Private Training Enterprise and Commercial L&D sectors and only one from compulsory education. Only 2 Australians participated. Despite having laboured the point in all publicity materials that it would be valuable to read the report before participating, only 8 said they had read it (or the summary), with 11 admitting they had not. Of the 17 that responded to the question about their approach to new educational technologies, 12 saw themselves as ‘progressive’, 2 as ‘radical’, and 3 as ‘pedestrian’. To get participants involved in thinking about each pedagogic approach, we ran a 2×2 square exercise, asking what the relative effort versus impact might be. See the video for responses. Following breakout groups we ranked the innovations in terms of the amount of attention participants would pay to them in the next 12 months in their personal practice (see screenshot above). The general consensus was that whilst there was nothing exceptional or radical in any of these innovations, they provided a focus for reflection and were deemed stimulating. Thank you to all who participated.


Kukulska-Hulme, A., et.al. (2022). Innovating Pedagogy 2022: Open University Innovation (No. 10). Open University.

The threat to the integrity of educational assessments is not from ‘essay mills’ but from Artificial Intelligence (AI)

The threat to the integrity of educational assessments is no longer from ‘essay mills’ and contract cheating but from Artificial Intelligence (AI).

It is not so long ago that academics complained that essay mills, ‘contract cheating’ services, and commercial companies piecing together ‘bespoke’ answers to standard essay questions, were undermining the integration of higher education’s assessment processes. The outputs of these less than ethically justifiable endeavours tried to cheat the plagiarism detection software (such as Turnitin and Urkund) that so many institutions have come to rely on. This reliance, in part the result of the increase in the student-tutor ratio, the use of adjunct markers and poor assessment design, worked for a while. It no longer works particularly well.


If you are interested in reviewing your programme or institutional assessment strategy and approaches please get in touch. This consultancy service can be done remotely. Contact me


Many institutions sighed with relief when governments began outlawing these commercial operations (in April 2022 the UK passed the ‘Skills and Post-16 Education Act 2022’ following NZ and Australian examples) and went back to the business-as-usual. For the less enlightened this meant a return to setting generic questions, decontextualised, knowledge recitation essay tasks. Some have learnt to at least require a degree of contextualisation of their students’ work, introduced internal self-justification and self-referencing, requiring ‘both sides’ arguments rather than declared positions, and applied the ‘could this already have been written’ test in advance. Banning essay mills, or ‘contract cheating’, is necessary, but it is not enough to secure the integrity of assessment regimes.

Why students plagiarise is worthy of its own post, but suffice it to say it varies greatly depending on the student. A very capable student may simply be terrible at time management and fear running out of time or feel the assessment is unworthy of them. Another student may be fearful of their ability to express complex arguments and in pursuit of the best possible grade, plagiarise. Some may simply have not learnt to cite and reference, or to appreciate that rewording someone else’s thoughts without attributing them also constitutes plagiarism. And there is that category of students whose cultural reference point, deference to ‘the words of the master’, make plagiarism conceptually difficult for them to understand.

I remember receiving my most blatant example of plagiarism and academic malpractice back in 2006. A student submitted a piece of work that included 600 words copied wholesale from Wikipedia, complete with internal bookmarks and hyperlinks. I suspect the majority of students are now sufficiently digitally literate not to make that mistake, but how many are also now in a position to do what the essay mills used to do for them, stitch together, paraphrase and redraft existing material using freely available AI text generation tools.

As we encourage our students to search the web for sources, how easy is it for them now to access some of the easily accessible, and often free, online tools? These tools include https://app.inferkit.com/demo which allows you to enter a few sentences and then generate longer texts on the basis of that origin. You can enter merely a title, of at least five words, or a series of sentences into https://smodin.io/writer and have it generate a short essay, free of references. Professional writing tools aimed at marketers, such as https://ai-writer.com, would cost a subscriber to be effective but would allow students to generate passable work. This last tool actually tells you the sources from which its abstractions have been drawn, including academic journals.

You might find it enlightening to take something you have published and put it through one of these tools and evaluate the output.

It is insufficient to ask the student to generate their own question, or even to ask the student to contextualise their own work. Some of the emergent AI tools can take account of the context. There is a need to move away from the majority of long-form text assessments. With the exception of those disciplines where writing more than a thousand words at once is justifiable (journalism, policy studies, and some humanities subjects), there is a need to make assessments as close to real-world experience as possible. It needs to be evidently the product of an individual.

Paraphrasing is a skill. A valuable one in a world where most professions do not lack pure information. The issue is to evaluate the quality of that information and then be able to reduce it to a workable volume.

I’ve worked recently with an institution reviewing its postgraduate politics curriculum. I suggested that rather than try and stop students from ‘cheating’ by paraphrasing learned texts, they should encourage the students to learn what they need to do to enhance the output of these AI tools. Using one of these tools to paraphrase, and essentially re-write, a WHO report for health policy makers made it more readable, it also left out certain details that would be essential for effective policy responses. Knowing how to read the original, use a paraphrasing tool, and being able to explore the deficiencies of its output and correct them, was a useful skill for these students.

We cannot stop the encroachment of these kinds of AI text manipulation tools in higher education, but we can make their contemporary use more meaningful to the student.


If you are interested in reviewing your programme or institutional assessment strategy and approaches please get in touch. This consultancy service can be done remotely. Contact me.


Image was generated by DALL-e



Psychomotor skills should be at the core of all learning

Any learning design framework that does not address the psychomotor skills is not worth exploring.

There is not a single discipline taught in any formal, non-formal or informal way that does not make use of some tool or technology, instrument or mechanism (aka media), at some point in the process. It makes sense that any curriculum development process needs to put the media at the forefront of its planning. Curricula need to developed around intended learning outcomes that are clearly articulated around the development of psychomotor skills.

Rather than have intellectual (cognitive) outcomes such as, (students will be able to:)

Apply transformations and use symmetry to analyze mathematical situations
Would it not be better to say
Utilise graphical representation software in order to analyse mathematical transformations and symmetry

That way the student requires the practical ability to evidence their ability to meet an intellectual skill.

Another example in a disparate discipline, let’s take theatre studies. Rather than say,

Demonstrate an understanding of all aspects of theatrical production including design and technical functions” [a real, but poorly written outcome]
Would it not be better say:
Produce design and technical specifications for a theatrical production

The learner cannot provide evidence of their ability to meet that outcome without fulfilling the weaker intellectual outcome.

The course design process then become skills focussed rather than knowledge orientated. Knowledge is acquired within a practical context. The psychomotor outcomes are not overly specific, they do not say ‘using Algosim to generate mathematical visualisations….” Or “Manage stage plans using ShowNotes….”, because in both cases stating a particular technology does not allow for future evolution of those technologies (renaming, rebranding, etc). The focus is on developing the skills, always with a light to their transferability across other tools. We should always ensure that we teach the ‘paper and pencil’ version alongside too, so the increments between origination and implementation are also evident.

The 8-Stage Learning Design Framework has as its third step the ‘Media Choices’, which requires programme and course designers to review the current (and evolving) environment into which graduates will emerge. This should incorporate a review of the tools and technologies that students are expected to use ‘on the job’. Only after this stage is complete is it appropriate to draft Intended Learning Outcomes, then assessment, and then learning & teaching activities.


See Courses on both Designing Effective Intended Learning Outcomes and the Introduction to Five Educational Taxonomies, which includes the Psychomotor domain.



do-not-reply@: the inefficiencies of email use demonstrated by graduates

Graduates, and their colleagues, born after 1970 are unlikely to have worked in a context in which email was not a primary communication tool. Its inefficiency is manifest but often overlooked.

I went from my undergraduate degree in 1988 into a role as a Logistics Manager for Vichy L’Oreal. The job involved using a stand-alone PC, a telephone to the factory in France, and a weekly Telex to Paris. I was appointed on the basis that I spoke French. I couldn’t write French for caramel au beurre (toffee), but I could speak it. No one asked me at my interview whether I had any computer skills or indeed whether I was numerate. I didn’t have email.

Today employers make similar assumptions about digital competence. I have just finished a consultancy project looking at embedding real-world practices and assessment in a business degree. The external reviews had been good, but the feedback from the destination surveys with employers less so. Their recent recruits could not communicate appropriately using email.

This challenge was described by Cal Newport in his Harvard Business Report piece “A Modest Proposal.”(2016). Outlining IBM’s early experience of corporate email in the 1980s, he explains that while the first few days showed positive signs of increased productivity but quickly employees simply adopting the system for their routine offline communication, and notably communicated ‘vastly more’ than they had before. “Thus — in a mere week or so — was gained and blown the potential productivity gain of email,” Newport cites an IBM manager. Email is now a source of inefficiency. Email use is so ubiquitous it becomes difficult to imagine running an organisation without it.

The review of this particular business degree demonstrated much like that was positive. Lots of collaborative project type activities and group assessments. Business simulations using software, embedded in the institutional learning management system, form the basis of the second year and a ‘new’ business project provided the foundation for the third year. In many ways it is a progressive design, sharing characteristics of many medical school’s approaches to ‘problem based learning’. So, why weren’t the employers recruiting from this degree happy?

It turns out that the employers biggest complaint was the inability of graduates to use email effectively. Their complaints mirror closely many of my own with the way colleagues routinely use email, so let’s unpack them.

Saying enough, but not too much. Writing a handwritten email requires thought before you write and a conscious decision whether to ‘go over the page’. It leads to shorter, more direct forms of asking questions or answering them. Students who do not use email (preferring messaging apps) are not practiced at measuring their message.

Treating email like chat. Students who are invested in the immediacy of communication, and the transient nature of the message, are prone to not invest sufficient thought in the enduring nature of business communication. Microsoft Teams, Yammer, and other messaging apps simply reinforce this behaviour. Communications may be immediate, but it’s it also ‘cheap’.

Inappropriate use of reply-all. Senior managers expressed some frustration with receiving emails with ‘thanks’ as the body of the email, sent to everyone, sometimes dozens of people, simply because the person hit ‘reply-all’. The question one might ask is if you received a paper memo from the same person, with the same information, would you go to the trouble of writing a memo back that says ‘thanks’?

Lazy addressing of email. Dragging out an old email in order to identify a sender and then hitting reply fails on two counts. Firstly, it may be that without changing the subject heading, the email may be filed with a conversation thread that is otherwise closed. It may get lost, and prove difficult to retrieve if the subject is different from that of the header. Secondly, the dreaded ‘reply-all’ means that one risks sending an appropriate message to people you did not mean to contact. Human Resource departments often have stories of ‘misunderstandings’ borne merely as the result of inappropriately sending email to the ‘wrong’ people.

Some larger commercial organisations are templating emails, turning them more into ‘digital memos’, using mailing aliases, using more BCC, and originating emails from ‘do-not-reply@‘ addresses. I came across one company that has banned all internal email, using Teams for anything internal, and using email for external communications. There are moves for organisations to reimpose a degree of structure around workflow and issuing mandates (as French legislation does) protecting the right for employees to ‘un-plug’. Organisations are embracing the limitations imposed through tools like Slack, imposing different internal and external channels. Others are simply exploring internal training courses on how to write emails efficiently. Some have even chosen to embrace the paper memo to replace some internal communication. Somewhere I am sure someone is ‘reinventing’ Lotus Notes as I write.

There is a growing problem, not just for the younger generations now graduating who have no experience of email efficiency, but for businesses worldwide. My advice to the leadership of the degree that I was consulting on was to make the second and third year projects less ‘clinical’, more chaotic in terms of the technology platforms that were available to students. Students need to learn to communicate using email in its wild and untamed form. That would require students to be educated as to write digital memos.Then any restricted workflow they may encounter will be a bonus.

I had actually used a PC before I joined Vichy in 1988, but certainly not the way the role required of me. Fortunately I seemed to have the ‘knack’. As well as the weekly telex and making daily phonecalls, I wrote a handful of internal memos (on paper) each week. I was managing a monthly budget of millions of pounds worth of stock flows with just that level of communication. I can imagine there are at least three people doing the same job now, engaged in a sisyphean effort just to manage the email traffic.


Newport, Cal. “A Modest Proposal: Eliminate Email.” Harvard Business Review, February 18, 2016. https://hbr.org/2016/02/a-modest-proposal-eliminate-email.

Image by Sara Kurfeß @stereo.prototype at unsplash.com

 

How do you define hybrid, or hyflex, learning?

I struggled recently to define hybrid learning to a client. They asked how they could go about creating ‘hybrid learning’ for their learners. A reasonable question?

There appears to be some confusion, in practice and in the literature, as to the differences between hybrid, hyflex (hiflex, hi-flex, etc), and blended learning. So, I would like to take a minute to propose some definitional parameters, and wait to see if you agree or disagree.

The terms hybrid and hyflex are, in my mind, essentially the same thing, but they differ from ‘mainstream’ blended approaches. Blended learning, as curricula and teaching practice, determines where a learner studies, and what they are doing in each space. The blend is anticipated and written into the curriculum. The teacher knows what the student will be doing in-person or as a distance learner. Indeed the course is most probably designed ‘flipped-classroom’ style, to optimise the precious time in face-to-face-face contexts, whether in-person or virtually. There are a few flavours of blended learning but they are all pre-determined by the course creator.

Hybrid, or hyflex, approaches attempt to give some agency, some control, to the learner as to the nature of their learning experience, the when, where and how. Both aim to empower the student to choose what learning should be studied face-to-face and that which should be studied online, and how to go about engaging with that learning. The only apparent difference, largely in US practice, appears to be the unpacking of the the distance participation element as asynchronous or synchronous online engagement. To me it’s a definition without a difference.

This hybrid/hyflex nature very often means courses spawn new hybrid ‘spaces’ in which there is an attempt at seamless integration between real-world in-person and virtual learning experiences. This means that designers of courses that aspire to be hybrid/hyflex learning may be required to enable the same (or equivalent) learning experiences to be modelled in multiple forms or alternative spaces (Bennett et al., 2020; Goodyear, 2020). This could be significant burden. Think about it as Universal Design for Learning (UDL) on drugs.

Blended, hybrid/hyflex are in fact all flexible learning models of delivery. They all make use of different combinations of the two modes of learning, in-person and distance. And they all fall within a regulatory and validation authority that determines the relative openness of programmes of study. Flexible is anything that is less than fixed. Its merely a question of degree. It’s clearly a spectrum. Courses are on a spectrum of curriculum delivery between rigid and flexible.

I persuaded this particular client that they did not need to go ‘all-in’ and design courses for hybrid delivery. Rather, they simply needed to consider what learning and teaching activities were best suited for ‘away-from-the-classroom’ study and to determine whether these required independent study or collaboration with others. To be a bit more… flexible.

It wasn’t the answer they wanted. After all, being ‘hybrid’ is so very much, you know, ‘now’. But it’s the answer they got.

Dr Simon Paul Atkinson

15 July 2022

Bennett, Dawn, Elizabeth Knight, and Jennifer Rowley. “The Role of Hybrid Learning Spaces in Enhancing Higher Education Students’ Employability.” British Journal of Educational Technology 51, no. 4 (2020): 1188–1202. https://doi.org/10.1111/bjet.12931.

Goodyear, Peter. “Design and Co‐configuration for Hybrid Learning: Theorising the Practices of Learning Space Design.” British Journal of Educational Technology 51, no. 4 (2020): 1045–60. https://doi.org/10.1111/bjet.12925.

Image generated using OpenAI DALL-E

Very Brief Overview of ‘Innovating Pedagogy 2022’

This very brief summary is in no way to be taken as a substitute for reading the full report, or indeed the Executive Summary, which is available here: Innovating Pedagogy 2022

Cover of Innovating Pedagogy 2022This is the 10th annual report exploring new forms in interactive and innovative practice of teaching, learning and assessment. These innovations already exist in pockets of practice but are not considered mainstream. This report, a collaboration between the Institute of Educational Technology in The Open University, UK, and the Open University of Catalonia, Spain, is the result of a filtering process and is compiled, based on a review of published studies and other sources.

Hybrid models
Maximising learning flexibility and opportunities. Beyond the strict curriculum delineations in Blended Learning models, Hybrid forms aim to empower the learner to optimise their own learner choices at to where, when and how to learn. Providing flexible choices requires teachers and institutions to adjust their systemic approaches.
Influencer-led education
 Learning from education influencers on social media platforms. Acknowledging the growth of edu-influencers, who optimise their use of social media tools to share their knowledge, experience, and passion for a range of subjects from the highly specialised to the generic. Evaluating the veracity of the message is a challenge for the learner.
Dual learning scenarios
Connecting learning in classrooms and industry workplaces. A step on from work-integrated learning models, the expectation is that course designers fully meld both formal classroom and work spaces into a coherent experience.
Pedagogies of the home
Understanding the home as a place for cultural learning. Not the same as home-schooling. Rather, it seeks to leverage the wider socio-cultural environment that the learner inhabits. Also recognises the burden on marginalised communities to fully participate.
Pedagogies of microcredentials
Accredited short courses to develop workplace skills. Existing approaches, snippets taken from existing programmes, fail to create an effective learning ecosystem for learners who require support to develop a patchwork portfolio meshing formal, non-formal and informal experiences together.
Pedagogy of discomfort  
Emotions as powerful tools for learning and for promoting social justice. A process
of self-examination that requires students to critically engage with their ideological traditions and ways of thinking about issues such as racism, oppression and social injustice.
Pedagogy of autonomy
Building capacity for freedom and independent learning. Explores notion of incorporating informal, non-formal and formal learning patterns into the learner’s experience, creating self-regulated learners with an emphasis on their metacognitive development and allowing them to reflect their true selves..
Wellbeing education
Promoting wellbeing across all aspects of teaching and learning. Wellbeing education helps students to develop mental health ‘literacy’ by teaching them how to manage their own mental health, recognise possible disorders, and learn how, where and when to seek help.
Watch parties
Watching videos together, whatever the time or place. Leveraging the increased connectivity prompted in response to covid-19, and the move of  media providers to provide educational tools, this is the notion of structured engagement around a shared viewing (or listening) experience.
Walk-and-talk
Combining movement and conversation to enhance learning. Not just used in service of for those in need of emotional support, where the therapeutic benefits have been proven, but across a wide range of learning activities where reflection and thought would be best served by being away from the classroom and being outside and mobile.
10 Themes from the 2022 Innovating Pedagogy report

 

Kukulska-Hulme, A., et.al. (2022). Innovating Pedagogy 2022: Open University Innovation (No. 10). Open University.
 

Dr Simon Paul Atkinson PFHEA / 13 July 2022

Image is generated by OpenAI’s DALL-E2

Teaching about existential threats: why we need to teach concepts, not just facts.

It has now been more than four months after Russia’s invasion of Ukraine and I have been thinking how badly we need to be teaching about existential threats. I think we need to develop a curriculum that is open to contemporary real world challenges.

It has now been more than four months after Russia’s invasion of Ukraine. Like many, I have been ruminating. This post it’s about that. Or at least not directly. I have been thinking about how badly we need to be teaching about existential threats. I think we need to develop a curriculum that is open to contemporary real world challenges.

I think global education needs to adjust to new realities. The First World War, the Great War, wasn’t a World war in July 1914. It became one later. The Second World War likewise was not a world war in September 1939, although it engulfed the globe in due course. We are yet to see whether the February 2022 Russian invasion of Ukraine will prove to have been the start of the Third World War. Hopefully it will not become that kind of milestone, but i think we owe it to students to prepare them for that possibility.

Education is different now than it was in 1914 or 1939. Now we have wall to wall coverage, ubiquitous social media and real-time battlefield insights piped into childrens’, adolescents’ and adults’ television screens, tablets and smartphones. There is almost as much misinformation as there are facts on many platforms; as many well-meaning transmitters of misleading (sometimes factually inaccurate) news, as there are respectable and accredited voices. The democratisation of information is a good idea, but it assumes individual generators of that information are well-informed, critical and, if opinionated (or biased), state that upfront.

I have learnt a great deal about Ukraine in recent months. Some of it from listening to TikTokers and YouTubers. Some summarise Ukrainian news sources for us. and others share their daily lives from war torn cities. Some review Russian media sources. Some provide valuable daily summaries. Others share emotive responses to news as it happens. I am aware that I am, I believe, relatively digitally literate. I’m critical of the sources, often look up the individual commentators on other social media, check their LinkedIn profile and review past output. This last element is particularly interesting. I have been suspicious, though not dismissive, of social media account that started in late February 2022. Some are clearly chasing followers, clicks and likes. Some are clearly trying to provide what they see as a genuine information service. Being able to identify the difference is not always easy. Our students need to learn these skills.

Risk evaluation is a very difficult thing to teach. Each student will have different life experiences to make them more or less fearful of uncertainty. Those who lived through the Cuban missed crisis or the nuclear standoffs in the early 1980s may say they’ve seen it all before. What is different now is the ubiquitous nature of information, and misinformation, which is in danger of confusing students’ ability to make their own judgements.

Educators are morally obliged to teach the unseen. That includes climate change and the risk of nuclear war. There is not a discipline that cannot leverage the moment. Social sciences and humanities obviously have an edge. Physical sciences are often less flexible in terms of the curriculum. But I think it’s important that anyone teaching today pauses before delivering any concept, any idea or thought, and consider whether there is a contemporary example amongst the unseen existential threats that exist. It means sometimes abandoning our own safe assumptions, our own safe havens, and exploring things that we ourselves may see as uncertainties.

There any number of examples of elements within any curriculum that can leverage the Russia-Ukraine war. Political scientists can explore the notion of the Eurasian multipolar world view. Geographers can explore the three seas project, and Sociologists can explore the religious realignment we are now seeing amongst the Orthodox churches. Examples are endless if we focus on the concept rather than the content. Beyond any obvious historical comparisons there are lessons to be learnt across all the disciplines using contemporary examples. The learning of concepts, geopolitical perspectives, resource management and cultural power, are all more useful to the students that any specific set of facts.

How relevant the current Russian invasion of Ukraine may be perceived by faculty and their students alike will depend largely on geography. While the conflict itself is seen as a largely European ‘problem’, and it’s global economic implications are yet to be clearly felt, I would understand that these reflections are probably more relevant to my European colleagues than to many others. But the principle still stands. All concepts that form any part of the curriculum need to be based within a contemporary world context. We need to leverage the current crisis that is being seen and witnessed by students through the prism social media. In doing so we can both serve the curriculum and educate students with critical judgement about their sources of information.

Concepts not Content

I am passionate about privileging the teaching and learning of concepts rather than content. Concepts are instruments that serve to identify, define, explain, illustrate and analyse real-life elements and events, past, present or future. These are usually within the confines of a particular geography, social context and within discipline conventions, but when defined well, reach across all cultural boundaries.

There are essentially two kinds of concepts: sensory and abstract. Sensory concepts are tangible, they can be experienced through our senses. Abstract concepts are not directly experienced, they are often not visible and need to be imagined. There is a simple three step process for you to consider as you build learning with concepts: define, illustrate, and imagine.

Define:

It’s important to keep the definition of a concept at its simplest. It should be a self contained concept.

Let’s take for example the statement that Regional wars have global consequences.

We could then unpack what we means by regions, wars and global consequences. The easiest way to validate your concepts’ definition is to see how easy it is to state its opposite. Regional wars do not have global consequences.

I can already envisage an assessment task that asks students to identify regional conflicts that did not have global consequences, and then have their peers challenge them subsequently as alternative perceptions of those consequences (after some enquiry-based learning).

Illustrate:

Illustrating a concept helps learners to catagorize new knowledge, to cement that new learning in a hierarchy or order of reality. Illustrations can be examples that demonstrate the truth of the definition, or it’s opposite. An illustration that does not match the definition also serves to help learners make sense of the definition. So in this example “Regional wars can have global consequences”, I could describe the key protagonists and events that led to the war in the Middle East between Israel and Arab powers in 1956, which had profound long term implications for European loss of influence and the rise of the United States as a regional power broker.

For its opposite I could take the regional war fought between the Sahrawi Indigenous Polisario Front and Morocco from 1975 to 1991 (and involving Mauritania between 1975 to 1979), for control of Western Sahara which has had minimal global impact. Although it is still a live issue in that region.

Neither world wars, both clearly regional conflicts but with different impacts. A useful conceptual space to unpack thoughts and idea with students. Learners do not need to have detailed knowledge about the background histories of the parties to be able to develop and understanding why these two different conflicts result in different implications. The challenge for them to unpack the factors that make up the definition of the concept shared earlier, that regional wars do not have global consequences.

I am not teaching my students about the 1956 Israel-Arab war or the war in Western Sahara, I am illustrating the factors that go into making the truth of my definition self evident. Examples and non-examples both support the interpretation of concepts.

Imagine:

Imagining scenarios in which the concept might be illustrated, perhaps using analogies, can prove very effective. Interpreting analogies requires the learner to deconstruct and reconstruct the element of the concept, it supports deeper comprehension, improves retention and allows the learner to adapt then meaning of a concept into their own sociology-cultural context.

It’s important that as you construct your imagined scenario or your analogy, that you ground it in the existing, or at least conceivable, experience that your learners already have or could have. There is a danger that we forget just how culturally diverse our student cohorts are. References to popular culture, national habits and pastimes may mean something to you but are not going to be generally understood.

You could for example ask students to imagine a conflict between whichever country you are teaching in and ask how a conflict with a neighbour state might, or might not, have global consequences. I acknowledge for too many in the world this is not merely an intellectual exercise.

Changing Practice

Concepts are foundational to all new learning but we, in tertiary education, are in the habit of burying or obscuring the key concepts amidst the weight of information, and then expecting the learner to be able to think in abstract terms.

I had a lecturer recently tell me that they didn’t have time to change the example that they were using to teach supply and demand, a well developed scenario based on the oil price during the Second Gulf War. I found it very hard to believe, given that the current Russian invasion of Ukraine has had a direct impact on oil and commodity prices globally. Why, I suggested, why didn’t they ask the students to fill in the details of his scenario so that they would understand better the duplications of each factor rather than sharing a preprepared example. I suggested that might also provide an opportunity for students to talk more openly about the current threats that they may perceive impact on them personally as result of this particular war.

Strange as it may seem I think that the current war provides an important catalyst for the re-evaluation and revitalisation of much of our social science and humanities curriculum. It reminds us that there are existential threats around us, and that these should service as pivotal points of reference as we explore concepts with our students enabling them to make meaningful connections.

Students need to be encouraged to seek out sources of information with a critical eye in order to be better prepared for the unforeseen.


Photo by Антон Дмитриев on Unsplash

%d bloggers like this: