Plagiarism: desperately in need of redefinition in the age of generative AI.

The vernacular definition of plagiarism is often “passing off someone else’s work as your own” or more fully, in the University of Oxford maternal guidance, “Presenting work or ideas from another source as your own, with or without consent of the original author, by incorporating it into your work without full acknowledgement.” This later definition works better in the current climate in which generative AI assistants are being rolled out across many word-processing tools. When a student can start a prompt and have the system, rather than another individual, write paragraphs, there is an urgent need to redefine academic integrity.

If they are not your own thoughts committed to text, where did they come from? Any thoughts that are not your own need to be attributed. Generative AI applications are already being used in the way that previous generations have made use of Wikipedia, as a source of initial ‘research’, clarification, definitions, and for the more diligent perhaps for sources. In the early days of Wikipedia I saw digitally illiterate students copy and paste wholesale blocks of text from the website straight into their submissions, often with removing hyperlinks! The character of wikipedia as a source has evolved. We need to engage in an open conversation with students, and between ourselves, about the nature of the purpose of any writing task assigned to a student. We need to quickly move students beyond the unreferenced Chatbots into structured and referenced generative AI tools and deploy what we have learnt about Wikipedia. Students need to differentiate between their own thoughts and triangulate everything else before citing and referencing it.

Image: Midjourney 12/06/23

Empower Learners for the Age of AI: a reflection

During the Empower Learners for the Age of AI (ELAI) conference earlier in December 2022, it became apparent to me personally that not only does Artificial intelligence (AI) have the potential to revolutionize the field of education, but that it already is. But beyond the hype and enthusiasm there are enormous strategic policy decisions to be made, by governments, institutions, faculty and individual students. Some of the ‘end is nigh’ messages circulating on Social Media in the light of the recent release of ChatGPT are fanciful click-bait, some however, fire a warning shot across the bow of complacent educators.

It is certainly true to say that if your teaching approach is to deliver content knowledge and assess the retention and regurgitation of that same content knowledge then, yes, AI is another nail in that particular coffin. If you are still delivering learning experiences the same way that you did in the 1990s, despite Google Search (b.1998) and Wikipedia (b.2001), I am amazed you are still functioning. What the emerging fascination about AI is delivering an accelerated pace to the self-reflective processes that all university leadership should be undertaking continuously.

AI advocates argue that by leveraging the power of AI, educators can personalize learning for each student, provide real-time feedback and support, and automate administrative tasks. Critics argue that AI dehumanises the learning process, is incapable of modelling the very human behaviours we want our students to emulate, and that AI can be used to cheat. Like any technology, AI also has its disadvantages and limitations. I want to unpack these from three different perspectives, the individual student, faculty, and institutions.


Get in touch with me if your institution is looking to develop its strategic approach to AI.


Individual Learner

For learners whose experience is often orientated around learning management systems, or virtual learning environments, existing learning analytics are being augmented with AI capabilities. Where in the past students might be offered branching scenarios that were preset by learning designers, the addition of AI functionality offers the prospect of algorithms that more deeply analyze a student’s performance and learning approaches, and provide customized content and feedback that is tailored to their individual needs. This is often touted as especially beneficial for students who may have learning disabilities or those who are struggling to keep up with the pace of a traditional classroom, but surely the benefit is universal when realised. We are not quite there yet. Identifying ‘actionable insights’ is possible, the recommended actions harder to define.

The downside for the individual learner will come from poorly conceived and implemented AI opportunities within institutions. Being told to complete a task by a system, rather than by a tutor, will be received very differently depending on the epistemological framework that you, as a student, operate within. There is a danger that companies presenting solutions that may work for continuing professional development will fail to recognise that a 10 year old has a different relationship with knowledge. As an assistant to faculty, AI is potentially invaluable, as a replacement for tutor direction it will not work for the majority of younger learners within formal learning programmes.

Digital equity becomes important too. There will undoubtedly be students today, from K-12 through to University, who will be submitting written work generated by ChatGPT. Currently free, for ‘research’ purposes (them researching us), ChatGPT is being raved about across social media platforms for anyone who needs to author content. But for every student that is digitally literate enough to have found their way to the OpenAI platform and can use the tool, there will be others who do not have access to a machine at home, or the bandwidth to make use of the internet, or even to have the internet at all. Merely accessing the tools can be a challenge.

The third aspect of AI implementation for individuals is around personal digital identity. Everyone, regardless of their age or context, needs to recognise that ‘nothing in life is free’. Whenever you use a free web service you are inevitably being mined for data, which in turn allows the provider of that service to sell your presence on their platform to advertisers. Teaching young people about the two fundamental economic models that operate online, subscription services and surveillance capitalism, MUST be part of ever curriculum. I would argue this needs to be introduced in primary schools and built on in secondary. We know that AI data models require huge datasets to be meaningful, so our data is what fuels these AI processes.

Faculty

Undoubtedly faculty will gain through AI algorithms ability to provide real-time feedback and support, to continuously monitor a student’s progress and provide immediate feedback and suggestions for improvement. On a cohort basis this is proving invaluable already, allowing faculty to adjust the pace or focus of content and learning approaches. A skilled faculty member can also, within the time allowed to them, to differentiate their instruction helping students to stay engaged and motivated. Monitoring students’ progress through well structured learning analytics is already available through online platforms.

What of the in-classroom teaching spaces. One of the sessions at ELAI showcased AI operating in a classroom, interpreting students body language, interactions and even eye tracking. Teachers will tell you that class sizes are a prime determinant of student success. Smaller classes mean that teachers can ‘read the room’ and adjust their approaches accordingly. AI could allow class sizes beyond any claim to be manageable by individual faculty.

One could imagine a school built with extensive surveillance capability, with every classroom with total audio and visual detection, with physical behaviour algorithms, eye tracking and audio analysis. In that future, the advocates would suggest that the role of the faculty becomes more of a stage manager rather than a subject authority. Critics would argue a classroom without a meaningful human presence is a factory.

Institutions

The attraction for institutions of AI is the promise to automate administrative tasks, such as grading assignments and providing progress reports, currently provided by teaching faculty. This in theory frees up those educators to focus on other important tasks, such as providing personalized instruction and support.

However, one concern touched on at ELAI was the danger of AI reinforcing existing biases and inequalities in education. An AI algorithm is only as good as the data it has been trained on. If that data is biased, its decisions will also be biased. This could lead to unfair treatment of certain students, and could further exacerbate existing disparities in education. AI will work well with homogenous cohorts where the perpetuation of accepted knowledge and approaches is what is expected, less well with diverse cohorts in the context of challenging assumptions.

This is a problem. In a world in which we need students to be digitally literate and AI literate, to challenge assumptions but also recognise that some sources are verified and others are not, institutions that implement AI based on existing cohorts is likely to restrict the intellectual growth of those that follow.

Institutions rightly express concerns about the cost of both implementing AI in education and the costs associated with monitoring its use. While the initial investment in AI technologies may be significant, the long-term cost savings and potential benefits may make it worthwhile. No one can be certain how the market will unfurl. It’s possible that many AI applications become incredibly cheap under some model of surveillance capitalism so as to be negligible, even free. However, many of the AI applications, such as ChatGPT, use enormous computing power, little is cacheable and retained for reuse, and these are likely to become costly.

Institutions wanting to explore the use of AI are likely to find they are being presented with additional, or ‘upgraded’ modules to their existing Enterprise Management Systems or Learning Platforms.

Conclusion

It is true that AI has the potential to revolutionize the field of education by providing personalized instruction and support, real-time feedback, and automated administrative tasks. However, institutions need to be wary of the potential for bias, aware of privacy issues and very attentive to the nature of the learning experiences they enable.


Get in touch with me if your institution is looking to develop its strategic approach to AI.


Image created using DALL-E

Metaverse explained for University Leaders: What is currently possible within the Metaverse? 2/4

I am not selling anything here. That should be self-evident given that my answer to the question “what is currently possible within the Metaverse?” is, not much. I could even suggest nothing, because ‘it’ doesn’t exist yet, certainly in the form it aspires to. What we have instead are partial experiences, glimpses into the promise of what the future holds. In part one of this four-part blog, I explored the definitions of what the Metaverse might be. We don’t have it yet.

Recent press (including this from the NYT), in part the reason for the delay in issuing this second of four short articles, have highlighted how deeply unpopular the concept of an immersive working environment in the Metaverse actually may prove to be. Meta’s own Horizon platform, the immersive environment that is the company’s manifestation of the Metaverse, is proving unpopular even amongst its own employees. Essentially, the Metaverse still remains largely the domain of ‘video games’. There is a serious risk of over-inflating the promise of a virtual reality workspace. Just as 3D films have repeated the cycle of innovation, technology breakthrough, costly implementation, partial deployment, and customer non-engagement, so it looks like the Metaverse risks repeating this trajectory.

If you are looking to review institutional strategies in the light of challenges and opportunities presented by the Metaverse, please feel free to get in touch with spa@sijen.com

Nonetheless, we should discuss what is currently accessible for educators. There are a range of AR and VR visualisations that aid learning. These include 3D visualisations of the human body for medical purposes, and of engineering and architectural designs that aid a deeper understanding of structure. The challenge for academics is to confront themselves with the question of whether learning gained through these 3D renditions adds enough value to warrant associated costs. If you were a medical science student before these visualisations were available, are you likely to have learned anything new from these 3D renditions? Are these 3D images necessarily enhanced by viewing them using VR headsets? It might be a ‘nice to have’, but does it warrant the not insignificant investment in staff training and equipment?

What is currently available in the commercial world,  notably in disaster response and security contexts, are a series of hyper-real representations of real-world scenes, as opposed to fantasy worlds, in which skills can be perfected. The most obvious in the public consciousness would be flight simulators on which pilots learn to master new aircraft. Surgeons have also benefited for some years from the VR renditions of difficult procedures that can be rehearsed before opening up a patient. Touching on a humanities field, but still with a foot firmly in the technical realm, the restoration team working on the Notre Dame in Paris collaborates within a VR version of the fire-gutted cathedral, discussing and experimenting with approaches before tackling the real thing. 

There is no doubt that the human brain is clever. Having a 3D visualisation of an object or a scene, displayed on a flat screen, satisfies most cognitive engagements. Is immersion in virtual reality either helpful or necessary?

Graphic design and game design students would undoubtedly benefit from practice suites to be able to design 3D models and game interactives. Saving individual students the cost of investing in the kit that is likely to be constantly upgraded as IT equipment manufacturers attempt to recoup their investments.  However, unless there is a distinct visualisation requirement,  asked of by current or emergent practice within the profession to which university programmes are aligned to, I would suggest there is no need to invest heavily in developing the in-house capacity to create VR experiences. It remains cheaper, not cheap, but cheaper, to employ either a third party, or your own student designers, to create experiences. 

What is less certain is the role that AR will play in the Metaverse. That’s for next time.

If you are looking to review institutional strategies in the light of challenges and opportunities presented by the Metaverse, please feel free to get in touch with spa@sijen.com

How do you define hybrid, or hyflex, learning?

I struggled recently to define hybrid learning to a client. They asked how they could go about creating ‘hybrid learning’ for their learners. A reasonable question?

There appears to be some confusion, in practice and in the literature, as to the differences between hybrid, hyflex (hiflex, hi-flex, etc), and blended learning. So, I would like to take a minute to propose some definitional parameters, and wait to see if you agree or disagree.

The terms hybrid and hyflex are, in my mind, essentially the same thing, but they differ from ‘mainstream’ blended approaches. Blended learning, as curricula and teaching practice, determines where a learner studies, and what they are doing in each space. The blend is anticipated and written into the curriculum. The teacher knows what the student will be doing in-person or as a distance learner. Indeed the course is most probably designed ‘flipped-classroom’ style, to optimise the precious time in face-to-face-face contexts, whether in-person or virtually. There are a few flavours of blended learning but they are all pre-determined by the course creator.

Hybrid, or hyflex, approaches attempt to give some agency, some control, to the learner as to the nature of their learning experience, the when, where and how. Both aim to empower the student to choose what learning should be studied face-to-face and that which should be studied online, and how to go about engaging with that learning. The only apparent difference, largely in US practice, appears to be the unpacking of the the distance participation element as asynchronous or synchronous online engagement. To me it’s a definition without a difference.

This hybrid/hyflex nature very often means courses spawn new hybrid ‘spaces’ in which there is an attempt at seamless integration between real-world in-person and virtual learning experiences. This means that designers of courses that aspire to be hybrid/hyflex learning may be required to enable the same (or equivalent) learning experiences to be modelled in multiple forms or alternative spaces (Bennett et al., 2020; Goodyear, 2020). This could be significant burden. Think about it as Universal Design for Learning (UDL) on drugs.

Blended, hybrid/hyflex are in fact all flexible learning models of delivery. They all make use of different combinations of the two modes of learning, in-person and distance. And they all fall within a regulatory and validation authority that determines the relative openness of programmes of study. Flexible is anything that is less than fixed. Its merely a question of degree. It’s clearly a spectrum. Courses are on a spectrum of curriculum delivery between rigid and flexible.

I persuaded this particular client that they did not need to go ‘all-in’ and design courses for hybrid delivery. Rather, they simply needed to consider what learning and teaching activities were best suited for ‘away-from-the-classroom’ study and to determine whether these required independent study or collaboration with others. To be a bit more… flexible.

It wasn’t the answer they wanted. After all, being ‘hybrid’ is so very much, you know, ‘now’. But it’s the answer they got.

Dr Simon Paul Atkinson

15 July 2022

Bennett, Dawn, Elizabeth Knight, and Jennifer Rowley. “The Role of Hybrid Learning Spaces in Enhancing Higher Education Students’ Employability.” British Journal of Educational Technology 51, no. 4 (2020): 1188–1202. https://doi.org/10.1111/bjet.12931.

Goodyear, Peter. “Design and Co‐configuration for Hybrid Learning: Theorising the Practices of Learning Space Design.” British Journal of Educational Technology 51, no. 4 (2020): 1045–60. https://doi.org/10.1111/bjet.12925.

Image generated using OpenAI DALL-E

Very Brief Overview of ‘Innovating Pedagogy 2022’

This very brief summary is in no way to be taken as a substitute for reading the full report, or indeed the Executive Summary, which is available here: Innovating Pedagogy 2022

Cover of Innovating Pedagogy 2022This is the 10th annual report exploring new forms in interactive and innovative practice of teaching, learning and assessment. These innovations already exist in pockets of practice but are not considered mainstream. This report, a collaboration between the Institute of Educational Technology in The Open University, UK, and the Open University of Catalonia, Spain, is the result of a filtering process and is compiled, based on a review of published studies and other sources.

Hybrid models
Maximising learning flexibility and opportunities. Beyond the strict curriculum delineations in Blended Learning models, Hybrid forms aim to empower the learner to optimise their own learner choices at to where, when and how to learn. Providing flexible choices requires teachers and institutions to adjust their systemic approaches.
Influencer-led education
 Learning from education influencers on social media platforms. Acknowledging the growth of edu-influencers, who optimise their use of social media tools to share their knowledge, experience, and passion for a range of subjects from the highly specialised to the generic. Evaluating the veracity of the message is a challenge for the learner.
Dual learning scenarios
Connecting learning in classrooms and industry workplaces. A step on from work-integrated learning models, the expectation is that course designers fully meld both formal classroom and work spaces into a coherent experience.
Pedagogies of the home
Understanding the home as a place for cultural learning. Not the same as home-schooling. Rather, it seeks to leverage the wider socio-cultural environment that the learner inhabits. Also recognises the burden on marginalised communities to fully participate.
Pedagogies of microcredentials
Accredited short courses to develop workplace skills. Existing approaches, snippets taken from existing programmes, fail to create an effective learning ecosystem for learners who require support to develop a patchwork portfolio meshing formal, non-formal and informal experiences together.
Pedagogy of discomfort  
Emotions as powerful tools for learning and for promoting social justice. A process
of self-examination that requires students to critically engage with their ideological traditions and ways of thinking about issues such as racism, oppression and social injustice.
Pedagogy of autonomy
Building capacity for freedom and independent learning. Explores notion of incorporating informal, non-formal and formal learning patterns into the learner’s experience, creating self-regulated learners with an emphasis on their metacognitive development and allowing them to reflect their true selves..
Wellbeing education
Promoting wellbeing across all aspects of teaching and learning. Wellbeing education helps students to develop mental health ‘literacy’ by teaching them how to manage their own mental health, recognise possible disorders, and learn how, where and when to seek help.
Watch parties
Watching videos together, whatever the time or place. Leveraging the increased connectivity prompted in response to covid-19, and the move of  media providers to provide educational tools, this is the notion of structured engagement around a shared viewing (or listening) experience.
Walk-and-talk
Combining movement and conversation to enhance learning. Not just used in service of for those in need of emotional support, where the therapeutic benefits have been proven, but across a wide range of learning activities where reflection and thought would be best served by being away from the classroom and being outside and mobile.
10 Themes from the 2022 Innovating Pedagogy report

 

Kukulska-Hulme, A., et.al. (2022). Innovating Pedagogy 2022: Open University Innovation (No. 10). Open University.
 

Dr Simon Paul Atkinson PFHEA / 13 July 2022

Image is generated by OpenAI’s DALL-E2

Teaching about existential threats: why we need to teach concepts, not just facts.

It has now been more than four months after Russia’s invasion of Ukraine and I have been thinking how badly we need to be teaching about existential threats. I think we need to develop a curriculum that is open to contemporary real world challenges.

It has now been more than four months after Russia’s invasion of Ukraine. Like many, I have been ruminating. This post it’s about that. Or at least not directly. I have been thinking about how badly we need to be teaching about existential threats. I think we need to develop a curriculum that is open to contemporary real world challenges.

I think global education needs to adjust to new realities. The First World War, the Great War, wasn’t a World war in July 1914. It became one later. The Second World War likewise was not a world war in September 1939, although it engulfed the globe in due course. We are yet to see whether the February 2022 Russian invasion of Ukraine will prove to have been the start of the Third World War. Hopefully it will not become that kind of milestone, but i think we owe it to students to prepare them for that possibility.

Education is different now than it was in 1914 or 1939. Now we have wall to wall coverage, ubiquitous social media and real-time battlefield insights piped into childrens’, adolescents’ and adults’ television screens, tablets and smartphones. There is almost as much misinformation as there are facts on many platforms; as many well-meaning transmitters of misleading (sometimes factually inaccurate) news, as there are respectable and accredited voices. The democratisation of information is a good idea, but it assumes individual generators of that information are well-informed, critical and, if opinionated (or biased), state that upfront.

I have learnt a great deal about Ukraine in recent months. Some of it from listening to TikTokers and YouTubers. Some summarise Ukrainian news sources for us. and others share their daily lives from war torn cities. Some review Russian media sources. Some provide valuable daily summaries. Others share emotive responses to news as it happens. I am aware that I am, I believe, relatively digitally literate. I’m critical of the sources, often look up the individual commentators on other social media, check their LinkedIn profile and review past output. This last element is particularly interesting. I have been suspicious, though not dismissive, of social media account that started in late February 2022. Some are clearly chasing followers, clicks and likes. Some are clearly trying to provide what they see as a genuine information service. Being able to identify the difference is not always easy. Our students need to learn these skills.

Risk evaluation is a very difficult thing to teach. Each student will have different life experiences to make them more or less fearful of uncertainty. Those who lived through the Cuban missed crisis or the nuclear standoffs in the early 1980s may say they’ve seen it all before. What is different now is the ubiquitous nature of information, and misinformation, which is in danger of confusing students’ ability to make their own judgements.

Educators are morally obliged to teach the unseen. That includes climate change and the risk of nuclear war. There is not a discipline that cannot leverage the moment. Social sciences and humanities obviously have an edge. Physical sciences are often less flexible in terms of the curriculum. But I think it’s important that anyone teaching today pauses before delivering any concept, any idea or thought, and consider whether there is a contemporary example amongst the unseen existential threats that exist. It means sometimes abandoning our own safe assumptions, our own safe havens, and exploring things that we ourselves may see as uncertainties.

There any number of examples of elements within any curriculum that can leverage the Russia-Ukraine war. Political scientists can explore the notion of the Eurasian multipolar world view. Geographers can explore the three seas project, and Sociologists can explore the religious realignment we are now seeing amongst the Orthodox churches. Examples are endless if we focus on the concept rather than the content. Beyond any obvious historical comparisons there are lessons to be learnt across all the disciplines using contemporary examples. The learning of concepts, geopolitical perspectives, resource management and cultural power, are all more useful to the students that any specific set of facts.

How relevant the current Russian invasion of Ukraine may be perceived by faculty and their students alike will depend largely on geography. While the conflict itself is seen as a largely European ‘problem’, and it’s global economic implications are yet to be clearly felt, I would understand that these reflections are probably more relevant to my European colleagues than to many others. But the principle still stands. All concepts that form any part of the curriculum need to be based within a contemporary world context. We need to leverage the current crisis that is being seen and witnessed by students through the prism social media. In doing so we can both serve the curriculum and educate students with critical judgement about their sources of information.

Concepts not Content

I am passionate about privileging the teaching and learning of concepts rather than content. Concepts are instruments that serve to identify, define, explain, illustrate and analyse real-life elements and events, past, present or future. These are usually within the confines of a particular geography, social context and within discipline conventions, but when defined well, reach across all cultural boundaries.

There are essentially two kinds of concepts: sensory and abstract. Sensory concepts are tangible, they can be experienced through our senses. Abstract concepts are not directly experienced, they are often not visible and need to be imagined. There is a simple three step process for you to consider as you build learning with concepts: define, illustrate, and imagine.

Define:

It’s important to keep the definition of a concept at its simplest. It should be a self contained concept.

Let’s take for example the statement that Regional wars have global consequences.

We could then unpack what we means by regions, wars and global consequences. The easiest way to validate your concepts’ definition is to see how easy it is to state its opposite. Regional wars do not have global consequences.

I can already envisage an assessment task that asks students to identify regional conflicts that did not have global consequences, and then have their peers challenge them subsequently as alternative perceptions of those consequences (after some enquiry-based learning).

Illustrate:

Illustrating a concept helps learners to catagorize new knowledge, to cement that new learning in a hierarchy or order of reality. Illustrations can be examples that demonstrate the truth of the definition, or it’s opposite. An illustration that does not match the definition also serves to help learners make sense of the definition. So in this example “Regional wars can have global consequences”, I could describe the key protagonists and events that led to the war in the Middle East between Israel and Arab powers in 1956, which had profound long term implications for European loss of influence and the rise of the United States as a regional power broker.

For its opposite I could take the regional war fought between the Sahrawi Indigenous Polisario Front and Morocco from 1975 to 1991 (and involving Mauritania between 1975 to 1979), for control of Western Sahara which has had minimal global impact. Although it is still a live issue in that region.

Neither world wars, both clearly regional conflicts but with different impacts. A useful conceptual space to unpack thoughts and idea with students. Learners do not need to have detailed knowledge about the background histories of the parties to be able to develop and understanding why these two different conflicts result in different implications. The challenge for them to unpack the factors that make up the definition of the concept shared earlier, that regional wars do not have global consequences.

I am not teaching my students about the 1956 Israel-Arab war or the war in Western Sahara, I am illustrating the factors that go into making the truth of my definition self evident. Examples and non-examples both support the interpretation of concepts.

Imagine:

Imagining scenarios in which the concept might be illustrated, perhaps using analogies, can prove very effective. Interpreting analogies requires the learner to deconstruct and reconstruct the element of the concept, it supports deeper comprehension, improves retention and allows the learner to adapt then meaning of a concept into their own sociology-cultural context.

It’s important that as you construct your imagined scenario or your analogy, that you ground it in the existing, or at least conceivable, experience that your learners already have or could have. There is a danger that we forget just how culturally diverse our student cohorts are. References to popular culture, national habits and pastimes may mean something to you but are not going to be generally understood.

You could for example ask students to imagine a conflict between whichever country you are teaching in and ask how a conflict with a neighbour state might, or might not, have global consequences. I acknowledge for too many in the world this is not merely an intellectual exercise.

Changing Practice

Concepts are foundational to all new learning but we, in tertiary education, are in the habit of burying or obscuring the key concepts amidst the weight of information, and then expecting the learner to be able to think in abstract terms.

I had a lecturer recently tell me that they didn’t have time to change the example that they were using to teach supply and demand, a well developed scenario based on the oil price during the Second Gulf War. I found it very hard to believe, given that the current Russian invasion of Ukraine has had a direct impact on oil and commodity prices globally. Why, I suggested, why didn’t they ask the students to fill in the details of his scenario so that they would understand better the duplications of each factor rather than sharing a preprepared example. I suggested that might also provide an opportunity for students to talk more openly about the current threats that they may perceive impact on them personally as result of this particular war.

Strange as it may seem I think that the current war provides an important catalyst for the re-evaluation and revitalisation of much of our social science and humanities curriculum. It reminds us that there are existential threats around us, and that these should service as pivotal points of reference as we explore concepts with our students enabling them to make meaningful connections.

Students need to be encouraged to seek out sources of information with a critical eye in order to be better prepared for the unforeseen.


Photo by Антон Дмитриев on Unsplash

Why I am not a social-constructivist

I have never believed in social-constructivism. At least not the way the educational anthropologists’ definition of the phenomena has been distorted and contorted into current practice. Social-constructivists justifiably argue that knowledge is often constructed through social interaction. Further, they state that the social and cultural context in which that learning occurs is significant. I just don’t believe that it necessarily requires in-person encounters. And I don’t think it applies to all forms of learning and disciplines.

Atharva Tulsi at Unsplash

The fetishism of ‘group-work’, which has continued to grow since the 1980s on the back of skimming the literature about social constructivism, and further enabled through digital tools in developed economies, has been applied to nearly all disciplines and all levels. This simply doesn’t make sense. Socialisation matters for children in K-12 as they learn diverse social skills through subject-based curriculum; at least in theory. Group-work, applied to much of the University curriculum has been poorly conceived. Rich courseware should provide a  transparent socio-cultural context for its learning. It rarely does. Unless the intention is to refine and extend the processes of socialisation for University students, students can, and should, be empowered to mediate the knowledge through their own socio-cultural reality.

When I read, listen, or watch something I am engaged in learning from another human being. Often this learning is asynchronous, sometimes time-displaced to an extreme degree, but there is still evidence of a voice. How well crafted the learning is, will depend on the coherent nature of that voice, but there is always a voice. At the Open University in the early 2000s Course Teams worked hard to ensure that no matter how many course authors might contribute to a course, there was a consistent ‘voice’. I just don’t believe that it is appropriate to assume that an individual’s learning is enhanced somehow by having ‘horizontal’ conversations with others who are at the same level of learning as themselves. I agree that one can learn from others. That is not the same as saying one necessarily learns with peers.

Personally, I believe we should be designing learning experiences, and courseware, that the individual student can deploy in their own context. If learners ‘want’ to learn with others, with whānua (family/community) or colleagues, they can do so. We may want to encourage them to mobilise people around their own learning, and to build networks to support their learning journey. This would be a truer representation of their lifelong learning experience going forward.

I don’t believe we should force students to ‘come and learn with us‘. To do so is to perpetuate an arcane model of learning that reinforces notions of power and privilege. It’s a model of learning that centralises access to knowledge, and maintains the notion of gatekeepers to learning. We should empower and enable learners through our courseware, not enslave them through it.

Photo by Atharva Tulsi on Unsplash

 

Why there is no place for self-directed learning in formal and non-formal education

I have a problem with the use of the term ‘self-directed learning’. Or more precisely, the misuse of the term, certainly as it relates to formal programmes of study as defined by United Kingdom (QAA) and New Zealand qualifications authorities (NZQA), and others. The casual use of vernacular language to define specific concepts is a constant problem for me. I would prefer if we would use the term ‘independent learning’, which is more accurate.

In my worldview, self-directed learning has a specific definition. Based on the work of Malcolm Knowles , self-directed learning requires the learner to have the freedom to decide what outcomes they intended and the resources and the path they will travel to gain the learning. Knowles’s own definition was that “In its broadest meaning self-directed learning describes a process by which individuals take the initiative, with or without the assistance of others, in diagnosing their learning needs, formulating learning goals, identifying human and material resources for learning, choosing and implementing appropriate learning strategies, and evaluating learning outcomes” (Knowles, 1975, p. 18). My emphasis.

There is an important distinction to be made here between formal, non-formal, informal, and incidental learning. Formal, non-formal and informal learning all have intentionality, the learner intends to learn something. That distinguishes it from incidental learning, which is gained ‘accidentally’, without the learner intending to learn anything. Formal and non-formal learning can be distinguished from informal learning because both forms have some structure, some curriculum, and some prescribed learning goals (UNESCO Institute for Statistics, 2012).

It’s important to make this distinction because so much of the commentary on the web, and even in the academic literature, journals, books and YouTube or LinkedIn videos conflate these concepts.

Clearly, any programme or course that has a defined curriculum, which accounts for most of the learning that takes place in schools, colleges, polytechnics, and universities, has learning goals, or outcomes, prescribed. This makes it literally impossible for the student to be ‘self-directed’. Self-directed is, by definition, learning where the individual decides for themselves what their curriculum is going to be and what the outcome of their learning will be. Self-directed learning cannot be non-formal or formal learning because both forms have curriculum already prescribed.

I would advocate that there are three modalities of learning applicable to contemporary formal (and non-formal) education. These three are taught, guided, and independent learning. Taught modality requires relatively close proximity to the instructor. Using Vygotskyian language, ‘taught’ constitutes the in inner space within the Zone of Proximal Development (ZPD) close to the More Knowledgeable Other (MKO). ‘Guided’ still requires close attention to the voice of the MKE but can be experienced at a distance, spatially and temporally, but is still closely following a predetermined path. ‘Independent’ study may deviate to varied degrees from the voice of the MKE, encouraging students to explore their own learning context, real-world experiences or identifying multiple voices from which to learn, but all within a prescribed learning journey with predetermined outcomes in mind.

We can encourage students to explore specific learning resources and activities more independently of the tutor’s gaze and away from other students. This is independent study. This is likely still to be guided by the teacher with an agreed set of outcomes in mind, it is just not taught learning.

We would serve the learning community better if we talk about self-directed learning only in the context of informal learning. Self-directed learning requires the individual to decide for themselves the outcomes they want to achieve. When we are talking about learners doing their own thing in the context of formal and non-formal programmes of study, we should describe that as ‘independent study’ or ‘independent learning ‘.

I would like to see national qualifications authorities to adopt these distinctions. ‘Taught’ implies face-to-face real-time encounters between learners and teachers. ‘Guided’ is more suitable for time-displaced and distance learning, but still requires students to follow the lead by the voice of the MKO. Independent learning is still restricted by the agreed outcomes but allows the student to move away from the voice of the teacher and to make autonomous decisions as to best achieve the prescribed outcomes. There is no place for self-directed learning in formal and non-formal education.

Disclaimer: this post represents a personal view and in no way represents the views of any institution with which I am, or have been, associated.

Knowles, M. (1975). Self-directed Learning: A guide for learners and teachers. Association Press.

UNESCO Institute for Statistics. (2012). International standard classification of education: ISCED 2011. UNESCO Institute for Statistics. http://www.uis.unesco.org/Education/Documents/isced-2011-en.pdf

Photo by Mika Matin on Unsplash

 

Designing Pathways: which way to innovation?

We need to continue to move away from seeing tertiary education as the imparting knowledge and see it rather as developing the skill of all students to be able to decide which learning pathways best suits their context, prior experience and aspirations. One of the consistent messages I try and instil in others’ practice is the importance of the social context in which the student inhabits.

In November 2018 I contributed to an EDEN online webinar talking about ‘Innovative Education’ as part of the 2018 European Distance Learning Week. Here is my presentation, entitled “Designing Pathways: which way to innovation?”

Pedagogy, Andragogy and Transformative Change (23’20”)

This online lecture, first delivered as part of a UK University PGCert for educators, reviews the concepts of pedagogy and andragogy before going on to examine the applicability of Mezirow’s transformative learning theory to professional education. It also identifies Paulo Friere and bell hooks as radical thinkers in education worthy of note. Please note that this lecture was originally intended to be supplemented with a synchronous webinar and additional readings.

These resources from 2013-2017 are being shared to support colleagues new to teaching online in the face of the COVID-19 pandemic. 

 

%d