Programme boards, project boards and university committees make decisions about all kinds of aspects of HE business activity. I don’t think I have ever sat on so many. I’m honestly not sure whether this is a reflection of my growing role, or a symptom of a widespread rush to governance.
Currently the roles I play on such groups include: Senior Sponsor, Senior Supplier, Senior User, Senior ISG representative, Director of LTW, Assistant Principal, Service Owner, Business Service Owner, Chair, Champion, and ‘person-volunteered-in-their-absence’. My current portfolio of boards ( around 20) include:
Jisc had gathered a community of learning technologists and IT specialists and asked us to think about how we might find an evidence base for TEL.
But I do wonder: Should we even try?
There is a real risk to the universities in having the people who are best placed to build and develop excellent new services spending too much of their time of fruitless tasks. I think knowing what kinds of evidence is relevant for which decisions is a leadership skill, and leadership in learning technology is what its all about.
That’s not to say we shouldn’t make evidence-based decisions, or decisions based on data. We need to know the difference between evidence and data. But I think ‘technology enhanced learning‘ might be a red herring. Or possibly a hens tooth. Or may be both.
Even before the Trump era of post-fact and post-truth there were already many people, with strong convictions will not be persuaded by evidence, however well it is presented.
Some times I suspect that people ask for evidence not because they want to make a decision, but because they already have.
Sometimes I suspect that the request for more evidence, and more detail is a stalling or blocking tactic. It is just one approach to resistance. No amount of detail will ever be enough and you’ll spend a long time looking for it.
The evidence-base is not the same as the business-case.
So, In summary: Should we spend more time assembling an evidence base for technology enhanced learning?
I vote No. The opportunity cost is too great. It would have to be so broad, yet so detailed to convince university lecturers it would be quickly unstainable. It would be backward looking and the data unreproducable. It would have little useful link to the real, real-time decisions being made for investment for the future. We should not waste that time, we have more urgent things to do.
It’s that time of year again. OER17 conference will see a gathering of the OER clans in the UK once more. Together we will map the political landscape for OER. I will be arguing that it is OER which will save the HE institutions from Brexit, Trump and possibly Indyref2.
It is clear that business models associated with OER are in their infancy and whether any institution pursues models[…….] will be highly dependent on any given institutions business strategy.’(1)
“The clear identification of ownership and copyright permissions is integral to managing open educational resources. This means that institutions become much more aware of intellectual property in relation to the resources they create and use. “ (2)
The senior management briefing papers and guides produced as a result of the JISC /HEA funding programmes (2009-13) offered suggestions to colleagues within institutions on how to best engage with senior stakeholders. They also offered suggestions to those stakeholders as to reasons why they might invest in OER as part of strategic planning. And yet, at many OER conferences, workshops and events the questions are still raised: ‘What can we do to get institutional support for our open education practice?’ ‘ How can we persuade senior managers?’
What piece of the puzzle is missing? In this presentation I will offer a view from the perspective of one UK HEI senior management which I hope will be of interest and use to colleagues working in large institutions at a time of Brexit and Trump. Making a business case for OER is simple if it aligns that activity to institutional strategies for investment, market differentiation, student and staff satisfaction or IT, IP and mitigation of risk. The context of OER includes a range of views relating to the economics of OER . This short presentation will focus on just one, but one which identifies persuading budget holders within the institution as key to successful sustainable services.
This session is a presentation rather than a workshop but please feel free to bring a copy of your own University’s strategic mission.
I am often asked by academic colleagues to provide an evidence base for TEL.
When colleagues ask me for evidence they hope I will find for them evidence of exactly this technology being used in exactly the way they teach at exactly the same level at a peer university. And that ideally this evidence will have been published with peer review. And that it will be entirely free from bias.
One thing about academics is they all come from difference research backgrounds, different research paradigms and use different research methods. So what they consider to be evidence strong enough for making decisions upon can be very varied.
Another thing about academics is that they may not know much about each others research methods – because they mostly spend time writing, researching, conferencing and publishing within their own discipline.
I am always being told that academics spend more time in their discipline networks, than in their university, so I do think they might be better placed to discover the practice of their peers than I.
University of Edinburgh established a number of TEL chairs to improve the quality of teaching in the disciplines but it sometimes feels like other colleagues deliberately do not engage with the development of teaching in their own disciplines. I’m not sure why.
I used to teach on the PGcert Learning and Teaching in Higher Education at University of Leeds and we always organised a session in which colleagues went around the room just describing how they do research. I think it was eye-opening for all.
We asked them ‘how do you do research? Some do experiments, some do clinical trials, some do text mining, some do field trips, some do focus groups and ethnographic studies some, do qualitative others do quantitative. Some do practical, some do theoretical. Some are empirical, some not so much. Some wrangle big data live, some seek metaphysical interpretation and engage in hundreds of hours of reading. Doing research in History is quite different from doing research in Chemistry. Even evidence-based medicine and evidence-based practice are not the same. Very few academics outside of Education departments do research in Education.
It is also true that learning technologists are drawn from many discpline backgrounds. Some of us have studied Education, some Computing, some Philosophy, some Medicine, some Geography, some Copyright Law. We will tend to use the research methods with which we are most familiar.
For most early career academic there’s no reward for researching TEL. They are unlikely to want to spend time on that task. They may be happy to contribute to a quick case study. Even then, case studies tend to be based on cohorts and every teacher will tell you that cohorts can be markedly difference for many reasons. There really is very little higher education educational research that is generalisable. A colleague who doesn’t trust your methodology will never trust your findings.
Where colleagues do engage with PGcert Learning and Teaching courses, those courses sometimes aim to do the action research on situated practice. Some of this will be about using technology in teaching. For many of the participants this will be the first time doing education research and they are doing it at a beginners Masters level. They will tend to want to use the research methods of their own discipline. So although the case studies exist, they may still fail to persuade each other. The PGcertLTHE community of academic developers do little to gather these case studies together as an evidence base for all. They lock them away on internal wikis with no intention to share openly.
The evidence they want is not the same as the evidence lecturers ask for. Budget holders are more persuaded by market research than academic research. They want evidence that has been gathered at scale. Across whole institutions, across the sector, or across the globe.
Academic colleagues want to be persuaded to spend their own time. Budget holders want to be persuaded to invest the institution’s money and many, many people’s time.
A business case is not the same as an evidence base.
Different kind of evidence entirely.
Senior managers want to know what competitors are doing- they are working to find a value proposition, they are looking at what other universities are doing and looking for differentiation in the market. They want to know what to buy now for the university in 5 years not what a lecturer did somewhere three years ago.
From the learning technology out there, who’s offering the best price, service resilience, future proofing and information security? What’s the full cost of ownership over 5-10 years? We look at what integrates with the systems we have on campus – offering a streamlined approach. What efficiencies, what re-use, what standardisation, what new business?
We look to other industries and demographics trends of technology use. What devices students bring, what devices people use, how people use technology. We suspect that staff and student are people. We consider their use of personal devices at home and for work and their expectation that they will find this at university too.
We know they expect choice and a high quality of service. Does it shift time and space? Does it give flexibility? Does it work on my mobile?
Student demand for digitisation is about being able to watch a thing rather than not, being able to find a thing rather than not. Being able to do a thing in the middle of the night.
So budget holders are persuaded by the kind of evidence you find in business and IT disciplines: hype cycles and horizon scanning, evidence of use video traffic across the network, evidence of what students use and what students voted for when they elected their reps.
The kind of evidence being gathered by student experience surveys, and digital student experience surveys are driving institutional investment from the centre faster and harder than you might think.
Senior managers are looking for solutions at scale.
The thing about working in universities, is you have to be very careful about language. I am very lucky to work at University of Edinburgh. Previously I worked at University of Oxford. In both those places I learned that colleagues will, quite rightly, question you and push you to be clear. And so they should.
At Edinburgh I work alongside a group of digital education researchers who have published their thoughts about technology enhanced learning. It’s a good read. I would encourage you to take a look.
According to Sian, the problem is the words: technology, enhanced and learning.
When we talk about technology in universities we tend to assume we know what we mean by TEL- that there is a shared understanding of the phrase. I’m not sure there is or should be.
Technology could be a range of things, not just computers, not just online, there might be all kinds of technologies we should investigate which might enhance learning. We should think of performance-enhancing study drugs and quantified-self technologies which might be used by students to enhance their revision timetable or maximise their studying stamina.
For TEL evidence-based research we seem to focus only on quite a small set of technologies- most of which are not particularly new- and are mostly fairly unremarkable even invisible, to students- websites, handouts, lecture recordings, tests, wikis, blogs. These days these are hard to distinguish from everyday content for most students who routinely read online, watch online and chat online. Do we show our age when we refer to these as innovation?
And then there’s the word ‘enhanced’. Enhanced is not the same as support, or change or disrupt, or transform- all of which might be worth exploring. Enhanced implies that learning is a thing well understood the way it is and that the only thing worth doing with technology is a bit more of that, but with some tweaked enhancement. If we approach it like that we find studies which show no significant difference, or not much and no moves forward are made. And it’s hard to justify investment.
And learning? Do we really mean learning, or is it the teaching that’s to be changed or the education? Or the accessibility, or the discoverability, or the administration?
It does strike me that in this country we have made make a rod for our own backs. TEL and VLE are both very UK specific terms. In other countries Balckboard, Moodle et al are LMS- Learning Management Systems. ‘Virtual Learning Enviroment’ promises a lot. It sounds like a platform for virtual worlds and immersive environments and beautifully designed, challenging games.
You know your VLE is never going to deliver that. It won’t even compare to the kind of impressive learning environment offered by a splendid library but because of the name, we seek to find the affordances and cognitive gains instead of just admiring the rather elegant ways it manages groups, integrates with the timetabling system and works on a mobile phone.
Sometimes I wonder in whose interest it is for our tech experts to be tamed, domesticated and confined to a term like ‘TEL’? But I suspect we did this to ourselves. We called them VLEs * to convince our senior budget holders to invest and now we beat on, like boats against the current, borne back ceaselessly into the past, searching endlessly for the evidence.