Category Archives: Uncategorized

More thoughts on: “Digital is not the future – Hacking the institution from the inside” – Technology, practical solutions and further questions

Previously on Screenface.net:

I’ve been participating in an online “hack” looking at “Digital is not the future – Hacking the institution from the inside” with a number of other education designers/technologists.

hack-poster-3
It’s been pretty great.

I shared some thoughts and summarised some of the discussions tied to the issues we face in supporting and driving institutional change, working with organisational culture and our role as professional staff experts in education design and technology.

There’s still much to talk about. Technology and what we need it to do, practical solutions both in place and under consideration / on the wishlist, further questions and a few stray ideas that were generated along the way.

Technology: 

Unsurprisingly, technology was a significant part of our conversation about what we can do in the education support/design/tech realm to help shape the future of our institutions. The core ideas that came up included what we are using it for and how we sell and instill confidence in it in our clients – teachers, students and the executive.

The ubiquity and variety of educational technologies means that they can be employed in all areas of the teaching and learning experience. It’s not just being able to watch a recording of the lecture you missed or to take a formative online quiz; it’s signing up for a course, finding your way to class, joining a Spanish conversation group, checking for plagiarism, sharing notes, keeping an eye on at-risk students and so much more.

It’s a fine distinction but Ed Tech is bigger than just “teaching and learning” – it’s also about supporting the job of being a teacher or a learner. I pointed out that the recent “What works and why?” report from the OLT here in Australia gives a strong indication that the tools most highly valued by students are the ones that they can use to organise their studies.
Amber Thomas highlighted that “…better pedagogy isn’t the only quality driver. Students expect convenience and flexibility from their courses” and went on to state that “We need to use digital approaches to support extra-curricular opportunities and richer personal tracking. Our “TEL” tools can enable faster feedback loops and personalised notifications”

Even this is just the tip of the iceberg – it’s not just tools for replicating or improving analog practices – the technology that we support and the work we do offers opportunities for new practices. In some ways this links back closely to the other themes that have emerged – how we can shape the culture of the organisation and how we ensure that we are part of the conversation. A shift in pedagogical approaches and philosophies is a much larger thing that determining the best LMS to use. (But at its best, a shift to a new tool can be a great foot in the door to discussing new pedagogical approaches)

“It is reimagining the pedagogy and understanding the ‘new’ possibilities digital technologies offer to the learning experience where the core issue is” (Caroline Kuhn)

Lesley Gourlay made a compelling argument for us to not throw out the baby with the bathwater when it comes to technology by automatically assuming that tech is good and “analogue” practices are bad. (I’d like to assume that any decent Ed Designer/Tech knows this but it bears repeating and I’m sure we’ve all encountered “thought leaders” with this take on things).

“we can find ourselves collapsing into a form of ‘digital dualism’ which assumes a clear binary between digital and analogue / print-based practices (?)…I would argue there are two problems with this. First, that it suggests educational and social practice can be unproblematically categorised as one or the other of these, where from a sociomaterial perspective I would contend that the material / embodied, the print-based / verbal and the digital are in constant and complex interplay. Secondly, there perhaps is a related risk of falling into a ‘digital = student-centred, inherently better for all purposes’, versus ‘non-digital = retrograde, teacher-centred, indicative of resistance, in need of remediation’.” (Lesley Gourlay)

Another very common theme in the technology realm was the absolute importance of having reliable technology (as well as the right technology.)

Make technology not failing* a priority. All technology fails sometime, but it fails too often in HE institutions. Cash registers in supermarkets almost never fail, because that would be way too much of a risk.” (Sonia Grussendorf)

When it comes to how technology is selected for the institution, a number of people picked up on the the tension between having it selected centrally vs by lecturers.

“Decentralize – allow staff to make their own technology (software and hardware) choices” (Peter Bryant)

Infrastructure is also important in supporting technologies (Alex Chapman)

Personally I think that there must be a happy medium. There are a lot of practical reasons that major tools and systems need to be selected, implemented, managed and supported centrally – integration with other systems, economies of scale, security, user experience, accessibility etc. At the same time we also have to ensure that we are best meeting the needs of students and academics in a host of different disciplines. and are able to support innovation and agility. (When it comes to the selection of any tool I think that there still needs to be a process in place to ensure that the tool meets the needs identified – including those of various institutional stakeholders – and can be implemented and supported properly.)

Finally, Andrew Dixon framed his VC elevator pitch in terms of a list of clear goals describing the student experience with technology which I found to be an effective way of crafting a compelling narrative (or set of narratives) for a busy VC. Here are the first few:

 

  1. They will never lose wifi signal on campus – their wifi will roam seemlessly with them
  2. They will have digital access to lecture notes before the lectures, so that they can annotate them during the lecture.
  3. They will also write down the time at which difficult sub-topics are explained in the lecture so that they can listen again to the captured lecture and compare it with their notes. (Andrew Dixon)

Some practical solutions

Scattered liberally amongst the discussions were descriptions of practical measures that people and institutions are putting in place. I’ll largely let what people said stand on its own – in some cases I’ve added my thoughts in italics afterwards. (Some of the solutions I think were a little more tongue in cheek – part of the fun of the discussion – but I’ll leave it to you to determine which)

Culture / organisation

Our legal team is developing a risk matrix for IT/compliance issues (me)

(We should identify our work) “not just as teaching enhancement but as core digital service delivery” (Amber Thomas)

“we should pitch ‘exposure therapy’ – come up with a whole programme that immerses teaching staff in educational technology, deny them the choice of “I want to do it the old fashioned way” so that they will realise the potential that technologies can have…” (Sonja Grussendorf)

“Lets look at recommendations from all “strategy development” consultations, do a map of the recommendations and see which ones always surface and are never tackled properly.” (Sheila MacNeill)

“Could this vision be something like this: a serendipitous hub of local, participatory, and interdisciplinary teaching and learning, a place of on-going, life-long engagement, where teaching and learning is tailored and curated according to the needs of users, local AND global, actual AND virtual, all underscored by data and analytics?” (Rainer Usselman)

“…build digital spaces to expand our reach and change the physical set up of our learning spaces to empower use of technology…enable more collaborative activities between disciplines” (Silke Lange)

“we need a centralised unit to support the transition and the evolution and persistence of the digital practice – putting the frontliners into forefront of the decision making. This unit requires champions throughout the institutions so that this is truly a peer-led initiative, and a flow of new blood through secondments. A unit that is actively engaging with practitioners and the strategic level of the university” (Peter Bryant)

In terms of metrics – “shift the focus from measuring contact time to more diverse evaluations of student engagement and student experience” (Silke Lange)
“Is there a metric that measures teaching excellence?… Should it be designed in such a way as to minimise gaming? … should we design metrics that are helpful and allow tools to be developed that support teaching quality enhancement?” (David Kernohan)  How do we define or measure teaching excellence?
“the other thing that we need to emphasise about learning analytics is that if it produces actionable insights then the point is to act on the insights” (Amber Thomas) – this needs to be built into the plan for collecting and dealing with the data.

Talking about the NSS (National student survey) – “One approach is to build feel-good factor and explain use of NSS to students. Students need to be supported in order to provide qualitative feedback” (David Kernohan)  (I’d suggest that feedback from students can be helpful but it needs to be weighted – I’ve seen FB posts from students discussing spite ratings)

“We should use the same metrics that the NSS will use at a more granular levels at the university to allow a more agile intervention to address any issues and learn from best practices. We need to allow flexibility for people to make changes during the year based on previous NSS” (Peter Bryant)

“Institutional structures need to be agile enough to facilitate action in real time on insights gained from data” (Rainer Usselmann) – in real time? What kind of action? What kind of insights? Seems optimistic

“Institutions need at the very least pockets of innovation /labs / discursive skunk works that have licence to fail, where it is safe to fail” (Rainer Usselmann)

“Teachers need more space to innovate their pedagogy and fail in safety” (Silke Lange)
“Is it unfair (or even unethical) to not give students the best possible learning experience that we can?…even if it was a matter of a control group receiving business-as-usual teaching while a test group got the new-and-improved model, aren’t we underserving the control group?” (me)

“I can share two examples from my own experiences
An institution who wanted to shift all their UG programmes from 3 year to 4 year degrees and to deliver an American style degree experience (UniMelb in the mid 2000s)

An institution who wanted to ensure that all degree programmes delivered employability outcomes and graduate attributes at a teaching, learning and assessment level

So those resulted in;
a) curriculum change
b) teaching practice change
c) assessment change
d) marketing change ” (Peter Bryant)

“One practical option that I’m thinking about is adjusting the types of research that academics can be permitted to do in their career path to include research into their own teaching practices. Action research.” (Me) I flagged this with our Associate Dean Education yesterday and was very happy to hear that she is currently working on a paper for an education focussed journal in her discipline and sees great value in supporting this activity in the college.

“I think policy is but one of the pillars that can reinforce organisational behaviour” (Peter Bryant)- yes, part of a carrot/stick approach, and sometimes we do need the stick. Peter also mentions budgets and strategies, I’d wonder if they don’t change behaviour but more support change already embarked upon.

Technology

“let’s court rich people and get some endowments. We can name the service accordingly: “kingmoneybags.universityhandle.ac.uk”. We do it with buildings, why not with services?” (Sonia Grussendorf) – selling naming rights for TELT systems just like buildings – intriguing

We need solid processes for evaluating and implementing Ed Tech and new practices (me)

Pedagogical

“Could creating more ‘tailored’ learning experiences, which better fit the specific needs and learning styles of each individual learner be part of the new pedagogic paradigm?” (Rainer Usselman) (big question though around how this might be supported in terms of workload

“At Coventry, we may be piloting designing your own degree” (Sylvester Arnab)
“The challenge comes in designing the modules so as to minimise prerequisites, or make them explicit in certain recommended pathways” (Christopher Fryer)
I went on to suggest that digital badges and tools such as MyCourseMap might help to support this model. Sylvester noted that he is aware that “these learning experiences, paths, patterns, plans have to be validated somehow” Learner convenience over pedagogy – or is it part of pedagogy in line with adult learning principles of self-efficacy and motivation. In a design your own degree course, how do we ensure that learners don’t just choose the easiest subjects – how do we avoid the trap of having learners think they know enough to choose wisely?

“digital might be able to help with time-shifting slots to increase flexibility with more distributed collaboration, flipped teaching, online assessment” (George Roberts)

 

“At UCL we are in the midst of an institution-wide pedagogic redesign through the Connected Curriculum. This is our framework for research-based education which will see every student engaging in research and enquiry from the very start of their programme until they graduate (and beyond). More at http://www.ucl.ac.uk/teaching-learning/connected-curriculum

The connected bit involves students making connections with each other, with researchers, beyond modules and programmes, across years of study, across different disciplines, with alumni, employers, and showcase their work to the wider world…

There is strong top-down support, but also a middle-out approach with faculties having CC fellows on part time secondments to plan how introduce and embed the CC in their discipline.

From a TEL perspective we need to provide a digital infrastructure to support all of this connectivity – big project just getting going. Requirements gathering has been challenging… And we’re also running workshops to help programme and module teams to design curricula that support research-based and connected learning.” (Fiona Strawbridge) – liking this a lot, embedding practice. What relationship do these fellows have with lecturers?

 

“I am imagining that my research, personal learning environment would fit perfect with this approach as I am thinking the PLE as a toolbox to do research. There is also a potential there to engage student in open practice, etc.” Caroline Kuhn

“There may be a “metapedagogy” around the use of the VLE as a proxy for knowledge management systems in some broad fields of employment: consultancy, financial services, engineering…” (George Roberts)  (which I’d tie to employability)

“We need to challenge the traditional model of teaching, namely didactic delivery of knowledge. The ways in which our learning spaces are currently designed -neat rows, whiteboard at front, affords specific behaviours in staff and students. At the moment virtual learning spaces replicate existing practices, rather than enabling a transformative learning experience. The way forward is to encourage a curricula founded on enquiry-based learning that utilise the digital space as professional practitioners would be expected to” (Silke Lange) – maybe but none of this describes where or how lecturers learn these new teaching skills. Do we need to figure out an evolutionary timeline to get to this place, where every year or semester, lecturers have to take one further step, add one new practice?

“Do not impose a pedagogy. Get rid of the curricula. Empower students to explore and to interact with one another. The role of the teacher is as expert, navigator, orienteer, editor, curator and contextualisor of the subject. Use heuristic, problem-based learning that is open and collaborative. Teach students why they need to learn” (Christopher Fryer)

 

This is but a cherry-picked selection of the ideas and actions that people raised in this hack but I think it gives a sense of some of the common themes that emerged and of the passion that people feel for our work in supporting innovation and good practices in our institutions.  I jotted down a number of stray ideas for further action in my own workplace as well as broader areas to investigate in the pursuit of my own research.

As always, the biggest question for me is that of how we move the ideas from the screen into practice.

Further questions

How are we defining pedagogical improvements – is it just strictly about teaching and learning principles (i.e. cognition, transfer etc) or is it broader – is the act of being a learner/teacher a part of this (and thus the “job” of being these people which includes a broader suite of tools) (me)

What if we can show how learning design/UX principles lead to better written papers by academics? – more value to them (secondary benefits) (me)

“how much extra resource is required to make really good use of technology, and where do we expect that resource to come from?” (Andrew Dixon)

Where will I put external factors like the TEF / NSS into my research? Is it still part of the organisation/institution? Because there are factors outside the institution like this that need to be considered – govt initiatives / laws / ???

Are MOOCs for recruitment? Marketing? (MOOCeting?)

“How do we demonstrate what we do will position the organisation more effectively? How do we make sure we stay in the conversation and not be relegated to simply providing services aligned with other people’s strategies” (arguably the latter is part of our job)
“How do we embed technology and innovative pedagogical practices within the strategic plans and processes at our institutions?” (Peter Bryant)

Further research

Psychology of academia and relationships between academic and professional staff. (Executive tends to come from academia)

“A useful way to categorise IT is according to benefits realisation. For each service offered, a benefits map should articulate why we are providing the service and how it benefits the university.” (See https://en.wikipedia.org/wiki/Benefits_realisation_management ) (Andrew Dixon)

Leadership and getting things done / implementing change, organisational change

How is organisational (particularly university) culture defined, formed and shaped?

Actor-network theory

Design research

Some ideas this generated for me

Instead of tech tool based workshops – or in addition at least – perhaps some learning theme based seminars/debates (with mini-presentations). Assessment / Deeper learning / Activities / Reflection

Innovation – can be an off-putting / scary term for academics with little faith in their own skills but it’s the buzzword of the day for leadership. How can we address this conflict? How can we even define innovation within the college?

What if we bring academics into a teaching and learning / Ed tech/design support team?

Telling the story of what we need by describing what it looks like and how students/academics use it in scenario / case study format offers a more engaging narrative

What is the role of professional bodies (E.g. unions like the NTEU) in these discussions?

Are well-off, “prestigious” universities the best places to try to innovate? Is there less of a driving urge, no pressing threat to survival? Perhaps this isn’t the best way to frame it – a better question to ask might be – if we’re so great, what should other universities be learning from us to improve their own practices? (And then, would we want to share that knowledge with our competitors)

“I was thinking about the power that could lie behind a social bookmarking tool when doing a dissertation, not only to be able to store and clasify a resource but also to share it with a group of likeminded researcher and also to see what other have found about the same topic.” (Caroline Kuhn) – kind of like sharing annotated bibliographies?

Bigger push for constructive alignment
I need to talk more about teaching and learning concepts in the college to be seen as the person that knows about it

In conclusion

I’d really like to thank the organisers of the Digital is not the future Hack for their efforts in bringing this all together and all of the people that participated and shared so many wonderful and varied perspectives and ideas. Conversation is still happening over there from what I can see and it’s well worth taking a look.

 

 

 

 

 

 

 

 

 

 

 

 

Rethinking Ed Tech at our university

I mentioned a couple of weeks ago that I’ve embarked on some sort of ramshackle process of evaluating what we’re doing in terms of Ed. Tech and design with some of my fellow Ed Techs and Designers in the colleges and central team. This is with a view to finding ways to work together better, build relationships and ultimately make some recommendations to the high ups that may or not be acted upon. (At the very least I’m optimistic that people on the ground will communicate and collaborate better and with a renewed clarity)

In some ways, we’re racing the clock, as our VC has started his consultation tour as the first part of his review/reform/something process. Best case scenario is that we’ll be able to feed our findings/opinions/fervent wishes into his process and change might be kickstarted. Worst case is – well, let’s not think about that. Something with dragons and ice zombies or something.

So we had our second discussion today and were able to successfully identify six core themes with some attendant issues and questions to press on with for more in-depth investigation. The goal is to try to come up with something tangible for each theme every two weeks, through a combination of online and in person discussions. This will ideally give us a greater sense of what we’re about (I hate to use the term mission statement but perhaps something less aethereal) which will inform some revised terms of reference for our lower level parts of the ed. tech governance structure. (This is where I’m expecting the greatest resistance but who knows.)

These are the themes that we have arrived at. (If you feel that we’ve missed something or over-estimated the importance of something, please feel free to leave a comment.)

Language and philosophy/vision: 

Is it eLearning, blended learning, technology enhanced learning (and teaching), online learning or just plain old teaching and learning? Why? Are we about education innovation or education support? (It’s not simply about the language either – this can be quite political).

What are we ultimately trying to achieve for the learners, the academics, the university, etc?
Are there a set of key principles that guide us?

Best practice

How do we define, encourage and support best practices in teaching and learning? (And in other areas?) How can we best serve teachers and learners? Is it strictly about the cognitive, pedagogical aspects of teaching and learning or do other factors need to come in to the training and advice that we offer including accessibility, equity and pastoral care?

Influencing

What can we do as humble (yet expert) professional support staff to be listened to? How do we take a more substantive role in the decision making processes that directly affect us?

Communication and collaboration

What can we do between our various colleges and teams to work together more effectively and share our skills and knowledge? How can we support wider dissemination of ideas in the university and in the wider education design/technology community?

Transparency

What can we do to build better relationships between the colleges and central teams and to increase understanding of each other’s needs and obligations? Can we simplify the decision making process to streamline approvals for changes and new initiatives?

Governance:  

How can we make the elements of the existing governance structure work more effectively together and better utilise the resources available?

These are some sensitively phrased questions and ideas to get started – this process is going to be complicated by virtue of the range of different stakeholders with competing priorities and differences of opinion will be inevitable. My hope is that by keeping focus on the mutual benefits – and sticking to the discussion topics – progress will be made.

This is the padlet in progress – you should be able to add things but not change them.

(I should mention that some of the themes were inspired/expanded by the discussions in the “Digital is not the future” hack – particularly the question of expertise)

Thoughts on: “Digital is not the future – Hacking the institution from the inside” – how education support staff can bring change

One of the best things about my research and being part of a community of practice of education technologists/designers is that there are a lot of overlaps. While I’d hate to jump the gun, I think it’s pretty safe to say that harnessing the group wisdom of this community is going to be a core recommendation when it comes to looking for ways to better support Tech Enhanced Learning and Teaching practices in higher education.

So why not start now?

There’s a lively and active community of Ed Techs online (as you’d hope) and arguably we’re bad at our jobs if we’re not part of it. I saw an online “hack” event mentioned a couple of weeks ago – the “Digital is not the future – Hacking the institution from the inside” discussion/workshop/something and joined up.

hack-poster-3

There’s a homepage (that I only just discovered) and a Twitter hashtag #futurehappens (that I also wish I’d noticed) and then a Loomio discussion forum thing that most of the online action has been occurring in.

Just a side note on Loomio as a tool – some of the advanced functionality  (voting stuff) seems promising but the basics are a little poor. Following a discussion thread is quite difficult when you get separate displays for new posts that only include most of the topic heading and don’t preview the new response. (Either on screen or in the email digest). Biographical information about participants was also scant.

All the same, the discussions muddled along and there were some very interesting and thoughtful chats about a range of issues that I’ll hopefully do justice to here.

It’s a joint event organised by the London School of Economics (LSE) and the University of Arts, London (UAL) but open to all. Unsurprisingly then, most participants seem to be from the UK, so discussions were a little staggered. There was also an f2f event that generated a sudden surge of slightly decontextualised posts but there was still quite of bit of interest to come from that (for an outsider)

The “hack” – I have to use inverted commas because I feel silly using the term with a straight face but all power to the organisers and it’s their baby – was intended to examine the common issues Higher Ed institutions face in innovating teaching and learning practices, with a specific focus on technology.

The guiding principles are:

Rule 1: We are teaching and learning focused *and* institutionally committed
Rule 2: What we talk about here is institutionally/nationally agnostic
Rule 3: You are in the room with the decision makers. What we decide is critical to the future of our institutions. You are the institution
Rule 4: Despite the chatter, all the tech ‘works’ – the digital is here, we are digital institutions. Digital is not the innovation.
Rule 5: We are here to build not smash
Rule 6: You moan (rehearse systemic reasons why you can’t effect change – see Rule 3), you get no beer (wine, juice, love, peace, etc)

We have chosen 5 common scenarios which are often the catalyst for change in institutions. As we noted above, you are in the room with the new VC and you have 100 words in each of the scenarios below to effectively position what we do as a core part of the institution. Why is this going to make our institutional more successful/deliver the objectives/save my (the VCs) job? How do we demonstrate what we do will position the organisation effectively? How do we make sure we stay in the conversation and not be relegated to simply providing services aligned with other people’s strategies?

The scenarios on offer are below – they seemed to fall by the wayside fairly quickly as the conversation evolved but they did spark a range of different discussions.

Scenario 1
Strategic review of the institution and budget planning for 2020
Scenario 2
Institutional restructure because of a new VC
Scenario 3
Undertaking of an institution wide pedagogical redesign
Scenario 4
Responding to and implementing TEF
Scenario 5
Institutional response to poor NSS/student experience results

(It was assumed knowledge that TEF is the upcoming UK govt Teaching Excellence Framework – new quality standards – and the NSS is the National Student Survey – student satisfaction feedback largely.)

Discussions centred around what we as Ed. Designers/Techs need to do to “change the discourse and empower people like us to actively shape teaching and learning at our institutions”. Apart from the ubiquitous “more time and more money” issue that HE executives hear from everyone, several common themes emerged across the scenarios and other posts. Thoughts about university culture, our role as experts and technology consistently came to the fore. Within these could be found specific issues that need to be addressed and a number of practical (and impractical) solutions that are either in train or worth considering.

On top of this, I found a few questions worthy of further investigation as well as broader topics to pursue in my own PhD research.

I’m going to split this into a few posts because there was a lot of ground covered. This post will look at some of the common issues that were identified and the from there I will expand on some of the practical solutions that are being implemented or considered, additional questions that this event raised for me and a few other random ideas that it sparked.

Issues identified

There was broad consensus that we are in the midst of a period of potentially profound change in education due to the affordances offered by ICT and society’s evolving relationship with information and knowledge creation. As Education designers/technologists/consultants, many of us sit fairly low in the university decision making scheme of things but our day to day contact with lecturers and students (and emerging pedagogy and technology) give us a unique perspective on how things are and how they might/should be.

Ed Tech is being taken up in universities but we commonly see it used to replicate or at best augment long-standing practices in teaching and learning. Maybe this is an acceptable use but it is often a source of frustration to our “tribe” when we see opportunities to do so much more in evolving pedagogy.

Peter Bryant described it as “The problem of potential. The problem of resistance and acceptance” and went on to ask “what are the key messages, tools and strategies that put the digital in the heart of the conversation and not as a freak show, an uncritical duplication of institutional norms or a fringe activity of the tech savvy?”

So the most pressing issue – and the purpose of the hack itself – is what we can do (and how) to support and drive the change needed in our institutions. Change in our approach to the use of technology enhanced learning and teaching practices and perhaps even to our pedagogy itself.

Others disagreed that a pedagogical shift was always the answer. “I’m not sure what is broken about university teaching that needs fixing by improved pedagogy… however the economy, therefore the job market is broken I think… when I think about what my tools can do to support that situation, the answers feel different to the pedagogical lens” (Amber Thomas)

The very nature of how we define pedagogy arose tangentially a number of times – is it purely about practices related to the cognitive and knowledge building aspects of teaching and learning or might we extend it to include the ‘job’ of being a student or teacher? The logistical aspects of studying – accessing data, communication, administrivia and the other things that technology can be particularly helpful in making more convenient. I noted the recent OLT paper – What works and why? – that found that students and teachers valued these kinds of tools highly. Even if these activities aren’t the text-book definition of pedagogy, they are a key aspect of teaching and learning support and I’d argue that we should give them equal consideration.

Several other big-picture issues were identified – none with simple solutions but all things that must be taken into consideration if we hope to be a part of meaningful change.

  • The sheer practicality of institution wide change, with the many different needs of the various disciplines necessitates some customised solutions.
  • The purpose of universities and university education – tensions between the role of the university in facilitating research to extend human knowledge and the desire of many students to gain the skills and knowledge necessary for a career.
  • The very nature of teaching and learning work can and is changing – this needs to be acknowledged and accommodated. New skills are needed to create content and experiences and to make effective use of the host of new tools on offer. Students have changing expectations of access to information and their lecturers’ time, created by the reality of our networked world. These are particularly pointy issues when we consider the rise of casualisation in the employment of tutors and lecturers and the limits on the work that they are paid to do.

Three key themes emerged consistently across all of the discussion posts in terms of how we education support types (we really do need a better umbrella term) can be successful in the process of helping to drive change and innovation in our institutions. Institutional culture, our role as “experts” and technology. I’ll look at culture and expertise for now.

Culture

It was pretty well universally acknowledged that, more than policy or procedure or resourcing, the culture of the institution is at the heart of any successful innovations. Culture itself was a fairly broad set of topics though and these ranged across traditional/entrenched teaching and learning practices, how risk-averse or flexible/adaptable an institution is, how hierarchical it is and to what extent the ‘higher-ups’ are willing to listen to those on the ground, willingness to test assumptions, aspirational goals and strategy and change fatigue.

Some of the ideas and questions to emerge included:

  • How do we change the discourse from service (or tech support) to pedagogy?
  • “The real issue is that money, trust, support, connectedness and strategy all emanate from the top” (Peter Bryant)
  • “the prerequisite for change is an organisational culture that is discursive, open and honest. And there needs to be consensus about the centrality of user experience.” (Rainer Usselmann)
  • “We need to review our decision making models and empower agility through more experimentation” (Silke Lange) – My take on this – probably true but perhaps not the right thing to say to the executive presumably, with the implication that they’re currently making poor decisions. Perhaps we can frame this more in terms of a commitment to continuous improvement, then we might be able to remove the sense of judgement about current practices and decision makers? 
  • “we will reduce the gap between the VC and our students… the VC will engage with students in the design of the organisation so it reflects their needs. This can filter down to encourage students and staff to co-design courses and structures, with two way communication” (Steve Rowett)
  • “In the private (start-up) sector, change is all you know. Iterate, pivot or perservere before you run out of money.That is the ‘Lean Start-up’ mantra… create a culture and climate where it is expected and ingrained behaviour then you constantly test assumptions and hypotheses” (Rainer Usselman)
  • “Theoretical and practical evidence is important for creating rationale and narratives to justify the strategy” (Peter Bryant) – I agree, we need to use a research-led approach to speak to senior academic execs

While change and continuous improvement is important, in many places it has come to be almost synonymous with management. And not always good management – sometimes just management for the sake of appearing to be doing things. It’s also not just about internal management – changes in government and government policy or discipline practices can all necessitate change.

One poster described how change fatigued lecturers came to respond to an ongoing stream of innovations relating to teaching and learning practice coming down from on-high.

I don’t think anyone will be surprised to hear that staff got very good at re-describing their existing, successful practice in whatever the language of the week was.

Culture is arguably the hardest thing to change in an organisation because there are so many different perspectives of it held by so many different stakeholders. The culture is essentially the philosophy of the place and it will shape the kinds of policy that determine willingness to accept risk, open communication, transparency and reflection – some of the things needed to truly cultivate change.

Experts

Our status (as education designers/technologists/support with expertise) in the institution and the extent to which we are listened to (and heard) was seen as another major factor in our ability to support and drive innovation.

There were two key debates in this theme: how we define/describe ourselves and what we do and how we should best work with academics and the university.

Several people noted the difference between education designers and education technologists.

“Educational developers cannot be ignorant of educational technologies any more than Learning Technologists can be ignorant of basic HE issues (feedback, assessment, teaching practices, course design, curriculum development etc).”

Perhaps it says something about my personal philosophy or the fact that I’ve always worked in small multi-skilled teams but the idea that one might be able to responsibly recommend and support tools without an understanding of teaching and learning practice seems unthinkable. This was part of a much larger discussion of whether we should even be talking in terms of eLearning any more or just trying to normalise it so that it is all just teaching and learning. Valid points were made on both sides

“Any organisational distinction between Learning & Teaching and eLearning / Learning Technology is monstrous. Our goal should be to make eLearning so ubiquitous that as a word it becomes obsolete.” (SonJa Grussendorf)

” I also think that naming something, creating a new category, serves a political end, making it visible in a way that it might not be otherwise.” (Martin Oliver)

“it delineates a work area, an approach, a mindset even…Learning technology is not a separate, secondary consideration, or it shouldn’t be, or most strongly: it shouldn’t be optional.” (Sonja Grussendorf)

There was also an interesting point made that virtually nobody talks about e-commerce any more, it’s just a normal (and sometimes expected) way of doing business now.

For me, the most important thing is the perception of the people that I am working with directly – the lecturers. While there is a core of early adopters and envelope pushers who like to know that they have someone that speaks their language and will entertain their more “out-there” ideas when it comes to ed tech and teaching practices, many more just want to go about the business of teaching with the new learning tools that we have available.

As the Education Innovation Office or as a learning technologist, I’m kind of the helpdesk guy for the LMS and other things. Neither of these sets of people necessarily think of me in terms of broader educational design (that might or might not be enhanced with Ed Tech). So I’ve been thinking a lot lately about rebranding to something more like Education Support or Education Excellence. (I’ve heard whispers of a Teaching Eminence project in the wind – though I haven’t yet been involved in any discussions)

But here’s the kicker – education technology and innovation isn’t just about better teaching and learning. For the college or the university, it’s an opportunity to demonstrate that yes, we are keeping up with the Joneses, we are progressive, we are thought leaders. We do have projects that can be used to excite our prospective students, industry partners, alumni, government and benefactors. So on this level, keeping  “innovation” or  “technology” in a title is far more important. So, pragmatically, if it can help me to further the work that I’m doing by being connected to the “exciting” parts of university activity, it would seem crazy not to.

There was some debate about whether our role is to push, lead or guide new practice. I think this was largely centred on differences of opinion on approaches to working with academics.  As I’ve mentioned, my personal approach is about understanding their specific teaching needs (removing technology from the conversation where possible) and recommending the best solutions (tech and pedagogic). Other people felt that as the local “experts”, we have a responsibility to push new innovations for the good for the organisation.

“Personally I’m not shy about having at least some expertise and if our places of work claim to be educational institutions then we have a right to attempt to make that the case… It’s part of our responsibility to challenge expectations and improve practices as well” (David White)

“we should pitch ‘exposure therapy’ – come up with a whole programme that immerses teaching staff in educational technology, deny them the choice of “I want to do it the old fashioned way” so that they will realise the potential that technologies can have in changing not only, say, the speed and convenience of delivery of materials (dropbox model), but can actually change their teaching practice.” (Sonja Grussendorf)

I do have a handful of personal Ed tech hobby horses that the College hasn’t yet taken an interest in (digital badges, ePortfolios, gamification) that I have advocated with minimal success and I have to concede that I think this is largely because I didn’t link these effectively enough to teaching and learning needs. Other participants held similar views.

Don’t force people to use the lift – convince them of the advantages, or better still let their colleagues convince them. (Andrew Dixon)

A final thought that these discussions triggered in me – though I didn’t particularly raise it on the board – came from the elephant in the room that while we might be the “experts” – or at least have expertise worth heeding – we are here having this discussion because we feel that our advice isn’t always listened to.

Is this possibly due to an academic / professional staff divide? Universities exist for research and teaching and these are the things most highly valued – understandably. Nobody will deny that a university couldn’t function in the day to day without the work of professional staff but perhaps there is a hierarchy/class system in which some opinions inherently carry less weight. Not all opinions, of course – academics will gladly accept the advice of the HR person about their leave allocation or the IT person in setting up their computer – but when we move into the territory of teaching and learning, scholarship if you like, perhaps there is an almost unconscious disconnect occurring.

I don’t say this with any particular judgement and I don’t even know if it’s true – my PhD supervisor was aghast when I suggested it to her – but everyone has unconscious biases and maybe this is one of them.

Just a thought anyway but I’d really like to hear your thoughts on this or anything else that I’ve covered in this post.

Next time, on Screenface.net – the role of technology in shaping change and some practical ideas for action.

 

 

Thoughts on: “Doing Insider Research in Universities” (Trowler, 2012) Part 3 – Good research design and ethics/politics

I’m not sure that this is how I’ll process all of the books that I read – in fact I’m almost certain that it isn’t – but I’ll continue this series of posts about Trowler – Doing Insider Research in Universities because I have found it to be a great way to dip my toe into the many issues that I will face in my research.

The next two chapters look at value and robustness in insider research (which, again, I take to be about being able to defend your methodology – and choosing a good one) and then the ethics and politics of insider research in your university, which is pretty much unavoidable.

He opens with a discussion of some of the criticisms that case studies face as well as some of the responses to these. I’ll leave it here in full as it sums it up well.

Case study researchers may find their work subject to the following criticisms (Flyvbjerg, 2006): it only yields concrete, practical knowledge rather than the supposedly more valuable general, theoretical and propositional knowledge;  generalization from one case is not possible and so the research does not contribute to the development of knowledge; the research can generate hypotheses, but other methods are necessary to test them and build theory; case study design contains a bias toward verification, i.e., a tendency to confirm the researcher’s preconceived notions. Both Flyvbjerg and Yin (2009) refute these criticisms, but those who research their own universities need to be clear about precisely what their research questions are, what the rationale behind the research design is, and what the truth claims are. This advice holds for any kind of research, but other designs tend to draw less critical fire.

(He also highlights Gomm et al (2000), Simons (2009) and Yin (2009) as great starting points for further investigation of case study methodology)

Trowler dips back into some of the ontological questions that he touched on earlier in the book, comparing the merits of the true vs the useful. (I may be oversimplifying this). This draws on Sayer’s notion of “practical adequacy” in prioritising the usefulness of information. I kind of get it but think I’ll need to dig a little deeper. I can see how some true things mightn’t necessarily be valuable but as for things that are kind of true…?

This is echoed in further discussions of Bassey’s idea of “fuzzy generalizations”. In short, this is about acknowledging that life is complex and theory won’t always accommodate the range of factors at play. So that rather than saying that in situation A, if B happens then C will follow, we might say in situation A, if B happens then C will generally follow between D and E% of the time. It’s not as neat and arguably not as helpful but no doubt more realistic.

In terms of the design of research, Trowler posits twelve questions for researchers to consider to test the rigour and quality of their proposed methods.

1. In designing the research, how do I know that my planned approach will answer the questions I have more fully than any other?
2. How do I design the research to take best advantage of the benefits of insider research while avoiding its pitfalls as far as possible?
3. Conceptually, how do I represent my organization, its culture and its practices? (And how does this representation shape my design?)
4. How and from whom will I secure access to the data I need? (Why them and why not others?)
5. Whom should I inform about the project, and how should I describe it, when I seek ‘informed’ consent? (And how might this affect my data?)
6. How will I ensure that the project is run ethically so that all participants and institutional bodies are protected? (While at the same time being as transparent as possible to readers so they can judge the robustness of my approach and conclusions?)
7. If I am using participant observation, what are the ethical issues in relation to the people I observe in natural settings? (And how might my answer to that question affect my data?)
8. If using interviews, what measures should I take to deal with interview bias? (And will they assure a sufficient degree of robustness?)
9. What should the balance be between collecting naturalistic data and formally ‘collected’ data? (And how can I offer assurances of robustness about conclusions drawn from both?)
10. How should I analyse the different forms of data I have, given that there will almost certainly be a large amount of various sorts? (And how do I ensure that sufficient and appropriate weight is given to each form of data in generating conclusions?)
11. How, and how much, will I communicate my findings to participants to ensure that they are happy with what I intend to make public? (And will this affect the way I present my conclusions to other audiences?)
12. Generally, in what other ways can I satisfy the reader about the robustness of my research and its findings?

(I was a little hesitant to just paste this in holus bolus but it all seems particularly valuable).

In very pragmatic terms, there will also inevitably be a number of ethical and political issues to consider when undertaking insider research in one’s own institution. The question of whether to anonymise the institution and the research participants is a live one – though at this stage, to me, it seems impractical and counterproductive, particularly when it could mean reducing the number of sources of data for fear of not being able to correctly reference them. The lack of transparency could also arguably lessen a reader’s view of the robustness of the research.

I also think that the question of to what extent people change their behaviour under observation is a valid one, however there are no doubt ways to mitigate this.

Politically, senior leaders in the organisation will imaginably want to feel confident that the research won’t damage the reputation of the university before granting access. Does this then lead to self-censorship and selective reporting if there are areas where there is room for improvement?

At a higher level of ethical debate, the selection of standpoints from which to pose questions and to begin observations and investigations can raise some concerns. If I fail to incorporate the perspective of voices that are less often heard, am I guilty of perpetuating the status quo?

Lots to think about and to be perfectly blunt, I would be naive not to factor in the question of what asking the wrong person the wrong question might mean for my career prospects in the organisation. Fortunately I have a little time to think about some of these things.

 

Thoughts on: “Doing Insider Research in Universities” (Trowler, 2012) Part 2 (Being an insider and research methodologies)

I just realised that Part 1 of these posts had been sitting in my drafts folder – waiting for ??? – so if it seems like I’ve had an incredibly productive burst of writing, yeah nah. (Which isn’t to say that I don’t have a fair bit to say about this book now, as I have found it particularly helpful)

One thing I have learned, sadly, is that the Kindle doesn’t sort “highlights” (selections that I’ve made in the text) by chapter, so my notes aren’t as useful to me as I might’ve hoped. I’ll know to add notes to the highlights as well next time, listing the chapter. (It is still pretty handy being able to collate them in the kindle.amazon.com/your_highlights page though)

Chapter 2: Researching Universities. 

Trowler looks here at some of the knowledge and data issues that we face when researching in universities. Primarily the nature of the different types of knowledge present – implicit and explicit.

Implicit relates to practice knowledge (a.k.a phronesis) that has been developed through experience and which can be harder to express in words. Things like how to deliver a rigorous and yet engaging lecture. Because it is lived knowledge, it can be argued that it carries more legitimacy than that which has come from theory. At the same time, it will be very important to remember that there can often be a gap between people’s perspectives of their actions and the actions themselves.

On the other side we find propositional or technical knowledge, which is more like your old fashioned book learning. Easier to express and generally more objective but if we take into consideration the fact that unis exist in different contexts and can have their own quirks, this knowledge is likely to be useful in different applications.

Trowler references Blackler (1995), who

distinguishes between five types of knowledge identified in the organizational literature: embodied; embedded; embrained; encultured and encoded. These terms describe, respectively, knowledge which lies in muscle memory and is seen in skilled physical performances (performative knowledge); that which is located in systematic routines; knowledge which lies in the brain (cognition); that which is located in shared meaning systems; and finally knowledge which is encoded in text of various sorts.

For the research that I’m considering – which currently seems to be drifting toward the role of the university as a whole (or as an “ecology” if you will), I can see that there may be value in most of these kinds of knowledge. I get the feeling that there is definitely going to be a need to consider some of the theories relating to organisations and management.

In a nutshell, I need to be mindful of the ontological and epistemological questions that come up relating to the nature of reality and of knowledge and have an answer ready for questions about how I have done so.

Chapter 3: Research design, data collection and theory

This is a major section for me as I’ve been out of the formal research loop for a long time. If you don’t read anything else in this book (and methodology is important to you), read this. This chapter runs through major methodological approaches, offering up pros and cons for all of them. Ultimately Trowler’s own research took an ethnographic case study approach in a single site and drew from four sources of data: “interviews, observant participation, documents generated within and without the institution and other studies of it”.

Having stepped us through the alternatives, Trowler’s case for his approach in his research appears sound. (Of course, at this stage I’m going off intuition rather than experience but my understanding is that as long as you can make a strong/valid enough argument for your methodology, you can pretty much do what you like)

He anchors this discussion with a comparison of two ontological perspectives (whether they exist as a dichotomy or as points on a spectrum is debatable, of course). There is the “realist” approach (more quantitative, tied to “correlations of a generalisable nature”) and the “social constructivist” (more qualitative, reality is more subjective, more specific but less generalisable).

A key question raised – and certainly one that I’ll need to consider – is that of single site vs multi site research. Beyond the basic logistical issues, there is also (for me) the matter of one site being insider research and another being outsider research. (Even though if I was to go with a second and even a third site, I have connections and relationships in both).

To be clearer, at this stage I’m considering primarily researching within my own university (ANU) and possibly the neighbouring University of Canberra. A third option could be my former employer, the Canberra Institute of Technology, a Vocational Education and Training provider. On the one hand, this would introduce a lot more variables to the things being examined – well-resourced, research “elite” university vs emerging, “standard” university vs resource-challenged teaching focused adult education provider. All of these variables will necessarily shape the data collected and the questions asked. On the other hand, being able to compare approaches to similar questions (how can we better support TELT practices) could well be more illuminating.

Trowler identifies 6 type of comparative projects of interest:

1. The factors influencing the success or otherwise of an innovation (for example around virtual learning environments’ deployment and use)
2. Approaches to management and leadership and their effectiveness
3. The implementation of a national policy
4. Compliance (or otherwise) with national quality (or other) guidelines
5. Professional practices in a discipline or field of study
6. Student responses to an innovation

To be honest, the added complexity feels like an overreach at this juncture – I’d rather do something simpler well. It does open the door for further research down the track of course.

Of the methodological approaches discussed, I’d summarise them as follows:

Ethnographic – Participant observation
Researcher ‘lives with the natives’ for prolonged periods of time, fly on the wall type observation, often starts froma particular standpoint, “should be for people and not just about them”, language and textual objects vital,

Action research
Focused on solving particular problems, researcher is often a practitioner, iterative processes

Evaluative research
“Evaluative research in higher education aims to attribute value and worth to individual, group, institutional or sectoral activities happening there”. Care needs to be taken to ensure that it makes a larger contribution (theoretical/methodological/professional) to knowledge in the academic world

Hypothesis testing
Often trying to replicate findings of other studies, may try to improve on methodological flaws in prior studies, applying and extending theory

From here, Trowler moves on to discussing the importance of connecting theory to research. From what I can see, it seems to be partially as a way of finding a standpoint and a set of questions and partially as a way of testing some of the hypotheses.

He offers a solid overview of the characteristics of theory:

1. It uses a set of interconnected concepts to classify the components of a system and how they are related.
2. This set is deployed to develop a set of systematically and logically related propositions that depict some aspect of the operation of the world.
3. These claim to provide an explanation for a range of phenomena by illuminating causal connections.
4. Theory should provide predictions which reduce uncertainty about the outcome of a specific set of conditions. These may be rough probabilistic or fuzzy predictions, and they should be corrigible – it should be possible to disconfirm or jeopardize them through observations of the world. In the hypothetico-deductive tradition, from which this viewpoint comes, theory offers statements of the form ‘in Z conditions, if X happens then Y will follow’.
5. Theory helps locate local social processes in wider structures, because it is these which lend predictability to the social world.
6. Finally, theory guides research interventions, helping to define research problems and appropriate research designs to investigate them.

One of the goals of theory seems to relate to being able to “render the normal strange”. A challenge to face as an insider researcher – and particularly in the face of information sources that will come with their own biases and preconceptions – is to maintain that objectivity.

One way of addressing this is by “the application of Stenhouse’s (1979) notion of the ‘second record’ (the use of a detailed understanding of meaning systems to ‘read’ interview data) to this kind of secondary data about the institution.”

Trowler mentions that he probably collected too much data which caused problems in the data analysis phase – no idea how to deal with that but this is something that I will look for in my further reading about research practices.

This chapter also includes a solid description of the methodology that he used in his own PhD research – I think what I’m going to need to do is to create a spreadsheet or database outlining different methodologies – that should be particularly helpful. (Already partially discussed above).

 

Thoughts on: “Doing Insider Research in Universities” (Trowler, 2012) Part 1

This book was recommended to me (I’m pretty sure) by Inger Mewburn and while it is relatively short, it is incredibly pertinent to my PhD research. One of the reasons that I chose my topic area (Factors influencing TELT practices in Higher Ed) is the level of access that I have to this world through my day job as a learning technologist. I’m relatively new to the Higher Ed sector, after spending a long time in Vocational Education and Training – the other adult learning sector – and while there are many commonalities, there are more than a few significant differences to navigate.

Trowler seems to have a refreshingly grounded perspective of the Higher Ed sector, celebrating its many strengths but not being afraid to name the areas for improvement. There’s a lot to unpack in this remarkably short (74 pages all up) book so I’m going to break it into chunks.

Chapter 1: Insider research: a brief overview

Launching into an examination of the pros and cons of conducting research in your own educational organisation, Trowler quotes Merton (1972) who suggests that

insider doctrine (that only insiders can do ‘proper’ research because of the depth of their understanding) and the ‘outsider’ doctrine (that only outsiders have the necessary detachment for robust research) are both fallacies because we rarely are ever completely an insider or an outsider

The main thing, Trowler suggests, is that it is vital to state explicitly in your justification of your research methodology exactly how an endogenous (insider) approach might “illuminate areas of interest” and “where it could obscure them” (and the steps taken to avoid this)

In broad terms, the advantages of endogenous research include:

  • “better access to naturalistic data… greater access to the second record (underlying meanings of statements made in person or in print) and hidden transcripts (the occluded articulations of power relations within organisations)”
  • “the researcher is better able to produce ‘emic’ accounts (ones meaningful to actors), especially using an ethnographic approach”
  • ” the insider researcher is empowered to deploy naturalistic data (records of activities that are neither elicited nor affected by the actions of social researchers), critical discourse analysis (an approach to the study of the production and effects of texts of all sorts) and phenomenography (an approach to the study of the social world which captures the different ways in which a concept is apprehended, usually through interviews with a range of respondents in a particular social field)”
  • “Being culturally literate; one can deploy different types of data, which are relatively easily available, in interrogating an argument”
  • “There may be a better chance of having a beneficial impact on university practices too, especially if the research project involves action research or when research questions address the implications for policy and practice of the project’s findings (LSE Public Policy Group, 2011)”
  • “In addition, specific groups, previously under-represented or dis-empowered may benefit. Insider research is one way of addressing this issue by shining a light on experiences and ways of knowing of women and other groups”

On the other hand, there can be some significant practical disadvantages to insider research

  • “One’s involvement as a participant in the site of research may mean loss of the ability to produce good, culturally neutral, ‘etic’ (culturally neutral depictions of the social world, describing behaviours) accounts because if can become difficult to ‘see’ some dimensions of social life; they easily become normalised for the participant (the literature talks about the difficulty for insiders of “rendering the normal strange”: Delamont, 2002).”
  • “Moreover, there may be a conflict between the role as a researcher and one’s professional or student role in the university. There may be issues of power differentials between the researcher and researched, in either direction, which can be very problematic both ethically and methodologically (see Ryan et al, 2011 for a discussion of this…)”
  • “Finally, respondents who know the researcher personally or by reputation may have pre-formed expectations of their alignments and preferences in ways which change their responses to questions (a form of the effect called ‘interview bias’)”

My thoughts: 

As a full-time employee of a university with what I feel are a number of interesting avenues for investigation on offer, I am still highly mindful of the fact that not everyone may be as supportive and open to research into our practices as one might expect. Higher education can be an environment where people are necessarily heavily invested in their own expertise – it is essentially their currency – and even when questions are asked of their teaching (or professional) practices (rather than their discipline knowledge), it can sometimes be taken poorly.

Any number of other political factors can come into play and, rightly or wrongly, need to be considered and accommodated. Anonymising the data is possible but it does feel as though this would impact the narrative aspects of the thesis and, realistically, given the links to papers and conference presentations and the relatively small size of the ed tech community in Australia, it’s hard to believe that people wouldn’t figure out who you are writing about anyway.

Putting these things aside, from a purely practical perspective, as a part-time researcher and full-time employee, being able to work on research (with the blessing of my manager) that directly benefits our organisation during work hours is highly appealing. Maintaining a high degree of mindfulness about keeping an analytical and objective mindset – “rendering the normal strange” – might take a degree of effort but with focus seems achievable.

Merton, R. K. (1972). Insiders and Outsiders: A Chapter in the Sociology of Knowledge. Americal Journal of Sociology.lsLSELDSE
LSE Public Policy Group. (2011, April). Maximizing the impacts of your research: A handbook for social scientists. Retrieved March 17, 2016, from http://www.lse.ac.uk/government/research/resgroups/LSEPublicPolicy/Docs/LSE_Impact_Handbook_April_2011.pdf
Ryan, L., Kofman, E., & Aaron, P. (2011). Insiders and outsiders: working with peer researchers in researching Muslim communities. International Journal of Social Research Methodology, 14(1), 49–60. http://doi.org/10.1080/13645579.2010.481835

Thoughts on: What is educational research? Changing perspectives through the 20th century (Nisbet, 2005)

This is a substantial and comprehensive paper that takes us on a journey from the late 19th century to the modern age. Nisbet explores the place of research in education and how it has shifted in focus and character over time in response to the needs and demands of stakeholders.

Nisbet broadly identifies three key shifts in the role of the researcher in education – “from academic theorist in phase 1, through expert consultant in phase 2, to reflective practitioner in phase 3.” Clearly these can never be absolutist descriptions as there are always going to be a range of types of research occurring at any time but they do still offer an interesting insight into the ways that cultural norms and trends have led research. He also rightly wonders whether “the growing acceptance of research in education… may have had the effect of restricting its scope”

Rather than focusing here on the entirety of this overview, I’ll look more at some of the key ideas. (I’m very much still at the point of needing to better understand the nature of formal research in education and I heartily recommend this paper if you’re interested in the context of this but a blog post will do it no justice)

Early educational research had a highly psychological and quantitative bent

The concept of educational research which was established was experimental, primarily psychological, involving measurement, seeking solutions to the educational problems of the day, and this interpretation monopolised educational research for the first half of the century

Edouarde Claparede delved into some interesting questions in the field of cognition in Experimental Pedagogy and the Psychology of the Child (1911), asking:

Before learning anything, it is necessary to learn how to learn (p.57)

How far are the various mental functions able to be independent of each other, or, on the other hand, how far do they reciprocally influence each other? [correlation and factor analysis] (p. 61)

When we educate a certain function, are we acting on others at the same time? [transfer of training] (p. 64)

The heavily science driven approach persisted for many decades, lead to a flourishing of standardised testing (as well as bizarre experiments where students were sprayed with a 1% solution of nerve gas to increase alertness). Some of the less radical work relating to student fatigue led to contemporary practices of limiting class times to around 40-45 minutes. Broadly however, education research was seen as an academic pursuit that was largely “out of touch with ‘real problems'”

Major shifts occurred in the 1960s when governments started to review education systems and practices and saw a need for supporting evidence for change. Researchers became more closely aligned with education departments who started to demand more of what they saw as relevant value for their money.

Relegating research to this instrumental role carries risks: trivialising, in pursuing volatile educational fashions; restrictive, in limiting research within the constraints of existing policy frameworks; potentially divisive, creating an elite group of researchers in alliance with authority; and ultimately damaging, in that it can leave the researchers wholly dependent on their powerful partner. Researchers who decline to accept this requirement, choosing an unpopular or unfashionable line of inquiry, are liable to find that they receive no grants, that their papers are not accepted by journals, or, if published, are not widely read or quoted.

Meanwhile, in the 1970s shifts in the parallel disciplines of psychology and sociology led to a move away from the long-standing quantitative methods in education and towards greater acceptance of qualitative approaches including case studies “exploring issues in more depth with relatively small numbers”. It aimed “at understanding and insight into the complexities of learning and human behaviour”.

This in turn saw a rise in Action Research and the teacher-researcher movement. Greater emphasis on reflective practices by teachers was (is) a key component of this. Other approaches also emerged.

The practice of measurement was also questioned. In-depth interviews provided a different kind of data, approaching a topic from the perspective of the interviewee rather than within a framework decided in advance by the researcher. This phenomenographic method (as it is called) has its roots in the philosophy of phenomenology, which opposes the positivism or naturalism inherent in contemporary science and technology – the standard scientific approach to knowledge by formulating hypotheses and designing experimental procedures to test these – on the grounds that this finds (or negates) only what the researcher is looking for, whereas the open-ended methods of phenomenography produce data for formulating new interpretive constructs. This approach focuses on awareness or ‘encountering’ and accepts the role of description in how we perceive situations and how we interpret or ‘understand’ them. Thus, from the interview transcripts, the researcher derives interpretive categories: for example, the way students speak about their reading and understanding leads to the categorisation of ‘deep’ and ‘surface’ learning (Entwhistle, 1981). Recognising the subjectivity involved, the interpretation is supported extensively by excerpts from the interview transcripts.

Further approaches drawn from the worlds of philosophical and sociological theory included:

Garfinkel (1967) introduced the term ‘ethnomethodology’ to describe his approach to research which explores ‘the patterns and structures discernible in societies’

(These) are not a matter of external social constraints, roles or functions imposed on hapless individuals but are produced through cultural and interpretive processes that people collaboratively use to make sense of the world and render it mutually comprehensible (Maclure, 2003, p. 188)

Arguably, much of this and other post-modern / poststructuralist approaches raise interesting and valuable questions but don’t appear to be used widely in educational research due to their lack of connection to more pressing practical questions facing educators.

This is evidenced most dramatically – and I would argue is manifested most apparently in the apparent government obsession with standardised testing and metrics – in the contemporary reality that ” evaluation and data gathering studies are more likely to attract funding than theoretical analyses which aim at insights into problems, the enlightenment function of research”. This is also clearly where the booming corporate face of education sits – the textbook publishers (Pearson, McGraw Hill etc) with their expanding range of teaching and testing products.

My thoughts and ideas

As ever, these are a little random in some ways and reflect some of the tangents that I went off on while reading this paper. Some are simply to-do items for further investigation.

Teacher reflection processes – if this is to be taken seriously as a form of professional development – need to be more tightly framed and/or led

How much research can I get my teachers to do into their practices – can we get the uni to support this? Can this become a more substantive part of their teaching professional development? Where is the carrot for this?

I need to look more deeply into the existing OLT research projects

A final quote:

But research has become part of every professional role today, and in education one task of professional development is to weave a research question into the expertise of teachers, leading them to adopt at a personal level the self-questioning approach which leads to reflection and understanding, and from there into action

This reminds me that professional development for higher ed “teachers” should be a vital avenue in my research and also makes me think about the value of reflection in a vacuum. Surely Communities of practice are a vital element of the step to action?

 

Claparède, E. (1911). Experimental Pedagogy and the Psychology of the Child. Longmans, Green and Company.

Entwistle, N. (1981). Styles of Learning and Teaching John Wiley UK.

Thoughts on: A general framework for developing proposals – Developing Effective Research Proposals. (Punch, 2000)

book cover

Writing in this format for gathering my thoughts and collecting useful quotes and ideas from articles/books/etc proved fairly useful to me while completing my Masters so I figured that I’d give it a shot here now.

(Actually it’s funny now going back to that old blog as the final post was an overview of my thoughts about doing a research methodology subject, which seemed utterly redundant as it was the final subject in the degree and not an area that I felt that I would likely to spend any further time on)

Anyway, while I thought the first of these posts would relate to Paul Trowler’s mini-book on “Doing Insider Research in Universities”, I’m still working my way through (and enjoying) that and in the meantime was given Chapter 3 of Punch’s book about research proposals to read at the first of the Thesis Proposal Writing Workshop sessions offered by USyd ESW. (Homework, who knew?)

Punch offers a pragmatic and seemingly reasonable (based on my limited knowledge) approach to framing a research proposal. He readily acknowledges that there can be no single perfect approach but more a broad set of guiding principles that should enable one to hone one’s area of research interest down to specific and measurable data collection questions. (This isn’t to say that it won’t be a cyclical, iterative process with some potential dead-ends but ultimately it should result in a product that is “neat, well-structured and easy to follow”)

Here are some of the key points that I took from the chapter:

  • Three key questions at the heart of the proposal – What, Why and How (how coming later and including when, who and where – i.e. the methodology)
  • Why is important – the justification for the research and will often merit multiple sections
  • Logical flow from research area -> research topic -> general research questions -> specific research questions -> data collection questions

Possible examples:
Research area: youth suicide
Research topic: factors associated with the incidence of youth suicide
General research question: “What is the relationship between family background factors and the incidence of youth suicide?”
Specific research question: “What is the relationship between family income and the incidence of youth suicide?”

The point is to move toward questions that can be directly asked and answered.
“Is it clear what data will be required to answer this question?”
The answers to the general questions are the sum and synthesis of the more specific questions.

Punch prefers the term “indicators” to “factors” (which I have been tending to use to date) because “of its wide applicability across different types of research. It applies in quantitative and qualitative contexts, whereas the terms ‘dimensions’, ‘factors’ and ‘components’ have more quantitative connotations.

He also makes the point that the more well-considered the research questions are, the more they suggest the types of data that are necessary to answer them. “It is the idea behind the expression that ‘a question well asked is a question half-answered.'”

Punch goes on to point out that “should” questions (e.g. Should nurses wear white uniforms?) are unduly complex and require a lot of unpacking to answer. (Who’s to judge “should”?)
A more productive question might be “Do nurses think they should wear white uniforms?” – to which I would add maybe “Why do nurses think they should wear white uniforms?” – which perhaps gets more complicated but can still form a reasonable question to a nurse.

In broad terms, Punch then reiterates the importance of being clear on the what and the why of the research before moving on to methodology. There is some interesting discussion of the value of hypotheses in relation to the research questions – though at this stage I don’t think they will be relevant to my research – which links to aligning theory to the research questions.

Some reflections and questions raised.

At this early stage I’ve been concerned about my lack of a strong research background in terms of knowing what kind of methodology I plan to use. Many of my peers seem to have already mapped out the next 3-6 years and I’m still trying to figure out what I really want/need to find out.

This chapter has reminded me that figuring out the what and why – which I’ve made a modest start on in my mind at least – is vital in informing the next steps in the research.

It has also sparked a few random ideas and questions for me to pursue, which feels like a win.

Why don’t more people use TELT practices in Higher Ed / Adult Ed?
Is the learning technologist a factor? Where do we sit? In Organisation? or separately?
(There’s some crossover with pedagogy maybe. Also compliance and innovation)
How do these factors interrelate?

What if I start out by thinking there is a gap in the literature and there actually isn’t?
What’s the difference between a learning practice and a teaching practice?
Which factors (or sets of factors) impact TELT practices and how do they interrelate?
What actions are needed at what levels & contexts to mitigate the barrier factors?

Just finally, I’ve also decided on some tools to start my documenting process – Zotero and Scrivener. (Probably worthy of posts in their own right). The following bibliographic entry comes from the Zotpress plugin for WordPress and seems to have done a nice job in preview. (I do need to find out what the “official” citation style is. Currently I’m going APA because I like it)

Trying out Mahara – an ePortfolio adventure

Managing your online identity – particularly your professional identity – is arguably now one of the core digital literacies. This is why I’ve long taken an interest in the use of ePortfolio tools.

As the “Education Innovation Office” in a college at a major Australian university, it’s my job to keep us moving forward and to find the best ways to do so. I’d previously dabbled with ePortfolios before coming here but had never really used them for a project and thus had no direct evidence to support a case that they are worth pursuing.

A few months ago I came across CMALT – Certified Member of the Association of Learning Technologists – useful seeming accreditation and connection to a community of practice of my peers. The application process lends itself very well to the use of ePortfolios so I decided to take the Mahara ePortfolio platform out for a spin.

Our IT/Web team was kind enough to install an instance on a local server – A Windows server sadly (more on that later) – and off I went.

Using Mahara 

Mahara enables users to curate a range of files, text, links and other resources into highly customisable pages. A range of page layouts are available and content can easily drag/dropped into the page.

mahara edit content screenshot

These pages can then be gathered into collections and private URLs generated so that the user is able to choose which pages are shared with whom.

In terms of ease of use, so far so good. My biggest concern at this stage was in finding a way to provide clear connections between the evidence that I was providing and the reflective posts that I was making to respond to the selection criteria for the CMALT application. Utlimately I decided on footnote style supertext annotations that referred to numbered sections containing files at the side of the text.

mahara footnotes

The good parts

Using Mahara was a highly intuitive process that made it very easy to quickly produce a professional looking page. It certainly helped that Mahara is based on Moodle code (to some extent) as I have used this for a number of years, but I feel confident even a user without Moodle experience will pick it up quickly.

The file management system is similarly quick to pick up, with a simple space that can be used to upload files and organise them into folders.

The range of themes that can be used to style a portfolio (or the whole site) offer a reasonable degree of personalisation and I suspect that it is possible to do a lot more if you are happy to dive into the CSS and tinker.

The lesser parts

As I mentioned earlier, our Mahara instance was installed on a Windows server rather than the recommended platform and this generated a number of back-end error messages. Broadly the system seems to be working fairly well and when it is rolled out officially at a university level, I’m sure that it will be done according to the recommended specs.

For this reason, it’s hard to know whether some things don’t work – most notably the Open Badges plugin – because of our non-compliant server configuration or because of other issues. This was more of a nice-to-have in any case so it hasn’t been a major headache.

One thing that did cause a few more headaches though was the fact that when adding a folder of files to a page, the process for correctly selecting the folder (so that it actually displays the files) is fairly unintuitive in the latest release. A user needs to click the white space next to the folder name – but not the name itself or the folder icon – to select it. This caused me a good few hours of frustration but I have been told it will be addressed in future versions.

The user manual is fairly rich and detailed and there is also a growing community of users, so it isn’t generally too hard to find an answer when you strike a problem.

Wrapping up

I did spend a fair while tweaking my ePortfolio to get it just right but this was never labourious. My ePortfolio probably doesn’t make full use of the range of tools on offer but I think it has done the job reasonably well.

Feel free to take a look at it at http://mahara.cbenet.anu.edu.au/view/view.php?t=vex1phdEaZm2QsXRUGc9 and if you have any thoughts or suggestions, please let me know.

Week 2 of the 11.133x MOOC – getting things done. (Gradually)

The second week (well fortnight really) of the 11.133x MOOC moved us on to developing some resources that will help us to evaluate education technology and make a selection.

Because I’m applying this to an actual project (two birds with one stone and all) at work, it took me a little longer than I’d hoped but I’m still keeping up. It was actually fairly enlightening because the tool that I had assumed we would end up using wasn’t the one that was shown to be the most appropriate for our needs. I was also able to develop a set of resources (and the start of a really horribly messy flowchart) that my team will be able to use for evaluating technology down the road.

I’m just going to copy/paste the posts that I made in the MOOC – with the tools – as I think they explain what I did better than glibly trying to rehash it here on the fly.


 

Four tools for identifying and evaluating educational technology

I’ve been caught up with work things this week so it’s taken me a little while to get back to this assignment but I’m glad as it has enabled me to see the approaches that other people have been taken and clarify my ideas a little.

My biggest challenge is that I started this MOOC with a fairly specific Ed Tech project in mind – identifying the best option in student lecture instant response systems. The assignment however asks us to consider tools that might support evaluating Ed Tech in broader terms and I can definitely see the value in this as well. This has started me thinking that there are actually several stages in this process that would probably be best supported by very different tools.

One thing that I have noticed (and disagreed with) in the approaches that some people have taken has been that the tools seem to begin with the assumption that the type of technology is selected and then the educational /pedagogical strengths of this tool are assessed. This seems completely backwards to me as I would argue that we need to look at the educational need first and then try to map it to a type of technology.

In my case, the need/problem is that student engagement in lectures is low and a possible solution is that the lecturer/teacher would like to get better feedback about how much the students are understanding in real time so that she can adjust the content/delivery if needed.

Matching the educational need to the right tool

When I started working on this I thought that this process required three separate steps – a flowchart to point to suitable types of technology, a checklist to see whether it would be suitable and then a rubric to compare products.

As I developed these, I realised that we also need to clearly identify the teacher’s educational needs for the technology, so I have add a short survey about this here, at the beginning of this stage.

I also think that a flowchart (ideally interactive) could be a helpful tool in this stage of identifying technology. (There is a link to the beginning of the flowchart below)

I have been working on a model that covers 6 key areas of teaching and learning activity that I think could act as the starting point for this flowchart but I recognise that such a tool would require a huge amount of work so I have just started with an example of how this might look. (Given that I have already identified the type of tool that I’m looking at for my project, I’m going to focus more on the tool to select the specific application)

I also recognise that even for my scenario, the starting point could be Communication or Reflection/Feedback, so this could be a very messy and large tool.

The key activities of teaching/learning are:
• Sharing content
• Communication
• Managing students
• Assessment tasks
• Practical activities
• Reflection / Feedback

I have created a Padlet at http://padlet.com/gamerlearner/edTechFlowchart and a LucidChart athttps://www.lucidchart.com/invitations/accept/6645af78-85fd-4dcd-92fe-998149cf68b2 if you are interested in sharing ideas for types of tools, questions or feel like helping me to build this flowchart.

I haven’t built many flowcharts (as my example surely demonstrates) but I think that if it was possible to remove irrelevant options by clicking on sections, this could be achievable.

Is the technology worthwhile?

The second phase of this evaluation lets us look more closely at the features of a type of technology to determine whether it is worth pursuing. I would say that there are general criteria that will apply to any type of technology and there would also need to be specific criteria for the use case. (E.g. for my lecture clicker use case, it will need to support 350+ users – not all platforms/apps will do this but as long as some can, it should be considered suitable)

Within this there are also essential criteria and nice-to-have criteria. If a tool can’t meet the essential criteria then it isn’t fit for purpose, so I would say that a simple checklist should be sufficient as a tool will either meet a need or it won’t. (This stage may require some research and understanding of the available options first). This stage should also make it possible to compare different types of platforms/tools that could address the same educational needs. (In my case, for example, providing physical hardware based “clickers” vs using mobile/web based apps)

This checklist should address general needs which I have broken down by student, teacher and organisational needs that could be applied to any educational need. It should also include scenario specific criteria.

Evaluating products
It’s hard to know exactly what the quality of the tool or the learning experiences will be. We need to make assumptions based on the information that is available. I would recommend some initial testing wherever possible.
I’m not convinced that it is possible to determine the quality of the learning outcomes from using the tool so I have excluded these from the rubric.
Some of the criteria could be applied to any educational technology and some are specifically relevant to the student response / clicker tool that I am investigating.


 

Lecture Response System pitch

This was slightly rushed but it does reflect the results of the actual evaluation that I have carried out into this technology so far. (I’m still waiting to have some questions answered from one of the products)

I have emphasised the learning needs that we identified, looked quickly at the key factors in the evaluation and then presented the main selling points of the particular tool. From there I would encourage the teacher/lecturer to speak further to me about the finer details of the tool and our implementation plan.

Any thoughts or feedback on this would be most welcome.

Edit: I’ve realised that I missed some of the questions – well kind of.
The biggest challenge will be how our network copes with 350+ people trying to connect to something at once. The internet and phone texting options were one of the appealing parts about the tool in this regard, as it will hopefully reduce this number.

Awesome would look like large numbers of responses to poll questions and the lecturer being able to adjust their teaching style – either re-explaining a concept or moving to a new one – based on the student responses.


 

These are the documents from these two assignments.

Lecture Response systemsPitch ClickersEdTechEvaluationRubric EducationalTechnologyNeedsSurvey ColinEducation Technology Checklist2