Category Archives: feedback

Research update #50: Thesis proposal feedback day

I got some rich, detailed feedback on my thesis proposal from Peter. Given that this was firmly entrenched in my mind as a first draft, I was prepared for the worst – I think – so I was fairly happy that some of the key issues were things that had already been concerning me.

I need to go over the comments again in more depth but the biggest areas for improvement so far seem to be the fact that I got a little repetitive in the latter section (and I know this is true) and I focused too heavily on building and supporting my argument in the literature review rather than just describing the current state of scholarship in the field. That I had been less aware of (or perhaps didn’t quite understand at the time), I think maybe because I’ve been concerned about snooty examiner academics looking at my idea and saying – so what? I probably also liked the flow that this  – making a case – gave me in that it made it easier to look at the connections between a range of ideas but I’m sure there’ll be other ways to do that that will be less, I don’t know, needy?

One thing that was interesting was the fact that I had been concerned that I was trying to cover too many things and needed to reel it back in a little to come to an achievable project. On the contrary, Peter suggested a couple more avenues that might be interesting to explore and noted the virtual absence of students. I think it’s more about teaching than learning but I also had a great discussion with a friend last week along the lines of ‘you can learn without a teacher but you can’t teach without a learner’, so there’s some room to move there when we look at the practice of teaching (If that’s what I’m going to do)

One thing that I was perhaps a little wrong about was my sense that older literature might carry more weight or have more gravitas or something. I’ve read a lot and must confess that some of the more recent work has been particularly valuable in some ways (probably that whole standing on the shoulders of giants thing) but I guess I was concerned that overemphasising this might make it look as though I’d only taken a shallow step into the pool. Just picked up a handful of recent journals and gone ‘this’ll do’. I do still think that the older literature has value in demonstrating that these issues aren’t new and that, in spite of much discussion of them, little seems to have changed. Room for both I guess.

There were also a few minor things – formatting and heading size, indenting quotes etc – which I’m always fine with because these are the easiest to fix and the more of this there is, the less there is to improve in the actual writing and arguments and ideas. I also need to be a little more mindful of my broad declarations that a thing is a certain way and claiming knowledge of things I couldn’t possibly know. That’s fair too. Sometimes you just get caught up in an idea and go too far.

So there’s still a fair bit to be done but there doesn’t seem to be anything majorly wrong with my key ideas, so I’m happy.

Also, hooray, 50 research updates. I have no idea if anyone reads these, I seem to get maybe 10-15 hits on these posts on average but I have no idea how much people read. If you’re still here, I hope you get something from this. It’s a weird form of public personal writing really, blogging, when you think about it.

Thoughts on: The Dynamics of Social Practice; Chapter 6 Circuits of Reproduction (Shove et al, 2012)

This chapter seemed to take forever to work through, possibly because a bit of life stuff came up in the interim but it’s also at the more complex end of the discussion. It concludes the overall examination of the dynamics of social practice and in some ways felt like a boss level that I’d been working up to. Most of the chapter made sense but I must confess that there is a page about cross-referencing practices as entities that I only wrote “wut?” on in my notes. Maybe it’ll make more sense down the road.

The good news is that there was more than enough more meaningful content in this chapter to illuminate my own exploration of practices in my research and it sparked a few new stray ideas for future directions. There’s a decent summary near the end of the chapter that I’ll start with and then expand upon. (The authors do great work with their summaries, bless them)

In this chapter we have built on the idea that if practices are to endure, the elements of which they are made need to be linked together consistently and recurrently over time (circuit 1). How this works out is, in turn, limited and shaped by the intended and unintended consequences of previous and co-existing configurations (circuit 2). Our third step has been to suggest that persistence and change depend upon feedback between one instance or moment of enactment and another, and on patterns of mutual influence between co-existing practices (circuit 3). It is through these three intersecting circuits that practices-as-entities come into being and through these that they are transformed. (p.93)

So let’s unpack this a little. There are a number of reciprocal relationships and cycles in the lives of practices. The authors discuss these in two main ways – through the impact of monitoring/feedback on practices (both as individual performances and larger entities) and also by cross-referencing different practices (again as performances and entities)

Monitoring practices as performances

A performance of a practice will generate data that can be monitored. It might be monitored by the practitioner (as a part of the practice) and it might also be monitored by an external actor that is assessing the performance or the results/outputs. (This might be in an education/training context or regulatory or something else). This monitoring then informs feedback which improves/modifies that performance and/or the next one/s and so the cycle continues. In some way, potentially, in every new performance, the history of past performances can help refine the practice over time.

This isn’t all that evolves the practice; materials, competences and meanings play their part too, but it is a significant factor.

Monitoring, whether instant or delayed, provides practitioners with feedback on the outcomes and qualities of past performance. To the extent that this feeds forward into what they do next it is significant for the persistence, transformation and decay of the practices concerned… self-monitoring or monitoring by others is part of, and not somehow outside, the enactment of a practice (what are the minimum conditions of the practice?) is, in a sense, integral to the performance. Amongst other things, this means that the instruments of recording (the body, the score sheet, the trainee’s CV) have a constitutive and not an innocent role. (p.83-4)

So far, so good. This also makes me think that many practices are made up of elements or units of practice – let’s call them steps for simplicity. The act of monitoring is just another step. (This does take us into the question of whether it is a practice or a complex/bundle of practices – like driving is made up of steering, accelerating, braking, signalling etc – but nobody says they’re going out accelerating)

Monitoring practices as entities

Looking at a practice as an entity is to look much more at the bigger picture of the practice.

the changing contours of practices-as-entities are shaped by the sum total of what practitioners do, by the variously faithful ways in which performances are enacted over time and by the scale and commitment of the cohorts involved. We also noticed that practices-as-entities develop as streams of consistently faithful and innovative performances intersect. This makes sense, but how are the transformative effects of such encounters played out? More specifically, how are definitions and understandings of what it is to do a practice mediated and shared, and how do such representations change over time (p.84)

An interesting side-note when considering the evolution of practice is the contribution of business. This example was in a discussion of snow-boarding

As the number of snowboarders rose, established commercial interests latched on to the opportunities this presented for product development and profit (p.85)

This ties back to the influence of material elements (new designs and products in this case) on shaping a practice.

Technologies are themselves important in stabilising and transforming the contours of a practice. In producing snowboards of different length, weight, width and form, the industry caters to – and in a sense produces – the increasingly diverse needs of a different types of user… Developments of this kind contribute to the ongoing setting and re-setting of conventions and standards. (p.85)

This in turn brings up back to one of the other key roles played by monitoring (and feedback) in terms of practices as entities, which is describing and defining them. The language that is used to describe a practice and its component parts and also to define what makes good practice is of vital importance in determining what a practice is and what it becomes.

…if we are to understand what snowboarding ‘is’ at any one moment, and if we are to figure out how the image and the substance of the sport evolves, we need to identify the means by which different versions of the practice-as-entity relate to each other over time. Methods of naming and recording constitute one form of connective tissue. In naming tricks like the ollie, the nollie, the rippey flip and the chicken salad, snowboarders recognise and temporarily stabilize specific moves. Such descriptions map onto templates of performance- to an idea of what it is to do an ollie, and what it means to do one well… in valuing certain skills and qualities above others, they define the present state of play and the direction in which techniques and technologies evolve. (p.85)

So clearly, monitoring and description and standardisation/regulation and the dissemination of all of this is hugely important on practices. Interestingly, a Community of Practice for TEL edvisors that I’ve started recently had our first webinar yesterday looking at the different standards that Australian universities have for online course design. It probably merits a blog post of its own but I guess it’s a good example of what needs to happen to improve Technology Enhanced Learning and Teaching practices.

The final piece of the puzzle when it comes to monitoring – which ties back to our webinar nicely once more – is mediation.

Describing and materializing represent two modes of monitoring in the sense that they capture and to some extent formalize aspects of performance in terms of which subsequent enactments are defined and differentiated. A third mode lies in processes of mediation which also constitute channels of circulation. Within some snowboarding subcultures, making and sharing videos has become part of the experience. These films, along with magazines, websites and exhibitions, provide tangible records of individual performance and collectively reflect changing meanings of the sport within and between its various forms. Put simply, they allow actual and potential practitioners to ‘keep up’ with what is happening at the practice’s leading edge(s) (p.86)

I find the fact that monitoring/documenting and sharing the practice is considered an important part of practice quite interesting. Looking at teaching, I’ve tried to launch projects to support this in teaching but management levels have not seen value in this. (I’ll just have to persevere and keep making the argument).

There’s another nice description of the role of standards

…standards, in the form of rules, descriptions, materials and representations, constitute templates and benchmarks in terms of which present performances are evaluated and in relation to which future variants develop (p.86)

The discussion of the role of feedback notes that positive feedback can be self-perpetuating, in “what Latour and Woolgar refer to as ‘cycles of credibility’ (1986). Their study of laboratory life showed how the currencies of scientific research – citations, reputation, research funding – fuelled each other. In the situations they describe, research funding led to research papers that enhanced reputations in ways which made it easier to get more research funding and so on” (p.86)

Ultimately, feedback helps to sustain practices (as entities) by keeping practitioners motivated.

At a very basic level, it’s good to know you are doing well. Even the most casual forms of monitoring reveal (and in a sense constitute) levels of performance. In this role, signs of progress are often important in encouraging further effort and investment of time and energy (Sudnow, 1993). The details of how performances are evaluated (when how often, by whom) consequently structure the careers of individual practitioners and the career path that the practice itself affords. This internal structure is itself of relevance for the number of practitioners involved and the extent of their commitment (p.86)

Cross-referencing practices-as-performances

The act of participating in a performance of a practice means that it has been prioritised over other practices – the time spent on this performance is not available to the others.

…some households deliberately rush parts of the day in order to create unhurried periods of ‘quality’ time elsewhere in their schedule. In effect, short-cuts and compromises in the performance of certain practices are accepted because they allow for the ‘proper’ enactment of others (p.87)

Shove et al examine the importance of time as a tool, a coordinating agent that helps in this process. In a nutshell, it is a vital element of every practice and shapes the interactions between practices (and also practitioners). They move on to explore the change from static ‘clock-time’ to a more flexible ‘mobile-time’. Their argument is essentially that our adoption of mobile communication technologies (i.e. smart phones) is giving us a more fluid relationship with time because we can now call people on the fly to tell them that we are running late.

However some commentators are interested in the ways in which mobile messaging (texting, phoning, mobile emailing) influences synchronous cross-referencing between practices (p.88)

I’ll accept that mobile communication is changing the way we live but I’m not convinced that it is having the impact on practices that the authors suggest. Letting someone know that you’re running late particularly doesn’t change what is to be done, it just pushes it back. Perhaps letting someone know of a change of venue has more impact, in that it would allow a practice that might not otherwise have occurred to do so, but this doesn’t strike me as something that would happen regularly enough to change our concepts of time or practice.

The authors express this somewhat more eloquently than I:

But is this of further significance for the ways in which practices shape each other? For example, does the possibility of instant adjustment increase the range of practices in which many people are involved? Does real-time coordination generate more or less leeway in the timing and sequencing of what people do? Are patterns of inattention and disengagement changing as practices are dropped or cut short in response to news that something else is going on? Equally, are novel practices developing as a result? In thinking about these questions it is important to consider how technologies of ‘micro’-coordination relate to those that operate on a global scale (p.89)

Another significant idea that this generated for me was that the things that we do shape our world because we design and modify our world to suit the things that we do. This then may change our ability to do those things and we enter a cycle where practice shapes environment shapes practice etc.

Or as the authors put it:

we have shown that moments of performance reproduce and reflect qualities of spacing and timing, some proximate, some remote. It is in this sense that individual practices ‘make’ the environments that others inhabit (p.89)

I guess then, the real question is how we as TEL edvisors can make this work for us.

Cross-referencing practices-as-entities

This is the section that lost me a little but parts made sense. It’s something to do with the way that separate practices might be aggregated as part of a larger issue – such as eating and exercise both sit within this issue of obesity. Clearly obesity isn’t a practice but it does encompass both of these practices and creates linkages that wouldn’t necessarily otherwise be there. This happens in part by tying in monitoring and creating a discourse/meaning attached to them all.

The authors refer to this combination of the discourse and the monitoring via measurement technologies as “epistemic objects, in terms of which practices are conjoined and associated, one to another” (p.92). (And arguably monitoring and discourse create their own cyclical relationship)

They move on to expand the significance of the elements of a practice (material, competence and meaning),

this time viewing them as instruments of coordination. In their role as aggregators, accumulators, relays and vehicles, elements are more than necessary building blocks: they are also relevant for the manner in which practices relate to each other and for how these relations change over time. (p.92)

Writing and reading – as competences rather than practices I guess – occupy a vital space here in terms of the ways that they are vital in the dissemination of practices, meanings and techniques.

They discuss two competing ideas by other scholars in the field (Law and Latour) that posit that either practices need elements to remain stable for significant periods of time to allow practices to become entrenched or that they benefit from changes in the elements that enable practices to evolve. I don’t actually think that these positions are mutually exclusive.

Other thoughts

I jotted down a number of stray thoughts as I read this chapter that don’t necessarily tie to specific sections, so I’ll just share them as is.

Is a technology a material or does it also carry meanings and competences?

Does research culture/practice negatively impact teaching practice? Isolated and competitive – essentially the antithesis of a good teaching culture.

Does imposter syndrome (in H.E. specifically) inhibit teachers from being monitored/observed for feedback? Does rewarding only teaching excellence inhibit academic professional development in teaching because it stops people from admitting that they could use help. Are teaching excellence awards a hangover from a research culture that is highly competitive?
What if we could offer academics opportunities to anonymously and invisibly self assess their teaching and online course design? 

Is Digigogy (digital pedagogy) the ‘wicked problem’ that I’m trying to resolve in my research – in the same way that ‘obesity’ is an aggregator for exercise and eating as practices? I do like ‘digigogy’ as an umbrella term for TEL practices.

Where do TEL edvisors sit in the ‘monitoring’ space of TEL practices?

This ‘epistemic objects – cycle of monitoring/feedback and discourse’ is probably going to play a part in my research somehow. Maybe in CoPs.

So what am I taking away from this?

I guess it’s mainly that there are a lot of different ways in which practices (and performances of practices) are connected which impact on how the evolve and spread. Monitoring and feedback – particularly when it is baked into the practice – is a big deal. The whole mobile time thing feels like an interesting diversion but the place of technology (and what exactly it is in practice element terms) will be a factor, given that I’m looking at Tech Enhanced Learning. (To be honest though, I think I’m really looking at Tech Enhanced Teaching)

In specific terms, it seems more and more like what I need to do is break down all of the individual tasks/activities that make up the practice of ‘teaching’ – or Tech Enhanced Learning and Teaching – and find the cross-over points with the activities that make up the practice of a TEL edvisor. I think there is also merit in looking at the competition between the practices of research for academics and teaching, which impacts their practice in significant ways.

On now to Chapter 7, where the authors promise to bring all the ideas together. Looking forward to it.

Shove, E., Pantzar, M., & Watson, M. (2012). The Dynamics of Social Practice: Everyday Life and How it Changes. SAGE Publications Ltd. http://sk.sagepub.com/books/the-dynamics-of-social-practice

 

 

 

 

 

 

 

 

More thoughts on: “Digital is not the future – Hacking the institution from the inside” – Technology, practical solutions and further questions

Previously on Screenface.net:

I’ve been participating in an online “hack” looking at “Digital is not the future – Hacking the institution from the inside” with a number of other education designers/technologists.

hack-poster-3
It’s been pretty great.

I shared some thoughts and summarised some of the discussions tied to the issues we face in supporting and driving institutional change, working with organisational culture and our role as professional staff experts in education design and technology.

There’s still much to talk about. Technology and what we need it to do, practical solutions both in place and under consideration / on the wishlist, further questions and a few stray ideas that were generated along the way.

Technology: 

Unsurprisingly, technology was a significant part of our conversation about what we can do in the education support/design/tech realm to help shape the future of our institutions. The core ideas that came up included what we are using it for and how we sell and instill confidence in it in our clients – teachers, students and the executive.

The ubiquity and variety of educational technologies means that they can be employed in all areas of the teaching and learning experience. It’s not just being able to watch a recording of the lecture you missed or to take a formative online quiz; it’s signing up for a course, finding your way to class, joining a Spanish conversation group, checking for plagiarism, sharing notes, keeping an eye on at-risk students and so much more.

It’s a fine distinction but Ed Tech is bigger than just “teaching and learning” – it’s also about supporting the job of being a teacher or a learner. I pointed out that the recent “What works and why?” report from the OLT here in Australia gives a strong indication that the tools most highly valued by students are the ones that they can use to organise their studies.
Amber Thomas highlighted that “…better pedagogy isn’t the only quality driver. Students expect convenience and flexibility from their courses” and went on to state that “We need to use digital approaches to support extra-curricular opportunities and richer personal tracking. Our “TEL” tools can enable faster feedback loops and personalised notifications”

Even this is just the tip of the iceberg – it’s not just tools for replicating or improving analog practices – the technology that we support and the work we do offers opportunities for new practices. In some ways this links back closely to the other themes that have emerged – how we can shape the culture of the organisation and how we ensure that we are part of the conversation. A shift in pedagogical approaches and philosophies is a much larger thing that determining the best LMS to use. (But at its best, a shift to a new tool can be a great foot in the door to discussing new pedagogical approaches)

“It is reimagining the pedagogy and understanding the ‘new’ possibilities digital technologies offer to the learning experience where the core issue is” (Caroline Kuhn)

Lesley Gourlay made a compelling argument for us to not throw out the baby with the bathwater when it comes to technology by automatically assuming that tech is good and “analogue” practices are bad. (I’d like to assume that any decent Ed Designer/Tech knows this but it bears repeating and I’m sure we’ve all encountered “thought leaders” with this take on things).

“we can find ourselves collapsing into a form of ‘digital dualism’ which assumes a clear binary between digital and analogue / print-based practices (?)…I would argue there are two problems with this. First, that it suggests educational and social practice can be unproblematically categorised as one or the other of these, where from a sociomaterial perspective I would contend that the material / embodied, the print-based / verbal and the digital are in constant and complex interplay. Secondly, there perhaps is a related risk of falling into a ‘digital = student-centred, inherently better for all purposes’, versus ‘non-digital = retrograde, teacher-centred, indicative of resistance, in need of remediation’.” (Lesley Gourlay)

Another very common theme in the technology realm was the absolute importance of having reliable technology (as well as the right technology.)

Make technology not failing* a priority. All technology fails sometime, but it fails too often in HE institutions. Cash registers in supermarkets almost never fail, because that would be way too much of a risk.” (Sonia Grussendorf)

When it comes to how technology is selected for the institution, a number of people picked up on the the tension between having it selected centrally vs by lecturers.

“Decentralize – allow staff to make their own technology (software and hardware) choices” (Peter Bryant)

Infrastructure is also important in supporting technologies (Alex Chapman)

Personally I think that there must be a happy medium. There are a lot of practical reasons that major tools and systems need to be selected, implemented, managed and supported centrally – integration with other systems, economies of scale, security, user experience, accessibility etc. At the same time we also have to ensure that we are best meeting the needs of students and academics in a host of different disciplines. and are able to support innovation and agility. (When it comes to the selection of any tool I think that there still needs to be a process in place to ensure that the tool meets the needs identified – including those of various institutional stakeholders – and can be implemented and supported properly.)

Finally, Andrew Dixon framed his VC elevator pitch in terms of a list of clear goals describing the student experience with technology which I found to be an effective way of crafting a compelling narrative (or set of narratives) for a busy VC. Here are the first few:

 

  1. They will never lose wifi signal on campus – their wifi will roam seemlessly with them
  2. They will have digital access to lecture notes before the lectures, so that they can annotate them during the lecture.
  3. They will also write down the time at which difficult sub-topics are explained in the lecture so that they can listen again to the captured lecture and compare it with their notes. (Andrew Dixon)

Some practical solutions

Scattered liberally amongst the discussions were descriptions of practical measures that people and institutions are putting in place. I’ll largely let what people said stand on its own – in some cases I’ve added my thoughts in italics afterwards. (Some of the solutions I think were a little more tongue in cheek – part of the fun of the discussion – but I’ll leave it to you to determine which)

Culture / organisation

Our legal team is developing a risk matrix for IT/compliance issues (me)

(We should identify our work) “not just as teaching enhancement but as core digital service delivery” (Amber Thomas)

“we should pitch ‘exposure therapy’ – come up with a whole programme that immerses teaching staff in educational technology, deny them the choice of “I want to do it the old fashioned way” so that they will realise the potential that technologies can have…” (Sonja Grussendorf)

“Lets look at recommendations from all “strategy development” consultations, do a map of the recommendations and see which ones always surface and are never tackled properly.” (Sheila MacNeill)

“Could this vision be something like this: a serendipitous hub of local, participatory, and interdisciplinary teaching and learning, a place of on-going, life-long engagement, where teaching and learning is tailored and curated according to the needs of users, local AND global, actual AND virtual, all underscored by data and analytics?” (Rainer Usselman)

“…build digital spaces to expand our reach and change the physical set up of our learning spaces to empower use of technology…enable more collaborative activities between disciplines” (Silke Lange)

“we need a centralised unit to support the transition and the evolution and persistence of the digital practice – putting the frontliners into forefront of the decision making. This unit requires champions throughout the institutions so that this is truly a peer-led initiative, and a flow of new blood through secondments. A unit that is actively engaging with practitioners and the strategic level of the university” (Peter Bryant)

In terms of metrics – “shift the focus from measuring contact time to more diverse evaluations of student engagement and student experience” (Silke Lange)
“Is there a metric that measures teaching excellence?… Should it be designed in such a way as to minimise gaming? … should we design metrics that are helpful and allow tools to be developed that support teaching quality enhancement?” (David Kernohan)  How do we define or measure teaching excellence?
“the other thing that we need to emphasise about learning analytics is that if it produces actionable insights then the point is to act on the insights” (Amber Thomas) – this needs to be built into the plan for collecting and dealing with the data.

Talking about the NSS (National student survey) – “One approach is to build feel-good factor and explain use of NSS to students. Students need to be supported in order to provide qualitative feedback” (David Kernohan)  (I’d suggest that feedback from students can be helpful but it needs to be weighted – I’ve seen FB posts from students discussing spite ratings)

“We should use the same metrics that the NSS will use at a more granular levels at the university to allow a more agile intervention to address any issues and learn from best practices. We need to allow flexibility for people to make changes during the year based on previous NSS” (Peter Bryant)

“Institutional structures need to be agile enough to facilitate action in real time on insights gained from data” (Rainer Usselmann) – in real time? What kind of action? What kind of insights? Seems optimistic

“Institutions need at the very least pockets of innovation /labs / discursive skunk works that have licence to fail, where it is safe to fail” (Rainer Usselmann)

“Teachers need more space to innovate their pedagogy and fail in safety” (Silke Lange)
“Is it unfair (or even unethical) to not give students the best possible learning experience that we can?…even if it was a matter of a control group receiving business-as-usual teaching while a test group got the new-and-improved model, aren’t we underserving the control group?” (me)

“I can share two examples from my own experiences
An institution who wanted to shift all their UG programmes from 3 year to 4 year degrees and to deliver an American style degree experience (UniMelb in the mid 2000s)

An institution who wanted to ensure that all degree programmes delivered employability outcomes and graduate attributes at a teaching, learning and assessment level

So those resulted in;
a) curriculum change
b) teaching practice change
c) assessment change
d) marketing change ” (Peter Bryant)

“One practical option that I’m thinking about is adjusting the types of research that academics can be permitted to do in their career path to include research into their own teaching practices. Action research.” (Me) I flagged this with our Associate Dean Education yesterday and was very happy to hear that she is currently working on a paper for an education focussed journal in her discipline and sees great value in supporting this activity in the college.

“I think policy is but one of the pillars that can reinforce organisational behaviour” (Peter Bryant)- yes, part of a carrot/stick approach, and sometimes we do need the stick. Peter also mentions budgets and strategies, I’d wonder if they don’t change behaviour but more support change already embarked upon.

Technology

“let’s court rich people and get some endowments. We can name the service accordingly: “kingmoneybags.universityhandle.ac.uk”. We do it with buildings, why not with services?” (Sonia Grussendorf) – selling naming rights for TELT systems just like buildings – intriguing

We need solid processes for evaluating and implementing Ed Tech and new practices (me)

Pedagogical

“Could creating more ‘tailored’ learning experiences, which better fit the specific needs and learning styles of each individual learner be part of the new pedagogic paradigm?” (Rainer Usselman) (big question though around how this might be supported in terms of workload

“At Coventry, we may be piloting designing your own degree” (Sylvester Arnab)
“The challenge comes in designing the modules so as to minimise prerequisites, or make them explicit in certain recommended pathways” (Christopher Fryer)
I went on to suggest that digital badges and tools such as MyCourseMap might help to support this model. Sylvester noted that he is aware that “these learning experiences, paths, patterns, plans have to be validated somehow” Learner convenience over pedagogy – or is it part of pedagogy in line with adult learning principles of self-efficacy and motivation. In a design your own degree course, how do we ensure that learners don’t just choose the easiest subjects – how do we avoid the trap of having learners think they know enough to choose wisely?

“digital might be able to help with time-shifting slots to increase flexibility with more distributed collaboration, flipped teaching, online assessment” (George Roberts)

 

“At UCL we are in the midst of an institution-wide pedagogic redesign through the Connected Curriculum. This is our framework for research-based education which will see every student engaging in research and enquiry from the very start of their programme until they graduate (and beyond). More at http://www.ucl.ac.uk/teaching-learning/connected-curriculum

The connected bit involves students making connections with each other, with researchers, beyond modules and programmes, across years of study, across different disciplines, with alumni, employers, and showcase their work to the wider world…

There is strong top-down support, but also a middle-out approach with faculties having CC fellows on part time secondments to plan how introduce and embed the CC in their discipline.

From a TEL perspective we need to provide a digital infrastructure to support all of this connectivity – big project just getting going. Requirements gathering has been challenging… And we’re also running workshops to help programme and module teams to design curricula that support research-based and connected learning.” (Fiona Strawbridge) – liking this a lot, embedding practice. What relationship do these fellows have with lecturers?

 

“I am imagining that my research, personal learning environment would fit perfect with this approach as I am thinking the PLE as a toolbox to do research. There is also a potential there to engage student in open practice, etc.” Caroline Kuhn

“There may be a “metapedagogy” around the use of the VLE as a proxy for knowledge management systems in some broad fields of employment: consultancy, financial services, engineering…” (George Roberts)  (which I’d tie to employability)

“We need to challenge the traditional model of teaching, namely didactic delivery of knowledge. The ways in which our learning spaces are currently designed -neat rows, whiteboard at front, affords specific behaviours in staff and students. At the moment virtual learning spaces replicate existing practices, rather than enabling a transformative learning experience. The way forward is to encourage a curricula founded on enquiry-based learning that utilise the digital space as professional practitioners would be expected to” (Silke Lange) – maybe but none of this describes where or how lecturers learn these new teaching skills. Do we need to figure out an evolutionary timeline to get to this place, where every year or semester, lecturers have to take one further step, add one new practice?

“Do not impose a pedagogy. Get rid of the curricula. Empower students to explore and to interact with one another. The role of the teacher is as expert, navigator, orienteer, editor, curator and contextualisor of the subject. Use heuristic, problem-based learning that is open and collaborative. Teach students why they need to learn” (Christopher Fryer)

 

This is but a cherry-picked selection of the ideas and actions that people raised in this hack but I think it gives a sense of some of the common themes that emerged and of the passion that people feel for our work in supporting innovation and good practices in our institutions.  I jotted down a number of stray ideas for further action in my own workplace as well as broader areas to investigate in the pursuit of my own research.

As always, the biggest question for me is that of how we move the ideas from the screen into practice.

Further questions

How are we defining pedagogical improvements – is it just strictly about teaching and learning principles (i.e. cognition, transfer etc) or is it broader – is the act of being a learner/teacher a part of this (and thus the “job” of being these people which includes a broader suite of tools) (me)

What if we can show how learning design/UX principles lead to better written papers by academics? – more value to them (secondary benefits) (me)

“how much extra resource is required to make really good use of technology, and where do we expect that resource to come from?” (Andrew Dixon)

Where will I put external factors like the TEF / NSS into my research? Is it still part of the organisation/institution? Because there are factors outside the institution like this that need to be considered – govt initiatives / laws / ???

Are MOOCs for recruitment? Marketing? (MOOCeting?)

“How do we demonstrate what we do will position the organisation more effectively? How do we make sure we stay in the conversation and not be relegated to simply providing services aligned with other people’s strategies” (arguably the latter is part of our job)
“How do we embed technology and innovative pedagogical practices within the strategic plans and processes at our institutions?” (Peter Bryant)

Further research

Psychology of academia and relationships between academic and professional staff. (Executive tends to come from academia)

“A useful way to categorise IT is according to benefits realisation. For each service offered, a benefits map should articulate why we are providing the service and how it benefits the university.” (See https://en.wikipedia.org/wiki/Benefits_realisation_management ) (Andrew Dixon)

Leadership and getting things done / implementing change, organisational change

How is organisational (particularly university) culture defined, formed and shaped?

Actor-network theory

Design research

Some ideas this generated for me

Instead of tech tool based workshops – or in addition at least – perhaps some learning theme based seminars/debates (with mini-presentations). Assessment / Deeper learning / Activities / Reflection

Innovation – can be an off-putting / scary term for academics with little faith in their own skills but it’s the buzzword of the day for leadership. How can we address this conflict? How can we even define innovation within the college?

What if we bring academics into a teaching and learning / Ed tech/design support team?

Telling the story of what we need by describing what it looks like and how students/academics use it in scenario / case study format offers a more engaging narrative

What is the role of professional bodies (E.g. unions like the NTEU) in these discussions?

Are well-off, “prestigious” universities the best places to try to innovate? Is there less of a driving urge, no pressing threat to survival? Perhaps this isn’t the best way to frame it – a better question to ask might be – if we’re so great, what should other universities be learning from us to improve their own practices? (And then, would we want to share that knowledge with our competitors)

“I was thinking about the power that could lie behind a social bookmarking tool when doing a dissertation, not only to be able to store and clasify a resource but also to share it with a group of likeminded researcher and also to see what other have found about the same topic.” (Caroline Kuhn) – kind of like sharing annotated bibliographies?

Bigger push for constructive alignment
I need to talk more about teaching and learning concepts in the college to be seen as the person that knows about it

In conclusion

I’d really like to thank the organisers of the Digital is not the future Hack for their efforts in bringing this all together and all of the people that participated and shared so many wonderful and varied perspectives and ideas. Conversation is still happening over there from what I can see and it’s well worth taking a look.

 

 

 

 

 

 

 

 

 

 

 

 

Week 3 of the 11.133x MOOC – On fire

In comparison to last week, I ploughed through this weeks’ work. Probably because I have a separate education tech training event on this week – the ACODE Learning Technology Teaching Institute – and wanted to clear the decks to give this due focus.

This week in MOOC land saw us looking at the evaluation phase of a project. Once more, the videos and examples were a little (ok entirely) focussed on K-12 examples (and a detour into ed tech startups that was slightly underwhelming) but there were enough points of interest to keep my attention.

I even dipped back into GoAnimate for one of the learning activities, which was fun.

The primary focus was on considering barriers to success before starting an implementation and it did introduce me to a very solid tool created by Jennifer Groff of MIT that provides a map of potential barriers across 6 areas. Highly recommend checking it out.

Groff, Jennifer, and Chrystalla Mouza. 2008. “A Framework for Addressing Challenges to Classroom Technology Use.” AACE Journal 16 (1): 21-46.

As for my contributions, as ever, I’ll add them now.


 

Barriers and Opportunities for using PollEverywhere in uni lectures. 

There are a number of potential barriers to this success of this project, however I believe that several of them have been considered and hopefully addressed in the evaluation process that led us to our choice of PollEverywhere. One of them only came up on Friday though and it will be interesting to see how it plays out.

1 School – Commercial Relationships

Last week I learnt that a manager in the school that has been interested in this project (my college is made up of four schools) has been speaking to a vendor and has arranged for them to come and make a presentation about their student/lecture response tool. (Pearson – Learning Catalytics). Interestingly this wasn’t a tool on my radar in the evaluation process – it didn’t come up in research at all. A brief look at the specs for the tool (without testing) indicates though that it doesn’t meet several of our key needs.

I believe that we may be talking to this vendor about some of their other products but I’m not sure what significance this has in our consideration of this specific product. The best thing that I can do is to put the new product through the same evaluation process as the one that I have selected and make the case based on selection criteria. We have also purchased a license for PollEverywhere for trialling, so this project will proceed anyway. We may just need to focus on a pilot group from other schools.

2 School – Resistance to centralisation

Another potential obstacle may come from one of our more fiercely independent schools. They have a very strong belief in maintaining their autonomy and “academic freedom” and have historically resisted ideas from the college.

There isn’t a lot that can be done about this other than inviting them to participate and showcasing the results after the piloting phase is complete.

3 School – network limitations

This is unfortunately not something that we can really prepare for. We don’t know how well our wireless network will support 300+ students trying to access a site simultaneously. This was a key factor in the decision to use a tool that enables students to participate via SMS/text, website/app and Twitter.

We will ask lecturers to encourage students to used varied submission options. If the tool proves successful, we could potentially upgrade the wireless access points.

4 Teacher – Technical ability to use the tool

While I tried to select a tool that appears to be quite user-friendly, there are still aspects of it that could be confusing. In the pilot phase, I will develop detailed how-to resources (both video and print) and provide practical training to lecturers before they use the tools.

5 Teacher – Technical

PollEverywhere offers a plug-in that enables lecturers to display polls directly in their PowerPoint slides. Lecturers don’t have permission to install software on their computers, so I will work with our I.T team to ensure that this is made available.

6 Teacher – Pedagogy

Poorly worded or times questions could reduce student engagement. During the training phase of the pilot program, I will discuss the approach that the teacher intends to take in their questions. (E.g. consider asking – did I explain that clearly VS do you understand that)

Opportunities

Beyond the obvious opportunity to enhance student engagement in lectures, I can see a few other potential benefits to this project.

Raise the profile of Educational technology

A successful implementation of a tool that meshes well with existing practice will show that change can be beneficial, incremental and manageable.

Open discussion of current practices

Providing solid evidence of improvements in practices may offer a jumping off point for wider discussion of other ways to enhance student engagement and interaction.

Showcase and share innovative practices with other colleges

A successful implementation could lead to greater collegiality by providing opportunities to share new approaches with colleagues in other parts of the university.

Timeline

This isn’t incredibly detailed yet but is the direction I am looking at. (Issues in brackets)

    Develop how-to resources for both students and lecturers(3)
    Identify pilot participants (1,2)
    Train / support participants (3,4,6)
    Live testing in lectures (5)
    Gather feedback and refine
    Present results to college
    Extend pilot (repeat cycle)
    Share with university

 

Oh and here is the GoAnimate video. (Don’t judge me)

http://goanimate.com/videos/0e8QxnJgGKf0?utm_source=linkshare&utm_medium=linkshare&utm_campaign=usercontent

Week 2 of the 11.133x MOOC – getting things done. (Gradually)

The second week (well fortnight really) of the 11.133x MOOC moved us on to developing some resources that will help us to evaluate education technology and make a selection.

Because I’m applying this to an actual project (two birds with one stone and all) at work, it took me a little longer than I’d hoped but I’m still keeping up. It was actually fairly enlightening because the tool that I had assumed we would end up using wasn’t the one that was shown to be the most appropriate for our needs. I was also able to develop a set of resources (and the start of a really horribly messy flowchart) that my team will be able to use for evaluating technology down the road.

I’m just going to copy/paste the posts that I made in the MOOC – with the tools – as I think they explain what I did better than glibly trying to rehash it here on the fly.


 

Four tools for identifying and evaluating educational technology

I’ve been caught up with work things this week so it’s taken me a little while to get back to this assignment but I’m glad as it has enabled me to see the approaches that other people have been taken and clarify my ideas a little.

My biggest challenge is that I started this MOOC with a fairly specific Ed Tech project in mind – identifying the best option in student lecture instant response systems. The assignment however asks us to consider tools that might support evaluating Ed Tech in broader terms and I can definitely see the value in this as well. This has started me thinking that there are actually several stages in this process that would probably be best supported by very different tools.

One thing that I have noticed (and disagreed with) in the approaches that some people have taken has been that the tools seem to begin with the assumption that the type of technology is selected and then the educational /pedagogical strengths of this tool are assessed. This seems completely backwards to me as I would argue that we need to look at the educational need first and then try to map it to a type of technology.

In my case, the need/problem is that student engagement in lectures is low and a possible solution is that the lecturer/teacher would like to get better feedback about how much the students are understanding in real time so that she can adjust the content/delivery if needed.

Matching the educational need to the right tool

When I started working on this I thought that this process required three separate steps – a flowchart to point to suitable types of technology, a checklist to see whether it would be suitable and then a rubric to compare products.

As I developed these, I realised that we also need to clearly identify the teacher’s educational needs for the technology, so I have add a short survey about this here, at the beginning of this stage.

I also think that a flowchart (ideally interactive) could be a helpful tool in this stage of identifying technology. (There is a link to the beginning of the flowchart below)

I have been working on a model that covers 6 key areas of teaching and learning activity that I think could act as the starting point for this flowchart but I recognise that such a tool would require a huge amount of work so I have just started with an example of how this might look. (Given that I have already identified the type of tool that I’m looking at for my project, I’m going to focus more on the tool to select the specific application)

I also recognise that even for my scenario, the starting point could be Communication or Reflection/Feedback, so this could be a very messy and large tool.

The key activities of teaching/learning are:
• Sharing content
• Communication
• Managing students
• Assessment tasks
• Practical activities
• Reflection / Feedback

I have created a Padlet at http://padlet.com/gamerlearner/edTechFlowchart and a LucidChart athttps://www.lucidchart.com/invitations/accept/6645af78-85fd-4dcd-92fe-998149cf68b2 if you are interested in sharing ideas for types of tools, questions or feel like helping me to build this flowchart.

I haven’t built many flowcharts (as my example surely demonstrates) but I think that if it was possible to remove irrelevant options by clicking on sections, this could be achievable.

Is the technology worthwhile?

The second phase of this evaluation lets us look more closely at the features of a type of technology to determine whether it is worth pursuing. I would say that there are general criteria that will apply to any type of technology and there would also need to be specific criteria for the use case. (E.g. for my lecture clicker use case, it will need to support 350+ users – not all platforms/apps will do this but as long as some can, it should be considered suitable)

Within this there are also essential criteria and nice-to-have criteria. If a tool can’t meet the essential criteria then it isn’t fit for purpose, so I would say that a simple checklist should be sufficient as a tool will either meet a need or it won’t. (This stage may require some research and understanding of the available options first). This stage should also make it possible to compare different types of platforms/tools that could address the same educational needs. (In my case, for example, providing physical hardware based “clickers” vs using mobile/web based apps)

This checklist should address general needs which I have broken down by student, teacher and organisational needs that could be applied to any educational need. It should also include scenario specific criteria.

Evaluating products
It’s hard to know exactly what the quality of the tool or the learning experiences will be. We need to make assumptions based on the information that is available. I would recommend some initial testing wherever possible.
I’m not convinced that it is possible to determine the quality of the learning outcomes from using the tool so I have excluded these from the rubric.
Some of the criteria could be applied to any educational technology and some are specifically relevant to the student response / clicker tool that I am investigating.


 

Lecture Response System pitch

This was slightly rushed but it does reflect the results of the actual evaluation that I have carried out into this technology so far. (I’m still waiting to have some questions answered from one of the products)

I have emphasised the learning needs that we identified, looked quickly at the key factors in the evaluation and then presented the main selling points of the particular tool. From there I would encourage the teacher/lecturer to speak further to me about the finer details of the tool and our implementation plan.

Any thoughts or feedback on this would be most welcome.

Edit: I’ve realised that I missed some of the questions – well kind of.
The biggest challenge will be how our network copes with 350+ people trying to connect to something at once. The internet and phone texting options were one of the appealing parts about the tool in this regard, as it will hopefully reduce this number.

Awesome would look like large numbers of responses to poll questions and the lecturer being able to adjust their teaching style – either re-explaining a concept or moving to a new one – based on the student responses.


 

These are the documents from these two assignments.

Lecture Response systemsPitch ClickersEdTechEvaluationRubric EducationalTechnologyNeedsSurvey ColinEducation Technology Checklist2