Doing Difference

Gah! Talk about technical hiccups. Sorry readers – for some reason my transfer of posts from the other blog to this one automatically did not go as planned. But the good news is, this means there are many new posts to come! This one is based on a lecture about gender difference given to students on my module Contemporary Issues in Management.


 

We often think about difference as something natural. ‘We’re all different from each other,’ we like to think, ‘everyone is special in some way.’ Yet we rarely think about how collective (and individual) difference is something that is a careful production of regular maintenance work and activity. Women are often more aware of this than men, because there are certain unspoken rules we are explicitly taught by each other about ‘correct’ or ‘good’ appearances – (unintentionally) smudged eyeliner or mascara is a definite faux pas, and not wearing makeup is itself sometimes a political statement.

But this work of emphasising similarity or difference is not only evident in women’s makeup, but also in a wide number of other everyday actions and activities. Standing in a queue quietly, with upright stance and facing forward, for instance, signifies something different to standing in a queue while slouching at an angle and talking to a friend nearby. The second type of behaviour conveys the message; ‘I’m in this queue, but it’s not terribly important to me. I can get along just fine without keeping my place in it or getting to the front as quickly as possible’. The second type of behaviour might be seen as a mark of status, or of aggression and control. It may indicate the person’s confidence of being able to get the item or service they are queuing for by other means, or at another time.

Although a contemporary issue, you might be surprised to find out that the notion of the performance of difference (in studies of gender, at least), was widely popularised in an article by West and Zimmerman (1987) entitled Doing Gender‘. If you follow this link and scroll down you will see the large number of articles this publication influenced, which include a large variety of topics on business and organization as well as sociology in general.

At the time of West and Zimmerman’s (1987) writing, there was already a clear divide in the study of gender which distinguished between ‘natural difference’ or sex, and ‘constructed difference’ or gender. The subtleties of this distinction for the study of gender were hotly debated at the time and continue to be discussed as key principles of the study of gender. West and Zimmerman argued that we can think about gender as a performed accomplishment, an outcome of continuous ongoing work and performance of everyday activities in ways which align with (and reinforce) expectations about ‘masculinity’ or ‘femininity’. But what happens when some of these characteristics are more valued than others? Or when performing activity in a certain way is a job requirement? Based on research in Manchester, Dr Darren Nixon explored how the huge shift in the UK economy towards service sector work, which often requires subservient (‘feminized’) behaviour, disadvantages working class men looking for work, as throughout their lives they have developed everyday patterns of behaviour based on masculine expectations which are not compatible with this type of work. Having learned to be brash, confident in their skills, aggressively independent and plainspoken, work in department stores and perfume counters simply does not ‘fit’.

This approach is important when you think about how frequently most research is interested only in the business case for diversity in organizations. The ‘business case’ approach often assumes that our identities are fixed by our own decisions, a result of choices freely made throughout our lifetime. What the performative approach emphasises is that many of these decisions might have slipped by unnoticed in our everyday practices of getting by in the workplace and fitting in. As such, small things such as an organizational dress code, or recruitment policies looking for the ‘proper look’ for an organization, neglect to realise that these practices are learned and performed through association with certain communities. It also attempts to rationalise people’s complex lives and connections to each other as the choices of individual ’employment applicants’, thereby justifying ongoing practices of exclusion or even harassment.

When thinking about your own expectations in gendered roles, you might want to consider the sorts of things you might list as measures of ‘appropriate behaviour’ among your own group of friends or acquaintances, and how those expectations might change for people who were work colleagues. Consider what you might consider a challenge to your identity practices. You might find this discussion of ‘policing’ of appropriate behaviour in an American high school informative. Such behaviour in school might influence what sort of further education or training you might be likely to consider a good prospect. You could also consider what occupations you find least attractive, or even distasteful, and why.

 

 

Why is gender a contemporary issue?

[This post accompanies a taught programme for undergraduate students at Keele University]
Some of you may well have noticed the media reports back in November that despite the legislation to equalise pay between men and women which has been part of law in many countries for over 50 years, progress in gender equality as indicated by the pay gap is still limited, not only in the UK, but worldwide. Such media reports focus attention on the persistence of structural inequality, but there are also persistently wide discrepancies in occupation, and in the gender expectations of certain types of work and how it is performed.
Our lecture on MAN 30047 from Dr Deborah Kerfoot emphasised the significance of how we think about difference as something that is performed in our everyday actions. The associated reading also draws on the idea of ‘habitus’, from Bourdieu; the idea that these repeatedly performed attitudes and behaviours become closely inscribed in our identities and in our bodies. Although a contemporary issue, you might be surprised to find out that the notion of the performance of difference (in studies of gender, at least), was widely popularised in an article by West and Zimmerman (1987) entitled Doing Gender‘. If you follow this link and scroll down you will see the large number of articles this publication influenced, which include a large variety of topics on business and organization as well as sociology in general.

This approach is important when you think about how frequently most research is interested only in the business case for diversity in organizations. The ‘business case’ approach often assumes that our identities are fixed by our own decisions, a result of choices freely made throughout our lifetime. What the performative approach emphasises is that many of these decisions might have slipped by unnoticed in our everyday practices of getting by in the workplace and fitting in. As such, small things such as an organizational dress code, or recruitment policies looking for the ‘proper look’ for an organization, neglect to realise that these practices are learned and performed through association with certain communities. It also attempts to rationalise people’s complex lives and connections to each other as the choices of individual ’employment applicants’, thereby justifying ongoing practices of exclusion or even harassment.

These expectations are not only something that affect workers, they are often part of our social experience in education and become a part of how we learn what is appropriate to our identity as we grow and age. An excellent article in The Conversation identifies how we might even experience these expectations as very young children. As such, it wouldn’t be surprising to identify such clear discrepancies between the genders when we get older as ‘natural’; after all, very few people have clear memories of their developing opinions and expectations as a very young child.
This in-built bias is often addressed by attempts to counter it in state-sponsored interventions, such as attempts to increase female participation in education in the STEM subjects. But it is not only women who are disadvantaged. Men are also excluded or discriminated against in particular occupations, even where they can make a genuine claim to merit and, as individuals, work hard to ensure they present themselves ‘in the right way’ (i.e a feminine way). This article on a blog featuring work by members of the American Sociological Association highlights how in some occupations, male workers are simply not tolerated by public expectations around gender performance and ‘natural’ behaviour.

As a student thinking about your own expectations, you might want to consider the sorts of things you might list as measures of ‘appropriate behaviour’ among your own group of friends or acquaintances, and how those expectations might change for people who were work colleagues. Consider what you might consider a challenge to your identity practices. You might find this discussion of ‘policing’ of appropriate behaviour in an American high school informative. Such behaviour in school might influence what sort of further education or training you might be likely to consider a good prospect. Take some time to reflect on this and consider what it might mean in your experience for the tendency for workers to become segregated in different occupations according to gender.

So many outlets, so little time!

…or, why this blog isn’t more regularly updated

Followers of this blog may have been wondering for some time where it’s author has got herself to. In general this is a consequence of writing for other platforms, including academic journals and books, and blogging internally for my students. In order to be more equitable, I have decided to begin simultaneously posting material from my student focussed blog here, for general readers. If you would prefer to see this content on the original site, however, it can be found at http://man30047.blogspot.com

So what is this other blog about?

The MAN 30047 blog is a companion for students studying the module “Contemporary Issues in Management” at Keele University. This module seeks to strengthen student knowledge of management and organisations by emphasising a critical approach to contemporary events. In order to direct everyone’s attention to what happens outside, as well as inside the classroom, the blog serves to encourage students’ active participation with reflections on guest lecture content, links to other source materials and questions for personal reflection. Students have to draw on and reflect upon their experiences of organisations including work and education and share them with the rest of the class. As such, these posts may also be of interest to the general reader.

The taught module relies upon the key text Contemporary Issues in Management, edited by Hamilton, Mitchell and Mangan, published by Edward Elgar. It can be purchased directly from the publisher, or through other booksellers and is available in paperback, hardback and e-book.

Magna Carta and the achievement of moral equality

It may have skipped your notice that the magna carta is 800 years old today. On the other hand, Google has doodled it, so you may be sick of hearing about it by now. The amazing thing about the Magna Carta (or ‘Great Charter’, ‘huge paper’, ‘big list’) is that its influence is more in terms of its symbolic relevance than its original content.

There is a marvellous exhibition of the history and impact of the Magna Carta currently at the British Library, and if you get a chance to look at it you will find that many of the provisions laid down in the Magna Carta were very quickly reverted, changed, addressed in a different charter or fundamentally ignored. Yet the symbolic nature of the Magna Carta lies in its existence as evidence of an agreement between unequal parties, as a contract between ruler and ruled.

At Runnymede in June 1215, the then monarch of England, King John (yes, that King John! The one previously at odds with a certain be-stockinged outlaw based in Nottinghamshire) was in a difficult spot. The many wars between his brothers and father had contributed to his loss of England’s lands in Northern France, and Richard’s involvement in the Third Crusade had drained the coffers of the Treasury which were further depleted by John’s attempts to reclaim lands in France. The ‘taxpayer’, that is, the English barons (particularly in the North), were not particularly pleased with John’s failures and his constant demands for money. Neither did he have the support of the church, having fallen out with the Pope some years previously and had barely returned to his good graces when the English barons marched on London.  The conditions of the Magna Carta were forced upon John, who promptly ignored them, perpetuating civil war in England until his death (probably from dysentry) in 1216.

Now, I’m not a historian, so why is this contextual idiosyncracy around the rule of a despotic monarch of interest? The Magna Carta is held as a precious moment in history by the Law profession, as it symbolises a key moment when the highest secular power (the monarch) is held to account by (secular) law. This raising of the importance of the secular law is perhaps why the Magna Carta was immediately challenged by the Church. The very principle of the administration of justice in this world rather than in an afterlife is encoded in the document.

The actual contents of the original document, however, were subject to substantial revisions with few unaltered articles remaining in the revised versions and the most substantial effects at the time pertained to rights of common people to the access and use of Royal Forest land to gather firewood and pasture animals. The majority of these rights were protected in a later separate charter (the Forest Charter) by Henry III.

However, the majority of discussions on the impact of the Magna Carta refer to the principles of government rather than those of land use and ownership. Social reform movements appropriated the symbol of the Magna Carta to challenge the authority of Imperial rulers (hence the significance of the Magna Carta in the USA), as well as the legitimacy of the rule of the propertied classes. The Magna Carta is therefore identified as influential in the suffragete movement, the civil rights movement and the human rights movement. And so it is the symbolic nature of the event, rather than the sealed vellum parchment, which highlights the Magna Carta as the start of a recognition of universal moral equality.

Nonetheless, it is important to reconsider the content of the original and revised charters. For while the principles of moral equality are enshrined in contemporary legal documents such as the Declaration of Human Rights, these documents focus primarily on civil liberties, the right to liberty and freedom from degrading treatment. The later articles are rarely discussed, such as the right to desirable work and an adequate living standard. In the original concerns of the Magna Carta, even the privileged English barons challenge the authority of an absent monarch to determine that people be unable to heat their homes or feed their families. Moral equality is more than an abstract principle, and considering the vast inequalities growing in our contemporary societies, the Magna Carta is perhaps a better touchstone than some for reconsidering how that ought to be implemented today.

The exhibition on the Magna Carta in the British Library continues until 1st September 2015

The British Library website also has loads of amazing information about the charter

Should LARP be not-for-profit?

If there is one oft-cited rule that almost all LARP organisers face it is this: LARP does not make money. In fact, scraping together the cash to ensure you have enough funds to run a future event, keep the group website registered and online, or to pay for prop storage when that convenient friend’s garage becomes unavailable is the constant worry of anyone trying to keep a LARPing group together.

There is evidence to the contrary (of course!). The professional ‘LARPwrights’ of Nordic LARP, the large festival systems that at least make enough money to pay their employees, the adept entrepreneurs who transform LARP into a training activity or even the savvy LARPers who run the same game twice to save on props.

But here’s the interesting question; is there something about LARP that would be ‘lost’ if events were run on a for-profit basis?
Read More »

Technology, resources and learning

The role of technology in the learning process is something to which many hours of management consultancy time have been devoted in recent history. It is also something which academic institutions, sceptical at first, have increasingly embraced with open arms; as we are tweeted, podcast, prezi’d and interactive whiteboarded into a more enlightened future. Technology has nonetheless been a subject of study broadly within sociology for a number of years. Such studies have emphasised that the tendency to identify technology as an independent ‘actor’ outside of social relations is misleading at best (see Grint and Woolgar 1997). Rather, technology can be understood as an extension of the roles humans might perform, as a heavy weight on a hotel key ‘stands in’ for a door steward, or a ‘sleeping policeman’ prevents speeding by standing in for the real thing. Technologies are also not necessarily limited to artefacts. Winner (1986) argued that technology could be defined in three aspects; as apparatus, technique or organization, though admittedly it is difficult to distinguish these in practice. This is one expression of a broader philosophical argument which defines technology as an extension of human capabilities in both an abstract and practical sense (see Rothenberg 1993; Brey 2000), thus a bicycle extends our capacities for swift movement or a calculator extends our (individual) capacities for arithmetic. Based on this understanding of technology, many objects in common use for teaching and learning activities fulfil the definition. Some examples might be;

  • improved memory (often through artefacts of data/information storage and categorisation such as books or databases)
  • faster and more reliable calculation (through mathematical techniques algorithms and the encoding of these techniques into everyday items such as calculators or computers)
  • consistent, comprehensible methods of communication (the organization of the written word, in all its forms, represents one of the most significant technologies of our society, and is often manifest in software to support word processing, spell checking and so on)
  • extended voice (projection of messages across space and/or time through recording, telephony, radio, translation software et cetera)
  • extended social reach (adaptation of material for people of different languages or abilities, use of online networks to circulate material more widely or to collaboarate in virtual ‘classrooms’)

Such examples suggest that technology as an extension of human capacities offers powerful potential to improve the efficiency of the learning experience. Yet the technological objects only offer potential; capacities have to be realised. Unfamiliar technologies may be as great an impediment to the development of learning as no technologies at all where there is no support for accessing and learning how to use the technology (again, see Grint & Woolgar 1997). Artefacts often make no distinction between different types of user or student, and as students transition from routines familiar to sixth form colleges or other educational establishments to the less cohesive structures of diverse university departments they are likely to encounter significant changes in their experiences with different technologies. The learning ‘gap’ for each student with respect to making the best use of the technology, in my experience, will be different.

Generational distinctions are, in addition, often highlighted as a ‘gap’ between the cultures of the ‘out of touch’ academy and the ‘technologically native’ youth. By implication, engagement with up to date technologies by the academy offers a bridge by means of which students may be reached. However, my own (anecdotal) experience suggests this may well be mythical, since I have encountered many academics who are thoroughly competent in the use of the most cutting edge technology as well as regular Luddites among the student population.  In addition, this understanding of technology does not encompass the full range of learning technologies we have at our disposal: although we might often think of PCs and wireless internet access as the main ‘technologies’ used in contemporary learning, the way technology is defined encompasses a wide range of learning resources, including printed books and strategies of organizing learning time. This mythical simplification not only presents technology as a simple apparatus but in addition conceals the work undertaken to learn technique and build organization. Within the myth lies two significant potentials, one being the importance of breaking down barriers between the academy and the ‘real world’ (of students, not the same as the ‘real world’ of the employers/employed or of politics), and the second potential being accessibility of knowledge.

Do students care about learning technology?

This post was influenced by a key question in the national student survey (NSS); which asks students how well their university has provided access to resources. This concern fundamentally relates to access to knowledge. Writing, as the long lasting record and encoding of knowledge, is one of the most advanced technologies we possess, yet access to written records is becoming ever more complex. This matter concerns issues as diverse as the opening hours and physical book collections of libraries, and status hierarchies in academia informing the choice of subscription-restricted peer reviewed journals. In this way the written artefact becomes embroiled in more complicated networks of access which students have trouble navigating, either through lack of consultation or through lack of training in technique and organization of these materials. The accessibility of digital media is also relevant, as open access podcasts may be freely available, but only to those lucky enough or wealthy enough to have a reliable internet connection at home, since IT facilities on campus may often be full to capacity or in use for teaching. When I consulted with students on the problem of accessible materials, it became evident that the main concern of students was in navigating these complex circumstances, and the most effective and immediate solution for my teaching was to engage with a ‘low’ technology solution, that is, to provide guidance on accessing print books in the library, provide printed copies of notes and easy to print accessible PDF document links for core readings which would not take long to access or could be accessed using alternative electronic devices such as smart phones or tablets. This is not to suggest, however, that a ‘low’ tech solution is always best: to facilitate revision for students on the same module I have found developing a series of flashcards using the online website studyblue.com very useful as these can be easily printed or used via any mobile device.

The term ‘blended learning’ is frequently used to describe the application of technology to programme or session design, particularly the use of online delivery of materials or activities. While the definition of ‘blended learning’ is roving and contested (Oliver & Trigwell 2005), there are some discussions over whether this approach to teaching should also be considered ‘theory’. Considering the theoretical approaches highlighted in my previous entry on learning theory, the addition of technology to the process of learning seems to be one which makes no intrinsic assumptions regarding the learning process, though the application of certain technologies may indicate a sympathy with cognitive approaches by broadening student choice regarding the order of content, or with behaviourist approaches where technology is used to monitor assessment outcomes (formative and summative alike). To some degree, the discussion of learning technologies highlight the potential for adaptability, suggesting that the fundamental value of using such technology lies in the potential to accommodate a range of individual learning needs in a diverse cohort such as those posed by disability or diverse learning backgrounds. However, these rely on considerable evaluation of student’s requirements which if not conducted comprehensively may act against student’s interests as they are required to learn not only the content of a given session, but also a way of interacting with the technology. Oliver & Trigwell (2005) highlight that a significant limitation of the approach to ‘blended learning’ lies in an absence of analysis from the perspective of a learner but also that technology does offer the potential for varied experiences. Such varied experiences may contribute towards an enhanced learning experience, or place the student under an additional burden of isolation and estrangement from the rest of the class and the learning material. I feel it is important to recognise that for many students of differing competencies, too much application of ‘learning technology’ may present this risk; impeding their individual learning rather than facilitating it.

Looking at the theoretical confusion surrounding these issues, it seems that while technology may offer significant potential to improve the learning experience, it may also serve to confuse students and suffer from in-built prejudice regarding the access to knowledge. Consequently, I believe the implementation of new learning technologies should take account of student’s needs in a comprehensive way and they should not suffer the consequences of adoption before analysis.

The importance of feedback: Where and When students learn (to learn)

A reflection on a ‘critical incident’: an occurrence during teaching that encouraged me to reflect on my practice and assumptions.

Should the classroom be the ‘learning’ environment, or should ‘learning’ occur elsewhere? I have always felt that the traditional ‘chalk and talk’ method of lectures assumes the classroom to be the space where students are given a guided tour to the literature on their chosen topic, but that they need to visit those foreign shores themselves in that terrifying time often labelled as ‘independent study’. In a previous post I have suggested that there are particular theoretical answers to the ‘why’ students study (their motivations to learn) that are intrinsically linked to our views on ‘how’ they study. This post focuses on the question of and assumptions about where learning takes place. In many contemporary universities, students are now also being expected to learn ‘virtually’; in times and spaces facilitated by online materials and engagement with technology and social media. Following on from an assumption that all students are digital ‘natives’ already thoroughly engaged in an electronic world, I have a few concerns with the promotion of technological spaces as learning environments, though I will consider this in detail in a future post.

I teach a course in a business school which deals with theoretical areas of sociology; areas which students have little prior knowledge of and often limited awareness of the historical context in which the relevant ideas have been developed and applied. The size of the class varies but would often be considered ‘large’ for a humanities subject. Many of the students are from China, some from south-east Asia, a few European students and a mixture of students come from throughout the UK. A common learning or cultural background is therefore not something to be taken for granted. None have been required to have previous experience of social science subjects in order to enrol on the course, and many are simultaneously studying finance, economics or accounting. These subjects are not wholly quantitative, but that often forms a frequent part of their assessment and skill set in such programmes. Appreciating that the education system often filters students into ‘good at mathematics and science’ versus ‘good at humanities and languages’, I imagine many students of this background enter my compulsory course feeling at a disadvantage. Equally, the majority of students represent the 18-20 year old demographic, and will have recently come from a setting where learning occurs in the classroom, or in set assignments at regulated intervals. The notion of learning according to a timetable is imprinted, draconian fashion, in their institutionalised bodies.

This reflection is based on the following ‘critical incident’: In a large tutorial group (30 students), after going through discussion on that week’s assigned reading, I was explaining the criteria for the written assessment (a conventional academic essay). This assessment was based on the topics we had just been discussing, and students were concerned about the system of grading. I had provided an overview of how the assessment would be graded as a handout in the previous lecture.  One particularly adept student asked “if so much of the grade is based on study skills, why are you teaching us about these concepts and not teaching us to write essays?” I replied that the university supplied various workshops and activities to hone essay writing skills and that this was why I had been informing them about these workshops repeatedly at the beginning of lectures. However, the incident gave me pause to consider; students seemed to expect a ‘classroom model’ of total and complete learning delivery which was simply not the way I had planned to deliver the module, instead expecting and encouraging students to develop these generic skills through independent study.

This seems to suggest a conflict between a more ‘behaviourist’ model of learning expected by students (see Theories of Learning) and an independent or self-directed model which is significantly more ‘constructivist’ in thinking.  One development in ‘constructivist’ approaches to learning attempts to incorporate elements of practice common to the behaviourist model, and is regularly cited; Kolb’s (1983) model of experiential learning.

Kolb’s (1984) model of experiential learning  comprises a cycle of learning activity, whereby students progress between different types or styles of activity to learn from active engagement. On the horizontal axis, the model presents observation and action (very similarly to stimulus and response) while the vertical axis attempts to ‘fill in the black box’ with internal cognitive processes.

Kolb’s learning styles (copied from ruspat.wordpress.com)

This distinction of mental processes harks back to the Greek distinction between techne, or hands-on knowledge, and episteme ‘justified true belief’ (or abstract knowledge). This approach to learning further suggests that learners have particular preferences for different stages in this cycle, and so may be at their most effective in different environments. However, the entire cycle is the aim. The role of the teacher and the pupil, according to this pluralist approach, is more complex. The teacher is responsible for ensuring as many of the stages as possible are represented and facilitated through learning activities under their control, but it is the student’s responsibility to actually go through the process. While this process is likely to incorporate many aspects outside the teacher’s control (in particular, concrete experience), and even in the case of higher education outside the sphere of the degree programme, it is implicit in these theoretical approaches that student engagement will follow from appropriate course and activity design.

Following on from Kolb’s theory to return to my course design, I had expected that reflective observation and abstract conceptualisation were the tasks for the classroom, whereas concrete experience or active experience in social and business problems (as the topic of the course) and in reading and writing practices (as the medium of learning) were outside of my control. Although I was certain that work and life experience would be of benefit to students in better appreciating the content of the course, I was not sure if attempting to embed teaching activities on how to write into the module would in fact benefit students, or if my role was to attempt to communicate more strongly the extent to which they have to independently engage with the process of developing their writing skills.

Allan and Clarke (2007) discuss the struggles of designing programmes with embedded teaching of study skills over those without. In their research, they highlight a distinction between ‘generic’, ‘subject-related’, and ‘metacognitive’ skills.

Generic skills are those such as effective communication through presentation and/or writing, using information technology and working with others. The authors found that for some students, formal teaching of these skills could build confidence and improve expertise. Other students did not successfully engage with the activities, for various reasons all relating to the perception of the training as lacking relevance.

Subject-related skills are those which are directly related to the learning activities and assessments specific to the subject programme or module. So for this module that may include reading and comprehension skills (especially evaluating the meaning of the author compared to the student’s interpretation), essay planning and writing skills, notetaking, or producing answers under exam conditions. In Allan & Clarke’s (2007) study, for some students these were considered irrelevant or in some cases particular matters which were relevant to them were not available in sufficient depth.

Metacognitive skills are those which relate to the student’s awareness of their own performance and areas of improvement. In this sense they are directly related to assessment and feedback. Encouraging students to develop these skills related specifically to developing a more reflective awareness and personal development. Students on the whole engaged with activities related to this aspect of teaching.

Allan & Clarke (2007) advocate that following from their research, attempts should be made to embed the teaching of study-related and metacognitive skills within subject teaching. They further imply that this requires commitment from several lecturers across entire degree programmes. However, they also indicate that further research is required to identify if this is effective for students.

Considering the issue further:

Following this reflection, I have made strong attempts to incorporate some study-related skills into the course, although I have made a less concerted effort to explicitly address metacognitive skills. In order to develop reading and comprehension skills, I have provided more preparatory exercises such as questions for weekly readings which incorporate components from Bloom’s taxonomy.  Students are given access to online flashcard datasets which allow them to undertake multiple choice tests on these questions which incentivise their week by week completion and allow them to check their progress. Essay planning and writing skills are promoted through provision of resources and a specific session of relevant activities prior to the essay submission, but these are not motivated though clear (behaviourist) rewards.

Due to the limits on classroom time, few explicit sessions on developing metacognitive skills are included in the course. What the module does do at present is attempt to get students to develop these through implicit demands made of them in the classroom, such as asking students to relate their question answers to their own experience, or to things they may be familiar with from the news or even from popular fiction. There is also a more explicit session at the beginning of the course, prior to any teaching on the content, where metacognitve skills are taught more explicitly. Allan and Clarke (2007) suggest that these might be embedded in subject teaching, but incorporating them in a single module might be counterproductive due to the short timeframe (12 weeks) and repetition on other modules. At this point in time, I feel that a single session in the classroom with the option of further one-to-one discussion on specific assignments does well to support the development of metacognitive skills but without fatiguing students with repetition.

I do have concerns that to attempt to ’embed’ these skills in a module too strongly could result in overburdening the students or with over-assessment. Many of the generic skills are now being introduced as a compulsory part of initial study in a number of universities, but there is (understandably) no corresponding decrease in the expectations for academic content. The question of where and when students learn is also a key concern of applicants, often wondering where their money (from the increased student contribution to fees) is being spent. There is a competitive view on contact hours among applicants (and their parents) which seems to demand more classroom time, and which implies more responsibility for learning outcomes being attributed to the teaching rather than the learning effort. An attempt to placate these demands with unspecified additional time in a room with a tutor or some new nifty social networking learning platform without thorough consideration of where and when students learn metacognitive skills as well as content seems fraught with peril for any university.

Meaningful Work

Thanks to the ESRC Festival of Social Science, last week with the support of the New Vic Theatre, Newcastle-Under-Lyme I ran an event asking individuals to consider what they felt stood in the way of meaningful work. While there has been plenty of academic research into this topic, as well as related concerns about the quality of work in the form of ‘good’ or ‘bad’ jobs, the search for meaningful work as an academic topic and an everyday activity seems to fade into the background when many people count themselves lucky to be earning enough money to not need to rely on food banks just to get by.

The workshop was led by Sue Moffat, director of New Vic Borderlines and advocate of the use of theatrical techniques to get people to engage with each other and express their shared knowledge. As part of the workshop we played games to examine how we learn to trust people we work with, how a competitive urge developed, encouraging us to challenge some individuals and make alliances with others. We then talked about this as a group, exploring how important social camaraderie at work can be to make it a meaningful experience, or even how some types of paid work were only meaningful as enabling independence and freedom to do things in other aspects of life. We also listened to recordings about work, thinking about how the sounds and sensations of working could play a part in bringing meaning to a community as much as to individual people, and reflecting in particular on how the disappearance of those sounds and sensations could leave a feeling of loss.

Much of our later activity, building a narrative around images and objects in the theatre reiterated these themes about society, community and individual approaches to meaning. Using large metal frames we entangled teacups and wallets, stethoscopes and teddy bears. A story of the voyage towards meaningful work was written, considering the importance of the crew aboard the vessel, the storms and dangers of the deep seas, the provisions needed to survive the trip, and the search for dry land. While these metaphors may seem fanciful, they allowed everyone participating in the workshop to easily explore their shared experiences based on how they interpreted these objects and events. Throughout, we discovered that meaning was elusive, and could be challenged or built through our relationships with others. We explored how many of our everyday frustrations with work were those which challenged its goals or meanings, and how the money obtained through paid work was not enough to fulfil our desires for a meaningful life, and for meaningful work to occupy it.

For more information about the New Vic Theatre, follow this link.

This event was followed by an evening discussion about what business can do for society, hosted by Keele University Management School. There will be a follow up post on this next week.

Theories about Learning motivation and practice

Following a recent post on a friend’s blog about undertaking postgraduate certificate qualifications in teaching at university, I thought I would start the process I have been promising myself I would do for months now: publish my blogs on learning to teach. NB: some of this material has been recently submitted for assessment purposes, enjoy the read but don’t quote in your own teacher training programme!

I formally started the teaching at university programme about six months after I began working at my current university. Unsurprisingly, like at a lot of universities I have heard of, the programme was not held in high regard by academic staff, mostly because they were compelled to undertake it and had developed (over the course of the PhD or over many years of research focussed work) some cynicism towards the programme tutors. Broadly, this cynicism related to three factors; (1) a belief that students who are motivated to learn, do, regardless of  techniques applied by lecturers, (2) a view that programme tutors did not sufficiently account for the constraints on lecturers following from large class sizes, limited resources and bureaucratic impediments to change, (3) skepticism about the political aims behind the programme and whether this signified a move to a ‘customer oriented’ model of teaching that fundamentally undermines the authority of the lecturer as ‘expert’. Following from this third element was a critical attitude towards the political status of universities in the UK and the consequences of changes to student fees and recruitment in the most recent attempt to create a higher education ‘market’.  But I’ll come back to this issue in a later post.

Today’s post focuses solely on point (1): theories about learning and the motivation to learn, and summarises two broad theoretical approaches; behaviourism and cognitivism. What is interesting is that each model has a different role for the teacher, and requires them to engage with the students in a different way. Each also suggests that different rewards or learning environments will produce varying results in how much and how well students learn.

These approaches to the study of learning have much in common with the fields of psychology and social psychology generally, and as such I have been a bit sweeping in my assertions which follow. Each has it’s historical place in influencing learning institutions and systems, and consequently some aspects of learning, teaching and assessment that are often taken for granted can be linked to different parts of these theories.

Behaviourism: looking at external action not internal subjectivities

Behaviourism is one of the earlier approaches to learning, drawing on the notion that since the internal workings of the mind are objectively unknowable then only the external factors can be studied. “Learning is defined simply as the acquisition of new behaviour” (Pritchard 2008:6). Central to this is the basic premise that all creatures respond to stimulus to increase positive experience and decrease negative experience. Central theorists include Watson (1958), Skinner (1953) and Thorndike (1966). Historically, this approach to the study of psychology was particularly functionalist, and much of the research in this area focused on ‘conditioning’ subjects into a particular habit of response. You might have heard of famous examples of this sort of research such as Pavlov’s dog experiments, where dogs are trained to associate the noise of a bell with food, such that eventually, even when the food is not present, the sound of the bell will make them behave as if food is present.

While many conditioning experiments may seem crude, or even laughable, by today’s standards, they were incredibly influential in their practical implications. However, the perspective was not universally well-received, as it placed human beings in the same category as any other kind of animal. Skinner’s (1971) Beyond Freedom and Dignity is a particularly vehement response to his critics, arguing that humans had to ‘get over’ their belief in their own special status if society was to be functionally improved. This experimental approach was also criticised for oversimplifying the study of behaviour (see Eddie Izzard’s sketch about Pavlov’s cat for a laughable example of what happens when not all variables are controlled)

Based on a simple view of student motivation as merely learned response to stimuli, learning approaches that adopt this view might be summarised as ‘stick’ or ‘carrot’ techniques. Approaches as different as the Victorian ‘spare the rod and spoil the child’ and contemporary practices around the need for ‘positive feedback for psychological engagement’ all fit in with this approach. Any focus on rewards for correct behaviour is underpinned by behaviourist theory, whether it is a directly ‘conditioned’ response or a ‘shaping’ (using goal-setting approaches) towards ideal behaviour.

Limitations to using a behaviourist approach to designing learning activities are usually listed as including a limited or ‘surface’ approach to learning, as the desired response could be produced without developing an understanding at a ‘deeper’ level; it is limited to rote-learning (Pritchard 2008).

An interesting part of behaviourism, however, is that it places the responsibility for ‘correct learning’ directly upon the teacher, provided the student complies with the system. It is the responsibility of the teacher to identify desired behaviours and reward them appropriately. Additionally, students may have come from schools or colleges that use this sort of approach, and therefore to an extent are already ‘conditioned’ to expect this sort of learning activity and reward.

Cognitivism (or Constructivism): Looking inside the black box

A different approach to learning is apparent in cognitivism. Focusing on the workings of the brain from multiple different perspectives, cognitivism gives primacy to the idea that learning is an internal process. Much of the research on which these theories are based comes from developmental studies with children, or with those suffering from developmental difficulties. The underlying principle contends (against behaviourism) that learners are active agents in the learning process, and that learning should be approached in a holistic manner (this is associated with ‘gestalt‘ theories). This suggests that students respond to patterns as much as to individual stimulus.

Many different approaches tend to get clustered under the cognitivist label. Two early theorists in the area are Piaget (1926) and Vygotsky (1978). While both share similar principles, they do differ in terms of the priorities they give to particular aspects of the learning experience. Vygotsky’s approach (ibid) focussed on the social interaction between teacher and learner, stressing that it is within that relationship that the teacher can help provide a framework (and break down earlier frameworks) which the learner then strengthens and models for themselves. Piaget, by contrast, stressed that the learner engages with artefacts provided by the teacher independently and develops knowledge which is incorporated into schema (a sort of subjective framework, see Smith, Dockrell & Tomlinson 1997). Both theorists stress the significance of activity undertaken by the learner alone or with the teacher as a key part of the process (Jarvis 2003).

Compared to the behaviourist approach, the constructivist approach as a consequence of a more subjective understanding of learning (by experience) tends to offer a view of learning which allows pluralistic versions of knowledge (i.e. there is space for more than one ‘correct’ answer or way of doing things). By contrast, the behaviourist view presents a much more rigorous position on what does and does not constitute legitimate knowledge that indicates a one-way transmission of that knowledge from teacher to learner. Both different approaches also commit to different priorities and techniques for the design of the teaching and learning environment. Clearly, certain training programmes may tend towards the behaviourist perspective, as some interpretations or behaviours are considered illegitimate, misguided, or even dangerous, whereas disciplinary areas more tolerant of pluralism may be more inclined towards a cognitive view.

A synthesis of constructivist and behaviourist theoretical leanings is apparent in the majority of current approaches to institutionalised learning, perhaps thanks to inherited behaviourist systems of the past, or the failure of cognitivist learning experiments to revolutionise teaching styles. One frequently-used reference point which demonstrates this is Bloom’s (1956) taxonomy of (cognitive) knowledge[1]. Bloom’s taxonomy presents multiple ‘building blocks’ as a progressive hierarchy of knowledge attained through learning where the achievement of each stage requires proficiency in the stage below (this strongly informs international comparison standards regarding the level of achievement in particular qualifications) .

bloom

The original presents a continuum which presents a programme suitable to behavioural ‘shaping’, but also stipulates the cognitive activities it is expected that students will undertake. Bloom’s framework was revised in 2001 in order to more comprehensively represent changes in educational language and to incorporate the type of knowledge the student is expected to master (factual, conceptual, procedural and metacognitive), as well as the cognitive process they engage in to do so (Krathwohl 2002). There have been some critiques of Bloom’s taxonomy, however, which suggest that the hierarchy of cognitive approaches may be reversed, and that the production of knowledge in the form of ‘facts’ is a hard-won outcome of the other processes (Wineburg & Schneider 2010). After all, in scientific endeavour, that is how research produces knowledge!

Wineburg and Schneider’s (2010) argument could be seen as a revisit to Bloom’s framework which highlights a shift away from behaviourist models of learning towards cognitivist approaches.  A behaviourist approach to learning, with its focus on stimulus-response-reward, privileges a basis in the accumulation of facts through rote learning followed by study in the skills of manipulating those facts for logical analysis and evaluation. In this presentation of Bloom’s taxonomy, the teacher provides students with ‘legitimate’ knowledge in the form of facts, then slowly leads them through a process whereby each stage in the process is reinforced through reward, often in the form of good test marks though also sometimes using more mundane rewards (such as sweets or book tokens). Wineburg and Schneider (ibid) argue that the taxonomy may instead be represented in the opposite direction, where knowledge is the outcome of the learning process rather than its base. This derives from a more constructivist approach which builds upon the notion of the learning ‘scaffold’ (see Sylva 1997).

 

[1] It is important to recognise that the committee of which Bloom was head intended to encourage a synthesis between three different types of learning; cognitive, affective and psychomotor (see Krathwohl 2002). I have rarely come across discussion of the latter two dimensions at university, which may be instructive in how far such discussions have penetrated in the educational domain.