challenging the abuse of social statistics and social research

16
sra : RESEARCH MATTERS JUNE 2013 INSIDE Branched survey questions | Ethics: sharing practice and learning | Families in austerity | The Great British Class Survey | Propaganda: power and persuasion | Plus usual news, reviews and briefings Challenging the abuse of social statistics and social research By Ceridwen Roberts I am sure you will have noticed over the last few weeks, growing concern about the misuse of social science statistics and findings. The UK Statistics Authority has had occasion to publicly rebuke the Department for Work and Pensions for the wrong interpretation of the department’s statistics. Mr Duncan Smith’s claims that 878,000 people dropped their claims for sickness benefits as they did not want to face the new medical assessment and that 8,000 people moved into work as a result of the benefit cap, are not supported by the official statistics. At the same time, the Education Secretary, Mr Gove, quick to criticise the teaching profession for low standards, has been shown to be less than rigorous with his use of ‘survey’ data using, as evidence, surveys which were described as ‘amateurish, politically biased or irrelevant’ to the political point he was making. Misusing and misquoting social science data is nothing new – many of us have experienced our press releases or quotes being used out of context and presenting a travesty of our results. And, of course, almost every day we read about or hear of a ‘survey’ trumpeted as revealing startlingly new findings which, on examination, are at best nothing of the kind and at worst seriously wrong. All this gives social science research a bad name, distorts everyday understanding of the world and actually, if used in political debate, is seen an attack on democracy. The Public Administration Select Committee is currently scrutinising the issue of official statistics and their use in government, and conducting ten short studies. While this is to be welcomed, and the reports make interesting reading, this does not really deal with the wider issue. The SRA is increasingly concerned about this problem and the Strategy Group will be discussing this at its forthcoming meeting with the acting chair of the Campaign for Social Science, Professor Michael Harloe. Tackling the misuse of social research needs a co-ordinated effort across the whole spectrum of social science providers. The website www.fullfact.org which is supported by the Joseph Rowntree Foundation, the Nuffield Foundation and the Esme Fairbairn Foundation, aims to make it ‘easier to see the facts and context behind the claims made by the key players in British political debate and press those who make misleading claims to correct the record’. It is having some success, as did the website ‘Straight Statistics’. The Natural Scientists are tackling the problems of inaccuracy, misuse and misreporting and, we think, so should the social science community. How to go forward? This is something we do need your help on. Please let us know of instances when your research or research you are familiar with has been misused in this way. We need to build up a momentum about the problem so that persuading the other social scientists and funders that something must be done becomes easier. I would like us to have a section on the website where these instances can be reported, anonymously if need be. But we also want your views on whether this is a problem and what you think we should do. The Strategy Group meets on 27 June. Emails to the office about this before then would be a great start. Misusing and misquoting social science data is nothing new – many of us have experienced our press releases or quotes being used out of context and presenting a travesty of our results

Upload: doanthu

Post on 08-Dec-2016

216 views

Category:

Documents


2 download

TRANSCRIPT

sra : RE SE ARCH MAT T ER SJUN

E2

01

3 INSIDE Branched survey questions | Ethics: sharing practice and learning | Families in austerity | The Great British Class Survey | Propaganda: power and persuasion | Plus usual news, reviews and briefings

Challenging the abuse of social statistics and social researchBy Ceridwen Roberts

I am sure you will have noticed over the last few weeks, growing concern about the misuse of social science statistics and findings. The UK Statistics Authority has had occasion to publicly rebuke the Department for Work and Pensions for the wrong interpretation of the department’s statistics. Mr Duncan Smith’s claims that 878,000 people dropped their claims for sickness benefits as they did not want to face the new medical assessment and that 8,000 people moved into work as a result of the benefit cap, are not supported by the official statistics. At the same time, the Education Secretary, Mr Gove, quick to criticise the teaching profession for low standards, has been shown to be less than rigorous with his use of ‘survey’ data using, as evidence, surveys which were described as ‘amateurish, politically biased or irrelevant’ to the political point he was making.

Misusing and misquoting social science data is nothing new – many of us have experienced our press releases or quotes being used out of context and presenting a travesty of our results. And, of course, almost every day we read about or hear of a ‘survey’ trumpeted as revealing startlingly new findings which, on examination, are at best nothing of the kind and at worst seriously wrong. All this gives social science research a bad name, distorts everyday understanding of the world and actually, if used in political debate, is seen an attack on democracy.

The Public Administration Select Committee is currently scrutinising the issue of official statistics and their use in government, and conducting ten short studies. While this is to be welcomed, and the reports make interesting reading, this does not really deal with the wider issue. The SRA is increasingly concerned about this problem and the Strategy Group will be discussing this at its forthcoming meeting with the acting chair of the Campaign for Social Science, Professor Michael Harloe. Tackling the misuse of social research needs a co-ordinated effort across the whole spectrum of social science providers.

The website www.fullfact.org which is supported by the Joseph Rowntree Foundation, the Nuffield Foundation and the

Esme Fairbairn Foundation, aims to make it ‘easier to see the facts and context behind the claims made by the key players in British political debate and press those who make misleading claims to correct the record’. It is having some success, as did the website ‘Straight Statistics’. The Natural Scientists are tackling

the problems of inaccuracy, misuse and misreporting and, we think, so should the social science community.

How to go forward? This is something we do need your help on. Please let us know of instances when your research or research you are familiar with has been misused in this way. We need to build up a momentum about the problem so

that persuading the other social scientists and funders that something must be done becomes easier. I would like us to have a section on the website where these instances can be reported, anonymously if need be. But we also want your views on whether this is a problem and what you think we should do. The Strategy Group meets on 27 June. Emails to the office about this before then would be a great start.

Misusing and misquoting social science data is nothing new – many of us have experienced our press releases or quotes being used out of context and presenting a travesty of our results

S R A R E S E A R C H M A T T E R S : J U N E 2 0 1 3 : 2

Can we live without the Census? SRA Summer EventSarah Cheesbrough, National Audit Office

In 2010, Francis Maude announced government plans to scrap the national Census. With the 2011 Census costing over £480 million, he said it had become an expensive and inefficient way of counting the population. He argued that if we made better use of the data held by the NHS, local councils, the Royal Mail and HMRC or DWP, as well as credit card and utility companies, we could both save money and have more up-to-date information. But if we scrap the Census, do we risk ending up with data that is inadequate and incomplete for the policy decisions and resource allocation that depend on it?

These are the questions being tackled by the Office for National Statistics in its Beyond 2011 programme. The ONS is considering whether, if we continue with a Census, it could be shorter or even conducted on a rolling basis with about 10 per cent of the population each year. Alternatively, how much would administrative data need to be supplemented by large-sample surveys to make sure we know enough about household characteristics? All of these options have implications, not only for the immediate Census user community, but also wider social research which often depends on Census data as a starting point for further, in depth, investigation or a barometer against which to check the representativeness of a survey. Even if you do not consider yourself to be a Census user, it is time to take stock of how such changes could affect our work.

Chaired by the BBC’s Mark Easton, this year’s SRA Summer Event, ‘The Census: now and in the future’ will bring together Census experts and commentators to provide a lively and accessible update on the value of the Census and what we might expect by 2021. As well as presenting the latest from the Beyond 2011 programme, speakers will demonstrate how much we rely on the Census and the risks that some of the changes under consideration pose to providing local services in particular. We will look at the administrative data available and touch on how other countries collect key socio-demographic information. By the end of the afternoon, as well as building up a thirst for the drinks reception on the Local Government House roof terrace, we hope to be better judges of whether spending 87p each per year on the Census is, after all, money well spent.

See page 5 for details.

S R A R E S E A R C H M A T T E R S : J U N E 2 0 1 3 : 3

sra: E D I T O R I A L

The schlock of the newSRA chair, Patten Smith, on technology for technology’s sake

Over the past year I have seen a couple of government-issued briefs for complex social research studies which exhibited a disturbing feature which I have not previously seen in such a developed form. This was a strongly-made implication that

‘cutting-edge’ data collection methods (use of web-based questionnaires, smartphone platforms, and so on) would be valued more highly than would their traditional counterparts (postal and other paper self-completion questionnaires for example). In itself, this preference is innocent. Indeed, were it coupled with an appropriate ‘ceteris paribus’ clause, the preference for the modern might be regarded as representing a commendable attempt to welcome the future and hasten the advent of the inevitable. Unfortunately, this was not how the documents read. The preference for the new appeared to be absolute whether or not other things were equal: the implication was that innovative methods were preferred over traditional ones even if the latter would deliver better data or greater survey cost-effectiveness. In summary, it appeared that methodological proposals were to be assessed not merely against traditional criteria – their ability to deliver good quality data at a low cost – but also against a new criterion of novelty.

Taken with this a second, and this time commonly observed, feature of the briefs this created an unintended irony. This second feature was a powerful insistence that defined

response rate targets should be met. In a general sense this was a good thing: it demonstrated a commendable concern with quality, a concern which should help ensure that, even if the high value placed on innovation per se were to render the studies sub-optimal in cost-effectiveness terms, the studies would still be subject to rigorous quality standards. But in the choice of specific quality criterion – response rate – an irony was generated. Why? Because survey researchers who are at the cutting edge of methodological research consider response rate to be a very poor measure of survey quality: the main kind of error arising from non-response is non-response bias, and response rate has been found to be a near-useless measure of this quantity. Adding weight to the irony is the fact that, over recent years, survey methodologists have instigated a range of developments which could undoubtedly have had a place in the

studies being tendered: these include new approaches to non-response bias measurement, and the development of adaptive/responsive designs and smart uses of paradata.

To spell out the irony, the briefs placed a high value on the cutting-edge as it related to data collection methods regardless of quality considerations, but ignored the cutting-edge as it related to minimising error in conducting research.

I have not focused on these studies because they have features which made them exceptional, but rather because I worry that they represent an emerging trend by which technological novelty is coming to be fetishised purely for its own sake – perhaps reflecting broader trends in the ways in

which the mass-media discusses technological developments. Ultimately, in doing research that is worthy of the name, we should be aiming to maximise the quality of what we do given the inevitable constraints of cost and time. And to do this, we should judge our methods against underlying standards that are immune to the vagaries of technological fashion. We are, of course, often well served by embracing the latest technological developments, but sometimes we are not; and knowing which of these responses is appropriate is surely all-important when decisions are being made about how best to spend public money on important research.

We are, of course, often well served by embracing the latest technological developments, but sometimes we are not; and knowing which of these responses is appropriate is surely all-important

S R A R E S E A R C H M A T T E R S : J U N E 2 0 1 3 : 4

sra: N E W S R O U N D U P

SRA CymruMembers in Wales are getting involved with our breakfast socials. We have held three of these so far this year. They are a great way to meet other social researchers and find out about work going on in local areas.

Our LinkedIn group continues to grow, giving people the opportunity to promote their research, find jobs and connect with people they meet at our local events. In particular, the group has helped to find volunteers for SRA members’ projects, whether as research participants or for research design. Thanks to everyone who is joining, using and making new connections with our group.

We are planning our autumn programme of activities and this will include our first thematic event, which will look at how research evidence is used in healthcare policy. Register for this event, planned for September, at www.the-sra.org.uk

Partnerships with other organisations continue to develop and we will be at the WISERD conference on 25 and 26 June, raising our profile within academia in Wales. We are also developing a programme of events with the Welsh Government, which will feature a cycle of regular annual events promoting surveys which have a significant Welsh Government investment, starting with Understanding Society.

As ever, we are always on the lookout for people who are willing and eager to help us in our endeavour to bring local social researchers together, so do get in touch with us either through our LinkedIn group, email us at [email protected] or follow us on twitter @sracymru

SRA ScotlandWe have had a busy few months. In April we were delighted to welcome a new member to the committee, Shanna Dowling from ScotCen. In May, we ran our first evening seminar of the year which explored the use of forum theatre in research with vulnerable groups.

Keep a look out for upcoming seminars later in the year on social return on investment (SROI) and the Growing Up in Scotland (GUS) study.

We’ve also made a foray into social media as a committee, to help us keep up to date with emerging activity in the social research world and to stay in closer contact with our members. Get involved by joining our LinkedIn group or following us on Twitter – we look forward to connecting with you!

If you’d like to hear more about the work of the committee, or are interested in facilitating a seminar or event, please visit the SRA Scotland page on the SRA website, or contact our chair, Sophie Ellison at [email protected]

Members area now on the SRA websiteA new feature on the SRA website allows SRA members access to members-only information. This includes discount codes for training courses and events; the application form for indemnity insurance; and member profiles (visible only to the member) which they can edit and update. We will add more to the section over time – we welcome your ideas for this.

This means we can now make two major improvements: an online directory, which members can choose to join; and the facility to renew membership online with a credit or debit card. The directory is being set up by Ian Henghes of Diditon who created our new website last year. And online renewals are available once members are registered on the members area, which we are starting to do.

As members profiles are set up in the coming weeks, we will email you individually with your username and password so you can get into the members area. Please do take a few minutes to visit the area, check your profile (edit as needed), change your password to something more intelligible, and have a look around.

SRA IrelandA recent workshop provided participants with an insight into developing outputs from research beyond written reports.

Dr Claire O’Connell, freelance journalist and contributor to The Irish Times, explored the media landscape and discussed how to pitch research stories and work with journalists and others. Kate Morris, project specialist at the Centre for Effective Services, discussed how to present research findings to different audiences giving practical tips on writing briefing papers and developing podcasts.

We also met to look at the type of events we might hold later this year. We are planning to run a training event on procurement issues and managing research projects to include commissioning research and responding to tenders. We are also planning a seminar series on accessing data and library and archive information.

More information: [email protected]

SRA news and updatesFind out the latest news from the SRA along with details of training and events at www.the-sra.org.uk

S R A R E S E A R C H M A T T E R S : J U N E 2 0 1 3 : 5

Have you published something research-related recently?We have just started to let SRA members know about work by other members which has been published in hard copy or online. This is now being publicised in our fortnightly e-newsletter, and links to the published items are going up on our website – look under Resources/Publications.

We are looking for a wide range of material – a report, a piece of analysis, a presentation, a briefing document, a ‘think piece’ on a website or a blog, a journal article, a book, and so forth. Basically, anything that is yours, and which may be of interest to social research colleagues.

If you have something suitable, just send us an email with the following details:

1. Title

2. Author name(s) and affiliation(s)

3. Year of publication

4. Web link

You must be the author or a co-author of the piece, and we’ll need a web link leading either to direct online access, or to information about how to get hold of your work. We will publicise this in a new section of our e-newsletter. We may also tweet about it, if the content is freely available online.

We hope this new service to members will benefit everyone. Please email [email protected] with the heading ‘members publications’.

Helen Kara

sra: N E W S R O U N D U P

On the moveZoe Ferguson recently took up post as chief researcher in the Scottish Government. She joined the civil service in 2000 from a background in social research in labour market issues and local economic development, working at the training and

employment research unit at Glasgow University and Ekos Ltd, after graduating with a degree in Geography from Glasgow University. She has worked in government research in enterprise and lifelong learning, social justice and organisational development and is now returning to social research after several years working in policy, in education and culture, and most recently, finance. Zoe says, ‘I am delighted to be back in social research at such an exciting time. I think researchers have a crucial role to play as we are all thinking about what kind of nation we want to be, whether independent or part of the UK, and public finances dictate the need for new ideas about how we deliver services. I hope the range of experience I have gained will bring a fresh perspective to challenges for the profession.’

After 34 years in the civil service and almost 20 as chief research officer in the Department of Employment, DfEE, DfES, DCSF and then Department for Education, Richard Bartholomew will retire at the end of June. A new head of research in DfE is being

appointed through a cross-government competition.Richard Bartholomew is well known across government

and in the wider social policy research community, including through his role as joint head of Government Social Research. With his retirement, Jenny Dibden will continue in her role as head of GSR. Over the next year, the government chief scientist and the head of the new What Works initiative will assess the case for appointing a government chief social scientist.

SRA Summer Event

The Census: now and in the future26 JunE, LOCAL GOvERnMEnT ASSOCIATIOn, LOndOnChaired by Mark Easton, BBC Home AffairsSpeakers:◗◗ How is the Census faring? Ian Cope, director,

Population and Demography, ONS

◗◗ What’s needed locally? Ludi Simpson, University of Manchester, president of the British Society for Population Studies

◗◗ Can the use of big data eliminate the need for another traditional Census in 2012? Keith Dugmore, director, Demographic Decisions Ltd and honorary professor, UCL

Discussant: Simon Briscoe, author and journalistDrinks reception after the event on the LGA’s roof garden overlooking Westminster.Fee: £65 (SRA members £45)Online booking on the SRA website: www.the-sra.org.uk/events

Save the dateSRA annual conference 2013: Getting social research into policy and practiceMonday 9 December, British Library

Call for papers: keep checking at www.the-sra.org.uk

S R A R E S E A R C H M A T T E R S : J U N E 2 0 1 3 : 6

To branch or not to branch?In our second methodology article, Nick Allum and Emily E. Gilbert, University of Essex, explore the benefits of branched survey questions

We are all familiar with the traditional strongly agree/strongly disagree Likert scale. They are quick and easy to administer and understand, and have become the go-to format for measuring attitudes in surveys. However, much research has suggested that their use can result in problems including the tendency for acquiescent responses, non-differentiation or straight-lining in responses, and reliability problems. As a result, better alternatives are being sought.

An idea gaining momentum is that of the benefits of branched survey questions. In the branched format, respondents are asked first about the direction of their attitude (positive, negative or neutral) and then, using a follow-up question, about the intensity of the attitude (Krosnick and Berent 1993). The potential advantage of this method is the reduction of cognitive burden on the respondent. It has been argued that branched questions can overcome some of the problems associated with traditional unbranched Likert-type scales.

The branching approach was first used in telephone surveys as a way of asking respondents to place themselves on a scale without them having to remember all the response options on that scale (Schaeffer and Presser, 2003). It therefore allowed seemingly comparable questions to be asked across different modes. Some major surveys, such as the American National Election Studies (ANES), are turning to branched questions across multiple modes in order to provide comparable data and overcome various data quality problems.

Some research has suggested that responses to branched questions are better at predicting attitudes and behaviours compared with unbranched versions of the same questions. Studies have also shown that branched questions provide more reliable results than their unbranched counterparts. However,

it turns out that most of these studies had confounding factors, making it impossible to separate out the effect of branching from the effects of other question design or implementation factors. A notable exception is the paper by Malhotra, Krosnick and Thomas (2009), which finds that branched questions have higher validity than unbranched equivalents.

At Essex University, we are conducting a study to test the performance of branched questions using three waves of the Innovation Panel, the methodological testing panel of Understanding Society (the UK Longitudinal Household Study). Using a split-ballot experiment, we compare branched and unbranched questions in measuring attitudes in a face-to-face survey interview. We have not yet published the results, but, so far, we have not seen differences in data quality between forms; and, critically, we find that branched questions take significantly longer to administer than their unbranched equivalents. As a result, we are unconvinced that branched questions are the way forward, at least for face-to-face interviews. From a total survey costs approach, it probably is not worth the extra administration time (and therefore cost) to use the branched format instead of the standard format. Of course, there are other alternatives to the ubiquitous agree-disagree format, but that is for another article.

Emily E. Gilbert is about to complete a PhD in survey methodology at ISER, University of Essex: [email protected]

Nick Allum is a senior lecturer and director of the masters programme in survey methods for social research at the Department of Sociology, University of Essex: [email protected]

ReferencesKrosnick, J.A. and Berent, M.K. (1993) ‘Comparisons of party identification and policy preferences: The impact of survey question format’: Journal of Political Science, 37(3): 941-964

Malhotra, N., Krosnick, J.A. and Thomas, R.K. (2009) ‘Optimal design of branching questions to measure bipolar constructs’: Public Opinion Quarterly, 73(2): 304-324

Schaeffer, N.C. and Presser, S. (2003) ‘The science of asking questions’: Annual Review of Sociology, 29(1): 65-88

sra: M E T H O D O L O G Y

S R A R E S E A R C H M A T T E R S : J U N E 2 0 1 3 : 7

Ethics: sharing practice and learningHelen Kara introduces the SRA Ethics Consultancy Forum case studies and reports on a new resource (TEAR) to help with ethical approval

The SRA’s ethics consultancy forum is a free service for SRA members, offering support for particularly difficult ethical dilemmas. Around a dozen people act as volunteer consultants to the forum, bringing a wealth of research experience from different sectors. If you want to present a case for consultation, you can email details to the convenor, Ron Iphofen, at [email protected]. The work of the forum is confidential, and if you wish, even your own identity can be kept secret from the volunteer consultants. Ron will circulate your case by email and the consultants will respond with their views and suggestions. These will be collated and sent back to you, usually within seven working days.

As there is rarely a hard-and-fast answer to any ethical question, you may receive a range of responses. While these are unlikely to provide you with a definitive solution to your dilemma, they are likely to provide a map for navigating through turbulent ethical waters. I used the forum myself once, in the mid-2000s, when I found myself in a particularly delicate situation. The knowledgeable and considered responses from the forum were very useful indeed. Their input enabled me to find solutions that sat well with me then and still feel like the best options now. Of course, managing the situation remained my responsibility throughout; the forum’s input is purely advisory, and if I had decided on a course of action which had turned out badly, I would not have been able to hold the forum, or the SRA, accountable.

My assertion that I found the forum useful would be dismissed by some researchers as anecdotal evidence, and

lauded by others as testimonial evidence. It is between such differing standpoints that ethical difficulties often arise. Luckily there is some firmer evidence of the forum’s value on the SRA website, that is, anonymised case studies, prepared with the permission of SRA members who submitted them to the forum. These case studies indicate the diversity of problems which arise, covering areas such as ethical approval, difficulty gaining access to potential participants, international research ethics, covert observation, and making complaints about poorly-conducted research. Some useful links are included, and I would urge you to have a look; I

think most researchers would find these cases thought-provoking, perhaps even educational.

I am now a member of the ethics consultancy forum, and link its work to that of the SRA board. Not every member of the forum responds to every case – sometimes people are on holiday, or unwell, or just

too busy – but most respond to most. While I don’t want to precipitate a great deluge of cases, I would encourage all SRA members to remember the forum, and to use it in times of need.

It is my great pleasure to announce that the forum has recently developed an exciting international dimension in the shape of Professor Martin Tolich, an ethics expert from New Zealand. Martin is the founder of an excellent initiative, The Ethics Application Repository (TEAR), launched in March 2012 (see www.tear.otago.ac.nz). This is an open-access online repository of exemplary applications to research

ethics committees, many of which are posted with related documents such as information sheets for participants and consent form templates. The contents of applications are anonymised, although the researcher(s) concerned must be named. I found the repository slightly awkward to search, but if you browse by title, you will find all the applications listed.

The aim of TEAR is to make ethical approval less secretive and hierarchical, more open and transparent, less focused on form-filling, and more supportive of ethical thinking. Applications are donated by researchers from around the world who want to help others facing the onerous task of applying for ethical approval. Martin would welcome more donations of successful applications and associated documents. SRA members who might be able to add to this very useful resource can reach Martin at [email protected].

http://the-sra.org.uk/sra_resources/research-ethics/ethics-cases

sra: E T H I C S

I would encourage all SRA members to remember the forum, and to use it in times of need

S R A R E S E A R C H M A T T E R S : J U N E 2 0 1 3 : 8

Keeping trained and up to date: SRA finds out what members like

sra: F I N D I N G S

It was encouraging that many of your comments were positive

Many thanks to everyone who took part in our recent survey. We had 250 responses, which is about 42% of the membership – a decent result for this type of questionnaire.

We were most grateful that you took the time to provide very thoughtful answers to the open-ended questions. We are now using these to find ways to improve what the SRA does.

Initially, we’ve shared the open-ended responses to the questions ‘How can we improve our …’ (training, events, website). The responses about training have gone to Simon Haslam, the board member with responsibility for training; the comments on events to the Events Group; responses about the e-newsletter to Gillian Smith who edits the newsletter; and similarly we have shared your comments on the website, this magazine, and so on.

We are doing a qualitative analysis of these and other open-ended responses, thanks to Graham Hughes who, as well as being the SRA treasurer, is a qualitative researcher specialising in this type of analysis.

It was encouraging that many of your comments were positive. However, there is much to discuss. Meanwhile, a few findings below:

1. MEMbER bEnEfITSYour reasons for belonging to the SRA are varied:

90% Keeping up to date with social research news, issues, developments

57% Access to training at discounted rates

56% Opportunities to network/share experiences with other researchers

26% Discounted delegate rates for the SRA annual conference

24% Specific advice or help (eg on ethics issues)

24% Able to get involved in debates and help to shape policy

24% Discounted delegate rates to other events (eg summer event, social media in social research conference)

5% Something else

2. vALuE Of SRA EvEnTSWe asked people who attended any of the four main SRA events last year how useful they found each one. The majority of the results were in the very/fairly useful area, but with quite a range of views:

How useful did you find it?

Very useful

Fairly useful

Not very useful

Not at all useful

2012 annual conference (base: 163)

60% 33% 7% 0%

Summer event (base: 78)

25% 65% 10% 0%

Social media in social research conference (base: 30)

63% 38% 0% 0%

Cathie Marsh lecture (held jointly with the RSS) (base: 70)

33% 56% 11% 0%

3. SRA JOuRnALWe were encouraged by the positive responses to a question about the possibility of starting an SRA journal. This has added support to the sense among SRA board members that there is a need for a resource of this kind:

We are exploring the possibility of starting an SRA journal, with mostly short articles aimed at practitioners of social research. The journal would be posted to members quarterly. Could a journal like this be of interest to you?

38% Yes, definitely

50% Yes, probably

12% Probably not

0% Definitely not

These are just a few items from a very rich data source. Many thanks again for taking part.

S R A R E S E A R C H M A T T E R S : J U N E 2 0 1 3 : 9

families in austerityBy Suzanne Hall, research director, Ipsos Mori

Stories about life in austerity have become all too familiar reading over recent years, but what have the pressures on household budgets really meant for family life?

Research conducted by Ipsos Mori on behalf of the Family and Childcare Trust – Family Matters – opens a window into the lives of 11 families, providing a vivid illustration of the delicate balancing act necessary to sustain household budgets and keep family life on track in austerity. The families’ stories, (gained from participant-observation, interviewing, phone conversations, self-completion diaries, participatory photography) and a survey of over 1,000 parents, demonstrate how mounting financial pressures are increasingly spilling over into family life and putting relationships to the test.

What was striking from this research is how widely felt financial fragility is; with four in five (85%) parents stating that the financial situation for families has deteriorated in the last year. What’s more, families are paying an emotional price for this: three in five (60%) parents have experienced increased levels of stress and anxiety as a result of changes in their financial circumstances and a third (33%) suggest that they have resulted in relational problems with family and friends.

We wanted to understand what is behind this fragility, and our research revealed that more often than not, it is one or more of the four Cs: cost of living, cars, credit and childcare.

Against a backdrop of stagnant wages, it is no surprise that the rising cost of living is a constant pressure, with nine in ten (89%) mentioning that they have noticed an increase in the

amount they are paying for food and household bills. To adapt, families are trading down what they eat – swapping fresh food for frozen – and, at worst, parents are going without, so their children can eat.

‘The price of everyday things; fruit, meat, fuel… has just skyrocketed,’ said a lone father.

These strategies save money, but cost individuals in physical and mental health. Parents spoke of the guilt they felt at not being able to afford healthy food for their children, and also mentioned how their own energy levels dipped as they cut back on what they ate.

Cars were often considered a ‘necessary evil’ as they were often essential to family life but also a source of frequent costs – expected and unexpected – financially, emotionally and psychologically. Eight in ten (79 per cent) parents were

concerned about their ability to pay for an unexpected major expense, with the car a likely candidate.

Childcare was another significant cost and one that acted as a barrier to work for many potential second earners – usually mothers. While families valued the educational and social benefits offered by formal childcare, the inflexibility and high price of it meant,

for many, it was incompatible with working life.‘If I didn’t have Mark’s parents I wouldn’t work. We need the

money but at the same time childcare is so expensive that you end up just paying to work,’ said one woman.

Given these pressures, borrowing from family and friends or using high-cost credit was a common coping strategy. However, there was often a ‘relationship premium’ in borrowing from friends and family which created strain and, of course a ‘financial premium’ for using high cost credit.

‘Short term loans are bad, of course they are, everyone knows they are, but if I didn’t have them then I wouldn’t always be able to get through the month,’ said a single mother of two children.

What was striking about the lives of these families was not only how financially fragile even the relatively best-off were but how financial fragility time and again translated directly into a deterioration in family relationships. A detailed, longitudinal perspective enabled a fuller appreciation of this multi-faceted fragility but also emphasised how resilient families can be in even the most challenging circumstances. Nevertheless, it is important not to underestimate exactly how challenging it is and will continue to be for families to get by in austerity, especially if they lack the right combination of resources to do so.

For more information see Patrick Butler’s blog piece and the Poverty and Social Exclusion website at: www.poverty.ac.uk/editorial/families-left-fragile-austerity

sra: F I N D I N G S

What was striking from this research is how widely felt financial fragility is

S R A R E S E A R C H M A T T E R S : J U N E 2 0 1 3 : 1 0

The Great british Class Survey: first resultsBy Professor Fiona Devine, OBE, AcSS, School of Social Sciences, University of Manchester

The first set of results from the Great British Class Survey – showcasing a new model of class with seven classes ranging from the elite to the precariat – was announced on 3 April. The findings enjoyed considerable media attention ranging from coverage on BBC Breakfast to a

report in the New York Times. It led 6.9 million people in the UK and abroad to play with the BBC’s ‘class calculator’ to see where they fit in the 21st century class system.

The results were disseminated in a plenary session of the annual British Sociological Association Conference by Professor Mike Savage of the LSE and myself with the wider academic team in attendance on the same day. We drew on an article appearing in the BSA flagship journal, Sociology, which details the theoretical and empirical arguments in full. This, too, enjoyed widespread interest, with the article downloaded 19,642 times in April.

The research was conducted in collaboration with BBC Lab UK, and BBC producer, Michael Orwell, worked with us throughout. We devised a questionnaire to define and measure class in a new way. We defined class in terms of economic capital, cultural capital and social capital drawing on the work of Pierre Bourdieu. Accordingly, we asked people about their income, the value of their homes and savings (economic capital), their cultural interests and activities (cultural capital) and the number and status of people they know (social capital).

The web survey was launched on 11 January 2011 and completed by over 161,400 participants by July 2011. They were not representative of the population at large however. Accordingly, 1,026 respondents completed a face-to-face national representative sample survey, using the same questionnaire, in April 2013. We analysed the data using latent class analysis. We combined the cases from the two surveys and weighted the GBCS so that its participants are equivalent to one case of the GfK survey.

We identified a complex map of seven classes. First, we found an elite (6% GfK, 22% GBCS) with high levels of economic capital (especially savings), high ‘highbrow’ cultural capital and high social capital. Second, there is an established middle class (25% GfK, 43% GBCS) with high economic capital, high highbrow and popular or ‘emerging’ cultural capital and high-status social contacts. Third, we identified a small technical middle class (6% GfK, 10% GBCS) with high economic capital, moderate levels of cultural capital and contacts with a high mean status although few contacts were reported.

Fourth, we found a class of new affluent workers (15% GfK, 6% GBCS) with moderately good economic capital, moderate highbrow and good emerging cultural capital and moderately mean score of social contacts. Fifth, we identified a class of emergent service workers (19% GfK, 17% GBCS) with moderately poor economic capital, though with reasonable household income, high emerging (and low highbrow) cultural capital and moderate social contacts. These classes capture the blurring of middle-class and working-class boundaries and fragmentation in the middle of British society.

sra: F I N D I N G S

S R A R E S E A R C H M A T T E R S : J U N E 2 0 1 3 : 11

Sixth, we found a traditional working class (14% GfK and 2% GBCS) with moderately poor economic capital, (albeit with reasonable house values), low cultural capital (of both types) and few social contacts of low status. Finally, we found a precariat (15% GfK, <1% GBCS) with poor economic capital and the lowest scores on every other criterion of cultural and social capital. We suggest there is clear polarisation between an elite at the top with substantial economic, cultural and social capital and a precariat at the bottom with little in the way of these capitals. The new class model captures these two extremes.

Since the announcement of the results, a further 200,000 people decided to complete the full web survey. The final sample now stands at 366,626 participants making it the most popular of the BBC UK Lab experiments. This level of participation seems to suggest that British are, indeed, somewhat obsessed with class (said in a typically understated British way)! This is the final data set for analysis since the BBC UK Lab stopped capturing data on 1 June.

An even more exciting development is that the BBC Lab UK conducted 11 experiments that also enjoyed mass participation. The BBC Lab UK platform was designed to ensure that every experiment gathered data into a common database that used anonymised unique identifiers for participants which are consistent across all of the experiments. Thus, we know the number of participants in the Great British Class Survey who also participated in the 10 other experiments. See Table 1 opposite.

TAbLE 1 OvERLAppInG pARTICIpAnTS In THE GbCS And THE OTHER ExpERIMEnTS

The Big Personality Test 16,580

Brain Test Britain 5,194

The Web Behaviour Test 2,896

The Stress Test 11,764

The Big Money Test 17,120

The Big Risk Test 23,420

How Musical Are You? 29,853

The Get Yourself Hired Test 5,981

Test Your Morality 22,183

Can You Compete Under Pressure 11,270

This means we have big data to analyse over the coming months and years. The academic team should be able to delve even deeper into the nature of economic, cultural and social capital. The team is especially interested in doing more analysis on the spatial dimensions of class, the nature of occupations associated with different classes and the role of universities in class reproduction. The wider academic community and beyond will be able to analyse the data sets when they are deposited with data services later this year.

How might this way of examining class be used in practice? Several market researchers have approached us already so collaborations may lead to engagement with a theoretically-informed multi-dimensional view of class. The academic team has been asked about the policy implications of the research. We have only just begun to think about these issues. One obvious example is to better understand the nature of class polarisation and especially the substantial economic, cultural and social capital of the elite.

A key question is whether this approach will supplant more orthodox measures of class. The orthodox measures of class (NS-SEC), with its focus on economic capital as captured by occupational and employment status, remains a very important way of measuring class, not least in official statistics. Measuring class using different capitals is an important complement to describe and explain class. It will be very interesting to see discussions and collaborations between social scientists enriching the different approaches to understanding class in the future.

More informationSavage, M., Devine, F., Cunningham, N., Taylor, M., Li, Y., Hjellbrekke, J., Le Roux, B., Friedman, S. and Miles, A. (2013) ‘A New Model of Social Class? Findings from the BBC’s Great British Class Survey Experiment’, Sociology, 47 (2) 219-250

sra: F I N D I N G S

S R A R E S E A R C H M A T T E R S : J U N E 2 0 1 3 : 1 2

propaganda: power and persuasionBy Ian Cooke, curator, international and political studies, British Library

Propaganda: Power and Persuasion is the new exhibition at the British Library, which runs until 17 September. It examines state use of propaganda from around the world over the past 100 years. Taking a thematic approach, it examines the challenges and methods of propaganda in the context of: nation-building and national competition; demonisation and opposition; war; and public health campaigns. 200 exhibits include posters, pamphlets,

films, recorded sound, bank-notes, postage stamps, even paper bags.

Definitions of propaganda have changed over time, and the term remains controversial today. The exhibition offers various definitions and interpretations. Professor David Welch, author of the accompanying book, suggests that propaganda is simply a tool, something which is itself ethically neutral. Our judgement of whether a piece of propaganda is

‘good’ or ‘bad’ comes from the motivations and objectives to which it is applied.

Propaganda is examined as a form of communication intended to have an effect on the opinions or behaviour of

a defined population. Often the desired effect is reinforcement of existing opinions or prejudices rather than radical change, and this makes it difficult to measure its effectiveness. Propaganda is often mobilised in support of policy, and it is difficult to disentangle the impact of propaganda from other issues and incentives. For example, the British AIDS awareness campaigns of the late 1980s are credited with keeping infection rates low compared to countries which acted later in public awareness. Official communications, however, accompanied information from other concerned organisations, notably the Terrence Higgins Trust. Similarly, the low infection rates amongst intravenous drug users are probably more the result of effective needle exchange services than the shocking posters of the time.

Throughout its evolution, propaganda has used contemporary cultural forms: from monumental architecture of the ancient world, through print, posters and postcards, to cinema, radio and television. Its challenge is to keep changing and inhabit the forms and media popular at the time and among the people that it aims to influence. Examples in the exhibition include a Chinese poster for the 1950 film ‘The White Haired Girl’. This film, which emphasises the emancipatory power of the Chinese Communist Party, takes the popular form of a musical play and is based on both well-known myths and stories that were circulating by word of mouth in the

border provinces during the 1940s. By contrast, during the first majority elections in South Africa in 1994, a variety of posters, and comic books featuring conversations between farm workers, were an effort to convince the population of the legitimacy of the elections.

The exhibition ends by looking at what has changed in the 21st century highlighting two themes: heightened sensitivities over the relationship between state and mass media, especially in the build-up to war; and the impact of social media. For both, the ability to comment and report on events almost immediately as they happen have changed how we receive news and how we expect to interact with those in power. For mass media, it raises challenges about how to compete while ensuring credibility in reporting. For social media, there are questions about how readers can recognise the source of information and the potential for states and powerful interest groups to coordinate messages in a way that might not be immediately obvious. What is interesting is that these sorts of methods can also be seen amongst groups who have learned to use social media to mobilise grassroots campaigns. The question we are left with is whether there is a need for a new form of awareness to identify the presence of propaganda in social media or whether new communications have given us all the potential to be propagandists.

sra: E X H I B I T I O N

Propaganda: Power and Persuasion runs at the British Library, London until 17 September. For more information, including the accompanying events programme, see www.bl.uk/propaganda

S R A R E S E A R C H M A T T E R S : J U N E 2 0 1 3 : 1 3

doing your literature reviewBy Jesson, J.K. with Matheson, L. and Lacey, F.M. London: Sage, 2011

Reviewed by Dr Jenni Brooks, research fellow at the Social Policy Research Unit, University of York

sra: R E V I E W S

Social network analysis: history, theory and methodologyBy Christina Prell, London: Sage, 2012

Reviewed by Dr Ron Iphofen, AcSS, independent research consultantThis book explains how to conduct both a

traditional literature review and a systematic review, and clearly sets out the differences between the two. It is aimed at postgraduate students in a range of disciplines, and expects some understanding of research methods. The chapter on meta-analysis requires some statistical knowledge.

The book is straightforward and easy to navigate and understand. The language is clear throughout, and unusual terms are explained within the text without being patronising. Each chapter begins with a short summary and a list of points to be aware of, and there is a helpful glossary at the back.

The main strength lies in the book’s practical nature. The authors place great emphasis on the importance of proper searching techniques and encourage the use of specialist librarians. Chapters on reading and note-taking skills contain useful detail often missing from similar books – such as which bits of an article to read first, and how to make and store relevant notes that will be usable later. The examples of how to improve specific passages of writing are very valuable.

I appreciated the way the authors understood the pitfalls facing a relatively novice researcher, and provided techniques to avoid these. For instance, they describe a process for determining whether an article

is relevant to the central purpose of your review (so you can avoid getting side-tracked with unnecessary, albeit interesting, reading). The chapter on synthesis and writing up of a review is set out in a very methodical, practical way, making a hard to manage process feel very manageable.

The systematic review chapters are detailed. However, I think if I was going to actually do a systematic review, I might want more guidance about such things as determining inclusion and exclusion criteria. But the book would be very useful for determining whether a systematic review would be appropriate, the level of work it is likely to involve, and also understanding and judging other systematic reviews one comes across.

I found just one thing slightly vexing. The authors suggest that to get the most from their section about different types of reviews, the reader should look up particular articles referred to in the text. These were not all freely available.

Overall I enjoyed the book, and would recommend it to postgraduate students or early career researchers who haven’t yet done a systematic review, and are faced with the prospect of trying to decide whether to do one.

I was keen to discover more about social network analysis (SNA). This book by Christina Prell more than adequately fulfils a promise to meet the needs of those coming to the field for the first time, the intermediate researcher looking to develop skills in the field and for teachers seeking a comprehensive account that will help inform their students.

After a vital introduction to terms and definitions, a succeeding historical chapter exploring the origins of SNA prompted memories of my earliest interest in this topic. From Elizabeth Bott’s views on the importance of network ‘connectedness’ to conjugal role relationships, through Leon Festinger’s thesis on leadership emergence in small groups, to Kurt Lewin on field theory, I was transported back to the early 70s when there still appeared to be a relic of interest in Moreno’s foundational work on sociometry. I recall spending some time constructing sociograms while failing to see how they could adequately display the structure of more complex networks than a dyad or a triad. Prell’s exegesis of the contributions from matrix and graph theory accounts for the renewed development of the field and the more sophisticated analyses these permit of a range of larger network

systems. Both the multidisciplinary origins and applications – especially with the growth of interest in social capital – appear to have fuelled a sustained interest in the field.

I read the rest of the book as a novice researcher or postgrad student seeking the appropriate analytic method for a substantive topic, and a method that could supplement a repertoire of methodological skills that could be transferable across a range of other research areas. Prell’s subsequent chapters did not disappoint. After a step-by-step explanation of how to study social networks, from assumptions through theoretical frameworks to sampling and question framing, I was offered a range of alternative approaches to suit my chosen research question. Given my own current field of interest, I had hoped for a little more on the ethics of SNA, but what I was offered I found succinct but sound and accurate. The rest of the book offers a straightforwardly clear and practical ‘how to’ together with introductions to the favoured computer software applications – always a valuable insight to the neophyte. Indeed if I were to seek a ‘foothold’ in SNA this is where I would start.

S R A R E S E A R C H M A T T E R S : J U N E 2 0 1 3 : 1 4

doing Q methodological research: theory, method and interpretationBy Simon Watts and Paul Stenner, London: Sage, 2012

Reviewed by Dr Ron Iphofen, AcSS, independent research consultant

sra: R E V I E W S

I have to be entirely honest with you and explain that I offered to review this book since it was ‘advertised’ with the all-important ‘Q’ omitted from the title. I was attracted then to the idea of a wonderfully reflexive notion of researching researchers. So I admit I was disappointed when the book arrived. However, given timescales and knowing how important it is for authors to have their books reviewed, I felt a moral obligation to complete the task.

Fortunately Q methodology was not an entirely new concept to me. Some years ago, I was seeking ways to address subjectivity in social research without necessarily pursuing the phenomenological route but requiring an approach that promised enhanced rigour and would stand up to independent tests of reliability and validity. Within health sciences, one of the most noted proponents of Q methodology, as mentioned in this text, was Wendy Stainton Rogers. I could see the advantages of the method in her work but was intimidated somewhat by the apparent complexity of the task faced by research participants and the complexity of the statistical analysis – required of me! Instead, I opted to pursue George Kelly’s alternative approach to ‘objectively measuring subjectivity’ via personal construct theory (PCT) and repertory grid analysis. Ironically,

after conducting such research and then supervising others, I found that I did have to learn how to conduct factor analyses and also how to adequately explain a complex procedure to research participants.

I tell you this to explain the ‘light’ under which I reviewed Watts and Stenner. They organise the work in terms of theoretical origins – from William Stephenson in the 1930s who was seeking a systematic way to study subjectivity in opposition to the dominant behaviourist approaches at the time. In some respects, he was seeking to pursue the approach advocated originally by William James and was able to build upon the correlational statistical techniques developed by Spearman and others. In the ‘method’ section they explain how the subjective views, attitudes, statements are rank-ordered in a ‘P-set’ which can then be subject to factor analysis. As Stephenson designed it, instead of large numbers of people being given small numbers of tests, small numbers of people are given large numbers of tests. Watts and Stenner then go on to show how to conduct

the statistical analysis but then, more importantly, explain how to interpret it.

It is curious that while Q methodologists and PCT’ists come from similar assumptions, adopt similar ‘sampling’ methods and very similar correlational analyses – factors, rotations, eigenvalues – they rarely refer to each other, neither critically nor in terms of best/better practice. So in reading this book, I came to the view that, once you invest so much time and energy in learning how to conduct such complex research, you simply get on with it and improve how you do

it – remaining convinced that the approach bears effort from a distinctly ‘vested interest’. Thus while Watts

and Stenner rightly suggest that you can start this book at the methods section – it is clear and straightforward enough – I would advise against it. Read their excellent ‘theory/history’ section first and decide that this is a warrantable approach.

I remain doubtful about this approach on the grounds of finding research participants who can fully understand what is required of

them to conduct the Q sorts, and are willing enough to invest the energy and time to do it. It is hard enough to maintain response rates on the simplest surveys. Of course the section here on recruitment is vital (p.70 et seq.) – this is not an approach suited to either random or opportunity sampling. Participants have to be selected carefully and strategically. In fact, if you are considering this method for a particular research topic, I’d recommend reading this section first and see if it suits the topic and participants you had in mind.

Of course that is a criticism of the method, not the book. This is a comprehensive, clear, ‘one-stop start’ to the method with some very useful chapter summaries and appendices – but it is by no means as ‘simple’ as claimed on the cover blurb. Together with the all-important recommended free software package for the data analysis if you were thinking of taking this approach this would be a good place to begin. And if anyone did ever conduct a ‘researching the researchers’ study I’d be interested in knowing why people like Watts and Stenner chose to pursue this line of work in the first place.

I remain doubtful about this approach on the grounds of finding research participants who can fully understand what is required of them to conduct the Q sorts, and are willing enough to invest the energy and time to do it

Engage respondents wherever they are

Paper, online and mobile questionnaires in any language Mobile interviewing for iPad, Android and Windows devices Include video, clickable images, audio, sliders and file attachments

Gain deeper insight with Snap’s new Smart Reports

Create dynamic repeatable reports tailored specifically to the reader Freetext analysis options include word clouds Show results in context

www.snapsurveys.com/snap11

Join us for a preview webinar or request an online demo:

Call 020 7747 8900 or email [email protected]

The Intelligent Survey Solution

S R A R E S E A R C H M A T T E R S : J U N E 2 0 1 3 : 1 5

sra: R E V I E W S

Qualitative content analysis in practiceBy Margrit Schreier, London: Sage, 2012

Reviewed by Nicola Singleton, independent research consultant and visiting researcher, Department of Addictions, Kings College, London

This book has been written for students but would be of value to anyone considering using the analysis method to help them reduce and make sense of a large volume of textual data. The early chapters (1 to 3) explain qualitative content analysis, how it differs from the more familiar quantitative content analysis, what it can and can’t do, and when it is appropriate to use it. It does this by comparing it to other qualitative analysis methods and also explains how it might be used to complement other approaches. Reading these would be enough for those who just want to understand the method, and decide whether or not it would be appropriate for their data.

For those who want to go further and use the method, chapters 4 to 11 give a step-by-step guide to conducting qualitative content analysis. The first six of these chapters are concerned with the initial stages of building, testing and evaluating the coding frame. This may seem excessive but reflects that a defining feature of the method is that it seeks to systematically describe and reduce the data in a consistent way, which should be reproducible. The detailed discussion and examples of data-driven and concept-driven coding may also be more generally useful

and provide food for thought to those who are developing coding frames for use with other qualitative analysis methods. The next two chapters cover the main analysis phase and the presentation of results. Finally, chapter 12 discusses the use of software to assist with qualitative content analysis, covering various packages, without prescribing any one in particular, and highlighting the issues to consider when using them.

The content is detailed and presented in text-book style with key points, definitions and beginners’ mistakes scattered throughout and frequently asked questions and end of chapter questions. These break up the text but also help when skimming. In addition, and what I found particularly valuable, was the liberal use of examples drawn from published papers. These really help to clarify and bring to life the issues raised.

While not a quick read, and only for people likely to be undertaking qualitative analysis, the book is clear. Having read it, I feel confident that I could use it to guide me in undertaking a robust qualitative content analysis of some of my own research data, and quite inspired to do so.

TRuSTEESDr Patten Smith (Chair), Ipsos-MORI [email protected]

Susannah Browne, Home Office [email protected]

Sophie Ellison, Blake Stevenson Ltd [email protected]

Jennifer Evans, Cardiff University [email protected]

Judith Hanna (Secretary), Natural England [email protected]

Dr Simon Haslam, FMR Research [email protected]

Graham Hughes (Treasurer) [email protected]

Dr Helen Kara, We Research It Ltd [email protected]

Nick Ockenden, Institute for Volunteering Research [email protected]

Ceridwen Roberts, AcSS, University of Oxford [email protected]

Richard Self, Welsh Local Government Association [email protected]

David Silke, Centre for Housing Research [email protected]

sra: C O N T A C T S

EdITORIAL pOLICySRA Research Matters will include any copy that may be of interest to its readers in the social research community. We will notify you if we are unable to include an item. Copy submitted for publication is accepted on the basis that it may be edited to ensure coherence within the publication. The views expressed by individual contributors do not necessarily reflect those of the SRA.

SRA RESEARCH MATTERS pubLICATIOnNext copy date: 26 July (September issue) and 31 October (December issue)

For latest news and details of training from the SRA see online at www.the-sra.org.uk

Edit

ed b

y Sh

irley

Hen

ders

on w

ww.

shirl

eyhe

nder

son.

co.u

k D

esig

n by

ww

w.gr

aphi

cs.c

oop

uSEfuL COnTACTSCommissioning and funding working groupJohn Wicks, AcSS [email protected]

Public affairs forumBarbara Doig, AcSS [email protected]

Strategy forumCeridwen Roberts, AcSS [email protected]

SRA ScotlandSophie Ellison [email protected]

SRA CymruRichard Self [email protected]

Jennifer Evans [email protected]

SRA IrelandDavid Silke [email protected]

SRA TrainingDr Simon Haslam [email protected]

Lindsay Adams, Training administrator [email protected]

Events groupDr Patten Smith [email protected]

Ethics forumDr Ron Iphofen, AcSS [email protected]

SRA Research MattersCommissioning editor: Ceridwen Roberts, AcSS [email protected]

SRA NewsletterGillian Smith [email protected]

Book reviewsDr Simon Haslam [email protected]

Administrative officeGraham Farrant [email protected]

Gabrielle Elward [email protected]

The Social Research Association (SRA), 24-32 Stephenson Way, London NW1 2HX Tel: 0207 388 2391 Email: [email protected]

The Cathie Marsh Memorial Lecture 2013Presented by the SRA and the RSS

AT THE ROyAL STATISTICAL SOCIETy In LOndOn, THuRSdAy 21 nOvEMbER, 5-7pM

Methods for policy evaluationThere has been considerable recent interest in randomised control trials (RCTs) for evaluating public policy, particularly in education, with Ben Goldacre’s recent contribution and interest from the Secretary of State. But there are also many reasons to be sceptical of the value of applying the medical model to evaluation in complex social contexts. These are the issues we will explore in this year’s Cathie MarshMemorial Lecture (title to be confirmed).

Speakers:Leon Feinstein, Early Intervention Foundation (formerly Cabinet Office)Jeremy Hardie, LSEBooking available shortly: www.the-sra.org.uk TheSRAOrg