Week 4 – Going into Europe

One of the UK’s leading constitutional scholars (and David Cameron’s old tutor) is Vernon Bogdanor who gave a lecture a few years ago about the early years of European integration.

Britain and Europe since 1945, by Alex May, gives a good clear overview of British policy towards Europe in the first 55 years since the War.

Finally, the man who signed the treaty bringing us into Europe – prime minister Ted Heath – devoted several chapters to the positive arguments for integration in his autobiography, The Course of My Life.

2 thoughts on “Week 4 – Going into Europe

  1. For those of us who were discussing the EU referendum – here’s an interesting article in today’s Times explaining that the internet polls give Out a small lead while phone polls give In a large lead.

    As it’s behind a paywall, I’ll C&P it here:

    The most reliable polls are at the end of a phone

    Daniel Finkelstein

    If you think the EU referendum looks neck and neck, you’re probably wrong. Much depends on the kind of polling that’s been used

    The pollsters were wrong in the general election. Are they getting it right in the European referendum?

    I have only once been asked to participate in an opinion poll. A woman came round to my parents’ house in the late 1980s and asked me some questions about my preferred washing powder and, by the way, how would I vote if there was going to be a general election tomorrow?

    I told her that my intention was to support the Social Democratic party. I saw her tick the box for the Social and Liberal Democrats and I had to explain to her that, no, that was a different party, could she please try again. It took several minutes before we got it right.

    I learnt a couple of things from this experience. The first is that the SDP was doomed. The other thing — and this may seem odd — was an enhanced respect for opinion pollsters.

    The companies that produce these surveys have to cope with every sort of confounding factor. It’s not just voters who are confused but even the people asking the questions. And although I said confidently that I was going to vote SDP, as I sat on the party’s national executive, by the time there actually was an election I voted Conservative. I got my washing powder right, though.

    Despite this, the pollsters generally do very well. They manage most of the time to give you a pretty good idea of who is going to win. And, as they did last week when measuring the London mayoral race, often they get it pretty much spot on. The record of YouGov in calling elections is strikingly good. The more aware you are of the messiness of the data they get, the more impressive this achievement seems, and the more forgiving you are when a mistake is made.

    Yet also, the more obvious it is that polls are very dependent on how data is treated. The way it is gathered and weighted is critical to the outcome it produces.

    Which brings me to the EU referendum. The problem with answering the question about whether the polls are right is that the polls aren’t all saying the same thing. It sort of looks as if they are, but they aren’t.

    On Monday, ICM showed Leave on 46 per cent and Remain on 44 per cent. Meanwhile, YouGov had Remain on 42 per cent and Leave on 40 per cent. These are fairly representative of most polls in two ways. They show the two sides running about even, and they were conducted online, using panels of voters filling in a computerised survey.

    The minority of polls conducted in a different way — on the phone — are producing different results. The last five phone polls have shown leads for Remain of 7 percentage points, 5 points, 11 points, 10 points and 7 points. When, three weeks ago, ICM conducted online and phone polls on the same days, their online poll showed a Leave lead of 1 point, while their phone poll showed a Remain lead of 7 points.

    Which, of course, leads to the question: which of these methods (not individual companies, but the overall method) is giving us a clearer picture?

    In a very good paper prepared by Matt Singh, of Number Cruncher Politics, and James Kanagasooriam, of Populus, the authors argue convincingly that it is the phone polls that are getting closer to the truth. And the reasons why they think this are interesting.

    Part of the answer lies with the way that the pollsters treat voters who don’t know. These people are as likely as anyone else to vote, so we want to know what they think. The structure of phone surveys gets more of them to express an opinion, revealing a higher proportion who want to Remain. (By the way, there is broad consensus that the increased tendency of older people to vote, favouring Leave, is roughly cancelled out by more engaged affluent voters favouring Remain.)

    The rest of the answer lies with social attitudes.

    The problem with opinion polling is that in order to ask someone a question, you first have to find them. You need, for instance, a certain number of working women aged 50ish in a two-car family. If you can’t get hold of the first one you try, if they are out, say, or won’t answer, you just look for another. As long as you find a few, you can weight the number so that you have the correct, representative proportion.

    The problem is that while polls may look the same, they aren’t

    Yet this produces a problem. What if the sort of working women aged 50ish in a two-car family that you can get hold of is different in some systematic way from the ones you can’t get hold of? And it turns out that they are.

    We know this because after the general election a very expensive piece of work was done to measure voter opinion. The British Election Survey was conducted in a much more costly way than is normally practical. A working woman aged etc etc would be identified and contacted repeatedly until that individual responded.

    This gave a more accurate overall picture — the survey result was much closer to the election outcome — and showed different attitudes among voters the more effort you had to go to to find them. The more often you had to contact them before getting an answer, the more likely they were to be socially liberal and the more likely they were to be for remaining in the EU.

    As far as the EU is concerned, the more you are out the more likely you are to be In, and the more you are in, the more likely you are to be Out.

    The phone pollsters as a group (each company has its own methods) seem to be doing better at capturing these more elusive socially liberal people. In addition to any light this might shed on the eventual result, these findings also illuminate the nature of the argument.

    Being in favour of the EU is linked to a university education

    One of the reasons for Ruth Davidson’s stunning success in Scotland was the targeting work based on polling done for her by Andrew Cooper, of Populus, who is also the pollster of the Stronger In campaign. He looked at voters and their broad social attitudes and background, which turned out to be a successful way of looking at things.

    Being in favour of leaving the EU turns out to be very closely associated with, for instance, believing that female and ethnic equality has gone far enough. Being in favour of remaining is closely associated with having a university education.

    While everyone is stressing the arguments about the EU, the result may be determined much more by how voters see themselves than by how they see Europe.

    Like

  2. As a committed Inner, this has cheered me up quite a bit, but thought this might be interesting for those on both sides of the debate and also for anyone with an interest in politics, polling and psephology.

    Firstly, it’s interesting to know about the differences in phone and internet polls. I guess if we really want to know what’s happening, then we’ll have to pay pollsters more to do it properly by the phone.

    But in terms of the EU, this I found interesting:

    “Being in favour of leaving the EU turns out to be very closely associated with, for instance, believing that female and ethnic equality has gone far enough. Being in favour of remaining is closely associated with having a university education.”

    The comments below the Times article are vitriolic at times. The above paragraph suggests that In voters are generally better educated and Out voters are more racist and sexist.

    Now obviously, this is just a comparison, not absolute. It’s not saying that every outer is a racist and every Inner is intelligent.

    But it’s interesting that there’s a statistically significant difference.

    But it’s interesting that the two people I was discussing this with – Oliver? and ??? {sorry, can’t remember the names. This is Rory posting btw} seem both to have been right predicting c.54-46 victory for In.

    I guess you’ve been paying closer attention than I have.

    If anyone has anything else to share about this, I’d be happy to read it.

    All the best.

    Rory.

    Like

Leave a comment