Wednesday, September 20, 2017

Recent Polling On The Same-Sex Marriage Postal Survey

The national ABS postal "survey" on whether the law should be changed to allow same-sex marriage in Australia is now in its second week.  A number of pieces of polling have been published or alluded to since my last general polling update, but what do they really tell us about the outcome and how reliable are they?  At this stage there is still much that we do not know.  It is too early to be certain Yes has it in the bag, but the widespread narrative that support for Yes is crashing rapidly and that this is another Trump or Brexit coming is so far not that well supported by the evidence.

Public Polling: Ipsos

Firstly, the public polling.  Last week saw an Ipsos poll which buoyed some worried Yes supporters with a 70-26 Yes response, one of the highest Yes votes ever recorded in a poll in Australia.  Indeed, as far as I'm aware, this score has only been exceeded in a few commissioned polls and one Morgan-SMS (a suspect polling method) which did not use an undecided option.  The Ipsos also found a 70% Yes response among the 65% of voters who rated themselves as certain to vote, and found a gender gap with 72% of women and 59% of men saying they were certain to vote.



However, I am treating these Ipsos figures with a lot of caution.  Firstly we know that Ipsos greatly overestimates the Green vote compared to other pollsters and compared to election results, and Green voters very strongly support same-sex marriage.  Although that's probably only worth a point or two on the headline figure for same-sex marriage, it could be a concern if whatever causes Ipsos to over-poll the Green vote also might cause them to over-poll the left end of the Liberal base.  Some evidence for the latter concern might be suspected in the perpetually strong results for Malcolm Turnbull in Ipsos compared to other polls.

The second issue is whether Ipsos, as the only national pollster still using exclusively live-interview phone polling, is affected by "social desirability bias".  When talking to another person on the phone, voters may be more reluctant to say they oppose same-sex marriage, since they may be concerned about offending the interviewer.  When clicking boxes on a computer survey, or pressing numbers on a robo-poll, these issues don't arise.  Some overseas studies (mainly American) have found live phone polling more prone to desirability bias than other polling methods, while some haven't, and some have even challenged whether it's a thing in polls at all, but I don't recommend assuming that it isn't.  A spectacular difference that may have been down to this was that between the phone polls and the online polls in the Brexit vote.  The phone polls had large leads for Yes until very late in the piece while the online polls always had a close contest.

In the Australian context, we don't have enough pollsters of each type to really draw a line between phone polls and other polls.  But I've noticed over time that the live polls (Ipsos phone, old Newspoll phone, old Nielsen) tend to get higher-end results for Yes, while one particular robopoll (ReachTEL) tends to get much closer results than other polls in general.  The closer results in ReachTEL occur not only nationally but also locally.  This was especially pronounced in the early days of ReachTEL when the pollster used to get awfully bad results for Yes (like 43.0% in 2011) but has remained to some degree up til the present.

Public Polling: Essential

This week's Essential had a number of interesting features.  See the main poll question and also the question about likelihood of voting.

Firstly, this Essential finds a 55-34 headline result in favour of changing the law to allow same-sex marriage, which is (by a small amount) the highest No response I can find in a published poll that includes a don't know option since 2009, excluding the early ReachTEL mentioned above.  The Essential continues the trend of all previous public polling of having voters who say they will definitely vote more pro-Yes than those who say they probably won't. 

As Peter Brent pointed out on Twitter, this week's Essential sample was a strong one for the Coalition compared to other recent samples, so it's plausible that random error has made this result slightly closer than it would have been.

An unusual aspect of the plebi-survey is that it allows for pollsters to conduct exit polling of a sort while people are still responding.  Thus this particular poll finds 9% of respondents have already voted, and of these 59% say they support changing the law, 37% say they don't, and 5% say they don't know.

The 5% who say they don't know are rather baffling, since after all they have already answered the same question in the survey.  For the Yes side, a concern would be that these may be voters who heeded the "if you don't know, vote No" rubbish from the No campaign.  (I call it rubbish because if you don't know, you don't even have to vote at all.)  The question doesn't yet ask how did you voteThere may even be some people who support changing the law but voted No for some bizarre reason, though their numbers are probably very small if so.

Now 59-37 may still sound pretty good for Yes (assuming Yes is happy to win by any reasonable margin), but there are again reasons for caution.  Firstly the sample size: the margin of error on 9% of a one-week Essential sample (less than 100 voters) would be about 10% if it was treated as a random sample, ignoring the increases in effective error margin that come from the use of scaling.  (There is also the point that panel polls are self-selecting samples of a non-random selection of the population anyway, so applying "margin of error" to them is especially difficult).

More importantly, exit polling of this kind will be stratified, both by time of receipt and by time taken to respond.  The Yes and No votes are unlikely to be constant in time through the survey.  Voters in different areas will receive their survey forms at different rates.  Those in remote areas or whose forms have to be redirected from a different address may receive them more slowly.  When voters get their surveys, those with strong feelings are more likely to vote right away, while those who care little about the issue may defer a decision on voting, and those who are genuinely unsure may take longer to think about it.  Furthermore, campaigning through the survey period will influence vote behaviour.  The first 9% might be representative of the final outcome, but it's not something I'd be at all sure of.

Public Polling: YouGov

YouGov (another online pollster) this week has a 59:33 lead for Yes with 9% unsure.  This is unchanged from a month ago except that in that case 8% were unsure (so the only difference is the unsure figure after rounding.)  So that's one poll not backing the idea that the Yes vote is declining.

Internal Polls

Some internal polls conducted for the two campaigns have been reported.

Two online polls by Newgate Research for the Yes campaign have been reported, the first of which showed 63% for Yes and the second of which showed Yes leading 58.4-31.4.  The second was reported as having Yes down six points and No up two, which suggests something in the various reporting of these polls is not exactly accurate!

Newgate Research is a relatively little-known pollster and has not released any voting intention polling prior to elections based on which its accuracy can be gauged.  It's Senior Director Jim Reed is formerly of Crosby/Textor, the far better known internal pollsters for the Liberal Party.

The reporting of the second Newgate poll was picked by several people as a classic instance of that poll reporting trope, the internal polling scare story.  For a rather modest shift off a modest sample size (800) we got that support "has crashed", that "only" two-thirds of voters intended to take part, that this would be a "very low" turnout, "alarmingly" so for Yes campaigners, and so on.  The reporting said the poll was released as a "wake-up call", which rather implies that had its results not supported such a call it wouldn't have been released, ie selective release.  As evidence that Yes was at risk of going down in a screaming heap, a 27-point lead was far from convincing!  However, the idea that No has been making at least modest gains is now supported by the Essential result noted above.

The Newgate poll report noted with concern that young voters were less confident of voting, but Essential has found this too and yet still found that Yes voters are more likely to vote than No voters.  This is apparently explained by Yes supporters saying they're more likely to vote than No supporters across every demographic.

There are some strategic concerns about whether encouraging this style of reporting through the use of internal polls to scare people into not being complacent was a good idea.  Overhyped reports of support for same-sex marriage "crashing" may feed into a bandwagon effect for the No campaign.

The No campaign's Lyle Shelton claims to have internal polling showing a drop in the Yes vote from 65% to 58% , on which basis he claims that Yes has shed "over 1 million votes in ten days".  The latter claim is false as he is forgetting that not everyone will vote.  In any case, no details of pollster or method have been provided.

Another report from the No campaign says that they have support among female voters dropping from 67% to 60% in ten days, and male voters 54% to 49% in the same time, with supposedly a large drop in support among Gen X voters and negligible change among young voters.  Again, no details of pollster or of method and again, inconsistency in the different claims about the polling.

ReachTEL has been conducting widespread internal polling for itself - these results won't be released and are for the purposes of prediction-testing.  (The article is fascinating but raises some questions about how the pollster has "about 100 different variables on each individual" and whether this is problematic in privacy terms.)

 A more contentious robocall has emerged conducted by American firm WPA Intelligence.  The poll asks respondents their views about same-sex marriage, then presents them with two statements, one pro-same-sex-marriage and one against it.  Then it asks them their view about same-sex marriage again. This has been claimed to be a "push-poll" but I think it is more likely to be some kind of message-testing exercise, albeit one that should be a lot more upfront about who it is being conducted for.   For a detailed discussion about the differences between push-polling, skew-polling and message-testing see this piece from the 2014 Tasmanian state election.

Alleged Internal Poll (Unverified)

Wirrah Award For Something Fishy (Not Entirely Clear What) 

An especially dubious internal polling report came from one of the usual suspects on this issue, Miranda Devine.  Devine reported on "Exclusive polling by the Coalition for Marriage" in which 5% of 4000 adults surveyed identified as "LGBTIQ" and of these 93% supported same-sex marriage.  So far so unproblematic, although it probably won't stop said so-called Coalition from continuing to claim that only the rights of one point something percent are affected.  However Devine then, apparently in reference to this LGBTIQ cohort, claims:

A staggering 92 per cent would vote “No” or boycott the ballot if the proposed change to the Marriage Act “has not been thought through properly in terms of all of its consequences for the majority of Australians”.

While I don't expect Devine or other such op-ed authors to know better, this simply doesn't pass the smell test.  Far more than 8% of such voters would be immune to such blatant skew-polling and either click yes anyway or else abandon the survey in disgust.  And ditto for these further claims:

And 81 per cent would vote “No” or not vote if it turned out that the “Yes” campaign was “being run and controlled by left-wing activists with a hidden agenda that goes well beyond changes to the Marriage Act”. [..]

The survey also showed that more than three-quarters of LGBTIQ respondents would vote “No” or not vote if changes to the Marriage Act triggered “negative consequences for the majority of Australians, including restrictions on free speech, and penalties for acting according to one’s beliefs about marriage”.

Almost two thirds say they would vote “No” if same-sex marriage resulted in “forced exposure of young children to radical sex education content without parental consent”. A further 12 per cent would boycott the ballot.

And one third would change their vote to “No” or not vote if the “Yes” campaign were “nasty”, understanding that they would bear the backlash.


In other words, LGBTIQ Australians have serious reservations about the consequences of changing the definition of marriage.

For us to even consider taking such implausible claims seriously, these claims should have been backed by details such as the pollster conducting the polling, the polling methods, and the full list of questions.  But even if these results are genuine and conducted by a reputable pollster - and not completely concocted or grossly incompetent as they sound - it is obvious that the poll would have engaged in severe skew-polling.

Beware Cherry-Picking

Some media sources that support same-sex marriage have nonetheless been keen to play up the narrative that the Yes vote is crashing and the Yes campaign is terrible.  Partly this makes for a more interesting narrative (there is often more interest in contests perceived to be close or unpredictable) and partly it helps some of them to reach out to No campaigners by saying "well yes I support same-sex marriage but the Yes campaign are doing it all wrong".

A good example was a tweet from Paul Murray that drew a line through Morgan 76% in March 2016, Newspoll 63% in August, Essential 55% now and asked "Time to worry?" But the Morgan result, as already noted, was a dodgy yes/no SMS poll with no undecided option (practically an opt-in); other polls from 2015 and 2016 had an average Yes response of about 62%.  Also, Newspoll has generally had slightly higher Yes votes than Essential, mainly because it tends to have lower don't-know votes.  And why include an ancient Morgan and not a modern Ipsos?

Intentionally or otherwise, anyone can pick points from a sample that support any conclusion one likes - that support for same-sex marriage is plummeting, rising or staying still.  But it is better to look at a full range of polls.  An incomplete list may be found on Wikipedia, but be aware that some are commissioned and some are not.

If The Gap Is Narrowing, Why?

The above polling is not really conclusive on whether the gap between Yes and No has narrowed in recent weeks.  On an aggregate of public polls weighted for house effect it is, but only very slightly.  Two of the three public polls do not find narrowing.  Reported internal polls of both sides find narrowing, but both sides perceive themselves as having a reason to claim this: the Yes campaign to discourage complacency and the No campaign to encourage momentum and media attention and a belief that voting No isn't futile.

If we assume there has been a significant drop in the Yes vote, some would blame it on the antics of the more indisciplined or extreme Yes supporters.  That would be a part of the story (if there is one) but my view is that if someone says "I was going to vote Yes but then those nasty Yes supporters were meanies to the poor No supporters, so I'm voting no because they might be nasty to me too" there is a fairly high chance that that person is concern-trolling.  After all I am yet to see anyone at all saying they were going to vote No but switched to Yes because Kevin Rudd's godson was bashed, because actual fascists put up crackpot homophobic posters or because a church refused to marry a couple who had expressed support on Facebook for same-sex marriage.

Referendum-type "debates" often, but not always, see some level of decline in support for the pro-change position.  They tend to, as this one has done, become arguments about and proxies for everything but the issue in question.

They are also prone to objections of the "yes, but not like that" form, and can be said to be "built to fail".  If the legislative model is specified then some people who support the concept may not like the model.  If the legislative model isn't specified (as in this case) then some people will say that since they don't know what they're voting for they cannot vote for it.  Most people seem to either realise it is actually and intentionally an in-principle question, or else not care about the precise details of religious exemptions.

Is This Trump/Brexit/the Republic Again?

There have been many attempts to argue that the postal survey will return an upset No vote, using the election of Donald Trump, the win for Leave in Brexit, and the 1999 defeat of the Republic referendum as precedents.

At this stage these attempts are not convincing.  The polling misses in final polls in the cases of Trump and Brexit were not massive.  National final polls had Hillary Clinton beating Donald Trump by an average of about four points in the popular vote, and she beat him by two.  The Brexit miss was worse - polls had Remain winning by about three points, and it lost by 3.8.  But neither provides a precedent for the loser to win while trailing by over 20 points (and an average exceeding 30 points on a yes-no basis) in every published poll.

In the case of the Republic proposal in 1999, support for the generic proposal that Australia should be a republic led opposition by around 15 points in published polls.  But the split in pro-republic forces between direct and indirect election models for the President was a massive one that led to very large numbers of republicans voting no in the (pious as it turned out) hope of bringing on a new vote on a direct-election model within a decade.  As a result the defeat of the republic referendum by 10 points is not a useful precedent either.

There is not, at this stage, a substantial faction of Yes supporters pushing to defeat the postal vote in order to pass a version with specific religious safeguards included or excluded later. In general the complaints about lack of clarity regarding exemptions are coming from those who will be voting No anyway, and they are making these complaints in a classic game of FUD to try to pry "uncertains" off the fence.

There is also not, so far as I can tell, a significant boycott from voters who would otherwise vote Yes.  There are some same-sex attracted people who are affected by the vote to such an extent that they do not feel they can participate, but their numbers will be behind the decimal point in the overall survey result.  Calls for a more widespread boycott from unaffected voters appear to have dissipated - indeed I've been tracking these on social media where they have slowed to an infrequent drip and are mostly made by insincere troublemakers.

So the idea that Yes will fail, given the polling, relies on at least one of these:

* The view that the unfolding "debate" (if you can call it that, it's mostly just an exchange of opposing views that was all done to death years ago) will radically transform opinion leading to vast numbers of voters changing their intended answers, probably in the next two weeks, or

* The view that there is some kind of systematic and massive failure in the polling of the question.  This would arise, for instance, from people who said they were certain to respond not actually responding, from social desirability bias contaminating all the different forms of polling, or from all the pollsters oversampling politically engaged voters on a scale that made the UK 2015 flop look tame.  Another possibility is that pollsters turn out to be asking the wrong question and that there is a large difference between how voters respond to the poll question when asked it in a poll, and how they respond to it on the survey form.  (Someone might say they agree the law should be changed but vote No anyway, for example.)

All of these seem to be big stretches but we should bear in mind that Australian pollsters greatly lack experience in how to sample voluntary voting, and it is much harder than sampling for compulsory.  Some other possible sources of polling failure might include the unusual nature of the "voting" process and any impacts of progressive exit polling on voter behaviour.  The Yes campaign will need to guard against supporters assuming that Yes will win based on early exit polling and therefore not bothering to vote.  Enthusiasm should be built for the idea of not just winning but if trying to win by as much as possible.

I did note in the previous article that there was a large failure in polling on this question in Ireland, with polls overestimating Yes on a two-answer basis by nine points and hence the margin by eighteen.  If this happened in Australia, on current polling Yes would still get home, by a margin of something like 57-43. 

On current polling I can't rule out that some combination of campaign shift, multiple polling errors and maybe vandalism could get No over the line.  But it would take more than was seen with Trump, Brexit and 1999 to do it.

An Alternative Approach

An alternative approach to prediction may be seen at equalitypulse.com, which tracks search activity for "vote yes" and "vote no" via Google trends and on this basis has so far been consistently showing No winning by a large margin.  The author used similar methods in a Greek referendum but the claim that they nailed the Irish same-sex marriage debate appears to be retrospective, rather than one where there was a public predictive test in advance of the vote.  It will be interesting to see how this goes, but it is not at all clear to me why "vote yes" should be a commoner search term than "vote no" in a country where the debate has been going on for many years and a great many Australian voters would not Google anything prior to answering.  It may be a pointer to a tendency of undecided voters to investigate or switch to No that may narrow the margin, or there may be something else going on with it.  Comments from anyone with expertise in this area welcome.

Betting

It may also be of interest to note that one bookmaker (Sportsbet) was at $3 for No in late August, dipped down to $2.25 and is now back at $3, while William Hill which was at $2.40 at one point is now at $3.75.  As has been noted here before, betting markets are not great predictors.

Disclosure

Mr Kenny thinks that posting pictures of your vote on Twitter is virtue-signalling so I thought I should be doubly virtuous and do it on here as well.  

-----------------------------------------------------------------------------------------------------

Update: Western Australian ReachTEL

A WA-specific ReachTEL has been released, which is interesting because ReachTEL has a history of poor results for same-sex marriage compared to other pollsters and yet this result seems strong.  The poll finds 63-37 support for same-sex marriage in what appears to have been a yes/no forced choice. That sounds particularly good (as WA would probably not be so strongly pro-same-sex marriage as, say, Victoria) but the one cautionary note I would apply is that the poll has only asked a question about in-principle support and not how people actually would or will vote.  The use of a forced choice method is interesting because there is a perception that undecided voters skew to No (by as much as 80:20) in exercises like this.

The poll apparently finds over 90% saying they had voted or were very likely to, which I suspect reflects a sampling bias in favour of politically engaged voters, with politically disengaged voters especially likely to self-exclude from robo-polls.  However the reporting also says two-thirds of those who do not support change are unlikely to vote or will not vote.  It is impossible to reconcile these two claims so there is a need to see the full poll to identify the error.

The poll dates are not stated but I would expect that the polling mostly predates widespread awareness of the campaign's most publicised incident to date - the largely ineffectual headbutting of Tony Abbott by a drunken self-styled "anarchist" and same-sex marriage supporter (but not "campaigner" as claimed by many sources) in my home city of Hobart.  We will need to wait for more polls to see what impact this incident will have.

Update: Newspoll

Following the headbutting of Tony Abbott (mentioned above) Newspoll has the lead for Yes down ten points to 57-34.  The full tables are here.  At the time of the poll (taken Thursday to Sunday) 15% of respondents had already voted and 67% said they definitely would.   

Newspoll is actually asking the right question, since it is asking how people will vote or have voted.  Among the 82% (that seems high!) who say they have voted or definitely will vote, Yes holds a slightly greater lead, 61-34.  

These results are still healthy for Yes, but given the vagaries in this sort of polling and the large amount of the vote that's not yet in, the Yes side would not want it to slide much more.  Stupid stuff should be reigned in and extremists disowned.  

Update: Essential Again

I have limited time this morning so see the Guardian's reporting on Essential here.  Essential finds those who have already voted (around a third) breaking 72:26 to in-principle support with an overall figure of 58:33 among all voters.  As noted above, hard yes voters may be more likely to vote early.

(Note added Monday Oct 2: I intend to do a new article soon when the ABS turnout figures appear on Tuesday).

10 comments:

  1. Why on earth would anyone Google "vote yes" or "vote no"? To look for persuasive arguments, or because their friends have told them "hey, look this up - it's really hysterical!" With absolutely zero inside info on this, I would suspect that in many cases it's the latter. I very mildy wonder what you'd find by googling either of those terms - but so mildly that I can't be bothered doing it.
    And our forms haven't come yet, and it's the Thursday of the second week.

    ReplyDelete
  2. Update - 4 forms just arrived. The 2 of us at home will be posting them back in the next 10 mins and the other 2 will do it when they get home. Whoopee!

    ReplyDelete
  3. And, quite unnecessarily, I crammed a YES and a tick and an equals sign into the "yes" box. It won't make any difference to a mark reader, which will simply report a mark in that box and none in the other - I think you could write NO in the yes box and it would still be seen as a yes - but it was satisfying.

    ReplyDelete
    Replies
    1. I do wonder about some of the more eccentric marking choices and how they might be interpreted. In a normal election there would be scrutineers poring over the informal "pile" with No scrutineers trying to argue (unsuccessfully I'd expect) that cartoon genitals in the Yes box were not a clear expression of voter intent, and so on.

      Delete
    2. In a normal election there are ballot-paper instructions to use 1,2,3,etc. Here we are simply instructed to "mark" one box, not even whether to use a tick or a cross, and we know that they are using mark readers (as used for multi-choice exams for years), not the much more sophisticated character readers. So I'm pretty confident that any squiggle will do, and also pretty confident that a NO (or genitals) in the yes box would count as a yes, though not silly enough to actually do it. The only ones that will go into an "informal" pile are the ones where there is some trace of a mark in both boxes.

      I would have appreciated instructions from the ABS as to what kinds of mark the machines will read - the days are gone when you had to use a special proprietary mark-sense pencil on a mark-sense card, and I'd imagine they will read at least a B pencil and blue or black biro, but what about an H pencil or red biro?

      But I'm sure it won't matter in the end - at least 55% yesses and maybe 65% will put it beyond doubt. (Yes, some after-the-date polling would be interesting.) And Bernardi and Abbott (who seems increasingly to be a disciple of Cory) will predict the end of the world, or at least of civilisation as we know it, and it still won't happen. (I believe some maddies are expecting the planet Nibiru on Saturday. Good timing!)

      Delete
  4. More questions!

    1) I hadn't thought about exit polling. Do you think there'll be exit polls after the final date for the return of the survey? It will be a really interesting exercise in polling accuracy given there'll be hard data to compare the poll results with on Nov 15.

    2) Any thoughts on the argument that there is an overestimation of turnout on the basis that the more likely you are to respond to an opinion, the more likely you are to respond to the postal survey? i.e. a helluva lot of people who don't want to answer an opinion will be similarly disinclined to respond to the survey.

    ReplyDelete
  5. 1) I suspect so, or close enough to the final date as to make little difference. I just hope the exit pollsters will increasingly ask voters to report their actual vote, not their in-principle view.

    2) Plausible - this is a bias that is hard for scaling to eradicate. That said, a lot of sources are counting only the stated certain voters in their estimates of turnout, so their estimates may be conservative enough for this to cancel out.

    ReplyDelete
  6. The "mark with an x" showing your negation of that choice, came up for me this time. How do you vote for "no" with a cross, it is a double negative. ? How do you vote "yes" with the universal symbol for no, being an "x" or a cross. A friend called me stuck on the horns of that dilemma.

    So I used a tick instead.

    In other news, the conflation if irrelevancies continue, one bizzare argument I read on FB was that one if the SSM private member bills previously posted online seemed to permit people to marry, and that a company is also a people, so the SSM bills are seeking to allow companies to be marriageable.

    I see that as farcical, not necessarily insane, but farcical.

    ReplyDelete
    Replies
    1. I used a tick for the same reason, but I'm sure crosses will be counted as votes for (not against) whichever side they are marked on, given that the instructions only say to mark a box, and given that they are commonly used that way on government forms. A lot of people posting their votes on social media have used crosses. I've been involved in an international election where voters are told they can use any of a tick or a cross or a + sign, so I suspect there are some countries in the world where boxes are commonly marked with a +.

      Delete
    2. Don't overthink it, historvre! The forms are being fed through Optical Mark Readers, which have no intelligence and have no idea what ticks, crosses, maths symbols, letters of the alphabet or anything else mean. They respond to any mark in the two boxed areas. If you want to be really sure that your mark will be read I guess the best would be to cross-hatch the whole of one box, but if the machines are working properly a single line (in any direction) will do. Or even a question mark! As long as you only mark one box.

      Delete

The comment system is unreliable. If you cannot submit comments you can email me a comment (via email link in profile) - email must be entitled: Comment for publication, followed by the name of the article you wish to comment on. Comments are accepted in full or not at all. If you submit a comment which is not accepted within a few days you can also email me and I will check if it has been received.