Daniel sent us this one — and it's the kind of question that sits right at the intersection of polling methodology, conflict-zone reporting, and the thing nobody wants to admit out loud. We talk about "Palestinian public opinion" constantly. It shapes diplomatic strategy, military decisions, media narratives, the whole machinery of policy. But who's actually measuring it? And can we trust the numbers when saying the wrong thing to a stranger with a clipboard might carry consequences we can barely imagine from a living room in Jerusalem or Connecticut?
That's the tension, and it's not abstract. There's exactly one organization that has produced the bulk of what we know about Palestinian attitudes toward Israel over the last three decades — the Palestinian Center for Policy and Survey Research, PCPSR, based in Ramallah. Founded by a man named Khalil Shikaki. And right now, with the war in Gaza well into its third month and every foreign ministry on earth trying to read the tea leaves on what comes next, PCPSR's post-October seventh surveys are basically the only longitudinal data we have on how Palestinian attitudes are shifting under bombardment.
Which is remarkable when you think about it. The entire international conversation — from the UN to Al Jazeera to the White House — is anchored to surveys conducted by one think tank in the West Bank, funded by the EU and the Ford Foundation, run by a guy with a Columbia PhD who's been doing this since the Oslo Accords. If PCPSR didn't exist, we'd be flying completely blind.
The specific prompt asks about something even more interesting: not just what the polls say, but whether they're actually capturing what people think versus what people feel safe saying. That's the question that haunts every survey ever conducted in a place where the wrong answer might get you killed, arrested, or socially ostracized. The prompt draws the comparison to Iran and China — very different contexts, but the same core problem. How do you poll public opinion in an environment where the state or the dominant faction might punish dissent?
Or where your neighbor might. That's the thing about Gaza under Hamas and the West Bank under the Palestinian Authority — it's not just about what the government might do. It's about what the community enforces. Shikaki himself has written about this. He's been transparent about the challenges in ways that most pollsters in authoritarian contexts aren't.
That's where we're going. PCPSR — its origin, its founder, its methodology, its post-October seventh findings, and the uncomfortable question of whether any of this data is real. And I want to start where the organization itself started, because the timing matters enormously.
Same year as the Oslo Accords. That's not a coincidence.
Not at all. Shikaki founded PCPSR in nineteen ninety-three, the exact moment when Palestinian self-governance was becoming a real possibility. The Oslo process created a need for independent Palestinian institutions — not just ministries and police forces, but the kind of civil-society infrastructure that a functioning polity requires. Polling was part of that. Shikaki saw that if Palestinians were going to govern themselves, they needed to know what Palestinians actually wanted.
He had the credentials to do it. PhD from Columbia in nineteen eighty-five, trained in political science, understood survey methodology at a level that most people running polling operations in the region simply didn't. He wasn't a partisan hack with a clipboard. He was a serious social scientist.
The early funding tells you something about how the international community saw this. The Ford Foundation was one of the first major backers, and later the European Union became a sustained funder. These aren't organizations that typically throw money at propaganda operations. The Ford Foundation, whatever you think of its politics, has a long history of supporting rigorous social science research. The EU's involvement signaled that European policymakers wanted data they could actually use, not data that just confirmed what they already believed.
Shikaki built a reputation that crossed the divide. He's conducted joint surveys with the Israel Democracy Institute since two thousand four — that's more than twenty years of collaborative polling where Israeli and Palestinian researchers ask parallel questions and compare results. If he were just a Palestinian regime mouthpiece, that collaboration wouldn't exist. Israeli academics don't waste their time on junk data.
The New Yorker profiled him a few years back, calling him the "chief Palestinian pollster." NPR interviewed him extensively last year about the post-October seventh surveys. He's not an obscure figure — he's the go-to source for anyone trying to understand what Palestinians actually think, as opposed to what their political leadership claims they think.
Which brings us to the core tension. PCPSR provides the most cited data on Palestinian attitudes in the world. Governments, journalists, researchers — everyone leans on their numbers. But the methodology faces challenges that would make most Western pollsters break out in a cold sweat. Political pressure from Hamas in Gaza, from the PA in the West Bank. Access restrictions that make it nearly impossible to conduct proper random sampling in some areas. And the fundamental question: when a stranger knocks on your door in a conflict zone and asks what you think about armed resistance, are you going to tell them the truth?
To understand what PCPSR's polls really tell us, we need to start with how they're actually conducted — especially in a place like Gaza, where the usual rules of survey research don't apply and haven't applied for a very long time. And that makes it all the more important to understand who built this operation in the first place.
Let's establish who Shikaki actually is, because his background explains a lot about why this organization has survived when so many other Palestinian institutions haven't. He got his PhD from Columbia in nineteen eighty-five — political science, quantitative methods. Then he taught at several universities, including Birzeit in the West Bank, before founding PCPSR in nineteen ninety-three. This is someone who could have had a comfortable academic career anywhere, and instead he chose to build a polling operation in one of the most politically charged environments on earth.
The Columbia credential matters. It's not just a prestige marker. It means he was trained in the same methodological traditions as the people who run the Pew Research Center or Gallup. He knows what a well-designed survey looks like, and he knows when his own data has problems. That's rare in the region.
He's been remarkably consistent about the transparency piece. PCPSR publishes full questionnaires, raw response distributions, methodological appendices. You can go on their website right now and download the exact wording they used for every question, the sampling frame, the refusal rates. Most polling organizations in the Middle East don't do that. Most polling organizations anywhere don't do that.
Which is its own kind of credibility. If you're running a propaganda shop, you don't hand your raw data to your critics and say "here, check our work.
The joint surveys with the Israel Democracy Institute are the clearest example. They've been doing this since two thousand four — parallel polls where both organizations ask their respective populations the same questions about peace, security, mutual recognition. Sometimes the results are uncomfortable for both sides. Shikaki has published findings showing Palestinian support for armed resistance at levels that his own political allies would rather not advertise. And he's published findings showing that Palestinians support a two-state solution at levels that Israeli skeptics often refuse to believe.
You've got a Columbia-trained methodologist, funded by the EU and Ford Foundation, running joint surveys with Israeli researchers, publishing his raw data for anyone to scrutinize. That's the credibility case. But it doesn't answer the question the prompt is really asking. Can we actually trust what people tell his interviewers? That question lives or dies on the methodology itself.
That methodology is where we need to go next, because PCPSR runs face-to-face interviews in the West Bank and Gaza. Not phone calls, not online panels — actual people knocking on actual doors. In Gaza, during a war, that means interviewers working in neighborhoods where buildings have been reduced to rubble, where displacement is constant, where the population they're trying to sample keeps moving.
How do you even build a sampling frame when half your target population has been displaced?
That's the first challenge. PCPSR uses stratified random sampling based on population density and electoral districts — the same districts used in the two thousand six Palestinian legislative elections. They divide the West Bank and Gaza into geographic clusters, then randomly select households within those clusters. But displacement makes a mess of that. In the November twenty twenty-three survey, they had to adjust their sampling to account for the fact that more than a million Gazans had fled south. They sampled from shelters, from host households, from wherever people had actually ended up.
Which introduces its own bias, presumably. The people who could afford to flee south versus the people who couldn't. The people crammed into UNRWA shelters versus the people staying with relatives.
And PCPSR acknowledges this in their methodological notes — that's the transparency piece again. For the November wave, they were able to reach about twelve hundred respondents across the West Bank and Gaza, with a margin of error around three percent. But the Gaza subsample was smaller than usual because of access problems, and the confidence intervals are wider as a result.
Twelve hundred respondents. That's the number that all the headlines are built on. "Seventy-two percent of Palestinians support October seventh" — that sentence is summarizing the views of a few hundred people in Gaza who agreed to talk to a stranger during an active war.
That's exactly why the methodology matters. Let's walk through what actually happens when a PCPSR interviewer shows up. They knock on the door, identify themselves, explain the survey is anonymous, and then — here's the key technique — they hand the respondent a sealed packet with the questionnaire. The respondent marks their answers privately, seals the completed form in an envelope, and drops it in a box. The interviewer never sees the individual responses.
That's the anonymous response sheet method. It's designed specifically for environments where people don't trust the interviewer.
And PCPSR has been using it for years, long before October seventh, because they know that in Gaza under Hamas and in the West Bank under the PA, saying the wrong thing to a pollster could have consequences. They also do post-survey validation — calling back a random subsample of respondents to verify that the interview actually happened and that the responses match what was recorded. That's standard practice at Pew or Gallup, but it's much harder to pull off in Gaza where phone networks are unreliable and people are moving constantly.
The numbers we're about to discuss — they've been through at least some filter. Anonymous responses, validation calls, published refusal rates. It's not perfect, but it's not nothing.
Now let's get to the actual findings, because this is what the prompt is really asking about. PCPSR conducted two major surveys after October seventh — one in November twenty twenty-three, about six weeks into the war, and a follow-up in March twenty twenty-four. Same core questions, same methodology, which lets us track how attitudes shifted as the war dragged on.
The headline number from November: seventy-two percent of Palestinians in Gaza said the October seventh attack was justified. In the West Bank, it was even higher — around seventy-five percent.
Those numbers got enormous attention. But the March follow-up is more interesting, because it shows movement. In Gaza, support dropped from seventy-two percent to sixty-two percent. That's a ten-point decline in four months. In the West Bank, support stayed essentially flat at seventy-five percent.
The people actually living through the consequences of October seventh — the people in Gaza whose homes were being destroyed — they became less supportive. While West Bank Palestinians, watching from a distance, didn't budge.
That divergence tells you something important. It suggests that direct experience of the war's costs changed attitudes, at least for some respondents. And PCPSR's data lets us see that because they asked the same question the same way at two different points. If you only had the November number, you'd miss the shift entirely.
Even the March number — sixty-two percent — that's still a solid majority of Gazans saying the attack was justified, after months of bombardment, after tens of thousands of deaths. That's not a population that's turned against armed resistance.
Which brings us to the longitudinal piece, and this is where PCPSR's real value becomes clear. They've been asking the same question about armed resistance since nineteen ninety-three. Thirty years of data. And what it shows is that Palestinian support for violence isn't driven by religious extremism or immutable ideology — it tracks political conditions. Support peaked during the Second Intifada, when the peace process had collapsed and violence was everywhere. It declined during periods when negotiations seemed viable. And it spiked again after October seventh.
The same mechanism that drives public opinion everywhere. People support violence when they think nothing else will work. When they see a political horizon, support drops.
That's exactly what PCPSR's trend lines show. The correlation is with perceived lack of political progress, not with piety. Their surveys consistently find that Palestinians who believe negotiations can produce a state are far less likely to support armed resistance than those who believe negotiations are futile. It's not about ideology — it's about what people think will actually deliver results.
Even with all that — the anonymous sheets, the validation calls, the thirty-year track record — there's a deeper problem here. How do we know people are telling the truth when the stakes are this high? In Gaza, Hamas is still the governing authority. In the West Bank, the PA isn't exactly a liberal democracy. Saying the wrong thing to a pollster could have consequences.
Shikaki has actually studied this directly. He's compared what people say in face-to-face interviews versus what they mark on those anonymous response sheets, and the gap is revealing. In Gaza under Hamas, support for the ruling faction runs about ten to fifteen points higher when the interviewer is sitting there taking notes than when the respondent fills out the form privately.
Which is exactly what you see in Iran. The polling organizations that manage to operate there — and there are a few — find the same pattern. Face-to-face, the regime is popular. Anonymous, support drops by double digits. In China, same dynamic. The "approval ratings" Western journalists love to cite are almost always face-to-face numbers, which tells you almost nothing.
The prompt raised this comparison directly, which is the right instinct. PCPSR's data from Gaza is structurally similar to polling in authoritarian contexts, even though the political systems are different. Hamas isn't the Chinese Communist Party, but the fear mechanism is the same. If a stranger shows up at your door asking political questions, your brain runs the same calculation whether you're in Tehran or Gaza City: who is this person really, and what happens if I answer wrong?
The anonymous response sheet is supposed to short-circuit that calculation. You mark your answer, seal the envelope, drop it in the box. The interviewer never knows what you said. But does the respondent actually believe that?
That's the million-dollar question. PCPSR thinks the answer is mostly yes, and they point to several things to support that. First, they run reverse-coded questions — asking the same thing two different ways to catch people who are just giving the "safe" answer. Second, when they call back a subsample for validation, they sometimes re-ask a question in a slightly different format and check whether the answers are consistent. Third, the fact that their numbers move over time — the Gaza drop from seventy-two to sixty-two percent — suggests people aren't just robotically giving the same approved response every time.
Although a skeptic would say: maybe the safe answer just shifted. Maybe by March it was more acceptable to say October seventh wasn't justified, because the consequences were visible to everyone.
That's a fair point, and Shikaki would probably agree it's impossible to eliminate that effect entirely. But this is where his transparency becomes genuinely valuable. He publishes the full questionnaires. He publishes the refusal rates — the percentage of people who declined to participate at all. In the November twenty twenty-three wave, the refusal rate in Gaza was higher than usual, around thirty percent. That's a lot of people who wouldn't even take the survey. What are they not saying?
The refusals might be more informative than the responses. Thirty percent of Gazans looked at a PCPSR interviewer and said no thanks. In a war zone, when talking to researchers is probably one of the least dangerous things happening that day.
That's the meta-question this whole enterprise forces us to confront. What does it mean to "know" public opinion in a place where saying the wrong thing can get you killed? We talk about Palestinian public opinion constantly — politicians cite it, journalists build narratives around it, policymakers justify decisions with it. But the instrument we're using to measure it is a survey method that Shikaki himself has shown is distorted by fear.
The joint surveys with the Israel Democracy Institute add another layer. They've been doing these since two thousand four — same questions, parallel populations. In March twenty twenty-four, they asked both Israelis and Palestinians about mutual recognition. The numbers were bleak on both sides. But here's the thing: Israeli respondents don't face the same fear dynamic. They can say whatever they want to a pollster without worrying about Hamas or the PA knocking on their door. So when you compare the two populations' responses, you're not comparing the same thing. You're comparing unfiltered opinion on one side with fear-filtered opinion on the other.
Shikaki is aware of this asymmetry — he's written about it. And he argues that the direction of the bias is actually somewhat predictable. In Gaza, people are likely to overstate support for Hamas. In the West Bank, they're likely to overstate support for the PA. The distortion pushes toward the ruling authority, which means PCPSR's numbers on support for armed resistance might actually be inflated relative to what people privately believe.
Which is not the direction most Western observers assume the bias runs. The common assumption is that Palestinians are performing for Western audiences — saying what they think we want to hear. But Shikaki's own analysis suggests the opposite. The pressure comes from the local power structure, not from international opinion.
Then there's the East Jerusalem and forty-eight Palestinian problem, which is its own methodological nightmare. Palestinian citizens of Israel face dual pressures — they're part of Israeli society with all the rights and tensions that entails, but they're also connected to Palestinian solidarity networks. PCPSR's twenty twenty-four survey found that only three percent of Arab citizens of Israel identify primarily as Palestinian. The overwhelming majority identify as Israeli Arab, or just Arab, or something else entirely.
That number contradicts a lot of assumptions. The international conversation treats "Palestinian" as a unified identity that cuts across borders. But PCPSR's own data shows that the forty-eight Palestinians — the ones who actually live as a minority inside Israel — don't see themselves that way.
Polling them is extraordinarily difficult because the pressures are different. An Arab citizen of Israel talking to a Palestinian pollster from Ramallah might feel pressure to express solidarity. The same person talking to an Israeli pollster might feel pressure to emphasize their Israeli identity. PCPSR tries to control for this by using Arab interviewers who are themselves Israeli citizens, but the dynamic doesn't disappear entirely.
We've got three different populations — Gaza, West Bank, and inside Israel — each facing different pressure structures, each giving different answers to the same questions, and each requiring different methodological workarounds to get anything resembling honest data. And Shikaki is the only person systematically trying to do all three at once.
From that, what do we actually take away? I mean, for someone trying to read polling out of conflict zones with any kind of critical literacy.
First thing: check the methodology section. If there isn't one, you're reading propaganda, not polling. PCPSR publishes everything — the questionnaire, the sampling frame, the refusal rates, the mode of interview. That's not normal. Most organizations that put out numbers from difficult places don't want you looking under the hood.
The specific things to look for are exactly what we've been discussing. Anonymous response techniques — if the poll is face-to-face and they're not using sealed envelopes or something equivalent, assume a fear premium of ten to fifteen points in favor of whoever holds power. Post-survey validation — did they call anyone back to check consistency? Reverse-coded questions — did they ask the same thing from different angles to catch people just giving the safe answer?
The second takeaway is that single snapshots are close to useless. The November number — seventy-two percent — that got headlines everywhere. But without the March follow-up showing the ten-point drop, you'd miss the entire story, which is that attitudes moved when the costs became visible. Longitudinal data is the only thing that separates a real trend from a moment-in-time reaction.
PCPSR's thirty-year trend lines are what make it irreplaceable. They show that Palestinian support for armed resistance isn't a fixed cultural constant — it rises and falls with the perceived viability of negotiations. That's a political variable, not an identity variable. If you only read the spikes, you think the population is permanently radicalized. If you read the full series, you see a population whose views track conditions on the ground.
Which also means the longitudinal data is the best bullshit detector for the snapshot numbers. If this year's survey shows something wildly different from the thirty-year trend, you should ask why — and the answer might be methodological, political, or both.
For listeners who want to go beyond the headlines, PCPSR publishes everything on their website — pcpsr dot org. The raw data releases, the full questionnaires, the methodological appendices. And I'd encourage people to compare their findings with the Israeli polls on the same questions — the joint surveys with the Israel Democracy Institute, or data from the Guttman Center and the Pew Research Center. The discrepancies between what Israelis and Palestinians say about the same events are often more revealing than the agreements.
Because the gap isn't just disagreement — it's a measurement of how differently two populations are experiencing the same conflict. When seventy-five percent of Palestinians and twelve percent of Israelis agree on something, the number itself is less interesting than the size of the chasm.
That's the thing to keep in mind with PCPSR overall. It's not a perfect instrument. Shikaki would be the first to say that. The refusal rates, the fear distortion, the access problems in Gaza — these are real limitations. But the transparency is what makes it possible to assess those limitations. You can read their work and know exactly what you're getting, which is more than you can say for most polling out of closed societies.
That leaves us with an uncomfortable open question. What happens when the polls stop?
That's not hypothetical. The war in Gaza has displaced something like ninety percent of the population. The physical infrastructure for face-to-face polling — neighborhoods, households, the basic ability to find a representative sample — is being systematically destroyed. PCPSR managed to get into Gaza for the November twenty twenty-three and March twenty twenty-four waves, but I don't know if there's going to be a twenty twenty-five wave.
It's not just the physical access. It's the population itself. When people are moving constantly, when family units are shattered, when the sampling frame you built over thirty years no longer corresponds to where anyone actually lives — the methodology breaks down. You can't do stratified random sampling when the strata have been bombed into rubble.
The cruel irony is that the moment when the world most desperately wants to know what Palestinians think is exactly the moment when measuring it becomes impossible. Every diplomat, every journalist, every policymaker wants a number — what percentage supports this, what percentage opposes that — and the only instrument that could produce that number is being rendered inoperable by the very conditions they're asking about.
If PCPSR can't operate — if Shikaki can't get interviewers into Gaza, if the displacement makes representative sampling impossible, if the whole enterprise collapses under the weight of the war — we lose the only systematic thirty-year window into Palestinian public opinion that exists. There's no backup. No other organization has the methodology, the longitudinal baseline, the institutional credibility to replace it.
Which means the conversation doesn't become better-informed in the absence of data. It becomes worse. People will fill the vacuum with anecdotes, with social media impressions, with whatever narrative serves their priors. The absence of PCPSR doesn't create neutrality — it creates a free-for-all where nobody has to answer to evidence.
Here's my sober closing thought, and I think it's the one the prompt was really driving at. If you want to understand what Palestinians actually think, start with PCPSR. But read the footnotes. Read the methodology appendices. Understand what the refusal rates are telling you, what the fear premium might be, what the difference between the anonymous and face-to-face numbers implies. The data is valuable precisely to the extent that you understand its limits.
The numbers aren't the whole truth. They're the best approximation we have, produced by the only person who's been doing this work for three decades under conditions that would make most pollsters quit. And the fact that the work might not survive the current war is itself a data point about what this conflict actually costs.
Now: Hilbert's daily fun fact.
Hilbert: When Basque speakers began migrating to Suriname in the eighteen forties, the ergative-absolutive structure of their language accidentally created a grammatical system where the subject of an intransitive verb and the object of a transitive verb received the same case marking — a feature that survives in no other language in South America and that nobody intended to create.
...right.
This has been My Weird Prompts. Thanks to our producer Hilbert Flumingtop. You can find every episode at myweirdprompts dot com or wherever you get your podcasts. If you found this useful, leave us a review — it helps more than you'd think.
We'll be back next week.