Why EU Policymakers Should Not Rely on the Farmers’ Horizon Ipsos Report

Kees Jansen*

The Ipsos market study “Farmers’ Horizon: One Year After Farmers’ Protests” (2025) aims to capture “the pulse of the farmers population and address the current situation” following a period of farmers’ protest in different EU countries. The study seeks to determine whether farmers perceive any change in their financial situation one year after the protests, whether they are satisfied with the measures adopted by the European Union (EU) and national authorities, and what further actions might be undertaken to support EU farmers in coping with ongoing challenges. After a careful initial reading of the report, I conclude that the study is flawed in several fundamental aspects. Although it gives the impression of scientific rigor, methodological robustness, and representativeness, each of these aspects is, in fact, problematic.

With respect to the first point, many survey questions appear biased and seem to reflect the interests of the study’s funder, Crop Life Europe, a major lobby organization representing the pesticide industry. This type of market research is ill-suited to understanding any changes in perceptions let alone real changes in the conditions of farming that may have occurred. In fact, the study approach did not study specifically the effect of EU measures taken after the protests. Rather, the study appears to advance CropLife Europe’s agenda to paint EU environmental policies as outdated and subject to change. Moreover, it lacks the depth and analytical frameworks typical of sociological, anthropological, political science, or economic research needed to explore perceptions and shifts in perceptions. The conclusions drawn from the data constitute a form of cherry-picking: only ad hoc issues likely to alarm policymakers are highlighted.

For instance, the report concludes that 22% of farmers plan to cease farming. This finding, however, is not directly related to short-term EU or national policies, or their absence, that motivated the protests (such as the so-called nitrogen crisis in the Netherlands or the reduction of diesel and farm equipment subsidies in Germany). Rather, farm closures are a long-term outcome of structural political-economic processes of economic pressure and scale enlargement in agriculture. Such developments cannot plausibly change within the one-year scope of the study. Yet the study’s central question remains: One year after the protests, do farmers perceived a change in their financial situation?

From a methodological standpoint, serious concerns arise regarding the suitability of a web-intercept study to investigate complex questions of motivation and perception. This design, by definition, constitutes an unrestricted, self-selected survey. The report fails to specify the websites on which the study was hosted. The average completion time was only 12 minutes, indicating a high likelihood of ‘satisficing’. That is respondents providing quick, superficial answers rather than thoughtful, cognitively processed responses. Such “underbelly” responses are particularly problematic when questions rely on closed option lists rather than open-ended prompts. Although the report does not specify the question format, its gives the impression that predominantly closed questions were used.

Web-intercept studies may be acceptable when sample representativeness is not essential. However, the report itself claims that the sample is representative, as evidenced by statements such as “half of the European farmers protested” and “51% of farmers are pessimistic about the future of their operations”. Furthermore, the report presents results for each of nine countries studied (out of the EU’s 27 member states), implying that national differences are analytically significant. Yet, for most of the report, the specific political and social contexts underlying the protests are excluded from analysis. These contextual differences are, however, crucial for accurately interpreting the data. As a result, the study creates a misleading impression that EU policy alone is the primary cause of the protests. This is not to suggest that EU policy played no role, but rather that, from a methodological perspective, the study’s design is too limited and flawed to support such claims.

Turning again to the issue of representativeness, significant discrepancies emerge when comparing the average farm size of survey respondents with national average. For example, while the average farm size in Germany is 61 hectare, the average among German respondents was 433 hectare. Similar disparities are observed in all countries included in the study. This suggests a major source of sampling bias, likely stemming from the websites on which this survey was advertised. This is an aspect the report leaves unspecified. In addition, the selection of countries appears biased toward those with larger farmer protests, without any justification provided. Consequently, the study cannot credibly claim to be representative.

This brief note has not addressed all the weaknesses and flaws of the study, but the issues discussed above suffice to conclude that EU policymakers should not treat this report as scientific evidence. It does not constitute a scientifically robust or methodologically sound representation of farmers’ sentiments, motivations to protest, and interpretation of agricultural policy.

* Kees Jansen, Associate Professor, Rural Sociology Group, Wageningen University