People’s beliefs about the public opinion on climate change can play a significant role in determining their own attitudes and likelihood to engage in climate-friendly behavior. However, limited research exists on the perception of consensus and effective ways to inform individuals about public opinion. In this study, we examined whether presenting information in two different formats—packed or unpacked—would impact people’s perception of public agreement on climate change. In two experiments (total N = 506; 151 participants from the USA and 355 participants from Norway), participants read about the public opinion on different topics related to climate change, either in an “unpacked” way (e.g., 5% strongly oppose, 8% somewhat oppose, 41% somewhat support, and 46% strongly support funding research into renewable energy), or in a “packed” way (e.g., 13% somewhat or strongly oppose, and 87% somewhat or strongly support funding research into renewable energy), before rating the perceived public (dis)agreement about the topics. We hypothesized that presenting information in a packed way would lead to higher perceived agreement, but found no support for this hypothesis. Interestingly, our results showed that participants’ own beliefs or attitudes were positively related to perceived agreement. The findings contribute to the literature on false consensus and motivated reasoning.

Climate change stands as perhaps the most significant challenge of our time (Cook et al., 2016; IPCC, 2014; Tranter & Booth, 2015) and was recognized by The World Health Organization as a major threat to human health (WHO, 2019). To combat global warming, 195 countries worldwide have committed to limiting temperature rise to below two degrees Celsius compared to preindustrial levels, in the so-called Paris Agreement (United Nations Climate Change, 2021). Achieving these goals necessitates transformative changes across all levels of human activity. At the individual level, everyday choices collectively wield significant influence over the environment (Lahn & Torvanger, 2017; Wolske & Stern, 2018). Consequently, understanding people’s attitudes, views, and beliefs about climate change becomes crucial for effective climate adaptation (Leiserowitz et al., 2010).

Research in social sciences can play an important role in advancing our understanding of how individuals and societies take steps toward mitigating climate change (Nielsen et al., 2021; Steg, 2023). Seminal psychological studies emphasize that attitudes and efficacy beliefs strongly impact behavioral intentions, which, in turn, connect to actual behavior (Ajzen, 1985; Ajzen & Fishbein, 1975; Rodríguez-Barreiro et al., 2013; Schwartz, 1977). Recent research supports the role of intentions as predictors of environmental behavior (Bamberg & Möser, 2007; Rivis et al., 2009). An essential aspect of this involves how perceived social norms—individuals’ perceptions of what others believe and do— shape shared expectations and perceived appropriateness of behaviors within a social group or society.

People’s attitudes and behaviors are influenced by information about others’ actions and beliefs (Cialdini et al., 1990; Stok & de Ridder, 2019; van der Linden, Clarke, et al., 2015; van der Linden, Leiserowitz, et al., 2015). Essentially, normative beliefs—defined as an individual’s perception of whether others think they should engage in a specific behavior (Ajzen, 1985)—and second-order beliefs—defined as the beliefs an individual holds about others’ beliefs (Mildenberger & Tingley, 2019, p. 1279)—can partially shape individuals’ attitudes and guide their subsequent behaviors. For example, we overestimate the extent to which selfishness is the source of other people’s attitudes and behavior (Brick et al., 2021; Miller & Ratner, 1998); and underestimate how honest others are, leading to reduced trust in others - but receiving factually correct information about others’ honesty can lead to higher trust and reduced cynicism (Martuza et al., 2024).

In the context of climate change, people may be more inclined to adopt sustainable behaviors if they perceive a broad consensus that climate change is caused by humans and poses a serious threat to humanity. Conversely, if the perceived agreement is underestimated, individuals might think that skepticism about climate change is the norm, reducing their motivation to act sustainably (Andre et al., 2024b; Geiger & Swim, 2016). Understanding what influences perceived consensus can inform strategies to promote pro-environmental attitudes and behaviors.

Among scientists, there is a strong consensus that climate change is occurring, and that human activity is the main cause (Doran & Zimmerman, 2009; Tranter & Booth, 2015), and greater climate expertise among scientists correlates with higher agreement on climate change (Cook et al., 2016). Despite this high level of agreement in the scientific community, public concern about climate change varies, and laypeople still debate whether it is caused by humans (Ballew et al., 2020; Funk & Hefferon, 2019; Mildenberger & Tingley, 2019). In recent years, a stream of research has proposed that the perceived scientific agreement plays a pivotal role in people’s attitudes toward climate change (van der Linden, Clarke, et al., 2015; van der Linden, Leiserowitz, et al., 2015) as well as vaccinations (van der Linden, Clarke, et al., 2015) and other contested issues (Chinn et al., 2018). The gateway belief model (van der Linden, 2021) proposes that when people perceive a higher level of scientific agreement on an issue, it serves as a ‘gateway’ that leads to changes in related beliefs, such as increased concern about climate change and the belief that it is human caused. These shifts in belief can, in turn, influence the level of public support for policies addressing the issue.

However, people do not only look to scientific experts when forming their beliefs. The perceived opinion of the general public also matters, and it has long been known that such second-order beliefs may be inaccurate (Fields & Schuman, 1976). In the climate domain, there is evidence that most people report high personal concern about climate change, but underestimate how concerned others are (Andre et al., 2024b; Geiger & Swim, 2016; Leviston et al., 2024; Pearson et al., 2018). This is important as second-order beliefs are predictive of behavior, often more so than personal beliefs (Cialdini et al., 1990). Underestimating public support for climate-related issues can decrease support for climate policies, even among political actors and intellectual elites (Mildenberger & Tingley, 2019). In the extreme, underestimation of public concern leads to pluralistic ignorance, where a majority falsely believes that fewer people share their opinions than what is the case (Katz & Allport, 1931; Miller & McFarland, 1987; Prentice & Miller, 1993). Pluralistic ignorance can lead to self-silencing about climate change, and correcting people’s second-order beliefs can increase the willingness to discuss climate change (Andre et al., 2024a; Geiger & Swim, 2016). Additionally, second-order beliefs correlate with individual climate beliefs and behaviors; those who think most of their social circle believes in climate change are more likely to believe in it and support carbon dioxide regulation (Van Boven et al., 2018).

Given the diversity of opinions around climate change, effective communication about consensus—both scientific and public—is critical, and a growing body of research emphasizes that communicating consensus can create changes in people’s beliefs and behaviors (Goldberg et al., 2020; Mildenberger & Tingley, 2019). However, there is a lack of research on factors that influence whether people perceive agreement or disagreement between experts, and among the public (Løhre et al., 2019).

In summary, prior work on social norms and climate change has focused on two main areas. First, it has examined how social norms influence individuals’ attitudes and behaviors. Second, it has explored how accurately people perceive social norms, and whether interventions that provide accurate information about social norms can lead to perceptions that are more aligned with reality. So far, studies have shown that people consistently misunderstand prevailing social norms related to climate change concerns (Andre et al., 2024a, 2024b; Leviston et al., 2024). In the current work, we examine how the information presented regarding the social norm informs people’s understanding of the social norm. Specifically, we test whether the way information is presented affects people’s understanding of the public consensus about climate change.

In this study, we explore how the presentation of public opinion on climate change—specifically, whether it is “packed” or “unpacked”—influences how this information is perceived. The terms “packing” and “unpacking” were adopted from support theory (Tversky & Koehler, 1994), which deals with how different ways of presenting information influences perceived probability. In this context, “packing” refers to the practice of consolidating multiple related pieces of information into a single, more condensed unit, while “unpacking” involves providing a more detailed breakdown of the information. For example, the risk of cardiovascular disease can be communicated either as an overall risk or by unpacking it into individual risks for heart attacks, strokes, hypertension, arrhythmias, cardiomyopathy, heart failure, and congenital heart defects.

In the context of communicating climate change information, packing is often observed in surveys that report public opinions and attitudes. This method is typically used by mass media or interest groups in their information campaigns, where they consolidate similar response options. For instance, a report from the Yale Program on Climate Change Communication summarizes that 93% of respondents from Korea report that they are either “very worried” or “somewhat worried” about climate change. However, looking at each response separately reveals that 36% of participants report that they are “somewhat worried” (Leiserowitz et al., 2023). Such consolidation of responses may lead people to evaluate public agreement differently, compared to when the data is presented in a more detailed (unpacked) manner.

A similar argument was made in recent papers demonstrating the so-called “unlikelihood effect” (Ingendahl et al., 2024; Karmarkar & Kupor, 2023). Here, participants in two conditions were informed of a risk (getting a bacterial infection from a flea bite) with a 58% probability. In one condition the risk was communicated as a single probability (“58% of people get this bacterial infection from getting bitten by a siphonaptera flea”), while in the other condition the risk was divided in multiple probabilities (e.g., “8% of people get this bacterial infection from getting bitten by a siphonaptera flea”, “8% of people get this bacterial infection from getting bitten by a culex flea”, etc.). The unlikelihood effect refers to the finding that participants perceived a higher risk when the same probability (58%) was presented as a single consolidated value rather than as multiple detailed probabilities. In a similar fashion, we expect that receiving information about public opinions in a packed way could lead people to perceive a higher consensus than when the opinions are unpacked into a larger number of categories. To illustrate, say that a survey on support for regulation of CO2 shows that 9% strongly oppose it, 14% somewhat oppose, 43% somewhat support, and 34% strongly support it. Here, we would expect that the agreement regarding CO2 regulation will be perceived as greater if people are instead told that 23% strongly or somewhat oppose it, while 77% somewhat or strongly support it. This could happen both because the numerical majority seems larger in the packed format, and/or because the unpacked version demonstrates a greater granularity of opinions.

We define perceived public consensus as whether one thinks people in a referent social group agree about the issue (Kobayashi, 2018). We build on the empirically supported assertion that second-order beliefs about consensus on climate change can contribute to more people adopting sustainable behavior, as people’s inferences about others’ climate beliefs have implications for their own attitudes and beliefs (Farrow et al., 2017; Leviston et al., 2024; Mildenberger & Tingley, 2019; van Valkengoed & Steg, 2019).

To our knowledge, no previous studies have investigated whether the packing and unpacking of information impacts people’s perceptions of public opinion. Rottenstreich and Tversky’s (1997) work on support theory shows that alternative descriptions of the same event can lead to different judgments about the event’s likelihood. They argue that perceived probability is tied to how events are described (Tversky & Koehler, 1994). One illustrative example was provided by Redelmeier et al. (1995), who gave physicians descriptions of a patient experiencing abdominal pains. Half of the physicians received two possible diagnoses and the category “none of the above”, while the other half received five possible diagnoses (including the two given to the first group) and “none of the above”. The physicians who received a larger number of specific diagnoses, i.e., where “none of the above” was partly unpacked into more specific instances, gave a higher probability (69%) to the corresponding set than those who received only two specific diagnoses and a “packed” none of the above (50%). Support theory argues that the perceived likelihood of a hypothesis depends on the amount of supporting evidence presented compared to alternative hypotheses (Tversky & Koehler, 1994).

While support theory describes the phenomenon of packing and unpacking events, it differs slightly from the rationale behind the present study. Support theory focuses on probability judgments, demonstrating that more detailed information makes events seem more likely. It emphasizes cognitive evaluations over social norm assessments. Nonetheless, studies based on support theory indicate that the same event, when described with varying levels of detail, can be interpreted differently, leading to different behavioral consequences. One example extending support theory outside of probability judgments is Van Boven and Epley (2003), who hypothesized that evaluative judgments are susceptible to a similar unpacking effect. They found that more detailed descriptions produced more extreme evaluations compared to less detailed descriptions of the same event. For example, in a scenario describing an oil company convicted of polluting the environment, judgments were harsher when the consequences of the pollution were unpacked (i.e., the pollution was said to have led to an increase in “asthma, lung cancer, throat cancer, and all other varieties of respiratory diseases”) rather than packed (i.e., the pollution was said to lead to an increase in “all varieties of respiratory diseases”).

Our aim was to investigate how presenting public opinions on climate change topics in a packed vs. unpacked format influences perceived agreement. The main hypothesis was that people will perceive higher public agreement when opinions are presented in a packed manner compared to when presented as individual survey options (unpacked).

Table 1 summarizes the hypotheses tested across two studies, and Table 2 details the analyses conducted in those studies.

Table 1.
Hypotheses investigated in the current study
HypothesisDescriptionStudy
Pre-registered Hypothesis 
When public opinions related to climate change are presented in a packed format compared to an unpacked format, participants will perceive a higher level of agreement among the public. Study 1 & 2 
Exploratory Hypothesis 
Participants’ own beliefs or attitudes will be positively related to their perceptions of public agreement related to climate change topics. Study 1 & 2 
HypothesisDescriptionStudy
Pre-registered Hypothesis 
When public opinions related to climate change are presented in a packed format compared to an unpacked format, participants will perceive a higher level of agreement among the public. Study 1 & 2 
Exploratory Hypothesis 
Participants’ own beliefs or attitudes will be positively related to their perceptions of public agreement related to climate change topics. Study 1 & 2 
Table 2.
Details of pre-registered analyses and additional analyses
HypothesisAnalysesStudy
Pre-registered analyses:
Mixed ANOVA with the question as a within-subject factor (the six different questions about climate change topics), and with condition (packed vs. unpacked) as a between-subject factor.
Followed by separate t-tests for each of the six questions. Perceived consensus was the outcome variable. 
Study 1 & 2 
Additional secondary analysis:
Linear mixed-effects models accounting for participants’ responses across scenarios and experimental conditions, with perceived consensus as the outcome variable.
Set of correlation analyses between participants’ own beliefs/attitudes (Study 1) or behavioral intentions (Study 2) and perceived consensus. 
Study 1 & 2 
HypothesisAnalysesStudy
Pre-registered analyses:
Mixed ANOVA with the question as a within-subject factor (the six different questions about climate change topics), and with condition (packed vs. unpacked) as a between-subject factor.
Followed by separate t-tests for each of the six questions. Perceived consensus was the outcome variable. 
Study 1 & 2 
Additional secondary analysis:
Linear mixed-effects models accounting for participants’ responses across scenarios and experimental conditions, with perceived consensus as the outcome variable.
Set of correlation analyses between participants’ own beliefs/attitudes (Study 1) or behavioral intentions (Study 2) and perceived consensus. 
Study 1 & 2 

Both Study 1 and Study 2 were pre-registered on the Open Science Framework (OSF), and data collection was conducted after the pre-registration. The pre-registrations, together with datasets and R/RMarkdown code, are available on the OSF at https://osf.io/tk76r/. All measures, manipulations, and exclusions for this investigation are reported, and data collection was completed before analyses. Pre-registrations are available on the OSF: Study 1 - https://osf.io/6wy2m; Study 2 - https://osf.io/c92hg.

Method

Power Analysis

There was no empirical work exploring the topic of the current research that could reasonably inform our sample size planning. However, we note Van Boven and Epley’s (2003) study, “The unpacking effect in evaluative judgments”, as the most closely related to our research. We estimated the effect size of unpacking on evaluative judgments in Van Boven and Epley’s study to range between d = .55 and d = .82, depending on the different dependent measures they employed. For an anticipated effect size of d = .50, our a priori power analysis using G*Power shows that with a two-tailed test, an alpha level of .05, and a desired power of .90, a total sample size of 172 participants is suggested.

Participants and Procedures

A total of 186 US American Amazon Mechanical Turk participants were recruited. Data was collected in June 2019, and participants received $0.40 for completing the survey. After excluding those who failed two simple attention checks, those who spent less than 60 seconds on the survey, and those who had identical IP addresses there were 151 participants (59 female, 92 male) with a mean age of 33.8 years (SD = 10.4).

The participants’ self-reported educational backgrounds varied widely. The majority reported completing some college (39%) or vocational/technical school (31%). Participants also self-reported their political ideology on a five-point scale ranging from 1 (very liberal) to 5 (very conservative). Responses indicated a slight liberal lean overall (M = 2.54, SD = 1.23), with most identifying as very liberal (25%), somewhat liberal (28%), or moderate (25%). Regarding political party affiliation, most participants identified as Democrats (44%) or Independents (32%). For more details about educational background and reported political beliefs, see supplementary materials (Tables S2–S4).

Participants were randomly assigned to one of two conditions in a between-subjects design: Presentation format (packed vs. unpacked condition). In the first part of the study, the participants received information about the results of a nationally representative survey of the climate change opinions of US adults (based on data from the Yale Program on Climate Change Communication; see YPCC & Mason 4C, 2024). The results were presented differently in two between-subjects conditions: participants in the “unpacked” condition were told what percentage of respondents chose each of the possible response options, while participants in the “packed” condition were told what percentage of respondents chose two or more related response options. Participants were informed that the presented response patterns were based on a nationally representative survey. To illustrate, one question concerned how worried people are about climate change. The following text was presented to participants in the unpacked condition:

Below is a question concerning worry about global warming given to the participants in the climate change survey, along with the response options and the distribution of results:

How worried are you about global warming?

Response options: “not at all worried”, “not very worried”, “somewhat worried”, “very worried”

16% chose “not at all worried”

26% chose “not very worried”

39% chose “somewhat worried”

19% chose “very worried”

Participants in the “packed” condition were given the same text, but were told that 42% chose “not at all” or “not very” worried, while 58% chose “somewhat” or “very” worried.

After rating perceived agreement for six different climate change questions, participants answered how easy it was to evaluate the level of public agreement. Next, they reported their own opinions about the six different climate change questions they had just read about. Finally, they received demographic questions, including questions about their party affiliation and their political ideology.

Measures

Full instrumentation can be found in the supplementary materials.

Measure of Consensus

After receiving information about the topics in a packed or unpacked format, participants rated whether they perceived there to be agreement or disagreement among the public with regards to each question, on a seven-point scale from “very high disagreement” to “very high agreement”.

Fluency

The participants reported how easy it was to evaluate the level of public agreement, on a seven-point scale with the endpoints “it was very difficult” and “it was very easy.” This measure was included to explore if a potential effect of presentation format was mediated by how easy it was to process information in the two conditions.

Own Opinions

Participants received the same six questions about climate change, asking them to report their own opinions. For example, one of the questions asked participants to report their own opinion on “How much do you think global warming will harm future generations of people?” with the response options “don’t know”, “not at all”, “only a little”, “a moderate amount”, and “a great deal”.

Actual Level of Agreement1

We included a measure of the actual level of agreement present in the public opinions presented to participants, as the effect of packing vs. unpacking could differ depending on the actual level of agreement. For instance, when the actual consensus among public opinion is high (e.g., 80% are “somewhat” or “very” worried), participants are likely to perceive and rate the consensus as high, perhaps especially in the packed condition. Including a measure of actual agreement serves both to validate our measure of perceived agreement (we would expect a positive correlation between actual and perceived agreement) and to test the robustness of potential effects of packing vs. unpacking.

We used reverse-coded values of standardized entropy scores as a measure of the actual level of public agreement. We calculated this score for each of the six scenarios presented to the participants. Entropy scores reflect how spread out or diverse the responses are and serve as a useful proxy measure representing the level of agreement or consensus among a set of responses2. A higher score on this measure indicates a higher level of actual agreement among the public. Please refer to the supplementary section for additional details on the calculations of the actual level of agreement.

Results

Data Analysis

Confirmatory Analysis Based on Pre-Registration

We conducted a mixed ANOVA with questions as a within-subject factor (the six different questions about climate change topics), and with condition (packed vs. unpacked) as a between-subject factor. There was no support for the condition predictor (see Table 3).

Table 3.
The results of mixed ANOVA with perceived consensus as the outcome variable, Study 1
PredictorDf (numerator)Df (denominator)MSEFGeneralized eta squared
(ges)
p
Presentation format (Unpacked vs. packed) 1.00 149.00 7.88 0.01 0.00 .933 
Domain of survey question 3.11 463.69 2.69 48.75 0.14 .000 
Presentation format × Domain of survey question 3.11 463.69 2.69 0.83 0.00 .482 
PredictorDf (numerator)Df (denominator)MSEFGeneralized eta squared
(ges)
p
Presentation format (Unpacked vs. packed) 1.00 149.00 7.88 0.01 0.00 .933 
Domain of survey question 3.11 463.69 2.69 48.75 0.14 .000 
Presentation format × Domain of survey question 3.11 463.69 2.69 0.83 0.00 .482 

We followed the ANOVA with six Welch’s t-tests testing the differences in perceived agreement ratings across packed and unpacked conditions. As seen in Table 4, presenting information about public opinions about climate change in a packed vs. unpacked way did not seem to have any effect on the perceived level of public (dis)agreement. Additionally, we found that participants in the packed condition reported it was slightly easier to assess the agreement (M = 5.76, SD = 1.36) than participants in the unpacked condition (M = 5.33, SD = 1.42; t(145.16) = 1.94, p = .055, Cohen’s d = 0.32 [-0.01, 0.64]), although this difference was not statistically significant.

Table 4.
Means and standard deviations for perceived (dis)agreement and results of Welch’s t-tests, Study 1.
TopicPackedUnpackedWelch’s t-testCohen’s d with 95% CI
Worry 3.75 (1.52) 3.92 (1.65) t(146.86) = -0.64, p = .522 -0.11 [-0.42, 0.21] 
Future harm 5.04 (1.56) 4.70 (1.78) t(148.30) = 1.27, p = .206 0.21 [-0.11, 0.53] 
Harm plants and animals 4.85 (1.71) 4.89 (1.60) t(139.02) = -0.14, p = .887 -0.02 [-0.35, 0.30] 
Personal harm 3.37 (1.73) 3.54 (1.74) t(143.66) = -0.62, p = .539 -0.10 [-0.42, 0.22] 
Support for renewable energy 5.44 (1.82) 5.36 (1.42) t(124.88) = 0.29, p = .769 0.05 [-0.28, 0.37] 
Support CO2 regulation 4.87 (1.66) 4.82 (1.55) t(138.99) = 0.18, p = .854 0.03 [-0.29, 0.35] 
TopicPackedUnpackedWelch’s t-testCohen’s d with 95% CI
Worry 3.75 (1.52) 3.92 (1.65) t(146.86) = -0.64, p = .522 -0.11 [-0.42, 0.21] 
Future harm 5.04 (1.56) 4.70 (1.78) t(148.30) = 1.27, p = .206 0.21 [-0.11, 0.53] 
Harm plants and animals 4.85 (1.71) 4.89 (1.60) t(139.02) = -0.14, p = .887 -0.02 [-0.35, 0.30] 
Personal harm 3.37 (1.73) 3.54 (1.74) t(143.66) = -0.62, p = .539 -0.10 [-0.42, 0.22] 
Support for renewable energy 5.44 (1.82) 5.36 (1.42) t(124.88) = 0.29, p = .769 0.05 [-0.28, 0.37] 
Support CO2 regulation 4.87 (1.66) 4.82 (1.55) t(138.99) = 0.18, p = .854 0.03 [-0.29, 0.35] 

We conducted additional analyses that were not specified in the preregistration; therefore, they are reported here as exploratory analyses.

Exploratory Analysis

As an additional exploratory analysis, we conducted a linear mixed-effects analysis to draw summary conclusions based on within-subject responses to the six items, and between-subject experimental conditions. Linear mixed effects models (LMEMs) have several advantages. LMEMs allow multiple data sets to be pooled into one while also taking into account variability within and across participants and climate-related questions simultaneously.

We constructed the linear mixed model using the lme4 package (Bates et al., 2015) in R. The p values were calculated using the Satterthwaite approximation for mixed-effects regressions (Kuznetsova et al., 2017). The mixed-effects regression analysis included political ideology, presentation format (packed vs. unpacked), and a measure of the actual level of public agreement as fixed effect predictors of agreement ratings. Participant IDs and question domain (i.e., IDs corresponding to six climate change questions) were included as random intercepts. We found no indication that the experimental condition was a fixed effect predictor of agreement ratings (see Model 1 of Table 5). However, the results indicated that actual agreement positively predicted perceived agreement, indicating that participants’ ratings reflected important aspects of the information that they were given.

Table 5.
The results of mixed-effects regression analysis with perceived consensus as the outcome variable, Study 1.
Model 1Model 2
Predictors Estimates CI p Estimates CI p 
Intercept 3.52 3.01 – 4.03 <0.001 2.93 1.75 – 4.11 <0.001 
Political ideology -0.15 -0.29 – 0.00 .053 -0.10 -0.25 – 0.05 .202 
Actual consensus 9.85 8.38 –⁠ 11.32 <0.001 9.42 3.15 –⁠ 15.69 .003 
Presentation format (Unpacked vs. packed) -0.02 -0.38 – 0.35 .924 0.01 -0.35 – 0.36 .975 
Own beliefs or attitudes    0.14 0.02 – 0.25 .022 
Random Effects 
σ2 1.79 1.68 
τ00 0.99 SubjectID 0.96 SubjectID 
  0.20 groupvar 
ICC 0.36 0.41 
151 SubjectID 151 SubjectID 
  6 groupvar 
Observations 906 906 
Marginal R2 / Conditional R2 0.119 / 0.433 0.122 / 0.482 
Model 1Model 2
Predictors Estimates CI p Estimates CI p 
Intercept 3.52 3.01 – 4.03 <0.001 2.93 1.75 – 4.11 <0.001 
Political ideology -0.15 -0.29 – 0.00 .053 -0.10 -0.25 – 0.05 .202 
Actual consensus 9.85 8.38 –⁠ 11.32 <0.001 9.42 3.15 –⁠ 15.69 .003 
Presentation format (Unpacked vs. packed) -0.02 -0.38 – 0.35 .924 0.01 -0.35 – 0.36 .975 
Own beliefs or attitudes    0.14 0.02 – 0.25 .022 
Random Effects 
σ2 1.79 1.68 
τ00 0.99 SubjectID 0.96 SubjectID 
  0.20 groupvar 
ICC 0.36 0.41 
151 SubjectID 151 SubjectID 
  6 groupvar 
Observations 906 906 
Marginal R2 / Conditional R2 0.119 / 0.433 0.122 / 0.482 

Note. τ00 = variance of the random intercepts across participants (SubjectID); σ2 = refers to the residual variance; ICC = Intraclass Correlation Coefficient.

While packing did not have a main effect on perceived agreement, it is conceivable that any potential effect might depend on other factors, such as the actual level of agreement and participants’ own beliefs. To explore this possibility, we ran a mixed-effects model including presentation format (packed vs. unpacked), participants’ beliefs, actual public agreement, and potential interactions as predictors. One theoretically interesting possibility would be if we observed an interaction between presentation format and actual level of agreement, such that the effect of packing was greater for high consensus than for low consensus items. However, this interaction was not statistically significant (p = .081) and the interaction pattern did not resemble the proposed pattern. A three-way interaction between presentation format, own beliefs and actual agreement was observed; however, caution is warranted in interpreting this result given the sample size and a p-value of .04. The results of these analyses are provided in the supplementary materials section (see Table S5).

We also explored whether participants’ personal opinions on specific climate issues were associated with their agreement ratings. To investigate this, we calculated correlations between participants’ opinions and their agreement ratings for these issues. The results, depicted in Figure 1, revealed that four out of the six correlations were positive and statistically significant, with r’s ranging from 0.18 to 0.27 (ps < 0.012). Correlations were not significant for questions related to worry (r = 0.073, p = .330) and personal harm (r = 0.043, p = .560) concerning climate. We observe a similar positive relationship between participants’ own beliefs and perceived agreement ratings in a mixed-effects regression analysis (see Model 2 in Table 5).

Figure 1.
Correlations between participants’ personal opinions on specific climate issues and their perceived agreement ratings, Study 1

Note. The x-axis of plots shows participants’ own beliefs or attitudes on a variety of climate related issues and the y-axis shows perceived agreement reported by the participants.

Figure 1.
Correlations between participants’ personal opinions on specific climate issues and their perceived agreement ratings, Study 1

Note. The x-axis of plots shows participants’ own beliefs or attitudes on a variety of climate related issues and the y-axis shows perceived agreement reported by the participants.

Close modal

As part of our robustness check, we conducted the same analyses on the full sample (i.e., the sample without exclusions), and the results were practically identical. Please refer to the supplementary materials section for the results based on the full sample (see Table S12–S13).

Discussion

There are two noteworthy results of this experiment, the first being that we did not find the predicted effect of unpacking on judgments of agreement, and the second being the relationship between own opinions and perceived agreement. We discuss each topic in turn.

For all six questions, the level of public agreement was judged to be similar whether the information about public opinion was presented in a “packed” or “unpacked” fashion. There are several possible reasons why we do not find the predicted effect of unpacking. First, to make the two conditions as directly comparable as possible, all participants were informed that there were several different options for each question. For example, the participants in the “packed” condition were told that for the question about worry about global warming, 42% chose “not at all worried” or “very worried”, while 58% chose “somewhat worried” or “very worried”. In other words, even though the information is “packed”, the participants were explicitly reminded that within the percentages described, there are differences in opinion strength. Thus, our manipulation may have been too subtle to produce the predicted outcome. Second, our manipulation differs somewhat from previous studies on “unpacking”. It is usual to compare a general description of an object or event in the packed condition, with a description containing one or more concrete examples in the unpacked condition. As an example, Van Boven & Epley (2003) in their Experiment 1 asked their participants to evaluate the consequences of an oil spill that led to an increase in “all varieties of respiratory diseases” in the packed condition, as compared to an increase in “asthma, lung cancer, throat cancer, and all other varieties of respiratory diseases” in the unpacked condition. This again relates to the strength of the manipulation we used. A third possibility is, of course, that packing really does not have the predicted effect on judgments of agreement.

An interesting finding was that the level of perceived agreement correlated positively with the participants’ own opinions. For instance, participants who believed global warming would significantly harm future generations tended to perceive higher levels of public agreement with this view. This can be seen as an example of egocentric bias (Fields & Schuman, 1976) or a kind of false consensus effect (Marks & Miller, 1987). What is noteworthy here is that while studies of false consensus traditionally show that people overestimate the percentage that agrees with them, our results reveal that even when people are given the objective percentage, their perception of this agreement correlates with their own attitudes. For instance, participants who share the majority opinion may perceive the same percentage as representing stronger agreement compared to those holding a minority opinion. These findings indicate that individual attitudes are associated with perceptions of public opinion, but the directionality and causal nature of this relationship remains unclear.

The results of our first experiment indicated that packing does not influence perceived consensus, but with a relatively subtle manipulation. In Study 2, we aimed to test whether we would observe the hypothesized effect with a stronger manipulation. In addition to manipulating packing, the study also involved the manipulation of social category salience3. Finally, whereas participants in the previous study were asked to evaluate agreement about general climate change beliefs, participants in this study evaluated agreement based on questions about intentions to engage in pro-environmental behaviors.

Method

Power Analysis

The project was non-funded, and the data collection approach involved a convenience sample, i.e., inviting people to take part in the study via advertisements on university campuses and social media sites. Therefore, we were not sure about what kind of response rate we could expect. The aim was to double the sample size of the previous study. A sensitivity power analysis, based on the achieved sample, shows that we had 90% power (α = 0.05) to detect an effect size of d = 0.34 for the main effect of the presentation format (see Appendix A in the supplement).

Participants and Procedures

A total of 422 Norwegians completed an online questionnaire in October 2021. Respondents were recruited by sharing the survey on social media. Participants who spent less than a minute to complete the survey, who answered the attention checks incorrectly, or who reported problems with the survey were excluded4. After exclusions, 355 participants were retained (130 men, 219 women, 3 choosing “other/do not wish to answer”, and 2 did not respond to the question). Participants were between 18 and 82 years, with a mean of 36.79 years (SD = 15.42, two did not report age), and reported a range of educational backgrounds, with most completing a longer university or college education (45%) or a short university/college program of up to three years (25%). More details about participant backgrounds are presented in the supplementary materials (see Tables S7–S8).

Participants were randomly assigned to one of four conditions in a between-subjects design: 2 (presentation format: packed vs. unpacked) × 2 (social category salience: yes vs. no). The participants received information about the results of “Klimaundersøkelsen 2021” 5, a Norwegian survey on climate change-related concerns. The results were presented differently in two different between-subjects conditions in the same way as in Study 1: participants in the “unpacked” condition were told what percentage of respondents chose each of the possible response options, while for participants in the “packed” condition, two aggregated percentages were presented that consolidated the responses into two broad categories.

Participants were informed that the presented response patterns were based on results from a survey conducted in different parts of Norway. After rating perceived agreement for six different behaviors concerning climate change issues, participants answered how easy it was to evaluate the level of public agreement. Next, they reported their own behavioral intentions for the six climate measures. The demographic questions included gender, age, and party affiliation. Participants in experimental conditions in which the social category was made salient were reminded that “group affiliation influences how one perceives and talks about climate change” and answered the demographic questions at the very beginning of the survey, while other participants received this reminder and answered the demographic questions at the very end of the survey.

Measures

Full instrumentation can be found in the supplementary material section. Materials for the survey were taken from “Klimaundersøkelsen 2021”, a Norwegian survey that maps behaviors and attitudes to climate measures in the population. The survey has been conducted in several different Norwegian regions on a regular basis since 2017. The survey contains questions about how likely the respondents think it is that they will adopt different sustainable behaviors over the next two years. Six of these questions were selected for the current purposes, namely questions about recycling of plastic, meat consumption, flying, public transport, bicycling, and energy efficiency. Based on the results of surveys from the Oslo, Viken, and Bergen6 areas in Norway (total N = 3500), we calculated the percentage for each question who had selected each of the following response options: “I have done it/I do it already”, “Very likely”, “Quite likely”, “Unlikely”, “Very unlikely”.7

Measure of Consensus

Participants were asked to report if they found there to be agreement or disagreement among the public about each of the six topics from the climate survey. To illustrate, one question focused on reducing the use of plastic was formulated in the following way in the unpacked condition:

In the Climate survey 2021 people received the following question about reducing plastic use:

“How likely is it that you will take the following action in the next two years: reduce plastic consumption?”

23% replied “have done or am already doing it”

20% replied “very likely”

36% replied “quite likely”

17% replied “quite unlikely”

4% replied “very unlikely”

In the “packed” condition, participants were simply told that 79% replied very or quite likely, while 21% replied very or quite unlikely. Thus, unlike in Study 1, participants did not receive fully comparable information about all the response options (e.g., those in the packed condition were not told that “have done or am already doing it” was an option), to simplify the presentation and make the manipulation of packing somewhat stronger. Participants were then asked to rate whether there is agreement or disagreement among the public with regards to each question, on a seven-point scale from “very high disagreement” to “very high agreement”. Please refer to the supplementary material section for the full details of the survey measures.

Fluency

The participants reported how easy it was to evaluate the level of public agreement, on a seven-point scale with the endpoints “it was very difficult” and “it was very easy.”

Own Opinions

Six questions asked the participants to report their own behavioral intentions, using the same scale as in the original survey: have done or am already doing it (1), very likely (2), quite likely (3), quite unlikely (4), very unlikely (5). We reverse-coded the responses so that higher scores represented stronger behavioral intentions.

Actual Level of Agreement

As in Study 1, we used reverse-coded values of standardized entropy scores as a measure of the actual level of agreement.

Results

Data Analysis

Confirmatory Analysis Based on Pre-Registration

We conducted a mixed ANOVA with questions as a within-subject factor (the six different questions about climate change topics), and with condition (packed vs. unpacked) as a between-subject factor. There was no support for the condition predictor (see Table 6).

Table 6.
The results of mixed ANOVA with perceived consensus as the outcome variable, Study 2
PredictorDf (numerator)Df (denominator)MSEFGeneralized eta squared
(ges)
p
Presentation format (Unpacked vs. packed) 1.00 353.00 4.51 0.23 0.00 .631 
Domain of survey question 4.46 1573.86 0.98 287.92 0.29 .000 
Presentation format × Domain of survey question 4.46 1573.86 0.98 1.43 0.00 .216 
PredictorDf (numerator)Df (denominator)MSEFGeneralized eta squared
(ges)
p
Presentation format (Unpacked vs. packed) 1.00 353.00 4.51 0.23 0.00 .631 
Domain of survey question 4.46 1573.86 0.98 287.92 0.29 .000 
Presentation format × Domain of survey question 4.46 1573.86 0.98 1.43 0.00 .216 

We conducted six Welch’s t-tests testing the differences in perceived agreement ratings across packed and unpacked conditions. As seen in Table 7, presenting information about public opinions about climate change in a packed vs. unpacked way did not seem to have any effect on the perceived level of public (dis)agreement, except for the question on plastic usage. Additionally, we found that participants in the packed condition reported it was easier to assess the agreement (M = 4.85, SD = 1.62) than participants in the unpacked condition (M = 4.22, SD = 1.64; t(352.69) = 3.66, p < .001, Cohen’s d = 0.39 [0.18, 0.60]).

Table 7.
Means and standard deviations for perceived (dis)agreement and results of the Welch’s t-tests, Study 2.
TopicPackedUnpackedWelch’s t-testCohen’s d with 95% CI
Plastic usage 5.56 (1.07) 5.32 (1.05) t(353.00) = 2.08, p = .038 0.22 [0.01, 0.43] 
Meat consumption 3.27 (1.40) 3.21 (1.18) t(344.84) = 0.42, p = .673 0.05 [-0.16, 0.25] 
Using public transport 4.43 (1.26) 4.34 (1.19) t(352.47) = 0.64, p = .522 0.07 [-0.14, 0.28] 
Flying 3.68 (1.27) 3.75 (1.22) t(352.78) = -0.52, p = .606 -0.06 [-0.26, 0.15] 
Home energy usage 4.68 (1.21) 4.78 (1.08) t(350.01) = -0.80, p = .426 -0.09 [-0.29, 0.12] 
Using bicycles 3.48 (1.38) 3.43 (1.23) t(349.27) = 0.39, p = .696 0.04 [-0.17, 0.25] 
TopicPackedUnpackedWelch’s t-testCohen’s d with 95% CI
Plastic usage 5.56 (1.07) 5.32 (1.05) t(353.00) = 2.08, p = .038 0.22 [0.01, 0.43] 
Meat consumption 3.27 (1.40) 3.21 (1.18) t(344.84) = 0.42, p = .673 0.05 [-0.16, 0.25] 
Using public transport 4.43 (1.26) 4.34 (1.19) t(352.47) = 0.64, p = .522 0.07 [-0.14, 0.28] 
Flying 3.68 (1.27) 3.75 (1.22) t(352.78) = -0.52, p = .606 -0.06 [-0.26, 0.15] 
Home energy usage 4.68 (1.21) 4.78 (1.08) t(350.01) = -0.80, p = .426 -0.09 [-0.29, 0.12] 
Using bicycles 3.48 (1.38) 3.43 (1.23) t(349.27) = 0.39, p = .696 0.04 [-0.17, 0.25] 

We conducted additional analyses that were not specified in the preregistration and are therefore reported as exploratory analyses.

Exploratory Analysis

The effects of packing and social categorization salience on the perceived agreement were investigated by running a mixed linear model analysis. The mixed-effects regression analysis included experimental condition (packed vs. unpacked) and social category salience (yes vs. no), participants’ own opinions, and the actual level of public agreement as fixed effect predictors of agreement ratings. Participant IDs and question domain (i.e., IDs corresponding to six climate change questions) were included as random intercepts. There was no support for presentation format or social category salience as predictors of agreement ratings, but the actual level of agreement positively predicted perceived agreement (see Model 1 in Table 8). As an alternative to LMEMs we also conducted a mixed ANOVA and the results were practically identical (see Table S11 in the supplemental materials).

Table 8.
The results of mixed-effects regression analysis with perceived agreement as the outcome variable, Study 2
Model 1Model 2
Predictors Estimates CI p Estimates CI p 
Intercept 3.05 2.88 – 3.23 <0.001 2.83 1.91 – 3.76 <0.001 
Demographic salience condition (Yes vs. No) 0.02 -0.16 – 0.20 0.816 0.02 -0.16 – 0.20 0.797 
Presentation format (Unpacked vs. packed) -0.04 -0.22 – 0.14 0.638 -0.05 -0.23 – 0.13 0.588 
Actual level of agreement 26.38 24.48 –⁠ 28.28 <0.001 25.69 7.07 –⁠ 44.31 0.007 
Own behavioral intentions    0.07 0.04 – 0.11 <0.001 
Random Effects 
σ2 1.12 0.87 
τ00 0.57 SubjectID 0.59 SubjectID 
  0.30 groupvar 
ICC 0.34 0.51 
355 SubjectID 355 SubjectID 
  6 groupvar 
Observations 2130 2130 
Marginal R2 / Conditional R2 0.188 / 0.461 0.185 / 0.597 
Model 1Model 2
Predictors Estimates CI p Estimates CI p 
Intercept 3.05 2.88 – 3.23 <0.001 2.83 1.91 – 3.76 <0.001 
Demographic salience condition (Yes vs. No) 0.02 -0.16 – 0.20 0.816 0.02 -0.16 – 0.20 0.797 
Presentation format (Unpacked vs. packed) -0.04 -0.22 – 0.14 0.638 -0.05 -0.23 – 0.13 0.588 
Actual level of agreement 26.38 24.48 –⁠ 28.28 <0.001 25.69 7.07 –⁠ 44.31 0.007 
Own behavioral intentions    0.07 0.04 – 0.11 <0.001 
Random Effects 
σ2 1.12 0.87 
τ00 0.57 SubjectID 0.59 SubjectID 
  0.30 groupvar 
ICC 0.34 0.51 
355 SubjectID 355 SubjectID 
  6 groupvar 
Observations 2130 2130 
Marginal R2 / Conditional R2 0.188 / 0.461 0.185 / 0.597 

Note. τ00 = variance of the random intercepts across participants (SubjectID); σ2 = refers to the residual variance; ICC = Intraclass Correlation Coefficient.

As in Study 1, we also wanted to explore whether a potential effect of packing might depend on the actual level of agreement and participants’ own beliefs. A mixed-effects regression analysis showed support for a three-way interaction between the presentation format (packed vs. unpacked), participants’ own behavioral intentions, and the actual level of agreement on perceived agreement – but in a different way than the three-way interaction found in Study 1. The differences in the pattern of results might be due to differences in sample size, variations in the scenarios between the studies, or other unknown factors. Furthermore, the analysis showed statistically significant interactions between presentation format and own beliefs, as well as a potentially more theoretically interesting interaction between presentation format and actual agreement. This second interaction indicated that there was little difference in perceived agreement between packing and unpacking for low consensus items, but that as the actual level of agreement increased, perceived agreement increased more for packed than for unpacked items (see Figure S4 in the supplement). This could suggest that an effect of packing vs. unpacking might be observed for some items, but given the lack of consistent effects for individual items (see Table 7) and the fact that a similar interaction effect was not observed in Study 1, we think it best to treat these results as highly preliminary. More detailed results of these additional analyses are reported in the supplementary materials section (see Table S10, and Figures S3 and S4).

Additionally, we conducted independent correlation analyses between perceived agreement and participants’ behavioral intentions for each of the six questions. As in Study 1, we found positive and statistically significant correlations for most (five out of six) questions (see Figure 2), with r’s ranging from 0.11 to 0.18 (ps < .040). The correlation was not significant for the question related to the behavior of bicycling more often (r = 0.073, p = .170). A statistically significant and positive relationship between participants’ own behavioral intentions and perceived level of agreement was also found using LMEM, see Model 2 in Table 8.

Figure 2.
Correlations between participants’ personal opinions on specific climate issues and their perceived agreement ratings, Study 2

Note. The x-axis of plots contains participants’ own behavioral intentions on a variety of climate related issues and the y-axis perceived agreement reported by the participants.

Figure 2.
Correlations between participants’ personal opinions on specific climate issues and their perceived agreement ratings, Study 2

Note. The x-axis of plots contains participants’ own behavioral intentions on a variety of climate related issues and the y-axis perceived agreement reported by the participants.

Close modal

Discussion

The results were consistent with Study 1. First, there was no support for the predicted effect of presentation format on the judgment of agreement. The perceived level of public agreement was judged similarly across “packed” or “unpacked” conditions. Second, participants’ own beliefs were positively associated with perceived agreement. Third, actual levels of agreement predicted perceived agreement. Fourth, participants in the packed condition found it easier to evaluate the level of agreement compared to participants in the unpacked condition.

Study 2 additionally manipulated social category salience as a between-subjects factor: the respondents were reminded that social group and group affiliation can affect how we talk about and think about climate change. The participants’ ratings were similar across social category salience conditions.

Null Effect of “Packing”

The results of our study reveal several interesting conclusions about how information presentation and personal beliefs influence perceptions of public consensus on climate change. Notably, contrary to our main prediction, participants rated agreement similarly whether information was presented in a packed or an unpacked format. We cannot conclude from our findings that presenting information as packed vs. unpacked has no influence whatsoever on perceived consensus. However, in two relatively high-powered experiments, we failed to find evidence in support of our hypothesis, even when using a slightly stronger manipulation of packing in Study 2. Thus, if there is an effect of packing on perceived consensus, it might be quite small, it might depend on the characteristics of the stimuli or of the participants, or might require even stronger manipulations of packing.

In our experiments, participants in the “packed” condition were reminded that if (e.g.) 54% chose “very or quite likely”, the remaining 46% chose “very or quite unlikely”. It might be that packing could have a stronger effect if combined with giving more one-sided information (for instance, only stating that 54% say it is likely they will bike more). This would be in line with previous studies showing that people perceive more disagreement between experts when experts use opposite frames (Løhre et al., 2019). However, this would be a heavy-handed intervention, and might also be seen as more manipulative than the transparent information given in our two experiments.

We discuss several potential explanations for the lack of an effect of packing in the limitations section below, but regardless of the explanation for the null results, we believe it is important to communicate these findings to other researchers. Too often, such “negative” results are left in the file drawer (Rosenthal, 1979), leading to large publication bias (Franco et al., 2014) which can only be combatted by more widespread acceptance of null results (Chopra et al., 2023; Munafó & Neill, 2016). In this specific case, we have performed two pre-registered experiments investigating a reasonable hypothesis based on previous related research (e.g., Karmarkar & Kupor, 2023; Van Boven & Epley, 2003), but fail to find support for our predictions. We believe other researchers should be aware of these findings, to ensure better progress for others interested in similar questions.

Perceived Agreement Correlates With Own Attitudes

Across two samples, we found a consistent association between participants’ self-reported attitudes and their perception of consensus. Specifically, individuals’ perceptions of public agreement on climate change beliefs and climate-related behaviors were positively correlated with their own beliefs and attitudes toward the issue. This phenomenon, akin to an egocentric bias or a false consensus effect (Ross et al., 1977), highlights the extent to which personal beliefs can shape second-order beliefs—such as perceived public opinion.

Importantly, our findings extend the traditional understanding of the false consensus effect. Participants were presented with response patterns derived from representative social surveys, which comes as close to an objective depiction of public opinion as one can reasonably get. Despite this, our results suggest that participants’ perceptions of public consensus were associated with their own views. This indicates that personal beliefs may shape how individuals interpret public opinion, even when presented with factual information. Interventions aimed at addressing misperceptions of public opinion may need to consider how individual beliefs influence interpretations of public consensus.

We also explored whether there might be an interaction between presentation format and other factors. For example, the packing of information could shape the extent of the egocentric bias while accounting for the actual public agreement. We performed analyses including a three-way interaction between the presentation format (unpacked vs. packed), participants’ own beliefs and attitudes, and the actual level of agreement on the perceived level of agreement. Although the results across two studies showed statistically significant three-way interaction effects, the pattern of the results was inconsistent across the two studies (please refer to the supplementary materials section for detailed results). We were also interested in whether there was a two-way interaction between actual agreement and presentation format. In Study 1, this interaction was not statistically significant, but in Study 2, there was a statistically significant interaction, with a tendency to perceive higher agreement in the packed condition (as compared to the unpacked condition) as actual agreement increased. This could be seen as consistent with packing first and foremost having an effect for high consensus item. However, since a similar pattern was not observed in Study 1, we interpret these results with caution, but hold the possibility open that presentation format could influence perceived agreement depending on other factors.

Theoretical and Practical Implications

A practical implication of the current findings is that it does not seem to matter much whether information about public opinion is presented in a packed or unpacked format, as recipients (at least in these two studies) perceive similar degrees of public (dis)agreement regardless of the format. However, information format did influence ratings of how easy it was to rate perceived consensus, with participants finding it easier to evaluate agreement when receiving “packed” information. Together, these two findings indicate that those involved in communicating public opinions can safely use a “packed” format without fear of being manipulative.

These findings extend the literature on the false consensus effect and motivated reasoning by demonstrating their influence in the context of climate change—a critical societal and environmental issue. From a theoretical standpoint, this research highlights the need for more nuanced models of information processing in the context of environmental communication. Understanding that personal biases may significantly affect information processing can guide future research into more effective communication strategies that account for these biases.

Limitations and Future Research Directions

The present investigation has several limitations that warrant consideration when interpreting the findings. First, our materials were limited to general climate change beliefs and climate related behaviors, and we therefore cannot exclude different conclusions regarding people’s evaluation of information related to the climate change domain broadly, or in other domains. Second, we draw samples from highly developed societies in the USA and Norway. Consequently, caution is necessary when applying our results to other cultural contexts. Furthermore, even within the USA and Norway, our samples are convenience samples that are hardly representative of the general population. Third, we focus on the perception of consensus in public opinion but do not test how perception relates to behaviors. Fourth, to ensure that the information presented to participants in different conditions was comparable, our experimental conditions (packed and unpacked information) might have been insufficient to induce a noticeable effect. Fifth, while results across the two studies are consistent, the studies differ in several aspects: different samples (US MTurkers vs. Norwegian students/members of the general population), different topics (climate change attitudes vs. pro-environmental behavior intentions) with different levels of actual agreement, and slightly different manipulations of packing.

The sixth limitation concerns the actual level of agreement about the topics given to participants. Across the two studies, the majority response in all questions except one (the question about personal harm in Study 1) gravitated towards the “positive” or “climate concerned” pole. Thus, although our intention was to assess agreement in terms of (lack of) response variability, it is theoretically possible that participants interpreted the questions as asking about whether the public held “positive” opinions. While we cannot entirely rule out this alternative interpretation, the fact that actual agreement positively predicted perceived agreement mitigates this concern.

Seventh, while our design aimed to reflect ecologically valid ways in which public opinion is typically communicated, the simultaneous application of packing and unpacking to both agreement and disagreement may lead to some issues in interpretation. For example, packing could potentially make people perceive both agreement and disagreement as greater, with the effects cancelling each other out8. In our reading, this cancelling would not apply as strongly to high-consensus items, where the combination of a larger numerical majority in the packed condition and the greater granularity of options in the unpacked condition would both pull in the same direction (towards greater perceived agreement in the packed condition). Exploratory analyses showed some support for this idea in Study 2, but not in Study 1. Future research can address these limitations by incorporating a more diverse set of items (e.g., items where the majority opinion is negative or skeptical) and testing alternative designs that disentangle the effects of packing and unpacking on agreement and disagreement. Stronger or more targeted manipulations may provide greater clarity in understanding how these presentation formats shape perceptions of consensus.

Eighth, it is important to note that our findings regarding the relationship between self-reported attitudes and perceived consensus are correlational in nature, and causal inferences cannot be made. While we find it plausible that people’s (presumably) pre-existing beliefs shape the way they perceive public opinions, it is also conceivable that observing the consensus influences how participants report their own beliefs. To clarify the causality and directionality of this relationship, future research should employ experimental or longitudinal methods.

Finally, a limitation of this research is the simplicity of the measures used. Each issue on climate change or related behavior was assessed using a single question, which may have reduced reliability and contributed to the null results. Future research should consider using more comprehensive and validated measures to enhance robustness.

In light of these limitations, it is prudent to approach the conclusions drawn from our research with care. Nevertheless, we believe our findings is a starting point for further research on perceived consensus, perhaps examining other psychological factors, such as identity-consistent perception (Hu et al., 2017) or other types of presentation and framing effects.

In conclusion, while the format in which information about public opinion is presented (packed vs. unpacked) did not significantly influence perceptions of public agreement on climate change in our studies, individuals’ personal beliefs seem to play a pivotal role. The present results highlight the importance of understanding public consensus about climate change and informs practical implications for communication strategies, policy-making, and public engagement. This study also calls for continued exploration into the cognitive mechanisms underpinning consensus perception, particularly in the context of highly polarized issues such as climate change.

Subramanya Prasad Chandrashekar: Data curation, Formal analysis, Methodology, Software, Visualization, Writing – original draft, Writing – review and editing.

Erik Løhre: Conceptualization, Data curation, Investigation, Methodology, Project administration, Resources, Software, Supervision, Validation, Writing – review and editing.

Jenny Skjellet: Conceptualization, Investigation, Methodology, Writing – review and editing.

Alf Børre Kanten: Conceptualization, Investigation, Methodology, Writing – review and editing.

The author(s) declared no potential conflicts of interests with respect to the authorship and/or publication of this article.

All materials, data, analysis scripts and pre-registrations can be found on this paper’s project page on the osf: https://osf.io/tk76r/

The project is supported by BI Norwegian Business School.

Research was conducted according to the Norwegian Guidelines for Research Ethics in the Social Sciences and the Humanities.

1.

The measure of actual level of agreement is part of the exploratory analysis, and was included in retrospect, as we realized it was important to control for this factor when analyzing a potential effect of packing vs. unpacking.

2.

The calculation of entropy scores is somewhat technical, but simply put, when the reverse-coded standardized entropy measure is close to 0 the distribution of responses is closer to being uniform (i.e., all categories are equally likely), while if the standardized entropy measure is close to 1 the distribution is highly skewed (i.e., most of the responses are concentrated in a few categories, signaling higher agreement).

3.

The social category salience manipulation was part of an exploratory question, regarding whether group identities may influence the main effect of packing. The manipulation involved whether participants answered demographic questions at the beginning of the survey (before reading and answering agreement measures) or toward the end of the survey. As this was exploratory and not our main concern, we do not discuss the theoretical bases for this manipulation.

4.

As a robustness check, we analyzed the data without excluding participants and found that the results were practically identical (see Table S14 in the Supplementary Material section).

5.

The surveys we based our materials on can be found online on the following links: https://www.klimaoslo.no/rapport/klimaundersokelsen-2021-befolkning-oslo/; https://www.klimaoslo.no/rapport/klimaundersokelsen-2021-befolkning-viken/; https://www.bergen.kommune.no/api/rest/filer/V29080390. We have also posted copies of the original surveys on the osf project page: https://osf.io/tk76r/files/osfstorage

6.

Oslo is the capital and the largest city in Norway (population of about 717.000), Viken is the region surrounding Oslo (population of about 1.310.000), and Bergen is the second largest city in Norway (population of about 292.000) but located on the west coast of the country. We had access to surveys from these three locations and combined them to get as close as possible to a representative estimate of public beliefs in Norway.

7.

Two of the three surveys included the response option “not relevant”. To make the surveys comparable, “not relevant” responses were excluded when calculating average percentages for the three surveys.

8.

We thank an anonymous reviewer for suggesting this possibility.

Ajzen, I. (1985). From intentions to actions: A theory of planned behavior. In J. Kuhl & J. Beckman (Eds.), Action-control: From cognition to behavior (pp. 11–39). Springer.
Ajzen, I., & Fishbein, M. (1975). A Bayesian analysis of attribution processes. Psychological Bulletin, 82(2), 261–277. https:/​/​doi.org/​10.1037/​h0076477
Andre, P., Boneva, T., Chopra, F., & Falk, A. (2024a). Globally representative evidence on the actual and perceived support for climate action. Nature Climate Change, 14(3), 253–259. https:/​/​doi.org/​10.1038/​s41558-024-01925-3
Andre, P., Boneva, T., Chopra, F., & Falk, A. (2024b). Misperceived social norms and willingness to act against climate change. Review of Economics and Statistics, 1–46. https:/​/​doi.org/​10.1162/​rest_a_01468
Ballew, M. T., Rosenthal, S. A., Goldberg, M. H., Gustafson, A., Kotcher, J. E., Maibach, E. W., & Leiserowitz, A. (2020). Beliefs about others’ global warming beliefs: The role of party affiliation and opinion deviance. Journal of Environmental Psychology, 70, 101466. https:/​/​doi.org/​10.1016/​j.jenvp.2020.101466
Bamberg, S., & Möser, G. (2007). Twenty years after Hines, Hungerford, and Tomera: A new meta-analysis of psycho-social determinants of pro-environmental behaviour. Journal of Environmental Psychology, 27(1), 14–25. https:/​/​doi.org/​10.1016/​j.jenvp.2006.12.002
Bates, D., Kliegl, R., Vasishth, S., & Baayen, H. (2015). Parsimonious mixed models. arXiv preprint arXiv:1506.04967.
Brick, C., Fillon, A., Yeung, S. K., Wang, M., Lyu, H., Ho, J. Y. J., … Feldman, G. (2021). Self-interest is overestimated: Two successful pre-registered replications and extensions of Miller and Ratner (1998). Collabra: Psychology, 7(1), 23443. https:/​/​doi.org/​10.1525/​collabra.23443
Chinn, S., Lane, D. S., & Hart, P. S. (2018). In consensus we trust? Persuasive effects of scientific consensus communication. Public Understanding of Science, 27(7), 807–823. https:/​/​doi.org/​10.1177/​0963662518791094
Chopra, F., Haaland, I., Roth, C., & Stegmann, A. (2023). The null result penalty. The Economic Journal, 134(657), 193–219. https:/​/​doi.org/​10.1093/​ej/​uead060
Cialdini, R. B., Reno, R. R., & Kallgren, C. A. (1990). A focus theory of normative conduct: Recycling the concept of norms to reduce littering in public places. Journal of Personality and Social Psychology, 58(6), 1015. https:/​/​doi.org/​10.1037/​0022-3514.58.6.1015
Cook, J., Oreskes, N., Doran, P. T., Anderegg, W. R., Verheggen, B., Maibach, E. W., & Rice, K. (2016). Consensus on consensus: a synthesis of consensus estimates on human-caused global warming. Environmental Research Letters, 11(4), 048002. https:/​/​doi.org/​10.1088/​1748-9326/​11/​4/​048002
Doran, P. T., & Zimmerman, M. K. (2009). Examining the scientific consensus on climate change. Eos, Transactions American Geophysical Union, 90(3), 22–23. https:/​/​doi.org/​10.1029/​2009EO030002
Farrow, K., Grolleau, G., & Ibanez, L. (2017). Social norms and pro-environmental behavior: A review of the evidence. Ecological Economics, 140, 1–13. https:/​/​doi.org/​10.1016/​j.ecolecon.2017.04.017
Fields, J. M., & Schuman, H. (1976). Public beliefs about the beliefs of the public. Public Opinion Quarterly, 40(4), 427–448. https:/​/​doi.org/​10.1086/​268330
Franco, A., Malhotra, N., & Simonovits, G. (2014). Publication bias in the social sciences: Unlocking the file drawer. Science, 345(6203), 1502–1505. https:/​/​doi.org/​10.1126/​science.1255484
Funk, C., & Hefferon, M. (2019). US public views on climate and energy (p. 25). Pew Research Center. https:/​/​www.pewresearch.org/​science/​2019/​11/​25/​appendix-detailed-tables/​
Geiger, N., & Swim, J. K. (2016). Climate of silence: Pluralistic ignorance as a barrier to climate change discussion. Journal of Environmental Psychology, 47, 79–90. https:/​/​doi.org/​10.1016/​j.jenvp.2016.05.002
Goldberg, M. H., van der Linden, S., Leiserowitz, A., & Maibach, E. (2020). Perceived social consensus can reduce ideological biases on climate change. Environment and Behavior, 52(5), 495–517. https:/​/​doi.org/​10.1177/​0013916519853302
Hu, S., Jia, X., Zhang, X., Zheng, X., & Zhu, J. (2017). How political ideology affects climate perception: Moderation effects of time orientation and knowledge. Resources, Conservation and Recycling, 127, 124–131. https:/​/​doi.org/​10.1016/​j.resconrec.2017.09.003
Ingendahl, M., Woitzel, J., & Alves, H. (2024). Who shows the Unlikelihood Effect–and why? Psychonomic Bulletin & Review, 32, 1768–1781. https:/​/​doi.org/​10.3758/​s13423-024-02453-z
IPCC. (2014). Annex II: Glossary. In V. R. Barros, C. B. Field, D. J. Dokken, M. D. Mastrandrea, K. J. Mach, T. E. Bilir, & IPCC, et al. (Eds.), ClimateChange 2014: Impacts, Adaptation, and Vulnerability. Part B: Regional Aspects. Contribution of Working Group II to the Fifth AssessmentReport of the Intergovernmental Panel on Climate Change (pp. 1757–1776). Cambridge University Press.
Karmarkar, U. R., & Kupor, D. (2023). The unlikelihood effect: When knowing more creates the perception of less. Journal of Experimental Psychology: General, 152(3), 906–920. https:/​/​doi.org/​10.1037/​xge0001306
Katz, D., & Allport, F. (1931). Student attitudes: A report of the Syracuse University research study. Craftsman Press.
Kobayashi, K. (2018). The impact of perceived scientific and social consensus on scientific beliefs. Science Communication, 40(1), 63–88. https:/​/​doi.org/​10.1111/​ene.13539
Kuznetsova, A., Brockhoff, P. B., & Christensen, R. H. B. (2017). lmerTest package: tests in linear mixed effects models. Journal of Statistical Software, 82(13). https:/​/​doi.org/​10.18637/​jss.v082.i13
Lahn, B., & Torvanger, A. (2017). Klimavalg 2017. CICERO Report.
Leiserowitz, A., Maibach, E., Roser-Renouf, C., & Smith, N. (2010). Climate change in the American mind: Americans’ global warming beliefs and attitudes in January 2010. Yale and George Mason University. Yale Project on Climate Change.
Leiserowitz, A., Verner, M., Goddard, E., Wood, E., Carman, J., Ordaz Reynoso, N., Thulin, E., Rosenthal, S., Marlon, J., & Buttermore, N. (2023). International Public Opinion on Climate Change, 2023 (pp. 1–45). Yale Program on Climate Change Communication.
Leviston, Z., Nangrani, T., Stanley, S. K., & Walker, I. (2024). Consequences of group-based misperceptions of climate concern for efficacy and action. Current Research in Ecological and Social Psychology, 6, 100189. https:/​/​doi.org/​10.1016/​j.cresp.2024.100189
Løhre, E., Sobkow, A., Hohle, S. M., & Teigen, K. H. (2019). Framing experts’ (dis)agreements about uncertain environmental events. Journal of Behavioral Decision Making, 32(5), 564–578. https:/​/​doi.org/​10.1002/​bdm.2132
Marks, G., & Miller, N. (1987). Ten years of research on the false-consensus effect: An empirical and theoretical review. Psychological Bulletin, 102(1), 72–90. https:/​/​doi.org/​10.1037/​0033-2909.102.1.72
Martuza, J., Thorbjørnsen, H., & Sjåstad, H. (2024, June 27). Beliefs versus reality: People overestimate the actual dishonesty of others. https:/​/​doi.org/​10.31234/​osf.io/​nm2cz
Mildenberger, M., & Tingley, D. (2019). Beliefs about climate beliefs: the importance of second-order opinions for climate politics. British Journal of Political Science, 49(4), 1279–1307. https:/​/​doi.org/​10.1017/​S0007123417000321
Miller, D. T., & McFarland, C. (1987). Pluralistic ignorance: When similarity is interpreted as dissimilarity. Journal of Personality and Social Psychology, 53(2), 298. https:/​/​doi.org/​10.1037/​0022-3514.53.2.298
Miller, D. T., & Ratner, R. K. (1998). The disparity between the actual and assumed power of self-interest. Journal of Personality and Social Psychology, 74(1), 53. https:/​/​doi.org/​10.1037/​0022-3514.74.1.53
Munafó, M., & Neill, J. (2016). Null is beautiful: On the importance of publishing null results. Journal of Psychopharmacology, 30(7), 585–585. https:/​/​doi.org/​10.1177/​0269881116638813
Nielsen, K. S., Clayton, S., Stern, P. C., Dietz, T., Capstick, S., & Whitmarsh, L. (2021). How psychology can help limit climate change. American Psychologist, 76(1), 130–144. https:/​/​doi.org/​10.1037/​amp0000624
Pearson, A. R., Schuldt, J. P., Romero-Canyas, R., Ballew, M. T., & Larson-Konar, D. (2018). Diverse segments of the US public underestimate the environmental concerns of minority and low-income Americans. Proceedings of the National Academy of Sciences, 115(49), 12429–12434. https:/​/​doi.org/​10.1073/​pnas.1804698115
Prentice, D. A., & Miller, D. T. (1993). Pluralistic ignorance and alcohol use on campus: some consequences of misperceiving the social norm. Journal of Personality and Social Psychology, 64(2), 243. https:/​/​doi.org/​10.1037/​0022-3514.64.2.243
Redelmeier, D. A., Koehler, D. J., Liberman, V., & Tversky, A. (1995). Probability judgment in medicine: Discounting unspecified possibilities. Medical Decision Making, 15(3), 227–230. https:/​/​doi.org/​10.1177/​0272989X9501500305
Rivis, A., Sheeran, P., & Armitage, C. J. (2009). Expanding the affective and normative components of the theory of planned behavior: A meta-analysis of anticipated affect and moral norms. Journal of Applied Social Psychology, 39(12), 2985–3019. https:/​/​doi.org/​10.1111/​j.1559-1816.2009.00558.x
Rodríguez-Barreiro, L. M., Fernández-Manzanal, R., Serra, L. M., Carrasquer, J., Murillo, M. B., Morales, M. J., … del Valle, J. (2013). Approach to a causal model between attitudes and environmental behaviour. A graduate case study. Journal of Cleaner Production, 48, 116–125. https:/​/​doi.org/​10.1016/​j.jclepro.2012.09.029
Rosenthal, R. (1979). The file drawer problem and tolerance for null results. Psychological Bulletin, 86(3), 638–641. https:/​/​doi.org/​10.1037/​0033-2909.86.3.638
Ross, L., Greene, D., & House, P. (1977). The “false consensus effect”: An egocentric bias in social perception and attribution processes. Journal of Experimental Social Psychology, 13(3), 279–301. https:/​/​doi.org/​10.1016/​0022-1031(77)90049-X
Rottenstreich, Y., & Tversky, A. (1997). Unpacking, repacking, and anchoring: Advances in support theory. Psychological Review, 104(2), 406. https:/​/​doi.org/​10.1037/​0033-295X.104.2.406
Schwartz, S. H. (1977). Normative influences on altruism. In Advances in Experimental Social Psychology (Vol. 10, pp. 221–279). Academic Press. https:/​/​doi.org/​10.1016/​S0065-2601(08)60358-5
Steg, L. (2023). Psychology of climate change. Annual Review of Psychology, 74(1), 391–421. https:/​/​doi.org/​10.1146/​annurev-psych-032720-042905
Stok, F. M., & de Ridder, D. T. (2019). The focus theory of normative conduct. In Social Psychology in Action (pp. 95–110). Springer. https:/​/​doi.org/​10.1007/​978-3-030-13788-5_7
Tranter, B., & Booth, K. (2015). Scepticism in a changing climate: A cross-national study. Global Environmental Change, 33, 154–164. https:/​/​doi.org/​10.1016/​j.gloenvcha.2015.05.003
Tversky, A., & Koehler, D. J. (1994). Support theory: A nonextensional representation of subjective probability. Psychological Review, 101(4), 547–567. https:/​/​doi.org/​10.1037/​0033-295X.101.4.547
United Nations Climate Change. (2021, October). The Paris agreement.
Van Boven, L., Ehret, P. J., & Sherman, D. K. (2018). Psychological barriers to bipartisan public support for climate policy. Perspectives on Psychological Science, 13(4), 492–507. https:/​/​doi.org/​10.1177/​1745691617748966
Van Boven, L., & Epley, N. (2003). The unpacking effect in evaluative judgments: When the whole is less than the sum of its parts. Journal of Experimental Social Psychology, 39(3), 263–269. https:/​/​doi.org/​10.1016/​S0022-1031(02)00516-4
van der Linden, S. (2021). The Gateway Belief Model (GBM): A review and research agenda for communicating the scientific consensus on climate change. Current Opinion in Psychology, 42, 7–12. https:/​/​doi.org/​10.1016/​j.copsyc.2021.01.005
van der Linden, S., Clarke, C. E., & Maibach, E. W. (2015). Highlighting consensus among medical scientists increases public support for vaccines: evidence from a randomized experiment. BMC Public Health, 15, 1–5. https:/​/​doi.org/​10.1186/​s12889-015-2541-4
van der Linden, S., Leiserowitz, A. A., Feinberg, G. D., & Maibach, E. W. (2015). The scientific consensus on climate change as a gateway belief: Experimental evidence. PloS One, 10(2), e0118489. https:/​/​doi.org/​10.1371/​journal.pone.0118489
van Valkengoed, A. M., & Steg, L. (2019). Meta-analyses of factors motivating climate change adaptation behaviour. Nature Climate Change, 9(2), 158–163. https:/​/​doi.org/​10.1038/​s41558-018-0371-y
Wolske, K. S., & Stern, P. C. (2018). Contributions of psychology to limiting climate change: Opportunities through consumer behavior. In Psychology and Climate Change (pp. 127–160). Academic Press. https:/​/​doi.org/​10.1016/​B978-0-12-813130-5.00007-2
World Health Organization (WHO). (2019). Ten threats to global health in 2019. https:/​/​www.who.int/​news-room/​spotlight/​ten-threats-to-globalhealth-in-2019
This is an open access article distributed under the terms of the Creative Commons Attribution License (4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Supplementary Material