Australians overwhelmingly say that transparency is what builds their trust in the credibility of information. The 2024 Digital News Report found that 80% of news consumers ranked transparent sourcing and open methodology on the web as the leading reason to believe a story.
Over decades, journalists have developed an array of strategies to persuade sceptical audiences of the legitimacy of their stories, including offering credible sources, developing all aspects of a story to show the nuance behind the headlines, and examining a story from all perspectives.
Similar levels of scepticism are often experienced by research organisations that undertake and disseminate market research, public opinion polls and political polls. And when it comes to persuading their audiences of the credibility of their findings, they have access to similar strategies.
For instance, after the polling miss at the federal election in 2019, when just about every poll indicated Labor would win with certainty, an independent inquiry said clearer disclosure of sampling and weighting needed to be provided to enable journalists and voters to assess the veracity of polling data themselves.
Here are five journalism principles that every research team and polling organisation could implement to build trust in their data.
Transparency Builds Trust
Ethical journalists disclose their process, while protecting the anonymity and safety of their sources of course and encourage their readers or audiences to scrutinise the evidence. The Society of Professional Journalists’ Code of Ethics states “be accountable and transparent” as one of the four primary principles.
Polling has a parallel standard already. The Australian Polling Council (APC) has adopted ‘A Code of Professional Conduct‘ and a set of ethics that requires APC members to publish both short and detailed methodology statements for all polling data they publish.
The media can now provide hyperlinks to methodology statements, allowing audiences to determine if there is sufficient rigour in the polling before they send tweets with the headlines. See, for examples, the ABC’s April 2025 explainer on current federal voting polls.
With respect to B2B and B2C market research, The Research Society’s Code of Professional Behaviour requires its members to publish their methodology and protect respondent rights for every study, which helps to create accountability from the first brief to the ultimate final publication. The ADIA Market & Social Research Privacy Code requires clear protections for how personal data is stored, used, and destroyed, because transparency regarding methods also extends to transparency regarding privacy practices.
At a minimum, any public-facing methodology dashboard should clearly spell out:
- Sample frame & recruitment method – e.g. third party mobile records, opt-in online panel, client list.
- Field work dates & pauses – actual start, actual stop, any breaks during the study.
- Screener rules and incidence rate – who qualified and how often?
- Raw n, effective n, design effect – allows audience to gauge the statistical power of the sample.
- Weighting variables, targets, and overall efficiency – plus the final rim-weighting score.
- Questionnaire version logs – all wording or flow changes should be time-stamped.
- Quality-control measures – dual listen back-checks for CATI, speed-trap and straight-line flags for online, and rules for trimming outliers.
The main takeaway is that organisations commissioning, conducting, and publishing market research studies should share with their audiences how to read the poll and understand its conclusions.
Tell the Whole Story, Not Just the Numbers
Good reporting will share with audiences the narrative behind the headline, which is often a more nuanced story than a reading of the headline alone might suggest. What happened? Why does it matter? What’s the alternative perspective? What might happen next?
That same rule applies to market research dashboards and the reports based on them. A churn rate of 28 per cent is of little consequence to readers unless they know what segments are driving it, what external shocks – like price increases or supply hiccups – might be powering the problem, and how big is the margin of error. Numbers, without a narrative, open the door for misunderstanding and scepticism. Ensure context is provided by:
- Benchmarking everything. Show current n-sizes, weighting efficiency and design effects alongside historic averages.
- Identifying the blind spots. Add a “What this study can’t say” box to account for mode effects, low-incidence strata or self-selection bias.
- Leading with a ‘nut-graf’. Share a three-sentence, plain-English takeaway that appears directly under each key chart, answering the question “So what?” before your audience gets chance to ask it.
- Annotating charts. Just like a newsroom editor marks up the page proofs in a newspaper, you can use arrows, callouts, and captions to point the reader towards the intended interpretation of the data.
Contextualised insight moves faster within organisations and leads to action. Brand teams change ad spending, risk committees amend capital models, and communications teams inform spokespeople with greater assurance when they have confidence in the data they’re relying on to make decisions. Simply put, researchers that connect numbers to narrative make the leap from data vendors to strategic partners.
Question Everything – Your Assumptions Too
The job of an investigative reporter is to scrutinise official statements, verify information by triangulating sources, and expose conflicts of interest. A source’s trust always has to be earned through experience.
Research teams should do the same. Back in the 2019 Australian federal election, multiple incorrect polls led to claims of “herding”, where research firms may have altered their weightings subconsciously to fit in with their colleagues, not because they were being intentionally dishonest but because they didn’t want to publish results contrary to everyone else.
Here are some of the journalistic guardrails which researchers can adopt:
- Source variety. When tracking down a story, journalists don’t limit themselves to one witness, and neither should researchers. Mix up online panels with CATI or intercepts in the street to reach the people not captured in the digital sample.
- Pre-mortem drill. In a newsroom setting, an editor might ask, “What if our source is wrong?” leading journalists to do more fact checking before fronting a story. Ask yourself, before you go into the field, what if our turnout model collapses? What if the questionnaire is two pages too long? List all the “could go wrongs,” wire in counter-measures – dual-frame sampling, mode specific wording, shorter instruments – and keep an audit trail of your notes.
- Devil’s-advocate review. Find a colleague who had nothing to do with your study and ask them to look for mistakes in the weighting, quotas, and logic before you lock the data set. Their goal is to try to break your study to ensure the client, regulator or publisher can’t do the same once it’s ready for publication.
Educate the Audience
Transparency means little if your audience doesn’t know how to interpret the data you present. That’s why newsrooms now routinely use explainer pop ups or methodology drop-downs alongside every chart of interactive graphic. The 2019 election polling review urged pollsters to adopt this mindset to help “enhance the public’s understanding of sampling error and methodological limits.”
There are lots of practical ways that research organisations, news publishers and commissioning clients can help build their audience’s data literacy with every release:
- Plain language glossary: define quota, weighting, herding, and non-response biases in lay terms, and embed this list in every deck.
- Visual reinforcements: micro-infographics which show, for example, how the margin of error expands as sample size shrinks, or how you built a stratified sample.
- “How to read this chart” callouts: three bullet points under every important graphic to alert the reader to caveats and potential misinterpretations.
- Media-ready kit: a one-page disclosure, top line tables, the questionnaires used, and a two paragraph lay summary should be packaged for journalists, regulators, and stakeholders.
Presenting your audience with these kinds of bite-sized education materials guards against misreadings and misinterpretations, which is crucial in a social media age where graphs and charts travel faster than the nuance and detail they were meant to capture.
Methodology is Meaning
Without context of how it was generated or collected, even the best dataset can be misconstrued. The APC code requires long form disclosure that describes the wording on the questionnaire, explains the outcomes of calls, and gives the weighting cells, so that potentially any other analyst could replicate (or challenge) the results. In journalism, the same rules apply, which is why you see journalists linking to raw documents or embedding datasets in their stories, so that reader can explore them for themselves.
Publish an appendix with your research outlining all of the variables, weights and rules used. Include a data-confidence score on every chart, so readers can see which findings are robust and which to be cautious of? This adds context and meaning to the data you’re presenting.
Why It Matters
Documented methodologies and transparency lead to a much faster buy-in from stakeholders, clients and the public. When data transparency is prioritised, the time spent questioning the data’s findings is reduced, allowing decisions to be made more quickly and with more certainty. Transparency also gives confidence to regulatory stakeholders, or investor relations and compliance teams who will feel more comfortable approving the data as every assumption and weighting will be clearly stated and auditable. Media resilience is also improved, so when the study pops up on the headline news, the resources are already available to give context to the findings, reducing the risk of misquoting or selective interpretation.
When research organisations transparency front and centre, provide clarity of context, and ensure education is part of their publishing process, audiences are able to better trust the results.
If your stakeholders are looking for newsroom transparency, ISO certified rigour, and high quality data collection, TKW Research has you covered. Learn how we can assist with your next project by browsing our entire offering on the Solutions Page, reading our explainer on ISO compliance, or why not arrange a consultation with our fieldwork team today.