Often, as I read social science research related to workplace topics, I’m struck by the frequency of press mentions that read like this:

In research by ___, __% of employers/employees said that [surprising finding, but decontextualized]. A majority of them feel that [another finding, sometimes unrelated and only sometimes linked or credited].

A benefit of citing research this way is ease for readers, and introducing research with enough generality and approachability that a range of readers can see something useful for themselves in the findings. But this isn’t as complete as it could be. How might we introduce more specificity, and, ultimately, more usability?

As a field of journalists writing about research, and researchers writing about journalism, I think we can give readers more just-in-time information about where studies were conducted, when, and with whom. We can give credit to the people, firms, and institutions that invested in the research, and offer information about why they would undertake it. (Some companies deliberately obscure themselves or the firms they’ve commissioned to have results appear unbiased, such as studies about the percentage of employees who want to bring pets into their offices, conveniently sponsored by a pet goods company. Being more transparent about motivations is a hedge amid this type of publicity-oriented “science” that serves more to market than to inform.)

Vitally, we can offer more actionable details to our readers: What should they and their organizations *do* based on this information? The effect of highlighting these research ingredients should be more grounded information, making research dispatches feel less random.

Raising the bar is a competitive differentiator for Charter, and my mission. This is a call for consistency in how we as journalists and researchers show research highlights. It’s one way to do right by research participants, readers, and researchers, and it can bring rigor to self-reported data. Charter readers depend on us to guide them through the fog of survey-based research that so often dominates workplace coverage.

I don’t say this to shame our peer set, but to propose that we might all be sophisticated in how we collectively publicize research.

As we assess and report on research related to the future of work at Charter, and as we begin fielding and publishing our own original research, we’re publishing our guidelines for ourselves. This is just a start, and we welcome your suggestions for acknowledging social science findings in popular and niche press.

Let’s consider how we as researchers and journalists might best acknowledge these considerations:

Component

To address questions such as

Recommended approach

Could read or sound like…

Research methodology

What was the research approach: a survey, interviews, observation, larger scale data analysis, or something else? What did they measure?

A quantitative study in the form of a five-minute online survey


In-depth, remotely fielded small group interviews 

Sample size and demographics

Who took part in this research? How do they self-identify, and how did researchers recruit them? Is the sample representative of a wider population of interest?

521 U.S.-based business leaders currently working in people operations or HR with 10+ self-reported years of experience

Timeframe for the research and analysis

When did researchers field the study? Were there any external events that may have affected the results? Are the findings still relevant since that time?

…research conducted between January 2020 and June 2020, during which the spread of the coronavirus forced many offices to close and employees to work remotely

Author names and institutions

Who worked on this, and what is their relevant expertise on this topic? Where are the researchers coming from?

Andrew Crain conducted research and analysis and advises students as a career consultant at the University of Georgia.

At a minimum, share the research affiliation or publisher!

Conflicts of interest

How was this study funded? Would the results have been shared if the findings were different? 

The June 2022 report on worker organizing and collective action was published by the Worker Empowerment Research Network. WERN supporters include the Ford Foundation, the WorkRise network at the Urban

Institute, the Omidyar Network, the Hewlett Foundation, and the MIT Institute for Work and

Employment Research.

Question phrasing

How were study questions worded? Is that wording understandable to people who are new to the subjects being studied? What scales were used in surveys?

On Angela Duckworth’s “grit” scale, respondents were asked to indicate the extent to which they “have overcome setbacks to conquer an important challenge” using a five-point scale (“very much like me” to “not like me at all”).

Relevant limitations

What is important to know about the research sample, methodology, or analysis that might affect the outcomes? In what ways might the study’s results be misinterpreted?

Students self-selected to respond to the survey, and the final sample may not be representative of the university’s population.

Related literature

What else should readers know about this body of knowledge? Is there equally reliable research that contradicts the findings?

When considering the number of managers who say they are eager to have teams return to offices more regularly, also consider the Steelcase finding that many employees between 30 and 50 years old prefer working from home for ease of child and elder care responsibilities.

What you can expect from us at Charter

How we plan to move forward with sharing others’ published work

Our reporters and editors see dozens of pitches citing research each week, in addition to the findings they seek out. The studies they choose to cover originate from rigorous sources and represent the most recent available knowledge (with a note if findings are more than one year old). They question this and other research- and expert-backed advice, and present only that which is most credible and actionable.

(Click here to enlarge)

Our 3x/week Charter newsletter offers an opportunity to test how, in a brief format, we can make transparent the origins and key details behind research.

Charter makes readers smarter about the future of work by bridging research to practice, but that requires quick, cogent presentation. In a September 2021 survey of our newsletter subscribers, one person described this as Charter offering “sufficient depth without being overwhelming.”

Commitments we are making in how we cover research

  • We make a point of naming the research source (academic institutions, companies, individuals, think tanks, and more) in-line. We consistently link to studies so readers can go further.
  • We’re clear in how we present studies on the same topic that suggest different results. And we weigh in when there's a buzzy statistic that paints an inaccurate or less-than-nuanced picture that we can fill in.
  • If a press release doesn’t include methodology, timing, and details about the composition and number of people studied, we ask the sender to share those components, which we require for every study we consider covering.
  • Wherever possible, we avoid single sources and the biases they might inadvertently introduce.

Our style guide for publishing our own upcoming findings

As Charter expands frameworks and tools we offer to business leaders, we’re building a new practice to create and distribute original research. The thought leadership we publish will be rigorous with a solutions orientation. We have a strong point of view, informed by data, about how workplaces can grow with flexibility. (Said another way, we won’t assume a detached “view from nowhere” stance, to use a phrase from NYU journalism professor Jay Rosen.)

We assume that our readers have a baseline level of interest and/or knowledge about work issues, and we’ll survey them regularly about their perspectives and information needs. In sharing what we find in our research, you can expect us to:

  • Clearly explain our research fascinations and approaches, including the benefits and limitations of the methods we use;
  • We clearly signal when research is sponsored or underwritten;
  • Show how we recruit research participants and to show the exact phrasing we use when asking them questions;
  • Maintain the confidentiality and security of the information that research participants share with us;
  • Share our style guide as we develop it. I’m a fan of Vox’s Language Please for offering such a resource, and the community standards from The Conversation are useful, too;
  • Be forthcoming when we’re wrong, honest about what we don’t know yet, and to solicit technical and stylistic review and discussion from our peer set.

This is a living list, and we welcome your thoughts and suggested additions.

Thanks to collaborators on Charter’s editorial team: Cari Nazeer, Michelle Peng, and Kevin Delaney. On our product team, Sam Williams and Gillian Zamora were helpful in publishing this manifesto! And thanks to the News Product Alliance user research community.


Key takeaways:

Charter’s in-house researchers Emily Goligoski and Melissa Zwolinski field and analyze research that is actionable in the workplace transformation space. They have a strong point of view, informed by data, about how workplaces can grow with flexibility.

Some companies deliberately obscure themselves or the firms they’ve commissioned to have results appear unbiased, such as studies about the percentage of employees who want to bring pets into their offices, conveniently sponsored by a pet goods company. Being more transparent about motivations is a hedge amid this type of publicity-oriented “science” that serves more to market than to inform.

This article features the commitments they make around Charter's original research contributions.