How
to have
difficult
conversations

A practical guide for academic &
practitioner
research collaborations

This interactive guide walks you through how to have difficult conversations question-by-question. Submit your answers online and we will email them to you. Or, download the complete guide and workbook to guide your own discussions.






Why
difficult
conversations?

You are entering into a partnership because you have decided that there is enough of an overlap in your interests and actions to pursue a joint agenda. But where do you diverge?

This guide provides guidance on how to have “difficult conversations” that often arise when academic researchers and practitioners decide to collaborate. We want to hone in on pivotal decision points and issues that are frequently overlooked, or brought up too late. This guide builds on MIT GOV/LAB’s experience of learning in the field and from many honest conversations with partners reflecting on “what could have gone differently.”

Engaged
scholarship
At MIT GOV/LAB, our engaged scholarship model is based on values of equitable exchange between practitioners and academics. This guide is one tool in our engaged scholarship series to support rigorous research that is co-created by partners and grounded in field work.
Feedback is welcome mitgovlab@mit.edu.
How
to use
this guide
The guide is structured as a series of questions meant to clarify our priorities and spell out assumptions. Each section includes questions that both partners should consider together, with specific lists aimed at academics and practitioners.
Instructions
Click through all the questions and read the tips provided on each card.
Choose one section to start and answer the questions posed to you. You can send the answers to your email right away, or save to finish the section later.* You also have the option to email your answers to your partner. Each section should be completed separately.
* Please note your session expires when you close the webpage, so don't forget to send your answers before closing the tab.
Ask your partner to work on the same section, and then talk through your answers. Repeat for the other sections.
Set regular check-ins with your partner to review what has changed and adjust as necessary.
How
to start
the conversation
When do you broach sensitive topics and how do you explain why these questions matter?
First, to the extent possible, having these conversations face-to-face is incredibly valuable for building trust and understanding.
Second, starting open conversations early on in the partnership makes for more open and productive decision-making.
Select a
section
to get
started
Seem simple? Good. Relationship-building takes time and patience, but initial optimism is required.
Incentives &
Expectations
In the desire to agree on the partnership and set it off to a good start, either side may be cautious in revealing too much of their expectations up front. In our experience, revealing more earlier is better. We include questions to unpack why each party is interested in the research partnership, and what expectations they have.
Exploratory
Phase &
Timelines
During initial conversations, both sides usually try to interpret what the other side says and fit them into particular boxes. That’s why, it’s so important to build lead time into the project to test the waters, get to know the context, and have time to clarify initial misunderstandings. This process leads to more innovative research questions, and helps avoid some of the common preconceptions and biases that can undermine effective collaboration.
Collaborative
Decision
Making &
Team Buy-in
Being open about where you are coming from is a necessary first step in getting to know your partner. As the relationship develops, equally important is determining how you will make decisions moving forward. Clarifying roles and responsibilities as well as decision-making processes and ensuring there is real team buy-in can support productive exchange.
Learning &
Dissemination
Practitioners and academics often think differently about how to use results. For academics, it’s typical to think in long timelines, towards peer-reviewed publications. For practitioners, timelines are often shorter, tied to programmatic or funding decisions. Because of these differences, it helps to sketch out the desired outputs and their uses for both parties ahead of time. How can both sides work together to produce results, in a range of formats that meets everyone’s needs?
Engaged
scholarship
tools
Select a tool to download
MIT
GOV/LAB
learning
cases:
Teaching on WhatsApp with Grassroot. Link
Learning Collaborative. Link
Civic Leadership in the Philippines. Link.
Access to Information with Twaweza. Link.
Resources
for building
partnerships
We don't directly address the issues of power and inequities in this guide, but we want to provide some resources for further context.
  • Chicago Beyond: Why am I Always Being Researched? A guidebook for community organizations, researchers, and funders to help us get from insufficient understanding to more authentic truth. Link
  • Silent Voices Manifesto. New avenues for collaborative research; call for a dialogue on the practice of transnational collaboration in field research. Link
  • Research 4 impact. A new evidence-based model for how to build relationships between people with diverse forms of expertise. Link
Authors: Varja Lipovsek & Alisa Zomer
Designers: Susy Tort & Gabriela Reygadas
Let us know what you think! mitgovlab@mit.edu
Thank you
The idea for this guide came out of a 2018 workshop co-hosted with Evidence in Governance and Politics (EGAP) on “Identifying Best Practices for AcademicPractitioner Research Partnerships” in Washington, DC. Thanks to Adam S. Levine, Baruani Mshale, Michael Moses, Leah Rosenzweig, Matthew Lisiecki, and Ingrid Lee, as well as the 2019 APSA Summer Institute for Civically Engaged Research at Tufts University for providing timely and useful input on the early iterations of the guide. Many thanks to our partners Luke Jordan and Katlego Mohlabane from Grassroot for reviewing the learning case in this update to the guide. Thanks to MIT GOV/LAB Faculty Director Professor Lily Tsai for many comments and to Selmah Goldberg for helping to shape this second iteration. Finally, thanks to our editor Maggie Biroscak for helping us translate across audiences and Nina Gregg for designing the initial interactive tool concept.

Ask
each
other

Incentives & Expectations

Email address*

Partner email address (optional)

What do you want out of this collaboration? Ask why five times to better understand who or what is really motivating the research study.*

* The “Five Whys” technique was developed by Toyota as a way to approach problem solving that gets to the root of a technical issue by understanding the human dimension (https://hbr.org/2012/02/the-5-whys.html).

How do you see the roles and responsibilities of your partner? For instance, will the practitioner primarily help facilitate the fieldwork (inroads into communities, translation, etc.)? Will the researcher be integrated into the practitioner organization or act more independently?

Ask
your
practitioner
partner

How do you plan to use the results? Is there a specific decision(s) or donor report(s) that the research results will inform?

What is at stake or how important are the results to your organization? For example, are the results important to the core of the organization (mission critical), are you evaluating a key component of your Theory of Change, or do you want to use the results to re-think or re-design a major component of your implementation?

What kind of results do you need? For example, are you hoping to get statistically significant results or show causal impact? Similarly, are you trying to evaluate more than one initiative bundled together? It's important to discuss early on what level of certainty you need for results, because the answers will directly affect possible research designs and budget.  

If you’re testing a particular initiative, are there still opportunities to tweak or change it, or must it remain exactly as is? Are you able to pilot or test implementation ideas before taking a project to scale? Getting a clear sense of what’s already set in stone will help everyone understand the full range of possibilities for the research design.

How will results be received? For example, how would your team deal with mixed, null, or negative results, especially if you have been hoping to confirm a program’s effectiveness? We find it useful to illustrate what the potential outcomes from an experimental study might be, so that the practitioner is very clear on the range of possible results.

Ask
your
academic
partner

Are these data for a dissertation or tenure-track promotion? Are you looking for data that will yield a peer-reviewed paper? Academic milestones often have long-term timelines as well as standards for rigor and method that might dictate the research design. Make sure to ask your academic partner to explain these design elements.  

Are there specific methods that are a must for you (e.g. do you need an experiment)? What other characteristics of the study are non-negotiable for you (e.g. sample size, ability to randomize, geography, etc.)? Conversely, what components of the research design are flexible and can be adjusted to fit practitioner needs? 

Do you foresee yourself as the primary owner of the data? Of all the data (e.g. including descriptive statistics) or primarily experimental data? What does this mean for the ability of the practitioner to use the data —e.g. can practitioners do their own analysis, or produce their own outputs? Is the release of data for you time-sensitive, and what are those timelines? (More on timelines below). 

What do you need the practitioner to provide? For example, do you need recommendations or contacts for government officials, communities, or research assistants? 

Save
Please note your session expires when you close the webpage, so don't forget to send your answers before closing the tab.
Oops! Something went wrong while submitting the form.

Ask
your
practitioner
partner

Incentives &
Expectations

Email address

How do you plan to use the results? Is there a specific decision(s) or donor report(s) that the research results will inform?

What is at stake or how important are the results to your organization? For example, are the results important to the core of the organization (mission critical), are you evaluating a key component of your Theory of Change, or do you want to use the results to re-think or re-design a major component of your implementation?

What kind of results do you need? For example, are you hoping to get statistically significant results or show causal impact? Similarly, are you trying to evaluate more than one initiative bundled together? It's important to discuss early on what level of certainty you need for results, because the answers will directly affect possible research designs and budget.  

If you’re testing a particular initiative, are there still opportunities to tweak or change it, or must it remain exactly as is? Are you able to pilot or test implementation ideas before taking a project to scale? Getting a clear sense of what’s already set in stone will help everyone understand the full range of possibilities for the research design.

How will results be received? For example, how would your team deal with mixed, null, or negative results, especially if you have been hoping to confirm a program’s effectiveness? We find it useful to illustrate what the potential outcomes from an experimental study might be, so that the practitioner is very clear on the range of possible results.

Save & Continue
Oops! Something went wrong while submitting the form.

Ask
your
academic
partner

Incentives &
Expectations

Email address

Are these data for a dissertation or tenure-track promotion? Are you looking for data that will yield a peer-reviewed paper? Academic milestones often have long-term timelines as well as standards for rigor and method that might dictate the research design. Make sure to ask your academic partner to explain these design elements.  

Are there specific methods that are a must for you (e.g. do you need an experiment)? What other characteristics of the study are non-negotiable for you (e.g. sample size, ability to randomize, geography, etc.)? Conversely, what components of the research design are flexible and can be adjusted to fit practitioner needs? 

Do you foresee yourself as the primary owner of the data? Of all the data (e.g. including descriptive statistics) or primarily experimental data? What does this mean for the ability of the practitioner to use the data —e.g. can practitioners do their own analysis, or produce their own outputs? Is the release of data for you time-sensitive, and what are those timelines? (More on timelines below). 

What do you need the practitioner to provide? For example, do you need recommendations or contacts for government officials, communities, or research assistants? 

Save & continue
Oops! Something went wrong while submitting the form.

Ask
each
other

Exploratory
phase &
Timelines

Email address*

Partner email address (optional)

What timelines matter most? When do big decisions need to be made? Can you create a common, shared calendar, updated in real time?

Are you able to include a “phase zero” in your plan? Taking time to conduct exploratory research, even for a couple of weeks, and spending time with partners in the field is valuable in identifying innovative research questions together before committing to implementation and study components.

At a minimum, can you design and fund a scoping trip to “ground-truth” in person before setting down major parameters of the research design? We find it useful to discuss what type of research—descriptive, observational, lab-in-the-field—would be ideal and what would be minimal for scoping to inform the next stage of your study, given time and resources.

Can there be a regular check-in time? Can we build pivot or exit points into the partnership? These can be around certain key moments—for example, after the “phase zero” or after the pilot—whereby both parties agree to review the progress, content, and direction of the collaboration, and grant each other the right to re-open discussions on how best to proceed, if at all. These pivots or exit points can be built into a Work Plan or Memoranda of Understanding.

Who is part of the team? Do we have sufficient support for the project? Detail out roles, responsibilities, and time in the field for both teams including project managers, junior researchers, students, and research assistants.

Ask
your
practitioner
partner

Are you bound by project-reporting or grant timelines? What calendar do you follow? Are there any important funding decisions we should know about? In many cases, practitioners are raising money to support projects and team salaries, which makes these deadlines high-stakes.

What level of results do you need by these various timelines? For example, sometimes descriptive information is enough to update a board or advising committee. Other times, donors might expect experimental results in order to fund the next stage. See Section 4 for more on how results need to be presented to be useful as well as the process and responsibility for producing outputs to inform practitioner decision-making.

Ask
your
academic
partner

What academic timeline matters most and what are the key dates when you need results? For graduate students, important dates might include going on the job market, submitting a dissertation, or graduating. For professors, tenure clocks or promotion timelines can drive the need for publication. Articles for academic journals take on average three to five years from data collection to publication.

How much time are you personally planning to spend in-country for this project? Will you be in country for key decision points in the study, for example, piloting or the start of the fieldwork? If there are particular moments in the research when you want the academic partner to be available, be sure to say that up-front.

How much time are you expecting to spend on this research? Will you be conducting other research projects at the same time? Do you have enough people on your team to cover all the required field work, data analysis, and writing or are you expecting the practitioner to provide support?

Save
Please note your session expires when you close the webpage, so don't forget to send your answers before closing the tab.
Oops! Something went wrong while submitting the form.

Ask
your
practitioner
partner

Exploratory
phase &
Timelines

Email address

Are you bound by project-reporting or grant timelines? What calendar do you follow? Are there any important funding decisions we should know about? In many cases, practitioners are raising money to support projects and team salaries, which makes these deadlines high-stakes.

What level of results do you need by these various timelines? For example, sometimes descriptive information is enough to update a board or advising committee. Other times, donors might expect experimental results in order to fund the next stage. See Section 4 for more on how results need to be presented to be useful as well as the process and responsibility for producing outputs to inform practitioner decision-making.

Save & Continue
Oops! Something went wrong while submitting the form.

Ask
your
academic
partner

Exploratory
phase &
Timelines

Email address

What academic timeline matters most and what are the key dates when you need results? For graduate students, important dates might include going on the job market, submitting a dissertation, or graduating. For professors, tenure clocks or promotion timelines can drive the need for publication. Articles for academic journals take on average three to five years from data collection to publication.

How much time are you personally planning to spend in-country for this project? Will you be in country for key decision points in the study, for example, piloting or the start of the fieldwork? If there are particular moments in the research when you want the academic partner to be available, be sure to say that up-front.

How much time are you expecting to spend on this research? Will you be conducting other research projects at the same time? Do you have enough people on your team to cover all the required field work, data analysis, and writing or are you expecting the practitioner to provide support?

Save & Continue
Oops! Something went wrong while submitting the form.

Ask
each
other

Collaborative
Decision
Making &
Team Buy-in

Email address*

Partner email address (optional)

How should the research be designed? For example, who should be included in the selection of the core themes to be studied, in the design itself, or in adding items to the data collection tools? Who has the final word on critical components of the intervention design and the research design?

Who are the key decision-makers? What are their roles and responsibilities in the research collaboration? What is the process for decision-making for projects and research?  Understanding who sign offs on any implementation changes or resources for research support is important for strengthening cooperation and buy-in. This information would be good to document in a Work Plan.

Have you participated in academic-practitioner research collaborations before? Where was it? Was fieldwork involved? This background information can be helpful for initial planning and onboarding.

Ask
your
practitioner
partner

Are you bound by project-reporting or grant timelines? Who are the main people at the organization who will be communicating with the academics? It’s great to have a mix—the leadership, but also others who will be involved directly to help either implement the initiative, the research, or who have a stake in the results. These could be program leads, monitoring and evaluation staff, and procurement.

What is the role of the monitoring and evaluation team in the collaboration, their capacity to participate, and also their interest in gaining practical skills through the collaboration? Monitoring and evaluation teams are in charge of fostering learning in the organization (including use of evidence to inform programming), so their involvement can help ensure that lessons and insights are absorbed in the organizational thinking and decision-making.

Are there stakeholders outside the practitioner organization that should be included in some of the initial discussions? These could be government stakeholders, civil society partners, donors, local academics, or others. Of course, the more people there are to consult and manage, the more likely there will be a communication breakdown at some point, but some carefully-selected outside stakeholders could form a useful advisory group.

Ask
your
academic
partner

Who is on the research team? What roles do they play and who is responsible for key decisions on research questions, designs, and implementation?

Would research managers consider working from the practitioner organization’s office for a period of time? Often not more than a desk and wi-fi connection is required, and hosting is a good way for both sides to keep in touch about progress. It makes for easier dialogue with a range of folks in the practitioner organization who may be involved in supporting the project, and it expands the possibilities for learning and skills-sharing.

Who at the university is vested—or at least interested—in the collaborative research? Who might spend time discussing the expectations and interests with the wider group of professors, students, and research staff? Oftentimes universities can provide in-residence or sabbatical opportunities for practitioner teams to learn new skills and collaborate on writing and research projects.

Save
Please note your session expires when you close the webpage, so don't forget to send your answers before closing the tab.
Oops! Something went wrong while submitting the form.

Email address

Are you bound by project-reporting or grant timelines? Who are the main people at the organization who will be communicating with the academics? It’s great to have a mix—the leadership, but also others who will be involved directly to help either implement the initiative, the research, or who have a stake in the results. These could be program leads, monitoring and evaluation staff, and procurement.

What is the role of the monitoring and evaluation team in the collaboration, their capacity to participate, and also their interest in gaining practical skills through the collaboration? Monitoring and evaluation teams are in charge of fostering learning in the organization (including use of evidence to inform programming), so their involvement can help ensure that lessons and insights are absorbed in the organizational thinking and decision-making.

Are there stakeholders outside the practitioner organization that should be included in some of the initial discussions? These could be government stakeholders, civil society partners, donors, local academics, or others. Of course, the more people there are to consult and manage, the more likely there will be a communication breakdown at some point, but some carefully-selected outside stakeholders could form a useful advisory group.

Save & Continue
Oops! Something went wrong while submitting the form.

Ask
your
academic
partner

Collaborative
Decision
Making &
Team Buy-in

Email address

Who is on the research team? What roles do they play and who is responsible for key decisions on research questions, designs, and implementation?

Would research managers consider working from the practitioner organization’s office for a period of time? Often not more than a desk and wi-fi connection is required, and hosting is a good way for both sides to keep in touch about progress. It makes for easier dialogue with a range of folks in the practitioner organization who may be involved in supporting the project, and it expands the possibilities for learning and skills-sharing.

Who at the university is vested—or at least interested—in the collaborative research? Who might spend time discussing the expectations and interests with the wider group of professors, students, and research staff? Oftentimes universities can provide in-residence or sabbatical opportunities for practitioner teams to learn new skills and collaborate on writing and research projects.

Oops! Something went wrong while submitting the form.

Ask
each
other

Learning &
Dissemination

Email address*

Partner email address (optional)

What is the internal review process for each entity before we can share results? Most academics, to preserve academic freedom, will not want to have their findings and conclusions approved, but will welcome reviews, comments and interpretations.  

How many iterations or reviews of an output are reasonable? How much lead time does reviewer need? For example, the academic partner should address at least one round of questions and clarifying comments from practitioners before the output is considered final. Good practice is to jointly review sample outputs that you find useful and discuss what it would take to reproduce them.

Does your university or organization have requirements or a process to follow for joint publications, co-branding or using each-others logos? What about sharing news about the collaboration on social or traditional media? Some established organizations may have sophisticated approval processes, so it’s important to check ahead of time to avoid complications from any public-facing communications.

How will we spread the word about the results? Dissemination and marketing is almost always an afterthought, so think early about your communications strategy and what would have the biggest impact for your target audience. This might include resources and budget for a meeting, workshop, or webinar as well as expert support for writing for popular audiences, editing, layout and design for print or online multimedia.

Ask
your
practitioner
partner

What are the minimum outputs that you need us to produce from this collaboration? Who is your target audience? Outputs might include descriptive statistics, field reports, final reports, or slide decks. Which are essential? Can you provide examples of what these outputs should or have looked like? Make sure to set expectations for producing outputs written in accessible language and without jargon for target audiences.

What outputs do you want to produce yourself but would like the academics to review? For example, the team may want to produce a “research brief” for policy audiences based on the findings. Best practice is to ask for academic review to ensure accuracy.

Ask
your
academic
partner

Who owns the data? If academics are conducting quantitative research, it’s possible they will be collecting and storing the raw data, which may contain sensitive or personal information. Discuss whether it is useful to share this data, in what format, and what redactions might be needed for sensitive data.

Will experimental results need to be replicated before they can be used for program or policy recommendations? Replications are when someone independently tries to recreate causal results from the raw data, to see if the analysis is correct. Ask more about what this means for interpreting your results and why it might be a good idea to spend the extra time and resources for this.  

How much time can you set aside developing non-academic outputs? This includes resources for policy memos and presentations, and support to translate the results from technical academic jargon to more common language for diverse audiences. In our experience, the minimum package includes a summary report (with detailed methodology and results), accompanying slide deck, and descriptive statistics.  

Can you offer opportunities for teaching, skills-sharing, and capacity-building for practitioner staff? Explore what would be of interest (e.g., methodological training on sampling, survey tools, piloting, data analysis). Collaboration and skill-sharing that go beyond the immediate research project is a big part of engaged scholarship.

Save
Please note your session expires when you close the webpage, so don't forget to send your answers before closing the tab.
Oops! Something went wrong while submitting the form.

Ask
your
practitioner
partner

Learning &
Dissemination

Email address

What are the minimum outputs that you need us to produce from this collaboration? Who is your target audience? Outputs might include descriptive statistics, field reports, final reports, or slide decks. Which are essential? Can you provide examples of what these outputs should or have looked like? Make sure to set expectations for producing outputs written in accessible language and without jargon for target audiences.

What outputs do you want to produce yourself but would like the academics to review? For example, the team may want to produce a “research brief” for policy audiences based on the findings. Best practice is to ask for academic review to ensure accuracy.

Oops! Something went wrong while submitting the form.

Ask
your
academic
partner

Learning &
Dissemination

Email address

Who owns the data? If academics are conducting quantitative research, it’s possible they will be collecting and storing the raw data, which may contain sensitive or personal information. Discuss whether it is useful to share this data, in what format, and what redactions might be needed for sensitive data.

Will experimental results need to be replicated before they can be used for program or policy recommendations? Replications are when someone independently tries to recreate causal results from the raw data, to see if the analysis is correct. Ask more about what this means for interpreting your results and why it might be a good idea to spend the extra time and resources for this.  

How much time can you set aside developing non-academic outputs? This includes resources for policy memos and presentations, and support to translate the results from technical academic jargon to more common language for diverse audiences. In our experience, the minimum package includes a summary report (with detailed methodology and results), accompanying slide deck, and descriptive statistics.  

Can you offer opportunities for teaching, skills-sharing, and capacity-building for practitioner staff? Explore what would be of interest (e.g., methodological training on sampling, survey tools, piloting, data analysis). Collaboration and skill-sharing that go beyond the immediate research project is a big part of engaged scholarship.

Save & Continue
Oops! Something went wrong while submitting the form.