Data Strategy 101: How to Gather, Analyze, and Use Data
'Dimeji Togunde, PhD, is the first to acknowledge that his academic background is different from many of his peers in international education. With bachelor’s and master’s degrees in statistics, Togunde says he finds it second nature to “collect data and use it as a tool” in his role as vice provost for global education at Spelman College. Knowing how to collect and use data is a skill international education professionals say is increasingly important as institutions are looking for areas of the budget to trim as they face financial challenges.
“There was this assumption that whatever we did was good—that any sort of study abroad [experience] almost magically produced changes in students,” says Melissa Whatley, PhD, a postdoctoral research scholar at North Carolina State University. “We have to be more intentional, particularly in times like now where funding is scarce and you can’t travel internationally, to justify why international education exists.”
International education scholars have always worked with data—consider mobility numbers and the Open Doors data that aggregate them to reveal global trends. But experts say there is an opportunity to dig deeper, tapping new sources of information and using them in new ways to advocate for programs. Here are a few ways to think about developing a data strategy that serves these needs.
1. Identify your priorities—and your institution’s.
One possible first step, says Whatley, is to check your program website, where objectives are often stated clearly and explicitly. Then step back and do the same with the institution’s website or strategic plans and see where the goals overlap.
“Think about building a data strategy that tells your story in a way that’s aligned with what the campus priorities are,” explains Donald Rubin, PhD, co-principal investigator for the Consortium for Analysis of Student Success through International Education (CASSIE) in the University System of Georgia. “Figure out what are the important conversations going on around campus.”
For example, ongoing conversations about student success and the overall value of a college education provide an opening to demonstrate that students who participate in study abroad or other programs graduate at higher rates or develop competencies desired by employers. This, in turn, provides insights into identifying the priorities with the greatest potential impact to stakeholders.
“It’s like backwards design when you’re building a course syllabus,” Whatley says.
2. Make sure goals are measurable.
Many goals are written in aspirational terms, using blanket statements such as “Students will appreciate different cultures,” according to Whatley.
“Reframe your goals so they can be something you provide evidence for,” she says. “Then your data collection plan [follows] from that.” If a program emphasizes providing students with skills for the global workforce, then the data should focus on how students fare in the workforce after they graduate.
When launching new programs, it is important to include ways to measure impact as part of the program design. “In some cases, we’re so busy making the experience and project fly that we don’t think about collecting data, particularly to assess impact on students,” says GianMario Besana, PhD, associate provost for global engagement and online learning at DePaul University.
3. Consider the data you (already) have.
Chances are there is more of it readily available than you might think. “Sometimes people tell me their office doesn’t collect data,” Whatley says. “I guarantee you it’s not true.”
Along with tracking the numbers of inbound and outbound students, a wide range of materials—student applications, records of participation in internationally focused events, and information about majors and courses—all can serve data collection needs. Sometimes, data outside of the department, such as insurance records for students traveling abroad, also can help fill in the blanks in the office’s data.
“It’s a change in mindset—it’s data, not just records I keep,” Whatley explains.
4. Do not let the perfect be the enemy of the good.
To measure impact, DePaul developed an instrument asking participants in the college's global learning experiences whether they feel they acquired new skills that will help them in the future. While Besana plans to improve the survey to focus on more specific skills, “even just from this initial question we're getting a strong signal that what we’re doing is helping students,” he says.
Another strategy: Adopt existing instruments to meet the needs of your department. Spelman adopted the Association of American Colleges & Universities VALUE rubrics to measure learning outcomes before and after study abroad experiences. The rubric and reflections together provide “a set of quantitative and qualitative data to explain their travel experience abroad,” Togunde says. Common measures of learning outcomes also make it possible to exchange information across multiple programs or departments.
There is an added benefit to analyzing existing survey data from other departments or programs with an international lens, says Gundolf Graml, PhD, associate vice president for academic affairs and dean for curriculum and strategic initiatives at Agnes Scott College, where results from National Association of Colleges and Employers (NACE) surveys are used to pinpoint critical skills.
“Rather than adding to the survey fatigue of everyone on campus…you can connect these thoughts to give you a larger narrative around the benefits of internationalization,” Graml says.
4. Build alliances to get better data.
It is also critical to work with other offices that focus on data, including the institutional research or institutional effectiveness offices, which crunch the numbers for reporting, accreditation, and other institutional functions.
“You should have a regular relationship with those people, not just respond to their request every x years for a table to include in an accreditation report,” Rubin says.
Along with helping prepare data for individual requests, colleagues in these offices can help identify areas of overlap with other departments that may result in collaborative research. For example, research that connects study abroad with choices of majors could be of mutual interest, Rubin says. And these offices can help international offices set up data dashboards that help them track key metrics themselves.
5. Focus on deeper dives—and differentiation.
At the University of Florida, the Office of Global Learning developed a validated survey to measure the impact of study abroad and the overall campus climate to determine if broader internationalization efforts were helping students “become more open-minded and culturally aware,” says the office’s director, Paloma Rodríguez, MA. And while the topline results were positive—showing overall improvements in cultural competencies—taking the time to disaggregate the data helped change the conversation around the deficits commonly associated with minority students and international education, which have long focused around barriers to participation.
Students of color were significantly stronger in measures of intercultural competence across multiple dimensions, according to Rodríguez. “People of color are themselves not aware of these strengths and therefore can’t use them,” she says. At Spelman, differentiating data around areas such as Pell Grant and first-generation status has helped provide “guidance for support and funding for student mobility,” Togunde says.
Differentiated data also can help make a stronger case for international programs. At Agnes Scott, focusing on breaking down data by fields and majors has helped “students see not [only] the value of study abroad in general, but how they can be strategic in selecting the programs they might take in a specific field,” says Graml.
6. Build capacity.
Not all international offices or programs have the capacity to analyze and differentiate data. Those that do not can consider recruiting graduate assistants with experience in data analysis. “That may mean looking beyond the disciplines you traditionally hire for in your office,” Whatley says. Staff members can take classes at their own institution to develop needed data skills, or attend professional development workshops like the ones NAFSA has held in recent months.
Offices should be on the lookout for graduate assistants or staff that have one of the most-needed—and hardest-to-find—skills: data visualization. “It’s not just about collecting data and being able to read it,” Besana says. “Being able to present it is really important.”
7. Periodically review data collection efforts.
A key component of a data collection strategy involves regular review, says Rubin. Along with quality improvement of collection efforts, a review process helps identify emerging areas of importance to programs or the institution, such as the trend toward noncredit education abroad programs or service-learning activities. “Sometimes you don’t anticipate trends that become important,” Rubin says. •
Start Here: Selected Metrics to Track for International Programs and Initiatives
Inbound and Outbound Students
- Demographics and classification
- First-generation students
- First time traveling abroad
- Pell Grant status
- Student learning outcomes, including ability to understand similarities and differences in political, cultural, social, and economic values and measures of intercultural competence
- Scholarships that support student travel
- Locations, institutions, countries, and purpose
- Reciprocity (numbers of inbound vs. outbound students)
- Length of agreements or partnerships
- Collaboration (coauthored publications or grants)
- Faculty visits
Faculty Global Engagement
- Faculty directing courses abroad, their departments, time at institution, and status (tenured, untenured, full or assistant professors)
- Workshops and conference presentations
- International research publications
- Pre- and post-travel data on faculty engagement
- Learning objectives in global courses in different majors
- Tenure profiles and decisions
Source: 'Dimeji Togunde, PhD, Spelman College
- “Foundations for Strategic Analysis of International Education Data” e-Learning Seminar
- “Making Data Come Alive: Diverse Methods for Engaging Stakeholders in Meaningful Assessment” e-Learning Seminar (recording)
- “Assessing Global Learning: Methods, Metrics, and Meaning” e-Learning Seminar
- “Assessment of Global Learning Toward Accreditation” e-Learning Seminar
- “Assessment of Global Learning: Targets vs. Growth” e-Learning Seminar
About International Educator
International Educator is NAFSA’s flagship publication and has been published continually since 1990. As a record of the association and the field of international education, IE includes articles on a variety of topics, trends, and issues facing NAFSA members and their work.
From in-depth features to interviews with thought leaders and columns tailored to NAFSA’s knowledge communities, IE provides must-read context and analysis to those working around the globe to advance international education and exchange.
NAFSA: Association of International Educators is the world's largest nonprofit association dedicated to international education and exchange. NAFSA's 10,000 members are located at more than 3,500 institutions worldwide, in over 150 countries.