At the National Coordinating Center, we collect lots of data from TPSID projects. We use several methods to communicate the findings from these data with our general audiences. We write Annual Reports that provide a complete description of the program and student data collected each year. We prepare other publications such as Fast Facts focused on specific pieces of data or journal articles that take a deep dive into the data. We present our data-related findings at professional conferences, through webinars, or to our various affinity groups.
But it’s also important that we communicate data-related findings back to those who report the data to us: the TPSID projects. We view our data collection and analysis work with the TPSIDs as an ongoing conversation, a continual cycle of learning, understanding, making meaning, and implementing changes (by both parties – TPSIDs and the NCC). As data are reported to us throughout the year by TPSIDs, we are constantly reviewing and reaching out where there are gaps in our understanding of how a program operates. Through our annual data-driven technical assistance process, we gain further understanding of each individual TPSID project, and each TPSID project also learns by viewing their data. At our annual Project Director Meetings, we engage in data-related activities with attendees. Understanding the data and what it means helps TPSID projects see trends in the data across all programs, identify their own strengths and areas for improvement, and find ways to enhance their services and supports to students with intellectual disability.
We are always looking for the best ways to engage TPSIDs in understanding their data. Four years ago, I attended a training by Kylie Hutchinson with a former evaluation team colleague, Frank Smith. At the training, “Participatory Data Analysis for Evaluators: Data Parties and Dabbling in the Data,” we learned about how to engage stakeholders in making sense of their own data. We learned about how to organize and facilitate “data parties” with stakeholders. We were really inspired by Kylie’s ideas and examples that she has used with different stakeholders in her evaluation work.
In the years that followed, we used some of these ideas at our annual TPSID Project Director Meetings. For example, in one activity we gave everyone a handout that included a dot plot for several key TPSID data indicators – things like % enrollments in inclusive courses, % of students with a paid job while in the program, and so on. The dot plot showed the data from two years – orange dot for 2015-16 and green dot for 2016-17. We then gave them a handout with their own program’s data for these two years and some orange and green stickers. With staff from their program, they worked together to put orange and green stickers on the dot plot chart. Through this activity, programs could see both 1) how their program’s data had changed from one year to the next, and 2) how they compared to TPSIDs as a whole. It was really interesting listening in to the conversations throughout the activity – takeaways such as “oh wow, we really improved in that area!” and “hmm, we aren’t doing as well as we thought on this one” helped us to see that the activity did what we set out to achieve.
Another activity we did was around how we can use program data to demonstrate the success of programs and students. Inside fortune cookies, we placed pieces of paper that were about outcomes that we want students to achieve – things like “Your students will earn great wealth” and “Your students will earn credentials of great value” (of course we had to write these in fortune cookie language!). Then we put fortune cookies on each table in the main meeting room. Attendees were directed to open their fortune cookie and find others who had the same fortune (they also got to eat their cookie if they wanted!). Once they were in groups, we asked them to answer three discussion questions: What data can we use to determine this outcome? What other data could be useful in determining this outcome? And, in your program, how do you share results with funders and other key stakeholders? At the end, we came back together to share our discussions. Attendees liked the opportunity to get up and move, and to talk with people from other programs. They learned from each other about how to use data to share their successes.
We’ve received a lot of positive feedback on our project director meeting data parties. We even presented our data party approach at the North East Association for Institutional Research (NEAIR) conference in Pittsburgh in 2018. Our poster, “Let’s Have a Data Party!” won first prize in the people’s choice competition that year!
This year, our project director meeting took place virtually. This was our first project director meeting with the new cohort of TPSID projects (funded 2020-2025). Even though they are still in the midst of reporting their own data, we felt it was important to set the stage for the next five years by having them engage in an activity based on data from the year 5 of cohort 2. We gave attendees a handout that showed key program and student data indicators graphed over the five years of cohort 2 (figures 16 and 17 of the annual report, for those who are interested), and provided some individual and group reflection questions. We used breakout rooms in Zoom to allow participants to talk with each other before we came back together to reflect as a whole group. It will be easier (and more fun) to engage TPSIDs in these activities once we are able to meet face to face, but this activity set the stage for how we want TPSIDs to engage with their data over the next four years.
How about you? What strategies have you used to engage stakeholders in understanding and learning from their data? I’d love to hear from
you if you have ideas! If you are interested in learning more about data parties, we recommend Kylie Hutchinson’s website https://communitysolutions.ca/web/ and book https://communitysolutions.ca/web/evaluation-reporting-guide-2/.
Post Author: Clare Papay wears multiple hats at Think College at the Institute for Community Inclusion, UMass Boston. She is the Evaluation Coordinator for the National Coordinating Center for the TPSID model demonstration project. She is also the co-Principal Investigator for the IES-funded project, “Moving Transition Forward: Exploration of College-based and Conventional Transition Practices for Students with Intellectual Disability and Autism.” Additionally, Dr. Papay serves as co-editor for the Journal of Inclusive Postsecondary Education (JIPE).