Lucy N Smith

research consultant + Communications lead

Lucy has worked with out-of-school time organizations for over six years. Her experience as a coach working directly with high school students fueled her passion for providing high quality and actionable resources and services to afterschool educators and Lucy’s experience as a data analyst and technical specialist has contributed to her strong commitment to data driven quality improvement systems that further positive outcomes for children and families.

Research Consultant and Communications Lead, QTurn LLC, March 2019 – present. Assisted in the start up of a new performance measurement LLC. Responsibilities included: development of communication strategy – web content, design, and related blogs; collection and organization of content for Research Gate; contributed to development of invoice system along with other administrative tasks. Maintain responsive and strength-based relationships with clients both domestic and internationally.

Research Assistant, David P. Weikart Center for Youth Program Quality at the Forum for Youth Investment, 2016-2019. Responsible to providing training and technical assistance for the Weikart Center’s online data collection/management system (referred to below as Scores Reporter). Lead two statewide data collection projects for the U.S. Department of Education’s 21 Century Community Learning Programs. This included, creating the tools necessary for data collection, training and providing technical assistance. Conduct external assessments interviews and produce initial qualitative analysis. Create and implement multiple surveys throughout the year in multiple communities. Continued support of multiple research projects focused on improving the quality of teaching in communities and conducted initial analysis once data was collected. Supported/led logistical coordination of Social and Emotional Learning Validation study. Research focus areas included: organizational performance, child development and methodology. Specific skills included: data collection (interview and observational), data management (cleaning, aggregation, merge), analytics, and qualitative analysis with a focus on social-emotional learning.

Co-founder and Head Coach, Mock Trial Mentors, 2016-2017. Established a successful, sustainable, and competitive high school mock trial team. Activities included: designing a high school mock trial curriculum that offers opportunities for further exploration of social justice; providing training, mentoring, team development and academic support; serving as a liaison to local legal professionals and the University of Michigan mock trial team.

B.A., Political Science – American Politics. University of Michigan.

Webinars and Trainings:

Self-Assessment & Scores Reporter Webinar (23) – Co-presentation and check-in on best practices of the self-assessment process and training on the utility and functionality of the Scores reporter system. My role in these trainings was focused on how clients will enter and interact with their data.

Network Lead Scores Reporter Walk-Through (2) – A deeper dive into the Scores Reporter system. Useful for administrators who will oversee the data entry and report creation of multiple programs.

Annual Performance Reporting Webinar (3) – Emphasize the importance of reporting accurate and timely data to congress. Go over the timeline and process of data collection with programs. Answer questions and clarify any aspects of the specific data being collected.

Survey Administration Webinar (1) – For clients that will be providing multiple surveys for their network, I review the survey protocols and best practices for collecting high quality data at high response rates.



Annual Performance Reporting (APR) for 21st Century Community Learning Centers (CCLC) – In order to continue to receive federal funding, States must report program information and performance data to demonstrate that their programs are meeting the requirements of the 21 CCLC Grant. For Arkansas and Oklahoma, I’ve developed new data collection processes for each data collection period. These processes include trainings and technical assistance to the individual sites who are submitting this information, and coordination with the relevant State Departments to collect additional information. Once data is collected, it is cleaned and entered into the national database for congress to review and approve funding. There are three data collection points throughout the year, and programs must report for each year they receive funding.


Social and Emotional Learning Validation Study – Collected observational data of programs participating in the study in Michigan (March, 2018) and Massachusetts (December, 2018). Coordinated 11 external assessors observation schedules at 64 sites and the collection of a child-level assessment (Staff Rating of Youth Behavior – SRYB) of over 2000 youth at two time points.



Youth Program Quality Intervention Reports (20) – These reports are the culmination of the client’s engagement in the Youth Program Quality Intervention (YPQI) process during a program year. It combines data that demonstrates the quality of programming delivered, the supports provided by/to staff and managers, and the extent to which the YPQI process was implemented by managers and staff throughout the year. On each of these reports, I was responsible for data collection, baseline data analysis, and proofreading/editing. These reports are often customized to meet the needs of the client. Common customization requests include: additional survey data, multi-year breakdown, state assessment data and attendance analysis.