HBCU Master’s Degree Program

Fayetteville State University


HBCU Master’s Degree Program – Data Collection Plan

Data Collection Plan

HBCU Master’s Degree Program

Fayetteville State University | Grant Period: 2023-2029

Purpose of Data Collection

This comprehensive data collection plan ensures systematic tracking of program performance against all six objectives and their associated performance indicators. Data collection supports continuous improvement, accountability, and evidence-based decision-making throughout the grant period.

Key Principles: Data will be collected consistently, analyzed regularly, and used to inform program adjustments. All data collection methods comply with FERPA and institutional research protocols.

Primary Data Sources: Institutional records, online surveys, program logs, faculty reports, and external sources will provide comprehensive evidence of program outcomes and impacts.

Institutional Data

Enrollment, graduation, retention rates from university registrar and IR office

Online Surveys

Student feedback, graduate outcomes, program satisfaction assessments

Program Logs

Scholarship awards, participation tracking, event attendance records

Faculty Reports

Publications, presentations, grant activities, professional development

External Sources

Employer surveys, partnership agreements, state education records

Objective 1: Increase Graduate STEM Enrollment and Degree Completion

Track enrollment growth, degree completion, research engagement, and time-to-degree metrics

Data Point & Description Data Source Collection Frequency Performance Indicator Link
Total Graduate STEM Enrollment
Number of students enrolled in MAT, MEd, MBA, MSN, MS programs in targeted STEM disciplines each fall semester
Institutional Data

University Registrar’s Office / Institutional Research

Annual
Census date each Fall semester
PI 1.1: 3% annual enrollment increase
(Baseline: 201, Target: 207, Year 1: 208)
Enrollment by Program
Breakdown of students by specific degree program (MAT Math, MEd Science, MBA BIDA, MSN, etc.)
Institutional Data

Registrar / Program Coordinators

Annual
Each Fall semester
PI 1.1: Track enrollment trends by program to identify growth areas
Graduate Degree Completions
Number of students earning master’s degrees in targeted STEM disciplines annually
Institutional Data

Registrar’s Office / Institutional Research

Annual
End of each academic year (Summer graduation)
PI 1.2: 3% annual increase in degrees earned
(Baseline: 65 for 2023-2024)
HBCU Scholar Research/Clinical Participation
Number and percentage of scholarship recipients engaged in research projects, apprenticeships, or clinical experiences
Program Logs Faculty Reports

Program office tracking + Faculty advisor reports

Semester
End of Fall and Spring semesters
PI 1.3: ≥75% of scholars in experiential learning
(Year 1: 80%)
Research Project Titles & Details
Documentation of specific research projects, apprenticeship placements, and clinical experiences with faculty mentors
Student Reports Program Logs

Student research forms + Program database

Semester
End of each semester
PI 1.3: Qualitative evidence of research engagement and quality
Median Time to Degree Completion
Average time from enrollment to graduation by program type (MAT: 2 years, MEd: 2 years, MBA: 2.5 years, MSN: 2 years, MS: 2 years)
Institutional Data

Registrar / Institutional Research cohort analysis

Annual
End of each academic year
PI 1.4: Median time meets program-specific targets (2-2.5 years)
Student Demographics
Race/ethnicity, socioeconomic status (Pell-eligible), gender breakdown of enrolled students
Institutional Data

Student records / Financial Aid office

Annual
Fall census date
PI 1.1: Monitor diversity and ensure broad participation across student populations

Why This Data Matters

Objective 1 data demonstrates the program’s core impact on access and completion. Enrollment trends show whether financial assistance and support services are effectively attracting students. Research engagement metrics validate the quality of experiential learning. Time-to-degree data ensures efficiency and student success, while completion rates measure the ultimate goal of degree attainment.

Objective 2: Strengthen STEM Teacher Preparation

Monitor teacher candidate enrollment, licensure completion, program cost-effectiveness, and infrastructure development

Data Point & Description Data Source Collection Frequency Performance Indicator Link
HBCU-MD STEM Teacher Scholarships Awarded
Number of scholarships awarded annually to teachers, teacher candidates, or residency licensure participants in STEM disciplines
Program Logs

HBCU Grant Office scholarship database

Annual
End of academic year
PI 2.1: Award minimum 10 scholarships annually
Scholarship Recipient Profiles
Background information (current teacher, residency participant, or initial licensure candidate), program of study, demographics
Program Logs Institutional Data

Scholarship applications + Student records

Annual
At scholarship award time
PI 2.1: Ensure scholarships target appropriate teacher populations
MAT/MEd STEM Program Enrollment
Total number of students enrolled in MAT and MEd programs in STEM teaching disciplines (Math, Science)
Institutional Data

Registrar / College of Education records

Annual
Fall census date
PI 2.2: 3% annual enrollment increase
(Baseline: 27, Year 1: 28)
Teacher Licensure Completions
Number of graduates obtaining initial or advanced teaching licensure in STEM disciplines
Institutional Data External Data

College of Education + NC DPI licensure records

Annual
End of academic year
PI 2.2: 3% annual increase in completions
(Baseline: 13)
Cost of Attendance per Graduate
Total program cost divided by number of graduates in supported academic programs
Institutional Data Program Logs

Finance office + Grant expenditure tracking

Annual
End of fiscal year
PI 2.3: Cost per degree ≤ $60,000
(Year 1: $21,743)
COE STEM Education Center Construction Progress
Milestone completion percentages: Planning (Year 1), Construction (Years 2-3), Completion (Year 4)
Project Reports

Facilities Management + Construction contractor updates

Quarterly
Construction progress reports
PI 2.4: 20% Year 1, construction 2024-2026, completion 2026-2027
(Year 1: 20% complete)
Teacher Placement Data
Employment outcomes for teacher graduates (school district, grade level, subject area)
Graduate Survey External Data

Alumni surveys + NC DPI employment records

Annual
6-12 months post-graduation
PI 2.2: Validate completions translate to teacher workforce contribution

Why This Data Matters

Objective 2 data addresses the critical teacher shortage in STEM fields. Scholarship and enrollment data demonstrate program reach to teacher candidates. Licensure completion rates validate program effectiveness in preparing qualified STEM teachers. Cost-effectiveness metrics ensure efficient use of grant funds. Infrastructure tracking ensures modern facilities support teacher preparation. Placement data confirms graduates enter the teaching workforce, directly impacting K-12 STEM education quality.

Objective 3: Enhance Program Relevance Through Expansion

Track new program development, facility upgrades, and alignment with industry demands

Data Point & Description Data Source Collection Frequency Performance Indicator Link
New Concentrations/Certificates Launched
Number and names of new concentrations and certificates implemented (Fintech, Digital Enterprise, Instructional Technology, Cybersecurity, ERP-SAP, etc.)
Institutional Data Program Reports

Academic Affairs / Program Directors

Semester
When programs launch
PI 3.1: Minimum 3 new concentrations/certificates by Year 4
(Year 1: 2 launched + ERP initiated)
Program Development Milestones
Timeline of curriculum development, approval processes, accreditation reviews for new programs
Program Logs Status Reports

Program development team + Academic Affairs

Ongoing
Tracked continuously
PI 3.1: Monitor progress toward 3+ program launches
Enrollment in New Concentrations
Number of students enrolled in each new concentration/certificate program each semester
Institutional Data

Registrar / Program Coordinators

Semester
Each Fall and Spring
PI 3.1: Assess demand and viability of new programs
MS in Data Science Proposal Status
Progress through development stages: Intent to Plan, full proposal development, submission to UNC System, approval status
Status Reports External Data

Program development team + UNC System Office communications

Quarterly
Development phase tracking
PI 3.2: Proposal submission by Year 3
(Year 1: 50% – Intent to Plan submitted)
Facility Upgrades Completed
List and status of laboratory, technology space, and equipment upgrades (Bloomberg Terminals, Data Science Lab, etc.)
Project Logs Facilities Reports

Facilities Management + IT Services + Grant office

Ongoing
Project completion tracking
PI 3.3: 100% completion by Year 6
(Year 1: 50% complete)
Technology Acquisitions
Equipment, software, and technology purchased and deployed (with costs and implementation dates)
Purchase Logs Institutional Data

Procurement + IT Services + Grant budget tracking

Ongoing
As purchases occur
PI 3.3: Document technology infrastructure enhancements
Industry Partnership Agreements
Documentation of partnerships with companies (SAP, SAS, IBM, etc.), including services and resources provided
Partnership Docs Program Logs

Partnership agreements + Program office records

Annual
Partnership review
PI 3.1 & 3.3: Demonstrate industry alignment and support for new programs
Student Satisfaction with Facilities
Survey responses on quality and adequacy of labs, technology spaces, and equipment
Student Survey

End-of-semester program evaluation surveys

Semester
End of Fall and Spring
PI 3.3: Validate that upgrades enhance student learning experience

Why This Data Matters

Objective 3 data demonstrates the program’s responsiveness to evolving industry needs and educational trends. New concentration tracking shows curriculum innovation in high-demand fields like fintech, cybersecurity, and data science. The MS in Data Science development represents a major institutional expansion. Facility upgrade documentation validates investment in learning infrastructure that will serve students beyond the grant period. Industry partnership data confirms programs meet employer needs, enhancing graduate employability.

Objective 4: Support Faculty Research and Development

Monitor faculty scholarly productivity, professional development, and research support

Data Point & Description Data Source Collection Frequency Performance Indicator Link
Faculty Travel Grants Awarded
Number of faculty receiving travel grants for conferences, workshops, and professional development events
Program Logs

Grant office travel grant database

Annual
End of fiscal year
PI 4.1: Minimum 10 faculty travel grants annually
(Year 1: 9)
Conference/Event Details
Names of conferences attended, dates, locations, purpose (presentation, workshop participation, networking)
Faculty Reports Travel Logs

Travel request forms + Post-travel reports

Ongoing
After each event
PI 4.1: Document types and quality of professional development activities
Faculty Research Mini-Grants Awarded
Number of research mini-grants awarded starting Year 3, with project titles and principal investigators
Program Logs

Grant office mini-grant database

Annual
Starting Year 3 (2025-2026)
PI 4.2: Minimum 5 mini-grants annually beginning Year 3
Mini-Grant Research Outcomes
Publications, presentations, or other scholarly products resulting from mini-grant funded research
Faculty Reports

Required final reports from grant recipients

Annual
End of grant period
PI 4.2: Assess productivity and impact of research support
Faculty Publications
Number and types of publications (journal articles, book chapters, books) by STEM faculty annually
Faculty Reports Institutional Data

Faculty annual reports + Institutional Research

Annual
End of academic year
PI 4.3: Minimum 50 publications annually
(Year 1: 191 – far exceeded)
Publication Citations
Full citations for faculty publications including journal names, titles, authors, DOIs
Faculty CVs Institutional Data

Faculty CV repository + Annual activity reports

Annual
Faculty reporting period
PI 4.3: Document scholarly contributions and quality
Conference Presentations
Number and titles of conference presentations, posters, and invited talks by faculty
Faculty Reports

Faculty annual activity reports

Annual
End of academic year
PI 4.3: Measure dissemination of research beyond publications
Faculty-Student Research Collaborations
Number of faculty mentoring students in research, with co-authored publications or presentations
Faculty Reports Research Logs

Research symposium records + Faculty reports

Annual
End of academic year
PI 4.3: Connect faculty research to student learning outcomes
External Grant Applications & Awards
Number of external grant proposals submitted and awarded by STEM faculty (with funding amounts)
Institutional Data

Office of Sponsored Programs

Annual
Fiscal year end
PI 4.2 & 4.3: Measure broader research competitiveness and capacity building

Why This Data Matters

Objective 4 data demonstrates investment in faculty capacity, which directly benefits students through improved instruction and research mentorship. Travel grant data shows professional development reach. Publication metrics (191 in Year 1 vs. 50 target) indicate exceptionally strong scholarly productivity. Mini-grants (starting Year 3) will seed research projects that enhance FSU’s research profile. Faculty-student collaborations link research support to student outcomes. This data validates that supporting faculty excellence creates a ripple effect throughout the program.

Objective 5: Develop and Revise STEM Curricula

Track course development, instructional tool acquisition, and curriculum quality improvements

Data Point & Description Data Source Collection Frequency Performance Indicator Link
New Courses Developed
List of new STEM courses created with course numbers, titles, credit hours, and program alignment
Institutional Data Program Reports

Academic Affairs / Curriculum committees

Semester
When approved/launched
PI 5.1: Develop minimum 10 new courses by Year 6
(Year 1: Planning begun for Fintech, Instructional Tech, Data Science)
Course Development Timeline
Development stages for each course: planning, syllabus creation, approval, first offering
Program Logs Status Reports

Curriculum development team documentation

Ongoing
Continuous tracking
PI 5.1: Monitor progress toward 10-course target
Evidence of Course Development Steps
Documentation of course development process: needs assessment, learning outcomes, curriculum proposals, syllabi drafts, materials developed, pilot feedback
Development Docs Program Logs

Course development files + Academic Affairs records

Per Course
Throughout development cycle
PI 5.1: Document quality and rigor of course development process
Courses Revised
List of existing STEM courses updated with revision details (content updates, technology integration, alignment improvements)
Institutional Data Faculty Reports

Curriculum committees / Faculty documentation

Semester
When revisions implemented
PI 5.2: Revise minimum 8 courses by Year 6
(Year 1: Identification phase)
Course Revision Rationale
Documentation of why courses were revised and what improvements were made (industry alignment, pedagogy, technology)
Faculty Reports

Course revision proposals and reports

As Completed
Per revision project
PI 5.2: Demonstrate quality and relevance of revisions
Teacher Licensure Tools Acquired
List of instructional tools purchased (SchoolSims, Pearson products, etc.) with implementation status
Purchase Logs Implementation Reports

Procurement records + COE usage reports

Ongoing
As acquired
PI 5.3: Acquire licensure tools throughout grant period
(Year 1: SchoolSims acquired)
Licensure Tool Usage Data
Number of students using tools, frequency of use, integration into courses
Faculty Reports Usage Logs

Platform usage analytics + Faculty reports

Semester
End of Fall and Spring
PI 5.3: Validate tools enhance teacher preparation
Software & Data Licenses Acquired
List of software licenses, data licenses, and equipment purchased (Bloomberg, VMock, SAP, SAS, IBM, nursing simulation, etc.)
Purchase Logs

IT Services + Grant office procurement tracking

Ongoing
As acquired
PI 5.4: Acquire licenses/equipment throughout grant
(Year 1: 4 major acquisitions – exceeded target)
Software/License Integration
Courses using each software/license, number of students with access, certifications earned
Institutional Data Usage Reports

LMS data + Software usage analytics + Certification records

Semester
End of each term
PI 5.4: Document student engagement with professional tools and certification achievement
Student Satisfaction with Curriculum
Survey responses on course quality, relevance, technology integration, and preparation for careers
Course Evaluations Program Surveys

Course evaluation system + Program-specific surveys

Semester
End of each course
PI 5.1 & 5.2: Assess student perception of curriculum quality and relevance
Industry Certification Achievement
Number and percentage of students earning industry certifications (SAP, SAS, IBM AI, etc.)
Institutional Data External Data

Program records + Certification vendor data

Semester
After certification exams
PI 5.4: Validate curriculum prepares students for professional credentials

Why This Data Matters

Objective 5 data demonstrates continuous curriculum improvement and industry alignment. New course development (target: 10) and revisions (target: 8) show responsiveness to emerging fields and pedagogical best practices. Documentation of course development steps provides evidence of rigorous planning, learning outcomes design, and quality assurance processes. Teacher licensure tools (SchoolSims acquired Year 1) enhance preparation quality. Software/license acquisition (4 in Year 1, exceeding targets) provides students with industry-standard tools. Certification achievement validates curriculum rigor and career readiness. This data proves the program stays current with field demands and employer needs.

Objective 6: Provide Comprehensive Student Support

Monitor graduation outcomes, employment success, and professional development engagement

Data Point & Description Data Source Collection Frequency Performance Indicator Link
Annual Graduation Rate
Number and percentage of students in targeted STEM disciplines who graduate within expected timeframes
Institutional Data

Registrar / Institutional Research

Annual
End of academic year
PI 6.1: Track graduation rates with Year 1 baseline (65 graduates)
Assess annually beginning Year 2
Graduation Rate by Program
Breakdown of graduation rates by specific degree programs (MAT, MEd, MBA concentrations, MSN, MS)
Institutional Data

Institutional Research cohort analysis

Annual
End of academic year
PI 6.1: Identify program-specific trends and areas needing support
Student Persistence & Retention
Fall-to-fall retention rates and semester-to-semester persistence for STEM graduate students
Institutional Data

Registrar / Institutional Research

Semester
Each Fall and Spring
PI 6.1: Early indicators of graduation success; identify at-risk students
Employment Outcomes
Number and percentage of graduates employed in their field of study within 1 year of graduation
Graduate Survey External Data

Graduate completer surveys + LinkedIn + employer reports

Annual
6-12 months post-graduation, starting Year 3
PI 6.2: Monitor employment outcomes
Tracking begins Year 3 (2025-2026)
Employment Details
Job titles, employers, salary ranges, geographic locations, career advancement for graduates
Graduate Survey

Detailed graduate outcome surveys

Annual
6-12 months post-grad, starting Year 3
PI 6.2: Validate career relevance and salary outcomes (MBA BIDA avg: $75K)
Graduate Further Education
Number of graduates pursuing doctoral degrees or additional certifications
Graduate Survey

Graduate follow-up surveys

Annual
Starting Year 3
PI 6.2: Additional measure of program success and career trajectory
Scholar Professional Development Participation
Number and percentage of HBCU STEM Scholars attending conferences, workshops, seminars, or other PD activities
Attendance Logs Event Reports

Event registration systems + Sign-in sheets + Travel records

Ongoing
After each event
PI 6.3: ≥80% scholars participate in PD annually
(Year 1: 100%)
Professional Development Event Details
List of PD opportunities offered: Student Research Symposium, Excellence in Teaching Conference, BIDA/HINF Symposium, external conferences
Program Logs Event Reports

Event planning documents + Post-event summaries

Ongoing
Per event
PI 6.3: Document range and quality of PD opportunities provided
Academic Support Service Utilization
Usage of tutoring, advising, mentoring, writing support, and other academic services by STEM students
Service Logs Institutional Data

Academic support center records + Advising logs

Semester
End of Fall and Spring
PI 6.1: Assess whether support services reach students and contribute to retention
Non-Academic Support Service Utilization
Usage of mental health services, financial counseling, career services, and wellness programs
Institutional Data

Student Affairs offices (aggregated, de-identified data)

Semester
End of each term
PI 6.1: Holistic support contributes to student success and wellbeing
Student Satisfaction & Program Feedback
Survey responses on overall program satisfaction, support services quality, academic experience, and recommendations
Student Surveys Exit Surveys

Mid-program and exit surveys + Focus groups

Semester & Annual
Mid-program + At graduation
PI 6.1 & 6.3: Continuous improvement feedback on all support dimensions
Scholarship Impact Assessment
Student feedback on role of financial support in enrollment decision, persistence, and degree completion
Scholar Surveys

HBCU Scholar-specific surveys + Exit interviews

Annual
Mid-year and exit
PI 6.1: Document impact of financial assistance on access and completion
(Phase I: 67% cited scholarship as primary enrollment reason)

Why This Data Matters

Objective 6 data demonstrates the holistic student support approach. Graduation rates (baseline: 65) show program effectiveness in helping students complete degrees. Employment tracking (beginning Year 3) validates career readiness and demonstrates return on investment for students and the institution. Professional development participation (100% in Year 1, target 80%) shows comprehensive development beyond academics. Academic and non-academic support utilization data guide resource allocation. Student feedback enables continuous improvement. This comprehensive data set proves the program creates an environment where students thrive academically, professionally, and personally.

Data Collection Timeline by Grant Year

Years 1-2 (2023-2025)

  • Establish all baseline data
  • Begin tracking enrollment, scholarships, and participation
  • Track facility upgrades and program launches
  • Monitor faculty publications and travel grants
  • Collect course satisfaction data
  • Document professional development participation

Years 3-4 (2025-2027)

  • Begin employment outcome tracking
  • Start faculty research mini-grant data collection
  • Track COE STEM Center construction completion
  • Monitor MS in Data Science proposal approval
  • Assess new concentration enrollment trends
  • Evaluate mid-grant progress on all objectives

Years 5-6 (2027-2029)

  • Complete longitudinal graduation trend analysis
  • Finalize employment outcome assessments
  • Document program sustainability planning
  • Conduct comprehensive program evaluation
  • Prepare final impact reports
  • Assess institutionalization of initiatives

Implementation Best Practices

  • Data Quality: Establish standardized data collection protocols with clear definitions for all metrics to ensure consistency across years
  • Privacy Compliance: All data collection complies with FERPA regulations; student-level data is aggregated and de-identified for reporting
  • Regular Reporting: Quarterly progress reports to grant leadership; annual performance reports to Department of Education
  • Data Management: Maintain centralized database with access controls; assign clear responsibilities for each data collection point
  • Survey Response Rates: Target 80%+ response rates through multiple contact attempts, incentives, and convenient survey timing
  • Continuous Improvement: Use data to make real-time program adjustments, not just for retrospective reporting
  • External Evaluation: Independent evaluator reviews data collection methods and findings annually
  • Stakeholder Engagement: Share data summaries with faculty, students, and partners to demonstrate progress and gather feedback
  • Technology Integration: Use automated data collection where possible (LMS analytics, registration systems) to reduce manual burden
  • Documentation: Maintain audit trail for all data sources, collection methods, and analysis procedures