CT Agency Suite surveys

QA surveys that feed your audit evidence.

Built-in survey tooling for staff and family feedback. Distribute, collect, analyze, and report on quality-assurance data — with the responses landing inside the same platform that holds the consumers, plans, and audit trail. No exporting to a separate survey vendor and stitching results back into compliance reporting.

Built-in
Same platform as records
Per-consumer
Linked to the right record
Audit-ready
Survey responses are evidence
What QA surveys cover
Three audiences, one tool

Family satisfaction, staff feedback, consumer-experience surveys — all in one place, all linked to the right records.

  • Family/guardian satisfaction surveys
  • Staff working-conditions feedback
  • Consumer experience and outcomes
  • Custom QA surveys per agency need
  • State-required survey distribution
Linked to records
Per-consumer, per-staff context
Audit-ready
Responses are evidence
Configurable
Custom surveys per agency
Reporting built in
Trends, comparisons, alerts
Why this matters

Most agencies run surveys in a separate tool, then manually stitch the results into compliance reporting.

When QA surveys live in an external survey tool, every response cycle becomes a stitch-back project. Export from the survey vendor, import into a spreadsheet, match responses to consumers or staff, aggregate, generate reports for state submission. The work isn't difficult but it's tedious, and the disconnect between survey responses and the agency's actual records means insights are slow to surface.

CT Agency Suite includes survey tooling so the responses land inside the same platform that holds the consumers, plans, staff records, and audit trail. A family satisfaction survey for a specific consumer connects to that consumer's record. A staff working-conditions survey connects to the staff member's record. Aggregated reporting feeds compliance dashboards directly — no exports, no spreadsheet bridges.

Surveys are configurable per agency. State-required survey templates ship as defaults; agencies customize for their specific QA program. Distribution methods include email links to families, in-app prompts to staff, and scheduled distribution per QA cycle. Response analysis includes trend tracking, comparison across periods or programs, and alerts when responses fall below thresholds.

What integrated QA surveys deliver
  • Responses linked to records — per-consumer, per-staff, per-program context
  • Audit-ready evidence — state-required surveys feed compliance reporting directly
  • No stitching — no separate survey vendor, no spreadsheet bridges
  • Trend tracking — year-over-year comparisons surface in dashboards
  • Configurable — templates adapt to your agency's specific QA program
Survey tooling capabilities

What the suite's survey tools deliver.

Family satisfaction surveys

Distribute satisfaction surveys to families and guardians per QA cycle. Responses link to the specific consumer's record so trends and concerns surface in context. Compliance reporting on response rates and satisfaction scores generates automatically.

Staff feedback surveys

Working-conditions, training-effectiveness, and engagement surveys for direct-care and office staff. Responses link to staff records so trends per supervisor, per program, or per credential cohort surface clearly. Confidential response options preserve honest feedback.

Consumer experience surveys

Where appropriate, direct consumer-experience surveys collect feedback from the people receiving services. Accessibility-aware survey formats accommodate diverse cognitive and communication needs.

Custom surveys per agency

Beyond the default templates, agencies build custom surveys for their specific QA programs — incident-follow-up surveys, post-training assessments, exit interviews, etc. Question types include scale, multiple choice, free text, and conditional follow-ups.

Distribution and reminders

Distribution via email links, in-app prompts, scheduled cycles per QA program. Automatic reminders for unresponded surveys (configurable cadence). Response rates tracked per distribution so administrators see participation in real time.

Reporting and trend analysis

Response analysis includes trend tracking across cycles, comparisons across programs or time periods, alerts when scores fall below configured thresholds. State-required reports generate automatically with the data already in the right format.

What it looks like in practice

A few ways teams use this.

Quarterly family satisfaction cycle

Quarterly family satisfaction survey cycle starts. Distribution lists pull from active consumers; emails go out automatically. Reminders fire after 7 and 14 days for unresponded recipients. Responses populate per-consumer records. End-of-quarter report aggregates response rates, satisfaction scores by program, and flags any consumers whose families reported concerns. The QA review meeting has data, not anecdotes.

Staff engagement surveys before annual review

Annual staff engagement survey rolls out two months before annual reviews. Anonymous response mode preserves honesty. Aggregated results break down by supervisor, by program, by tenure cohort. Patterns surface that wouldn't from individual conversations — specific supervisors with consistent low scores, programs with retention issues. Annual review planning is informed by data.

Annual family-feedback cycle

Annual family-feedback survey rolls out — questions designed by CozziTech, with the agency adding their own questions for what they want to learn from families about their consumer's experience and the SC's work. Distribution goes to all active consumers' family contacts. Responses populate per-consumer records, and aggregated results show per-SC and per-program patterns. The agency uses the data internally to improve service. (When state audits run separately and ask about QA programs, the survey data is also there as evidence.)

Frequently asked

Common QA survey questions.

Can survey responses be anonymous?

Yes. Surveys can be configured for attributed (response linked to specific user/family) or anonymous (responses aggregated without individual attribution). Sensitive surveys like staff engagement typically run anonymous; satisfaction surveys often run attributed so concerns can be followed up with specific consumers/families. The configuration is per-survey.

How does the platform handle accessibility for consumer surveys?

Survey formats support large-text, simplified-language, and audio-supported variants. Question types accommodate diverse communication needs: visual choice scales, yes/no with audio prompts, etc. For consumers needing assistance, surveys can be administered with documented assistance attribution preserved in the response.

Can we send surveys via SMS or only email?

Email is the default distribution method. SMS distribution is available for early-access partners with specific use cases (e.g., communities where SMS is more reliable than email). The signing flow uses the same secure-link approach regardless of channel.

Do state-required survey templates ship by default?

Common state-required survey templates ship as defaults — HCBS satisfaction surveys, Medicaid quality reporting templates, etc. The templates are starting points; most agencies customize for their specific program needs. Custom templates are part of standard configuration.

How does survey data feed compliance reporting?

Survey response data is queryable from the platform's reporting layer. Standard compliance reports (response rates, satisfaction scores, action-taken-on-concerns) generate automatically. Custom report templates can be built for state-specific or program-specific reporting requirements.

Can supervisors see only their team's survey responses?

Yes. Role-based permissions scope survey results to what each role needs. Supervisors see their direct reports; QA leads see agency-wide data; senior leadership sees aggregated reporting. The audit trail tracks who accessed what.

QA surveys without the spreadsheet stitching.

Apply for the CT Agency Suite early-access program. We'll walk through your current survey workflow and how the integrated tooling fits.