Member Survey Analysis¶

Survey outcomes for Recovery Cafe San Jose members, organized by domain. Results are grouped into wellbeing at enrollment (baseline), program impact (did the Cafe help?), and change over time (pre-post comparison for members with follow-up surveys).

639 surveys from 588 members (552 initial, 87 follow-up). 37 members have both initial and follow-up surveys (longitudinal).

Survey Response Rates (2025-03-19 to 2026-03-19)¶

Survey Type Scheduled Completed Rate
Initial536552103%
3 Month1364130%
6 Month771418%
9 Month32825%
12 Month151173%
Annual331339%
Total82963977%

"Scheduled" = milestones that fell due based on each member's enrollment date, excluding members disenrolled before the due date. Does not distinguish between surveys not administered vs member declined.


1. Wellbeing at Enrollment¶

How do members rate their hope, coping, connectedness, recovery desire, and health when they first join? This establishes the baseline for measuring change.

Responses are shown as diverging bars: favorable responses (green) extend right, unfavorable (red) extend left, with the percentage in each category labeled. Sorted by percent favorable (highest to lowest).

No description has been provided for this image

Based on 552 initial surveys. Favorable = top 2 responses, Unfavorable = bottom responses.


2. Program Impact — "Has the Recovery Cafe Helped?"¶

Members who have completed follow-up surveys are asked whether the Cafe has helped across several domains. These are the most direct measures of perceived program benefit. Results are shown as percent favorable (agree/yes).

No description has been provided for this image

Based on 87 follow-up surveys from 73 members. Green = 80%+, yellow = 50-79%, red = <50%.


3. Change Over Time — Pre-Post Comparison¶

For the members who have both an initial and at least one follow-up survey, how did their self-rated wellbeing change? Each member's initial score is compared to their most recent follow-up.

The slope chart shows the direction and magnitude of change. Lines going up-right indicate improvement; lines going down-right indicate decline.

37 members have both initial and follow-up surveys.

No description has been provided for this image
Domain N Improved Same Declined Avg Change
Hope3712 (32%)205 (14%)+0.3
Coping3712 (32%)1510 (27%)+0.3
Connectedness3710 (27%)189 (24%)+0.1
Recovery Desire367 (19%)254 (11%)+0.2
Physical Health377 (19%)246 (16%)+0.1
Mental Health3717 (46%)137 (19%)+0.6

Avg Change is on a 0-5 scale. Positive = improvement.


4. Housing Stability¶

Where are members living, and are they satisfied? Housing is a critical outcome for programs serving people recovering from homelessness.

No description has been provided for this image

5. Recovery Profile¶

What substances are members recovering from, and how strong is their desire for recovery?

No description has been provided for this image

Alcohol use (past 30 days): median 0 days, mean 2.4 days (N=356 members in alcohol recovery)

Drug use (past 30 days): median 0 days, mean 3.1 days (N=360 members in drug recovery)


6. Recommendations¶

Survey Process¶

  1. Prioritize follow-up survey completion. We have 552 initial surveys but only 87 follow-ups (16%). Without follow-ups, we cannot measure whether the program is working. The 37 members with pre-post data are too few to draw confident conclusions. See the Survey Completion Tracking report for the gap analysis.

  2. Distinguish "not administered" from "member declined." The response rate table above shows scheduled vs completed, but a missed survey could mean staff didn't schedule it, or the member was asked and said no. These are different problems requiring different solutions. Consider adding a "declined" option in Agency when a survey is offered but not completed, so we can separate process gaps from member choice. Until then, the completion rates should be read as upper bounds on the true response rate.

  3. Integrate survey timing with the membership lifecycle. The 3/6/9/12-month milestones should be flagged automatically so staff know when a follow-up is due. Consider adding survey due dates to the intervention patterns report.

  4. Close the feedback loop. Share aggregated results (not individual responses) with staff and members. People are more willing to complete surveys when they see the data is used. Consider a quarterly "what the surveys tell us" summary for circle facilitators.

Survey Questions¶

  1. Add a "not applicable" option to recovery-specific questions. Members not in recovery from alcohol or drugs must still navigate skip logic. An explicit N/A reduces blank responses and makes the data cleaner.

  2. Separate self-assessment from program attribution. The current survey mixes "rate your hope" (self-assessment) with "did the Cafe help your hope?" (attribution) in the same section. These measure different things and should be analyzed separately (as we do in Sections 1 and 2 above).

  3. Consider adding a social connectedness scale. The current connectedness question is a single item. A validated 3-5 item scale (e.g., items from the Social Connectedness Scale) would be more reliable and sensitive to change over time.

  4. The "Has the Recovery Cafe helped?" questions may produce inflated results. Members who are present and filling out the survey are self-selected — they stayed. Members who left (and might have negative experiences) are not surveyed. Acknowledge this limitation when presenting results. Consider adding a question about what could be improved.

  5. Align with SAMHSA NOMs where possible. The survey already covers most SAMHSA National Outcome Measure domains (substance use, mental health, housing, physical health). Explicitly mapping your questions to NOMs makes results legible to funders and allows benchmarking against national data.

Data Quality¶

  1. Standardize the housing categories. The current survey has 20 housing options, many rarely used. Grouping into 5-6 categories (housed, family/friends, transitional, shelter, unsheltered, institutional) as done in Section 4 above would make the data more actionable.

  2. Validate survey dates against enrollment dates. Some initial surveys may be backdated or entered late. Cross-referencing survey dates with Agency enrollment dates would flag data entry issues.