UNITED STATES DIGITAL RESPONSE
Streamlining data collection for the City of San Francisco
ROLE
Lead Researcher
TIMEFRAME
March - June 2024
TEAM
2 Designers, 1 Project Lead, City of SF Stakeholders
OVERVIEW
The United States Digital Response (USDR) is a nonpartisan organization that supports governments, nonprofits, and public entities in responding swiftly to critical public needs. I volunteered to lead research and design during a 16-week engagement with San Francisco’s Department of Emergency Management (DEM).
Our objective was to enhance client intake and data standardization procedures for the Emergency Response Team (ERT) and Healthy Streets Operation Center (HSOC). These teams coordinate the city’s response to unsheltered individuals experiencing homelessness, those with behavioral health challenges, encampments, street cleanliness, and related public safety issues, ensuring that San Francisco's streets are healthy for everyone.
OUTCOMES
My team conducted a two-phase test aimed at simultaneously enhancing the overall user experience of the client intake form for ERT staff in the field and improving the quality of the data collected for HSOC staff evaluating it. The outcomes of the phased tests resulted in an 84% improvement in data quality compared to the original intake form, while also making it easier for ERT staff to collect data during a client interaction.
QUICKLY GAINING CONTEXT
In order to quickly come up to speed on the roles and responsibilities of ERT and HSOC team members, my team conducted interviews and ride-alongs with both teams. We created user journeys for each team, assessed pain points during the data collection and analysis process, and created a set of goals to structure the engagement:
Goal 1: Automate data cleaning processes to reduce manual work.
Goal 2: Optimize data formatting and reporting to facilitate easier analysis and answering of common policy questions.
Goal 3: Improve data entry process by addressing issues with redundant questions and free-form answers.
CHALLENGES WITH THE EXISTING FORM
Form data often had errors, typos, and incorrect formatting, and was sometimes incoherent altogether.
We conducted an audit of the existing intake form and learned that ERT staff had to answer ~30 questions, some of which were highly personal, while trying to maintain a client’s interest. Most of these intakes occur in the middle of the day on the streets of SF, and ERT staff described the process of using a mobile phone for intake as cumbersome and lengthy. The HSOC team had designed the form in Microsoft Forms. It had free-form fields and a laborious and awkward set of branched questions, but despite these challenges, it was free and had been in use for years.
The HSOC team identified a few questions that were problematic for data quality on the form: Today’s Date, Birthday, Staff Initial, Refusal/Acceptance of Shelter, and Location. For ERT staff, the free-form fields meant they could easily type out answers with ease and not worry about the clunky UX of MS Forms. For the HSOC staff, this was a nightmare. As a result, a single analyst had to manually clean the data as part of their workflow to ensure it could be used for further analysis and policy decisions.
Audit of the existing form, as well as a proposal for alternative tools.
After auditing the existing form, we assessed areas of improvement. Initially, we proposed alternative tools like Google Forms and JotForm to the HSOC team, believing their modern features would address the identified pain points. However, this raised concerns about pricing and the labor required to migrate the form, especially for a team already stretched thin. Understandingly, we moved forward with proposing updates in Microsoft Forms.
TESTING IN TWO PHASES
With multiple ways to test a single problem area, we decided to conduct a two-phase test.
Phase 1: Add helper text within each question + UX improvements
Phase 2: Change free-form fields to drop downs and pickers + UX improvements
Summary of Changes
Each phase ran for 2 weeks, and in between we interviewed ERT and HSOC staff to understand how the changes impacted data gathering during intake and data cleaning on the backend.
TEST INSIGHTS
Phase 2’s interventions improved the quality of submissions by 83.81%.
Changes in Data Quality
After 6 weeks of testing and feedback gathering, we learning that the changes made during both Phase 1 and Phase 2 reduced the need for manual database cleanup overall.
Phase 1’s Interventions involving the addition of subtext to the Date, Location, Partner Initials, and People Sharing Vehicle questions, led to a significant reduction in dirty data, cutting it by nearly half compared to the original form.
Phase 2’s Interventions, compounding on some interventions from Phase 1 as well as the addition of Location Dropdowns, further improved the quality of submissions, reducing the dirty data to less than a sixth of the original percentage.
ERT Staff Feedback
The addition of subtext on all questions was useful for some ERT Staff, but not for all. Some members admitted to not reading any of the helper text, while others avidly read and followed the instructions given. This highlights that an open text field with helper text, though helpful for the ease of form-filling, is harder to control for from a data standardization lens.
On the other hand, the fixed drop downs for location were well received by all ERT Staff.
OUTCOMES
The HSOC team implemented our recommendations, earning recognition from the Mayor of San Francisco as a model for rapid testing and iteration.
Our 16-week engagement paved the path for a new way of working for the HSOC team. Based on our findings, the team adopted several Phase 2 recommendations, including adding helper text for Date of Birth, Shelter Acceptance/Refusal, Partner Initials, and People Sharing Vehicles. We also transitioned to a dropdown approach for Location and improved the overall flow of the form.
At the end of our engagement, our team was invited to present our work to the Office of Mayor London Breed of San Francisco. Our collaboration with the HSOC team was recognized for introducing an innovative, rapid testing approach that streamlined decision-making—a notable achievement given the complexities of working on sensitive topics for the city.
The colleagues at USDR, alongside members of the HSOC team.
ByteBoard: CodeCollab →