1 University of Washington Division of Public Behavioral Health and Justice Policy Seattle Youth Violence Prevention Initiative Risk Assessment Tool Validation Findings and Recommendations from Quality Assurance Interviews
2 Introduction The University of Washington s (UW) Division of Public Behavioral Health and Justice Policy currently holds the contract with the City of Seattle to conduct a validation analysis of the Seattle Youth Violence Prevention Initiative (SYVPI) Risk Assessment Tool. In support of this analysis, the UW staff requested permission from SYVPI Director, Mariko Lockhart, to conduct brief quality assurance interviews with SYVPI staff who utilize the Risk Assessment (RA) tool or who manage the use of the tool in work with Initiative-involved youth. UW staff participated in existing workgroup meetings with SYVPI staff around adjustments to the risk assessment tool and scheduled subsequent individual agency interviews as a follow up to that process. The following report outlines the description of the interview process, coding of the information collected, and a summary of the results obtained from the interviews. Recommendations for moving forward are provided. Methods Subjects An introducing research staff and explaining our role in the Risk Assessment tool validation was sent to points of contact at all three networks, seven case management agencies, and the street outreach group. Interviews were scheduled with 28 staff from three networks, five out of seven case management agencies, and street outreach. Table 1 includes details on participant sample size per agency. Staff participants included Program Managers, Coordinators, Intake and Referral Specialists, Case Managers, Outreach Workers, and Program Supervisors. All participants were either directly implementing the assessment tool or were overseeing the implementation process at least one interviewee per site was working directly with the tool in their position within the Initiative. All results related to street outreach are identified by group
3 only in order to protect confidentiality, comments and suggestions were not separated on a personal level; rather, individual results are aggregated by group. Table 1. Interview Participant Sample Agency Proxy ID* Participant Counts N 1 2 N 2 3 N 3 2 CM 1 3 CM 2 2 CM 3 2 CM 4 3 CM 5 3 SO 8 Total 28 *N = Network; CM = Case Management; SO = Street Outreach Interview Process Description The research team developed a semi-structured interview to guide the discussions see Appendix A. The question list was sent to all participants approximately one or two days prior to each interview. Participants were also encouraged to prepare any additional questions or general information they wanted to share regarding their experiences using the RA tool. Research staff travelled to all agency locations in order to conduct the interviews, with the exception of one interview that was conducted via phone conference due to scheduling conflicts. Research staff began each interview by presenting and discussing a consent form, which was developed in accordance with sound qualitative research practices and to incentivize participants to engage in open dialogue by attempting to safeguard confidentiality. This form
4 included a brief explanation of the purpose of the interview, procedures, benefits of the study, and other general information regarding confidentiality and the interview process. See Appendix B Consent Form. After receiving signed copies of the consents, and going over any questions raised by participants, research staff began the interview. All interviews lasted between 1 and 1.5 hours. Participants received follow-up s, including signed copies of their consent forms, and were informed that they would have a chance to provide feedback on the report draft prior to dissemination. Research staff completed the interview process by October 3, Semi-Structured Interviews Question Categories All interviews began with the question, Can you explain how you are using the Risk Assessment tool in the work you are doing with the youth? After this, research staff asked questions based on the pre-identified list so that broader themes would emerge and answers could be coded with consistency across agencies/groups. In order to receive feedback on whether the risk assessment tool was valuable for case planning purposes, research staff asked questions such as How are you currently using the risk assessment tool?, How is the tool integrated into your decision-making?, Do items specifically guide your decision-making?, and Is this tool helpful or meaningful for the work you are doing with these youth? Additionally, specific questions were asked regarding how the risk assessment tool is being administered by SYVPI staff: When is the risk assessment tool being administered?, How is the risk assessment tool being administered?, and Are all the items answered based on a youth s self-report? If no, how is the other information gathered, if any? Themes regarding how staff perceive the risk assessment tool emerged when asked questions such as, What items tend to be most helpful in your decision-making?, What items do you find to be the most problematic?, and What is your overall assessment of the use of the tool? Database specific
5 information was acquired by asking questions like Can you explain how you currently use the database?, Who inputs the information from the risk assessment tool into the database?, Do you experience any problems with the database? Themes related to overall feedback about the risk assessment tool and its role within the greater Initiative were identified from responses to questions such as Do you believe everyone in the Initiative is using the tool in a similar way?, Are resources available to you through the Initiative if you have questions or need help filling out the tool?, Does your work environment support you in using the tool for planning purposes?, and Do you have a clear idea of how the tool is being used in the Initiative overall? Qualitative Data Coding Process Research staff reviewed and synthesized all notes taken during the interviews by creating typologies of codes. The coding occurred in a group discussion of the research staff (4 staff = principal investigator, faculty consultant, graduate assistant, research analyst) in which responses were grouped according to initial impressions and then reorganized and reframed after all participant responses were addressed. After responses were grouped, category titles were created to describe the focus of each set of responses. Initial findings were incorporated into a matrix which identified whether one or more agencies/groups identified with a particular item. Items were categorized under broad domains and ranked according to item importance, which was based on consensus between agencies/groups and the number of times the items were answered. See Table 2 for coding theme matrix and results Appendix C. This matrix was sent to all participants for brief review research staff received feedback from three agencies/groups. This feedback was taken into consideration, and the matrix was updated and revised accordingly.
6 After the initial coding themes were identified, research staff also conducted a more in-depth review of the notes in order to obtain specific details, quotes, and exact totals for each item. Results Overall results suggest that the RA tool is perceived as needing additional clarification around purpose and implementation but that the tool has the potential to be more useful if revisions are implemented in a number of areas. Six general coding themes related to the tool s purpose and use emerged from the interviews: 1) the risk assessment tool and case planning; 2) how the risk assessment tool is being administered; 3) perceptions of the risk assessment tool; 4) database specific issues; 5) overall Initiative related feedback; and 6) improvement suggestions for moving forward. The Risk Assessment Tool and Case Planning. The most common feedback regarding the RA tool for case management indicated that the risk assessment tool was not very valuable in the case planning process (N = 2; CM = 2; SO); rather, all three networks and three of the case management agencies felt as though the SYVPI referral form was more useful than the risk assessment tool for determining service plans. One network noted that the risk assessment is currently not helpful it s just a filler and extra paperwork at this point. Three case management agencies and one network indicated common referral sources; these included school, police, parents/guardians, and self-referrals. It is important to note that referral sources varied by agency and geographic location. For instance, in one heterogeneous population, youth were more likely to self-refer because a majority of their friends were in the Initiative and there were not a lot of other readily available services within their community; on the other hand, some networks identified schools as their primary referral sources. The Brief Statement of Concern on the referral form was identified as most helpful for case planning by
7 two case management agencies and the street outreach group. One network and one case management agency indicated that it was extremely helpful to have one person in the network complete the initial risk assessment prior to referring the youth for services. Additionally, one network stated, The risk assessment could be useful as a screening tool prior to SYVPI enrollment in the schools if revisions were made. How the Risk Assessment Tool is Being Administered. All agencies/groups noted that the risk assessment tool was administered differentially based on the practices of certain staff or agencies as a whole. Individuals vary, even within agency, in the method of administration which includes handing the youth the form to fill out themselves (N = 2), engaging the youth in an interview which then staff use to fill out the form (N = 3; CM = 5; SO), administering the form to a group of youth (N = 1), and filling out the form exclusively using collateral information from the referral source (N = 1; CM = 1). The lack of uniformity appears to strongly relate to confusion over the correct or most effective method of administration, which is discussed in more detail further in the report. In addition, general consensus indicated differences in administration style as varying by staff and youth characteristics. For example, one network noted having the youth fill out the assessment if they were presenting with reservation and seemed uncomfortable during the assessment process, whereas if the youth was more forthcoming initially, the staff member would conduct the assessment using the motivational interviewing technique. There was general consensus that the timeframe for completing the initial assessment was too short, with one network noting that because of the limited timeframe, initial and 6-month assessments are sometimes completed through backtracking, or completing the assessments after the youth are already enrolled in the Initiative and receiving services. A number of suggestions were made
8 regarding a more appropriate timeframe for completing the initial assessment 1 month, 60 days, and 3 months were identified as timeframes that would likely produce more accurate initial assessments. In addition to administration and timeframe, themes related to the process for collecting information to finish the risk assessment were included in this coding theme. For example, it was indicated that items from the referral form are often used to fill in portions of the initial risk assessment (N = 3; CM = 2), with similar agencies indicating that contacting the referral source to get more information is often a common practice (N = 3; CM = 2). It is often the case that the initial risk assessment is often completed after the youth has already been referred to additional services and/or after youth are already receiving services through the Initiative (N = 2; CM = 2). One network and one case management agency indicated that it is helpful when one designated individual completes all the initial risk assessments however, this practice does not take place if youth are referred to street outreach. Additionally, it was stated that the initial assessment process within the overall Initiative is unclear (N = 1; CM = 1) Sometimes the initial risk assessment is completed before and sometimes after the referral, so how are we supposed to know exactly when the assessment is supposed to happen? Perceptions of the Risk Assessment Tool. Through the interviews, a number of themes emerged, indicating strong perceptions about the tool itself and whether it is useful in the work staff are doing with Initiative-involved youth. With the exception of one case management agency, all staff identified specific items that were helpful/useful and others that were not helpful/useful in their work with the youth; however, there was considerably variation in the items identified as helpful or not among agencies. These differences varied by staff and agency and seemed to strongly relate to both the role of the agency (e.g., intake vs. case management vs.
9 outreach) and what types of services agencies were able to provide for example, one network indicated that the mental health domain items were particularly helpful because their specific agency was not able to provide these services, indicating that a referral should be made. On the other hand, a different agency noted that the mental health items were incredibly difficult to interpret and get information on, We are case managers, not psychiatrists or counselors, how are we supposed to know this? In addition, many of the sites indicated that the initial assessment data may be unreliable as the youth are often not forthcoming/truthful initially because of the sensitive nature of the items and lack of initial rapport with the staff (N = 3; CM = 4; SO). There was general consensus that the risk assessment items are not currently guiding decision-making but may have the potential to do so if revisions are made (N = 3; CM = 3; SO). Translating the items from the risk assessment into youth-friendly language was indicated as being difficult to do, especially for mental health domain items and terms like anti-social and pro-social (N = 3; CM = 3; SO). While a glossary was developed and provided to all Initiative staff in order to address this issue, only one case management agency mentioned that they used the glossary to assist in administering the tool. Multiple agencies reported that the RA was too long and too wordy (N = 3; CM = 3) for this reason, a number of case managers indicated a preference for the old case management assessment form (CM = 2). However, when specifically asked whether the current tool should be replaced or significantly changed, the majority of agencies did not want to start over and were supportive of continuing with the current tool if modifications could be made (N = 3; CM = 5; SO), which is further discussed in the Improvement Suggestions section of this report. Lastly, it was indicated that it would be helpful if the less sensitive domains/items were located at the
10 beginning of the assessment (N = 1; CM = 1), and that if items are sensitive (often culturally) that staff be able to gather needed information without bringing up the actual assessment at that point in time (CM = 1). Database Specific. Because the risk assessment tool validation relies on quality of data via collection methods, research staff wanted to gain a better understanding of how the risk assessment data is transferred from the paper assessments into the database. A number of issues were identified as affecting staff in their ability to use the SYPVI database. There was general consensus that the current database is unreliable for a number of reasons (N = 3; CM = 5; SO): youth contact information, outdated and/or incomplete reports, loss or misrepresentation of pertinent information, incomplete data entry due to the amount of time and effort it takes to input the risk assessments into the database. One case manager noted that The database is challenging to navigate and I don t even use it [database] half the time because there are so many issues. While the SYPVI database manager was identified as an individual who has put in a significant amount of work to address and fix these issues (N = 2; CM = 1), agencies noted that there are still a significant amount of unresolved problems that are outside his scope of work (N = 1). As a result, agencies have indicated that they keep their own database systems for tracking youth and for producing reports to save time and ensure accuracy of data (N = 3; SO). While hard copies of the risk assessments are kept, not all 6-month assessments are being entered into the database (CM = 1), and some staff do not have access to the database at all (SO). Only one case management agency indicated that training on the database was provided by the database manager, and that it was really helpful although only three people attended. Other issues that were identified include not being able to access a youth s initial risk assessment until the 6- month follow-up assessment (CM = 1), not being able to input comments into the database
11 although this could be really useful for goal measurement (CM = 2), and needing approval to enter risk assessments into the database (CM = 1). One case management agency suggested that a better alternative to the current database would be a web-based mechanism [ ] we would be more willing to use it if it was less work to do so. Improvement Suggestions for Moving Forward. The last coding theme included specific suggestions from participants about improving the current assessment process. All staff, through direct comment or inference, indicated that the risk assessment tool has the potential to be useful we should work with the tool we already have to come up with the most effective way to use it [ ] it would be a disservice to everyone if we started over with a completely new tool. One comment indicated that the initial tool is trying to assess too much, and that it may be helpful if the initial assessment looked similar to the referral form in order to establish service need, using the risk assessment to address propensity towards violence after service plans are developed. It was suggested that the risk assessment should be completed by all staff regardless of their role, but it needs to be perceived as more than just a tool for data collection. A majority of staff indicated uncertainty around whether all staff within the Initiative had appropriate training on the use of the tool, and that more training on how to effectively administer it in a standardized manner would be beneficial (N = 2; CM = 3; SO). Overall, in every interview it was indicated that the risk assessment tool needs a clear purpose within the Initiative. If the tool is intended to assess risk, agencies reported that a scoring scheme needs to be developed and implemented so staff have a standard way of assessing youth for eligibility (N = 3; CM = 2). Overall Initiative Related Feedback. There were a number of themes related to overall feedback about the risk assessment tool and its role within the greater Initiative that emerged
12 from the interviews. There was general consensus that partnership roles and expectations around collaboration need to be made more clear, and that the Initiative structure needs to account for this (N = 3; CM = 4; SO). For example, a number of case management agencies indicated that they did not fully understand the street outreach role, and that collaboration was lacking between street outreach and other agencies around youth engagement within the school setting (CM = 2). Not only do these roles need to be made clearer, but more collaboration and better communication is needed among the networks, case management, and street outreach throughout all stages of the Initiative. Initiative contracts were discussed because they affect case loads, timeframes for working with youth, and service provision all of which can ultimately affect data quality. One network stated that The Initiative s structure around collaboration is not ideal [ ] contracting is difficult and does not allow for much accountability or communication between agencies in terms of monitoring how the risk assessment tool is administered and how youth receive services as a result. In addition to collaboration issues, concerns regarding the Initiative s structure were also discussed. Multiple agencies reported that it is difficult to follow all SYPVI guidelines because they do not directly align with the goals and mission of the agencies where staff are housed (N = 1; CM = 1). For example, one case management agency noted that overall goals of the Initiative are not very clear and there is often tension between the mission statement and the mission of the agency. One case management agency and the street outreach group indicated that they would like to see referrals before accepting youth into their specific services in order to ensure that the referral is a good match. The process for exiting youth from the Initiative was another common theme that emerged out of the overall feedback dialogue. Two case management agencies indicated uncertainty around where youth go or what happens to them
13 after the exit process occurs. This may be confounded by the general consensus that exit strategies vary by staff or agency practice. There is currently no common exit approach; however, a variety of strategies are being utilized including setting agency-based target exit dates, holding internal meetings to discuss exiting cases, or ceasing service provision because youth stop participating or age out of the Initiative. Co-locating networks and case management was cited as helpful or potentially helpful by four of the sites (N = 1; CM = 2). This was seen as promoting communication and collaboration. Lastly, youth demographics and geographic locations were indicated as playing significant roles in the referral process, outreach techniques, service provision, etc. which also affects collaboration. Our agency is unique because of the population we serve, but it makes it difficult to collaborate effectively with certain networks or other case management agencies [ ] there needs to be a more defined scope of practice between the agencies, and everyone should be working together in a more standard way. Recommendations Interviews with SYVPI networks, case management agencies and outreach about the risk assessment tool revealed uncertainty about the tool s current benefit for service provision and data collection. At the same time, the clear majority of sites recognized the need for a tool to monitor outcomes and indicated enthusiasm for continuing to work on improvements. The recommendations below reflect a summary of participant reflections as well as our own suggestions for next steps. 1) Continue existing outreach efforts with SYVPI staff members around the risk assessment tool by convening a core workgroup to discuss and develop strategies for the following:
14 a. The purpose of the risk assessment tool within the Initiative (case planning, data gathering, outcomes monitoring) b. Set of standard guidelines for tool administration and interpretation c. Quality assurance methods for providing support to agencies in implementation d. Communication strategy for informing SYVPI staff members about the strategies above. 2) Improve database usability in the following areas: a. Accuracy of data b. Accessibility c. Troubleshooting support 3) Contingent on the recommendations provided by the workgroup, develop a job description for quality assurance monitoring for either a single individual or as part of an existing range of duties amongst current supervisory staff.
15 Appendix A SYVPI Risk Assessment Tool Validation Questions for Site Visits How are you currently using the risk assessment tool? When is it being administered? How is it being administered? Are all the items answered based on youth s self-report? If no, how is the other information gathered, if any? How is the risk assessment integrated into your decision-making? What items tend to be the most helpful in your decision making, if any? Do these items specifically guide decision making? If yes, in what way? If no, why do you think this is the case? What items do you find to be the most problematic? Are there any items that you have specific issues with on a regular basis? Items that are potentially not answered regularly or are difficult to get information on? What is your overall assessment of the use of the tool? What would you change, if anything? Is it a helpful and/or meaningful tool for the work you are doing with these youth? Do you feel comfortable asking youth about all the topic areas and/or individual itemrelated questions? Do you believe the use of this tool will help assess risk and reduce violence propensity? Do you believe the use of this tool will help staff manage their cases? Do you believe everyone in the Initiative is using the tool in a similar way? If yes, in what ways? If no, in what ways? What kind of trainings have you attended around the use/implementation of the tool? Are trainings mandatory? If no, do you feel that you could benefit from attending training(s)? Are resources available to you through the Initiative if you have questions or need help filling out the tool? If yes: What types of resources are available? Are they easy to access? Do you personally use them?
16 If no: What do you think would be most beneficial for you? Is the use of the tool an accepted practice within the Initiative and/or your agency? Does your work environment support you in using the tool for planning purposes? Who inputs the information from the tool into the SYVPI database system? Do you have a clear idea of how the tool is being used in the Initiative overall?
17 Appendix B UNIVERSITY OF WASHINGTON CONSENT FORM Seattle Youth Violence Risk Assessment Tool Validation Investigator: Name Academic Appointment UW Department Contact Information* Sarah C. Walker, Ph.D. Assistant Research Professor Department of Psychiatry and Behavioral Sciences (206) * Please note that we cannot ensure the confidentiality of information sent via . Investigator's statement We are asking you to participate in an interview about the Seattle Youth Violence Prevention Initiative s (SYVPI) Risk Assessment tool validation for quality improvement purposes. The purpose of this consent is to give you the information you will need to help you decide whether or not to participate in the interview. Please read this form carefully. You may ask questions about the purpose of this interview, what we would ask you to discuss as a participant, and anything else about the interview or this form that is not clear. When all your questions have been answered, you can decide whether or not you want to participate in this interview. PURPOSE OF THE INTERVIEW We want to gain a better understanding of the risk assessment tool and how it is implemented and used throughout the various agencies and networks associated with the City of Seattle s Youth Violence Prevention Initiative. PROCEDURES If you choose to participate in this interview, we will spend about an hour discussing your experiences using the SYVPI risk assessment tool. We will explore questions like, How are you currently using the risk assessment tool?, How is the risk assessment integrated into your decision-making?, What items tend to be the most or least helpful in your decision-making?, Do you have a clear idea of how the tool is being used in the Initiative overall?, etc. All of the questions will be provided at least one day prior to the interview for you to review. We encourage you to also add questions or comments that you feel are important to address. Also, you are not required to answer every question.
18 BENEFITS OF THE STUDY You may not directly benefit from taking part in this interview. However, we will use this information to provide feedback to the City of Seattle regarding the current usefulness of the SYVPI risk assessment tool. While we cannot guarantee that this information will lead to change, the information will inform the SYVPI leaders about whether change is recommended. OTHER INFORMATION Taking part in this study is voluntary. No personal or agency information will be included in our final report. We will report results as combined data, for example, two of the three network sites reported that the risk assessment was being administered in multiple ways. We will also send you a copy of the report prior to submitting it to the City of Seattle. We will provide you with one week to respond with comments regarding report content. We will make the final decision about what to report after all comments have been received and reviewed. We will make every effort to keep your information confidential. All notes taken during the interview will be kept in a locked file cabinet at the University of Washington s Division of Public Behavioral Health and Justice Policy. These notes will not be shared with the City of Seattle or any other SYVPI management or staff. Your agency/personal information and the information you provide us will not be divulged in risk assessment interviews with other agencies/staff. We will not discuss the project outside of the risk assessment evaluation team. Government or university staff sometimes review study information to make sure interviews are being done safely and legally. If a review of this study takes place, your records may be examined; however, the reviewers will protect your privacy and you will not be put at legal risk of harm. Signature of Investigator Printed Name Date Participant s statement This study has been explained to me. I volunteer to take part in this interview. I have had a chance to ask questions. If I have questions about this study or the interview at a later date, I can ask the investigator (listed above) as well as other research staff. I give my permission for the researchers to take notes on what I say during this interview. I will receive a copy of this consent form by within one week after the interview takes place. Signature of Participant Printed Name Date
19 Appendix C Table 2. Coding Theme and Results Matrix Coding Themes from Interviews Networks* Case Management* Street Outreach* Item Importance** The Risk Assessment Tool and Case Planning Referral form more useful for determining service plans XXX XXX 1 Risk Assessment: Not very valuable for case planning XX XX X 2 Common referral sources: School, Self-referrals, Friends, Police XX XXX 2 "Brief Statement of Concern" section on referral form most helpful XX X 3 Majority of initial assessments completed by Networks X X 4 Risk Assessment: Could be useful as a screening tool prior to SYVPI enrollment WITH revisions X 5 How the Risk Assessment Tool is Being Administered Risk Assessment administered differentially: from collateral vs. in-group vs. individually XXX XXXXX X 1 Timeframe for completing initial risk assessment is too short XXX XXXX X 2 Differences in administration style vary by staff member and by youth characteristics XX XXXX X 3 Using items from referral to fill in portions of the risk assessment XXX XXXX 3 Calls to referral source (i.e., parents/schools) to get more information if necessary XXX XX 4 Risk assessment completed AFTER decision-making/case planning XX XX 5 Helpful when one designated person completes all initial risk assessments X X 6 Unclear about when initial assessments are supposed to happen within the overall SYVPI process X X 6 Networks do not do initial assessments for Street Outreach X 7 Perceptions of Risk Assessment Tool Indicated items that were both helpful/useful AND items that were not helpful/not useful XXX XXXX X 1 Initial assessment is unreliable: youth may not be forthcoming/truthful initially because of limited time-frame XXX XXX X 2 Items do not guide decision-making but may have potential XXX XXX X 2 Translating items from risk assessment in youth-friendly language can be difficult XXX XXX X 2 New risk assessment is too long/too wordy XXX XXX 3 Liked old case management assessment better/felt it was more useful XX 4 Better if less sensitive domains/items were first in assessment X X 4 If questions are sensitive (often culturally sensitive) staff gather info without bringing up assessment X 5 Old risk assessment did not have enough information - new version has too much X 5 Indicated use of SYVPI risk assessment tool glossary X 5 Database Specific Database is unreliable XXX XX 1 Challenging to enter data because of time constraints XXX XX 1 Data entry is incomplete: not sure everyone is able to do this for initial and follow-up assessments in timely manner XX XX 2 Created own database for tracking and report purposes: Easier than dealing with the SYVPI database issues XXX X 2 Vincent helps troubleshoot database issues - but still a lot of unresolved problems XX XX 2 Not able to input "comments" into database although this could be really useful XX 3 Once assessment is in the database, cannot access until 6-month risk assessment X 4 Indicates needing approval to enter risk assessment data into database X 4
20 Training has occurred on the database X 4 Hard copies of risk assessments are kept: Not all 6-months assessments are entered into database X 4 Can input "comments" section into the database - not able to input comments anywhere else in database X 4 Staff have access to paper assessments - do not have access to database X 4 Web-based database would be a better alternative to current database - less work, more user friendly X 4 Overall Initiative Related Feedback More collaboration needed among Networks, Case Management, and Street Outreach XXX XXXX X 1 Partnership roles and expectations around collaboration need to be more clear XX XXXX X 2 Exit strategies vary (set target exit dates, youth stop participating, manager/staff meetings to discuss exit, etc.) XX XXXX X 2 Initiative structure is confusing and roles are often unclear XX XX 3 Initiative contracts are insufficient for caseloads and ultimately effect service provision and use of the tool X XX 4 Unclear where youth go/what happens to them after exited X XX 4 Youth population demographic differences create issues around referrals, outreach techniques, treatment, etc. XXX 4 Would like to see referral prior to accepting case from Network to identify if referral is appropriate fit for services X X 5 Too difficult to follow all guidelines from SYVPI: Mission often clashes with other agency missions/goals X X 5 Helpful when Case Management agencies are co-located with Networks X X 5 Dislikes Motivational Interviewing Technique - restrains staff in ability to do their work X 6 Improvement Suggestions (moving forward) Work with tool we already have but change as needed to be more effective and applicable to the work XXX XXXXX X 1 Risk assessment COULD be helpful for risk assessment purposes WITH revisions XXX XXXXX X 2 Risk assessment tool needs a clear purpose within the Initiative XXX XXXX X 2 More training on how to most effectively administer the risk assessment would be helpful XX XXX X 3 Indicates uncertainty around whether ALL staff within Initiative have appropriate training on the use of the tool XXX XX X 4 Need scoring scheme to make the risk assessment more meaningful and applicable XXX XX 4 The risk assessment currently does not allow for assessing measurable goals/not helpful in measuring outcomes XX XX X 4 The risk assessment is trying to assess too much initially XX XX 5 Risk assessment perceived as a tool for data collection X X 6 Keep risk assessment process intact for Street Outreach role X 7 Risk assessment needs to account for a youth's LGBTQ status, if applicable X 7 NOTES: *X indicates one agency or group **Item importance based solely on consensus between agencies/groups and number of times answered