THE YOUTH COMMUNITY DATA CENTER Trusted Guidance for Doing Good.
Acknowledgements This report was prepared by the Dorothy A. Johnson Center s Community Research Institute. We would like to thank the key members of the team responsible for this report. Joshua Church, M.I.S. Web Developer Community Research Institute Julie Strominger, M.S. Research Coordinator Community Research Institute Amber Erickson, M.S.W. Research Manager Community Research Institute T.J. Maciak, M.S. Senior Programmer Community Research Institute Jodi Petersen, Ph.D. Senior Researcher Community Research Institute Lisa Venema Project Manager Community Research Institute Eleibny Feliz Santana, M.Sc. Database and System Administrator Community Research Institute Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 2
Table of Contents The Youth Community Data Center: Description, Features, and Function...4 Process Summary: Building YCDC & Collecting Data....6 YCDC Data Analysis Capabilities...11 Analysis with Pilot Data... 13 Plan for YCDC Scale Up... 15 Appendix A: Indicator Definitions...16 Appendix B: Flow Chart Diagram...17 Appendix C: YCDC Scale Up Cost Estimate...18 Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 3
The Youth Community Data Center: Description, Features, & Function The Youth Community Data Center (YCDC) is an integrated data system designed for use by Expanded Learning Opportunities Network (ELO) members. The need for YCDC became clear after several years of research on the data needs and goals of ELO members. Programs reported they wanted to be able to use their own data systems for multiple needs and requests. They wanted a system that would allow them to continue their own data collection while contributing to the ELO Network s quality improvement and outcome measurements. The goals of an integrated data system are 1.) align shared indicators across diverse out of school time (OST) programs to see the combined impact on youth outcomes; and 2.) share the overall impact of out of school time programs on youth outcomes with the community. This is a pilot project meant to test the feasibility of a full-scale data system utilized by all ELO members. Two school sites agreed to participate in the pilot: Harrison Park Elementary and Martin Luther King Jr. Leadership Academy. The OST programs included between these two sites are Kent School Services Network (KSSN); The Gerontology Network; Creative Youth Center; Boy Scouts of America; and the athletic programs at the two schools. YCDC has a public interface accessible to anyone in the community. The features and functions of the public YCDC are described below: YCDC home page: displays ELO media coverage and provides a program spotlight to highlight an individual program s services. Program Directory: contains the following information on all programs registered with the data system: Name, Number of sites, Site location, Website, Phone, Email, Day/times of programming, Schools served, Grades served, and Topics served. Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 4
This is a searchable directory, meaning that interested users can search for programs based on any of the above criteria. Individual programs complete this information when they register with YCDC and can update it at any time. The objective is to serve the needs of community members, program providers, and funders through a current, comprehensive, and interactive directory that can be used as a reference for out of school time programs in the region. Locator Map: contains geographic points for all registered ELO programs. The map will also offer a search function, so that users can enter an address and locate programs within a particular distance. The locator map is intended to provide increased access to and information on out of school time programs for the community. ELO Network Progress: a dashboard showing aggregate data for all youth in all ELO programs. Additionally, there will be comparison data using youth in the school system who are not attending any ELO programs. Currently, the data elements included are: number of unique youth served; percent of youth reaching a program s attendance benchmark; percent of youth with chronic absence and satisfactory absence; percent of youth meeting grade level for math and reading Measures of Academic Progress (MAP) scores; percent of youth arrested within the time frame of OST programming; average program quality score; and average social emotional development score. This dashboard meets YCDC s goal of sharing with the community the overall impact of out of school time programs on youth outcomes. (See Appendix A for a definition of each indicator). Links: lists hyperlinks to relevant organizations (e.g. United Way 211, all ELO program websites). In addition to the above features, ELO members who register with YCDC will have access to other functions within the system. These functions will vary by the role of the user, as described below: Individual program user: will upload individual program data into the YCDC portal and will be able to access their program s individual and aggregate level data through program reports. Data should be updated quarterly. Multi-site program user: will upload individual program data for each program site into the YCDC portal and will be able to access their individual and aggregate level data for their overall program and each individual site through program reports. Data should be updated quarterly. ELO administrator user: will have access to aggregate program reports for programs that opt to share their information. Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 5
Programs will upload their data on a quarterly basis using the Data Uploader. The Data Uploader allows individual programs and multi-site programs to upload their data in a comma separated values (csv) format via the YCDC interface. Program data will be merged with data from Grand Rapids Public Schools (GRPS) and the Grand Rapids Police Department (GRPD). The data will be used to create the ELO Network Progress dashboard and program reports. Specifics on these reports are described below: ELO Network Progress dashboard: links individual youth data across all OST programs and shows aggregate data for the following indicators: number of unique youth served; percent of youth with chronic absence and satisfactory absence; percent of youth reaching a program s attendance benchmark; percent of youth meeting grade level for math and reading MAP scores; percent of youth arrested within the time frame of OST programming; average program quality score; and average social emotional development score. Individual program report: program-specific, aggregate level data will include: number of participants; average days attended last quarter; percent of youth reaching the program s attendance benchmark; percent of youth with chronic absence and satisfactory absence; percent of youth meeting grade level for math and reading MAP scores; percent of youth arrested within the time frame of the program; average program quality score; and average social emotional development score. The ELO Network Progress dashboard will also be presented for comparison. Individual-level data for program attendance and social emotional development will be available for each program participant. Multi-site program report: aggregate data will include program-specific data, site-specific data, and overall ELO data. For the overall program, and for each individual site, data will include: number of participants; average days attended last quarter; percent of youth reaching the program s attendance benchmark; percent of youth with chronic absence and satisfactory absence; percent of youth meeting grade level for math and reading MAP scores; percent of youth arrested within the time frame of the program; average program quality score; and average social emotional development score. Individual-level data for program attendance and social emotional development will be available for each program participant. Process Summary: Building YCDC & Collecting Data Building an integrated data center and collecting useful data from community programs necessitated a months long process of facilitation and negotiation. Many key lessons emerged from this process, which can be used to further improve the system as it scales up to include all ELO members, as well as to inform other initiatives and/or communities that are interested in integrated data and collective impact. Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 6
The idea for an integrated data system stemmed from focus groups held with ELO members. Program providers voiced concerns regarding the original idea of using a new data system requiring additional data entry. The main themes that emerged from the focus groups were 1.) providers did not want to enter data into another data system; they already entered data into multiple systems due to multiple sources of funding; 2.) providers would only be interested in a new data system if they were to get more out of it than what they were putting into it; and 3.) providers did not want to have to collect data in a uniform way across the network or have to make many changes in the way they already collect data. Therefore, the concept behind YCDC was to build a system that allows members to upload their data as is from their current data system, where it would be linked to outside datasets that would provide them with additional information that they do not have, such as school attendance and juvenile offenses. Additionally, the web-based portal is easily accessible and has a user-friendly interface. Building an online portal and integrated data system began with determining the basic functionality of the system. It was decided that the structure of the portal had to allow for different capabilities based on the roles of those who would be using the portal. Therefore, users must register with the system and provide a unique log in name/password and identify their role within the site. This also ensures the security and confidentiality of the program data entered into the system. Programmers designed the portal to auto-populate various features based on information programs enter into their profile. This information provides the data needed to fill in the Program Directory and Locator Map. Additionally the search function accesses the profiles information to query results. The Data Uploader function allows programs to enter their data via a csv file. This was determined to be the most flexible option, as most data systems that programs use allow for data to be exported into this format. Additionally, it is most helpful for those who are managing the data, as this format can easily be manipulated for use in integrating datasets and building the aggregate reports. A major requirement of an integrated data system is the capability to match youths individuallevel data across several datasets. Each program dataset must contain the same identifiers (first name, last name, and date of birth) in order for a master dataset to be created. The work of cleaning, managing, matching and creating one integrated dataset requires a skilled database administrator (see Appendix B for graphic representation of process). Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 7
In order for the design of the portal to be fully functional, programmers and database administrators will need data to be submitted in an appropriate format and manner. This includes variables across all datasets that are clearly labeled; select variables that measure similar indicators so as to compare data across programs; individual-level data with the same identifiers (name and date of birth); and timely submission of a csv file. Programmers and database administrators must work closely with researchers and select program staff to communicate their needs so as to ensure the most efficient process. The process of actually receiving program data to be imported into the portal for the pilot project required many phases of work. First, the research team and the ELO evaluation committee worked with GRPS to identify two schools that could pilot the data system process. Harrison Park Elementary and Martin Luther King Jr. Leadership Academy were chosen based mostly on two factors: 1.) there was an established OST program infrastructure, and 2.) school principals had been in place for a while and were motivated to assist with the pilot. The research team and the ELO evaluation committee then selected six OST programs between the two schools based on the following criteria: Program is a member of ELO. Program is providing services within GRPS. Number of youth and age range of youth served should be representative of ELO. At least one program within each pilot school should be participating in the quality improvement system (implementing the Youth Program Quality Assessment (YPQA)). Program needs to be willing to participate and help shape the process. Programs should have varying levels of current data capacity. Programs should vary in topic to represent the diversity of ELO Network members. Once the pilot schools and programs had been selected, the next step was to work with programs to decide what data elements to collect. Indicators from outside datasets (GRPS and GRPD) included school attendance, academic achievement, and juvenile offenses; these were chosen because they are easily accessible, generalizable across ELO programs, and provided information that programs could not typically get on their own. Other data collected across all programs included program quality scores through the YPQA and program attendance; therefore, these indicators were also chosen to be a part of the data system. In order to be flexible and responsive to programs requests, it was decided that attendance data could be collected in any format if they also provided an attendance benchmark by which to measure their success in that area; this way, the system could report out on attendance in a uniform way across all programs (reporting on the percent of youth who reached the attendance benchmark). Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 8
Additionally, when programs were asked to provide their definition of success for youth in their program, the most common concept that emerged was some type of increased social emotional development. Researchers decided to create a brief survey to measure social emotional development that could be administered across all ELO programs. This provided another uniform indicator across all programs, which helps when analyzing what specific combination of ELO programs or dosage of time contributes to desired outcomes (in this case, increased social emotional development). Once the data elements were determined, the process of actually obtaining data began. To access GRPS data, researchers had to work closely with the school system to develop a research request that specifically outlined the requested data elements and how the data would be gathered and kept confidential and secure. Researchers had already acquired GRPD data through another research project. An opt out consent process was used though the GRPS school system where parents could indicate that they did not want their child s data from either dataset (GRPS and GRPD) to be used for this project. This opt out consent process required working closely with the school system to determine appropriate wording and a protocol for the administration and collection of the consent. In order to obtain program data, researchers needed to take several steps: Inform and ask programs for their participation. Develop an MOU for each program to stipulate the requirements of the project. Work with programs to determine the type of the data they collect and in what format. This one step can be time consuming and complicated, as programs often do not have a good sense of how their data is collected and managed. Researchers met with providers to interpret their data and figure out how it could be used to measure ELO s outcomes of interest. Additionally, researchers worked with providers to set up a Secure File Transfer Protocol (a software program that allows for secure transfer of electronic files) so that their data could be sent to researchers. This was a necessary step as the Data Uploader function of the portal had not yet been built. Develop a Data Sharing Agreement with each program that specifies the exact data to be shared and in what format. The next step was to build an integrated dataset using the data from all the pilot programs. Individual-level data was linked across datasets using full name and date of birth. Because the data is formatted in various ways across the programs, a database administrator must spend time cleaning and restructuring the data so researchers can measure outcomes in a standardized way. This is a time and energy intensive process, but it allows for programs to largely continue collecting their data as is and eliminates the burden of additional and new data collection. Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 9
The greatest lesson learned from this entire process is that the most difficult work is not in building the technology; it is in fully engaging program providers, listening and responding to their needs, and fostering their cooperation and participation with the data collection process. Because there is no mandate to participate in ELO or an integrated data system, there has to be a real incentive for programs to spend time contributing to a network-wide initiative; in addition, ELO executive committees and researchers need to facilitate opportunities for programs to voice their opinions and perspectives so that any project developments truly reflect their needs. This is the only way to ensure their involvement. More specifically, the process of cataloging programs data, agreeing to a minimum number of data attributes programs should collect, and carrying out a scheduled data import with the agreed upon data elements are huge challenges that require consistent communication and a democratic style of decision making. The underlying issue that needs to continually be addressed, and is of utmost importance to the school system and police department, is guaranteeing the security and confidentiality of the data. Accomplishing the above tasks is extremely time-consuming, so project planning should account for longer timelines and be flexible to delays. Program providers who participated in the pilot also gave their perspective on lessons learned. A summary of key points include: Each program should receive a clear set of instructions on what is expected, deadlines, and points of contact for various needs. Because providers are very busy, the information they need to participate in YCDC should be inclusive and in one place. The social emotional survey is not appropriate for younger children. A different, more ageappropriate measure should be developed. MAP score data should include the growth score and not just percent at or above grade level. Providers serve youth who often have below average scores. They believe that programming contributes to improved scores, but the improvement may not reach the grade level benchmark; however, this growth is still an important factor when considering a program s impact. There should be a plan for how the portal is disseminated to the community, and especially to parents of children in the school system. Possible ways to encourage program provider engagement with YCDC include: Explain how providers can use the directory and locator map to access other programs in their school or neighborhood that they may collaborate with, or refer students to if needed Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 10
Emphasize the desire to improve ELO as an entity that works toward improving outcomes for all children in the community; the data system can be used to examine how various OST programs interact with each other to affect youth outcomes. This information can be used to make decisions about programming in regard to collaborating with other providers, dosage, areas in need of improvement, and many other ways. Demonstrate how the reports can be used for evaluation purposes or grant proposals so that providers are able to use this information in ways that are useful to them. It is clear that there are multiple layers of benefits to an integrated data system. Programs have accessible and regularly updated information to help improve their services; ELO has data to advocate for their structure and continued sustainability; and funders will have better information on program outcomes to inform their decision making. An important additional step that has not yet been implemented is some sort of training on how programs can interpret their reports and use the information for their maximum benefit. This will be necessary in promoting their participation so that they continue to see the benefits to their involvement. YCDC Data Analysis Capabilities YCDC has the capability to provide descriptive data on the population served by ELO members as well as provide a deeper analysis of the cumulative impact of out of school time programming. The system will include data from several datasets, including: registered ELO programs; Grand Rapids Public School System; Grand Rapids Police Department; Youth Program Quality Assessment data; and the ELO Network Social Emotional Development Survey. The descriptive information will provide an up to date profile containing key indicators related to the ELO population. These include: Number of unique students served, Percent of youth with chronic absence and satisfactory absence, Percent of youth reaching a program s attendance benchmark, Percent of youth meeting grade level for math and reading MAP scores, Percent of youth arrested within the time frame of out of school programming, Average program quality score, and Average social emotional development score. Data can also be used to begin to answer ELO outcomes of interest. Propensity score analysis will be used to match individual-level student data across the several datasets submitted to YCDC, allowing for an analysis of how out of school time programs may or may not work together to produce better outcomes. This analysis will be strengthened by comparing students in OST programming versus students not in OST programming (using student data from the school system and police department). Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 11
ELO is interested in producing the following outcomes: Increased school attendance, Decreased police contact, Increased social emotional score, Increased school achievement, and Increased program quality. There are several analyses that can be used to explore if and how these outcomes are achieved. Data from YCDC will be able to answer the following questions: Is there a key dosage of programming that will produce the desired outcomes? What is it? Is there a combination of certain types of program that will produce the desired outcomes? What is it? Are higher social emotional scores associated with an increase in program and/or school attendance? Is out of school time program attendance associated with school attendance? Do programs that use the Youth Program Quality Assessment tool produce better outcomes than program that do not use the tool? Do programs with higher program quality scores produce better outcomes than those with lower program quality scores? The integration of several datasets provides the unique opportunity to explore how ELO programs may interact with one another to impact youth in a variety of ways. Additionally, the comparison between students in programming versus students not in programming potentially adds value to any outcome findings. As a result, there is the ability for a sophisticated analysis of youth programming that could affect system or policy level change. This level of analysis and the potential findings would not be possible without the cooperation of involved programs and the integration of data. Individual program data can only answer questions related to that program, which is of less value to an entity such as ELO that is interested in the relationship between all member programs and factors like attendance and criminal involvement. YCDC will be able to provide ELO with current and consistent data that will help inform decision making around program funding, policy, and best practices, which will in turn work to serve youth in the most effective and impactful ways possible. Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 12
Analysis with Pilot Data In order to demonstrate the analysis capabilities of the data system and how select findings will be reported, researchers used the pilot data to provide select descriptive statistics. These statistics are meant only to show how the data can be used; they should not be interpreted as conclusive findings in any way. Student information was obtained from the Grand Rapids Public School system; this data contained attendance information by quarter and MAP score results for students at Martin Luther King, Jr. Leadership Academy and Harrison Park Elementary School, reported out in the form of national percentiles. Program information was collected from programs participating in the YCDC pilot phase and consisted of: Creative Youth Center, Gerontology Network, Boy Scouts, athletics, and Kent Social Services Network. The program information was merged with the GRPS information based on first name, last name, and date of birth. Unique Participants The number of unique students served by the selected ELO programs over the 2013-2014 school year was 401 (20 students participated in two or more programs). School Attendance Following discussions around how to effectively convey attendance information, percent of students categorized as chronically absent and percent of students categorized with satisfactory attendance were calculated: ELO Program Participants Non-ELO Program Participants Chronic Absence 27% 24% Satisfactory Absence 47% 49% Program Attendance A benchmark for attendance, by program, was determined and the percent of students that reached the previously set benchmark was calculated. Thirty-eight percent of students in ELO pilot programs reached the attendance benchmark set by the program. Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 13
Academic Achievement Regarding MAP scores, percent of students performing at grade level or above (defined as performing at 50 percent or above) was calculated for students in ELO pilot programs and students not in ELO pilot programs. This was broken down by grade level (K-2nd, 2nd-5th, and 6th+) and by test (Reading and Math): ELO Program Participants Math - Grades K-2 17% 32% Math - Grades 2-5 12% 20% Math - Grades 6+ 18% 19% Reading - Grads K-2 24% 28% Reading - Grades 2-5 25% 31% Reading - Grades 6+ 27% 32% Non-ELO Program Participants Program Quality Youth Program Quality Assessments (YPQAs) were performed at select ELO program sites. Self and external scores (on a 0-5 scale) were calculated for four areas of the programs engagement, supportive environment, interaction, and safe environment: Self-Assessment Engagement 2.33 4.33 Supportive Environment 3.51 4.37 Interaction 3.22 3.94 Safe Environment 4.61 4.87 External Assessment Social Emotional Development Pilot programs administered a social emotional development survey to their youth toward the end of their program. The average social emotional score, on a 1-100 scale, was 83.7. The standard deviation was 11.6. Police Contact Information from the Grand Rapids Police Department was used to determine the percent of students in each program who were arrested during the program dates. Less than 5 percent of the ELO program participants were arrested during the program dates. For comparison purposes, less than 5 percent of students not in ELO programs were arrested during the school year. Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 14
Plan for YCDC Scale Up The pilot project proved that a scale up to include all ELO member programs is feasible. There will be additional adjustments as new programs are brought on to the project, but the technology can support the load level. While engagement, relationship-building, and data sharing with programs, the school system, and the police department is time-consuming and complicated, these parties do have data that is beneficial for an integrated data system and are willing to share this data. Additionally, the system infrastructure is supported by a unique team that includes a database administrator, web designer, programmer, project manager, and researchers; this team has the skill sets and available resources to manage the various elements of such a multidimensional project. A defined step by step process for scale up should be developed as soon as funding is available. Providers will need clear instructions on how they can participate, as well as continued engagement that shows how participation will be beneficial. There should be a brief planning period before scale up is implemented in order to carry out any changes the pilot showed to be necessary and to create a protocol for onboarding programs in the most efficient way possible. An estimate of the scale up cost is detailed in Appendix C. There is a yearly cost for hosting the portal and providing quarterly reports, as well as a cost associated with incorporating new programs, expanding to additional school systems, and gathering updated police department data. Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 15
Appendix A: Indicator Definitions Number of unique youth served: a count of the unique number of youth served by all ELO program or an individual program. Percent of youth reaching a program s attendance benchmark: each program has an individual attendance benchmark (e.g. youth should attend at least 75 percent of program sessions). This indicator measures the percent of youth who met or exceeded that benchmark. Average days attended last quarter: only for program reports the average number of days program participants attended the program. Percent of youth with chronic absence and satisfactory absence: percent of youth categorized with Chronic Absence or Satisfactory Absence according to Grand Rapids Public School definitions. Percent of youth meeting grade level for math and reading Measures of Academic Progress (MAP) scores: percent of youth with scores at grade level for the following subjects and grades: Math - Grades K-2: 20 Math - Grades 2-5: 30 Math - Grades 6 and above Reading - Grades K-2 Reading - Grades 2-5 Reading - Grades 6 and above Percent of youth arrested with the timeframe of our of school time programming: percent of youth arrested during ELO program participation. Program quality score: average score for program quality using the Youth Program Quality Assessment (possible scores range between 1 and 5). The self-assessment is completed by the program and the external assessment is completed by an outside evaluator. Social emotional development score: average score of social emotional development survey (possible scores range between 1 and 100). Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 16
Appendix B: Flow Chart Diagram Youth Community Data Center Flow Chart Diagram Contributed Data Data Transformation Publication Boy Scouts GRPS Demographics and Test data Creative Youth Center Load Data Clean Data, Standardize fields, Assign Unique ID, Populate Program participation Merge Programs data with GRPS Demographics and Test data -Exact matching *- Social Emotional Survey Data Demographics-MAP Gerontology Network Filter data by consent. (GRPS, ELO) All Programs Data LOOP KSSN De-identify data Analysis Combined (GRPS + Program Data) GRPD Prepare and Publish Research Datasets Phase * Exact Matching is based on 100 % match of first name, last name and Date of birth Community Research Institute of Grand Valley State University 2014 Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 17
Appendix C: YCDC Scale Up Cost Estimate Yearly Cost: $21,100 -Hosting the Portal and providing quarterly reports Technology supply/hosting cost $400 Technology updates to maintain website 5 hours DBA, $625 5 hours Web Designer, $625 Quarterly reports to ELO ED 10 hours Senior Researcher/Research Manager, $3,400 Social Emotional survey administration (programs print surveys and coordinate within sites) 10 hours Senior Researcher/Research Manager, $1,700 120 hours Project Manager, $10,200 40 hours student data management/entry, $1,000 ELO evaluation team meetings 18 hours Senior Researcher/Research Manager, $3,060 Onboarding of new programs: Single Site Program: $2,100 Technology team client meetings 4 hours Web Designer or DBA, $500 Research team client meetings, DSA 6 hours Research Coordinator and 4 hours student, $610 Portal administration 2 hours Web Designer or DBA, $250 Technology report generation 6 hours Web Designer, $750 Multi site Program: $3,510 Technology team client meetings 4 hours Web Designer or DBA, $500 Research team client meetings, DSA 6 hours Research Coordinator and 10 hours student, $760 Portal administration 6 hours Web Designer or DBA, $750 Technology report generation 12 hours Web Designer, $1,500 Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 18
Expansion to each additional School District: $2550 DSA development and consent processing 10 hours Senior Researcher/Research Manager, $1,700 GRJOI updated quarterly: $8,100 10 hours Project Manager, $850 Data pulls from GRPD and data integration analysis 20 hours DBA, $2,500 20 hours Research Coordinator, $1,700 Annual longitudinal report 20 hours Research Coordinator and 10 Senior Researcher, $3,400 Expansion of GRJOI (one time cost): $10,150 40 hours GIS Specialist, $5,000 New DSAs with Kentwood and Wyoming Police, Kent County Court 30 hours Senior Researcher/Research Manager, $5,100 30 hours Project Manager, $2,550 Data merging 20 hours DBA, $2,500 Dorothy A. Johnson Center for Philanthropy Grand Valley State University 2014 19
Dorothy A. Johnson Center for Philanthropy Grand Valley State University 201 Front Ave. SW, BIK 200 Grand Rapids, MI 49504 www.johnsoncenter.org