Data Buddies a CRA W/CDC Project Tracy Camp Colorado Schoolof of Mines Tom McKlin The Findings Group Betsy Bizot Computing Research Association
Data Buddies a CRA W/CDC Project Tracy Camp Colorado Schoolof of Mines Tom McKlin The Findings Group Betsy Bizot Computing Research Association
Data Buddies a CRA W/CDC Project
BPC Alliance: Widening i the Research Pipeline
CDC Focuses on increasing the diversity of those participating in computing Executive Committee: Brian Blake, U of Notre Dame, Chair Juan Gilbert, Clemson, Chair elect Charles Isbell, GA Tech, at large Jim Kurose, Umass, CRA rep. Cynthia Lanius, Bll Bellsouth, at large Manuel Pérez Quiñones, Virginia Tech, Past Chair Rodrigo Romero, U TX, El Paso, IEEE rep. Barbara Simmons, ACM rep. Elaine Weyuker, AT&T at large Members: Nancy Amato, Texas A&M University Jon Bashor, Lawrence Berkeley Labs Enobong Hannah Branch, Univ. of Mass. Jamika Burge, The Pennsylvania State Univ. John Cavazos, University of Delaware Deborah Cooper, Google Ron Eglash, RPI Maria Klawe, Harvey Mudd Chad Jenkins, Brown University Brandeis Marshall, Purdue University Ron Metoyer, Oregon State University Linda Morales, The University of TX at Dallas Ann Rdlf Redelfs, Independent d tconsultant t Erik Russell, CRA Tiki Suarez Brown, Florida A&M University Valerie Taylor, Texas A&M Pamela Williams, Logistics Manag. Inst. Dale Marie Wilson, UNCC Bryant York, Portland State
CRA W Focuses on increasing the number and success of women in computing research Board: Nancy Amato, Texas A&M Univ. Carla Brodley, Tufts University AJ Brush, Microsoft Tracy Camp, CO School of Mines, co chair chair Sheila Castaneda, Clarke College Lori Clarke, Univ. of MA, Amherst Joanne Cohoon, University of Virginia Dona Crawford, LLNL Andrea Danyluk, Williams College Dilma Da Silva, IBM Watson Sandhya Dwarkadas, Univ. of Rochester Carla Ellis, Duke University. Kathleen Fisher, DARPA/Tufts Tessa Lau, IBM Almaden Research Ctr. Patty Lopez, Intel Margaret Martonosi, Princeton Univ. Kathryn McKinley, Microsoft Research, co chair Gail Murphy, Univ. of British Columbia Lori Pollock, Univ. of Delaware Susan Roger, Duke University Holly Rushmeier, Yale University Erik Russell, CRA Mary Lou Soffa, Univ of VA Manuela Veloso, CMU
Alliance Activities Collaborative REU (CREU) Academic careers Undergraduates Graduate Students Distributed REU (DREU) Industry/government labs Alliance Distinguished Lecture Series Discipline Specific Mentoring Workshops Data Buddies/Evaluation
Alliance Activities Tapia Celebration Sending Students/Mentors to Conferences Collaborative REU (CREU) Academic careers Undergraduates Graduate Students Distributed REU (DREU) Industry/government labs Academic Workshop Alliance Distinguished Lecture Series CDC Discipline Specific Mentoring Workshops Data Buddies/Evaluation Addressing the Shrinking Pipeline IEEE CS Distinguished Visitors Program
Alliance Activities Tapia Celebration Sending Students/Mentors to Conferences Collaborative REU CAPP (CREU) Grad Cohort Academic careers Undergraduates Graduate Students Distributed REU (DREU) Alliance CDC CRA W Industry/government labs Career Mentoring Workshop Academic Workshop Distinguished Lecture Series Discipline Specific Mentoring Workshops Data Buddies/Evaluation Addressing the Shrinking Pipeline IEEE CS Distinguished Visitors Program
CRAW CDC BPC Participants 2007 20112011 CREU DREU DSW DLS
Are we spending our money wisely?
Evaluation
Evaluation: CREU/DREU
CREU Numbers 542 undergraduate URM+Ws across the USA have participated in CREU! 223 CREU projects have been funded!
DREU Numbers 643 undergraduate URM+Ws across the USA have participated in DMP/DREU! 96 research universities iti have hosted DMP/DREU students from 282 institutions!
BPC CREU/DREU Evaluation (in program) CREU: Students complete a before CREU survey and an after CREU survey DREU: Students complete a before DREU survey and an after DREU survey
BPC CREU/DREU Evaluation (in program) Experience feeling like a member of a research community 44% (n=28) 78% (n=44) 40% (n=65) 81% (n=116) Knowledge about criteria for admission to graduate programs Knowledge about what it is like to do computing research 38% (n=24) 49% (n=27) 29% (n=49) 72% (n=105) CREU before CREU after DREU before DREU after 38% (n=24) 68% (n=38) 30% (n=50) 82% (n=120) 0% 20% 40% 60% 80% 100%
What did we learn? Learned what increased participant s p interest, knowledge, and confidence in computing Learned what happened to our participants Learned how to improve our programs Grad Cohort Evaluation Discipline Specific Workshops
What did we learn? We learned some valuable things but we didn t answer the BIG questions
BPC CREU/DREU Evaluation (in program) Experience feeling like a member of a research community 44% (n=28) 78% (n=44) 40% (n=65) 81% (n=116) Knowledge about criteria for admission to graduate programs Knowledge about what it is like to do computing research 38% (n=24) 49% (n=27) 29% (n=49) 72% (n=105) CREU before CREU after DREU before DREU after 38% (n=24) 68% (n=38) 30% (n=50) 82% (n=120) 0% 20% 40% 60% 80% 100% do nonparticipants make similar gains?
What did we learn? Learned that ~30 40% of CREU/DREU participants enrolled in graduate school (mainly Ph.D. programs) what percent of non participants what percent of non participants enroll in graduate school?
Are we spending our money wisely?
Data Buddies a CRA W/CDC Project Tracy Camp Colorado Schoolof of Mines Tom McKlin The Findings Group Betsy Bizot Computing Research Association
Comparison Groups Delivered to BPC PIs at BPC meeting in 2007
Comparison Groups 16 % Under-Re epresente ed Majors 11 12 13 14 15 Treatment Group Outcome Not Part of the Outcome Control Group 10 2005 2006 2007 2008 2009 2010 2011 Year
Comparison Groups Good Comparison Groups: Are like the program participants in every way except that they have not participated in theprogram. Mostly unfeasible for local projects (expand N, expand geographic region)
Comparison Groups Accept some noise in the data. Higher n mitigates noise. Took 14-month hiatus from program CE21 Treatment Program Comparison Group Former Program Participant Participates in a BPC Alliance Program Unwilling to be measured Participates in an MSP Program
Comparison Groups Common BPC Groups: What comparison groups would you choose? High School CS Students (School level AP data) Undergraduate and Graduate Computing Students (Data Buddies) Middle and High School Teachers (Teachers in other states identified via CSTA) University Faculty Consider the literature for comparison group p g p performance (true for effect sizes, too)
Comparison Groups What BPCs Measure may be used as measures for treatment and comparison groups: Enrollment/retention/graduation/transit ion rates of CS students Student & educator content knowledge Motivation to succeed, confidence, gender/race stereotypes, intention to persist
Data Buddies a CRA W/CDC Project Tracy Camp Colorado Schoolof of Mines Tom McKlin The Findings Group Betsy Bizot Computing Research Association
Why Data Buddies? Our participants i t are nationwide id Participants from many institutions come to us as individuals In theory, our comparison group is all computing undergraduates in the US
Sampling by Department Don t sample individuals, sample departments Then survey all students Starting ti point: departments t from which h participants came in 2008 2010 Categorizeby program type Top ranked PhD Other PhD Masters only Bachelors only Then invite randomly
Recruiting Departments Reasons s for them to participate pate Department report on responses of their students compared to programs of the same type Department stipend How they participate Contact person facilitates our surveys of students and faculty Encourage individual participation
Randomly Selected Departments: Spring 2011 PhD Programs ranked 1 36 (8): Master s (8): Georgia Tech, Northwestern, Penn CUNY Queens, Miami i U Oxford, St Joseph s, Texas State, Stanford, UCLA, U Mass Southern, U of Akron, Illinois Amherst, UNC, Yale Springfield, Michigan Flint, Oh Other PhD Programs (13): and Western Oregon Dartmouth, New Mexico State, Old Dominion, Syracuse, U of Kansas, Missouri Columbia, Nevada Reno, New Mexico, South FL, Texas at Dallas, Utah, Wash at St. Louis, Worcester Polytech 4 year (17): Albany State, Cal State Stanislaus, CUNY Hunter, Harvey Mudd, Kean, Millersville of PA, Radford, Sonoma State, SUNY Plattsburgand and Potsdam, U Hartford, Hawaii Hilo, Houston Downtown, Minnesota Morris, Nebraska Kearney,Puget Sound, and Wellesley
Now what? We have the departments. What are we We have the departments. What are we going to do with them?
Long Term Outcomes Participants earn PhDs and join the academic and research workforce Measured by progress to next milestone Undergraduates enroll in graduate school Master s students enroll in a PhD program or take a research job PhD students find a research/academic position
Intermediate Outcomes 1. Awareness ofresearch option 2. Interest in continuing and intent to continue 3. Technical knowledge, skills, and credentials to advance 4. Career management knowledge 5. Confidence 6. Connection to professional networks Research indicates these factors contribute to progress toward next milestone.
Survey Content Student experience in their department, with mentoring, with research, withprofessional networks Interest and confidence in continuing on a research track Highest intended degree For completing students, plans for the following fall
Survey Schedule Year 1 Spring 2011: Completing students Fall 2011: Continuing students Winter 2012: Faculty and postdocs Year 2 Spring 2012: Completing students, plus follow up on last years who agreed Fall 2012: Continuing students Continue for 5 years
Survey Responses Group Spring 2011 (graduating) Fall 2011 (continuing) Departments 45 50 Undergraduate Students 706 2329 Graduate Students 555 1260
Key Result Spring Survey
We have great data Why graduating students did not participate in undergrad research All: no time; Women: not interested; URM: was turned down or didn t pay Undergrad interest in career type College or university professor: 48.8% Non research computing industry: 85.6% Computer applications: 83.8% High school CS teacher: 31.8%
Next Steps Continue with surveys Faculty survey: end of this month Next round of graduating gstudents: April In depth analysis Perhaps collaborative research Work with others to make this a true community resource
Data Buddies a CRA W/CDC Project Tracy Camp Colorado Schoolof of Mines Tom McKlin The Findings Group Betsy Bizot Computing Research Association
Are we spending our money wisely? Absolutely!
Impact
Alliance NSF Site Visit
Program Improvement
Data Buddies: Get Involved Become a Data Buddy Become a Data Buddy Receive a department report that compares responses from your students with other programs of the same type Use our results to improve your programs/research Work with us to (maybe) expand our surveys to assist your program Use our survey questions in your evaluations
For more information See www.cra.org/databuddies Initial report on spring 2011 surveys soon to be finalized; available from there Information on being a volunteer department is there as well Or email databuddies@cra.org QUESTIONS?
National Resource
Alliance Activities: CREU and DREU Undergraduate Mentoring GOAL: Encourage students to apply to graduate school through research experiences and individual id mentoring ACTIVITIES: Participate in research Visit a research dept not at home institution (DREU) Present at a conference Wit Write research papers
CREU/DREU Program Chairs BPC: Distributed REU Nancy Amato, Texas A&M Julia Hirschberg, Columbia Maria Gini, Univ. of Minnesota Tiffani Williams, Texas A&M BPC: Collaborative REU Andrea Danyluk, Williams College Jamika Burge, Information Systems Worldwide
Alliance Activities: Distinguished i i Lecture Series GOAL: Encourage students to apply to graduate school and provide exposure to role models to undergraduate andgraduate students ACTIVITIES: Industrial and academic research presentations Grad school panel
Alliance Activities: Discipline SpecificMentoring Workshops GOAL: Connect senior and junior researchers and students within a specialty area to provide disciplinespecific mentoring ACTIVITIES: Learn about best journals and conferences, hotresearch areas Practice networking skills and build collaborations
DLS and DSW Program Chairs BPC: Distinguished Lecture Series NancyAmato Amato, Texas A&M Dilma Da Silva, IBM Chad Jenkins, Brown BPC: Discipline Specific Workshops Margaret Martonosi, Princeton Juan Gilbert, Clemson