Convergence and Synergy: Social Q&A Meets Virtual Reference Services



Similar documents
Cyber Synergy: Seeking Sustainability through Collaboration Between Virtual Reference and Social Q&A Sites

Evaluating and Predicting Answer Quality in Community QA

Comparing IPL2 and Yahoo! Answers: A Case Study of Digital Reference and Community Based Question Answering

Application for The Thomson Reuters Information Science Doctoral. Dissertation Proposal Scholarship

Shaping the Future Realities of Virtual Reference

Not Dead Yet! A Longitudinal Study of Query Type and Ready Reference Accuracy in Live Chat and IM Reference

ASSISTing you online: creating positive student experiences at the University of Wolverhampton

Answering the Young Adult Reference Question - Making

Evaluation has always been a critical component of managing an online

Information Quality on Yahoo! Answers

Modalities, Motivations, and Materials Investigating Traditional and Social Online Q&A Services

Subordinating to the Majority: Factoid Question Answering over CQA Sites


Motivations and Expectations for Asking Questions Within Online Q&A

Not Dead Yet! Ready Reference in Live Chat Reference 13 th RUSA New Reference Research Forum ALA, Washington DC, June 2007

Asynchronous Learning Networks in Higher Education: A Review of the Literature on Community, Collaboration & Learning. Jennifer Scagnelli

Electronic Reference Service : Challenges and Trends

Pnina Shachaf School of Library and Information Science, Indiana University, 1320 E. 10th St., Bloomington, IN

How Many Answers Are Enough? : Optimal Number of Answers for Q&A Sites

VIRTUAL REFERENCE PRACTICES IN LIBRARIES OF INDIA

The Advantages of Using virtual Reference Services

TRANSITIONAL DISTANCE THEORY AND COMMUNIMCATION IN ONLINE COURSES A CASE STUDY

ED Trends and Issues in Digital Reference Services. ERIC Digest.

Seeking Synchronicity: Evaluating Virtual Reference Services from User, Non- User, and Librarian Perspectives

How To Know If A Site Is More Reliable

Joe Murphy Science Librarian, Coordinator of Instruction and Technology Yale University Science Libraries

GUIDELINES OF THE SUCCESSFUL REFERENCE INTERVIEW

An overview of Digital Reference Services

Seeking Synchronicity:

Virtual Reference at McGill Library

Evaluating Virtual Reference Service at University of Botswana Library: a case study of Question Point

Faculty Strategies for Balancing Workload When Teaching Online

Helen Featherstone, Visitor Studies Group and University of the West of England, Bristol

How to Support Faculty as They Prepare to Teach Online Susan C. Biro Widener University Abstract: A survey, an in-depth interview, and a review of

Pemberton (Instructional Services Coordinator) and Radom (Instructional Services Librarian) University of North Carolina Wilmington [Wilmington, NC]

Building Online Learning Communities: Factors Supporting Collaborative Knowledge-Building. Joe Wheaton, Associate Professor The Ohio State University

Digital Reference Service: Libraries Online 24/7

Undergraduate Students Perceptions and Preferences of Computer mediated Communication with Faculty. Jennifer T. Edwards

QUT Digital Repository:

UNDERSTANDING EXPLORATORY USE

Key Success Factors of elearning in Education: A Professional Development Model to Evaluate and Support elearning

Academic Advisor, College of Information Science and Technology, Drexel University,


Feasibility Of Social Networking Media In Project Management Communication

Site-Specific versus General Purpose Web Search Engines: A Comparative Evaluation

University of Arizona Libraries Initiates Successful Partnership with Campus Commercialization Unit: A Case Study

University of North Texas Learning Enhancement Grant Application 1. Project Title Online Masters Program in Applied Anthropology

Online Social Reference: A Research Agenda Through a STIN Framework

How To Click With A Librarian On The Web On A Computer Or Phone On A Laptop On A Pc Or Mac Or Ipad On A Tablet Or Ipa On A Iphone Or Ipro On A Phone Or Ipo On A Mac

Digital Reference Services of University Libraries in Japan

Request for Proposals for Research on the Middle Market National Center for the Middle Market Fisher College of Business The Ohio State University

Utilising Online Learning in a Humanities Context

HYBRID LEARNING: BALANCING FACE-TO-FACE AND ONLINE CLASS SESSIONS

Improving Ed-Tech Purchasing

Best Practices for Link Placement

Evrim GENÇ KUMTEPE, Anadolu University, Turkey

ED Building and Maintaining Digital Reference Services. ERIC Digest.

The Open University s repository of research publications and other research outputs

AN ONLINE MASTER S DEGREE: TEACHING AND LEARNING STRATEGY VS. MANAGERIAL CONSTRAINTS

Social Media Marketing in Selected UK Luxury Hotels

LIBRARY GUIDE & SELECTED RESOURCES FOR NURSING. To find books about and by nursing theorists:

IS DIGITAL REFERENCE? HISTORY OF DIGITAL REFERENCE ERIC

How To Harvest For The Agnic Portal

Cisco Unified Communications and Collaboration technology is changing the way we go about the business of the University.


The Roles that Librarians and Libraries Play in Distance Education Settings

STUDENTS PERCEPTIONS OF ONLINE LEARNING AND INSTRUCTIONAL TOOLS: A QUALITATIVE STUDY OF UNDERGRADUATE STUDENTS USE OF ONLINE TOOLS

The Social Justice Framework in the Information Technology Rural Librarian Master s Scholarship Program: Bridging the Rural Digital Divides

Understanding News Researchers through a Content Analysis of Dissertations and Theses

Code Number: 027-E 75 Reference and Information Services

Business Intranet Redesign: Can High Usability Mediate Competitive Advantage?

Running head: FROM IN-PERSON TO ONLINE 1. From In-Person to Online : Designing a Professional Development Experience for Teachers. Jennifer N.

Comparative Analysis of PhD programs in Engineering Education

Information Literacy Assessment: A Critical Component to the School Library Media Program

Lessons Learned Through Online Study of Indicators of Constructivism in Online Courses

Highlights from Service-Learning in California's Teacher Education Programs: A White Paper

Bibliometrics and Transaction Log Analysis. Bibliometrics Citation Analysis Transaction Log Analysis

How To Find Out If Distance Education Is A Good Thing For A Hispanic Student


Journal of Homeland Security and Emergency Management

Transaction Log Analyses of Electronic Book (E-book) Usage

LITERATURE REVIEWS. The 2 stages of a literature review

Improving Distance Education Through Student Online Orientation Classes

Education at the Crossroads: Online Teaching and Students' Perspectives on Distance Learning

TOOL 2.1 WHO SHOULD USE THIS TOOL EMPLOYER ENGAGEMENT TOOLKIT TOOL 2.1 IDENTIFYING EMPLOYERS IN YOUR INDUSTRY

Evaluation of a Statewide Collaborative Chat-based Reference Service: Approaches and Directions

Texas State University University Library Strategic Plan

19 November QuestionPoint use by California and Virginia Community College Systems

On the Feasibility of Answer Suggestion for Advice-seeking Community Questions about Government Services

BA Psychology ( )

Faculty Perspectives: A Comparison of Distance Education and Face-to-Face Courses

Instructional Strategies: What Do Online Students Prefer?

Portfolio Assessment as an Alternate Appraisal Method: A Faculty Perspective

The paradox of expertise: Is the Wikipedia Reference Desk as. good as your library?

Motivating Education Faculty to Teach Online Courses: Implications for Administrators

Characteristics of Effective and Sustainable Teaching Development Programs for Quality Teaching in Higher Education

Transcription:

Convergence and Synergy: Social Q&A Meets Virtual Reference Services Marie L. Radford Lynn Silipigni Connaway Chirag Shah School of Communication & Information (SC&I), Rutgers, The State University of New Jersey OCLC Research Jersey School of Communication & Information (SC&I), Rutgers, The State University of New 4 Huntington St, 6565 Kilgour Place 4 Huntington St, New Brunswick, NJ 08901 Dublin, OH 43017 New Brunswick, NJ 08901 mradford@rutgers.edu connawal@oclc.org chirags@rutgers.edu ABSTRACT This paper presents a preliminary analysis of online information services that examines the intersection of two areas of research: Virtual Reference Services (VRS) and Social Q&A (SQA). This project embraces the idea of a synergic collaboration within and between VRS and SQA to provide higher quality content to information seekers and a higher level of sustainability. Preliminary findings are described that emanate from a grant-funded project: Cyber Synergy: Seeking Sustainability through Collaboration between Virtual Reference and Social Q&A Sites, which is being supported by the Institute for Museum and Library Services (IMLS). Early stages of the grant project activities include VRS transcript analysis and interviews with online Q&A end users (students) and experts (university librarians). Analysis reveals that VRS questions address a broad range of subjects, predominantly social sciences and technology. Largest numbers of question types are Procedural and Ready Reference. An analysis of accuracy for the subset of Ready Reference questions was found to be high, 90 percent correct, although only 75 percent were found to be correct with citation cited. Interviews with information seekers and providers in both VRS and SQA environments revealed important differences in how experts and end-users perceive online Q&A services, leaving clues as to how it may be possible to start bridging these systems. Keywords Online Q&A; Virtual reference services; Social Q&A; human information behavior, social media. INTRODUCTION Digital and on-ground libraries are increasingly adopting a range of virtual reference services (VRS) to meet the needs of their users. In a parallel digital universe, social media sites and applications are joining with mobile technologies ASIST 2012, October 26-31, 2012, Baltimore, MD, USA. Copyright 2012 Chirag Shah, Marie L. Radford, of Rutgers University, and OCLC, Inc. 1 to enable the flow of information to be more accessible to larger and more diverse audiences. The existence and fast adoption by information seekers of these modes of communication call for transformative research that addresses innovative models, services, and tools. New approaches are required that investigate ways in which more sustainable service models can be created by seamlessly providing collaboration among the services within and between knowledge institutions, professionals, and subject experts. This paper examines the intersection of two related areas of research: VRS and Social Q&A (SQA). Its major goals are 1) to summarize research related to both of these different, yet essentially similar online information services, 2) to suggest ways that an examination of each service could be useful to the other, and 3) to present preliminary findings from a grant-funded project: Cyber Synergy: Seeking Sustainability through Collaboration between Virtual Reference and Social Q&A Sites. Cyber Synergy is being supported from October 2011 to September 2013, by the Institute for Museum and Library Services (IMLS), Rutgers, the State University of New Jersey, and OCLC, (Radford, Connaway, & Shah [2011-2013] www.oclc.org/research/activities/synergy/default.htm). This research explores the idea of a synergic collaboration within and between VRS and SQA services to provide higher quality content to information seekers and a higher level of sustainability to their respective institutions. VRS, including live chat and Instant Messaging (IM) formats, have become well-established offerings now found on the great majority of library websites. Including email reference, VRS have existed for more than 20 years (Sloan, 2006) and libraries have offered live chat for more than10 years, with IM reference entering the mainstream in 2008 (Introducing Qwidget, 2008). The ability of libraries to continue offering VRS to users is seriously threatened in this time of deep funding cuts. Concerns regarding sustainability of VRS have demanded an exploration of

how collaborations among libraries and library consortia can optimize use of their information service resources, and how a possible collaboration between VRS and SQA communities could leverage elements to generate better service and provide access to subject expertise. BACKGROUND Literature on VRS A growing body of research examines the quality of VRS, including attention to accuracy, interpersonal dimensions, and ability to effectively provide instruction. Connaway and Radford (2011), Radford and Mon (2008) and Ross, Nilsen, and Radford (2009) provide overviews of these findings. While numerous works look at usage and effectiveness of various VRS services, most are based on the assumption of single user/patron and single source of information. Many professionals in the field are challenging this assumption and are looking for innovative ways to address the needs of information seekers in the rapidly changing information landscape, as well as to provide a more effective and sustainable model for VRS services. For instance, Shachaf (2010) asserts that existing reference models that assume dyadic interaction (between two people) are limited. She describes social reference, as a collaborative endeavor that may differ significantly from the traditional (and digital) reference encounter by its social and collaborative nature that involves group effort and open participation (p. 67). She believes SQA sites to be virtual communities within which social reference encounters could occur. Pomerantz (2006) agrees that VRS must become more collaborative, although he asserts that library reference has always been a collaborative effort between the librarian and the user, and that reference referrals have typified ongoing and common collaboration between librarians. National VRS cooperatives like OCLC s QuestionPoint (QP) (http://questionpoint.org) have grown from 300 members in 2002 to more than 2000 today. According to Pomerantz (2006), information seekers want answers, and if VRS does not become a more collaborative effort to provide these answers, users will go elsewhere. Twidale and Nichols (1996) state that: The use of library resources is often stereotyped as a solitary activity, with hardly any mention in the substantial library science and information retrieval literature of the social aspects of information systems (p.177). Levy and Marshall (1994) also note that support for communication and collaboration is important for information-seeking activities, and... indeed, support for the former is needed to support the latter (p.164). Hansen (2009) asserts that in these difficult budget times there is an imperative to find the right collaboration model. Given that an effective solution for creating better and more sustainable VRS is collaboration, a natural question is how it should/could be achieved. Hertzum (2008) asserts that there is a need to establish common ground and information sharing for collaborative work to take place. In the communication field, Gibson and Gibbs (2006) studied virtual teams, collaboration, and innovation and found that 2 there were barriers to collaboration at a distance. These barriers included: Geographical dispersion (being physically distant); electronic dependence (reliance on computer-mediated communication, with little or no FtF contact), structural dynamism (frequent change in personnel), and national diversity (being from different nations or nationalities). One barrier they identified was geographic dispersion, which also was studied with regard to email reference (Portree, Evans, Adams, & Doherty, 2008) and a factor in increasing the length of time spent on live chat queries (Connaway & Radford, 2011; Radford & Connaway, 2005-2008). Literature on SQA SQA services are community-based, and purposefully designed to support people who desire to ask and answer questions, interacting with one another online. People ask questions to the public and expect to receive answers from anyone who knows something related to the questions, allowing everyone to benefit from the collective wisdom of many. In essence, SQA enables people to collaborate by sharing and distributing information among fellow users and by making the entire process and product publicly available. This encourages users to participate in various support activities beyond question and answer, including commenting on questions and answers, rating the quality of answers, and voting on the best answers. Within the past few years, various types of SQA services have been introduced to the public, and researchers have begun to evidence interest in information-seeking behaviors in these contexts. The most popular examples of SQA include Yahoo! Answers (http://answers.yahoo.com/) and AnswerBag (http://www.answerbag.com/). The advantages of this approach are the low cost (most services are free), quick turnaround due to large community participation, and easy build-up of social capital. On the other hand, there is, typically, no guarantee regarding the quality of the answers. The asker is simply relying on the wisdom of crowd. While service such as Yahoo! Answers offer peer-to-peer question answering, there are services that facilitate discussion among peers for questioning and answering. Such services allow a group of users to participant in solving a problem posed by a user of that community. This approach is very similar to the peer-to-peer Q&A, with a difference that it encourages peers to have a discussion, rather than simply trying to answer the original question. WikiAnswers is a good example of this approach. The advantage is that the asker often gets more comprehensive information that involves not only the answer, but also the opinions and the views of the participants. In a way, this approach can be seen as a hybrid of Yahoo! Answers and Wikipedia. The disadvantage is that not many questions require discussion-based answering. In addition, with services such as Yahoo! Answers, users can go back-andforth between asking questions and leaving answers or comments, thus inducing an implicit discussion.

What a Convergence of SQA and VRS Offers In library and information science (LIS), Pomerantz (2006) asserts that it is imperative to sustainability that VRS collaborate to insure that users receive answers, not just to provide referrals or instruction on how to find answers themselves. A seemingly natural choice for VRS is to collaborate with SQA, which has several components similar to VRS, with the major difference being that the answers could be provided by virtually anyone in the world. SQA, according to Shah, Oh, and Oh (2009), consists of three components: a mechanism for users to submit questions in natural language, a venue for users to submit answers to questions, and a community built around this exchange. Despite this short history SQA has already attracted a great deal of attention from researchers investigating information-seeking behaviors (Kim, Oh, & Oh, 2007), selection of resources (Harper, Raban, Rafaeli, & Konstan, 2008), social annotations (Gazan, 2008), user motivations (Shah, Oh, & Oh, 2008), comparisons with other types of question-answering services (Su, Pavlov, Chow, & Baker, 2007), and a range of other informationrelated behaviors. Pomerantz (2008a) wrote about the Slam the Boards effort in which librarians voluntarily pick up questions from SQA services, answer them, and declare themselves to be librarians in order to educate the public about quality answers from information professionals. In addition, Enquire, a 24-hour, live question answering service offered by public librarians across England and Scotland in collaboration with partners in the United States, (http://uk.answers.yahoo.com/activity?show=ct91f1tlaa) is answering up to 1500 questions/month in Yahoo! Answers UK and Ireland within specific subject areas. Enquire has been rated the best answer 79 percent of the time. Although these endeavors provide a library presence in SQA sites and have earned a reputation for good answers, they represent a very small number of libraries and are not operating under an economic model or mechanism to triage questions with the SQA sites. From the standpoint of user satisfaction with both the answer and the site it would be a benefit to SQA sites for there to be a mechanism to triage questions between sites. The question s topic would be one factor in such a mechanism, but other factors in evaluating quality of answers also could be valuable for this purpose. This sort of triage is comparatively simple for a human to perform and while time-consuming, is in fact commonly performed by librarians for digital reference services (Pomerantz, 2008a; Pomerantz, 2008b; Pomerantz, Nicholson, Belanger, & Lankes, 2004). The QuestionPoint (QP) reference service (http://questionpoint.org), an application for managing a global collaborative of library reference services, also performs this sort of triage automatically, by matching questions to profiles of individual libraries (Kresh, 2000). The level of complexity that such triage systems are capable of performing pale in comparison to the complexity that a human triage can manage. 3 The current model of VRS is often monolithic, in the sense that it works as a black-box, with questions going in and answers coming out, without an easy way to extend the existing services or combine them with other sites or services. This model is becoming difficult to support, especially in the current economic environment, which has negatively impacted library funding across the country ( Branch Closings and Budget Cuts Threaten Libraries Nationwide, 2008; Goldberg, 2010; Henderson & Bosch, 2010), threatening and often implementing devastating service reductions ( State Budgets Hammer Public Libraries Nationwide, 2009, p.19). Reference desks (physical or virtual) are sometimes underutilized, or the kinds of information requests coming to them are not appropriate for a given reference service or available staff (Naylor, Stoffel, & Van Der Laan, 2008). Creating an on-demand collaborative among participating reference services at the same or even different institutions may allow for ways not only to best utilize library resources, but also to provide a more effective service to users. VRS offer in-depth expert answering to information seekers, whereas SQA services, such as Yahoo! Answers (http://answers.yahoo.com), provide quick answers using crowdsourcing. The former is expected to have better quality, whereas the latter has the advantage of a large volume. An interesting avenue to explore is how VRS can collaborate with SQA services to provide more in-depth information when needed. Such collaboration may allow SQA services to offer premium (paid) content to their large user-base, and VRS to create a new revenue stream to support their sustainability. VRS are evolving, with new developments coming at a quickening pace to enhance the user experience, and recently allowing access to library services through textmessaging mobile devices and social networking sites (e.g., Cassell, 2010; Cassell & Hiremath, 2009; Cole & Krkoska, 2011). However, an expanded research agenda that generates empirical data is needed to assess the effectiveness of these services and enhancements. Furthermore, the reduction of library budgets increases the need to determine opportunities to share resources and generate revenue through collaboration and SQA services may provide such an opportunity. Research Questions To address these issues and to further the understanding of user behavior in VRS and SQA, the following research questions are being addressed by Cyber Synergy: What is the effectiveness of various VRS and SQA services, quality of content provided, and their relative merits and shortcomings? How does accuracy compare between VRS and SQA sites? What lessons can be learned from SQA sites that could be applied to VRS and vice-versa?

How can VRS become more collaborative, within and between libraries, and tap more effectively into librarian s subject expertise? How can we design systems and services within and between VRS and SQA for better quality and sustainability? The present paper provides some of the early investigations and findings along these questions. METHOD This project will span two years, and this paper reports initial transcript analysis for VRS queries and results from interviews with online Q&A end-users and experts (university librarians). These preliminary results will be used to inform the development of later phases of the grant project and will include comparisons of transcript analyses of both SQA and VRS sessions, additional interviews with VRS users, VRS providers, and SQA users, plus design sessions with experts (both system design and information professional), which will result in specifications for a system that will help to address the research question relating to system design. Method for VRS Query Analysis A sample of 575 transcripts was drawn from 296,797 QuestionPoint (QP) live chat and Qwidget (QW) sessions from June 2010 to December 2010. Of these, 560 were found to be useable, including 350 live chat from QP and 210 QW. All transcripts were stripped of identifying information (e.g., name, email address, IP address, etc.) at OCLC and then subjected to several different analyses by teams of coders, including researchers and graduate assistants from OCLC and Rutgers University, supervised by the co-authors. A team of two coders first classified the subject of the questions from each chat and QW transcript using the Dewey Decimal Classification System s categories. Two additional coders determined the type (e.g., Ready Reference, Subject Search, Procedural) using criteria and category schemes from Katz (1997), Kaske and Arnold (2002), and Arnold and Kaske (2005). The subset of ready reference questions were further analyzed for accuracy, using criteria and category schemes from Arnold and Kaske (2005), Radford and Connaway (2005-2008), and Radford, Connaway, Confer, Sabolsci-Boros, and Kwon (2011). Coders evaluated the accuracy of chat ready reference answers by checking responses using authoritative web sites and subscription-based databases and any citations/links provided by the librarian/staff member. Intercoder reliability after discussion to resolve differences was above 90 percent for all analyses of VRS transcripts. Method for SQA Analysis In order to address the research question regarding user and expert perspectives on SQA, a series of preliminary interviews were conducted. Interviews can be designed to allow researchers to collect qualitative data and are 4 appropriate for exploratory phases of research projects in library and information science (LIS) contexts (Connaway & Powell, 2010). Here structured interviews were used to gather perceptions regarding how users judge quality of answers and also how experts might view the potential of increased collaboration and the leveraging of subject knowledge that might result from a synergy between SQA and VRS. Ten reference librarians from Rutgers University Libraries were interviewed who represented multifaceted subject areas including the liberal arts, the arts, sciences and business disciplines. Interview questions for these academic librarian experts centered on their perceptions of face-toface and VRS reference, approaches to responding to users needs, selection of resources, important components of quality answers and their knowledge of and opinion of SQA services. Additionally, exploratory interviews were conducted with users of online Q&A systems including twenty-four undergraduate students from Rutgers University, NJ, and 12 Masters students from eight different universities (i.e., Boston University, John Hopkins, Drexel University, Philadelphia College of Osteopathic Medicine, Harvard, Georgetown, Notre Dame and Northwestern). Convenience sampling was used to obtain these students by offering undergraduates either monetary compensation ($10) or extra credit for those who volunteered to participate in the study, as well as offering monetary compensation ($10) to Masters students. Undergraduates were recruited through flyers posted at Rutgers University; the Masters students were recruited using a snowball sampling technique beginning with Rutgers University Master s students. Students who volunteered were selected as participants only if they could identify past use of and/or experience with at least one online Q&A platform. Interview questions for these graduate and undergraduate end-users centered on their use and experiences in physical and digital libraries, their use and experience with SQAs, and on what they look for in question-answering services. All interviews were audio-taped and transcribed and a theme analysis was performed by the research team following the constant comparison method (Glaser & Strauss, 1967) and facilitated through the use of NVivo 9 qualitative analysis software (http://www.qsrinternational.com/products_nvivo.aspx). PRELIMINARY RESULTS FOR VRS VRS Query Analysis - Dewey Decimal Classification To better understand the types of subjects received by VRS, the 560 transcripts (350 from QP and 210 from QW) were further analyzed for topicality. 429 questions were discovered that could be coded using the Dewey Decimal Classification (Dewey, 2011). The Dewey system was selected as is a well-recognized and simple subject typology. An additional 261 questions were identified that were not able to be assigned Dewey numbers (these questions included Procedural, No Subject, Unclear, and Miscellaneous - Directional, Inappropriate, Test).

As can be seen in Table 1, the highest percentages were found for questions related to the Dewey categories of Social Sciences (116, 16.81%) and Technology (80, 11.59%). Table 1 VRS Questions by Dewey Decimal Classification Dewey Decimal Classification Category # VRS Questions Percent 000 Generalities 39 5.65% 100 Philosophy & Psychology 18 2.61% 200 Religion 15 2.17% 300 Social Sciences 116 16.81% 400 Language 13 1.88% 500 Natural Science & Mathematics 600 Technology (Applied Sciences) 33 4.78% 80 11.59% 700 The Arts 27 3.91% 800 Literature & Rhetoric 58 8.41% 900 Geography & History 30 4.35% Other (includes Procedural, Directional, No Subject, Holdings, Inappropriate, Test, Unclear) 261 37.8% Total VRS Questions 690 100.00% Type of Question From the 560 usable transcripts (350 from QP and 210 from QW), coders identified 575 questions, as some transcripts had more than one query that could be sorted into the following nine categories: 1) Subject Search, 2) Ready Reference, 3) Procedural, 4) No Question, 5) Holdings, 6) Research, 7) Inappropriate, 8) Directional, and 9) Reader s Advisory. These categories were derived from Arnold and Kaske (2005), Radford and Connaway (2005-2008), and Ross et al. (2009). The largest number and percentage of questions were determined to be in two categories: Procedural (181, 31 percent) and Ready Reference (179, 31 percent). The next most frequent were Subject Searches (97, 17 percent), Holdings (49, 9 percent), and No Question (25, 4 percent). Lower numbers and percentages were found for Research (19, 3 percent), Directional (15, 3 percent), Reader s Advisory (6, 1 percent), and Inappropriate (4, <1 percent). Accuracy of Ready Reference Answers The subset of 179 Ready Reference queries were further examined for accuracy, with 11 of these then eliminated because they were referred for follow-up to another library/librarian or had technical difficulties. The remaining 168 queries were checked by a team of two coders for accuracy. These were found to include 151 correct answers 5 (90 percent) and 7 incorrect answers (4 percent), and ten queries were coded as Other, 10 answers (6 percent). Correct answers were further examined to determine which of these included citations, no citations, and instances where no citation was necessary. Correct with Citation accounted for 75 percent (114 of 151 correct answers). Correct without Citations were found for 14 percent (21 of 151 correct answers), and Correct No Citation Needed were found for 11 percent (16 of 151 correct answers). PRELIMINARY RESULTS FOR SQA This investigation began by identifying a series of factors that users employ when judging the goodness of an answer, or more appropriately, a response. A comprehensive literature review of LIS literature, focusing on SQA services and VRS best practices, identified over 200 factors. Scholars argue that many of these factors overlap and can been winnowed down to an exhaustive list of around 10-30 factors, however a comprehensive, standard list has yet to be generated. From this review, it was determined that these factors could be grouped into three high level categories: quality, relevance, and satisfaction. These categories account for both the qualities inherent to the objective content of the information, as well as the subjective expectations of the end user, the two of which combine to produce a judgment of information value. These findings extend earlier works on evaluating answers or responses for quality or relevance only (Agichtein, Castillo, Donato, Gionis, & Mishne, 2008; Harper et al., 2008; Kim & Oh, 2009). To determine which factors to classify under the three high level categories, it was necessary to explore insights garnered from both experts and users regarding the criteria they employ when making value judgments. These interviews with experts (librarians) and end-users (students), constituted an initial attempt to look at both sides of the online Q&A coin, hoping to compare SQA and VRS at some level. Interviews revealed that users and experts most often identified topicality and validity as important factors used in making value judgments of information. Undergraduate and graduate students value information that is accurate and will often test the veracity of answers provided to them on a SQA site. If they could verify the truth to an answer, or at least had the opportunity to verify if desired, they exercised a propensity toward SQA as compared to VRS. Students who indicated participation in library instruction sessions emphasized continued use of search strategies learned to find resources that were more in-depth and on-topic with their information need. This falls in line with the experts similar indicated value of these two attributes. The noted success of the library instruction sessions suggests that future study should be completed to further analyze the effectiveness of these sessions in aligning user and expert information behaviors, as well as to examine the efficacy of these services across various platforms.

CONCLUSION The above literature review provides a brief overview of the research related to these different, yet essentially similar models for delivery of online information service to endusers. Traditional VRS, like face-to-face reference services, embraces a one-to-one model (i.e., one asker, one expert professional, usually a librarian) and SQA promotes a oneto-many model (i.e., one asker to many potential experts, harnessing the knowledge of the crowd). This paper foregrounds some questions including: Can and should these two models converge or be bridged? What would a combination or synergized model look like? Should and how could VRS embrace the wisdom of the crowd, especially for questions related to highly esoteric or complex subjects? The research findings reported here constitute the initial explorations of a larger project to span two years that is investigating these questions and the idea of a possible convergence and synergy between SQA and VRS (Radford, Connaway, & Shah, 2011-2013). Analysis of preliminary data have found that VRS services successfully address a broad range of subjects, predominantly centered in the social sciences and on technology. Classification of a sample randomly drawn from a large corpus found that the largest numbers of question types are Procedural and Ready Reference. Accuracy of the sub-set of questions typed as Ready Reference across the wide range of subjects was found to be high, with 90 percent correct, although only 75 percent were found to be correct with citation included. One application of this finding is that it establishes a benchmark for VRS accuracy. VRS librarians/staff were found to provide a high level of accuracy in informational questions on a variety of topics. The level of accuracy (and user satisfaction) for SQA will be a focus of the grant project s later phases, and an analysis of reasons why some SQA questions fail to garner any answers is in progress. A typology of attributes of failed informational questions is also being developed. The interviews conducted to compare VRS and SQA services provided interesting insights from both the experts (librarians ) and the end-users (students ) points of view. Librarians who were interviewed placed either an equal or higher importance on attaining sustained user satisfaction and meeting service expectations. They also valued the inclusion of teaching search strategies that enable users to develop their searching and information evaluation skills as opposed to the delivery of e-resources, including URLs for authoritative websites or scholarly articles from databases that will most likely only be used once. Most librarians claimed that teaching these strategies was better conveyed via electronic media rather than in face-to-face settings (see also Radford & Connaway, 2005-2008, which had a similar finding in interviews with VRS librarians). Students, on the other hand, asserted the importance of topicality, length, visibility, timeliness, clarity, availability and verifiability. These judgments were conveyed both via their reported information behavior, as well as by their analyses of preidentified categories culled from the literature review. Unlike experts, students identified factors comprising relevance, quality and satisfaction on equal planes, suggesting that these three criteria have equal importance for information seeking within an academic and Everyday Life Information Seeking (ELIS) context (Savolainen, 1995). Additional analyses are in progress for these data, as well as analysis of SQA questions by subject, question type, and reasons why some questions fail to obtain any answers, that will allow for further comparison. Later phases of the project will include in-depth interviews with VRS users, VRS providers, and additional interviews with SQA users, and plus design sessions with experts which will result in specifications for a system that will help to address the research question relating to system design. The findings of the Cyber Synergy research will help scholars to: a) explore how various library services, especially VRS, could better employ existing and frequently underutilized services (e.g., Naylor et al., 2008), b) understand how subject expertise of librarians could be better leveraged through virtual collaborations, c) develop guidelines for practice and recommendations for evaluation of VRS and SQA, d) inform systems design, and e) explore ways to connect potential users and information seekers with SQA services, VRS, and social media sites. These investigations could inform system design and adoption of different service models to create a more open, innovative, and sustainable pathway for the future. ACKNOWLEDGMENTS Support for the research study, Cyber Synergy: Seeking Sustainability through Collaboration between Virtual Reference and Social Q&A Sites, was provided by the Institute of Museum and Library Services (IMLS), Rutgers, The State University of New Jersey, and OCLC, Inc. The authors wish to thank the following research assistants who helped with coding and transcript analysis: Erik Choi, Alyssa Darden, Lisa DeLuca, Mary Ennis, Erin Hood, Jaqueline Woolcott, and Vanessa Kitzie. REFERENCES Agichtein, E., Castillo, C., Donato, D., Gionis, A., & Mishne, G. (2008). Finding high-quality content in social media. In Proceedings of the International Conference on Web Search and Web Data Mining (pp. 183-194). New York: ACM. Arnold, J., & Kaske, N. K. (2005). Evaluating the quality of a chat service. portal: Libraries and the Academy, 5(2), 177-193. Branch closings and budget cuts threaten libraries nationwide. (2008, November 7). American Libraries, 39(11), 22. 6

Cassell, K. A. (2010). Meeting users needs through new reference service models. In M. L. Radford, & R. D. Lankes (Eds.), Reference renaissance: Current and future trends (pp. 153-160). New York, NY: Neal-Schuman Publishers, Inc. Cassell, K. A., & Hiremath, U. (2009). Reference and information services in the 21st century: An introduction (2nd ed.). New York: Neal-Schuman Publishers, Inc. Cole, V., & Krkoska, B. (2011). Launching a Text a Librarian service: Cornell s preliminary experiences. The Reference Librarian, 52(1), 3-8. Connaway, L. S., & Powell, R. L. (2010). Basic research methods for librarians (5th ed.). Littleton, CO: Libraries Unlimited. Connaway, L. S. & Radford, M. L. (2011). Seeking Synchronicity: Revelations and recommendations for virtual reference. Dublin, OH: OCLC Research. Retrieved on February 26, 2012 from http://www.oclc.org/reports/synchronicity/full.pdf Dewey, M. (2011). Dewey Decimal Classification and Relative Index (23rd ed., J. S. Mitchell, J. Beall, R. Green, G. Martin, & M. Panzer, Eds.). Dublin, OH: OCLC. Gazan, R. (2008). Social annotations in digital library collections. D-Lib Magazine, 14(11/12). Retrieved from http://www.dlib.org/dlib/november08/gazan/11gazan.htm l Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research. New York: Aldine. Goldberg, B. (2010). Libraries reach FY2011: Some relieved, all wary. American Libraries, 41(8), 16. Gibson, C. B., & Gibbs, J. (2006). Unpacking the concept of virtuality: The effects of geographic dispersion, electronic dependence, dynamic structure, and national diversity on team innovation. Administrative Quarterly, 51, 451-495. Hansen, M. T. (2009). When internal collaboration is bad for your company. Harvard Business Review, 87(4), 82-88. Harper, M. F., Raban, D. R., Rafaeli, S., & Konstan, J. K. (2008). Predictors of answer quality in online Q&A sites. Proceedings of the 26th Annual SIGCHI Conference on Human Factors in Computing Systems (pp. 865 874). New York: ACM. Henderson, K. S., & Bosch, S. (2010). Seeking the new normal. Library Journal, 135(7), 36-40. Hertzum, M. (2008). Collaborative information seeking: The combined activity of information seeking and collaborative grounding. Information Processing & Management, 44(2), 957-962. Introducing Qwidget, the new QuestionPoint chat widget. (2008). Retrieved from http://www.oclc.org/news/announcements/announcemen t268.htm Kaske, N. & Arnold, J. (2002, June). An unobtrusive evaluation of online real time library reference services. Paper presented at the American Library Association, Annual Conference, Atlanta, GA. Katz, W. A. (1997). Introduction to reference work (7th ed.). New York: McGraw-Hill. Kim, S., & Oh, S. (2009). Users' reference criteria for evaluating answers in a social Q&A site. Journal of the American Society for Information Science and Technology, 60(4), 716-727. Kim, S., Oh, J. S., & Oh, S. (2007). Best-answer selection criteria in a social Q&A site from the user oriented relevance perspective. Proceedings of the 70th Annual Meeting of the American Society for Information Science and Technology, 44(1), 1-15. Kresh, D. N. (2000). Offering high quality reference service on the Web: The collaborative digital reference service (CDRS). D-Lib Magazine, 6(6). Retrieved from http://www.dlib.org/dlib/june00/kresh/06kresh.html Levy, D. M., & Marshall, C. C. (1994). Washington s white horse? A look at assumptions underlying digital libraries. Proceedings of Digital Libraries 94 (pp. 163-169). College Station, TX: Texas A&M University Press. Naylor, S., Stoffel, B., & Van Der Laan, S. (2008). Why isn t our chat reference used more? Reference & User Services Quarterly, 27(4), 342-354. Nielsen, J. (1993). Usability engineering. San Francisco, CA: Morgan Kaufmann. Pomerantz, J. (2006). Collaboration as the norm in reference work. Reference & User Services Quarterly, 46(1), 45-55. Pomerantz, J. (2008a). The librarian invasion: Evaluating the Slam the Boards effort. Proceedings of the American Society for Information Science and Technology, 45(1). Retrieved from http://www.ils.unc.edu/~jpom/conf/asist2008_eref_pa nel.pdf Pomerantz, J. (2008b). Evaluation of online reference services. Bulletin of the American Society for Information Science and Technology, 34(2), 15-19. Retrieved from http://www.asis.org/bulletin/dec-07/pomerantz.html Pomerantz, J., Nicholson, S., Belanger, Y., & Lankes, R. D. (2004). The current state of digital reference: Validation of a general digital reference model through a survey of digital reference services. Information Processing & Management, 40(2), 347-363. 7

Portree, M., Evans, R., Adams, T. M., & Doherty, J. J. (2008). Overcoming transactional distance: Instructional intent in an e-mail reference service. Reference & User Services Quarterly, 48(2), 142-152. Radford, M. L., & Connaway, L. S. (2005-2008). Seeking synchronicity: Evaluating virtual reference services from user, non-user, and librarian perspectives. Funded by National Leadership Grants for Libraries program of the Institute of Museum and Library Services (IMLS). Retrieved from http://www.oclc.org/research/activities/synchronicity/def ault.htm Radford, M. L., Connaway, L. S., & Shah, C. (2011-2013). Cyber Synergy: Seeking Sustainability through Collaboration between Virtual Reference and Social Q&A Sites. Retrieved from http://www.oclc.org/research/activities/synergy/default.ht m Radford, M. L., Connaway, L. S., Confer, P., Sabolsci- Boros, S., & Kwon, H. (2011). Are we getting warmer? Query clarification in live chat virtual reference. Reference & User Services Quarterly, 50(3), 259-279. Radford, M. L., & Mon, L. (2008). Reference service in face-to-face and virtual environments. In M. L. Radford & P. Snelson (Eds.), Academic library research: Perspectives and current trends (pp. 1-47). Chicago: ACRL. Ross, C. S., Nilsen, K., & Radford, M. L. (2009). Conducting the reference interview (2nd ed.). New York: Neal-Schuman Publishers, Inc. Savolainen, R. (1995). Everyday life information seeking: Approaching information seeking in the context of way of life. Library Information Science Research, 17(3), 259-294. Shachaf, P. (2010). Social reference: Toward a unifying theory. Library & Information Science Research, 32(1), 66-76. Shah, C., Oh, J. S., & Oh, S. (2008). Exploring characteristics and effects of user participation in online social Q&A sites. First Monday, 13(9). Retreived from http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/a rticle/view/2182/2028 Shah, C., Oh, S., & Oh, J. S. (2009). Research agenda for social Q&A. Library and Information Science Research, 11(4), 205-209. Sloan, B. (2006). Twenty years of virtual reference. Internet Reference Services Quarterly, 11(2), 91-95. State budgets hammer public libraries nationwide. (2009). American Libraries, 40(8/9), 19-21. Su, Q., Pavlov, D., Chow, J., & Baker, W. (2007). Internetscale collection of human-reviewed data. In C. L. Williamson, M. E. Zurko, P. E. Patel-Schneider, & P. J. Shenoy (Eds.), Proceedings of the 16th International Conference on World Wide Web (pp. 231 240). New York: ACM. Twidale, M. B., & Nichols, D. M. (1996). Collaborative browsing and visualisation of the search process. Aslib Proceedings, 48(7-8), 177-182. Retrieved from http://www.comp.lancs.ac.uk/computing/research/cseg/pr ojects/ariadne/docs/elvira96.html 8