Towards a Semantic Standard for Enterprise Crowdsourcing A scenario-based evaluation of a conceptual prototype ECIS, Utrecht, June the 7th, 2013 Lars Hetmank
MOTIVATION
What are the Challenges for Enterprise Crowdsourcing? CROWDSOURCING SYSTEM 13 Team formation 12 Task allocation 11 Task specification define propose select Requester Crowdsourcing Task Crowd 14 Monitoring adjust process 15 Interoperability EXTERNAL BUSINESS APPLICATIONS Knowledge repository Enterprise SNS HR database... Towards a Semantic Standard for Enterprise Crowdsourcing Slide 3 16
MY RESEARCH
Methodology Design-science research approach (Hevner et al., 2004) Based on a systematic literature review on crowdsourcing systems (Hetmank, 2013) Design as an Artifact Problem Relevance Research Contribution Design Evaluation Research Rigor Search Process Research Communication Semantic standard Enterprise Crowdsourcing Metadata Schema Automation and interoperability First proof-ofconcept Scenario building Based on previous studies on crowdsourcing First step in the development process Technicaloriented and managementoriented audience Conferences, Journals, and Prototypes Application of design-science research (DSR) guidelines according to Hevner et al., 2004 Towards a Semantic Standard for Enterprise Crowdsourcing Slide 5 16
DEVELOPMENT OF A CONCEPTUAL PROTOTYPE
Task descrip=on Type of the reward Nature of the reward Complexity Type of ac=on Modulariza=on Target audience Visibility? Crowdsourcing Task Human requirement Technical resource Confiden=ality Latency Dura=on Submission =me Closure =me Towards a Semantic Standard for Enterprise Crowdsourcing Slide 7 16
User iden=ty Na=onality Qualifica=on Job =tle Entry date Accomplishment Crowdsourcing User Department Loca=on Towards a Semantic Standard for Enterprise Crowdsourcing Slide 8 16
EVALUATION
Evaluation and 1 st Proof-of-concept Three scenarios that may occur in real business environments: 1. Evaluate product design proposals 2. Translate technical specification 3. Build company-wide virtual library Four users as a subset of an example crowd Towards a Semantic Standard for Enterprise Crowdsourcing Slide 10 16
Element Scenario 1 Element User 1 User 2 Task description Target audience Complexity Type of action Modularization Latency Nature of the reward Evaluate product design Hybrid Simple Evaluate 10 subtasks (bundled) Immediate Fixed and performance-based User identity Alan Coulter Adèle Girard Location Cork Lyon Nationality Irish French Job title Chief product designer Junior product engineer Entry date 1993-04-01 2010-02-09 Department Product development Product engineering Type of the reward Submission time Closure time 15 reputation points plus bonus or discount of 5 After release After 20 reviews for each product design Qualification Master of Product Design and Development Accomplishment http://examplecompany.com/ cs/task/3241 Bachelor of Engineering <none> Duration 1 minute Confidentiality Visibility Low Hidden Human requirement Job tenure of more than two years OR master in engineering, product design, marketing OR sales Technical resource http://www.flickr.com/ photos/new-product-xyz Towards a Semantic Standard for Enterprise Crowdsourcing Slide 11 16
NEXT STEPS
Future Improvements and Research some elements are currently oversimplified and require further refinement in their level of detail facilitate the definition of conditional expressions create additional concepts (contribution, evaluation and reward mechanism) reuse of existing standards, such as FOAF, schema.org, Dublin Core, activitystrea.ms, SIOC, Proton, HR XML, WS- BPEL Extension for People (BPEL4People), or XMP Process Definition Language Towards a Semantic Standard for Enterprise Crowdsourcing Slide 13 16
Conclusion foster the standardization in the domain of enterprise crowdsourcing 5 current challenges design-science research approach 1 st conceptual prototype crowdsourcing task (15 metadata elements) crowdsourcing user (8 metadata elements) Towards a Semantic Standard for Enterprise Crowdsourcing Slide 14 16
DISCUSSION
MOBILE +49 176 24 77 39 65 WEBSITE www.larshetmank.com SKYPE larshetmank Towards a Semantic Standard for Enterprise Crowdsourcing A scenario-based evaluation of a conceptual prototype TWITTER twitter.com/larshetmank EMAIL lars@hetmank.de Towards a Semantic Standard for Enterprise Crowdsourcing Slide 16 16