SPEC Research Group. Sam Kounev. SPEC 2015 Annual Meeting. Austin, TX, February 5, 2015



Similar documents
Welcome to the 6 th Workshop on Big Data Benchmarking

Using Dynatrace Monitoring Data for Generating Performance Models of Java EE Applications

BUNGEE An Elasticity Benchmark for Self-Adaptive IaaS Cloud Environments

Towards a Performance Model Management Repository for Component-based Enterprise Applications

BUNGEE Quick Start Guide for AWS EC2 based elastic clouds

Setting the Direction for Big Data Benchmark Standards Chaitan Baru, PhD San Diego Supercomputer Center UC San Diego

Elasticity in Cloud Computing: What It Is, and What It Is Not

Automatic Extraction of Probabilistic Workload Specifications for Load Testing Session-Based Application Systems

Tool-Supported Application Performance Problem Detection and Diagnosis. André van Hoorn.

Setting the Direction for Big Data Benchmark Standards

Automating Big Data Benchmarking for Different Architectures with ALOJA

PERFORMANCE ANALYSIS OF PaaS CLOUD COMPUTING SYSTEM

Model-Based Performance Evaluations in Continuous Delivery Pipelines

ExplorViz: Visual Runtime Behavior Analysis of Enterprise Application Landscapes

Protecting Database Centric Web Services against SQL/XPath Injection Attacks

SOSP 14 Symposium on Software Performance: Joint Descartes/Kieker/Palladio Days 2014

SPECjEnterprise2010 & Java Enterprise Edition (EE) PCM Model Generation DevOps Performance WG Meeting

Visible Ops Private Cloud

Hybrid Cloud Computing

Scale Cloud Across the Enterprise

Shaping the Landscape of Industry Standard Benchmarks: Contributions of the Transaction Processing Performance Council (TPC)

Consumption IT. Michael Shepherd Business Development Manager. Cisco Public Sector May 1 st 2014

Fundamentals of Continuous Integration

Continuous Integration in Kieker

IaaS Cloud Architectures: Virtualized Data Centers to Federated Cloud Infrastructures

How To Write A Bigbench Benchmark For A Retailer

IBM SAP International Competence Center. Load testing SAP ABAP Web Dynpro applications with IBM Rational Performance Tester

BENCHMARKING BIG DATA SYSTEMS AND THE BIGDATA TOP100 LIST

Microservices for Scalability

CloudCmp:Comparing Cloud Providers. Raja Abhinay Moparthi

journey to a hybrid cloud

A Gentle Introduction to Cloud Computing

Introducing EEMBC Cloud and Big Data Server Benchmarks

ASCETiC Whitepaper. Motivation. ASCETiC Toolbox Business Goals. Approach

Performance Management from black-art to process

1 Abstracts of all SOSP 2014 Contributions

18/09/2015. DevOps. Prof. Filippo Lanubile. Outline. Definitions Collaboration in DevOps Automation in DevOps. Prof.

The Evolution of Load Testing. Why Gomez 360 o Web Load Testing Is a

How To Manage Cloud Management

Building a Mobile App Security Risk Management Program. Copyright 2012, Security Risk Advisors, Inc. All Rights Reserved

GRAPHALYTICS A Big Data Benchmark for Graph-Processing Platforms

Cisco Data Center Optimization Services

Evaluation of Intrusion Detection Systems in Virtualized Environments Using Attack Injection

Performance Benchmarking of Application Monitoring Frameworks

Setting the Direction for Big Data Benchmark Standards 1

Continuous???? Copyright 2015 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice.

Optimizing your IT infrastructure IBM Corporation

Cloud Management Platform

An enterprise- grade cloud management platform that enables on- demand, self- service IT operating models for Global 2000 enterprises

VMware's Cloud Management Platform Simplifies and Automates Operations of Heterogeneous Environments and Hybrid Clouds

CloudCenter Full Lifecycle Management. An application-defined approach to deploying and managing applications in any datacenter or cloud environment

Securing the Cloud with IBM Security Systems. IBM Security Systems IBM Corporation IBM IBM Corporation Corporation

Web Application Hosting Cloud Architecture

STeP-IN SUMMIT June 18 21, 2013 at Bangalore, INDIA. Performance Testing of an IAAS Cloud Software (A CloudStack Use Case)

Building Platform as a Service for Scientific Applications

Heterogeneous Workload Consolidation for Efficient Management of Data Centers in Cloud Computing

An Oracle White Paper May Oracle Database Cloud Service

Comparison of Request Admission Based Performance Isolation Approaches in Multi-tenant SaaS Applications

NoSQL Performance Test In-Memory Performance Comparison of SequoiaDB, Cassandra, and MongoDB

Delivering Cloud Services Transformation : Plan > Build> Assure> Secure. Stephen Miles Vice President, Solution Sales, APJ

On defining metrics for elasticity of cloud databases

Successful Factors for Performance Testing Projects. NaveenKumar Namachivayam - Founder - QAInsights

Evaluation Methodology of Converged Cloud Environments

CLOUD COMPUTING. A Primer

NIST Cloud Computing Program Activities

Data center fo the future software defined DC

Journey to the Cloud and Application Release Automation Shane Pearson VP, Portfolio & Product Management

The Devils Behind Web Application Vulnerabilities

A Case of Study on Hadoop Benchmark Behavior Modeling Using ALOJA-ML

Planning the Migration of Enterprise Applications to the Cloud

HPC over Cloud. July 16 th, HPC Summer GIST. SCENT (Super Computing CENTer) GIST (Gwangju Institute of Science & Technology)

Modern Monitoring for Modern IT Systems

Application Performance Monitoring of a scalable Java web-application in a cloud infrastructure

Transcription:

SPEC Research Group Sam Kounev SPEC 2015 Annual Meeting Austin, TX, February 5, 2015

Standard Performance Evaluation Corporation OSG HPG GWPG RG Open Systems Group High Performance Group Graphics and Workstation Performance Group Research Group > 80 member organizations & associates Founded 1988 2

Standard Performance Evaluation Corporation Development of Industry Standard Benchmarks OSG HPG GWPG RG Open Systems Group High Performance Group Graphics and Workstation Performance Group Research Group CPU, Java, Virtualization, Power, OpenMP, MPI > 80 member organizations & associates Founded 1988 3

Standard Performance Evaluation Corporation Platform for Collaborative Research OSG HPG GWPG RG Open Systems Group High Performance Group Graphics and Workstation Performance Group Research Group Cloud, Intrusion Detection Systems, (Big) Data? > 80 member organizations & associates Founded 1988 4

SPEC Research Group (RG) The Research Group of the Standard Performance Evaluation Corporation Mission Statement } Provide a platform for collaborative research efforts in the area of quantitative system evaluation and analysis } Foster interactions and collaborations between industry and academia } Scope: computer benchmarking, performance evaluation, and experimental system analysis in general } Focus on standard scenarios, metrics, benchmarks, analysis methodologies and tools Find more information on: http://research.spec.org 2015 SPEC Annual Meeting 5

Members (Dec 2014) 2015 SPEC Annual Meeting 6

Scope Standard scenarios, metrics, and (research) benchmarks Methodologies, techniques and tools for quantitative analysis Measurement, load testing, profiling, workload characterization, dependability & efficiency evaluation of computing systems, etc. Performance in a broad sense Classical performance metrics Response time, throughput, scalability and efficiency including energy efficiency, etc. Non-functional system properties under the term dependability (availability, reliability, and security) 2015 SPEC Annual Meeting 7

Selected SPEC RG Activities Working Groups Cloud Computing (Chair: Alexandru Iosup, TU Delft) Intrusion Detection Systems (Chair: Marco Vieira, Uni-Coimbra) BigData Benchmarking (Chair: Tilmann Rabl, bankmark) DevOps Performance (Chair: André van Hoorn, Uni-Stuttgart) Repository of peer-reviewed tools, experimental data & traces Maintain a portal for all kinds of performance-related resources Organization of conferences and workshops 2015 SPEC Annual Meeting 8

Selected Peer Reviewed Tools Faban automates the running of multi-tier server performance tests/workloads Kieker appl. performance monitoring & dynamic software analysis Benchmarking of event processing systems DiSL Java bytecode instrumentation Performance evaluation of storage systems 2015 SPEC Annual Meeting 9

Example Tool: LIMBO Problem: How to capture the load intensity variations (e.g. requests per sec) in a compact mathematical model? Load Intensity Modeling Tool Automated model extraction from recorded traces Creation and composition of custom models Emulation of job arrivals for load generation http://descartes.tools/limbo 2015 SPEC Annual Meeting 10

Research Benchmark Research Benchmarks are not intended for direct comparison and marketing of existing products Provide a basis for in-depth quantitative analysis and evaluation Representative application scenarios and workloads Flexible and customizable to different usage scenarios Provide a range of possible metrics 2015 SPEC Annual Meeting 11

Highlights from SPEC RG s 4 th Year Roughly 45 member organizations Two new working groups (RG BigData and RG DevOps) Technical co-sponsorship of QEST 2014 and ICAC 2015 Keynote presentation at QEST 2014 Two new tools DynamicSpotter and LIMBO accepted A technical report on hypercall vulnerabilities published ISSRE paper nominated for Best Paper Award RG Newsletter now established as a regular publication 2015 SPEC Annual Meeting 12

RG DevOps Performance WG Deployment Operations Testing Contact Chair: André van Hoorn (University of Stuttgart) Vice chair: Andreas Brunnert (fortiss GmbH) Web site http://research.spec.org/devopswg/ 2015 SPEC Annual Meeting 13

RG DevOps Performance WG Deployment Operations Testing Mission: foster and facilitate research in combining measurement-based application performance management (APM) and model-based software performance engineering (SPE) activities for business-critical application systems. http://research.spec.org/devopswg/ 2015 SPEC Annual Meeting 14

RG DevOps Performance WG Current member organizations Contact Chair: André van Hoorn (University of Stuttgart) Vice chair: Andreas Brunnert (fortiss GmbH) Secretary: Felix Willnecker (fortiss GmbH) Release manager: Nils Ehmke (Kiel University) Bi-weekly calls, currently @ Fri - 4:00-5:00 (Berlin/CET) The next F2F meeting in February 2015 (Würzburg, Germany) 2015 SPEC Annual Meeting 15

RG DevOps: Topics of Interest Performance and workload model extraction Dynamic/adaptive instrumentation techniques Combination of static and dynamic analysis Feedback between development (Dev) and operations (Ops) Interchange formats for metrics and models Process models to better integrate SPE and APM activities Runtime performance management techniques Automatic problem detection, diagnosis, and resolution Load testing Benchmarks for SPE and APM methods, techniques, and tools 2015 SPEC Annual Meeting 16

RG DevOps: Selected Activities Report on DevOps performance (10+ authors) Using industry common APM solutions for model generation (cooperating with dynatrace) see Poster @ ICPE 2015 Expert-guided automatic diagnosis of performance problems in enterprise applications Landscaping model extraction approaches 2015 SPEC Annual Meeting 17

RG Cloud WG RG Cloud Alexandru Iosup Chair Aleks Milenkoski Secretary Nikolas Herbst Release Manager http://research.spec.org/working-groups/rg-cloud-working-group.html 2015 SPEC Annual Meeting 18

RG Cloud WG: Vision a broad approach, relevant for both academia and industry, to cloud benchmarking, quantitative evaluation, and experimental analysis. To develop new methodological elements for gaining deeper understanding not only of cloud performance, but also of cloud operation and behavior through diverse quantitative evaluation tools, including benchmarks, metrics, and workload generators. 2015 SPEC Annual Meeting 19

RG Cloud WG: Topics of Interest New methodological elements Online workload generation Workload characterization and modeling Techniques for performance measurement and evaluation, etc. New quantitative evaluation tools Benchmarks, Metrics, Workload generators Targeted topics, usually triggered by a question How to present cloud usage scenarios in a uniform, formal way? How to measure elasticity? (And other metric-related questions) How to benchmark a PaaS cloud (for specific application domains)? 2015 SPEC Annual Meeting 20

A Finished Product Textual and visual formalism for describing cloud usage scenarios Value chains, value chains with mediators, hybrid service provisioning, A. Milenkoski, A. Iosup, S. Kounev, K. Sachs, P. Rygielski, J. Ding, W. Cirne & F. Rosenberg. Cloud Usage Patterns: A Formalism for Description of Cloud Usage Scenarios. Technical Report SPEC-RG-2013-001 v.1.0.1, SPEC Research Group - Cloud Working Group, Standard Performance Evaluation Corporation (SPEC), April 2013. RG Cloud SPEC RG Cloud Working Group http://research.spec.org/working-groups 2015 SPEC Annual Meeting 21

Defining Elasticity Def: The degree to which a system is able to adapt to workload changes by provisioning and deprovisioning resources in an autonomic manner, such that at each point in time the available resources match the current demand as closely as possible. N. Herbst, S. Kounev and R. Reussner Elasticity in Cloud Computing: What it is, and What it is Not. in Proceedings of the 10th International Conference on Autonomic Computing (ICAC 2013), San Jose, CA, June 24-28, 2013. [ slides http.pdf ] http://en.wikipedia.org/wiki/elasticity_(cloud_computing) 2015 SPEC Annual Meeting 22

Work-In-Progress: BUNGEE Framework for benchmarking elasticity Current focus: IaaS cloud platforms Planned collaboration with OSG Cloud http://descartes.tools/bungee 2015 SPEC Annual Meeting 23

RG Cloud: Why and How to Join? http://research.spec.org/working-groups/rg-cloud-working-group.html Why join? Active, growing group for impactful work on cloud computing Participate in the future of standardization work at SPEC Discuss how performance can be measured and engineered Find out about novel methods and current trends Get in contact with leading organizations in the field Find potential employees and/or interns Find more information on: http://research.spec.org How to join? Join SPEC (low-cost yearly membership fee, once per organization) Details: http://research.spec.org/faq.html#membershipandmeetings 2015 SPEC Annual Meeting 24

RG Big Data Working Group Differentiating from traditional large database applications, e.g., OLTP, OLAP Clear goals for the aspects important to measure Sound rules and metrics to measure performance Tools to help instantiate sample big data systems 2015 SPEC Annual Meeting 25

RG Big Data Working Group Big Data Benchmark Community (BDBC) Original founders: Chaitan Baru (SDSC), Tilmann Rabl (University of Toronto), Milind Bhandarkar (Pivotal/Greenplum), Raghu Nambiar (Cisco), Meikel Poess (Oracle) clds.ucsd.edu/bdbc/community Refining ideas from BigBench for a SPEC BigData benchmark Published papers in SIGMOD, TPCTC, Big Data Journal 5 WBDB workshops on Big Data Benchmarking (2012-2014) 2 Springer Publications 2015 SPEC Annual Meeting 26

RG BigData: Targets for 2015 Organizing two workshops 6 th WBDB, June 16-17 2015, Toronto, CA 7 th WBDB, September 10-11, 2015 Delhi, India Publications/ Conference Participation Present SPEC WG on Big Data, XLDB Position paper, targeted for SIGMOD Record Big Data Survey style paper including SPEC RG vision on Big Data, targeted for CACM Refining ideas introduced in BigBench for a SPEC Big Data benchmark 2015 SPEC Annual Meeting 27

RG IDS Working Group Mission statement: dissemination of techniques and tools for quantitative evaluation of host and network intrusion detection systems, with a focus on intrusion detection systems in dynamic virtualized environments Topics of interest: Workload generation using attack injection IDS evaluation metrics 2015 SPEC Annual Meeting 28

Attack Injection 2015 SPEC Annual Meeting 29

RG IDS: Example Results Aleksandar M., Bryan D. P., Nuno A., Marco V., and Samuel K. Experience Report: An Analysis of Hypercall Handler Vulnerabilities @ The 25th IEEE International Symposium on Software Reliability Engineering (ISSRE 2014) - Research Track, Italy, 2014. Aleksandar M., Bryan D. P., Nuno A., Marco V., and Samuel K. hinjector: Injecting Hypercall Attacks for Evaluating VMI-based Intrusion Detection Systems (Poster) @ The 2013 Annual Computer Security Applications Conference (ACSAC 2013), USA, 2013. hinjector available @ https://github.com/hinj/hinjector Aleksandar M., Marco V., Samuel K., Alberto A., Bryan D. P. Evaluating Computer Intrusion Detection Systems: A Survey of Common Practices @ ACM Computing Surveys. Under Revision. Collaboration with industry 2015 SPEC Annual Meeting 30

hinjector Tool IDS evaluation in virtualized environments Workloads Metrics and measurement methodologies Injection of attacks targeting VMMs New security-related metrics Injection of representative hypercall attacks Attack detection accuracy metrics that take elasticity into account 2015 SPEC Annual Meeting 31

Collaboration with SPEC OSG Power University of Würzburg (Jóakim v. Kistowski, Sam Kounev) Accelerator development Workload balancing for multi-node systems prototyping Calibrating workloads by core loading Plans to provide a SERT-based tool for broader testing ICPE 2015 Industry paper: Analysis of the influences on server power consumption and energy efficiency for CPU-intensive workloads Tutorial on How to build a benchmark in collaboration with Power, CPU & TPC Paper planning for MASCOT 2015 & ICPE 2016, Journal paper 2015 SPEC Annual Meeting 32

Collaboration with SPEC OSG Power University of Würzburg (Jóakim v. Kistowski, Sam Kounev) Accelerator development Workload balancing for multi-node systems prototyping Calibrating workloads by core loading, so integrating load balancing into the core architecture of worklet design Plans to provide a SERT-based tool for broader testing ICPE 2015 paper: Analysis of the Influences on Server Power Consumption and Energy Efficiency for CPU-Intensive Workloads Tutorial on How to Build a Benchmark at ICPE 2015 in collaboration Power, CPU & TPC Paper planning for MASCOT 2015 & ICPE 2016, Journal paper Will jointly add credibility to the SPEC power work and to the interactions with academia ON HOLD: the ideas around Jóakim and his team managing the open sourcing of worklets (based on the Chauffeur WDK that SPEC is now shipping) as possible future candidates for the SERT or for SPECpower vnext o Everyone agrees it s a great idea but the effort required may be too much for Jóakim and his team to allocate 2015 SPEC Annual Meeting 33

Example Collaborations with OSG University of Alberta is a supporting contributor to SPEC OSG Developed a set of inputs to selected benchmarks to be released soon after the release of SPECCPU v6 Additional Contributions Sanity check on procedures to generate inputs Characterization of benchmark behavior variations Suggestions to improve a few benchmarks 2015 SPEC Annual Meeting 34

Benefits for SPEC Involvement of researchers and students in SPEC Exchange of ideas and experiences Feedback from an academic perspective Participation in benchmark development Visibility and credibility in the academic community Attraction of new members and benchmark projects 2015 SPEC Annual Meeting 35

Questions? http://research.spec.org/ 36