Performance Evaluation in Database Research: Principles and Experiences
|
|
|
- Tamsin Cory Taylor
- 10 years ago
- Views:
Transcription
1 Planning Presentation Repeatability Summary Manegold (CWI) Performance Evaluation: Principles & Experiences 1/144 Performance Evaluation in Database Research: Principles and Experiences Stefan Manegold CWI (Centrum Wiskunde & Informatica) Amsterdam, The Netherlands
2 Planning Presentation Repeatability Summary Performance evaluation Manegold (CWI) Performance Evaluation: Principles & Experiences 2/144 Disclaimer There is no single way how to do it right. There are many ways how to do it wrong. This is not a mandatory script. This is more a collection of anecdotes or fairy tales not always to be taken literally, only, but all provide some general rules or guidelines what (not) to do.
3 Planning Presentation Repeatability Summary Manegold (CWI) Performance Evaluation: Principles & Experiences 3/144 1 Planning & conducting experiments 2 Presentation 3 Repeatability 4 Summary
4 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Manegold (CWI) Performance Evaluation: Principles & Experiences 4/144 1 Planning & conducting experiments From micro-benchmarks to real-life applications Choosing the hardware Choosing the software What and how to measure How to run Comparison with others CSI 2 Presentation 3 Repeatability 4 Summary
5 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Planning & conducting experiments Manegold (CWI) Performance Evaluation: Principles & Experiences 5/144 What do you plan to do / analyze / test / prove / show? Which data / data sets should be used? Which workload / queries should be run? Which hardware & software should be used? Metrics: What to measure? How to measure? How to compare? CSI: How to find out what is going on?
6 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Data sets & workloads Manegold (CWI) Performance Evaluation: Principles & Experiences 6/144 Micro-benchmarks Standard benchmarks Real-life applications No general simple rules, which to use when But some guidelines for the choice...
7 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Micro-benchmarks Manegold (CWI) Performance Evaluation: Principles & Experiences 7/144 Definition Specialized, stand-alone piece of software Isolating one particular piece of a larger system E.g., single DB operator (select, join, aggregation, etc.)
8 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Micro-benchmarks Manegold (CWI) Performance Evaluation: Principles & Experiences 8/144 Pros Focused on problem at hand Controllable workload and data characteristics Data sets (synthetic & real) Data size / volume (scalability) Value ranges and distribution Correlation Queries Workload size (scalability) Allow broad parameter range(s) Useful for detailed, in-depth analysis Low setup threshold; easy to run
9 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Micro-benchmarks Manegold (CWI) Performance Evaluation: Principles & Experiences 9/144 Cons Neglect larger picture Neglect contribution of local costs to global/total costs Neglect impact of micro-benchmark on real-life applications Neglect embedding in context/system at large Generalization of result difficult Application of insights in full systems / real-life applications not obvious Metrics not standardized Comparison?
10 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Standard benchmarks Manegold (CWI) Performance Evaluation: Principles & Experiences 10/144 Examples RDBMS, OODBMS, ORDMBS: TPC-{A,B,C,H,R,DS}, OO7,... XML, XPath, XQuery, XUF, SQL/XML: MBench, XBench, XMach-1, XMark, X007, TPoX,... Stream Processing: Linear Road,... General Computing: SPEC,......
11 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Standard benchmarks Manegold (CWI) Performance Evaluation: Principles & Experiences 11/144 Pros Mimic real-life scenarios Publicly available Well defined (in theory...) Scalable data sets and workloads (if well designed...) Metrics well defined (if well designed...) Easily comparable (?)
12 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Standard benchmarks Cons Often outdated (standardization takes (too?) long) Often compromises Often very large and complicated to run Limited dataset variation Limited workload variation Systems are often optimized for the benchmark(s), only! Manegold (CWI) Performance Evaluation: Principles & Experiences 12/144
13 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Real-life applications Manegold (CWI) Performance Evaluation: Principles & Experiences 13/144 Pros There are so many of them Existing problems and challenges
14 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Real-life applications Manegold (CWI) Performance Evaluation: Principles & Experiences 14/144 Cons There are so many of them Proprietary datasets and workloads
15 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Two types of experiments Manegold (CWI) Performance Evaluation: Principles & Experiences 15/144 Analysis: CSI Investigate (all?) details Analyze and understand behavior and characteristics Find out where the time goes and why! Publication Sell your story Describe picture at large Highlight (some) important / interesting details Compare to others
16 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Choosing the hardware Manegold (CWI) Performance Evaluation: Principles & Experiences 16/144 Choice mainly depends on your problem, knowledge, background, taste, etc. What ever is required by / adequate for your problem A laptop might not be the most suitable / representative database server...
17 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Choosing the software Manegold (CWI) Performance Evaluation: Principles & Experiences 17/144 Which DBMS to use? Commercial Require license Free versions with limited functionality and/or optimization capabilities? Limitations on publishing results No access to code Optimizers Analysis & Tuning Tools Open source Freely available No limitations on publishing results Access to source code
18 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Choosing the software Manegold (CWI) Performance Evaluation: Principles & Experiences 18/144 Other choices depend on your problem, knowledge, background, taste, etc. Operating system Programming language Compiler Scripting languages System tools Visualization tools
19 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Metrics: What to measure? Manegold (CWI) Performance Evaluation: Principles & Experiences 19/144 Basic Throughput: queries per time Evaluation time wall-clock ( real ) CPU ( user ) I/O ( system ) Server-side vs. client-side Memory and/or storage usage / requirements Comparison Scale-up Speed-up Analysis System events & interrupts Hardware events
20 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Metrics: What to measure? Manegold (CWI) Performance Evaluation: Principles & Experiences 20/144 Laptop: 1.5 GHz Pentium M (Dothan), 2 MB L2 cache, 2 GB RAM, 5400 RPM disk TPC-H (sf = 1) MonetDB/SQL v5.5.0/ measured 3rd (& 4th) of four consecutive runs Q server client 3rd 3rd 4th run user real real real... time (milliseconds)
21 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Metrics: What to measure? Manegold (CWI) Performance Evaluation: Principles & Experiences 21/144 Laptop: 1.5 GHz Pentium M (Dothan), 2 MB L2 cache, 2 GB RAM, 5400 RPM disk TPC-H (sf = 1) MonetDB/SQL v5.5.0/ measured 3rd (& 4th) of four consecutive runs Q server client 3rd 3rd 4th run user real real real... time (milliseconds)
22 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Metrics: What to measure? Manegold (CWI) Performance Evaluation: Principles & Experiences 22/144 Laptop: 1.5 GHz Pentium M (Dothan), 2 MB L2 cache, 2 GB RAM, 5400 RPM disk TPC-H (sf = 1) MonetDB/SQL v5.5.0/ measured 3rd (& 4th) of four consecutive runs server client 3rd 3rd 4th run user real real real result... time (milliseconds) Q file file file terminal size output went to KB MB
23 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Metrics: What to measure? Manegold (CWI) Performance Evaluation: Principles & Experiences 23/144 Laptop: 1.5 GHz Pentium M (Dothan), 2 MB L2 cache, 2 GB RAM, 5400 RPM disk TPC-H (sf = 1) MonetDB/SQL v5.5.0/ measured 3rd (& 4th) of four consecutive runs server client 3rd 3rd 4th run user real real real result... time (milliseconds) Q file file file terminal size output went to KB MB Be aware what you measure!
24 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Metrics: How to measure? Manegold (CWI) Performance Evaluation: Principles & Experiences 24/144 Tools, functions and/or system calls to measure time: Unix /usr/bin/time, shell built-in time Command line tool works with any executable Reports real, user & sys time (milliseconds) Measures entire process incl. start-up Note: output format varies! gettimeofday() System function requires source code Reports timestamp (microseconds)
25 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Metrics: How to measure? Manegold (CWI) Performance Evaluation: Principles & Experiences 25/144 Tools, functions and/or system calls to measure time: Windows TimeGetTime(), GetTickCount() System function requires source code Reports timestamp (milliseconds) Resolution can be as coarse as 10 milliseconds QueryPerformanceCounter() / QueryPerformanceFrequency() System function requires source code Reports timestamp (ticks per seconds) Resolution can be as fine as 1 microsecond cf.,
26 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Metrics: How to measure? Manegold (CWI) Performance Evaluation: Principles & Experiences 26/144 Use timings provided by the tested software (DBMS) IBM DB2 db2batch Microsoft SQLserver GUI and system variables PostgreSQL postgresql.conf log statement stats = on log min duration statement = 0 log duration = on MonetDB/XQuery & MonetDB/SQL mclient -lxquery -t mclient -lsql -t (PROFILE TRACE) select...
27 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Metrics: How to measure? Manegold (CWI) Performance Evaluation: Principles & Experiences 27/144 mclient -lxquery -t -s Trans msec Shred msec Query msec Print msec Timer msec mclient -lsql -t PROFILE select 1.sql %. # table name % single value # name % tinyint # type % 1 # length [ 1 ] #times real 62, user 0, system 0, 100 Timer msec
28 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI How to run experiments Manegold (CWI) Performance Evaluation: Principles & Experiences 28/144 We run all experiments in warm memory.
29 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI How to run experiments Manegold (CWI) Performance Evaluation: Principles & Experiences 29/144 We run all experiments in warm memory.
30 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI hot vs. cold Manegold (CWI) Performance Evaluation: Principles & Experiences 30/144 Depends on what you want to show / measure / analyze No formal definition, but common sense Cold run A cold run is a run of the query right after a DBMS is started and no (benchmark-relevant) data is preloaded into the system s main memory, neither by the DBMS, nor in filesystem caches. Such a clean state can be achieved via a system reboot or by running an application that accesses sufficient (benchmark-irrelevant) data to flush filesystem caches, main memory, and CPU caches. Hot run A hot run is a run of a query such that as much (query-relevant) data is available as close to the CPU as possible when the measured run starts. This can (e.g.) be achieved by running the query (at least) once before the actual measured run starts. Be aware and document what you do / choose
31 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI hot vs. cold Laptop: 1.5 GHz Pentium M (Dothan), 2 MB L2 cache, 2 GB RAM, 5400 RPM disk TPC-H (sf = 1) MonetDB/SQL v5.5.0/ measured last of three consecutive runs cold hot Q time (milliseconds) Manegold (CWI) Performance Evaluation: Principles & Experiences 31/144
32 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI hot vs. cold Manegold (CWI) Performance Evaluation: Principles & Experiences 32/144 Laptop: 1.5 GHz Pentium M (Dothan), 2 MB L2 cache, 2 GB RAM, 5400 RPM disk TPC-H (sf = 1) MonetDB/SQL v5.5.0/ measured last of three consecutive runs cold hot Q user user... time (milliseconds)
33 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI hot vs. cold & user vs. real time Manegold (CWI) Performance Evaluation: Principles & Experiences 33/144 Laptop: 1.5 GHz Pentium M (Dothan), 2 MB L2 cache, 2 GB RAM, 5400 RPM disk TPC-H (sf = 1) MonetDB/SQL v5.5.0/ measured last of three consecutive runs cold hot Q user real user real... time (milliseconds)
34 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI hot vs. cold & user vs. real time Manegold (CWI) Performance Evaluation: Principles & Experiences 34/144 Laptop: 1.5 GHz Pentium M (Dothan), 2 MB L2 cache, 2 GB RAM, 5400 RPM disk TPC-H (sf = 1) MonetDB/SQL v5.5.0/ measured last of three consecutive runs cold hot Q user real user real... time (milliseconds) Be aware what you measure!
35 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Of apples and oranges Manegold (CWI) Performance Evaluation: Principles & Experiences 35/144 Once upon a time at CWI... Two colleagues A & B each implemented one version of an algorithm, A the old version and B the improved new version They ran identical experiments on identical machines, each for his code. Though both agreed that B s new code should be significantly better, results were consistently worse.
36 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Of apples and oranges Manegold (CWI) Performance Evaluation: Principles & Experiences 36/144 Once upon a time at CWI... Two colleagues A & B each implemented one version of an algorithm, A the old version and B the improved new version They ran identical experiments on identical machines, each for his code. Though both agreed that B s new code should be significantly better, results were consistently worse. They tested, profiled, analyzed, argued, wondered, fought for several days...
37 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Of apples and oranges Manegold (CWI) Performance Evaluation: Principles & Experiences 37/144 Once upon a time at CWI... Two colleagues A & B each implemented one version of an algorithm, A the old version and B the improved new version They ran identical experiments on identical machines, each for his code. Though both agreed that B s new code should be significantly better, results were consistently worse. They tested, profiled, analyzed, argued, wondered, fought for several days and eventually found out that A had compiled with optimization enabled, while B had not...
38 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Of apples and oranges Manegold (CWI) Performance Evaluation: Principles & Experiences 38/144 DBG configure --enable-debug --disable-optimize --enable-assert CFLAGS = "-g [-O0]" OPT configure --disable-debug --enable-optimize --disable-assert CFLAGS = " -O6 -fomit-frame-pointer -finline-functions -malign-loops=4 -malign-jumps=4 -malign-functions=4 -fexpensive-optimizations -funroll-all-loops -funroll-loops -frerun-cse-after-loop -frerun-loop-opt -DNDEBUG "
39 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Of apples and oranges Manegold (CWI) Performance Evaluation: Principles & Experiences 39/144 relative execution time: DBG/OPT TPC-H queries
40 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Of apples and oranges Manegold (CWI) Performance Evaluation: Principles & Experiences 40/144 Compiler optimization up to factor 2 performance difference DBMS configuration and tuning factor x performance difference (2 x 10?) Self-* still research Default settings often too conservative Do you know all systems you use/compare equally well?
41 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Of apples and oranges Manegold (CWI) Performance Evaluation: Principles & Experiences 41/144 Compiler optimization up to factor 2 performance difference DBMS configuration and tuning factor x performance difference (2 x 10?) Self-* still research Default settings often too conservative Do you know all systems you use/compare equally well? Our problem-specific, hand-tuned, prototype X outperforms an out-of-the-box installation of a full-fledged off-the-shelf system Y ;
42 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Of apples and oranges Manegold (CWI) Performance Evaluation: Principles & Experiences 42/144 Compiler optimization up to factor 2 performance difference DBMS configuration and tuning factor x performance difference (2 x 10?) Self-* still research Default settings often too conservative Do you know all systems you use/compare equally well? Our problem-specific, hand-tuned, prototype X outperforms an out-of-the-box installation of a full-fledged off-the-shelf system Y ; in X, we focus on pure query execution time, omitting the times for query parsing, translation, optimization and result printing;
43 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Of apples and oranges Manegold (CWI) Performance Evaluation: Principles & Experiences 43/144 Compiler optimization up to factor 2 performance difference DBMS configuration and tuning factor x performance difference (2 x 10?) Self-* still research Default settings often too conservative Do you know all systems you use/compare equally well? Our problem-specific, hand-tuned, prototype X outperforms an out-of-the-box installation of a full-fledged off-the-shelf system Y ; in X, we focus on pure query execution time, omitting the times for query parsing, translation, optimization and result printing; we did not manage to do the same for Y.
44 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Of apples and oranges Manegold (CWI) Performance Evaluation: Principles & Experiences 44/144 Compiler optimization up to factor 2 performance difference DBMS configuration and tuning factor x performance difference (2 x 10?) Self-* still research Default settings often too conservative Do you know all systems you use/compare equally well? Our problem-specific, hand-tuned, prototype X outperforms an out-of-the-box installation of a full-fledged off-the-shelf system Y ; in X, we focus on pure query execution time, omitting the times for query parsing, translation, optimization and result printing; we did not manage to do the same for Y. Absolutely fair comparisons virtually impossible But: Be at least aware of the the crucial factors and their impact, and document accurately and completely what you do.
45 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Do you know what happens? Manegold (CWI) Performance Evaluation: Principles & Experiences 45/144 Simple In-Memory Scan: SELECT MAX(column) FROM table elapsed time per iteration [nanoseconds] Memory CPU 0 year system Sun LX Sun Ultra SunUltra DEC Alpha Origin2000 CPU type Sparc UltraSparc UltraSparcII Alpha R12000 CPU speed 50 MHz 200 MHz 296 MHz 500 MHz 300 MHz
46 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Do you know what happens? Manegold (CWI) Performance Evaluation: Principles & Experiences 46/144 Simple In-Memory Scan: SELECT MAX(column) FROM table No disk-i/o involved Up to 10x improvement in CPU clock-speed Yet hardly any performance improvement!??
47 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Do you know what happens? Manegold (CWI) Performance Evaluation: Principles & Experiences 47/144 Simple In-Memory Scan: SELECT MAX(column) FROM table No disk-i/o involved Up to 10x improvement in CPU clock-speed Yet hardly any performance improvement!?? Research: Always question what you see!
48 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Do you know what happens? Manegold (CWI) Performance Evaluation: Principles & Experiences 48/144 Simple In-Memory Scan: SELECT MAX(column) FROM table No disk-i/o involved Up to 10x improvement in CPU clock-speed Yet hardly any performance improvement!?? Research: Always question what you see! Standard profiling (e.g., gcc -gp + gprof ) does not reveal more (in this case)
49 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Do you know what happens? Manegold (CWI) Performance Evaluation: Principles & Experiences 49/144 Simple In-Memory Scan: SELECT MAX(column) FROM table No disk-i/o involved Up to 10x improvement in CPU clock-speed Yet hardly any performance improvement!?? Research: Always question what you see! Standard profiling (e.g., gcc -gp + gprof ) does not reveal more (in this case) Need to dissect CPU & memory access costs Use hardware performance counters to analyze cache-hits, -misses & memory accesses VTune, oprofile, perfctr, perfmon2, PAPI, PCL, etc.
50 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Find out what happens! Manegold (CWI) Performance Evaluation: Principles & Experiences 50/144 Simple In-Memory Scan: SELECT MAX(column) FROM table elapsed time per iteration [nanoseconds] Memory CPU 0 year system Sun LX Sun Ultra SunUltra DEC Alpha Origin2000 CPU type Sparc UltraSparc UltraSparcII Alpha R12000 CPU speed 50 MHz 200 MHz 296 MHz 500 MHz 300 MHz
51 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Find out what happens! Manegold (CWI) Performance Evaluation: Principles & Experiences 51/144 Use info provided by the tested software (DBMS) IBM DB2 db2expln Microsoft SQLserver GUI and system variables MySQL, PostgreSQL EXPLAIN select... MonetDB/SQL (PLAN EXPLAIN TRACE) select...
52 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Find out what happens! Manegold (CWI) Performance Evaluation: Principles & Experiences 52/144 Use profiling and monitoring tools gcc -gp + gprof Reports call tree, time per function and time per line Requires re-compilation and static linking valgrind --tool=callgrind + kcachegrind Reports call tree, times, instructions executed and cache misses Thread-aware Does not require (re-)compilation Simulation-based slows down execution up to a factor 100 Hardware performance counters to analyze cache-hits, -misses & memory accesses VTune, oprofile, perfctr, perfmon2, PAPI, PCL, etc. System monitors ps, top, iostat,...
53 Planning Presentation Repeatability Summary Benchmarks HW SW Metrics How to run Compare CSI Find out what happens! Manegold (CWI) Performance Evaluation: Principles & Experiences 53/144 TPC-H Q1 (sf = 1) (AMD 1533 GHz, 1 GB RAM) MySQL gprof trace MonetDB/MIL trace
54 Planning Presentation Repeatability Summary Guidelines Mistakes Manegold (CWI) Performance Evaluation: Principles & Experiences 54/144 1 Planning & conducting experiments 2 Presentation Guidelines Mistakes 3 Repeatability 4 Summary
55 Manegold (CWI) Performance Evaluation: Principles & Experiences 55/144 Planning Presentation Repeatability Summary Guidelines Mistakes Graphical presentation of results We all know A picture is worth a thousand words
56 Manegold (CWI) Performance Evaluation: Principles & Experiences 56/144 Planning Presentation Repeatability Summary Guidelines Mistakes Graphical presentation of results We all know A picture is worth a thousand words Er, maybe not all pictures...
57 Planning Presentation Repeatability Summary Guidelines Mistakes Graphical presentation of results We all know A picture is worth a thousand words Er, maybe not all pictures... (Borrowed from T.Grust s slides at VLDB 2007 panel) Manegold (CWI) Performance Evaluation: Principles & Experiences 57/144
58 Planning Presentation Repeatability Summary Guidelines Mistakes Guidelines for preparing good graphic charts Manegold (CWI) Performance Evaluation: Principles & Experiences 58/144 Require minimum effort from the reader
59 Planning Presentation Repeatability Summary Guidelines Mistakes Guidelines for preparing good graphic charts Manegold (CWI) Performance Evaluation: Principles & Experiences 59/144 Require minimum effort from the reader Not the minimum effort from you
60 Planning Presentation Repeatability Summary Guidelines Mistakes Guidelines for preparing good graphic charts Manegold (CWI) Performance Evaluation: Principles & Experiences 60/144 Require minimum effort from the reader Not the minimum effort from you Try to be honest: how would you like to see it?
61 Planning Presentation Repeatability Summary Guidelines Mistakes Guidelines for preparing good graphic charts Manegold (CWI) Performance Evaluation: Principles & Experiences 61/144 Require minimum effort from the reader Not the minimum effort from you Try to be honest: how would you like to see it? Response time B A Response time B A C Response time A B C C Number of users Number of users Number of users
62 Planning Presentation Repeatability Summary Guidelines Mistakes Guidelines for preparing good graphic charts Manegold (CWI) Performance Evaluation: Principles & Experiences 62/144 Maximize information: try to make the graph self-sufficient Use keywords in place of symbols to avoid a join in the reader s brain Use informative axis labels: prefer Average I/Os per query to Average I/Os to I/Os Include units in the labels: prefer CPU time (ms) to CPU time
63 Planning Presentation Repeatability Summary Guidelines Mistakes Guidelines for preparing good graphic charts Manegold (CWI) Performance Evaluation: Principles & Experiences 63/144 Maximize information: try to make the graph self-sufficient Use keywords in place of symbols to avoid a join in the reader s brain Use informative axis labels: prefer Average I/Os per query to Average I/Os to I/Os Include units in the labels: prefer CPU time (ms) to CPU time Use commonly accepted practice: present what people expect Usually axes begin at 0, the factor is plotted on x, the result on y Usually scales are linear, increase from left to right, divisions are equal Use exceptions as necessary
64 Planning Presentation Repeatability Summary Guidelines Mistakes Guidelines for preparing good graphic charts Manegold (CWI) Performance Evaluation: Principles & Experiences 64/144 Minimize ink: present as much information as possible with as little ink as possible
65 Planning Presentation Repeatability Summary Guidelines Mistakes Guidelines for preparing good graphic charts Manegold (CWI) Performance Evaluation: Principles & Experiences 65/144 Minimize ink: present as much information as possible with as little ink as possible Prefer the chart that gives the most information out of the same data
66 Planning Presentation Repeatability Summary Guidelines Mistakes Guidelines for preparing good graphic charts Manegold (CWI) Performance Evaluation: Principles & Experiences 66/144 Minimize ink: present as much information as possible with as little ink as possible Prefer the chart that gives the most information out of the same data Availability Unavailability Day of the week Day of the week
67 Planning Presentation Repeatability Summary Guidelines Mistakes Reading material Manegold (CWI) Performance Evaluation: Principles & Experiences 67/144 Edward Tufte: The Visual Display of Quantitative Information
68 Planning Presentation Repeatability Summary Guidelines Mistakes Common presentation mistakes Manegold (CWI) Performance Evaluation: Principles & Experiences 68/144 Presenting too many alternatives on a single chart Rules of thumb, to override with good reason: A line chart should be limited to 6 curves A column chart or bar should be limited to 10 bars A pie chart should be limited to 8 components Each cell in a histogram should have at least five data points
69 Planning Presentation Repeatability Summary Guidelines Mistakes Common presentation mistakes Manegold (CWI) Performance Evaluation: Principles & Experiences 69/144 Presenting many result variables on a single chart Commonly done to fit into available page count :-( Utilization Throughput Response time Throughput Utilization Response time Number of users
70 Planning Presentation Repeatability Summary Guidelines Mistakes Common presentation mistakes Manegold (CWI) Performance Evaluation: Principles & Experiences 70/144 Presenting many result variables on a single chart Commonly done to fit into available page count :-( Utilization Throughput Response time Throughput Utilization Response time Number of users Huh?
71 Planning Presentation Repeatability Summary Guidelines Mistakes Common presentation mistakes Manegold (CWI) Performance Evaluation: Principles & Experiences 71/144 Using symbols in place of text
72 Planning Presentation Repeatability Summary Guidelines Mistakes Common presentation mistakes Manegold (CWI) Performance Evaluation: Principles & Experiences 72/144 Using symbols in place of text R µ=1 µ=2 µ=3 λ Response time 1 job/sec 2 jobs/sec 3 jobs/sec Arrival rate
73 Planning Presentation Repeatability Summary Guidelines Mistakes Common presentation mistakes Manegold (CWI) Performance Evaluation: Principles & Experiences 73/144 Using symbols in place of text R µ=1 µ=2 µ=3 λ Response time 1 job/sec 2 jobs/sec 3 jobs/sec Arrival rate Human brain is a poor join processor
74 Planning Presentation Repeatability Summary Guidelines Mistakes Common presentation mistakes Manegold (CWI) Performance Evaluation: Principles & Experiences 74/144 Using symbols in place of text R µ=1 µ=2 µ=3 λ Response time 1 job/sec 2 jobs/sec 3 jobs/sec Arrival rate Human brain is a poor join processor Humans get frustrated by computing joins
75 Planning Presentation Repeatability Summary Guidelines Mistakes Common presentation mistakes Manegold (CWI) Performance Evaluation: Principles & Experiences 75/144 Change the graphical layout of a given curve from one figure to another
76 Planning Presentation Repeatability Summary Guidelines Mistakes Common presentation mistakes Manegold (CWI) Performance Evaluation: Principles & Experiences 76/144 Change the graphical layout of a given curve from one figure to another
77 Planning Presentation Repeatability Summary Guidelines Mistakes Common presentation mistakes Manegold (CWI) Performance Evaluation: Principles & Experiences 77/144 Change the graphical layout of a given curve from one figure to another What do you mean my graphs are not legible?
78 Planning Presentation Repeatability Summary Guidelines Mistakes Pictorial games Manegold (CWI) Performance Evaluation: Principles & Experiences 78/144 MINE is better than YOURS!
79 Planning Presentation Repeatability Summary Guidelines Mistakes Pictorial games MINE is better than YOURS! MINE YOURS 2600 MINE YOURS Manegold (CWI) Performance Evaluation: Principles & Experiences 79/144
80 Planning Presentation Repeatability Summary Guidelines Mistakes Pictorial games MINE is better than YOURS! MINE YOURS 2600 MINE YOURS A-ha Manegold (CWI) Performance Evaluation: Principles & Experiences 80/144
81 Planning Presentation Repeatability Summary Guidelines Mistakes Pictorial games Recommended layout: let the useful height of the graph be 3/4th of its useful width 2600 MINE YOURS 0 Manegold (CWI) Performance Evaluation: Principles & Experiences 81/144
82 Planning Presentation Repeatability Summary Guidelines Mistakes Pictorial games Manegold (CWI) Performance Evaluation: Principles & Experiences 82/144 Plot random quantities without confidence intervals MINE YOURS MINE YOURS
83 Planning Presentation Repeatability Summary Guidelines Mistakes Pictorial games Manegold (CWI) Performance Evaluation: Principles & Experiences 83/144 Plot random quantities without confidence intervals MINE YOURS MINE YOURS Overlapping confidence intervals sometimes mean the two quantities are statistically indifferent
84 Planning Presentation Repeatability Summary Guidelines Mistakes Pictorial games Manegold (CWI) Performance Evaluation: Principles & Experiences 84/144 Manipulating cell size in histograms Frequency Frequency [0,2)[2,4)[4,6)[6,8)[8,10)[10,12) Response time [0,6)[6,12) Response time
85 Planning Presentation Repeatability Summary Guidelines Mistakes Pictorial games Manegold (CWI) Performance Evaluation: Principles & Experiences 85/144 Manipulating cell size in histograms Frequency Frequency [0,2)[2,4)[4,6)[6,8)[8,10)[10,12) Response time [0,6)[6,12) Response time Rule of thumb: each cell should have at least five points Not sufficient to uniquely determine what one should do.
86 Planning Presentation Repeatability Summary Guidelines Mistakes Pictorial games: gnuplot & L A TEX relative execution time: DBG/OPT TPC-H queries relative execution time: DBG/OPT TPC-H queries Manegold (CWI) Performance Evaluation: Principles & Experiences 86/144
87 Planning Presentation Repeatability Summary Guidelines Mistakes Pictorial games: gnuplot & L A TEX Manegold (CWI) Performance Evaluation: Principles & Experiences 87/144 relative execution time: DBG/OPT default: better: set size ratio 0 1,1 set size ratio 0 0.5, TPC-H queries relative execution time: DBG/OPT TPC-H queries
88 Planning Presentation Repeatability Summary Guidelines Mistakes Pictorial games: gnuplot & L A TEX Manegold (CWI) Performance Evaluation: Principles & Experiences 88/144 relative execution time: DBG/OPT default: better: set size ratio 0 1,1 set size ratio 0 0.5, TPC-H queries relative execution time: DBG/OPT TPC-H queries Rule of thumb for papers: width of plot = x\textwidth set size ratio 0 x*1.5,y
89 Planning Presentation Repeatability Summary Guidelines Mistakes Specifying hardware environments Manegold (CWI) Performance Evaluation: Principles & Experiences 89/144 We use a machine with 3.4 GHz.
90 Planning Presentation Repeatability Summary Guidelines Mistakes Specifying hardware environments Manegold (CWI) Performance Evaluation: Principles & Experiences 90/144 We use a machine with 3.4 GHz. 3400x?
91 Planning Presentation Repeatability Summary Guidelines Mistakes Specifying hardware environments Manegold (CWI) Performance Evaluation: Principles & Experiences 91/144 We use a machine with 3.4 GHz. Under-specified!
92 Planning Presentation Repeatability Summary Guidelines Mistakes Specifying hardware environments Manegold (CWI) Performance Evaluation: Principles & Experiences 92/144 cat /proc/cpuinfo processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 13 model name : Intel(R) Pentium(R) M processor 1.50GHz stepping : 6 cpu MHz : cache size : 2048 KB fdiv_bug : no hlt_bug : no f00f_bug : no coma_bug : no fpu : yes fpu_exception : yes cpuid level : 2 wp : yes flags : fpu vme de pse tsc msr mce cx8 mtrr pge mca cmov pat clflush dts acpi mmx fxsr sse sse2 ss tm pbe up bts est tm2 bogomips : clflush size : 64
93 Planning Presentation Repeatability Summary Guidelines Mistakes Specifying hardware environments Manegold (CWI) Performance Evaluation: Principles & Experiences 93/144 cat /proc/cpuinfo processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 13 model name : Intel(R) Pentium(R) M processor 1.50GHz! stepping : 6 cpu MHz : = throtteled down by speed stepping! cache size : 2048 KB fdiv_bug : no hlt_bug : no f00f_bug : no coma_bug : no fpu : yes fpu_exception : yes cpuid level : 2 wp : yes flags : fpu vme de pse tsc msr mce cx8 mtrr pge mca cmov pat clflush dts acpi mmx fxsr sse sse2 ss tm pbe up bts est tm2 bogomips : clflush size : 64
94 Planning Presentation Repeatability Summary Guidelines Mistakes Specifying hardware environments Manegold (CWI) Performance Evaluation: Principles & Experiences 94/144 /sbin/lspci -v 00:00.0 Host bridge: Intel Corporation 82852/82855 GM/GME/PM/GMV Processor to I/O Controller (rev 02) Flags: bus master, fast devsel, latency 0 Memory at <unassigned> (32-bit, prefetchable) Capabilities: <access denied> Kernel driver in use: agpgart-intel... 01:08.0 Ethernet controller: Intel Corporation 82801DB PRO/100 VE (MOB) Ethernet Controller (rev 83) Subsystem: Benq Corporation Unknown device 5002 Flags: bus master, medium devsel, latency 64, IRQ 10 Memory at e (32-bit, non-prefetchable) [size=4k] I/O ports at c000 [size=64] Capabilities: <access denied> Kernel driver in use: e100 Kernel modules: e100 /sbin/lspci -v wc 151 lines 861 words 6663 characters
95 Planning Presentation Repeatability Summary Guidelines Mistakes Specifying hardware environments Manegold (CWI) Performance Evaluation: Principles & Experiences 95/144 /sbin/lspci -v 00:00.0 Host bridge: Intel Corporation 82852/82855 GM/GME/PM/GMV Processor to I/O Controller (rev 02) Flags: bus master, fast devsel, latency 0 Memory at <unassigned> (32-bit, prefetchable) Capabilities: <access denied> Kernel driver in use: agpgart-intel... 01:08.0 Ethernet controller: Intel Corporation 82801DB PRO/100 VE (MOB) Ethernet Controller (rev 83) Subsystem: Benq Corporation Unknown device 5002 Flags: bus master, medium devsel, latency 64, IRQ 10 Memory at e (32-bit, non-prefetchable) [size=4k] I/O ports at c000 [size=64] Capabilities: <access denied> Kernel driver in use: e100 Kernel modules: e100 /sbin/lspci -v wc 151 lines 861 words 6663 characters Over-specified!
96 Planning Presentation Repeatability Summary Guidelines Mistakes Specifying hardware environments Manegold (CWI) Performance Evaluation: Principles & Experiences 96/144 CPU: Vendor, model, generation, clockspeed, cache size(s) 1.5 GHz Pentium M (Dothan), 32 KB L1 cache, 2 MB L2 cache Main memory: size 2 GB RAM Disk (system): size & speed 120 GB Laptop ATA 5400 RPM 1 TB striped RAID-0 system (5x 200 GB S-ATA 7200 RPM Network (interconnection): type, speed & topology 1 GB shared Ethernet
97 Planning Presentation Repeatability Summary Guidelines Mistakes Specifying software environments Manegold (CWI) Performance Evaluation: Principles & Experiences 97/144 Product names, exact version numbers, and/or sources where obtained from
98 Planning Presentation Repeatability Summary Portability Test suite Documenting Manegold (CWI) Performance Evaluation: Principles & Experiences 98/144 1 Planning & conducting experiments 2 Presentation 3 Repeatability Portable parameterizable experiments Test suite Documenting your experiment suite 4 Summary
99 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments repeatable Manegold (CWI) Performance Evaluation: Principles & Experiences 99/144 Purpose: another human equipped with the appropriate software and hardware can repeat your experiments.
100 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments repeatable Manegold (CWI) Performance Evaluation: Principles & Experiences 100/144 Purpose: another human equipped with the appropriate software and hardware can repeat your experiments. Your supervisor / your students Your colleagues Yourself, 3 months later when you have a new idea Yourself, 3 years later when writing the thesis or answering requests for that journal version of your conference paper Future researchers (you get cited!)
101 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments repeatable Manegold (CWI) Performance Evaluation: Principles & Experiences 101/144 Purpose: another human equipped with the appropriate software and hardware can repeat your experiments. Your supervisor / your students Your colleagues Yourself, 3 months later when you have a new idea Yourself, 3 years later when writing the thesis or answering requests for that journal version of your conference paper Future researchers (you get cited!) Making experiments repeatable means: 1 Making experiments portable and parameterizable 2 Building a test suite and scripts 3 Writing instructions
102 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments portable Manegold (CWI) Performance Evaluation: Principles & Experiences 102/144 Try to use not-so-exotic hardware Try to use free or commonly available tools (databases, compilers, plotters...)
103 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments portable Manegold (CWI) Performance Evaluation: Principles & Experiences 103/144 Try to use not-so-exotic hardware Try to use free or commonly available tools (databases, compilers, plotters...) Clearly, scientific needs go first (joins on graphic cards; smart card research; energy consumption study...)
104 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments portable Manegold (CWI) Performance Evaluation: Principles & Experiences 104/144 Try to use not-so-exotic hardware Try to use free or commonly available tools (databases, compilers, plotters...) Clearly, scientific needs go first (joins on graphic cards; smart card research; energy consumption study...) You may omit using Matlab as the driving platform for the experiments
105 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments portable Manegold (CWI) Performance Evaluation: Principles & Experiences 105/144 Try to use not-so-exotic hardware Try to use free or commonly available tools (databases, compilers, plotters...) Clearly, scientific needs go first (joins on graphic cards; smart card research; energy consumption study...) You may omit using Matlab as the driving platform for the experiments 20-years old software that only works on an old SUN and is now unavailable
106 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments portable Manegold (CWI) Performance Evaluation: Principles & Experiences 106/144 Try to use not-so-exotic hardware Try to use free or commonly available tools (databases, compilers, plotters...) Clearly, scientific needs go first (joins on graphic cards; smart card research; energy consumption study...) You may omit using Matlab as the driving platform for the experiments 20-years old software that only works on an old SUN and is now unavailable If you really love your code, you may even maintain it
107 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments portable Manegold (CWI) Performance Evaluation: Principles & Experiences 107/144 Try to use not-so-exotic hardware Try to use free or commonly available tools (databases, compilers, plotters...) Clearly, scientific needs go first (joins on graphic cards; smart card research; energy consumption study...) You may omit using Matlab as the driving platform for the experiments 20-years old software that only works on an old SUN and is now unavailable Code If you really love your code, you may even maintain maintenance it
108 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments portable Manegold (CWI) Performance Evaluation: Principles & Experiences 108/144 Try to use not-so-exotic hardware Try to use free or commonly available tools (databases, compilers, plotters...) Clearly, scientific needs go first (joins on graphic cards; smart card research; energy consumption study...) You may omit using Matlab as the driving platform for the experiments 20-years old software that only works on an old SUN and is now unavailable (if you really love your code, you may even maintain it) 4-years old library that is no longer distributed and you do no longer have (idem)
109 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments portable Manegold (CWI) Performance Evaluation: Principles & Experiences 109/144 Try to use not-so-exotic hardware Try to use free or commonly available tools (databases, compilers, plotters...) Clearly, scientific needs go first (joins on graphic cards; smart card research; energy consumption study...) You may omit using Matlab as the driving platform for the experiments 20-years old software that only works on an old SUN and is now unavailable (if you really love your code, you may even maintain it) 4-years old library that is no longer distributed and you do no longer have (idem) /usr/bin/time to time execution, parse the output with perl, divide by zero
110 Planning Presentation Repeatability Summary Portability Test suite Documenting Which abstract do you prefer? Manegold (CWI) Performance Evaluation: Principles & Experiences 110/144 Abstract (Take 1) We provide a new algorithm that consistently outperforms the state of the art.
111 Planning Presentation Repeatability Summary Portability Test suite Documenting Which abstract do you prefer? Manegold (CWI) Performance Evaluation: Principles & Experiences 111/144 Abstract (Take 1) We provide a new algorithm that consistently outperforms the state of the art. Abstract (Take 2) We provide a new algorithm that on a Debian Linux machine with 4 GHz CPU, 60 GB disk, DMA, 2 GB main memory and our own brand of system libraries consistently outperforms the state of the art.
112 Planning Presentation Repeatability Summary Portability Test suite Documenting Which abstract do you prefer? Manegold (CWI) Performance Evaluation: Principles & Experiences 112/144 Abstract (Take 1) We provide a new algorithm that consistently outperforms the state of the art. Abstract (Take 2) We provide a new algorithm that on a Debian Linux machine with 4 GHz CPU, 60 GB disk, DMA, 2 GB main memory and our own brand of system libraries consistently outperforms the state of the art. There are obvious, undisputed exceptions
113 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments parameterizable Manegold (CWI) Performance Evaluation: Principles & Experiences 113/144 This is huge
114 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments parameterizable Manegold (CWI) Performance Evaluation: Principles & Experiences 114/144 This is huge Parameters your code may depend on: credentials (OS, database, other) values of important environment variables (usually one or two) various paths and directories (see: environment variables) where the input comes from switches (pre-process, optimize, prune, materialize, plot...) where the output goes
115 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments parameterizable Manegold (CWI) Performance Evaluation: Principles & Experiences 115/144 Purpose: have a very simple mean to obtain a test for the values f 1 = v 1, f 2 = v 2,..., f k = v k
116 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments parameterizable Manegold (CWI) Performance Evaluation: Principles & Experiences 116/144 Purpose: have a very simple mean to obtain a test for the values f 1 = v 1, f 2 = v 2,..., f k = v k Many tricks. Very simple ones: argc / argv: specific to each class main Configuration files Java Properties pattern + command-line arguments
117 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments parameterizable Manegold (CWI) Performance Evaluation: Principles & Experiences 117/144 Configuration files Omnipresent in large-scale software Crucial if you hope for serious installations: see gnu software install procedure Decide on a specific relative directory, fix the syntax Report meaningful error if the configuration file is not found Pro: human-readable even without running code Con: the values are read when the process is created
118 Planning Presentation Repeatability Summary Portability Test suite Documenting Making experiments parameterizable Manegold (CWI) Performance Evaluation: Principles & Experiences 118/144 Java util.properties Flexible management of parameters for Java projects Defaults + overriding How does it go: Properties extends Hashtable Properties is a map of (key, value) string pairs { datadir,./data } { dostore, true } Methods: getproperty(string s) setproperty(string s1, String s2) load(inputstream is) store(outputstream os, String comments) loadfromxml(...), storetoxml(...)
119 Planning Presentation Repeatability Summary Portability Test suite Documenting Using java.util.properties Manegold (CWI) Performance Evaluation: Principles & Experiences 119/144 One possible usage class Parameters{ Properties prop; String[][] defaults = {{ datadir,./data }, { dostore, true } }; void init(){ prop = new Properties(); for (int i = 0; i < defaults.length; i ++) prop.put(defaults[i][0], defaults[i][1]); } void set(string s, String v){ prop.put(s, v); } String get(string s){ // error if prop is null! return prop.get(s);} }
120 Planning Presentation Repeatability Summary Portability Test suite Documenting Using java.util.properties Manegold (CWI) Performance Evaluation: Principles & Experiences 120/144 When the code starts, it calls Parameters.init(), loading the defaults The defaults may be overridden later from the code by calling set The properties are accessible to all the code The properties are stored in one place Simple serialization/deserialization mechanisms may be used instead of constant defaults
121 Planning Presentation Repeatability Summary Portability Test suite Documenting Command-line arguments and java.util.properties Better init method class Parameters{ Properties prop;... void init(){ prop = new Properties(); for (int i = 0; i < defaults.length; i ++) prop.put(defaults[i][0], defaults[i][1]); Properties sysprops = System.getProperties(); // copy sysprops into (over) prop! } } Call with: java -DdataDir=./test -DdoStore=false pack.anyclass Manegold (CWI) Performance Evaluation: Principles & Experiences 121/144
122 Planning Presentation Repeatability Summary Portability Test suite Documenting Making your code parameterizable Manegold (CWI) Performance Evaluation: Principles & Experiences 122/144 The bottom line: you will want to run it in different settings With your or the competitor s algorithm or special optimization On your desktop or your laptop With a local or remote MySQL server Make it easy to produce a point If it is very difficult to produce a new point, ask questions
123 Planning Presentation Repeatability Summary Portability Test suite Documenting Making your code parameterizable Manegold (CWI) Performance Evaluation: Principles & Experiences 123/144 The bottom line: you will want to run it in different settings With your or the competitor s algorithm or special optimization On your desktop or your laptop With a local or remote MySQL server Make it easy to produce a point If it is very difficult to produce a new point, ask questions You may omit coding like this: The input data set files should be specified in source file util.globalproperty.java.
124 Planning Presentation Repeatability Summary Portability Test suite Documenting Building a test suite Manegold (CWI) Performance Evaluation: Principles & Experiences 124/144 You already have: Designs Easy way to get any measure point You need: Suited directory structure (e.g.: source, bin, data, res, graphs) Control loops to generate the points needed for each graph, under res/, and possibly to produce graphs under graphs Even Java can be used for the control loops, but... It does pay off to know how to write a loop in shell/perl etc.
125 Planning Presentation Repeatability Summary Portability Test suite Documenting Building a test suite Manegold (CWI) Performance Evaluation: Principles & Experiences 125/144 You already have: Designs Easy way to get any measure point You need: Suited directory structure (e.g.: source, bin, data, res, graphs) Control loops to generate the points needed for each graph, under res/, and possibly to produce graphs under graphs Even Java can be used for the control loops, but... It does pay off to know how to write a loop in shell/perl etc. You may omit coding like this: Change the value of the delta variable in distribution.distfreenode.java into 1,5,15,20 and so on.
126 Planning Presentation Repeatability Summary Portability Test suite Documenting Automatically generated graphs Manegold (CWI) Performance Evaluation: Principles & Experiences 126/144 You have: files containing numbers characterizing the parameter values and the results basic shell skills
127 Planning Presentation Repeatability Summary Portability Test suite Documenting Automatically generated graphs Manegold (CWI) Performance Evaluation: Principles & Experiences 127/144 You have: files containing numbers characterizing the parameter values and the results basic shell skills You need: graphs Most frequently used solutions: Based on Gnuplot Based on Excel or OpenOffice clone Other solutions: R; Matlab (remember portability)
128 Planning Presentation Repeatability Summary Portability Test suite Documenting Automatically generating graphs with Gnuplot Manegold (CWI) Performance Evaluation: Principles & Experiences 128/144 1 Data file results-m1-n5.csv:
129 Planning Presentation Repeatability Summary Portability Test suite Documenting Automatically generating graphs with Gnuplot Manegold (CWI) Performance Evaluation: Principles & Experiences 129/144 1 Data file results-m1-n5.csv: Gnuplot command file plot-m1-n5.gnu for plotting this graph:
130 Planning Presentation Repeatability Summary Portability Test suite Documenting Automatically generating graphs with Gnuplot Manegold (CWI) Performance Evaluation: Principles & Experiences 130/144 1 Data file results-m1-n5.csv: Gnuplot command file plot-m1-n5.gnu for plotting this graph: set data style linespoints set terminal postscript eps color set output "results-m1-n5.eps" set title "Execution time for various scale factors" set xlabel "Scale factor" set ylabel "Execution time (ms)" plot "results-m1-n5.csv"
131 Planning Presentation Repeatability Summary Portability Test suite Documenting Automatically generating graphs with Gnuplot Manegold (CWI) Performance Evaluation: Principles & Experiences 131/144 1 Data file results-m1-n5.csv: Gnuplot command file plot-m1-n5.gnu for plotting this graph: set data style linespoints set terminal postscript eps color set output "results-m1-n5.eps" set title "Execution time for various scale factors" set xlabel "Scale factor" set ylabel "Execution time (ms)" plot "results-m1-n5.csv" 3 Call gnuplot plot-m1-n5.gnu
132 Planning Presentation Repeatability Summary Portability Test suite Documenting Automatically producing graphs with Excel Manegold (CWI) Performance Evaluation: Principles & Experiences 132/144 1 Create an Excel file results-m1-n5.xls with the column labels: A B C 1 Scale factor Execution time
133 Planning Presentation Repeatability Summary Portability Test suite Documenting Automatically producing graphs with Excel Manegold (CWI) Performance Evaluation: Principles & Experiences 133/144 1 Create an Excel file results-m1-n5.xls with the column labels: A B C 1 Scale factor Execution time Insert in the area B2-C3 a link to the file results-m1-n5.csv
134 Planning Presentation Repeatability Summary Portability Test suite Documenting Automatically producing graphs with Excel Manegold (CWI) Performance Evaluation: Principles & Experiences 134/144 1 Create an Excel file results-m1-n5.xls with the column labels: A B C 1 Scale factor Execution time Insert in the area B2-C3 a link to the file results-m1-n5.csv 3 Create in the.xls file a graph out of the cells A1:B3, chose the layout, colors etc.
135 Planning Presentation Repeatability Summary Portability Test suite Documenting Automatically producing graphs with Excel Manegold (CWI) Performance Evaluation: Principles & Experiences 135/144 1 Create an Excel file results-m1-n5.xls with the column labels: A B C 1 Scale factor Execution time Insert in the area B2-C3 a link to the file results-m1-n5.csv 3 Create in the.xls file a graph out of the cells A1:B3, chose the layout, colors etc. 4 When the.csv file will be created, the graph is automatically filled in.
136 Planning Presentation Repeatability Summary Portability Test suite Documenting Graph generation Manegold (CWI) Performance Evaluation: Principles & Experiences 136/144 You may omit working like this: In avgs.out, the first 15 lines correspond to xyzt, the next 15 lines correspond to xyzt, the next 15 lines correspond to Xyzt, the next 15 lines correspond to xyzt, the next 15 lines correspond to XyzT, the next 15 lines correspond to XYZT, and the next 15 lines correspond to XyZT. In each of these sets of 15, the numbers correspond to queries 1.1,1.2,1.3,1.4,2.1,2.2,2.3,2.4,3.1,3.2,3.3,3.4,4.1,4.2,and 4.3.
137 Planning Presentation Repeatability Summary Portability Test suite Documenting Graph generation Manegold (CWI) Performance Evaluation: Principles & Experiences 137/144 You may omit working like this: In avgs.out, the first 15 lines correspond to xyzt, the next 15 lines correspond to xyzt, the next 15 lines correspond to Xyzt, the next 15 lines correspond to xyzt, the next 15 lines correspond to XyzT, the next 15 lines correspond to XYZT, and the next 15 lines correspond to XyZT. In each of these sets of 15, the numbers correspond to queries 1.1,1.2,1.3,1.4,2.1,2.2,2.3,2.4,3.1,3.2,3.3,3.4,4.1,4.2,and either because you want to do clean work, or because you don t want this to happen:
138 Planning Presentation Repeatability Summary Portability Test suite Documenting Why you should take care to generate your own graphs Manegold (CWI) Performance Evaluation: Principles & Experiences 138/144 File avgs.out contains average times over three runs: a b
139 Planning Presentation Repeatability Summary Portability Test suite Documenting Why you should take care to generate your own graphs Manegold (CWI) Performance Evaluation: Principles & Experiences 139/144 File avgs.out contains average times over three runs: a b Copy-paste into OpenOffice fc8: a b
140 Planning Presentation Repeatability Summary Portability Test suite Documenting Why you should take care to generate your own graphs Manegold (CWI) Performance Evaluation: Principles & Experiences 140/144 File avgs.out contains average times over three runs: a b Copy-paste into OpenOffice fc8: The graph doesn t look good :-( a b
141 Planning Presentation Repeatability Summary Portability Test suite Documenting Why you should take care to generate your own graphs File avgs.out contains average times over three runs: (. decimals) a b Copy-paste into OpenOffice fc8: (expecting, decimals) a b The graph doesn t look good :-( Hard to figure out when you have to produce by hand 20 such graphs and most of them look OK Manegold (CWI) Performance Evaluation: Principles & Experiences 141/144
142 Planning Presentation Repeatability Summary Portability Test suite Documenting Documenting your experiment suite Manegold (CWI) Performance Evaluation: Principles & Experiences 142/144 Very easy if experiments are already portable, parameterizable, and if graphs are automatically generated. Specify: 1 What the installation requires; how to install 2 For each experiment 1 Extra installation if any 2 Script to run 3 Where to look for the graph
143 Planning Presentation Repeatability Summary Portability Test suite Documenting Documenting your experiment suite Manegold (CWI) Performance Evaluation: Principles & Experiences 143/144 Very easy if experiments are already portable, parameterizable, and if graphs are automatically generated. Specify: 1 What the installation requires; how to install 2 For each experiment 1 Extra installation if any 2 Script to run 3 Where to look for the graph 4 How long it takes
144 Planning Presentation Repeatability Summary Summary & conclusions Manegold (CWI) Performance Evaluation: Principles & Experiences 144/144 Good and repeatable performance evaluation and experimental assessment require no fancy magic but rather solid craftmanship Proper planning helps to keep you from getting lost and ensure repeatability Repeatable experiments simplify your own work (and help others to understand it better) There is no single way how to do it right. There are many ways how to do it wrong. We provided some simple rules and guidelines what (not) to do.
Virtuoso and Database Scalability
Virtuoso and Database Scalability By Orri Erling Table of Contents Abstract Metrics Results Transaction Throughput Initializing 40 warehouses Serial Read Test Conditions Analysis Working Set Effect of
Performance And Scalability In Oracle9i And SQL Server 2000
Performance And Scalability In Oracle9i And SQL Server 2000 Presented By : Phathisile Sibanda Supervisor : John Ebden 1 Presentation Overview Project Objectives Motivation -Why performance & Scalability
Performance monitoring. in the GNU/Linux environment. Linux is like a wigwam - no Windows, no Gates, Apache inside!
1 Performance monitoring in the GNU/Linux environment Linux is like a wigwam - no Windows, no Gates, Apache inside! 2 1 Post-conditions To be familiar with some performance-tuning options To be able to
ovirt QoS Martin Sivák Red Hat Czech KVM Forum October 2013 ovirt QOS
ovirt QoS Martin Sivák Red Hat Czech KVM Forum October 2013 1 Agenda Why is QoS important? Scalability and management challenges Managing resources CPU Network Memory Future plans 2 How do we define QoS?
Delivering Quality in Software Performance and Scalability Testing
Delivering Quality in Software Performance and Scalability Testing Abstract Khun Ban, Robert Scott, Kingsum Chow, and Huijun Yan Software and Services Group, Intel Corporation {khun.ban, robert.l.scott,
Virtualization on Linux Using KVM and libvirt. Matt Surico Long Island Linux Users Group 11 March, 2014
Virtualization on Linux Using KVM and libvirt Matt Surico Long Island Linux Users Group 11 March, 2014 What will we talk about? Rationalizing virtualization Virtualization basics How does it work under
find model parameters, to validate models, and to develop inputs for models. c 1994 Raj Jain 7.1
Monitors Monitor: A tool used to observe the activities on a system. Usage: A system programmer may use a monitor to improve software performance. Find frequently used segments of the software. A systems
Web Application s Performance Testing
Web Application s Performance Testing B. Election Reddy (07305054) Guided by N. L. Sarda April 13, 2008 1 Contents 1 Introduction 4 2 Objectives 4 3 Performance Indicators 5 4 Types of Performance Testing
System Requirements Table of contents
Table of contents 1 Introduction... 2 2 Knoa Agent... 2 2.1 System Requirements...2 2.2 Environment Requirements...4 3 Knoa Server Architecture...4 3.1 Knoa Server Components... 4 3.2 Server Hardware Setup...5
Tableau Server Scalability Explained
Tableau Server Scalability Explained Author: Neelesh Kamkolkar Tableau Software July 2013 p2 Executive Summary In March 2013, we ran scalability tests to understand the scalability of Tableau 8.0. We wanted
Microsoft Windows Server 2003 with Internet Information Services (IIS) 6.0 vs. Linux Competitive Web Server Performance Comparison
April 23 11 Aviation Parkway, Suite 4 Morrisville, NC 2756 919-38-28 Fax 919-38-2899 32 B Lakeside Drive Foster City, CA 9444 65-513-8 Fax 65-513-899 www.veritest.com [email protected] Microsoft Windows
Adaptec: Snap Server NAS Performance Study
October 2006 www.veritest.com [email protected] Adaptec: Snap Server NAS Performance Study Test report prepared under contract from Adaptec, Inc. Executive summary Adaptec commissioned VeriTest, a service
Oracle Database Scalability in VMware ESX VMware ESX 3.5
Performance Study Oracle Database Scalability in VMware ESX VMware ESX 3.5 Database applications running on individual physical servers represent a large consolidation opportunity. However enterprises
On Benchmarking Popular File Systems
On Benchmarking Popular File Systems Matti Vanninen James Z. Wang Department of Computer Science Clemson University, Clemson, SC 2963 Emails: {mvannin, jzwang}@cs.clemson.edu Abstract In recent years,
CS 3530 Operating Systems. L02 OS Intro Part 1 Dr. Ken Hoganson
CS 3530 Operating Systems L02 OS Intro Part 1 Dr. Ken Hoganson Chapter 1 Basic Concepts of Operating Systems Computer Systems A computer system consists of two basic types of components: Hardware components,
CS 377: Operating Systems. Outline. A review of what you ve learned, and how it applies to a real operating system. Lecture 25 - Linux Case Study
CS 377: Operating Systems Lecture 25 - Linux Case Study Guest Lecturer: Tim Wood Outline Linux History Design Principles System Overview Process Scheduling Memory Management File Systems A review of what
Performance and scalability of a large OLTP workload
Performance and scalability of a large OLTP workload ii Performance and scalability of a large OLTP workload Contents Performance and scalability of a large OLTP workload with DB2 9 for System z on Linux..............
DELL TM PowerEdge TM T610 500 Mailbox Resiliency Exchange 2010 Storage Solution
DELL TM PowerEdge TM T610 500 Mailbox Resiliency Exchange 2010 Storage Solution Tested with: ESRP Storage Version 3.0 Tested Date: Content DELL TM PowerEdge TM T610... 1 500 Mailbox Resiliency
BENCHMARKING CLOUD DATABASES CASE STUDY on HBASE, HADOOP and CASSANDRA USING YCSB
BENCHMARKING CLOUD DATABASES CASE STUDY on HBASE, HADOOP and CASSANDRA USING YCSB Planet Size Data!? Gartner s 10 key IT trends for 2012 unstructured data will grow some 80% over the course of the next
ABSTRACT 1. INTRODUCTION. Kamil Bajda-Pawlikowski [email protected]
Kamil Bajda-Pawlikowski [email protected] Querying RDF data stored in DBMS: SPARQL to SQL Conversion Yale University technical report #1409 ABSTRACT This paper discusses the design and implementation
An Oracle White Paper August 2012. Oracle WebCenter Content 11gR1 Performance Testing Results
An Oracle White Paper August 2012 Oracle WebCenter Content 11gR1 Performance Testing Results Introduction... 2 Oracle WebCenter Content Architecture... 2 High Volume Content & Imaging Application Characteristics...
Best Practices for Deploying SSDs in a Microsoft SQL Server 2008 OLTP Environment with Dell EqualLogic PS-Series Arrays
Best Practices for Deploying SSDs in a Microsoft SQL Server 2008 OLTP Environment with Dell EqualLogic PS-Series Arrays Database Solutions Engineering By Murali Krishnan.K Dell Product Group October 2009
StreamServe Persuasion SP5 Microsoft SQL Server
StreamServe Persuasion SP5 Microsoft SQL Server Database Guidelines Rev A StreamServe Persuasion SP5 Microsoft SQL Server Database Guidelines Rev A 2001-2011 STREAMSERVE, INC. ALL RIGHTS RESERVED United
An Oracle White Paper July 2011. Oracle Primavera Contract Management, Business Intelligence Publisher Edition-Sizing Guide
Oracle Primavera Contract Management, Business Intelligence Publisher Edition-Sizing Guide An Oracle White Paper July 2011 1 Disclaimer The following is intended to outline our general product direction.
Cisco UCS and Fusion- io take Big Data workloads to extreme performance in a small footprint: A case study with Oracle NoSQL database
Cisco UCS and Fusion- io take Big Data workloads to extreme performance in a small footprint: A case study with Oracle NoSQL database Built up on Cisco s big data common platform architecture (CPA), a
VERITAS Software - Storage Foundation for Windows Dynamic Multi-Pathing Performance Testing
January 2003 www.veritest.com [email protected] VERITAS Software - Storage Foundation for Windows Dynamic Multi-Pathing Performance Testing Test report prepared under contract from VERITAS Software Corporation
Accelerating Server Storage Performance on Lenovo ThinkServer
Accelerating Server Storage Performance on Lenovo ThinkServer Lenovo Enterprise Product Group April 214 Copyright Lenovo 214 LENOVO PROVIDES THIS PUBLICATION AS IS WITHOUT WARRANTY OF ANY KIND, EITHER
N8103-149/150/151/160 RAID Controller. N8103-156 MegaRAID CacheCade. Feature Overview
N8103-149/150/151/160 RAID Controller N8103-156 MegaRAID CacheCade Feature Overview April 2012 Rev.1.0 NEC Corporation Contents 1 Introduction... 3 2 Types of RAID Controllers... 3 3 New Features of RAID
DELL. Virtual Desktop Infrastructure Study END-TO-END COMPUTING. Dell Enterprise Solutions Engineering
DELL Virtual Desktop Infrastructure Study END-TO-END COMPUTING Dell Enterprise Solutions Engineering 1 THIS WHITE PAPER IS FOR INFORMATIONAL PURPOSES ONLY, AND MAY CONTAIN TYPOGRAPHICAL ERRORS AND TECHNICAL
Users are Complaining that the System is Slow What Should I Do Now? Part 1
Users are Complaining that the System is Slow What Should I Do Now? Part 1 Jeffry A. Schwartz July 15, 2014 SQLRx Seminar [email protected] Overview Most of you have had to deal with vague user complaints
Measuring Cache and Memory Latency and CPU to Memory Bandwidth
White Paper Joshua Ruggiero Computer Systems Engineer Intel Corporation Measuring Cache and Memory Latency and CPU to Memory Bandwidth For use with Intel Architecture December 2008 1 321074 Executive Summary
Capacity Planning for Microsoft SharePoint Technologies
Capacity Planning for Microsoft SharePoint Technologies Capacity Planning The process of evaluating a technology against the needs of an organization, and making an educated decision about the configuration
New!! - Higher performance for Windows and UNIX environments
New!! - Higher performance for Windows and UNIX environments The IBM TotalStorage Network Attached Storage Gateway 300 (NAS Gateway 300) is designed to act as a gateway between a storage area network (SAN)
Microsoft Dynamics NAV 2013 R2 Sizing Guidelines for On-Premises Single Tenant Deployments
Microsoft Dynamics NAV 2013 R2 Sizing Guidelines for On-Premises Single Tenant Deployments July 2014 White Paper Page 1 Contents 3 Sizing Recommendations Summary 3 Workloads used in the tests 3 Transactional
Microsoft SQL Server 2012 on Cisco UCS with iscsi-based Storage Access in VMware ESX Virtualization Environment: Performance Study
White Paper Microsoft SQL Server 2012 on Cisco UCS with iscsi-based Storage Access in VMware ESX Virtualization Environment: Performance Study 2012 Cisco and/or its affiliates. All rights reserved. This
DMS Performance Tuning Guide for SQL Server
DMS Performance Tuning Guide for SQL Server Rev: February 13, 2014 Sitecore CMS 6.5 DMS Performance Tuning Guide for SQL Server A system administrator's guide to optimizing the performance of Sitecore
FileMaker 12. ODBC and JDBC Guide
FileMaker 12 ODBC and JDBC Guide 2004 2012 FileMaker, Inc. All Rights Reserved. FileMaker, Inc. 5201 Patrick Henry Drive Santa Clara, California 95054 FileMaker and Bento are trademarks of FileMaker, Inc.
VI Performance Monitoring
VI Performance Monitoring Preetham Gopalaswamy Group Product Manager Ravi Soundararajan Staff Engineer September 15, 2008 Agenda Introduction to performance monitoring in VI Common customer/partner questions
Comprehending the Tradeoffs between Deploying Oracle Database on RAID 5 and RAID 10 Storage Configurations. Database Solutions Engineering
Comprehending the Tradeoffs between Deploying Oracle Database on RAID 5 and RAID 10 Storage Configurations A Dell Technical White Paper Database Solutions Engineering By Sudhansu Sekhar and Raghunatha
DATABASE. Pervasive PSQL Performance. Key Performance Features of Pervasive PSQL. Pervasive PSQL White Paper
DATABASE Pervasive PSQL Performance Key Performance Features of Pervasive PSQL Pervasive PSQL White Paper June 2008 Table of Contents Introduction... 3 Per f o r m a n c e Ba s i c s: Mo r e Me m o r y,
MAGENTO HOSTING Progressive Server Performance Improvements
MAGENTO HOSTING Progressive Server Performance Improvements Simple Helix, LLC 4092 Memorial Parkway Ste 202 Huntsville, AL 35802 [email protected] 1.866.963.0424 www.simplehelix.com 2 Table of Contents
Evaluation Report: Accelerating SQL Server Database Performance with the Lenovo Storage S3200 SAN Array
Evaluation Report: Accelerating SQL Server Database Performance with the Lenovo Storage S3200 SAN Array Evaluation report prepared under contract with Lenovo Executive Summary Even with the price of flash
Benchmarking Hadoop & HBase on Violin
Technical White Paper Report Technical Report Benchmarking Hadoop & HBase on Violin Harnessing Big Data Analytics at the Speed of Memory Version 1.0 Abstract The purpose of benchmarking is to show advantages
XTM Web 2.0 Enterprise Architecture Hardware Implementation Guidelines. A.Zydroń 18 April 2009. Page 1 of 12
XTM Web 2.0 Enterprise Architecture Hardware Implementation Guidelines A.Zydroń 18 April 2009 Page 1 of 12 1. Introduction...3 2. XTM Database...4 3. JVM and Tomcat considerations...5 4. XTM Engine...5
Evaluation Report: Supporting Microsoft Exchange on the Lenovo S3200 Hybrid Array
Evaluation Report: Supporting Microsoft Exchange on the Lenovo S3200 Hybrid Array Evaluation report prepared under contract with Lenovo Executive Summary Love it or hate it, businesses rely on email. It
Operating System Overview. Otto J. Anshus
Operating System Overview Otto J. Anshus A Typical Computer CPU... CPU Memory Chipset I/O bus ROM Keyboard Network A Typical Computer System CPU. CPU Memory Application(s) Operating System ROM OS Apps
IMPLEMENTING GREEN IT
Saint Petersburg State University of Information Technologies, Mechanics and Optics Department of Telecommunication Systems IMPLEMENTING GREEN IT APPROACH FOR TRANSFERRING BIG DATA OVER PARALLEL DATA LINK
Microsoft SharePoint Server 2010
Microsoft SharePoint Server 2010 Small Farm Performance Study Dell SharePoint Solutions Ravikanth Chaganti and Quocdat Nguyen November 2010 THIS WHITE PAPER IS FOR INFORMATIONAL PURPOSES ONLY, AND MAY
LCMON Network Traffic Analysis
LCMON Network Traffic Analysis Adam Black Centre for Advanced Internet Architectures, Technical Report 79A Swinburne University of Technology Melbourne, Australia [email protected] Abstract The Swinburne
Serving 4 million page requests an hour with Magento Enterprise
1 Serving 4 million page requests an hour with Magento Enterprise Introduction In order to better understand Magento Enterprise s capacity to serve the needs of some of our larger clients, Session Digital
Adaptec: Snap Server NAS Performance Study
March 2006 www.veritest.com [email protected] Adaptec: Snap Server NAS Performance Study Test report prepared under contract from Adaptec, Inc. Executive summary Adaptec commissioned VeriTest, a service
Analyzing IBM i Performance Metrics
WHITE PAPER Analyzing IBM i Performance Metrics The IBM i operating system is very good at supplying system administrators with built-in tools for security, database management, auditing, and journaling.
MCTS Guide to Microsoft Windows 7. Chapter 10 Performance Tuning
MCTS Guide to Microsoft Windows 7 Chapter 10 Performance Tuning Objectives Identify several key performance enhancements Describe performance tuning concepts Use Performance Monitor Use Task Manager Understand
How To Store Data On An Ocora Nosql Database On A Flash Memory Device On A Microsoft Flash Memory 2 (Iomemory)
WHITE PAPER Oracle NoSQL Database and SanDisk Offer Cost-Effective Extreme Performance for Big Data 951 SanDisk Drive, Milpitas, CA 95035 www.sandisk.com Table of Contents Abstract... 3 What Is Big Data?...
Optimizing LTO Backup Performance
Optimizing LTO Backup Performance July 19, 2011 Written by: Ash McCarty Contributors: Cedrick Burton Bob Dawson Vang Nguyen Richard Snook Table of Contents 1.0 Introduction... 3 2.0 Host System Configuration...
Power Efficiency Comparison: Cisco UCS 5108 Blade Server Chassis and IBM FlexSystem Enterprise Chassis
White Paper Power Efficiency Comparison: Cisco UCS 5108 Blade Server Chassis and IBM FlexSystem Enterprise Chassis White Paper March 2014 2014 Cisco and/or its affiliates. All rights reserved. This document
White Paper. Recording Server Virtualization
White Paper Recording Server Virtualization Prepared by: Mike Sherwood, Senior Solutions Engineer Milestone Systems 23 March 2011 Table of Contents Introduction... 3 Target audience and white paper purpose...
VMware Server 2.0 Essentials. Virtualization Deployment and Management
VMware Server 2.0 Essentials Virtualization Deployment and Management . This PDF is provided for personal use only. Unauthorized use, reproduction and/or distribution strictly prohibited. All rights reserved.
Plesk 8.3 for Linux/Unix System Monitoring Module Administrator's Guide
Plesk 8.3 for Linux/Unix System Monitoring Module Administrator's Guide Revision 1.0 Copyright Notice ISBN: N/A SWsoft. 13755 Sunrise Valley Drive Suite 600 Herndon VA 20171 USA Phone: +1 (703) 815 5670
DIABLO TECHNOLOGIES MEMORY CHANNEL STORAGE AND VMWARE VIRTUAL SAN : VDI ACCELERATION
DIABLO TECHNOLOGIES MEMORY CHANNEL STORAGE AND VMWARE VIRTUAL SAN : VDI ACCELERATION A DIABLO WHITE PAPER AUGUST 2014 Ricky Trigalo Director of Business Development Virtualization, Diablo Technologies
Performance Comparison of SQL based Big Data Analytics with Lustre and HDFS file systems
Performance Comparison of SQL based Big Data Analytics with Lustre and HDFS file systems Rekha Singhal and Gabriele Pacciucci * Other names and brands may be claimed as the property of others. Lustre File
HARDWARE AND SOFTWARE REQUIREMENTS
HARDWARE AND SOFTWARE REQUIREMENTS WINDOWS SPECIFICATIONS File Server Pentium 4 64 - Bit Dual Core, 3.2 GHZ or better (dual/quad Xeon processors recommended) 4GB of RAM or better Two 72GB or better fast
Perforce with Network Appliance Storage
Perforce with Network Appliance Storage Perforce User Conference 2001 Richard Geiger Introduction What is Network Attached storage? Can Perforce run with Network Attached storage? Why would I want to run
Using Iometer to Show Acceleration Benefits for VMware vsphere 5.5 with FlashSoft Software 3.7
Using Iometer to Show Acceleration Benefits for VMware vsphere 5.5 with FlashSoft Software 3.7 WHITE PAPER Western Digital Technologies, Inc. 951 SanDisk Drive, Milpitas, CA 95035 www.sandisk.com Table
Windows Server Performance Monitoring
Spot server problems before they are noticed The system s really slow today! How often have you heard that? Finding the solution isn t so easy. The obvious questions to ask are why is it running slowly
1 Storage Devices Summary
Chapter 1 Storage Devices Summary Dependability is vital Suitable measures Latency how long to the first bit arrives Bandwidth/throughput how fast does stuff come through after the latency period Obvious
FileMaker 11. ODBC and JDBC Guide
FileMaker 11 ODBC and JDBC Guide 2004 2010 FileMaker, Inc. All Rights Reserved. FileMaker, Inc. 5201 Patrick Henry Drive Santa Clara, California 95054 FileMaker is a trademark of FileMaker, Inc. registered
WHITE PAPER Optimizing Virtual Platform Disk Performance
WHITE PAPER Optimizing Virtual Platform Disk Performance Think Faster. Visit us at Condusiv.com Optimizing Virtual Platform Disk Performance 1 The intensified demand for IT network efficiency and lower
Chapter 6. 6.1 Introduction. Storage and Other I/O Topics. p. 570( 頁 585) Fig. 6.1. I/O devices can be characterized by. I/O bus connections
Chapter 6 Storage and Other I/O Topics 6.1 Introduction I/O devices can be characterized by Behavior: input, output, storage Partner: human or machine Data rate: bytes/sec, transfers/sec I/O bus connections
Q & A From Hitachi Data Systems WebTech Presentation:
Q & A From Hitachi Data Systems WebTech Presentation: RAID Concepts 1. Is the chunk size the same for all Hitachi Data Systems storage systems, i.e., Adaptable Modular Systems, Network Storage Controller,
James Serra Sr BI Architect [email protected] http://jamesserra.com/
James Serra Sr BI Architect [email protected] http://jamesserra.com/ Our Focus: Microsoft Pure-Play Data Warehousing & Business Intelligence Partner Our Customers: Our Reputation: "B.I. Voyage came
S 2 -RAID: A New RAID Architecture for Fast Data Recovery
S 2 -RAID: A New RAID Architecture for Fast Data Recovery Jiguang Wan*, Jibin Wang*, Qing Yang+, and Changsheng Xie* *Huazhong University of Science and Technology, China +University of Rhode Island,USA
Portable Scale-Out Benchmarks for MySQL. MySQL User Conference 2008 Robert Hodges CTO Continuent, Inc.
Portable Scale-Out Benchmarks for MySQL MySQL User Conference 2008 Robert Hodges CTO Continuent, Inc. Continuent 2008 Agenda / Introductions / Scale-Out Review / Bristlecone Performance Testing Tools /
Optimizing SQL Server Storage Performance with the PowerEdge R720
Optimizing SQL Server Storage Performance with the PowerEdge R720 Choosing the best storage solution for optimal database performance Luis Acosta Solutions Performance Analysis Group Joe Noyola Advanced
HP ProLiant BL660c Gen9 and Microsoft SQL Server 2014 technical brief
Technical white paper HP ProLiant BL660c Gen9 and Microsoft SQL Server 2014 technical brief Scale-up your Microsoft SQL Server environment to new heights Table of contents Executive summary... 2 Introduction...
Basics of VTune Performance Analyzer. Intel Software College. Objectives. VTune Performance Analyzer. Agenda
Objectives At the completion of this module, you will be able to: Understand the intended purpose and usage models supported by the VTune Performance Analyzer. Identify hotspots by drilling down through
Liferay Portal Performance. Benchmark Study of Liferay Portal Enterprise Edition
Liferay Portal Performance Benchmark Study of Liferay Portal Enterprise Edition Table of Contents Executive Summary... 3 Test Scenarios... 4 Benchmark Configuration and Methodology... 5 Environment Configuration...
There are numerous ways to access monitors:
Remote Monitors REMOTE MONITORS... 1 Overview... 1 Accessing Monitors... 1 Creating Monitors... 2 Monitor Wizard Options... 11 Editing the Monitor Configuration... 14 Status... 15 Location... 17 Alerting...
Estimate Performance and Capacity Requirements for Workflow in SharePoint Server 2010
Estimate Performance and Capacity Requirements for Workflow in SharePoint Server 2010 This document is provided as-is. Information and views expressed in this document, including URL and other Internet
Scaling out a SharePoint Farm and Configuring Network Load Balancing on the Web Servers. Steve Smith Combined Knowledge MVP SharePoint Server
Scaling out a SharePoint Farm and Configuring Network Load Balancing on the Web Servers Steve Smith Combined Knowledge MVP SharePoint Server Scaling out a SharePoint Farm and Configuring Network Load Balancing
Computer Architecture. Secure communication and encryption.
Computer Architecture. Secure communication and encryption. Eugeniy E. Mikhailov The College of William & Mary Lecture 28 Eugeniy Mikhailov (W&M) Practical Computing Lecture 28 1 / 13 Computer architecture
EqualLogic PS Series Load Balancers and Tiering, a Look Under the Covers. Keith Swindell Dell Storage Product Planning Manager
EqualLogic PS Series Load Balancers and Tiering, a Look Under the Covers Keith Swindell Dell Storage Product Planning Manager Topics Guiding principles Network load balancing MPIO Capacity load balancing
Fall 2009. Lecture 1. Operating Systems: Configuration & Use CIS345. Introduction to Operating Systems. Mostafa Z. Ali. [email protected].
Fall 2009 Lecture 1 Operating Systems: Configuration & Use CIS345 Introduction to Operating Systems Mostafa Z. Ali [email protected] 1-1 Chapter 1 Introduction to Operating Systems An Overview of Microcomputers
INTEL PARALLEL STUDIO XE EVALUATION GUIDE
Introduction This guide will illustrate how you use Intel Parallel Studio XE to find the hotspots (areas that are taking a lot of time) in your application and then recompiling those parts to improve overall
Technical Paper. Performance and Tuning Considerations for SAS on Fusion-io ioscale Flash Storage
Technical Paper Performance and Tuning Considerations for SAS on Fusion-io ioscale Flash Storage Release Information Content Version: 1.0 May 2014. Trademarks and Patents SAS Institute Inc., SAS Campus
This guide specifies the required and supported system elements for the application.
System Requirements Contents System Requirements... 2 Supported Operating Systems and Databases...2 Features with Additional Software Requirements... 2 Hardware Requirements... 4 Database Prerequisites...
Power Efficiency Comparison: Cisco UCS 5108 Blade Server Chassis and Dell PowerEdge M1000e Blade Enclosure
White Paper Power Efficiency Comparison: Cisco UCS 5108 Blade Server Chassis and Dell PowerEdge M1000e Blade Enclosure White Paper March 2014 2014 Cisco and/or its affiliates. All rights reserved. This
HP SN1000E 16 Gb Fibre Channel HBA Evaluation
HP SN1000E 16 Gb Fibre Channel HBA Evaluation Evaluation report prepared under contract with Emulex Executive Summary The computing industry is experiencing an increasing demand for storage performance
PART IV Performance oriented design, Performance testing, Performance tuning & Performance solutions. Outline. Performance oriented design
PART IV Performance oriented design, Performance testing, Performance tuning & Performance solutions Slide 1 Outline Principles for performance oriented design Performance testing Performance tuning General
Geospatial Server Performance Colin Bertram UK User Group Meeting 23-Sep-2014
Geospatial Server Performance Colin Bertram UK User Group Meeting 23-Sep-2014 Topics Auditing a Geospatial Server Solution Web Server Strategies and Configuration Database Server Strategy and Configuration
Best Practices for Optimizing SQL Server Database Performance with the LSI WarpDrive Acceleration Card
Best Practices for Optimizing SQL Server Database Performance with the LSI WarpDrive Acceleration Card Version 1.0 April 2011 DB15-000761-00 Revision History Version and Date Version 1.0, April 2011 Initial
Performance Comparison of Intel Enterprise Edition for Lustre* software and HDFS for MapReduce Applications
Performance Comparison of Intel Enterprise Edition for Lustre software and HDFS for MapReduce Applications Rekha Singhal, Gabriele Pacciucci and Mukesh Gangadhar 2 Hadoop Introduc-on Open source MapReduce
IBM Systems Director Navigator for i5/os New Web console for i5, Fast, Easy, Ready
Agenda Key: Session Number: 35CA 540195 IBM Systems Director Navigator for i5/os New Web console for i5, Fast, Easy, Ready 8 Copyright IBM Corporation, 2008. All Rights Reserved. This publication may refer
