NUnit 2.6.4 Study Unit Test Study Version: 1.3 Status from: 03.03.2016 09:44:00 Authors: Ing. Jaroslav Klimes Document-ID: Classification: Internal Software Quality Lab. Alle Rechte vorbehalten. Druckdatum: 03.03.2016
Contents DOCUMENT INFORMATION... III General... III Document-History... III Dokument-Qualitätssicherung... III Purpose and contents of this document... III 1. INTRODUCTION... 1 2. CONFIGURATION... 2 3. TEST SPECIFICATION... 3 3.1. Configuring your test project in MS Windows and Visual Studio... 3 3.2. Creating Test Classes... 3 3.3. Creating Test Methods... 3 3.4. Working with Assertions... 4 3.4.1. Classic Assertions... 5 3.4.2. Constraint-based Assertions... 5 3.5. Working with attributes... 6 4. PARAMETERIZED TESTS... 9 5. TEST EXECUTION... 12 5.1. NUnit console... 12 5.2. NUnit GUI... 13 5.3. PNUnit runner... 15 5.4. Third-party runners and other extensions... 15 6. EXTENSIBILITY... 16 6.1. Custom attributes... 16 6.2. Custom categories... 16 6.3. Custom constraints... 17 6.4. Addins... 17 7. CONCLUSION... 18 Author: Ing. Jaroslav Klimes I / III
APPENDIX... 19 List of pictures... 19 Author: Ing. Jaroslav Klimes II / III
Document information General Relation to other documents: This document is a part of the Unit Test Study by Software Quality Lab. Distribution: Document-History Version Status Date Name Reason for change / Notes 1.1 extended 2011-06-09 Liljana Pendovska Extend the Study with Parameterized tests 1.3 extended 2015-05-07 Bernhard Barbisch Updated to NUnit version 2.6.4 Dokument-Qualitätssicherung Role Name Availability: (Org.+Tel.+e-mail) Activity Date Author Klimes created 2010-09-13 Reviewer Groiss reviewed 2010-09-15 Reviewer Hochrainer reviewed 2010-09-27 Release Hochrainer released 2010-10-11 Reviewer Plasser Reviewed 2015-05-11 Release Plasser Released 2015-05-11 Release Plasser Released 2015-11-01 Purpose and contents of this document This document contains an objective analysis of chosen unit test tools and should serve as a decision support in the process of acquiring a new unit test tool. Author: Ing. Jaroslav Klimes III / III
1. Introduction The open-source testing framework NUnit was developed to perform unit tests using all.net programming languages, such as C#, VB, C++.net or J#. The framework has developed in response to JUnit framework on Java platform, which it shares its basic principles with. The original concept of both these frameworks is based on the SUnit framework developed by the American computer scientist Kent Beck in Smalltalk. NUnit has been completely redesigned over time to take advantage of many.net language features, and it is developed entirely in C# in its latest versions. Autor: Ing. Jaroslav Klimes 1 / 19
2. Configuration If you don t have NUnit already embedded in your IDE, which is the case e.g. with the freeware IDE SharpDevelop, you will have to download the framework package from http://www.nunit.org. The latest stable release version at this study s time of creation is 2.6.4. Apart from the standard package suited for MS.NET there are also source packages for contributors or tweakers available. The installation program places a number of shortcuts in the start menu that run NUnit under.net or Mono, which is a platform compatible with Microsoft s.net framework. There are several possibilities of executing NUnit tests, the most user-friendly of which is the GUI runner. Once the GUI version of NUnit runner is started, it opens a standalone graphical user interface that can be used to manage and run tests (Picture 1: The embedded integrity test in the NUnit GUI) First, the tester opens an appropriate test file in order to commence the test execution, the allowed extensions for a file containing a NUnit test are *.exe, *.dll and finally *.nunit (NUnit project file) that does not contain any test classes by itself, it just points at another executable or library. The executables and libraries (*.exe and *.dll) containing test classes must be compiled before they can be executed, as the test specification takes place in the development environment. NUnit stores its configuration in several types of config files. There is a global configuration valid for all tests, it may be overridden by a testset-specific config file or single-test-specific config file though. Configuration files hold e.g. the settings for thread priorities or standard timeouts. When NUnit is run through the runners included in the package, the configuration files of nunit.exe.config and nunit-console.exe.config control its operation. NUnit installation contains several pre-prepared test projects that check the integrity of the installed framework, the functionality of its test features and pinpoints eventual problems. It is recommended to run these integrity tests after the installation, they are loaded automatically the first time the test GUI is started, as shown in the Picture 1: The embedded integrity test in the NUnit GUI: Picture 1: The embedded integrity test in the NUnit GUI Autor: Ing. Jaroslav Klimes 2 / 19
3. Test specification 3.1. Configuring your test project in MS Windows and Visual Studio In general there are two ways of implementing tests into your units: Either you implement the test code directly into the assemblies with the tested classes, or create a standalone testing assembly, which is particularly suitable for larger projects. In both cases a few adjustments have to be implemented in the project in order to support testing through NUnit: Additions to the project references: A reference has to be made to the nunit.framework.dll library. It is stored under the NUnit installation folder in \bin\\framework subfolder. Adjustments in the code: In each namespace containing test classes you have to include the following statement in order to make all test methods work properly: using NUnit.Framework; 3.2. Creating Test Classes When creating tests, your testing classes must have a [TestFixture] attribute. This attribute can be inherited and therefore can also be added to a base class (if the abstract fixture pattern is being implemented). [TestFixture] public abstract class AbstractFixtureBase { } public class TestFixture : AbstractFixtureBase { } Additionally parameterized and generic fixtures are supported as well. [TestFixture("a", 1)] [TestFixture(2, true)] public class ParameterizedFixture { public ParameterizedFixture(string s, int i) { } public ParameterizedFixture(int i, bool b) { } } [TestFixture(typeof(ArrayList))] public class GenericFixture<T> where T : ICollection, new() { } 3.3. Creating Test Methods The testing methods themselves can be placed inside the fixture class and must be marked with one of the following attributes in order to work property: Autor: Ing. Jaroslav Klimes 3 / 19
[Test]: Defines a simple test method (can be combined with the attribute [TestCaseSource]). [Test] public void TestMethod() { //test logic } [TestCase]: Defines the values of a test execution of a parameterized test (can be applied multiple times) [TestCase(1, "a")] //the first testcase [TestCase(2, "b")] //the second testcase public void TestMethod(int i, string s) { //test logic... } [TestCaseSource]: Defines the values of a test execution of a parameterized test. private class TestValuesType : IEnumerable { public IEnumerator GetEnumerator() { yield return new TestCaseData(1, "a"); yield return new TestCaseData(2, "b"); } } [Test] [TestCaseSource(typeof(TestValuesType))] public void TestMethod(int i, string s) { //test logic... } The methods contain the actual test code, using assertions about various properties of the tested units. 3.4. Working with Assertions Assertions can be applied to the following things in NUnit: Equality Checks whether numbers or single-dimensioned arrays are equal. Identity Checks whether two objects are identical or one is part of the other. Condition Checks whether a condition is met or whether objects are empty/null. Comparison Arithmetic comparison (<,=<,>,>=) Type Checks whether an object is an instance of a type or the type is assignable. Exception Checks whether a specified code snippet throws a certain exception type. Utility Provides a way of better control over the test process through special assertions. String Compares strings or substrings, allows usage of regular expressions as well. Collection Checks whether two collections contain the same objects (ordered/unordered). Autor: Ing. Jaroslav Klimes 4 / 19
Thank you very much for downloading a public preview of our well known and highly appreciated tool studies. If you re interested in reading the whole document just apply here and request the document you want to read from our comprehensive set of studies already available. For further information on our tool studies visit our web site. Or, you can always use the following form to request the studies you want to read. Seite 1 von 2
Bestellformular für Tool-Studien/Order form for tool studies An/to Software Quality Lab GmbH Gewerbepark Urfahr 6 4040 Linz AUSTRIA Bitte ausgefüllt per Email an/ Please fill out and send to info@software-quality-lab.com Name/full name: Adresse/address: Firma/company: Email/email: Funktion/role: Telefon/phone: / [bitte füllen Sie die Daten vollständig und korrekt aus/please supply correct and complete data] Bestellliste/Order list [bitte kreuzen Sie die gewünschten Tools an/please select the tool studies] ALM, RM IBM Rational Requirements Composer 4.0.2 Jama Contour 3.61 NEU! Micro Focus Borland Caliber Microsoft Team Foundation Server 2013 Polarion ALM 2014 HP ALM 11 VersioneOne 13.0 Visure Solutions IRQA 4 Test Management Imbus TestBench 1.6.1 Micro Focus SilkCentral 2008 Microsoft VSTS 2010 Mozilla Testopia 2.1 Orcanos QPack 5 Polarion ALM 20121 NEU! Siemens SiTEMPPO 5.8 TestLink 1.9.12 Seapine TestTrack 2012 Static Code Analysis SonarQube 4.5 GUI Test Automation Appium 1.2.2 Bredex GUIdancer 4.2 HP UFT Professional 10 IBM Functional Tester 8.1.0.3 Microsoft VSTS 2010 QF-Test 3.5.1 NEU! Ranorex 3.3 SAP ecatt Selenium 2.0 SmartBear TestComplete 9.3 SmartBear SoapUI 5.1 Weitere Informationen auf unserer Website www.software-quality-lab.com. Embedded Systems LDRA 8.0 Razorcat Tessy 2.6 Load/Performance Apache JMeter 2.8 Grinder 3.4 HP LoadRunner 9.5 Neotys Neoload 3.0.3 SmartBear LoadUI 2.8.0 Unit Tests Google Test 1.5 JUnit 4.11 NUnit 2.5.7 Die Toolnamen und eingetragenen Warenzeichen sind Eigentum der jeweiligen Toolhersteller/Tool names are property of tool vendors. Die Studien sind grundsätzlich auf Anfrage frei erhältlich. Software Quality Lab behält sich jedoch vor, die Aufwände für gewisse Branchen und Privatpersonen in Rechnung zu stellen (z.b. Beratungsfirmen, Toolhersteller usw.) Studenten erhalten die Studie zum halben Preis/All studies are basically free, except for certain branches of industry or private persons, for whom we invoice at cost. Students get a discount of 50%. Seite 2 von 2
Tool Evaluation Center Efficient, goal-oriented and professional tool evaluation Unique in Europe What is TEC? The Tool Evaluation Center (TEC) is a neutral environment for evaluating software tools related to software development, through which you will receive a detailed, vendorindependent overview of the suitability of different tools for your software projects. Your Partner for software quality and testing Consulting Implementation Training Automation Evaluation Software Quality Lab is your competent partner in software quality and testing and one of the leading independent consulting companies in Austria. We specialize ourselves in the improvement of efficiency, quality and security in the fields of software development and system processes. All tools in TEC are up to date, installed and configured, furnished with examples and ready for demonstration. Practical work and getting to know the tools in a handson manner is also possible. Your Benefit EETime-saving: Tool selection in a few days. No time-consuming research and product requests, no installation and testing. EEPreparation of information for different target groups from technicians to the managing board. EESupport provided by specialists, ranging from requirements through to a proof of concept or a pilot project. Our services around the tool selection Tool categories in the TEC EEApplication Life Cycle Management EERequirements Management EETest Management EEGUI Test Automation EETest Data Management EEComprehensible decision criteria: Using special techniques we evaluate the tools together with you, basing on your requirements as a foundation for your investment decision. EELarge selection and direct comparison of products ranging from market leaders through newcomers to open source software and freeware. Linz / Vienna / Graz / Lustenau / Munich Requirements elicitation, Creating a list of criteria Tool review, Pre-selection of tools, Practical getting to know the tools in the TEC Workshop with 3-5 suitable tools, Proof of Concept, Planning and implementing the pilot project Integration with other tools Tools in TEC www.software-quality-lab.com Our TEC includes more then 40 installed tools from reputable manufacturer (e.g. IBM, HP, Micro Focus, Microsoft, Polarion, Siemens) as well as interesting newcomer products and alternatives from the open source area (e.g. JMeter, Selenium). Further details can be found at www.tool-evaluation.com. V. 2011-12 Software Quality Lab GmbH info@software-quality-lab.com www.software-quality-lab.com