Effective feedback from quality tools during development EuroSTAR 2004 Daniel Grenner Enea Systems
Current state Project summary of known code issues Individual list of known code issues Views targeted at individuals / roles Information updated daily (at least) Well known project status This presentation will show you how this is done!
Original state 20+ developers Not much Java/J2EE experience Varying OO experience New language, tools, processes How should quality be ensured?
Initial development practices Coding standards Nightly build Unit tests Code reviews Automated code checks Support from mentors
Coding standards Improves maintainability Makes it easier to find strange code Use tool to format code
Nightly build Build the system Create documentation Generate reports
Unit tests JUnit obvious choice Cactus chosen for in-container tests Fit well into environment Ant tasks useful Unit testing course for all developers
Code reviews All code should be reviewed by a mentor JavaDoc @reviewer tag used To find unreviewed code Work intensive but effective
Automated code checks Let a tool do repetitive tasks Reduces review work Allows reviews to focus on difficult problems Small amount of work large payoff
Selecting tool for automated checks Useful initial checks Configurable (modify existing checks) Extensible (add new checks) Integration with environment Easy to use License & price Possibility to use several tools
Checkstyle Checks for: Coding standards Potential bugs Common problems And much more (100+ checks) Easy to configure Extensible Integrates with Ant & several IDEs Open source
Mentoring 4 7 consultants during project Experts in different areas Java / J2EE / Swing / UML / Tasks Teach Answer questions Create project standards Development support
Tuning the process Continuous integration Better feedback Continuous follow-up
Continuous integration Build when new code is checked in Avoid problems during nightly build Faster feedback
Feedback from nightly build Build / compile warnings JavaDoc warnings Checkstyle findings JUnit results All organized differently e.g. by file, subproject, project Findings increased, due to lack of responsibility. Need to simplify daily work Find all information needed daily in a single location
Individual feedback Use Checkstyle as base, but group findings based on responsible person (@author) instead of file. Added Checkstyle task to output @author into results Sort results with XSLT Immediate decrease in Checkstyle findings Most people wanted good statistics. Some individuals did not improve their results, but follow up became easier.
Project summary Individual summary Details ordered by individual
Integration of other tools Integrate all results into a single report Checkstyle report JUnit reports (based on @author of test case) Results from build process JavaDoc Javac Code reviews
Code review Checkstyle JUnit Javac JavaDoc
Separate reports Code review information Optimizeit CodeCoverage
Project summary Individual summary Details ordered by individual
Technical solution XSLT Ant XML XSLT XML
XSLT tasks Merge different information sources Summarize information E.g. calculate statistics Extract missing information E.g. assign author if it is missing Produce readable reports (HTML)
XSLT setup One style sheet for each task Simplifies development Decorator pattern More information added each step Merge all / most XML files into one Use Ant s style task
Report generation Java source Checkstyle XML XSLT XML Build XML XSLT XML Unit test XML XSLT XML XML XSLT XML XSLT XSLT XSLT HTML HTML HTML
Tool requirements Automation Ant (first choice) Command-line (if Ant interface is not available) Assigned person (possible for e.g. weekly task) Output format XML directly Possible to convert to XML Output content Preferably with file reference (enables merging)
Technical details More information is provided in the paper
Continuous follow-up Set up goals / limits in project E.g. 80% code coverage during unit test E.g. < 20 Checkstyle findings / developer 0 issues / person not always possible during dev. Update tool settings to allow or disallow new things E.g. when common coding error found When exceptions to rules are needed
Who should perform follow-up? First solution Follow up by QA responsible Discussion about standards Second solution Non-technical person. Discussions pointless Special cases Report to management Use results e.g. as input to salary review
Summary Automate as much as possible Most cost-effective solution Simplify for developers Target report to recipients Follow-up of results
More information E-mail me: daniel.grenner@enea.se Tools: http://ant.apache.org http://checkstyle.sourceforge.net/ http://cruisecontrol.sourceforge.net/ http://jakarta.apache.org/cactus/ http://www.junit.org
Questions?