NatHERS Benchmark Study

Size: px
Start display at page:

Download "NatHERS Benchmark Study"

Transcription

1 NatHERS Benchmark Study February 2014 Prepared for the Department of Industry by: Floyd Energy, Wayne Floyd With assistance from Tony Isaacs and Rodger Hills

2 Table of Contents 1 Executive Summary Study design Key findings Recommendations Introduction Background The Industry Objectives Purpose Deliverables Assumptions Methodology Stage One: System design Stage Two: Participation Stage 3: Data collection and analysis Limitations of the methodology Risk Management Findings summary Statistical significance of sample Queries Assessor demographics Assessor Practices Star Ratings Software tools AAOs Question scores Net Conditioned Floor Area (NCFA) Zoning Site exposure Orientation Windows and skylights Eaves Overshadowing by surrounds ii P a g e

3 6.16 Air leakage sites Ceiling penetrations Walls Roofs Floors Recommendations Minimum training and professional development standards for NatHERS assessors NatHERS tool improvements New resource development Monitor progress and future studies Appendices Appendix 1: Benefits to assessors of participation in the study Appendix 2: Assessor demographics Appendix 3: Detailed analysis of assessments Appendix 4: Assessor practices Appendix 5: House plans Appendix 6: Questions asked for each dwelling iii P a g e

4 Abbreviations Abbreviation AAO ABCB ABS ABSA ACT AccuRate BASIX BDAV BERS Pro Cert IV CPD DI FirstRate5 NatHERS NCC NCFA NSW NT QLD RIS SA SHGC SV TAS WA WERS Meaning NatHERS Assessor Accrediting Organisation Australian Building Codes Board Australian Bureau of Statistics Association for Building Sustainability Assessors Australian Capital Territory NatHERS Benchmark software tool Building Sustainability Index the web-based tool designed to assess the potential performance of buildings that contain a dwelling, against thermal comfort, water and energy criteria. Building Designers Association of Victoria NatHERS accredited software tool Certificate IV in NatHERS Assessment Continuing Professional Development Department of Industry NatHERS accredited software tool Nationwide House Energy Rating Scheme National Construction Code Net Conditioned Floor Area New South Wales Northern Territory Queensland Regulatory Impact Statement South Australia Solar Heat Gain Coefficient a property of windows that reflects how much solar radiation is transmitted through the window Sustainability Victoria Tasmania Western Australia Window Energy Rating Scheme iv P a g e

5 Acknowledgements The study team would like to thank the following people and organisations for their generous assistance and support: ABSA BDAV, Kate Bell CEO Sustainability Victoria, FirstRate5 software, Anthony Wright SmartRate, BERS Pro software, Michael Plunkett Hearne Scientific, distributors of AccuRate, Chris Williams, Barlow Telford Members of expert committees: Ted Harkness Peter Barlow Tony Butters Debbie Bute Eliza Morawska Katie Fallowfield David Canciello Victoria Prior And last but not least, the study team would like to thank all participating assessors who generously took time away from their business or leisure to participate in this study. This study represents an important step forward for the NatHERS assessor industry and will help all assessors to produce consistent and reliable ratings for their clients. As the first study of its kind this study was on its own learning curve, and as a result there were inevitable glitches in its implementation. The study team appreciates your dedication and patience for sticking with us to the end. v P a g e

6 1 Executive Summary 1.1 Study design This study creates a national benchmark by measuring the accuracy of NatHERS assessments. Assessors were randomly allocated one of four houses to rate and their assessments were compared with a solution set prepared by a committee of expert assessors. Approximately 100 questions about NatHERS data entry for each house were developed to specifically test how assessors applied Technical Notes 1 and 2, whilst also testing accuracy of general data entry techniques. A study website was developed to provide a portal for assessors to access the study, enter responses to questions, ask questions and gather feedback. A webinar available through YouTube and the study website provided instructions on participating in the study and using the data entry portal. Assessors were initially contacted through Assessor Accrediting Organisations (AAOs) and software providers who endorsed the study and encouraged assessors to participate. AAOs provided Continuing Professional Development (CPD) points for participation as an incentive. 1.2 Key findings Statistical significance of participants Statistical significance depends on the sample size relative to the total population. The measure of statistical significance used for academic publications is 95% certainty and a 5% margin of error. This study obtained a sample of 344 assessors from an estimated sample of 1816 assessors which has an error margin of 4.8% at 95% certainty, so the sample was statistically representative of the assessor population as a whole. As the sample was self-selecting and not random, the nature of the bias, that is, whether one would expect a self-selected sample to be more or less accurate than the general population is not clear. Sub samples by jurisdiction, AAO and software tool were also examined to determine statistical significance. Only the sub sample of ABSA assessors fell within the 95% confidence and 5% error margin limit. Sub samples of BDAV assessors, assessors from WA and Victoria and each of the software tools fell within an error margin of 10% at 95% confidence. Assessors who were unaccredited represented only 10% of participants. This does not mean that comparisons between sub samples had no significance, but that the differences between these subsamples would need to be greater to be able to conclude that the difference was statistically significant Star rating accuracy Star rating accuracy is the most important element of a NatHERS assessment as this is directly related to compliance with building regulations and the National Construction Code (NCC). The study found that around 21% of assessments obtained the correct rating around 37% of ratings were within 0.25 stars, 58% within 0.5 stars and 77% were within 1 star (Table 1). Alternatively, 64% of assessors had an error greater than 0.25 of a star. 6 P a g e

7 The high error rate in star ratings was consistent with assessors correctly answering 65% of the questions about data inputs. Table 1 Proportion of ratings within various star rating error bands Error in star rating Cumulative Within 0.25 stars 37% Within 0.50 stars 58% Within 0.75 stars 70% Within 1.00 stars 77% Within 1.50 stars 86% Within 2.00 stars 91% More than 2.00 stars 100% NatHERS Technical Notes NatHERS Technical Notes 1 and 2, cover rules for data entry including zone types and conditioning, site exposure and calculation of the area of uninsulated ceiling created by ceiling penetrations. The errors found in this study suggest that these documents are not well understood, or widely used by the assessor industry. For example, depending on the complexity of the house plan only between 3% and 29% of assessors correctly estimated the area of ceiling penetrations Application of Technical Note 1 Zoning rules Arguably, the most important element of accurate modelling using NatHERS software is to correctly zone the house plan. A large proportion of assessments were not zoned correctly according to the zoning rules in Technical Note 1. Rooms were incorrectly combined or split, the allocation of other day or night time-zoned rooms were not well understood and heating and cooling was incorrectly turned off in these zones. Inspection of individual files showed that around 60% of assessors had made errors in zoning house 1 and 4 and 85% incorrectly zoned house House plan complexity Assessor errors increased significantly with the increasing complexity of design and documentation. 57% of assessors were within 0.25 stars of the correct rating for house 1, but only 41% for house 2 and 26% for house 3. Houses 2 and 3 were two storey houses (which are inherently more complex to rate than single storey houses) and house 3 had more complex documentation than the simpler house 1. House 4 was an apartment on the 15 th floor and only 19% of the assessors got within 0.25 stars of the correct rating. In addition to zoning errors a significant contributing factor to the errors was found to be selecting incorrect wind exposure and height above ground. 41% of assessors selected the wrong exposure for house Net Conditioned Floor Area (NCFA) The rating is calculated on the basis of energy loads per square metre of NCFA, so getting this right is vital to an accurate rating. One in five ratings reported incorrect NCFA (> 5% difference), which resulted from factors such as failing to turn on heating and cooling in other day and other night classified zones. 7 P a g e

8 1.2.7 Overshadowing Overshadowing refers to the shading of the rated building by surrounding objects, typically adjacent houses and fences. Incorrectly modelling overshadowing on a heavily overshadowed site can result in rating errors of over 1 star. Houses 1 to 3 each had significant overshadowing from adjacent buildings and fences. Most assessors did not accurately model these obstructions. Errors ranged from miscalculating the height and depth of the overshadowing obstruction to not counting the overshadowing impact of roof ridges and fences. Over 75% of assessors that rated house 1 did not accurately model obstructions Systematic process A wide variety of other errors were also observed that indicated the lack of a systematic approach to the rating process by assessors. For example, 25% of assessments reported incorrect window areas, 85% reported incorrect area of ceiling penetrations, 35% had incorrect roof solar absorptance, 25% had the incorrect R-value for waffle pod slabs and the area of different wall constructions and floor coverings within each house showed error rates ranging from 10 to 70% Differentiation by AAO or tool The responses of assessors were also checked to see whether the use of a particular tool or membership of AAO was correlated with different accuracy levels. While differences were found, the small sample sizes meant that the differences found could be explained by the error margin of the samples Information about the assessor industry Assessors who registered for the study were asked a number of questions which provided valuable information on the assessor industry. Around a quarter of assessors work solely as an assessor. Two thirds of assessors are self-employed and around half of these assessors were building designers or architects. The average assessors had been in the job for 5.5 years and rated 90 houses per year. Almost all assessors had some building site experience with around 63% having direct on site experience. 1.3 Recommendations It is important that NatHERS assessments are consistently accurate to ensure compliance with building regulations, effectiveness in saving energy, for the reputation of the industry and to ensure consumers get the service they pay for. The requirement that all assessors complete the Certificate IV in NatHERS Assessment by 1 July 2015 is an important step towards improving the skills of assessors. This study highlighted areas where enhancements could be made to NatHERS to increase the accuracy and consistency of NatHERS assessments. These can be grouped into four broad categories: Mandatory accreditation of assessors This study found that there was a high level of error in NatHERS ratings irrespective whether assessors were accredited or not. The quality assurance (QA) processes of accreditation is a mechanism that could contribute to more accurate assessments, however, there is no process 8 P a g e

9 available to improve the accuracy of unaccredited assessors. Making accreditation mandatory for all assessors is essential to ensure to improve the accuracy of assessments. The error rate showed that there are benefits for further CPD training for assessors in Technical Notes 1 and 2. This additional training should incorporate evaluation to ensure that the content is understood by assessors. The Technical Notes could be improved by providing detailed examples to better explain the principles NatHERS tool improvements NatHERS Software Tools could be further improved to minimise error rates. This includes improvements such as automatic calculation of ceiling penetration area, better guidance on the allocation of zone occupancy types and the application of heating and cooling, automatic calculation of waffle pod slab R values and simplifying data entry for overshadowing. The calculation of NCFA is not consistent across tools and should be harmonised as the rating is determined on the basis of this area New resources for assessors A number of new resources for assessors would help to increase accuracy: The NatHERS assessor industry has no standard data entry and error checking procedures. Development of such procedures would facilitate a more systematic approach to rating. This would help to eliminate many of the errors found in this study; A comprehensive technical manual for assessors which explains how to model all the design and site features which assessors may encounter in the field; and Work with product suppliers to develop trade literature which better meets the needs of assessors (69% reported that they had difficulty finding the information they needed in trade literature) Improving future NatHERS assessor benchmark studies The proposed Universal Certificate extracts data from the rating file and displays this on the Certificate. A similar tool which extracts a larger set of data than the proposed Universal Certificate from NatHERS rating files automatically would have significantly reduced the extent of analysis required, facilitated analysis in greater detail and made participation in this study faster and easier for assessors. It would also avoid issues of assessor misinterpretation of questions. The proposed universal certificate generator could be modified to do this. It would also significantly reduce the resources needed for QA checking by AAOs and assist in the marking of exams and tests for trainers. 9 P a g e

10 2 Introduction In May 2013, The Department of Industry commissioned Floyd Energy to undertake the first national benchmark study to measure the accuracy and consistency of assessments undertaken by Nationwide House Energy Rating Scheme (NatHERS) assessors. This study followed a study in 2012 where the Association of Building Sustainability Assessors (ABSA) was contracted by the Western Australian Office of Energy to conduct a workshop to trial state-based interim software modelling principles and defaults. The results of this workshop identified a number of areas where the accuracy and consistency of NatHERS assessments could be significantly improved, such as how to consistently treat zoning and floor covering, and how to make consistent judgements regarding insulation and external screening. As the results were unlikely to be unique to Western Australia, the Department of Industry has sought to establish a national baseline of current assessor practices. The intent of the study was to more fully understand problem areas, monitor and evaluate improvements to the Scheme, and to work with all governments to continually improve the consistency and accuracy of NatHERS assessments nationally. 3 Background NatHERS provides a national framework that allows various computer software tools to rate the potential energy efficiency of Australian houses at the design stage. NatHERS software tools fulfil a regulatory function for assessing compliance of a building design with the thermal performance requirements in the National Construction Code (NCC). The correct application of NatHERS software tools is critical to the success of the Scheme. Although each state and territory government uses the NatHERS accredited software to provide assessments, minimum standards in all other aspects of the Scheme including application of the use of tools can vary. Main components of the Scheme include: The use of accredited software tools that provide consistent and accurate modelling of building shell performance (assessments) across a diverse range of climates to determine heating and cooling demands; Minimum standards for assessors: o Accreditation of Assessor Accrediting Organisations (AAOs) incorporating CPD and quality assurance of assessors, Code of Conduct and mandatory Professional Indemnity Insurance; o Certificate IV in NatHERS Assessment qualification; o Consistent use of NatHERS tools through nationally-applied technical notes; and o Working toward a minimum consistent standard across all states and territories, and A process of continuous improvement for all aspects of the Scheme. 3.1 The Industry The NatHERS Assessor industry is still in the early years of its development. National Regulations for minimum NatHERS performance were introduced in 2003, and a number of states delayed the introduction of these regulations. Prior to 2003 a number of jurisdictions had introduced minimum requirements: NSW s Energy Smart Homes program required a minimum 3.5 stars, the ACT required 10 P a g e

11 a minimum 4 stars and extended this as a disclosure rating for houses for sale and rent, while in Victoria houses requiring a planning permit had to be at least 4 stars. Training requirements to become an assessor have been minimal: a 4 day course covering thermal performance theory, software use and professional practice. A new Certificate IV course has been developed for NatHERS assessors and all accredited assessors will be required to meet this new standard by 1 July Implementation of Quality Assurance (QA) programs has been inconsistent across states and territories: In NSW, Assessors were required to be accredited with ABSA for thermal performance assessments conducted under BASIX. In 2006 ABSA became the first recognised AAO for NatHERS. Assessors were required to pass entry exams, participate in CPD and some assessors work was subject to QA checks each year. In Victoria there was a minimal accreditation requirement which involved passing an exam run by training authorities with occasional QA random checks. When FirstRate5 was released, SV ceased accrediting assessors and this task was given to the Building Designers Association of Victoria (BDAV). BDAV initially accredited assessors under the requirements of SV but became recognised as a NatHERS AAO in The ACT has required licensing for assessors from March Other states have required no accreditation, but some have recommended that assessors accredited with an AAO be used in preference to unaccredited assessors. The software and regulations around NatHERS ratings have changed several times: 2 nd generation software was introduced between 2006 and All assessors were required to retrain to be able to use the new software. AccuRate and BERS Pro released new versions with non-regulatory assessment of appliance efficiencies and water use. Accreditation for BERS Pro and FirstRate5 was provisional and when the fully accredited versions were released the ratings for houses changed slightly. National regulation stringency has increased from 4 to 5 stars and then to 6 stars in ABSA s rating procedures have been updated three times culminating in the release of the NatHERS Technical Notes 1 and 2 in The basis of the Window Energy Rating Scheme (WERS) changed without corresponding amendments to the window libraries in NatHERS tools making the job of identifying the required minimum properties of glazing more difficult. The NatHERS Assessor Industry is a young and rapidly changing industry. This environment and the inconsistent application of QA across Australia does not lend itself to establishing high standards across the country without a consolidated and consistent approach. 11 P a g e

12 4 Objectives 4.1 Purpose The purpose of this study was to: Collect relevant data about NatHERS assessments to be able to determine levels of consistency and accuracy of assessments nationwide; Design a methodology to assess the accuracy of NatHERS assessments; Identify ways to improve NatHERS tools and the skills, accuracy and consistency of assessors; and Provide a benchmark to measure current and future improvements to the NatHERS. 4.2 Deliverables Deliverables for this study included: Designing a sound methodology to gather the required NatHERS assessor/assessment information; Ensuring participation of assessors was statistically significant; Guaranteeing the confidentiality of participants; Ensuring the study was conducted with the highest integrity; Identifying and manage risks; Analysing data to highlight trends and summarise outcomes; Advising of appropriate actions to improve the accuracy of assessments; and Providing a final report. 4.3 Assumptions NatHERS Technical Notes 1 and 2 The NatHERS administrator released two Technical Notes in early 2013 which defined the way in which all assessors using NatHERS tools are required to rate dwellings. Technical Note 1 gives a general description of data entry techniques, and Technical Note 2 explains how to allow for loss of ceiling insulation due to the clearances required around some ceiling penetrations. The Technical Notes specifically address the poor rating practices of assessors which had been identified by previous studies (such as the Floyd Energy project in WA) and AAO QA findings Bias Participation in the study was voluntary. Samples which are self-selected are not random and this will introduce some bias into the study. It is difficult to determine whether this bias would lead to participants being more or less accurate than the general population of assessors. If only those assessors who were confident they did accurate ratings participated then the bias might be expected to cause greater accuracy. On the other hand it may be that, because accuracy in this study had no impact on accreditation, those who struggle with the rating process participated because they want to learn how to improve their ratings so the accuracy of the sample could be worse than the population. Due to the high number of CPD points allocated by AAOs for participation in the study, assessors main motivation to participate may simply have been to earn CPD points, in which case 12 P a g e

13 the impact on the accuracy of ratings is not clear. The self-selection of assessors for participation in this study is an inherent limitation to the methodology of this study which is acknowledged, however, the nature of the bias this introduces is unclear Accredited assessors vs. non-accredited assessor participation Although both non-accredited and accredited assessors were invited to participate in this study, it was assumed that most participants would likely be accredited assessors as they received a significant allocation of CPD points for their efforts at no cost. 5 Methodology The methodology for this study involved three key stages (1) System design, (2) Participation and (3) Data collation and analysis. 5.1 Stage One: System design Assessors were randomly assigned one of four house plans that were uniquely designed and peerreviewed. Assessors then answered around 100 questions (see section 8.6) about their specific assessment through a website designed specifically for this study. Assessors were also asked to upload their rating files to the study website to facilitate deeper analysis if the questions asked did not provide sufficient information. This also helped to resolve any issues around misinterpretation of questions House plans Houses were selected to test the application of specific clauses in the Technical Notes as well as general rating techniques. In developing the plans the common rating errors reported by AAOs were examined to make sure they were tested. The plans were typical of the different types of dwellings which assessors face in the field: House 1 was a single storey 196 m 2 4 bedroom volume builder house provided by Dennis Family Homes; House 2 was a two storey 311 m 2 4 bedroom volume builder house provided by Dennis Family Homes it has some more complicated features such as lower floor roof spaces adjacent to upper floor rooms and walls which change construction from Brick Veneer to cement sheet clad over their height. Two storey homes are also, by nature, more complicated to assess with NatHERS tools because overshadowing must be adjusted to take into account the floor height of the zone and zone adjacencies between floors are more complicated; House 3 was a two storey 107 m 2 3 bedroom town house which was part of a multi-unit town house development (NCC class 1) and was provided by volume builder AV Jennings. Its documentation is for the whole medium density development and assessors needed to locate the rated unit within these documents; and House 4 was a 166 m 2 3/4 bedroom apartment on the top floor of a 15 storey building (NCC class 2). This design was developed for this study specifically to test the application of Technical Note zoning rules to do with the creation of zones for common corridors in multiunit buildings. 13 P a g e

14 Section 8.5 shows the plans of the houses and the specific areas of assessor data input these houses were designed to test Peer review committee A peer review committee examined the documentation for each house to ensure the house documentation included everything to undertake the rating task. Expressions of interest for participation in the committee were requested through AAOs. A number of nominations were received. In addition to examining the nominee s experience advice was sought from AAOs and tool developers on the suitability of each nominee to undertake this task. Experts were selected for the committee from each of the software tools; 3 for AccuRate, 4 for BERS Pro and 2 for FirstRate Development of data entry questionnaires for each house A full QA check of each rating assessment was too time-consuming for the purposes of this study. For this reason, a series of questions were developed for each house to cover those data inputs which were considered essential to the accuracy of the rating. The questions were designed to focus on key new areas of assessor practices introduced in the Technical Notes released in 2013 and based on AAO observations of typical assessor rating errors. The full set of questions asked for each house in each software tool is shown in section 8.6. Around 100 questions were asked for each plan covering: Key rating parameters: star rating, energy loads and NCFA, climate zone and site exposure; House zoning including: total number of zones and number of zones of each different type; Areas and types of key construction elements: walls, floors, ceilings and windows; Insulation R values: particularly where these involved calculation by the assessor e.g. combined foil and batt insulation in a wall with BERS Pro or calculation of ceiling penetration areas; Number of air leakage sites of different types; and Details of overshadowing obstructions Study website A website was developed for the study ( and was the assessors portal for the study. The site: Provided key messages about the importance of participation; Explained how the study would run and what would be required of assessors; Allowed assessors to register their interest in the study; Gave a contact to allow assessors to ask questions about the study; Provided downloads of key background materials: NatHERS Technical Notes 1 and 2 and instructional notes from the webinar, together with a link to the video of the webinar; and Gave study participants access to the Frequently Asked Questions compiled as a result of participant feedback during the study Totara system for recording answers to questions on each house The Totara system is a popular learning management platform used in the corporate sector. It was used for the NatHERS Benchmark Study to provide participating assessors with a website to enter 14 P a g e

15 the answers to questions for each house and to upload their rating files. The questions were divided into subsections and assessors were given feedback at the completion of each subsection on whether their answers were correct. At the conclusion of the questions a summary report showing all correct and incorrect answers was provided for each assessor. Totara was selected because it is robust, scalable and fully supported in Australia. The system allowed the study team to load content specific to the study quickly and easily. It provided sophisticated assessment and reporting to ensure tracking of participant progress, scores and final results of the study. Issues and trends that were determined from Totara included: Assessor demographics; Typical assessment errors; Typical errors by rating tool; Typical errors by demographic (state / accredited ABSA / accredited BDAV / unaccredited / age / gender etc.); Details of assessment practices; Overall deviation of assessors from benchmarks; and Proportion of questions correctly answered by each assessor Additional data gathered for this study The study also provides a unique opportunity to gather more data about the assessor industry. In addition to asking questions about the rating of each house assessors were also asked to provide information on their background through the registration process e.g. number of ratings per year, whether self-employed, number of years since last trained etc. Section 8.2 lists the questions asked and analyses assessor responses on assessor background. Assessors who completed their ratings were asked a number of questions about assessor practices such as how long assessors spent checking their ratings, the nature of procedures they use when rating a house and the ease of finding appropriate thermal performance data in trade literature. Section 8.4 lists the questions asked and analyses the responses to these questions on assessor practices. 5.2 Stage Two: Participation Access to assessors Access to accredited assessors was facilitated through the cooperation of AAOs and software suppliers. To comply with privacy requirements, all communication with assessors regarding recruitment into the study was done through these sources. AAOs and software providers added their support to the study in these communications to encourage assessor participation. Reminder s were sent once per week through these channels. A total of over 17,000 s were sent to recruit assessors into the study Voluntary participation To maximise participation a number of strategies were employed such as AAOs offering CPD points for participation, which saved assessors around $400. The benefits to assessors and to the industry of the study were highlighted during recruitment and explained on the website (see Section 8.1 for a description of the benefits to assessors in participating in this study). Assessors were also guaranteed confidentiality. A webinar was conducted in the early stages for assessors who registered their interest to ensure assessors could make a fully informed decision. This study allowed 15 P a g e

16 assessors to get feedback on the accuracy of their work for free without the potentially negative consequences of going through an AAO QA process. AAOs and software developers played a major role by sending out reminders to assessors to (1) participate and/or (2) complete their assessment Assessor engagement A key benefit for assessor participation was to improve the standard of their ratings. On completion of the study, assessors were given access to the report and/or report summary as well as the solution sets for all houses. This feedback encouraged participation and was included to start the process of assisting assessors to improve their standards Responding to assessor rating queries during the study The Zendesk system was used by the study team to respond to assessor queries. Zendesk is a cloudbased customer service system. It is typically used to provide Help Desk services in the IT industry and industries which need to provide technical support for their products. It was selected because it is an efficient method to receive and respond to participant enquiries. It allowed the study to identify participant s issues quickly, allocate the query to the study team member with the appropriate expertise to ensure rapid response. The Zendesk online portal allowed both the submission of queries and responses as well as allowing members of the study team to communicate directly via regarding the study Sample size Establishing the number of study participants that would constitute a statistically valid sample of assessors required selecting a confidence level and confidence interval. The following description of these terms was taken from the Survey System 1 web site: The confidence interval (also called margin of error) is the plus-or-minus figure usually reported in newspaper or television opinion poll results. For example, if you use a confidence interval of 4 and 47% percent of your sample picks an answer you can be "sure" that if you had asked the question of the entire relevant population between 43% (47-4) and 51% (47+4) would have picked that answer. The confidence level tells you how sure you can be. It is expressed as a percentage and represents how often the true percentage of the population who would pick an answer lies within the confidence interval. The 95% confidence level means you can be 95% certain; the 99% confidence level means you can be 99% certain. Most researchers use the 95% confidence level. When you put the confidence level and the confidence interval together, you can say that you are 95% sure that the true percentage of the population is between 43% and 51%. People sometimes think that the 95% level is sacred when looking at significance levels. If a test shows a.06 probability, it means that it has a 94% chance of being true. You can't be quite as sure about it as if it had a 95% chance of being be true, but the odds still are that it is true. The 95% level comes from academic publications, where a theory usually has to have at least a 95% chance of being true to be 16 P a g e

17 considered worth telling people about. In the business world if something has a 90% chance of being true (probability =0.1), it can't be considered proven, but it is probably better to act as if it were true rather than false. 1 As explained above, in most academic research a 95% confidence level with a 5% confidence interval is used to define a statistically significant sample i.e. you can be 95% sure that the results represent the population within +/- 5%. To meet this level of confidence for the total assessor population would require a sample size of 317 assessors of the total, 1816 assessors. It would also be ideal to ensure that the results of the study were statistically valid for a variety of assessor sub samples e.g. assessors in in different jurisdictions (Table 2) or individual AAOs (Table3). Table 2 Number of participants required to achieve statistically valid samples from each state State within 5% within 10% ACT NSW NT QLD SA TAS VIC WA TOTAL Table 3 Number of participants required to achieve statistically valid samples for each type of accreditation Accreditation: ABSA BDAV None Total within 5% within 10% Stage 3: Data collection and analysis The Totara System The Totara system collected assessor responses to questions in a form that could be imported into Excel for analysis. This study assessed the accuracy of assessments in terms of star rating outcomes and the extent of errors in data inputs that have led to divergence from the correct star rating. Where sub samples were statistically significant, differences between sub-samples were also examined e.g. the differences in accuracy and data input errors between: Each AAO and non-accredited assessors; NatHERS tools; Assessors in different states; and Each of the four house plans used in this study. The impact of data entry errors on star rating outcomes was also examined e.g. the average star rating for assessors with correct answers to a particular question was compared with those with P a g e

18 incorrect answers. In some cases the impact of the extent of error in data inputs was correlated with average star rating outcomes as well e.g. the average ratings of those assessments with too many thermal zones was compared to the average rating for those with the correct number and those with too few Analysis The analysis for this study focussed particularly on new areas of the Technical Notes: Zoning: the process of dividing the rooms of a house into unique thermal zones and assigning occupancy and conditioning, particularly the application of other day and night time occupancies and combining of rooms into single zones; Modelling of surrounding buildings and fences which cast shade on the dwelling; Setting of site exposure to wind; Modelling of building elements shared with common areas; and Calculation of the area of ceiling left uninsulated due to ceiling penetrations. Other critical areas of rating accuracy were also examined such as ensuring that: the areas of different construction elements were correct, building element construction details were correctly identified and researching product thermal performance properties Maximising assessor understanding of rating questions For the study results to be valid, it was important that the questions asked were unambiguous. To try and ensure the questions were as easy to understand the study provided: Instructions for the study to ensure study participants understood how to find the data they needed to enter; and A webinar was developed to explain to how to prepare the rating and extract data inputs needed for the study. Despite all efforts to ensure questions and documentation were easily understood, it became clear that some questions were interpreted incorrectly, which was not anticipated. This was the first time such a study has been attempted and, with the best will in the world, it was likely that the questions would not be clear to all participants. To further assist with interpretation of assessor responses, each study participant was required to upload their ratings data file. This allowed files to be checked to gain a better understanding of where assessors were making mistakes, to check outlier results and to check whether assessors understood questions when unusually high rates of incorrect answers were observed. 5.4 Limitations of the methodology Ideally a complete check of each rating file would have been undertaken to determine the extent of data entry errors. The technique used in this study of asking assessors to only extract key data saves a substantial amount of time in analysis but does present a limited view of the rating. If errors are made in areas of data input which were not included in the data extracted for this study then the rating error will not be able to be explained. In addition, if assessors didn t understand the question or how to extract this data from the software then they may report incorrect data entry, where they 18 P a g e

19 may have actually entered the correct information. This study attempted to overcome these limitations by examining data in the rating files where assessors were found to struggle with particular questions in feedback given through the support website, or where there seemed to be an unusually high error rate. The documentation provided for this study was designed to be sufficiently comprehensive to include all data required for ratings to be completed. Documentation standards vary considerably in the residential building industry. The documentation provided for this study is likely to be more comprehensive than assessors would usually receive, and it does not therefore test assessors ability to deal with inadequate documentation. Furthermore, information may not be presented in the way assessors are used to. Assessors develop relationships with their clients, and become accustomed to the way in which their clients present information e.g. how details of weatherstrips are shown, taking window dimensions from elevations rather than from a schedule. Therefore, some errors may be made in data entry for this study which assessors may not normally make when they are dealing with their regular clients. This study was an artificial construct. There was not an opportunity for client feedback in the usual way that assessors would be accustomed to when dealing with their clients. Furthermore, this study was undertaken by assessors on top of their usual workloads and contained an element of evaluation of their performance which may have caused some anxiety. Again, these conditions may see assessors make errors where they may not in the field. Assessors may have also been assessing houses in climate zones they were not accustomed to working in. Consequently the feel they may have developed for what appears to be a reasonable rating outcome for dwellings like this in the climate zones they are used to working in may not be useful in this case. The study may therefore suffer from a Hawthorne effect where the subjects being studied do not behave as they would normally simply because they know they are being studied. Finally there is one key aspect of NatHERS assessments which was not tested in this study, that is, how assessors optimise the performance of a house to achieve regulatory compliance. This can only be effectively tested if the data entry is correct in the first place. 5.5 Risk Management Participation rates There were a number of strategies employed to maximise the participation rate in the study including: Engaging AAOs and software tool providers to support the study. Endorsement from these bodies demonstrated the importance of the study to the industry. Assessors were approached to participate through AAOs and software providers. Assessors were sent weekly reminders throughout the recruitment phase encouraging them to participate. Allocating CPD points by each AAO to participants allowed assessors to earn these CPD points in one study at no cost, providing an annual saving of about $400. A 4-week timeframe to complete the study was used. This minimised the intrusion into assessors work time so that it would not require more than 2 hours per week to participate in the study. An extension of 5 days was granted at the end of the study to encourage those assessors who had commenced the study to complete it. 19 P a g e

20 Outlining the benefits of assessor feedback on their practices and in helping to set priorities for training and CPD based on assessors actual needs. (see section 8.1) Collusion There was some risk that assessors would discuss the study and enter responses to the study which they had not worked out themselves. To some extent, if this happened, it would reflect what assessors actually do in the field as around half of assessors said they would consult other more experienced assessors when confronted with unfamiliar situations. However, this would mask some of the information needs of assessors so a number of strategies were put in place to address this: Four different plans were designed and randomly assigned; Assessor were assured that the results were confidential so that there were no negative consequences for incorrect responses; and Assessor responses were scrutinized to isolate any cases with substantially identical responses Misunderstanding study questions A number of strategies were put in place to ensure that assessors understood the questions to ensure that their answers properly reflected their rating: Plans were peer reviewed to ensure they did contain all the information required; An information webinar was held to explain the study and the approach to rating houses required for the study. A link to this webinar was included in the study website so that all participating assessors could gain access as often as required; A web-based help system was developed to deal with assessor queries. Over 500 queries were addressed during the study and most queries were handled within 24 hours; and When error rates were high, individual rating files were reviewed to ensure that assessors had entered the correct data. Where assessors did not understand the question, data was taken directly from the rating files Confidentiality Once registered, each assessor was allocated a 12 digit personalised security code. This unique identifier was used to associate assessor responses with rating files. All personal data was removed. Confidentiality was vital to ensuring a high participation rate so that assessors felt assured that there would be no negative consequences from participating in the study. 20 P a g e

21 6 Findings summary Note: this section represents a summary only. More comprehensive discussion and analysis can be found in section 8.3. Assessors enter thousands of individual data items to establish the rating of a house. Each data item has a different effect on the rating, so the extent of accuracy required by assessors for each data item in order to maintain an accurate assessment also varies. To reflect the varying impact of different types of data on the accuracy of the assessment different levels of acceptable variation to the correct answers were used. For example, a variation in the NCFA of the building of 2.5% would lead to an error of 0.1 stars while a variation in the area of ceiling penetrations of 20% would lead to a similar error. In the following sections, the measure of acceptable variation to data entry used was that which would lead to an error of less than or about 0.1 star, so different percent error limits are considered acceptable for different types of data entry. 6.1 Statistical significance of sample To obtain a statistically-significant sample for the total assessor population the minimum number of assessors needed for the study was 317 i.e. the number of assessors needed for 95% confidence at a 5% margin of error. 547 assessors registered for the study and 344 completed the rating and entered the details into the study web site. The study sample is therefore a statistically-significant sample of the total assessor population. Table 4 shows the final participation rates of assessors by accreditation status and Table 5 by jurisdiction. Table 4 Participation rates in the study AAO Membership Registrants Completions % of registrations completed ABSA % BDAV % Other % Total % 21 P a g e

22 Table 5 Statistical analysis of sample and sub-sample sizes State/Territory Accreditation status Total Study Participa ABSA BDAV None nts* 95% confident answers are within +/- ACT % NSW % NT % QLD % SA % TAS % VIC % WA % TOTAL % Study Participants % confident answers 4.8% 8.4% 20.6% 4.8% are within +/- * this includes assessors registered/licensed ACT and SA whether they live in that state or not. As a result the total in each state adds up to more than the total participants in the study. The statistical significance of the sample can be summarised as follows: The overall sample will give answers which are expected to be within 4.8% of total population of assessor s answers at a 95% confidence limit; The sample for ABSA members will give answers which are expected to be within 4.8% of total population answers at a 95% confidence limit; The sample for BDAV members will give answers which are expected to be within 8.4% of total population answers at a 95% confidence limit; The sample size for non-accredited assessors was not statistically significant with a confidence interval of over 20%; and Only samples from NSW, VIC and WA provide answers which are expected to be within 10% of total population answers at a 95% confidence limit. The number of non-accredited assessors was difficult to assess, and the numbers of accredited assessors is changing as new assessors are trained and old assessors leave the industry. Therefore, it was important to establish how errors in the estimation of the population size would affect the number of assessors required to achieve a statistically-significant sample. If there were 2,000 assessors the number of study participants needed would increase from 301 to 322. This was still below the total number of participants so the results from this study are still robust even if assessor numbers have been underestimated. To determine whether the difference in answers between members of different AAOs were statistically significant, the proportion of assessors with different answers would need to exceed the sum of the confidence intervals for the two AAOs. To compare the responses of ABSA and BDAV members it may be that the sample was in error by +8.4% for BDAV members and -4.8% for ABSA members so only differences outside the range of 13.2% could strictly be considered a statisticallysignificant difference. 22 P a g e

23 The numbers of current software users are derived from information provided by the NatHERS software tool providers. This does not add up to the same number as the total number of accredited assessors because: Some assessors are accredited in multiple software packages; Some assessors work for other assessors and use their copy of the software; and Almost a quarter of AccuRate assessors report they have not upgraded to AccuRate Sustainability V SP1. Hearne Scientific (distributors of AccuRate) were only able to provide details regarding the number of users of the most current package which did not include those who have not upgraded so the number of assessors using AccuRate may be underestimated. The statistical significance of the samples obtained for each tool was also evaluated. Table 6 shows the confidence interval at a 95% confidence level for the samples of assessors using each tool in this study. Table 6 Statistical significance of samples of assessors using different NatHERS tools Software Package Total number of users Number of participants in Confidence Interval study AccuRate % FirstRate % BERS Pro % None of the samples meet the standard of +/- 5% at 95% confidence; however, all are significant at a 10% confidence interval. To determine whether the differences in answers between users of different packages are statistically significant, the proportion of assessors with different answers would need to exceed the sum of the confidence intervals for the two packages being compared. To compare the responses of AccuRate and BERS Pro users, for example, it may be that the sample was in error by +10.0% for AccuRate users and -8.1% for BERS Pro users so only differences outside the range of 18.1% could be considered a statistically-significant difference. 23 P a g e

24 6.2 Queries Over the duration of the data entry phase of the study, there were 311 enquiries logged requiring a total of 592 responses (including follow-up clarifications). These enquiries were broken down into the categories represented in Table 7. Table 7 Enquiries e-ticket Type Number of tickets Technical enquiries needing expert assessor response House 1 Accurate 10 House 1 BERS Pro 4 House 1 FirstRate5 10 House 2 Accurate 3 House 2 BERS Pro 8 House 2 FirstRate5 8 House 3 Accurate 7 House 3 BERS Pro 7 House 3 FirstRate5 11 House 4 Accurate 6 House 4 BERS Pro 5 House 4 FirstRate5 7 Total Technical 86 Other enquiries Withdraw from study 19 General help enquiry 206 TOTAL Assessor demographics This analysis includes all 547 assessors who registered for the study. This the first time such information has been available on the assessor industry and it provides a very useful snapshot. The following is a summary to the responses to the assessor practices question shown in section Assessors registering interest in this study reported that together they rate around 52,000 houses each year. The Australian Building Codes Board s (ABCB) 6 star Regulatory Impact Statement (RIS) 2 estimated that the number of building permits assessed using ratings would be around 71% of the 130,000 houses constructed in The assessments conducted by study participants therefore represent about a half of all new houses assessed using NatHERS tools in Australia each year. 2 Centre for International Economics, Consultation RIS , Proposal to Revise the Energy Efficiency Requirements of the Building Code of Australia for Residential Building Classes 1, 2, 4, 10. Australian Building Codes Board, Canberra, September 2009 see page 17 for proportion of houses assessed using NatHERS tools, and p. 190 for numbers of new houses 24 P a g e

25 While the most frequent response to number of houses assessed per year was under 10, the average reported number of assessments per registrant was 90 per year, or 2 per working week. Two-thirds of assessors are self-employed and only one-quarter do ratings as their main business. Of self-employed assessors, almost half are building designers or architects. The average assessor has been doing NatHERS assessments for 5.5 years and was last trained 2.5 years ago. FirstRate5 was the most commonly used package and the largest number of assessors in the study came from Victoria. 80% of the study sample came from Victoria, NSW or WA. ABSA members represent 57% of the registrants; BDAV was 33% and 10% were not accredited. 6.4 Assessor Practices The following is a summary of responses to the assessor practices question shown in section 8.4. Assessors reported spending on average 22.5 minutes checking their rating for accuracy once the assessment was completed. Assessors reported an average time to rate the houses in this study of 3 to 5 hours depending on the tool used. The most common approaches to checking ratings were through using a formal procedure such as (33%) using a rating checklist (28%) and following an informal procedure (22%). Almost half of assessors do not or rarely seek feedback from their clients on their performance. When assessors are presented with a complex house that is beyond their current experience just over half of the assessors would proceed with the rating without obtaining any advice from a more experienced assessor. Only 1% of assessors never had any building site experience while 63% have either supervised construction, inspected or approved construction or have worked as a tradesperson or labourer on site. Around one in five assessors report some difficulty in visualising a building in 3D from 2D plans. 69% of assessors report some difficulty in finding appropriate product thermal performance properties in trade literature. On average assessors keep trade literature for 22 different products in their professional library. The average age of an assessor was 44, and 71% of assessors are male. 25 P a g e

26 6.5 Star Ratings There was a wide spread of results in star rating accuracy. Figure 1 shows the proportion of the overall sample at various ranges of star rating error. Note that BERS Pro only reports half stars, not decimal stars as the other two tools report. This will slightly skew results to show a greater level of error. Figure 1 Frequency distribution of star rating errors showing the difference between assessor rating and the correct answer One third of the sample was able to obtain a rating that was within a quarter of a star of the correct result. The average of all rating errors was stars. As can be seen in Figure 2 the number of assessors rating too high was greater than the number of assessors who rated too low. The size of the errors for those who rated too low was greater than for those who rated the house too high. The result is that on average these errors cancel out. Table 8 presents this information in another way. It shows the percentage of ratings within specific star rating error levels e.g. 20.6% of assessors in the sample obtained the correct rating. At star differences under 0.5 stars the level of inaccuracy for BERS Pro is slightly overstated because BERS Pro only reports ratings in half stars, not decimal stars as is reported by AccuRate and FirstRate 5. To better reflect the results on a similar basis for all software Table 9 shows the error ranges down to half a star. 26 P a g e

27 Table 8 Proportion of ratings at various star rating error levels Error Range* % of sample From To % % % % % % % % % % % % % *-ve indicates that rating was higher than the correct answer Obtaining the correct rating does not necessarily mean that the rating was correct in all aspects. If the heating energy is in error by around the same amount as the cooling energy load then the overall star rating will be correct. Table 9 Proportion of ratings within various star rating error bands Error in star rating Cumulative Within 0.25 stars* 37% Within 0.50 stars 58% Within 0.75 stars* 70% Within 1.00 stars 77% Within 1.50 stars 86% Within 2.00 stars 91% More than 2.00 stars 100% *For 0.25 and 0.75 stars, the level of inaccuracy is slightly overstated as BERS Pro only reports ratings in half stars, not decimal stars as is reported by AccuRate and FirstRate House plan complexity The rating errors found in this study increased with the complexity of the house and documentation. House 1 was a volume builder single storey house, House 2 a volume builder two storey house with complex roof configuration and mixed wall types, House 3 was a small 3 bedroom townhouse where documentation was provided for the entire development and not just the dwelling to be rated, and house 4 was an apartment on the top (15 th ) floor. Assessors were far better at rating house 1 than the other houses in the study, as shown in Table 10 57% obtained a rating within 0.25 stars of the correct answer and 95% within 1 star. As the complexity of the houses and documentation increased, the level of accuracy reduced. Assessments were within a quarter of a star in only 41% of cases for house 2, 26% for house 3 and 19% for house P a g e

28 Table 10 Proportion of ratings within various star rating error levels Error in star rating House 1 House 2 House 3 Townhouse House 4 Apartment Within 0.25 stars 57% 41% 26% 19% Within 0.50 stars 77% 70% 45% 32% Within 0.75 stars 94% 80% 62% 41% Within 1.00 stars 95% 90% 74% 46% Within 1.50 stars 97% 95% 91% 57% Within 2.00 stars 100% 97% 95% 72% From +3.8 to -5.4 stars 100% 100% 100% 100% Assessors clearly found rating an apartment very difficult with less than half the sample obtaining a rating within one star of the correct result for house 4. Error levels for houses 2 and 3 show that assessor error levels increased as the complexity of the house and documentation increased. 6.6 Software tools Errors in star rating for the three NatHERS tools were different. Table 11 shows the proportion of assessors within 0.25 and 0.75 stars for each software package. Table 11 Accuracy of assessors using different NatHERS tools AccuRate BERS Pro FirstRate5 Within 0.25 stars 43% 36% 32% Within 0.75 stars 78% 72% 62% Some care must be taken interpreting these results as the samples are not representative at a 95% confidence level with a 5% confidence limit. The differences in the level of accuracy between AccuRate and FirstRate5 are similar to the error margin (confidence limit) of the samples, however, the error level differences could, in the total population, be negligible. 28 P a g e

29 6.7 AAOs The differences in star rating errors between assessors with different types of accreditation were also examined. Figure 2 shows the star rating errors for ABSA members, BDAV members and unaccredited assessors. Figure 2 Rating errors by type of accreditation Note that the small numbers of assessors who are not accredited means that this subsample is not statistically significant so comparisons may not reflect actual performance in the field. ABSA assessors achieve a rating within 0.25 stars at a significantly higher rate (41%) than BDAV members (28%) or unaccredited assessors (32%). However, at a limit of 0.75 of a star, the results are much closer though ABSA assessors are still more accurate (70% within 0.75 stars) than BDAV (63%) and unaccredited assessors (59%). The differences in error levels observed at 0.25 stars between ABSA and BDAV assessors are around the same as the sum of the error margins for the BDAV and ABSA samples. The actual difference could be negligible if total population statistics are all negative for one and positive for the other AAO. This difference in accuracy between ABSA and BDAV members may be more due to the software used than the performance of the assessors from each AAO. 81% of BDAV members use FirstRate5 while only 22% of ABSA members use FirstRate5. In section 6.6 above, FirstRate5 was found to have a higher rate of error in ratings compared to AccuRate. Note that the average proportion of questions answered correctly was virtually identical between the two AAOs. 29 P a g e

30 6.8 Question scores On average, assessors gave correct answers for two-thirds of the questions. Figure 3 shows the proportion of assessors achieving various score levels. Figure 3 Overall proportion of questions answered correctly across the sample The data items covered by each question do not have similar impacts on the star rating. For example, the impact of correctly calculating the area of uninsulated ceiling due to penetrations is different to the impact of the orientation of the dwelling. As a result, there was little correlation observed between the score and the star rating. Assessors made mistakes across all data entry areas. Only 4% of the assessor sample provided correct answers in 80% or more of questions. During the rating phase, questions received from assessors showed that some questions were open to misinterpretation. In these cases data was extracted directly from files. 6.9 Net Conditioned Floor Area (NCFA) The basis of the energy rating is the total energy load of the house divided by the NCFA. If the assessor makes mistakes in calculating the NCFA, the star rating will also be incorrect. The three main errors associated with in estimating NCFA included: Incorrect data entry for zone sizes; and/or Heating and cooling zones which should not be heated and/or cooled; and/or Failing to apply heating and cooling to zones which should be heated and cooled. 30 P a g e

31 Figure 4 shows the distribution of errors in estimating the NCFA of the dwellings, expressed as a percentage of the correct NCFA across the sample. Figure 4 Errors in estimating Net Conditioned Floor Area (NCFA) An error in the estimation of the NCFA of up to 2.5% would not have a major impact on the star rating of the house (around 0.1 stars for the houses in this study). 64 percent of the ratings were within of 2.5% of the correct area NCFA and 81% fell within 5% of the correct NCFA. This means that almost one in five assessors was making significant errors in calculating the NCFA. Checking rating files for houses 1 and 2 showed that a major source of the errors in estimation of the NCFA were from assessors not applying conditioning to zones consistent with Technical Note 1. BERS Pro and FirstRate5 users import the plan, scale it and trace over to define the zones. There was some concern that incorrect scaling would lead to significant errors, however, no evidence of poor application of scaling in BERS Pro or FirstRate5 could be found. No significant differences were found in the estimation of NCFA between the three software packages. A small systematic error in the estimation of NCFA consistency for BERS Pro and FirstRate5 was found in developing the solution sets; however this is not shown in Figure 4 because the graph shows areas relative to the correct answer for the tool. Because FirstRate5 and BERS Pro do not subtract the area of floors in upper storeys above walls in rooms below some significant differences were identified. Table 12 shows the NCFA calculated by each tool in the solution set files. 31 P a g e

32 Table 12 NCFA calculated by each software for each house in the master rating files House Software AccuRate BERS Pro FirstRate In calculating NCFA, errors were greatest in house 2 (a two storey house) because fewer walls on the lower floor lined up with walls on the upper floor and so this house has a greater floor area above ground floor walls Zoning Zoning refers to the allocation of spaces within the building to particular hours of occupancy and use of heating and cooling. Different types of zones can have significantly different energy loads. The energy loads of bedrooms/night time occupied spaces are usually much lower per square metre than in a living/day time space due to differences in the time of occupancy and thermostat settings. If heating or cooling is turned off in a zone, its performance has little impact on the rating. NatHERS Technical Note 1 was specifically designed to improve assessor practices with regard to zoning. AAOs have previously reported significant errors with zoning in QA of assessments: 32.5% of BDAV QA checks and 6.7% of ABSA QA checks found errors in zoning. Some of the common zoning errors made included combining too many rooms into the one zone (which effectively breaks the cross ventilation model), turning off heating and cooling to zones which should have been conditioned and allocating living and circulation spaces incorrectly to night-time occupancy. This study found that assessors were making significant errors in the number of zones and allocation of occupancy to zones. These errors were similar to those that have previously been reported. Zones were inappropriately combined and the allocation of zones to other day time conditioned showed that error rates were between 86% and 100% depending on the house. The average ratings of assessors who allocated the number of zones in the house within 1 of the correct number of zones were closer to the correct rating than those who had too few or too many zones. The energy ratings of assessors who had too few zones were significantly higher than those who had too many zones. The high error rate observed raised issues regarding assessor interpretation of the questions. To ensure that assessor interpretation issues were not clouding the results the rating files were further examined to investigate zoning practices. Table 13 shows the extent of incorrect zoning found in Houses 1, 2 and 4 by software type. House 3 was omitted from this analysis to save time manually checking rating files is time consuming - and because house 3 did not present any unique zoning issues that were not covered by the other houses. 32 P a g e

33 Table 13 Proportion of assessors with incorrect zoning of houses 1, 2 and 4 by software type Software Proportion of Assessors with incorrect zoning* House 1 House 2 House 4 AccuRate 47% 88% 61% BERS Pro 74% 92% 53% FirstRate5 60% 81% 65% * Zoning was further examined for houses 1, 2 and Site exposure The site exposure factor used in NatHERS tools was taken from the AS Wind Loads for Housing. It provides an estimate of the impact of the surrounding terrain on wind speed For example; a house on the edge of a cliff was exposed to much higher winds than a house in the inner city. While error rates were low for houses 1 to 3, around 41% of assessors allocated the wrong exposure factor to house 4: an apartment on the top floor of a 15 storey building. An apartment at this height should always be allocated to the exposed category and setting this factor to suburban (the most common error) would increase the rating by around a half a star Orientation Orientation can make a substantial difference to the rating, depending on the concentration of windows in each orientation. Windows on different orientations receive different amounts of solar radiation in summer and winter. House 1 has a fairly even distribution of windows on all sides, but orientation can still affect the rating by a half star. The rating of house 4, which has windows only on two opposite sides, can vary by as much as 2.4 stars at different orientations. The study found that 42% of assessors did not enter the correct orientation of the house. The question used to determine the correct orientation asked the assessor to nominate the orientation of the wall containing the front door. This seemed fairly simple, but inspection of rating files for house 1 showed that actual orientation error rates were much lower than those found in assessor responses. Allowing for assessor misinterpretation of the question the error rate for orientation was around 11% Windows and skylights Significant proportions of the sample made errors in the reporting of data relating to windows and skylights. House 3 included three skylights on the upper floor, however, around 22% of assessors rating house 3 did not identify that there were skylights in the house. The total window area reported by assessors was also evaluated. This analysis showed that on average 81% of assessors entered window areas within 5% of the correct area. Assessor estimates of total window area by house type were examined to see if errors in the estimation of window area were associated with particular houses. Table 14 shows the proportion of assessors who entered window areas within 5% of the correct window area for each house. 33 P a g e

34 Table 14 Proportion of assessors with correct estimates of window area by House House % of assessors within 5% of correct area House % House % House % House % Assessors had particular difficulty entering correct window areas in houses 3 and 4. House 4 errors were particularly large and this was due in part to the large area of glass in the shared corridor zone, which is required to be created by Technical Note 1. A number of assessors did not create this corridor zone at all and in these cases the window area will be much lower than the correct answer. The error rate associated with estimating window areas was relatively high given that all plans included window schedules with the height and width of windows clearly shown Eaves During the rating file checks for house 1, it was observed that the eave offset the distance between the top of the wall and the underside of the eave had a high error rate. Table 15 shows the proportion of assessments with the wrong window offset in the three software packages for house 1. The amount of the offset for house 1 was small, so the impact of the error on the rating was also small. The error rate for AccuRate and BERS Pro users was particularly high. Table 15 Ratings with incorrect eave offset by software type for house 1. Software Ratings with incorrect eave offset AccuRate 89% BERS Pro 79% FirstRate5 37% 6.15 Overshadowing by surrounds A high level of errors was found with assessor responses to questions about overshadowing. To better understand the high proportion of incorrect data entered for overshadowing, the data entered by assessors was examined in the rating files directly. This showed even more errors than the answers to questions suggested. Table 16 shows the results of the extent of errors in modelling overshadowing objects found in the rating files for house P a g e

35 Table 16 Proportion of houses with incorrect modelling of obstructions for house 1. Software Proportion of ratings with incorrect modelling of obstructions AccuRate 74% BERS Pro 100% FirstRate5 76% Typical errors included: Only modelling the effect of the closest wall of the adjacent building and not the ridge or the fence; Failure to model an obstruction on the adjacent lot where no house was shown on the lot but this was required by the Technical Note; Incorrectly calculating the height of the obstruction by not deducting the relative level of the dwelling being modelled; In FirstRate5, not connecting the obstruction screen to the wall; Not taking into account the height of the dwelling floor level above when calculating the height of the obstruction; and Incorrectly measuring the distance from the dwelling to the obstruction Air leakage sites Assessors showed high error rates in defining the number of air leakage sites. In particular, 71% of assessors entered vented down lights in house 3, when there were actually none because the down lights used were sealed and therefore had no air leakage. The model number of the down light was given and the assessors had to research the down light on the internet to discover whether it allowed air leakage or not. The proportion of assessors correctly entering whether doors were weather-stripped was also low. Between 35% and 60% of assessors correctly identified whether the doors were weather-stripped or not, depending on the house type. Error rates were much lower for other air leakage sites such as exhaust fans and weather-stripping of windows Ceiling penetrations Technical Note 2 was written to help assessors estimate the area of uninsulated ceiling caused by ceiling penetrations to ensure that the higher heat losses through these areas of ceiling was properly considered. Depending on the house, only between 3% and 29% of assessors estimated the area of ceiling penetrations within 20% of the correct answer Walls Assessors were asked to nominate the area of walls in each house allocated to different type of wall construction used in the house. A number of significant errors were found. Table 17 shows what proportion of assessors reported a wall area for each type of wall that was within 10% of the correct answer. 35 P a g e

36 Table 17 Errors in the identification of wall constructions Type of Wall Proportion of assessors with wall area within 10 % of correct answer House 1 House 2 House 3 House 4 Area BV + ins + wrap correct 90.3% 62.5% 23.4% NA Area BV and Ins correct 57.0% 87.5% NA NA Area BV no ins correct 41.9% 75.0% NA NA Area WB correct NA 69.3% NA 98.8% Area BC correct 75.3% 90.9% 94.8% 100.0% Area FC sheet correct NA 37.5% NA NA Area Ins Hebel correct NA NA 51.9% NA Area Hebel Correct NA NA 20.8% 65.1% Area CSR 360 correct NA NA 49.4% NA BV: Brick Veneer; ins: Insulation; WB: Weatherboard; BC: Brick Cavity; FC: Fibre Cement. Some of the errors identified may not be as large as they may first appear. Assessors may have failed to distinguish between a Brick Veneer wall with bulk insulation and wall wrap and a Brick Veneer wall with bulk insulation only. Confusing the two types had a small impact on the rated performance of the house. As a result, assessors may have got the total wall area correct, but misallocated the wall types. In addition, some of the wall areas were quite small, so a large error may only equate to a few square metres of wall area and again this would have little impact on the star rating Walls in BERS Pro Where a combination of foil and bulk insulation was used in walls in assessors using BERS Pro software must adjust the R value of the bulk insulation if the insulation fills the cavity between the foil and the internal wall lining to allow for the elimination of a reflective air space. Study participants using BERS Pro were asked to nominate the wall insulation used and with the high number of errors all rating files for house 1 were further examined to check what assessors had actually entered. This investigation confirmed the mistakes noted in the answers to questions with 91% of BERS Pro assessors not adjusting the insulation R value of the bulk insulation to allow for the loss of the reflective air space, despite the fact that this was reinforced in BERS Pro training. This does not lead to a significant rating error (less than 0.1 stars) however, it was clear that most BERS Pro users do not understand this data entry technique Roofs The study did not specifically check that assessors entered the correct ceiling insulation as this was clearly specified on the plans. However, the review of house 1 showed that a small number of assessors (between 5% and 9% depending on software type) did not enter the R value as shown on the plans or, in the case of BERS Pro, made an allowance to the ceiling R value for uninsulated areas of ceiling due to ceiling penetrations when no ceiling penetrations existed. Study participants were asked to nominate the type of roof material and the solar absorptance of the roof material. Plans contained references to the manufacturer, material and colour of the roof 36 P a g e

37 material and assessors had to find out the solar absorptance by researching this on the internet 3. While virtually all assessors got the material of the roof correct, assessors were less successful at determining the roof solar absorptance. Table 18 shows the proportion of assessors who entered correct answers for roof type and roof solar absorptance. While the majority of assessors did not correctly determine the solar absorptance of the roof, the extent of error in the solar absorptance itself was not large. Table 18 Assessors correctly answering questions about roof material and roof solar absorptance House Proportion with correct answers Roof Material Roof Solar Absorptance House % 38.7% House % 14.8% House % 50.6% House % 44.2% 6.20 Floors Rating errors for floors mainly related to the modelling of waffle pod slab floor constructions and the areas of floor coverings. Table 19 shows the proportion of assessors who correctly entered the Waffle R value (1.0) and concrete thickness of the top layer of the waffle slab (85 mm). Table 19 Assessors correctly answering questions about waffle pod slabs House Houses with correct answers Waffle R value Waffle Thickness House % 68.8% House % 61.4% Around two thirds of assessors correctly answered the waffle pod slab thickness question and three quarters the correct waffle pod R value. There is a variety of different information available to assessors regarding waffle pod slab construction which provide different answers to the R value that should be used when modelling waffle pod slabs: a report for Foamex prepared by CSIRO and advice on the BERS Pro website about how to calculate an effective R value for waffle pods give different answers to the Technical Notes. Therefore, the fact that there were some errors was not surprising. Study respondents were also asked to nominate the area of floors with different types of floor covering and to note the area of any floor which was above outdoor air, that is, not above ground, a subfloor or another room or dwelling. Table 20 shows the percentage of assessors whose answers were within 10% of the correct answers. 3 The website for one of the roof materials had changed since the materials for the study were prepared and for one of the houses the specific tile solar absorptance listed on the web site no longer made a clear distinction between the solar absorptance of the terra cotta and concrete tiles. While the FAQ on the Benchmark web site was updated to provide guidance, some assessors may have entered incorrect information. 37 P a g e

38 Table 20 Assessors who rated floor area type correctly House Floor Area type % of answers correct (within 10%) and associated average star error for correct and incorrect answers Carpet Floating Timber Ceramic Above outdoor air House % 98.9% 86.0% 65.6% House % 55.7% 59.1% 71.6% House % 96.1% 53.2% 45.5% House % 89.5% 68.6% 75.6% Floor coverings were shown in the plans with a separate drawing allocated to showing the type of floor covering in each room. Given the clarity of the data provided, the errors in estimating floor covering areas were high. 7 Recommendations 7.1 Minimum training and professional development standards for NatHERS assessors Assessor Qualifications The extent of rating errors made by assessors observed in this study demonstrates that the previous approach to NatHERS Assessor training has not been adequate. The Certificate IV for NatHERS Assessment has recently been developed and implemented as the qualification for NatHERS assessors, and the extent of errors found in this study highlights the need for such a course. The extent of errors in rating outcomes were found to be independent of the number of years an assessor had been undertaking NatHERS ratings, or how recently they had been trained, demonstrating that even experienced assessors require more training. It is recommended that all governments require assessors to be qualified to the Certificate IV standard. Educational institutions that are developing Certificate IV course resources should use this study as an aid to bridge skill gaps Compulsory Continuing Professional Development (CPD) for accredited assessors CPD is compulsory for NatHERS accredited assessors and all accredited assessors will be required to upgrade their level of competency to the Certificate IV in NatHERS Assessment by 1 July In the interim there is much that can be done to address the rating errors found in this study through CPD. CPD programs should be created to address the issues which have been revealed from this report. The priority areas based on this study should be zoning, rating techniques for more complex designs, reading house documentation, rating apartments, rating two-storey dwellings, entering overshadowing details, understanding how to derive thermal performance data from trade literature, taking a systematic approach to house rating and checking ratings. CPD offered by AAOs award points to each event attended and assessors can select which CPD events they attend. Some of ABSA s CPD includes testing to ensure that attendee s understand the information in the CPD, but this is generally not the case for BDAV. CPD was delivered for both AAOs 38 P a g e

39 by the NatHERS administrator regarding the new Technical Notes, but it is clear from this report that this did not lead to a high level of understanding. Not all assessors attended these CPD seminars. To ensure that CPD is effective in raising assessor skills two recommendations are made. Firstly, CPD should include an assessment element to ensure that it can be demonstrated that assessors have gained skills from their attendance. A minimum number of CPD sessions which include assessment should be required each year. Secondly, CPD session on topics which are vital to the way assessors undertake their work such as those for the Technical Notes or important software updates should be compulsory. Around 30% of BERS Pro and AccuRate users and 50% of FirstRate5 users have received no training or briefing in the new features of the current tools, at the time of writing this report Compulsory national accreditation The extent of rating errors found in this study shows that a consistent national approach to addressing NatHERS assessor standards is very much needed. All assessors who own NatHERS software, received several invitations to participate in the study however only 22 non-accredited assessors took part. This is only 5.8% of the estimated population of non-accredited assessors compared to the 29.4% of accredited assessors who took part in this study. Non-accredited assessors are estimated to represent around a quarter of all assessors in the field. They have little incentive to take part in any programs to improve their rating skills. Because the number of non-accredited assessors who participated was low, no statistically significant differences were observed between non-accredited and accredited assessors. While nonaccredited assessors did have lower average question scores and slightly larger average rating errors the differences were not statistically significant. This study has found high levels of errors are being made in the field: only one in five assessors obtained the correct rating and around one in four ratings were out by more than 1 star. If accredited assessors cannot reach consistently high levels of accuracy in their assessments, with CPD requirements, adherence to a Code of Professional Conduct and QA activities, then it is fair to assume that accuracy and consistency of assessments by non-accredited assessors is of greater concern. Accredited assessors at least work in an environment which can address these issues. Nonaccredited assessors do not. It is therefore recommended that accreditation of assessors be required by all governments to provide a mechanism to improve the accuracy and consistency of NatHERS assessments. This will help to raise standards in the industry and manage risk for those states and territories that currently have no requirements for assessors to meet best practice standards. Further, the more assessors there are accredited, the greater the resources AAOs have to service their members and the more competitive AAOs will become. The lack of universal accreditation of NatHERS assessors is holding back the development of the assessor industry and means that governments cannot be certain that compliance is being met Technical Notes 1 and 2 This study shows that many assessors do not understand the technical notes correctly, or they simply do not use them. Non accredited assessors may not use them at all. This affects consistency of application of the NatHERS software tools and therefore the consistency and accuracy of 39 P a g e

40 assessments. It is not possible to have consistent outputs if there are variances in the application of the complex NatHERS software tools. Recommendations for improvements include: Providing detailed examples of all parts of the Technical notes to improve the understanding of assessors; Building more principles of the Technical notes into the software: o Assessors made many mistakes in estimating the uninsulated area of ceilings due to ceiling penetrations. Building this calculation into the software would help to avoid data entry error. o Assessors also made a large number of mistakes in estimating the site exposure to wind speed. Asking key questions about the height of the floor above ground and extent of development in the surrounding area would help to make the selection of the exposure more objective. o Entering overshadowing obstructions also saw assessors make many mistakes. Software could be upgraded to remind assessors to take into account the side fences and roof ridge of neighbouring buildings which were found to be rarely entered - as well as the closest adjacent wall. All states should require use of the Technical Notes to ensure that assessors are using the tools consistently. This would be a natural consequence of a requiring compulsory national accreditation Quality Assurance improvements One of the main functions of the NatHERS AAO is to provide a system of best practice, which includes CPD, QA measures and a Code of Conduct. This study highlights that assessors do not correctly apply the NatHERS Technical Notes. It is therefore recommended that assessors are provided with further training around Technical Notes 1 and 2 and are assessed to measure competence. The level of errors found in this study greatly exceeds the level of errors found by AAO QA checks, for example, 32.5% of BDAV QA checks and 6.7% of ABSA QA checks found errors in zoning while between 47% and 92% of ratings were found to have errors in zoning in this study depending on the house and tool used. Discovering errors is harder for AAOs than it was in this study as AAOs have to find errors in rating files without having a solution set. In addition, the documentation provided to assessors for this study met all the requirements of Technical Note 1. The standard of documentation is not regularly achieved in the field and this may make it more difficult for AAOs to check ratings thoroughly. Nevertheless, the very large differences found are of concern. It is recommended that AAOs upgrade their QA procedures in light of the findings of this report. A common QA procedure for AAOs would also help deliver consistent outcomes across the country. Improvements to QA systems only improve standards for those who are covered by the system. Non-accredited assessors represent about 25% of assessors and have no way of knowing whether their work is accurate or not, and no framework to help them improve accuracy or consistency. 40 P a g e

41 7.2 NatHERS tool improvements Incorporating Technical Note principles Assessors are clearly having trouble interpreting Technical Note principles. for example, errors in zoning were found in 47% to 92% of ratings depending on the house and tool used. The section above on the Technical Notes highlights three areas where better quality outcomes could be achieved by getting the software to ask key questions and then determine the appropriate input automatically to lower the risk of error Allocation of rooms to zones and selection of occupancy for zones A large proportion of the sample cohort incorrectly zoned the dwellings. Simply eliminating the ability of assessors to turn on or off heating and cooling individually would eliminate a number of errors. If a space is conditioned there are no circumstances under which Technical Note 1 allows for a zone to be only heated or only cooled. The allocation of other day and other night time occupancies to zones is an area where significant errors were made. By contrast, living, kitchen and bedroom zone allocation was reasonably accurate. If a zone is not a bedroom, living area, garage or kitchen, then it will come under one of the other categories. To improve the allocation of zone types for these rooms, software could be modified to ask a short series of questions to establish the nature of the occupancy and make the appropriate selection for the assessor Modelling floor height above ground Around 70% of assessors did not enter the correct height above ground for house 1 and many left the height at 0. Assessors should not be allowed to leave the height above ground at 0 and could be prompted to enter a non-zero response by software tools. Assessors also made a number of mistakes entering overshadowing obstructions, partly because they did not allow for the floor height above ground. By automatically allowing for the impact of floor heights at different levels on the effective height of an obstruction, a number of errors would be eliminated. In AccuRate this may be implemented by entering the obstructions details in the same fashion as a horizontal shading schedule, in FirstRate5 obstructions drawn on one level could be selectable on all levels. Selecting the correct height above ground can be very difficult on sloping sites. Again, providing a series of questions in NatHERS tools about the site and its levels would help to avoid errors Modelling waffle pods There was a high error rate in modelling waffle pod slabs. Around a quarter of assessors got the added insulation R value wrong and around one third got the thickness of the slab wrong. For complex construction elements like this where the data input was already a simplification for the actual construction, it would be preferable to take the data entry out of the hands of the user. Ideally assessors should only nominate what they can see on plans That is, in this example, the fact that it was a waffle pod and the thickness and type of the waffle pod itself and the software should set concrete thicknesses and waffle R values from there. 41 P a g e

42 7.2.5 Modelling in FirstRate5 A number of errors in obstruction modelling for assessors using FirstRate5 were caused when assessors failed to connect the obstruction to a wall. Automatic allocation of shading screens would eliminate a number of errors. FirstRate5 had significantly different NCFA to AccuRate for house 2. This was because AccuRate excludes the floor area of the internal walls under the upper room floor areas. This is similar for BERS Pro. Consistent application of floor area calculations should be used across all NatHERS tools Modelling in BERS Pro For walls which contained both reflective foil and bulk insulation, it was found that many assessors did not adjust the bulk insulation R-value to allow for the fact that it removed a reflective air space in BERS Pro. A simple check box to say whether the inner facing air space has been maintained would eliminate a number of errors. BERS Pro had significantly different NCFA to Accurate for house 2. This was because AccuRate excludes the floor area of the internal walls under the upper room floor areas. This is similar for FirstRate5. Consistent application of floor area calculations should be used across all NatHERS tools Version of tool To maintain accuracy of ratings across the NatHERS assessor industry, it is important that assessors use the correct version of software. Where a new version changes data input procedures or affects rating outcomes, assessors need to be appropriately informed about these changes. Software update procedures for all software tools seem to be inadequate, at least in the lack of training/briefing that is provided for new versions. It is recommended that regulatory authorities ensure that building surveyors do not accept out-dated software tool version certificates. 7.3 New resource development Trade literature There are many products available which affect NatHERS ratings. Around 75% of the study cohort report they had at least some difficulty extracting relevant thermal performance data from trade literature. For example, how to determine the correct R value or construction type to use for reflective foil products Questions that required some research into product features found significant errors. Two strategies were suggested to overcome this: Developing standards for industry trade literature to help the building products industry to ensure their literature meets the needs of NatHERS assessors e.g. literature for reflective foil products often shows the total R value of an element insulated using the product. This is useful for assessing compliance with elemental provisions, but provides no help to assessors on how to enter the product into rating software which may require information on the emissivity of surfaces, the width of the air gaps they face, the extent of dust coverage assumed on upward facing surfaces etc; and Working with the building products industry to provide more convenient access to the full range of literature that assessors need. On average, assessors already keep information about 22 products in their professional libraries. This would be an ideal task to be led by AAOs. 42 P a g e

43 7.3.2 Rating procedures The key to fast, reliable ratings is to be systematic. In the NatHERS industry, all assessors have to develop their own rating systems and assessors report that they use a variety of different approaches to checking data. Many building industry professions have developed Practice Notes covering the administrative side of their work. The rating industry would benefit from a similar approach. A variety of resources could be developed including: Checklists to ensure all required rating inputs have been entered; Rating procedures to help assessors take a systematic approach to their ratings will help to ensure consistent results; and Procedures to facilitate easy checking of data A manual explaining data entry for NatHERS rating tools This would be a comprehensive resource explaining the essentials of how to rate a house in detail which explain how to approach the full complexity of situations the assessor will face in the field. Such a resource would ideally be full of worked examples so that assessors would be able to relate the examples given to the dwelling they are rating. A number of formats would be suitable: a reference book, a CD/DVD or a website. This would provide an invaluable resource for assessors to address issues that arise around more complex houses. It is impossible to remember the appropriate rating technique for every situation as most dwellings only require a few of the more complex rating techniques. A resource that assessors could refer for guidance would improve the accuracy of ratings. Assessors have shown a reluctance to seek advice from more experienced assessors: over 50% of assessors do not seek advice from other more experienced assessors when confronted with a dwelling which is beyond their area of expertise. And more experienced or high work volume assessors showed no greater level of accuracy (on average) than other assessors. There are also very few mentors or experts in the field who can provide sound advice. 7.4 Monitor progress and future studies This study has highlighted considerable areas for improvement for NatHERS which could increase consistency and accuracy of assessments of NatHERS assessors. As the Certificate IV becomes more fully integrated into the industry, NatHERS tools continue to improve and government s commitment to CPD and QA increases, a further study of this kind will be useful again to monitor, measure and evaluate changes and progress. Recommendations to assist in the next study follow Simplifying data extraction for assessors and reducing the time needed for analysis Extracting data from the rating file was a time consuming process for assessors and, because the data was incomplete, there were challenges for analysis. Issues were resolved through inspection of the rating files, which was resource intense. It was also difficult to ensure that questions are completely unambiguous. A tool which automatically extracted key data from the rating file and saved this in a spreadsheet compatible format would free assessors from having to extract data 43 P a g e

44 themselves and resolve issues over the interpretation of questions, eliminating one source of possible error. It would also significantly reduce the time needed for analysis of data. The proposed Universal Certificate will extract building key data from the rating file automatically. This tool could be modified to extract a more complete data set and store this electronically. This modification could provide a data extraction tool which would simplify future benchmark studies. It would also have application for other resource intensive tasks such as QA checks and exam marking for AAOs. By extracting all key data from the rating file, the process of comparing the assessor rating to documentation or solution sets would also be greatly simplified. Considerations for future studies include: Better instruction to interpret the orientation of the front door; Zoning questions could be asked on a room-by-room basis rather than on the total number; Details of overshadowing obstructions should be asked for fence, adjacent wall and roof ridge, and should be asked universally for all houses with obstructions; and Conducting a pilot of study questions to lower the risk of ambiguous questions with a broad range of assessors using different software and in different states Participation rate Enlisting the participation of assessors was difficult, and the final sample size was only just over the minimum needed to achieve a statistically-representative sample for the entire industry. Several sub-samples were not statistically significant. If benchmark studies are to be conducted in future, it is recommended that participation be a compulsory requirement of accreditation. Not all assessors would need to participate in the study each time, simply enough to ensure the statistical validity of the sample and sub-samples. The recruitment phase and keeping assessors who registered their interest engaged with the study was resource intense. If participation was mandatory, costs could be reduced and the statistical validity of the sample maintained. Compulsory accreditation of all assessors combined with mandatory participation in the study will then provide a mechanism to better measure the accuracy and consistency of NatHERS assessments and target areas for improvement. 44 P a g e

45 8 Appendices 8.1 Appendix 1: Benefits to assessors of participation in the study There are many benefits to the industry and assessors for participating in this study, even for nonaccredited assessors who did not receive a CPD allowance for participation. These benefits were summarised and included in s for recruitment and repeated on the study Website as shown below: Help to contribute to the most comprehensive rating resource ever developed in this country; Information about where assessors are making mistakes helps the AAOs and the NatHERS administrator to develop information and training resources for assessors that focus on the needs of the industry, Assist to build and improve the respect and standing for assessors in the building industry; Establishing the extent of errors in NatHERS assessments is the first step to raising standards. This study demonstrates to the building industry that the NatHERS assessor industry is serious about delivering a high quality service. Help to ensure quality and consistent ratings techniques; By establishing the nature of rating errors rating guidelines such as the Technical Notes can be improved to help clarify those areas where assessors struggle with interpretation, Personal and confidential measurement of your skill against benchmark; A total score was calculated for data entry by each assessor. This score, together with the distribution of scores shown in this report helps assessors to see how their rating techniques compare to the industry as a whole, Instant feedback on result; As assessors entered data the correct answers were shown, Help to protect your business by limiting errors and possible liability; If assessments are incorrect assessors are potentially liable for the cost of upgrading dwellings to conform to minimum standards or to pay the cost of items which were not needed to achieve minimum standards. By helping assessors see where they went wrong the standard of their rating techniques can be improved thus protecting them against future claims, Assist to translate your knowledge into rating techniques; Seeing practical examples of how rating principles are implemented helps assessors to better understand how to apply these principles in their work, Improve ongoing training opportunities and programs for assessors; The information generated by this study will help training and CPD programs to be customised to better meet the needs of assessors, Assist you with your next AAO QA review; By showing assessors where they went wrong in this study they are better placed to make fewer mistakes in the field, Earn CPD points if accredited with an AAO. 45 P a g e

46 8.2 Appendix 2: Assessor demographics Questions The following information was taken from the answers to questions which assessors provided on registration. These questions covered the basic demographics of the industry: o o o What software tool of choice do you wish to use for the study? Who is your accrediting organisation? Are you registered in SA? Are you licensed in the ACT? How long have you been an assessor? How many new houses or apartment assessments do you personally complete in a year? How many alterations and addition NatHERS assessments do you personally complete in a year? How long have you been using your current NatHERS accredited software of choice? (this is not version specific) When did you last receive training in your software? Are you self-employed? If yes, are assessments your primary occupation? If no, what is the nature of the main business Do you work for an employer? If yes, what is the nature of the main business Do you use your NatHERS accredited software as a design aid? Do you also complete the regulatory assessment? This provides a picture of the nature of the assessor industry for the first time. The information in this section includes all assessors who registered for the study (547) whether they continued their participation or not (only 344 assessors completed the study). This information was used in the study to determine whether any trends in assessor accuracy could be correlated with experience, type of business, number of ratings etc What tool is used? AccuRate is the benchmark tool for the NatHERS scheme, however data entry is quite complex with all geometric data about the building being required to be calculated on an element by element basis e.g. to enter the walls in a building requires the user to enter for each wall: the orientation, the length and width, the colour (solar absorptivity) horizontal shading device (e.g. eave or pergola) depth and distance above or below the top of the wall of the outermost edge the layers of construction materials contained within each wall, the height and width of external obstructions and the distance of the obstruction to the wall, and the depth of walls studying at right angles to the wall being entered. 46 P a g e

47 FirstRate5 and BERS Pro allow the user to draw in the plan, eaves and obstructions and select wall types from typical constructions which greatly increases the speed of the rating. The majority of assessors used BERS Pro and FirstRate5 with FirstRate5 being preferred by 43% of assessors compared to 35% for BERS Pro. AccuRate users represent 22% of the population. Note that AccuRate users may be underestimated as a number of assessors use more than one tool and this study only reports the number of assessors who used each tool for the purposes of this study. Because AccuRate was the first 2 nd generation tool released, a number of assessors initially purchased this tool before switching to one of the other tools when they became available. Presumably assessors chose to use the tool that they are most familiar with, so the potential underestimate of AccuRate users does not significantly affect the findings of this study with respect to day-to-day rating practices Accreditation Status 57% of the assessor cohorts were ABSA members, whilst 33% were BDAV and 10 % not accredited Location Assessors were also asked where they lived. Because the building industry is smaller in the ACT and SA a number of assessors who service the industry in these states do not live in the state. Assessors were also asked whether they were registered as a NatHERS assessor in SA or licensed to do assessments in the ACT under the ACT assessor licensing scheme. Figure 5 Jurisdiction where assessors live below shows the location of assessors who registered for the study. Figure 5 Jurisdiction where assessors live 47 P a g e

48 ABS data 4 on dwelling approvals for the 12 months to July 2013 were examined to investigate the relationship between the percentage of assessors and house construction in each jurisdiction. Table 21 shows that the location of assessors generally represents the extent of house construction in each jurisdiction except in Queensland where assessor numbers are much lower than the overall level of construction. Table 21 Percentage of assessors and dwelling approvals in each jurisdiction Jurisdiction Assessors Dwelling Approvals Australian Capital Territory 2.3% 2.7% New South Wales 27.0% 24.7% Northern Territory 0.9% 1.4% Queensland 7.7% 18.5% South Australia 5.7% 5.6% Tasmania 3.7% 1.1% Victoria 34.1% 30.3% Western Australia 18.8% 15.7% 4 Australian Bureau of Statistics report, Building Approvals, Australia: Table 07 Total Number of Dwelling Units Approved - States and Territories, ABS, Canberra P a g e

49 8.2.5 Number of years as an assessor On average assessors have been practising for 5.5 years, however, 60% of assessors have been practising for 5 years or less. The assessor industry is a relatively new industry it has been required for regulation for only 10 years and as a result standards of industry practice will not be as well entrenched as those in other areas of the building. Only 12% of assessors have experience from the days before regulatory requirements. There are therefore not many mentors in the industry for new assessors. Figure 6 Number of years assessors who registered for the study have been practicing as an assessor 49 P a g e

50 8.2.6 Training Figure 7 shows the average time since assessors were last trained in the use of NatHERS tools. It shows that, while assessors have been rating for 5.5 years, it is only 2.5 years on average, since they were last trained. This indicates that a number of assessors retrained either because they have switched tools or purchased another tool or to keep pace with new versions of their current tool. Figure 7 Years since assessors were last trained Assessments per year Assessors were asked how many houses they rate each year. With around 150,000 new houses built in Australia each year, and only 70% of these assumed to use NatHERS 5, the assessors who registered for the study report that they rate around half of all the new houses in Australia. It may be that assessors who have completed more NatHERS assessments are more familiar with the appropriate techniques and keep more up to date with industry standards such as the Technical Notes issued by the NatHERS administrator. 5 Estimated in Regulatory Impact Statement for 6 star regulations CONSULTATION REGULATION IMPACT STATEMENT (Consultation RIS ) Proposal to Revise the Energy Efficiency Requirements of the Building Code of Australia for Residential Buildings Classes 1, 2, 4 and 10 September 2009 p P a g e

51 Figure 8 Distribution of annual volume of assessments Self-employed versus employee The study participants were asked if they were self-employed and whether the business they were working in did NatHERS assessments as its main income. Most assessors are self-employed around two thirds. Only around 25 % of assessors do assessments as their main job role. The largest single category of businesses for self-employed assessors was building design 49% (including architects and designers) Main occupation Almost 75% of assessors undertake NatHERS assessments as an adjunct to their main business Type of business If study participants did not undertake assessments as their main business they were asked to describe their main business. (Figure 9) Almost half of those who do not undertake assessments as their main occupation are designers or architects. A better understanding of how design affects the energy efficiency of a house among the design professions is important, and it is also important that assessors understand how the building industry works. In this regard, the high number of designers who also do NatHERS assessments can be seen as positive. However, those who do not do assessments as their main job may also find it 51 P a g e

52 hard to keep up with the changing requirements of a new industry, and there is some concern about a conflict of interest in assessing your own design for regulatory purposes. Figure 9 Type of business where assessments are not the main business Building industry experience It is important that assessors understand the building industry. Assessors need to read and interpret plans and understand construction techniques to ensure that their recommendations to clients are practical, e.g. not to specify more insulation that can be physically installed in a construction element. Figure 10 shows previous building industry experience of study participants. 52 P a g e

53 Figure 10 Site experience of study participants There were no significant differences between the average ratings of assessors with different site experience. Again, the large range of errors may be masking any differences. And simply because no difference was found, does not mean that understanding construction techniques is not important, only that past experience in the building industry is not necessarily a predictor of rating accuracy. Visualising dwelling from plans can be critical to ratings as this will help the assessor to understand such things as identifying bulkheads to attic spaces in houses with sloping ceilings or walls in split level houses that may back on to two different zones. Ensuring competency in reading plans and understanding construction techniques are important components of the new Cert IV NatHERS assessor qualification. Study participants were asked what previous building industry experience they had and how difficult they found it to visualise a plan from 2D drawings. 33% of the sample reported that it was very easy to visualise from plans and 48% reported it was very easy. 18% reported it was a little difficult while only 1% reported that it was difficult to visualise a house from two dimensional plans. Study participants who reported that they had difficulty in visualising plans also had errors in the rating of their houses which were on average around 0.5 stars greater than those who had no problem. 53 P a g e

54 8.3 Appendix 3: Detailed analysis of assessments Systematic differences between the software tools In developing the master rating files for this study some systematic differences between the three packages were observed. All responses are calculated to be within a certain percentage or stars of the correct answer in the sections below so this systematic difference is eliminated from the results. However, these differences are of some concern and may point to the need for improvements to BERS Pro and FirstRate5. The sections below highlight the systematic differences between the software found in this study Heating and cooling loads There were small but significant differences in loads calculated by FirstRate5 and BERS Pro compared to AccuRate, the benchmark tool. As this may be due to the error in calculation of the NCFA, the total energy loads predicted by each package were calculated i.e. NCFA multiplied by MJ/m 2. The difference in NCFA explained the differences in loads for house 2, however, the differences were exacerbated for house 3. The difference between the loads predicted by BERS Pro and AccuRate were greater than 5% for houses 2, 3 and 4, while for FirstRate5 the differences were greater than 5% for houses 4. If these houses were part of the accreditation protocol FirstRate5 and BERS Pro would therefore have difficulty passing. Table 22 shows the energy loads predicted by FirstRate5 and BERS Pro for the 4 houses as a percentage of the loads predicted by AccuRate. Table 22 Total heating and cooling loads for BERS Pro and FirstRate5 compared to AccuRate House/Software % total loads compared to AccuRate House 1 FR 95.8% House 1 BP 96.0% House 2 FR 96.4% House 2 BP 93.7% House 3 FR 103.8% House 3 BP 112.2% House 4 FR 111.4% House 4 BP 92.2% Time taken to complete rating Study participants were asked how long it took them to rate each house. This gives the first comparison data on the duration of data entry for each of the software tools. Unsurprisingly this showed that AccuRate had the slowest data entry at an average of 5.3 hours. FirstRate5 and BERS Pro users use a similar data entry paradigm: tracing over an image of the plan. BERS Pro users were faster (3.1 hours) with than FirstRate5 users (4.1 hours) to complete their ratings. It is understood that BERS Pro training courses do focus on speed of data entry to a much greater degree than FirstRate5 courses and this may explain the faster data entry time. 54 P a g e

55 8.3.4 Star rating differences House complexity Each of the houses present the assessor with differing levels of complexity: House 1 is a simple single storey house and is typical of the sorts of examples used in assessor training, House two is more complex. Being larger it has more zones to define and a greater number of rooms can be combined into zones. The second storey means that the areas between zones floors must be calculated and the brick cladding extends only part of the way up the walls on the second floor. House 3 is a small two storey townhouse with shared walls. Documentation is provided for the development as a whole so assessors need to extract the information which is relevant to the unit rated which is a more demanding task than for houses one and two. House 4 is an apartment on the top floor of a building. This requires the application of some specific modelling techniques that assessors who are not familiar with rating class 2 may struggle with. Figure 11 shows the distribution of star rating errors for each of the four houses. Figure 11 Errors in Rating by house Figure 11 clearly shows that assessors were far better at rating house 1 than the other houses in the study. 57% of assessors rating house 1 were able to get within 0.25 stars of the correct rating and 94% were within 0.75 stars. This is a much lower error level than for other houses, but given the 55 P a g e

56 simple nature of the house it could be argued that assessments should be even closer. As the complexity of the houses and documentation increased, the level of accuracy fell significantly. Assessments were within ¼ of a star for only 41% of house 2, 26% of house 3 and 9% of house 4. Note that not all the errors may be due to assessor error alone. In some cases the interpretation of data or the answers to questions on the website may have been open to interpretation. To ensure that issues of misinterpretation of questions did not cloud the findings of this report, the rating files were also interrogated to determine the actual assessor response where there were unusually high error levels or where assessor questions revealed some issues of misinterpretation Differences in NCFA Each of the three NatHERS tools enters data in a slightly different way. The differences may lead to systematic differences in the calculation of NCFA for each house. In AccuRate, quantities are measured and the area calculated, so there is the possibility of human error in the entry of areas. In both BERS Pro and FirstRate5, the plan is drawn in and the scale of the drawing is set. Incorrect scaling will introduce a systematic error in the calculation of areas, so there was concern that this could be a common problem. Figure 12 shows the frequency distribution of errors in estimating the NCFA for the three software tools. Figure 12 Errors in NCFA by software type No statistically significant software specific differences in NCFA although users of FirstRate5 and AccuRate on average tend to slightly underestimate NCFA while users of BERS Pro tend to slightly 56 P a g e

57 overestimate NCFA. There would therefore appear to be no systematic user error in the estimate of NCFA. There are small systematic differences in the calculation of NCFA for these packages in two storey dwellings as discussed in section 6.9. Assessor scores The Totara system calculates the proportion of questions where the answer fell within the acceptable range i.e. the correct answer. Figure 13 plots the proportion of correct answers with the difference between the assessors rating and the correct rating. Figure 13 Percentage of correct answers versus rating error As can be seen, the percentage of correct answers is not a good predictor of rating accuracy. There are a number of reasons for this: Some wrong answers may increase the rating and others may reduce it. In some cases errors may cancel out, Not all wrong answers may have the same impact on the accuracy of the rating. This does have implications for how assessors are examined or audits evaluated. It is clear that a simple percentage of correct results were not a guarantee of achieving the correct rating Selecting climate, exposure and orientation Climate, exposure and orientation are as fundamental to producing an accurate rating as the NCFA. Rules governing data input for these three factors have not changed substantially over the history of 57 P a g e

58 the NatHERS scheme. It was therefore not expected that there would be any substantial errors in these inputs Climate Selecting the appropriate climate zone is fundamental to getting the correct star rating. The climate to be selected was clearly stated in the documentation; however, 7.2% of assessors did not select the correct climate. In those houses where the correct climate was not selected the average of the absolute value of rating errors was 1.06 stars compared to the average rating error of 0.02 stars. Selecting the correct climate in the field may be a little more straightforward in the field than in the benchmark study as the client will provide an address, and some assessors in this study appear to have selected the climate where they were located. This is more of an administrative oversight than an indication of serious problems with selecting the climate Exposure The exposure factor sets the value of a wind speed modifier in the Chenath engine which adjusts the free stream wind speed in the climate data file to be more representative of wind speeds at the actual site. The factor is selected by assessors on the basis of the exposure of the site and the height of dwelling above ground. In addition to this factor users of all NatHERS tools are required to enter the height above ground of the lowest floor in the dwelling being. Figure 14 shows an extract from Technical Note 1 explaining how the terrain exposure category is to be set. Figure 14 Description of terrain exposure categories from Technical Note 1 While assessors generally had a very low error rate in setting the exposure, House 4 an apartment on the 15 th floor had a very high error rate as shown in Figure 15. Technical Note 1 clearly shows that at the 10 th floor or higher a category 1 Exposed should be entered. Despite this many 58 P a g e

59 assessors entered the wrong exposure. The graph below shows the proportion of assessors who entered the incorrect exposure. Figure 15 Proportion of ratings with incorrect exposure setting by house and software Incorrect selection of the exposure factor can have a significant impact on the accuracy of a rating. In this sample the average rating error observed among those who entered the wrong exposure was only 0.2 stars worse than for those who entered the correct exposure, however, the actual impact of the error is masked by other errors. When the exposure factor in the master rating file for house 4 in AccuRate is changed from Exposed to Suburban (the most common wrong exposure) the star rating decreases by 0.5 stars. This is because the higher wind speeds increase air movement through the dwelling which makes internal conditions comfortable without air conditioning more often and thus lowers cooling loads. Due to the high number of errors observed, the rating files for house 4 were inspected by hand. This revealed that in addition to the error in exposure, the height of the floor above ground had also not been correctly set. The height of the floor above ground is used in combination with the exposure setting to adjust the wind speed and would exacerbate the problems caused by incorrect exposure. Table 23 shows the percentage of assessments with incorrect floor height. 59 P a g e

60 Table 23 Percentage of assessments with incorrect floor height above ground in each tool for house 4. Software Incorrect floor height above ground AccuRate 56% BERS Pro 67% FirstRate5 29% Orientation Getting the orientation of the building correct is another fundamental rating parameter. Without selecting the correct orientation the rating will be wrong. To illustrate the importance of getting the orientation of the house right, Figure 16 below shows the rating of houses 1 and 4 at eight different orientations. Figure 16 Rating of houses 1 and 4 at eight orientations House 1 windows are fairly evenly distributed on all sides while house 4 has windows on only two opposite sides. The rating of house one varies by as much as 0.5 stars at different orientations. The extent of error in the rating will depend on the extent of the error in orientation: +/- 45 degrees of the correct orientation results in an error of only 0.1 stars for house 1. Houses with windows which are less evenly distributed on all sides of the dwelling will show greater errors. Incorrectly orienting house 4 leads to a rating error between 0.7 and 2.4 stars. In house 4, small errors also significantly 60 P a g e

61 affect the rating: an error of only 45 degrees will change the rating by from -0.8 to -1.4 stars. While getting the orientation exactly right may not have a significant impact on the rating for every house, in some houses it is critical. To test whether assessors were correctly orienting their houses, study participants were asked to nominate the orientation of the front door. There appears to have been some confusion regarding this question, and in later benchmark studies it is recommended that instructions for assessors be improved. The extent of error found in the orientation question was very high. Table 24 shows the error rate for orientation of the front door by house and software. Table 24 Percentage of assessors with the wrong orientation by house and software type House and Software Assessments with incorrect orientation of front door House 1 AR 42% House 1 FR 48% House 1 BP 50% House 2 AR 11% House 2 FR 43% House 2 BP 21% House 3 AR 17% House 3 FR 20% House 3 BP 34% House 4 AR 45% House 4 FR 76% House 4 BP 72% Errors in orientation were lower for AccuRate across the sample, and FirstRate5 had slightly higher error rate than BERS Pro. Assessors draw in the plan in BERS Pro and FirstRate5 and then set the north point, while AccuRate assessor must enter the orientation for each wall manually. Obtaining the orientation of the wall with the front door is therefore as more complicated visual task in BERS Pro and FirstRate5 so a number of the errors for this software may be in reporting the orientation rather than in the rating file itself. Further, it appears that although the bearings of the lot were given and all walls were parallel to lot boundaries, some assessors attempted to measure the angle on the plan rather than using the bearings given. The errors in orientation are much larger than expected, so this issue was examined in more depth by opening the rating files and inspecting the orientation directly. An examination of the responses suggested that assessors may have not properly understood the question e.g. FirstRate5 assessors seemed to be entering the rotation of the north point on the plan screen rather than the orientation of the front door. This showed that the extent of errors was not nearly as high as the answers to the questions about the front door suggested. Table 25 shows the number of assessors who entered the wrong orientation into the benchmark website compared to the number who actually entered orientation data incorrectly for house P a g e

62 Table 25 Reported incorrect orientation versus actual incorrect orientation in house 1 Software Wrong orientation Orientation in rating file reported is wrong AccuRate 11 3 BERS Pro 17 5 FirstRate 19 8 Inspection of the rating files showed that the extent of errors in orientation were actually far less in reality than reported. Further, the extent of error was not great, except in a few cases. FirstRate5 users who showed the greatest errors in reported orientation only had a range of incorrect orientation from -9 to +13 degrees. Clearly, a number of assessors did not understand the question, and in later benchmark studies care will need to be taken to provide better instructions to assessors. The extent of errors in setting the orientation was highest in FirstRate5. BERS Pro has a similar method of entering orientation: trace over the plan, then set the north point. BERS Pro displays the orientation represented by the top, bottom, left and right of screen. A similar feature in FirstRate5 may be useful in ensuring that FirstRate5 users make fewer mistakes in setting their orientation. Note that while the orientation is a fundamental parameter for the rating there was no substantial difference between the average rating of assessors who reported the correct orientation and those who did not. There are a number of reasons why rating errors may not be seen with incorrect orientation: A number of assessors used the correct orientation, but misreported it; Those who measured off the plan may have been very close to the correct orientation so the rating error will not be large; and/or The rating differences may have been lost in the variability of other data Zoning NatHERS software divides the house into thermally unique zones. Allocation of zones is done on the basis of: Whether the room is a bedroom or opens only from a bedroom: these rooms are occupied at night (4pm to 9 am) while all other rooms are assumed to be occupied during the day (7am to midnight); The level internal heat loads in the room: bedrooms are assumed to have heat generation from people and lights and a few appliances, living areas have a higher heat generation from people i.e. more than one occupying the zone, and appliances e.g. heat loads from TVs and entertainment devices and higher lighting levels, while kitchens also include loads from cooking; Whether the room is heated and cooled. Laundries, bathrooms and WC s (toilet) with windows and garages are the only spaces which can be assumed to be not heated or cooled, and ALL other zones are considered to be heated or cooled must have BOTH heating and cooling applied; and 62 P a g e

63 The impact on wind driven ventilation. Rooms may be similar in all other respects, but if there are openings between these rooms which obstruct air flow then these rooms must be allocated to a separate thermal zone with a few minor exceptions. Failure to correctly zone a house will lead to significant rating errors. In general, combining rooms when they should have been separated INCREASES the star rating and dividing rooms which should be combined REDUCES the star rating. Technical Note 1 was specifically designed to improve assessor practices with regard to zoning. There were many anecdotal reports of assessors combining too many rooms into the one zone and AAOs auditing showed that this was an issue: 32.5% of BDAV QA checks and 6.7% of ABSA QA checks found errors in zoning. A number of typical errors were found in assessor zoning: combining living areas and adjacent hallways; leaving hallways unconditioned; applying heating and cooling only to those spaces which physically contained or was serviced by a heater or cooler; combining ensuites, walk-in-robes and bedroom spaces incorrectly; and incorrect adjacencies of elements in dwellings with elements which were shared with common areas. Technical Note 1 was released in early 2013 with a view to improving assessor practices and a number of CPD seminars were conducted with AAO s to explain the new rules. Each house used in the study tested the application of specific zoning rules. The following sections report on the questions asked in the studies, which were specifically designed to test whether assessors had correctly applied the new zoning rules in Technical Note Treating a corridor as a separate zone In house 4 there is a small corridor (~5m 2 ) adjacent to the main bedroom. Technical Note clause and its associated Table 4 explain that this should be treated as a separate zone. Most assessors zoned this correctly, however, around 14% of FirstRate5 and BERS Pro assessors did not. Assessors who did not zone this correctly had a far greater error in their overall star rating as shown in the Table 26 Table 26 Zoning of the corridor adjacent to the main bedroom in house 4 Software Correct Zoning of corridor Average star rating error BERS Pro wrong 2.00 right 0.62 FirstRate5 wrong 3.00 right 1.55 Note the amount of the rating error is not likely to be due to the incorrect zoning of this space alone, however, the ability to zone this correctly may be an indicator of the assessors general understanding of Tech Note 1 and therefore is a good indicator of a whole range of errors. This may therefore provide an important indicator of assessor understanding of zoning rules in exams and 63 P a g e

64 audits. A significant number of FirstRate5 and BERS Pro assessors need further information on how to apply this zoning rule Combining zones Houses 1, 2 and 4 combining WC and Hall Tech. Note 1, section deals with combining adjacent rooms into the one zone. If the occupancy of the zone is the same and both zones are conditioned the spaces can be combined subject to consideration of air flow. Assessors were asked whether they combined the WC and hall in Houses 1, 2 and 4. This was only allowed in house 2. Table 17 shows the percentage of assessors who answered incorrectly for each house by software: Table 17 Combining WC and the adjacent hallway Software & House % with incorrect zoning Accurate H1 73.7% BERS Pro H1 88.2% FirstRate5 H1 55.0% Accurate H2 10.5% BERS Pro H2 3.4% FirstRate5 H2 20.0% Accurate H4 20.0% BERS Pro H4 17.2% FirstRate5 H4 0.0% Incorrectly combining the hall and WC will not by itself lead to significant error in the rating. The error by itself is therefore not a great concern, but the fact that so many assessors do not understand the application of the zoning rules is significant. House 2, combining bedrooms and their walk in robes. Technical Note requires that: Small air spaces include: store rooms, walk-in-robes (WIR), pantries, linen closet and other small non-habitable rooms, that do not have external operable windows/doors and are ONLY accessed from one parent zone these are to be combined into the applicable parent zone (refer to Clause 7.4). Around 10% of AccuRate and BERS Pro users and 30% of FirstRate5 users did not combine WIR and bedroom. While no significant star rating difference was observed between those assessors who did and did not include the WIR as part of the bedroom zone it is still important that assessors apply zoning rules consistently. Note that whether the WIR is combined with the adjacent bedroom will not have a significant impact on the rating for house 2. From a thermal performance point of view it is not incorrect to separate these zones, however, it is important that assessors follow the same zoning rules to ensure all houses are rated on a similar basis. 64 P a g e

65 Number of zones Modelling the house with the correct number of zones and allocating type of occupancy i.e. the time of use and whether conditioned or not, is fundamental to accurately modelling a house for NatHERS. Table 28 shows the percentage of assessors who correctly modelled the number of zones of various types in each of the houses. Table 28 Assessors with the correct number of zones by zone type House Type of zone, shows % with correct zoning bed zones other day other day other night living cond. uncond. cond. kitchen House 1 98% 11% 68% 17% 91% 99% House 2 99% 9% 49% 50% 48% 94% House 3 99% 0% 57% 90% 77% 100% House 4 92% 14% 37% 72% 16% 100% While the allocation of bedroom and kitchen zones is well understood there are many assessors who do not allocate other day or night zone types of zones correctly. The allocation of zone types to rooms is fundamental to correct NatHERS modelling, but so too is breaking up the design into the correct number of zones. Table 49 shows the proportion of assessments with the correct total number of zones and those with too many or too few zones. Note some of the zoning rules in Technical Note 1 state that assessors may combine/split zones and leaves the final decision to the assessor. To take account of this acceptable variation, the assessments judged to have the correct number of zones are those within +/- 1 of the correct answer. Table 49 Assessors who estimated the number of zones within one of the correct answer Number of zones compared to correct House 1 House 2 House 3 House 4 answer 2 or more too many 12.9% 20.5% 20.5% 5.8% within % 36.4% 67.5% 64.0% 2 or more too few 32.3% 42.0% 14.3% 30.2% Assessors clearly need to understand the zoning rules in Technical Note 1 better with only one-third to two-thirds of houses having the correct number of zones depending on the house. Note that AccuRate assessors may have been confused as to whether non-habitable zones like subfloor and roof zones should be included - they were to be included and therefore may have misreported the number of zones when they actually did model the correct number. The fact that, on average, around one-third of assessors had too few zones is of concern. If zones are combined which should not be combined, ventilation performance is overestimated and this tends to lower the overall modelled cooling energy loads and lead to inappropriately high ratings. As expected, the average rating of houses with too fewer zones were higher than those with too many zones. As mentioned several times, it was very hard to find any trends in average rating 65 P a g e

66 outcomes because the impact of other errors tends to mask the impact of individual errors in data entry. The fact that any trend can be seen at all makes this finding particularly important. Figure 17 shows how the average rating error varies with the number of zones modelled for each NatHERS tool: Figure 17 Impact of number of zones on the error in star ratings Zoning review of house 1 rating files There were so many errors in zoning found that the full picture could only be properly investigated by inspecting zoning practices in the rating files by hand because the questions asked didn t always explain the nature of the errors that were made. This review of zoning found even more errors. Table 30 shows the extent of incorrect zoning found in the review of zoning in house 1 by software type. Table 30 Assessors with incorrect zoning of house 1 by software type Software Proportion of Assessors with incorrect Zoning to house 1 AccuRate 47% BERS Pro 74% FirstRate5 60% 66 P a g e

67 Figure 18 shows typical zoning errors made by assessors rating house 1. Figure 18 Typical zoning errors in house 1 Ensuite combined with Bed 1 Hall combined with wet areas Hall combined with Kitchen/Family Living combined with Entry WIR not combined with Bed 1 Robe as a separate zone Entry combined with Kitchen/Family 1 Entry, Living and Kitchen/Family all one zone Zoning review of house 2 rating files The zoning of house 2 in the rating files was also examined to determine the extent and nature of zoning errors. Table 31 shows the extent of incorrect zoning found in this audit of zoning in house 2 by software type: Table 31 Assessors with incorrect zoning of house 2 by software type Software Assessors with incorrect Zoning to House 2 AccuRate 88% BERS Pro 92% FirstRate5 81% Just as with house 1, assessors are not following the Technical Note zoning rules, and the extent of errors was even greater for this house. 19 and 20 highlight the most significant errors made in the zoning of the ground floor in house P a g e

68 Figure 19 Significant errors made in the zoning of the ground floor of house 2 Living, Kitchen/Meals/Family, Entry and Study combined Pantry and KMF not combined Stairs not combined with Living Living, Study and Entry combined in various ways Games and KMF combined or Games unconditioned Kitchen as separate zone Laundry hall and powder combined 68 P a g e

69 Figure 20 Significant errors made in the zoning of the upper floor of house 2 WIRs not combined with bedrooms Stairs and Living 2 not combined WC and ensuite combined Bed 1 and ensuite combined & ensuite not conditioned Living 2 zoned as Other Night time Zoning review of house 4 rating files The zoning of house 4 in the rating files was also examined to determine the extent and nature of zoning errors. Table 32 shows the extent of incorrect zoning found in this audit of zoning in house 4 by software type. Table 32 Assessors with incorrect zoning of house 4 by software type Software Assessors with incorrect Zoning to House 4 AccuRate 61% BERS Pro 53% FirstRate5 65% 69 P a g e

70 Just as with house 1, assessors are not following the Technical Note zoning rules. The rating file review found that there were two additional errors with zoning in house 4: Failing to create a separate zone for the shared corridor; Creating separate zones for plumbing ducts and robes; and Splitting the Kitchen Living room into two by creating a separate zone for the kitchen. The creation of the new zone for the shared corridor is a new requirement in Technical Note 1. Around 15% of assessors failed to create this zone or created it incorrectly. Technical Note 1 requires that small spaces for plumbing ducts and robes be included in the adjacent zone. Around 12% of assessors did not do this but created a separate zone for robes and/or plumbing ducts. Around 15% assessors split the kitchen and living areas into two zones, in particular BERS Pro users. The extent of the rating error for houses with the kitchen as a separate zone was much larger than the error for those who did not create the kitchen as a separate zone Air leakage Ceiling penetrations affect the area of the ceiling which is uninsulated. Downlights represent the most common form of ceiling penetration in houses, so reporting of the ceiling penetrations has been shown in this section despite the fact that the area of ceiling penetrations does not affect air leakage. Correctly identifying air leakage sites is another fundamental rating skill. In house 1 the Kitchen/Living/Dining zone has 12 down lights shown on the plan. The electrical plan lists these as sealed down lights a detail rarely given on most documentation - and a simple Google search for the product reveals that this type of down light does not allow air leakage between the room and the attic space above 6. Assessors were asked to report on the number of sealed and unsealed downlights, sealed and unsealed exhaust fans, doors with weatherstrips and whether the windows were weather-stripped. All this information was shown on the plan except for whether the downlight was considered sealed or unsealed. The make and model of the downlight was given and information on the website for the downlights clearly showed that they were sealed. In this section assessors were also asked to calculate the area of uninsulated ceiling around ceiling penetrations (for AccuRate and FirstRate5) a new requirement of Technical Note 2. Some 15% of assessors entered downlights in house 1 as unsealed downlights, when they are clearly sealed. This reduces the rating of the house by 0.3 stars. The extent of errors in entering air leakage features and ceiling penetration area is shown in Table P a g e

71 Table 33 Errors observed in the data entry for air leakage sites and allowance for ceiling penetrations on insulation House No. vented downlights correct Area uninsulated within 20% of correct answer No. of unsealed exhaust fans correct No. of sealed doors correct weatherstripped windows correct House 1 85% 29% 84% 49% 99% House 2 93% 3% 90% 35% 99% House 3 29% 13% 95% 60% 77% House 4 66% NA 92% 40% 100% The impact of getting these 5 areas of data entry correct on the star rating is shown in Figure 21. Figure 21 Impact of errors in entering air leakage sites and ceiling penetrations on rating errors Errors in the estimation of windows and doors with weatherstrips were quite high. This may be of less consequence because assessors will apply weatherstrips if they are needed to achieve compliance and will report on whether this is needed Wall constructions Correctly identifying wall construction types and their area is another fundamental skill needed to obtain an accurate rating. Assessors were asked to report the area of key wall construction types in each house. Table 34 Errors in the identification of wall constructions shows the percentage of assessors who reported a wall area for each type of wall that was within 10% of the correct answer: 71 P a g e

72 Table 34 Errors in the identification of wall constructions Type of Wall Assessors with wall area within 10% of correct answer House 1 House 2 House 3 House 4 Area BV + ins + wrap correct 90.3% 62.5% 23.4% NA Area BV and Ins correct 57.0% 87.5% NA NA Area BV no ins correct 41.9% 75.0% NA NA Area WB correct NA 69.3% NA 98.8% Area BC correct 75.3% 90.9% 94.8% 100.0% Area FC sheet correct NA 37.5% NA NA Area Ins Hebel correct NA NA 51.9% NA Area Hebel Correct NA NA 20.8% 65.1% Area CSR 360 correct NA NA 49.4% NA NA: Not applicable The responses above show that assessors reported wall areas which were often significantly different to the correct answer. Some of this error may have been due to confusion what data to extract from the software e.g. the were no Brick Cavity walls in any of the houses yet 25% of assessors rating house 1 reported that there were Brick Cavity Walls. The errors in wall area did not seem to produce average rating levels which were substantially different to those who did report the correct areas. This is in part because the R values of the walls were similar e.g. in house 1 there is some BV wall with insulation, and some with bulk insulation and house wrap confusing one with the other won t lead to large errors in the rating Wall insulation in BERS Pro and FirstRate5 Where a combination of foil and bulk insulation is used in walls in BERS Pro assessors must adjust the R value of the bulk insulation if the insulation fills the cavity between the foil and the internal wall lining to allow for the elimination of a reflective air space. Study participants using BERS Pro were asked to nominate the wall insulation used and there were so many wrong answers that this was investigated by opening all rating files for house 1 to check what assessors had actually entered. 91% of BERS Pro assessors did not adjust the insulation R value of the bulk insulation to allow for the loss of the reflective air space, despite the fact that this is reinforced in BERS Pro training. This does not cause a huge rating error (less than 0.1 stars). In FirstRate5 the inner facing air space is automatically deleted when the wall is insulated with bulk insulation and reflective foil. This is a reasonable option for most cases, but does not allow for situations where the air space is easy to maintain e.g. where a rigid board insulation product is used which does not fill the space between the inner wall lining and the foil Roof Just like getting wall constructions right, entering correct data about roofs is also critical to rating accuracy. In cooler climates, roof solar absorptance is not critical because ceilings need to be highly insulated to minimise winter heat loss. In warmer climates, however, current stringency allows houses with low roof insulation levels and getting the roof solar absorptance right will be critical. To illustrate what a large impact roof solar absorptance can have in hot climates, house 1 was modified 72 P a g e

73 to achieve 5 stars in Darwin (there is a 1 star allowance for the alfresco area) with a dark roof. For every 0.05 reduction in solar absorptance, the rating of the house improved by 0.1 stars so that the difference between a light roof (abs 0.3) and a dark roof (abs 0.8) is almost 1 star. The study did not specifically check that assessors entered the correct ceiling insulation as this was clearly specified on the plans, however, the rating file review of house 1 showed that a small number of assessors (between 5% and 9% depending on software type) did not enter the R value as shown on the plans or, in the case of BERS Pro, made an allowance to the ceiling R value for uninsulated areas of ceiling due to ceiling penetrations when no existed. Study participants were asked to nominate the type of roof material and the solar absorptance of the roof material. Plans contained references to the manufacturer, material and colour of the roof material and assessors had to find out the solar absorptance by researching this on the internet 7. While virtually all assessors got the material of the roof correct, assessors were less successful at determining the roof solar absorptance. Table 35 below shows the percentage of assessors who entered correct answers for roof type and roof solar absorptance. Table 35 Assessors correctly answering questions about roofs showing extent of error House Percentage with correct Error in estimating Solar Absorptance answers Roof Material Roof Solar Average 5th percentile 95th Absorptance percentile House % 38.7% House % 14.8% House % 50.6% House % 44.2% Given the importance of roof solar absorptance to the accuracy of the rating in hot climates the low proportion of correct answers is of concern, however, the extent and range of error was not large: 92% of assessors did get the solar absorptance within 0.1 of the correct answer and 67% were within This may still be an impact between 0.1 and 0.2 stars in hot climates, however, so it is clear that assessors still need some additional assistance with this, though not nearly as much as in other areas Floors Floor areas are critical to an accurate rating because the measure used to calculate the star rating divides the energy loads by the floor area (NCFA). Study participants were also asked a number of other questions about floors: details of the waffle pod slab: slab thickness and insulation R value, and the areas of different floor types and floor coverings. 7 Unfortunately the website for one of the roof materials had changed since the materials for the study were prepared and for one of the houses the specific tile solar absorptance listed on the web site no longer made a clear distinction between the solar absorptance of the terra cotta and concrete tiles. While the FAQ on the Benchmark web site was updated to provide guidance some assessors may have inadvertently entered incorrect information. 73 P a g e

74 Table 36 shows the percentage of assessors who correctly entered the Waffle R value (1.0) and concrete thickness of the top layer of the waffle slab (85 mm). Table 36 Assessors correctly answering questions about waffle pod slabs House Houses with correct answers Waffle R Concrete Thickness House % 68.8% House % 61.4% Around two thirds of assessors gave correct answer for the waffle pod slab thickness and three quarters the correct waffle pod R value. There is a variety of information available to assessors regarding waffle pod slab construction: a report for Foamex by CSIRO and advice on the BERS Pro website about how to calculate an effective R value for waffle pods so the fact that there were some errors is not surprising. At the time of conducting the study there was no official advice from the NatHERS Administrator about what R value to use to represent the R value of a waffle pod slab, so the correct answer was assumed to be R1.0 as shown in a report for Foamex (a waffle pod manufacturer) written by CSIRO. There was also advice on the BERS Pro website which gave a different answer, so there was some already confusion among assessors about the correct value to use, and this may explain the extent of errors found in this study. Since conducting the study the NatHERS Administrator has release a more detailed study of the effective R value provided by a waffle pod slab and this will help to eliminate errors in the estimation of waffle pod R values. Floor coverings can also have a significant impact on the rating. House 1 has a mix of floor coverings, but if all floors were covered with ceramic tiles the rating would improve by 0.2 stars. In timber floored houses, carpet adds around R0.3 insulation to the floor and in cold climates carpeting all timber floors (if uninsulated) would also improve the rating by 0.2 stars. While this is not as significant effect on the rating as other areas, it is still important to get this right. Study respondents were also asked to nominate the area of floors with different types of floor covering and to note the area of any floor which was above outdoor air i.e. not above ground, a subfloor or another room or dwelling. 74 P a g e

75 Table 37 shows the percentage of assessors whose answers were within 10% of the correct answers. Table 37 Assessors with correct floor type areas House Floor Area type % of answers correct (within 10%) and associated average star error for correct and incorrect answers Carpet Floating Timber Ceramic Above outdoor air House % 98.9% 86.0% 65.6% House % 55.7% 59.1% 71.6% House % 96.1% 53.2% 45.5% House % 89.5% 68.6% 75.6% If incorrect: star error If correct: star error Assessors who correctly entered the area of carpet and area of floor above outdoor air had more accurate average ratings than those who did not, but the trend observed was relatively weak. The fact that the area of floating timber and ceramic tiles showed no relationship between a correct answer and a more accurate average rating is likely due to the fact that other errors have masked the impact of these floor coverings Windows and skylights Getting the type, opening style and size of windows correct is another fundamental skill for NatHERS assessors. In the field achieving compliance with regulatory minimum ratings is highly dependent on window size, orientation and properties. The last half a star improvement to achieve the regulatory minimum is almost always a matter of fine tuning window performance. Improvements to window performance are also much more expensive on a cost per square metre basis than, for example insulation. This means that accurate data entry for windows is vital to achieving accurate ratings and minimising the cost of compliance. Study participants were asked to identify the areas and types of the various windows in each house. The plans included the U value and SHGC of each window so that identifying the window type was simply a matter of identifying the window in the software with these properties. BERS Pro has different sets of generic windows to AccuRate and FirstRate5 and displays different U values and SHGCs to those shown in the other software for the same windows. While all plans included U value and SHGC for BERS Pro and other software, there was some confusion among BERS Pro users about which windows to use in BERS Pro during the study. This confusion may account for some of the errors found in reporting window types. Table 38 aggregates all questions regarding window areas and the type of window used in the rating. If window areas for each type of window are within 10% of the correct answer the area of windows is assumed to be correct. 75 P a g e

76 Table 38 Correct responses to questions on window areas of different types in each house Window Data House House 1 House 2 House 3 House 4 Aluminium Single size 92.5% 84.1% 84.4% 88.4% Aluminium Double type 94.6% 87.5% 46.8% 86.0% Aluminium Double Size 91.4% 84.1% 97.4% 81.4% Aluminium Double to corridor type NA NA NA 87.2% Window area in corridor NA NA NA 81.4% Timber single type 91.4% 68.2% 80.5% NA Timber single area 91.4% 68.2% 80.5% NA Window type timber double 82.8% 62.5% 79.2% 89.5% No. awning windows 90.3% 59.1% 57.1% 97.7% No sliding windows 63.4% 52.3% 92.2% 80.2% No fixed windows 36.6% 21.6% 85.7% 36.0% No. Sliding Doors 47.3% 100.0% 92.2% 77.9% Timber doors with glazed inserts are counted as timber framed windows. A number of the errors in the estimation of timber single glazing in house 2 appear to be as a result of assessors not including the glazed area of the entry door. No significant difference in star rating error was seen for those who reported the correct window areas and types compared to those who incorrectly reported window areas and types shown in While assessors may not have correctly allocated the correct type of window if the area was close and a window with similar U value and SHGC was selected then little rating error would be expected. This together with the masking effect of other errors may account for the lack of correlation between incorrect answers and less accurate ratings. The total window area reported by assessors was also examined. This analysis showed that on average 81% of assessors entered window areas within 5% of the correct area. On average the window area entered was 100.3% of the correct area. Assessor estimates of total window area by house and software type were examined to see if errors in the estimation of window area were associated with particular houses or software Table 39 shows the percentage of assessors who entered window areas within 5% of the correct window area for each house as well as the 5 th and 95 th percentile of window areas entered as a percentage of the correct area 90% of assessor estimates of window area fall between these two limits. 76 P a g e

77 Table 39 Assessors with correct estimates of window area by for each house House Assessors within 5% of correct area 5th percentile of areas 95th percentile of areas House % 95% 105% House % 98% 109% House % 89% 123% House % 63% 123% Assessors had particular difficulty entering correct window areas in house 3 and 4. House 4 errors were particularly large and this is due in part to the large area of glass in the corridor zone required to be created by Technical Note 1. A number of assessors did not create this corridor zone at all and in these cases the window area will be significantly lower than the correct answer. All plans included window schedules with the height and width of windows clearly shown. Table 40 shows the percentage of assessments which reported a window area within 5% of the correct area for each software type together with the average window area calculated for all assessments undertaken with this software type as a percentage of the correct answer. Table 40 Assessments with a window area within 5% of the correct answer by software type Software Assessments with correct window Average window area as a % of area correct area AccuRate 75.0% 103% BERS Pro 76.9% 99% FirstRate5 88.4% 100% AccuRate users made the greatest number of errors in reporting window size. This was expected as AccuRate users must enter window height and width by hand. BERS Pro and FirstRate5 users trace window widths off the plan and only need to enter window height which ensures that the window width is right and makes it harder to overlook windows as they are obvious from the plan. The impact of assessor estimates of total window area on rating accuracy was also examined. Table 41 shows the average rating error by house and software type. Table 41 Impact of entering correct window area on star rating error House Correct Window Area Average error in star rating AccuRate BERS Pro FirstRate5 Incorrect Window Area Correct Window Area Incorrect Window Area Correct Window Area Incorrect Window Area House House House House P a g e

78 It has been difficult to find any trend linking star rating error with responses to individual questions because the large number of errors made means that the impact on star rating outcomes has been lost due to the masking effect of other errors. Despite this, some significant reductions in rating error were observed where assessors did enter a glazing area within 5%, particularly with FirstRate5 users in house 3 and 4 and BERS Pro users in house 4. The fact that any trend was observed at all given the confounding effects of errors in other data inputs emphasises how important it is to enter accurate windows areas. Many plans will be provided to assessors without windows schedules and the full data that assessors need to model windows that were provided with the plans in the study, so there is plenty of scope for even larger errors in the field. The key to getting correct window areas is taking a systematic and methodical approach to the data entry of window size and type. House 3 included 3 skylights on the upper floor. Around 22% of assessors rating house 3 did not identify that there were skylights in the house. While the skylights were shown in lighter grey, they were still clearly marked and easy to see if one zoomed in on the plan sufficiently. This is a significant oversight and Cert IV and CPD training should include information on identifying the graphic symbols used to identify skylights Eaves During the rating file checks for house 1 it was observed that the eave offset the distance between the top of the wall and the underside of the eave - had a very high error rate. Table 42 shows the percentage of assessments with the wrong window offset in the three software packages for house 1. Table 423 Ratings with incorrect eave offset by software type Software AccuRate Ratings with incorrect eave offset 89% BERS Pro 79% FirstRate5 37% The amount of the offset was small, and so the impact of the error on the rating is also small. The error rate for AccuRate and BERS Pro users is particularly high Overshadowing Correctly modelling overshadowing by obstructions is another key skill required to achieve accurate NatHERS ratings and significantly affects the design strategies needed to achieve compliance with minimum regulatory ratings. Correctly modelling obstructions can mean that glazing which would otherwise create high cooling loads does not need to be tinted or shaded with blinds. Correct modelling of obstructions may mean that glazing, which would normally reduce heating loads due to winter solar gains, actually increases heating loads. Correctly modelling surrounding overshadowing obstructions will often require the entry of more than one shading screen. Figure 22 shows that in a typical suburban situation 3 shading screens may need to be modelled. 78 P a g e

79 Figure 22 Modelling overshadowing obstructions In houses 1 and 2 three shading screens: fence, adjacent wall and roof ridge must be modelled at each side boundary. The plans showed the relative levels of the floor in the building to be modelled, the height of the adjacent fence, and the relative level of the adjacent wall/eave line and ridges. In these houses obstruction heights are calculated for the adjacent building by subtracting the relative level of the floor from the relative level of the wall and ridge. This caused some confusion among assessors and several entered the relative level of the obstructions without taking away the relative floor height of the dwelling being rated. Technical Note 1 contains specific instructions on the modelling of overshadowing obstructions. Even if neighbouring buildings are not shown on the plan obstructions on lots either side of the dwelling obstructions are required to be modelled on adjacent lots. The only exception is for south boundaries in cool climates. This is a fairly new procedure for most assessors. House 1 showed only one house on an adjacent lot, but obstructions were required to be modelled on both sides of the building. Several assessors did not enter the obstruction on the lot where no house was shown. Assessors were asked to nominate the height, width of the obstruction and distance from the wall to the adjacent obstruction. Dimensions were only required to be input for one shading screen. The fact that there was only one entry when three shading screens were required caused some confusion about which shading screen was to be input. This was an oversight in the design of the study and should be addressed in future studies. In an attempt to overcome this limitation each rating file submitted for house 1 was checked to see whether assessors had correctly modelled obstructions. Table 43 shows the results of this checking process. 79 P a g e

80 Table 43 Extent of errors in modelling of obstructions Software Proportion of ratings with incorrect modelling of obstructions AccuRate 74% BERS Pro 100% FirstRate5 76% The overwhelming majority of assessors did not model obstructions properly. Typical errors included: Only modelling the effect of the closest wall of the adjacent building and not the ridge or the fence; Calculating the distance to the obstruction by measuring the distance between the walls of the buildings and not accounting for the fact that it is the eave and not the wall that casts the shadow; Failure to model an obstruction on the adjacent lot where no house was shown on the lot, Incorrectly calculating the height of the obstruction by not taking into account the relative level of the dwelling being modelled; In FirstRate5, not connecting the obstruction screen to the wall; Not taking into account the height of the dwelling floor level above when calculating the height of the fence; and Incorrectly measuring the distance from the dwelling to the obstruction. The site plan showing the obstructions was at a different scale to the other plans typical in the building industry and some assessors appear have either not taken this into account or scaled incorrectly. The height of the dwelling floor above ground also has a significant impact on the modelling of overshadowing effects. In checking the rating files for house 1 many assessors were found to have not entered the height of the building above ground correctly. This height also affects the calculated site wind speed. While the impact on the rating may be small it is still important that assessors model this correctly. Table 44 shows the percentage of assessors who did not model the floor height above ground properly. Table 44 Extent of errors found in the modelling of floor height above ground Software Proportion with incorrect modelling of floor height above ground AccuRate 84% BERS Pro 91% FirstRate5 39% The overwhelming majority of errors were simply not entering the height and leaving it at Appendix 4: Assessor practices Participants in this study were not only asked to enter the results of their assessment into the survey website, but were also asked several questions about their work practices. The questions asked about rating practices were based on ABSA s Success Traits for assessors. ABSA developed this list of assessor traits which are likely to be associated with assessors who are good assessors. Table P a g e

81 describes success traits and the questions asked in this study about work practices which would reflect these success traits. Table 45 Data gathered on assessor work practices Success Traits Passion for accuracy, getting it right and checks results. Methodical, systematic, using documented work procedures and taking the time to work through the details correctly. Ability to read plans correctly (especially in 3D) and understand building documentation. Good working knowledge of how buildings go together, building physics, thermal performance of materials, construction practices and construction costs. Wants to know the leading edge and trends in new materials, innovations, software upgrades and regulatory changes. Regular use of the rating tools and understanding of their potential and limitations. Committed to a good energy efficiency outcome rather an any outcome that meets the Code. Passionate about the sustainability of the built environment and climate change science generally. Verification Questions On average, for a typical house rating how much time do you spend checking your data inputs? When rating a house what sort of procedures do you follow? For most of the ratings I do, how easy is it to visualise how a three dimensional building will be constructed from a 2D set of plans? What is the extent of on-site building industry experience that you have.? We are interested in finding out difficult it might be for assessors to get the right thermal performance information on new materials to help them model dwellings properly. How easy do you find it to obtain this information? How many products do you find you need to keep technical data sheets for in your professional library? We are interested in how assessors handle the issue of advising clients on the cost impacts of their recommendations. Which of the following best describes your approach? Do you refer to the National Construction Code (BCA) in the course of your work: Do you refer to the State or Territory guidelines and practice notes e.g. BASIX Protocol in NSW, Building Commission PN in Victoria in the course of your work? We are interested in how assessors use the software tools available. Which of the following best describes version of software you are using: How often do you contact your clients to ensure they are satisfied with your service: How do you handle rating situations that you consider is outside your knowledge and expertise? How do you help clients resolve compliance when a dwelling you have rated does not pass the necessary thermal performance requirements? How interested are you when it comes to broader sustainability, low carbon and climate change science issues that are not associated with thermal performance rating? The study investigated whether any of these traits were indeed associated with assessors who obtained more accurate ratings. In general, no statistically significant link was found between good practices and rating accuracy. This does not suggest that these success traits are unimportant, simply that, again, the extent of errors was so large that it may have masked differences in rating outcomes. 81 P a g e

82 8.4.1 Quality control methods used by assessors Checking procedures Virtually all assessors used some kind of checking procedures although the most common type of procedure was an informal approach (33%) where assessors rate the dwelling using a similar procedure each time. Assessors who use a formal procedure or do a manual check represent only just over one third of the sample. Study participant rating results were checked to see whether the time taken to check ratings or the rating check method they reported led to more accurate ratings. No statistically significant trends were observed linking reported checking and accuracy of rating. This does not mean that checking ratings is of no consequence. Rather, there were so many errors that these mask any trend that can be seen. Furthermore, assessors may check their ratings, but if they don t understand what the correct data is in the first place, this checking will not improve accuracy Feedback from clients Another important aspect of quality control is obtaining feedback from clients. Around half of assessors never or only occasionally seek formal feedback from their clients. (Figure 23) This is not necessarily an indicator of poor practices as feedback may be given by clients on a job by job basis, or the assessor may be rating their own designs and so does not have an external client. Nevertheless seeking feedback from clients is still an important part of running a successful business. This is often something that assessors may find hard as it can be difficult to deal with negative feedback. 82 P a g e

83 Figure 23 Feedback from clients sought by assessors Approach to complex ratings Assessors are sometimes confronted with ratings for complex dwellings which are outside their normal experience and skill set. Study participants were asked what they did in these cases. When confronted with complex dwellings around half of assessors attempt to rate the dwelling without any reference to more experienced colleagues. In a highly competitive rating market, this is understandable: assessors do not want to lose work to other assessors. The analysis in section 6.5.1, shows that as the house becomes more complex the extent of error in the star rating increases, and there are far more complex dwellings being constructed than those in this study. Consequently, not seeking help may not be the best response when confronted with a complex house. However, there was no statistically significant difference in the accuracy of ratings between assessors with only a few years of experience and those with several years of experience, so there appears to be no guarantee that a more experienced assessor will be able to help. 83 P a g e

84 8.5 Appendix 5: House plans Features of houses design to test assessor accuracy Table 46 Specific features of houses designed to test assessor accuracy Area Tested House 1 House 2 House 3 House 4 Constructions Floor construction: waffle pod Correct Areas of floor coverings Garage has solid brick external wall Overshadowing Building on adjacent lot: modelling all 3 screens Procedures for vacant lots Reading plans Understanding relative levels Correct orientation from lot bearings As for house 1 except Garage has brick veneer internal wall Mixed wall materials, some walls have different materials on lower and upper portions As for house 1 As for house 1 Floor heights of levels Mixed wall materials, some walls have different materials on lower and upper portions Modelling of spandrel glass Identification of skylight Construction details of shared walls Area of floor overhanging lower level Building on adjacent lots correct height allowing for floor level of room Wing wall effects of adjacent units Modelling of light court Determining correct house details from plans of multi-unit development Interpretation of orientation from north point where no bearing are given Floor heights of levels Construction of party, internal and external walls nonstandard types Modelling of shared floor construction Correct Areas of floor coverings NA Correct orientation from lot bearings 84 P a g e

85 Zoning Horizontal shade Windows Combining zones: all wet areas can be combined, Beds 2-4 can be combined, WIR and Bed 1 can be combined Ensure that Kitchen/Family, Entry, Living, Hall and Ensuite are separate zones General eave depth and offset Modelling of outdoor living shade Treatment of porch and its structure Window size Openability: some non-standard openability configurations Finding appropriate window type from U and SHGC Modelling of glass inserts in front door Entry, Living, Study and stair zoning: combining all is a typical error Combining appropriate zones: WIRs in bedrooms, pantry with kitchen Combining all WIR, Ensuite, WC and Bed 1 is a common error Zoning and conditioning of powder room and adjacent hall Inclusion of service ducts in parent zone As for house 1 but treatment of balcony rather than porch Corrected eave offsets for lower portion of walls Zoning of WC on ground floor Separation of Kitchen/Dining and Living rooms Zoning of stairwell and service ducts Modelling of entry porch depth, offset and wing wall Modelling of pergola at rear Modelling of shared corridor as an unconditioned zone Treatment of internal hall spaces as separate zones General eave depth and offset of balcony As for house 1 As for house 1 As for house 1 85 P a g e

86 Ceiling Penetrations Research Roof Space Modelling Two storey modelling Allocation of correct uninsulated areas for ceiling fans and downlights where greater than 0.55% of ceiling area Determine whether downlight has air leakage Foil properties of Enviroseal roof tile plus Solar absorptance of roof tiles from colour name NA Ensure no downlights entered, Ensure ceiling registers for ducted heating not included Solar absorptance of roof tiles from colour name Three separate roof spaces modelled Adjacency of internal walls to roof spaces Proper connection of upper and lower zones Allowance for floor level when entering overshadowing Does not include skylights, and no downlights to roof Ensure ceiling registers for ducted heating not included Spandrel glazing Modelling of roof not as an attic As for House 2 Allocation of correct uninsulated areas for ceiling fans and downlights where greater than 0.5% of ceiling area Party wall construction system Adjacency to other units NA NA Side walls fully shared one side, partially shared on the other 86 P a g e

87 8.5.2 House 1 87 P a g e

88 88 P a g e

89 89 P a g e

90 90 P a g e

91 8.5.3 House 2 91 P a g e

92 92 P a g e

93 93 P a g e

94 94 P a g e

95 95 P a g e

96 96 P a g e

97 97 P a g e

98 98 P a g e

99 8.5.4 House 3 99 P a g e

100 100 P a g e

Nationwide House Energy Rating Scheme (NatHERS)

Nationwide House Energy Rating Scheme (NatHERS) Nationwide House Energy Rating Scheme (NatHERS) Administrative and Governance Arrangements August 2015 NatHERS Administrative and Governance Arrangements Page 1 Contents PURPOSE... 3 THE NATIONWIDE HOUSE

More information

Nationwide House Energy Rating Scheme. Strategic Plan 2015-2018

Nationwide House Energy Rating Scheme. Strategic Plan 2015-2018 Nationwide House Energy Rating Scheme Strategic Plan -2018 August Disclaimer While reasonable efforts have been made to ensure the contents of this document are factually correct, the Commonwealth, State

More information

PROTOCOL FOR BUILDING ENERGY ANALYSIS SOFTWARE For Class 3, 5, 6, 7, 8 and 9 buildings

PROTOCOL FOR BUILDING ENERGY ANALYSIS SOFTWARE For Class 3, 5, 6, 7, 8 and 9 buildings PROTOCOL FOR BUILDING ENERGY ANALYSIS SOFTWARE For Class 3, 5, 6, 7, 8 and 9 buildings Version 2006.1 AUSTRALIAN BUILDING CODES BOARD JANUARY 2006 TABLE OF CONTENTS Foreword 1. Scope 2. Purpose and context

More information

NatHERS Technical Note V1.2 Update for use with software using Chenath engine V3.13 Assessor Questions and Answers:

NatHERS Technical Note V1.2 Update for use with software using Chenath engine V3.13 Assessor Questions and Answers: NatHERS Technical Note V1.2 Update for use with software using Chenath engine V3.13 Assessor Questions and Answers: Issue 1 Zoning Q1: Why can't we have daytime or night-time zones with no occupancy or

More information

Assessing energy equivalence using the peer review process. Peer review guideline

Assessing energy equivalence using the peer review process. Peer review guideline Assessing energy equivalence using the peer review process Peer review guideline October 2010 Contents 1. Background... 2 Application...3 Assumptions...3 Legislation...3 2. What is peer review?... 3 3.

More information

NATIONWIDE HOUSE ENERGY RATING SCHEME (NatHERS) SOFTWARE ACCREDITATION PROTOCOL

NATIONWIDE HOUSE ENERGY RATING SCHEME (NatHERS) SOFTWARE ACCREDITATION PROTOCOL NATIONWIDE HOUSE ENERGY RATING SCHEME (NatHERS) SOFTWARE ACCREDITATION PROTOCOL NatHERS National Administrator June 2012 Disclaimer This publication provides information to applicants seeking NatHERS accreditation

More information

A GUIDE TO WINDOW AND DOOR SELECTION

A GUIDE TO WINDOW AND DOOR SELECTION Australian window association guide series A GUIDE TO WINDOW AND DOOR SELECTION An industry guide to the selection and certification of windows and doors Background Foreword As part of its commitment to

More information

Council of Ambulance Authorities

Council of Ambulance Authorities Council of Ambulance Authorities Patient Satisfaction Survey 2013 Prepared for: Mojca Bizjak-Mikic Manager, Data & Research The Council of Ambulance Authorities Prepared by: Natasha Kapulski Research Associate

More information

Market Research to Assess the Proposed Designs for the Nationwide House Energy Rating Scheme Universal Certificate

Market Research to Assess the Proposed Designs for the Nationwide House Energy Rating Scheme Universal Certificate Market Research to Assess the Proposed Designs for the Nationwide House Energy Rating Scheme Universal Certificate May 2013 Prepared for: Residential Buildings Team Building Energy Efficiency Branch Department

More information

Summary Report. Department of Innovation, Industry, Science and Research. Industry and Small Business Policy Division

Summary Report. Department of Innovation, Industry, Science and Research. Industry and Small Business Policy Division Summary Report Department of Innovation, Industry, Science and Research Industry and Small Business Policy Division Small Business Dispute Resolution June 2010 DIISR Small Business Dispute Resolution Research

More information

Council of Ambulance Authorities

Council of Ambulance Authorities Council of Ambulance Authorities National Patient Satisfaction Survey 2015 Prepared for: Mojca Bizjak-Mikic Manager, Data & Research The Council of Ambulance Authorities Prepared by: Dr Svetlana Bogomolova

More information

Local Government and Planning Ministers Council

Local Government and Planning Ministers Council Attachment A Local Government and Planning Ministers Council First National Report on Development Assessment Performance 2008/09 Prepared by the South Australian Government Attachment A Table of Contents

More information

ASSOCIATIONS SALARY SURVEY 2015

ASSOCIATIONS SALARY SURVEY 2015 ASSOCIATIONS SALARY SURVEY 2015 ASSOCIATIONS SALARY SURVEY 2015 Table of Contents Welcome 01 Part 1 Overview 02 1.1 Introduction 03 1.2 Aims of Salary Survey 03 1.3 Method 03 1.4 Survey Respondents 03

More information

2009 National Practice Nurse Workforce Survey Report Page 1

2009 National Practice Nurse Workforce Survey Report Page 1 2009 National Practice Nurse Workforce Survey Report Page 1 AGPN is the largest representative voice for General Practice in Australia. It is the peak national body of the divisions of General Practice,

More information

Teacher Performance and Development in Australia

Teacher Performance and Development in Australia AITSL is funded by the Australian Government Teacher Performance and Development in Australia A mapping and analysis of current practice March 2012 Graham Marshall, Peter Cole and Vic Zbar Contents Background...

More information

Practice Note 2014-62

Practice Note 2014-62 Practice Note 2014-62 Documentation Required for Applications for Building Permits Reference to the Building Code of Australia (BCA) in this Practice Note means Volume One and Volume Two of the National

More information

Housing Affordability Report

Housing Affordability Report Housing Affordability Report JUNE QUARTER Stable market but no reprieve for first home Housing affordability remained relatively steady in the June quarter of with the proportion of income required to

More information

Optus Submission to Productivity Commission Inquiry into National Frameworks for Workers Compensation and Occupational Health and Safety

Optus Submission to Productivity Commission Inquiry into National Frameworks for Workers Compensation and Occupational Health and Safety Optus Submission to Productivity Commission Inquiry into National Frameworks for Workers Compensation and Occupational Health and Safety June 2003 Overview Optus welcomes the opportunity to provide this

More information

Relationship Manager (Banking) Assessment Plan

Relationship Manager (Banking) Assessment Plan 1. Introduction and Overview Relationship Manager (Banking) Assessment Plan The Relationship Manager (Banking) is an apprenticeship that takes 3-4 years to complete and is at a Level 6. It forms a key

More information

Human Resource Change Management Plan

Human Resource Change Management Plan Structural Reform in Western Australian Local Governments Human Resource Change Management Plan A resource for the progression of your workforce through the structural reform process Contents Human Resource

More information

Assessor Advice Re: CPC and CPC08 Training Packages

Assessor Advice Re: CPC and CPC08 Training Packages 373 Elizabeth Street North Hobart, Tasmania 7000 PO Box 547 North Hobart, Tasmania 7002 03 6218 2841 enquiries@artibus.com.au www.artibus.com.au Assessor Advice Re: CPC and CPC08 Training Packages Artibus

More information

Policy Statement on. Associations. Eligibility to apply for a Scheme under Professional Standards Legislation May 2014

Policy Statement on. Associations. Eligibility to apply for a Scheme under Professional Standards Legislation May 2014 Policy Statement on on Code Business of Conduct Entity Associations Eligibility to apply for a Scheme under Professional Standards Legislation May 2014 Table of Contents Professional Standards Council

More information

Report on the National Quality Framework & Regulatory Burden

Report on the National Quality Framework & Regulatory Burden Report on the National Quality Framework & Regulatory Burden 2013 Report on the National Quality Framework and Regulatory Burden Overview The National Quality Framework (NQF) was introduced to improve

More information

Small business guide to trade practices compliance programs. April 2006

Small business guide to trade practices compliance programs. April 2006 Small business guide to trade practices compliance programs April 2006 Commonwealth of Australia 2006 ISBN 1 920702 93 8 This work is copyright. Apart from any use permitted under the Copyright Act 1968

More information

Association of Consulting Architects Australia Strategic Plan

Association of Consulting Architects Australia Strategic Plan Association of Consulting Architects Australia Strategic Plan Box 17 Flinders Lane Post Office Melbourne Vic 8009 T 1300 653 026 E nat@aca.org.au www.aca.org.au Contents 1. Introduction 5 2. Survey 7 3.

More information

[Type text] Associations. Eligibility to apply for a Scheme under Professional Standards Legislation. Dec 2014

[Type text] Associations. Eligibility to apply for a Scheme under Professional Standards Legislation. Dec 2014 Dec 2014 [Type text] Policy A Framework Statement for on Compliance Business Entity Associations Eligibility to apply for a Scheme under Professional Standards Legislation Table of contents 1. Introduction...

More information

Feedback on the Inquiry into Serious Injury. Presented to the Road Safety Committee of the Parliament of Victoria. 08 May 2013

Feedback on the Inquiry into Serious Injury. Presented to the Road Safety Committee of the Parliament of Victoria. 08 May 2013 Feedback on the Inquiry into Serious Injury Presented to the Road Safety Committee of the Parliament of Victoria 08 May 2013 About the APA The Australian Physiotherapy Association (APA) is the peak body

More information

2.50 Retirement villages - section 32 evaluation for the Proposed Auckland Unitary Plan

2.50 Retirement villages - section 32 evaluation for the Proposed Auckland Unitary Plan 2.50 Retirement villages - section 32 evaluation for the Proposed Auckland Unitary Plan 1 OVERVIEW AND PURPOSE...2 1.1 Subject Matter of this Section...2 1.2 Resource Management Issue to be Addressed...2

More information

C:\Documents and Settings\panot\Local Settings\Temporary Internet Files\OLK60\Guide to Home Warranty Ins FINAL.doc

C:\Documents and Settings\panot\Local Settings\Temporary Internet Files\OLK60\Guide to Home Warranty Ins FINAL.doc 1 Guide to Home Warranty Insurance This Guide is to assist accountants and builders/contractors in the process of obtaining and maintaining Home Warranty Insurance. Home Warranty Insurance is a compulsory

More information

Asbestos removal and licensing

Asbestos removal and licensing Applicant Guide Asbestos removal and licensing Includes: Asbestos removal and asbestos assessor licence and notifications of asbestos removal work, asbestos fibres and emergency demolition of structures

More information

Guide for applicants for asbestos removal and asbestos assessor licences and notifications

Guide for applicants for asbestos removal and asbestos assessor licences and notifications Guide for applicants for asbestos removal and asbestos assessor licences and notifications of asbestos removal work, asbestos fibres and emergency demolition of structures containing asbestos March 2013

More information

Report into the Rural, Regional and Remote Areas Lawyers Survey. Prepared by the Law Council of Australia and the Law Institute of Victoria

Report into the Rural, Regional and Remote Areas Lawyers Survey. Prepared by the Law Council of Australia and the Law Institute of Victoria Report into the Rural, Regional and Remote Areas Lawyers Survey Prepared by the Law Council of Australia and the Law Institute of Victoria July 2009 Acknowledgements The Law Council is grateful for the

More information

Electricity network services. Long-term trends in prices and costs

Electricity network services. Long-term trends in prices and costs Electricity network services Long-term trends in prices and costs Contents Executive summary 3 Background 4 Trends in network prices and service 6 Trends in underlying network costs 11 Executive summary

More information

Aboriginal and Torres Strait Islander Health Workers / Practitioners in focus

Aboriginal and Torres Strait Islander Health Workers / Practitioners in focus Aboriginal and Torres Strait Islander Health Workers / Practitioners in focus i Contents Introduction... 1 What is an Aboriginal and Torres Strait Islander Health Worker?... 2 How are Aboriginal and Torres

More information

REGULATION IMPACT STATEMENT IMPLEMENTATION FOR ELECTRICITY METERS

REGULATION IMPACT STATEMENT IMPLEMENTATION FOR ELECTRICITY METERS REGULATION IMPACT STATEMENT IMPLEMENTATION FOR ELECTRICITY METERS AUGUST 2012 Contents 1. INTRODUCTION... 1 1.1 The National Measurement Institute (NMI)... 1 1.2 Definitions... 1 1.2.1 Electricity Meter...

More information

ASBESTOS REMOVAL LICENCE ASBESTOS NOTIFICATIONS

ASBESTOS REMOVAL LICENCE ASBESTOS NOTIFICATIONS GUIDE FOR APPLICANTS ASBESTOS REMOVAL LICENCE ASBESTOS NOTIFICATIONS TABLE OF CONTENTS INTRODUCTION... 2 WHY YOU NEED A LICENCE... 2 WHY YOU NEED TO NOTIFY... 2 LICENCE TYPES... 2 SCOPE... 2 WHO MAY APPLY

More information

Home Warranty Insurance Claim Form

Home Warranty Insurance Claim Form Home Warranty Insurance Claim Form General WFI Insurance Limited (ABN 24 000 036 279) trading as Lumley Insurance offers Builders Home Warranty Insurance to owner Builders and Licensed Builders in the

More information

Influence of Solar Radiation Models in the Calibration of Building Simulation Models

Influence of Solar Radiation Models in the Calibration of Building Simulation Models Influence of Solar Radiation Models in the Calibration of Building Simulation Models J.K. Copper, A.B. Sproul 1 1 School of Photovoltaics and Renewable Energy Engineering, University of New South Wales,

More information

NATIONAL PARTNERSHIP AGREEMENT ON ENERGY EFFICIENCY

NATIONAL PARTNERSHIP AGREEMENT ON ENERGY EFFICIENCY NATIONAL PARTNERSHIP AGREEMENT ON ENERGY EFFICIENCY Council of Australian Governments An agreement between the Commonwealth of Australia and the States and Territories, being: The State of New South Wales

More information

Home Building Protection Review Consultation Responses

Home Building Protection Review Consultation Responses Home Building Protection Review Consultation Responses November 2014 Contents 1 Introduction 1 2 Response overview 2 3 The insurance model 3 First resort model 4 Mandatory last resort fidelity fund 4 Voluntary

More information

2 March 2015. Mutual Recognition Schemes Study Productivity Commission Locked Bag 2 Collins Street East MELBOURNE VIC 8003

2 March 2015. Mutual Recognition Schemes Study Productivity Commission Locked Bag 2 Collins Street East MELBOURNE VIC 8003 2 March 2015 Mutual Recognition Schemes Study Productivity Commission Locked Bag 2 Collins Street East MELBOURNE VIC 8003 Sent via email to: mutual.recognition@pc.gov.au Dear Commissioner, Master Electricians

More information

Dodo Power & Gas Complaint Management Policy

Dodo Power & Gas Complaint Management Policy DODO POWER & GAS PTY LTD Dodo Power & Gas Complaint Management Policy Jurisdiction: All 2013 Policy Reference ref DPG 100-004 Version: 1.2 Author: Status Andrew Mair Draft Publication Date 7/06/2013 Location:

More information

Assessment plan: Mortgage Adviser

Assessment plan: Mortgage Adviser Assessment plan: Mortgage Adviser ST0182/AP 1. Introduction and Overview Mortgage advice is provided by a number of different types of businesses; direct to consumer through banks & building societies

More information

Standard 1. Governance for Safety and Quality in Health Service Organisations. Safety and Quality Improvement Guide

Standard 1. Governance for Safety and Quality in Health Service Organisations. Safety and Quality Improvement Guide Standard 1 Governance for Safety and Quality in Health Service Organisations Safety and Quality Improvement Guide 1 1 1October 1 2012 ISBN: Print: 978-1-921983-27-6 Electronic: 978-1-921983-28-3 Suggested

More information

The Evaluation of the 5-Star Energy Efficiency Standard for Residential Buildings

The Evaluation of the 5-Star Energy Efficiency Standard for Residential Buildings ENERGY TRANSFORMED FLAGSHIP The Evaluation of the 5-Star Energy Efficiency Standard for Residential Buildings Final Report Michael Ambrose, Melissa James, Andrew Law, Peter Osman and Stephen White December

More information

REPORT. Training for the White Card for Australia s Construction Industry

REPORT. Training for the White Card for Australia s Construction Industry REPORT Training for the White Card for Australia s Construction Industry A national strategic review of registered training organisations offering industry induction training - the White Card 20 September

More information

Scrim And Texture Data Collection Contract Management S Chamberlain

Scrim And Texture Data Collection Contract Management S Chamberlain Scrim And Texture Data Collection Contract Management S Chamberlain Introduction Transit New Zealand (Transit) has been successfully managing SCRIM and texture data collection through external suppliers

More information

AER reference: 52454; D14/54321 ACCC_09/14_865

AER reference: 52454; D14/54321 ACCC_09/14_865 Commonwealth of Australia 2014 This work is copyright. In addition to any use permitted under the Copyright Act 1968, all material contained within this work is provided under a Creative Commons Attribution

More information

Resource 6 Workplace travel survey guide

Resource 6 Workplace travel survey guide Resource 6 Workplace travel survey guide Page 1 Resource 6 Workplace travel survey guide Overview Introduction Contents The NZ Transport Agency (NZTA) provides a workplace travel survey (hereafter referred

More information

Administrator National Health Funding Pool Annual Report 2012-13

Administrator National Health Funding Pool Annual Report 2012-13 Administrator National Health Funding Pool Annual Report 2012-13 Design Voodoo Creative Printing Paragon Printers Australasia Paper-based publications Commonwealth of Australia 2013 This work is copyright.

More information

Guidelines on continuing professional development

Guidelines on continuing professional development Guidelines on continuing professional development Introduction These guidelines 1 have been developed by the Psychology Board of Australia under s. 39 of the Health Practitioner Regulation National Law

More information

Australia & New Zealand. Return to Work Monitor 2011/12. Heads of Workers Compensation Authorities

Australia & New Zealand. Return to Work Monitor 2011/12. Heads of Workers Compensation Authorities Australia & New Zealand Return to Work Monitor 2011/12 Prepared for Heads of Workers Compensation Authorities July 2012 SUITE 3, 101-103 QUEENS PDE PO BOX 441, CLIFTON HILL, VICTORIA 3068 PHONE +613 9482

More information

NATIONAL STRATEGIC INDUSTRY AUDIT. TAA40104 Certificate IV in Training and Assessment

NATIONAL STRATEGIC INDUSTRY AUDIT. TAA40104 Certificate IV in Training and Assessment 2010 NATIONAL STRATEGIC INDUSTRY AUDIT TAA40104 Certificate IV in Training and Assessment WESTERN AUSTRALIA FINAL REPORT June 2010 CONTENTS 1. Executive Summary 2 2. Background 5 3. Methodology 5 4. Findings

More information

0 - Initial Services

0 - Initial Services Overview At All Australian Architecture we consider each project in terms of self contained stages (see below). Each stage is independent and does not tie you into completing every stage of your project

More information

Scheme Document. How could it be used to assess a multi-tenanted asset?

Scheme Document. How could it be used to assess a multi-tenanted asset? Page: 1 of 21 1 DOCUMENT DETAILS Scope To formalise BREEAM In-Use operating procedures between BRE Global Ltd, BREEAM In- Use Clients and BREEAM In-Use Auditors. This document is applicable to UK and International

More information

Asbestos licensing and notification guide for applicants

Asbestos licensing and notification guide for applicants Asbestos licensing and notification guide for applicants Use this guide if you are: applying for an asbestos removal work licence applying for an asbestos assessor licence notifying of asbestos removal

More information

Protocol for the Certification of Energy Simulation Software: First edition, December 2009

Protocol for the Certification of Energy Simulation Software: First edition, December 2009 Copyright Agrément South Africa, December 2009 The master copy of this document appears on the website: http://www.agrement.co.za Protocol for the Certification of Energy Simulation Software: First edition,

More information

A Review of the Help Desk Performance Report

A Review of the Help Desk Performance Report Help Desk Response to the Client Satisfaction survey Contents Help Desk Response to the Client Satisfaction survey... 1 Executive Summary... 2 Introduction... 3 Methodology... 3 Survey results... 4 Key

More information

Protocol for the Certification of Energy Simulation Software: Second edition, September 2011

Protocol for the Certification of Energy Simulation Software: Second edition, September 2011 Copyright Agrément South Africa, September 2011 The master copy of this document appears on the website: http://www.agrement.co.za SANS 10400: The application of the National Building Regulations: Protocol

More information

Labour Market Research Health Professions 1 Australia 2014-15

Labour Market Research Health Professions 1 Australia 2014-15 ISSN: 223-9619 Labour Market Research Health Professions 1 Australia 214-15 2346-11 Medical Laboratory Scientist No Shortage There are large fields of qualified applicants 2512-11 Medical Diagnostic Radiographer

More information

Funding success! How funders support charities to evaluate

Funding success! How funders support charities to evaluate Funding success! How funders support charities to evaluate A review of Evaluation Support Accounts with Laidlaw Youth Trust and Lloyds TSB Foundation for Scotland The process of working with ESS has greatly

More information

Health expenditure Australia 2011 12: analysis by sector

Health expenditure Australia 2011 12: analysis by sector Health expenditure Australia 2011 12: analysis by sector HEALTH AND WELFARE EXPENDITURE SERIES No. 51 HEALTH AND WELFARE EXPENDITURE SERIES Number 51 Health expenditure Australia 2011 12: analysis by sector

More information

Digital Industries Apprenticeship: Assessment Plan. Cyber Security Technologist. April 2016

Digital Industries Apprenticeship: Assessment Plan. Cyber Security Technologist. April 2016 Digital Industries Apprenticeship: Assessment Plan Cyber Security Technologist April 2016 1 Digital Industries Apprenticeships: Assessment Plan 1. General Introduction and Overview The apprenticeship Standard

More information

APA Physiotherapist Title Program

APA Physiotherapist Title Program APA Physiotherapist Title Program Information Booklet This work is copyright. Apart from any use as permitted under the Copyright Act 1968, no part may be reproduced by any process without permission from

More information

Housing Affordability Report

Housing Affordability Report Housing Affordability Report MARCH QUARTER Housing affordability improves on the back of falling interest rates as loan sizes rise and incomes stall The first quarter of showed an improvement in housing

More information

aaca NCSA 01 The National Competency Standards in Architecture aaca Architects Accreditation Council of Australia PO Box 236 Civic Square ACT 2608

aaca NCSA 01 The National Competency Standards in Architecture aaca Architects Accreditation Council of Australia PO Box 236 Civic Square ACT 2608 aaca NCSA 01 The National Competency Standards in Architecture aaca Architects Accreditation Council of Australia PO Box 236 Civic Square ACT 2608 NCSA 01 Competency Based Assessment in Architecture THE

More information

Plan, Build, Enjoy! Everything you need to know about funding a new build or renovation

Plan, Build, Enjoy! Everything you need to know about funding a new build or renovation Plan, Build, Enjoy! Everything you need to know about funding a new build or renovation From foundations to fit out a construction loan can help you get there... Building a brand new home or completing

More information

Essential Standards for Registration

Essential Standards for Registration Essential Standards for Registration State and Territory Registering Bodies Australian Capital Territory New South Wales Northern Territory Queensland South Australia Tasmania Victoria Western Australia

More information

Code Compliance Monitoring Committee. Inquiry into bank compliance with Clause 29 of the Code of Banking Practice

Code Compliance Monitoring Committee. Inquiry into bank compliance with Clause 29 of the Code of Banking Practice Code Compliance Monitoring Committee Inquiry into bank compliance with Clause 29 of the Code of Banking Practice February 2008 1 Index Introduction and executive summary Page 3 The Debt collection guidelines

More information

Payroll Tax in the Costing of Government Services

Payroll Tax in the Costing of Government Services Payroll Tax in the Costing of Government Services Research Paper Steering Committee for the Review of Commonwealth/State Service Provision Commonwealth of Australia 1999 ISBN: 1 74037 006 6 This paper

More information

Draft Decision. Approach to compliance with the National Energy Retail Law, Rules and Regulations

Draft Decision. Approach to compliance with the National Energy Retail Law, Rules and Regulations Draft Decision Approach to compliance with the National Energy Retail Law, Rules and Regulations December 2010 Commonwealth of Australia 2010 This work is copyright. Apart from any use permitted by the

More information

Miscellaneous Professional Indemnity Insurance

Miscellaneous Professional Indemnity Insurance Miscellaneous Professional Indemnity Insurance Proposal Form 1. All questions must be answered giving full and complete answers. 2. Please ensure that this Proposal Form is Signed and Dated. 3. All fee

More information

APPENDIX C - Florida Energy Code Standard Reference Design Auto-Generation Tests

APPENDIX C - Florida Energy Code Standard Reference Design Auto-Generation Tests REM/Rate v15 Review 9/22/14 Purpose This document includes the results of limited testing and verification of the REM/Rate v15.0 software submitted as a simulation tool for demonstrating compliance with

More information

Energy Performance Certificate

Energy Performance Certificate Energy Performance Certificate 0 Raleigh Drive CULLOMPTON EX15 1FZ Dwelling type: Date of assessment: Date of certificate: Reference number: Type of assessment: Total floor area: Semi detached house 09

More information

Statistical appendix. A.1 Introduction

Statistical appendix. A.1 Introduction A Statistical appendix A.1 Introduction This appendix contains contextual information to assist the interpretation of the performance indicators presented in the Report. The following four key factors

More information

Registration standard: Endorsement as a nurse practitioner

Registration standard: Endorsement as a nurse practitioner Registration standard: Endorsement as a nurse practitioner Consultation report February 2016 Nursing and Midwifery Board of Australia G.P.O. Box 9958 Melbourne VIC 3001 www.nursingmidwiferyboard.gov.au

More information

Level 5 Diploma in Managing the Supply Chain (QCF) Qualification Specification

Level 5 Diploma in Managing the Supply Chain (QCF) Qualification Specification Level 5 Diploma in Managing the Supply Chain (QCF) Qualification Specification Created: May 2012 Version: 1.0 Accreditation Number: 600/5605/8 Qualification Start Date: 1 st June 2012 Qualification Last

More information

ASSOCIATION OF INDEPENDENT SCHOOLS OF NSW BLOCK GRANT AUTHORITY GUIDE TO PROCUREMENT PROCESSES

ASSOCIATION OF INDEPENDENT SCHOOLS OF NSW BLOCK GRANT AUTHORITY GUIDE TO PROCUREMENT PROCESSES ASSOCIATION OF INDEPENDENT SCHOOLS OF NSW BLOCK GRANT AUTHORITY GUIDE TO PROCUREMENT PROCESSES CAPITAL GRANTS PROGRAM / BUILDING GRANTS ASSISTANCE SCHEME Background Non government schools accepting the

More information

The Menzies-Nous Australian Health Survey 2010

The Menzies-Nous Australian Health Survey 2010 The Menzies-Nous Australian Health Survey 2010 November 2010 Contents 1 Executive summary...1 1.1 Health and health services...1 1.2 Aged care services...2 2 Background...3 3 Health and health services...5

More information

APPLICATION FOR ELIGIBILITY FOR DOMESTIC BUILDING INSURANCE

APPLICATION FOR ELIGIBILITY FOR DOMESTIC BUILDING INSURANCE VIC - Builders Warranty Insurance Business with Turnover below $5m Application Form Commercial & General Insurance Brokers (Aust) Pty Ltd Suite 4, 1016 Doncaster Road Doncaster East Victoria 3109 Phone:

More information

Commercial and Residential Code Compliance Study in Florida

Commercial and Residential Code Compliance Study in Florida Where Should We Focus Efforts to Improve Building Energy Code Enforcement Rates? Results From A Research Study in Florida Charles Withers, Jr. and Robin Vieira, Florida Solar Energy Center ABSTRACT Most

More information

Workplace travel surveys

Workplace travel surveys Department of Transport Department of Environment and Conservation Department of Health TravelSmart Workplace fact sheet Workplace travel surveys Travel surveys are important for understanding and monitoring

More information

Contractors Guide to Choosing an Accountant

Contractors Guide to Choosing an Accountant Contractors Guide to Choosing an Accountant Genie Accountancy have produced this guide to illustrate the importance of selecting the correct accountant, an accountant who will strive to support your business

More information

Relationship Manager (Banking) Assessment Plan

Relationship Manager (Banking) Assessment Plan Relationship Manager (Banking) Assessment Plan ST0184/AP03 1. Introduction and Overview The Relationship Manager (Banking) is an apprenticeship that takes 3-4 years to complete and is at a Level 6. It

More information

NQC RESPONSE TO THE PRODUCTIVITY COMMISSION DRAFT RESEARCH REPORT ON THE VOCATIONAL EDUCATION AND TRAINING WORKFORCE

NQC RESPONSE TO THE PRODUCTIVITY COMMISSION DRAFT RESEARCH REPORT ON THE VOCATIONAL EDUCATION AND TRAINING WORKFORCE INTRODUCTION The National Quality Council (NQC) is pleased to provide feedback to the Productivity Commission on the Draft Research Report Vocational Education and Training Workforce (November 2010). Council

More information

Retention of Nursing and Allied Health Professionals in Rural and Remote Australia summary report

Retention of Nursing and Allied Health Professionals in Rural and Remote Australia summary report Retention of Nursing and Allied Health Professionals in Rural and Remote Australia summary report March 2014 1 Health Workforce Australia. This work is copyright. It may be reproduced in whole or part

More information

National Assessment Frameworks For Local Government Asset Management and Financial Planning Implementation Proposal Paper

National Assessment Frameworks For Local Government Asset Management and Financial Planning Implementation Proposal Paper For Local Government Asset Management and Financial Planning Prepared by Chris Champion and Leon Patterson, Institute of Public Works Engineering Australia June 2012 Contents Executive Summary... 1 Introduction...

More information

technology UICK progressive sustainable any design, any shape, any colour, anywhere you like!

technology UICK progressive sustainable any design, any shape, any colour, anywhere you like! uality using advanced building technology UICK uality innovative progressive sustainable any design, any shape, any colour, anywhere you like! 2 NicheQ is an award winning progressive and innovative builder

More information

Guide to the National Safety and Quality Health Service Standards for health service organisation boards

Guide to the National Safety and Quality Health Service Standards for health service organisation boards Guide to the National Safety and Quality Health Service Standards for health service organisation boards April 2015 ISBN Print: 978-1-925224-10-8 Electronic: 978-1-925224-11-5 Suggested citation: Australian

More information

How To Use Transact Online Pension Fund Online

How To Use Transact Online Pension Fund Online IOOF Transact User guide Table of Contents What is Transact? 3 Navigating Transact 3 Employee maintenance 6 Member data remediation 12 Remediating the data 14 Contributions 18 General functions & notifications

More information

Management Research Series No 1/2003. Key findings. Management Development Practice in Australia

Management Research Series No 1/2003. Key findings. Management Development Practice in Australia Management Research Series No 1/2003 Key findings Management Development Practice in Australia A national study commissioned by the Australian Institute of Management 2002 About this series The Management

More information

2017 19 TasNetworks Regulatory Proposal Expenditure Forecasting Methodology

2017 19 TasNetworks Regulatory Proposal Expenditure Forecasting Methodology 2017 19 TasNetworks Regulatory Proposal Expenditure Forecasting Methodology Version Number 1 26 June 2015 Tasmanian Networks Pty Ltd (ACN 167 357 299) Table of contents 1 Introduction... 1 2 Meeting our

More information

NATIONAL PARTNERSHIP AGREEMENT ON THE NATIONAL QUALITY AGENDA FOR EARLY CHILDHOOD EDUCATION AND CARE

NATIONAL PARTNERSHIP AGREEMENT ON THE NATIONAL QUALITY AGENDA FOR EARLY CHILDHOOD EDUCATION AND CARE NATIONAL PARTNERSHIP AGREEMENT ON THE NATIONAL QUALITY AGENDA FOR EARLY CHILDHOOD EDUCATION AND CARE Council of Australian Governments An agreement between the Commonwealth of Australia and the States

More information

Smart meters: Removing regulatory barriers and maintaining consumer safety for a market-led roll out of smart meters in New South Wales

Smart meters: Removing regulatory barriers and maintaining consumer safety for a market-led roll out of smart meters in New South Wales Smart meters: Removing regulatory barriers and maintaining consumer safety for a market-led roll out of smart meters in New South Wales The NSW Government is committed to a voluntary, market-led roll out

More information

Review of the Insurance Arrangements of State and Territory Governments under the Natural Disaster Relief and Recovery Arrangements Determination

Review of the Insurance Arrangements of State and Territory Governments under the Natural Disaster Relief and Recovery Arrangements Determination Review of the Insurance Arrangements of State and Territory Governments under the Natural Disaster Relief and Recovery Arrangements Determination 2011 NDRRA PHASE 2 REPORT SEPTEMBER 2012 Copyright Notice

More information

MASTER BUILDERS ASSOCIATION

MASTER BUILDERS ASSOCIATION MASTER BUILDERS ASSOCIATION New South Wales 2015 EXCELLENCE IN HOUSING AWARDS ENTRY FORM Closing Date: Friday 20 March 2015 2015 EXCELLENCE IN HOUSING AWARDS AN ELECTRONIC VERSION OF THIS ENTRY FORM IS

More information

Building in the ACT. A consumer guide to the building process OCTOBER 2014

Building in the ACT. A consumer guide to the building process OCTOBER 2014 Building in the ACT A consumer guide to the building process OCTOBER 2014 Australian Capital Territory, Canberra 2014 This work is copyright. Apart from any use as permitted under the Copyright Act 1968,

More information

Evaluating the effectiveness of Reconciliation Action Plans Report prepared by Auspoll

Evaluating the effectiveness of Reconciliation Action Plans Report prepared by Auspoll Evaluating the effectiveness of Reconciliation Action Plans Report prepared by Auspoll REPORT PREPARED BY: David Stolper Senior Research Partner d.stolper@auspoll.com.au T/ 02 9258 4462 Nick Wyatt Consultant

More information